Sample records for adaptive projection method

  1. Assessing Adaptive Instructional Design Tools and Methods in ADAPT[IT].

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Spector, J. Michael

    ADAPT[IT] (Advanced Design Approach for Personalized Training - Interactive Tools) is a European project within the Information Society Technologies program that is providing design methods and tools to guide a training designer according to the latest cognitive science and standardization principles. ADAPT[IT] addresses users in two significantly…

  2. Adaptive projection intensity adjustment for avoiding saturation in three-dimensional shape measurement

    NASA Astrophysics Data System (ADS)

    Chen, Chao; Gao, Nan; Wang, Xiangjun; Zhang, Zonghua

    2018-03-01

    Phase-based fringe projection methods have been commonly used for three-dimensional (3D) measurements. However, image saturation results in incorrect intensities in captured fringe pattern images, leading to phase and measurement errors. Existing solutions are complex. This paper proposes an adaptive projection intensity adjustment method to avoid image saturation and maintain good fringe modulation in measuring objects with a high range of surface reflectivities. The adapted fringe patterns are created using only one prior step of fringe-pattern projection and image capture. First, a set of phase-shifted fringe patterns with maximum projection intensity value of 255 and a uniform gray level pattern are projected onto the surface of an object. The patterns are reflected from and deformed by the object surface and captured by a digital camera. The best projection intensities corresponding to each saturated-pixel clusters are determined by fitting a polynomial function to transform captured intensities to projected intensities. Subsequently, the adapted fringe patterns are constructed using the best projection intensities at projector pixel coordinate. Finally, the adapted fringe patterns are projected for phase recovery and 3D shape calculation. The experimental results demonstrate that the proposed method achieves high measurement accuracy even for objects with a high range of surface reflectivities.

  3. Developing adaptive interventions for adolescent substance use treatment settings: protocol of an observational, mixed-methods project.

    PubMed

    Grant, Sean; Agniel, Denis; Almirall, Daniel; Burkhart, Q; Hunter, Sarah B; McCaffrey, Daniel F; Pedersen, Eric R; Ramchand, Rajeev; Griffin, Beth Ann

    2017-12-19

    Over 1.6 million adolescents in the United States meet criteria for substance use disorders (SUDs). While there are promising treatments for SUDs, adolescents respond to these treatments differentially in part based on the setting in which treatments are delivered. One way to address such individualized response to treatment is through the development of adaptive interventions (AIs): sequences of decision rules for altering treatment based on an individual's needs. This protocol describes a project with the overarching goal of beginning the development of AIs that provide recommendations for altering the setting of an adolescent's substance use treatment. This project has three discrete aims: (1) explore the views of various stakeholders (parents, providers, policymakers, and researchers) on deciding the setting of substance use treatment for an adolescent based on individualized need, (2) generate hypotheses concerning candidate AIs, and (3) compare the relative effectiveness among candidate AIs and non-adaptive interventions commonly used in everyday practice. This project uses a mixed-methods approach. First, we will conduct an iterative stakeholder engagement process, using RAND's ExpertLens online system, to assess the importance of considering specific individual needs and clinical outcomes when deciding the setting for an adolescent's substance use treatment. Second, we will use results from the stakeholder engagement process to analyze an observational longitudinal data set of 15,656 adolescents in substance use treatment, supported by the Substance Abuse and Mental Health Services Administration, using the Global Appraisal of Individual Needs questionnaire. We will utilize methods based on Q-learning regression to generate hypotheses about candidate AIs. Third, we will use robust statistical methods that aim to appropriately handle casemix adjustment on a large number of covariates (marginal structural modeling and inverse probability of treatment weights

  4. Incorporating adaptive responses into future projections of coral bleaching.

    PubMed

    Logan, Cheryl A; Dunne, John P; Eakin, C Mark; Donner, Simon D

    2014-01-01

    Climate warming threatens to increase mass coral bleaching events, and several studies have projected the demise of tropical coral reefs this century. However, recent evidence indicates corals may be able to respond to thermal stress though adaptive processes (e.g., genetic adaptation, acclimatization, and symbiont shuffling). How these mechanisms might influence warming-induced bleaching remains largely unknown. This study compared how different adaptive processes could affect coral bleaching projections. We used the latest bias-corrected global sea surface temperature (SST) output from the NOAA/GFDL Earth System Model 2 (ESM2M) for the preindustrial period through 2100 to project coral bleaching trajectories. Initial results showed that, in the absence of adaptive processes, application of a preindustrial climatology to the NOAA Coral Reef Watch bleaching prediction method overpredicts the present-day bleaching frequency. This suggests that corals may have already responded adaptively to some warming over the industrial period. We then modified the prediction method so that the bleaching threshold either permanently increased in response to thermal history (e.g., simulating directional genetic selection) or temporarily increased for 2-10 years in response to a bleaching event (e.g., simulating symbiont shuffling). A bleaching threshold that changes relative to the preceding 60 years of thermal history reduced the frequency of mass bleaching events by 20-80% compared with the 'no adaptive response' prediction model by 2100, depending on the emissions scenario. When both types of adaptive responses were applied, up to 14% more reef cells avoided high-frequency bleaching by 2100. However, temporary increases in bleaching thresholds alone only delayed the occurrence of high-frequency bleaching by ca. 10 years in all but the lowest emissions scenario. Future research should test the rate and limit of different adaptive responses for coral species across latitudes and

  5. Reservoir adaptive operating rules based on both of historical streamflow and future projections

    NASA Astrophysics Data System (ADS)

    Zhang, Wei; Liu, Pan; Wang, Hao; Chen, Jie; Lei, Xiaohui; Feng, Maoyuan

    2017-10-01

    Climate change is affecting hydrological variables and consequently is impacting water resources management. Historical strategies are no longer applicable under climate change. Therefore, adaptive management, especially adaptive operating rules for reservoirs, has been developed to mitigate the possible adverse effects of climate change. However, to date, adaptive operating rules are generally based on future projections involving uncertainties under climate change, yet ignoring historical information. To address this, we propose an approach for deriving adaptive operating rules considering both historical information and future projections, namely historical and future operating rules (HAFOR). A robustness index was developed by comparing benefits from HAFOR with benefits from conventional operating rules (COR). For both historical and future streamflow series, maximizations of both average benefits and the robustness index were employed as objectives, and four trade-offs were implemented to solve the multi-objective problem. Based on the integrated objective, the simulation-based optimization method was used to optimize the parameters of HAFOR. Using the Dongwushi Reservoir in China as a case study, HAFOR was demonstrated to be an effective and robust method for developing adaptive operating rules under the uncertain changing environment. Compared with historical or projected future operating rules (HOR or FPOR), HAFOR can reduce the uncertainty and increase the robustness for future projections, especially regarding results of reservoir releases and volumes. HAFOR, therefore, facilitates adaptive management in the context that climate change is difficult to predict accurately.

  6. Focus on climate projections for adaptation strategies

    NASA Astrophysics Data System (ADS)

    Feijt, Arnout; Appenzeller, Christof; Siegmund, Peter; von Storch, Hans

    2016-01-01

    Most papers in this focus issue on ‘climate and climate impact projections for adaptation strategies’ are solicited by the guest editorial team and originate from a cluster of projects that were initiated 5 years ago. These projects aimed to provide climate change and climate change adaptation information for a wide range of societal areas for the lower parts of the deltas of the Rhine and Meuse rivers, and particularly for the Netherlands. The papers give an overview of our experiences, methods, approaches, results and surprises in the process to developing scientifically underpinned climate products and services for various clients. Although the literature on interactions between society and climate science has grown over the past decade both with respect to policy-science framing in post-normal science (Storch et al 2011 J. Environ. Law Policy 1 1-15, van der Sluijs 2012 Nature and Culture 7 174-195), user-science framing (Berkhout et al 2014 Regional Environ. Change 14 879-93) and joint knowledge production (Hegger et al 2014 Regional Environ. Change 14 1049-62), there is still a lot to gain. With this focus issue we want to contribute to best practices in this quickly moving field between science and society.

  7. Optimal and adaptive methods of processing hydroacoustic signals (review)

    NASA Astrophysics Data System (ADS)

    Malyshkin, G. S.; Sidel'nikov, G. B.

    2014-09-01

    Different methods of optimal and adaptive processing of hydroacoustic signals for multipath propagation and scattering are considered. Advantages and drawbacks of the classical adaptive (Capon, MUSIC, and Johnson) algorithms and "fast" projection algorithms are analyzed for the case of multipath propagation and scattering of strong signals. The classical optimal approaches to detecting multipath signals are presented. A mechanism of controlled normalization of strong signals is proposed to automatically detect weak signals. The results of simulating the operation of different detection algorithms for a linear equidistant array under multipath propagation and scattering are presented. An automatic detector is analyzed, which is based on classical or fast projection algorithms, which estimates the background proceeding from median filtering or the method of bilateral spatial contrast.

  8. Adaptation to Climate Change: A Comparative Analysis of Modeling Methods for Heat-Related Mortality

    PubMed Central

    Hondula, David M.; Bunker, Aditi; Ibarreta, Dolores; Liu, Junguo; Zhang, Xinxin; Sauerborn, Rainer

    2017-01-01

    Background: Multiple methods are employed for modeling adaptation when projecting the impact of climate change on heat-related mortality. The sensitivity of impacts to each is unknown because they have never been systematically compared. In addition, little is known about the relative sensitivity of impacts to “adaptation uncertainty” (i.e., the inclusion/exclusion of adaptation modeling) relative to using multiple climate models and emissions scenarios. Objectives: This study had three aims: a) Compare the range in projected impacts that arises from using different adaptation modeling methods; b) compare the range in impacts that arises from adaptation uncertainty with ranges from using multiple climate models and emissions scenarios; c) recommend modeling method(s) to use in future impact assessments. Methods: We estimated impacts for 2070–2099 for 14 European cities, applying six different methods for modeling adaptation; we also estimated impacts with five climate models run under two emissions scenarios to explore the relative effects of climate modeling and emissions uncertainty. Results: The range of the difference (percent) in impacts between including and excluding adaptation, irrespective of climate modeling and emissions uncertainty, can be as low as 28% with one method and up to 103% with another (mean across 14 cities). In 13 of 14 cities, the ranges in projected impacts due to adaptation uncertainty are larger than those associated with climate modeling and emissions uncertainty. Conclusions: Researchers should carefully consider how to model adaptation because it is a source of uncertainty that can be greater than the uncertainty in emissions and climate modeling. We recommend absolute threshold shifts and reductions in slope. https://doi.org/10.1289/EHP634 PMID:28885979

  9. Adaptation to Climate Change: A Comparative Analysis of Modeling Methods for Heat-Related Mortality.

    PubMed

    Gosling, Simon N; Hondula, David M; Bunker, Aditi; Ibarreta, Dolores; Liu, Junguo; Zhang, Xinxin; Sauerborn, Rainer

    2017-08-16

    Multiple methods are employed for modeling adaptation when projecting the impact of climate change on heat-related mortality. The sensitivity of impacts to each is unknown because they have never been systematically compared. In addition, little is known about the relative sensitivity of impacts to "adaptation uncertainty" (i.e., the inclusion/exclusion of adaptation modeling) relative to using multiple climate models and emissions scenarios. This study had three aims: a ) Compare the range in projected impacts that arises from using different adaptation modeling methods; b ) compare the range in impacts that arises from adaptation uncertainty with ranges from using multiple climate models and emissions scenarios; c ) recommend modeling method(s) to use in future impact assessments. We estimated impacts for 2070-2099 for 14 European cities, applying six different methods for modeling adaptation; we also estimated impacts with five climate models run under two emissions scenarios to explore the relative effects of climate modeling and emissions uncertainty. The range of the difference (percent) in impacts between including and excluding adaptation, irrespective of climate modeling and emissions uncertainty, can be as low as 28% with one method and up to 103% with another (mean across 14 cities). In 13 of 14 cities, the ranges in projected impacts due to adaptation uncertainty are larger than those associated with climate modeling and emissions uncertainty. Researchers should carefully consider how to model adaptation because it is a source of uncertainty that can be greater than the uncertainty in emissions and climate modeling. We recommend absolute threshold shifts and reductions in slope. https://doi.org/10.1289/EHP634.

  10. Adaptive pixel-to-pixel projection intensity adjustment for measuring a shiny surface using orthogonal color fringe pattern projection

    NASA Astrophysics Data System (ADS)

    Chen, Chao; Gao, Nan; Wang, Xiangjun; Zhang, Zonghua

    2018-05-01

    Three-dimensional (3D) shape measurement based on fringe pattern projection techniques has been commonly used in various fields. One of the remaining challenges in fringe pattern projection is that camera sensor saturation may occur if there is a large range of reflectivity variation across the surface that causes measurement errors. To overcome this problem, a novel fringe pattern projection method is proposed to avoid image saturation and maintain high-intensity modulation for measuring shiny surfaces by adaptively adjusting the pixel-to-pixel projection intensity according to the surface reflectivity. First, three sets of orthogonal color fringe patterns and a sequence of uniform gray-level patterns with different gray levels are projected onto a measured surface by a projector. The patterns are deformed with respect to the object surface and captured by a camera from a different viewpoint. Subsequently, the optimal projection intensity at each pixel is determined by fusing different gray levels and transforming the camera pixel coordinate system into the projector pixel coordinate system. Finally, the adapted fringe patterns are created and used for 3D shape measurement. Experimental results on a flat checkerboard and shiny objects demonstrate that the proposed method can measure shiny surfaces with high accuracy.

  11. Parallel 3D Mortar Element Method for Adaptive Nonconforming Meshes

    NASA Technical Reports Server (NTRS)

    Feng, Huiyu; Mavriplis, Catherine; VanderWijngaart, Rob; Biswas, Rupak

    2004-01-01

    High order methods are frequently used in computational simulation for their high accuracy. An efficient way to avoid unnecessary computation in smooth regions of the solution is to use adaptive meshes which employ fine grids only in areas where they are needed. Nonconforming spectral elements allow the grid to be flexibly adjusted to satisfy the computational accuracy requirements. The method is suitable for computational simulations of unsteady problems with very disparate length scales or unsteady moving features, such as heat transfer, fluid dynamics or flame combustion. In this work, we select the Mark Element Method (MEM) to handle the non-conforming interfaces between elements. A new technique is introduced to efficiently implement MEM in 3-D nonconforming meshes. By introducing an "intermediate mortar", the proposed method decomposes the projection between 3-D elements and mortars into two steps. In each step, projection matrices derived in 2-D are used. The two-step method avoids explicitly forming/deriving large projection matrices for 3-D meshes, and also helps to simplify the implementation. This new technique can be used for both h- and p-type adaptation. This method is applied to an unsteady 3-D moving heat source problem. With our new MEM implementation, mesh adaptation is able to efficiently refine the grid near the heat source and coarsen the grid once the heat source passes. The savings in computational work resulting from the dynamic mesh adaptation is demonstrated by the reduction of the the number of elements used and CPU time spent. MEM and mesh adaptation, respectively, bring irregularity and dynamics to the computer memory access pattern. Hence, they provide a good way to gauge the performance of computer systems when running scientific applications whose memory access patterns are irregular and unpredictable. We select a 3-D moving heat source problem as the Unstructured Adaptive (UA) grid benchmark, a new component of the NAS Parallel

  12. Accounting for adaptation and intensity in projecting heat wave-related mortality.

    PubMed

    Wang, Yan; Nordio, Francesco; Nairn, John; Zanobetti, Antonella; Schwartz, Joel D

    2018-02-01

    How adaptation and intensity of heat waves affect heat wave-related mortality is unclear, making health projections difficult. We estimated the effect of heat waves, the effect of the intensity of heat waves, and adaptation on mortality in 209 U.S. cities with 168 million people during 1962-2006. We improved the standard time-series models by incorporating the intensity of heat waves using excess heat factor (EHF) and estimating adaptation empirically using interactions with yearly mean summer temperature (MST). We combined the epidemiological estimates for heat wave, intensity, and adaptation with the Coupled Model Intercomparison Project Phase 5 (CMIP5) multi-model dataset to project heat wave-related mortality by 2050. The effect of heat waves increased with its intensity. Adaptation to heat waves occurred, which was shown by the decreasing effect of heat waves with MST. However, adaptation was lessened as MST increased. Ignoring adaptation in projections would result in a substantial overestimate of the projected heat wave-related mortality (by 277-747% in 2050). Incorporating the empirically estimated adaptation into projections would result in little change in the projected heat wave-related mortality between 2006 and 2050. This differs regionally, however, with increasing mortality over time for cities in the southern and western U.S. but decreasing mortality over time for the north. Accounting for adaptation is important to reduce bias in the projections of heat wave-related mortality. The finding that the southern and western U.S. are the areas that face increasing heat-related deaths is novel, and indicates that more regional adaptation strategies are needed. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. The Computerized Adaptive Testing System Development Project.

    ERIC Educational Resources Information Center

    McBride, James R.; Sympson, J. B.

    The Computerized Adaptive Testing (CAT) project is a joint Armed Services coordinated effort to develop and evaluate a system for automated, adaptive administration of the Armed Services Vocational Aptitude Battery (ASVAB). The CAT is a system for administering personnel tests that differs from conventional test administration in two major…

  14. Climate project screening tool: an aid for climate change adaptation

    Treesearch

    Toni Lyn Morelli; Sharon Yeh; Nikola M. Smith; Mary Beth Hennessy; Constance I. Millar

    2012-01-01

    To address the impacts of climate change, land managers need techniques for incorporating adaptation into ongoing or impending projects. We present a new tool, the Climate Project Screening Tool (CPST), for integrating climate change considerations into project planning as well as for developing concrete adaptation options for land managers. We designed CPST as part of...

  15. Construction of robust prognostic predictors by using projective adaptive resonance theory as a gene filtering method.

    PubMed

    Takahashi, Hiro; Kobayashi, Takeshi; Honda, Hiroyuki

    2005-01-15

    For establishing prognostic predictors of various diseases using DNA microarray analysis technology, it is desired to find selectively significant genes for constructing the prognostic model and it is also necessary to eliminate non-specific genes or genes with error before constructing the model. We applied projective adaptive resonance theory (PART) to gene screening for DNA microarray data. Genes selected by PART were subjected to our FNN-SWEEP modeling method for the construction of a cancer class prediction model. The model performance was evaluated through comparison with a conventional screening signal-to-noise (S2N) method or nearest shrunken centroids (NSC) method. The FNN-SWEEP predictor with PART screening could discriminate classes of acute leukemia in blinded data with 97.1% accuracy and classes of lung cancer with 90.0% accuracy, while the predictor with S2N was only 85.3 and 70.0% or the predictor with NSC was 88.2 and 90.0%, respectively. The results have proven that PART was superior for gene screening. The software is available upon request from the authors. honda@nubio.nagoya-u.ac.jp

  16. Adaptive Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Fasnacht, Marc

    We develop adaptive Monte Carlo methods for the calculation of the free energy as a function of a parameter of interest. The methods presented are particularly well-suited for systems with complex energy landscapes, where standard sampling techniques have difficulties. The Adaptive Histogram Method uses a biasing potential derived from histograms recorded during the simulation to achieve uniform sampling in the parameter of interest. The Adaptive Integration method directly calculates an estimate of the free energy from the average derivative of the Hamiltonian with respect to the parameter of interest and uses it as a biasing potential. We compare both methods to a state of the art method, and demonstrate that they compare favorably for the calculation of potentials of mean force of dense Lennard-Jones fluids. We use the Adaptive Integration Method to calculate accurate potentials of mean force for different types of simple particles in a Lennard-Jones fluid. Our approach allows us to separate the contributions of the solvent to the potential of mean force from the effect of the direct interaction between the particles. With contributions of the solvent determined, we can find the potential of mean force directly for any other direct interaction without additional simulations. We also test the accuracy of the Adaptive Integration Method on a thermodynamic cycle, which allows us to perform a consistency check between potentials of mean force and chemical potentials calculated using the Adaptive Integration Method. The results demonstrate a high degree of consistency of the method.

  17. "Intelligent Ensemble" Projections of Precipitation and Surface Radiation in Support of Agricultural Climate Change Adaptation

    NASA Technical Reports Server (NTRS)

    Taylor, Patrick C.; Baker, Noel C.

    2015-01-01

    Earth's climate is changing and will continue to change into the foreseeable future. Expected changes in the climatological distribution of precipitation, surface temperature, and surface solar radiation will significantly impact agriculture. Adaptation strategies are, therefore, required to reduce the agricultural impacts of climate change. Climate change projections of precipitation, surface temperature, and surface solar radiation distributions are necessary input for adaption planning studies. These projections are conventionally constructed from an ensemble of climate model simulations (e.g., the Coupled Model Intercomparison Project 5 (CMIP5)) as an equal weighted average, one model one vote. Each climate model, however, represents the array of climate-relevant physical processes with varying degrees of fidelity influencing the projection of individual climate variables differently. Presented here is a new approach, termed the "Intelligent Ensemble, that constructs climate variable projections by weighting each model according to its ability to represent key physical processes, e.g., precipitation probability distribution. This approach provides added value over the equal weighted average method. Physical process metrics applied in the "Intelligent Ensemble" method are created using a combination of NASA and NOAA satellite and surface-based cloud, radiation, temperature, and precipitation data sets. The "Intelligent Ensemble" method is applied to the RCP4.5 and RCP8.5 anthropogenic climate forcing simulations within the CMIP5 archive to develop a set of climate change scenarios for precipitation, temperature, and surface solar radiation in each USDA Farm Resource Region for use in climate change adaptation studies.

  18. Los Angeles County Metropolitan Transportation Authority climate change adaptation pilot project report.

    DOT National Transportation Integrated Search

    2013-08-01

    This Climate Change Adaptation Pilot Project Report details the project background of the recently-completed Los Angeles County : Metropolitan Transportation Authority (Metro) Transit Climate Change Adaptation Pilot Project as well as the various wor...

  19. Building Knowledge in the Workplace and Beyond. Curriculum Adaptation Project.

    ERIC Educational Resources Information Center

    Ballinger, Ronda

    A project was conducted to adapt and modify the four-part workplace literacy curriculum previously created by the College of Lake County (Illinois) and six industries in the county in order to improve the usefulness and application of the information in the original curriculum. Information for the adaptation project was generated by instructors…

  20. Optimal Couple Projections for Domain Adaptive Sparse Representation-based Classification.

    PubMed

    Zhang, Guoqing; Sun, Huaijiang; Porikli, Fatih; Liu, Yazhou; Sun, Quansen

    2017-08-29

    In recent years, sparse representation based classification (SRC) is one of the most successful methods and has been shown impressive performance in various classification tasks. However, when the training data has a different distribution than the testing data, the learned sparse representation may not be optimal, and the performance of SRC will be degraded significantly. To address this problem, in this paper, we propose an optimal couple projections for domain-adaptive sparse representation-based classification (OCPD-SRC) method, in which the discriminative features of data in the two domains are simultaneously learned with the dictionary that can succinctly represent the training and testing data in the projected space. OCPD-SRC is designed based on the decision rule of SRC, with the objective to learn coupled projection matrices and a common discriminative dictionary such that the between-class sparse reconstruction residuals of data from both domains are maximized, and the within-class sparse reconstruction residuals of data are minimized in the projected low-dimensional space. Thus, the resulting representations can well fit SRC and simultaneously have a better discriminant ability. In addition, our method can be easily extended to multiple domains and can be kernelized to deal with the nonlinear structure of data. The optimal solution for the proposed method can be efficiently obtained following the alternative optimization method. Extensive experimental results on a series of benchmark databases show that our method is better or comparable to many state-of-the-art methods.

  1. Combining Adaptive Hypermedia with Project and Case-Based Learning

    ERIC Educational Resources Information Center

    Papanikolaou, Kyparisia; Grigoriadou, Maria

    2009-01-01

    In this article we investigate the design of educational hypermedia based on constructivist learning theories. According to the principles of project and case-based learning we present the design rational of an Adaptive Educational Hypermedia system prototype named MyProject; learners working with MyProject undertake a project and the system…

  2. Projection Operator: A Step Towards Certification of Adaptive Controllers

    NASA Technical Reports Server (NTRS)

    Larchev, Gregory V.; Campbell, Stefan F.; Kaneshige, John T.

    2010-01-01

    One of the major barriers to wider use of adaptive controllers in commercial aviation is the lack of appropriate certification procedures. In order to be certified by the Federal Aviation Administration (FAA), an aircraft controller is expected to meet a set of guidelines on functionality and reliability while not negatively impacting other systems or safety of aircraft operations. Due to their inherent time-variant and non-linear behavior, adaptive controllers cannot be certified via the metrics used for linear conventional controllers, such as gain and phase margin. Projection Operator is a robustness augmentation technique that bounds the output of a non-linear adaptive controller while conforming to the Lyapunov stability rules. It can also be used to limit the control authority of the adaptive component so that the said control authority can be arbitrarily close to that of a linear controller. In this paper we will present the results of applying the Projection Operator to a Model-Reference Adaptive Controller (MRAC), varying the amount of control authority, and comparing controller s performance and stability characteristics with those of a linear controller. We will also show how adjusting Projection Operator parameters can make it easier for the controller to satisfy the certification guidelines by enabling a tradeoff between controller s performance and robustness.

  3. Lessons learned applying CASE methods/tools to Ada software development projects

    NASA Technical Reports Server (NTRS)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  4. Context-Specific Adaptation of Gravity-Dependent Vestibular Reflex Responses (NSBRI Neurovestibular Project 1)

    NASA Technical Reports Server (NTRS)

    Shelhamer, Mark; Goldberg, Jefim; Minor, Lloyd B.; Paloski, William H.; Young, Laurence R.; Zee, David S.

    1999-01-01

    Impairment of gaze and head stabilization reflexes can lead to disorientation and reduced performance in sensorimotor tasks such as piloting of spacecraft. Transitions between different gravitoinertial force (gif) environments - as during different phases of space flight - provide an extreme test of the adaptive capabilities of these mechanisms. We wish to determine to what extent the sensorimotor skills acquired in one gravity environment will transfer to others, and to what extent gravity serves as a context cue for inhibiting such transfer. We use the general approach of adapting a response (saccades, vestibuloocular reflex: VOR, or vestibulocollic reflex: VCR) to a particular change in gain or phase in one gif condition, adapting to a different gain or phase in a second gif condition, and then seeing if gif itself - the context cue - can recall the previously-learned adapted responses. Previous evidence indicates that unless there is specific training to induce context-specificity, reflex adaptation is sequential rather than simultaneous. Various experiments in this project investigate the behavioral properties, neurophysiological basis, and anatomical substrate of context-specific learning, using otolith (gravity) signals as a context cue. In the following, we outline the methods for all experiments in this project, and provide details and results on selected experiments.

  5. Inexact adaptive Newton methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertiger, W.I.; Kelsey, F.J.

    1985-02-01

    The Inexact Adaptive Newton method (IAN) is a modification of the Adaptive Implicit Method/sup 1/ (AIM) with improved Newton convergence. Both methods simplify the Jacobian at each time step by zeroing coefficients in regions where saturations are changing slowly. The methods differ in how the diagonal block terms are treated. On test problems with up to 3,000 cells, IAN consistently saves approximately 30% of the CPU time when compared to the fully implicit method. AIM shows similar savings on some problems, but takes as much CPU time as fully implicit on other test problems due to poor Newton convergence.

  6. Parallel, adaptive finite element methods for conservation laws

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Devine, Karen D.; Flaherty, Joseph E.

    1994-01-01

    We construct parallel finite element methods for the solution of hyperbolic conservation laws in one and two dimensions. Spatial discretization is performed by a discontinuous Galerkin finite element method using a basis of piecewise Legendre polynomials. Temporal discretization utilizes a Runge-Kutta method. Dissipative fluxes and projection limiting prevent oscillations near solution discontinuities. A posteriori estimates of spatial errors are obtained by a p-refinement technique using superconvergence at Radau points. The resulting method is of high order and may be parallelized efficiently on MIMD computers. We compare results using different limiting schemes and demonstrate parallel efficiency through computations on an NCUBE/2 hypercube. We also present results using adaptive h- and p-refinement to reduce the computational cost of the method.

  7. Adaptive eigenspace method for inverse scattering problems in the frequency domain

    NASA Astrophysics Data System (ADS)

    Grote, Marcus J.; Kray, Marie; Nahum, Uri

    2017-02-01

    A nonlinear optimization method is proposed for the solution of inverse scattering problems in the frequency domain, when the scattered field is governed by the Helmholtz equation. The time-harmonic inverse medium problem is formulated as a PDE-constrained optimization problem and solved by an inexact truncated Newton-type iteration. Instead of a grid-based discrete representation, the unknown wave speed is projected to a particular finite-dimensional basis of eigenfunctions, which is iteratively adapted during the optimization. Truncating the adaptive eigenspace (AE) basis at a (small and slowly increasing) finite number of eigenfunctions effectively introduces regularization into the inversion and thus avoids the need for standard Tikhonov-type regularization. Both analytical and numerical evidence underpins the accuracy of the AE representation. Numerical experiments demonstrate the efficiency and robustness to missing or noisy data of the resulting adaptive eigenspace inversion method.

  8. Adaptive Knowledge Management of Project-Based Learning

    ERIC Educational Resources Information Center

    Tilchin, Oleg; Kittany, Mohamed

    2016-01-01

    The goal of an approach to Adaptive Knowledge Management (AKM) of project-based learning (PBL) is to intensify subject study through guiding, inducing, and facilitating development knowledge, accountability skills, and collaborative skills of students. Knowledge development is attained by knowledge acquisition, knowledge sharing, and knowledge…

  9. Advances in Adaptive Control Methods

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan

    2009-01-01

    This poster presentation describes recent advances in adaptive control technology developed by NASA. Optimal Control Modification is a novel adaptive law that can improve performance and robustness of adaptive control systems. A new technique has been developed to provide an analytical method for computing time delay stability margin for adaptive control systems.

  10. The Colorado Climate Preparedness Project: A Systematic Approach to Assessing Efforts Supporting State-Level Adaptation

    NASA Astrophysics Data System (ADS)

    Klein, R.; Gordon, E.

    2010-12-01

    Scholars and policy analysts often contend that an effective climate adaptation strategy must entail "mainstreaming," or incorporating responses to possible climate impacts into existing planning and management decision frameworks. Such an approach, however, makes it difficult to assess the degree to which decisionmaking entities are engaging in adaptive activities that may or may not be explicitly framed around a changing climate. For example, a drought management plan may not explicitly address climate change, but the activities and strategies outlined in it may reduce vulnerabilities posed by a variable and changing climate. Consequently, to generate a strategic climate adaptation plan requires identifying the entire suite of activities that are implicitly linked to climate and may affect adaptive capacity within the system. Here we outline a novel, two-pronged approach, leveraging social science methods, to understanding adaptation throughout state government in Colorado. First, we conducted a series of interviews with key actors in state and federal government agencies, non-governmental organizations, universities, and other entities engaged in state issues. The purpose of these interviews was to elicit information about current activities that may affect the state’s adaptive capacity and to identify future climate-related needs across the state. Second, we have developed an interactive database cataloging organizations, products, projects, and people actively engaged in adaptive planning and policymaking that are relevant to the state of Colorado. The database includes a wiki interface, helping create a dynamic component that will enable frequent updating as climate-relevant information emerges. The results of this project are intended to paint a clear picture of sectors and agencies with higher and lower levels of adaptation awareness and to provide a roadmap for the next gubernatorial administration to pursue a more sophisticated climate adaptation agenda

  11. An adaptive angle-doppler compensation method for airborne bistatic radar based on PAST

    NASA Astrophysics Data System (ADS)

    Hang, Xu; Jun, Zhao

    2018-05-01

    Adaptive angle-Doppler compensation method extract the requisite information based on the data itself adaptively, thus avoiding the problem of performance degradation caused by inertia system error. However, this method requires estimation and egiendecomposition of sample covariance matrix, which has a high computational complexity and limits its real-time application. In this paper, an adaptive angle Doppler compensation method based on projection approximation subspace tracking (PAST) is studied. The method uses cyclic iterative processing to quickly estimate the positions of the spectral center of the maximum eigenvector of each range cell, and the computational burden of matrix estimation and eigen-decompositon is avoided, and then the spectral centers of all range cells is overlapped by two dimensional compensation. Simulation results show the proposed method can effectively reduce the no homogeneity of airborne bistatic radar, and its performance is similar to that of egien-decomposition algorithms, but the computation load is obviously reduced and easy to be realized.

  12. Addressing Climate Change Mitigation and Adaptation Together: A Global Assessment of Agriculture and Forestry Projects.

    PubMed

    Kongsager, Rico; Locatelli, Bruno; Chazarin, Florie

    2016-02-01

    Adaptation and mitigation share the ultimate purpose of reducing climate change impacts. However, they tend to be considered separately in projects and policies because of their different objectives and scales. Agriculture and forestry are related to both adaptation and mitigation: they contribute to greenhouse gas emissions and removals, are vulnerable to climate variations, and form part of adaptive strategies for rural livelihoods. We assessed how climate change project design documents (PDDs) considered a joint contribution to adaptation and mitigation in forestry and agriculture in the tropics, by analyzing 201 PDDs from adaptation funds, mitigation instruments, and project standards [e.g., climate community and biodiversity (CCB)]. We analyzed whether PDDs established for one goal reported an explicit contribution to the other (i.e., whether mitigation PDDs contributed to adaptation and vice versa). We also examined whether the proposed activities or expected outcomes allowed for potential contributions to the two goals. Despite the separation between the two goals in international and national institutions, 37% of the PDDs explicitly mentioned a contribution to the other objective, although only half of those substantiated it. In addition, most adaptation (90%) and all mitigation PDDs could potentially report a contribution to at least partially to the other goal. Some adaptation project developers were interested in mitigation for the prospect of carbon funding, whereas mitigation project developers integrated adaptation to achieve greater long-term sustainability or to attain CCB certification. International and national institutions can provide incentives for projects to harness synergies and avoid trade-offs between adaptation and mitigation.

  13. Robust Optimal Adaptive Control Method with Large Adaptive Gain

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.

    2009-01-01

    In the presence of large uncertainties, a control system needs to be able to adapt rapidly to regain performance. Fast adaptation is referred to the implementation of adaptive control with a large adaptive gain to reduce the tracking error rapidly. However, a large adaptive gain can lead to high-frequency oscillations which can adversely affect robustness of an adaptive control law. A new adaptive control modification is presented that can achieve robust adaptation with a large adaptive gain without incurring high-frequency oscillations as with the standard model-reference adaptive control. The modification is based on the minimization of the Y2 norm of the tracking error, which is formulated as an optimal control problem. The optimality condition is used to derive the modification using the gradient method. The optimal control modification results in a stable adaptation and allows a large adaptive gain to be used for better tracking while providing sufficient stability robustness. Simulations were conducted for a damaged generic transport aircraft with both standard adaptive control and the adaptive optimal control modification technique. The results demonstrate the effectiveness of the proposed modification in tracking a reference model while maintaining a sufficient time delay margin.

  14. Neurology diagnostics security and terminal adaptation for PocketNeuro project.

    PubMed

    Chemak, C; Bouhlel, M-S; Lapayre, J-C

    2008-09-01

    This paper presents new approaches of medical information security and terminal mobile phone adaptation for the PocketNeuro project. The latter term refers to a project created for the management of neurological diseases. It consists of transmitting information about patients ("desk of patients") to a doctor's mobile phone during a visit and examination of a patient. These new approaches for the PocketNeuro project were analyzed in terms of medical information security and adaptation of the diagnostic images to the doctor's mobile phone. Images were extracted from a DICOM library. Matlab and its library were used as software to test our approaches and to validate our results. Experiments performed on a database of 30 256 x 256 pixel-sized neuronal medical images indicated that our new approaches for PocketNeuro project are valid and support plans for large-scale studies between French and Swiss hospitals using secured connections.

  15. Adaptive quantum computation in changing environments using projective simulation

    NASA Astrophysics Data System (ADS)

    Tiersch, M.; Ganahl, E. J.; Briegel, H. J.

    2015-08-01

    Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks.

  16. Adaptive quantum computation in changing environments using projective simulation

    PubMed Central

    Tiersch, M.; Ganahl, E. J.; Briegel, H. J.

    2015-01-01

    Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks. PMID:26260263

  17. Promoting Adoption and Adaptation. A Handbook for Teacher Corps Projects.

    ERIC Educational Resources Information Center

    Center for New Schools, Inc., Chicago, IL.

    This handbook was designed to assist local Teacher Corps projects to plan and implement the Teacher Corps'"Fourth Outcome:" the adoption or adaptation of the project's educational improvement activities by other educational agencies and institutions. Section I provides an overview of seven scenarios which might be applicable to local…

  18. [A Method for Selecting Self-Adoptive Chromaticity of the Projected Markers].

    PubMed

    Zhao, Shou-bo; Zhang, Fu-min; Qu, Xing-hua; Zheng, Shi-wei; Chen, Zhe

    2015-04-01

    The authors designed a self-adaptive projection system which is composed of color camera, projector and PC. In detail, digital micro-mirror device (DMD) as a spatial light modulator for the projector was introduced in the optical path to modulate the illuminant spectrum based on red, green and blue light emitting diodes (LED). However, the color visibility of active markers is affected by the screen which has unknown reflective spectrum as well. Here active markers are projected spot array. And chromaticity feature of markers is sometimes submerged in similar spectral screen. In order to enhance the color visibility of active markers relative to screen, a method for selecting self-adaptive chromaticity of the projected markers in 3D scanning metrology is described. Color camera with 3 channels limits the accuracy of device characterization. For achieving interconversion of device-independent color space and device-dependent color space, high-dimensional linear model of reflective spectrum was built. Prior training samples provide additional constraints to yield high-dimensional linear model with more than three degrees of freedom. Meanwhile, spectral power distribution of ambient light was estimated. Subsequently, markers' chromaticity in CIE color spaces was selected via maximization principle of Euclidean distance. The setting values of RGB were easily estimated via inverse transform. Finally, we implemented a typical experiment to show the performance of the proposed approach. An 24 Munsell Color Checker was used as projective screen. Color difference in the chromaticity coordinates between the active marker and the color patch was utilized to evaluate the color visibility of active markers relative to the screen. The result comparison between self-adaptive projection system and traditional diode-laser light projector was listed and discussed to highlight advantage of our proposed method.

  19. An Adaptation of the Distance Driven Projection Method for Single Pinhole Collimators in SPECT Imaging

    NASA Astrophysics Data System (ADS)

    Ihsani, Alvin; Farncombe, Troy

    2016-02-01

    The modelling of the projection operator in tomographic imaging is of critical importance especially when working with algebraic methods of image reconstruction. This paper proposes a distance-driven projection method which is targeted to single-pinhole single-photon emission computed tomograghy (SPECT) imaging since it accounts for the finite size of the pinhole, and the possible tilting of the detector surface in addition to other collimator-specific factors such as geometric sensitivity. The accuracy and execution time of the proposed method is evaluated by comparing to a ray-driven approach where the pinhole is sub-sampled with various sampling schemes. A point-source phantom whose projections were generated using OpenGATE was first used to compare the resolution of reconstructed images with each method using the full width at half maximum (FWHM). Furthermore, a high-activity Mini Deluxe Phantom (Data Spectrum Corp., Durham, NC, USA) SPECT resolution phantom was scanned using a Gamma Medica X-SPECT system and the signal-to-noise ratio (SNR) and structural similarity of reconstructed images was compared at various projection counts. Based on the reconstructed point-source phantom, the proposed distance-driven approach results in a lower FWHM than the ray-driven approach even when using a smaller detector resolution. Furthermore, based on the Mini Deluxe Phantom, it is shown that the distance-driven approach has consistently higher SNR and structural similarity compared to the ray-driven approach as the counts in measured projections deteriorates.

  20. A framework for identifying tailored subsets of climate projections for impact and adaptation studies

    NASA Astrophysics Data System (ADS)

    Vidal, Jean-Philippe; Hingray, Benoît

    2014-05-01

    uncertainty associated from this modelling step. Besides, the climate projection dataset available for a given study has several characteristics that will heavily condition the type of conclusions that can be reached. Indeed, the dataset at hand may or not sample different types of uncertainty (socio-economic, structural, parametric, along with internal variability). Moreover, these types are present at different steps in the well-known cascade of uncertainty, from the emission / concentration scenarios and the global climate to the regional-to-local climate. Critical choices for the selection are therefore conditioned on all features above. The type of selection (picking out, culling, or statistical sampling) is closely related to the study objectives and the uncertainty types present in the dataset. Moreover, grounds for picking out or culling projections may stem from global, regional or feature-specific present-day performance, representativeness, or covered range. An example use of this framework is a hierarchical selection for 3 classes of impact models among 3000 transient climate projections from different runs of 4 GCMs, statistically downscaled by 3 probabilistic methods, and made available for an integrated water resource adaptation study in the Durance catchment (southern French Alps). This work is part of the GICC R2D2-20501 project (Risk, water Resources and sustainable Development of the Durance catchment in 2050) and the EU FP7 COMPLEX2 project (Knowledge Based Climate Mitigation Systems for a Low Carbon Economy).

  1. Large Scale Analyses and Visualization of Adaptive Amino Acid Changes Projects.

    PubMed

    Vázquez, Noé; Vieira, Cristina P; Amorim, Bárbara S R; Torres, André; López-Fernández, Hugo; Fdez-Riverola, Florentino; Sousa, José L R; Reboiro-Jato, Miguel; Vieira, Jorge

    2018-03-01

    When changes at few amino acid sites are the target of selection, adaptive amino acid changes in protein sequences can be identified using maximum-likelihood methods based on models of codon substitution (such as codeml). Although such methods have been employed numerous times using a variety of different organisms, the time needed to collect the data and prepare the input files means that tens or hundreds of coding regions are usually analyzed. Nevertheless, the recent availability of flexible and easy to use computer applications that collect relevant data (such as BDBM) and infer positively selected amino acid sites (such as ADOPS), means that the entire process is easier and quicker than before. However, the lack of a batch option in ADOPS, here reported, still precludes the analysis of hundreds or thousands of sequence files. Given the interest and possibility of running such large-scale projects, we have also developed a database where ADOPS projects can be stored. Therefore, this study also presents the B+ database, which is both a data repository and a convenient interface that looks at the information contained in ADOPS projects without the need to download and unzip the corresponding ADOPS project file. The ADOPS projects available at B+ can also be downloaded, unzipped, and opened using the ADOPS graphical interface. The availability of such a database ensures results repeatability, promotes data reuse with significant savings on the time needed for preparing datasets, and effortlessly allows further exploration of the data contained in ADOPS projects.

  2. An analysis of European riverine flood risk and adaptation measures under projected climate change

    NASA Astrophysics Data System (ADS)

    Bouwer, Laurens; Burzel, Andreas; Holz, Friederike; Winsemius, Hessel; de Bruijn, Karind

    2015-04-01

    There is increasing need to assess costs and benefits of adaptation at scales beyond the river basin. In Europe, such estimates are required at the European scale in order to set priorities for action and financing, for instance in the context of the EU Adaptation Strategy. The goal of this work as part of the FP7 BASE project is to develop a flood impact model that can be applied at Pan-European scale and that is able to project changes in flood risk due to climate change and socio-economic developments, and costs of adaptation. For this research, we build upon the global flood hazard estimation method developed by Winsemius et al. (Hydrology and Earth System Sciences, 2013), that produces flood inundation maps at different return period, for present day (EU WATCH) and future climate (IPCC scenarios RCP4.5 and 8.5, for five climate models). These maps are used for the assessment of flood impacts. We developed and tested a model for assessing direct economic flood damages by using large scale land use maps. We characterise vulnerable land use functions, in particular residential, commercial, industrial, infrastructure and agriculture, using depth-damage relationships. Furthermore, we apply up to NUTS3 level information on Gross Domestic Product, which is used as a proxy for relative differences in maximum damage values between different areas. Next, we test two adaptation measures, by adjusting flood protection levels and adjusting damage functions. The results show the projected changes in flood risk in the future. For example, on NUTS2 level, flood risk increases in some regions up to 179% (between the baseline scenario 1960-1999 and time slice 2010-2049). On country level there are increases up to 60% for selected climate models. The conference presentation will show the most relevant improvements in damage modelling on the continental scale, and results of the analysis of adaptation measures. The results will be critically discussed under the aspect of major

  3. Network models for solving the problem of multicriterial adaptive optimization of investment projects control with several acceptable technologies

    NASA Astrophysics Data System (ADS)

    Shorikov, A. F.; Butsenko, E. V.

    2017-10-01

    This paper discusses the problem of multicriterial adaptive optimization the control of investment projects in the presence of several technologies. On the basis of network modeling proposed a new economic and mathematical model and a method for solving the problem of multicriterial adaptive optimization the control of investment projects in the presence of several technologies. Network economic and mathematical modeling allows you to determine the optimal time and calendar schedule for the implementation of the investment project and serves as an instrument to increase the economic potential and competitiveness of the enterprise. On a meaningful practical example, the processes of forming network models are shown, including the definition of the sequence of actions of a particular investment projecting process, the network-based work schedules are constructed. The calculation of the parameters of network models is carried out. Optimal (critical) paths have been formed and the optimal time for implementing the chosen technologies of the investment project has been calculated. It also shows the selection of the optimal technology from a set of possible technologies for project implementation, taking into account the time and cost of the work. The proposed model and method for solving the problem of managing investment projects can serve as a basis for the development, creation and application of appropriate computer information systems to support the adoption of managerial decisions by business people.

  4. Project power: Adapting an evidence-based HIV/STI prevention intervention for incarcerated women.

    PubMed

    Fasula, Amy M; Fogel, Catherine I; Gelaude, Deborah; Carry, Monique; Gaiter, Juarlyn; Parker, Sharon

    2013-06-01

    Incarcerated women are a critical population for targeted HIV/STI prevention programming; however, there is a dearth of evidence-based, genderspecific behavioral interventions for this population. Systematically adapting existing evidence-based interventions (EBIs) can help fill this gap. We illustrate the adaptation of the HIV/STI prevention EBI, Project Safe, for use among incarcerated women and delivery in prisons. Project POWER, the final adapted intervention, was developed using formative research with prison staff and administration, incarcerated and previously incarcerated women, and input of community advisory boards. Intervention delivery adaptations included: shorter, more frequent intervention sessions; booster sessions prior to and just after release; facilitator experience in prisons and counseling; and new videos. Intervention content adaptations addressed issues of empowerment, substance use, gender and power inequity in relationships, interpersonal violence, mental health, reentry, and social support. This illustration of the adaption process provides information to inform additional efforts to adapt EBIs for this underserved population.

  5. Successful adaptation of a research methods course in South America.

    PubMed

    Tamariz, Leonardo; Vasquez, Diego; Loor, Cecilia; Palacio, Ana

    2017-01-01

    South America has low research productivity. The lack of a structured research curriculum is one of the barriers to conducting research. To report our experience adapting an active learning-based research methods curriculum to improve research productivity at a university in Ecuador. We used a mixed-method approach to test the adaptation of the research curriculum at Universidad Catolica Santiago de Guayaquil. The curriculum uses a flipped classroom and active learning approach to teach research methods. When adapted, it was longitudinal and had 16-hour programme of in-person teaching and a six-month follow-up online component. Learners were organized in theme groups according to interest, and each group had a faculty leader. Our primary outcome was research productivity, which was measured by the succesful presentation of the research project at a national meeting, or publication in a peer-review journal. Our secondary outcomes were knowledge and perceived competence before and after course completion. We conducted qualitative interviews of faculty members and students to evaluate themes related to participation in research. Fifty university students and 10 faculty members attended the course. We had a total of 15 groups. Both knowledge and perceived competence increased by 17 and 18 percentage points, respectively. The presentation or publication rate for the entire group was 50%. The qualitative analysis showed that a lack of research culture and curriculum were common barriers to research. A US-based curriculum can be successfully adapted in low-middle income countries. A research curriculum aids in achieving pre-determined milestones. UCSG: Universidad Catolica Santiago de Guayaquil; UM: University of Miami.

  6. On adaptive modified projective synchronization of a supply chain management system

    NASA Astrophysics Data System (ADS)

    Tirandaz, Hamed

    2017-12-01

    In this paper, the synchronization problem of a chaotic supply chain management system is studied. A novel adaptive modified projective synchronization method is introduced to control the behaviour of the leader supply chain system by a follower chaotic system and to adjust the leader system parameters until the measurable errors of the system parameters converge to zero. The stability evaluation and convergence analysis are carried out by the Lyapanov stability theorem. The proposed synchronization and antisynchronization techniques are studied for identical supply chain chaotic systems. Finally, some numerical simulations are presented to verify the effectiveness of the theoretical discussions.

  7. Finite-time hybrid projective synchronization of the drive-response complex networks with distributed-delay via adaptive intermittent control

    NASA Astrophysics Data System (ADS)

    Cheng, Lin; Yang, Yongqing; Li, Li; Sui, Xin

    2018-06-01

    This paper studies the finite-time hybrid projective synchronization of the drive-response complex networks. In the model, general transmission delays and distributed delays are also considered. By designing the adaptive intermittent controllers, the response network can achieve hybrid projective synchronization with the drive system in finite time. Based on finite-time stability theory and several differential inequalities, some simple finite-time hybrid projective synchronization criteria are derived. Two numerical examples are given to illustrate the effectiveness of the proposed method.

  8. Accelerated Adaptive Integration Method

    PubMed Central

    2015-01-01

    Conformational changes that occur upon ligand binding may be too slow to observe on the time scales routinely accessible using molecular dynamics simulations. The adaptive integration method (AIM) leverages the notion that when a ligand is either fully coupled or decoupled, according to λ, barrier heights may change, making some conformational transitions more accessible at certain λ values. AIM adaptively changes the value of λ in a single simulation so that conformations sampled at one value of λ seed the conformational space sampled at another λ value. Adapting the value of λ throughout a simulation, however, does not resolve issues in sampling when barriers remain high regardless of the λ value. In this work, we introduce a new method, called Accelerated AIM (AcclAIM), in which the potential energy function is flattened at intermediate values of λ, promoting the exploration of conformational space as the ligand is decoupled from its receptor. We show, with both a simple model system (Bromocyclohexane) and the more complex biomolecule Thrombin, that AcclAIM is a promising approach to overcome high barriers in the calculation of free energies, without the need for any statistical reweighting or additional processors. PMID:24780083

  9. Global Change adaptation in water resources management: the Water Change project.

    PubMed

    Pouget, Laurent; Escaler, Isabel; Guiu, Roger; Mc Ennis, Suzy; Versini, Pierre-Antoine

    2012-12-01

    In recent years, water resources management has been facing new challenges due to increasing changes and their associated uncertainties, such as changes in climate, water demand or land use, which can be grouped under the term Global Change. The Water Change project (LIFE+ funding) developed a methodology and a tool to assess the Global Change impacts on water resources, thus helping river basin agencies and water companies in their long term planning and in the definition of adaptation measures. The main result of the project was the creation of a step by step methodology to assess Global Change impacts and define strategies of adaptation. This methodology was tested in the Llobregat river basin (Spain) with the objective of being applicable to any water system. It includes several steps such as setting-up the problem with a DPSIR framework, developing Global Change scenarios, running river basin models and performing a cost-benefit analysis to define optimal strategies of adaptation. This methodology was supported by the creation of a flexible modelling system, which can link a wide range of models, such as hydrological, water quality, and water management models. The tool allows users to integrate their own models to the system, which can then exchange information among them automatically. This enables to simulate the interactions among multiple components of the water cycle, and run quickly a large number of Global Change scenarios. The outcomes of this project make possible to define and test different sets of adaptation measures for the basin that can be further evaluated through cost-benefit analysis. The integration of the results contributes to an efficient decision-making on how to adapt to Global Change impacts. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Revealing Risks in Adaptation Planning: expanding Uncertainty Treatment and dealing with Large Projection Ensembles during Planning Scenario development

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.; Clark, M. P.; Gutmann, E. D.; Wood, A.; Mizukami, N.; Mendoza, P. A.; Rasmussen, R.; Ikeda, K.; Pruitt, T.; Arnold, J. R.; Rajagopalan, B.

    2015-12-01

    Adaptation planning assessments often rely on single methods for climate projection downscaling and hydrologic analysis, do not reveal uncertainties from associated method choices, and thus likely produce overly confident decision-support information. Recent work by the authors has highlighted this issue by identifying strengths and weaknesses of widely applied methods for downscaling climate projections and assessing hydrologic impacts. This work has shown that many of the methodological choices made can alter the magnitude, and even the sign of the climate change signal. Such results motivate consideration of both sources of method uncertainty within an impacts assessment. Consequently, the authors have pursued development of improved downscaling techniques spanning a range of method classes (quasi-dynamical and circulation-based statistical methods) and developed approaches to better account for hydrologic analysis uncertainty (multi-model; regional parameter estimation under forcing uncertainty). This presentation summarizes progress in the development of these methods, as well as implications of pursuing these developments. First, having access to these methods creates an opportunity to better reveal impacts uncertainty through multi-method ensembles, expanding on present-practice ensembles which are often based only on emissions scenarios and GCM choices. Second, such expansion of uncertainty treatment combined with an ever-expanding wealth of global climate projection information creates a challenge of how to use such a large ensemble for local adaptation planning. To address this challenge, the authors are evaluating methods for ensemble selection (considering the principles of fidelity, diversity and sensitivity) that is compatible with present-practice approaches for abstracting change scenarios from any "ensemble of opportunity". Early examples from this development will also be presented.

  11. Laser beam projection with adaptive array of fiber collimators. II. Analysis of atmospheric compensation efficiency.

    PubMed

    Lachinova, Svetlana L; Vorontsov, Mikhail A

    2008-08-01

    We analyze the potential efficiency of laser beam projection onto a remote object in atmosphere with incoherent and coherent phase-locked conformal-beam director systems composed of an adaptive array of fiber collimators. Adaptive optics compensation of turbulence-induced phase aberrations in these systems is performed at each fiber collimator. Our analysis is based on a derived expression for the atmospheric-averaged value of the mean square residual phase error as well as direct numerical simulations. Operation of both conformal-beam projection systems is compared for various adaptive system configurations characterized by the number of fiber collimators, the adaptive compensation resolution, and atmospheric turbulence conditions.

  12. Adaptive Finite Element Methods for Continuum Damage Modeling

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Tworzydlo, W. W.; Xiques, K. E.

    1995-01-01

    The paper presents an application of adaptive finite element methods to the modeling of low-cycle continuum damage and life prediction of high-temperature components. The major objective is to provide automated and accurate modeling of damaged zones through adaptive mesh refinement and adaptive time-stepping methods. The damage modeling methodology is implemented in an usual way by embedding damage evolution in the transient nonlinear solution of elasto-viscoplastic deformation problems. This nonlinear boundary-value problem is discretized by adaptive finite element methods. The automated h-adaptive mesh refinements are driven by error indicators, based on selected principal variables in the problem (stresses, non-elastic strains, damage, etc.). In the time domain, adaptive time-stepping is used, combined with a predictor-corrector time marching algorithm. The time selection is controlled by required time accuracy. In order to take into account strong temperature dependency of material parameters, the nonlinear structural solution a coupled with thermal analyses (one-way coupling). Several test examples illustrate the importance and benefits of adaptive mesh refinements in accurate prediction of damage levels and failure time.

  13. Vibration control with adaptive structures: MAVO FASPAS project review

    NASA Astrophysics Data System (ADS)

    Hanselka, Holger; Melz, Tobias; Drossel, Welf-Guntram; Sporn, Dieter; Schönecker, Andreas; Poigné, Axel

    2006-03-01

    The mission of the Fraunhofer Gesellschaft, one of the biggest research facilities in Germany, is to identify technologies with a high impact potential for commercial applications and to take all necessary steps to successfully promote them by performing cooperative industrial research activities. One of these technologies is called smart structures, also known as adaptive structures. Most recently, Fraunhofer decided to strategically extend its portfolio to include this technology and summarize its R&D activities in the FIT (Fraunhofer Innovation Topics) ADAPTRONIK. To improve Fraunhofer's competencies in adaptronics, especially with respect to system design and implementation, the Fraunhofer internal project MAVO FASPAS was launched in 2003. Now, after 3 years of work, the project comes to a close. This article discusses some major project results.

  14. Quantifying the effect of autonomous adaptation to global river flood projections: application to future flood risk assessments

    NASA Astrophysics Data System (ADS)

    Kinoshita, Youhei; Tanoue, Masahiro; Watanabe, Satoshi; Hirabayashi, Yukiko

    2018-01-01

    This study represents the first attempt to quantify the effects of autonomous adaptation on the projection of global flood hazards and to assess future flood risk by including this effect. A vulnerability scenario, which varies according to the autonomous adaptation effect for conventional disaster mitigation efforts, was developed based on historical vulnerability values derived from flood damage records and a river inundation simulation. Coupled with general circulation model outputs and future socioeconomic scenarios, potential future flood fatalities and economic loss were estimated. By including the effect of autonomous adaptation, our multimodel ensemble estimates projected a 2.0% decrease in potential flood fatalities and an 821% increase in potential economic losses by 2100 under the highest emission scenario together with a large population increase. Vulnerability changes reduced potential flood consequences by 64%-72% in terms of potential fatalities and 28%-42% in terms of potential economic losses by 2100. Although socioeconomic changes made the greatest contribution to the potential increased consequences of future floods, about a half of the increase of potential economic losses was mitigated by autonomous adaptation. There is a clear and positive relationship between the global temperature increase from the pre-industrial level and the estimated mean potential flood economic loss, while there is a negative relationship with potential fatalities due to the autonomous adaptation effect. A bootstrapping analysis suggests a significant increase in potential flood fatalities (+5.7%) without any adaptation if the temperature increases by 1.5 °C-2.0 °C, whereas the increase in potential economic loss (+0.9%) was not significant. Our method enables the effects of autonomous adaptation and additional adaptation efforts on climate-induced hazards to be distinguished, which would be essential for the accurate estimation of the cost of adaptation to

  15. Modeling Adaptive Educational Methods with IMS Learning Design

    ERIC Educational Resources Information Center

    Specht, Marcus; Burgos, Daniel

    2007-01-01

    The paper describes a classification system for adaptive methods developed in the area of adaptive educational hypermedia based on four dimensions: What components of the educational system are adapted? To what features of the user and the current context does the system adapt? Why does the system adapt? How does the system get the necessary…

  16. A resilience perspective to water risk management: case-study application of the adaptation tipping point method

    NASA Astrophysics Data System (ADS)

    Gersonius, Berry; Ashley, Richard; Jeuken, Ad; Nasruddin, Fauzy; Pathirana, Assela; Zevenbergen, Chris

    2010-05-01

    In a context of high uncertainty about hydrological variables due to climate change and other factors, the development of updated risk management approaches is as important as—if not more important than—the provision of improved data and forecasts of the future. Traditional approaches to adaptation attempt to manage future water risks to cities with the use of the predict-then-adapt method. This method uses hydrological change projections as the starting point to identify adaptive strategies, which is followed by analysing the cause-effect chain based on some sort of Pressures-State-Impact-Response (PSIR) scheme. The predict-then-adapt method presumes that it is possible to define a singular (optimal) adaptive strategy according to a most likely or average projection of future change. A key shortcoming of the method is, however, that the planning of water management structures is typically decoupled from forecast uncertainties and is, as such, inherently inflexible. This means that there is an increased risk of under- or over-adaptation, resulting in either mal-functioning or unnecessary costs. Rather than taking a traditional approach, responsible water risk management requires an alternative approach to adaptation that recognises and cultivates resiliency for change. The concept of resiliency relates to the capability of complex socio-technical systems to make aspirational levels of functioning attainable despite the occurrence of possible changes. Focusing on resiliency does not attempt to reduce uncertainty associated with future change, but rather to develop better ways of managing it. This makes it a particularly relevant perspective for adaptation to long-term hydrological change. Although resiliency is becoming more refined as a theory, the application of the concept to water risk management is still in an initial phase. Different methods are used in practice to support the implementation of a resilience-focused approach. Typically these approaches

  17. The New England Climate Adaptation Project: Enhancing Local Readiness to Adapt to Climate Change through Role-Play Simulations

    NASA Astrophysics Data System (ADS)

    Rumore, D.; Kirshen, P. H.; Susskind, L.

    2014-12-01

    Despite scientific consensus that the climate is changing, local efforts to prepare for and manage climate change risks remain limited. How we can raise concern about climate change risks and enhance local readiness to adapt to climate change's effects? In this presentation, we will share the lessons learned from the New England Climate Adaptation Project (NECAP), a participatory action research project that tested science-based role-play simulations as a tool for educating the public about climate change risks and simulating collective risk management efforts. NECAP was a 2-year effort involving the Massachusetts Institute of Technology, the Consensus Building Institute, the National Estuarine Research Reserve System, and four coastal New England municipalities. During 2012-2013, the NECAP team produced downscaled climate change projections, a summary risk assessment, and a stakeholder assessment for each partner community. Working with local partners, we used these assessments to create a tailored, science-based role-play simulation for each site. Through a series of workshops in 2013, NECAP engaged between 115-170 diverse stakeholders and members of the public in each partner municipality in playing the simulation and a follow up conversation about local climate change risks and possible adaptation strategies. Data were collected through before-and-after surveys administered to all workshop participants, follow-up interviews with 25 percent of workshop participants, public opinion polls conducted before and after our intervention, and meetings with public officials. This presentation will report our research findings and explain how science-based role-play simulations can be used to help communicate local climate change risks and enhance local readiness to adapt.

  18. An Adaptive Approach to Managing Knowledge Development in a Project-Based Learning Environment

    ERIC Educational Resources Information Center

    Tilchin, Oleg; Kittany, Mohamed

    2016-01-01

    In this paper we propose an adaptive approach to managing the development of students' knowledge in the comprehensive project-based learning (PBL) environment. Subject study is realized by two-stage PBL. It shapes adaptive knowledge management (KM) process and promotes the correct balance between personalized and collaborative learning. The…

  19. Supporting UK adaptation: building services for the next set of UK climate projections

    NASA Astrophysics Data System (ADS)

    Fung, Fai; Lowe, Jason

    2016-04-01

    As part of the Climate Change Act 2008, the UK Government sets out a national adaptation programme to address the risks and opportunities identified in a national climate change risk assessment (CCRA) every five years. The last risk assessment in 2012 was based on the probabilistic projections for the UK published in 2009 (UKCP09). The second risk assessment will also use information from UKCP09 alongside other evidence on climate projections. However, developments in the science of climate projeciton, and evolving user needs (based partly on what has been learnt about the diverse user requirements of the UK adaptation community from the seven years of delivering and managing UKCP09 products, market research and the peer-reviewed literature) suggest now is an appropriate time to update the projections and how they are delivered. A new set of UK climate projections are now being produced to upgrade UKCP09 to reflect the latest developments in climate science, the first phase of which will be delivered in 2018 to support the third CCRA. A major component of the work is the building of a tailored service to support users of the new projections during their development and to involve users in key decisions so that the projections are of most use. We will set out the plan for the new climate projections that seek to address the evolving user need. We will also present a framework which aims to (i) facilitate the dialogue between users, boundary organisations and producers, reflecting their different decision-making roles (ii) produce scientifically robust, user-relevant climate information (iii) provide the building blocks for developing further climate services to support adaptation activities in the UK.

  20. Adaptive mesh strategies for the spectral element method

    NASA Technical Reports Server (NTRS)

    Mavriplis, Catherine

    1992-01-01

    An adaptive spectral method was developed for the efficient solution of time dependent partial differential equations. Adaptive mesh strategies that include resolution refinement and coarsening by three different methods are illustrated on solutions to the 1-D viscous Burger equation and the 2-D Navier-Stokes equations for driven flow in a cavity. Sharp gradients, singularities, and regions of poor resolution are resolved optimally as they develop in time using error estimators which indicate the choice of refinement to be used. The adaptive formulation presents significant increases in efficiency, flexibility, and general capabilities for high order spectral methods.

  1. Post-project appraisals in adaptive management of river channel restoration.

    PubMed

    Downs, Peter W; Kondolf, G Mathias

    2002-04-01

    Post-project appraisals (PPAs) can evaluate river restoration schemes in relation to their compliance with design, their short-term performance attainment, and their longer-term geomorphological compatibility with the catchment hydrology and sediment transport processes. PPAs provide the basis for communicating the results of one restoration scheme to another, thereby improving future restoration designs. They also supply essential performance feedback needed for adaptive management, in which management actions are treated as experiments. PPAs allow river restoration success to be defined both in terms of the scheme attaining its performance objectives and in providing a significant learning experience. Different levels of investment in PPA, in terms of pre-project data and follow-up information, bring with them different degrees of understanding and tbus different abilities to gauge both types of success. We present four case studies to illustrate how the commitment to PPA has determined the understanding achieved in each case. In Moore's Gulch (California, USA), understanding was severely constrained by the lack of pre-project data and post-implementation monitoring. Pre-project data existed for the Kitswell Brook (Hertfordshire, UK), but the monitoring consisted only of one site visit and thus the understanding achieved is related primarily to design compliance issues. The monitoring undertaken for Deep Run (Maryland, USA) and the River Idle (Nottinghamshire, UK) enabled some understanding of the short-term performance of each scheme. The transferable understanding gained from each case study is used to develop an illustrative five-fold classification of geomorphological PPAs (full, medium-term, short-term, one-shot, and remains) according to their potential as learning experiences. The learning experience is central to adaptive management but rarely articulated in the literature. Here, we gauge the potential via superimposition onto a previous schematic

  2. Track and vertex reconstruction: From classical to adaptive methods

    NASA Astrophysics Data System (ADS)

    Strandlie, Are; Frühwirth, Rudolf

    2010-04-01

    This paper reviews classical and adaptive methods of track and vertex reconstruction in particle physics experiments. Adaptive methods have been developed to meet the experimental challenges at high-energy colliders, in particular, the CERN Large Hadron Collider. They can be characterized by the obliteration of the traditional boundaries between pattern recognition and statistical estimation, by the competition between different hypotheses about what constitutes a track or a vertex, and by a high level of flexibility and robustness achieved with a minimum of assumptions about the data. The theoretical background of some of the adaptive methods is described, and it is shown that there is a close connection between the two main branches of adaptive methods: neural networks and deformable templates, on the one hand, and robust stochastic filters with annealing, on the other hand. As both classical and adaptive methods of track and vertex reconstruction presuppose precise knowledge of the positions of the sensitive detector elements, the paper includes an overview of detector alignment methods and a survey of the alignment strategies employed by past and current experiments.

  3. Towards More Comprehensive Projections of Urban Heat-Related Mortality: Estimates for New York City under Multiple Population, Adaptation, and Climate Scenarios

    PubMed Central

    Petkova, Elisaveta P.; Vink, Jan K.; Horton, Radley M.; Gasparrini, Antonio; Bader, Daniel A.; Francis, Joe D.; Kinney, Patrick L.

    2016-01-01

    Background: High temperatures have substantial impacts on mortality and, with growing concerns about climate change, numerous studies have developed projections of future heat-related deaths around the world. Projections of temperature-related mortality are often limited by insufficient information to formulate hypotheses about population sensitivity to high temperatures and future demographics. Objectives: The present study derived projections of temperature-related mortality in New York City by taking into account future patterns of adaptation or demographic change, both of which can have profound influences on future health burdens. Methods: We adopted a novel approach to modeling heat adaptation by incorporating an analysis of the observed population response to heat in New York City over the course of eight decades. This approach projected heat-related mortality until the end of the 21st century based on observed trends in adaptation over a substantial portion of the 20th century. In addition, we incorporated a range of new scenarios for population change until the end of the 21st century. We then estimated future heat-related deaths in New York City by combining the changing temperature–mortality relationship and population scenarios with downscaled temperature projections from the 33 global climate models (GCMs) and two Representative Concentration Pathways (RCPs). Results: The median number of projected annual heat-related deaths across the 33 GCMs varied greatly by RCP and adaptation and population change scenario, ranging from 167 to 3,331 in the 2080s compared with 638 heat-related deaths annually between 2000 and 2006. Conclusions: These findings provide a more complete picture of the range of potential future heat-related mortality risks across the 21st century in New York City, and they highlight the importance of both demographic change and adaptation responses in modifying future risks. Citation: Petkova EP, Vink JK, Horton RM, Gasparrini A, Bader

  4. Adaptive [theta]-methods for pricing American options

    NASA Astrophysics Data System (ADS)

    Khaliq, Abdul Q. M.; Voss, David A.; Kazmi, Kamran

    2008-12-01

    We develop adaptive [theta]-methods for solving the Black-Scholes PDE for American options. By adding a small, continuous term, the Black-Scholes PDE becomes an advection-diffusion-reaction equation on a fixed spatial domain. Standard implementation of [theta]-methods would require a Newton-type iterative procedure at each time step thereby increasing the computational complexity of the methods. Our linearly implicit approach avoids such complications. We establish a general framework under which [theta]-methods satisfy a discrete version of the positivity constraint characteristic of American options, and numerically demonstrate the sensitivity of the constraint. The positivity results are established for the single-asset and independent two-asset models. In addition, we have incorporated and analyzed an adaptive time-step control strategy to increase the computational efficiency. Numerical experiments are presented for one- and two-asset American options, using adaptive exponential splitting for two-asset problems. The approach is compared with an iterative solution of the two-asset problem in terms of computational efficiency.

  5. Fast and robust reconstruction for fluorescence molecular tomography via a sparsity adaptive subspace pursuit method.

    PubMed

    Ye, Jinzuo; Chi, Chongwei; Xue, Zhenwen; Wu, Ping; An, Yu; Xu, Han; Zhang, Shuang; Tian, Jie

    2014-02-01

    Fluorescence molecular tomography (FMT), as a promising imaging modality, can three-dimensionally locate the specific tumor position in small animals. However, it remains challenging for effective and robust reconstruction of fluorescent probe distribution in animals. In this paper, we present a novel method based on sparsity adaptive subspace pursuit (SASP) for FMT reconstruction. Some innovative strategies including subspace projection, the bottom-up sparsity adaptive approach, and backtracking technique are associated with the SASP method, which guarantees the accuracy, efficiency, and robustness for FMT reconstruction. Three numerical experiments based on a mouse-mimicking heterogeneous phantom have been performed to validate the feasibility of the SASP method. The results show that the proposed SASP method can achieve satisfactory source localization with a bias less than 1mm; the efficiency of the method is much faster than mainstream reconstruction methods; and this approach is robust even under quite ill-posed condition. Furthermore, we have applied this method to an in vivo mouse model, and the results demonstrate the feasibility of the practical FMT application with the SASP method.

  6. Methods used in adaptation of health-related guidelines: A systematic survey.

    PubMed

    Abdul-Khalek, Rima A; Darzi, Andrea J; Godah, Mohammad W; Kilzar, Lama; Lakis, Chantal; Agarwal, Arnav; Abou-Jaoude, Elias; Meerpohl, Joerg J; Wiercioch, Wojtek; Santesso, Nancy; Brax, Hneine; Schünemann, Holger; Akl, Elie A

    2017-12-01

    Adaptation refers to the systematic approach for considering the endorsement or modification of recommendations produced in one setting for application in another as an alternative to de novo development. To describe and assess the methods used for adapting health-related guidelines published in peer-reviewed journals, and to assess the quality of the resulting adapted guidelines. We searched Medline and Embase up to June 2015. We assessed the method of adaptation, and the quality of included guidelines. Seventy-two papers were eligible. Most adapted guidelines and their source guidelines were published by professional societies (71% and 68% respectively), and in high-income countries (83% and 85% respectively). Of the 57 adapted guidelines that reported any detail about adaptation method, 34 (60%) did not use a published adaptation method. The number (and percentage) of adapted guidelines fulfilling each of the ADAPTE steps ranged between 2 (4%) and 57 (100%). The quality of adapted guidelines was highest for the "scope and purpose" domain and lowest for the "editorial independence" domain (respective mean percentages of the maximum possible scores were 93% and 43%). The mean score for "rigor of development" was 57%. Most adapted guidelines published in peer-reviewed journals do not report using a published adaptation method, and their adaptation quality was variable.

  7. Adaptive method with intercessory feedback control for an intelligent agent

    DOEpatents

    Goldsmith, Steven Y.

    2004-06-22

    An adaptive architecture method with feedback control for an intelligent agent provides for adaptively integrating reflexive and deliberative responses to a stimulus according to a goal. An adaptive architecture method with feedback control for multiple intelligent agents provides for coordinating and adaptively integrating reflexive and deliberative responses to a stimulus according to a goal. Re-programming of the adaptive architecture is through a nexus which coordinates reflexive and deliberator components.

  8. Evidence of genomic adaptation to climate in Eucalyptus microcarpa: Implications for adaptive potential to projected climate change.

    PubMed

    Jordan, Rebecca; Hoffmann, Ary A; Dillon, Shannon K; Prober, Suzanne M

    2017-11-01

    Understanding whether populations can adapt in situ or whether interventions are required is of key importance for biodiversity management under climate change. Landscape genomics is becoming an increasingly important and powerful tool for rapid assessments of climate adaptation, especially in long-lived species such as trees. We investigated climate adaptation in Eucalyptus microcarpa using the DArTseq genomic approach. A combination of F ST outlier and environmental association analyses were performed using >4200 genomewide single nucleotide polymorphisms (SNPs) from 26 populations spanning climate gradients in southeastern Australia. Eighty-one SNPs were identified as putatively adaptive, based on significance in F ST outlier tests and significant associations with one or more climate variables related to temperature (70/81), aridity (37/81) or precipitation (35/81). Adaptive SNPs were located on all 11 chromosomes, with no particular region associated with individual climate variables. Climate adaptation appeared to be characterized by subtle shifts in allele frequencies, with no consistent fixed differences identified. Based on these associations, we predict adaptation under projected changes in climate will include a suite of shifts in allele frequencies. Whether this can occur sufficiently rapidly through natural selection within populations, or would benefit from assisted gene migration, requires further evaluation. In some populations, the absence or predicted increases to near fixation of particular adaptive alleles hint at potential limits to adaptive capacity. Together, these results reinforce the importance of standing genetic variation at the geographic level for maintaining species' evolutionary potential. © 2017 John Wiley & Sons Ltd.

  9. Methods for the cultural adaptation of a diabetes lifestyle intervention for Latinas: an illustrative project.

    PubMed

    Osuna, Diego; Barrera, Manuel; Strycker, Lisa A; Toobert, Deborah J; Glasgow, Russell E; Geno, Cristy R; Almeida, Fabio; Perdomo, Malena; King, Diane; Doty, Alyssa Tinley

    2011-05-01

    Because Latinas experience a high prevalence of type 2 diabetes and its complications, there is an urgent need to reach them with interventions that promote healthful lifestyles. This article illustrates a sequential approach that took an effective multiple-risk-factor behavior-change program and adapted it for Latinas with type 2 diabetes. Adaptation stages include (a) information gathering from literature and focus groups, (b) preliminary adaptation design, and (c) preliminary adaptation test. In this third stage, a pilot study finds that participants were highly satisfied with the intervention and showed improvement across diverse outcomes. Key implications for applications include the importance of a model for guiding cultural adaptations, and the value of procedures for obtaining continuous feedback from staff and participants during the preliminary adaptation test.

  10. A family of variable step-size affine projection adaptive filter algorithms using statistics of channel impulse response

    NASA Astrophysics Data System (ADS)

    Shams Esfand Abadi, Mohammad; AbbasZadeh Arani, Seyed Ali Asghar

    2011-12-01

    This paper extends the recently introduced variable step-size (VSS) approach to the family of adaptive filter algorithms. This method uses prior knowledge of the channel impulse response statistic. Accordingly, optimal step-size vector is obtained by minimizing the mean-square deviation (MSD). The presented algorithms are the VSS affine projection algorithm (VSS-APA), the VSS selective partial update NLMS (VSS-SPU-NLMS), the VSS-SPU-APA, and the VSS selective regressor APA (VSS-SR-APA). In VSS-SPU adaptive algorithms the filter coefficients are partially updated which reduce the computational complexity. In VSS-SR-APA, the optimal selection of input regressors is performed during the adaptation. The presented algorithms have good convergence speed, low steady state mean square error (MSE), and low computational complexity features. We demonstrate the good performance of the proposed algorithms through several simulations in system identification scenario.

  11. Methods for cost estimation in software project management

    NASA Astrophysics Data System (ADS)

    Briciu, C. V.; Filip, I.; Indries, I. I.

    2016-02-01

    The speed in which the processes used in software development field have changed makes it very difficult the task of forecasting the overall costs for a software project. By many researchers, this task has been considered unachievable, but there is a group of scientist for which this task can be solved using the already known mathematical methods (e.g. multiple linear regressions) and the new techniques as genetic programming and neural networks. The paper presents a solution for building a model for the cost estimation models in the software project management using genetic algorithms starting from the PROMISE datasets related COCOMO 81 model. In the first part of the paper, a summary of the major achievements in the research area of finding a model for estimating the overall project costs is presented together with the description of the existing software development process models. In the last part, a basic proposal of a mathematical model of a genetic programming is proposed including here the description of the chosen fitness function and chromosome representation. The perspective of model described it linked with the current reality of the software development considering as basis the software product life cycle and the current challenges and innovations in the software development area. Based on the author's experiences and the analysis of the existing models and product lifecycle it was concluded that estimation models should be adapted with the new technologies and emerging systems and they depend largely by the chosen software development method.

  12. Evaluating success criteria and project monitoring in river enhancement within an adaptive management framework

    USGS Publications Warehouse

    O'Donnell, T. K.; Galat, D.L.

    2008-01-01

    Objective setting, performance measures, and accountability are important components of an adaptive-management approach to river-enhancement programs. Few lessons learned by river-enhancement practitioners in the United States have been documented and disseminated relative to the number of projects implemented. We conducted scripted telephone surveys with river-enhancement project managers and practitioners within the Upper Mississippi River Basin (UMRB) to determine the extent of setting project success criteria, monitoring, evaluation of monitoring data, and data dissemination. Investigation of these elements enabled a determination of those that inhibited adaptive management. Seventy river enhancement projects were surveyed. Only 34% of projects surveyed incorporated a quantified measure of project success. Managers most often relied on geophysical attributes of rivers when setting project success criteria, followed by biological communities. Ninety-one percent of projects that performed monitoring included biologic variables, but the lack of data collection before and after project completion and lack of field-based reference or control sites will make future assessments of ecologic success difficult. Twenty percent of projects that performed monitoring evaluated ???1 variable but did not disseminate their evaluations outside their organization. Results suggest greater incentives may be required to advance the science of river enhancement. Future river-enhancement programs within the UMRB and elsewhere can increase knowledge gained from individual projects by offering better guidance on setting success criteria before project initiation and evaluation through established monitoring protocols. ?? 2007 Springer Science+Business Media, LLC.

  13. Adaptive envelope protection methods for aircraft

    NASA Astrophysics Data System (ADS)

    Unnikrishnan, Suraj

    Carefree handling refers to the ability of a pilot to operate an aircraft without the need to continuously monitor aircraft operating limits. At the heart of all carefree handling or maneuvering systems, also referred to as envelope protection systems, are algorithms and methods for predicting future limit violations. Recently, envelope protection methods that have gained more acceptance, translate limit proximity information to its equivalent in the control channel. Envelope protection algorithms either use very small prediction horizon or are static methods with no capability to adapt to changes in system configurations. Adaptive approaches maximizing prediction horizon such as dynamic trim, are only applicable to steady-state-response critical limit parameters. In this thesis, a new adaptive envelope protection method is developed that is applicable to steady-state and transient response critical limit parameters. The approach is based upon devising the most aggressive optimal control profile to the limit boundary and using it to compute control limits. Pilot-in-the-loop evaluations of the proposed approach are conducted at the Georgia Tech Carefree Maneuver lab for transient longitudinal hub moment limit protection. Carefree maneuvering is the dual of carefree handling in the realm of autonomous Uninhabited Aerial Vehicles (UAVs). Designing a flight control system to fully and effectively utilize the operational flight envelope is very difficult. With the increasing role and demands for extreme maneuverability there is a need for developing envelope protection methods for autonomous UAVs. In this thesis, a full-authority automatic envelope protection method is proposed for limit protection in UAVs. The approach uses adaptive estimate of limit parameter dynamics and finite-time horizon predictions to detect impending limit boundary violations. Limit violations are prevented by treating the limit boundary as an obstacle and by correcting nominal control

  14. Adaptive Set-Based Methods for Association Testing.

    PubMed

    Su, Yu-Chen; Gauderman, William James; Berhane, Kiros; Lewinger, Juan Pablo

    2016-02-01

    With a typical sample size of a few thousand subjects, a single genome-wide association study (GWAS) using traditional one single nucleotide polymorphism (SNP)-at-a-time methods can only detect genetic variants conferring a sizable effect on disease risk. Set-based methods, which analyze sets of SNPs jointly, can detect variants with smaller effects acting within a gene, a pathway, or other biologically relevant sets. Although self-contained set-based methods (those that test sets of variants without regard to variants not in the set) are generally more powerful than competitive set-based approaches (those that rely on comparison of variants in the set of interest with variants not in the set), there is no consensus as to which self-contained methods are best. In particular, several self-contained set tests have been proposed to directly or indirectly "adapt" to the a priori unknown proportion and distribution of effects of the truly associated SNPs in the set, which is a major determinant of their power. A popular adaptive set-based test is the adaptive rank truncated product (ARTP), which seeks the set of SNPs that yields the best-combined evidence of association. We compared the standard ARTP, several ARTP variations we introduced, and other adaptive methods in a comprehensive simulation study to evaluate their performance. We used permutations to assess significance for all the methods and thus provide a level playing field for comparison. We found the standard ARTP test to have the highest power across our simulations followed closely by the global model of random effects (GMRE) and a least absolute shrinkage and selection operator (LASSO)-based test. © 2015 WILEY PERIODICALS, INC.

  15. Highlight removal based on the regional-projection fringe projection method

    NASA Astrophysics Data System (ADS)

    Qi, Zhaoshuai; Wang, Zhao; Huang, Junhui; Xing, Chao; Gao, Jianmin

    2018-04-01

    In fringe projection profilometry, highlight usually causes the saturation and blooming in captured fringes and reduces the measurement accuracy. To solve the problem, a regional-projection fringe projection (RP-FP) method is proposed. Regional projection patterns (RP patterns) are projected onto the tested object surface to avoid the saturation and blooming. Then, an image inpainting technique is employed to reconstruct the missing phases in the captured RP patterns and a complete surface of the tested object is obtained. Experiments verified the effectiveness of the proposed method. The method can be widely used in industrial inspections and quality controlling in mechanical and manufacturing industries.

  16. Project ADAPT: A Program to Assess Depression and Provide Proactive Treatment in Rural Areas

    ERIC Educational Resources Information Center

    Luptak, Marilyn; Kaas, Merrie J.; Artz, Margaret; McCarthy, Teresa

    2008-01-01

    Purpose: We describe and evaluate a project designed to pilot test an evidence-based clinical intervention for assessing and treating depression in older adults in rural primary care clinics. Project ADAPT--Assuring Depression Assessment and Proactive Treatment--utilized existing primary care resources to overcome barriers to sustainability…

  17. Model-free adaptive sliding mode controller design for generalized projective synchronization of the fractional-order chaotic system via radial basis function neural networks

    NASA Astrophysics Data System (ADS)

    Wang, L. M.

    2017-09-01

    A novel model-free adaptive sliding mode strategy is proposed for a generalized projective synchronization (GPS) between two entirely unknown fractional-order chaotic systems subject to the external disturbances. To solve the difficulties from the little knowledge about the master-slave system and to overcome the bad effects of the external disturbances on the generalized projective synchronization, the radial basis function neural networks are used to approach the packaged unknown master system and the packaged unknown slave system (including the external disturbances). Consequently, based on the slide mode technology and the neural network theory, a model-free adaptive sliding mode controller is designed to guarantee asymptotic stability of the generalized projective synchronization error. The main contribution of this paper is that a control strategy is provided for the generalized projective synchronization between two entirely unknown fractional-order chaotic systems subject to the unknown external disturbances, and the proposed control strategy only requires that the master system has the same fractional orders as the slave system. Moreover, the proposed method allows us to achieve all kinds of generalized projective chaos synchronizations by turning the user-defined parameters onto the desired values. Simulation results show the effectiveness of the proposed method and the robustness of the controlled system.

  18. Computerized Adaptive Assessment of Personality Disorder: Introducing the CAT-PD Project

    PubMed Central

    Simms, Leonard J.; Goldberg, Lewis R.; Roberts, John E.; Watson, David; Welte, John; Rotterman, Jane H.

    2011-01-01

    Assessment of personality disorders (PD) has been hindered by reliance on the problematic categorical model embodied in the most recent Diagnostic and Statistical Model of Mental Disorders (DSM), lack of consensus among alternative dimensional models, and inefficient measurement methods. This article describes the rationale for and early results from an NIMH-funded, multi-year study designed to develop an integrative and comprehensive model and efficient measure of PD trait dimensions. To accomplish these goals, we are in the midst of a five-phase project to develop and validate the model and measure. The results of Phase 1 of the project—which was focused on developing the PD traits to be assessed and the initial item pool—resulted in a candidate list of 59 PD traits and an initial item pool of 2,589 items. Data collection and structural analyses in community and patient samples will inform the ultimate structure of the measure, and computerized adaptive testing (CAT) will permit efficient measurement of the resultant traits. The resultant Computerized Adaptive Test of Personality Disorder (CAT-PD) will be well positioned as a measure of the proposed DSM-5 PD traits. Implications for both applied and basic personality research are discussed. PMID:22804677

  19. On The Behavior of Subgradient Projections Methods for Convex Feasibility Problems in Euclidean Spaces.

    PubMed

    Butnariu, Dan; Censor, Yair; Gurfil, Pini; Hadar, Ethan

    2008-07-03

    We study some methods of subgradient projections for solving a convex feasibility problem with general (not necessarily hyperplanes or half-spaces) convex sets in the inconsistent case and propose a strategy that controls the relaxation parameters in a specific self-adapting manner. This strategy leaves enough user-flexibility but gives a mathematical guarantee for the algorithm's behavior in the inconsistent case. We present numerical results of computational experiments that illustrate the computational advantage of the new method.

  20. A novel heterogeneous training sample selection method on space-time adaptive processing

    NASA Astrophysics Data System (ADS)

    Wang, Qiang; Zhang, Yongshun; Guo, Yiduo

    2018-04-01

    The performance of ground target detection about space-time adaptive processing (STAP) decreases when non-homogeneity of clutter power is caused because of training samples contaminated by target-like signals. In order to solve this problem, a novel nonhomogeneous training sample selection method based on sample similarity is proposed, which converts the training sample selection into a convex optimization problem. Firstly, the existing deficiencies on the sample selection using generalized inner product (GIP) are analyzed. Secondly, the similarities of different training samples are obtained by calculating mean-hausdorff distance so as to reject the contaminated training samples. Thirdly, cell under test (CUT) and the residual training samples are projected into the orthogonal subspace of the target in the CUT, and mean-hausdorff distances between the projected CUT and training samples are calculated. Fourthly, the distances are sorted in order of value and the training samples which have the bigger value are selective preference to realize the reduced-dimension. Finally, simulation results with Mountain-Top data verify the effectiveness of the proposed method.

  1. Artisticc: An Art and Science Integration Project to Enquire into Community Level Adaptation to Climate Change

    NASA Astrophysics Data System (ADS)

    Vanderlinden, J. P.; Baztan, J.

    2014-12-01

    The prupose of this paper is to present the "Adaptation Research a Transdisciplinary community and policy centered appoach" (ARTisticc) project. ARTisticc's goal is to apply innovative standardized transdisciplinary art and science integrative approaches to foster robust, socially, culturally and scientifically, community centred adaptation to climate change. The approach used in the project is based on the strong understanding that adaptation is: (a) still "a concept of uncertain form"; (b) a concept dealing with uncertainty; (c) a concept that calls for an analysis that goes beyond the traditional disciplinary organization of science, and; (d) an unconventional process in the realm of science and policy integration. The project is centered on case studies in France, Greenland, Russia, India, Canada, Alaska, and Senegal. In every site we jointly develop artwork while we analyzing how natural science, essentially geosciences can be used in order to better adapt in the future, how society adapt to current changes and how memories of past adaptations frames current and future processes. Artforms are mobilized in order to share scientific results with local communities and policy makers, this in a way that respects cultural specificities while empowering stakeholders, ARTISTICC translates these "real life experiments" into stories and artwork that are meaningful to those affected by climate change. The scientific results and the culturally mediated productions will thereafter be used in order to co-construct, with NGOs and policy makers, policy briefs, i.e. robust and scientifically legitimate policy recommendations regarding coastal adaptation. This co-construction process will be in itself analysed with the goal of increasing arts and science's performative functions in the universe of evidence-based policy making. The project involves scientists from natural sciences, the social sciences and the humanities, as well as artitis from the performing arts (playwriters

  2. On The Behavior of Subgradient Projections Methods for Convex Feasibility Problems in Euclidean Spaces

    PubMed Central

    Butnariu, Dan; Censor, Yair; Gurfil, Pini; Hadar, Ethan

    2010-01-01

    We study some methods of subgradient projections for solving a convex feasibility problem with general (not necessarily hyperplanes or half-spaces) convex sets in the inconsistent case and propose a strategy that controls the relaxation parameters in a specific self-adapting manner. This strategy leaves enough user-flexibility but gives a mathematical guarantee for the algorithm’s behavior in the inconsistent case. We present numerical results of computational experiments that illustrate the computational advantage of the new method. PMID:20182556

  3. Sliding Window Generalized Kernel Affine Projection Algorithm Using Projection Mappings

    NASA Astrophysics Data System (ADS)

    Slavakis, Konstantinos; Theodoridis, Sergios

    2008-12-01

    Very recently, a solution to the kernel-based online classification problem has been given by the adaptive projected subgradient method (APSM). The developed algorithm can be considered as a generalization of a kernel affine projection algorithm (APA) and the kernel normalized least mean squares (NLMS). Furthermore, sparsification of the resulting kernel series expansion was achieved by imposing a closed ball (convex set) constraint on the norm of the classifiers. This paper presents another sparsification method for the APSM approach to the online classification task by generating a sequence of linear subspaces in a reproducing kernel Hilbert space (RKHS). To cope with the inherent memory limitations of online systems and to embed tracking capabilities to the design, an upper bound on the dimension of the linear subspaces is imposed. The underlying principle of the design is the notion of projection mappings. Classification is performed by metric projection mappings, sparsification is achieved by orthogonal projections, while the online system's memory requirements and tracking are attained by oblique projections. The resulting sparsification scheme shows strong similarities with the classical sliding window adaptive schemes. The proposed design is validated by the adaptive equalization problem of a nonlinear communication channel, and is compared with classical and recent stochastic gradient descent techniques, as well as with the APSM's solution where sparsification is performed by a closed ball constraint on the norm of the classifiers.

  4. Adaptive reconnection-based arbitrary Lagrangian Eulerian method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bo, Wurigen; Shashkov, Mikhail

    We present a new adaptive Arbitrary Lagrangian Eulerian (ALE) method. This method is based on the reconnection-based ALE (ReALE) methodology of Refs. [35], [34] and [6]. The main elements in a standard ReALE method are: an explicit Lagrangian phase on an arbitrary polygonal (in 2D) mesh in which the solution and positions of grid nodes are updated; a rezoning phase in which a new grid is defined by changing the connectivity (using Voronoi tessellation) but not the number of cells; and a remapping phase in which the Lagrangian solution is transferred onto the new grid. Furthermore, in the standard ReALEmore » method, the rezoned mesh is smoothed by using one or several steps toward centroidal Voronoi tessellation, but it is not adapted to the solution in any way.« less

  5. Adaptive reconnection-based arbitrary Lagrangian Eulerian method

    DOE PAGES

    Bo, Wurigen; Shashkov, Mikhail

    2015-07-21

    We present a new adaptive Arbitrary Lagrangian Eulerian (ALE) method. This method is based on the reconnection-based ALE (ReALE) methodology of Refs. [35], [34] and [6]. The main elements in a standard ReALE method are: an explicit Lagrangian phase on an arbitrary polygonal (in 2D) mesh in which the solution and positions of grid nodes are updated; a rezoning phase in which a new grid is defined by changing the connectivity (using Voronoi tessellation) but not the number of cells; and a remapping phase in which the Lagrangian solution is transferred onto the new grid. Furthermore, in the standard ReALEmore » method, the rezoned mesh is smoothed by using one or several steps toward centroidal Voronoi tessellation, but it is not adapted to the solution in any way.« less

  6. Adaptive and dynamic meshing methods for numerical simulations

    NASA Astrophysics Data System (ADS)

    Acikgoz, Nazmiye

    -hoc application of the simulated annealing technique, which improves the likelihood of removing poor elements from the grid. Moreover, a local implementation of the simulated annealing is proposed to reduce the computational cost. Many challenging multi-physics and multi-field problems that are unsteady in nature are characterized by moving boundaries and/or interfaces. When the boundary displacements are large, which typically occurs when implicit time marching procedures are used, degenerate elements are easily formed in the grid such that frequent remeshing is required. To deal with this problem, in the second part of this work, we propose a new r-adaptation methodology. The new technique is valid for both simplicial (e.g., triangular, tet) and non-simplicial (e.g., quadrilateral, hex) deforming grids that undergo large imposed displacements at their boundaries. A two- or three-dimensional grid is deformed using a network of linear springs composed of edge springs and a set of virtual springs. The virtual springs are constructed in such a way as to oppose element collapsing. This is accomplished by confining each vertex to its ball through springs that are attached to the vertex and its projection on the ball entities. The resulting linear problem is solved using a preconditioned conjugate gradient method. The new method is compared with the classical spring analogy technique in two- and three-dimensional examples, highlighting the performance improvements achieved by the new method. Meshes are an important part of numerical simulations. Depending on the geometry and flow conditions, the most suitable mesh for each particular problem is different. Meshes are usually generated by either using a suitable software package or solving a PDE. In both cases, engineering intuition plays a significant role in deciding where clusterings should take place. In addition, for unsteady problems, the gradients vary for each time step, which requires frequent remeshing during simulations

  7. Modified signal-to-noise: a new simple and practical gene filtering approach based on the concept of projective adaptive resonance theory (PART) filtering method.

    PubMed

    Takahashi, Hiro; Honda, Hiroyuki

    2006-07-01

    Considering the recent advances in and the benefits of DNA microarray technologies, many gene filtering approaches have been employed for the diagnosis and prognosis of diseases. In our previous study, we developed a new filtering method, namely, the projective adaptive resonance theory (PART) filtering method. This method was effective in subclass discrimination. In the PART algorithm, the genes with a low variance in gene expression in either class, not both classes, were selected as important genes for modeling. Based on this concept, we developed novel simple filtering methods such as modified signal-to-noise (S2N') in the present study. The discrimination model constructed using these methods showed higher accuracy with higher reproducibility as compared with many conventional filtering methods, including the t-test, S2N, NSC and SAM. The reproducibility of prediction was evaluated based on the correlation between the sets of U-test p-values on randomly divided datasets. With respect to leukemia, lymphoma and breast cancer, the correlation was high; a difference of >0.13 was obtained by the constructed model by using <50 genes selected by S2N'. Improvement was higher in the smaller genes and such higher correlation was observed when t-test, NSC and SAM were used. These results suggest that these modified methods, such as S2N', have high potential to function as new methods for marker gene selection in cancer diagnosis using DNA microarray data. Software is available upon request.

  8. Stanovich's arguments against the "adaptive rationality" project: An assessment.

    PubMed

    Polonioli, Andrea

    2015-02-01

    This paper discusses Stanovich's appeal to individual differences in reasoning and decision-making to undermine the "adaptive rationality" project put forth by Gigerenzer and his co-workers. I discuss two different arguments based on Stanovich's research. First, heterogeneity in the use of heuristics seems to be at odds with the adaptationist background of the project. Second, the existence of correlations between cognitive ability and susceptibility to cognitive bias suggests that the "standard picture of rationality" (Stein, 1996, 4) is normatively adequate. I argue that, as matters stand, none of the arguments can be seen as fully compelling. Nevertheless, my discussion is not only critical of Stanovich's research, as I also show that (and how) his research can push forward the so-called "rationality debate" by encouraging greater theoretical and experimental work. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Wavelet methods in multi-conjugate adaptive optics

    NASA Astrophysics Data System (ADS)

    Helin, T.; Yudytskiy, M.

    2013-08-01

    The next generation ground-based telescopes rely heavily on adaptive optics for overcoming the limitation of atmospheric turbulence. In the future adaptive optics modalities, like multi-conjugate adaptive optics (MCAO), atmospheric tomography is the major mathematical and computational challenge. In this severely ill-posed problem, a fast and stable reconstruction algorithm is needed that can take into account many real-life phenomena of telescope imaging. We introduce a novel reconstruction method for the atmospheric tomography problem and demonstrate its performance and flexibility in the context of MCAO. Our method is based on using locality properties of compactly supported wavelets, both in the spatial and frequency domains. The reconstruction in the atmospheric tomography problem is obtained by solving the Bayesian MAP estimator with a conjugate-gradient-based algorithm. An accelerated algorithm with preconditioning is also introduced. Numerical performance is demonstrated on the official end-to-end simulation tool OCTOPUS of European Southern Observatory.

  10. Solution-adaptive finite element method in computational fracture mechanics

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Bass, J. M.; Spradley, L. W.

    1993-01-01

    Some recent results obtained using solution-adaptive finite element method in linear elastic two-dimensional fracture mechanics problems are presented. The focus is on the basic issue of adaptive finite element method for validating the applications of new methodology to fracture mechanics problems by computing demonstration problems and comparing the stress intensity factors to analytical results.

  11. Project Lifespan-based Nonstationary Hydrologic Design Methods for Changing Environment

    NASA Astrophysics Data System (ADS)

    Xiong, L.

    2017-12-01

    Under changing environment, we must associate design floods with the design life period of projects to ensure the hydrologic design is really relevant to the operation of the hydrologic projects, because the design value for a given exceedance probability over the project life period would be significantly different from that over other time periods of the same length due to the nonstationarity of probability distributions. Several hydrologic design methods that take the design life period of projects into account have been proposed in recent years, i.e. the expected number of exceedances (ENE), design life level (DLL), equivalent reliability (ER), and average design life level (ADLL). Among the four methods to be compared, both the ENE and ER methods are return period-based methods, while DLL and ADLL are risk/reliability- based methods which estimate design values for given probability values of risk or reliability. However, the four methods can be unified together under a general framework through a relationship transforming the so-called representative reliability (RRE) into the return period, i.e. m=1/1(1-RRE), in which we compute the return period m using the representative reliability RRE.The results of nonstationary design quantiles and associated confidence intervals calculated by ENE, ER and ADLL were very similar, since ENE or ER was a special case or had a similar expression form with respect to ADLL. In particular, the design quantiles calculated by ENE and ADLL were the same when return period was equal to the length of the design life. In addition, DLL can yield similar design values if the relationship between DLL and ER/ADLL return periods is considered. Furthermore, ENE, ER and ADLL had good adaptability to either an increasing or decreasing situation, yielding not too large or too small design quantiles. This is important for applications of nonstationary hydrologic design methods in actual practice because of the concern of choosing the emerging

  12. Projected trends in high-mortality heatwaves under different scenarios of climate, population, and adaptation in 82 US communities.

    PubMed

    Anderson, G Brooke; Oleson, Keith W; Jones, Bryan; Peng, Roger D

    2018-02-01

    Some rare heatwaves have extreme daily mortality impacts; moderate heatwaves have lower daily impacts but occur much more frequently at present and so account for large aggregated impacts. We applied health-based models to project trends in high-mortality heatwaves, including proportion of all heatwaves expected to be high-mortality, using the definition that a high-mortality heatwave increases mortality risk by ≥20 %. We projected these trends in 82 US communities in 2061-2080 under two scenarios of climate change (RCP4.5, RCP8.5), two scenarios of population change (SSP3, SSP5), and three scenarios of community adaptation to heat (none, lagged, on-pace) for large- and medium-ensemble versions of the National Center for Atmospheric Research's Community Earth System Model. More high-mortality heatwaves were expected compared to present under all scenarios except on-pace adaptation, and population exposure was expected to increase under all scenarios. At least seven more high-mortality heatwaves were expected in a twenty-year period in the 82 study communities under RCP8.5 than RCP4.5 when assuming no adaptation. However, high-mortality heatwaves were expected to remain <1 % of all heatwaves and heatwave exposure under all scenarios. Projections were most strongly influenced by the adaptation scenario- going from a scenario of on-pace to lagged adaptation or from lagged to no adaptation more than doubled the projected number of and exposure to high-mortality heatwaves. Based on our results, fewer high-mortality heatwaves are expected when following RCP4.5 versus RCP8.5 and under higher levels of adaptation, but high-mortality heatwaves are expected to remain a very small proportion of total heatwave exposure.

  13. 3D SAPIV particle field reconstruction method based on adaptive threshold.

    PubMed

    Qu, Xiangju; Song, Yang; Jin, Ying; Li, Zhenhua; Wang, Xuezhen; Guo, ZhenYan; Ji, Yunjing; He, Anzhi

    2018-03-01

    Particle image velocimetry (PIV) is a necessary flow field diagnostic technique that provides instantaneous velocimetry information non-intrusively. Three-dimensional (3D) PIV methods can supply the full understanding of a 3D structure, the complete stress tensor, and the vorticity vector in the complex flows. In synthetic aperture particle image velocimetry (SAPIV), the flow field can be measured with large particle intensities from the same direction by different cameras. During SAPIV particle reconstruction, particles are commonly reconstructed by manually setting a threshold to filter out unfocused particles in the refocused images. In this paper, the particle intensity distribution in refocused images is analyzed, and a SAPIV particle field reconstruction method based on an adaptive threshold is presented. By using the adaptive threshold to filter the 3D measurement volume integrally, the three-dimensional location information of the focused particles can be reconstructed. The cross correlations between images captured from cameras and images projected by the reconstructed particle field are calculated for different threshold values. The optimal threshold is determined by cubic curve fitting and is defined as the threshold value that causes the correlation coefficient to reach its maximum. The numerical simulation of a 16-camera array and a particle field at two adjacent time events quantitatively evaluates the performance of the proposed method. An experimental system consisting of a camera array of 16 cameras was used to reconstruct the four adjacent frames in a vortex flow field. The results show that the proposed reconstruction method can effectively reconstruct the 3D particle fields.

  14. Adaptive Set-Based Methods for Association Testing

    PubMed Central

    Su, Yu-Chen; Gauderman, W. James; Kiros, Berhane; Lewinger, Juan Pablo

    2017-01-01

    With a typical sample size of a few thousand subjects, a single genomewide association study (GWAS) using traditional one-SNP-at-a-time methods can only detect genetic variants conferring a sizable effect on disease risk. Set-based methods, which analyze sets of SNPs jointly, can detect variants with smaller effects acting within a gene, a pathway, or other biologically relevant sets. While self-contained set-based methods (those that test sets of variants without regard to variants not in the set) are generally more powerful than competitive set-based approaches (those that rely on comparison of variants in the set of interest with variants not in the set), there is no consensus as to which self-contained methods are best. In particular, several self-contained set tests have been proposed to directly or indirectly ‘adapt’ to the a priori unknown proportion and distribution of effects of the truly associated SNPs in the set, which is a major determinant of their power. A popular adaptive set-based test is the adaptive rank truncated product (ARTP), which seeks the set of SNPs that yields the best-combined evidence of association. We compared the standard ARTP, several ARTP variations we introduced, and other adaptive methods in a comprehensive simulation study to evaluate their performance. We used permutations to assess significance for all the methods and thus provide a level playing field for comparison. We found the standard ARTP test to have the highest power across our simulations followed closely by the global model of random effects (GMRE) and a LASSO based test. PMID:26707371

  15. Reflections on the Adaptive Designs Accelerating Promising Trials Into Treatments (ADAPT-IT) Process—Findings from a Qualitative Study

    PubMed Central

    Guetterman, Timothy C.; Fetters, Michael D.; Legocki, Laurie J.; Mawocha, Samkeliso; Barsan, William G.; Lewis, Roger J.; Berry, Donald A.; Meurer, William J.

    2015-01-01

    Context The context for this study was the Adaptive Designs Advancing Promising Treatments Into Trials (ADAPT-IT) project, which aimed to incorporate flexible adaptive designs into pivotal clinical trials and to conduct an assessment of the trial development process. Little research provides guidance to academic institutions in planning adaptive trials. Objectives The purpose of this qualitative study was to explore the perspectives and experiences of stakeholders as they reflected back about the interactive ADAPT-IT adaptive design development process, and to understand their perspectives regarding lessons learned about the design of the trials and trial development. Materials and methods We conducted semi-structured interviews with ten key stakeholders and observations of the process. We employed qualitative thematic text data analysis to reduce the data into themes about the ADAPT-IT project and adaptive clinical trials. Results The qualitative analysis revealed four themes: education of the project participants, how the process evolved with participant feedback, procedures that could enhance the development of other trials, and education of the broader research community. Discussion and conclusions While participants became more likely to consider flexible adaptive designs, additional education is needed to both understand the adaptive methodology and articulate it when planning trials. PMID:26622163

  16. Project Delivery Methods.

    ERIC Educational Resources Information Center

    Dolan, Thomas G.

    2003-01-01

    Describes project delivery methods that are replacing the traditional Design/Bid/Build linear approach to the management, design, and construction of new facilities. These variations can enhance construction management and teamwork. (SLD)

  17. What is the Value Added to Adaptation Planning by Probabilistic Projections of Climate Change?

    NASA Astrophysics Data System (ADS)

    Wilby, R. L.

    2008-12-01

    Probabilistic projections of climate change offer new sources of risk information to support regional impacts assessment and adaptation options appraisal. However, questions continue to surround how best to apply these scenarios in a practical context, and whether the added complexity and computational burden leads to more robust decision-making. This paper provides an overview of recent efforts in the UK to 'bench-test' frameworks for employing probabilistic projections ahead of the release of the next generation, UKCIP08 projections (in November 2008). This is involving close collaboration between government agencies, research and stakeholder communities. Three examples will be cited to illustrate how probabilistic projections are already informing decisions about future flood risk management in London, water resource planning in trial river basins, and assessments of risks from rising water temperatures to Atlantic salmon stocks in southern England. When compared with conventional deterministic scenarios, ensemble projections allow exploration of a wider range of management options and highlight timescales for implementing adaptation measures. Users of probabilistic scenarios must keep in mind that other uncertainties (e.g., due to impacts model structure and parameterisation) should be handled in an equally rigorous way to those arising from climate models and emission scenarios. Finally, it is noted that a commitment to long-term monitoring is also critical for tracking environmental change, testing model projections, and for evaluating the success (or not) of any scenario-led interventions.

  18. Cultural adaptation and translation of measures: an integrated method.

    PubMed

    Sidani, Souraya; Guruge, Sepali; Miranda, Joyal; Ford-Gilboe, Marilyn; Varcoe, Colleen

    2010-04-01

    Differences in the conceptualization and operationalization of health-related concepts may exist across cultures. Such differences underscore the importance of examining conceptual equivalence when adapting and translating instruments. In this article, we describe an integrated method for exploring conceptual equivalence within the process of adapting and translating measures. The integrated method involves five phases including selection of instruments for cultural adaptation and translation; assessment of conceptual equivalence, leading to the generation of a set of items deemed to be culturally and linguistically appropriate to assess the concept of interest in the target community; forward translation; back translation (optional); and pre-testing of the set of items. Strengths and limitations of the proposed integrated method are discussed. (c) 2010 Wiley Periodicals, Inc.

  19. Application of Bounded Linear Stability Analysis Method for Metrics-Driven Adaptive Control

    NASA Technical Reports Server (NTRS)

    Bakhtiari-Nejad, Maryam; Nguyen, Nhan T.; Krishnakumar, Kalmanje

    2009-01-01

    This paper presents the application of Bounded Linear Stability Analysis (BLSA) method for metrics-driven adaptive control. The bounded linear stability analysis method is used for analyzing stability of adaptive control models, without linearizing the adaptive laws. Metrics-driven adaptive control introduces a notion that adaptation should be driven by some stability metrics to achieve robustness. By the application of bounded linear stability analysis method the adaptive gain is adjusted during the adaptation in order to meet certain phase margin requirements. Analysis of metrics-driven adaptive control is evaluated for a second order system that represents a pitch attitude control of a generic transport aircraft. The analysis shows that the system with the metrics-conforming variable adaptive gain becomes more robust to unmodeled dynamics or time delay. The effect of analysis time-window for BLSA is also evaluated in order to meet the stability margin criteria.

  20. Multi-Role Project (MRP): A New Project-Based Learning Method for STEM

    ERIC Educational Resources Information Center

    Warin, Bruno; Talbi, Omar; Kolski, Christophe; Hoogstoel, Frédéric

    2016-01-01

    This paper presents the "Multi-Role Project" method (MRP), a broadly applicable project-based learning method, and describes its implementation and evaluation in the context of a Science, Technology, Engineering, and Mathematics (STEM) course. The MRP method is designed around a meta-principle that considers the project learning activity…

  1. Moving and adaptive grid methods for compressible flows

    NASA Technical Reports Server (NTRS)

    Trepanier, Jean-Yves; Camarero, Ricardo

    1995-01-01

    This paper describes adaptive grid methods developed specifically for compressible flow computations. The basic flow solver is a finite-volume implementation of Roe's flux difference splitting scheme or arbitrarily moving unstructured triangular meshes. The grid adaptation is performed according to geometric and flow requirements. Some results are included to illustrate the potential of the methodology.

  2. Adaptive Discontinuous Galerkin Methods in Multiwavelets Bases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Archibald, Richard K; Fann, George I; Shelton Jr, William Allison

    2011-01-01

    We use a multiwavelet basis with the Discontinuous Galerkin (DG) method to produce a multi-scale DG method. We apply this Multiwavelet DG method to convection and convection-diffusion problems in multiple dimensions. Merging the DG method with multiwavelets allows the adaptivity in the DG method to be resolved through manipulation of multiwavelet coefficients rather than grid manipulation. Additionally, the Multiwavelet DG method is tested on non-linear equations in one dimension and on the cubed sphere.

  3. Comparing Anisotropic Output-Based Grid Adaptation Methods by Decomposition

    NASA Technical Reports Server (NTRS)

    Park, Michael A.; Loseille, Adrien; Krakos, Joshua A.; Michal, Todd

    2015-01-01

    Anisotropic grid adaptation is examined by decomposing the steps of flow solution, ad- joint solution, error estimation, metric construction, and simplex grid adaptation. Multiple implementations of each of these steps are evaluated by comparison to each other and expected analytic results when available. For example, grids are adapted to analytic metric fields and grid measures are computed to illustrate the properties of multiple independent implementations of grid adaptation mechanics. Different implementations of each step in the adaptation process can be evaluated in a system where the other components of the adaptive cycle are fixed. Detailed examination of these properties allows comparison of different methods to identify the current state of the art and where further development should be targeted.

  4. Health diplomacy the adaptation of global health interventions to local needs in sub-Saharan Africa and Thailand: Evaluating findings from Project Accept (HPTN 043)

    PubMed Central

    2012-01-01

    Background Study-based global health interventions, especially those that are conducted on an international or multi-site basis, frequently require site-specific adaptations in order to (1) respond to socio-cultural differences in risk determinants, (2) to make interventions more relevant to target population needs, and (3) in recognition of ‘global health diplomacy' issues. We report on the adaptations development, approval and implementation process from the Project Accept voluntary counseling and testing, community mobilization and post-test support services intervention. Methods We reviewed all relevant documentation collected during the study intervention period (e.g. monthly progress reports; bi-annual steering committee presentations) and conducted a series of semi-structured interviews with project directors and between 12 and 23 field staff at each study site in South Africa, Zimbabwe, Thailand and Tanzania during 2009. Respondents were asked to describe (1) the adaptations development and approval process and (2) the most successful site-specific adaptations from the perspective of facilitating intervention implementation. Results Across sites, proposed adaptations were identified by field staff and submitted to project directors for review on a formally planned basis. The cross-site intervention sub-committee then ensured fidelity to the study protocol before approval. Successfully-implemented adaptations included: intervention delivery adaptations (e.g. development of tailored counseling messages for immigrant labour groups in South Africa) political, environmental and infrastructural adaptations (e.g. use of local community centers as VCT venues in Zimbabwe); religious adaptations (e.g. dividing clients by gender in Muslim areas of Tanzania); economic adaptations (e.g. co-provision of income generating skills classes in Zimbabwe); epidemiological adaptations (e.g. provision of ‘youth-friendly’ services in South Africa, Zimbabwe and Tanzania), and

  5. User-Adaptable Microcomputer Graphics Software for Life Science Instruction. Final Project Report.

    ERIC Educational Resources Information Center

    Spain, James D.

    The objectives of the SUMIT project was to develop, evaluate, and disseminate 20 course modules (microcomputer programs) for instruction in general biology and ecology. To encourage broad utilization, the programs were designed for the Apple II microcomputer and written in Applesoft Basic with a user-adaptable format. Each package focused on a key…

  6. Investigation of the Multiple Method Adaptive Control (MMAC) method for flight control systems

    NASA Technical Reports Server (NTRS)

    Athans, M.; Baram, Y.; Castanon, D.; Dunn, K. P.; Green, C. S.; Lee, W. H.; Sandell, N. R., Jr.; Willsky, A. S.

    1979-01-01

    The stochastic adaptive control of the NASA F-8C digital-fly-by-wire aircraft using the multiple model adaptive control (MMAC) method is presented. The selection of the performance criteria for the lateral and the longitudinal dynamics, the design of the Kalman filters for different operating conditions, the identification algorithm associated with the MMAC method, the control system design, and simulation results obtained using the real time simulator of the F-8 aircraft at the NASA Langley Research Center are discussed.

  7. [Exercise-referral to a specialist in adapted physical activity (APA) : a pilot project].

    PubMed

    Brugnerotto, Adeline; Cardinaux, Regula; Ueltschi, Yan; Bauwens, Marine; Nanchen, David; Cornuz, Jacques; Bize, Raphaël; Auer, Reto

    2016-11-02

    Family physicians have a key role in the promotion of physical activity, in particular in identifying and counseling persons who have a sedentary lifestyle. Some patients could benefit from intensive individual counseling. Physicians are often not aware of all physical activity promotion activities in the community that they could recommend their patients. In a pilot study, we have tested and adapted the referral of patients from family physicians to specialists in adapted physical activity (APAs). APAs are trained to assess and guide persons towards physical activities adapted to their needs and pathologies and thus towards an increase in physical activity. Pilot data suggest that, while few patients were oriented to the APAs in the pilot project, family physicians appreciate the possibility of collaborating with the APAs.

  8. Fast adaptive composite grid methods on distributed parallel architectures

    NASA Technical Reports Server (NTRS)

    Lemke, Max; Quinlan, Daniel

    1992-01-01

    The fast adaptive composite (FAC) grid method is compared with the adaptive composite method (AFAC) under variety of conditions including vectorization and parallelization. Results are given for distributed memory multiprocessor architectures (SUPRENUM, Intel iPSC/2 and iPSC/860). It is shown that the good performance of AFAC and its superiority over FAC in a parallel environment is a property of the algorithm and not dependent on peculiarities of any machine.

  9. Adaptive Flight Control Research at NASA

    NASA Technical Reports Server (NTRS)

    Motter, Mark A.

    2008-01-01

    A broad overview of current adaptive flight control research efforts at NASA is presented, as well as some more detailed discussion of selected specific approaches. The stated objective of the Integrated Resilient Aircraft Control Project, one of NASA s Aviation Safety programs, is to advance the state-of-the-art of adaptive controls as a design option to provide enhanced stability and maneuverability margins for safe landing in the presence of adverse conditions such as actuator or sensor failures. Under this project, a number of adaptive control approaches are being pursued, including neural networks and multiple models. Validation of all the adaptive control approaches will use not only traditional methods such as simulation, wind tunnel testing and manned flight tests, but will be augmented with recently developed capabilities in unmanned flight testing.

  10. Agricultural Adaptation to Climate Change

    NASA Astrophysics Data System (ADS)

    Tam, A.; Jain, M.

    2016-12-01

    This research includes two projects pertaining to agricultural systems' adaption to climate change. The first research project focuses on the wheat yielding regions of India. Wheat is a major staple crop and many rural households and smallholder farmers rely on crop yields for survival. We examine the impacts of weather variability and groundwater depletion on agricultural systems, using geospatial analysis and satellite-based analysis and household-based and census data sets. We use these methods to estimate the crop yields and identify what factors are associated with low versus high yielding regions. This can help identify strategies that should be further promoted to increase crop yields. The second research project is a literature review. We conduct a meta-analysis and synthetic review on literature about agricultural adaptation to climate change. We sort through numerous articles to identify and examine articles that associate socio-economic, biophysical, and perceptional factors to farmers' adaption to climate change. Our preliminary results show that researchers tend to associate few factors to a farmers' vulnerability and adaptive capacity, and most of the research conducted is concentrated in North America, whereas tropical regions that are highly vulnerable to weather variability are underrepresented by literature. There are no conclusive results in both research projects as of so far.

  11. Online Sequential Projection Vector Machine with Adaptive Data Mean Update

    PubMed Central

    Chen, Lin; Jia, Ji-Ting; Zhang, Qiong; Deng, Wan-Yu; Wei, Wei

    2016-01-01

    We propose a simple online learning algorithm especial for high-dimensional data. The algorithm is referred to as online sequential projection vector machine (OSPVM) which derives from projection vector machine and can learn from data in one-by-one or chunk-by-chunk mode. In OSPVM, data centering, dimension reduction, and neural network training are integrated seamlessly. In particular, the model parameters including (1) the projection vectors for dimension reduction, (2) the input weights, biases, and output weights, and (3) the number of hidden nodes can be updated simultaneously. Moreover, only one parameter, the number of hidden nodes, needs to be determined manually, and this makes it easy for use in real applications. Performance comparison was made on various high-dimensional classification problems for OSPVM against other fast online algorithms including budgeted stochastic gradient descent (BSGD) approach, adaptive multihyperplane machine (AMM), primal estimated subgradient solver (Pegasos), online sequential extreme learning machine (OSELM), and SVD + OSELM (feature selection based on SVD is performed before OSELM). The results obtained demonstrated the superior generalization performance and efficiency of the OSPVM. PMID:27143958

  12. Online Sequential Projection Vector Machine with Adaptive Data Mean Update.

    PubMed

    Chen, Lin; Jia, Ji-Ting; Zhang, Qiong; Deng, Wan-Yu; Wei, Wei

    2016-01-01

    We propose a simple online learning algorithm especial for high-dimensional data. The algorithm is referred to as online sequential projection vector machine (OSPVM) which derives from projection vector machine and can learn from data in one-by-one or chunk-by-chunk mode. In OSPVM, data centering, dimension reduction, and neural network training are integrated seamlessly. In particular, the model parameters including (1) the projection vectors for dimension reduction, (2) the input weights, biases, and output weights, and (3) the number of hidden nodes can be updated simultaneously. Moreover, only one parameter, the number of hidden nodes, needs to be determined manually, and this makes it easy for use in real applications. Performance comparison was made on various high-dimensional classification problems for OSPVM against other fast online algorithms including budgeted stochastic gradient descent (BSGD) approach, adaptive multihyperplane machine (AMM), primal estimated subgradient solver (Pegasos), online sequential extreme learning machine (OSELM), and SVD + OSELM (feature selection based on SVD is performed before OSELM). The results obtained demonstrated the superior generalization performance and efficiency of the OSPVM.

  13. Adaptive Nodal Transport Methods for Reactor Transient Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas Downar; E. Lewis

    2005-08-31

    Develop methods for adaptively treating the angular, spatial, and time dependence of the neutron flux in reactor transient analysis. These methods were demonstrated in the DOE transport nodal code VARIANT and the US NRC spatial kinetics code, PARCS.

  14. Integrated Modeling and Participatory Scenario Planning for Climate Adaptation: the Maui Groundwater Project

    NASA Astrophysics Data System (ADS)

    Keener, V. W.; Finucane, M.; Brewington, L.

    2014-12-01

    For the last century, the island of Maui, Hawaii, has been the center of environmental, agricultural, and legal conflict with respect to surface and groundwater allocation. Planning for adequate future freshwater resources requires flexible and adaptive policies that emphasize partnerships and knowledge transfer between scientists and non-scientists. In 2012 the Hawai'i state legislature passed the Climate Change Adaptation Priority Guidelines (Act 286) law requiring county and state policy makers to include island-wide climate change scenarios in their planning processes. This research details the ongoing work by researchers in the NOAA funded Pacific RISA to support the development of Hawaii's first island-wide water use plan under the new climate adaptation directive. This integrated project combines several models with participatory future scenario planning. The dynamically downscaled triply nested Hawaii Regional Climate Model (HRCM) was modified from the WRF community model and calibrated to simulate the many microclimates on the Hawaiian archipelago. For the island of Maui, the HRCM was validated using 20 years of hindcast data, and daily projections were created at a 1 km scale to capture the steep topography and diverse rainfall regimes. Downscaled climate data are input into a USGS hydrological model to quantify groundwater recharge. This model was previously used for groundwater management, and is being expanded utilizing future climate projections, current land use maps and future scenario maps informed by stakeholder input. Participatory scenario planning began in 2012 to bring together a diverse group of over 50 decision-makers in government, conservation, and agriculture to 1) determine the type of information they would find helpful in planning for climate change, and 2) develop a set of scenarios that represent alternative climate/management futures. This is an iterative process, resulting in flexible and transparent narratives at multiple scales

  15. Two Project Methods: Preliminary Observations on the Similarities and Differences between William Heard Kilpatrick's Project Method and John Dewey's Problem-Solving Method

    ERIC Educational Resources Information Center

    Sutinen, Ari

    2013-01-01

    The project method became a famous teaching method when William Heard Kilpatrick published his article "Project Method" in 1918. The key idea in Kilpatrick's project method is to try to explain how pupils learn things when they work in projects toward different common objects. The same idea of pupils learning by work or action in an…

  16. Wavelet-based adaptive thresholding method for image segmentation

    NASA Astrophysics Data System (ADS)

    Chen, Zikuan; Tao, Yang; Chen, Xin; Griffis, Carl

    2001-05-01

    A nonuniform background distribution may cause a global thresholding method to fail to segment objects. One solution is using a local thresholding method that adapts to local surroundings. In this paper, we propose a novel local thresholding method for image segmentation, using multiscale threshold functions obtained by wavelet synthesis with weighted detail coefficients. In particular, the coarse-to- fine synthesis with attenuated detail coefficients produces a threshold function corresponding to a high-frequency- reduced signal. This wavelet-based local thresholding method adapts to both local size and local surroundings, and its implementation can take advantage of the fast wavelet algorithm. We applied this technique to physical contaminant detection for poultry meat inspection using x-ray imaging. Experiments showed that inclusion objects in deboned poultry could be extracted at multiple resolutions despite their irregular sizes and uneven backgrounds.

  17. Adaptive methods, rolling contact, and nonclassical friction laws

    NASA Technical Reports Server (NTRS)

    Oden, J. T.

    1989-01-01

    Results and methods on three different areas of contemporary research are outlined. These include adaptive methods, the rolling contact problem for finite deformation of a hyperelastic or viscoelastic cylinder, and non-classical friction laws for modeling dynamic friction phenomena.

  18. The Formative Method for Adapting Psychotherapy (FMAP): A community-based developmental approach to culturally adapting therapy

    PubMed Central

    Hwang, Wei-Chin

    2010-01-01

    How do we culturally adapt psychotherapy for ethnic minorities? Although there has been growing interest in doing so, few therapy adaptation frameworks have been developed. The majority of these frameworks take a top-down theoretical approach to adapting psychotherapy. The purpose of this paper is to introduce a community-based developmental approach to modifying psychotherapy for ethnic minorities. The Formative Method for Adapting Psychotherapy (FMAP) is a bottom-up approach that involves collaborating with consumers to generate and support ideas for therapy adaptation. It involves 5-phases that target developing, testing, and reformulating therapy modifications. These phases include: (a) generating knowledge and collaborating with stakeholders (b) integrating generated information with theory and empirical and clinical knowledge, (c) reviewing the initial culturally adapted clinical intervention with stakeholders and revising the culturally adapted intervention, (d) testing the culturally adapted intervention, and (e) finalizing the culturally adapted intervention. Application of the FMAP is illustrated using examples from a study adapting psychotherapy for Chinese Americans, but can also be readily applied to modify therapy for other ethnic groups. PMID:20625458

  19. Stability and error estimation for Component Adaptive Grid methods

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph; Zhu, Xiaolei

    1994-01-01

    Component adaptive grid (CAG) methods for solving hyperbolic partial differential equations (PDE's) are discussed in this paper. Applying recent stability results for a class of numerical methods on uniform grids. The convergence of these methods for linear problems on component adaptive grids is established here. Furthermore, the computational error can be estimated on CAG's using the stability results. Using these estimates, the error can be controlled on CAG's. Thus, the solution can be computed efficiently on CAG's within a given error tolerance. Computational results for time dependent linear problems in one and two space dimensions are presented.

  20. Capillary Electrophoresis Sensitivity Enhancement Based on Adaptive Moving Average Method.

    PubMed

    Drevinskas, Tomas; Telksnys, Laimutis; Maruška, Audrius; Gorbatsova, Jelena; Kaljurand, Mihkel

    2018-06-05

    In the present work, we demonstrate a novel approach to improve the sensitivity of the "out of lab" portable capillary electrophoretic measurements. Nowadays, many signal enhancement methods are (i) underused (nonoptimal), (ii) overused (distorts the data), or (iii) inapplicable in field-portable instrumentation because of a lack of computational power. The described innovative migration velocity-adaptive moving average method uses an optimal averaging window size and can be easily implemented with a microcontroller. The contactless conductivity detection was used as a model for the development of a signal processing method and the demonstration of its impact on the sensitivity. The frequency characteristics of the recorded electropherograms and peaks were clarified. Higher electrophoretic mobility analytes exhibit higher-frequency peaks, whereas lower electrophoretic mobility analytes exhibit lower-frequency peaks. On the basis of the obtained data, a migration velocity-adaptive moving average algorithm was created, adapted, and programmed into capillary electrophoresis data-processing software. Employing the developed algorithm, each data point is processed depending on a certain migration time of the analyte. Because of the implemented migration velocity-adaptive moving average method, the signal-to-noise ratio improved up to 11 times for sampling frequency of 4.6 Hz and up to 22 times for sampling frequency of 25 Hz. This paper could potentially be used as a methodological guideline for the development of new smoothing algorithms that require adaptive conditions in capillary electrophoresis and other separation methods.

  1. Three-dimensional self-adaptive grid method for complex flows

    NASA Technical Reports Server (NTRS)

    Djomehri, M. Jahed; Deiwert, George S.

    1988-01-01

    A self-adaptive grid procedure for efficient computation of three-dimensional complex flow fields is described. The method is based on variational principles to minimize the energy of a spring system analogy which redistributes the grid points. Grid control parameters are determined by specifying maximum and minimum grid spacing. Multidirectional adaptation is achieved by splitting the procedure into a sequence of successive applications of a unidirectional adaptation. One-sided, two-directional constraints for orthogonality and smoothness are used to enhance the efficiency of the method. Feasibility of the scheme is demonstrated by application to a multinozzle, afterbody, plume flow field. Application of the algorithm for initial grid generation is illustrated by constructing a three-dimensional grid about a bump-like geometry.

  2. Efficient Unstructured Grid Adaptation Methods for Sonic Boom Prediction

    NASA Technical Reports Server (NTRS)

    Campbell, Richard L.; Carter, Melissa B.; Deere, Karen A.; Waithe, Kenrick A.

    2008-01-01

    This paper examines the use of two grid adaptation methods to improve the accuracy of the near-to-mid field pressure signature prediction of supersonic aircraft computed using the USM3D unstructured grid flow solver. The first method (ADV) is an interactive adaptation process that uses grid movement rather than enrichment to more accurately resolve the expansion and compression waves. The second method (SSGRID) uses an a priori adaptation approach to stretch and shear the original unstructured grid to align the grid with the pressure waves and reduce the cell count required to achieve an accurate signature prediction at a given distance from the vehicle. Both methods initially create negative volume cells that are repaired in a module in the ADV code. While both approaches provide significant improvements in the near field signature (< 3 body lengths) relative to a baseline grid without increasing the number of grid points, only the SSGRID approach allows the details of the signature to be accurately computed at mid-field distances (3-10 body lengths) for direct use with mid-field-to-ground boom propagation codes.

  3. Adaptive Prior Variance Calibration in the Bayesian Continual Reassessment Method

    PubMed Central

    Zhang, Jin; Braun, Thomas M.; Taylor, Jeremy M.G.

    2012-01-01

    Use of the Continual Reassessment Method (CRM) and other model-based approaches to design in Phase I clinical trials has increased due to the ability of the CRM to identify the maximum tolerated dose (MTD) better than the 3+3 method. However, the CRM can be sensitive to the variance selected for the prior distribution of the model parameter, especially when a small number of patients are enrolled. While methods have emerged to adaptively select skeletons and to calibrate the prior variance only at the beginning of a trial, there has not been any approach developed to adaptively calibrate the prior variance throughout a trial. We propose three systematic approaches to adaptively calibrate the prior variance during a trial and compare them via simulation to methods proposed to calibrate the variance at the beginning of a trial. PMID:22987660

  4. Free energy calculations: an efficient adaptive biasing potential method.

    PubMed

    Dickson, Bradley M; Legoll, Frédéric; Lelièvre, Tony; Stoltz, Gabriel; Fleurat-Lessard, Paul

    2010-05-06

    We develop an efficient sampling and free energy calculation technique within the adaptive biasing potential (ABP) framework. By mollifying the density of states we obtain an approximate free energy and an adaptive bias potential that is computed directly from the population along the coordinates of the free energy. Because of the mollifier, the bias potential is "nonlocal", and its gradient admits a simple analytic expression. A single observation of the reaction coordinate can thus be used to update the approximate free energy at every point within a neighborhood of the observation. This greatly reduces the equilibration time of the adaptive bias potential. This approximation introduces two parameters: strength of mollification and the zero of energy of the bias potential. While we observe that the approximate free energy is a very good estimate of the actual free energy for a large range of mollification strength, we demonstrate that the errors associated with the mollification may be removed via deconvolution. The zero of energy of the bias potential, which is easy to choose, influences the speed of convergence but not the limiting accuracy. This method is simple to apply to free energy or mean force computation in multiple dimensions and does not involve second derivatives of the reaction coordinates, matrix manipulations nor on-the-fly adaptation of parameters. For the alanine dipeptide test case, the new method is found to gain as much as a factor of 10 in efficiency as compared to two basic implementations of the adaptive biasing force methods, and it is shown to be as efficient as well-tempered metadynamics with the postprocess deconvolution giving a clear advantage to the mollified density of states method.

  5. Locally adaptive parallel temperature accelerated dynamics method

    NASA Astrophysics Data System (ADS)

    Shim, Yunsic; Amar, Jacques G.

    2010-03-01

    The recently-developed temperature-accelerated dynamics (TAD) method [M. Sørensen and A.F. Voter, J. Chem. Phys. 112, 9599 (2000)] along with the more recently developed parallel TAD (parTAD) method [Y. Shim et al, Phys. Rev. B 76, 205439 (2007)] allow one to carry out non-equilibrium simulations over extended time and length scales. The basic idea behind TAD is to speed up transitions by carrying out a high-temperature MD simulation and then use the resulting information to obtain event times at the desired low temperature. In a typical implementation, a fixed high temperature Thigh is used. However, in general one expects that for each configuration there exists an optimal value of Thigh which depends on the particular transition pathways and activation energies for that configuration. Here we present a locally adaptive high-temperature TAD method in which instead of using a fixed Thigh the high temperature is dynamically adjusted in order to maximize simulation efficiency. Preliminary results of the performance obtained from parTAD simulations of Cu/Cu(100) growth using the locally adaptive Thigh method will also be presented.

  6. A self-adaptive-grid method with application to airfoil flow

    NASA Technical Reports Server (NTRS)

    Nakahashi, K.; Deiwert, G. S.

    1985-01-01

    A self-adaptive-grid method is described that is suitable for multidimensional steady and unsteady computations. Based on variational principles, a spring analogy is used to redistribute grid points in an optimal sense to reduce the overall solution error. User-specified parameters, denoting both maximum and minimum permissible grid spacings, are used to define the all-important constants, thereby minimizing the empiricism and making the method self-adaptive. Operator splitting and one-sided controls for orthogonality and smoothness are used to make the method practical, robust, and efficient. Examples are included for both steady and unsteady viscous flow computations about airfoils in two dimensions, as well as for a steady inviscid flow computation and a one-dimensional case. These examples illustrate the precise control the user has with the self-adaptive method and demonstrate a significant improvement in accuracy and quality of the solutions.

  7. Speckle reduction in optical coherence tomography by adaptive total variation method

    NASA Astrophysics Data System (ADS)

    Wu, Tong; Shi, Yaoyao; Liu, Youwen; He, Chongjun

    2015-12-01

    An adaptive total variation method based on the combination of speckle statistics and total variation restoration is proposed and developed for reducing speckle noise in optical coherence tomography (OCT) images. The statistical distribution of the speckle noise in OCT image is investigated and measured. With the measured parameters such as the mean value and variance of the speckle noise, the OCT image is restored by the adaptive total variation restoration method. The adaptive total variation restoration algorithm was applied to the OCT images of a volunteer's hand skin, which showed effective speckle noise reduction and image quality improvement. For image quality comparison, the commonly used median filtering method was also applied to the same images to reduce the speckle noise. The measured results demonstrate the superior performance of the adaptive total variation restoration method in terms of image signal-to-noise ratio, equivalent number of looks, contrast-to-noise ratio, and mean square error.

  8. Adapting Project Management Practices to Research-Based Projects

    NASA Technical Reports Server (NTRS)

    Bahr, P.; Baker, T.; Corbin, B.; Keith, L.; Loerch, L.; Mullenax, C.; Myers, R.; Rhodes, B.; Skytland, N.

    2007-01-01

    From dealing with the inherent uncertainties in outcomes of scientific research to the lack of applicability of current NASA Procedural Requirements guidance documentation, research-based projects present challenges that require unique application of classical project management techniques. If additionally challenged by the creation of a new program transitioning from basic to applied research in a technical environment often unfamiliar with the cost and schedule constraints addressed by project management practices, such projects can find themselves struggling throughout their life cycles. Finally, supplying deliverables to a prime vehicle customer, also in the formative stage, adds further complexity to the development and management of research-based projects. The Biomedical Research and Countermeasures Projects Branch at NASA Johnson Space Center encompasses several diverse applied research-based or research-enabling projects within the newly-formed Human Research Program. This presentation will provide a brief overview of the organizational structure and environment in which these projects operate and how the projects coordinate to address and manage technical requirements. We will identify several of the challenges (cost, technical, schedule, and personnel) encountered by projects across the Branch, present case reports of actions taken and techniques implemented to deal with these challenges, and then close the session with an open forum discussion of remaining challenges and potential mitigations.

  9. Cancer diagnosis marker extraction for soft tissue sarcomas based on gene expression profiling data by using projective adaptive resonance theory (PART) filtering method

    PubMed Central

    Takahashi, Hiro; Nemoto, Takeshi; Yoshida, Teruhiko; Honda, Hiroyuki; Hasegawa, Tadashi

    2006-01-01

    Background Recent advances in genome technologies have provided an excellent opportunity to determine the complete biological characteristics of neoplastic tissues, resulting in improved diagnosis and selection of treatment. To accomplish this objective, it is important to establish a sophisticated algorithm that can deal with large quantities of data such as gene expression profiles obtained by DNA microarray analysis. Results Previously, we developed the projective adaptive resonance theory (PART) filtering method as a gene filtering method. This is one of the clustering methods that can select specific genes for each subtype. In this study, we applied the PART filtering method to analyze microarray data that were obtained from soft tissue sarcoma (STS) patients for the extraction of subtype-specific genes. The performance of the filtering method was evaluated by comparison with other widely used methods, such as signal-to-noise, significance analysis of microarrays, and nearest shrunken centroids. In addition, various combinations of filtering and modeling methods were used to extract essential subtype-specific genes. The combination of the PART filtering method and boosting – the PART-BFCS method – showed the highest accuracy. Seven genes among the 15 genes that are frequently selected by this method – MIF, CYFIP2, HSPCB, TIMP3, LDHA, ABR, and RGS3 – are known prognostic marker genes for other tumors. These genes are candidate marker genes for the diagnosis of STS. Correlation analysis was performed to extract marker genes that were not selected by PART-BFCS. Sixteen genes among those extracted are also known prognostic marker genes for other tumors, and they could be candidate marker genes for the diagnosis of STS. Conclusion The procedure that consisted of two steps, such as the PART-BFCS and the correlation analysis, was proposed. The results suggest that novel diagnostic and therapeutic targets for STS can be extracted by a procedure that includes

  10. Towards More Comprehensive Projections of Urban Heat-Related Mortality: Estimates for New York City under Multiple Population, Adaptation, and Climate Scenarios.

    PubMed

    Petkova, Elisaveta P; Vink, Jan K; Horton, Radley M; Gasparrini, Antonio; Bader, Daniel A; Francis, Joe D; Kinney, Patrick L

    2017-01-01

    High temperatures have substantial impacts on mortality and, with growing concerns about climate change, numerous studies have developed projections of future heat-related deaths around the world. Projections of temperature-related mortality are often limited by insufficient information to formulate hypotheses about population sensitivity to high temperatures and future demographics. The present study derived projections of temperature-related mortality in New York City by taking into account future patterns of adaptation or demographic change, both of which can have profound influences on future health burdens. We adopted a novel approach to modeling heat adaptation by incorporating an analysis of the observed population response to heat in New York City over the course of eight decades. This approach projected heat-related mortality until the end of the 21st century based on observed trends in adaptation over a substantial portion of the 20th century. In addition, we incorporated a range of new scenarios for population change until the end of the 21st century. We then estimated future heat-related deaths in New York City by combining the changing temperature-mortality relationship and population scenarios with downscaled temperature projections from the 33 global climate models (GCMs) and two Representative Concentration Pathways (RCPs). The median number of projected annual heat-related deaths across the 33 GCMs varied greatly by RCP and adaptation and population change scenario, ranging from 167 to 3,331 in the 2080s compared with 638 heat-related deaths annually between 2000 and 2006. These findings provide a more complete picture of the range of potential future heat-related mortality risks across the 21st century in New York City, and they highlight the importance of both demographic change and adaptation responses in modifying future risks. Citation: Petkova EP, Vink JK, Horton RM, Gasparrini A, Bader DA, Francis JD, Kinney PL. 2017. Towards more

  11. Vortical Flow Prediction Using an Adaptive Unstructured Grid Method

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar Z.

    2001-01-01

    A computational fluid dynamics (CFD) method has been employed to compute vortical flows around slender wing/body configurations. The emphasis of the paper is on the effectiveness of an adaptive grid procedure in "capturing" concentrated vortices generated at sharp edges or flow separation lines of lifting surfaces flying at high angles of attack. The method is based on a tetrahedral unstructured grid technology developed at the NASA Langley Research Center. Two steady-state, subsonic, inviscid and Navier-Stokes flow test cases are presented to demonstrate the applicability of the method for solving practical vortical flow problems. The first test case concerns vortex flow over a simple 65deg delta wing with different values of leading-edge bluntness, and the second case is that of a more complex fighter configuration. The superiority of the adapted solutions in capturing the vortex flow structure over the conventional unadapted results is demonstrated by comparisons with the windtunnel experimental data. The study shows that numerical prediction of vortical flows is highly sensitive to the local grid resolution and that the implementation of grid adaptation is essential when applying CFD methods to such complicated flow problems.

  12. Method ranks competing projects by priorities, risk. [A method to help prioritize oil and gas pipeline project goals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moeckel, D.R.

    A practical, objective guide for ranking projects based on risk-based priorities has been developed by Sun Pipe Line Co. The deliberately simple system guides decisions on how to allocate scarce company resources because all managers employ the same criteria in weighing potential risks to the company versus benefits. Managers at all levels are continuously having to comply with an ever growing amount of legislative and regulatory requirements while at the same time trying to run their businesses effectively. The system primarily is designed for use as a compliance oversight and tracking process to document, categorize, and follow-up on work concerningmore » various issues or projects. That is, the system consists of an electronic database which is updated periodically, and is used by various levels of management to monitor progress of health, safety, environmental and compliance-related projects. Criteria used in determining a risk factor and assigning a priority also have been adapted and found useful for evaluating other types of projects. The process enables management to better define potential risks and/or loss of benefits that are being accepted when a project is rejected from an immediate work plan or budget. In times of financial austerity, it is extremely important that the right decisions are made at the right time.« less

  13. Assessing gait adaptability in people with a unilateral amputation on an instrumented treadmill with a projected visual context.

    PubMed

    Houdijk, Han; van Ooijen, Mariëlle W; Kraal, Jos J; Wiggerts, Henri O; Polomski, Wojtek; Janssen, Thomas W J; Roerdink, Melvyn

    2012-11-01

    Gait adaptability, including the ability to avoid obstacles and to take visually guided steps, is essential for safe movement through a cluttered world. This aspect of walking ability is important for regaining independent mobility but is difficult to assess in clinical practice. The objective of this study was to investigate the validity of an instrumented treadmill with obstacles and stepping targets projected on the belt's surface for assessing prosthetic gait adaptability. This was an observational study. A control group of people who were able bodied (n=12) and groups of people with transtibial (n=12) and transfemoral (n=12) amputations participated. Participants walked at a self-selected speed on an instrumented treadmill with projected visual obstacles and stepping targets. Gait adaptability was evaluated in terms of anticipatory and reactive obstacle avoidance performance (for obstacles presented 4 steps and 1 step ahead, respectively) and accuracy of stepping on regular and irregular patterns of stepping targets. In addition, several clinical tests were administered, including timed walking tests and reports of incidence of falls and fear of falling. Obstacle avoidance performance and stepping accuracy were significantly lower in the groups with amputations than in the control group. Anticipatory obstacle avoidance performance was moderately correlated with timed walking test scores. Reactive obstacle avoidance performance and stepping accuracy performance were not related to timed walking tests. Gait adaptability scores did not differ in groups stratified by incidence of falls or fear of falling. Because gait adaptability was affected by walking speed, differences in self-selected walking speed may have diminished differences in gait adaptability between groups. Gait adaptability can be validly assessed by use of an instrumented treadmill with a projected visual context. When walking speed is taken into account, this assessment provides unique

  14. ICASE/LaRC Workshop on Adaptive Grid Methods

    NASA Technical Reports Server (NTRS)

    South, Jerry C., Jr. (Editor); Thomas, James L. (Editor); Vanrosendale, John (Editor)

    1995-01-01

    Solution-adaptive grid techniques are essential to the attainment of practical, user friendly, computational fluid dynamics (CFD) applications. In this three-day workshop, experts gathered together to describe state-of-the-art methods in solution-adaptive grid refinement, analysis, and implementation; to assess the current practice; and to discuss future needs and directions for research. This was accomplished through a series of invited and contributed papers. The workshop focused on a set of two-dimensional test cases designed by the organizers to aid in assessing the current state of development of adaptive grid technology. In addition, a panel of experts from universities, industry, and government research laboratories discussed their views of needs and future directions in this field.

  15. Technology transfer for adaptation

    NASA Astrophysics Data System (ADS)

    Biagini, Bonizella; Kuhl, Laura; Gallagher, Kelly Sims; Ortiz, Claudia

    2014-09-01

    Technology alone will not be able to solve adaptation challenges, but it is likely to play an important role. As a result of the role of technology in adaptation and the importance of international collaboration for climate change, technology transfer for adaptation is a critical but understudied issue. Through an analysis of Global Environment Facility-managed adaptation projects, we find there is significantly more technology transfer occurring in adaptation projects than might be expected given the pessimistic rhetoric surrounding technology transfer for adaptation. Most projects focused on demonstration and early deployment/niche formation for existing technologies rather than earlier stages of innovation, which is understandable considering the pilot nature of the projects. Key challenges for the transfer process, including technology selection and appropriateness under climate change, markets and access to technology, and diffusion strategies are discussed in more detail.

  16. Multi-Fault Diagnosis of Rolling Bearings via Adaptive Projection Intrinsically Transformed Multivariate Empirical Mode Decomposition and High Order Singular Value Decomposition.

    PubMed

    Yuan, Rui; Lv, Yong; Song, Gangbing

    2018-04-16

    Rolling bearings are important components in rotary machinery systems. In the field of multi-fault diagnosis of rolling bearings, the vibration signal collected from single channels tends to miss some fault characteristic information. Using multiple sensors to collect signals at different locations on the machine to obtain multivariate signal can remedy this problem. The adverse effect of a power imbalance between the various channels is inevitable, and unfavorable for multivariate signal processing. As a useful, multivariate signal processing method, Adaptive-projection has intrinsically transformed multivariate empirical mode decomposition (APIT-MEMD), and exhibits better performance than MEMD by adopting adaptive projection strategy in order to alleviate power imbalances. The filter bank properties of APIT-MEMD are also adopted to enable more accurate and stable intrinsic mode functions (IMFs), and to ease mode mixing problems in multi-fault frequency extractions. By aligning IMF sets into a third order tensor, high order singular value decomposition (HOSVD) can be employed to estimate the fault number. The fault correlation factor (FCF) analysis is used to conduct correlation analysis, in order to determine effective IMFs; the characteristic frequencies of multi-faults can then be extracted. Numerical simulations and the application of multi-fault situation can demonstrate that the proposed method is promising in multi-fault diagnoses of multivariate rolling bearing signal.

  17. Parallel adaptive wavelet collocation method for PDEs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nejadmalayeri, Alireza, E-mail: Alireza.Nejadmalayeri@gmail.com; Vezolainen, Alexei, E-mail: Alexei.Vezolainen@Colorado.edu; Brown-Dymkoski, Eric, E-mail: Eric.Browndymkoski@Colorado.edu

    2015-10-01

    A parallel adaptive wavelet collocation method for solving a large class of Partial Differential Equations is presented. The parallelization is achieved by developing an asynchronous parallel wavelet transform, which allows one to perform parallel wavelet transform and derivative calculations with only one data synchronization at the highest level of resolution. The data are stored using tree-like structure with tree roots starting at a priori defined level of resolution. Both static and dynamic domain partitioning approaches are developed. For the dynamic domain partitioning, trees are considered to be the minimum quanta of data to be migrated between the processes. This allowsmore » fully automated and efficient handling of non-simply connected partitioning of a computational domain. Dynamic load balancing is achieved via domain repartitioning during the grid adaptation step and reassigning trees to the appropriate processes to ensure approximately the same number of grid points on each process. The parallel efficiency of the approach is discussed based on parallel adaptive wavelet-based Coherent Vortex Simulations of homogeneous turbulence with linear forcing at effective non-adaptive resolutions up to 2048{sup 3} using as many as 2048 CPU cores.« less

  18. Project Method, as One of the Basic Methods of Environmental Education

    ERIC Educational Resources Information Center

    Szállassy, Noémi

    2008-01-01

    Our aim was to present in this paper the one of the most important methods of environmental education, the project method. We present here the steps and phases of project method and we give an example of how to use these elements in planning an activity for celebrating the World Day for Water.

  19. Methods for prismatic/tetrahedral grid generation and adaptation

    NASA Technical Reports Server (NTRS)

    Kallinderis, Y.

    1995-01-01

    The present work involves generation of hybrid prismatic/tetrahedral grids for complex 3-D geometries including multi-body domains. The prisms cover the region close to each body's surface, while tetrahedra are created elsewhere. Two developments are presented for hybrid grid generation around complex 3-D geometries. The first is a new octree/advancing front type of method for generation of the tetrahedra of the hybrid mesh. The main feature of the present advancing front tetrahedra generator that is different from previous such methods is that it does not require the creation of a background mesh by the user for the determination of the grid-spacing and stretching parameters. These are determined via an automatically generated octree. The second development is a method for treating the narrow gaps in between different bodies in a multiply-connected domain. This method is applied to a two-element wing case. A High Speed Civil Transport (HSCT) type of aircraft geometry is considered. The generated hybrid grid required only 170 K tetrahedra instead of an estimated two million had a tetrahedral mesh been used in the prisms region as well. A solution adaptive scheme for viscous computations on hybrid grids is also presented. A hybrid grid adaptation scheme that employs both h-refinement and redistribution strategies is developed to provide optimum meshes for viscous flow computations. Grid refinement is a dual adaptation scheme that couples 3-D, isotropic division of tetrahedra and 2-D, directional division of prisms.

  20. Adaptation Method for Overall and Local Performances of Gas Turbine Engine Model

    NASA Astrophysics Data System (ADS)

    Kim, Sangjo; Kim, Kuisoon; Son, Changmin

    2018-04-01

    An adaptation method was proposed to improve the modeling accuracy of overall and local performances of gas turbine engine. The adaptation method was divided into two steps. First, the overall performance parameters such as engine thrust, thermal efficiency, and pressure ratio were adapted by calibrating compressor maps, and second, the local performance parameters such as temperature of component intersection and shaft speed were adjusted by additional adaptation factors. An optimization technique was used to find the correlation equation of adaptation factors for compressor performance maps. The multi-island genetic algorithm (MIGA) was employed in the present optimization. The correlations of local adaptation factors were generated based on the difference between the first adapted engine model and performance test data. The proposed adaptation method applied to a low-bypass ratio turbofan engine of 12,000 lb thrust. The gas turbine engine model was generated and validated based on the performance test data in the sea-level static condition. In flight condition at 20,000 ft and 0.9 Mach number, the result of adapted engine model showed improved prediction in engine thrust (overall performance parameter) by reducing the difference from 14.5 to 3.3%. Moreover, there was further improvement in the comparison of low-pressure turbine exit temperature (local performance parameter) as the difference is reduced from 3.2 to 0.4%.

  1. Locally adaptive, spatially explicit projection of US population for 2030 and 2050.

    PubMed

    McKee, Jacob J; Rose, Amy N; Bright, Edward A; Huynh, Timmy; Bhaduri, Budhendra L

    2015-02-03

    Localized adverse events, including natural hazards, epidemiological events, and human conflict, underscore the criticality of quantifying and mapping current population. Building on the spatial interpolation technique previously developed for high-resolution population distribution data (LandScan Global and LandScan USA), we have constructed an empirically informed spatial distribution of projected population of the contiguous United States for 2030 and 2050, depicting one of many possible population futures. Whereas most current large-scale, spatially explicit population projections typically rely on a population gravity model to determine areas of future growth, our projection model departs from these by accounting for multiple components that affect population distribution. Modeled variables, which included land cover, slope, distances to larger cities, and a moving average of current population, were locally adaptive and geographically varying. The resulting weighted surface was used to determine which areas had the greatest likelihood for future population change. Population projections of county level numbers were developed using a modified version of the US Census's projection methodology, with the US Census's official projection as the benchmark. Applications of our model include incorporating multiple various scenario-driven events to produce a range of spatially explicit population futures for suitability modeling, service area planning for governmental agencies, consequence assessment, mitigation planning and implementation, and assessment of spatially vulnerable populations.

  2. QUEST+: A general multidimensional Bayesian adaptive psychometric method.

    PubMed

    Watson, Andrew B

    2017-03-01

    QUEST+ is a Bayesian adaptive psychometric testing method that allows an arbitrary number of stimulus dimensions, psychometric function parameters, and trial outcomes. It is a generalization and extension of the original QUEST procedure and incorporates many subsequent developments in the area of parametric adaptive testing. With a single procedure, it is possible to implement a wide variety of experimental designs, including conventional threshold measurement; measurement of psychometric function parameters, such as slope and lapse; estimation of the contrast sensitivity function; measurement of increment threshold functions; measurement of noise-masking functions; Thurstone scale estimation using pair comparisons; and categorical ratings on linear and circular stimulus dimensions. QUEST+ provides a general method to accelerate data collection in many areas of cognitive and perceptual science.

  3. Project-Method Fit: Exploring Factors That Influence Agile Method Use

    ERIC Educational Resources Information Center

    Young, Diana K.

    2013-01-01

    While the productivity and quality implications of agile software development methods (SDMs) have been demonstrated, research concerning the project contexts where their use is most appropriate has yielded less definitive results. Most experts agree that agile SDMs are not suited for all project contexts. Several project and team factors have been…

  4. Solving delay differential equations in S-ADAPT by method of steps.

    PubMed

    Bauer, Robert J; Mo, Gary; Krzyzanski, Wojciech

    2013-09-01

    S-ADAPT is a version of the ADAPT program that contains additional simulation and optimization abilities such as parametric population analysis. S-ADAPT utilizes LSODA to solve ordinary differential equations (ODEs), an algorithm designed for large dimension non-stiff and stiff problems. However, S-ADAPT does not have a solver for delay differential equations (DDEs). Our objective was to implement in S-ADAPT a DDE solver using the methods of steps. The method of steps allows one to solve virtually any DDE system by transforming it to an ODE system. The solver was validated for scalar linear DDEs with one delay and bolus and infusion inputs for which explicit analytic solutions were derived. Solutions of nonlinear DDE problems coded in S-ADAPT were validated by comparing them with ones obtained by the MATLAB DDE solver dde23. The estimation of parameters was tested on the MATLB simulated population pharmacodynamics data. The comparison of S-ADAPT generated solutions for DDE problems with the explicit solutions as well as MATLAB produced solutions which agreed to at least 7 significant digits. The population parameter estimates from using importance sampling expectation-maximization in S-ADAPT agreed with ones used to generate the data. Published by Elsevier Ireland Ltd.

  5. Uncertainty Analyses for Back Projection Methods

    NASA Astrophysics Data System (ADS)

    Zeng, H.; Wei, S.; Wu, W.

    2017-12-01

    So far few comprehensive error analyses for back projection methods have been conducted, although it is evident that high frequency seismic waves can be easily affected by earthquake depth, focal mechanisms and the Earth's 3D structures. Here we perform 1D and 3D synthetic tests for two back projection methods, MUltiple SIgnal Classification (MUSIC) (Meng et al., 2011) and Compressive Sensing (CS) (Yao et al., 2011). We generate synthetics for both point sources and finite rupture sources with different depths, focal mechanisms, as well as 1D and 3D structures in the source region. The 3D synthetics are generated through a hybrid scheme of Direct Solution Method and Spectral Element Method. Then we back project the synthetic data using MUSIC and CS. The synthetic tests show that the depth phases can be back projected as artificial sources both in space and time. For instance, for a source depth of 10km, back projection gives a strong signal 8km away from the true source. Such bias increases with depth, e.g., the error of horizontal location could be larger than 20km for a depth of 40km. If the array is located around the nodal direction of direct P-waves the teleseismic P-waves are dominated by the depth phases. Therefore, back projections are actually imaging the reflection points of depth phases more than the rupture front. Besides depth phases, the strong and long lasted coda waves due to 3D effects near trench can lead to additional complexities tested here. The strength contrast of different frequency contents in the rupture models also produces some variations to the back projection results. In the synthetic tests, MUSIC and CS derive consistent results. While MUSIC is more computationally efficient, CS works better for sparse arrays. In summary, our analyses indicate that the impact of various factors mentioned above should be taken into consideration when interpreting back projection images, before we can use them to infer the earthquake rupture physics.

  6. Projections, plans, policies and politics in Prince George: reflections on five years of climate change adaptation in a northern Canadian community

    NASA Astrophysics Data System (ADS)

    Picketts, I. M.; Dery, S. J.; Curry, J.

    2013-12-01

    The City of Prince George, in central British Columbia, Canada, has partnered with academics and collaborators for over five years to address climate change adaptation at the local level. The first phase of research involved conducting a detailed overview of past climate trends and future projections for the region using the outputs of GCMs and downscaled RCMs. This information was communicated to senior local staff and community members, and feedback was applied to create a detailed adaptation strategy for the City, which identified priority impacts and outlined potential strategies to address them at the local level. The top priority impacts for Prince George are forest changes, increased flooding, and impacts to transportation infrastructure. During a second implementation phase of the project, eight local initiatives were completed focusing on: incorporating adaptation into a local sustainability plan and land use plan; exploring impacts related to forests, flooding and transportation infrastructure; and assessing trends and projections in freeze-thaw cycles and heavy rainfall events. This presentation will outline the adaptation initiatives undertaken in the City of Prince George during the second phase of research, and evaluate their effectiveness through reflections from interviews with local planners, engineers, managers, community champions and politicians. The initiatives deemed to be most successful - and most likely to be implemented - focus on topics that: are of high public concern; have clear cost implications; incorporate adaptation into policy; and/or incorporate adaptation into an ongoing project. Outcomes highlight challenges local researchers, practitioners and leaders face as they strive to implement proactive adaptation measures in policy and practice without strong support from policy and professional practices, and with a paucity of successful case study examples to build upon. Outcomes also reveal challenges as municipalities strive to do

  7. [Application of adaptive canceling methods in temperature control in ultrasonic therapeutical treatment].

    PubMed

    Deng, Jun; Liu, Du-ren

    2002-12-01

    Objective. To improve the quality of ultrasonic therapeutical treatment by improving the accuracy of temperature control. Method. Adaptive canceling methods were used to reduce the noise of temperature signal gained, and enhance signal-to-noise ratio. Result. The test's result corresponds basically to the theoretical curve. Conclusion. Adaptive canceling methods can be applied to clinic treatment.

  8. Towards More Comprehensive Projections of Urban Heat-Related Mortality: Estimates for New York City Under Multiple Population, Adaptation, and Climate Scenarios

    NASA Technical Reports Server (NTRS)

    Petkova, Elisaveta P.; Vink, Jan K.; Horton, Radley M.; Gasparrini, Antonio; Bader, Daniel A.; Francis, Joe D.; Kinney, Patrick L.

    2016-01-01

    High temperatures have substantial impacts on mortality and, with growing concerns about climate change, numerous studies have developed projections of future heat-related deaths around the world. Projections of temperature-related mortality are often limited by insufficient information necessary to formulate hypotheses about population sensitivity to high temperatures and future demographics. This study has derived projections of temperature-related mortality in New York City by taking into account future patterns of adaptation or demographic change, both of which can have profound influences on future health burdens. We adopt a novel approach to modeling heat adaptation by incorporating an analysis of the observed population response to heat in New York City over the course of eight decades. This approach projects heat-related mortality until the end of the 21st century based on observed trends in adaptation over a substantial portion of the 20th century. In addition, we incorporate a range of new scenarios for population change until the end of the 21st century. We then estimate future heat-related deaths in New York City by combining the changing temperature-mortality relationship and population scenarios with downscaled temperature projections from the 33 global climate models (GCMs) and two Representative Concentration Pathways (RCPs).The median number of projected annual heat-related deaths across the 33 GCMs varied greatly by RCP and adaptation and population change scenario, ranging from 167 to 3331 in the 2080s compared to 638 heat-related deaths annually between 2000 and 2006.These findings provide a more complete picture of the range of potential future heat-related mortality risks across the 21st century in New York, and highlight the importance of both demographic change and adaptation responses in modifying future risks.

  9. Deep learning with domain adaptation for accelerated projection-reconstruction MR.

    PubMed

    Han, Yoseob; Yoo, Jaejun; Kim, Hak Hee; Shin, Hee Jung; Sung, Kyunghyun; Ye, Jong Chul

    2018-09-01

    The radial k-space trajectory is a well-established sampling trajectory used in conjunction with magnetic resonance imaging. However, the radial k-space trajectory requires a large number of radial lines for high-resolution reconstruction. Increasing the number of radial lines causes longer acquisition time, making it more difficult for routine clinical use. On the other hand, if we reduce the number of radial lines, streaking artifact patterns are unavoidable. To solve this problem, we propose a novel deep learning approach with domain adaptation to restore high-resolution MR images from under-sampled k-space data. The proposed deep network removes the streaking artifacts from the artifact corrupted images. To address the situation given the limited available data, we propose a domain adaptation scheme that employs a pre-trained network using a large number of X-ray computed tomography (CT) or synthesized radial MR datasets, which is then fine-tuned with only a few radial MR datasets. The proposed method outperforms existing compressed sensing algorithms, such as the total variation and PR-FOCUSS methods. In addition, the calculation time is several orders of magnitude faster than the total variation and PR-FOCUSS methods. Moreover, we found that pre-training using CT or MR data from similar organ data is more important than pre-training using data from the same modality for different organ. We demonstrate the possibility of a domain-adaptation when only a limited amount of MR data is available. The proposed method surpasses the existing compressed sensing algorithms in terms of the image quality and computation time. © 2018 International Society for Magnetic Resonance in Medicine.

  10. Adaptive multifocus image fusion using block compressed sensing with smoothed projected Landweber integration in the wavelet domain.

    PubMed

    V S, Unni; Mishra, Deepak; Subrahmanyam, G R K S

    2016-12-01

    The need for image fusion in current image processing systems is increasing mainly due to the increased number and variety of image acquisition techniques. Image fusion is the process of combining substantial information from several sensors using mathematical techniques in order to create a single composite image that will be more comprehensive and thus more useful for a human operator or other computer vision tasks. This paper presents a new approach to multifocus image fusion based on sparse signal representation. Block-based compressive sensing integrated with a projection-driven compressive sensing (CS) recovery that encourages sparsity in the wavelet domain is used as a method to get the focused image from a set of out-of-focus images. Compression is achieved during the image acquisition process using a block compressive sensing method. An adaptive thresholding technique within the smoothed projected Landweber recovery process reconstructs high-resolution focused images from low-dimensional CS measurements of out-of-focus images. Discrete wavelet transform and dual-tree complex wavelet transform are used as the sparsifying basis for the proposed fusion. The main finding lies in the fact that sparsification enables a better selection of the fusion coefficients and hence better fusion. A Laplacian mixture model fit is done in the wavelet domain and estimation of the probability density function (pdf) parameters by expectation maximization leads us to the proper selection of the coefficients of the fused image. Using the proposed method compared with the fusion scheme without employing the projected Landweber (PL) scheme and the other existing CS-based fusion approaches, it is observed that with fewer samples itself, the proposed method outperforms other approaches.

  11. An adaptive singular spectrum analysis method for extracting brain rhythms of electroencephalography

    PubMed Central

    Hu, Hai; Guo, Shengxin; Liu, Ran

    2017-01-01

    Artifacts removal and rhythms extraction from electroencephalography (EEG) signals are important for portable and wearable EEG recording devices. Incorporating a novel grouping rule, we proposed an adaptive singular spectrum analysis (SSA) method for artifacts removal and rhythms extraction. Based on the EEG signal amplitude, the grouping rule determines adaptively the first one or two SSA reconstructed components as artifacts and removes them. The remaining reconstructed components are then grouped based on their peak frequencies in the Fourier transform to extract the desired rhythms. The grouping rule thus enables SSA to be adaptive to EEG signals containing different levels of artifacts and rhythms. The simulated EEG data based on the Markov Process Amplitude (MPA) EEG model and the experimental EEG data in the eyes-open and eyes-closed states were used to verify the adaptive SSA method. Results showed a better performance in artifacts removal and rhythms extraction, compared with the wavelet decomposition (WDec) and another two recently reported SSA methods. Features of the extracted alpha rhythms using adaptive SSA were calculated to distinguish between the eyes-open and eyes-closed states. Results showed a higher accuracy (95.8%) than those of the WDec method (79.2%) and the infinite impulse response (IIR) filtering method (83.3%). PMID:28674650

  12. Cockpit Adaptive Automation and Pilot Performance

    NASA Technical Reports Server (NTRS)

    Parasuraman, Raja

    2001-01-01

    The introduction of high-level automated systems in the aircraft cockpit has provided several benefits, e.g., new capabilities, enhanced operational efficiency, and reduced crew workload. At the same time, conventional 'static' automation has sometimes degraded human operator monitoring performance, increased workload, and reduced situation awareness. Adaptive automation represents an alternative to static automation. In this approach, task allocation between human operators and computer systems is flexible and context-dependent rather than static. Adaptive automation, or adaptive task allocation, is thought to provide for regulation of operator workload and performance, while preserving the benefits of static automation. In previous research we have reported beneficial effects of adaptive automation on the performance of both pilots and non-pilots of flight-related tasks. For adaptive systems to be viable, however, such benefits need to be examined jointly in the context of a single set of tasks. The studies carried out under this project evaluated a systematic method for combining different forms of adaptive automation. A model for effective combination of different forms of adaptive automation, based on matching adaptation to operator workload was proposed and tested. The model was evaluated in studies using IFR-rated pilots flying a general-aviation simulator. Performance, subjective, and physiological (heart rate variability, eye scan-paths) measures of workload were recorded. The studies compared workload-based adaptation to to non-adaptive control conditions and found evidence for systematic benefits of adaptive automation. The research provides an empirical basis for evaluating the effectiveness of adaptive automation in the cockpit. The results contribute to the development of design principles and guidelines for the implementation of adaptive automation in the cockpit, particularly in general aviation, and in other human-machine systems. Project goals

  13. Methods of multi-conjugate adaptive optics for astronomy

    NASA Astrophysics Data System (ADS)

    Flicker, Ralf

    2003-07-01

    This work analyses several aspects of multi-conjugate adaptive optics (MCAO) for astronomy. The research ranges from fundamental and technical studies for present-day MCAO projects, to feasibility studies of high-order MCAO instruments for the extremely large telescopes (ELTs) of the future. The first part is an introductory exposition on atmospheric turbulence, adaptive optics (AO) and MCAO, establishing the framework within which the research was carried out The second part (papers I VI) commences with a fundamental design parameter study of MCAO systems, based upon a first-order performance estimation Monte Carlo simulation. It is investigated how the number and geometry of deformable mirrors and reference beacons, and the choice of wavefront reconstruction algorithm, affect system performance. Multi-conjugation introduces the possibility of optically canceling scintillation in part, at the expense of additional optics, by applying the phase correction in a certain sequence. The effects of scintillation when this sequence is not observed are investigated. As a link in characterizing anisoplanatism in conventional AO systems, images made with the AO instrument Hokupa'a on the Gemini-North Telescope were analysed with respect to the anisoplanatism signal. By model-fitting of simulated data, conclusions could be drawn about the vertical distribution of turbulence above the observatory site (Mauna Kea), and the significance to future AO and MCAO instruments with conjugated deformable mirrors is addressed. The problem of tilt anisoplanatism with MCAO systems relying on artificial reference beacons—laser guide stars (LGSs)—is analysed, and analytical models for predicting the effects of tilt anisoplanatism are devised. A method is presented for real-time retrieval of the tilt anisoplanatism point spread function (PSF), using control loop data. An independent PSF estimation of high accuracy is thus obtained which enables accurate PSF photometry and deconvolution

  14. A modified adjoint-based grid adaptation and error correction method for unstructured grid

    NASA Astrophysics Data System (ADS)

    Cui, Pengcheng; Li, Bin; Tang, Jing; Chen, Jiangtao; Deng, Youqi

    2018-05-01

    Grid adaptation is an important strategy to improve the accuracy of output functions (e.g. drag, lift, etc.) in computational fluid dynamics (CFD) analysis and design applications. This paper presents a modified robust grid adaptation and error correction method for reducing simulation errors in integral outputs. The procedure is based on discrete adjoint optimization theory in which the estimated global error of output functions can be directly related to the local residual error. According to this relationship, local residual error contribution can be used as an indicator in a grid adaptation strategy designed to generate refined grids for accurately estimating the output functions. This grid adaptation and error correction method is applied to subsonic and supersonic simulations around three-dimensional configurations. Numerical results demonstrate that the sensitive grids to output functions are detected and refined after grid adaptation, and the accuracy of output functions is obviously improved after error correction. The proposed grid adaptation and error correction method is shown to compare very favorably in terms of output accuracy and computational efficiency relative to the traditional featured-based grid adaptation.

  15. A velocity-correction projection method based immersed boundary method for incompressible flows

    NASA Astrophysics Data System (ADS)

    Cai, Shanggui

    2014-11-01

    In the present work we propose a novel direct forcing immersed boundary method based on the velocity-correction projection method of [J.L. Guermond, J. Shen, Velocity-correction projection methods for incompressible flows, SIAM J. Numer. Anal., 41 (1)(2003) 112]. The principal idea of immersed boundary method is to correct the velocity in the vicinity of the immersed object by using an artificial force to mimic the presence of the physical boundaries. Therefore, velocity-correction projection method is preferred to its pressure-correction counterpart in the present work. Since the velocity-correct projection method is considered as a dual class of pressure-correction method, the proposed method here can also be interpreted in the way that first the pressure is predicted by treating the viscous term explicitly without the consideration of the immersed boundary, and the solenoidal velocity is used to determine the volume force on the Lagrangian points, then the non-slip boundary condition is enforced by correcting the velocity with the implicit viscous term. To demonstrate the efficiency and accuracy of the proposed method, several numerical simulations are performed and compared with the results in the literature. China Scholarship Council.

  16. Innovative Adaptive Control Method Demonstrated for Active Suppression of Instabilities in Engine Combustors

    NASA Technical Reports Server (NTRS)

    Kopasakis, George

    2005-01-01

    This year, an improved adaptive-feedback control method was demonstrated that suppresses thermoacoustic instabilities in a liquid-fueled combustor of a type used in aircraft engines. Extensive research has been done to develop lean-burning (low fuel-to-air ratio) combustors that can reduce emissions throughout the mission cycle to reduce the environmental impact of aerospace propulsion systems. However, these lean-burning combustors are susceptible to thermoacoustic instabilities (high-frequency pressure waves), which can fatigue combustor components and even downstream turbine blades. This can significantly decrease the safe operating life of the combustor and turbine. Thus, suppressing the thermoacoustic combustor instabilities is an enabling technology for meeting the low-emission goals of the NASA Ultra-Efficient Engine Technology (UEET) Project.

  17. Multi-Fault Diagnosis of Rolling Bearings via Adaptive Projection Intrinsically Transformed Multivariate Empirical Mode Decomposition and High Order Singular Value Decomposition

    PubMed Central

    Lv, Yong; Song, Gangbing

    2018-01-01

    Rolling bearings are important components in rotary machinery systems. In the field of multi-fault diagnosis of rolling bearings, the vibration signal collected from single channels tends to miss some fault characteristic information. Using multiple sensors to collect signals at different locations on the machine to obtain multivariate signal can remedy this problem. The adverse effect of a power imbalance between the various channels is inevitable, and unfavorable for multivariate signal processing. As a useful, multivariate signal processing method, Adaptive-projection has intrinsically transformed multivariate empirical mode decomposition (APIT-MEMD), and exhibits better performance than MEMD by adopting adaptive projection strategy in order to alleviate power imbalances. The filter bank properties of APIT-MEMD are also adopted to enable more accurate and stable intrinsic mode functions (IMFs), and to ease mode mixing problems in multi-fault frequency extractions. By aligning IMF sets into a third order tensor, high order singular value decomposition (HOSVD) can be employed to estimate the fault number. The fault correlation factor (FCF) analysis is used to conduct correlation analysis, in order to determine effective IMFs; the characteristic frequencies of multi-faults can then be extracted. Numerical simulations and the application of multi-fault situation can demonstrate that the proposed method is promising in multi-fault diagnoses of multivariate rolling bearing signal. PMID:29659510

  18. Method and apparatus for adaptive force and position control of manipulators

    NASA Technical Reports Server (NTRS)

    Seraji, Homayoun (Inventor)

    1989-01-01

    The present invention discloses systematic methods and apparatus for the design of real time controllers. Real-time control employs adaptive force/position by use of feedforward and feedback controllers, with the feedforward controller being the inverse of the linearized model of robot dynamics and containing only proportional-double-derivative terms is disclosed. The feedback controller, of the proportional-integral-derivative type, ensures that manipulator joints follow reference trajectories and the feedback controller achieves robust tracking of step-plus-exponential trajectories, all in real time. The adaptive controller includes adaptive force and position control within a hybrid control architecture. The adaptive controller, for force control, achieves tracking of desired force setpoints, and the adaptive position controller accomplishes tracking of desired position trajectories. Circuits in the adaptive feedback and feedforward controllers are varied by adaptation laws.

  19. Adaptive finite element methods for two-dimensional problems in computational fracture mechanics

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Bass, J. M.; Spradley, L. W.

    1994-01-01

    Some recent results obtained using solution-adaptive finite element methods in two-dimensional problems in linear elastic fracture mechanics are presented. The focus is on the basic issue of adaptive finite element methods for validating the new methodology by computing demonstration problems and comparing the stress intensity factors to analytical results.

  20. Regional projections of North Indian climate for adaptation studies.

    PubMed

    Mathison, Camilla; Wiltshire, Andrew; Dimri, A P; Falloon, Pete; Jacob, Daniela; Kumar, Pankaj; Moors, Eddy; Ridley, Jeff; Siderius, Christian; Stoffel, Markus; Yasunari, T

    2013-12-01

    Adaptation is increasingly important for regions around the world where large changes in climate could have an impact on populations and industry. The Brahmaputra-Ganges catchments have a large population, a main industry of agriculture and a growing hydro-power industry, making the region susceptible to changes in the Indian Summer Monsoon, annually the main water source. The HighNoon project has completed four regional climate model simulations for India and the Himalaya at high resolution (25km) from 1960 to 2100 to provide an ensemble of simulations for the region. In this paper we have assessed the ensemble for these catchments, comparing the simulations with observations, to give credence that the simulations provide a realistic representation of atmospheric processes and therefore future climate. We have illustrated how these simulations could be used to provide information on potential future climate impacts and therefore aid decision-making using climatology and threshold analysis. The ensemble analysis shows an increase in temperature between the baseline (1970-2000) and the 2050s (2040-2070) of between 2 and 4°C and an increase in the number of days with maximum temperatures above 28°C and 35°C. There is less certainty for precipitation and runoff which show considerable variability, even in this relatively small ensemble, spanning zero. The HighNoon ensemble is the most complete data for the region providing useful information on a wide range of variables for the regional climate of the Brahmaputra-Ganges region, however there are processes not yet included in the models that could have an impact on the simulations of future climate. We have discussed these processes and show that the range from the HighNoon ensemble is similar in magnitude to potential changes in projections where these processes are included. Therefore strategies for adaptation must be robust and flexible allowing for advances in the science and natural environmental changes

  1. ISTC projects devoted to improving laser beam quality

    NASA Astrophysics Data System (ADS)

    Malakhov, Yu. I.

    2007-05-01

    Short overview is done about the activity of ISTC in a direction concerned with improving powerful laser beam quality by means of nonlinear and linear adaptive optics methods. Completed projects #0591 and #1929 resulted in the development of a stimulated Brillouin scattering (SBS) phase conjugation mirror of superhigh fidelity employing the kinoform optical elements (rasters of small lenses) of new generation designed for pulsed or pulse-periodic lasers with nanosecond scale pulse duration. Project #2631 is devoted to development of an adaptive optical system for phase registration and correction of laser beams with wave front vortices. The principles of operation of conventional adaptive systems are based on the assumption that the phase is a smooth continuous function in space. Therefore the solution of the Project tasks will assume a new step in adaptive optics.

  2. Finite element methods for the biomechanics of soft hydrated tissues: nonlinear analysis and adaptive control of meshes.

    PubMed

    Spilker, R L; de Almeida, E S; Donzelli, P S

    1992-01-01

    This chapter addresses computationally demanding numerical formulations in the biomechanics of soft tissues. The theory of mixtures can be used to represent soft hydrated tissues in the human musculoskeletal system as a two-phase continuum consisting of an incompressible solid phase (collagen and proteoglycan) and an incompressible fluid phase (interstitial water). We first consider the finite deformation of soft hydrated tissues in which the solid phase is represented as hyperelastic. A finite element formulation of the governing nonlinear biphasic equations is presented based on a mixed-penalty approach and derived using the weighted residual method. Fluid and solid phase deformation, velocity, and pressure are interpolated within each element, and the pressure variables within each element are eliminated at the element level. A system of nonlinear, first-order differential equations in the fluid and solid phase deformation and velocity is obtained. In order to solve these equations, the contributions of the hyperelastic solid phase are incrementally linearized, a finite difference rule is introduced for temporal discretization, and an iterative scheme is adopted to achieve equilibrium at the end of each time increment. We demonstrate the accuracy and adequacy of the procedure using a six-node, isoparametric axisymmetric element, and we present an example problem for which independent numerical solution is available. Next, we present an automated, adaptive environment for the simulation of soft tissue continua in which the finite element analysis is coupled with automatic mesh generation, error indicators, and projection methods. Mesh generation and updating, including both refinement and coarsening, for the two-dimensional examples examined in this study are performed using the finite quadtree approach. The adaptive analysis is based on an error indicator which is the L2 norm of the difference between the finite element solution and a projected finite element

  3. Adaptive restoration of river terrace vegetation through iterative experiments

    USGS Publications Warehouse

    Dela Cruz, Michelle P.; Beauchamp, Vanessa B.; Shafroth, Patrick B.; Decker, Cheryl E.; O’Neil, Aviva

    2014-01-01

    Restoration projects can involve a high degree of uncertainty and risk, which can ultimately result in failure. An adaptive restoration approach can reduce uncertainty through controlled, replicated experiments designed to test specific hypotheses and alternative management approaches. Key components of adaptive restoration include willingness of project managers to accept the risk inherent in experimentation, interest of researchers, availability of funding for experimentation and monitoring, and ability to restore sites as iterative experiments where results from early efforts can inform the design of later phases. This paper highlights an ongoing adaptive restoration project at Zion National Park (ZNP), aimed at reducing the cover of exotic annual Bromus on riparian terraces, and revegetating these areas with native plant species. Rather than using a trial-and-error approach, ZNP staff partnered with academic, government, and private-sector collaborators to conduct small-scale experiments to explicitly address uncertainties concerning biomass removal of annual bromes, herbicide application rates and timing, and effective seeding methods for native species. Adaptive restoration has succeeded at ZNP because managers accept the risk inherent in experimentation and ZNP personnel are committed to continue these projects over a several-year period. Techniques that result in exotic annual Bromus removal and restoration of native plant species at ZNP can be used as a starting point for adaptive restoration projects elsewhere in the region.

  4. Total variation regularization for seismic waveform inversion using an adaptive primal dual hybrid gradient method

    NASA Astrophysics Data System (ADS)

    Yong, Peng; Liao, Wenyuan; Huang, Jianping; Li, Zhenchuan

    2018-04-01

    Full waveform inversion is an effective tool for recovering the properties of the Earth from seismograms. However, it suffers from local minima caused mainly by the limited accuracy of the starting model and the lack of a low-frequency component in the seismic data. Because of the high velocity contrast between salt and sediment, the relation between the waveform and velocity perturbation is strongly nonlinear. Therefore, salt inversion can easily get trapped in the local minima. Since the velocity of salt is nearly constant, we can make the most of this characteristic with total variation regularization to mitigate the local minima. In this paper, we develop an adaptive primal dual hybrid gradient method to implement total variation regularization by projecting the solution onto a total variation norm constrained convex set, through which the total variation norm constraint is satisfied at every model iteration. The smooth background velocities are first inverted and the perturbations are gradually obtained by successively relaxing the total variation norm constraints. Numerical experiment of the projection of the BP model onto the intersection of the total variation norm and box constraints has demonstrated the accuracy and efficiency of our adaptive primal dual hybrid gradient method. A workflow is designed to recover complex salt structures in the BP 2004 model and the 2D SEG/EAGE salt model, starting from a linear gradient model without using low-frequency data below 3 Hz. The salt inversion processes demonstrate that wavefield reconstruction inversion with a total variation norm and box constraints is able to overcome local minima and inverts the complex salt velocity layer by layer.

  5. Evaluation of intrinsic respiratory signal determination methods for 4D CBCT adapted for mice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, Rachael; Pan, Tinsu, E-mail: tpan@mdanderson.org; Rubinstein, Ashley

    Purpose: 4D CT imaging in mice is important in a variety of areas including studies of lung function and tumor motion. A necessary step in 4D imaging is obtaining a respiratory signal, which can be done through an external system or intrinsically through the projection images. A number of methods have been developed that can successfully determine the respiratory signal from cone-beam projection images of humans, however only a few have been utilized in a preclinical setting and most of these rely on step-and-shoot style imaging. The purpose of this work is to assess and make adaptions of several successfulmore » methods developed for humans for an image-guided preclinical radiation therapy system. Methods: Respiratory signals were determined from the projection images of free-breathing mice scanned on the X-RAD system using four methods: the so-called Amsterdam shroud method, a method based on the phase of the Fourier transform, a pixel intensity method, and a center of mass method. The Amsterdam shroud method was modified so the sharp inspiration peaks associated with anesthetized mouse breathing could be detected. Respiratory signals were used to sort projections into phase bins and 4D images were reconstructed. Error and standard deviation in the assignment of phase bins for the four methods compared to a manual method considered to be ground truth were calculated for a range of region of interest (ROI) sizes. Qualitative comparisons were additionally made between the 4D images obtained using each of the methods and the manual method. Results: 4D images were successfully created for all mice with each of the respiratory signal extraction methods. Only minimal qualitative differences were noted between each of the methods and the manual method. The average error (and standard deviation) in phase bin assignment was 0.24 ± 0.08 (0.49 ± 0.11) phase bins for the Fourier transform method, 0.09 ± 0.03 (0.31 ± 0.08) phase bins for the modified Amsterdam shroud

  6. Reconstruction of brachytherapy seed positions and orientations from cone-beam CT x-ray projections via a novel iterative forward projection matching method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pokhrel, Damodar; Murphy, Martin J.; Todor, Dorin A.

    2011-01-15

    reconstructing seed orientations, as well as centroids, from a small number of radiographic projections, in support of intraoperative planning and adaptive replanning. Unlike standard back-projection methods, gIFPM avoids the need to match corresponding seed images on the projections. This algorithm also successfully reconstructs overlapping clustered and highly migrated seeds in the implant. The accuracy of better than 1 mm and 6 deg. demonstrates that gIFPM has the potential to support 2D Task Group 43 calculations in clinical practice.« less

  7. An adaptive grid scheme using the boundary element method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munipalli, R.; Anderson, D.A.

    1996-09-01

    A technique to solve the Poisson grid generation equations by Green`s function related methods has been proposed, with the source terms being purely position dependent. The use of distributed singularities in the flow domain coupled with the boundary element method (BEM) formulation is presented in this paper as a natural extension of the Green`s function method. This scheme greatly simplifies the adaption process. The BEM reduces the dimensionality of the given problem by one. Internal grid-point placement can be achieved for a given boundary distribution by adding continuous and discrete source terms in the BEM formulation. A distribution of vortexmore » doublets is suggested as a means of controlling grid-point placement and grid-line orientation. Examples for sample adaption problems are presented and discussed. 15 refs., 20 figs.« less

  8. Do Bayesian adaptive trials offer advantages for comparative effectiveness research? Protocol for the RE-ADAPT study

    PubMed Central

    Luce, Bryan R; Broglio, Kristine R; Ishak, K Jack; Mullins, C Daniel; Vanness, David J; Fleurence, Rachael; Saunders, Elijah; Davis, Barry R

    2013-01-01

    Background Randomized clinical trials, particularly for comparative effectiveness research (CER), are frequently criticized for being overly restrictive or untimely for health-care decision making. Purpose Our prospectively designed REsearch in ADAptive methods for Pragmatic Trials (RE-ADAPT) study is a ‘proof of concept’ to stimulate investment in Bayesian adaptive designs for future CER trials. Methods We will assess whether Bayesian adaptive designs offer potential efficiencies in CER by simulating a re-execution of the Antihypertensive and Lipid Lowering Treatment to Prevent Heart Attack Trial (ALLHAT) study using actual data from ALLHAT. Results We prospectively define seven alternate designs consisting of various combinations of arm dropping, adaptive randomization, and early stopping and describe how these designs will be compared to the original ALLHAT design. We identify the one particular design that would have been executed, which incorporates early stopping and information-based adaptive randomization. Limitations While the simulation realistically emulates patient enrollment, interim analyses, and adaptive changes to design, it cannot incorporate key features like the involvement of data monitoring committee in making decisions about adaptive changes. Conclusion This article describes our analytic approach for RE-ADAPT. The next stage of the project is to conduct the re-execution analyses using the seven prespecified designs and the original ALLHAT data. PMID:23983160

  9. Smart algorithms and adaptive methods in computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Tinsley Oden, J.

    1989-05-01

    A review is presented of the use of smart algorithms which employ adaptive methods in processing large amounts of data in computational fluid dynamics (CFD). Smart algorithms use a rationally based set of criteria for automatic decision making in an attempt to produce optimal simulations of complex fluid dynamics problems. The information needed to make these decisions is not known beforehand and evolves in structure and form during the numerical solution of flow problems. Once the code makes a decision based on the available data, the structure of the data may change, and criteria may be reapplied in order to direct the analysis toward an acceptable end. Intelligent decisions are made by processing vast amounts of data that evolve unpredictably during the calculation. The basic components of adaptive methods and their application to complex problems of fluid dynamics are reviewed. The basic components of adaptive methods are: (1) data structures, that is what approaches are available for modifying data structures of an approximation so as to reduce errors; (2) error estimation, that is what techniques exist for estimating error evolution in a CFD calculation; and (3) solvers, what algorithms are available which can function in changing meshes. Numerical examples which demonstrate the viability of these approaches are presented.

  10. Free-energy landscapes from adaptively biased methods: Application to quantum systems

    NASA Astrophysics Data System (ADS)

    Calvo, F.

    2010-10-01

    Several parallel adaptive biasing methods are applied to the calculation of free-energy pathways along reaction coordinates, choosing as a difficult example the double-funnel landscape of the 38-atom Lennard-Jones cluster. In the case of classical statistics, the Wang-Landau and adaptively biased molecular-dynamics (ABMD) methods are both found efficient if multiple walkers and replication and deletion schemes are used. An extension of the ABMD technique to quantum systems, implemented through the path-integral MD framework, is presented and tested on Ne38 against the quantum superposition method.

  11. Nonlinear optimization with linear constraints using a projection method

    NASA Technical Reports Server (NTRS)

    Fox, T.

    1982-01-01

    Nonlinear optimization problems that are encountered in science and industry are examined. A method of projecting the gradient vector onto a set of linear contraints is developed, and a program that uses this method is presented. The algorithm that generates this projection matrix is based on the Gram-Schmidt method and overcomes some of the objections to the Rosen projection method.

  12. A constrained Delaunay discretization method for adaptively meshing highly discontinuous geological media

    NASA Astrophysics Data System (ADS)

    Wang, Yang; Ma, Guowei; Ren, Feng; Li, Tuo

    2017-12-01

    A constrained Delaunay discretization method is developed to generate high-quality doubly adaptive meshes of highly discontinuous geological media. Complex features such as three-dimensional discrete fracture networks (DFNs), tunnels, shafts, slopes, boreholes, water curtains, and drainage systems are taken into account in the mesh generation. The constrained Delaunay triangulation method is used to create adaptive triangular elements on planar fractures. Persson's algorithm (Persson, 2005), based on an analogy between triangular elements and spring networks, is enriched to automatically discretize a planar fracture into mesh points with varying density and smooth-quality gradient. The triangulated planar fractures are treated as planar straight-line graphs (PSLGs) to construct piecewise-linear complex (PLC) for constrained Delaunay tetrahedralization. This guarantees the doubly adaptive characteristic of the resulted mesh: the mesh is adaptive not only along fractures but also in space. The quality of elements is compared with the results from an existing method. It is verified that the present method can generate smoother elements and a better distribution of element aspect ratios. Two numerical simulations are implemented to demonstrate that the present method can be applied to various simulations of complex geological media that contain a large number of discontinuities.

  13. Speckle-metric-optimization-based adaptive optics for laser beam projection and coherent beam combining.

    PubMed

    Vorontsov, Mikhail; Weyrauch, Thomas; Lachinova, Svetlana; Gatz, Micah; Carhart, Gary

    2012-07-15

    Maximization of a projected laser beam's power density at a remotely located extended object (speckle target) can be achieved by using an adaptive optics (AO) technique based on sensing and optimization of the target-return speckle field's statistical characteristics, referred to here as speckle metrics (SM). SM AO was demonstrated in a target-in-the-loop coherent beam combining experiment using a bistatic laser beam projection system composed of a coherent fiber-array transmitter and a power-in-the-bucket receiver. SM sensing utilized a 50 MHz rate dithering of the projected beam that provided a stair-mode approximation of the outgoing combined beam's wavefront tip and tilt with subaperture piston phases. Fiber-integrated phase shifters were used for both the dithering and SM optimization with stochastic parallel gradient descent control.

  14. A multilevel correction adaptive finite element method for Kohn-Sham equation

    NASA Astrophysics Data System (ADS)

    Hu, Guanghui; Xie, Hehu; Xu, Fei

    2018-02-01

    In this paper, an adaptive finite element method is proposed for solving Kohn-Sham equation with the multilevel correction technique. In the method, the Kohn-Sham equation is solved on a fixed and appropriately coarse mesh with the finite element method in which the finite element space is kept improving by solving the derived boundary value problems on a series of adaptively and successively refined meshes. A main feature of the method is that solving large scale Kohn-Sham system is avoided effectively, and solving the derived boundary value problems can be handled efficiently by classical methods such as the multigrid method. Hence, the significant acceleration can be obtained on solving Kohn-Sham equation with the proposed multilevel correction technique. The performance of the method is examined by a variety of numerical experiments.

  15. Solid rocket booster internal flow analysis by highly accurate adaptive computational methods

    NASA Technical Reports Server (NTRS)

    Huang, C. Y.; Tworzydlo, W.; Oden, J. T.; Bass, J. M.; Cullen, C.; Vadaketh, S.

    1991-01-01

    The primary objective of this project was to develop an adaptive finite element flow solver for simulating internal flows in the solid rocket booster. Described here is a unique flow simulator code for analyzing highly complex flow phenomena in the solid rocket booster. New methodologies and features incorporated into this analysis tool are described.

  16. The Project Method in Agricultural Education: Then and Now

    ERIC Educational Resources Information Center

    Roberts, T. Grady; Harlin, Julie F.

    2007-01-01

    The purpose of this philosophical paper was to synthesize theoretical and historical foundations of the project method and compare them to modern best-practices. A review of historical and contemporary literature related to the project method yielded six themes: 1) purpose of projects; 2) project classification; 3) the process; 4) the context; 5)…

  17. Implementation of an improved adaptive-implicit method in a thermal compositional simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, T.B.

    1988-11-01

    A multicomponent thermal simulator with an adaptive-implicit-method (AIM) formulation/inexact-adaptive-Newton (IAN) method is presented. The final coefficient matrix retains the original banded structure so that conventional iterative methods can be used. Various methods for selection of the eliminated unknowns are tested. AIM/IAN method has a lower work count per Newtonian iteration than fully implicit methods, but a wrong choice of unknowns will result in excessive Newtonian iterations. For the problems tested, the residual-error method described in the paper for selecting implicit unknowns, together with the IAN method, had an improvement of up to 28% of the CPU time over the fullymore » implicit method.« less

  18. Adaptive variational mode decomposition method for signal processing based on mode characteristic

    NASA Astrophysics Data System (ADS)

    Lian, Jijian; Liu, Zhuo; Wang, Haijun; Dong, Xiaofeng

    2018-07-01

    Variational mode decomposition is a completely non-recursive decomposition model, where all the modes are extracted concurrently. However, the model requires a preset mode number, which limits the adaptability of the method since a large deviation in the number of mode set will cause the discard or mixing of the mode. Hence, a method called Adaptive Variational Mode Decomposition (AVMD) was proposed to automatically determine the mode number based on the characteristic of intrinsic mode function. The method was used to analyze the simulation signals and the measured signals in the hydropower plant. Comparisons have also been conducted to evaluate the performance by using VMD, EMD and EWT. It is indicated that the proposed method has strong adaptability and is robust to noise. It can determine the mode number appropriately without modulation even when the signal frequencies are relatively close.

  19. Full-Scale Flight Research Testbeds: Adaptive and Intelligent Control

    NASA Technical Reports Server (NTRS)

    Pahle, Joe W.

    2008-01-01

    This viewgraph presentation describes the adaptive and intelligent control methods used for aircraft survival. The contents include: 1) Motivation for Adaptive Control; 2) Integrated Resilient Aircraft Control Project; 3) Full-scale Flight Assets in Use for IRAC; 4) NASA NF-15B Tail Number 837; 5) Gen II Direct Adaptive Control Architecture; 6) Limited Authority System; and 7) 837 Flight Experiments. A simulated destabilization failure analysis along with experience and lessons learned are also presented.

  20. Beta Hebbian Learning as a New Method for Exploratory Projection Pursuit.

    PubMed

    Quintián, Héctor; Corchado, Emilio

    2017-09-01

    In this research, a novel family of learning rules called Beta Hebbian Learning (BHL) is thoroughly investigated to extract information from high-dimensional datasets by projecting the data onto low-dimensional (typically two dimensional) subspaces, improving the existing exploratory methods by providing a clear representation of data's internal structure. BHL applies a family of learning rules derived from the Probability Density Function (PDF) of the residual based on the beta distribution. This family of rules may be called Hebbian in that all use a simple multiplication of the output of the neural network with some function of the residuals after feedback. The derived learning rules can be linked to an adaptive form of Exploratory Projection Pursuit and with artificial distributions, the networks perform as the theory suggests they should: the use of different learning rules derived from different PDFs allows the identification of "interesting" dimensions (as far from the Gaussian distribution as possible) in high-dimensional datasets. This novel algorithm, BHL, has been tested over seven artificial datasets to study the behavior of BHL parameters, and was later applied successfully over four real datasets, comparing its results, in terms of performance, with other well-known Exploratory and projection models such as Maximum Likelihood Hebbian Learning (MLHL), Locally-Linear Embedding (LLE), Curvilinear Component Analysis (CCA), Isomap and Neural Principal Component Analysis (Neural PCA).

  1. Systems and Methods for Parameter Dependent Riccati Equation Approaches to Adaptive Control

    NASA Technical Reports Server (NTRS)

    Kim, Kilsoo (Inventor); Yucelen, Tansel (Inventor); Calise, Anthony J. (Inventor)

    2015-01-01

    Systems and methods for adaptive control are disclosed. The systems and methods can control uncertain dynamic systems. The control system can comprise a controller that employs a parameter dependent Riccati equation. The controller can produce a response that causes the state of the system to remain bounded. The control system can control both minimum phase and non-minimum phase systems. The control system can augment an existing, non-adaptive control design without modifying the gains employed in that design. The control system can also avoid the use of high gains in both the observer design and the adaptive control law.

  2. The method for froth floatation condition recognition based on adaptive feature weighted

    NASA Astrophysics Data System (ADS)

    Wang, Jieran; Zhang, Jun; Tian, Jinwen; Zhang, Daimeng; Liu, Xiaomao

    2018-03-01

    The fusion of foam characteristics can play a complementary role in expressing the content of foam image. The weight of foam characteristics is the key to make full use of the relationship between the different features. In this paper, an Adaptive Feature Weighted Method For Froth Floatation Condition Recognition is proposed. Foam features without and with weights are both classified by using support vector machine (SVM).The classification accuracy and optimal equaling algorithm under the each ore grade are regarded as the result of the adaptive feature weighting algorithm. At the same time the effectiveness of adaptive weighted method is demonstrated.

  3. A comparison of locally adaptive multigrid methods: LDC, FAC and FIC

    NASA Technical Reports Server (NTRS)

    Khadra, Khodor; Angot, Philippe; Caltagirone, Jean-Paul

    1993-01-01

    This study is devoted to a comparative analysis of three 'Adaptive ZOOM' (ZOom Overlapping Multi-level) methods based on similar concepts of hierarchical multigrid local refinement: LDC (Local Defect Correction), FAC (Fast Adaptive Composite), and FIC (Flux Interface Correction)--which we proposed recently. These methods are tested on two examples of a bidimensional elliptic problem. We compare, for V-cycle procedures, the asymptotic evolution of the global error evaluated by discrete norms, the corresponding local errors, and the convergence rates of these algorithms.

  4. Experimental Design and Primary Data Analysis Methods for Comparing Adaptive Interventions

    PubMed Central

    Nahum-Shani, Inbal; Qian, Min; Almirall, Daniel; Pelham, William E.; Gnagy, Beth; Fabiano, Greg; Waxmonsky, Jim; Yu, Jihnhee; Murphy, Susan

    2013-01-01

    In recent years, research in the area of intervention development is shifting from the traditional fixed-intervention approach to adaptive interventions, which allow greater individualization and adaptation of intervention options (i.e., intervention type and/or dosage) over time. Adaptive interventions are operationalized via a sequence of decision rules that specify how intervention options should be adapted to an individual’s characteristics and changing needs, with the general aim to optimize the long-term effectiveness of the intervention. Here, we review adaptive interventions, discussing the potential contribution of this concept to research in the behavioral and social sciences. We then propose the sequential multiple assignment randomized trial (SMART), an experimental design useful for addressing research questions that inform the construction of high-quality adaptive interventions. To clarify the SMART approach and its advantages, we compare SMART with other experimental approaches. We also provide methods for analyzing data from SMART to address primary research questions that inform the construction of a high-quality adaptive intervention. PMID:23025433

  5. The Application of the Real Options Method for the Evaluation of High-Rise Construction Projects

    NASA Astrophysics Data System (ADS)

    Izotov, Aleksandr; Rostova, Olga; Dubgorn, Alissa

    2018-03-01

    The paper is devoted to the problem of evaluation of high-rise construction projects in a rapidly changing environment. The authors proposed an algorithm for constructing and embedding real options in high-rise construction projects, which makes it possible to increase the flexibility of managing multi-stage projects that have the ability to adapt to changing conditions of implementation.

  6. Locally-Adaptive, Spatially-Explicit Projection of U.S. Population for 2030 and 2050

    DOE PAGES

    McKee, Jacob J.; Rose, Amy N.; Bright, Eddie A.; ...

    2015-02-03

    Localized adverse events, including natural hazards, epidemiological events, and human conflict, underscore the criticality of quantifying and mapping current population. Moreover, knowing the spatial distribution of future population allows for increased preparation in the event of an emergency. Building on the spatial interpolation technique previously developed for high resolution population distribution data (LandScan Global and LandScan USA), we have constructed an empirically-informed spatial distribution of the projected population of the contiguous U.S. for 2030 and 2050. Whereas most current large-scale, spatially explicit population projections typically rely on a population gravity model to determine areas of future growth, our projection modelmore » departs from these by accounting for multiple components that affect population distribution. Modelled variables, which included land cover, slope, distances to larger cities, and a moving average of current population, were locally adaptive and geographically varying. The resulting weighted surface was used to determine which areas had the greatest likelihood for future population change. Population projections of county level numbers were developed using a modified version of the U.S. Census s projection methodology with the U.S. Census s official projection as the benchmark. Applications of our model include, but are not limited to, suitability modelling, service area planning for governmental agencies, consequence assessment, mitigation planning and implementation, and assessment of spatially vulnerable populations.« less

  7. Locally-Adaptive, Spatially-Explicit Projection of U.S. Population for 2030 and 2050

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKee, Jacob J.; Rose, Amy N.; Bright, Eddie A.

    Localized adverse events, including natural hazards, epidemiological events, and human conflict, underscore the criticality of quantifying and mapping current population. Moreover, knowing the spatial distribution of future population allows for increased preparation in the event of an emergency. Building on the spatial interpolation technique previously developed for high resolution population distribution data (LandScan Global and LandScan USA), we have constructed an empirically-informed spatial distribution of the projected population of the contiguous U.S. for 2030 and 2050. Whereas most current large-scale, spatially explicit population projections typically rely on a population gravity model to determine areas of future growth, our projection modelmore » departs from these by accounting for multiple components that affect population distribution. Modelled variables, which included land cover, slope, distances to larger cities, and a moving average of current population, were locally adaptive and geographically varying. The resulting weighted surface was used to determine which areas had the greatest likelihood for future population change. Population projections of county level numbers were developed using a modified version of the U.S. Census s projection methodology with the U.S. Census s official projection as the benchmark. Applications of our model include, but are not limited to, suitability modelling, service area planning for governmental agencies, consequence assessment, mitigation planning and implementation, and assessment of spatially vulnerable populations.« less

  8. A self-organizing Lagrangian particle method for adaptive-resolution advection-diffusion simulations

    NASA Astrophysics Data System (ADS)

    Reboux, Sylvain; Schrader, Birte; Sbalzarini, Ivo F.

    2012-05-01

    We present a novel adaptive-resolution particle method for continuous parabolic problems. In this method, particles self-organize in order to adapt to local resolution requirements. This is achieved by pseudo forces that are designed so as to guarantee that the solution is always well sampled and that no holes or clusters develop in the particle distribution. The particle sizes are locally adapted to the length scale of the solution. Differential operators are consistently evaluated on the evolving set of irregularly distributed particles of varying sizes using discretization-corrected operators. The method does not rely on any global transforms or mapping functions. After presenting the method and its error analysis, we demonstrate its capabilities and limitations on a set of two- and three-dimensional benchmark problems. These include advection-diffusion, the Burgers equation, the Buckley-Leverett five-spot problem, and curvature-driven level-set surface refinement.

  9. Method and apparatus for telemetry adaptive bandwidth compression

    NASA Technical Reports Server (NTRS)

    Graham, Olin L.

    1987-01-01

    Methods and apparatus are provided for automatic and/or manual adaptive bandwidth compression of telemetry. An adaptive sampler samples a video signal from a scanning sensor and generates a sequence of sampled fields. Each field and range rate information from the sensor are hence sequentially transmitted to and stored in a multiple and adaptive field storage means. The field storage means then, in response to an automatic or manual control signal, transfers the stored sampled field signals to a video monitor in a form for sequential or simultaneous display of a desired number of stored signal fields. The sampling ratio of the adaptive sample, the relative proportion of available communication bandwidth allocated respectively to transmitted data and video information, and the number of fields simultaneously displayed are manually or automatically selectively adjustable in functional relationship to each other and detected range rate. In one embodiment, when relatively little or no scene motion is detected, the control signal maximizes sampling ratio and causes simultaneous display of all stored fields, thus maximizing resolution and bandwidth available for data transmission. When increased scene motion is detected, the control signal is adjusted accordingly to cause display of fewer fields. If greater resolution is desired, the control signal is adjusted to increase the sampling ratio.

  10. A Novel Unsupervised Adaptive Learning Method for Long-Term Electromyography (EMG) Pattern Recognition.

    PubMed

    Huang, Qi; Yang, Dapeng; Jiang, Li; Zhang, Huajie; Liu, Hong; Kotani, Kiyoshi

    2017-06-13

    Performance degradation will be caused by a variety of interfering factors for pattern recognition-based myoelectric control methods in the long term. This paper proposes an adaptive learning method with low computational cost to mitigate the effect in unsupervised adaptive learning scenarios. We presents a particle adaptive classifier (PAC), by constructing a particle adaptive learning strategy and universal incremental least square support vector classifier (LS-SVC). We compared PAC performance with incremental support vector classifier (ISVC) and non-adapting SVC (NSVC) in a long-term pattern recognition task in both unsupervised and supervised adaptive learning scenarios. Retraining time cost and recognition accuracy were compared by validating the classification performance on both simulated and realistic long-term EMG data. The classification results of realistic long-term EMG data showed that the PAC significantly decreased the performance degradation in unsupervised adaptive learning scenarios compared with NSVC (9.03% ± 2.23%, p < 0.05) and ISVC (13.38% ± 2.62%, p = 0.001), and reduced the retraining time cost compared with ISVC (2 ms per updating cycle vs. 50 ms per updating cycle).

  11. Distortion analysis of subband adaptive filtering methods for FMRI active noise control systems.

    PubMed

    Milani, Ali A; Panahi, Issa M; Briggs, Richard

    2007-01-01

    Delayless subband filtering structure, as a high performance frequency domain filtering technique, is used for canceling broadband fMRI noise (8 kHz bandwidth). In this method, adaptive filtering is done in subbands and the coefficients of the main canceling filter are computed by stacking the subband weights together. There are two types of stacking methods called FFT and FFT-2. In this paper, we analyze the distortion introduced by these two stacking methods. The effect of the stacking distortion on the performance of different adaptive filters in FXLMS algorithm with non-minimum phase secondary path is explored. The investigation is done for different adaptive algorithms (nLMS, APA and RLS), different weight stacking methods, and different number of subbands.

  12. Adaptation to Climate change Impacts on the Mediterranean islands' Agriculture (ADAPT2CLIMA)

    NASA Astrophysics Data System (ADS)

    Giannakopoulos, Christos; Karali, Anna; Lemesios, Giannis; Loizidou, Maria; Papadaskalopoulou, Christina; Moustakas, Konstantinos; Papadopoulou, Maria; Moriondo, Marco; Markou, Marinos; Hatziyanni, Eleni; Pasotti, Luigi

    2016-04-01

    Agriculture is one of the economic sectors that will likely be hit hardest by climate change, since it directly depends on climatic factors such as temperature, sunlight, and precipitation. The EU LIFE ADAPT2CLIMA (http://adapt2clima.eu/en/) project aims to facilitate the development of adaptation strategies for agriculture by deploying and demonstrating an innovative decision support tool. The ADAPT2CLIMA tool will make it possible to simulate the impacts of climate change on crop production and the effectiveness of selected adaptation options in decreasing vulnerability to climate change in three Mediterranean islands, namely Crete (Greece), Sicily (Italy), and Cyprus. The islands were selected for two reasons: firstly, they figure among the most important cultivation areas at national level. Secondly, they exhibit similarities in terms of location (climate), size, climate change threats faced (coastal agriculture, own water resources), agricultural practices, and policy relevance. In particular, the tool will provide: i) climate change projections; ii) hydrological conditions related to agriculture: iii) a vulnerability assessment of selected crops; iv) an evaluation of the adaptation options identified. The project is expected to contribute significantly to increasing climate resilience of agriculture areas in Sicily, Cyprus and Crete as well as at EU and international level by: • Developing, implementing and demonstrating an innovative and interactive decision support tool (ADAPT2CLIMA tool) for adaptation planning in agriculture that estimates future climate change impacts on local water resources, as well as the climate change vulnerability of the agricultural crop production in the project areas; • Evaluating the technical and economic viability of the implementation of the ADAPT2CLIMA tool; • Developing climate change adaptation strategies for agriculture (including a monitoring plan) for the three project areas and presenting them to the competent

  13. Projection of temperature-related mortality due to cardiovascular disease in beijing under different climate change, population, and adaptation scenarios.

    PubMed

    Zhang, Boya; Li, Guoxing; Ma, Yue; Pan, Xiaochuan

    2018-04-01

    Human health faces unprecedented challenges caused by climate change. Thus, studies of the effect of temperature change on total mortality have been conducted in numerous countries. However, few of those studies focused on temperature-related mortality due to cardiovascular disease (CVD) or considered future population changes and adaptation to climate change. We present herein a projection of temperature-related mortality due to CVD under different climate change, population, and adaptation scenarios in Beijing, a megacity in China. To this end, 19 global circulation models (GCMs), 3 representative concentration pathways (RCPs), 3 socioeconomic pathways, together with generalized linear models and distributed lag non-linear models, were used to project future temperature-related CVD mortality during periods centered around the years 2050 and 2070. The number of temperature-related CVD deaths in Beijing is projected to increase by 3.5-10.2% under different RCP scenarios compared with that during the baseline period. Using the same GCM, the future daily maximum temperatures projected using the RCP2.6, RCP4.5, and RCP8.5 scenarios showed a gradually increasing trend. When population change is considered, the annual rate of increase in temperature-related CVD deaths was up to fivefold greater than that under no-population-change scenarios. The decrease in the number of cold-related deaths did not compensate for the increase in that of heat-related deaths, leading to a general increase in the number of temperature-related deaths due to CVD in Beijing. In addition, adaptation to climate change may enhance rather than ameliorate the effect of climate change, as the increase in cold-related CVD mortality greater than the decrease in heat-related CVD mortality in the adaptation scenarios will result in an increase in the total number of temperature-related CVD mortalities. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. Dual Adaptive Filtering by Optimal Projection Applied to Filter Muscle Artifacts on EEG and Comparative Study

    PubMed Central

    Peyrodie, Laurent; Szurhaj, William; Bolo, Nicolas; Pinti, Antonio; Gallois, Philippe

    2014-01-01

    Muscle artifacts constitute one of the major problems in electroencephalogram (EEG) examinations, particularly for the diagnosis of epilepsy, where pathological rhythms occur within the same frequency bands as those of artifacts. This paper proposes to use the method dual adaptive filtering by optimal projection (DAFOP) to automatically remove artifacts while preserving true cerebral signals. DAFOP is a two-step method. The first step consists in applying the common spatial pattern (CSP) method to two frequency windows to identify the slowest components which will be considered as cerebral sources. The two frequency windows are defined by optimizing convolutional filters. The second step consists in using a regression method to reconstruct the signal independently within various frequency windows. This method was evaluated by two neurologists on a selection of 114 pages with muscle artifacts, from 20 clinical recordings of awake and sleeping adults, subject to pathological signals and epileptic seizures. A blind comparison was then conducted with the canonical correlation analysis (CCA) method and conventional low-pass filtering at 30 Hz. The filtering rate was 84.3% for muscle artifacts with a 6.4% reduction of cerebral signals even for the fastest waves. DAFOP was found to be significantly more efficient than CCA and 30 Hz filters. The DAFOP method is fast and automatic and can be easily used in clinical EEG recordings. PMID:25298967

  15. THe Case Method of Instruction (CMI) Project. Final Report.

    ERIC Educational Resources Information Center

    McWilliam, P. J.; And Others

    This final report describes the Case Method of Instruction (CMI) Project, a project to develop, field test, and disseminate training materials to facilitate the use of the Case Method of Instruction by inservice and preservice instructors in developmental disabilities. CMI project activities focused on developing a collection of case stories and…

  16. Three-dimensional anisotropic adaptive filtering of projection data for noise reduction in cone beam CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maier, Andreas; Wigstroem, Lars; Hofmann, Hannes G.

    2011-11-15

    Purpose: The combination of quickly rotating C-arm gantry with digital flat panel has enabled the acquisition of three-dimensional data (3D) in the interventional suite. However, image quality is still somewhat limited since the hardware has not been optimized for CT imaging. Adaptive anisotropic filtering has the ability to improve image quality by reducing the noise level and therewith the radiation dose without introducing noticeable blurring. By applying the filtering prior to 3D reconstruction, noise-induced streak artifacts are reduced as compared to processing in the image domain. Methods: 3D anisotropic adaptive filtering was used to process an ensemble of 2D x-raymore » views acquired along a circular trajectory around an object. After arranging the input data into a 3D space (2D projections + angle), the orientation of structures was estimated using a set of differently oriented filters. The resulting tensor representation of local orientation was utilized to control the anisotropic filtering. Low-pass filtering is applied only along structures to maintain high spatial frequency components perpendicular to these. The evaluation of the proposed algorithm includes numerical simulations, phantom experiments, and in-vivo data which were acquired using an AXIOM Artis dTA C-arm system (Siemens AG, Healthcare Sector, Forchheim, Germany). Spatial resolution and noise levels were compared with and without adaptive filtering. A human observer study was carried out to evaluate low-contrast detectability. Results: The adaptive anisotropic filtering algorithm was found to significantly improve low-contrast detectability by reducing the noise level by half (reduction of the standard deviation in certain areas from 74 to 30 HU). Virtually no degradation of high contrast spatial resolution was observed in the modulation transfer function (MTF) analysis. Although the algorithm is computationally intensive, hardware acceleration using Nvidia's CUDA Interface provided an

  17. [Comparative adaptation of crowns of selective laser melting and wax-lost-casting method].

    PubMed

    Li, Guo-qiang; Shen, Qing-yi; Gao, Jian-hua; Wu, Xue-ying; Chen, Li; Dai, Wen-an

    2012-07-01

    To investigate the marginal adaptation of crowns fabricated by selective laser melting (SLM) and wax-lost-casting method, so as to provide an experimental basis for clinic. Co-Cr alloy full crown were fabricated by SLM and wax-lost-casting for 24 samples in each group. All crowns were cemented with zinc phosphate cement and cut along longitudinal axis by line cutting machine. The gap between crown tissue surface and die was measured by 6-point measuring method with scanning electron microscope (SEM). The marginal adaptation of crowns fabricated by SLM and wax-lost-casting were compared statistically. The gap between SLM crowns were (36.51 ± 2.94), (49.36 ± 3.31), (56.48 ± 3.35), (42.20 ± 3.60) µm, and wax-lost-casting crowns were (68.86 ± 5.41), (58.86 ± 6.10), (70.62 ± 5.79), (69.90 ± 6.00) µm. There were significant difference between two groups (P < 0.05). Co-Cr alloy full crown fabricated by wax-lost-casting method and SLM method provide acceptable marginal adaptation in clinic, and the marginal adaptation of SLM is better than that of wax-lost-casting method.

  18. FPGA Based Adaptive Rate and Manifold Pattern Projection for Structured Light 3D Camera System †

    PubMed Central

    Lee, Sukhan

    2018-01-01

    The quality of the captured point cloud and the scanning speed of a structured light 3D camera system depend upon their capability of handling the object surface of a large reflectance variation in the trade-off of the required number of patterns to be projected. In this paper, we propose and implement a flexible embedded framework that is capable of triggering the camera single or multiple times for capturing single or multiple projections within a single camera exposure setting. This allows the 3D camera system to synchronize the camera and projector even for miss-matched frame rates such that the system is capable of projecting different types of patterns for different scan speed applications. This makes the system capturing a high quality of 3D point cloud even for the surface of a large reflectance variation while achieving a high scan speed. The proposed framework is implemented on the Field Programmable Gate Array (FPGA), where the camera trigger is adaptively generated in such a way that the position and the number of triggers are automatically determined according to camera exposure settings. In other words, the projection frequency is adaptive to different scanning applications without altering the architecture. In addition, the proposed framework is unique as it does not require any external memory for storage because pattern pixels are generated in real-time, which minimizes the complexity and size of the application-specific integrated circuit (ASIC) design and implementation. PMID:29642506

  19. Physically Handicapped in Science: Final Project Report.

    ERIC Educational Resources Information Center

    O'Brien, Maureen B.; And Others

    A two-year project was conducted by St. Mary's Junior College to improve the science literacy of visually-impaired students (VIS) through the adaptation of instructional methods and materials. A four-step process was used: (1) learning materials were reviewed to identify problem areas; (2) preliminary adaptations were made based on the review; (3)…

  20. Adaptive-projection intrinsically transformed multivariate empirical mode decomposition in cooperative brain-computer interface applications.

    PubMed

    Hemakom, Apit; Goverdovsky, Valentin; Looney, David; Mandic, Danilo P

    2016-04-13

    An extension to multivariate empirical mode decomposition (MEMD), termed adaptive-projection intrinsically transformed MEMD (APIT-MEMD), is proposed to cater for power imbalances and inter-channel correlations in real-world multichannel data. It is shown that the APIT-MEMD exhibits similar or better performance than MEMD for a large number of projection vectors, whereas it outperforms MEMD for the critical case of a small number of projection vectors within the sifting algorithm. We also employ the noise-assisted APIT-MEMD within our proposed intrinsic multiscale analysis framework and illustrate the advantages of such an approach in notoriously noise-dominated cooperative brain-computer interface (BCI) based on the steady-state visual evoked potentials and the P300 responses. Finally, we show that for a joint cognitive BCI task, the proposed intrinsic multiscale analysis framework improves system performance in terms of the information transfer rate. © 2016 The Author(s).

  1. Manufacturing Methods and Technology (MMT) project execution report

    NASA Astrophysics Data System (ADS)

    Swim, P. A.

    1982-10-01

    This document is a summary compilation of the manufacturing methods and technology program project status reports (RCS DRCMT-301) submitted to IBEA from DARCOM major Army subcommands and project managers. Each page of the computerized section lists project number, title, status, funding, and projected completion date. Summary pages give information relating to the overall DARCOM program.

  2. Earthquake Rupture Dynamics using Adaptive Mesh Refinement and High-Order Accurate Numerical Methods

    NASA Astrophysics Data System (ADS)

    Kozdon, J. E.; Wilcox, L.

    2013-12-01

    Our goal is to develop scalable and adaptive (spatial and temporal) numerical methods for coupled, multiphysics problems using high-order accurate numerical methods. To do so, we are developing an opensource, parallel library known as bfam (available at http://bfam.in). The first application to be developed on top of bfam is an earthquake rupture dynamics solver using high-order discontinuous Galerkin methods and summation-by-parts finite difference methods. In earthquake rupture dynamics, wave propagation in the Earth's crust is coupled to frictional sliding on fault interfaces. This coupling is two-way, required the simultaneous simulation of both processes. The use of laboratory-measured friction parameters requires near-fault resolution that is 4-5 orders of magnitude higher than that needed to resolve the frequencies of interest in the volume. This, along with earlier simulations using a low-order, finite volume based adaptive mesh refinement framework, suggest that adaptive mesh refinement is ideally suited for this problem. The use of high-order methods is motivated by the high level of resolution required off the fault in earlier the low-order finite volume simulations; we believe this need for resolution is a result of the excessive numerical dissipation of low-order methods. In bfam spatial adaptivity is handled using the p4est library and temporal adaptivity will be accomplished through local time stepping. In this presentation we will present the guiding principles behind the library as well as verification of code against the Southern California Earthquake Center dynamic rupture code validation test problems.

  3. Using the Project Method in Distributive Education. Teacher's Manual.

    ERIC Educational Resources Information Center

    Maletta, Edwin

    The document explains how to integrate the project training methods into a distributive education curriculum for grades 10 or 11. The purpose of this teacher's manual is to give an overall picture of the project method in use. Ten sample projects are included which could apply to any distributive education student concentrating on the major areas…

  4. A Novel Unsupervised Adaptive Learning Method for Long-Term Electromyography (EMG) Pattern Recognition

    PubMed Central

    Huang, Qi; Yang, Dapeng; Jiang, Li; Zhang, Huajie; Liu, Hong; Kotani, Kiyoshi

    2017-01-01

    Performance degradation will be caused by a variety of interfering factors for pattern recognition-based myoelectric control methods in the long term. This paper proposes an adaptive learning method with low computational cost to mitigate the effect in unsupervised adaptive learning scenarios. We presents a particle adaptive classifier (PAC), by constructing a particle adaptive learning strategy and universal incremental least square support vector classifier (LS-SVC). We compared PAC performance with incremental support vector classifier (ISVC) and non-adapting SVC (NSVC) in a long-term pattern recognition task in both unsupervised and supervised adaptive learning scenarios. Retraining time cost and recognition accuracy were compared by validating the classification performance on both simulated and realistic long-term EMG data. The classification results of realistic long-term EMG data showed that the PAC significantly decreased the performance degradation in unsupervised adaptive learning scenarios compared with NSVC (9.03% ± 2.23%, p < 0.05) and ISVC (13.38% ± 2.62%, p = 0.001), and reduced the retraining time cost compared with ISVC (2 ms per updating cycle vs. 50 ms per updating cycle). PMID:28608824

  5. Combined Heat and Power Protocol for Uniform Methods Project | Advanced

    Science.gov Websites

    Manufacturing Research | NREL Combined Heat and Power Protocol for Uniform Methods Project Combined Heat and Power Protocol for Uniform Methods Project NREL developed a protocol that provides a ; is consistent with the scope and other protocols developed for the Uniform Methods Project (UMP

  6. Managing hardwood-softwood mixtures for future forests in eastern North America: assessing suitability to projected climate change

    Treesearch

    John M. Kabrick; Kenneth L. Clark; Anthony W. D' Amato; Daniel C. Dey; Laura S. Kenefic; Christel C. Kern; Benjamin O. Knapp; David A. MacLean; Patricia Raymond; Justin D. Waskiewicz

    2017-01-01

    Despite growing interest in management strategies for climate change adaptation, there are few methods for assessing the ability of stands to endure or adapt to projected future climates. We developed a means for assigning climate "Compatibility" and "Adaptability" scores to stands for assessing the suitability of tree species for projected climate...

  7. Reconstruction of brachytherapy seed positions and orientations from cone-beam CT x-ray projections via a novel iterative forward projection matching method.

    PubMed

    Pokhrel, Damodar; Murphy, Martin J; Todor, Dorin A; Weiss, Elisabeth; Williamson, Jeffrey F

    2011-01-01

    centroids, from a small number of radiographic projections, in support of intraoperative planning and adaptive replanning. Unlike standard back-projection methods, gIFPM avoids the need to match corresponding seed images on the projections. This algorithm also successfully reconstructs overlapping clustered and highly migrated seeds in the implant. The accuracy of better than 1 mm and 6 degrees demonstrates that gIFPM has the potential to support 2D Task Group 43 calculations in clinical practice.

  8. How to Select a Project Delivery Method for School Facilities

    ERIC Educational Resources Information Center

    Kalina, David

    2007-01-01

    In this article, the author discusses and explains three project delivery methods that are commonly used today in the United States. The first project delivery method mentioned is the design-bid-build, which is still the predominant method of project delivery for public works and school construction in the United States. The second is the…

  9. ZZ-Type a posteriori error estimators for adaptive boundary element methods on a curve☆

    PubMed Central

    Feischl, Michael; Führer, Thomas; Karkulik, Michael; Praetorius, Dirk

    2014-01-01

    In the context of the adaptive finite element method (FEM), ZZ-error estimators named after Zienkiewicz and Zhu (1987) [52] are mathematically well-established and widely used in practice. In this work, we propose and analyze ZZ-type error estimators for the adaptive boundary element method (BEM). We consider weakly singular and hyper-singular integral equations and prove, in particular, convergence of the related adaptive mesh-refining algorithms. Throughout, the theoretical findings are underlined by numerical experiments. PMID:24748725

  10. Catalog of Exemplary Projects: 1984-85.

    ERIC Educational Resources Information Center

    Virginia Community Coll. System, Sterling. Inst. for Instructional Excellence.

    This compilation of abstracts represents 39 projects that were funded by the State Council of Higher Education for Virginia under Adapter Grants (which involve experimentation with instructional methods or techniques) or Developer Grants (which involve the implementation of a uniquely innovative teaching method or other instructional procedure).…

  11. A multigrid method for steady Euler equations on unstructured adaptive grids

    NASA Technical Reports Server (NTRS)

    Riemslagh, Kris; Dick, Erik

    1993-01-01

    A flux-difference splitting type algorithm is formulated for the steady Euler equations on unstructured grids. The polynomial flux-difference splitting technique is used. A vertex-centered finite volume method is employed on a triangular mesh. The multigrid method is in defect-correction form. A relaxation procedure with a first order accurate inner iteration and a second-order correction performed only on the finest grid, is used. A multi-stage Jacobi relaxation method is employed as a smoother. Since the grid is unstructured a Jacobi type is chosen. The multi-staging is necessary to provide sufficient smoothing properties. The domain is discretized using a Delaunay triangular mesh generator. Three grids with more or less uniform distribution of nodes but with different resolution are generated by successive refinement of the coarsest grid. Nodes of coarser grids appear in the finer grids. The multigrid method is started on these grids. As soon as the residual drops below a threshold value, an adaptive refinement is started. The solution on the adaptively refined grid is accelerated by a multigrid procedure. The coarser multigrid grids are generated by successive coarsening through point removement. The adaption cycle is repeated a few times. Results are given for the transonic flow over a NACA-0012 airfoil.

  12. A multi-block adaptive solving technique based on lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Zhang, Yang; Xie, Jiahua; Li, Xiaoyue; Ma, Zhenghai; Zou, Jianfeng; Zheng, Yao

    2018-05-01

    In this paper, a CFD parallel adaptive algorithm is self-developed by combining the multi-block Lattice Boltzmann Method (LBM) with Adaptive Mesh Refinement (AMR). The mesh refinement criterion of this algorithm is based on the density, velocity and vortices of the flow field. The refined grid boundary is obtained by extending outward half a ghost cell from the coarse grid boundary, which makes the adaptive mesh more compact and the boundary treatment more convenient. Two numerical examples of the backward step flow separation and the unsteady flow around circular cylinder demonstrate the vortex structure of the cold flow field accurately and specifically.

  13. Method and system for spatial data input, manipulation and distribution via an adaptive wireless transceiver

    NASA Technical Reports Server (NTRS)

    Wang, Ray (Inventor)

    2009-01-01

    A method and system for spatial data manipulation input and distribution via an adaptive wireless transceiver. The method and system include a wireless transceiver for automatically and adaptively controlling wireless transmissions using a Waveform-DNA method. The wireless transceiver can operate simultaneously over both the short and long distances. The wireless transceiver is automatically adaptive and wireless devices can send and receive wireless digital and analog data from various sources rapidly in real-time via available networks and network services.

  14. AP-Cloud: Adaptive particle-in-cloud method for optimal solutions to Vlasov–Poisson equation

    DOE PAGES

    Wang, Xingyu; Samulyak, Roman; Jiao, Xiangmin; ...

    2016-04-19

    We propose a new adaptive Particle-in-Cloud (AP-Cloud) method for obtaining optimal numerical solutions to the Vlasov–Poisson equation. Unlike the traditional particle-in-cell (PIC) method, which is commonly used for solving this problem, the AP-Cloud adaptively selects computational nodes or particles to deliver higher accuracy and efficiency when the particle distribution is highly non-uniform. Unlike other adaptive techniques for PIC, our method balances the errors in PDE discretization and Monte Carlo integration, and discretizes the differential operators using a generalized finite difference (GFD) method based on a weighted least square formulation. As a result, AP-Cloud is independent of the geometric shapes ofmore » computational domains and is free of artificial parameters. Efficient and robust implementation is achieved through an octree data structure with 2:1 balance. We analyze the accuracy and convergence order of AP-Cloud theoretically, and verify the method using an electrostatic problem of a particle beam with halo. Here, simulation results show that the AP-Cloud method is substantially more accurate and faster than the traditional PIC, and it is free of artificial forces that are typical for some adaptive PIC techniques.« less

  15. AP-Cloud: Adaptive Particle-in-Cloud method for optimal solutions to Vlasov–Poisson equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Xingyu; Samulyak, Roman, E-mail: roman.samulyak@stonybrook.edu; Computational Science Initiative, Brookhaven National Laboratory, Upton, NY 11973

    We propose a new adaptive Particle-in-Cloud (AP-Cloud) method for obtaining optimal numerical solutions to the Vlasov–Poisson equation. Unlike the traditional particle-in-cell (PIC) method, which is commonly used for solving this problem, the AP-Cloud adaptively selects computational nodes or particles to deliver higher accuracy and efficiency when the particle distribution is highly non-uniform. Unlike other adaptive techniques for PIC, our method balances the errors in PDE discretization and Monte Carlo integration, and discretizes the differential operators using a generalized finite difference (GFD) method based on a weighted least square formulation. As a result, AP-Cloud is independent of the geometric shapes ofmore » computational domains and is free of artificial parameters. Efficient and robust implementation is achieved through an octree data structure with 2:1 balance. We analyze the accuracy and convergence order of AP-Cloud theoretically, and verify the method using an electrostatic problem of a particle beam with halo. Simulation results show that the AP-Cloud method is substantially more accurate and faster than the traditional PIC, and it is free of artificial forces that are typical for some adaptive PIC techniques.« less

  16. AP-Cloud: Adaptive particle-in-cloud method for optimal solutions to Vlasov–Poisson equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Xingyu; Samulyak, Roman; Jiao, Xiangmin

    We propose a new adaptive Particle-in-Cloud (AP-Cloud) method for obtaining optimal numerical solutions to the Vlasov–Poisson equation. Unlike the traditional particle-in-cell (PIC) method, which is commonly used for solving this problem, the AP-Cloud adaptively selects computational nodes or particles to deliver higher accuracy and efficiency when the particle distribution is highly non-uniform. Unlike other adaptive techniques for PIC, our method balances the errors in PDE discretization and Monte Carlo integration, and discretizes the differential operators using a generalized finite difference (GFD) method based on a weighted least square formulation. As a result, AP-Cloud is independent of the geometric shapes ofmore » computational domains and is free of artificial parameters. Efficient and robust implementation is achieved through an octree data structure with 2:1 balance. We analyze the accuracy and convergence order of AP-Cloud theoretically, and verify the method using an electrostatic problem of a particle beam with halo. Here, simulation results show that the AP-Cloud method is substantially more accurate and faster than the traditional PIC, and it is free of artificial forces that are typical for some adaptive PIC techniques.« less

  17. Analyses of historical and projected climates to support climate adaptation in the northern Rocky Mountains: Chapter 4

    USGS Publications Warehouse

    Gross, John E.; Tercek, Michael; Guay, Kevin; Chang, Tony; Talbert, Marian; Rodman, Ann; Thoma, David; Jantz, Patrick; Morisette, Jeffrey T.

    2016-01-01

    Most of the western United States is experiencing the effects of rapid and directional climate change (Garfin et al. 2013). These effects, along with forecasts of profound changes in the future, provide strong motivation for resource managers to learn about and prepare for future changes. Climate adaptation plans are based on an understanding of historic climate variation and their effects on ecosystems and on forecasts of future climate trends. Frameworks for climate adaptation thus universally identify the importance of a summary of historical, current, and projected climates (Glick, Stein, and Edelson 2011; Cross et al. 2013; Stein et al. 2014). Trends in physical climate variables are usually the basis for evaluating the exposure component in vulnerability assessments. Thus, this chapter focuses on step 2 of the Climate-Smart Conservation framework (chap. 2): vulnerability assessment. We present analyses of historical and current observations of temperature, precipitation, and other key climate measurements to provide context and a baseline for interpreting the ecological impacts of projected climate changes.

  18. The life cycles of six multi-center adaptive clinical trials focused on neurological emergencies developed for the Advancing Regulatory Science initiative of the National Institutes of Health and US Food and Drug Administration: Case studies from the Adaptive Designs Accelerating Promising Treatments Into Trials Project

    PubMed Central

    Guetterman, Timothy C; Fetters, Michael D; Mawocha, Samkeliso; Legocki, Laurie J; Barsan, William G; Lewis, Roger J; Berry, Donald A; Meurer, William J

    2017-01-01

    Objectives: Clinical trials are complicated, expensive, time-consuming, and frequently do not lead to discoveries that improve the health of patients with disease. Adaptive clinical trials have emerged as a methodology to provide more flexibility in design elements to better answer scientific questions regarding whether new treatments are efficacious. Limited observational data exist that describe the complex process of designing adaptive clinical trials. To address these issues, the Adaptive Designs Accelerating Promising Treatments Into Trials project developed six, tailored, flexible, adaptive, phase-III clinical trials for neurological emergencies, and investigators prospectively monitored and observed the processes. The objective of this work is to describe the adaptive design development process, the final design, and the current status of the adaptive trial designs that were developed. Methods: To observe and reflect upon the trial development process, we employed a rich, mixed methods evaluation that combined quantitative data from visual analog scale to assess attitudes about adaptive trials, along with in-depth qualitative data about the development process gathered from observations. Results: The Adaptive Designs Accelerating Promising Treatments Into Trials team developed six adaptive clinical trial designs. Across the six designs, 53 attitude surveys were completed at baseline and after the trial planning process completed. Compared to baseline, the participants believed significantly more strongly that the adaptive designs would be accepted by National Institutes of Health review panels and non-researcher clinicians. In addition, after the trial planning process, the participants more strongly believed that the adaptive design would meet the scientific and medical goals of the studies. Conclusion: Introducing the adaptive design at early conceptualization proved critical to successful adoption and implementation of that trial. Involving key

  19. An Adaptive Cross-Architecture Combination Method for Graph Traversal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    You, Yang; Song, Shuaiwen; Kerbyson, Darren J.

    2014-06-18

    Breadth-First Search (BFS) is widely used in many real-world applications including computational biology, social networks, and electronic design automation. The combination method, using both top-down and bottom-up techniques, is the most effective BFS approach. However, current combination methods rely on trial-and-error and exhaustive search to locate the optimal switching point, which may cause significant runtime overhead. To solve this problem, we design an adaptive method based on regression analysis to predict an optimal switching point for the combination method at runtime within less than 0.1% of the BFS execution time.

  20. The Impact of Climate Projection Method on the Analysis of Climate Change in Semi-arid Basins

    NASA Astrophysics Data System (ADS)

    Halper, E.; Shamir, E.

    2016-12-01

    In small basins with arid climates, rainfall characteristics are highly variable and stream flow is tightly coupled with the nuances of rainfall events (e.g. hourly precipitation patterns Climate change assessments in these basins typically employ CMIP5 projections downscaled with Bias Corrected Statistical Downscaling and Bias Correction/Constructed Analogs (BCSD-BCCA) methods, but these products have drawbacks. Specifically, BCSD-BCCA these projections do not explicitly account for localized physical precipitation mechanisms (e.g. monsoon and snowfall) that are essential to many hydrological systems in the U. S. Southwest. An investigation of the impact of different types of precipitation projections for two kinds of hydrologic studies is being conducted under the U.S. Bureau of Reclamation's Science and Technology Grant Program. An innovative modeling framework consisting of a weather generator of likely hourly precipitation scenarios, coupled with rainfall-runoff, river routing and groundwater models, has been developed in the Nogales, Arizona area. This framework can simulate the impact of future climate on municipal water operations. This framework allows the rigorous comparison of the BCSD-BCCA methods with alternative approaches including rainfall output from dynamical downscaled Regional Climate Models (RCM), a stochastic rainfall generator forced by either Global Climate Models (GCM) or RCM, and projections using historical records conditioned on either GCM or RCM. The results will provide guide for the use of climate change projections into hydrologic studies of semi-arid areas. The project extends this comparison to analyses of flood control. Large flows on the Bill Williams River are a concern for the operation of dams along the Lower Colorado River. After adapting the weather generator for this region, we will evaluate the model performance for rainfall and stream flow, with emphasis on statistical features important to the specific needs of flood

  1. Tailoring the visual communication of climate projections for local adaptation practitioners in Germany and the UK

    PubMed Central

    Lorenz, Susanne; Dessai, Suraje; Forster, Piers M.; Paavola, Jouni

    2015-01-01

    Visualizations are widely used in the communication of climate projections. However, their effectiveness has rarely been assessed among their target audience. Given recent calls to increase the usability of climate information through the tailoring of climate projections, it is imperative to assess the effectiveness of different visualizations. This paper explores the complexities of tailoring through an online survey conducted with 162 local adaptation practitioners in Germany and the UK. The survey examined respondents’ assessed and perceived comprehension (PC) of visual representations of climate projections as well as preferences for using different visualizations in communicating and planning for a changing climate. Comprehension and use are tested using four different graph formats, which are split into two pairs. Within each pair the information content is the same but is visualized differently. We show that even within a fairly homogeneous user group, such as local adaptation practitioners, there are clear differences in respondents’ comprehension of and preference for visualizations. We do not find a consistent association between assessed comprehension and PC or use within the two pairs of visualizations that we analysed. There is, however, a clear link between PC and use of graph format. This suggests that respondents use what they think they understand the best, rather than what they actually understand the best. These findings highlight that audience-specific targeted communication may be more complex and challenging than previously recognized. PMID:26460109

  2. A Hyperspherical Adaptive Sparse-Grid Method for High-Dimensional Discontinuity Detection

    DOE PAGES

    Zhang, Guannan; Webster, Clayton G.; Gunzburger, Max D.; ...

    2015-06-24

    This study proposes and analyzes a hyperspherical adaptive hierarchical sparse-grid method for detecting jump discontinuities of functions in high-dimensional spaces. The method is motivated by the theoretical and computational inefficiencies of well-known adaptive sparse-grid methods for discontinuity detection. Our novel approach constructs a function representation of the discontinuity hypersurface of an N-dimensional discontinuous quantity of interest, by virtue of a hyperspherical transformation. Then, a sparse-grid approximation of the transformed function is built in the hyperspherical coordinate system, whose value at each point is estimated by solving a one-dimensional discontinuity detection problem. Due to the smoothness of the hypersurface, the newmore » technique can identify jump discontinuities with significantly reduced computational cost, compared to existing methods. In addition, hierarchical acceleration techniques are also incorporated to further reduce the overall complexity. Rigorous complexity analyses of the new method are provided as are several numerical examples that illustrate the effectiveness of the approach.« less

  3. Data Assimilation Methods on a Non-conservative Adaptive Mesh

    NASA Astrophysics Data System (ADS)

    Guider, Colin Thomas; Rabatel, Matthias; Carrassi, Alberto; Jones, Christopher K. R. T.

    2017-04-01

    Adaptive mesh methods are used to model a wide variety of physical phenomena. Some of these models, in particular those of sea ice movement, are particularly interesting in that they use a remeshing process to remove and insert mesh points at various points in their evolution. This presents a challenge in developing compatible data assimilation schemes, as the dimension of the state space we wish to estimate can change over time when these remeshings occur. In this work, we first describe a remeshing scheme for an adaptive mesh in one dimension. We then develop advanced data assimilation methods that are appropriate for such a moving and remeshed grid. We hope to extend these techniques to two-dimensional models, like the Lagrangian sea ice model neXtSIM te{ns}. \\bibitem{ns} P. Rampal, S. Bouillon, E. Ólason, and M. Morlighem. ne{X}t{SIM}: a new {L}agrangian sea ice model. {The Cryosphere}, 10 (3): 1055-1073, 2016.

  4. Numerical simulation of h-adaptive immersed boundary method for freely falling disks

    NASA Astrophysics Data System (ADS)

    Zhang, Pan; Xia, Zhenhua; Cai, Qingdong

    2018-05-01

    In this work, a freely falling disk with aspect ratio 1/10 is directly simulated by using an adaptive numerical model implemented on a parallel computation framework JASMIN. The adaptive numerical model is a combination of the h-adaptive mesh refinement technique and the implicit immersed boundary method (IBM). Our numerical results agree well with the experimental results in all of the six degrees of freedom of the disk. Furthermore, very similar vortex structures observed in the experiment were also obtained.

  5. Mixed Methods in Intervention Research: Theory to Adaptation

    ERIC Educational Resources Information Center

    Nastasi, Bonnie K.; Hitchcock, John; Sarkar, Sreeroopa; Burkholder, Gary; Varjas, Kristen; Jayasena, Asoka

    2007-01-01

    The purpose of this article is to demonstrate the application of mixed methods research designs to multiyear programmatic research and development projects whose goals include integration of cultural specificity when generating or translating evidence-based practices. The authors propose a set of five mixed methods designs related to different…

  6. Adapting a perinatal empathic training method from South Africa to Germany.

    PubMed

    Knapp, Caprice; Honikman, Simone; Wirsching, Michael; Husni-Pascha, Gidah; Hänselmann, Eva

    2018-01-01

    Maternal mental health conditions are prevalent across the world. For women, the perinatal period is associated with increased rates of depression and anxiety. At the same time, there is widespread documentation of disrespectful care for women by maternity health staff. Improving the empathic engagement skills of maternity healthcare workers may enable them to respond to the mental health needs of their clients more effectively. In South Africa, a participatory empathic training method, the "Secret History" has been used as part of a national Department of Health training program with maternity staff and has showed promising results. For this paper, we aimed to describe an adaptation of the Secret History empathic training method from the South African to the German setting and to evaluate the adapted training. The pilot study occurred in an academic medical center in Germany. A focus group ( n  = 8) was used to adapt the training by describing the local context and changing the materials to be relevant to Germany. After adapting the materials, the pilot training was conducted with a mixed group of professionals ( n  = 15), many of whom were trainers themselves. A pre-post survey assessed the participants' empathy levels and attitudes towards the training method. In adapting the materials, the focus group discussion generated several experiences that were considered to be typical interpersonal and structural challenges facing healthcare workers in maternal care in Germany. These experiences were crafted into case scenarios that then formed the basis of the activities used in the Secret History empathic training pilot. Evaluation of the pilot training showed that although the participants had high levels of empathy in the pre-phase (100% estimated their empathic ability as high or very high), 69% became more aware of their own emotional experiences with patients and the need for self-care after the training. A majority, or 85%, indicated that the training

  7. HIFI-C: a robust and fast method for determining NMR couplings from adaptive 3D to 2D projections.

    PubMed

    Cornilescu, Gabriel; Bahrami, Arash; Tonelli, Marco; Markley, John L; Eghbalnia, Hamid R

    2007-08-01

    We describe a novel method for the robust, rapid, and reliable determination of J couplings in multi-dimensional NMR coupling data, including small couplings from larger proteins. The method, "High-resolution Iterative Frequency Identification of Couplings" (HIFI-C) is an extension of the adaptive and intelligent data collection approach introduced earlier in HIFI-NMR. HIFI-C collects one or more optimally tilted two-dimensional (2D) planes of a 3D experiment, identifies peaks, and determines couplings with high resolution and precision. The HIFI-C approach, demonstrated here for the 3D quantitative J method, offers vital features that advance the goal of rapid and robust collection of NMR coupling data. (1) Tilted plane residual dipolar couplings (RDC) data are collected adaptively in order to offer an intelligent trade off between data collection time and accuracy. (2) Data from independent planes can provide a statistical measure of reliability for each measured coupling. (3) Fast data collection enables measurements in cases where sample stability is a limiting factor (for example in the presence of an orienting medium required for residual dipolar coupling measurements). (4) For samples that are stable, or in experiments involving relatively stronger couplings, robust data collection enables more reliable determinations of couplings in shorter time, particularly for larger biomolecules. As a proof of principle, we have applied the HIFI-C approach to the 3D quantitative J experiment to determine N-C' RDC values for three proteins ranging from 56 to 159 residues (including a homodimer with 111 residues in each subunit). A number of factors influence the robustness and speed of data collection. These factors include the size of the protein, the experimental set up, and the coupling being measured, among others. To exhibit a lower bound on robustness and the potential for time saving, the measurement of dipolar couplings for the N-C' vector represents a realistic

  8. Method study on fuzzy-PID adaptive control of electric-hydraulic hitch system

    NASA Astrophysics Data System (ADS)

    Li, Mingsheng; Wang, Liubu; Liu, Jian; Ye, Jin

    2017-03-01

    In this paper, fuzzy-PID adaptive control method is applied to the control of tractor electric-hydraulic hitch system. According to the characteristics of the system, a fuzzy-PID adaptive controller is designed and the electric-hydraulic hitch system model is established. Traction control and position control performance simulation are carried out with the common PID control method. A field test rig was set up to test the electric-hydraulic hitch system. The test results showed that, after the fuzzy-PID adaptive control is adopted, when the tillage depth steps from 0.1m to 0.3m, the system transition process time is 4s, without overshoot, and when the tractive force steps from 3000N to 7000N, the system transition process time is 5s, the system overshoot is 25%.

  9. An examination of an adapter method for measuring the vibration transmitted to the human arms.

    PubMed

    Xu, Xueyan S; Dong, Ren G; Welcome, Daniel E; Warren, Christopher; McDowell, Thomas W

    2015-09-01

    The objective of this study is to evaluate an adapter method for measuring the vibration on the human arms. Four instrumented adapters with different weights were used to measure the vibration transmitted to the wrist, forearm, and upper arm of each subject. Each adapter was attached at each location on the subjects using an elastic cloth wrap. Two laser vibrometers were also used to measure the transmitted vibration at each location to evaluate the validity of the adapter method. The apparent mass at the palm of the hand along the forearm direction was also measured to enhance the evaluation. This study found that the adapter and laser-measured transmissibility spectra were comparable with some systematic differences. While increasing the adapter mass reduced the resonant frequency at the measurement location, increasing the tightness of the adapter attachment increased the resonant frequency. However, the use of lightweight (≤15 g) adapters under medium attachment tightness did not change the basic trends of the transmissibility spectrum. The resonant features observed in the transmissibility spectra were also correlated with those observed in the apparent mass spectra. Because the local coordinate systems of the adapters may be significantly misaligned relative to the global coordinates of the vibration test systems, large errors were observed for the adapter-measured transmissibility in some individual orthogonal directions. This study, however, also demonstrated that the misalignment issue can be resolved by either using the total vibration transmissibility or by measuring the misalignment angles to correct the errors. Therefore, the adapter method is acceptable for understanding the basic characteristics of the vibration transmission in the human arms, and the adapter-measured data are acceptable for approximately modeling the system.

  10. Non-orthogonal spin-adaptation of coupled cluster methods: A new implementation of methods including quadruple excitations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthews, Devin A., E-mail: dmatthews@utexas.edu; Stanton, John F.

    2015-02-14

    The theory of non-orthogonal spin-adaptation for closed-shell molecular systems is applied to coupled cluster methods with quadruple excitations (CCSDTQ). Calculations at this level of detail are of critical importance in describing the properties of molecular systems to an accuracy which can meet or exceed modern experimental techniques. Such calculations are of significant (and growing) importance in such fields as thermodynamics, kinetics, and atomic and molecular spectroscopies. With respect to the implementation of CCSDTQ and related methods, we show that there are significant advantages to non-orthogonal spin-adaption with respect to simplification and factorization of the working equations and to creating anmore » efficient implementation. The resulting algorithm is implemented in the CFOUR program suite for CCSDT, CCSDTQ, and various approximate methods (CCSD(T), CC3, CCSDT-n, and CCSDT(Q))« less

  11. Projection of Summer Climate on Tokyo Metropolitan Area using Pseudo Global Warming Method

    NASA Astrophysics Data System (ADS)

    Adachi, S. A.; Kimura, F.; Kusaka, H.; Hara, M.

    2010-12-01

    Recent surface air temperature observations in most of urban areas show the remarkable increasing trend affected by the global warming and the heat island effects. There are many populous areas in Japan. In such areas, the effects of land-use change and urbanization on the local climate are not negligible (Fujibe, 2010). The heat stress for citizen there is concerned to swell moreover in the future. Therefore, spatially detailed climate projection is required for making adaptation and mitigation plans. This study focuses on the Tokyo metropolitan area (TMA) in summer and aims to estimate the local climate change over the TMA in 2070s using a regional climate model. The Regional Atmospheric Modeling System (RAMS) was used for downscaling. A single layer urban canopy model (Kusaka et al., 2001) is built into RAMS as a parameterization expressing the features of urban surface. We performed two experiments for estimating present and future climate. In the present climate simulation, the initial and boundary conditions for RAMS are provided from the JRA-25/JCDAS. On the other hand, the Pseudo Global Warming (PGW) method (Sato et al., 2007) is applied to estimate the future climate, instead of the conventional dynamical downscaling method. The PGW method is expected to reduce the model biases in the future projection estimated by Atmosphere-Ocean General Circulation Models (AOGCM). The boundary conditions used in the PGW method is given by the PGW data, which are obtained by adding the climate monthly difference between 1990s and 2070s estimated by AOGCMs to the 6-hourly reanalysis data. In addition, the uncertainty in the regional climate projection depending on the AOGCM projections is estimated from additional downscaling experiments using the different PGW data obtained from five AOGCMs. Acknowledgment: This work was supported by the Global Environment Research Fund (S-5-3) of the Ministry of the Environment, Japan. References: 1. Fujibe, F., Int. J. Climatol., doi

  12. Practicum in adapted physical activity: a Dewey-inspired action research project.

    PubMed

    Standal, Øyvind; Rugseth, Gro

    2014-07-01

    The purpose of this study was to investigate what adapted physical activity (APA) students learn from their practicum experiences. One cohort of APA students participated, and data were generated from an action research project that included observations, reflective journals, and a focus group interview. The theoretical framework for the study was Dewey's and Wackerhausen's theories of reflections. The findings show the objects of students' reflections, the kind of conceptual resources they draw on while reflecting, and their knowledge interests. In addition, two paradoxes are identified: the tension between reflecting from and on own values, and how practicum as a valued experience of reality can become too difficult to handle. In conclusion, we reflect on how practicum learning can be facilitated.

  13. Method and system for training dynamic nonlinear adaptive filters which have embedded memory

    NASA Technical Reports Server (NTRS)

    Rabinowitz, Matthew (Inventor)

    2002-01-01

    Described herein is a method and system for training nonlinear adaptive filters (or neural networks) which have embedded memory. Such memory can arise in a multi-layer finite impulse response (FIR) architecture, or an infinite impulse response (IIR) architecture. We focus on filter architectures with separate linear dynamic components and static nonlinear components. Such filters can be structured so as to restrict their degrees of computational freedom based on a priori knowledge about the dynamic operation to be emulated. The method is detailed for an FIR architecture which consists of linear FIR filters together with nonlinear generalized single layer subnets. For the IIR case, we extend the methodology to a general nonlinear architecture which uses feedback. For these dynamic architectures, we describe how one can apply optimization techniques which make updates closer to the Newton direction than those of a steepest descent method, such as backpropagation. We detail a novel adaptive modified Gauss-Newton optimization technique, which uses an adaptive learning rate to determine both the magnitude and direction of update steps. For a wide range of adaptive filtering applications, the new training algorithm converges faster and to a smaller value of cost than both steepest-descent methods such as backpropagation-through-time, and standard quasi-Newton methods. We apply the algorithm to modeling the inverse of a nonlinear dynamic tracking system 5, as well as a nonlinear amplifier 6.

  14. Adaptive optics image restoration algorithm based on wavefront reconstruction and adaptive total variation method

    NASA Astrophysics Data System (ADS)

    Li, Dongming; Zhang, Lijuan; Wang, Ting; Liu, Huan; Yang, Jinhua; Chen, Guifen

    2016-11-01

    To improve the adaptive optics (AO) image's quality, we study the AO image restoration algorithm based on wavefront reconstruction technology and adaptive total variation (TV) method in this paper. Firstly, the wavefront reconstruction using Zernike polynomial is used for initial estimated for the point spread function (PSF). Then, we develop our proposed iterative solutions for AO images restoration, addressing the joint deconvolution issue. The image restoration experiments are performed to verify the image restoration effect of our proposed algorithm. The experimental results show that, compared with the RL-IBD algorithm and Wiener-IBD algorithm, we can see that GMG measures (for real AO image) from our algorithm are increased by 36.92%, and 27.44% respectively, and the computation time are decreased by 7.2%, and 3.4% respectively, and its estimation accuracy is significantly improved.

  15. Adaptive finite element method for turbulent flow near a propeller

    NASA Astrophysics Data System (ADS)

    Pelletier, Dominique; Ilinca, Florin; Hetu, Jean-Francois

    1994-11-01

    This paper presents an adaptive finite element method based on remeshing to solve incompressible turbulent free shear flow near a propeller. Solutions are obtained in primitive variables using a highly accurate finite element approximation on unstructured grids. Turbulence is modeled by a mixing length formulation. Two general purpose error estimators, which take into account swirl and the variation of the eddy viscosity, are presented and applied to the turbulent wake of a propeller. Predictions compare well with experimental measurements. The proposed adaptive scheme is robust, reliable and cost effective.

  16. Development of Underwater Laser Scaling Adapter

    NASA Astrophysics Data System (ADS)

    Bluss, Kaspars

    2012-12-01

    In this paper the developed laser scaling adapter is presented. The scaling adapter is equipped with a twin laser unit where the two parallel laser beams are projected onto any target giving an exact indication of scale. The body of the laser scaling adapter is made of Teflon, the density of which is approximately two times the water density. The development involved multiple challenges - numerical hydrodynamic calculations for choosing an appropriate shape which would reduce the effects of turbulence, an accurate sealing of the power supply and the laser diodes, and others. The precision is estimated by the partial derivation method. Both experimental and theoretical data conclude the overall precision error to be in the 1% margin. This paper presents the development steps of such an underwater laser scaling adapter for a remotely operated vehicle (ROV).

  17. A solution-adaptive hybrid-grid method for the unsteady analysis of turbomachinery

    NASA Technical Reports Server (NTRS)

    Mathur, Sanjay R.; Madavan, Nateri K.; Rajagopalan, R. G.

    1993-01-01

    A solution-adaptive method for the time-accurate analysis of two-dimensional flows in turbomachinery is described. The method employs a hybrid structured-unstructured zonal grid topology in conjunction with appropriate modeling equations and solution techniques in each zone. The viscous flow region in the immediate vicinity of the airfoils is resolved on structured O-type grids while the rest of the domain is discretized using an unstructured mesh of triangular cells. Implicit, third-order accurate, upwind solutions of the Navier-Stokes equations are obtained in the inner regions. In the outer regions, the Euler equations are solved using an explicit upwind scheme that incorporates a second-order reconstruction procedure. An efficient and robust grid adaptation strategy, including both grid refinement and coarsening capabilities, is developed for the unstructured grid regions. Grid adaptation is also employed to facilitate information transfer at the interfaces between unstructured grids in relative motion. Results for grid adaptation to various features pertinent to turbomachinery flows are presented. Good comparisons between the present results and experimental measurements and earlier structured-grid results are obtained.

  18. Effectiveness of an Adaptation of the Project Connect Health Systems Intervention: Youth and Clinic-Level Findings.

    PubMed

    Loosier, Penny S; Doll, Shelli; Lepar, Danielle; Ward, Kristin; Gamble, Ginger; Dittus, Patricia J

    2016-08-01

    The Project Connect Health Systems Intervention (Project Connect) uses a systematic process of collecting community and healthcare infrastructure information to craft a referral guide highlighting local healthcare providers who provide high quality sexual and reproductive healthcare. Previous self-report data on healthcare usage indicated Project Connect was successful with sexually experienced female youth, where it increased rates of human immunodeficiency virus (HIV) and sexually transmitted disease (STD) testing and receipt of contraception. This adaption of Project Connect examined its effectiveness in a new context and via collection of clinic encounter-level data. Project Connect was implemented in 3 high schools. (only 2 schools remained open throughout the entire project period). Participant recruitment and data collection occurred in 5 of 8 participating health clinics. Students completed Youth Surveys (N = 608) and a Clinic Survey (paired with medical data abstraction in 2 clinics [N = 305]). Students were more likely than nonstudents to report having reached a clinic via Project Connect. Nearly 40% of students attended a Project Connect school, with 32.7% using Project Connect to reach the clinic. Students were most likely to have been referred by a school nurse or coach. Project Connect is a low-cost, sustainable structural intervention with multiple applications within schools, either as a standalone intervention or in combination with ongoing efforts. © 2016, American School Health Association.

  19. Method and system for environmentally adaptive fault tolerant computing

    NASA Technical Reports Server (NTRS)

    Copenhaver, Jason L. (Inventor); Jeremy, Ramos (Inventor); Wolfe, Jeffrey M. (Inventor); Brenner, Dean (Inventor)

    2010-01-01

    A method and system for adapting fault tolerant computing. The method includes the steps of measuring an environmental condition representative of an environment. An on-board processing system's sensitivity to the measured environmental condition is measured. It is determined whether to reconfigure a fault tolerance of the on-board processing system based in part on the measured environmental condition. The fault tolerance of the on-board processing system may be reconfigured based in part on the measured environmental condition.

  20. Disturbance observer-based adaptive sliding mode hybrid projective synchronisation of identical fractional-order financial systems

    NASA Astrophysics Data System (ADS)

    Khan, Ayub; Tyagi, Arti

    2018-05-01

    In this paper, we have studied the hybrid projective synchronisation for incommensurate, integer and commensurate fractional-order financial systems with unknown disturbance. To tackle the problem of unknown bounded disturbance, fractional-order disturbance observer is designed to approximate the unknown disturbance. Further, we have introduced simple sliding mode surface and designed adaptive sliding mode controllers incorporating with the designed fractional-order disturbance observer to achieve a bounded hybrid projective synchronisation between two identical fractional-order financial model with different initial conditions. It is shown that the slave system with disturbance can be synchronised with the projection of the master system generated through state transformation. Simulation results are presented to ensure the validity and effectiveness of the proposed sliding mode control scheme in the presence of external bounded unknown disturbance. Also, synchronisation error for commensurate, integer and incommensurate fractional-order financial systems is studied in numerical simulation.

  1. An adaptive proper orthogonal decomposition method for model order reduction of multi-disc rotor system

    NASA Astrophysics Data System (ADS)

    Jin, Yulin; Lu, Kuan; Hou, Lei; Chen, Yushu

    2017-12-01

    The proper orthogonal decomposition (POD) method is a main and efficient tool for order reduction of high-dimensional complex systems in many research fields. However, the robustness problem of this method is always unsolved, although there are some modified POD methods which were proposed to solve this problem. In this paper, a new adaptive POD method called the interpolation Grassmann manifold (IGM) method is proposed to address the weakness of local property of the interpolation tangent-space of Grassmann manifold (ITGM) method in a wider parametric region. This method is demonstrated here by a nonlinear rotor system of 33-degrees of freedom (DOFs) with a pair of liquid-film bearings and a pedestal looseness fault. The motion region of the rotor system is divided into two parts: simple motion region and complex motion region. The adaptive POD method is compared with the ITGM method for the large and small spans of parameter in the two parametric regions to present the advantage of this method and disadvantage of the ITGM method. The comparisons of the responses are applied to verify the accuracy and robustness of the adaptive POD method, as well as the computational efficiency is also analyzed. As a result, the new adaptive POD method has a strong robustness and high computational efficiency and accuracy in a wide scope of parameter.

  2. Hybrid Adaptive Flight Control with Model Inversion Adaptation

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan

    2011-01-01

    This study investigates a hybrid adaptive flight control method as a design possibility for a flight control system that can enable an effective adaptation strategy to deal with off-nominal flight conditions. The hybrid adaptive control blends both direct and indirect adaptive control in a model inversion flight control architecture. The blending of both direct and indirect adaptive control provides a much more flexible and effective adaptive flight control architecture than that with either direct or indirect adaptive control alone. The indirect adaptive control is used to update the model inversion controller by an on-line parameter estimation of uncertain plant dynamics based on two methods. The first parameter estimation method is an indirect adaptive law based on the Lyapunov theory, and the second method is a recursive least-squares indirect adaptive law. The model inversion controller is therefore made to adapt to changes in the plant dynamics due to uncertainty. As a result, the modeling error is reduced that directly leads to a decrease in the tracking error. In conjunction with the indirect adaptive control that updates the model inversion controller, a direct adaptive control is implemented as an augmented command to further reduce any residual tracking error that is not entirely eliminated by the indirect adaptive control.

  3. An examination of an adapter method for measuring the vibration transmitted to the human arms

    PubMed Central

    Xu, Xueyan S.; Dong, Ren G.; Welcome, Daniel E.; Warren, Christopher; McDowell, Thomas W.

    2016-01-01

    The objective of this study is to evaluate an adapter method for measuring the vibration on the human arms. Four instrumented adapters with different weights were used to measure the vibration transmitted to the wrist, forearm, and upper arm of each subject. Each adapter was attached at each location on the subjects using an elastic cloth wrap. Two laser vibrometers were also used to measure the transmitted vibration at each location to evaluate the validity of the adapter method. The apparent mass at the palm of the hand along the forearm direction was also measured to enhance the evaluation. This study found that the adapter and laser-measured transmissibility spectra were comparable with some systematic differences. While increasing the adapter mass reduced the resonant frequency at the measurement location, increasing the tightness of the adapter attachment increased the resonant frequency. However, the use of lightweight (≤15 g) adapters under medium attachment tightness did not change the basic trends of the transmissibility spectrum. The resonant features observed in the transmissibility spectra were also correlated with those observed in the apparent mass spectra. Because the local coordinate systems of the adapters may be significantly misaligned relative to the global coordinates of the vibration test systems, large errors were observed for the adapter-measured transmissibility in some individual orthogonal directions. This study, however, also demonstrated that the misalignment issue can be resolved by either using the total vibration transmissibility or by measuring the misalignment angles to correct the errors. Therefore, the adapter method is acceptable for understanding the basic characteristics of the vibration transmission in the human arms, and the adapter-measured data are acceptable for approximately modeling the system. PMID:26834309

  4. Prioritizing sewer rehabilitation projects using AHP-PROMETHEE II ranking method.

    PubMed

    Kessili, Abdelhak; Benmamar, Saadia

    2016-01-01

    The aim of this paper is to develop a methodology for the prioritization of sewer rehabilitation projects for Algiers (Algeria) sewer networks to support the National Sanitation Office in its challenge to make decisions on prioritization of sewer rehabilitation projects. The methodology applies multiple-criteria decision making. The study includes 47 projects (collectors) and 12 criteria to evaluate them. These criteria represent the different issues considered in the prioritization of the projects, which are structural, hydraulic, environmental, financial, social and technical. The analytic hierarchy process (AHP) is used to determine weights of the criteria and the Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE II) method is used to obtain the final ranking of the projects. The model was verified using the sewer data of Algiers. The results have shown that the method can be used for prioritizing sewer rehabilitation projects.

  5. [Dissertations 25 year after date 41. Older people's adaptability].

    PubMed

    de Baat, C; Gerritsen, A E; van der Putten, G J; van der Maarel-Wierink, C D

    2015-09-01

    In 1990, the thesis 'Removable complete dentures in older people, an issue dealing with adaptability?' was published. Among other things, this thesis aimed at finding a method of measuring older people's adaptability to removable complete dentures. Its conclusion was that a subscale of the "Beoordelingsschaal voor Oudere Patiënten" (Rating scale for older patients) had predictive value. Subsequently, only a few research projects on this topic have been carried out. They dealt with demonstrated adaptation achieved after treatment, the realised adaptation. The results were disappointing. Ever since the availability of endosseous oral implants, research into adaptability to conventional removable complete dentures seems less relevant. During the last decades, inquiries into a method of measuring treatment effectiveness has focused on older people's quality of life and general health condition. However, to assess with respect to oral health care an older person's general health condition and load-taking capacity adequately, some experience is indispensable.

  6. Computerized Adaptive Testing Project: Objectives and Requirements.

    DTIC Science & Technology

    1982-07-01

    developing a cqmputerlzed adaptive lwfb system ( CAT ). SiN 0102- LP. Old. "O AM"- S/M "of F.g~ smuuim ftmAYUSN 0 IM ~ A joint-service coordinated effort is In...progress to develop a computerized adaptive testing ( CAT ) system and to evaluate its potential for use in the Military Enlistment Processing Stations...lead laboratory for this effort. This report is intended to serve as a working paper documenting CAT system functional requirements and schedules. It

  7. A Remote Sensing Image Fusion Method based on adaptive dictionary learning

    NASA Astrophysics Data System (ADS)

    He, Tongdi; Che, Zongxi

    2018-01-01

    This paper discusses using a remote sensing fusion method, based on' adaptive sparse representation (ASP)', to provide improved spectral information, reduce data redundancy and decrease system complexity. First, the training sample set is formed by taking random blocks from the images to be fused, the dictionary is then constructed using the training samples, and the remaining terms are clustered to obtain the complete dictionary by iterated processing at each step. Second, the self-adaptive weighted coefficient rule of regional energy is used to select the feature fusion coefficients and complete the reconstruction of the image blocks. Finally, the reconstructed image blocks are rearranged and an average is taken to obtain the final fused images. Experimental results show that the proposed method is superior to other traditional remote sensing image fusion methods in both spectral information preservation and spatial resolution.

  8. Adaptation potential of naturally ventilated barns to high temperature extremes: The OptiBarn project

    NASA Astrophysics Data System (ADS)

    Menz, Christoph

    2016-04-01

    Climate change interferes with various aspects of the socio-economic system. One important aspect is its influence on animal husbandry, especially dairy faming. Dairy cows are usually kept in naturally ventilated barns (NVBs) which are particular vulnerable to extreme events due to their low adaptation capabilities. An effective adaptation to high outdoor temperatures for example, is only possible under certain wind and humidity conditions. High temperature extremes are expected to increase in number and strength under climate change. To assess the impact of this change on NVBs and dairy cows also the changes in wind and humidity needs to be considered. Hence we need to consider the multivariate structure of future temperature extremes. The OptiBarn project aims to develop sustainable adaptation strategies for dairy housings under climate change for Europe, by considering the multivariate structure of high temperature extremes. In a first step we identify various multivariate high temperature extremes for three core regions in Europe. With respect to dairy cows in NVBs we will focus on the wind and humidity field during high temperature events. In a second step we will use the CORDEX-EUR-11 ensemble to evaluate the capability of the RCMs to model such events and assess their future change potential. By transferring the outdoor conditions to indoor climate and animal wellbeing the results of this assessment can be used to develop technical, architectural and animal specific adaptation strategies for high temperature extremes.

  9. Integrated Framework for an Urban Climate Adaptation Tool

    NASA Astrophysics Data System (ADS)

    Omitaomu, O.; Parish, E. S.; Nugent, P.; Mei, R.; Sylvester, L.; Ernst, K.; Absar, M.

    2015-12-01

    Cities have an opportunity to become more resilient to future climate change through investments made in urban infrastructure today. However, most cities lack access to credible high-resolution climate change projection information needed to assess and address potential vulnerabilities from future climate variability. Therefore, we present an integrated framework for developing an urban climate adaptation tool (Urban-CAT). Urban-CAT consists of four modules. Firstly, it provides climate projections at different spatial resolutions for quantifying urban landscape. Secondly, this projected data is combined with socio-economic data using leading and lagging indicators for assessing landscape vulnerability to climate extremes (e.g., urban flooding). Thirdly, a neighborhood scale modeling approach is presented for identifying candidate areas for adaptation strategies (e.g., green infrastructure as an adaptation strategy for urban flooding). Finally, all these capabilities are made available as a web-based tool to support decision-making and communication at the neighborhood and city levels. In this paper, we present some of the methods that drive each of the modules and demo some of the capabilities available to-date using the City of Knoxville in Tennessee as a case study.

  10. Highly accurate adaptive TOF determination method for ultrasonic thickness measurement

    NASA Astrophysics Data System (ADS)

    Zhou, Lianjie; Liu, Haibo; Lian, Meng; Ying, Yangwei; Li, Te; Wang, Yongqing

    2018-04-01

    Determining the time of flight (TOF) is very critical for precise ultrasonic thickness measurement. However, the relatively low signal-to-noise ratio (SNR) of the received signals would induce significant TOF determination errors. In this paper, an adaptive time delay estimation method has been developed to improve the TOF determination’s accuracy. An improved variable step size adaptive algorithm with comprehensive step size control function is proposed. Meanwhile, a cubic spline fitting approach is also employed to alleviate the restriction of finite sampling interval. Simulation experiments under different SNR conditions were conducted for performance analysis. Simulation results manifested the performance advantage of proposed TOF determination method over existing TOF determination methods. When comparing with the conventional fixed step size, and Kwong and Aboulnasr algorithms, the steady state mean square deviation of the proposed algorithm was generally lower, which makes the proposed algorithm more suitable for TOF determination. Further, ultrasonic thickness measurement experiments were performed on aluminum alloy plates with various thicknesses. They indicated that the proposed TOF determination method was more robust even under low SNR conditions, and the ultrasonic thickness measurement accuracy could be significantly improved.

  11. Advanced Dynamically Adaptive Algorithms for Stochastic Simulations on Extreme Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiu, Dongbin

    2017-03-03

    The focus of the project is the development of mathematical methods and high-performance computational tools for stochastic simulations, with a particular emphasis on computations on extreme scales. The core of the project revolves around the design of highly efficient and scalable numerical algorithms that can adaptively and accurately, in high dimensional spaces, resolve stochastic problems with limited smoothness, even containing discontinuities.

  12. Societal transformation and adaptation necessary to manage dynamics in flood hazard and risk mitigation (TRANS-ADAPT)

    NASA Astrophysics Data System (ADS)

    Fuchs, Sven; Thaler, Thomas; Bonnefond, Mathieu; Clarke, Darren; Driessen, Peter; Hegger, Dries; Gatien-Tournat, Amandine; Gralepois, Mathilde; Fournier, Marie; Mees, Heleen; Murphy, Conor; Servain-Courant, Sylvie

    2015-04-01

    Facing the challenges of climate change, this project aims to analyse and to evaluate the multiple use of flood alleviation schemes with respect to social transformation in communities exposed to flood hazards in Europe. The overall goals are: (1) the identification of indicators and parameters necessary for strategies to increase societal resilience, (2) an analysis of the institutional settings needed for societal transformation, and (3) perspectives of changing divisions of responsibilities between public and private actors necessary to arrive at more resilient societies. This proposal assesses societal transformations from the perspective of changing divisions of responsibilities between public and private actors necessary to arrive at more resilient societies. Yet each risk mitigation measure is built on a narrative of exchanges and relations between people and therefore may condition the outputs. As such, governance is done by people interacting and defining risk mitigation measures as well as climate change adaptation are therefore simultaneously both outcomes of, and productive to, public and private responsibilities. Building off current knowledge this project will focus on different dimensions of adaptation and mitigation strategies based on social, economic and institutional incentives and settings, centring on the linkages between these different dimensions and complementing existing flood risk governance arrangements. The policy dimension of adaptation, predominantly decisions on the societal admissible level of vulnerability and risk, will be evaluated by a human-environment interaction approach using multiple methods and the assessment of social capacities of stakeholders across scales. As such, the challenges of adaptation to flood risk will be tackled by converting scientific frameworks into practical assessment and policy advice. In addressing the relationship between these dimensions of adaptation on different temporal and spatial scales, this

  13. Tailoring the visual communication of climate projections for local adaptation practitioners in Germany and the UK.

    PubMed

    Lorenz, Susanne; Dessai, Suraje; Forster, Piers M; Paavola, Jouni

    2015-11-28

    Visualizations are widely used in the communication of climate projections. However, their effectiveness has rarely been assessed among their target audience. Given recent calls to increase the usability of climate information through the tailoring of climate projections, it is imperative to assess the effectiveness of different visualizations. This paper explores the complexities of tailoring through an online survey conducted with 162 local adaptation practitioners in Germany and the UK. The survey examined respondents' assessed and perceived comprehension (PC) of visual representations of climate projections as well as preferences for using different visualizations in communicating and planning for a changing climate. Comprehension and use are tested using four different graph formats, which are split into two pairs. Within each pair the information content is the same but is visualized differently. We show that even within a fairly homogeneous user group, such as local adaptation practitioners, there are clear differences in respondents' comprehension of and preference for visualizations. We do not find a consistent association between assessed comprehension and PC or use within the two pairs of visualizations that we analysed. There is, however, a clear link between PC and use of graph format. This suggests that respondents use what they think they understand the best, rather than what they actually understand the best. These findings highlight that audience-specific targeted communication may be more complex and challenging than previously recognized. © 2015 The Authors.

  14. An Analytical Method for Measuring Competence in Project Management

    ERIC Educational Resources Information Center

    González-Marcos, Ana; Alba-Elías, Fernando; Ordieres-Meré, Joaquín

    2016-01-01

    The goal of this paper is to present a competence assessment method in project management that is based on participants' performance and value creation. It seeks to close an existing gap in competence assessment in higher education. The proposed method relies on information and communication technology (ICT) tools and combines Project Management…

  15. A Conditional Exposure Control Method for Multidimensional Adaptive Testing

    ERIC Educational Resources Information Center

    Finkelman, Matthew; Nering, Michael L.; Roussos, Louis A.

    2009-01-01

    In computerized adaptive testing (CAT), ensuring the security of test items is a crucial practical consideration. A common approach to reducing item theft is to define maximum item exposure rates, i.e., to limit the proportion of examinees to whom a given item can be administered. Numerous methods for controlling exposure rates have been proposed…

  16. An Adaptive Unstructured Grid Method by Grid Subdivision, Local Remeshing, and Grid Movement

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar Z.

    1999-01-01

    An unstructured grid adaptation technique has been developed and successfully applied to several three dimensional inviscid flow test cases. The approach is based on a combination of grid subdivision, local remeshing, and grid movement. For solution adaptive grids, the surface triangulation is locally refined by grid subdivision, and the tetrahedral grid in the field is partially remeshed at locations of dominant flow features. A grid redistribution strategy is employed for geometric adaptation of volume grids to moving or deforming surfaces. The method is automatic and fast and is designed for modular coupling with different solvers. Several steady state test cases with different inviscid flow features were tested for grid/solution adaptation. In all cases, the dominant flow features, such as shocks and vortices, were accurately and efficiently predicted with the present approach. A new and robust method of moving tetrahedral "viscous" grids is also presented and demonstrated on a three-dimensional example.

  17. Optimal projection method determination by Logdet Divergence and perturbed von-Neumann Divergence.

    PubMed

    Jiang, Hao; Ching, Wai-Ki; Qiu, Yushan; Cheng, Xiao-Qing

    2017-12-14

    Positive semi-definiteness is a critical property in kernel methods for Support Vector Machine (SVM) by which efficient solutions can be guaranteed through convex quadratic programming. However, a lot of similarity functions in applications do not produce positive semi-definite kernels. We propose projection method by constructing projection matrix on indefinite kernels. As a generalization of the spectrum method (denoising method and flipping method), the projection method shows better or comparable performance comparing to the corresponding indefinite kernel methods on a number of real world data sets. Under the Bregman matrix divergence theory, we can find suggested optimal λ in projection method using unconstrained optimization in kernel learning. In this paper we focus on optimal λ determination, in the pursuit of precise optimal λ determination method in unconstrained optimization framework. We developed a perturbed von-Neumann divergence to measure kernel relationships. We compared optimal λ determination with Logdet Divergence and perturbed von-Neumann Divergence, aiming at finding better λ in projection method. Results on a number of real world data sets show that projection method with optimal λ by Logdet divergence demonstrate near optimal performance. And the perturbed von-Neumann Divergence can help determine a relatively better optimal projection method. Projection method ia easy to use for dealing with indefinite kernels. And the parameter embedded in the method can be determined through unconstrained optimization under Bregman matrix divergence theory. This may provide a new way in kernel SVMs for varied objectives.

  18. Searching for signposts: Adaptive planning thresholds in long-term water supply projections for the Western U.S.

    NASA Astrophysics Data System (ADS)

    Robinson, B.; Herman, J. D.

    2017-12-01

    Long-term water supply planning is challenged by highly uncertain streamflow projections across climate models and emissions scenarios. Recent studies have devised infrastructure and policy responses that can withstand or adapt to an ensemble of scenarios, particularly those outside the envelope of historical variability. An important aspect of this process is whether the proposed thresholds for adaptation (i.e., observations that trigger a response) truly represent a trend toward future change. Here we propose an approach to connect observations of annual mean streamflow with long-term projections by filtering GCM-based streamflow ensembles. Visualizations are developed to investigate whether observed changes in mean annual streamflow can be linked to projected changes in end-of-century mean and variance relative to the full ensemble. A key focus is identifying thresholds that point to significant long-term changes in the distribution of streamflow (+/- 20% or greater) as early as possible. The analysis is performed on 87 sites in the Western United States, using streamflow ensembles through 2100 from a recent study by the U.S. Bureau of Reclamation. Results focus on three primary questions: (1) how many years of observed data are needed to identify the most extreme scenarios, and by what year can they be identified? (2) are these features different between sites? and (3) using this analysis, do observed flows to date at each site point to significant long-term changes? This study addresses the challenge of severe uncertainty in long-term streamflow projections by identifying key thresholds that can be observed to support water supply planning.

  19. Teacher Resource Guide, Project ECO.

    ERIC Educational Resources Information Center

    Ames Public Schools, IA.

    More than 100 outdoor education and field science projects are compiled in this teacher's resource book. Designed for use in grades K-9, the activities cover the areas of field taxonomy, laboratory taxonomy, autecology, synecology, adaptation, economic biology, conservation, museum methods, culturing, zoo keeping, gardening, and woodcraft. Each…

  20. Quantitative adaptation analytics for assessing dynamic systems of systems: LDRD Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauthier, John H.; Miner, Nadine E.; Wilson, Michael L.

    2015-01-01

    Our society is increasingly reliant on systems and interoperating collections of systems, known as systems of systems (SoS). These SoS are often subject to changing missions (e.g., nation- building, arms-control treaties), threats (e.g., asymmetric warfare, terrorism), natural environments (e.g., climate, weather, natural disasters) and budgets. How well can SoS adapt to these types of dynamic conditions? This report details the results of a three year Laboratory Directed Research and Development (LDRD) project aimed at developing metrics and methodologies for quantifying the adaptability of systems and SoS. Work products include: derivation of a set of adaptability metrics, a method for combiningmore » the metrics into a system of systems adaptability index (SoSAI) used to compare adaptability of SoS designs, development of a prototype dynamic SoS (proto-dSoS) simulation environment which provides the ability to investigate the validity of the adaptability metric set, and two test cases that evaluate the usefulness of a subset of the adaptability metrics and SoSAI for distinguishing good from poor adaptability in a SoS. Intellectual property results include three patents pending: A Method For Quantifying Relative System Adaptability, Method for Evaluating System Performance, and A Method for Determining Systems Re-Tasking.« less

  1. Systematic, Multimethod Assessment of Adaptations Across Four Diverse Health Systems Interventions.

    PubMed

    Rabin, Borsika A; McCreight, Marina; Battaglia, Catherine; Ayele, Roman; Burke, Robert E; Hess, Paul L; Frank, Joseph W; Glasgow, Russell E

    2018-01-01

    Many health outcomes and implementation science studies have demonstrated the importance of tailoring evidence-based care interventions to local context to improve fit. By adapting to local culture, history, resources, characteristics, and priorities, interventions are more likely to lead to improved outcomes. However, it is unclear how best to adapt evidence-based programs and promising innovations. There are few guides or examples of how to best categorize or assess health-care adaptations, and even fewer that are brief and practical for use by non-researchers. This study describes the importance and potential of assessing adaptations before, during, and after the implementation of health systems interventions. We present a promising multilevel and multimethod approach developed and being applied across four different health systems interventions. Finally, we discuss implications and opportunities for future research. The four case studies are diverse in the conditions addressed, interventions, and implementation strategies. They include two nurse coordinator-based transition of care interventions, a data and training-driven multimodal pain management project, and a cardiovascular patient-reported outcomes project, all of which are using audit and feedback. We used the same modified adaptation framework to document changes made to the interventions and implementation strategies. To create the modified framework, we started with the adaptation and modification model developed by Stirman and colleagues and expanded it by adding concepts from the RE-AIM framework. Our assessments address the intuitive domains of Who, How, When, What, and Why to classify and organize adaptations. For each case study, we discuss how the modified framework was operationalized, the multiple methods used to collect data, results to date and approaches utilized for data analysis. These methods include a real-time tracking system and structured interviews at key times during the

  2. Adaptive grid methods for RLV environment assessment and nozzle analysis

    NASA Technical Reports Server (NTRS)

    Thornburg, Hugh J.

    1996-01-01

    Rapid access to highly accurate data about complex configurations is needed for multi-disciplinary optimization and design. In order to efficiently meet these requirements a closer coupling between the analysis algorithms and the discretization process is needed. In some cases, such as free surface, temporally varying geometries, and fluid structure interaction, the need is unavoidable. In other cases the need is to rapidly generate and modify high quality grids. Techniques such as unstructured and/or solution-adaptive methods can be used to speed the grid generation process and to automatically cluster mesh points in regions of interest. Global features of the flow can be significantly affected by isolated regions of inadequately resolved flow. These regions may not exhibit high gradients and can be difficult to detect. Thus excessive resolution in certain regions does not necessarily increase the accuracy of the overall solution. Several approaches have been employed for both structured and unstructured grid adaption. The most widely used involve grid point redistribution, local grid point enrichment/derefinement or local modification of the actual flow solver. However, the success of any one of these methods ultimately depends on the feature detection algorithm used to determine solution domain regions which require a fine mesh for their accurate representation. Typically, weight functions are constructed to mimic the local truncation error and may require substantial user input. Most problems of engineering interest involve multi-block grids and widely disparate length scales. Hence, it is desirable that the adaptive grid feature detection algorithm be developed to recognize flow structures of different type as well as differing intensity, and adequately address scaling and normalization across blocks. These weight functions can then be used to construct blending functions for algebraic redistribution, interpolation functions for unstructured grid generation

  3. Limits to health adaptation in a changing climate

    NASA Astrophysics Data System (ADS)

    Ebi, K. L.

    2015-12-01

    Introduction: Because the health risks of climate variability and change are not new, it has been assumed that health systems have the capacity, experience, and tools to effectively adapt to changing burdens of climate-sensitive health outcomes with additional climate change. However, as illustrated in the Ebola crisis, health systems in many low-income countries have insufficient capacity to manage current health burdens. These countries also are those most vulnerable to climate change, including changes in food and water safety and security, increases in extreme weather and climate events, and increases in the geographic range, incidence, and seasonality of a variety of infectious diseases. The extent to which they might be able to keep pace with projected risks depends on assumptions of the sustainability of development pathways. At the same time, the magnitude and pattern of climate change will depend on greenhouse gas emission pathways. Methods: Review of the success of health adaptation projects and expert judgment assessment of the degree to which adaptation efforts will be able to keep pace with projected changes in climate variability and change. Results: Health adaptation can reduce the current and projected burdens of climate-sensitive health outcomes over the short term in many countries, but the extent to which it could do so past mid-century will depend on emission and development pathways. Under high emission scenarios, climate change will be rapid and extensive, leading to fundamental shifts in the burden of climate-sensitive health outcomes that will challenging for many countries to manage. Sustainable development pathways could delay but not eliminate associated health burdens. Conclusions: To prepare for and cope with the Anthropocene, health systems need additional adaptation policies and measures to develop more robust health systems, and need to advocate for rapid and significant reductions in greenhouse gas emissions.

  4. Asynchronous multilevel adaptive methods for solving partial differential equations on multiprocessors - Performance results

    NASA Technical Reports Server (NTRS)

    Mccormick, S.; Quinlan, D.

    1989-01-01

    The fast adaptive composite grid method (FAC) is an algorithm that uses various levels of uniform grids (global and local) to provide adaptive resolution and fast solution of PDEs. Like all such methods, it offers parallelism by using possibly many disconnected patches per level, but is hindered by the need to handle these levels sequentially. The finest levels must therefore wait for processing to be essentially completed on all the coarser ones. A recently developed asynchronous version of FAC, called AFAC, completely eliminates this bottleneck to parallelism. This paper describes timing results for AFAC, coupled with a simple load balancing scheme, applied to the solution of elliptic PDEs on an Intel iPSC hypercube. These tests include performance of certain processes necessary in adaptive methods, including moving grids and changing refinement. A companion paper reports on numerical and analytical results for estimating convergence factors of AFAC applied to very large scale examples.

  5. Fast alternating projection methods for constrained tomographic reconstruction

    PubMed Central

    Liu, Li; Han, Yongxin

    2017-01-01

    The alternating projection algorithms are easy to implement and effective for large-scale complex optimization problems, such as constrained reconstruction of X-ray computed tomography (CT). A typical method is to use projection onto convex sets (POCS) for data fidelity, nonnegative constraints combined with total variation (TV) minimization (so called TV-POCS) for sparse-view CT reconstruction. However, this type of method relies on empirically selected parameters for satisfactory reconstruction and is generally slow and lack of convergence analysis. In this work, we use a convex feasibility set approach to address the problems associated with TV-POCS and propose a framework using full sequential alternating projections or POCS (FS-POCS) to find the solution in the intersection of convex constraints of bounded TV function, bounded data fidelity error and non-negativity. The rationale behind FS-POCS is that the mathematically optimal solution of the constrained objective function may not be the physically optimal solution. The breakdown of constrained reconstruction into an intersection of several feasible sets can lead to faster convergence and better quantification of reconstruction parameters in a physical meaningful way than that in an empirical way of trial-and-error. In addition, for large-scale optimization problems, first order methods are usually used. Not only is the condition for convergence of gradient-based methods derived, but also a primal-dual hybrid gradient (PDHG) method is used for fast convergence of bounded TV. The newly proposed FS-POCS is evaluated and compared with TV-POCS and another convex feasibility projection method (CPTV) using both digital phantom and pseudo-real CT data to show its superior performance on reconstruction speed, image quality and quantification. PMID:28253298

  6. Adaptive MPC based on MIMO ARX-Laguerre model.

    PubMed

    Ben Abdelwahed, Imen; Mbarek, Abdelkader; Bouzrara, Kais

    2017-03-01

    This paper proposes a method for synthesizing an adaptive predictive controller using a reduced complexity model. This latter is given by the projection of the ARX model on Laguerre bases. The resulting model is entitled MIMO ARX-Laguerre and it is characterized by an easy recursive representation. The adaptive predictive control law is computed based on multi-step-ahead finite-element predictors, identified directly from experimental input/output data. The model is tuned in each iteration by an online identification algorithms of both model parameters and Laguerre poles. The proposed approach avoids time consuming numerical optimization algorithms associated with most common linear predictive control strategies, which makes it suitable for real-time implementation. The method is used to synthesize and test in numerical simulations adaptive predictive controllers for the CSTR process benchmark. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Methods for assessing wall interference in the 2- by 2-foot adaptive-wall wind tunnel

    NASA Technical Reports Server (NTRS)

    Schairer, E. T.

    1986-01-01

    Discussed are two methods for assessing two-dimensional wall interference in the adaptive-wall test section of the NASA Ames 2 x 2-Foot Transonic Wind Tunnel: (1) a method for predicting free-air conditions near the walls of the test section (adaptive-wall methods); and (2) a method for estimating wall-induced velocities near the model (correction methods), both of which methods are based on measurements of either one or two components of flow velocity near the walls of the test section. Each method is demonstrated using simulated wind tunnel data and is compared with other methods of the same type. The two-component adaptive-wall and correction methods were found to be preferable to the corresponding one-component methods because: (1) they are more sensitive to, and give a more complete description of, wall interference; (2) they require measurements at fewer locations; (3) they can be used to establish free-stream conditions; and (4) they are independent of a description of the model and constants of integration.

  8. A quality improvement project aimed at adapting primary care to ensure the delivery of evidence-based psychotherapy for adult anxiety.

    PubMed

    Williams, Mark D; Sawchuk, Craig N; Shippee, Nathan D; Somers, Kristin J; Berg, Summer L; Mitchell, Jay D; Mattson, Angela B; Katzelnick, David J

    2018-01-01

    Primary care patients frequently present with anxiety with prevalence ratios up to 30%. Brief cognitive-behavioural therapy (CBT) has been shown in meta-analytic studies to have a strong effect size in the treatment of anxiety. However, in surveys of anxious primary care patients, nearly 80% indicated that they had not received CBT. In 2010, a model of CBT (Coordinated Anxiety Learning and Management (CALM)) adapted to primary care for adult anxiety was published based on results of a randomised controlled trial. This project aimed to integrate an adaptation of CALM into one primary care practice, using results from the published research as a benchmark with the secondary intent to spread a successful model to other practices. A quality improvement approach was used to translate the CALM model of CBT for anxiety into one primary care clinic. Plan-Do-Study-Act steps are highlighted as important steps towards our goal of comparing our outcomes with benchmarks from original research. Patients with anxiety as measured by a score of 10 or higher on the Generalized Anxiety Disorder 7 item scale (GAD-7) were offered CBT as delivered by licensed social workers with support by a PhD psychologist. Outcomes were tracked and entered into an electronic registry, which became a critical tool upon which to adapt and improve our delivery of psychotherapy to our patient population. Challenges and adaptations to the model are discussed. Our 6-month response rates on the GAD-7 were 51%, which was comparable with that of the original research (57%). Quality improvement methods were critical in discovering which adaptations were needed before spread. Among these, embedding a process of measurement and data entry and ongoing feedback to patients and therapists using this data are critical step towards sustaining and improving the delivery of CBT in primary care.

  9. Adaptive model training system and method

    DOEpatents

    Bickford, Randall L; Palnitkar, Rahul M; Lee, Vo

    2014-04-15

    An adaptive model training system and method for filtering asset operating data values acquired from a monitored asset for selectively choosing asset operating data values that meet at least one predefined criterion of good data quality while rejecting asset operating data values that fail to meet at least the one predefined criterion of good data quality; and recalibrating a previously trained or calibrated model having a learned scope of normal operation of the asset by utilizing the asset operating data values that meet at least the one predefined criterion of good data quality for adjusting the learned scope of normal operation of the asset for defining a recalibrated model having the adjusted learned scope of normal operation of the asset.

  10. Adaptive model training system and method

    DOEpatents

    Bickford, Randall L; Palnitkar, Rahul M

    2014-11-18

    An adaptive model training system and method for filtering asset operating data values acquired from a monitored asset for selectively choosing asset operating data values that meet at least one predefined criterion of good data quality while rejecting asset operating data values that fail to meet at least the one predefined criterion of good data quality; and recalibrating a previously trained or calibrated model having a learned scope of normal operation of the asset by utilizing the asset operating data values that meet at least the one predefined criterion of good data quality for adjusting the learned scope of normal operation of the asset for defining a recalibrated model having the adjusted learned scope of normal operation of the asset.

  11. A Self-Adaptive Model-Based Wi-Fi Indoor Localization Method.

    PubMed

    Tuta, Jure; Juric, Matjaz B

    2016-12-06

    This paper presents a novel method for indoor localization, developed with the main aim of making it useful for real-world deployments. Many indoor localization methods exist, yet they have several disadvantages in real-world deployments-some are static, which is not suitable for long-term usage; some require costly human recalibration procedures; and others require special hardware such as Wi-Fi anchors and transponders. Our method is self-calibrating and self-adaptive thus maintenance free and based on Wi-Fi only. We have employed two well-known propagation models-free space path loss and ITU models-which we have extended with additional parameters for better propagation simulation. Our self-calibrating procedure utilizes one propagation model to infer parameters of the space and the other to simulate the propagation of the signal without requiring any additional hardware beside Wi-Fi access points, which is suitable for real-world usage. Our method is also one of the few model-based Wi-Fi only self-adaptive approaches that do not require the mobile terminal to be in the access-point mode. The only input requirements of the method are Wi-Fi access point positions, and positions and properties of the walls. Our method has been evaluated in single- and multi-room environments, with measured mean error of 2-3 and 3-4 m, respectively, which is similar to existing methods. The evaluation has proven that usable localization accuracy can be achieved in real-world environments solely by the proposed Wi-Fi method that relies on simple hardware and software requirements.

  12. A Self-Adaptive Model-Based Wi-Fi Indoor Localization Method

    PubMed Central

    Tuta, Jure; Juric, Matjaz B.

    2016-01-01

    This paper presents a novel method for indoor localization, developed with the main aim of making it useful for real-world deployments. Many indoor localization methods exist, yet they have several disadvantages in real-world deployments—some are static, which is not suitable for long-term usage; some require costly human recalibration procedures; and others require special hardware such as Wi-Fi anchors and transponders. Our method is self-calibrating and self-adaptive thus maintenance free and based on Wi-Fi only. We have employed two well-known propagation models—free space path loss and ITU models—which we have extended with additional parameters for better propagation simulation. Our self-calibrating procedure utilizes one propagation model to infer parameters of the space and the other to simulate the propagation of the signal without requiring any additional hardware beside Wi-Fi access points, which is suitable for real-world usage. Our method is also one of the few model-based Wi-Fi only self-adaptive approaches that do not require the mobile terminal to be in the access-point mode. The only input requirements of the method are Wi-Fi access point positions, and positions and properties of the walls. Our method has been evaluated in single- and multi-room environments, with measured mean error of 2–3 and 3–4 m, respectively, which is similar to existing methods. The evaluation has proven that usable localization accuracy can be achieved in real-world environments solely by the proposed Wi-Fi method that relies on simple hardware and software requirements. PMID:27929453

  13. Using Replication Projects in Teaching Research Methods

    ERIC Educational Resources Information Center

    Standing, Lionel G.; Grenier, Manuel; Lane, Erica A.; Roberts, Meigan S.; Sykes, Sarah J.

    2014-01-01

    It is suggested that replication projects may be valuable in teaching research methods, and also address the current need in psychology for more independent verification of published studies. Their use in an undergraduate methods course is described, involving student teams who performed direct replications of four well-known experiments, yielding…

  14. A one-shot-projection method for measurement of specular surfaces.

    PubMed

    Wang, Zhenzhou

    2015-02-09

    In this paper, a method is proposed to measure the shapes of specular surfaces with one-shot-projection of structured laser patterns. By intercepting the reflection of the reflected laser pattern twice with two diffusive planes, the closed form solution is achieved for each reflected ray. The points on the specular surface are reconstructed by computing the intersections of the incident rays and the reflected rays. The proposed method can measure both static and dynamic specular shapes due to its one-shot-projection, which is beyond the capability of most of state of art methods that need multiple projections. To our knowledge, the proposed method is the only method so far that could yield the closed form solutions for the dynamic and specular surfaces.

  15. An Evidence-Based Public Health Approach to Climate Change Adaptation

    PubMed Central

    Eidson, Millicent; Tlumak, Jennifer E.; Raab, Kristin K.; Luber, George

    2014-01-01

    Background: Public health is committed to evidence-based practice, yet there has been minimal discussion of how to apply an evidence-based practice framework to climate change adaptation. Objectives: Our goal was to review the literature on evidence-based public health (EBPH), to determine whether it can be applied to climate change adaptation, and to consider how emphasizing evidence-based practice may influence research and practice decisions related to public health adaptation to climate change. Methods: We conducted a substantive review of EBPH, identified a consensus EBPH framework, and modified it to support an EBPH approach to climate change adaptation. We applied the framework to an example and considered implications for stakeholders. Discussion: A modified EBPH framework can accommodate the wide range of exposures, outcomes, and modes of inquiry associated with climate change adaptation and the variety of settings in which adaptation activities will be pursued. Several factors currently limit application of the framework, including a lack of higher-level evidence of intervention efficacy and a lack of guidelines for reporting climate change health impact projections. To enhance the evidence base, there must be increased attention to designing, evaluating, and reporting adaptation interventions; standardized health impact projection reporting; and increased attention to knowledge translation. This approach has implications for funders, researchers, journal editors, practitioners, and policy makers. Conclusions: The current approach to EBPH can, with modifications, support climate change adaptation activities, but there is little evidence regarding interventions and knowledge translation, and guidelines for projecting health impacts are lacking. Realizing the goal of an evidence-based approach will require systematic, coordinated efforts among various stakeholders. Citation: Hess JJ, Eidson M, Tlumak JE, Raab KK, Luber G. 2014. An evidence-based public

  16. Adaptive control method for core power control in TRIGA Mark II reactor

    NASA Astrophysics Data System (ADS)

    Sabri Minhat, Mohd; Selamat, Hazlina; Subha, Nurul Adilla Mohd

    2018-01-01

    The 1MWth Reactor TRIGA PUSPATI (RTP) Mark II type has undergone more than 35 years of operation. The existing core power control uses feedback control algorithm (FCA). It is challenging to keep the core power stable at the desired value within acceptable error bands to meet the safety demand of RTP due to the sensitivity of nuclear research reactor operation. Currently, the system is not satisfied with power tracking performance and can be improved. Therefore, a new design core power control is very important to improve the current performance in tracking and regulate reactor power by control the movement of control rods. In this paper, the adaptive controller and focus on Model Reference Adaptive Control (MRAC) and Self-Tuning Control (STC) were applied to the control of the core power. The model for core power control was based on mathematical models of the reactor core, adaptive controller model, and control rods selection programming. The mathematical models of the reactor core were based on point kinetics model, thermal hydraulic models, and reactivity models. The adaptive control model was presented using Lyapunov method to ensure stable close loop system and STC Generalised Minimum Variance (GMV) Controller was not necessary to know the exact plant transfer function in designing the core power control. The performance between proposed adaptive control and FCA will be compared via computer simulation and analysed the simulation results manifest the effectiveness and the good performance of the proposed control method for core power control.

  17. A NDVI assisted remote sensing image adaptive scale segmentation method

    NASA Astrophysics Data System (ADS)

    Zhang, Hong; Shen, Jinxiang; Ma, Yanmei

    2018-03-01

    Multiscale segmentation of images can effectively form boundaries of different objects with different scales. However, for the remote sensing image which widely coverage with complicated ground objects, the number of suitable segmentation scales, and each of the scale size is still difficult to be accurately determined, which severely restricts the rapid information extraction of the remote sensing image. A great deal of experiments showed that the normalized difference vegetation index (NDVI) can effectively express the spectral characteristics of a variety of ground objects in remote sensing images. This paper presents a method using NDVI assisted adaptive segmentation of remote sensing images, which segment the local area by using NDVI similarity threshold to iteratively select segmentation scales. According to the different regions which consist of different targets, different segmentation scale boundaries could be created. The experimental results showed that the adaptive segmentation method based on NDVI can effectively create the objects boundaries for different ground objects of remote sensing images.

  18. Adaptive methods for nonlinear structural dynamics and crashworthiness analysis

    NASA Technical Reports Server (NTRS)

    Belytschko, Ted

    1993-01-01

    The objective is to describe three research thrusts in crashworthiness analysis: adaptivity; mixed time integration, or subcycling, in which different timesteps are used for different parts of the mesh in explicit methods; and methods for contact-impact which are highly vectorizable. The techniques are being developed to improve the accuracy of calculations, ease-of-use of crashworthiness programs, and the speed of calculations. The latter is still of importance because crashworthiness calculations are often made with models of 20,000 to 50,000 elements using explicit time integration and require on the order of 20 to 100 hours on current supercomputers. The methodologies are briefly reviewed and then some example calculations employing these methods are described. The methods are also of value to other nonlinear transient computations.

  19. MONITORING METHODS ADAPTABLE TO VAPOR INTRUSION MONITORING - USEPA COMPENDIUM METHODS TO-15, TO-15 SUPPLEMENT (DRAFT), AND TO-17

    EPA Science Inventory

    USEPA ambient air monitoring methods for volatile organic compounds (VOCs) using specially-prepared canisters and solid adsorbents are directly adaptable to monitoring for vapors in the indoor environment. The draft Method TO-15 Supplement, an extension of the USEPA Method TO-15,...

  20. The adaptive problems of female teenage refugees and their behavioral adjustment methods for coping

    PubMed Central

    Mhaidat, Fatin

    2016-01-01

    This study aimed at identifying the levels of adaptive problems among teenage female refugees in the government schools and explored the behavioral methods that were used to cope with the problems. The sample was composed of 220 Syrian female students (seventh to first secondary grades) enrolled at government schools within the Zarqa Directorate and who came to Jordan due to the war conditions in their home country. The study used the scale of adaptive problems that consists of four dimensions (depression, anger and hostility, low self-esteem, and feeling insecure) and a questionnaire of the behavioral adjustment methods for dealing with the problem of asylum. The results indicated that the Syrian teenage female refugees suffer a moderate degree of adaptation problems, and the positive adjustment methods they have used are more than the negatives. PMID:27175098

  1. Sparse-view photoacoustic tomography using virtual parallel-projections and spatially adaptive filtering

    NASA Astrophysics Data System (ADS)

    Wang, Yihan; Lu, Tong; Wan, Wenbo; Liu, Lingling; Zhang, Songhe; Li, Jiao; Zhao, Huijuan; Gao, Feng

    2018-02-01

    To fully realize the potential of photoacoustic tomography (PAT) in preclinical and clinical applications, rapid measurements and robust reconstructions are needed. Sparse-view measurements have been adopted effectively to accelerate the data acquisition. However, since the reconstruction from the sparse-view sampling data is challenging, both of the effective measurement and the appropriate reconstruction should be taken into account. In this study, we present an iterative sparse-view PAT reconstruction scheme where a virtual parallel-projection concept matching for the proposed measurement condition is introduced to help to achieve the "compressive sensing" procedure of the reconstruction, and meanwhile the spatially adaptive filtering fully considering the a priori information of the mutually similar blocks existing in natural images is introduced to effectively recover the partial unknown coefficients in the transformed domain. Therefore, the sparse-view PAT images can be reconstructed with higher quality compared with the results obtained by the universal back-projection (UBP) algorithm in the same sparse-view cases. The proposed approach has been validated by simulation experiments, which exhibits desirable performances in image fidelity even from a small number of measuring positions.

  2. The block adaptive multigrid method applied to the solution of the Euler equations

    NASA Technical Reports Server (NTRS)

    Pantelelis, Nikos

    1993-01-01

    In the present study, a scheme capable of solving very fast and robust complex nonlinear systems of equations is presented. The Block Adaptive Multigrid (BAM) solution method offers multigrid acceleration and adaptive grid refinement based on the prediction of the solution error. The proposed solution method was used with an implicit upwind Euler solver for the solution of complex transonic flows around airfoils. Very fast results were obtained (18-fold acceleration of the solution) using one fourth of the volumes of a global grid with the same solution accuracy for two test cases.

  3. Project Salud: Using community-based participatory research to culturally adapt an HIV prevention intervention in the Latino migrant worker community.

    PubMed

    Sánchez, Jesús; Serna, Claudia A; de La Rosa, Mario

    2012-01-01

    Despite the unique and challenging circumstances confronting Latino migrant worker communities in the U.S., debate still exists as to the need to culturally adapt evidence-based interventions for dissemination with this population. Project Salud adopted a community-based participatory research model and utilized focus group methodology with 83 Latino migrant workers to explore the relevance of culturally adapting an evidence-based HIV prevention intervention to be disseminated within this population. Findings from this study indicate that, despite early reservations, Latino migrant workers wanted to participate in the cultural adaptation that would result in an intervention that was culturally relevant, respectful, responsive to their life experiences, and aligned with their needs. This study contributes to the cultural adaptation/fidelity debate by highlighting the necessity of exploring ways to develop culturally adapted interventions characterized by high cultural relevance without sacrificing high fidelity to the core components that have established efficacy for evidence-based HIV prevention interventions.

  4. Efficient and effective implementation of alternative project delivery methods.

    DOT National Transportation Integrated Search

    2017-05-01

    Over the past decade, the Maryland Department of Transportation State Highway : Administration (MDOT SHA) has implemented Alternative Project Delivery (APD) methods : in a number of transportation projects. While these innovative practices have produ...

  5. Lesion insertion in the projection domain: Methods and initial results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Baiyu; Leng, Shuai; Yu, Lifeng

    2015-12-15

    Purpose: To perform task-based image quality assessment in CT, it is desirable to have a large number of realistic patient images with known diagnostic truth. One effective way of achieving this objective is to create hybrid images that combine patient images with inserted lesions. Because conventional hybrid images generated in the image domain fails to reflect the impact of scan and reconstruction parameters on lesion appearance, this study explored a projection-domain approach. Methods: Lesions were segmented from patient images and forward projected to acquire lesion projections. The forward-projection geometry was designed according to a commercial CT scanner and accommodated bothmore » axial and helical modes with various focal spot movement patterns. The energy employed by the commercial CT scanner for beam hardening correction was measured and used for the forward projection. The lesion projections were inserted into patient projections decoded from commercial CT projection data. The combined projections were formatted to match those of commercial CT raw data, loaded onto a commercial CT scanner, and reconstructed to create the hybrid images. Two validations were performed. First, to validate the accuracy of the forward-projection geometry, images were reconstructed from the forward projections of a virtual ACR phantom and compared to physically acquired ACR phantom images in terms of CT number accuracy and high-contrast resolution. Second, to validate the realism of the lesion in hybrid images, liver lesions were segmented from patient images and inserted back into the same patients, each at a new location specified by a radiologist. The inserted lesions were compared to the original lesions and visually assessed for realism by two experienced radiologists in a blinded fashion. Results: For the validation of the forward-projection geometry, the images reconstructed from the forward projections of the virtual ACR phantom were consistent with the images

  6. Lesion insertion in the projection domain: Methods and initial results

    PubMed Central

    Chen, Baiyu; Leng, Shuai; Yu, Lifeng; Yu, Zhicong; Ma, Chi; McCollough, Cynthia

    2015-01-01

    Purpose: To perform task-based image quality assessment in CT, it is desirable to have a large number of realistic patient images with known diagnostic truth. One effective way of achieving this objective is to create hybrid images that combine patient images with inserted lesions. Because conventional hybrid images generated in the image domain fails to reflect the impact of scan and reconstruction parameters on lesion appearance, this study explored a projection-domain approach. Methods: Lesions were segmented from patient images and forward projected to acquire lesion projections. The forward-projection geometry was designed according to a commercial CT scanner and accommodated both axial and helical modes with various focal spot movement patterns. The energy employed by the commercial CT scanner for beam hardening correction was measured and used for the forward projection. The lesion projections were inserted into patient projections decoded from commercial CT projection data. The combined projections were formatted to match those of commercial CT raw data, loaded onto a commercial CT scanner, and reconstructed to create the hybrid images. Two validations were performed. First, to validate the accuracy of the forward-projection geometry, images were reconstructed from the forward projections of a virtual ACR phantom and compared to physically acquired ACR phantom images in terms of CT number accuracy and high-contrast resolution. Second, to validate the realism of the lesion in hybrid images, liver lesions were segmented from patient images and inserted back into the same patients, each at a new location specified by a radiologist. The inserted lesions were compared to the original lesions and visually assessed for realism by two experienced radiologists in a blinded fashion. Results: For the validation of the forward-projection geometry, the images reconstructed from the forward projections of the virtual ACR phantom were consistent with the images physically

  7. An A-T linker adapter polymerase chain reaction method for chromosome walking without restriction site cloning bias.

    PubMed

    Trinh, Quoclinh; Xu, Wentao; Shi, Hui; Luo, Yunbo; Huang, Kunlun

    2012-06-01

    A-T linker adapter polymerase chain reaction (PCR) was modified and employed for the isolation of genomic fragments adjacent to a known DNA sequence. The improvements in the method focus on two points. The first is the modification of the PO(4) and NH(2) groups in the adapter to inhibit the self-ligation of the adapter or the generation of nonspecific products. The second improvement is the use of the capacity of rTaq DNA polymerase to add an adenosine overhang at the 3' ends of digested DNA to suppress self-ligation in the digested DNA and simultaneously resolve restriction site clone bias. The combination of modifications in the adapter and in the digested DNA leads to T/A-specific ligation, which enhances the flexibility of this method and makes it feasible to use many different restriction enzymes with a single adapter. This novel A-T linker adapter PCR overcomes the inherent limitations of the original ligation-mediated PCR method such as low specificity and a lack of restriction enzyme choice. Moreover, this method also offers higher amplification efficiency, greater flexibility, and easier manipulation compared with other PCR methods for chromosome walking. Experimental results from 143 Arabidopsis mutants illustrate that this method is reliable and efficient in high-throughput experiments. Copyright © 2012 Elsevier Inc. All rights reserved.

  8. MFAM: Multiple Frequency Adaptive Model-Based Indoor Localization Method.

    PubMed

    Tuta, Jure; Juric, Matjaz B

    2018-03-24

    This paper presents MFAM (Multiple Frequency Adaptive Model-based localization method), a novel model-based indoor localization method that is capable of using multiple wireless signal frequencies simultaneously. It utilizes indoor architectural model and physical properties of wireless signal propagation through objects and space. The motivation for developing multiple frequency localization method lies in the future Wi-Fi standards (e.g., 802.11ah) and the growing number of various wireless signals present in the buildings (e.g., Wi-Fi, Bluetooth, ZigBee, etc.). Current indoor localization methods mostly rely on a single wireless signal type and often require many devices to achieve the necessary accuracy. MFAM utilizes multiple wireless signal types and improves the localization accuracy over the usage of a single frequency. It continuously monitors signal propagation through space and adapts the model according to the changes indoors. Using multiple signal sources lowers the required number of access points for a specific signal type while utilizing signals, already present in the indoors. Due to the unavailability of the 802.11ah hardware, we have evaluated proposed method with similar signals; we have used 2.4 GHz Wi-Fi and 868 MHz HomeMatic home automation signals. We have performed the evaluation in a modern two-bedroom apartment and measured mean localization error 2.0 to 2.3 m and median error of 2.0 to 2.2 m. Based on our evaluation results, using two different signals improves the localization accuracy by 18% in comparison to 2.4 GHz Wi-Fi-only approach. Additional signals would improve the accuracy even further. We have shown that MFAM provides better accuracy than competing methods, while having several advantages for real-world usage.

  9. MFAM: Multiple Frequency Adaptive Model-Based Indoor Localization Method

    PubMed Central

    Juric, Matjaz B.

    2018-01-01

    This paper presents MFAM (Multiple Frequency Adaptive Model-based localization method), a novel model-based indoor localization method that is capable of using multiple wireless signal frequencies simultaneously. It utilizes indoor architectural model and physical properties of wireless signal propagation through objects and space. The motivation for developing multiple frequency localization method lies in the future Wi-Fi standards (e.g., 802.11ah) and the growing number of various wireless signals present in the buildings (e.g., Wi-Fi, Bluetooth, ZigBee, etc.). Current indoor localization methods mostly rely on a single wireless signal type and often require many devices to achieve the necessary accuracy. MFAM utilizes multiple wireless signal types and improves the localization accuracy over the usage of a single frequency. It continuously monitors signal propagation through space and adapts the model according to the changes indoors. Using multiple signal sources lowers the required number of access points for a specific signal type while utilizing signals, already present in the indoors. Due to the unavailability of the 802.11ah hardware, we have evaluated proposed method with similar signals; we have used 2.4 GHz Wi-Fi and 868 MHz HomeMatic home automation signals. We have performed the evaluation in a modern two-bedroom apartment and measured mean localization error 2.0 to 2.3 m and median error of 2.0 to 2.2 m. Based on our evaluation results, using two different signals improves the localization accuracy by 18% in comparison to 2.4 GHz Wi-Fi-only approach. Additional signals would improve the accuracy even further. We have shown that MFAM provides better accuracy than competing methods, while having several advantages for real-world usage. PMID:29587352

  10. Urban-Climate Adaptation Tool: Optimizing Green Infrastructure

    NASA Astrophysics Data System (ADS)

    Fellows, J. D.; Bhaduri, B. L.

    2016-12-01

    Cities have an opportunity to become more resilient to future climate change and green through investments made in urban infrastructure today. However, most cities lack access to credible high-resolution climate change projection and other environmental information needed to assess and address potential vulnerabilities from future climate variability. Therefore, we present an integrated framework for developing an urban climate adaptation tool (Urban-CAT). The initial focus of Urban-CAT is to optimize the placement of green infrastructure (e.g., green roofs, porous pavements, retention basins, etc.) to be better control stormwater runoff and lower the ambient urban temperature. Urban-CAT consists of four modules. Firstly, it provides climate projections at different spatial resolutions for quantifying urban landscape. Secondly, this projected data is combined with socio-economic and other environmental data using leading and lagging indicators for assessing landscape vulnerability to climate extremes (e.g., urban flooding). Thirdly, a neighborhood scale modeling approach is presented for identifying candidate areas for adaptation strategies (e.g., green infrastructure as an adaptation strategy for urban flooding). Finally, all these capabilities are made available as a web-based tool to support decision-making and communication at the neighborhood and city levels. This presentation will highlight the methods that drive each of the modules, demo some of the capabilities using Knoxville Tennessee as a case study, and discuss the challenges of working with communities to incorporate climate change into their planning. Next steps on Urban-CAT is to additional capabilities to create a comprehensive climate adaptation tool, including energy, transportation, health, and other key urban services.

  11. Database Design Learning: A Project-Based Approach Organized through a Course Management System

    ERIC Educational Resources Information Center

    Dominguez, Cesar; Jaime, Arturo

    2010-01-01

    This paper describes an active method for database design learning through practical tasks development by student teams in a face-to-face course. This method integrates project-based learning, and project management techniques and tools. Some scaffolding is provided at the beginning that forms a skeleton that adapts to a great variety of…

  12. Effective Teaching Methods--Project-based Learning in Physics

    ERIC Educational Resources Information Center

    Holubova, Renata

    2008-01-01

    The paper presents results of the research of new effective teaching methods in physics and science. It is found out that it is necessary to educate pre-service teachers in approaches stressing the importance of the own activity of students, in competences how to create an interdisciplinary project. Project-based physics teaching and learning…

  13. An alternative extragradient projection method for quasi-equilibrium problems.

    PubMed

    Chen, Haibin; Wang, Yiju; Xu, Yi

    2018-01-01

    For the quasi-equilibrium problem where the players' costs and their strategies both depend on the rival's decisions, an alternative extragradient projection method for solving it is designed. Different from the classical extragradient projection method whose generated sequence has the contraction property with respect to the solution set, the newly designed method possesses an expansion property with respect to a given initial point. The global convergence of the method is established under the assumptions of pseudomonotonicity of the equilibrium function and of continuity of the underlying multi-valued mapping. Furthermore, we show that the generated sequence converges to the nearest point in the solution set to the initial point. Numerical experiments show the efficiency of the method.

  14. Application fuzzy multi-attribute decision analysis method to prioritize project success criteria

    NASA Astrophysics Data System (ADS)

    Phong, Nguyen Thanh; Quyen, Nguyen Le Hoang Thuy To

    2017-11-01

    Project success is a foundation for project owner to manage and control not only for the current project but also for future potential projects in construction companies. However, identifying the key success criteria for evaluating a particular project in real practice is a challenging task. Normally, it depends on a lot of factors, such as the expectation of the project owner and stakeholders, triple constraints of the project (cost, time, quality), and company's mission, vision, and objectives. Traditional decision-making methods for measuring the project success are usually based on subjective opinions of panel experts, resulting in irrational and inappropriate decisions. Therefore, this paper introduces a multi-attribute decision analysis method (MADAM) for weighting project success criteria by using fuzzy Analytical Hierarchy Process approach. It is found that this method is useful when dealing with imprecise and uncertain human judgments in evaluating project success criteria. Moreover, this research also suggests that although cost, time, and quality are three project success criteria projects, the satisfaction of project owner and acceptance of project stakeholders with the completed project criteria is the most important criteria for project success evaluation in Vietnam.

  15. An Adaptive Instability Suppression Controls Method for Aircraft Gas Turbine Engine Combustors

    NASA Technical Reports Server (NTRS)

    Kopasakis, George; DeLaat, John C.; Chang, Clarence T.

    2008-01-01

    An adaptive controls method for instability suppression in gas turbine engine combustors has been developed and successfully tested with a realistic aircraft engine combustor rig. This testing was part of a program that demonstrated, for the first time, successful active combustor instability control in an aircraft gas turbine engine-like environment. The controls method is called Adaptive Sliding Phasor Averaged Control. Testing of the control method has been conducted in an experimental rig with different configurations designed to simulate combustors with instabilities of about 530 and 315 Hz. Results demonstrate the effectiveness of this method in suppressing combustor instabilities. In addition, a dramatic improvement in suppression of the instability was achieved by focusing control on the second harmonic of the instability. This is believed to be due to a phenomena discovered and reported earlier, the so called Intra-Harmonic Coupling. These results may have implications for future research in combustor instability control.

  16. New methods and astrophysical applications of adaptive mesh fluid simulations

    NASA Astrophysics Data System (ADS)

    Wang, Peng

    The formation of stars, galaxies and supermassive black holes are among the most interesting unsolved problems in astrophysics. Those problems are highly nonlinear and involve enormous dynamical ranges. Thus numerical simulations with spatial adaptivity are crucial in understanding those processes. In this thesis, we discuss the development and application of adaptive mesh refinement (AMR) multi-physics fluid codes to simulate those nonlinear structure formation problems. To simulate the formation of star clusters, we have developed an AMR magnetohydrodynamics (MHD) code, coupled with radiative cooling. We have also developed novel algorithms for sink particle creation, accretion, merging and outflows, all of which are coupled with the fluid algorithms using operator splitting. With this code, we have been able to perform the first AMR-MHD simulation of star cluster formation for several dynamical times, including sink particle and protostellar outflow feedbacks. The results demonstrated that protostellar outflows can drive supersonic turbulence in dense clumps and explain the observed slow and inefficient star formation. We also suggest that global collapse rate is the most important factor in controlling massive star accretion rate. In the topics of galaxy formation, we discuss the results of three projects. In the first project, using cosmological AMR hydrodynamics simulations, we found that isolated massive star still forms in cosmic string wakes even though the mega-parsec scale structure has been perturbed significantly by the cosmic strings. In the second project, we calculated the dynamical heating rate in galaxy formation. We found that by balancing our heating rate with the atomic cooling rate, it gives a critical halo mass which agrees with the result of numerical simulations. This demonstrates that the effect of dynamical heating should be put into semi-analytical works in the future. In the third project, using our AMR-MHD code coupled with radiative

  17. ADAPTIVE METHODS FOR STOCHASTIC DIFFERENTIAL EQUATIONS VIA NATURAL EMBEDDINGS AND REJECTION SAMPLING WITH MEMORY.

    PubMed

    Rackauckas, Christopher; Nie, Qing

    2017-01-01

    Adaptive time-stepping with high-order embedded Runge-Kutta pairs and rejection sampling provides efficient approaches for solving differential equations. While many such methods exist for solving deterministic systems, little progress has been made for stochastic variants. One challenge in developing adaptive methods for stochastic differential equations (SDEs) is the construction of embedded schemes with direct error estimates. We present a new class of embedded stochastic Runge-Kutta (SRK) methods with strong order 1.5 which have a natural embedding of strong order 1.0 methods. This allows for the derivation of an error estimate which requires no additional function evaluations. Next we derive a general method to reject the time steps without losing information about the future Brownian path termed Rejection Sampling with Memory (RSwM). This method utilizes a stack data structure to do rejection sampling, costing only a few floating point calculations. We show numerically that the methods generate statistically-correct and tolerance-controlled solutions. Lastly, we show that this form of adaptivity can be applied to systems of equations, and demonstrate that it solves a stiff biological model 12.28x faster than common fixed timestep algorithms. Our approach only requires the solution to a bridging problem and thus lends itself to natural generalizations beyond SDEs.

  18. ADAPTIVE METHODS FOR STOCHASTIC DIFFERENTIAL EQUATIONS VIA NATURAL EMBEDDINGS AND REJECTION SAMPLING WITH MEMORY

    PubMed Central

    Rackauckas, Christopher

    2017-01-01

    Adaptive time-stepping with high-order embedded Runge-Kutta pairs and rejection sampling provides efficient approaches for solving differential equations. While many such methods exist for solving deterministic systems, little progress has been made for stochastic variants. One challenge in developing adaptive methods for stochastic differential equations (SDEs) is the construction of embedded schemes with direct error estimates. We present a new class of embedded stochastic Runge-Kutta (SRK) methods with strong order 1.5 which have a natural embedding of strong order 1.0 methods. This allows for the derivation of an error estimate which requires no additional function evaluations. Next we derive a general method to reject the time steps without losing information about the future Brownian path termed Rejection Sampling with Memory (RSwM). This method utilizes a stack data structure to do rejection sampling, costing only a few floating point calculations. We show numerically that the methods generate statistically-correct and tolerance-controlled solutions. Lastly, we show that this form of adaptivity can be applied to systems of equations, and demonstrate that it solves a stiff biological model 12.28x faster than common fixed timestep algorithms. Our approach only requires the solution to a bridging problem and thus lends itself to natural generalizations beyond SDEs. PMID:29527134

  19. Guideline adaptation and implementation planning: a prospective observational study

    PubMed Central

    2013-01-01

    Background Adaptation of high-quality practice guidelines for local use has been advanced as an efficient means to improve acceptability and applicability of evidence-informed care. In a pan-Canadian study, we examined how cancer care groups adapted pre-existing guidelines to their unique context and began implementation planning. Methods Using a mixed-methods, case-study design, five cases were purposefully sampled from self-identified groups and followed as they used a structured method and resources for guideline adaptation. Cases received the ADAPTE Collaboration toolkit, facilitation, methodological and logistical support, resources and assistance as required. Documentary and primary data collection methods captured individual case experience, including monthly summaries of meeting and field notes, email/telephone correspondence, and project records. Site visits, process audits, interviews, and a final evaluation forum with all cases contributed to a comprehensive account of participant experience. Results Study cases took 12 to >24 months to complete guideline adaptation. Although participants appreciated the structure, most found the ADAPTE method complex and lacking practical aspects. They needed assistance establishing individual guideline mandate and infrastructure, articulating health questions, executing search strategies, appraising evidence, and achieving consensus. Facilitation was described as a multi-faceted process, a team effort, and an essential ingredient for guideline adaptation. While front-line care providers implicitly identified implementation issues during adaptation, they identified a need to add an explicit implementation planning component. Conclusions Guideline adaptation is a positive initial step toward evidence-informed care, but adaptation (vs. ‘de novo’ development) did not meet expectations for reducing time or resource commitments. Undertaking adaptation is as much about the process (engagement and capacity building) as it

  20. Adaptive mixed finite element methods for Darcy flow in fractured porous media

    NASA Astrophysics Data System (ADS)

    Chen, Huangxin; Salama, Amgad; Sun, Shuyu

    2016-10-01

    In this paper, we propose adaptive mixed finite element methods for simulating the single-phase Darcy flow in two-dimensional fractured porous media. The reduced model that we use for the simulation is a discrete fracture model coupling Darcy flows in the matrix and the fractures, and the fractures are modeled by one-dimensional entities. The Raviart-Thomas mixed finite element methods are utilized for the solution of the coupled Darcy flows in the matrix and the fractures. In order to improve the efficiency of the simulation, we use adaptive mixed finite element methods based on novel residual-based a posteriori error estimators. In addition, we develop an efficient upscaling algorithm to compute the effective permeability of the fractured porous media. Several interesting examples of Darcy flow in the fractured porous media are presented to demonstrate the robustness of the algorithm.

  1. Arbitrary Lagrangian-Eulerian Method with Local Structured Adaptive Mesh Refinement for Modeling Shock Hydrodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, R W; Pember, R B; Elliott, N S

    2001-10-22

    A new method that combines staggered grid Arbitrary Lagrangian-Eulerian (ALE) techniques with structured local adaptive mesh refinement (AMR) has been developed for solution of the Euler equations. This method facilitates the solution of problems currently at and beyond the boundary of soluble problems by traditional ALE methods by focusing computational resources where they are required through dynamic adaption. Many of the core issues involved in the development of the combined ALEAMR method hinge upon the integration of AMR with a staggered grid Lagrangian integration method. The novel components of the method are mainly driven by the need to reconcile traditionalmore » AMR techniques, which are typically employed on stationary meshes with cell-centered quantities, with the staggered grids and grid motion employed by Lagrangian methods. Numerical examples are presented which demonstrate the accuracy and efficiency of the method.« less

  2. Development and evaluation of a method of calibrating medical displays based on fixed adaptation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sund, Patrik, E-mail: patrik.sund@vgregion.se; Månsson, Lars Gunnar; Båth, Magnus

    2015-04-15

    Purpose: The purpose of this work was to develop and evaluate a new method for calibration of medical displays that includes the effect of fixed adaptation and by using equipment and luminance levels typical for a modern radiology department. Methods: Low contrast sinusoidal test patterns were derived at nine luminance levels from 2 to 600 cd/m{sup 2} and used in a two alternative forced choice observer study, where the adaptation level was fixed at the logarithmic average of 35 cd/m{sup 2}. The contrast sensitivity at each luminance level was derived by establishing a linear relationship between the ten pattern contrastmore » levels used at every luminance level and a detectability index (d′) calculated from the fraction of correct responses. A Gaussian function was fitted to the data and normalized to the adaptation level. The corresponding equation was used in a display calibration method that included the grayscale standard display function (GSDF) but compensated for fixed adaptation. In the evaluation study, the contrast of circular objects with a fixed pixel contrast was displayed using both calibration methods and was rated on a five-grade scale. Results were calculated using a visual grading characteristics method. Error estimations in both observer studies were derived using a bootstrap method. Results: The contrast sensitivities for the darkest and brightest patterns compared to the contrast sensitivity at the adaptation luminance were 37% and 56%, respectively. The obtained Gaussian fit corresponded well with similar studies. The evaluation study showed a higher degree of equally distributed contrast throughout the luminance range with the calibration method compensated for fixed adaptation than for the GSDF. The two lowest scores for the GSDF were obtained for the darkest and brightest patterns. These scores were significantly lower than the lowest score obtained for the compensated GSDF. For the GSDF, the scores for all luminance levels were

  3. Breakthrough Propulsion Physics Project: Project Management Methods

    NASA Technical Reports Server (NTRS)

    Millis, Marc G.

    2004-01-01

    To leap past the limitations of existing propulsion, the NASA Breakthrough Propulsion Physics (BPP) Project seeks further advancements in physics from which new propulsion methods can eventually be derived. Three visionary breakthroughs are sought: (1) propulsion that requires no propellant, (2) propulsion that circumvents existing speed limits, and (3) breakthrough methods of energy production to power such devices. Because these propulsion goals are presumably far from fruition, a special emphasis is to identify credible research that will make measurable progress toward these goals in the near-term. The management techniques to address this challenge are presented, with a special emphasis on the process used to review, prioritize, and select research tasks. This selection process includes these key features: (a) research tasks are constrained to only address the immediate unknowns, curious effects or critical issues, (b) reliability of assertions is more important than the implications of the assertions, which includes the practice where the reviewers judge credibility rather than feasibility, and (c) total scores are obtained by multiplying the criteria scores rather than by adding. Lessons learned and revisions planned are discussed.

  4. A hyper-spherical adaptive sparse-grid method for high-dimensional discontinuity detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Guannan; Webster, Clayton G.; Gunzburger, Max D.

    This work proposes and analyzes a hyper-spherical adaptive hierarchical sparse-grid method for detecting jump discontinuities of functions in high-dimensional spaces is proposed. The method is motivated by the theoretical and computational inefficiencies of well-known adaptive sparse-grid methods for discontinuity detection. Our novel approach constructs a function representation of the discontinuity hyper-surface of an N-dimensional dis- continuous quantity of interest, by virtue of a hyper-spherical transformation. Then, a sparse-grid approximation of the transformed function is built in the hyper-spherical coordinate system, whose value at each point is estimated by solving a one-dimensional discontinuity detection problem. Due to the smoothness of themore » hyper-surface, the new technique can identify jump discontinuities with significantly reduced computational cost, compared to existing methods. Moreover, hierarchical acceleration techniques are also incorporated to further reduce the overall complexity. Rigorous error estimates and complexity analyses of the new method are provided as are several numerical examples that illustrate the effectiveness of the approach.« less

  5. Climate Hazard Assessment for Stakeholder Adaptation Planning in New York City

    NASA Technical Reports Server (NTRS)

    Horton, Radley M.; Gornitz, Vivien; Bader, Daniel A.; Ruane, Alex C.; Goldberg, Richard; Rosenzweig, Cynthia

    2011-01-01

    This paper describes a time-sensitive approach to climate change projections, developed as part of New York City's climate change adaptation process, that has provided decision support to stakeholders from 40 agencies, regional planning associations, and private companies. The approach optimizes production of projections given constraints faced by decision makers as they incorporate climate change into long-term planning and policy. New York City stakeholders, who are well-versed in risk management, helped pre-select the climate variables most likely to impact urban infrastructure, and requested a projection range rather than a single 'most likely' outcome. The climate projections approach is transferable to other regions and consistent with broader efforts to provide climate services, including impact, vulnerability, and adaptation information. The approach uses 16 Global Climate Models (GCMs) and three emissions scenarios to calculate monthly change factors based on 30-year average future time slices relative to a 30- year model baseline. Projecting these model mean changes onto observed station data for New York City yields dramatic changes in the frequency of extreme events such as coastal flooding and dangerous heat events. Based on these methods, the current 1-in-10 year coastal flood is projected to occur more than once every 3 years by the end of the century, and heat events are projected to approximately triple in frequency. These frequency changes are of sufficient magnitude to merit consideration in long-term adaptation planning, even though the precise changes in extreme event frequency are highly uncertain

  6. Robust time and frequency domain estimation methods in adaptive control

    NASA Technical Reports Server (NTRS)

    Lamaire, Richard Orville

    1987-01-01

    A robust identification method was developed for use in an adaptive control system. The type of estimator is called the robust estimator, since it is robust to the effects of both unmodeled dynamics and an unmeasurable disturbance. The development of the robust estimator was motivated by a need to provide guarantees in the identification part of an adaptive controller. To enable the design of a robust control system, a nominal model as well as a frequency-domain bounding function on the modeling uncertainty associated with this nominal model must be provided. Two estimation methods are presented for finding parameter estimates, and, hence, a nominal model. One of these methods is based on the well developed field of time-domain parameter estimation. In a second method of finding parameter estimates, a type of weighted least-squares fitting to a frequency-domain estimated model is used. The frequency-domain estimator is shown to perform better, in general, than the time-domain parameter estimator. In addition, a methodology for finding a frequency-domain bounding function on the disturbance is used to compute a frequency-domain bounding function on the additive modeling error due to the effects of the disturbance and the use of finite-length data. The performance of the robust estimator in both open-loop and closed-loop situations is examined through the use of simulations.

  7. Markov chains of infinite order and asymptotic satisfaction of balance: application to the adaptive integration method.

    PubMed

    Earl, David J; Deem, Michael W

    2005-04-14

    Adaptive Monte Carlo methods can be viewed as implementations of Markov chains with infinite memory. We derive a general condition for the convergence of a Monte Carlo method whose history dependence is contained within the simulated density distribution. In convergent cases, our result implies that the balance condition need only be satisfied asymptotically. As an example, we show that the adaptive integration method converges.

  8. From vision to action: roadmapping as a strategic method and tool to implement climate change adaptation - the example of the roadmap 'water sensitive urban design 2020'.

    PubMed

    Hasse, J U; Weingaertner, D E

    2016-01-01

    As the central product of the BMBF-KLIMZUG-funded Joint Network and Research Project (JNRP) 'dynaklim - Dynamic adaptation of regional planning and development processes to the effects of climate change in the Emscher-Lippe region (North Rhine Westphalia, Germany)', the Roadmap 2020 'Regional Climate Adaptation' has been developed by the various regional stakeholders and institutions containing specific regional scenarios, strategies and adaptation measures applicable throughout the region. This paper presents the method, elements and main results of this regional roadmap process by using the example of the thematic sub-roadmap 'Water Sensitive Urban Design 2020'. With a focus on the process support tool 'KlimaFLEX', one of the main adaptation measures of the WSUD 2020 roadmap, typical challenges for integrated climate change adaptation like scattered knowledge, knowledge gaps and divided responsibilities but also potential solutions and promising chances for urban development and urban water management are discussed. With the roadmap and the related tool, the relevant stakeholders of the Emscher-Lippe region have jointly developed important prerequisites to integrate their knowledge, to clarify vulnerabilities, adaptation goals, responsibilities and interests, and to foresightedly coordinate measures, resources, priorities and schedules for an efficient joint urban planning, well-grounded decision-making in times of continued uncertainties and step-by-step implementation of adaptation measures from now on.

  9. Integrated Analysis of Pharmacologic, Clinical, and SNP Microarray Data using Projection onto the Most Interesting Statistical Evidence with Adaptive Permutation Testing

    PubMed Central

    Pounds, Stan; Cao, Xueyuan; Cheng, Cheng; Yang, Jun; Campana, Dario; Evans, William E.; Pui, Ching-Hon; Relling, Mary V.

    2010-01-01

    Powerful methods for integrated analysis of multiple biological data sets are needed to maximize interpretation capacity and acquire meaningful knowledge. We recently developed Projection Onto the Most Interesting Statistical Evidence (PROMISE). PROMISE is a statistical procedure that incorporates prior knowledge about the biological relationships among endpoint variables into an integrated analysis of microarray gene expression data with multiple biological and clinical endpoints. Here, PROMISE is adapted to the integrated analysis of pharmacologic, clinical, and genome-wide genotype data that incorporating knowledge about the biological relationships among pharmacologic and clinical response data. An efficient permutation-testing algorithm is introduced so that statistical calculations are computationally feasible in this higher-dimension setting. The new method is applied to a pediatric leukemia data set. The results clearly indicate that PROMISE is a powerful statistical tool for identifying genomic features that exhibit a biologically meaningful pattern of association with multiple endpoint variables. PMID:21516175

  10. Equivalent-Groups versus Single-Group Equating Designs for the Accelerated CAT-ASVAB (Computerized Adaptive Test-Armed Services Vocational Aptitude Battery) Project.

    DTIC Science & Technology

    1987-01-01

    DESIGNS FOR THE ACCELERATED CAT -ASVAB * PROJECT Peter H. Stoloff DTIC’- , " SELECTE -NOV 2 3 987 A Division of Hudson Institute CENTER FOR NAVAL ANALYSES...65153M C0031 SI TITLE (Include Security Classification) Equivalent-Groups Versus Single-Group Equating Designs For The Accelerated CAT -ASVAB Project...GROUP ACAP (Accelerated CAT -ASVAB Program), Aptitude tests, ASVAB (Armed 05 10 Services Vocational Aptitude Battery), CAT (Computerized Adaptive Test

  11. Gait-Event-Based Synchronization Method for Gait Rehabilitation Robots via a Bioinspired Adaptive Oscillator.

    PubMed

    Chen, Gong; Qi, Peng; Guo, Zhao; Yu, Haoyong

    2017-06-01

    In the field of gait rehabilitation robotics, achieving human-robot synchronization is very important. In this paper, a novel human-robot synchronization method using gait event information is proposed. This method includes two steps. First, seven gait events in one gait cycle are detected in real time with a hidden Markov model; second, an adaptive oscillator is utilized to estimate the stride percentage of human gait using any one of the gait events. Synchronous reference trajectories for the robot are then generated with the estimated stride percentage. This method is based on a bioinspired adaptive oscillator, which is a mathematical tool, first proposed to explain the phenomenon of synchronous flashing among fireflies. The proposed synchronization method is implemented in a portable knee-ankle-foot robot and tested in 15 healthy subjects. This method has the advantages of simple structure, flexible selection of gait events, and fast adaptation. Gait event is the only information needed, and hence the performance of synchronization holds when an abnormal gait pattern is involved. The results of the experiments reveal that our approach is efficient in achieving human-robot synchronization and feasible for rehabilitation robotics application.

  12. Wavelet and adaptive methods for time dependent problems and applications in aerosol dynamics

    NASA Astrophysics Data System (ADS)

    Guo, Qiang

    Time dependent partial differential equations (PDEs) are widely used as mathematical models of environmental problems. Aerosols are now clearly identified as an important factor in many environmental aspects of climate and radiative forcing processes, as well as in the health effects of air quality. The mathematical models for the aerosol dynamics with respect to size distribution are nonlinear partial differential and integral equations, which describe processes of condensation, coagulation and deposition. Simulating the general aerosol dynamic equations on time, particle size and space exhibits serious difficulties because the size dimension ranges from a few nanometer to several micrometer while the spatial dimension is usually described with kilometers. Therefore, it is an important and challenging task to develop efficient techniques for solving time dependent dynamic equations. In this thesis, we develop and analyze efficient wavelet and adaptive methods for the time dependent dynamic equations on particle size and further apply them to the spatial aerosol dynamic systems. Wavelet Galerkin method is proposed to solve the aerosol dynamic equations on time and particle size due to the fact that aerosol distribution changes strongly along size direction and the wavelet technique can solve it very efficiently. Daubechies' wavelets are considered in the study due to the fact that they possess useful properties like orthogonality, compact support, exact representation of polynomials to a certain degree. Another problem encountered in the solution of the aerosol dynamic equations results from the hyperbolic form due to the condensation growth term. We propose a new characteristic-based fully adaptive multiresolution numerical scheme for solving the aerosol dynamic equation, which combines the attractive advantages of adaptive multiresolution technique and the characteristics method. On the aspect of theoretical analysis, the global existence and uniqueness of

  13. Second derivatives for approximate spin projection methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Lee M.; Hratchian, Hrant P., E-mail: hhratchian@ucmerced.edu

    2015-02-07

    The use of broken-symmetry electronic structure methods is required in order to obtain correct behavior of electronically strained open-shell systems, such as transition states, biradicals, and transition metals. This approach often has issues with spin contamination, which can lead to significant errors in predicted energies, geometries, and properties. Approximate projection schemes are able to correct for spin contamination and can often yield improved results. To fully make use of these methods and to carry out exploration of the potential energy surface, it is desirable to develop an efficient second energy derivative theory. In this paper, we formulate the analytical secondmore » derivatives for the Yamaguchi approximate projection scheme, building on recent work that has yielded an efficient implementation of the analytical first derivatives.« less

  14. A photoacoustic imaging reconstruction method based on directional total variation with adaptive directivity.

    PubMed

    Wang, Jin; Zhang, Chen; Wang, Yuanyuan

    2017-05-30

    In photoacoustic tomography (PAT), total variation (TV) based iteration algorithm is reported to have a good performance in PAT image reconstruction. However, classical TV based algorithm fails to preserve the edges and texture details of the image because it is not sensitive to the direction of the image. Therefore, it is of great significance to develop a new PAT reconstruction algorithm to effectively solve the drawback of TV. In this paper, a directional total variation with adaptive directivity (DDTV) model-based PAT image reconstruction algorithm, which weightedly sums the image gradients based on the spatially varying directivity pattern of the image is proposed to overcome the shortcomings of TV. The orientation field of the image is adaptively estimated through a gradient-based approach. The image gradients are weighted at every pixel based on both its anisotropic direction and another parameter, which evaluates the estimated orientation field reliability. An efficient algorithm is derived to solve the iteration problem associated with DDTV and possessing directivity of the image adaptively updated for each iteration step. Several texture images with various directivity patterns are chosen as the phantoms for the numerical simulations. The 180-, 90- and 30-view circular scans are conducted. Results obtained show that the DDTV-based PAT reconstructed algorithm outperforms the filtered back-projection method (FBP) and TV algorithms in the quality of reconstructed images with the peak signal-to-noise rations (PSNR) exceeding those of TV and FBP by about 10 and 18 dB, respectively, for all cases. The Shepp-Logan phantom is studied with further discussion of multimode scanning, convergence speed, robustness and universality aspects. In-vitro experiments are performed for both the sparse-view circular scanning and linear scanning. The results further prove the effectiveness of the DDTV, which shows better results than that of the TV with sharper image edges and

  15. Likelihood Methods for Adaptive Filtering and Smoothing. Technical Report #455.

    ERIC Educational Resources Information Center

    Butler, Ronald W.

    The dynamic linear model or Kalman filtering model provides a useful methodology for predicting the past, present, and future states of a dynamic system, such as an object in motion or an economic or social indicator that is changing systematically with time. Recursive likelihood methods for adaptive Kalman filtering and smoothing are developed.…

  16. Adapting ethanol fuels to diesel engines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    During the 2nd International Alcohol Symposium 1977, Daimler-Benz reported on the advantages and disadvantages of the various methods of using ethanol in originally diesel-operated commercial vehicles, and especially about the first results in the field of adapting the ethanol fuel to the requirements of conventional diesel engines. Investigations to this effect were continued by Daimler-Benz AG, Stuttgart, and Mercedes-Benz of Brasil in coordination with competent Brazilian government departments. The development effort is primarily adapted to Brazilian conditions, since ethanol fuel is intended as a long-term project in this country. This report is presented under headings - auto-ignition; durability tests; remedialmore » measures; the injection systems; ethanol quality.« less

  17. Accurate Projection Methods for the Incompressible Navier–Stokes Equations

    DOE PAGES

    Brown, David L.; Cortez, Ricardo; Minion, Michael L.

    2001-04-10

    This paper considers the accuracy of projection method approximations to the initial–boundary-value problem for the incompressible Navier–Stokes equations. The issue of how to correctly specify numerical boundary conditions for these methods has been outstanding since the birth of the second-order methodology a decade and a half ago. It has been observed that while the velocity can be reliably computed to second-order accuracy in time and space, the pressure is typically only first-order accurate in the L ∞-norm. Here, we identify the source of this problem in the interplay of the global pressure-update formula with the numerical boundary conditions and presentsmore » an improved projection algorithm which is fully second-order accurate, as demonstrated by a normal mode analysis and numerical experiments. In addition, a numerical method based on a gauge variable formulation of the incompressible Navier–Stokes equations, which provides another option for obtaining fully second-order convergence in both velocity and pressure, is discussed. The connection between the boundary conditions for projection methods and the gauge method is explained in detail.« less

  18. Recent Results from NASA's Morphing Project

    NASA Technical Reports Server (NTRS)

    McGowan, Anna-Maria R.; Washburn, Anthony E.; Horta, Lucas G.; Bryant, Robert G.; Cox, David E.; Siochi, Emilie J.; Padula, Sharon L.; Holloway, Nancy M.

    2002-01-01

    The NASA Morphing Project seeks to develop and assess advanced technologies and integrated component concepts to enable efficient, multi-point adaptability in air and space vehicles. In the context of the project, the word "morphing" is defined as "efficient, multi-point adaptability" and may include macro, micro, structural and/or fluidic approaches. The project includes research on smart materials, adaptive structures, micro flow control, biomimetic concepts, optimization and controls. This paper presents an updated overview of the content of the Morphing Project including highlights of recent research results.

  19. A Formula for Fixing Troubled Projects: The Scientific Method Meets Leadership

    NASA Technical Reports Server (NTRS)

    Wagner, Sandra

    2006-01-01

    This presentation focuses on project management, specifically addressing project issues using the scientific method of problem-solving. Two sample projects where this methodology has been applied are provided.

  20. A new anisotropic mesh adaptation method based upon hierarchical a posteriori error estimates

    NASA Astrophysics Data System (ADS)

    Huang, Weizhang; Kamenski, Lennard; Lang, Jens

    2010-03-01

    A new anisotropic mesh adaptation strategy for finite element solution of elliptic differential equations is presented. It generates anisotropic adaptive meshes as quasi-uniform ones in some metric space, with the metric tensor being computed based on hierarchical a posteriori error estimates. A global hierarchical error estimate is employed in this study to obtain reliable directional information of the solution. Instead of solving the global error problem exactly, which is costly in general, we solve it iteratively using the symmetric Gauß-Seidel method. Numerical results show that a few GS iterations are sufficient for obtaining a reasonably good approximation to the error for use in anisotropic mesh adaptation. The new method is compared with several strategies using local error estimators or recovered Hessians. Numerical results are presented for a selection of test examples and a mathematical model for heat conduction in a thermal battery with large orthotropic jumps in the material coefficients.

  1. Marginal and Internal Adaptation of Zirconia Crowns: A Comparative Study of Assessment Methods.

    PubMed

    Cunali, Rafael Schlögel; Saab, Rafaella Caramori; Correr, Gisele Maria; Cunha, Leonardo Fernandes da; Ornaghi, Bárbara Pick; Ritter, André V; Gonzaga, Carla Castiglia

    2017-01-01

    Marginal and internal adaptation is critical for the success of indirect restorations. New imaging systems make it possible to evaluate these parameters with precision and non-destructively. This study evaluated the marginal and internal adaptation of zirconia copings fabricated with two different systems using both silicone replica and microcomputed tomography (micro-CT) assessment methods. A metal master model, representing a preparation for an all-ceramic full crown, was digitally scanned and polycrystalline zirconia copings were fabricated with either Ceramill Zi (Amann-Girrbach) or inCoris Zi (Dentslpy-Sirona), n=10. For each coping, marginal and internal gaps were evaluated by silicone replica and micro-CT assessment methods. Four assessment points of each replica cross-section and micro-CT image were evaluated using imaging software: marginal gap (MG), axial wall (AW), axio-occlusal angle (AO) and mid-occlusal wall (MO). Data were statistically analyzed by factorial ANOVA and Tukey test (a=0.05). There was no statistically significant difference between the methods for MG and AW. For AO, there were significant differences between methods for Amann copings, while for Dentsply-Sirona copings similar values were observed. For MO, both methods presented statistically significant differences. A positive correlation was observed determined by the two assessment methods for MG values. In conclusion, the assessment method influenced the evaluation of marginal and internal adaptation of zirconia copings. Micro-CT showed lower marginal and internal gap values when compared to the silicone replica technique, although the difference was not always statistically significant. Marginal gap and axial wall assessment points showed the lower gap values, regardless of ceramic system and assessment method used.

  2. Quantification of organ motion based on an adaptive image-based scale invariant feature method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paganelli, Chiara; Peroni, Marta; Baroni, Guido

    2013-11-15

    Purpose: The availability of corresponding landmarks in IGRT image series allows quantifying the inter and intrafractional motion of internal organs. In this study, an approach for the automatic localization of anatomical landmarks is presented, with the aim of describing the nonrigid motion of anatomo-pathological structures in radiotherapy treatments according to local image contrast.Methods: An adaptive scale invariant feature transform (SIFT) was developed from the integration of a standard 3D SIFT approach with a local image-based contrast definition. The robustness and invariance of the proposed method to shape-preserving and deformable transforms were analyzed in a CT phantom study. The application ofmore » contrast transforms to the phantom images was also tested, in order to verify the variation of the local adaptive measure in relation to the modification of image contrast. The method was also applied to a lung 4D CT dataset, relying on manual feature identification by an expert user as ground truth. The 3D residual distance between matches obtained in adaptive-SIFT was then computed to verify the internal motion quantification with respect to the expert user. Extracted corresponding features in the lungs were used as regularization landmarks in a multistage deformable image registration (DIR) mapping the inhale vs exhale phase. The residual distances between the warped manual landmarks and their reference position in the inhale phase were evaluated, in order to provide a quantitative indication of the registration performed with the three different point sets.Results: The phantom study confirmed the method invariance and robustness properties to shape-preserving and deformable transforms, showing residual matching errors below the voxel dimension. The adapted SIFT algorithm on the 4D CT dataset provided automated and accurate motion detection of peak to peak breathing motion. The proposed method resulted in reduced residual errors with respect to

  3. Participatory Scenario Planning for Climate Change Adaptation: the Maui Groundwater Project

    NASA Astrophysics Data System (ADS)

    Keener, V. W.; Brewington, L.; Finucane, M.

    2015-12-01

    For the last century, the island of Maui in Hawai'i has been the center of environmental, agricultural, and legal conflict with respect to both surface and groundwater allocation. Planning for sustainable future freshwater supply in Hawai'i requires adaptive policies and decision-making that emphasizes private and public partnerships and knowledge transfer between scientists and non-scientists. We have downscaled dynamical climate models to 1 km resolution in Maui and coupled them with a USGS Water Budget model and a participatory scenario building process to quantify future changes in island-scale climate and groundwater recharge under different land uses. Although these projections are uncertain, the integrated nature of the Pacific RISA research program has allowed us to take a multi-pronged approach to facilitate the uptake of climate information into policy and management. This presentation details the ongoing work to support the development of Hawai'i's first island-wide water use plan under the new climate adaptation directive. Participatory scenario planning began in 2012 to bring together a diverse group of ~100 decision-makers in state and local government, watershed restoration, agriculture, and conservation to 1) determine the type of information (climate variables, land use and development, agricultural practices) they would find helpful in planning for climate change, and 2) develop a set of nested scenarios that represent alternative climate and management futures. This integration of knowledge is an iterative process, resulting in flexible and transparent narratives of complex futures comprised of information at multiple scales. We will present an overview of the downscaling, scenario building, hydrological modeling processes, and stakeholder response.

  4. An Optimal Control Modification to Model-Reference Adaptive Control for Fast Adaptation

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.; Krishnakumar, Kalmanje; Boskovic, Jovan

    2008-01-01

    This paper presents a method that can achieve fast adaptation for a class of model-reference adaptive control. It is well-known that standard model-reference adaptive control exhibits high-gain control behaviors when a large adaptive gain is used to achieve fast adaptation in order to reduce tracking error rapidly. High gain control creates high-frequency oscillations that can excite unmodeled dynamics and can lead to instability. The fast adaptation approach is based on the minimization of the squares of the tracking error, which is formulated as an optimal control problem. The necessary condition of optimality is used to derive an adaptive law using the gradient method. This adaptive law is shown to result in uniform boundedness of the tracking error by means of the Lyapunov s direct method. Furthermore, this adaptive law allows a large adaptive gain to be used without causing undesired high-gain control effects. The method is shown to be more robust than standard model-reference adaptive control. Simulations demonstrate the effectiveness of the proposed method.

  5. VisAdapt: A Visualization Tool to Support Climate Change Adaptation.

    PubMed

    Johansson, Jimmy; Opach, Tomasz; Glaas, Erik; Neset, Tina-Simone; Navarra, Carlo; Linner, Bjorn-Ola; Rod, Jan Ketil

    2017-01-01

    The web-based visualization VisAdapt tool was developed to help laypeople in the Nordic countries assess how anticipated climate change will impact their homes. The tool guides users through a three-step visual process that helps them explore risks and identify adaptive actions specifically modified to their location and house type. This article walks through the tool's multistep, user-centered design process. Although VisAdapt's target end users are Nordic homeowners, the insights gained from the development process and the lessons learned from the project are applicable to a wide range of domains.

  6. The life cycles of six multi-center adaptive clinical trials focused on neurological emergencies developed for the Advancing Regulatory Science initiative of the National Institutes of Health and US Food and Drug Administration: Case studies from the Adaptive Designs Accelerating Promising Treatments Into Trials Project.

    PubMed

    Guetterman, Timothy C; Fetters, Michael D; Mawocha, Samkeliso; Legocki, Laurie J; Barsan, William G; Lewis, Roger J; Berry, Donald A; Meurer, William J

    2017-01-01

    Clinical trials are complicated, expensive, time-consuming, and frequently do not lead to discoveries that improve the health of patients with disease. Adaptive clinical trials have emerged as a methodology to provide more flexibility in design elements to better answer scientific questions regarding whether new treatments are efficacious. Limited observational data exist that describe the complex process of designing adaptive clinical trials. To address these issues, the Adaptive Designs Accelerating Promising Treatments Into Trials project developed six, tailored, flexible, adaptive, phase-III clinical trials for neurological emergencies, and investigators prospectively monitored and observed the processes. The objective of this work is to describe the adaptive design development process, the final design, and the current status of the adaptive trial designs that were developed. To observe and reflect upon the trial development process, we employed a rich, mixed methods evaluation that combined quantitative data from visual analog scale to assess attitudes about adaptive trials, along with in-depth qualitative data about the development process gathered from observations. The Adaptive Designs Accelerating Promising Treatments Into Trials team developed six adaptive clinical trial designs. Across the six designs, 53 attitude surveys were completed at baseline and after the trial planning process completed. Compared to baseline, the participants believed significantly more strongly that the adaptive designs would be accepted by National Institutes of Health review panels and non-researcher clinicians. In addition, after the trial planning process, the participants more strongly believed that the adaptive design would meet the scientific and medical goals of the studies. Introducing the adaptive design at early conceptualization proved critical to successful adoption and implementation of that trial. Involving key stakeholders from several scientific domains early

  7. Adaptive mesh refinement techniques for the immersed interface method applied to flow problems

    PubMed Central

    Li, Zhilin; Song, Peng

    2013-01-01

    In this paper, we develop an adaptive mesh refinement strategy of the Immersed Interface Method for flow problems with a moving interface. The work is built on the AMR method developed for two-dimensional elliptic interface problems in the paper [12] (CiCP, 12(2012), 515–527). The interface is captured by the zero level set of a Lipschitz continuous function φ(x, y, t). Our adaptive mesh refinement is built within a small band of |φ(x, y, t)| ≤ δ with finer Cartesian meshes. The AMR-IIM is validated for Stokes and Navier-Stokes equations with exact solutions, moving interfaces driven by the surface tension, and classical bubble deformation problems. A new simple area preserving strategy is also proposed in this paper for the level set method. PMID:23794763

  8. Projection methods for the numerical solution of Markov chain models

    NASA Technical Reports Server (NTRS)

    Saad, Youcef

    1989-01-01

    Projection methods for computing stationary probability distributions for Markov chain models are presented. A general projection method is a method which seeks an approximation from a subspace of small dimension to the original problem. Thus, the original matrix problem of size N is approximated by one of dimension m, typically much smaller than N. A particularly successful class of methods based on this principle is that of Krylov subspace methods which utilize subspaces of the form span(v,av,...,A(exp m-1)v). These methods are effective in solving linear systems and eigenvalue problems (Lanczos, Arnoldi,...) as well as nonlinear equations. They can be combined with more traditional iterative methods such as successive overrelaxation, symmetric successive overrelaxation, or with incomplete factorization methods to enhance convergence.

  9. An improved adaptive weighting function method for State Estimation in Power Systems with VSC-MTDC

    NASA Astrophysics Data System (ADS)

    Zhao, Kun; Yang, Xiaonan; Lang, Yansheng; Song, Xuri; Wang, Minkun; Luo, Yadi; Wu, Lingyun; Liu, Peng

    2017-04-01

    This paper presents an effective approach for state estimation in power systems that include multi-terminal voltage source converter based high voltage direct current (VSC-MTDC), called improved adaptive weighting function method. The proposed approach is simplified in which the VSC-MTDC system is solved followed by the AC system. Because the new state estimation method only changes the weight and keeps the matrix dimension unchanged. Accurate and fast convergence of AC/DC system can be realized by adaptive weight function method. This method also provides the technical support for the simulation analysis and accurate regulation of AC/DC system. Both the oretical analysis and numerical tests verify practicability, validity and convergence of new method.

  10. An edge-based solution-adaptive method applied to the AIRPLANE code

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Thomas, Scott D.; Cliff, Susan E.

    1995-01-01

    Computational methods to solve large-scale realistic problems in fluid flow can be made more efficient and cost effective by using them in conjunction with dynamic mesh adaption procedures that perform simultaneous coarsening and refinement to capture flow features of interest. This work couples the tetrahedral mesh adaption scheme, 3D_TAG, with the AIRPLANE code to solve complete aircraft configuration problems in transonic and supersonic flow regimes. Results indicate that the near-field sonic boom pressure signature of a cone-cylinder is improved, the oblique and normal shocks are better resolved on a transonic wing, and the bow shock ahead of an unstarted inlet is better defined.

  11. Novel Multistatic Adaptive Microwave Imaging Methods for Early Breast Cancer Detection

    NASA Astrophysics Data System (ADS)

    Xie, Yao; Guo, Bin; Li, Jian; Stoica, Petre

    2006-12-01

    Multistatic adaptive microwave imaging (MAMI) methods are presented and compared for early breast cancer detection. Due to the significant contrast between the dielectric properties of normal and malignant breast tissues, developing microwave imaging techniques for early breast cancer detection has attracted much interest lately. MAMI is one of the microwave imaging modalities and employs multiple antennas that take turns to transmit ultra-wideband (UWB) pulses while all antennas are used to receive the reflected signals. MAMI can be considered as a special case of the multi-input multi-output (MIMO) radar with the multiple transmitted waveforms being either UWB pulses or zeros. Since the UWB pulses transmitted by different antennas are displaced in time, the multiple transmitted waveforms are orthogonal to each other. The challenge to microwave imaging is to improve resolution and suppress strong interferences caused by the breast skin, nipple, and so forth. The MAMI methods we investigate herein utilize the data-adaptive robust Capon beamformer (RCB) to achieve high resolution and interference suppression. We will demonstrate the effectiveness of our proposed methods for breast cancer detection via numerical examples with data simulated using the finite-difference time-domain method based on a 3D realistic breast model.

  12. Team-Centered Perspective for Adaptive Automation Design

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III

    2003-01-01

    Automation represents a very active area of human factors research. The journal, Human Factors, published a special issue on automation in 1985. Since then, hundreds of scientific studies have been published examining the nature of automation and its interaction with human performance. However, despite a dramatic increase in research investigating human factors issues in aviation automation, there remain areas that need further exploration. This NASA Technical Memorandum describes a new area of automation design and research, called adaptive automation. It discusses the concepts and outlines the human factors issues associated with the new method of adaptive function allocation. The primary focus is on human-centered design, and specifically on ensuring that adaptive automation is from a team-centered perspective. The document shows that adaptive automation has many human factors issues common to traditional automation design. Much like the introduction of other new technologies and paradigm shifts, adaptive automation presents an opportunity to remediate current problems but poses new ones for human-automation interaction in aerospace operations. The review here is intended to communicate the philosophical perspective and direction of adaptive automation research conducted under the Aerospace Operations Systems (AOS), Physiological and Psychological Stressors and Factors (PPSF) project.

  13. An adaptive reentry guidance method considering the influence of blackout zone

    NASA Astrophysics Data System (ADS)

    Wu, Yu; Yao, Jianyao; Qu, Xiangju

    2018-01-01

    Reentry guidance has been researched as a popular topic because it is critical for a successful flight. In view that the existing guidance methods do not take into account the accumulated navigation error of Inertial Navigation System (INS) in the blackout zone, in this paper, an adaptive reentry guidance method is proposed to obtain the optimal reentry trajectory quickly with the target of minimum aerodynamic heating rate. The terminal error in position and attitude can be also reduced with the proposed method. In this method, the whole reentry guidance task is divided into two phases, i.e., the trajectory updating phase and the trajectory planning phase. In the first phase, the idea of model predictive control (MPC) is used, and the receding optimization procedure ensures the optimal trajectory in the next few seconds. In the trajectory planning phase, after the vehicle has flown out of the blackout zone, the optimal reentry trajectory is obtained by online planning to adapt to the navigation information. An effective swarm intelligence algorithm, i.e. pigeon inspired optimization (PIO) algorithm, is applied to obtain the optimal reentry trajectory in both of the two phases. Compared to the trajectory updating method, the proposed method can reduce the terminal error by about 30% considering both the position and attitude, especially, the terminal error of height has almost been eliminated. Besides, the PIO algorithm performs better than the particle swarm optimization (PSO) algorithm both in the trajectory updating phase and the trajectory planning phases.

  14. Advanced adaptive computational methods for Navier-Stokes simulations in rotorcraft aerodynamics

    NASA Technical Reports Server (NTRS)

    Stowers, S. T.; Bass, J. M.; Oden, J. T.

    1993-01-01

    A phase 2 research and development effort was conducted in area transonic, compressible, inviscid flows with an ultimate goal of numerically modeling complex flows inherent in advanced helicopter blade designs. The algorithms and methodologies therefore are classified as adaptive methods, which are error estimation techniques for approximating the local numerical error, and automatically refine or unrefine the mesh so as to deliver a given level of accuracy. The result is a scheme which attempts to produce the best possible results with the least number of grid points, degrees of freedom, and operations. These types of schemes automatically locate and resolve shocks, shear layers, and other flow details to an accuracy level specified by the user of the code. The phase 1 work involved a feasibility study of h-adaptive methods for steady viscous flows, with emphasis on accurate simulation of vortex initiation, migration, and interaction. Phase 2 effort focused on extending these algorithms and methodologies to a three-dimensional topology.

  15. Scale-adaptive compressive tracking with feature integration

    NASA Astrophysics Data System (ADS)

    Liu, Wei; Li, Jicheng; Chen, Xiao; Li, Shuxin

    2016-05-01

    Numerous tracking-by-detection methods have been proposed for robust visual tracking, among which compressive tracking (CT) has obtained some promising results. A scale-adaptive CT method based on multifeature integration is presented to improve the robustness and accuracy of CT. We introduce a keypoint-based model to achieve the accurate scale estimation, which can additionally give a prior location of the target. Furthermore, by the high efficiency of data-independent random projection matrix, multiple features are integrated into an effective appearance model to construct the naïve Bayes classifier. At last, an adaptive update scheme is proposed to update the classifier conservatively. Experiments on various challenging sequences demonstrate substantial improvements by our proposed tracker over CT and other state-of-the-art trackers in terms of dealing with scale variation, abrupt motion, deformation, and illumination changes.

  16. Shear wave speed estimation by adaptive random sample consensus method.

    PubMed

    Lin, Haoming; Wang, Tianfu; Chen, Siping

    2014-01-01

    This paper describes a new method for shear wave velocity estimation that is capable of extruding outliers automatically without preset threshold. The proposed method is an adaptive random sample consensus (ARANDSAC) and the metric used here is finding the certain percentage of inliers according to the closest distance criterion. To evaluate the method, the simulation and phantom experiment results were compared using linear regression with all points (LRWAP) and radon sum transform (RS) method. The assessment reveals that the relative biases of mean estimation are 20.00%, 4.67% and 5.33% for LRWAP, ARANDSAC and RS respectively for simulation, 23.53%, 4.08% and 1.08% for phantom experiment. The results suggested that the proposed ARANDSAC algorithm is accurate in shear wave speed estimation.

  17. Construction of digital core by adaptive porosity method

    NASA Astrophysics Data System (ADS)

    Xia, Huifen; Liu, Ting; Zhao, Ling; Sun, Yanyu; Pan, Junliang

    2017-05-01

    The construction of digital core has its unique advantages in the study of water flooding or polymer flooding oil displacement efficiency. The frequency distribution of pore size is measured by mercury injection experiment, the coordination number by CT scanning method, and the wettability data by imbibition displacement was measured, on the basis of considering the ratio of pore throat ratio and wettability, the principle of adaptive porosity is used to construct the digital core. The results show that the water flooding recovery, the degree of polymer flooding and the results of the Physical simulation experiment are in good agreement.

  18. Wavefront detection method of a single-sensor based adaptive optics system.

    PubMed

    Wang, Chongchong; Hu, Lifa; Xu, Huanyu; Wang, Yukun; Li, Dayu; Wang, Shaoxin; Mu, Quanquan; Yang, Chengliang; Cao, Zhaoliang; Lu, Xinghai; Xuan, Li

    2015-08-10

    In adaptive optics system (AOS) for optical telescopes, the reported wavefront sensing strategy consists of two parts: a specific sensor for tip-tilt (TT) detection and another wavefront sensor for other distortions detection. Thus, a part of incident light has to be used for TT detection, which decreases the light energy used by wavefront sensor and eventually reduces the precision of wavefront correction. In this paper, a single Shack-Hartmann wavefront sensor based wavefront measurement method is presented for both large amplitude TT and other distortions' measurement. Experiments were performed for testing the presented wavefront method and validating the wavefront detection and correction ability of the single-sensor based AOS. With adaptive correction, the root-mean-square of residual TT was less than 0.2 λ, and a clear image was obtained in the lab. Equipped on a 1.23-meter optical telescope, the binary stars with angle distance of 0.6″ were clearly resolved using the AOS. This wavefront measurement method removes the separate TT sensor, which not only simplifies the AOS but also saves light energy for subsequent wavefront sensing and imaging, and eventually improves the detection and imaging capability of the AOS.

  19. Cis and trans RET signaling control the survival and central projection growth of rapidly adapting mechanoreceptors

    PubMed Central

    Fleming, Michael S; Vysochan, Anna; Paixão, Sόnia; Niu, Jingwen; Klein, Rüdiger; Savitt, Joseph M; Luo, Wenqin

    2015-01-01

    RET can be activated in cis or trans by its co-receptors and ligands in vitro, but the physiological roles of trans signaling are unclear. Rapidly adapting (RA) mechanoreceptors in dorsal root ganglia (DRGs) express Ret and the co-receptor Gfrα2 and depend on Ret for survival and central projection growth. Here, we show that Ret and Gfrα2 null mice display comparable early central projection deficits, but Gfrα2 null RA mechanoreceptors recover later. Loss of Gfrα1, the co-receptor implicated in activating RET in trans, causes no significant central projection or cell survival deficit, but Gfrα1;Gfrα2 double nulls phenocopy Ret nulls. Finally, we demonstrate that GFRα1 produced by neighboring DRG neurons activates RET in RA mechanoreceptors. Taken together, our results suggest that trans and cis RET signaling could function in the same developmental process and that the availability of both forms of activation likely enhances but not diversifies outcomes of RET signaling. DOI: http://dx.doi.org/10.7554/eLife.06828.001 PMID:25838128

  20. The GULLS project: a comparison of vulnerabilities across selected ocean hotspots and implications for adaptation to global change.

    NASA Astrophysics Data System (ADS)

    Cochrane, K.; Hobday, A. J.; Aswani, S.; Byfield, V.; Dutra, L.; Gasalla, M.; Haward, M.; Paytan, A.; Pecl, G.; Plaganyi-Lloyd, E.; Popova, K.; Salim, S. S.; Savage, C.; Sauer, W.; van Putten, I. E.; Visser, N.; Team, T G

    2016-12-01

    The GULLS project, `Global learning for local solutions: Reducing vulnerability of marine-dependent coastal communities' has been underway since October 2014. The project has been investigating six regional `hotspots': marine areas experiencing rapid warming. These are south-east Australia, Brazil, India, Solomon Islands, South Africa, and the Mozambique Channel and Madagascar. Rapid warming could be expected to have social, cultural and economic impacts that could affect these countries in different ways and may already be doing so. GULLS has focused on contributing to assessing and reducing the vulnerability of coastal communities and other stakeholders dependent on marine resources and to facilitate adaptation to climate change and variability through an integrated and trans-disciplinary approach. It includes participants from Australia, Brazil, India, Madagascar, New Zealand, South Africa, the United Kingdom and the United States of America. The research programme has been divided into six inter-linked components: ocean models, biological and ecological sensitivity analyses, system models, social vulnerability, policy mapping, and communication and education. This presentation will provide a brief overview of each of these components and describe the benefits that have resulted from the collaborative and transdisciplinary approach of GULLS. Following the standard vulnerability elements of exposure, sensitivity and adaptive capacity, the vulnerabilities of coastal communities and other stakeholders dependent on marine resources in the five hotspots will be compared using a set of indicators derived and populated from results of the research programme. The implications of similarities and differences between the hotspots for adaptation planning and options will be described.

  1. Influence of porcelain firing and cementation on the marginal adaptation of metal-ceramic restorations prepared by different methods.

    PubMed

    Kaleli, Necati; Saraç, Duygu

    2017-05-01

    Marginal adaptation plays an important role in the survival of metal-ceramic restorations. Porcelain firings and cementation may affect the adaptation of restorations. Moreover, conventional casting procedures and casting imperfections may cause deteriorations in the marginal adaptation of metal-ceramic restorations. The purpose of this in vitro study was to compare the marginal adaptation after fabrication of the framework, porcelain application, and cementation of metal-ceramic restorations prepared by using the conventional lost-wax technique, milling, direct metal laser sintering (DMLS), and LaserCUSING, a direct process powder-bed system. Alterations in the marginal adaptation of the metal frameworks during the fabrication stages and the precision of fabrication methods were evaluated. Forty-eight metal dies simulating prepared premolar and molar abutment teeth were fabricated to investigate marginal adaptation. They were divided into 4 groups (n=12) according to the fabrication method used (group C serving as the control group: lost-wax method; group M: milling method; group LS: DMLS method; group DP: direct process powder-bed method). Sixty marginal discrepancy measurements were recorded separately on each abutment tooth after fabrication of the framework, porcelain application, and cementation by using a stereomicroscope. Thereafter, each group was divided into 3 subgroups according to the measurements recorded in each fabrication stage: subgroup F (framework), subgroup P (porcelain application), and subgroup C (cementation). Data were statistically analyzed with univariate analysis of variance (followed by 1-way ANOVA and Tamhane T2 test (α=.05). The lowest marginal discrepancy values were observed in restorations prepared by using the direct process powder-bed method, and this was significantly different (P<.001) from the other methods. The highest marginal discrepancy values were recorded after the cementation procedure in all groups. The results

  2. An adaptive segment method for smoothing lidar signal based on noise estimation

    NASA Astrophysics Data System (ADS)

    Wang, Yuzhao; Luo, Pingping

    2014-10-01

    An adaptive segmentation smoothing method (ASSM) is introduced in the paper to smooth the signal and suppress the noise. In the ASSM, the noise is defined as the 3σ of the background signal. An integer number N is defined for finding the changing positions in the signal curve. If the difference of adjacent two points is greater than 3Nσ, the position is recorded as an end point of the smoothing segment. All the end points detected as above are recorded and the curves between them will be smoothed separately. In the traditional method, the end points of the smoothing windows in the signals are fixed. The ASSM creates changing end points in different signals and the smoothing windows could be set adaptively. The windows are always set as the half of the segmentations and then the average smoothing method will be applied in the segmentations. The Iterative process is required for reducing the end-point aberration effect in the average smoothing method and two or three times are enough. In ASSM, the signals are smoothed in the spacial area nor frequent area, that means the frequent disturbance will be avoided. A lidar echo was simulated in the experimental work. The echo was supposed to be created by a space-born lidar (e.g. CALIOP). And white Gaussian noise was added to the echo to act as the random noise resulted from environment and the detector. The novel method, ASSM, was applied to the noisy echo to filter the noise. In the test, N was set to 3 and the Iteration time is two. The results show that, the signal could be smoothed adaptively by the ASSM, but the N and the Iteration time might be optimized when the ASSM is applied in a different lidar.

  3. Shack-Hartmann wavefront sensor with large dynamic range by adaptive spot search method.

    PubMed

    Shinto, Hironobu; Saita, Yusuke; Nomura, Takanori

    2016-07-10

    A Shack-Hartmann wavefront sensor (SHWFS) that consists of a microlens array and an image sensor has been used to measure the wavefront aberrations of human eyes. However, a conventional SHWFS has finite dynamic range depending on the diameter of the each microlens. The dynamic range cannot be easily expanded without a decrease of the spatial resolution. In this study, an adaptive spot search method to expand the dynamic range of an SHWFS is proposed. In the proposed method, spots are searched with the help of their approximate displacements measured with low spatial resolution and large dynamic range. By the proposed method, a wavefront can be correctly measured even if the spot is beyond the detection area. The adaptive spot search method is realized by using the special microlens array that generates both spots and discriminable patterns. The proposed method enables expanding the dynamic range of an SHWFS with a single shot and short processing time. The performance of the proposed method is compared with that of a conventional SHWFS by optical experiments. Furthermore, the dynamic range of the proposed method is quantitatively evaluated by numerical simulations.

  4. Adaptive spline autoregression threshold method in forecasting Mitsubishi car sales volume at PT Srikandi Diamond Motors

    NASA Astrophysics Data System (ADS)

    Susanti, D.; Hartini, E.; Permana, A.

    2017-01-01

    Sale and purchase of the growing competition between companies in Indonesian, make every company should have a proper planning in order to win the competition with other companies. One of the things that can be done to design the plan is to make car sales forecast for the next few periods, it’s required that the amount of inventory of cars that will be sold in proportion to the number of cars needed. While to get the correct forecasting, on of the methods that can be used is the method of Adaptive Spline Threshold Autoregression (ASTAR). Therefore, this time the discussion will focus on the use of Adaptive Spline Threshold Autoregression (ASTAR) method in forecasting the volume of car sales in PT.Srikandi Diamond Motors using time series data.In the discussion of this research, forecasting using the method of forecasting value Adaptive Spline Threshold Autoregression (ASTAR) produce approximately correct.

  5. Evaluation and comparison of an adaptive method technique for improved performance of linear Fresnel secondary designs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hack, Madeline; Zhu, Guangdong; Wendelin, Timothy J.

    As a line-focus concentrating solar power (CSP) technology, linear Fresnel collectors have the potential to become a low-cost solution for electricity production and a variety of thermal energy applications. However, this technology often suffers from relatively low performance. A secondary reflector is a key component used to improve optical performance of a linear Fresnel collector. The shape of a secondary reflector is particularly critical in determining solar power captured by the absorber tube(s), and thus, the collector's optical performance. However, to the authors' knowledge, no well-established process existed to derive the optimal secondary shape prior to the development of amore » new adaptive method to optimize the secondary reflector shape. The new adaptive method does not assume any pre-defined analytical form; rather, it constitutes an optimum shape through an adaptive process by maximizing the energy collection onto the absorber tube. In this paper, the adaptive method is compared with popular secondary-reflector designs with respect to a collector's optical performance under various scenarios. For the first time, a comprehensive, in-depth comparison was conducted on all popular secondary designs for CSP applications. In conclusion, it is shown that the adaptive design exhibits the best optical performance.« less

  6. Evaluation and comparison of an adaptive method technique for improved performance of linear Fresnel secondary designs

    DOE PAGES

    Hack, Madeline; Zhu, Guangdong; Wendelin, Timothy J.

    2017-09-13

    As a line-focus concentrating solar power (CSP) technology, linear Fresnel collectors have the potential to become a low-cost solution for electricity production and a variety of thermal energy applications. However, this technology often suffers from relatively low performance. A secondary reflector is a key component used to improve optical performance of a linear Fresnel collector. The shape of a secondary reflector is particularly critical in determining solar power captured by the absorber tube(s), and thus, the collector's optical performance. However, to the authors' knowledge, no well-established process existed to derive the optimal secondary shape prior to the development of amore » new adaptive method to optimize the secondary reflector shape. The new adaptive method does not assume any pre-defined analytical form; rather, it constitutes an optimum shape through an adaptive process by maximizing the energy collection onto the absorber tube. In this paper, the adaptive method is compared with popular secondary-reflector designs with respect to a collector's optical performance under various scenarios. For the first time, a comprehensive, in-depth comparison was conducted on all popular secondary designs for CSP applications. In conclusion, it is shown that the adaptive design exhibits the best optical performance.« less

  7. Improved scatter correction using adaptive scatter kernel superposition

    NASA Astrophysics Data System (ADS)

    Sun, M.; Star-Lack, J. M.

    2010-11-01

    Accurate scatter correction is required to produce high-quality reconstructions of x-ray cone-beam computed tomography (CBCT) scans. This paper describes new scatter kernel superposition (SKS) algorithms for deconvolving scatter from projection data. The algorithms are designed to improve upon the conventional approach whose accuracy is limited by the use of symmetric kernels that characterize the scatter properties of uniform slabs. To model scatter transport in more realistic objects, nonstationary kernels, whose shapes adapt to local thickness variations in the projection data, are proposed. Two methods are introduced: (1) adaptive scatter kernel superposition (ASKS) requiring spatial domain convolutions and (2) fast adaptive scatter kernel superposition (fASKS) where, through a linearity approximation, convolution is efficiently performed in Fourier space. The conventional SKS algorithm, ASKS, and fASKS, were tested with Monte Carlo simulations and with phantom data acquired on a table-top CBCT system matching the Varian On-Board Imager (OBI). All three models accounted for scatter point-spread broadening due to object thickening, object edge effects, detector scatter properties and an anti-scatter grid. Hounsfield unit (HU) errors in reconstructions of a large pelvis phantom with a measured maximum scatter-to-primary ratio over 200% were reduced from -90 ± 58 HU (mean ± standard deviation) with no scatter correction to 53 ± 82 HU with SKS, to 19 ± 25 HU with fASKS and to 13 ± 21 HU with ASKS. HU accuracies and measured contrast were similarly improved in reconstructions of a body-sized elliptical Catphan phantom. The results show that the adaptive SKS methods offer significant advantages over the conventional scatter deconvolution technique.

  8. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in quantifying coronary calcium.

    PubMed

    Takahashi, Masahiro; Kimura, Fumiko; Umezawa, Tatsuya; Watanabe, Yusuke; Ogawa, Harumi

    2016-01-01

    Adaptive statistical iterative reconstruction (ASIR) has been used to reduce radiation dose in cardiac computed tomography. However, change of image parameters by ASIR as compared to filtered back projection (FBP) may influence quantification of coronary calcium. To investigate the influence of ASIR on calcium quantification in comparison to FBP. In 352 patients, CT images were reconstructed using FBP alone, FBP combined with ASIR 30%, 50%, 70%, and ASIR 100% based on the same raw data. Image noise, plaque density, Agatston scores and calcium volumes were compared among the techniques. Image noise, Agatston score, and calcium volume decreased significantly with ASIR compared to FBP (each P < 0.001). Use of ASIR reduced Agatston score by 10.5% to 31.0%. In calcified plaques both of patients and a phantom, ASIR decreased maximum CT values and calcified plaque size. In comparison to FBP, adaptive statistical iterative reconstruction (ASIR) may significantly decrease Agatston scores and calcium volumes. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  9. Projected Hybrid Orbitals: A General QM/MM Method

    PubMed Central

    2015-01-01

    A projected hybrid orbital (PHO) method was described to model the covalent boundary in a hybrid quantum mechanical and molecular mechanical (QM/MM) system. The PHO approach can be used in ab initio wave function theory and in density functional theory with any basis set without introducing system-dependent parameters. In this method, a secondary basis set on the boundary atom is introduced to formulate a set of hybrid atomic orbtials. The primary basis set on the boundary atom used for the QM subsystem is projected onto the secondary basis to yield a representation that provides a good approximation to the electron-withdrawing power of the primary basis set to balance electronic interactions between QM and MM subsystems. The PHO method has been tested on a range of molecules and properties. Comparison with results obtained from QM calculations on the entire system shows that the present PHO method is a robust and balanced QM/MM scheme that preserves the structural and electronic properties of the QM region. PMID:25317748

  10. A psychophysical comparison of two methods for adaptive histogram equalization.

    PubMed

    Zimmerman, J B; Cousins, S B; Hartzell, K M; Frisse, M E; Kahn, M G

    1989-05-01

    Adaptive histogram equalization (AHE) is a method for adaptive contrast enhancement of digital images. It is an automatic, reproducible method for the simultaneous viewing of contrast within a digital image with a large dynamic range. Recent experiments have shown that in specific cases, there is no significant difference in the ability of AHE and linear intensity windowing to display gray-scale contrast. More recently, a variant of AHE which limits the allowed contrast enhancement of the image has been proposed. This contrast-limited adaptive histogram equalization (CLAHE) produces images in which the noise content of an image is not excessively enhanced, but in which sufficient contrast is provided for the visualization of structures within the image. Images processed with CLAHE have a more natural appearance and facilitate the comparison of different areas of an image. However, the reduced contrast enhancement of CLAHE may hinder the ability of an observer to detect the presence of some significant gray-scale contrast. In this report, a psychophysical observer experiment was performed to determine if there is a significant difference in the ability of AHE and CLAHE to depict gray-scale contrast. Observers were presented with computed tomography (CT) images of the chest processed with AHE and CLAHE. Subtle artificial lesions were introduced into some images. The observers were asked to rate their confidence regarding the presence of the lesions; this rating-scale data was analyzed using receiver operating characteristic (ROC) curve techniques. These ROC curves were compared for significant differences in the observers' performances. In this report, no difference was found in the abilities of AHE and CLAHE to depict contrast information.

  11. Adapting Physical Education Activities.

    ERIC Educational Resources Information Center

    Bundschuh, Ernest; And Others

    Designed to meet the requirements of recent federal legislation, the booklet describes Project DART, which provides services in adapted physical education for handicapped children in Georgia. The first section examines the state of the art in adapted physical education and reviews the mandates of Public Law 94-142 (the Education for All…

  12. A goal-based angular adaptivity method for thermal radiation modelling in non grey media

    NASA Astrophysics Data System (ADS)

    Soucasse, Laurent; Dargaville, Steven; Buchan, Andrew G.; Pain, Christopher C.

    2017-10-01

    This paper investigates for the first time a goal-based angular adaptivity method for thermal radiation transport, suitable for non grey media when the radiation field is coupled with an unsteady flow field through an energy balance. Anisotropic angular adaptivity is achieved by using a Haar wavelet finite element expansion that forms a hierarchical angular basis with compact support and does not require any angular interpolation in space. The novelty of this work lies in (1) the definition of a target functional to compute the goal-based error measure equal to the radiative source term of the energy balance, which is the quantity of interest in the context of coupled flow-radiation calculations; (2) the use of different optimal angular resolutions for each absorption coefficient class, built from a global model of the radiative properties of the medium. The accuracy and efficiency of the goal-based angular adaptivity method is assessed in a coupled flow-radiation problem relevant for air pollution modelling in street canyons. Compared to a uniform Haar wavelet expansion, the adapted resolution uses 5 times fewer angular basis functions and is 6.5 times quicker, given the same accuracy in the radiative source term.

  13. Local initiatives and adaptation to climate change.

    PubMed

    Blanco, Ana V Rojas

    2006-03-01

    Climate change is expected to lead to an increase in the number and strength of natural hazards produced by climatic events. This paper presents some examples of the experiences of community-based organisations (CBOs) and non-governmental organisations (NGOs) of variations in climate, and looks at how they have incorporated their findings into the design and implementation of local adaptation strategies. Local organisations integrate climate change and climatic hazards into the design and development of their projects as a means of adapting to their new climatic situation. Projects designed to boost the resilience of local livelihoods are good examples of local adaptation strategies. To upscale these adaptation initiatives, there is a need to improve information exchange between CBOs, NGOs and academia. Moreover, there is a need to bridge the gap between scientific and local knowledge in order to create projects capable of withstanding stronger natural hazards.

  14. A NOISE ADAPTIVE FUZZY EQUALIZATION METHOD FOR PROCESSING SOLAR EXTREME ULTRAVIOLET IMAGES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Druckmueller, M., E-mail: druckmuller@fme.vutbr.cz

    A new image enhancement tool ideally suited for the visualization of fine structures in extreme ultraviolet images of the corona is presented in this paper. The Noise Adaptive Fuzzy Equalization method is particularly suited for the exceptionally high dynamic range images from the Atmospheric Imaging Assembly instrument on the Solar Dynamics Observatory. This method produces artifact-free images and gives significantly better results than methods based on convolution or Fourier transform which are often used for that purpose.

  15. An adaptive band selection method for dimension reduction of hyper-spectral remote sensing image

    NASA Astrophysics Data System (ADS)

    Yu, Zhijie; Yu, Hui; Wang, Chen-sheng

    2014-11-01

    Hyper-spectral remote sensing data can be acquired by imaging the same area with multiple wavelengths, and it normally consists of hundreds of band-images. Hyper-spectral images can not only provide spatial information but also high resolution spectral information, and it has been widely used in environment monitoring, mineral investigation and military reconnaissance. However, because of the corresponding large data volume, it is very difficult to transmit and store Hyper-spectral images. Hyper-spectral image dimensional reduction technique is desired to resolve this problem. Because of the High relation and high redundancy of the hyper-spectral bands, it is very feasible that applying the dimensional reduction method to compress the data volume. This paper proposed a novel band selection-based dimension reduction method which can adaptively select the bands which contain more information and details. The proposed method is based on the principal component analysis (PCA), and then computes the index corresponding to every band. The indexes obtained are then ranked in order of magnitude from large to small. Based on the threshold, system can adaptively and reasonably select the bands. The proposed method can overcome the shortcomings induced by transform-based dimension reduction method and prevent the original spectral information from being lost. The performance of the proposed method has been validated by implementing several experiments. The experimental results show that the proposed algorithm can reduce the dimensions of hyper-spectral image with little information loss by adaptively selecting the band images.

  16. New adaptive method to optimize the secondary reflector of linear Fresnel collectors

    DOE PAGES

    Zhu, Guangdong

    2017-01-16

    Performance of linear Fresnel collectors may largely depend on the secondary-reflector profile design when small-aperture absorbers are used. Optimization of the secondary-reflector profile is an extremely challenging task because there is no established theory to ensure superior performance of derived profiles. In this work, an innovative optimization method is proposed to optimize the secondary-reflector profile of a generic linear Fresnel configuration. The method correctly and accurately captures impacts of both geometric and optical aspects of a linear Fresnel collector to secondary-reflector design. The proposed method is an adaptive approach that does not assume a secondary shape of any particular form,more » but rather, starts at a single edge point and adaptively constructs the next surface point to maximize the reflected power to be reflected to absorber(s). As a test case, the proposed optimization method is applied to an industrial linear Fresnel configuration, and the results show that the derived optimal secondary reflector is able to redirect more than 90% of the power to the absorber in a wide range of incidence angles. Here, the proposed method can be naturally extended to other types of solar collectors as well, and it will be a valuable tool for solar-collector designs with a secondary reflector.« less

  17. New adaptive method to optimize the secondary reflector of linear Fresnel collectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Guangdong

    Performance of linear Fresnel collectors may largely depend on the secondary-reflector profile design when small-aperture absorbers are used. Optimization of the secondary-reflector profile is an extremely challenging task because there is no established theory to ensure superior performance of derived profiles. In this work, an innovative optimization method is proposed to optimize the secondary-reflector profile of a generic linear Fresnel configuration. The method correctly and accurately captures impacts of both geometric and optical aspects of a linear Fresnel collector to secondary-reflector design. The proposed method is an adaptive approach that does not assume a secondary shape of any particular form,more » but rather, starts at a single edge point and adaptively constructs the next surface point to maximize the reflected power to be reflected to absorber(s). As a test case, the proposed optimization method is applied to an industrial linear Fresnel configuration, and the results show that the derived optimal secondary reflector is able to redirect more than 90% of the power to the absorber in a wide range of incidence angles. Here, the proposed method can be naturally extended to other types of solar collectors as well, and it will be a valuable tool for solar-collector designs with a secondary reflector.« less

  18. Adaptive Elastic Net for Generalized Methods of Moments.

    PubMed

    Caner, Mehmet; Zhang, Hao Helen

    2014-01-30

    Model selection and estimation are crucial parts of econometrics. This paper introduces a new technique that can simultaneously estimate and select the model in generalized method of moments (GMM) context. The GMM is particularly powerful for analyzing complex data sets such as longitudinal and panel data, and it has wide applications in econometrics. This paper extends the least squares based adaptive elastic net estimator of Zou and Zhang (2009) to nonlinear equation systems with endogenous variables. The extension is not trivial and involves a new proof technique due to estimators lack of closed form solutions. Compared to Bridge-GMM of Caner (2009), we allow for the number of parameters to diverge to infinity as well as collinearity among a large number of variables, also the redundant parameters set to zero via a data dependent technique. This method has the oracle property, meaning that we can estimate nonzero parameters with their standard limit and the redundant parameters are dropped from the equations simultaneously. Numerical examples are used to illustrate the performance of the new method.

  19. Evaluation of Adaptive Subdivision Method on Mobile Device

    NASA Astrophysics Data System (ADS)

    Rahim, Mohd Shafry Mohd; Isa, Siti Aida Mohd; Rehman, Amjad; Saba, Tanzila

    2013-06-01

    Recently, there are significant improvements in the capabilities of mobile devices; but rendering large 3D object is still tedious because of the constraint in resources of mobile devices. To reduce storage requirement, 3D object is simplified but certain area of curvature is compromised and the surface will not be smooth. Therefore a method to smoother selected area of a curvature is implemented. One of the popular methods is adaptive subdivision method. Experiments are performed using two data with results based on processing time, rendering speed and the appearance of the object on the devices. The result shows a downfall in frame rate performance due to the increase in the number of triangles with each level of iteration while the processing time of generating the new mesh also significantly increase. Since there is a difference in screen size between the devices the surface on the iPhone appears to have more triangles and more compact than the surface displayed on the iPad. [Figure not available: see fulltext.

  20. Impact of agile methodologies on team capacity in automotive radio-navigation projects

    NASA Astrophysics Data System (ADS)

    Prostean, G.; Hutanu, A.; Volker, S.

    2017-01-01

    The development processes used in automotive radio-navigation projects are constantly under adaption pressure. While the software development models are based on automotive production processes, the integration of peripheral components into an automotive system will trigger a high number of requirement modifications. The use of traditional development models in automotive industry will bring team’s development capacity to its boundaries. The root cause lays in the inflexibility of actual processes and their adaption limits. This paper addresses a new project management approach for the development of radio-navigation projects. The understanding of weaknesses of current used models helped us in development and integration of agile methodologies in traditional development model structure. In the first part we focus on the change management methods to reduce request for change inflow. Established change management risk analysis processes enables the project management to judge the impact of a requirement change and also gives time to the project to implement some changes. However, in big automotive radio-navigation projects the saved time is not enough to implement the large amount of changes, which are submitted to the project. In the second phase of this paper we focus on increasing team capacity by integrating at critical project phases agile methodologies into the used traditional model. The overall objective of this paper is to prove the need of process adaption in order to solve project team capacity bottlenecks.

  1. Adaptive target binarization method based on a dual-camera system

    NASA Astrophysics Data System (ADS)

    Lei, Jing; Zhang, Ping; Xu, Jiangtao; Gao, Zhiyuan; Gao, Jing

    2018-01-01

    An adaptive target binarization method based on a dual-camera system that contains two dynamic vision sensors was proposed. First, a preprocessing procedure of denoising is introduced to remove the noise events generated by the sensors. Then, the complete edge of the target is retrieved and represented by events based on an event mosaicking method. Third, the region of the target is confirmed by an event-to-event method. Finally, a postprocessing procedure of image open and close operations of morphology methods is adopted to remove the artifacts caused by event-to-event mismatching. The proposed binarization method has been extensively tested on numerous degraded images with nonuniform illumination, low contrast, noise, or light spots and successfully compared with other well-known binarization methods. The experimental results, which are based on visual and misclassification error criteria, show that the proposed method performs well and has better robustness on the binarization of degraded images.

  2. Creating Schools and Strengthening Communities through Adaptive Reuse.

    ERIC Educational Resources Information Center

    Spector, Stephen

    This publication focuses on four school adaptive reuse projects--in Phoenix, Arizona; Wake County, North Carolina; Pomona, California; and Trenton, New Jersey. Together, the projects illustrate the many benefits of adaptive reuse and show that mainstream school districts can meet the regulatory and political challenges necessary to make such…

  3. Climate trends and projections for the Andean Altiplano and strategies for adaptation

    NASA Astrophysics Data System (ADS)

    Valdivia, C.; Thibeault, J.; Gilles, J. L.; García, M.; Seth, A.

    2013-04-01

    Climate variability and change impact production in rainfed agricultural systems of the Bolivian highlands. Maximum temperature trends are increasing for the Altiplano. Minimum temperature increases are significant in the northern region, and decreases are significant in the southern region. Producers' perceptions of climate hazards are high in the central region, while concerns with changing climate and unemployment are high in the north. Similar high-risk perceptions involve pests and diseases in both regions. Altiplano climate projections for end-of-century highlights include increases in temperature, extreme event frequency, change in the timing of rainfall, and reduction of soil humidity. Successful adaptation to these changes will require the development of links between the knowledge systems of producers and scientists. Two-way participatory approaches to develop capacity and information that involve decision makers and scientists are appropriate approaches in this context of increased risk, uncertainty and vulnerability.

  4. Adaptive Interfaces

    DTIC Science & Technology

    1991-03-01

    Project Engineer: Sharon M. Walter/COES/(315) 330-7650 Prime Contractor: lIT Research Institute (IITRI) 12a. DISTRIBUTION/AVAILABIUTY STATEMENT 12b...F30602- 87-D-0094. This contract is with IIT Research Institute (IITRI) and is sponsored by the Rome Air Development Center. The work was performed by... Research Institute (IITRI) and Rome Air Development Center (RADC) this Final Report for Project 7288, Adaptive Interfaces, Task A. This report is

  5. Ideas Identified and Distributed through Project IDEA.

    ERIC Educational Resources Information Center

    American Alliance for Health, Physical Education, and Recreation, Washington, DC.

    This document contains ideas on a variety of subjects directed at the physical educator. The work was compiled by Project IDEA (Identify, Distribute, Exchange for Action). Topics include the following: (a) scheduling, (b) curriculum, (c) games, (d) specific courses, (e) life sports, (f) fitness, (g) adaptive Physical education, (h) course methods,…

  6. An optimization-based framework for anisotropic simplex mesh adaptation

    NASA Astrophysics Data System (ADS)

    Yano, Masayuki; Darmofal, David L.

    2012-09-01

    We present a general framework for anisotropic h-adaptation of simplex meshes. Given a discretization and any element-wise, localizable error estimate, our adaptive method iterates toward a mesh that minimizes error for a given degrees of freedom. Utilizing mesh-metric duality, we consider a continuous optimization problem of the Riemannian metric tensor field that provides an anisotropic description of element sizes. First, our method performs a series of local solves to survey the behavior of the local error function. This information is then synthesized using an affine-invariant tensor manipulation framework to reconstruct an approximate gradient of the error function with respect to the metric tensor field. Finally, we perform gradient descent in the metric space to drive the mesh toward optimality. The method is first demonstrated to produce optimal anisotropic meshes minimizing the L2 projection error for a pair of canonical problems containing a singularity and a singular perturbation. The effectiveness of the framework is then demonstrated in the context of output-based adaptation for the advection-diffusion equation using a high-order discontinuous Galerkin discretization and the dual-weighted residual (DWR) error estimate. The method presented provides a unified framework for optimizing both the element size and anisotropy distribution using an a posteriori error estimate and enables efficient adaptation of anisotropic simplex meshes for high-order discretizations.

  7. The adaptive buffered force QM/MM method in the CP2K and AMBER software packages

    PubMed Central

    Mones, Letif; Jones, Andrew; Götz, Andreas W; Laino, Teodoro; Walker, Ross C; Leimkuhler, Ben; Csányi, Gábor; Bernstein, Noam

    2015-01-01

    The implementation and validation of the adaptive buffered force (AdBF) quantum-mechanics/molecular-mechanics (QM/MM) method in two popular packages, CP2K and AMBER are presented. The implementations build on the existing QM/MM functionality in each code, extending it to allow for redefinition of the QM and MM regions during the simulation and reducing QM-MM interface errors by discarding forces near the boundary according to the buffered force-mixing approach. New adaptive thermostats, needed by force-mixing methods, are also implemented. Different variants of the method are benchmarked by simulating the structure of bulk water, water autoprotolysis in the presence of zinc and dimethyl-phosphate hydrolysis using various semiempirical Hamiltonians and density functional theory as the QM model. It is shown that with suitable parameters, based on force convergence tests, the AdBF QM/MM scheme can provide an accurate approximation of the structure in the dynamical QM region matching the corresponding fully QM simulations, as well as reproducing the correct energetics in all cases. Adaptive unbuffered force-mixing and adaptive conventional QM/MM methods also provide reasonable results for some systems, but are more likely to suffer from instabilities and inaccuracies. © 2015 The Authors. Journal of Computational Chemistry Published by Wiley Periodicals, Inc. PMID:25649827

  8. The adaptive buffered force QM/MM method in the CP2K and AMBER software packages

    DOE PAGES

    Mones, Letif; Jones, Andrew; Götz, Andreas W.; ...

    2015-02-03

    We present the implementation and validation of the adaptive buffered force (AdBF) quantum-mechanics/molecular-mechanics (QM/MM) method in two popular packages, CP2K and AMBER. The implementations build on the existing QM/MM functionality in each code, extending it to allow for redefinition of the QM and MM regions during the simulation and reducing QM-MM interface errors by discarding forces near the boundary according to the buffered force-mixing approach. New adaptive thermostats, needed by force-mixing methods, are also implemented. Different variants of the method are benchmarked by simulating the structure of bulk water, water autoprotolysis in the presence of zinc and dimethyl-phosphate hydrolysis usingmore » various semiempirical Hamiltonians and density functional theory as the QM model. It is shown that with suitable parameters, based on force convergence tests, the AdBF QM/MM scheme can provide an accurate approximation of the structure in the dynamical QM region matching the corresponding fully QM simulations, as well as reproducing the correct energetics in all cases. Adaptive unbuffered force-mixing and adaptive conventional QM/MM methods also provide reasonable results for some systems, but are more likely to suffer from instabilities and inaccuracies.« less

  9. The adaptive buffered force QM/MM method in the CP2K and AMBER software packages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mones, Letif; Jones, Andrew; Götz, Andreas W.

    We present the implementation and validation of the adaptive buffered force (AdBF) quantum-mechanics/molecular-mechanics (QM/MM) method in two popular packages, CP2K and AMBER. The implementations build on the existing QM/MM functionality in each code, extending it to allow for redefinition of the QM and MM regions during the simulation and reducing QM-MM interface errors by discarding forces near the boundary according to the buffered force-mixing approach. New adaptive thermostats, needed by force-mixing methods, are also implemented. Different variants of the method are benchmarked by simulating the structure of bulk water, water autoprotolysis in the presence of zinc and dimethyl-phosphate hydrolysis usingmore » various semiempirical Hamiltonians and density functional theory as the QM model. It is shown that with suitable parameters, based on force convergence tests, the AdBF QM/MM scheme can provide an accurate approximation of the structure in the dynamical QM region matching the corresponding fully QM simulations, as well as reproducing the correct energetics in all cases. Adaptive unbuffered force-mixing and adaptive conventional QM/MM methods also provide reasonable results for some systems, but are more likely to suffer from instabilities and inaccuracies.« less

  10. Evaluation Guidelines for Service and Methods Demonstration Projects

    DOT National Transportation Integrated Search

    1976-02-01

    The document consists of evaluation guidelines for planning, implementing, and reporting the findings of the evaluation of Service and Methods Demonstration (SMD) projects sponsored by the Urban Mass Transportation Administration (UMTA). The objectiv...

  11. Evaluation of Load Analysis Methods for NASAs GIII Adaptive Compliant Trailing Edge Project

    NASA Technical Reports Server (NTRS)

    Cruz, Josue; Miller, Eric J.

    2016-01-01

    The Air Force Research Laboratory (AFRL), NASA Armstrong Flight Research Center (AFRC), and FlexSys Inc. (Ann Arbor, Michigan) have collaborated to flight test the Adaptive Compliant Trailing Edge (ACTE) flaps. These flaps were installed on a Gulfstream Aerospace Corporation (GAC) GIII aircraft and tested at AFRC at various deflection angles over a range of flight conditions. External aerodynamic and inertial load analyses were conducted with the intention to ensure that the change in wing loads due to the deployed ACTE flap did not overload the existing baseline GIII wing box structure. The objective of this paper was to substantiate the analysis tools used for predicting wing loads at AFRC. Computational fluid dynamics (CFD) models and distributed mass inertial models were developed for predicting the loads on the wing. The analysis tools included TRANAIR (full potential) and CMARC (panel) models. Aerodynamic pressure data from the analysis codes were validated against static pressure port data collected in-flight. Combined results from the CFD predictions and the inertial load analysis were used to predict the normal force, bending moment, and torque loads on the wing. Wing loads obtained from calibrated strain gages installed on the wing were used for substantiation of the load prediction tools. The load predictions exhibited good agreement compared to the flight load results obtained from calibrated strain gage measurements.

  12. A Meta-Analysis of Local Climate Change Adaptation Actions ...

    EPA Pesticide Factsheets

    Local governments are beginning to take steps to address the consequences of climate change, such as sea level rise and heat events. However, we do not have a clear understanding of what local governments are doing -- the extent to which they expect climate change to affect their community, the types of actions they have in place to address climate change, and the resources at their disposal for implementation. Several studies have been conducted by academics, non-governmental organizations, and public agencies to assess the status of local climate change adaptation. This project collates the findings from dozens of such studies to conduct a meta-analysis of local climate change adaptation actions. The studies will be characterized along several dimensions, including (a) methods used, (b) timing and geographic scope, (c) topics covered, (d) types of adaptation actions identified, (e) implementation status, and (f) public engagement and environmental justice dimensions considered. The poster presents the project's rationale and approach and some illustrative findings from early analyses. [Note: The document being reviewed is an abstract in which a poster is being proposed. The poster will enter clearance if the abstract is accepted] The purpose of this poster is to present the research framework and approaches I am developing for my ORISE postdoctoral project, and to get feedback on early analyses.

  13. Adaptation of LASCA method for diagnostics of malignant tumours in laboratory animals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ul'yanov, S S; Laskavyi, V N; Glova, Alina B

    The LASCA method is adapted for diagnostics of malignant neoplasms in laboratory animals. Tumours are studied in mice of Balb/c inbred line after inoculation of cells of syngeneic myeloma cell line Sp.2/0 Ag.8. The appropriateness of using the tLASCA method in tumour investigations is substantiated; its advantages in comparison with the sLASCA method are demonstrated. It is found that the most informative characteristic, indicating the presence of a tumour, is the fractal dimension of LASCA images.

  14. Nonlinear adaptive control system design with asymptotically stable parameter estimation error

    NASA Astrophysics Data System (ADS)

    Mishkov, Rumen; Darmonski, Stanislav

    2018-01-01

    The paper presents a new general method for nonlinear adaptive system design with asymptotic stability of the parameter estimation error. The advantages of the approach include asymptotic unknown parameter estimation without persistent excitation and capability to directly control the estimates transient response time. The method proposed modifies the basic parameter estimation dynamics designed via a known nonlinear adaptive control approach. The modification is based on the generalised prediction error, a priori constraints with a hierarchical parameter projection algorithm, and the stable data accumulation concepts. The data accumulation principle is the main tool for achieving asymptotic unknown parameter estimation. It relies on the parametric identifiability system property introduced. Necessary and sufficient conditions for exponential stability of the data accumulation dynamics are derived. The approach is applied in a nonlinear adaptive speed tracking vector control of a three-phase induction motor.

  15. Component model reduction via the projection and assembly method

    NASA Technical Reports Server (NTRS)

    Bernard, Douglas E.

    1989-01-01

    The problem of acquiring a simple but sufficiently accurate model of a dynamic system is made more difficult when the dynamic system of interest is a multibody system comprised of several components. A low order system model may be created by reducing the order of the component models and making use of various available multibody dynamics programs to assemble them into a system model. The difficulty is in choosing the reduced order component models to meet system level requirements. The projection and assembly method, proposed originally by Eke, solves this difficulty by forming the full order system model, performing model reduction at the the system level using system level requirements, and then projecting the desired modes onto the components for component level model reduction. The projection and assembly method is analyzed to show the conditions under which the desired modes are captured exactly; to the numerical precision of the algorithm.

  16. How Teaching Science Using Project-Based Learning Strategies Affects the Classroom Learning Environment

    ERIC Educational Resources Information Center

    Hugerat, Muhamad

    2016-01-01

    This study involved 458 ninth-grade students from two different Arab middle schools in Israel. Half of the students learned science using project-based learning strategies and the other half learned using traditional methods (non-project-based). The classes were heterogeneous regarding their achievements in the sciences. The adapted questionnaire…

  17. Bridging Scientific Reasoning and Conceptual Change through Adaptive Web-Based Learning

    ERIC Educational Resources Information Center

    She, Hsiao-Ching; Liao, Ya-Wen

    2010-01-01

    This study reports an adaptive digital learning project, Scientific Concept Construction and Reconstruction (SCCR), and examines its effects on 108 8th grade students' scientific reasoning and conceptual change through mixed methods. A one-group pre-, post-, and retention quasi-experimental design was used in the study. All students received tests…

  18. Large Scale Reduction of Graphite Oxide Project

    NASA Technical Reports Server (NTRS)

    Calle, Carlos; Mackey, Paul; Falker, John; Zeitlin, Nancy

    2015-01-01

    This project seeks to develop an optical method to reduce graphite oxide into graphene efficiently and in larger formats than currently available. Current reduction methods are expensive, time-consuming or restricted to small, limited formats. Graphene has potential uses in ultracapacitors, energy storage, solar cells, flexible and light-weight circuits, touch screens, and chemical sensors. In addition, graphite oxide is a sustainable material that can be produced from any form of carbon, making this method environmentally friendly and adaptable for in-situ reduction.

  19. A Cartesian Adaptive Level Set Method for Two-Phase Flows

    NASA Technical Reports Server (NTRS)

    Ham, F.; Young, Y.-N.

    2003-01-01

    In the present contribution we develop a level set method based on local anisotropic Cartesian adaptation as described in Ham et al. (2002). Such an approach should allow for the smallest possible Cartesian grid capable of resolving a given flow. The remainder of the paper is organized as follows. In section 2 the level set formulation for free surface calculations is presented and its strengths and weaknesses relative to the other free surface methods reviewed. In section 3 the collocated numerical method is described. In section 4 the method is validated by solving the 2D and 3D drop oscilation problem. In section 5 we present some results from more complex cases including the 3D drop breakup in an impulsively accelerated free stream, and the 3D immiscible Rayleigh-Taylor instability. Conclusions are given in section 6.

  20. Adaptive Management for Urban Watersheds: The Slavic Village Pilot Project

    EPA Science Inventory

    Adaptive management is an environmental management strategy that uses an iterative process of decision-making to reduce the uncertainty in environmental management via system monitoring. A central tenet of adaptive management is that management involves a learning process that ca...

  1. Virtual reality based adaptive dose assessment method for arbitrary geometries in nuclear facility decommissioning.

    PubMed

    Liu, Yong-Kuo; Chao, Nan; Xia, Hong; Peng, Min-Jun; Ayodeji, Abiodun

    2018-05-17

    This paper presents an improved and efficient virtual reality-based adaptive dose assessment method (VRBAM) applicable to the cutting and dismantling tasks in nuclear facility decommissioning. The method combines the modeling strength of virtual reality with the flexibility of adaptive technology. The initial geometry is designed with the three-dimensional computer-aided design tools, and a hybrid model composed of cuboids and a point-cloud is generated automatically according to the virtual model of the object. In order to improve the efficiency of dose calculation while retaining accuracy, the hybrid model is converted to a weighted point-cloud model, and the point kernels are generated by adaptively simplifying the weighted point-cloud model according to the detector position, an approach that is suitable for arbitrary geometries. The dose rates are calculated with the Point-Kernel method. To account for radiation scattering effects, buildup factors are calculated with the Geometric-Progression formula in the fitting function. The geometric modeling capability of VRBAM was verified by simulating basic geometries, which included a convex surface, a concave surface, a flat surface and their combination. The simulation results show that the VRBAM is more flexible and superior to other approaches in modeling complex geometries. In this paper, the computation time and dose rate results obtained from the proposed method were also compared with those obtained using the MCNP code and an earlier virtual reality-based method (VRBM) developed by the same authors. © 2018 IOP Publishing Ltd.

  2. Removing damped sinusoidal vibrations in adaptive optics systems using a DFT-based estimation method

    NASA Astrophysics Data System (ADS)

    Kania, Dariusz

    2017-06-01

    The problem of a vibrations rejection in adaptive optics systems is still present in publications. These undesirable signals emerge because of shaking the system structure, the tracking process, etc., and they usually are damped sinusoidal signals. There are some mechanical solutions to reduce the signals but they are not very effective. One of software solutions are very popular adaptive methods. An AVC (Adaptive Vibration Cancellation) method has been presented and developed in recent years. The method is based on the estimation of three vibrations parameters and values of frequency, amplitude and phase are essential to produce and adjust a proper signal to reduce or eliminate vibrations signals. This paper presents a fast (below 10 ms) and accurate estimation method of frequency, amplitude and phase of a multifrequency signal that can be used in the AVC method to increase the AO system performance. The method accuracy depends on several parameters: CiR - number of signal periods in a measurement window, N - number of samples in the FFT procedure, H - time window order, SNR, THD, b - number of A/D converter bits in a real time system, γ - the damping ratio of the tested signal, φ - the phase of the tested signal. Systematic errors increase when N, CiR, H decrease and when γ increases. The value of systematic error for γ = 0.1%, CiR = 1.1 and N = 32 is approximately 10^-4 Hz/Hz. This paper focuses on systematic errors of and effect of the signal phase and values of γ on the results.

  3. Errors in the estimation method for the rejection of vibrations in adaptive optics systems

    NASA Astrophysics Data System (ADS)

    Kania, Dariusz

    2017-06-01

    In recent years the problem of the mechanical vibrations impact in adaptive optics (AO) systems has been renewed. These signals are damped sinusoidal signals and have deleterious effect on the system. One of software solutions to reject the vibrations is an adaptive method called AVC (Adaptive Vibration Cancellation) where the procedure has three steps: estimation of perturbation parameters, estimation of the frequency response of the plant, update the reference signal to reject/minimalize the vibration. In the first step a very important problem is the estimation method. A very accurate and fast (below 10 ms) estimation method of these three parameters has been presented in several publications in recent years. The method is based on using the spectrum interpolation and MSD time windows and it can be used to estimate multifrequency signals. In this paper the estimation method is used in the AVC method to increase the system performance. There are several parameters that affect the accuracy of obtained results, e.g. CiR - number of signal periods in a measurement window, N - number of samples in the FFT procedure, H - time window order, SNR, b - number of ADC bits, γ - damping ratio of the tested signal. Systematic errors increase when N, CiR, H decrease and when γ increases. The value for systematic error is approximately 10^-10 Hz/Hz for N = 2048 and CiR = 0.1. This paper presents equations that can used to estimate maximum systematic errors for given values of H, CiR and N before the start of the estimation process.

  4. Adaptive Management Methods to Protect the California Sacramento-San Joaquin Delta Water Resource

    NASA Technical Reports Server (NTRS)

    Bubenheim, David

    2016-01-01

    The California Sacramento-San Joaquin River Delta is the hub for California's water supply, conveying water from Northern to Southern California agriculture and communities while supporting important ecosystem services, agriculture, and communities in the Delta. Changes in climate, long-term drought, water quality changes, and expansion of invasive aquatic plants threatens ecosystems, impedes ecosystem restoration, and is economically, environmentally, and sociologically detrimental to the San Francisco Bay/California Delta complex. NASA Ames Research Center and the USDA-ARS partnered with the State of California and local governments to develop science-based, adaptive-management strategies for the Sacramento-San Joaquin Delta. The project combines science, operations, and economics related to integrated management scenarios for aquatic weeds to help land and waterway managers make science-informed decisions regarding management and outcomes. The team provides a comprehensive understanding of agricultural and urban land use in the Delta and the major water sheds (San Joaquin/Sacramento) supplying the Delta and interaction with drought and climate impacts on the environment, water quality, and weed growth. The team recommends conservation and modified land-use practices and aids local Delta stakeholders in developing management strategies. New remote sensing tools have been developed to enhance ability to assess conditions, inform decision support tools, and monitor management practices. Science gaps in understanding how native and invasive plants respond to altered environmental conditions are being filled and provide critical biological response parameters for Delta-SWAT simulation modeling. Operational agencies such as the California Department of Boating and Waterways provide testing and act as initial adopter of decision support tools. Methods developed by the project can become routine land and water management tools in complex river delta systems.

  5. An adaptive case management system to support integrated care services: Lessons learned from the NEXES project.

    PubMed

    Cano, Isaac; Alonso, Albert; Hernandez, Carme; Burgos, Felip; Barberan-Garcia, Anael; Roldan, Jim; Roca, Josep

    2015-06-01

    Extensive deployment and sustainability of integrated care services (ICS) constitute an unmet need to reduce the burden of chronic conditions. The European Union project NEXES (2008-2013) assessed the deployment of four ICS encompassing the spectrum of severity of chronic patients. The current study aims to (i) describe the open source Adaptive Case Management (ACM) system (Linkcare®) developed to support the deployment of ICS at the level of healthcare district; (ii) to evaluate its performance; and, (iii) to identify key challenges for regional deployment of ICS. We first defined a conceptual model for ICS management and execution composed of five main stages. We then specified an associated logical model considering the dynamic runtime of ACM. Finally, we implemented the four ICS as a physical model with an ICS editor to allow professionals (case managers) to play active roles in adapting the system to their needs. Instances of ICS were then run in Linkcare®. Four ICS provided a framework for evaluating the system: Wellness and Rehabilitation (W&R) (number of patients enrolled in the study (n)=173); Enhanced Care (EC) in frail chronic patients to prevent hospital admissions, (n=848); Home Hospitalization and Early Discharge (HH/ED) (n=2314); and, Support to remote diagnosis (Support) (n=7793). The method for assessment of telemedicine applications (MAST) was used for iterative evaluation. Linkcare® supports ACM with shared-care plans across healthcare tiers and offers integration with provider-specific electronic health records. Linkcare® successfully contributed to the deployment of the four ICS: W&R facilitated long-term sustainability of training effects (p<0.01) and active life style (p<0.03); EC showed significant positive outcomes (p<0.05); HH/ED reduced on average 5 in-hospital days per patient with a 30-d re-admission rate of 10%; and, Support, enhanced community-based quality forced spirometry testing (p<0.01). Key challenges for regional deployment

  6. Fantastic animals as an experimental model to teach animal adaptation

    PubMed Central

    Guidetti, Roberto; Baraldi, Laura; Calzolai, Caterina; Pini, Lorenza; Veronesi, Paola; Pederzoli, Aurora

    2007-01-01

    Background Science curricula and teachers should emphasize evolution in a manner commensurate with its importance as a unifying concept in science. The concept of adaptation represents a first step to understand the results of natural selection. We settled an experimental project of alternative didactic to improve knowledge of organism adaptation. Students were involved and stimulated in learning processes by creative activities. To set adaptation in a historic frame, fossil records as evidence of past life and evolution were considered. Results The experimental project is schematized in nine phases: review of previous knowledge; lesson on fossils; lesson on fantastic animals; planning an imaginary world; creation of an imaginary animal; revision of the imaginary animals; adaptations of real animals; adaptations of fossil animals; and public exposition. A rubric to evaluate the student's performances is reported. The project involved professors and students of the University of Modena and Reggio Emilia and of the "G. Marconi" Secondary School of First Degree (Modena, Italy). Conclusion The educational objectives of the project are in line with the National Indications of the Italian Ministry of Public Instruction: knowledge of the characteristics of living beings, the meanings of the term "adaptation", the meaning of fossils, the definition of ecosystem, and the particularity of the different biomes. At the end of the project, students will be able to grasp particular adaptations of real organisms and to deduce information about the environment in which the organism evolved. This project allows students to review previous knowledge and to form their personalities. PMID:17767729

  7. A spatially adaptive total variation regularization method for electrical resistance tomography

    NASA Astrophysics Data System (ADS)

    Song, Xizi; Xu, Yanbin; Dong, Feng

    2015-12-01

    The total variation (TV) regularization method has been used to solve the ill-posed inverse problem of electrical resistance tomography (ERT), owing to its good ability to preserve edges. However, the quality of the reconstructed images, especially in the flat region, is often degraded by noise. To optimize the regularization term and the regularization factor according to the spatial feature and to improve the resolution of reconstructed images, a spatially adaptive total variation (SATV) regularization method is proposed. A kind of effective spatial feature indicator named difference curvature is used to identify which region is a flat or edge region. According to different spatial features, the SATV regularization method can automatically adjust both the regularization term and regularization factor. At edge regions, the regularization term is approximate to the TV functional to preserve the edges; in flat regions, it is approximate to the first-order Tikhonov (FOT) functional to make the solution stable. Meanwhile, the adaptive regularization factor determined by the spatial feature is used to constrain the regularization strength of the SATV regularization method for different regions. Besides, a numerical scheme is adopted for the implementation of the second derivatives of difference curvature to improve the numerical stability. Several reconstruction image metrics are used to quantitatively evaluate the performance of the reconstructed results. Both simulation and experimental results indicate that, compared with the TV (mean relative error 0.288, mean correlation coefficient 0.627) and FOT (mean relative error 0.295, mean correlation coefficient 0.638) regularization methods, the proposed SATV (mean relative error 0.259, mean correlation coefficient 0.738) regularization method can endure a relatively high level of noise and improve the resolution of reconstructed images.

  8. Proposed Project Selection Method for Human Support Research and Technology Development (HSR&TD)

    NASA Technical Reports Server (NTRS)

    Jones, Harry

    2005-01-01

    The purpose of HSR&TD is to deliver human support technologies to the Exploration Systems Mission Directorate (ESMD) that will be selected for future missions. This requires identifying promising candidate technologies and advancing them in technology readiness until they are acceptable. HSR&TD must select an may of technology development projects, guide them, and either terminate or continue them, so as to maximize the resulting number of usable advanced human support technologies. This paper proposes an effective project scoring methodology to support managing the HSR&TD project portfolio. Researchers strongly disagree as to what are the best technology project selection methods, or even if there are any proven ones. Technology development is risky and outstanding achievements are rare and unpredictable. There is no simple formula for success. Organizations that are satisfied with their project selection approach typically use a mix of financial, strategic, and scoring methods in an open, established, explicit, formal process. This approach helps to build consensus and develop management insight. It encourages better project proposals by clarifying the desired project attributes. We propose a project scoring technique based on a method previously used in a federal laboratory and supported by recent research. Projects are ranked by their perceived relevance, risk, and return - a new 3 R's. Relevance is the degree to which the project objective supports the HSR&TD goal of developing usable advanced human support technologies. Risk is the estimated probability that the project will achieve its specific objective. Return is the reduction in mission life cycle cost obtained if the project is successful. If the project objective technology performs a new function with no current cost, its return is the estimated cash value of performing the new function. The proposed project selection scoring method includes definitions of the criteria, a project evaluation

  9. [A study on training method for increasing adaptability to blood redistribution in human].

    PubMed

    Wu, Bin; You, Guang-xing; Wu, Ping; Xue, Yue-ying; Liu, Xing-hua; Su, Shuang-ning

    2003-01-01

    To verify validity of the increase in adaptability of blood redistribution in human body with repeated body position change training and to find preferable training method for increasing astronaut's adaptability of blood redistribution. Twelve subjects were randomly divided into group A and B. Six subjects in each group were trained with mode A and B repeated position change (9 times in 11 d) respectively. Their head-down tilt (HDT -30 degrees/30 min) tolerance and orthostatic tolerance were determined before and after training to verify training effects. 1) Two kinds of repeated body position change training modes increased all subjects' HDT tolerance. Compared with pre-training, during HDT test subjects' symptom scores in group B were significantly lower than those in group A (P<0.05) and after training decreasing magnitude of heart rate in group B increased significantly (P<0.01). Then mode B to be preferable training method in increasing HDT tolerance was suggested. 2) Two kinds of training modes improved all subjects' orthostatic tolerance. Compared with pre-training, during orthostatic tolerance test increasing magnitude of mean arterial blood pressure in group B increased significantly (P<0.05) and a trend of increasing magnitude of heart rate in group B was appeared smaller than in group A (P<0.10). Mode B to be preferable training method in increasing orthostatic tolerance was suggested too. Repeated body position change training could increase adaptability to blood redistribution in human body. Mode B was preferable training method and would be hopeful to be used in astronaut training.

  10. An HP Adaptive Discontinuous Galerkin Method for Hyperbolic Conservation Laws. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Bey, Kim S.

    1994-01-01

    This dissertation addresses various issues for model classes of hyperbolic conservation laws. The basic approach developed in this work employs a new family of adaptive, hp-version, finite element methods based on a special discontinuous Galerkin formulation for hyperbolic problems. The discontinuous Galerkin formulation admits high-order local approximations on domains of quite general geometry, while providing a natural framework for finite element approximations and for theoretical developments. The use of hp-versions of the finite element method makes possible exponentially convergent schemes with very high accuracies in certain cases; the use of adaptive hp-schemes allows h-refinement in regions of low regularity and p-enrichment to deliver high accuracy, while keeping problem sizes manageable and dramatically smaller than many conventional approaches. The use of discontinuous Galerkin methods is uncommon in applications, but the methods rest on a reasonable mathematical basis for low-order cases and has local approximation features that can be exploited to produce very efficient schemes, especially in a parallel, multiprocessor environment. The place of this work is to first and primarily focus on a model class of linear hyperbolic conservation laws for which concrete mathematical results, methodologies, error estimates, convergence criteria, and parallel adaptive strategies can be developed, and to then briefly explore some extensions to more general cases. Next, we provide preliminaries to the study and a review of some aspects of the theory of hyperbolic conservation laws. We also provide a review of relevant literature on this subject and on the numerical analysis of these types of problems.

  11. Reconstruction of sparse-view X-ray computed tomography using adaptive iterative algorithms.

    PubMed

    Liu, Li; Lin, Weikai; Jin, Mingwu

    2015-01-01

    In this paper, we propose two reconstruction algorithms for sparse-view X-ray computed tomography (CT). Treating the reconstruction problems as data fidelity constrained total variation (TV) minimization, both algorithms adapt the alternate two-stage strategy: projection onto convex sets (POCS) for data fidelity and non-negativity constraints and steepest descent for TV minimization. The novelty of this work is to determine iterative parameters automatically from data, thus avoiding tedious manual parameter tuning. In TV minimization, the step sizes of steepest descent are adaptively adjusted according to the difference from POCS update in either the projection domain or the image domain, while the step size of algebraic reconstruction technique (ART) in POCS is determined based on the data noise level. In addition, projection errors are used to compare with the error bound to decide whether to perform ART so as to reduce computational costs. The performance of the proposed methods is studied and evaluated using both simulated and physical phantom data. Our methods with automatic parameter tuning achieve similar, if not better, reconstruction performance compared to a representative two-stage algorithm. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Methods to achieve accurate projection of regional and global raster databases

    USGS Publications Warehouse

    Usery, E.L.; Seong, J.C.; Steinwand, D.R.; Finn, M.P.

    2002-01-01

    This research aims at building a decision support system (DSS) for selecting an optimum projection considering various factors, such as pixel size, areal extent, number of categories, spatial pattern of categories, resampling methods, and error correction methods. Specifically, this research will investigate three goals theoretically and empirically and, using the already developed empirical base of knowledge with these results, develop an expert system for map projection of raster data for regional and global database modeling. The three theoretical goals are as follows: (1) The development of a dynamic projection that adjusts projection formulas for latitude on the basis of raster cell size to maintain equal-sized cells. (2) The investigation of the relationships between the raster representation and the distortion of features, number of categories, and spatial pattern. (3) The development of an error correction and resampling procedure that is based on error analysis of raster projection.

  13. Isothermic and fixed intensity heat acclimation methods induce similar heat adaptation following short and long-term timescales.

    PubMed

    Gibson, Oliver R; Mee, Jessica A; Tuttle, James A; Taylor, Lee; Watt, Peter W; Maxwell, Neil S

    2015-01-01

    Heat acclimation requires the interaction between hot environments and exercise to elicit thermoregulatory adaptations. Optimal synergism between these parameters is unknown. Common practise involves utilising a fixed workload model where exercise prescription is controlled and core temperature is uncontrolled, or an isothermic model where core temperature is controlled and work rate is manipulated to control core temperature. Following a baseline heat stress test; 24 males performed a between groups experimental design performing short term heat acclimation (STHA; five 90 min sessions) and long term heat acclimation (LTHA; STHA plus further five 90 min sessions) utilising either fixed intensity (50% VO2peak), continuous isothermic (target rectal temperature 38.5 °C for STHA and LTHA), or progressive isothermic heat acclimation (target rectal temperature 38.5 °C for STHA, and 39.0 °C for LTHA). Identical heat stress tests followed STHA and LTHA to determine the magnitude of adaptation. All methods induced equal adaptation from baseline however isothermic methods induced adaptation and reduced exercise durations (STHA = -66% and LTHA = -72%) and mean session intensity (STHA = -13% VO2peak and LTHA = -9% VO2peak) in comparison to fixed (p < 0.05). STHA decreased exercising heart rate (-10 b min(-1)), core (-0.2 °C) and skin temperature (-0.51 °C), with sweat losses increasing (+0.36 Lh(-1)) (p<0.05). No difference between heat acclimation methods, and no further benefit of LTHA was observed (p > 0.05). Only thermal sensation improved from baseline to STHA (-0.2), and then between STHA and LTHA (-0.5) (p<0.05). Both the continuous and progressive isothermic methods elicited exercise duration, mean session intensity, and mean T(rec) analogous to more efficient administration for maximising adaptation. Short term isothermic methods are therefore optimal for individuals aiming to achieve heat adaptation most economically, i.e. when integrating heat acclimation into

  14. Adaptive silviculture for climate change: a national experiment in manager-scientist partnerships to apply an adaptation framework

    Treesearch

    Linda M. ​Nagel; Brian J. Palik; Michael A. Battaglia; Anthony W. D' Amato; James M. Guldin; Chris Swanston; Maria K. Janowiak; Matthew P. Powers; Linda A. Joyce; Constance I. Millar; David L. Peterson; Lisa M. Ganio; Chad Kirschbaum; Molly R. Roske

    2017-01-01

    Forest managers in the United States must respond to the need for climate-adaptive strategies in the face of observed and projected climatic changes. However, there is a lack of on-the-ground forest adaptation research to indicate what adaptation measures or tactics might be effective in preparing forest ecosystems to deal with climate change. Natural resource managers...

  15. An Adaptive Deghosting Method in Neural Network-Based Infrared Detectors Nonuniformity Correction.

    PubMed

    Li, Yiyang; Jin, Weiqi; Zhu, Jin; Zhang, Xu; Li, Shuo

    2018-01-13

    The problems of the neural network-based nonuniformity correction algorithm for infrared focal plane arrays mainly concern slow convergence speed and ghosting artifacts. In general, the more stringent the inhibition of ghosting, the slower the convergence speed. The factors that affect these two problems are the estimated desired image and the learning rate. In this paper, we propose a learning rate rule that combines adaptive threshold edge detection and a temporal gate. Through the noise estimation algorithm, the adaptive spatial threshold is related to the residual nonuniformity noise in the corrected image. The proposed learning rate is used to effectively and stably suppress ghosting artifacts without slowing down the convergence speed. The performance of the proposed technique was thoroughly studied with infrared image sequences with both simulated nonuniformity and real nonuniformity. The results show that the deghosting performance of the proposed method is superior to that of other neural network-based nonuniformity correction algorithms and that the convergence speed is equivalent to the tested deghosting methods.

  16. An integration time adaptive control method for atmospheric composition detection of occultation

    NASA Astrophysics Data System (ADS)

    Ding, Lin; Hou, Shuai; Yu, Fei; Liu, Cheng; Li, Chao; Zhe, Lin

    2018-01-01

    When sun is used as the light source for atmospheric composition detection, it is necessary to image sun for accurate identification and stable tracking. In the course of 180 second of the occultation, the magnitude of sun light intensity through the atmosphere changes greatly. It is nearly 1100 times illumination change between the maximum atmospheric and the minimum atmospheric. And the process of light change is so severe that 2.9 times per second of light change can be reached. Therefore, it is difficult to control the integration time of sun image camera. In this paper, a novel adaptive integration time control method for occultation is presented. In this method, with the distribution of gray value in the image as the reference variable, and the concepts of speed integral PID control, the integration time adaptive control problem of high frequency imaging. The large dynamic range integration time automatic control in the occultation can be achieved.

  17. An adaptive tau-leaping method for stochastic simulations of reaction-diffusion systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Padgett, Jill M. A.; Ilie, Silvana, E-mail: silvana@ryerson.ca

    2016-03-15

    Stochastic modelling is critical for studying many biochemical processes in a cell, in particular when some reacting species have low population numbers. For many such cellular processes the spatial distribution of the molecular species plays a key role. The evolution of spatially heterogeneous biochemical systems with some species in low amounts is accurately described by the mesoscopic model of the Reaction-Diffusion Master Equation. The Inhomogeneous Stochastic Simulation Algorithm provides an exact strategy to numerically solve this model, but it is computationally very expensive on realistic applications. We propose a novel adaptive time-stepping scheme for the tau-leaping method for approximating themore » solution of the Reaction-Diffusion Master Equation. This technique combines effective strategies for variable time-stepping with path preservation to reduce the computational cost, while maintaining the desired accuracy. The numerical tests on various examples arising in applications show the improved efficiency achieved by the new adaptive method.« less

  18. A projection method for low speed flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colella, P.; Pao, K.

    The authors propose a decomposition applicable to low speed, inviscid flows of all Mach numbers less than 1. By using the Hodge decomposition, they may write the velocity field as the sum of a divergence-free vector field and a gradient of a scalar function. Evolution equations for these parts are presented. A numerical procedure based on this decomposition is designed, using projection methods for solving the incompressible variables and a backward-Euler method for solving the potential variables. Numerical experiments are included to illustrate various aspects of the algorithm.

  19. Genome research elucidating environmental adaptation: Dark-fly project as a case study.

    PubMed

    Fuse, Naoyuki

    2017-08-01

    Organisms have the capacity to adapt to diverse environments, and environmental adaptation is a substantial driving force of evolution. Recent progress of genome science has addressed the genetic mechanisms underlying environmental adaptation. Whole genome sequencing has identified adaptive genes selected under particular environments. Genome editing technology enables us to directly test the role(s) of a gene in environmental adaptation. Genome science has also shed light on a unique organism, Dark-fly, which has been reared long-term in the dark. We determined the whole genome sequence of Dark-fly and reenacted environmental selections of the Dark-fly genome to identify the genes related to dark-adaptation. Here I will give an overview of current progress in genome science and summarize our study using Dark-fly, as a case study for environmental adaptation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Passive activity observation (PAO) method to estimate outdoor thermal adaptation in public space: case studies in Australian cities.

    PubMed

    Sharifi, Ehsan; Boland, John

    2018-06-18

    Outdoor thermal comfort is influenced by people's climate expectations, perceptions and adaptation capacity. Varied individual response to comfortable or stressful thermal environments results in a deviation between actual outdoor thermal activity choices and those predicted by thermal comfort indices. This paper presents a passive activity observation (PAO) method for estimating contextual limits of outdoor thermal adaptation. The PAO method determines which thermal environment result in statistically meaningful changes may occur in outdoor activity patterns, and it estimates thresholds of outdoor thermal neutrality and limits of thermal adaptation in public space based on activity observation and microclimate field measurement. Applications of the PAO method have been demonstrated in Adelaide, Melbourne and Sydney, where outdoor activities were analysed against outdoor thermal comfort indices between 2013 and 2014. Adjusted apparent temperature (aAT), adaptive predicted mean vote (aPMV), outdoor standard effective temperature (OUT_SET), physiological equivalent temperature (PET) and universal thermal comfort index (UTCI) are calculated from the PAO data. Using the PAO method, the high threshold of outdoor thermal neutrality was observed between 24 °C for optional activities and 34 °C for necessary activities (UTCI scale). Meanwhile, the ultimate limit of thermal adaptation in uncontrolled public spaces is estimated to be between 28 °C for social activities and 48 °C for necessary activities. Normalised results indicate that city-wide high thresholds for outdoor thermal neutrality vary from 25 °C in Melbourne to 26 °C in Sydney and 30 °C in Adelaide. The PAO method is a relatively fast and localised method for measuring limits of outdoor thermal adaptation and effectively informs urban design and policy making in the context of climate change.

  1. Fiber-optic fringe projection with crosstalk reduction by adaptive pattern masking

    NASA Astrophysics Data System (ADS)

    Matthias, Steffen; Kästner, Markus; Reithmeier, Eduard

    2017-02-01

    To enable in-process inspection of industrial manufacturing processes, measuring devices need to fulfill time and space constraints, while also being robust to environmental conditions, such as high temperatures and electromagnetic fields. A new fringe projection profilometry system is being developed, which is capable of performing the inspection of filigree tool geometries, e.g. gearing elements with tip radii of 0.2 mm, inside forming machines of the sheet-bulk metal forming process. Compact gradient-index rod lenses with a diameter of 2 mm allow for a compact design of the sensor head, which is connected to a base unit via flexible high-resolution image fibers with a diameter of 1.7 mm. The base unit houses a flexible DMD based LED projector optimized for fiber coupling and a CMOS camera sensor. The system is capable of capturing up to 150 gray-scale patterns per second as well as high dynamic range images from multiple exposures. Owing to fiber crosstalk and light leakage in the image fiber, signal quality suffers especially when capturing 3-D data of technical surfaces with highly varying reflectance or surface angles. An algorithm is presented, which adaptively masks parts of the pattern to reduce these effects via multiple exposures. The masks for valid surface areas are automatically defined according to different parameters from an initial capture, such as intensity and surface gradient. In a second step, the masks are re-projected to projector coordinates using the mathematical model of the system. This approach is capable of reducing both inter-pixel crosstalk and inter-object reflections on concave objects while maintaining measurement durations of less than 5 s.

  2. A Novel Adaptive H∞ Filtering Method with Delay Compensation for the Transfer Alignment of Strapdown Inertial Navigation Systems.

    PubMed

    Lyu, Weiwei; Cheng, Xianghong

    2017-11-28

    Transfer alignment is always a key technology in a strapdown inertial navigation system (SINS) because of its rapidity and accuracy. In this paper a transfer alignment model is established, which contains the SINS error model and the measurement model. The time delay in the process of transfer alignment is analyzed, and an H∞ filtering method with delay compensation is presented. Then the H∞ filtering theory and the robust mechanism of H∞ filter are deduced and analyzed in detail. In order to improve the transfer alignment accuracy in SINS with time delay, an adaptive H∞ filtering method with delay compensation is proposed. Since the robustness factor plays an important role in the filtering process and has effect on the filtering accuracy, the adaptive H∞ filter with delay compensation can adjust the value of robustness factor adaptively according to the dynamic external environment. The vehicle transfer alignment experiment indicates that by using the adaptive H∞ filtering method with delay compensation, the transfer alignment accuracy and the pure inertial navigation accuracy can be dramatically improved, which demonstrates the superiority of the proposed filtering method.

  3. A Novel Adaptive H∞ Filtering Method with Delay Compensation for the Transfer Alignment of Strapdown Inertial Navigation Systems

    PubMed Central

    Lyu, Weiwei

    2017-01-01

    Transfer alignment is always a key technology in a strapdown inertial navigation system (SINS) because of its rapidity and accuracy. In this paper a transfer alignment model is established, which contains the SINS error model and the measurement model. The time delay in the process of transfer alignment is analyzed, and an H∞ filtering method with delay compensation is presented. Then the H∞ filtering theory and the robust mechanism of H∞ filter are deduced and analyzed in detail. In order to improve the transfer alignment accuracy in SINS with time delay, an adaptive H∞ filtering method with delay compensation is proposed. Since the robustness factor plays an important role in the filtering process and has effect on the filtering accuracy, the adaptive H∞ filter with delay compensation can adjust the value of robustness factor adaptively according to the dynamic external environment. The vehicle transfer alignment experiment indicates that by using the adaptive H∞ filtering method with delay compensation, the transfer alignment accuracy and the pure inertial navigation accuracy can be dramatically improved, which demonstrates the superiority of the proposed filtering method. PMID:29182592

  4. M-AMST: an automatic 3D neuron tracing method based on mean shift and adapted minimum spanning tree.

    PubMed

    Wan, Zhijiang; He, Yishan; Hao, Ming; Yang, Jian; Zhong, Ning

    2017-03-29

    Understanding the working mechanism of the brain is one of the grandest challenges for modern science. Toward this end, the BigNeuron project was launched to gather a worldwide community to establish a big data resource and a set of the state-of-the-art of single neuron reconstruction algorithms. Many groups contributed their own algorithms for the project, including our mean shift and minimum spanning tree (M-MST). Although M-MST is intuitive and easy to implement, the MST just considers spatial information of single neuron and ignores the shape information, which might lead to less precise connections between some neuron segments. In this paper, we propose an improved algorithm, namely M-AMST, in which a rotating sphere model based on coordinate transformation is used to improve the weight calculation method in M-MST. Two experiments are designed to illustrate the effect of adapted minimum spanning tree algorithm and the adoptability of M-AMST in reconstructing variety of neuron image datasets respectively. In the experiment 1, taking the reconstruction of APP2 as reference, we produce the four difference scores (entire structure average (ESA), different structure average (DSA), percentage of different structure (PDS) and max distance of neurons' nodes (MDNN)) by comparing the neuron reconstruction of the APP2 and the other 5 competing algorithm. The result shows that M-AMST gets lower difference scores than M-MST in ESA, PDS and MDNN. Meanwhile, M-AMST is better than N-MST in ESA and MDNN. It indicates that utilizing the adapted minimum spanning tree algorithm which took the shape information of neuron into account can achieve better neuron reconstructions. In the experiment 2, 7 neuron image datasets are reconstructed and the four difference scores are calculated by comparing the gold standard reconstruction and the reconstructions produced by 6 competing algorithms. Comparing the four difference scores of M-AMST and the other 5 algorithm, we can conclude that

  5. Highly undersampled MR image reconstruction using an improved dual-dictionary learning method with self-adaptive dictionaries.

    PubMed

    Li, Jiansen; Song, Ying; Zhu, Zhen; Zhao, Jun

    2017-05-01

    Dual-dictionary learning (Dual-DL) method utilizes both a low-resolution dictionary and a high-resolution dictionary, which are co-trained for sparse coding and image updating, respectively. It can effectively exploit a priori knowledge regarding the typical structures, specific features, and local details of training sets images. The prior knowledge helps to improve the reconstruction quality greatly. This method has been successfully applied in magnetic resonance (MR) image reconstruction. However, it relies heavily on the training sets, and dictionaries are fixed and nonadaptive. In this research, we improve Dual-DL by using self-adaptive dictionaries. The low- and high-resolution dictionaries are updated correspondingly along with the image updating stage to ensure their self-adaptivity. The updated dictionaries incorporate both the prior information of the training sets and the test image directly. Both dictionaries feature improved adaptability. Experimental results demonstrate that the proposed method can efficiently and significantly improve the quality and robustness of MR image reconstruction.

  6. Phylogeny-based comparative methods question the adaptive nature of sporophytic specializations in mosses.

    PubMed

    Huttunen, Sanna; Olsson, Sanna; Buchbender, Volker; Enroth, Johannes; Hedenäs, Lars; Quandt, Dietmar

    2012-01-01

    Adaptive evolution has often been proposed to explain correlations between habitats and certain phenotypes. In mosses, a high frequency of species with specialized sporophytic traits in exposed or epiphytic habitats was, already 100 years ago, suggested as due to adaptation. We tested this hypothesis by contrasting phylogenetic and morphological data from two moss families, Neckeraceae and Lembophyllaceae, both of which show parallel shifts to a specialized morphology and to exposed epiphytic or epilithic habitats. Phylogeny-based tests for correlated evolution revealed that evolution of four sporophytic traits is correlated with a habitat shift. For three of them, evolutionary rates of dual character-state changes suggest that habitat shifts appear prior to changes in morphology. This suggests that they could have evolved as adaptations to new habitats. Regarding the fourth correlated trait the specialized morphology had already evolved before the habitat shift. In addition, several other specialized "epiphytic" traits show no correlation with a habitat shift. Besides adaptive diversification, other processes thus also affect the match between phenotype and environment. Several potential factors such as complex genetic and developmental pathways yielding the same phenotypes, differences in strength of selection, or constraints in phenotypic evolution may lead to an inability of phylogeny-based comparative methods to detect potential adaptations.

  7. Harvard Project Physics Newsletter 10. The Project Physics Course, Text.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Harvard Project Physics.

    A short description of the availability of Harvard Project Physics course components is given as is a discussion of the growth of the use of Project Physics in schools, including some enrollment data and survey results. Locations of the 1970 and 1971 Summer Institutes are listed. Adaptations of Project Physics course outside the United States are…

  8. Project JOVE. [microgravity experiments and applications

    NASA Technical Reports Server (NTRS)

    Lyell, M. J.

    1994-01-01

    The goal of this project is to investigate new areas of research pertaining to free surface-interface fluids mechanics and/or microgravity which have potential commercial applications. This paper presents an introduction to ferrohydrodynamics (FHD), and discusses some applications. Also, computational methods for solving free surface flow problems are presented in detail. Both have diverse applications in industry and in microgravity fluids applications. Three different modeling schemes for FHD flows are addressed and the governing equations, including Maxwell's equations, are introduced. In the area of computational modeling of free surface flows, both Eulerian and Lagrangian schemes are discussed. The state of the art in computational methods applied to free surface flows is elucidated. In particular, adaptive grids and re-zoning methods are discussed. Additional research results are addressed and copies of the publications produced under the JOVE Project are included.

  9. Hybrid Solution-Adaptive Unstructured Cartesian Method for Large-Eddy Simulation of Detonation in Multi-Phase Turbulent Reactive Mixtures

    DTIC Science & Technology

    2012-03-27

    pulse- detonation engines ( PDE ), stage separation, supersonic cav- ity oscillations, hypersonic aerodynamics, detonation induced structural...ADAPTIVE UNSTRUCTURED CARTESIAN METHOD FOR LARGE-EDDY SIMULATION OF DETONATION IN MULTI-PHASE TURBULENT REACTIVE MIXTURES 5b. GRANT NUMBER FA9550...CCL Report TR-2012-03-03 Hybrid Solution-Adaptive Unstructured Cartesian Method for Large-Eddy Simulation of Detonation in Multi-Phase Turbulent

  10. Can “Model Projects of Need-Adapted Care” Reduce Involuntary Hospital Treatment and the Use of Coercive Measures?

    PubMed Central

    Wullschleger, Alexandre; Berg, Jürgen; Bermpohl, Felix; Montag, Christiane

    2018-01-01

    Intensive outpatient models of need-adapted psychiatric care have been shown to reduce the length of hospital stays and to improve retention in care for people with severe mental illnesses. In contrast, evidence regarding the impact of such models on involuntary hospital treatment and other coercive measures in inpatient settings is still sparse, although these represent important indicators of the patients' wellbeing. In Germany, intensive models of care still have not been routinely implemented, and their effectiveness within the German psychiatric system is only studied in a few pioneering regions. An innovative model of flexible, assertive, need-adapted care established in Berlin, Germany, in 2014, treating unselected 14% of the catchment area's patients, was evaluated on the basis of routine clinical data. Records of n = 302 patients diagnosed with severe mental disorders, who had been hospitalized at least once during a 4-year-observational period, were analyzed in a retrospective individual mirror-image design, comparing the 2 years before and after inclusion in the model project regarding the time spent in hospital, the number and duration of involuntary hospital treatments and the use of direct coercive interventions like restraint or isolation. After inclusion to the project, patients spent significantly less time in hospital. Among patients treated on acute wards and patients with a diagnosis of psychosis, the number of patients subjected to provisional detention due to acute endangerment of self or others decreased significantly, as did the time spent under involuntary hospital treatment. The number of patients subjected to mechanical restraint, but not to isolation, on the ward decreased significantly, while the total number of coercive interventions remained unchanged. Findings suggest some potential of intensive models of need-adapted care to reduce coercive interventions in psychiatry. However, results must be substantiated by evidence from

  11. Adaptive and Adaptable Automation Design: A Critical Review of the Literature and Recommendations for Future Research

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Kaber, David B.

    2006-01-01

    This report presents a review of literature on approaches to adaptive and adaptable task/function allocation and adaptive interface technologies for effective human management of complex systems that are likely to be issues for the Next Generation Air Transportation System, and a focus of research under the Aviation Safety Program, Integrated Intelligent Flight Deck Project. Contemporary literature retrieved from an online database search is summarized and integrated. The major topics include the effects of delegation-type, adaptable automation on human performance, workload and situation awareness, the effectiveness of various automation invocation philosophies and strategies to function allocation in adaptive systems, and the role of user modeling in adaptive interface design and the performance implications of adaptive interface technology.

  12. Analysis and development of adjoint-based h-adaptive direct discontinuous Galerkin method for the compressible Navier-Stokes equations

    NASA Astrophysics Data System (ADS)

    Cheng, Jian; Yue, Huiqiang; Yu, Shengjiao; Liu, Tiegang

    2018-06-01

    In this paper, an adjoint-based high-order h-adaptive direct discontinuous Galerkin method is developed and analyzed for the two dimensional steady state compressible Navier-Stokes equations. Particular emphasis is devoted to the analysis of the adjoint consistency for three different direct discontinuous Galerkin discretizations: including the original direct discontinuous Galerkin method (DDG), the direct discontinuous Galerkin method with interface correction (DDG(IC)) and the symmetric direct discontinuous Galerkin method (SDDG). Theoretical analysis shows the extra interface correction term adopted in the DDG(IC) method and the SDDG method plays a key role in preserving the adjoint consistency. To be specific, for the model problem considered in this work, we prove that the original DDG method is not adjoint consistent, while the DDG(IC) method and the SDDG method can be adjoint consistent with appropriate treatment of boundary conditions and correct modifications towards the underlying output functionals. The performance of those three DDG methods is carefully investigated and evaluated through typical test cases. Based on the theoretical analysis, an adjoint-based h-adaptive DDG(IC) method is further developed and evaluated, numerical experiment shows its potential in the applications of adjoint-based adaptation for simulating compressible flows.

  13. Transcultural adaptation and psychometric properties of Spanish version of Pregnancy Physical Activity Questionnaire: the PregnActive project.

    PubMed

    Oviedo-Caro, Miguel Ángel; Bueno-Antequera, Javier; Munguía-Izquierdo, Diego

    2018-03-19

    To transculturally adapt the Spanish version of Pregnancy Physical Activity Questionnaire (PPAQ) analyzing its psychometric properties. The PPAQ was transculturally adapted into Spanish. Test-retest reliability was evaluated in a subsample of 109 pregnant women. The validity was evaluated in a sample of 208 pregnant women who answered the questionnaire and wore the multi-sensor monitor for 7 valid days. The reliability (intraclass correlation coefficient), concordance (concordance correlation coefficient), correlation (Pearson correlation coefficient), agreement (Bland-Altman plots) and relative activity levels (Jonckheere-Terpstra test) between both administrations and methods were examined. Intraclass correlation coefficients between both administrations were good for all categories except transportation. A low but significant correlation was found for total activity (light and above) whereas no correlation was found for other intensities between both methods. Relative activity levels analysis showed a significant linear trend for increased total activity between both methods. Spanish version of PPAQ is a brief and easily interpretable questionnaire with good reliability and ability to rank individuals, and poor validity compared with multi-sensor monitor. The use of PPAQ provides information of pregnancy-specific activities in order to establish physical activity levels of pregnant women and adapt health promotion interventions. Copyright © 2018 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  14. An adaptive Fuzzy C-means method utilizing neighboring information for breast tumor segmentation in ultrasound images.

    PubMed

    Feng, Yuan; Dong, Fenglin; Xia, Xiaolong; Hu, Chun-Hong; Fan, Qianmin; Hu, Yanle; Gao, Mingyuan; Mutic, Sasa

    2017-07-01

    Ultrasound (US) imaging has been widely used in breast tumor diagnosis and treatment intervention. Automatic delineation of the tumor is a crucial first step, especially for the computer-aided diagnosis (CAD) and US-guided breast procedure. However, the intrinsic properties of US images such as low contrast and blurry boundaries pose challenges to the automatic segmentation of the breast tumor. Therefore, the purpose of this study is to propose a segmentation algorithm that can contour the breast tumor in US images. To utilize the neighbor information of each pixel, a Hausdorff distance based fuzzy c-means (FCM) method was adopted. The size of the neighbor region was adaptively updated by comparing the mutual information between them. The objective function of the clustering process was updated by a combination of Euclid distance and the adaptively calculated Hausdorff distance. Segmentation results were evaluated by comparing with three experts' manual segmentations. The results were also compared with a kernel-induced distance based FCM with spatial constraints, the method without adaptive region selection, and conventional FCM. Results from segmenting 30 patient images showed the adaptive method had a value of sensitivity, specificity, Jaccard similarity, and Dice coefficient of 93.60 ± 5.33%, 97.83 ± 2.17%, 86.38 ± 5.80%, and 92.58 ± 3.68%, respectively. The region-based metrics of average symmetric surface distance (ASSD), root mean square symmetric distance (RMSD), and maximum symmetric surface distance (MSSD) were 0.03 ± 0.04 mm, 0.04 ± 0.03 mm, and 1.18 ± 1.01 mm, respectively. All the metrics except sensitivity were better than that of the non-adaptive algorithm and the conventional FCM. Only three region-based metrics were better than that of the kernel-induced distance based FCM with spatial constraints. Inclusion of the pixel neighbor information adaptively in segmenting US images improved the segmentation performance. The results demonstrate the

  15. A real-time regional adaptive exposure method for saving dose-area product in x-ray fluoroscopy

    PubMed Central

    Burion, Steve; Speidel, Michael A.; Funk, Tobias

    2013-01-01

    Purpose: Reduction of radiation dose in x-ray imaging has been recognized as a high priority in the medical community. Here the authors show that a regional adaptive exposure method can reduce dose-area product (DAP) in x-ray fluoroscopy. The authors' method is particularly geared toward providing dose savings for the pediatric population. Methods: The scanning beam digital x-ray system uses a large-area x-ray source with 8000 focal spots in combination with a small photon-counting detector. An imaging frame is obtained by acquiring and reconstructing up to 8000 detector images, each viewing only a small portion of the patient. Regional adaptive exposure was implemented by varying the exposure of the detector images depending on the local opacity of the object. A family of phantoms ranging in size from infant to obese adult was imaged in anteroposterior view with and without adaptive exposure. The DAP delivered to each phantom was measured in each case, and noise performance was compared by generating noise arrays to represent regional noise in the images. These noise arrays were generated by dividing the image into regions of about 6 mm2, calculating the relative noise in each region, and placing the relative noise value of each region in a one-dimensional array (noise array) sorted from highest to lowest. Dose-area product savings were calculated as the difference between the ratio of DAP with adaptive exposure to DAP without adaptive exposure. The authors modified this value by a correction factor that matches the noise arrays where relative noise is the highest to report a final dose-area product savings. Results: The average dose-area product saving across the phantom family was (42 ± 8)% with the highest dose-area product saving in the child-sized phantom (50%) and the lowest in the phantom mimicking an obese adult (23%). Conclusions: Phantom measurements indicate that a regional adaptive exposure method can produce large DAP savings without compromising the

  16. Development of adaptive IWRM options for climate change mitigation and adaptation

    NASA Astrophysics Data System (ADS)

    Flügel, W.-A.

    2011-04-01

    Adaptive Integrated Water Resources Management (IWRM) options related to the impacts of climate change in the twinning basins of the Upper Danube River Basin (UDRB) and the Upper Brahmaputra River Basin (UBRB) are developed based on the results obtained in the different work packages of the BRAHMATWINN project. They have been described and discussed in Chapter 2 till Chapter 9 and the paper is referring to and is integrating these findings with respect to their application and interpretation for the development of adaptive IWRM options addressing impacts of climate change in river basins. The data and information related to the results discussed in Chapter 2 till 8 have been input to the RBIS as a central component of the IWRMS (Chapter 9). Meanwhile the UDRB has been analysed with respect to IWRM and climate change impacts by various projects, i.e. the GLOWA-Danube BMBF funded project (GLOWA Danube, 2009; Mauser and Ludwig, 2002) the UBRB has not been studied so far in a similar way as it was done in the BRAHMATWINN project. Therefore the IWRM option development is focussing on the UBRB but the methodology presented can be applied for the UDRB and other river basins as well. Data presented and analysed in this chapter have been elaborated by the BRAHMATWINN project partners and are published in the project deliverable reports available from the project homepage http://www.brahmatwinn.uni-jena.de/index.php?id=5311&L=2.

  17. Adaptive-Grid Methods for Phase Field Models of Microstructure Development

    NASA Technical Reports Server (NTRS)

    Provatas, Nikolas; Goldenfeld, Nigel; Dantzig, Jonathan A.

    1999-01-01

    In this work the authors show how the phase field model can be solved in a computationally efficient manner that opens a new large-scale simulational window on solidification physics. Our method uses a finite element, adaptive-grid formulation, and exploits the fact that the phase and temperature fields vary significantly only near the interface. We illustrate how our method allows efficient simulation of phase-field models in very large systems, and verify the predictions of solvability theory at intermediate undercooling. We then present new results at low undercoolings that suggest that solvability theory may not give the correct tip speed in that regime. We model solidification using the phase-field model used by Karma and Rappel.

  18. Enhancing Functional Performance using Sensorimotor Adaptability Training Programs

    NASA Technical Reports Server (NTRS)

    Bloomberg, J. J.; Mulavara, A. P.; Peters, B. T.; Brady, R.; Audas, C.; Ruttley, T. M.; Cohen, H. S.

    2009-01-01

    During the acute phase of adaptation to novel gravitational environments, sensorimotor disturbances have the potential to disrupt the ability of astronauts to perform functional tasks. The goal of this project is to develop a sensorimotor adaptability (SA) training program designed to facilitate recovery of functional capabilities when astronauts transition to different gravitational environments. The project conducted a series of studies that investigated the efficacy of treadmill training combined with a variety of sensory challenges designed to increase adaptability including alterations in visual flow, body loading, and support surface stability.

  19. A method for evaluating the funding of components of natural resource and conservation projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wellington, John F., E-mail: welling@ipfw.edu; Lewis, Stephen A., E-mail: lewis.sa07@gmail.com

    Many public and private entities such as government agencies and private foundations have missions related to the improvement, protection, and sustainability of the environment. In pursuit of their missions, they fund projects with related outcomes. Typically, the funding scene consists of scarce funding dollars for the many project requests. In light of funding limitations and funder's search for innovative funding schemes, a method to support the allocation of scarce dollars among project components is presented. The proposed scheme has similarities to methods in the project selection literature but differs in its focus on project components and its connection to andmore » enumeration of the universe of funding possibilities. The value of having access to the universe is demonstrated with illustrations. The presentation includes Excel implementations that should appeal to a broad spectrum of project evaluators and reviewers. Access to the space of funding possibilities facilitates a rich analysis of funding alternatives. - Highlights: • Method is given for allocating scarce funding dollars among competing projects. • Allocations are made to fund parts of projects • Proposed method provides access to the universe of funding possibilities. • Proposed method facilitates a rich analysis of funding possibilities. • Excel spreadsheet implementations are provided.« less

  20. Projection methods for line radiative transfer in spherical media.

    NASA Astrophysics Data System (ADS)

    Anusha, L. S.; Nagendra, K. N.

    An efficient numerical method called the Preconditioned Bi-Conjugate Gradient (Pre-BiCG) method is presented for the solution of radiative transfer equation in spherical geometry. A variant of this method called Stabilized Preconditioned Bi-Conjugate Gradient (Pre-BiCG-STAB) is also presented. These methods are based on projections on the subspaces of the n dimensional Euclidean space mathbb {R}n called Krylov subspaces. The methods are shown to be faster in terms of convergence rate compared to the contemporary iterative methods such as Jacobi, Gauss-Seidel and Successive Over Relaxation (SOR).

  1. Climate Change Impacts and Adaptation on Southwestern DoD Facilities

    DTIC Science & Technology

    2017-03-03

    integrating climate change risks into decision priorities. 15. SUBJECT TERMS adaptation, baseline sensitivity, climate change, climate exposure...four bases we found that integrating climate change risks into the current decision matrix, by linking projected risks to current or past impacts...data and decision tools and methods. Bases have some capacity to integrate climate-related information, but they have limited resources to undertake

  2. Eulerian Lagrangian Adaptive Fup Collocation Method for solving the conservative solute transport in heterogeneous porous media

    NASA Astrophysics Data System (ADS)

    Gotovac, Hrvoje; Srzic, Veljko

    2014-05-01

    Contaminant transport in natural aquifers is a complex, multiscale process that is frequently studied using different Eulerian, Lagrangian and hybrid numerical methods. Conservative solute transport is typically modeled using the advection-dispersion equation (ADE). Despite the large number of available numerical methods that have been developed to solve it, the accurate numerical solution of the ADE still presents formidable challenges. In particular, current numerical solutions of multidimensional advection-dominated transport in non-uniform velocity fields are affected by one or all of the following problems: numerical dispersion that introduces artificial mixing and dilution, grid orientation effects, unresolved spatial and temporal scales and unphysical numerical oscillations (e.g., Herrera et al, 2009; Bosso et al., 2012). In this work we will present Eulerian Lagrangian Adaptive Fup Collocation Method (ELAFCM) based on Fup basis functions and collocation approach for spatial approximation and explicit stabilized Runge-Kutta-Chebyshev temporal integration (public domain routine SERK2) which is especially well suited for stiff parabolic problems. Spatial adaptive strategy is based on Fup basis functions which are closely related to the wavelets and splines so that they are also compactly supported basis functions; they exactly describe algebraic polynomials and enable a multiresolution adaptive analysis (MRA). MRA is here performed via Fup Collocation Transform (FCT) so that at each time step concentration solution is decomposed using only a few significant Fup basis functions on adaptive collocation grid with appropriate scales (frequencies) and locations, a desired level of accuracy and a near minimum computational cost. FCT adds more collocations points and higher resolution levels only in sensitive zones with sharp concentration gradients, fronts and/or narrow transition zones. According to the our recent achievements there is no need for solving the large

  3. Local adaptation and the evolution of species' ranges under climate change.

    PubMed

    Atkins, K E; Travis, J M J

    2010-10-07

    The potential impact of climate change on biodiversity is well documented. A well developed range of statistical methods currently exists that projects the possible future habitat of a species directly from the current climate and a species distribution. However, studies incorporating ecological and evolutionary processes remain limited. Here, we focus on the potential role that local adaptation to climate may play in driving the range dynamics of sessile organisms. Incorporating environmental adaptation into a stochastic simulation yields several new insights. Counter-intuitively, our simulation results suggest that species with broader ranges are not necessarily more robust to climate change. Instead, species with broader ranges can be more susceptible to extinction as locally adapted genotypes are often blocked from range shifting by the presence of cooler adapted genotypes that persist even when their optimum climate has left them behind. Interestingly, our results also suggest that it will not always be the cold-adapted phenotypes that drive polewards range expansion. Instead, range shifts may be driven by phenotypes conferring adaptation to conditions prevalent towards the centre of a species' equilibrium distribution. This may have important consequences for the conservation method termed predictive provenancing. These initial results highlight the potential importance of local adaptation in determining how species will respond to climate change and we argue that this is an area requiring urgent theoretical and empirical attention. 2010 Elsevier Ltd. All rights reserved.

  4. Combining Project Management Methods: A Case Study of Dlstributed Work Practices

    NASA Astrophysics Data System (ADS)

    Backlund, Per; Lundell, Björn

    The increasing complexity of information systems development (ISD) projects call for improved project management practices. This, together with an endeavour to improve the success rate of ISD projects (Lyytinen and Robey 1999; Cooke-Davies 2002; White and Fortune 2002), has served as drivers for various efforts in process improvement such as the introduction of new development methods (Fitzgerald 1997; Iivari and Maansaari 1998).

  5. Validation of an Adaptive Combustion Instability Control Method for Gas-Turbine Engines

    NASA Technical Reports Server (NTRS)

    Kopasakis, George; DeLaat, John C.; Chang, Clarence T.

    2004-01-01

    This paper describes ongoing testing of an adaptive control method to suppress high frequency thermo-acoustic instabilities like those found in lean-burning, low emission combustors that are being developed for future aircraft gas turbine engines. The method called Adaptive Sliding Phasor Averaged Control, was previously tested in an experimental rig designed to simulate a combustor with an instability of about 530 Hz. Results published earlier, and briefly presented here, demonstrated that this method was effective in suppressing the instability. Because this test rig did not exhibit a well pronounced instability, a question remained regarding the effectiveness of the control methodology when applied to a more coherent instability. To answer this question, a modified combustor rig was assembled at the NASA Glenn Research Center in Cleveland, Ohio. The modified rig exhibited a more coherent, higher amplitude instability, but at a lower frequency of about 315 Hz. Test results show that this control method successfully reduced the instability pressure of the lower frequency test rig. In addition, due to a certain phenomena discovered and reported earlier, the so called Intra-Harmonic Coupling, a dramatic suppression of the instability was achieved by focusing control on the second harmonic of the instability. These results and their implications are discussed, as well as a hypothesis describing the mechanism of intra-harmonic coupling.

  6. Research on a pulmonary nodule segmentation method combining fast self-adaptive FCM and classification.

    PubMed

    Liu, Hui; Zhang, Cai-Ming; Su, Zhi-Yuan; Wang, Kai; Deng, Kai

    2015-01-01

    The key problem of computer-aided diagnosis (CAD) of lung cancer is to segment pathologically changed tissues fast and accurately. As pulmonary nodules are potential manifestation of lung cancer, we propose a fast and self-adaptive pulmonary nodules segmentation method based on a combination of FCM clustering and classification learning. The enhanced spatial function considers contributions to fuzzy membership from both the grayscale similarity between central pixels and single neighboring pixels and the spatial similarity between central pixels and neighborhood and improves effectively the convergence rate and self-adaptivity of the algorithm. Experimental results show that the proposed method can achieve more accurate segmentation of vascular adhesion, pleural adhesion, and ground glass opacity (GGO) pulmonary nodules than other typical algorithms.

  7. Research on a Pulmonary Nodule Segmentation Method Combining Fast Self-Adaptive FCM and Classification

    PubMed Central

    Liu, Hui; Zhang, Cai-Ming; Su, Zhi-Yuan; Wang, Kai; Deng, Kai

    2015-01-01

    The key problem of computer-aided diagnosis (CAD) of lung cancer is to segment pathologically changed tissues fast and accurately. As pulmonary nodules are potential manifestation of lung cancer, we propose a fast and self-adaptive pulmonary nodules segmentation method based on a combination of FCM clustering and classification learning. The enhanced spatial function considers contributions to fuzzy membership from both the grayscale similarity between central pixels and single neighboring pixels and the spatial similarity between central pixels and neighborhood and improves effectively the convergence rate and self-adaptivity of the algorithm. Experimental results show that the proposed method can achieve more accurate segmentation of vascular adhesion, pleural adhesion, and ground glass opacity (GGO) pulmonary nodules than other typical algorithms. PMID:25945120

  8. Cultural adaptation of a supportive care needs measure for Hispanic men cancer survivors.

    PubMed

    Martinez Tyson, Dinorah; Medina-Ramirez, Patricia; Vázquez-Otero, Coralia; Gwede, Clement K; Bobonis, Margarita; McMillan, Susan C

    2018-01-01

    Research with ethnic minority populations requires instrumentation that is cultural and linguistically relevant. The aim of this study was to translate and culturally adapt the Cancer Survivor Unmet Needs measure into Spanish. We describe the iterative, community-engaged consensus-building approaches used to adapt the instrument for Hispanic male cancer survivors. We used an exploratory sequential mixed method study design. Methods included translation and back-translation, focus groups with cancer survivors (n = 18) and providers (n = 5), use of cognitive interview techniques to evaluate the comprehension and acceptability of the adapted instrument with survivors (n = 12), ongoing input from the project's community advisory board, and preliminary psychometric analysis (n = 84). The process emphasized conceptual, content, semantic, and technical equivalence. Combining qualitative and quantitative approaches offered a rigorous, systematic, and contextual approach to translation alone and supports the cultural adaptation of this measure in a purposeful and relevant manner. Our findings highlight the importance of going beyond translation when adapting measures for cross-cultural populations and illustrate the importance of taking culture, literacy, and language into consideration.

  9. Collaborative decision-making on wind power projects based on AHP method

    NASA Astrophysics Data System (ADS)

    Badea, A.; Proştean, G.; Tămăşilă, M.; Vârtosu, A.

    2017-01-01

    The complexity of projects implementation in Renewable Energy Sources (RES) requires finding collaborative alliances between suppliers and project developers in RES. Links activities in supply chain in RES, respectively, transportation of heavy components, processing orders to purchase quality raw materials, storage and materials handling, packaging, and other complex activities requiring a logistics system collaboratively to be permanently dimensioned properly selected and monitored. Requirements imposed by stringency of wind power energy projects implementation inevitably involves constraints in infrastructure, implementation and logistics. Thus, following an extensive research in RES project, to eliminate these constraints were identified alternative collaboration to provide feasible solutions on different levels of performance. The paper presents a critical analysis of different collaboration alternatives in supply chain for RES projects, selecting the ones most suitable for particular situations by using decision-making method Analytic Hierarchy Process (AHP). The role of AHP method was to formulate a decision model by which can be establish the collaboration alternative choice through mathematical calculation to reduce the impact created by constraints encountered. The solution provided through AHP provides a framework for detecting optimal alternative collaboration between suppliers and project developers in RES and avoids some breaks in the chain by resizing safety buffers for leveling orders in RES projects.

  10. Local statistics adaptive entropy coding method for the improvement of H.26L VLC coding

    NASA Astrophysics Data System (ADS)

    Yoo, Kook-yeol; Kim, Jong D.; Choi, Byung-Sun; Lee, Yung Lyul

    2000-05-01

    In this paper, we propose an adaptive entropy coding method to improve the VLC coding efficiency of H.26L TML-1 codec. First of all, we will show that the VLC coding presented in TML-1 does not satisfy the sibling property of entropy coding. Then, we will modify the coding method into the local statistics adaptive one to satisfy the property. The proposed method based on the local symbol statistics dynamically changes the mapping relationship between symbol and bit pattern in the VLC table according to sibling property. Note that the codewords in the VLC table of TML-1 codec is not changed. Since this changed mapping relationship also derived in the decoder side by using the decoded symbols, the proposed VLC coding method does not require any overhead information. The simulation results show that the proposed method gives about 30% and 37% reduction in average bit rate for MB type and CBP information, respectively.

  11. Using Direct Policy Search to Identify Robust Strategies in Adapting to Uncertain Sea Level Rise and Storm Surge

    NASA Astrophysics Data System (ADS)

    Garner, G. G.; Keller, K.

    2017-12-01

    Sea-level rise poses considerable risks to coastal communities, ecosystems, and infrastructure. Decision makers are faced with deeply uncertain sea-level projections when designing a strategy for coastal adaptation. The traditional methods have provided tremendous insight into this decision problem, but are often silent on tradeoffs as well as the effects of tail-area events and of potential future learning. Here we reformulate a simple sea-level rise adaptation model to address these concerns. We show that Direct Policy Search yields improved solution quality, with respect to Pareto-dominance in the objectives, over the traditional approach under uncertain sea-level rise projections and storm surge. Additionally, the new formulation produces high quality solutions with less computational demands than the traditional approach. Our results illustrate the utility of multi-objective adaptive formulations for the example of coastal adaptation, the value of information provided by observations, and point to wider-ranging application in climate change adaptation decision problems.

  12. A wavelet domain adaptive image watermarking method based on chaotic encryption

    NASA Astrophysics Data System (ADS)

    Wei, Fang; Liu, Jian; Cao, Hanqiang; Yang, Jun

    2009-10-01

    A digital watermarking technique is a specific branch of steganography, which can be used in various applications, provides a novel way to solve security problems for multimedia information. In this paper, we proposed a kind of wavelet domain adaptive image digital watermarking method using chaotic stream encrypt and human eye visual property. The secret information that can be seen as a watermarking is hidden into a host image, which can be publicly accessed, so the transportation of the secret information will not attract the attention of illegal receiver. The experimental results show that the method is invisible and robust against some image processing.

  13. Comparative study of adaptive controller using MIT rules and Lyapunov method for MPPT standalone PV systems

    NASA Astrophysics Data System (ADS)

    Tariba, N.; Bouknadel, A.; Haddou, A.; Ikken, N.; Omari, Hafsa El; Omari, Hamid El

    2017-01-01

    The Photovoltaic Generator have a nonlinear characteristic function relating the intensity at the voltage I = f (U) and depend on the variation of solar irradiation and temperature, In addition, its point of operation depends directly on the load that it supplies. To fix this drawback, and to extract the maximum power available to the terminal of the generator, an adaptation stage is introduced between the generator and the load to couple the two elements as perfectly as possible. The adaptation stage is associated with a command called MPPT MPPT (Maximum Power Point Tracker) whose is used to force the PVG to operate at the MPP (Maximum Power Point) under variation of climatic conditions and load variation. This paper presents a comparative study between the adaptive controller for PV Systems using MIT rules and Lyapunov method to regulate the PV voltage. The Incremental Conductance (IC) algorithm is used to extract the maximum power from the PVG by calculating the voltage Vref, and the adaptive controller is used to regulate and track quickly the PV voltage. The two methods of the adaptive controller will be compared to prove their performance by using the PSIM tools and experimental test, and the mathematical model of step-up with PVG model will be presented.

  14. A novel ECG data compression method based on adaptive Fourier decomposition

    NASA Astrophysics Data System (ADS)

    Tan, Chunyu; Zhang, Liming

    2017-12-01

    This paper presents a novel electrocardiogram (ECG) compression method based on adaptive Fourier decomposition (AFD). AFD is a newly developed signal decomposition approach, which can decompose a signal with fast convergence, and hence reconstruct ECG signals with high fidelity. Unlike most of the high performance algorithms, our method does not make use of any preprocessing operation before compression. Huffman coding is employed for further compression. Validated with 48 ECG recordings of MIT-BIH arrhythmia database, the proposed method achieves the compression ratio (CR) of 35.53 and the percentage root mean square difference (PRD) of 1.47% on average with N = 8 decomposition times and a robust PRD-CR relationship. The results demonstrate that the proposed method has a good performance compared with the state-of-the-art ECG compressors.

  15. An Adaptive Altitude Information Fusion Method for Autonomous Landing Processes of Small Unmanned Aerial Rotorcraft

    PubMed Central

    Lei, Xusheng; Li, Jingjing

    2012-01-01

    This paper presents an adaptive information fusion method to improve the accuracy and reliability of the altitude measurement information for small unmanned aerial rotorcraft during the landing process. Focusing on the low measurement performance of sensors mounted on small unmanned aerial rotorcraft, a wavelet filter is applied as a pre-filter to attenuate the high frequency noises in the sensor output. Furthermore, to improve altitude information, an adaptive extended Kalman filter based on a maximum a posteriori criterion is proposed to estimate measurement noise covariance matrix in real time. Finally, the effectiveness of the proposed method is proved by static tests, hovering flight and autonomous landing flight tests. PMID:23201993

  16. Effects of Content Balancing and Item Selection Method on Ability Estimation in Computerized Adaptive Tests

    ERIC Educational Resources Information Center

    Sahin, Alper; Ozbasi, Durmus

    2017-01-01

    Purpose: This study aims to reveal effects of content balancing and item selection method on ability estimation in computerized adaptive tests by comparing Fisher's maximum information (FMI) and likelihood weighted information (LWI) methods. Research Methods: Four groups of examinees (250, 500, 750, 1000) and a bank of 500 items with 10 different…

  17. Projection methods

    Treesearch

    Michael E. Goerndt; W. Keith Moser; Patrick D. Miles; Dave Wear; Ryan D. DeSantis; Robert J. Huggett; Stephen R. Shifley; Francisco X. Aguilar; Kenneth E. Skog

    2016-01-01

    One purpose of the Northern Forest Futures Project is to predict change in future forest attributes across the 20 States in the U.S. North for the period that extends from 2010 to 2060. The forest attributes of primary interest are the 54 indicators of forest sustainability identified in the Montreal Process Criteria and Indicators (Montreal Process Working Group, n.d...

  18. Balancing Plan-Driven and Agile Methods in Software Engineering Project Courses

    NASA Astrophysics Data System (ADS)

    Boehm, Barry; Port, Dan; Winsor Brown, A.

    2002-09-01

    For the past 6 years, we have been teaching a two-semester software engineering project course. The students organize into 5-person teams and develop largely web-based electronic services projects for real USC campus clients. We have been using and evolving a method called Model- Based (System) Architecting and Software Engineering (MBASE) for use in both the course and in industrial applications. The MBASE Guidelines include a lot of documents. We teach risk-driven documentation: if it is risky to document something, and not risky to leave it out (e.g., GUI screen placements), leave it out. Even so, students tend to associate more documentation with higher grades, although our grading eventually discourages this. We are always on the lookout for ways to have students learn best practices without having to produce excessive documentation. Thus, we were very interested in analyzing the various emerging agile methods. We found that agile methods and milestone plan-driven methods are part of a “how much planning is enough?” spectrum. Both agile and plan-driven methods have home grounds of project characteristics where they clearly work best, and where the other will have difficulties. Hybrid agile/plan-driven approaches are feasible, and necessary for projects having a mix of agile and plan-driven home ground characteristics. Information technology trends are going more toward the agile methods' home ground characteristics of emergent requirements and rapid change, although there is a concurrent increase in concern with dependability. As a result, we are currently experimenting with risk-driven combinations of MBASE and agile methods, such as integrating requirements, test plans, peer reviews, and pair programming into “agile quality management.”

  19. An adaptive signal-processing approach to online adaptive tutoring.

    PubMed

    Bergeron, Bryan; Cline, Andrew

    2011-01-01

    Conventional intelligent or adaptive tutoring online systems rely on domain-specific models of learner behavior based on rules, deep domain knowledge, and other resource-intensive methods. We have developed and studied a domain-independent methodology of adaptive tutoring based on domain-independent signal-processing approaches that obviate the need for the construction of explicit expert and student models. A key advantage of our method over conventional approaches is a lower barrier to entry for educators who want to develop adaptive online learning materials.

  20. Earth System Science Project

    ERIC Educational Resources Information Center

    Rutherford, Sandra; Coffman, Margaret

    2004-01-01

    For several decades, science teachers have used bottles for classroom projects designed to teach students about biology. Bottle projects do not have to just focus on biology, however. These projects can also be used to engage students in Earth science topics. This article describes the Earth System Science Project, which was adapted and developed…

  1. Formal methods demonstration project for space applications

    NASA Technical Reports Server (NTRS)

    Divito, Ben L.

    1995-01-01

    The Space Shuttle program is cooperating in a pilot project to apply formal methods to live requirements analysis activities. As one of the larger ongoing shuttle Change Requests (CR's), the Global Positioning System (GPS) CR involves a significant upgrade to the Shuttle's navigation capability. Shuttles are to be outfitted with GPS receivers and the primary avionics software will be enhanced to accept GPS-provided positions and integrate them into navigation calculations. Prior to implementing the CR, requirements analysts at Loral Space Information Systems, the Shuttle software contractor, must scrutinize the CR to identify and resolve any requirements issues. We describe an ongoing task of the Formal Methods Demonstration Project for Space Applications whose goal is to find an effective way to use formal methods in the GPS CR requirements analysis phase. This phase is currently under way and a small team from NASA Langley, ViGYAN Inc. and Loral is now engaged in this task. Background on the GPS CR is provided and an overview of the hardware/software architecture is presented. We outline the approach being taken to formalize the requirements, only a subset of which is being attempted. The approach features the use of the PVS specification language to model 'principal functions', which are major units of Shuttle software. Conventional state machine techniques form the basis of our approach. Given this background, we present interim results based on a snapshot of work in progress. Samples of requirements specifications rendered in PVS are offered to illustration. We walk through a specification sketch for the principal function known as GPS Receiver State processing. Results to date are summarized and feedback from Loral requirements analysts is highlighted. Preliminary data is shown comparing issues detected by the formal methods team versus those detected using existing requirements analysis methods. We conclude by discussing our plan to complete the remaining

  2. Patterns in Authoring of Adaptive Educational Hypermedia: A Taxonomy of Learning Styles

    ERIC Educational Resources Information Center

    Brown, Elizabeth; Cristea, Alexandra; Stewart, Craig; Brailsford, Tim

    2005-01-01

    This paper describes the use of adaptation patterns in the task of formulating standards for adaptive educational hypermedia (AEH) systems that is currently under investigation by the EU ADAPT project. Within this project, design dimensions for high granularity patterns have been established. In this paper we focus on detailing lower granularity…

  3. The adaptation challenge in the Arctic

    NASA Astrophysics Data System (ADS)

    Ford, James D.; McDowell, Graham; Pearce, Tristan

    2015-12-01

    It is commonly asserted that human communities in the Arctic are highly vulnerable to climate change, with the magnitude of projected impacts limiting their ability to adapt. At the same time, an increasing number of field studies demonstrate significant adaptive capacity. Given this paradox, we review climate change adaptation, resilience and vulnerability research to identify and characterize the nature and magnitude of the adaptation challenge facing the Arctic. We find that the challenge of adaptation in the Arctic is formidable, but suggest that drivers of vulnerability and barriers to adaptation can be overcome, avoided or reduced by individual and collective efforts across scales for many, if not all, climate change risks.

  4. The stochastic control of the F-8C aircraft using the Multiple Model Adaptive Control (MMAC) method

    NASA Technical Reports Server (NTRS)

    Athans, M.; Dunn, K. P.; Greene, E. S.; Lee, W. H.; Sandel, N. R., Jr.

    1975-01-01

    The purpose of this paper is to summarize results obtained for the adaptive control of the F-8C aircraft using the so-called Multiple Model Adaptive Control method. The discussion includes the selection of the performance criteria for both the lateral and the longitudinal dynamics, the design of the Kalman filters for different flight conditions, the 'identification' aspects of the design using hypothesis testing ideas, and the performance of the closed loop adaptive system.

  5. Adaptive control system having hedge unit and related apparatus and methods

    NASA Technical Reports Server (NTRS)

    Johnson, Eric Norman (Inventor); Calise, Anthony J. (Inventor)

    2003-01-01

    The invention includes an adaptive control system used to control a plant. The adaptive control system includes a hedge unit that receives at least one control signal and a plant state signal. The hedge unit generates a hedge signal based on the control signal, the plant state signal, and a hedge model including a first model having one or more characteristics to which the adaptive control system is not to adapt, and a second model not having the characteristic(s) to which the adaptive control system is not to adapt. The hedge signal is used in the adaptive control system to remove the effect of the characteristic from a signal supplied to an adaptation law unit of the adaptive control system so that the adaptive control system does not adapt to the characteristic in controlling the plant.

  6. Adapting Western research methods to indigenous ways of knowing.

    PubMed

    Simonds, Vanessa W; Christopher, Suzanne

    2013-12-01

    Indigenous communities have long experienced exploitation by researchers and increasingly require participatory and decolonizing research processes. We present a case study of an intervention research project to exemplify a clash between Western research methodologies and Indigenous methodologies and how we attempted reconciliation. We then provide implications for future research based on lessons learned from Native American community partners who voiced concern over methods of Western deductive qualitative analysis. Decolonizing research requires constant reflective attention and action, and there is an absence of published guidance for this process. Continued exploration is needed for implementing Indigenous methods alone or in conjunction with appropriate Western methods when conducting research in Indigenous communities. Currently, examples of Indigenous methods and theories are not widely available in academic texts or published articles, and are often not perceived as valid.

  7. Adapting Western Research Methods to Indigenous Ways of Knowing

    PubMed Central

    Christopher, Suzanne

    2013-01-01

    Indigenous communities have long experienced exploitation by researchers and increasingly require participatory and decolonizing research processes. We present a case study of an intervention research project to exemplify a clash between Western research methodologies and Indigenous methodologies and how we attempted reconciliation. We then provide implications for future research based on lessons learned from Native American community partners who voiced concern over methods of Western deductive qualitative analysis. Decolonizing research requires constant reflective attention and action, and there is an absence of published guidance for this process. Continued exploration is needed for implementing Indigenous methods alone or in conjunction with appropriate Western methods when conducting research in Indigenous communities. Currently, examples of Indigenous methods and theories are not widely available in academic texts or published articles, and are often not perceived as valid. PMID:23678897

  8. Adaptive nonlocal means filtering based on local noise level for CT denoising

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Zhoubo; Trzasko, Joshua D.; Lake, David S.

    2014-01-15

    Purpose: To develop and evaluate an image-domain noise reduction method based on a modified nonlocal means (NLM) algorithm that is adaptive to local noise level of CT images and to implement this method in a time frame consistent with clinical workflow. Methods: A computationally efficient technique for local noise estimation directly from CT images was developed. A forward projection, based on a 2D fan-beam approximation, was used to generate the projection data, with a noise model incorporating the effects of the bowtie filter and automatic exposure control. The noise propagation from projection data to images was analytically derived. The analyticalmore » noise map was validated using repeated scans of a phantom. A 3D NLM denoising algorithm was modified to adapt its denoising strength locally based on this noise map. The performance of this adaptive NLM filter was evaluated in phantom studies in terms of in-plane and cross-plane high-contrast spatial resolution, noise power spectrum (NPS), subjective low-contrast spatial resolution using the American College of Radiology (ACR) accreditation phantom, and objective low-contrast spatial resolution using a channelized Hotelling model observer (CHO). Graphical processing units (GPU) implementation of this noise map calculation and the adaptive NLM filtering were developed to meet demands of clinical workflow. Adaptive NLM was piloted on lower dose scans in clinical practice. Results: The local noise level estimation matches the noise distribution determined from multiple repetitive scans of a phantom, demonstrated by small variations in the ratio map between the analytical noise map and the one calculated from repeated scans. The phantom studies demonstrated that the adaptive NLM filter can reduce noise substantially without degrading the high-contrast spatial resolution, as illustrated by modulation transfer function and slice sensitivity profile results. The NPS results show that adaptive NLM denoising preserves

  9. Methods of Adapting Digital Content for the Learning Process via Mobile Devices

    ERIC Educational Resources Information Center

    Lopez, J. L. Gimenez; Royo, T. Magal; Laborda, Jesus Garcia; Calvo, F. Garde

    2009-01-01

    This article analyses different methods of adapting digital content for its delivery via mobile devices taking into account two aspects which are a fundamental part of the learning process; on the one hand, functionality of the contents, and on the other, the actual controlled navigation requirements that the learner needs in order to acquire high…

  10. Convergent Aeronautics Solutions (CAS) Showcase Presentation on Mission Adaptive Digital Composite Aerostructure Technologies (MADCAT)

    NASA Technical Reports Server (NTRS)

    Swei, Sean; Cheung, Kenneth

    2016-01-01

    This project is to develop a novel aerostructure concept that takes advantage of emerging digital composite materials and manufacturing methods to build high stiffness-to-density ratio, ultra-light structures that can provide mission adaptive and aerodynamically efficient future N+3N+4 air vehicles.

  11. Predicting missing values in a home care database using an adaptive uncertainty rule method.

    PubMed

    Konias, S; Gogou, G; Bamidis, P D; Vlahavas, I; Maglaveras, N

    2005-01-01

    Contemporary literature illustrates an abundance of adaptive algorithms for mining association rules. However, most literature is unable to deal with the peculiarities, such as missing values and dynamic data creation, that are frequently encountered in fields like medicine. This paper proposes an uncertainty rule method that uses an adaptive threshold for filling missing values in newly added records. A new approach for mining uncertainty rules and filling missing values is proposed, which is in turn particularly suitable for dynamic databases, like the ones used in home care systems. In this study, a new data mining method named FiMV (Filling Missing Values) is illustrated based on the mined uncertainty rules. Uncertainty rules have quite a similar structure to association rules and are extracted by an algorithm proposed in previous work, namely AURG (Adaptive Uncertainty Rule Generation). The main target was to implement an appropriate method for recovering missing values in a dynamic database, where new records are continuously added, without needing to specify any kind of thresholds beforehand. The method was applied to a home care monitoring system database. Randomly, multiple missing values for each record's attributes (rate 5-20% by 5% increments) were introduced in the initial dataset. FiMV demonstrated 100% completion rates with over 90% success in each case, while usual approaches, where all records with missing values are ignored or thresholds are required, experienced significantly reduced completion and success rates. It is concluded that the proposed method is appropriate for the data-cleaning step of the Knowledge Discovery process in databases. The latter, containing much significance for the output efficiency of any data mining technique, can improve the quality of the mined information.

  12. On Accuracy of Adaptive Grid Methods for Captured Shocks

    NASA Technical Reports Server (NTRS)

    Yamaleev, Nail K.; Carpenter, Mark H.

    2002-01-01

    The accuracy of two grid adaptation strategies, grid redistribution and local grid refinement, is examined by solving the 2-D Euler equations for the supersonic steady flow around a cylinder. Second- and fourth-order linear finite difference shock-capturing schemes, based on the Lax-Friedrichs flux splitting, are used to discretize the governing equations. The grid refinement study shows that for the second-order scheme, neither grid adaptation strategy improves the numerical solution accuracy compared to that calculated on a uniform grid with the same number of grid points. For the fourth-order scheme, the dominant first-order error component is reduced by the grid adaptation, while the design-order error component drastically increases because of the grid nonuniformity. As a result, both grid adaptation techniques improve the numerical solution accuracy only on the coarsest mesh or on very fine grids that are seldom found in practical applications because of the computational cost involved. Similar error behavior has been obtained for the pressure integral across the shock. A simple analysis shows that both grid adaptation strategies are not without penalties in the numerical solution accuracy. Based on these results, a new grid adaptation criterion for captured shocks is proposed.

  13. An Adaptive Deghosting Method in Neural Network-Based Infrared Detectors Nonuniformity Correction

    PubMed Central

    Li, Yiyang; Jin, Weiqi; Zhu, Jin; Zhang, Xu; Li, Shuo

    2018-01-01

    The problems of the neural network-based nonuniformity correction algorithm for infrared focal plane arrays mainly concern slow convergence speed and ghosting artifacts. In general, the more stringent the inhibition of ghosting, the slower the convergence speed. The factors that affect these two problems are the estimated desired image and the learning rate. In this paper, we propose a learning rate rule that combines adaptive threshold edge detection and a temporal gate. Through the noise estimation algorithm, the adaptive spatial threshold is related to the residual nonuniformity noise in the corrected image. The proposed learning rate is used to effectively and stably suppress ghosting artifacts without slowing down the convergence speed. The performance of the proposed technique was thoroughly studied with infrared image sequences with both simulated nonuniformity and real nonuniformity. The results show that the deghosting performance of the proposed method is superior to that of other neural network-based nonuniformity correction algorithms and that the convergence speed is equivalent to the tested deghosting methods. PMID:29342857

  14. Adaptive Formation Control of Electrically Driven Nonholonomic Mobile Robots With Limited Information.

    PubMed

    Bong Seok Park; Jin Bae Park; Yoon Ho Choi

    2011-08-01

    We present a leader-follower-based adaptive formation control method for electrically driven nonholonomic mobile robots with limited information. First, an adaptive observer is developed under the condition that the velocity measurement is not available. With the proposed adaptive observer, the formation control part is designed to achieve the desired formation and guarantee the collision avoidance. In addition, neural network is employed to compensate the actuator saturation, and the projection algorithm is used to estimate the velocity information of the leader. It is shown, by using the Lyapunov theory, that all errors of the closed-loop system are uniformly ultimately bounded. Simulation results are presented to illustrate the performance of the proposed control system.

  15. Precipitation Variability and Projection Uncertainties in Climate Change Adaptation: Go Local!

    EPA Science Inventory

    Presentations agenda includes: Regional and local climate change effects: The relevance; Variability and uncertainty in decision- making and adaptation approaches; Adaptation attributes for the U.S. Southwest: Water availability, storage capacity, and related; EPA research...

  16. A Comparative Study of Online Item Calibration Methods in Multidimensional Computerized Adaptive Testing

    ERIC Educational Resources Information Center

    Chen, Ping

    2017-01-01

    Calibration of new items online has been an important topic in item replenishment for multidimensional computerized adaptive testing (MCAT). Several online calibration methods have been proposed for MCAT, such as multidimensional "one expectation-maximization (EM) cycle" (M-OEM) and multidimensional "multiple EM cycles"…

  17. Adapting to Uncertainty: Comparing Methodological Approaches to Climate Adaptation and Mitigation Policy

    NASA Astrophysics Data System (ADS)

    Huda, J.; Kauneckis, D. L.

    2013-12-01

    Climate change adaptation represents a number of unique policy-making challenges. Foremost among these is dealing with the range of future climate impacts to a wide scope of inter-related natural systems, their interaction with social and economic systems, and uncertainty resulting from the variety of downscaled climate model scenarios and climate science projections. These cascades of uncertainty have led to a number of new approaches as well as a reexamination of traditional methods for evaluating risk and uncertainty in policy-making. Policy makers are required to make decisions and formulate policy irrespective of the level of uncertainty involved and while a debate continues regarding the level of scientific certainty required in order to make a decision, incremental change in the climate policy continues at multiple governance levels. This project conducts a comparative analysis of the range of methodological approaches that are evolving to address uncertainty in climate change policy. It defines 'methodologies' to include a variety of quantitative and qualitative approaches involving both top-down and bottom-up policy processes that attempt to enable policymakers to synthesize climate information into the policy process. The analysis examines methodological approaches to decision-making in climate policy based on criteria such as sources of policy choice information, sectors to which the methodology has been applied, sources from which climate projections were derived, quantitative and qualitative methods used to deal with uncertainty, and the benefits and limitations of each. A typology is developed to better categorize the variety of approaches and methods, examine the scope of policy activities they are best suited for, and highlight areas for future research and development.

  18. Curriculum Adaptation for Inclusive Classrooms.

    ERIC Educational Resources Information Center

    Neary, Tom; And Others

    This manual on curriculum adaptation for inclusive classrooms was developed as part of the PEERS (Providing Education for Everyone in Regular Schools) Project, a 5-year collaborative systems change project in California to facilitate the integration of students with severe disabilities previously at special centers into services at regular school…

  19. An adaptive reconstruction for Lagrangian, direct-forcing, immersed-boundary methods

    NASA Astrophysics Data System (ADS)

    Posa, Antonio; Vanella, Marcos; Balaras, Elias

    2017-12-01

    Lagrangian, direct-forcing, immersed boundary (IB) methods have been receiving increased attention due to their robustness in complex fluid-structure interaction problems. They are very sensitive, however, on the selection of the Lagrangian grid, which is typically used to define a solid or flexible body immersed in a fluid flow. In the present work we propose a cost-efficient solution to this problem without compromising accuracy. Central to our approach is the use of isoparametric mapping to bridge the relative resolution requirements of Lagrangian IB, and Eulerian grids. With this approach, the density of surface Lagrangian markers, which is essential to properly enforce boundary conditions, is adapted dynamically based on the characteristics of the underlying Eulerian grid. The markers are not stored and the Lagrangian data-structure is not modified. The proposed scheme is implemented in the framework of a moving least squares reconstruction formulation, but it can be adapted to any Lagrangian, direct-forcing formulation. The accuracy and robustness of the approach is demonstrated in a variety of test cases of increasing complexity.

  20. Improved neural network based scene-adaptive nonuniformity correction method for infrared focal plane arrays.

    PubMed

    Lai, Rui; Yang, Yin-tang; Zhou, Duan; Li, Yue-jin

    2008-08-20

    An improved scene-adaptive nonuniformity correction (NUC) algorithm for infrared focal plane arrays (IRFPAs) is proposed. This method simultaneously estimates the infrared detectors' parameters and eliminates the nonuniformity causing fixed pattern noise (FPN) by using a neural network (NN) approach. In the learning process of neuron parameter estimation, the traditional LMS algorithm is substituted with the newly presented variable step size (VSS) normalized least-mean square (NLMS) based adaptive filtering algorithm, which yields faster convergence, smaller misadjustment, and lower computational cost. In addition, a new NN structure is designed to estimate the desired target value, which promotes the calibration precision considerably. The proposed NUC method reaches high correction performance, which is validated by the experimental results quantitatively tested with a simulative testing sequence and a real infrared image sequence.

  1. An Efficient Adaptive Window Size Selection Method for Improving Spectrogram Visualization.

    PubMed

    Nisar, Shibli; Khan, Omar Usman; Tariq, Muhammad

    2016-01-01

    Short Time Fourier Transform (STFT) is an important technique for the time-frequency analysis of a time varying signal. The basic approach behind it involves the application of a Fast Fourier Transform (FFT) to a signal multiplied with an appropriate window function with fixed resolution. The selection of an appropriate window size is difficult when no background information about the input signal is known. In this paper, a novel empirical model is proposed that adaptively adjusts the window size for a narrow band-signal using spectrum sensing technique. For wide-band signals, where a fixed time-frequency resolution is undesirable, the approach adapts the constant Q transform (CQT). Unlike the STFT, the CQT provides a varying time-frequency resolution. This results in a high spectral resolution at low frequencies and high temporal resolution at high frequencies. In this paper, a simple but effective switching framework is provided between both STFT and CQT. The proposed method also allows for the dynamic construction of a filter bank according to user-defined parameters. This helps in reducing redundant entries in the filter bank. Results obtained from the proposed method not only improve the spectrogram visualization but also reduce the computation cost and achieves 87.71% of the appropriate window length selection.

  2. GPR used in combination with other NDT methods for assessing pavements in PPP projects

    NASA Astrophysics Data System (ADS)

    Loizos, Andreas; Plati, Christina

    2014-05-01

    In the recent decades, Public-Private Partnerships (PPP) has been adopted for highway infrastructure procurement in many countries. PPP projects typically take the form of a section of highway and connecting roadways which are to be construction and managed for a given concession period. Over the course of the highway concession period, the private agency takes over the pavement maintenance and rehabilitation duties. On this purpose, it is critical to find the most cost effective way to maintain the infrastructure in compliance with the agreed upon performance measures and a Pavement Management Systems (PMS) is critical to the success of this process. For the prosperous operation of a PMS it is necessary to have appropriate procedures for pavement monitoring and evaluation, which is important in many areas of pavement engineering. Non Destructive Testing (NDT) has played a major role in pavement condition monitoring, assessments and evaluation accomplishing continuous and quick collection of pavement data. The analysis of this data can lead to indicators related to trigger values (criteria) that define the pavement condition based on which the pavement "health" is perceived helping decide whether there is the need or not to intervene in the pavement. The accomplished perception appoints required management activities for preserving pavements in favor not only of the involved highway/road agencies but also of users' service. Amongst NDT methods Ground Penetrating Radar (GPR) seems to be a very powerful toll, as it provides a range of condition and construction pavement information. It can support effectively the implementation of PMS activities in the framework of pavement monitoring and evaluation. Given that, the present work aims to the development and adaptation of a protocol for the use of GPR in combination with other NDT methods, such as Falling Weight Deflectometer (FWD), for assessing pavements in PPP projects. It is based on the experience of Laboratory of

  3. An adaptive two-stage dose-response design method for establishing proof of concept.

    PubMed

    Franchetti, Yoko; Anderson, Stewart J; Sampson, Allan R

    2013-01-01

    We propose an adaptive two-stage dose-response design where a prespecified adaptation rule is used to add and/or drop treatment arms between the stages. We extend the multiple comparison procedures-modeling (MCP-Mod) approach into a two-stage design. In each stage, we use the same set of candidate dose-response models and test for a dose-response relationship or proof of concept (PoC) via model-associated statistics. The stage-wise test results are then combined to establish "global" PoC using a conditional error function. Our simulation studies showed good and more robust power in our design method compared to conventional and fixed designs.

  4. Phase retrieval with the reverse projection method in the presence of object's scattering

    NASA Astrophysics Data System (ADS)

    Wang, Zhili; Gao, Kun; Wang, Dajiang

    2017-08-01

    X-ray grating interferometry can provide substantially increased contrast over traditional attenuation-based techniques in biomedical applications, and therefore novel and complementary information. Recently, special attention has been paid to quantitative phase retrieval in X-ray grating interferometry, which is mandatory to perform phase tomography, to achieve material identification, etc. An innovative approach, dubbed ;Reverse Projection; (RP), has been developed for quantitative phase retrieval. The RP method abandons grating scanning completely, and is thus advantageous in terms of higher efficiency and reduced radiation damage. Therefore, it is expected that this novel method would find its potential in preclinical and clinical implementations. Strictly speaking, the reverse projection method is applicable for objects exhibiting only absorption and refraction. In this contribution, we discuss the phase retrieval with the reverse projection method for general objects with absorption, refraction and scattering simultaneously. Especially, we investigate the influence of the object's scattering on the retrieved refraction signal. Both theoretical analysis and numerical experiments are performed. The results show that the retrieved refraction signal is the product of object's refraction and scattering signals for small values. In the case of a strong scattering, the reverse projection method cannot provide reliable phase retrieval. Those presented results will guide the use of the reverse projection method for future practical applications, and help to explain some possible artifacts in the retrieved images and/or reconstructed slices.

  5. Potential benefit of the CT adaptive statistical iterative reconstruction method for pediatric cardiac diagnosis

    NASA Astrophysics Data System (ADS)

    Miéville, Frédéric A.; Ayestaran, Paul; Argaud, Christophe; Rizzo, Elena; Ou, Phalla; Brunelle, Francis; Gudinchet, François; Bochud, François; Verdun, Francis R.

    2010-04-01

    Adaptive Statistical Iterative Reconstruction (ASIR) is a new imaging reconstruction technique recently introduced by General Electric (GE). This technique, when combined with a conventional filtered back-projection (FBP) approach, is able to improve the image noise reduction. To quantify the benefits provided on the image quality and the dose reduction by the ASIR method with respect to the pure FBP one, the standard deviation (SD), the modulation transfer function (MTF), the noise power spectrum (NPS), the image uniformity and the noise homogeneity were examined. Measurements were performed on a control quality phantom when varying the CT dose index (CTDIvol) and the reconstruction kernels. A 64-MDCT was employed and raw data were reconstructed with different percentages of ASIR on a CT console dedicated for ASIR reconstruction. Three radiologists also assessed a cardiac pediatric exam reconstructed with different ASIR percentages using the visual grading analysis (VGA) method. For the standard, soft and bone reconstruction kernels, the SD is reduced when the ASIR percentage increases up to 100% with a higher benefit for low CTDIvol. MTF medium frequencies were slightly enhanced and modifications of the NPS shape curve were observed. However for the pediatric cardiac CT exam, VGA scores indicate an upper limit of the ASIR benefit. 40% of ASIR was observed as the best trade-off between noise reduction and clinical realism of organ images. Using phantom results, 40% of ASIR corresponded to an estimated dose reduction of 30% under pediatric cardiac protocol conditions. In spite of this discrepancy between phantom and clinical results, the ASIR method is as an important option when considering the reduction of radiation dose, especially for pediatric patients.

  6. A method of camera calibration with adaptive thresholding

    NASA Astrophysics Data System (ADS)

    Gao, Lei; Yan, Shu-hua; Wang, Guo-chao; Zhou, Chun-lei

    2009-07-01

    In order to calculate the parameters of the camera correctly, we must figure out the accurate coordinates of the certain points in the image plane. Corners are the important features in the 2D images. Generally speaking, they are the points that have high curvature and lie in the junction of different brightness regions of images. So corners detection has already widely used in many fields. In this paper we use the pinhole camera model and SUSAN corner detection algorithm to calibrate the camera. When using the SUSAN corner detection algorithm, we propose an approach to retrieve the gray difference threshold, adaptively. That makes it possible to pick up the right chessboard inner comers in all kinds of gray contrast. The experiment result based on this method was proved to be feasible.

  7. Generalization and modularization of two-dimensional adaptive coordinate transformations for the Fourier modal method.

    PubMed

    Küchenmeister, Jens

    2014-04-21

    The Fourier modal method (FMM) has advanced greatly by using adaptive coordinates and adaptive spatial resolution. The convergence characteristics were shown to be improved significantly, a construction principle for suitable meshes was demonstrated and a guideline for the optimal choice of the coordinate transformation parameters was found. However, the construction guidelines published so far rely on a certain restriction that is overcome with the formulation presented in this paper. Moreover, a modularization principle is formulated that significantly eases the construction of coordinate transformations in unit cells with reappearing shapes and complex sub-structures.

  8. Grid and basis adaptive polynomial chaos techniques for sensitivity and uncertainty analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perkó, Zoltán, E-mail: Z.Perko@tudelft.nl; Gilli, Luca, E-mail: Gilli@nrg.eu; Lathouwers, Danny, E-mail: D.Lathouwers@tudelft.nl

    2014-03-01

    The demand for accurate and computationally affordable sensitivity and uncertainty techniques is constantly on the rise and has become especially pressing in the nuclear field with the shift to Best Estimate Plus Uncertainty methodologies in the licensing of nuclear installations. Besides traditional, already well developed methods – such as first order perturbation theory or Monte Carlo sampling – Polynomial Chaos Expansion (PCE) has been given a growing emphasis in recent years due to its simple application and good performance. This paper presents new developments of the research done at TU Delft on such Polynomial Chaos (PC) techniques. Our work ismore » focused on the Non-Intrusive Spectral Projection (NISP) approach and adaptive methods for building the PCE of responses of interest. Recent efforts resulted in a new adaptive sparse grid algorithm designed for estimating the PC coefficients. The algorithm is based on Gerstner's procedure for calculating multi-dimensional integrals but proves to be computationally significantly cheaper, while at the same it retains a similar accuracy as the original method. More importantly the issue of basis adaptivity has been investigated and two techniques have been implemented for constructing the sparse PCE of quantities of interest. Not using the traditional full PC basis set leads to further reduction in computational time since the high order grids necessary for accurately estimating the near zero expansion coefficients of polynomial basis vectors not needed in the PCE can be excluded from the calculation. Moreover the sparse PC representation of the response is easier to handle when used for sensitivity analysis or uncertainty propagation due to the smaller number of basis vectors. The developed grid and basis adaptive methods have been implemented in Matlab as the Fully Adaptive Non-Intrusive Spectral Projection (FANISP) algorithm and were tested on four analytical problems. These show consistent good

  9. Fuzzy physical programming for Space Manoeuvre Vehicles trajectory optimization based on hp-adaptive pseudospectral method

    NASA Astrophysics Data System (ADS)

    Chai, Runqi; Savvaris, Al; Tsourdos, Antonios

    2016-06-01

    In this paper, a fuzzy physical programming (FPP) method has been introduced for solving multi-objective Space Manoeuvre Vehicles (SMV) skip trajectory optimization problem based on hp-adaptive pseudospectral methods. The dynamic model of SMV is elaborated and then, by employing hp-adaptive pseudospectral methods, the problem has been transformed to nonlinear programming (NLP) problem. According to the mission requirements, the solutions were calculated for each single-objective scenario. To get a compromised solution for each target, the fuzzy physical programming (FPP) model is proposed. The preference function is established with considering the fuzzy factor of the system such that a proper compromised trajectory can be acquired. In addition, the NSGA-II is tested to obtain the Pareto-optimal solution set and verify the Pareto optimality of the FPP solution. Simulation results indicate that the proposed method is effective and feasible in terms of dealing with the multi-objective skip trajectory optimization for the SMV.

  10. An Analysis of a Finite Element Method for Convection-Diffusion Problems. Part II. A Posteriori Error Estimates and Adaptivity.

    DTIC Science & Technology

    1983-03-01

    AN ANALYSIS OF A FINITE ELEMENT METHOD FOR CONVECTION- DIFFUSION PROBLEMS PART II: A POSTERIORI ERROR ESTIMATES AND ADAPTIVITY by W. G. Szymczak Y 6a...PERIOD COVERED AN ANALYSIS OF A FINITE ELEMENT METHOD FOR final life of the contract CONVECTION- DIFFUSION PROBLEM S. Part II: A POSTERIORI ERROR ...Element Method for Convection- Diffusion Problems. Part II: A Posteriori Error Estimates and Adaptivity W. G. Szvmczak and I. Babu~ka# Laboratory for

  11. Integrating the Complete Research Project into a Large Qualitative Methods Course

    ERIC Educational Resources Information Center

    Raddon, Mary-Beth; Nault, Caleb; Scott, Alexis

    2008-01-01

    Participatory exercises are standard practice in qualitative methods courses; less common are projects that engage students in the entire research process, from research design to write-up. Although the teaching literature provides several models of complete research projects, their feasibility, and appropriateness for large, compulsory,…

  12. A propagation method with adaptive mesh grid based on wave characteristics for wave optics simulation

    NASA Astrophysics Data System (ADS)

    Tang, Qiuyan; Wang, Jing; Lv, Pin; Sun, Quan

    2015-10-01

    Propagation simulation method and choosing mesh grid are both very important to get the correct propagation results in wave optics simulation. A new angular spectrum propagation method with alterable mesh grid based on the traditional angular spectrum method and the direct FFT method is introduced. With this method, the sampling space after propagation is not limited to propagation methods no more, but freely alterable. However, choosing mesh grid on target board influences the validity of simulation results directly. So an adaptive mesh choosing method based on wave characteristics is proposed with the introduced propagation method. We can calculate appropriate mesh grids on target board to get satisfying results. And for complex initial wave field or propagation through inhomogeneous media, we can also calculate and set the mesh grid rationally according to above method. Finally, though comparing with theoretical results, it's shown that the simulation result with the proposed method coinciding with theory. And by comparing with the traditional angular spectrum method and the direct FFT method, it's known that the proposed method is able to adapt to a wider range of Fresnel number conditions. That is to say, the method can simulate propagation results efficiently and correctly with propagation distance of almost zero to infinity. So it can provide better support for more wave propagation applications such as atmospheric optics, laser propagation and so on.

  13. Building Adaptive Capacity with the Delphi Method and Mediated Modeling for Water Quality and Climate Change Adaptation in Lake Champlain Basin

    NASA Astrophysics Data System (ADS)

    Coleman, S.; Hurley, S.; Koliba, C.; Zia, A.; Exler, S.

    2014-12-01

    Eutrophication and nutrient pollution of surface waters occur within complex governance, social, hydrologic and biophysical basin contexts. The pervasive and perennial nutrient pollution in Lake Champlain Basin, despite decades of efforts, exemplifies problems found across the world's surface waters. Stakeholders with diverse values, interests, and forms of explicit and tacit knowledge determine water quality impacts through land use, agricultural and water resource decisions. Uncertainty, ambiguity and dynamic feedback further complicate the ability to promote the continual provision of water quality and ecosystem services. Adaptive management of water resources and land use requires mechanisms to allow for learning and integration of new information over time. The transdisciplinary Research on Adaptation to Climate Change (RACC) team is working to build regional adaptive capacity in Lake Champlain Basin while studying and integrating governance, land use, hydrological, and biophysical systems to evaluate implications for adaptive management. The RACC team has engaged stakeholders through mediated modeling workshops, online forums, surveys, focus groups and interviews. In March 2014, CSS2CC.org, an interactive online forum to source and identify adaptive interventions from a group of stakeholders across sectors was launched. The forum, based on the Delphi Method, brings forward the collective wisdom of stakeholders and experts to identify potential interventions and governance designs in response to scientific uncertainty and ambiguity surrounding the effectiveness of any strategy, climate change impacts, and the social and natural systems governing water quality and eutrophication. A Mediated Modeling Workshop followed the forum in May 2014, where participants refined and identified plausible interventions under different governance, policy and resource scenarios. Results from the online forum and workshop can identify emerging consensus across scales and sectors

  14. An Adaptive INS-Aided PLL Tracking Method for GNSS Receivers in Harsh Environments.

    PubMed

    Cong, Li; Li, Xin; Jin, Tian; Yue, Song; Xue, Rui

    2016-01-23

    As the weak link in global navigation satellite system (GNSS) signal processing, the phase-locked loop (PLL) is easily influenced with frequent cycle slips and loss of lock as a result of higher vehicle dynamics and lower signal-to-noise ratios. With inertial navigation system (INS) aid, PLLs' tracking performance can be improved. However, for harsh environments with high dynamics and signal attenuation, the traditional INS-aided PLL with fixed loop parameters has some limitations to improve the tracking adaptability. In this paper, an adaptive INS-aided PLL capable of adjusting its noise bandwidth and coherent integration time has been proposed. Through theoretical analysis, the relation between INS-aided PLL phase tracking error and carrier to noise density ratio (C/N₀), vehicle dynamics, aiding information update time, noise bandwidth, and coherent integration time has been built. The relation formulae are used to choose the optimal integration time and bandwidth for a given application under the minimum tracking error criterion. Software and hardware simulation results verify the correctness of the theoretical analysis, and demonstrate that the adaptive tracking method can effectively improve the PLL tracking ability and integrated GNSS/INS navigation performance. For harsh environments, the tracking sensitivity is increased by 3 to 5 dB, velocity errors are decreased by 36% to 50% and position errors are decreased by 6% to 24% when compared with other INS-aided PLL methods.

  15. Simulator Adaptation Syndrome Literature Review

    DTIC Science & Technology

    2004-01-16

    SIMULATOR ADAPTATION SYNDROME LITERATURE REVIEW 1517 N. Main Street | Royal Oak, MI 48067 Tel...Simulator Adaptation Syndrome Literature Review 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e...describe simulator sickness as a syndrome because it has many complex contributing causes and manifests itself with many potential symptoms. A good

  16. Investigation of the effects of color on judgments of sweetness using a taste adaptation method.

    PubMed

    Hidaka, Souta; Shimoda, Kazumasa

    2014-01-01

    It has been reported that color can affect the judgment of taste. For example, a dark red color enhances the subjective intensity of sweetness. However, the underlying mechanisms of the effect of color on taste have not been fully investigated; in particular, it remains unclear whether the effect is based on cognitive/decisional or perceptual processes. Here, we investigated the effect of color on sweetness judgments using a taste adaptation method. A sweet solution whose color was subjectively congruent with sweetness was judged as sweeter than an uncolored sweet solution both before and after adaptation to an uncolored sweet solution. In contrast, subjective judgment of sweetness for uncolored sweet solutions did not differ between the conditions following adaptation to a colored sweet solution and following adaptation to an uncolored one. Color affected sweetness judgment when the target solution was colored, but the colored sweet solution did not modulate the magnitude of taste adaptation. Therefore, it is concluded that the effect of color on the judgment of taste would occur mainly in cognitive/decisional domains.

  17. A Bayesian Ensemble Approach for Epidemiological Projections

    PubMed Central

    Lindström, Tom; Tildesley, Michael; Webb, Colleen

    2015-01-01

    Mathematical models are powerful tools for epidemiology and can be used to compare control actions. However, different models and model parameterizations may provide different prediction of outcomes. In other fields of research, ensemble modeling has been used to combine multiple projections. We explore the possibility of applying such methods to epidemiology by adapting Bayesian techniques developed for climate forecasting. We exemplify the implementation with single model ensembles based on different parameterizations of the Warwick model run for the 2001 United Kingdom foot and mouth disease outbreak and compare the efficacy of different control actions. This allows us to investigate the effect that discrepancy among projections based on different modeling assumptions has on the ensemble prediction. A sensitivity analysis showed that the choice of prior can have a pronounced effect on the posterior estimates of quantities of interest, in particular for ensembles with large discrepancy among projections. However, by using a hierarchical extension of the method we show that prior sensitivity can be circumvented. We further extend the method to include a priori beliefs about different modeling assumptions and demonstrate that the effect of this can have different consequences depending on the discrepancy among projections. We propose that the method is a promising analytical tool for ensemble modeling of disease outbreaks. PMID:25927892

  18. Research on Comprehensive Evaluation Method for Heating Project Based on Analytic Hierarchy Processing

    NASA Astrophysics Data System (ADS)

    Han, Shenchao; Yang, Yanchun; Liu, Yude; Zhang, Peng; Li, Siwei

    2018-01-01

    It is effective to reduce haze in winter by changing the distributed heat supply system. Thus, the studies on comprehensive index system and scientific evaluation method of distributed heat supply project are essential. Firstly, research the influence factors of heating modes, and an index system with multiple dimension including economic, environmental, risk and flexibility was built and all indexes were quantified. Secondly, a comprehensive evaluation method based on AHP was put forward to analyze the proposed multiple and comprehensive index system. Lastly, the case study suggested that supplying heat with electricity has great advantage and promotional value. The comprehensive index system of distributed heating supply project and evaluation method in this paper can evaluate distributed heat supply project effectively and provide scientific support for choosing the distributed heating project.

  19. Developing Adaptive Teaching Competency through Coaching

    ERIC Educational Resources Information Center

    Vogt, Franziska; Rogalla, Marion

    2009-01-01

    The research project Adaptive Teaching Competency seeks to conceptualise the processes of tuning teaching to individual students' learning needs and to empirically test, within the field of science teaching, to what extent Adaptive Teaching Competency can be fostered through teacher education. 32 primary and secondary teachers took part in an…

  20. Systems and Methods for Derivative-Free Adaptive Control

    NASA Technical Reports Server (NTRS)

    Calise, Anthony J. (Inventor); Yucelen, Tansel (Inventor); Kim, Kilsoo (Inventor)

    2015-01-01

    An adaptive control system is disclosed. The control system can control uncertain dynamic systems. The control system can employ one or more derivative-free adaptive control architectures. The control system can further employ one or more derivative-free weight update laws. The derivative-free weight update laws can comprise a time-varying estimate of an ideal vector of weights. The control system of the present invention can therefore quickly stabilize systems that undergo sudden changes in dynamics, caused by, for example, sudden changes in weight. Embodiments of the present invention can also provide a less complex control system than existing adaptive control systems. The control system can control aircraft and other dynamic systems, such as, for example, those with non-minimum phase dynamics.

  1. Extension of moment projection method to the fragmentation process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Shaohua; Yapp, Edward K.Y.; Akroyd, Jethro

    2017-04-15

    The method of moments is a simple but efficient method of solving the population balance equation which describes particle dynamics. Recently, the moment projection method (MPM) was proposed and validated for particle inception, coagulation, growth and, more importantly, shrinkage; here the method is extended to include the fragmentation process. The performance of MPM is tested for 13 different test cases for different fragmentation kernels, fragment distribution functions and initial conditions. Comparisons are made with the quadrature method of moments (QMOM), hybrid method of moments (HMOM) and a high-precision stochastic solution calculated using the established direct simulation algorithm (DSA) and advantagesmore » of MPM are drawn.« less

  2. Development of a Coordinate Transformation method for direct georeferencing in map projection frames

    NASA Astrophysics Data System (ADS)

    Zhao, Haitao; Zhang, Bing; Wu, Changshan; Zuo, Zhengli; Chen, Zhengchao

    2013-03-01

    This paper develops a novel Coordinate Transformation method (CT-method), with which the orientation angles (roll, pitch, heading) of the local tangent frame of the GPS/INS system are transformed into those (omega, phi, kappa) of the map projection frame for direct georeferencing (DG). Especially, the orientation angles in the map projection frame were derived from a sequence of coordinate transformations. The effectiveness of orientation angles transformation was verified through comparing with DG results obtained from conventional methods (Legat method and POSPac method) using empirical data. Moreover, the CT-method was also validated with simulated data. One advantage of the proposed method is that the orientation angles can be acquired simultaneously while calculating position elements of exterior orientation (EO) parameters and auxiliary points coordinates by coordinate transformation. These three methods were demonstrated and compared using empirical data. Empirical results show that the CT-method is both as sound and effective as Legat method. Compared with POSPac method, the CT-method is more suitable for calculating EO parameters for DG in map projection frames. DG accuracy of the CT-method and Legat method are at the same level. DG results of all these three methods have systematic errors in height due to inconsistent length projection distortion in the vertical and horizontal components, and these errors can be significantly reduced using the EO height correction technique in Legat's approach. Similar to the results obtained with empirical data, the effectiveness of the CT-method was also proved with simulated data. POSPac method: The method is presented by Applanix POSPac software technical note (Hutton and Savina, 1997). It is implemented in the POSEO module of POSPac software.

  3. Developing Bayesian adaptive methods for estimating sensitivity thresholds (d′) in Yes-No and forced-choice tasks

    PubMed Central

    Lesmes, Luis A.; Lu, Zhong-Lin; Baek, Jongsoo; Tran, Nina; Dosher, Barbara A.; Albright, Thomas D.

    2015-01-01

    Motivated by Signal Detection Theory (SDT), we developed a family of novel adaptive methods that estimate the sensitivity threshold—the signal intensity corresponding to a pre-defined sensitivity level (d′ = 1)—in Yes-No (YN) and Forced-Choice (FC) detection tasks. Rather than focus stimulus sampling to estimate a single level of %Yes or %Correct, the current methods sample psychometric functions more broadly, to concurrently estimate sensitivity and decision factors, and thereby estimate thresholds that are independent of decision confounds. Developed for four tasks—(1) simple YN detection, (2) cued YN detection, which cues the observer's response state before each trial, (3) rated YN detection, which incorporates a Not Sure response, and (4) FC detection—the qYN and qFC methods yield sensitivity thresholds that are independent of the task's decision structure (YN or FC) and/or the observer's subjective response state. Results from simulation and psychophysics suggest that 25 trials (and sometimes less) are sufficient to estimate YN thresholds with reasonable precision (s.d. = 0.10–0.15 decimal log units), but more trials are needed for FC thresholds. When the same subjects were tested across tasks of simple, cued, rated, and FC detection, adaptive threshold estimates exhibited excellent agreement with the method of constant stimuli (MCS), and with each other. These YN adaptive methods deliver criterion-free thresholds that have previously been exclusive to FC methods. PMID:26300798

  4. Accurate Adaptive Level Set Method and Sharpening Technique for Three Dimensional Deforming Interfaces

    NASA Technical Reports Server (NTRS)

    Kim, Hyoungin; Liou, Meng-Sing

    2011-01-01

    In this paper, we demonstrate improved accuracy of the level set method for resolving deforming interfaces by proposing two key elements: (1) accurate level set solutions on adapted Cartesian grids by judiciously choosing interpolation polynomials in regions of different grid levels and (2) enhanced reinitialization by an interface sharpening procedure. The level set equation is solved using a fifth order WENO scheme or a second order central differencing scheme depending on availability of uniform stencils at each grid point. Grid adaptation criteria are determined so that the Hamiltonian functions at nodes adjacent to interfaces are always calculated by the fifth order WENO scheme. This selective usage between the fifth order WENO and second order central differencing schemes is confirmed to give more accurate results compared to those in literature for standard test problems. In order to further improve accuracy especially near thin filaments, we suggest an artificial sharpening method, which is in a similar form with the conventional re-initialization method but utilizes sign of curvature instead of sign of the level set function. Consequently, volume loss due to numerical dissipation on thin filaments is remarkably reduced for the test problems

  5. A locally adaptive kernel regression method for facies delineation

    NASA Astrophysics Data System (ADS)

    Fernàndez-Garcia, D.; Barahona-Palomo, M.; Henri, C. V.; Sanchez-Vila, X.

    2015-12-01

    Facies delineation is defined as the separation of geological units with distinct intrinsic characteristics (grain size, hydraulic conductivity, mineralogical composition). A major challenge in this area stems from the fact that only a few scattered pieces of hydrogeological information are available to delineate geological facies. Several methods to delineate facies are available in the literature, ranging from those based only on existing hard data, to those including secondary data or external knowledge about sedimentological patterns. This paper describes a methodology to use kernel regression methods as an effective tool for facies delineation. The method uses both the spatial and the actual sampled values to produce, for each individual hard data point, a locally adaptive steering kernel function, self-adjusting the principal directions of the local anisotropic kernels to the direction of highest local spatial correlation. The method is shown to outperform the nearest neighbor classification method in a number of synthetic aquifers whenever the available number of hard data is small and randomly distributed in space. In the case of exhaustive sampling, the steering kernel regression method converges to the true solution. Simulations ran in a suite of synthetic examples are used to explore the selection of kernel parameters in typical field settings. It is shown that, in practice, a rule of thumb can be used to obtain suboptimal results. The performance of the method is demonstrated to significantly improve when external information regarding facies proportions is incorporated. Remarkably, the method allows for a reasonable reconstruction of the facies connectivity patterns, shown in terms of breakthrough curves performance.

  6. Adaptive Flight Control for Aircraft Safety Enhancements

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.; Gregory, Irene M.; Joshi, Suresh M.

    2008-01-01

    This poster presents the current adaptive control research being conducted at NASA ARC and LaRC in support of the Integrated Resilient Aircraft Control (IRAC) project. The technique "Approximate Stability Margin Analysis of Hybrid Direct-Indirect Adaptive Control" has been developed at NASA ARC to address the needs for stability margin metrics for adaptive control that potentially enables future V&V of adaptive systems. The technique "Direct Adaptive Control With Unknown Actuator Failures" is developed at NASA LaRC to deal with unknown actuator failures. The technique "Adaptive Control with Adaptive Pilot Element" is being researched at NASA LaRC to investigate the effects of pilot interactions with adaptive flight control that can have implications of stability and performance.

  7. RO1 Funding for Mixed Methods Research: Lessons learned from the Mixed-Method Analysis of Japanese Depression Project

    PubMed Central

    Arnault, Denise Saint; Fetters, Michael D.

    2013-01-01

    Mixed methods research has made significant in-roads in the effort to examine complex health related phenomenon. However, little has been published on the funding of mixed methods research projects. This paper addresses that gap by presenting an example of an NIMH funded project using a mixed methods QUAL-QUAN triangulation design entitled “The Mixed-Method Analysis of Japanese Depression.” We present the Cultural Determinants of Health Seeking model that framed the study, the specific aims, the quantitative and qualitative data sources informing the study, and overview of the mixing of the two studies. Finally, we examine reviewer's comments and our insights related to writing mixed method proposal successful for achieving RO1 level funding. PMID:25419196

  8. Item Pocket Method to Allow Response Review and Change in Computerized Adaptive Testing

    ERIC Educational Resources Information Center

    Han, Kyung T.

    2013-01-01

    Most computerized adaptive testing (CAT) programs do not allow test takers to review and change their responses because it could seriously deteriorate the efficiency of measurement and make tests vulnerable to manipulative test-taking strategies. Several modified testing methods have been developed that provide restricted review options while…

  9. Wavelet-based Adaptive Mesh Refinement Method for Global Atmospheric Chemical Transport Modeling

    NASA Astrophysics Data System (ADS)

    Rastigejev, Y.

    2011-12-01

    Numerical modeling of global atmospheric chemical transport presents enormous computational difficulties, associated with simulating a wide range of time and spatial scales. The described difficulties are exacerbated by the fact that hundreds of chemical species and thousands of chemical reactions typically are used for chemical kinetic mechanism description. These computational requirements very often forces researches to use relatively crude quasi-uniform numerical grids with inadequate spatial resolution that introduces significant numerical diffusion into the system. It was shown that this spurious diffusion significantly distorts the pollutant mixing and transport dynamics for typically used grid resolution. The described numerical difficulties have to be systematically addressed considering that the demand for fast, high-resolution chemical transport models will be exacerbated over the next decade by the need to interpret satellite observations of tropospheric ozone and related species. In this study we offer dynamically adaptive multilevel Wavelet-based Adaptive Mesh Refinement (WAMR) method for numerical modeling of atmospheric chemical evolution equations. The adaptive mesh refinement is performed by adding and removing finer levels of resolution in the locations of fine scale development and in the locations of smooth solution behavior accordingly. The algorithm is based on the mathematically well established wavelet theory. This allows us to provide error estimates of the solution that are used in conjunction with an appropriate threshold criteria to adapt the non-uniform grid. Other essential features of the numerical algorithm include: an efficient wavelet spatial discretization that allows to minimize the number of degrees of freedom for a prescribed accuracy, a fast algorithm for computing wavelet amplitudes, and efficient and accurate derivative approximations on an irregular grid. The method has been tested for a variety of benchmark problems

  10. Large project experiences with object-oriented methods and reuse

    NASA Technical Reports Server (NTRS)

    Wessale, William; Reifer, Donald J.; Weller, David

    1992-01-01

    The SSVTF (Space Station Verification and Training Facility) project is completing the Preliminary Design Review of a large software development using object-oriented methods and systematic reuse. An incremental developmental lifecycle was tailored to provide early feedback and guidance on methods and products, with repeated attention to reuse. Object oriented methods were formally taught and supported by realistic examples. Reuse was readily accepted and planned by the developers. Schedule and budget issues were handled by agreements and work sharing arranged by the developers.

  11. SAChES: Scalable Adaptive Chain-Ensemble Sampling.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swiler, Laura Painton; Ray, Jaideep; Ebeida, Mohamed Salah

    We present the development of a parallel Markov Chain Monte Carlo (MCMC) method called SAChES, Scalable Adaptive Chain-Ensemble Sampling. This capability is targed to Bayesian calibration of com- putationally expensive simulation models. SAChES involves a hybrid of two methods: Differential Evo- lution Monte Carlo followed by Adaptive Metropolis. Both methods involve parallel chains. Differential evolution allows one to explore high-dimensional parameter spaces using loosely coupled (i.e., largely asynchronous) chains. Loose coupling allows the use of large chain ensembles, with far more chains than the number of parameters to explore. This reduces per-chain sampling burden, enables high-dimensional inversions and the usemore » of computationally expensive forward models. The large number of chains can also ameliorate the impact of silent-errors, which may affect only a few chains. The chain ensemble can also be sampled to provide an initial condition when an aberrant chain is re-spawned. Adaptive Metropolis takes the best points from the differential evolution and efficiently hones in on the poste- rior density. The multitude of chains in SAChES is leveraged to (1) enable efficient exploration of the parameter space; and (2) ensure robustness to silent errors which may be unavoidable in extreme-scale computational platforms of the future. This report outlines SAChES, describes four papers that are the result of the project, and discusses some additional results.« less

  12. Electronic-projecting Moire method applying CBR-technology

    NASA Astrophysics Data System (ADS)

    Kuzyakov, O. N.; Lapteva, U. V.; Andreeva, M. A.

    2018-01-01

    Electronic-projecting method based on Moire effect for examining surface topology is suggested. Conditions of forming Moire fringes and their parameters’ dependence on reference parameters of object and virtual grids are analyzed. Control system structure and decision-making subsystem are elaborated. Subsystem execution includes CBR-technology, based on applying case base. The approach related to analysing and forming decision for each separate local area with consequent formation of common topology map is applied.

  13. FMM-Yukawa: An adaptive fast multipole method for screened Coulomb interactions

    NASA Astrophysics Data System (ADS)

    Huang, Jingfang; Jia, Jun; Zhang, Bo

    2009-11-01

    A Fortran program package is introduced for the rapid evaluation of the screened Coulomb interactions of N particles in three dimensions. The method utilizes an adaptive oct-tree structure, and is based on the new version of fast multipole method in which the exponential expansions are used to diagonalize the multipole-to-local translations. The program and its full description, as well as several closely related packages are also available at http://www.fastmultipole.org/. This paper is a brief review of the program and its performance. Catalogue identifier: AEEQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEQ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL 2.0 No. of lines in distributed program, including test data, etc.: 12 385 No. of bytes in distributed program, including test data, etc.: 79 222 Distribution format: tar.gz Programming language: Fortran77 and Fortran90 Computer: Any Operating system: Any RAM: Depends on the number of particles, their distribution, and the adaptive tree structure Classification: 4.8, 4.12 Nature of problem: To evaluate the screened Coulomb potential and force field of N charged particles, and to evaluate a convolution type integral where the Green's function is the fundamental solution of the modified Helmholtz equation. Solution method: An adaptive oct-tree is generated, and a new version of fast multipole method is applied in which the "multipole-to-local" translation operator is diagonalized. Restrictions: Only three and six significant digits accuracy options are provided in this version. Unusual features: Most of the codes are written in

  14. Projected impacts of climate change on hydrology, water resource use and adaptation needs for the Chu and Talas cross-border rivers basin, Central Asia

    NASA Astrophysics Data System (ADS)

    Shamil Iliasov, Shamil; Dolgikh, Svetlana; Lipponen, Annukka; Novikov, Viktor

    2014-05-01

    The observed long-term trends, variability and projections of future climate and hydrology of the Chu and Talas transboundary rivers basin were analysed using a common approach for Kazakhstan and Kyrgyzstan parts of the basin. Historical, current and forecasted demands and main uses of water in the basin were elaborated by the joint effort of both countries. Such cooperative approach combining scientific data, water practitioners' outlook with decision making needs allowed the first time to produce a comprehensive assessment of climate change impacts on water resources in the Chu-Talas transboundary rivers basin, identify future needs and develop the initial set of adaptation measures and recommendations. This work was carried out under the project "Promoting Cooperation to Adapt to Climate Change in the Chu and Talas Transboundary Basin", supported by the United Nations Economic Commission for Europe (UNECE) and the United Nations Development Programme (UNDP). Climate change projections, including air temperatures and rainfall in the 21st century were determined with a spatial resolution 0.5 degrees based on the integration of 15 climate change model outputs (derived from IPCC's 4th Assessment Report, and partially 5th Assessment Report) combined with locally-designed hydrology and glacier models. A significant increase in surface air temperatures by 3-6°C may be expected in the basin area, especially in summer and autumn. This change is likely to be accompanied by rainfall increase during the cold season and a decrease in the warm half of the year. As a result, a deterioration of moisture conditions during the summer-autumn period is possible. Furthermore, milder winters and hotter summers can be expected. Mountains will likely receive more liquid precipitation, than snow, while the area and volume of glaciers may significantly reduce. Projected changes in climate and glaciers have implications for river hydrology and different sectors of the economy dependent

  15. Development of a scalable generic platform for adaptive optics real time control

    NASA Astrophysics Data System (ADS)

    Surendran, Avinash; Burse, Mahesh P.; Ramaprakash, A. N.; Parihar, Padmakar

    2015-06-01

    The main objective of the present project is to explore the viability of an adaptive optics control system based exclusively on Field Programmable Gate Arrays (FPGAs), making strong use of their parallel processing capability. In an Adaptive Optics (AO) system, the generation of the Deformable Mirror (DM) control voltages from the Wavefront Sensor (WFS) measurements is usually through the multiplication of the wavefront slopes with a predetermined reconstructor matrix. The ability to access several hundred hard multipliers and memories concurrently in an FPGA allows performance far beyond that of a modern CPU or GPU for tasks with a well-defined structure such as Adaptive Optics control. The target of the current project is to generate a signal for a real time wavefront correction, from the signals coming from a Wavefront Sensor, wherein the system would be flexible to accommodate all the current Wavefront Sensing techniques and also the different methods which are used for wavefront compensation. The system should also accommodate for different data transmission protocols (like Ethernet, USB, IEEE 1394 etc.) for transmitting data to and from the FPGA device, thus providing a more flexible platform for Adaptive Optics control. Preliminary simulation results for the formulation of the platform, and a design of a fully scalable slope computer is presented.

  16. Feasibility of an online adaptive replanning method for cranial frameless intensity-modulated radiosurgery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calvo, Juan Francisco, E-mail: jfcdrr@gmail.com; San José, Sol; Garrido, LLuís

    2013-10-01

    To introduce an approach for online adaptive replanning (i.e., dose-guided radiosurgery) in frameless stereotactic radiosurgery, when a 6-dimensional (6D) robotic couch is not available in the linear accelerator (linac). Cranial radiosurgical treatments are planned in our department using intensity-modulated technique. Patients are immobilized using thermoplastic mask. A cone-beam computed tomography (CBCT) scan is acquired after the initial laser-based patient setup (CBCT{sub setup}). The online adaptive replanning procedure we propose consists of a 6D registration-based mapping of the reference plan onto actual CBCT{sub setup}, followed by a reoptimization of the beam fluences (“6D plan”) to achieve similar dosage as originally wasmore » intended, while the patient is lying in the linac couch and the original beam arrangement is kept. The goodness of the online adaptive method proposed was retrospectively analyzed for 16 patients with 35 targets treated with CBCT-based frameless intensity modulated technique. Simulation of reference plan onto actual CBCT{sub setup}, according to the 4 degrees of freedom, supported by linac couch was also generated for each case (4D plan). Target coverage (D99%) and conformity index values of 6D and 4D plans were compared with the corresponding values of the reference plans. Although the 4D-based approach does not always assure the target coverage (D99% between 72% and 103%), the proposed online adaptive method gave a perfect coverage in all cases analyzed as well as a similar conformity index value as was planned. Dose-guided radiosurgery approach is effective to assure the dose coverage and conformity of an intracranial target volume, avoiding resetting the patient inside the mask in a “trial and error” way so as to remove the pitch and roll errors when a robotic table is not available.« less

  17. Relationship between Students' Scores on Research Methods and Statistics, and Undergraduate Project Scores

    ERIC Educational Resources Information Center

    Ossai, Peter Agbadobi Uloku

    2016-01-01

    This study examined the relationship between students' scores on Research Methods and statistics, and undergraduate project at the final year. The purpose was to find out whether students matched knowledge of research with project-writing skill. The study adopted an expost facto correlational design. Scores on Research Methods and Statistics for…

  18. Adaptation of trees, forests and forestry to climate change

    Treesearch

    Daniel J. Chmura; Glenn T. Howe; Paul D. Anderson; Bradley J. St Clair

    2010-01-01

    Ongoing climate change will likely expose trees and forests to new stresses and disturbances during this century. Trees naturally adapt to changes in climate, but their natural adaptive ability may be compromised by the rapid changes projected for this century. In the broad sense, adaptation to climate change also includes the purposeful adaptation of human systems,...

  19. A freestream-preserving fourth-order finite-volume method in mapped coordinates with adaptive-mesh refinement

    DOE PAGES

    Guzik, Stephen M.; Gao, Xinfeng; Owen, Landon D.; ...

    2015-12-20

    We present a fourth-order accurate finite-volume method for solving time-dependent hyperbolic systems of conservation laws on mapped grids that are adaptively refined in space and time. Some novel considerations for formulating the semi-discrete system of equations in computational space are combined with detailed mechanisms for accommodating the adapting grids. Furthermore, these considerations ensure that conservation is maintained and that the divergence of a constant vector field is always zero (freestream-preservation property). The solution in time is advanced with a fourth-order Runge-Kutta method. A series of tests verifies that the expected accuracy is achieved in smooth flows and the solution ofmore » a Mach reflection problem demonstrates the effectiveness of the algorithm in resolving strong discontinuities.« less

  20. ENVIRONMENTAL METHODS TESTING SITE PROJECT: DATA MANAGEMENT PROCEDURES PLAN

    EPA Science Inventory

    The Environmental Methods Testing Site (EMTS) Data Management Procedures Plan identifies the computer hardware and software resources used in the EMTS project. It identifies the major software packages that are available for use by principal investigators for the analysis of data...