Sample records for computer adaptive short

  1. Measurement properties of the Spinal Cord Injury-Functional Index (SCI-FI) short forms.

    PubMed

    Heinemann, Allen W; Dijkers, Marcel P; Ni, Pengsheng; Tulsky, David S; Jette, Alan

    2014-07-01

    To evaluate the psychometric properties of the Spinal Cord Injury-Functional Index (SCI-FI) short forms (basic mobility, self-care, fine motor, ambulation, manual wheelchair, and power wheelchair) based on internal consistency; correlations between short forms banks, full item bank forms, and a 10-item computer adaptive test version; magnitude of ceiling and floor effects; and test information functions. Cross-sectional cohort study. Six rehabilitation hospitals in the United States. Individuals with traumatic spinal cord injury (N=855) recruited from 6 national Spinal Cord Injury Model Systems facilities. Not applicable. SCI-FI full item bank, 10-item computer adaptive test, and parallel short form scores. The SCI-FI short forms (with separate versions for individuals with paraplegia and tetraplegia) demonstrate very good internal consistency, group-level reliability, excellent correlations between short forms and scores based on the total item bank, and minimal ceiling and floor effects (except ceiling effects for persons with paraplegia on self-care, fine motor, and power wheelchair ability and floor effects for persons with tetraplegia on self-care, fine motor, and manual wheelchair ability). The test information functions are acceptable across the range of scores where most persons in the sample performed. Clinicians and researchers should consider the SCI-FI short forms when computer adaptive testing is not feasible. Copyright © 2014 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  2. Senior Adults and Computers in the 1990s.

    ERIC Educational Resources Information Center

    Lawhon, Tommie; And Others

    1996-01-01

    Older adults use computers for entertainment, education, and creative and business endeavors. Computer training helps them increase productivity, learn skills, and boost short-term memory. Electronic mail, online services, and the Internet encourage socialization. Adapted technology helps disabled and ill elders use computers. (SK)

  3. Smart Swarms of Bacteria-Inspired Agents with Performance Adaptable Interactions

    PubMed Central

    Shklarsh, Adi; Ariel, Gil; Schneidman, Elad; Ben-Jacob, Eshel

    2011-01-01

    Collective navigation and swarming have been studied in animal groups, such as fish schools, bird flocks, bacteria, and slime molds. Computer modeling has shown that collective behavior of simple agents can result from simple interactions between the agents, which include short range repulsion, intermediate range alignment, and long range attraction. Here we study collective navigation of bacteria-inspired smart agents in complex terrains, with adaptive interactions that depend on performance. More specifically, each agent adjusts its interactions with the other agents according to its local environment – by decreasing the peers' influence while navigating in a beneficial direction, and increasing it otherwise. We show that inclusion of such performance dependent adaptable interactions significantly improves the collective swarming performance, leading to highly efficient navigation, especially in complex terrains. Notably, to afford such adaptable interactions, each modeled agent requires only simple computational capabilities with short-term memory, which can easily be implemented in simple swarming robots. PMID:21980274

  4. Smart swarms of bacteria-inspired agents with performance adaptable interactions.

    PubMed

    Shklarsh, Adi; Ariel, Gil; Schneidman, Elad; Ben-Jacob, Eshel

    2011-09-01

    Collective navigation and swarming have been studied in animal groups, such as fish schools, bird flocks, bacteria, and slime molds. Computer modeling has shown that collective behavior of simple agents can result from simple interactions between the agents, which include short range repulsion, intermediate range alignment, and long range attraction. Here we study collective navigation of bacteria-inspired smart agents in complex terrains, with adaptive interactions that depend on performance. More specifically, each agent adjusts its interactions with the other agents according to its local environment--by decreasing the peers' influence while navigating in a beneficial direction, and increasing it otherwise. We show that inclusion of such performance dependent adaptable interactions significantly improves the collective swarming performance, leading to highly efficient navigation, especially in complex terrains. Notably, to afford such adaptable interactions, each modeled agent requires only simple computational capabilities with short-term memory, which can easily be implemented in simple swarming robots.

  5. Developing an item bank and short forms that assess the impact of asthma on quality of life.

    PubMed

    Stucky, Brian D; Edelen, Maria Orlando; Sherbourne, Cathy D; Eberhart, Nicole K; Lara, Marielena

    2014-02-01

    The present work describes the process of developing an item bank and short forms that measure the impact of asthma on quality of life (QoL) that avoids confounding QoL with asthma symptomatology and functional impairment. Using a diverse national sample of adults with asthma (N = 2032) we conducted exploratory and confirmatory factor analyses, and item response theory and differential item functioning analyses to develop a 65-item unidimensional item bank and separate short form assessments. A psychometric evaluation of the RAND Impact of Asthma on QoL item bank (RAND-IAQL) suggests that though the concept of asthma impact on QoL is multi-faceted, it may be measured as a single underlying construct. The performance of the bank was then evaluated with a real-data simulated computer adaptive test. From the RAND-IAQL item bank we then developed two short forms consisting of 4 and 12 items (reliability = 0.86 and 0.93, respectively). A real-data simulated computer adaptive test suggests that as few as 4-5 items from the bank are needed to obtain highly precise scores. Preliminary validity results indicate that the RAND-IAQL measures distinguish between levels of asthma control. To measure the impact of asthma on QoL, users of these items may choose from two highly reliable short forms, computer adaptive test administration, or content-specific subsets of items from the bank tailored to their specific needs. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Neural Computations in a Dynamical System with Multiple Time Scales.

    PubMed

    Mi, Yuanyuan; Lin, Xiaohan; Wu, Si

    2016-01-01

    Neural systems display rich short-term dynamics at various levels, e.g., spike-frequency adaptation (SFA) at the single-neuron level, and short-term facilitation (STF) and depression (STD) at the synapse level. These dynamical features typically cover a broad range of time scales and exhibit large diversity in different brain regions. It remains unclear what is the computational benefit for the brain to have such variability in short-term dynamics. In this study, we propose that the brain can exploit such dynamical features to implement multiple seemingly contradictory computations in a single neural circuit. To demonstrate this idea, we use continuous attractor neural network (CANN) as a working model and include STF, SFA and STD with increasing time constants in its dynamics. Three computational tasks are considered, which are persistent activity, adaptation, and anticipative tracking. These tasks require conflicting neural mechanisms, and hence cannot be implemented by a single dynamical feature or any combination with similar time constants. However, with properly coordinated STF, SFA and STD, we show that the network is able to implement the three computational tasks concurrently. We hope this study will shed light on the understanding of how the brain orchestrates its rich dynamics at various levels to realize diverse cognitive functions.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koniges, A.E.; Craddock, G.G.; Schnack, D.D.

    The purpose of the workshop was to assemble workers, both within and outside of the fusion-related computations areas, for discussion regarding the issues of dynamically adaptive gridding. There were three invited talks related to adaptive gridding application experiences in various related fields of computational fluid dynamics (CFD), and nine short talks reporting on the progress of adaptive techniques in the specific areas of scrape-off-layer (SOL) modeling and magnetohydrodynamic (MHD) stability. Adaptive mesh methods have been successful in a number of diverse fields of CFD for over a decade. The method involves dynamic refinement of computed field profiles in a waymore » that disperses uniformly the numerical errors associated with discrete approximations. Because the process optimizes computational effort, adaptive mesh methods can be used to study otherwise the intractable physical problems that involve complex boundary shapes or multiple spatial/temporal scales. Recent results indicate that these adaptive techniques will be required for tokamak fluid-based simulations involving the diverted tokamak SOL modeling and MHD simulations problems related to the highest priority ITER relevant issues.Individual papers are indexed separately on the energy data bases.« less

  8. A computational cognitive model of syntactic priming.

    PubMed

    Reitter, David; Keller, Frank; Moore, Johanna D

    2011-01-01

    The psycholinguistic literature has identified two syntactic adaptation effects in language production: rapidly decaying short-term priming and long-lasting adaptation. To explain both effects, we present an ACT-R model of syntactic priming based on a wide-coverage, lexicalized syntactic theory that explains priming as facilitation of lexical access. In this model, two well-established ACT-R mechanisms, base-level learning and spreading activation, account for long-term adaptation and short-term priming, respectively. Our model simulates incremental language production and in a series of modeling studies, we show that it accounts for (a) the inverse frequency interaction; (b) the absence of a decay in long-term priming; and (c) the cumulativity of long-term adaptation. The model also explains the lexical boost effect and the fact that it only applies to short-term priming. We also present corpus data that verify a prediction of the model, that is, that the lexical boost affects all lexical material, rather than just heads. Copyright © 2011 Cognitive Science Society, Inc.

  9. Free energy landscapes of short peptide chains using adaptively biased molecular dynamics

    NASA Astrophysics Data System (ADS)

    Karpusenka, Vadzim; Babin, Volodymyr; Roland, Christopher; Sagui, Celeste

    2009-03-01

    We present the results of a computational study of the free energy landscapes of short polypeptide chains, as a function of several reaction coordinates meant to distinguish between several known types of helices. The free energy landscapes were calculated using the recently developed adaptively biased molecular dynamics method followed up with equilibrium ``umbrella correction'' runs. Specific polypeptides investigated include small chains of pure and mixed alanine, glutamate, leucine, lysine and methionine (all amino acids with strong helix-forming propensities), as well as glycine, proline(having a low helix forming propensities), tyrosine, serine and arginine. Our results are consistent with the existing experimental and other theoretical evidence.

  10. Towards psychologically adaptive brain-computer interfaces

    NASA Astrophysics Data System (ADS)

    Myrden, A.; Chau, T.

    2016-12-01

    Objective. Brain-computer interface (BCI) performance is sensitive to short-term changes in psychological states such as fatigue, frustration, and attention. This paper explores the design of a BCI that can adapt to these short-term changes. Approach. Eleven able-bodied individuals participated in a study during which they used a mental task-based EEG-BCI to play a simple maze navigation game while self-reporting their perceived levels of fatigue, frustration, and attention. In an offline analysis, a regression algorithm was trained to predict changes in these states, yielding Pearson correlation coefficients in excess of 0.45 between the self-reported and predicted states. Two means of fusing the resultant mental state predictions with mental task classification were investigated. First, single-trial mental state predictions were used to predict correct classification by the BCI during each trial. Second, an adaptive BCI was designed that retrained a new classifier for each testing sample using only those training samples for which predicted mental state was similar to that predicted for the current testing sample. Main results. Mental state-based prediction of BCI reliability exceeded chance levels. The adaptive BCI exhibited significant, but practically modest, increases in classification accuracy for five of 11 participants and no significant difference for the remaining six despite a smaller average training set size. Significance. Collectively, these findings indicate that adaptation to psychological state may allow the design of more accurate BCIs.

  11. A fast adaptive convex hull algorithm on two-dimensional processor arrays with a reconfigurable BUS system

    NASA Technical Reports Server (NTRS)

    Olariu, S.; Schwing, J.; Zhang, J.

    1991-01-01

    A bus system that can change dynamically to suit computational needs is referred to as reconfigurable. We present a fast adaptive convex hull algorithm on a two-dimensional processor array with a reconfigurable bus system (2-D PARBS, for short). Specifically, we show that computing the convex hull of a planar set of n points taken O(log n/log m) time on a 2-D PARBS of size mn x n with 3 less than or equal to m less than or equal to n. Our result implies that the convex hull of n points in the plane can be computed in O(1) time in a 2-D PARBS of size n(exp 1.5) x n.

  12. Target Identification Using Harmonic Wavelet Based ISAR Imaging

    NASA Astrophysics Data System (ADS)

    Shreyamsha Kumar, B. K.; Prabhakar, B.; Suryanarayana, K.; Thilagavathi, V.; Rajagopal, R.

    2006-12-01

    A new approach has been proposed to reduce the computations involved in the ISAR imaging, which uses harmonic wavelet-(HW) based time-frequency representation (TFR). Since the HW-based TFR falls into a category of nonparametric time-frequency (T-F) analysis tool, it is computationally efficient compared to parametric T-F analysis tools such as adaptive joint time-frequency transform (AJTFT), adaptive wavelet transform (AWT), and evolutionary AWT (EAWT). Further, the performance of the proposed method of ISAR imaging is compared with the ISAR imaging by other nonparametric T-F analysis tools such as short-time Fourier transform (STFT) and Choi-Williams distribution (CWD). In the ISAR imaging, the use of HW-based TFR provides similar/better results with significant (92%) computational advantage compared to that obtained by CWD. The ISAR images thus obtained are identified using a neural network-based classification scheme with feature set invariant to translation, rotation, and scaling.

  13. Very-long-term and short-term chromatic adaptation: are their influences cumulative?

    PubMed

    Belmore, Suzanne C; Shevell, Steven K

    2011-02-09

    Very-long-term (VLT) chromatic adaptation results from exposure to an altered chromatic environment for days or weeks. Color shifts from VLT adaptation are observed hours or days after leaving the altered environment. Short-term chromatic adaptation, on the other hand, results from exposure for a few minutes or less, with color shifts measured within seconds or a few minutes after the adapting light is extinguished; recovery to the pre-adapted state is complete in less than an hour. Here, both types of adaptation were combined. All adaptation was to reddish-appearing long-wavelength light. Shifts in unique yellow were measured following adaptation. Previous studies demonstrate shifts in unique yellow due to VLT chromatic adaptation, but shifts from short-term chromatic adaptation to comparable adapting light can be far greater than from VLT adaptation. The question considered here is whether the color shifts from VLT adaptation are cumulative with large shifts from short-term adaptation or, alternatively, does simultaneous short-term adaptation eliminate color shifts caused by VLT adaptation. The results show the color shifts from VLT and short-term adaptation together are cumulative, which indicates that both short-term and very-long-term chromatic adaptation affect color perception during natural viewing. Copyright © 2010 Elsevier Ltd. All rights reserved.

  14. SWAT system performance predictions

    NASA Astrophysics Data System (ADS)

    Parenti, Ronald R.; Sasiela, Richard J.

    1993-03-01

    In the next phase of Lincoln Laboratory's SWAT (Short-Wavelength Adaptive Techniques) program, the performance of a 241-actuator adaptive-optics system will be measured using a variety of synthetic-beacon geometries. As an aid in this experimental investigation, a detailed set of theoretical predictions has also been assembled. The computational tools that have been applied in this study include a numerical approach in which Monte-Carlo ray-trace simulations of accumulated phase error are developed, and an analytical analysis of the expected system behavior. This report describes the basis of these two computational techniques and compares their estimates of overall system performance. Although their regions of applicability tend to be complementary rather than redundant, good agreement is usually obtained when both sets of results can be derived for the same engagement scenario.

  15. Computational model for behavior shaping as an adaptive health intervention strategy.

    PubMed

    Berardi, Vincent; Carretero-González, Ricardo; Klepeis, Neil E; Ghanipoor Machiani, Sahar; Jahangiri, Arash; Bellettiere, John; Hovell, Melbourne

    2018-03-01

    Adaptive behavioral interventions that automatically adjust in real-time to participants' changing behavior, environmental contexts, and individual history are becoming more feasible as the use of real-time sensing technology expands. This development is expected to improve shortcomings associated with traditional behavioral interventions, such as the reliance on imprecise intervention procedures and limited/short-lived effects. JITAI adaptation strategies often lack a theoretical foundation. Increasing the theoretical fidelity of a trial has been shown to increase effectiveness. This research explores the use of shaping, a well-known process from behavioral theory for engendering or maintaining a target behavior, as a JITAI adaptation strategy. A computational model of behavior dynamics and operant conditioning was modified to incorporate the construct of behavior shaping by adding the ability to vary, over time, the range of behaviors that were reinforced when emitted. Digital experiments were performed with this updated model for a range of parameters in order to identify the behavior shaping features that optimally generated target behavior. Narrowing the range of reinforced behaviors continuously in time led to better outcomes compared with a discrete narrowing of the reinforcement window. Rapid narrowing followed by more moderate decreases in window size was more effective in generating target behavior than the inverse scenario. The computational shaping model represents an effective tool for investigating JITAI adaptation strategies. Model parameters must now be translated from the digital domain to real-world experiments so that model findings can be validated.

  16. Impedance computed tomography using an adaptive smoothing coefficient algorithm.

    PubMed

    Suzuki, A; Uchiyama, A

    2001-01-01

    In impedance computed tomography, a fixed coefficient regularization algorithm has been frequently used to improve the ill-conditioning problem of the Newton-Raphson algorithm. However, a lot of experimental data and a long period of computation time are needed to determine a good smoothing coefficient because a good smoothing coefficient has to be manually chosen from a number of coefficients and is a constant for each iteration calculation. Thus, sometimes the fixed coefficient regularization algorithm distorts the information or fails to obtain any effect. In this paper, a new adaptive smoothing coefficient algorithm is proposed. This algorithm automatically calculates the smoothing coefficient from the eigenvalue of the ill-conditioned matrix. Therefore, the effective images can be obtained within a short computation time. Also the smoothing coefficient is automatically adjusted by the information related to the real resistivity distribution and the data collection method. In our impedance system, we have reconstructed the resistivity distributions of two phantoms using this algorithm. As a result, this algorithm only needs one-fifth the computation time compared to the fixed coefficient regularization algorithm. When compared to the fixed coefficient regularization algorithm, it shows that the image is obtained more rapidly and applicable in real-time monitoring of the blood vessel.

  17. Study of aircraft in intraurban transportation systems, volume 1

    NASA Technical Reports Server (NTRS)

    Stout, E. G.; Kesling, P. H.; Matteson, H. C.; Sherwood, D. E.; Tuck, W. R., Jr.; Vaughn, L. A.

    1971-01-01

    An analysis of an effective short range, high density computer transportation system for intraurban systems is presented. The seven county Detroit, Michigan, metropolitan area, was chosen as the scenario for the analysis. The study consisted of an analysis and forecast of the Detroit market through 1985, a parametric analysis of appropriate short haul aircraft concepts and associated ground systems, and a preliminary overall economic analysis of a simplified total system designed to evaluate the candidate vehicles and select the most promising VTOL and STOL aircraft. Data are also included on the impact of advanced technology on the system, the sensitivity of mission performance to changes in aircraft characteristics and system operations, and identification of key problem areas that may be improved by additional research. The approach, logic, and computer models used are adaptable to other intraurban or interurban areas.

  18. Adaptive Time Stepping for Transient Network Flow Simulation in Rocket Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok K.; Ravindran, S. S.

    2017-01-01

    Fluid and thermal transients found in rocket propulsion systems such as propellant feedline system is a complex process involving fast phases followed by slow phases. Therefore their time accurate computation requires use of short time step initially followed by the use of much larger time step. Yet there are instances that involve fast-slow-fast phases. In this paper, we present a feedback control based adaptive time stepping algorithm, and discuss its use in network flow simulation of fluid and thermal transients. The time step is automatically controlled during the simulation by monitoring changes in certain key variables and by feedback. In order to demonstrate the viability of time adaptivity for engineering problems, we applied it to simulate water hammer and cryogenic chill down in pipelines. Our comparison and validation demonstrate the accuracy and efficiency of this adaptive strategy.

  19. A short note on the use of the red-black tree in Cartesian adaptive mesh refinement algorithms

    NASA Astrophysics Data System (ADS)

    Hasbestan, Jaber J.; Senocak, Inanc

    2017-12-01

    Mesh adaptivity is an indispensable capability to tackle multiphysics problems with large disparity in time and length scales. With the availability of powerful supercomputers, there is a pressing need to extend time-proven computational techniques to extreme-scale problems. Cartesian adaptive mesh refinement (AMR) is one such method that enables simulation of multiscale, multiphysics problems. AMR is based on construction of octrees. Originally, an explicit tree data structure was used to generate and manipulate an adaptive Cartesian mesh. At least eight pointers are required in an explicit approach to construct an octree. Parent-child relationships are then used to traverse the tree. An explicit octree, however, is expensive in terms of memory usage and the time it takes to traverse the tree to access a specific node. For these reasons, implicit pointerless methods have been pioneered within the computer graphics community, motivated by applications requiring interactivity and realistic three dimensional visualization. Lewiner et al. [1] provides a concise review of pointerless approaches to generate an octree. Use of a hash table and Z-order curve are two key concepts in pointerless methods that we briefly discuss next.

  20. Macintosh/LabVIEW based control and data acquisition system for a single photon counting fluorometer

    NASA Astrophysics Data System (ADS)

    Stryjewski, Wieslaw J.

    1991-08-01

    A flexible software system has been developed for controlling fluorescence decay measurements using the virtual instrument approach offered by LabVIEW. The time-correlated single photon counting instrument operates under computer control in both manual and automatic mode. Implementation time was short and the equipment is now easier to use, reducing the training time required for new investigators. It is not difficult to customize the front panel or adapt the program to a different instrument. We found LabVIEW much more convenient to use for this application than traditional, textual computer languages.

  1. Developing Item Response Theory-Based Short Forms to Measure the Social Impact of Burn Injuries.

    PubMed

    Marino, Molly E; Dore, Emily C; Ni, Pengsheng; Ryan, Colleen M; Schneider, Jeffrey C; Acton, Amy; Jette, Alan M; Kazis, Lewis E

    2018-03-01

    To develop self-reported short forms for the Life Impact Burn Recovery Evaluation (LIBRE) Profile. Short forms based on the item parameters of discrimination and average difficulty. A support network for burn survivors, peer support networks, social media, and mailings. Burn survivors (N=601) older than 18 years. Not applicable. The LIBRE Profile. Ten-item short forms were developed to cover the 6 LIBRE Profile scales: Relationships with Family & Friends, Social Interactions, Social Activities, Work & Employment, Romantic Relationships, and Sexual Relationships. Ceiling effects were ≤15% for all scales; floor effects were <1% for all scales. The marginal reliability of the short forms ranged from .85 to .89. The LIBRE Profile-Short Forms demonstrated credible psychometric properties. The short form version provides a viable alternative to administering the LIBRE Profile when resources do not allow computer or Internet access. The full item bank, computerized adaptive test, and short forms are all scored along the same metric, and therefore scores are comparable regardless of the mode of administration. Copyright © 2017 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  2. Human physiological adaptation to extended Space Flight and its implications for Space Station

    NASA Technical Reports Server (NTRS)

    Kutyna, F. A.; Shumate, W. H.

    1985-01-01

    Current work evaluating short-term space flight physiological data on the homeostatic changes due to weightlessness is presented as a means of anticipating Space Station long-term effects. An integrated systems analysis of current data shows a vestibulo-sensory adaptation within days; a loss of body mass, fluids, and electrolytes, stabilizing in a month; and a loss in red cell mass over a month. But bone demineralization which did not level off is seen as the biggest concern. Computer algorithms have been developed to simulate the human adaptation to weightlessness. So far these paradigms have been backed up by flight data and it is hoped that they will provide valuable information for future Space Station design. A series of explanatory schematics is attached.

  3. Polynomial Phase Estimation Based on Adaptive Short-Time Fourier Transform

    PubMed Central

    Jing, Fulong; Zhang, Chunjie; Si, Weijian; Wang, Yu; Jiao, Shuhong

    2018-01-01

    Polynomial phase signals (PPSs) have numerous applications in many fields including radar, sonar, geophysics, and radio communication systems. Therefore, estimation of PPS coefficients is very important. In this paper, a novel approach for PPS parameters estimation based on adaptive short-time Fourier transform (ASTFT), called the PPS-ASTFT estimator, is proposed. Using the PPS-ASTFT estimator, both one-dimensional and multi-dimensional searches and error propagation problems, which widely exist in PPSs field, are avoided. In the proposed algorithm, the instantaneous frequency (IF) is estimated by S-transform (ST), which can preserve information on signal phase and provide a variable resolution similar to the wavelet transform (WT). The width of the ASTFT analysis window is equal to the local stationary length, which is measured by the instantaneous frequency gradient (IFG). The IFG is calculated by the principal component analysis (PCA), which is robust to the noise. Moreover, to improve estimation accuracy, a refinement strategy is presented to estimate signal parameters. Since the PPS-ASTFT avoids parameter search, the proposed algorithm can be computed in a reasonable amount of time. The estimation performance, computational cost, and implementation of the PPS-ASTFT are also analyzed. The conducted numerical simulations support our theoretical results and demonstrate an excellent statistical performance of the proposed algorithm. PMID:29438317

  4. Polynomial Phase Estimation Based on Adaptive Short-Time Fourier Transform.

    PubMed

    Jing, Fulong; Zhang, Chunjie; Si, Weijian; Wang, Yu; Jiao, Shuhong

    2018-02-13

    Polynomial phase signals (PPSs) have numerous applications in many fields including radar, sonar, geophysics, and radio communication systems. Therefore, estimation of PPS coefficients is very important. In this paper, a novel approach for PPS parameters estimation based on adaptive short-time Fourier transform (ASTFT), called the PPS-ASTFT estimator, is proposed. Using the PPS-ASTFT estimator, both one-dimensional and multi-dimensional searches and error propagation problems, which widely exist in PPSs field, are avoided. In the proposed algorithm, the instantaneous frequency (IF) is estimated by S-transform (ST), which can preserve information on signal phase and provide a variable resolution similar to the wavelet transform (WT). The width of the ASTFT analysis window is equal to the local stationary length, which is measured by the instantaneous frequency gradient (IFG). The IFG is calculated by the principal component analysis (PCA), which is robust to the noise. Moreover, to improve estimation accuracy, a refinement strategy is presented to estimate signal parameters. Since the PPS-ASTFT avoids parameter search, the proposed algorithm can be computed in a reasonable amount of time. The estimation performance, computational cost, and implementation of the PPS-ASTFT are also analyzed. The conducted numerical simulations support our theoretical results and demonstrate an excellent statistical performance of the proposed algorithm.

  5. Development of Shunt-Type Three-Phase Active Power Filter with Novel Adaptive Control for Wind Generators

    PubMed Central

    2015-01-01

    This paper proposes a new adaptive filter for wind generators that combines instantaneous reactive power compensation technology and current prediction controller, and therefore this system is characterized by low harmonic distortion, high power factor, and small DC-link voltage variations during load disturbances. The performance of the system was first simulated using MATLAB/Simulink, and the possibility of an adaptive digital low-pass filter eliminating current harmonics was confirmed in steady and transient states. Subsequently, a digital signal processor was used to implement an active power filter. The experimental results indicate, that for the rated operation of 2 kVA, the system has a total harmonic distortion of current less than 5.0% and a power factor of 1.0 on the utility side. Thus, the transient performance of the adaptive filter is superior to the traditional digital low-pass filter and is more economical because of its short computation time compared with other types of adaptive filters. PMID:26451391

  6. Development of Shunt-Type Three-Phase Active Power Filter with Novel Adaptive Control for Wind Generators.

    PubMed

    Chen, Ming-Hung

    2015-01-01

    This paper proposes a new adaptive filter for wind generators that combines instantaneous reactive power compensation technology and current prediction controller, and therefore this system is characterized by low harmonic distortion, high power factor, and small DC-link voltage variations during load disturbances. The performance of the system was first simulated using MATLAB/Simulink, and the possibility of an adaptive digital low-pass filter eliminating current harmonics was confirmed in steady and transient states. Subsequently, a digital signal processor was used to implement an active power filter. The experimental results indicate, that for the rated operation of 2 kVA, the system has a total harmonic distortion of current less than 5.0% and a power factor of 1.0 on the utility side. Thus, the transient performance of the adaptive filter is superior to the traditional digital low-pass filter and is more economical because of its short computation time compared with other types of adaptive filters.

  7. A fast 4D cone beam CT reconstruction method based on the OSC-TV algorithm.

    PubMed

    Mascolo-Fortin, Julia; Matenine, Dmitri; Archambault, Louis; Després, Philippe

    2018-01-01

    Four-dimensional cone beam computed tomography allows for temporally resolved imaging with useful applications in radiotherapy, but raises particular challenges in terms of image quality and computation time. The purpose of this work is to develop a fast and accurate 4D algorithm by adapting a GPU-accelerated ordered subsets convex algorithm (OSC), combined with the total variation minimization regularization technique (TV). Different initialization schemes were studied to adapt the OSC-TV algorithm to 4D reconstruction: each respiratory phase was initialized either with a 3D reconstruction or a blank image. Reconstruction algorithms were tested on a dynamic numerical phantom and on a clinical dataset. 4D iterations were implemented for a cluster of 8 GPUs. All developed methods allowed for an adequate visualization of the respiratory movement and compared favorably to the McKinnon-Bates and adaptive steepest descent projection onto convex sets algorithms, while the 4D reconstructions initialized from a prior 3D reconstruction led to better overall image quality. The most suitable adaptation of OSC-TV to 4D CBCT was found to be a combination of a prior FDK reconstruction and a 4D OSC-TV reconstruction with a reconstruction time of 4.5 minutes. This relatively short reconstruction time could facilitate a clinical use.

  8. The Quantized Geometry of Visual Space: The Coherent Computation of Depth, Form, and Lightness. Revised Version.

    DTIC Science & Technology

    1982-08-01

    of sensitivity with background luminance, and the finitE capacity of visual short term memory are discussed in terms of a small set of ...binocular rivalry, reflectance rivalry, Fechner’s paradox, decrease of threshold contrast with increased number of cycles in a grating pattern, hysteresis...adaptation level tuning, Weber law modulation, shift of sensitivity with background luminance, and the finite capacity of visual

  9. CALL in a Climate of Change: Adapting to Turbulent Global Conditions. Short Papers from EUROCALL 2017 (25th, Southampton, United Kingdom, August 23-26, 2017)

    ERIC Educational Resources Information Center

    Borthwick, Kate, Ed.; Bradley, Linda, Ed.; Thouësny, Sylvie, Ed.

    2017-01-01

    The 25th European Association of Computer-Assisted Language Learning (EUROCALL) conference was hosted by Modern Languages and Linguistics at the University of Southampton, in the United Kingdom, from the 23rd to the 26th of August 2017. The theme of the conference was "CALL in a climate of change." The theme encompassed the notion of how…

  10. Adaptation and implementation of standardized order sets in a network of multi-hospital corporations in rural Ontario.

    PubMed

    Meleskie, Jessica; Eby, Don

    2009-01-01

    Standardized, preprinted or computer-generated physician orders are an attractive project for organizations that wish to improve the quality of patient care. The successful development and maintenance of order sets is a major undertaking. This article recounts the collaborative experience of the Grey Bruce Health Network in adapting and implementing an existing set of physician orders for use in its three hospital corporations. An Order Set Committee composed of primarily front-line staff was given authority over the order set development, approval and implementation processes. This arrangement bypassed the traditional approval process and facilitated the rapid implementation of a large number of order sets in a short time period.

  11. Robot Competence Development by Constructive Learning

    NASA Astrophysics Data System (ADS)

    Meng, Q.; Lee, M. H.; Hinde, C. J.

    This paper presents a constructive learning approach for developing sensor-motor mapping in autonomous systems. The system’s adaptation to environment changes is discussed and three methods are proposed to deal with long term and short term changes. The proposed constructive learning allows autonomous systems to develop network topology and adjust network parameters. The approach is supported by findings from psychology and neuroscience especially during infants cognitive development at early stages. A growing radial basis function network is introduced as a computational substrate for sensory-motor mapping learning. Experiments are conducted on a robot eye/hand coordination testbed and results show the incremental development of sensory-motor mapping and its adaptation to changes such as in tool-use.

  12. Robot Competence Development by Constructive Learning

    NASA Astrophysics Data System (ADS)

    Meng, Q.; Lee, M. H.; Hinde, C. J.

    This paper presents a constructive learning approach for developing sensor-motor mapping in autonomous systems. The system's adaptation to environment changes is discussed and three methods are proposed to deal with long term and short term changes. The proposed constructive learning allows autonomous systems to develop network topology and adjust network parameters. The approach is supported by findings from psychology and neuroscience especially during infants cognitive development at early stages. A growing radial basis function network is introduced as a computational substrate for sensory-motor mapping learning. Experiments are conducted on a robot eye/hand coordination testbed and results show the incremental development of sensory-motor mapping and its adaptation to changes such as in tool-use.

  13. Trends and Correlation Estimation in Climate Sciences: Effects of Timescale Errors

    NASA Astrophysics Data System (ADS)

    Mudelsee, M.; Bermejo, M. A.; Bickert, T.; Chirila, D.; Fohlmeister, J.; Köhler, P.; Lohmann, G.; Olafsdottir, K.; Scholz, D.

    2012-12-01

    Trend describes time-dependence in the first moment of a stochastic process, and correlation measures the linear relation between two random variables. Accurately estimating the trend and correlation, including uncertainties, from climate time series data in the uni- and bivariate domain, respectively, allows first-order insights into the geophysical process that generated the data. Timescale errors, ubiquitious in paleoclimatology, where archives are sampled for proxy measurements and dated, poses a problem to the estimation. Statistical science and the various applied research fields, including geophysics, have almost completely ignored this problem due to its theoretical almost-intractability. However, computational adaptations or replacements of traditional error formulas have become technically feasible. This contribution gives a short overview of such an adaptation package, bootstrap resampling combined with parametric timescale simulation. We study linear regression, parametric change-point models and nonparametric smoothing for trend estimation. We introduce pairwise-moving block bootstrap resampling for correlation estimation. Both methods share robustness against autocorrelation and non-Gaussian distributional shape. We shortly touch computing-intensive calibration of bootstrap confidence intervals and consider options to parallelize the related computer code. Following examples serve not only to illustrate the methods but tell own climate stories: (1) the search for climate drivers of the Agulhas Current on recent timescales, (2) the comparison of three stalagmite-based proxy series of regional, western German climate over the later part of the Holocene, and (3) trends and transitions in benthic oxygen isotope time series from the Cenozoic. Financial support by Deutsche Forschungsgemeinschaft (FOR 668, FOR 1070, MU 1595/4-1) and the European Commission (MC ITN 238512, MC ITN 289447) is acknowledged.

  14. Quantum Dynamics with Short-Time Trajectories and Minimal Adaptive Basis Sets.

    PubMed

    Saller, Maximilian A C; Habershon, Scott

    2017-07-11

    Methods for solving the time-dependent Schrödinger equation via basis set expansion of the wave function can generally be categorized as having either static (time-independent) or dynamic (time-dependent) basis functions. We have recently introduced an alternative simulation approach which represents a middle road between these two extremes, employing dynamic (classical-like) trajectories to create a static basis set of Gaussian wavepackets in regions of phase-space relevant to future propagation of the wave function [J. Chem. Theory Comput., 11, 8 (2015)]. Here, we propose and test a modification of our methodology which aims to reduce the size of basis sets generated in our original scheme. In particular, we employ short-time classical trajectories to continuously generate new basis functions for short-time quantum propagation of the wave function; to avoid the continued growth of the basis set describing the time-dependent wave function, we employ Matching Pursuit to periodically minimize the number of basis functions required to accurately describe the wave function. Overall, this approach generates a basis set which is adapted to evolution of the wave function while also being as small as possible. In applications to challenging benchmark problems, namely a 4-dimensional model of photoexcited pyrazine and three different double-well tunnelling problems, we find that our new scheme enables accurate wave function propagation with basis sets which are around an order-of-magnitude smaller than our original trajectory-guided basis set methodology, highlighting the benefits of adaptive strategies for wave function propagation.

  15. Adaptive Data-based Predictive Control for Short Take-off and Landing (STOL) Aircraft

    NASA Technical Reports Server (NTRS)

    Barlow, Jonathan Spencer; Acosta, Diana Michelle; Phan, Minh Q.

    2010-01-01

    Data-based Predictive Control is an emerging control method that stems from Model Predictive Control (MPC). MPC computes current control action based on a prediction of the system output a number of time steps into the future and is generally derived from a known model of the system. Data-based predictive control has the advantage of deriving predictive models and controller gains from input-output data. Thus, a controller can be designed from the outputs of complex simulation code or a physical system where no explicit model exists. If the output data happens to be corrupted by periodic disturbances, the designed controller will also have the built-in ability to reject these disturbances without the need to know them. When data-based predictive control is implemented online, it becomes a version of adaptive control. The characteristics of adaptive data-based predictive control are particularly appropriate for the control of nonlinear and time-varying systems, such as Short Take-off and Landing (STOL) aircraft. STOL is a capability of interest to NASA because conceptual Cruise Efficient Short Take-off and Landing (CESTOL) transport aircraft offer the ability to reduce congestion in the terminal area by utilizing existing shorter runways at airports, as well as to lower community noise by flying steep approach and climb-out patterns that reduce the noise footprint of the aircraft. In this study, adaptive data-based predictive control is implemented as an integrated flight-propulsion controller for the outer-loop control of a CESTOL-type aircraft. Results show that the controller successfully tracks velocity while attempting to maintain a constant flight path angle, using longitudinal command, thrust and flap setting as the control inputs.

  16. Recognizing short coding sequences of prokaryotic genome using a novel iteratively adaptive sparse partial least squares algorithm

    PubMed Central

    2013-01-01

    Background Significant efforts have been made to address the problem of identifying short genes in prokaryotic genomes. However, most known methods are not effective in detecting short genes. Because of the limited information contained in short DNA sequences, it is very difficult to accurately distinguish between protein coding and non-coding sequences in prokaryotic genomes. We have developed a new Iteratively Adaptive Sparse Partial Least Squares (IASPLS) algorithm as the classifier to improve the accuracy of the identification process. Results For testing, we chose the short coding and non-coding sequences from seven prokaryotic organisms. We used seven feature sets (including GC content, Z-curve, etc.) of short genes. In comparison with GeneMarkS, Metagene, Orphelia, and Heuristic Approachs methods, our model achieved the best prediction performance in identification of short prokaryotic genes. Even when we focused on the very short length group ([60–100 nt)), our model provided sensitivity as high as 83.44% and specificity as high as 92.8%. These values are two or three times higher than three of the other methods while Metagene fails to recognize genes in this length range. The experiments also proved that the IASPLS can improve the identification accuracy in comparison with other widely used classifiers, i.e. Logistic, Random Forest (RF) and K nearest neighbors (KNN). The accuracy in using IASPLS was improved 5.90% or more in comparison with the other methods. In addition to the improvements in accuracy, IASPLS required ten times less computer time than using KNN or RF. Conclusions It is conclusive that our method is preferable for application as an automated method of short gene classification. Its linearity and easily optimized parameters make it practicable for predicting short genes of newly-sequenced or under-studied species. Reviewers This article was reviewed by Alexey Kondrashov, Rajeev Azad (nominated by Dr J.Peter Gogarten) and Yuriy Fofanov (nominated by Dr Janet Siefert). PMID:24067167

  17. A fiber orientation-adapted integration scheme for computing the hyperelastic Tucker average for short fiber reinforced composites

    NASA Astrophysics Data System (ADS)

    Goldberg, Niels; Ospald, Felix; Schneider, Matti

    2017-10-01

    In this article we introduce a fiber orientation-adapted integration scheme for Tucker's orientation averaging procedure applied to non-linear material laws, based on angular central Gaussian fiber orientation distributions. This method is stable w.r.t. fiber orientations degenerating into planar states and enables the construction of orthotropic hyperelastic energies for truly orthotropic fiber orientation states. We establish a reference scenario for fitting the Tucker average of a transversely isotropic hyperelastic energy, corresponding to a uni-directional fiber orientation, to microstructural simulations, obtained by FFT-based computational homogenization of neo-Hookean constituents. We carefully discuss ideas for accelerating the identification process, leading to a tremendous speed-up compared to a naive approach. The resulting hyperelastic material map turns out to be surprisingly accurate, simple to integrate in commercial finite element codes and fast in its execution. We demonstrate the capabilities of the extracted model by a finite element analysis of a fiber reinforced chain link.

  18. Development of an adaptive hp-version finite element method for computational optimal control

    NASA Technical Reports Server (NTRS)

    Hodges, Dewey H.; Warner, Michael S.

    1994-01-01

    In this research effort, the usefulness of hp-version finite elements and adaptive solution-refinement techniques in generating numerical solutions to optimal control problems has been investigated. Under NAG-939, a general FORTRAN code was developed which approximated solutions to optimal control problems with control constraints and state constraints. Within that methodology, to get high-order accuracy in solutions, the finite element mesh would have to be refined repeatedly through bisection of the entire mesh in a given phase. In the current research effort, the order of the shape functions in each element has been made a variable, giving more flexibility in error reduction and smoothing. Similarly, individual elements can each be subdivided into many pieces, depending on the local error indicator, while other parts of the mesh remain coarsely discretized. The problem remains to reduce and smooth the error while still keeping computational effort reasonable enough to calculate time histories in a short enough time for on-board applications.

  19. An adaptive random search for short term generation scheduling with network constraints.

    PubMed

    Marmolejo, J A; Velasco, Jonás; Selley, Héctor J

    2017-01-01

    This paper presents an adaptive random search approach to address a short term generation scheduling with network constraints, which determines the startup and shutdown schedules of thermal units over a given planning horizon. In this model, we consider the transmission network through capacity limits and line losses. The mathematical model is stated in the form of a Mixed Integer Non Linear Problem with binary variables. The proposed heuristic is a population-based method that generates a set of new potential solutions via a random search strategy. The random search is based on the Markov Chain Monte Carlo method. The main key of the proposed method is that the noise level of the random search is adaptively controlled in order to exploring and exploiting the entire search space. In order to improve the solutions, we consider coupling a local search into random search process. Several test systems are presented to evaluate the performance of the proposed heuristic. We use a commercial optimizer to compare the quality of the solutions provided by the proposed method. The solution of the proposed algorithm showed a significant reduction in computational effort with respect to the full-scale outer approximation commercial solver. Numerical results show the potential and robustness of our approach.

  20. Relative codon adaptation: a generic codon bias index for prediction of gene expression.

    PubMed

    Fox, Jesse M; Erill, Ivan

    2010-06-01

    The development of codon bias indices (CBIs) remains an active field of research due to their myriad applications in computational biology. Recently, the relative codon usage bias (RCBS) was introduced as a novel CBI able to estimate codon bias without using a reference set. The results of this new index when applied to Escherichia coli and Saccharomyces cerevisiae led the authors of the original publications to conclude that natural selection favours higher expression and enhanced codon usage optimization in short genes. Here, we show that this conclusion was flawed and based on the systematic oversight of an intrinsic bias for short sequences in the RCBS index and of biases in the small data sets used for validation in E. coli. Furthermore, we reveal that how the RCBS can be corrected to produce useful results and how its underlying principle, which we here term relative codon adaptation (RCA), can be made into a powerful reference-set-based index that directly takes into account the genomic base composition. Finally, we show that RCA outperforms the codon adaptation index (CAI) as a predictor of gene expression when operating on the CAI reference set and that this improvement is significantly larger when analysing genomes with high mutational bias.

  1. Beyond Born-Mayer: Improved models for short-range repulsion in ab initio force fields

    DOE PAGES

    Van Vleet, Mary J.; Misquitta, Alston J.; Stone, Anthony J.; ...

    2016-06-23

    Short-range repulsion within inter-molecular force fields is conventionally described by either Lennard-Jones or Born-Mayer forms. Despite their widespread use, these simple functional forms are often unable to describe the interaction energy accurately over a broad range of inter-molecular distances, thus creating challenges in the development of ab initio force fields and potentially leading to decreased accuracy and transferability. Herein, we derive a novel short-range functional form based on a simple Slater-like model of overlapping atomic densities and an iterated stockholder atom (ISA) partitioning of the molecular electron density. We demonstrate that this Slater-ISA methodology yields a more accurate, transferable, andmore » robust description of the short-range interactions at minimal additional computational cost compared to standard Lennard-Jones or Born-Mayer approaches. Lastly, we show how this methodology can be adapted to yield the standard Born-Mayer functional form while still retaining many of the advantages of the Slater-ISA approach.« less

  2. Equipment: Antenna systems

    NASA Astrophysics Data System (ADS)

    Petrie, L. E.

    1983-05-01

    Some antenna fundamentals as well as definitions of the principal terms used in antenna engineering are described. Methods are presented for determining the desired antenna radiation patterns for an HF communication circuit or service area. Sources for obtaining or computing radiation pattern information are outlined. Comparisons are presented between the measured and computed radiation patterns. The effect of the properties of the ground on the antenna gain and pattern are illustrated for several types of antennas. Numerous examples are given of the radiation patterns for typical antennas used on short, intermediate and long distance circuits or both mobile and fixed service operations. The application of adaptive antenna arrays and active antennas in modern HF communication systems are briefly reviewed.

  3. Equipment: Antenna systems

    NASA Astrophysics Data System (ADS)

    Petrie, L. E.

    1986-03-01

    Some antenna fundamentals as well as definitions of the principal terms used in antenna engineering are described. Methods are presented for determining the desired antenna radiation patterns for HF communication circuit or service area. Sources for obtaining or computing radiation pattern information are outlined. Comparisons are presented between the measured and computed radiation patterns. The effect of the properties of the ground on the antenna gain and the pattern are illustrated for several types of antennas. Numerous examples are given of the radiation patterns for typical antennas used on short, intermediate and long distance circuits for both mobile and fixed service operations. The application of adaptive antenna arrays and active antennas in modern HF communication systems are briefly reviewed.

  4. On-line pulse control for structural and mechanical systems

    NASA Technical Reports Server (NTRS)

    Udwadia, F. E.; Garba, J. A.; Tabaie, S.

    1981-01-01

    This paper studies the feasibility of using open-loop adaptive on-line pulse control for limiting the response of large linear multidegree of freedom systems subjected to general dynamic loading environments. Pulses of short durations are used to control the system when the system response exceeds a given threshold level. The pulse magnitudes are obtained in closed form, leading to large computational efficiencies when compared with optimal control theoretic methods. The technique is illustrated for a structural system subjected to earthquake-like base excitations.

  5. Joint Aerospace Weapon System Support, Sensors And Simulation Symposium (5th Annual). Held In San Diego, California on 13-18 June 1999

    DTIC Science & Technology

    1999-06-18

    and 1.54 microns and to compute the spectral extinction coefficient. 3. Near IR (1.54 um) Laser rangefinders measure the time-of-flight of a short...quantitative understanding n n Research ( long term) n Encourage research in adaptive systems : evolutionary programming, genetic algorithms, neural nets... measures , such as false alarm rate , are not measurable in field applications. Other measures such as Incremental Fault Resolution, Operational Isolation

  6. Dynamical information encoding in neural adaptation.

    PubMed

    Luozheng Li; Wenhao Zhang; Yuanyuan Mi; Dahui Wang; Xiaohan Lin; Si Wu

    2016-08-01

    Adaptation refers to the general phenomenon that a neural system dynamically adjusts its response property according to the statistics of external inputs. In response to a prolonged constant stimulation, neuronal firing rates always first increase dramatically at the onset of the stimulation; and afterwards, they decrease rapidly to a low level close to background activity. This attenuation of neural activity seems to be contradictory to our experience that we can still sense the stimulus after the neural system is adapted. Thus, it prompts a question: where is the stimulus information encoded during the adaptation? Here, we investigate a computational model in which the neural system employs a dynamical encoding strategy during the neural adaptation: at the early stage of the adaptation, the stimulus information is mainly encoded in the strong independent firings; and as time goes on, the information is shifted into the weak but concerted responses of neurons. We find that short-term plasticity, a general feature of synapses, provides a natural mechanism to achieve this goal. Furthermore, we demonstrate that with balanced excitatory and inhibitory inputs, this correlation-based information can be read out efficiently. The implications of this study on our understanding of neural information encoding are discussed.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Vleet, Mary J.; Misquitta, Alston J.; Stone, Anthony J.

    Short-range repulsion within inter-molecular force fields is conventionally described by either Lennard-Jones or Born-Mayer forms. Despite their widespread use, these simple functional forms are often unable to describe the interaction energy accurately over a broad range of inter-molecular distances, thus creating challenges in the development of ab initio force fields and potentially leading to decreased accuracy and transferability. Herein, we derive a novel short-range functional form based on a simple Slater-like model of overlapping atomic densities and an iterated stockholder atom (ISA) partitioning of the molecular electron density. We demonstrate that this Slater-ISA methodology yields a more accurate, transferable, andmore » robust description of the short-range interactions at minimal additional computational cost compared to standard Lennard-Jones or Born-Mayer approaches. Lastly, we show how this methodology can be adapted to yield the standard Born-Mayer functional form while still retaining many of the advantages of the Slater-ISA approach.« less

  8. An initial investigation on developing a new method to predict short-term breast cancer risk based on deep learning technology

    NASA Astrophysics Data System (ADS)

    Qiu, Yuchen; Wang, Yunzhi; Yan, Shiju; Tan, Maxine; Cheng, Samuel; Liu, Hong; Zheng, Bin

    2016-03-01

    In order to establish a new personalized breast cancer screening paradigm, it is critically important to accurately predict the short-term risk of a woman having image-detectable cancer after a negative mammographic screening. In this study, we developed and tested a novel short-term risk assessment model based on deep learning method. During the experiment, a number of 270 "prior" negative screening cases was assembled. In the next sequential ("current") screening mammography, 135 cases were positive and 135 cases remained negative. These cases were randomly divided into a training set with 200 cases and a testing set with 70 cases. A deep learning based computer-aided diagnosis (CAD) scheme was then developed for the risk assessment, which consists of two modules: adaptive feature identification module and risk prediction module. The adaptive feature identification module is composed of three pairs of convolution-max-pooling layers, which contains 20, 10, and 5 feature maps respectively. The risk prediction module is implemented by a multiple layer perception (MLP) classifier, which produces a risk score to predict the likelihood of the woman developing short-term mammography-detectable cancer. The result shows that the new CAD-based risk model yielded a positive predictive value of 69.2% and a negative predictive value of 74.2%, with a total prediction accuracy of 71.4%. This study demonstrated that applying a new deep learning technology may have significant potential to develop a new short-term risk predicting scheme with improved performance in detecting early abnormal symptom from the negative mammograms.

  9. Numerical developments for short-pulsed Near Infra-Red laser spectroscopy. Part I: direct treatment

    NASA Astrophysics Data System (ADS)

    Boulanger, Joan; Charette, André

    2005-03-01

    This two part study is devoted to the numerical treatment of short-pulsed laser near infra-red spectroscopy. The overall goal is to address the possibility of numerical inverse treatment based on a recently developed direct model to solve the transient radiative transfer equation. This model has been constructed in order to incorporate the last improvements in short-pulsed laser interaction with semi-transparent media and combine a discrete ordinates computing of the implicit source term appearing in the radiative transfer equation with an explicit treatment of the transport of the light intensity using advection schemes, a method encountered in reactive flow dynamics. The incident collimated beam is analytically solved through Bouger Beer Lambert extinction law. In this first part, the direct model is extended to fully non-homogeneous materials and tested with two different spatial schemes in order to be adapted to the inversion methods presented in the following second part. As a first point, fundamental methods and schemes used in the direct model are presented. Then, tests are conducted by comparison with numerical simulations given as references. In a third and last part, multi-dimensional extensions of the code are provided. This allows presentation of numerical results of short pulses propagation in 1, 2 and 3D homogeneous and non-homogeneous materials given some parametrical studies on medium properties and pulse shape. For comparison, an integral method adapted to non-homogeneous media irradiated by a pulsed laser beam is also developed for the 3D case.

  10. Short-term Temperature Prediction Using Adaptive Computing on Dynamic Scales

    NASA Astrophysics Data System (ADS)

    Hu, W.; Cervone, G.; Jha, S.; Balasubramanian, V.; Turilli, M.

    2017-12-01

    When predicting temperature, there are specific places and times when high accuracy predictions are harder. For example, not all the sub-regions in the domain require the same amount of computing resources to generate an accurate prediction. Plateau areas might require less computing resources than mountainous areas because of the steeper gradient of temperature change in the latter. However, it is difficult to estimate beforehand the optimal allocation of computational resources because several parameters play a role in determining the accuracy of the forecasts, in addition to orography. The allocation of resources to perform simulations can become a bottleneck because it requires human intervention to stop jobs or start new ones. The goal of this project is to design and develop a dynamic approach to generate short-term temperature predictions that can automatically determines the required computing resources and the geographic scales of the predictions based on the spatial and temporal uncertainties. The predictions and the prediction quality metrics are computed using a numeric weather prediction model, Analog Ensemble (AnEn), and the parallelization on high performance computing systems is accomplished using Ensemble Toolkit, one component of the RADICAL-Cybertools family of tools. RADICAL-Cybertools decouple the science needs from the computational capabilities by building an intermediate layer to run general ensemble patterns, regardless of the science. In this research, we show how the ensemble toolkit allows generating high resolution temperature forecasts at different spatial and temporal resolution. The AnEn algorithm is run using NAM analysis and forecasts data for the continental United States for a period of 2 years. AnEn results show that temperature forecasts perform well according to different probabilistic and deterministic statistical tests.

  11. Free energy calculations of short peptide chains using Adaptively Biased Molecular Dynamics

    NASA Astrophysics Data System (ADS)

    Karpusenka, Vadzim; Babin, Volodymyr; Roland, Christopher; Sagui, Celeste

    2008-10-01

    We performed a computational study of monomer peptides composed of methionine, alanine, leucine, glutamate, lysine (all amino acids with a helix-forming propensities); and proline, glycine tyrosine, serine, arginine (which all have poor helix-forming propensities). The free energy landscapes as a function of the handedness and radius of gyration have been calculated using the recently introduced Adaptively Biased Molecular Dynamics (ABMD) method, combined with replica exchange, multiple walkers, and post-processing Umbrella Correction (UC). Minima that correspond to some of the left- and right-handed 310-, α- and π-helixes were identified by secondary structure assignment methods (DSSP, Stride). The resulting free energy surface (FES) and the subsequent steered molecular dynamics (SMD) simulation results are in agreement with the empirical evidence of preferred secondary structures for the peptide chains considered.

  12. Adaptive voting computer system

    NASA Technical Reports Server (NTRS)

    Koczela, L. J.; Wilgus, D. S. (Inventor)

    1974-01-01

    A computer system is reported that uses adaptive voting to tolerate failures and operates in a fail-operational, fail-safe manner. Each of four computers is individually connected to one of four external input/output (I/O) busses which interface with external subsystems. Each computer is connected to receive input data and commands from the other three computers and to furnish output data commands to the other three computers. An adaptive control apparatus including a voter-comparator-switch (VCS) is provided for each computer to receive signals from each of the computers and permits adaptive voting among the computers to permit the fail-operational, fail-safe operation.

  13. Autonomous Navigation by a Mobile Robot

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terrance; Aghazarian, Hrand

    2005-01-01

    ROAMAN is a computer program for autonomous navigation of a mobile robot on a long (as much as hundreds of meters) traversal of terrain. Developed for use aboard a robotic vehicle (rover) exploring the surface of a remote planet, ROAMAN could also be adapted to similar use on terrestrial mobile robots. ROAMAN implements a combination of algorithms for (1) long-range path planning based on images acquired by mast-mounted, wide-baseline stereoscopic cameras, and (2) local path planning based on images acquired by body-mounted, narrow-baseline stereoscopic cameras. The long-range path-planning algorithm autonomously generates a series of waypoints that are passed to the local path-planning algorithm, which plans obstacle-avoiding legs between the waypoints. Both the long- and short-range algorithms use an occupancy-grid representation in computations to detect obstacles and plan paths. Maps that are maintained by the long- and short-range portions of the software are not shared because substantial localization errors can accumulate during any long traverse. ROAMAN is not guaranteed to generate an optimal shortest path, but does maintain the safety of the rover.

  14. Evaluating the Appropriateness of a New Computer-Administered Measure of Adaptive Function for Children and Youth with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Coster, Wendy J.; Kramer, Jessica M.; Tian, Feng; Dooley, Meghan; Liljenquist, Kendra; Kao, Ying-Chia; Ni, Pengsheng

    2016-01-01

    The Pediatric Evaluation of Disability Inventory-Computer Adaptive Test is an alternative method for describing the adaptive function of children and youth with disabilities using a computer-administered assessment. This study evaluated the performance of the Pediatric Evaluation of Disability Inventory-Computer Adaptive Test with a national…

  15. 21 CFR 874.1070 - Short increment sensitivity index (SISI) adapter.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Short increment sensitivity index (SISI) adapter. 874.1070 Section 874.1070 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN... short periodic sound pulses in specific small decibel increments that are intended to be superimposed on...

  16. 21 CFR 874.1070 - Short increment sensitivity index (SISI) adapter.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Short increment sensitivity index (SISI) adapter. 874.1070 Section 874.1070 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN... short periodic sound pulses in specific small decibel increments that are intended to be superimposed on...

  17. Adaptively biased molecular dynamics: An umbrella sampling method with a time-dependent potential

    NASA Astrophysics Data System (ADS)

    Babin, Volodymyr; Karpusenka, Vadzim; Moradi, Mahmoud; Roland, Christopher; Sagui, Celeste

    We discuss an adaptively biased molecular dynamics (ABMD) method for the computation of a free energy surface for a set of reaction coordinates. The ABMD method belongs to the general category of umbrella sampling methods with an evolving biasing potential. It is characterized by a small number of control parameters and an O(t) numerical cost with simulation time t. The method naturally allows for extensions based on multiple walkers and replica exchange mechanism. The workings of the method are illustrated with a number of examples, including sugar puckering, and free energy landscapes for polymethionine and polyproline peptides, and for a short β-turn peptide. ABMD has been implemented into the latest version (Case et al., AMBER 10; University of California: San Francisco, 2008) of the AMBER software package and is freely available to the simulation community.

  18. Audio-Enhanced Tablet Computers to Assess Children's Food Frequency From Migrant Farmworker Mothers.

    PubMed

    Kilanowski, Jill F; Trapl, Erika S; Kofron, Ryan M

    2013-06-01

    This study sought to improve data collection in children's food frequency surveys for non-English speaking immigrant/migrant farmworker mothers using audio-enhanced tablet computers (ATCs). We hypothesized that by using technological adaptations, we would be able to improve data capture and therefore reduce lost surveys. This Food Frequency Questionnaire (FFQ), a paper-based dietary assessment tool, was adapted for ATCs and assessed consumption of 66 food items asking 3 questions for each food item: frequency, quantity of consumption, and serving size. The tablet-based survey was audio enhanced with each question "read" to participants, accompanied by food item images, together with an embedded short instructional video. Results indicated that respondents were able to complete the 198 questions from the 66 food item FFQ on ATCs in approximately 23 minutes. Compared with paper-based FFQs, ATC-based FFQs had less missing data. Despite overall reductions in missing data by use of ATCs, respondents still appeared to have difficulty with question 2 of the FFQ. Ability to score the FFQ was dependent on what sections missing data were located. Unlike the paper-based FFQs, no ATC-based FFQs were unscored due to amount or location of missing data. An ATC-based FFQ was feasible and increased ability to score this survey on children's food patterns from migrant farmworker mothers. This adapted technology may serve as an exemplar for other non-English speaking immigrant populations.

  19. Investigation of Item-Pair Presentation and Construct Validity of the Navy Computer Adaptive Personality Scales (NCAPS)

    DTIC Science & Technology

    2006-10-01

    NCAPS ) Christina M. Underhill, Ph.D. Approved for public release; distribution is unlimited. NPRST-TN-06-9 October 2006...Investigation of Item-Pair Presentation and Construct Validity of the Navy Computer Adaptive Personality Scales ( NCAPS ) Christina M. Underhill, Ph.D...documents one of the steps in our development of the Navy Computer Adaptive Personality Scales ( NCAPS ). NCAPS is a computer adaptive personality measure

  20. Passive adaptation to stress in adulthood after short-term social instability stress during adolescence in mice.

    PubMed

    de Lima, A P N; Massoco, C O

    2017-05-01

    This study reports that short-term social instability stress (SIS) in adolescence increases passive-coping in adulthood in male mice. Short-term SIS decreased the latency of immobility and increased the frequency and time of immobility in tail suspension test. These findings support the hypothesis that adolescent stress can induce a passive adaptation to stress in adulthood, even if it is a short period of stress.

  1. How visual short-term memory maintenance modulates subsequent visual aftereffects.

    PubMed

    Saad, Elyana; Silvanto, Juha

    2013-05-01

    Prolonged viewing of a visual stimulus can result in sensory adaptation, giving rise to perceptual phenomena such as the tilt aftereffect (TAE). However, it is not known if short-term memory maintenance induces such effects. We examined how visual short-term memory (VSTM) maintenance modulates the strength of the TAE induced by subsequent visual adaptation. We reasoned that if VSTM maintenance induces aftereffects on subsequent encoding of visual information, then it should either enhance or reduce the TAE induced by a subsequent visual adapter, depending on the congruency of the memory cue and the adapter. Our results were consistent with this hypothesis and thus indicate that the effects of VSTM maintenance can outlast the maintenance period.

  2. QPSO-Based Adaptive DNA Computing Algorithm

    PubMed Central

    Karakose, Mehmet; Cigdem, Ugur

    2013-01-01

    DNA (deoxyribonucleic acid) computing that is a new computation model based on DNA molecules for information storage has been increasingly used for optimization and data analysis in recent years. However, DNA computing algorithm has some limitations in terms of convergence speed, adaptability, and effectiveness. In this paper, a new approach for improvement of DNA computing is proposed. This new approach aims to perform DNA computing algorithm with adaptive parameters towards the desired goal using quantum-behaved particle swarm optimization (QPSO). Some contributions provided by the proposed QPSO based on adaptive DNA computing algorithm are as follows: (1) parameters of population size, crossover rate, maximum number of operations, enzyme and virus mutation rate, and fitness function of DNA computing algorithm are simultaneously tuned for adaptive process, (2) adaptive algorithm is performed using QPSO algorithm for goal-driven progress, faster operation, and flexibility in data, and (3) numerical realization of DNA computing algorithm with proposed approach is implemented in system identification. Two experiments with different systems were carried out to evaluate the performance of the proposed approach with comparative results. Experimental results obtained with Matlab and FPGA demonstrate ability to provide effective optimization, considerable convergence speed, and high accuracy according to DNA computing algorithm. PMID:23935409

  3. Adaptive bill morphology for enhanced tool manipulation in New Caledonian crows

    PubMed Central

    Matsui, Hiroshi; Hunt, Gavin R.; Oberhofer, Katja; Ogihara, Naomichi; McGowan, Kevin J.; Mithraratne, Kumar; Yamasaki, Takeshi; Gray, Russell D.; Izawa, Ei-Ichi

    2016-01-01

    Early increased sophistication of human tools is thought to be underpinned by adaptive morphology for efficient tool manipulation. Such adaptive specialisation is unknown in nonhuman primates but may have evolved in the New Caledonian crow, which has sophisticated tool manufacture. The straightness of its bill, for example, may be adaptive for enhanced visually-directed use of tools. Here, we examine in detail the shape and internal structure of the New Caledonian crow’s bill using Principal Components Analysis and Computed Tomography within a comparative framework. We found that the bill has a combination of interrelated shape and structural features unique within Corvus, and possibly birds generally. The upper mandible is relatively deep and short with a straight cutting edge, and the lower mandible is strengthened and upturned. These novel combined attributes would be functional for (i) counteracting the unique loading patterns acting on the bill when manipulating tools, (ii) a strong precision grip to hold tools securely, and (iii) enhanced visually-guided tool use. Our findings indicate that the New Caledonian crow’s innovative bill has been adapted for tool manipulation to at least some degree. Early increased sophistication of tools may require the co-evolution of morphology that provides improved manipulatory skills. PMID:26955788

  4. Patient-Adaptive Reconstruction and Acquisition in Dynamic Imaging with Sensitivity Encoding (PARADISE)

    PubMed Central

    Sharif, Behzad; Derbyshire, J. Andrew; Faranesh, Anthony Z.; Bresler, Yoram

    2010-01-01

    MR imaging of the human heart without explicit cardiac synchronization promises to extend the applicability of cardiac MR to a larger patient population and potentially expand its diagnostic capabilities. However, conventional non-gated imaging techniques typically suffer from low image quality or inadequate spatio-temporal resolution and fidelity. Patient-Adaptive Reconstruction and Acquisition in Dynamic Imaging with Sensitivity Encoding (PARADISE) is a highly-accelerated non-gated dynamic imaging method that enables artifact-free imaging with high spatio-temporal resolutions by utilizing novel computational techniques to optimize the imaging process. In addition to using parallel imaging, the method gains acceleration from a physiologically-driven spatio-temporal support model; hence, it is doubly accelerated. The support model is patient-adaptive, i.e., its geometry depends on dynamics of the imaged slice, e.g., subject’s heart-rate and heart location within the slice. The proposed method is also doubly adaptive as it adapts both the acquisition and reconstruction schemes. Based on the theory of time-sequential sampling, the proposed framework explicitly accounts for speed limitations of gradient encoding and provides performance guarantees on achievable image quality. The presented in-vivo results demonstrate the effectiveness and feasibility of the PARADISE method for high resolution non-gated cardiac MRI during a short breath-hold. PMID:20665794

  5. Integrated Adaptive Scenarios for Ariculture: Synergies and Tradeoffs.

    NASA Astrophysics Data System (ADS)

    Malek, K.; Rajagopalan, K.; Adam, J. C.; Brady, M.; Stockle, C.; Liu, M.; Kruger, C. E.

    2017-12-01

    A wide variety of factors can drive adaptation of the agricultural production sector in response to climate change. Warming and increased growing season length can lead to adoption of newer plant varieties as well as increases in double cropping systems. Changes in expectations of drought frequency or economic factors could lead to adoption of new technology (such as irrigation technology or water trading systems) or crop choices with a view of reducing farm-level risk, and these choices can result in unintended system wide effects. These are all examples of producer adaptation decisions made with a long-term (multiple decades) view. In addition, producers respond to short-term (current year) shocks - such as drought events - through management strategies that include deficit irrigation, fallowing, nutrient management, and engaging in water trading. The effects of these short- and long-term decisions are not independent, and can drive or be driven by the other. For example, investment in new irrigation systems (long-term) can be driven by expectations of short-term crop productivity losses in drought years. Similarly, the capacity to manage for short-term shocks will depend on crop type and variety as well as adopted irrigation technologies. Our overarching objective is to understand the synergies and tradeoffs that exist when combining three potential long-term adaptation strategies and two short-term adaptation strategies, with a view of understanding the synergies and tradeoffs. We apply the integrated crop-hydrology modeling framework VIC-CropSyst, along with the water management module Yakima RiverWare to address these questions over our test area, the Yakima River basin. We consider adoption of a) more efficient irrigation technologies, slower growing crop varieties, and increased prevalence of double cropping systems as long-term adaptation strategies; and b) fallowing and deficit irrigation as short-term responses to droughts. We evaluate the individual and combined effect of these strategies on agricultural production. Preliminary results indicate that long-term adaptation strategies impact short-run adaptive capacities to drought shocks. The strategies are complementary under certain situations and results in tradeoffs in other situations, and we characterize these differences.

  6. Cross-cultural adaptation and clinimetric property of Korean version of the Chronic Pain Coping Inventory-42 in patients with chronic low back pain.

    PubMed

    Ko, Young-Mi; Park, Won-Beom; Lim, Jae-Young

    2010-03-15

    Validation of a translated, culturally adapted questionnaire. We developed a Korean version of the Chronic Pain Coping Inventory-42 (CPCI-42) by performing a cross-cultural adaptation, and evaluated its reliability and validity. The CPCI is widely used and validated instruments for measuring coping strategies in chronic pain. However, no validated and culturally adapted version was available in Asian countries. We assessed 142 patients with chronic low back pain using the CPCI-42 and measures of physical disability, pain, and quality of life. Results for 93 of the 142 patients exhibited test-retest reliability. The interval time of collecting retest data varied from 2 weeks to 1 month. Criterion validity was evaluated using correlations between the CPCI-42 and the Oswestry Disability Index, the Brief Pain Inventory, and the Short Form 36-item Health Survey (version 2.0). Construct validity was computed using exploratory factor analysis. The Korean version of the CPCI-42 had a high internal consistency (Cronbach's alpha >0.70) with the exception of results for task persistence and relaxation. Illness-focused coping (guarding, resting, asking for assistance) and other-focused coping (seeking social support) were most significantly correlated with Oswestry Disability Index, Brief Pain Inventory, and Short Form 36-item Health Survey, respectively. Outcomes for task persistence were contrary to other subscales in wellness-focused coping. Construct validity by factor analysis produced similar results to the original CPCI subscale. However, several factors showed cross-loading in 8 factor solutions. Despite linguistic and cultural differences, the Korean version of the CPCI-42 is overall a meaningful tool, and produces results sufficiently similar to the original CPCI-42.

  7. Fast reversible wavelet image compressor

    NASA Astrophysics Data System (ADS)

    Kim, HyungJun; Li, Ching-Chung

    1996-10-01

    We present a unified image compressor with spline biorthogonal wavelets and dyadic rational filter coefficients which gives high computational speed and excellent compression performance. Convolutions with these filters can be preformed by using only arithmetic shifting and addition operations. Wavelet coefficients can be encoded with an arithmetic coder which also uses arithmetic shifting and addition operations. Therefore, from the beginning to the end, the while encoding/decoding process can be done within a short period of time. The proposed method naturally extends form the lossless compression to the lossy but high compression range and can be easily adapted to the progressive reconstruction.

  8. Adaptive synchrosqueezing based on a quilted short-time Fourier transform

    NASA Astrophysics Data System (ADS)

    Berrian, Alexander; Saito, Naoki

    2017-08-01

    In recent years, the synchrosqueezing transform (SST) has gained popularity as a method for the analysis of signals that can be broken down into multiple components determined by instantaneous amplitudes and phases. One such version of SST, based on the short-time Fourier transform (STFT), enables the sharpening of instantaneous frequency (IF) information derived from the STFT, as well as the separation of amplitude-phase components corresponding to distinct IF curves. However, this SST is limited by the time-frequency resolution of the underlying window function, and may not resolve signals exhibiting diverse time-frequency behaviors with sufficient accuracy. In this work, we develop a framework for an SST based on a "quilted" short-time Fourier transform (SST-QSTFT), which allows adaptation to signal behavior in separate time-frequency regions through the use of multiple windows. This motivates us to introduce a discrete reassignment frequency formula based on a finite difference of the phase spectrum, ensuring computational accuracy for a wider variety of windows. We develop a theoretical framework for the SST-QSTFT in both the continuous and the discrete settings, and describe an algorithm for the automatic selection of optimal windows depending on the region of interest. Using synthetic data, we demonstrate the superior numerical performance of SST-QSTFT relative to other SST methods in a noisy context. Finally, we apply SST-QSTFT to audio recordings of animal calls to demonstrate the potential of our method for the analysis of real bioacoustic signals.

  9. Contrast adaptation induced by defocus - a possible error signal for emmetropization?

    PubMed

    Ohlendorf, Arne; Schaeffel, Frank

    2009-01-01

    To describe some features of contrast adaptation as induced by imposed positive or negative defocus. To study its time course and selectivity for the sign of the imposed defocus. Contrast adaptation, CA (here referred to as any change in supra-threshold contrast sensitivity) was induced by presenting a movie to the subjects on a computer screen at 1m distance for 10min, while the right eye was defocused by a trial lens (+4D (n=25); -4D (n=10); -2D (n=11 subjects). The PowerRefractor was used to track accommodation binocularly. Contrast sensitivity at threshold was measured by a method of adjustment with a Gabor patch of 1deg angular subtense, filled with 3.22cyc/deg sine wave grating presented on a computer screen at 1m distance on gray background (33cd/m(2)). Supra-threshold contrast sensitivity was quantified by an interocular contrast matching task, in which the subject had to match the contrast of the sine wave grating seen with the right eye with the contrast of a grating with fixed contrast of 0.1. (1) Contrast sensitivity thresholds were not lowered by previous viewing of defocused movies. (2) By wearing positive lenses, the supra-threshold contrast sensitivity in the right eye was raised by about 30% and remained elevated for at least 2min until baseline was reached after about 5min. (3) CA was induced only by positive, but not by negative lenses, even after the distance of the computer screen was taken into account (1m, equivalent to +1D). In five subjects, binocular accommodation was tracked over the full adaptation period. Accommodation appeared to focus the eye not wearing a lens, but short transient switches in focus to the lens wearing eye could not be entirely excluded. Transient contrast adaptation was found at 3.22cyc/deg when positive lenses were worn but not with negative lenses. This asymmetry is intriguing. While it may represent an epiphenomenon of physiological optics, further experiments are necessary to determine whether it could also trace back to differences in CA with defocus of different sign.

  10. Revision and Expansion of Navy Computer Adaptive Personality Scales (NCAPS)

    DTIC Science & Technology

    2007-08-01

    Nav Pesne Reerh Stde, an Technolg y Dii sio Revision and Expansion of Navy Computer Adaptive Personality Scales ( NCAPS ) Robert J. Schneider, Ph.D...TN-o7-12 August 2007 Revision and Expansion of Navy Computer Adaptive Personality Scales ( NCAPS ) Robert J. Schneider, Ph.D. Kerri L. Ferstl, Ph.D...03/31/2006 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Revision and Expansion of Navy Computer Adaptive Personality Scales ( NCAPS ) 5b. GRANT NUMBER 5c

  11. Investigation of Item-Pair Presentation and Construct Validity of the Navy Computer Adaptive Personality Scales (NCAPS)

    DTIC Science & Technology

    2006-10-01

    Investigation of Item-Pair Presentation and Construct Validity of the Navy Computer Adaptive Personality Scales ( NCAPS ) Christina M. Underhill, Ph.D...Construct Validity of the Navy Computer Adaptive Personality Scales ( NCAPS ) Christina M. Underhill, Ph.D. Reviewed and Approved by Jacqueline A. Mottern...and Construct Validity of the Navy Computer Adaptive Personality Scales ( NCAPS ) 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 0602236N and 0603236N 6

  12. Linear hypergeneralization of learned dynamics across movement speeds reveals anisotropic, gain-encoding primitives for motor adaptation.

    PubMed

    Joiner, Wilsaan M; Ajayi, Obafunso; Sing, Gary C; Smith, Maurice A

    2011-01-01

    The ability to generalize learned motor actions to new contexts is a key feature of the motor system. For example, the ability to ride a bicycle or swing a racket is often first developed at lower speeds and later applied to faster velocities. A number of previous studies have examined the generalization of motor adaptation across movement directions and found that the learned adaptation decays in a pattern consistent with the existence of motor primitives that display narrow Gaussian tuning. However, few studies have examined the generalization of motor adaptation across movement speeds. Following adaptation to linear velocity-dependent dynamics during point-to-point reaching arm movements at one speed, we tested the ability of subjects to transfer this adaptation to short-duration higher-speed movements aimed at the same target. We found near-perfect linear extrapolation of the trained adaptation with respect to both the magnitude and the time course of the velocity profiles associated with the high-speed movements: a 69% increase in movement speed corresponded to a 74% extrapolation of the trained adaptation. The close match between the increase in movement speed and the corresponding increase in adaptation beyond what was trained indicates linear hypergeneralization. Computational modeling shows that this pattern of linear hypergeneralization across movement speeds is not compatible with previous models of adaptation in which motor primitives display isotropic Gaussian tuning of motor output around their preferred velocities. Instead, we show that this generalization pattern indicates that the primitives involved in the adaptation to viscous dynamics display anisotropic tuning in velocity space and encode the gain between motor output and motion state rather than motor output itself.

  13. An Investigation of the Validity and Reliability of the Adapted Mathematics Anxiety Rating Scale-Short Version (MARS-SV) among Turkish Students

    ERIC Educational Resources Information Center

    Baloglu, Mustafa

    2010-01-01

    This study adapted the Mathematics Anxiety Rating Scale-Short Version (MARS-SV) into Turkish and investigated the validity and reliability of the adapted instrument. Twenty-five bilingual experts agreed on the language validity, and 49 Turkish language experts agreed on the conformity and understandability of the scale's items. Thirty-two subject…

  14. Speech coding at low to medium bit rates

    NASA Astrophysics Data System (ADS)

    Leblanc, Wilfred Paul

    1992-09-01

    Improved search techniques coupled with improved codebook design methodologies are proposed to improve the performance of conventional code-excited linear predictive coders for speech. Improved methods for quantizing the short term filter are developed by employing a tree search algorithm and joint codebook design to multistage vector quantization. Joint codebook design procedures are developed to design locally optimal multistage codebooks. Weighting during centroid computation is introduced to improve the outlier performance of the multistage vector quantizer. Multistage vector quantization is shown to be both robust against input characteristics and in the presence of channel errors. Spectral distortions of about 1 dB are obtained at rates of 22-28 bits/frame. Structured codebook design procedures for excitation in code-excited linear predictive coders are compared to general codebook design procedures. Little is lost using significant structure in the excitation codebooks while greatly reducing the search complexity. Sparse multistage configurations are proposed for reducing computational complexity and memory size. Improved search procedures are applied to code-excited linear prediction which attempt joint optimization of the short term filter, the adaptive codebook, and the excitation. Improvements in signal to noise ratio of 1-2 dB are realized in practice.

  15. Medical physics practice in the next decade

    PubMed Central

    Paliwal, Bhudatt

    2006-01-01

    Impressive advances in computers and materials science have fueled a broad-based confluence of basic science breakthroughs. These advances are making us reformulate our learning, teaching and credentialing methodologies and research and development frontiers. We are now in the age of molecular medicine. In the entire field of health care, a paradigm shift from population-based solutions to individual specific care is taking place. These trends are reshaping the practice of medical physics. In this short presentation, examples are given to illustrate developments in image-guided intensity-modulated and adaptive helical tomotherapy, enhanced application of intensity modulation radiotherapy (IMRT) using adaptive radiotherapy and conformal avoidance. These advances include improved normal tissue sparing and permit dose reconstruction and verification, thereby allowing significant biologically effective dose escalation and reduced radiation toxicity. The intrinsic capability of helical TomoTherapy for megavoltage CT imaging for IMRT image-guidance is also discussed. Finally developments in motion management are described. PMID:22275799

  16. Development and Validation of a Short-Form Adaptation of the Age-Related Vision Loss Scale: The AVL12

    ERIC Educational Resources Information Center

    Horowitz, Amy; Reinhardt, Joann P.; Raykov, Tenko

    2007-01-01

    This article describes the development and evaluation of a short form of the 24-item Adaptation to Age-Related Vision Loss (AVL) scale. The evaluation provided evidence of the reliability and validity of the short form (the AVL12), for significant interindividual differences at the baseline and for individual-level change in AVL scores over time.…

  17. Adjoint-Based, Three-Dimensional Error Prediction and Grid Adaptation

    NASA Technical Reports Server (NTRS)

    Park, Michael A.

    2002-01-01

    Engineering computational fluid dynamics (CFD) analysis and design applications focus on output functions (e.g., lift, drag). Errors in these output functions are generally unknown and conservatively accurate solutions may be computed. Computable error estimates can offer the possibility to minimize computational work for a prescribed error tolerance. Such an estimate can be computed by solving the flow equations and the linear adjoint problem for the functional of interest. The computational mesh can be modified to minimize the uncertainty of a computed error estimate. This robust mesh-adaptation procedure automatically terminates when the simulation is within a user specified error tolerance. This procedure for estimating and adapting to error in a functional is demonstrated for three-dimensional Euler problems. An adaptive mesh procedure that links to a Computer Aided Design (CAD) surface representation is demonstrated for wing, wing-body, and extruded high lift airfoil configurations. The error estimation and adaptation procedure yielded corrected functions that are as accurate as functions calculated on uniformly refined grids with ten times as many grid points.

  18. Near Real-Time Image Reconstruction

    NASA Astrophysics Data System (ADS)

    Denker, C.; Yang, G.; Wang, H.

    2001-08-01

    In recent years, post-facto image-processing algorithms have been developed to achieve diffraction-limited observations of the solar surface. We present a combination of frame selection, speckle-masking imaging, and parallel computing which provides real-time, diffraction-limited, 256×256 pixel images at a 1-minute cadence. Our approach to achieve diffraction limited observations is complementary to adaptive optics (AO). At the moment, AO is limited by the fact that it corrects wavefront abberations only for a field of view comparable to the isoplanatic patch. This limitation does not apply to speckle-masking imaging. However, speckle-masking imaging relies on short-exposure images which limits its spectroscopic applications. The parallel processing of the data is performed on a Beowulf-class computer which utilizes off-the-shelf, mass-market technologies to provide high computational performance for scientific calculations and applications at low cost. Beowulf computers have a great potential, not only for image reconstruction, but for any kind of complex data reduction. Immediate access to high-level data products and direct visualization of dynamic processes on the Sun are two of the advantages to be gained.

  19. Accurate and general treatment of electrostatic interaction in Hamiltonian adaptive resolution simulations

    NASA Astrophysics Data System (ADS)

    Heidari, M.; Cortes-Huerto, R.; Donadio, D.; Potestio, R.

    2016-10-01

    In adaptive resolution simulations the same system is concurrently modeled with different resolution in different subdomains of the simulation box, thereby enabling an accurate description in a small but relevant region, while the rest is treated with a computationally parsimonious model. In this framework, electrostatic interaction, whose accurate treatment is a crucial aspect in the realistic modeling of soft matter and biological systems, represents a particularly acute problem due to the intrinsic long-range nature of Coulomb potential. In the present work we propose and validate the usage of a short-range modification of Coulomb potential, the Damped shifted force (DSF) model, in the context of the Hamiltonian adaptive resolution simulation (H-AdResS) scheme. This approach, which is here validated on bulk water, ensures a reliable reproduction of the structural and dynamical properties of the liquid, and enables a seamless embedding in the H-AdResS framework. The resulting dual-resolution setup is implemented in the LAMMPS simulation package, and its customized version employed in the present work is made publicly available.

  20. Kron-Branin modelling of ultra-short pulsed signal microelectrode

    NASA Astrophysics Data System (ADS)

    Xu, Zhifei; Ravelo, Blaise; Liu, Yang; Zhao, Lu; Delaroche, Fabien; Vurpillot, Francois

    2018-06-01

    An uncommon circuit modelling of microelectrode for ultra-short signal propagation is developed. The proposed model is based on the Tensorial Analysis of Network (TAN) using the Kron-Branin (KB) formalism. The systemic graph topology equivalent to the considered structure problem is established by assuming as unknown variables the branch currents. The TAN mathematical solution is determined after the KB characteristic matrix identification. The TAN can integrate various structure physical parameters. As proof of concept, via hole ended microelectrodes implemented on Kapton substrate were designed, fabricated and tested. The 0.1-MHz-to-6-GHz S-parameter KB model, simulation and measurement are in good agreement. In addition, time-domain analyses with nanosecond duration pulse signals were carried out to predict the microelectrode signal integrity. The modelled microstrip electrode is usually integrated in the atom probe tomography. The proposed unfamiliar KB method is particularly beneficial with respect to the computation speed and adaptability to various structures.

  1. Star adaptation for two-algorithms used on serial computers

    NASA Technical Reports Server (NTRS)

    Howser, L. M.; Lambiotte, J. J., Jr.

    1974-01-01

    Two representative algorithms used on a serial computer and presently executed on the Control Data Corporation 6000 computer were adapted to execute efficiently on the Control Data STAR-100 computer. Gaussian elimination for the solution of simultaneous linear equations and the Gauss-Legendre quadrature formula for the approximation of an integral are the two algorithms discussed. A description is given of how the programs were adapted for STAR and why these adaptations were necessary to obtain an efficient STAR program. Some points to consider when adapting an algorithm for STAR are discussed. Program listings of the 6000 version coded in 6000 FORTRAN, the adapted STAR version coded in 6000 FORTRAN, and the STAR version coded in STAR FORTRAN are presented in the appendices.

  2. Learners' Perceptions and Illusions of Adaptivity in Computer-Based Learning Environments

    ERIC Educational Resources Information Center

    Vandewaetere, Mieke; Vandercruysse, Sylke; Clarebout, Geraldine

    2012-01-01

    Research on computer-based adaptive learning environments has shown exemplary growth. Although the mechanisms of effective adaptive instruction are unraveled systematically, little is known about the relative effect of learners' perceptions of adaptivity in adaptive learning environments. As previous research has demonstrated that the learners'…

  3. Sockets Manufactured by CAD/CAM Method Have Positive Effects on the Quality of Life of Patients With Transtibial Amputation.

    PubMed

    Karakoç, Mehmet; Batmaz, İbrahim; Sariyildiz, Mustafa Akif; Yazmalar, Levent; Aydin, Abdülkadir; Em, Serda

    2017-08-01

    Patients with amputation need prosthesis to comfortably move around. One of the most important parts of a good prosthesis is the socket. Currently, the most commonly used method is the traditional socket manufacturing method, which involves manual work; however, computer-aided design/computer-aided manufacturing (CAD/CAM) is also being used in the recent years. The present study aimed to investigate the effects of sockets manufactured by traditional and CAD/CAM method on clinical characteristics and quality of life of patients with transtibial amputation. The study included 72 patients with transtibial amputation using prosthesis, 36 of whom had CAD/CAM prosthetic sockets (group 1) and 36 had traditional prosthetic sockets (group 2). Amputation reason, prosthesis lifetime, walking time and distance with prosthesis, pain-free walking time with prosthesis, production time of the prosthesis, and adaptation time to the prosthesis were questioned. Quality of life was assessed using the 36-item Short Form Health Survey questionnaire and the Trinity Amputation and Prosthesis Experience Scales. Walking time and distance and pain-free walking time with prosthesis were significantly better in group 1 than those in group 2. Furthermore, the prosthesis was applied in a significantly shorter time, and socket adaptation time was significantly shorter in group 1. Except emotional role limitation, all 36-item Short Form Healthy Survey questionnaire parameters were significantly better in group 1 than in group 2. Trinity Amputation and Prosthesis Experience Scales activity limitation scores of group 1 were lower, and Trinity Amputation and Prosthesis Experience Scales satisfaction with the prosthesis scores were higher than those in group 2. Our study demonstrated that the sockets manufactured by CAD/CAM methods yield better outcomes in quality of life of patients with transtibial amputation than the sockets manufactured by the traditional method.

  4. Conflict-driven adaptive control is enhanced by integral negative emotion on a short time scale.

    PubMed

    Yang, Qian; Pourtois, Gilles

    2018-02-05

    Negative emotion influences cognitive control, and more specifically conflict adaptation. However, discrepant results have often been reported in the literature. In this study, we broke down negative emotion into integral and incidental components using a modern motivation-based framework, and assessed whether the former could change conflict adaptation. In the first experiment, we manipulated the duration of the inter-trial-interval (ITI) to assess the actual time-scale of this effect. Integral negative emotion was induced by using loss-related feedback contingent on task performance, and measured at the subjective and physiological levels. Results showed that conflict-driven adaptive control was enhanced when integral negative emotion was elicited, compared to a control condition without changes in defensive motivation. Importantly, this effect was only found when a short, as opposed to long ITI was used, suggesting that it had a short time scale. In the second experiment, we controlled for effects of feature repetition and contingency learning, and replicated an enhanced conflict adaptation effect when integral negative emotion was elicited and a short ITI was used. We interpret these new results against a standard cognitive control framework assuming that integral negative emotion amplifies specific control signals transiently, and in turn enhances conflict adaptation.

  5. Implementation of Multispectral Image Classification on a Remote Adaptive Computer

    NASA Technical Reports Server (NTRS)

    Figueiredo, Marco A.; Gloster, Clay S.; Stephens, Mark; Graves, Corey A.; Nakkar, Mouna

    1999-01-01

    As the demand for higher performance computers for the processing of remote sensing science algorithms increases, the need to investigate new computing paradigms its justified. Field Programmable Gate Arrays enable the implementation of algorithms at the hardware gate level, leading to orders of m a,gnitude performance increase over microprocessor based systems. The automatic classification of spaceborne multispectral images is an example of a computation intensive application, that, can benefit from implementation on an FPGA - based custom computing machine (adaptive or reconfigurable computer). A probabilistic neural network is used here to classify pixels of of a multispectral LANDSAT-2 image. The implementation described utilizes Java client/server application programs to access the adaptive computer from a remote site. Results verify that a remote hardware version of the algorithm (implemented on an adaptive computer) is significantly faster than a local software version of the same algorithm implemented on a typical general - purpose computer).

  6. Molecular dynamics study of the structural and dynamic characteristics of the polyextremophilic short-chain dehydrogenase from the Thermococcus sibiricus archaeon and its homologues

    NASA Astrophysics Data System (ADS)

    Popinako, Anna V.; Antonov, Mikhail Yu.; Bezsudnova, Ekaterina Yu.; Prokopiev, Georgiy A.; Popov, Vladimir O.

    2017-11-01

    The study of structural adaptations of proteins from polyextremophilic organisms using computational molecular dynamics method is appealing because the obtained knowledge can be applied to construction of synthetic proteins with high activity and stability in polyextreme media which is useful for many industrial applications. To investigate molecular adaptations to high temperature, we have focused on a superthermostable short-chain dehydrogenase TsAdh319 from the Thermococcus sibiricus polyextremophilic archaeon and its closest structural homologues. Molecular dynamics method is widely used for molecular structure refinement, investigation of biological macromolecules motion, and, consequently, for interpreting the results of certain biophysical experiments. We performed molecular dynamics simulations of the proteins at different temperatures. Comparison of root mean square fluctuations (RMSF) of the atoms in thermophilic alcohol dehydrogenases (ADHs) at 300 K and 358 K revealed the existence of stable residues at 358 K. These residues surround the active site and form a "nucleus of rigidity" in thermophilic ADHs. The results of our studies suggest that the existence of the "nucleus of rigidity" is crucial for the stability of TsAdh319. Absence of the "nucleus of rigidity" in non-thermally stable proteins causes fluctuations throughout the protein, especially on the surface, triggering the process of denaturation at high temperatures.

  7. Computer Security Awareness Guide for Department of Energy Laboratories, Government Agencies, and others for use with Lawrence Livermore National Laboratory`s (LLNL): Computer security short subjects videos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Lonnie Moore, the Computer Security Manager, CSSM/CPPM at Lawrence Livermore National Laboratory (LLNL) and Gale Warshawsky, the Coordinator for Computer Security Education & Awareness at LLNL, wanted to share topics such as computer ethics, software piracy, privacy issues, and protecting information in a format that would capture and hold an audience`s attention. Four Computer Security Short Subject videos were produced which ranged from 1-3 minutes each. These videos are very effective education and awareness tools that can be used to generate discussions about computer security concerns and good computing practices. Leaders may incorporate the Short Subjects into presentations. After talkingmore » about a subject area, one of the Short Subjects may be shown to highlight that subject matter. Another method for sharing them could be to show a Short Subject first and then lead a discussion about its topic. The cast of characters and a bit of information about their personalities in the LLNL Computer Security Short Subjects is included in this report.« less

  8. Audio-Enhanced Tablet Computers to Assess Children’s Food Frequency From Migrant Farmworker Mothers

    PubMed Central

    Kilanowski, Jill F.; Trapl, Erika S.; Kofron, Ryan M.

    2014-01-01

    This study sought to improve data collection in children’s food frequency surveys for non-English speaking immigrant/migrant farmworker mothers using audio-enhanced tablet computers (ATCs). We hypothesized that by using technological adaptations, we would be able to improve data capture and therefore reduce lost surveys. This Food Frequency Questionnaire (FFQ), a paper-based dietary assessment tool, was adapted for ATCs and assessed consumption of 66 food items asking 3 questions for each food item: frequency, quantity of consumption, and serving size. The tablet-based survey was audio enhanced with each question “read” to participants, accompanied by food item images, together with an embedded short instructional video. Results indicated that respondents were able to complete the 198 questions from the 66 food item FFQ on ATCs in approximately 23 minutes. Compared with paper-based FFQs, ATC-based FFQs had less missing data. Despite overall reductions in missing data by use of ATCs, respondents still appeared to have difficulty with question 2 of the FFQ. Ability to score the FFQ was dependent on what sections missing data were located. Unlike the paper-based FFQs, no ATC-based FFQs were unscored due to amount or location of missing data. An ATC-based FFQ was feasible and increased ability to score this survey on children’s food patterns from migrant farmworker mothers. This adapted technology may serve as an exemplar for other non-English speaking immigrant populations. PMID:25343004

  9. Application of Adaptive Decision Aiding Systems to Computer-Assisted Instruction. Final Report, January-December 1974.

    ERIC Educational Resources Information Center

    May, Donald M.; And Others

    The minicomputer-based Computerized Diagnostic and Decision Training (CDDT) system described combines the principles of artificial intelligence, decision theory, and adaptive computer assisted instruction for training in electronic troubleshooting. The system incorporates an adaptive computer program which learns the student's diagnostic and…

  10. Computer-Adaptive Testing: Implications for Students' Achievement, Motivation, Engagement, and Subjective Test Experience

    ERIC Educational Resources Information Center

    Martin, Andrew J.; Lazendic, Goran

    2018-01-01

    The present study investigated the implications of computer-adaptive testing (operationalized by way of multistage adaptive testing; MAT) and "conventional" fixed order computer testing for various test-relevant outcomes in numeracy, including achievement, test-relevant motivation and engagement, and subjective test experience. It did so…

  11. Authoring of Adaptive Computer Assisted Assessment of Free-Text Answers

    ERIC Educational Resources Information Center

    Alfonseca, Enrique; Carro, Rosa M.; Freire, Manuel; Ortigosa, Alvaro; Perez, Diana; Rodriguez, Pilar

    2005-01-01

    Adaptation techniques can be applied not only to the multimedia contents or navigational possibilities of a course, but also to the assessment. In order to facilitate the authoring of adaptive free-text assessment and its integration within adaptive web-based courses, Adaptive Hypermedia techniques and Free-text Computer Assisted Assessment are…

  12. Contribution of the cyclic nucleotide gated channel subunit, CNG-3, to olfactory plasticity in Caenorhabditis elegans.

    PubMed

    O'Halloran, Damien M; Altshuler-Keylin, Svetlana; Zhang, Xiao-Dong; He, Chao; Morales-Phan, Christopher; Yu, Yawei; Kaye, Julia A; Brueggemann, Chantal; Chen, Tsung-Yu; L'Etoile, Noelle D

    2017-03-13

    In Caenorhabditis elegans, the AWC neurons are thought to deploy a cGMP signaling cascade in the detection of and response to AWC sensed odors. Prolonged exposure to an AWC sensed odor in the absence of food leads to reversible decreases in the animal's attraction to that odor. This adaptation exhibits two stages referred to as short-term and long-term adaptation. Previously, the protein kinase G (PKG), EGL-4/PKG-1, was shown necessary for both stages of adaptation and phosphorylation of its target, the beta-type cyclic nucleotide gated (CNG) channel subunit, TAX-2, was implicated in the short term stage. Here we uncover a novel role for the CNG channel subunit, CNG-3, in short term adaptation. We demonstrate that CNG-3 is required in the AWC for adaptation to short (thirty minute) exposures of odor, and contains a candidate PKG phosphorylation site required to tune odor sensitivity. We also provide in vivo data suggesting that CNG-3 forms a complex with both TAX-2 and TAX-4 CNG channel subunits in AWC. Finally, we examine the physiology of different CNG channel subunit combinations.

  13. Hybrid Self-Adaptive Evolution Strategies Guided by Neighborhood Structures for Combinatorial Optimization Problems.

    PubMed

    Coelho, V N; Coelho, I M; Souza, M J F; Oliveira, T A; Cota, L P; Haddad, M N; Mladenovic, N; Silva, R C P; Guimarães, F G

    2016-01-01

    This article presents an Evolution Strategy (ES)--based algorithm, designed to self-adapt its mutation operators, guiding the search into the solution space using a Self-Adaptive Reduced Variable Neighborhood Search procedure. In view of the specific local search operators for each individual, the proposed population-based approach also fits into the context of the Memetic Algorithms. The proposed variant uses the Greedy Randomized Adaptive Search Procedure with different greedy parameters for generating its initial population, providing an interesting exploration-exploitation balance. To validate the proposal, this framework is applied to solve three different [Formula: see text]-Hard combinatorial optimization problems: an Open-Pit-Mining Operational Planning Problem with dynamic allocation of trucks, an Unrelated Parallel Machine Scheduling Problem with Setup Times, and the calibration of a hybrid fuzzy model for Short-Term Load Forecasting. Computational results point out the convergence of the proposed model and highlight its ability in combining the application of move operations from distinct neighborhood structures along the optimization. The results gathered and reported in this article represent a collective evidence of the performance of the method in challenging combinatorial optimization problems from different application domains. The proposed evolution strategy demonstrates an ability of adapting the strength of the mutation disturbance during the generations of its evolution process. The effectiveness of the proposal motivates the application of this novel evolutionary framework for solving other combinatorial optimization problems.

  14. Test Anxiety, Computer-Adaptive Testing and the Common Core

    ERIC Educational Resources Information Center

    Colwell, Nicole Makas

    2013-01-01

    This paper highlights the current findings and issues regarding the role of computer-adaptive testing in test anxiety. The computer-adaptive test (CAT) proposed by one of the Common Core consortia brings these issues to the forefront. Research has long indicated that test anxiety impairs student performance. More recent research indicates that…

  15. The Cultural Adaptation Process during a Short-Term Study Abroad Experience in Swaziland

    ERIC Educational Resources Information Center

    Conner, Nathan W.; Roberts, T. Grady

    2015-01-01

    Globalization continuously shapes our world and influences post-secondary education. This study explored the cultural adaptation process of participants during a short-term study abroad program. Participants experienced stages which included initial feelings, cultural uncertainty, cultural barriers, cultural negativity, academic and career growth,…

  16. Adaptive Decision Aiding in Computer-Assisted Instruction: Adaptive Computerized Training System (ACTS).

    ERIC Educational Resources Information Center

    Hopf-Weichel, Rosemarie; And Others

    This report describes results of the first year of a three-year program to develop and evaluate a new Adaptive Computerized Training System (ACTS) for electronics maintenance training. (ACTS incorporates an adaptive computer program that learns the student's diagnostic and decision value structure, compares it to that of an expert, and adapts the…

  17. Noise Equalization for Ultrafast Plane Wave Microvessel Imaging.

    PubMed

    Song, Pengfei; Manduca, Armando; Trzasko, Joshua D; Chen, Shigao

    2017-11-01

    Ultrafast plane wave microvessel imaging significantly improves ultrasound Doppler sensitivity by increasing the number of Doppler ensembles that can be collected within a short period of time. The rich spatiotemporal plane wave data also enable more robust clutter filtering based on singular value decomposition. However, due to the lack of transmit focusing, plane wave microvessel imaging is very susceptible to noise. This paper was designed to: 1) study the relationship between ultrasound system noise (primarily time gain compensation induced) and microvessel blood flow signal and 2) propose an adaptive and computationally cost-effective noise equalization method that is independent of hardware or software imaging settings to improve microvessel image quality.

  18. Attention and apparent motion.

    PubMed

    Horowitz, T; Treisman, A

    1994-01-01

    Two dissociations between short- and long-range motion in visual search are reported. Previous research has shown parallel processing for short-range motion and apparently serial processing for long-range motion. This finding has been replicated and it has also been found that search for short-range targets can be impaired both by using bicontrast stimuli, and by prior adaptation to the target direction of motion. Neither factor impaired search in long-range motion displays. Adaptation actually facilitated search with long-range displays, which is attributed to response-level effects. A feature-integration account of apparent motion is proposed. In this theory, short-range motion depends on specialized motion feature detectors operating in parallel across the display, but subject to selective adaptation, whereas attention is needed to link successive elements when they appear at greater separations, or across opposite contrasts.

  19. OFF-AXIS THERMAL AND SYNCHROTRON EMISSION FOR SHORT GAMMA RAY BURST

    NASA Astrophysics Data System (ADS)

    Xie, Xiaoyi

    2018-01-01

    We present light curves of photospheric and synchrotron emission from a relativistic jet propagating through the ejecta cloud of a neutron star merger. We use a moving-mesh relativistic hydrodynamics code with adaptive mesh refinement to compute the continuous evolution of jet over 13 orders of magnitude in radius from the scale of the central merger engine all the way through the late afterglow phase. As the jet propagates through the cloud it forms a hot cocoon surrounding the jet core. We find that the photospheric emission released by the hot cocoon is bright for on-axis observers and is detectable for off-axis observers at a wide range of observing angles for sufficiently close sources. As the jet and cocoon drive an external shock into the surrounding medium we compute synchrotron light curves and find bright emission for off-axis observers which differs from top-hat Blandford-McKee jets, especially for lower explosion energies.

  20. Using Computational Cognitive Modeling to Diagnose Possible Sources of Aviation Error

    NASA Technical Reports Server (NTRS)

    Byrne, M. D.; Kirlik, Alex

    2003-01-01

    We present a computational model of a closed-loop, pilot-aircraft-visual scene-taxiway system created to shed light on possible sources of taxi error. Creating the cognitive aspects of the model using ACT-R required us to conduct studies with subject matter experts to identify experiential adaptations pilots bring to taxiing. Five decision strategies were found, ranging from cognitively-intensive but precise, to fast, frugal but robust. We provide evidence for the model by comparing its behavior to a NASA Ames Research Center simulation of Chicago O'Hare surface operations. Decision horizons were highly variable; the model selected the most accurate strategy given time available. We found a signature in the simulation data of the use of globally robust heuristics to cope with short decision horizons as revealed by errors occurring most frequently at atypical taxiway geometries or clearance routes. These data provided empirical support for the model.

  1. Automatic Learning of Fine Operating Rules for Online Power System Security Control.

    PubMed

    Sun, Hongbin; Zhao, Feng; Wang, Hao; Wang, Kang; Jiang, Weiyong; Guo, Qinglai; Zhang, Boming; Wehenkel, Louis

    2016-08-01

    Fine operating rules for security control and an automatic system for their online discovery were developed to adapt to the development of smart grids. The automatic system uses the real-time system state to determine critical flowgates, and then a continuation power flow-based security analysis is used to compute the initial transfer capability of critical flowgates. Next, the system applies the Monte Carlo simulations to expected short-term operating condition changes, feature selection, and a linear least squares fitting of the fine operating rules. The proposed system was validated both on an academic test system and on a provincial power system in China. The results indicated that the derived rules provide accuracy and good interpretability and are suitable for real-time power system security control. The use of high-performance computing systems enables these fine operating rules to be refreshed online every 15 min.

  2. Improvements in Routing for Packet-Switched Networks

    DTIC Science & Technology

    1975-02-18

    PROGRAM FOR COMPUTER SIMULATION . . 90 B.l Flow Diagram of Adaptive Routine 90 B.2 Progiam ARPSIM 93 B.3 Explanation of Variables...equa. 90 APPENDIX B ADAPTIVE ROUTING PROGRAM FOR COMPUTER SIMULA HON The computer simulation for adaptive routing was initially run on a DDP-24 small...TRANSMIT OVER AVAILABLE LINKS MESSAGES IN QUEUE COMPUTE Ni NUMBER OF ARRIVALS AT EACH NODE i AT TIME T Fig. Bla - Flow Diagram of Program Routine 92

  3. Computerized Adaptive Assessment of Cognitive Abilities among Disabled Adults.

    ERIC Educational Resources Information Center

    Engdahl, Brian

    This study examined computerized adaptive testing and cognitive ability testing of adults with cognitive disabilities. Adult subjects (N=250) were given computerized tests on language usage and space relations in one of three administration conditions: paper and pencil, fixed length computer adaptive, and variable length computer adaptive.…

  4. Three-phase short circuit calculation method based on pre-computed surface for doubly fed induction generator

    NASA Astrophysics Data System (ADS)

    Ma, J.; Liu, Q.

    2018-02-01

    This paper presents an improved short circuit calculation method, based on pre-computed surface to determine the short circuit current of a distribution system with multiple doubly fed induction generators (DFIGs). The short circuit current, injected into power grid by DFIG, is determined by low voltage ride through (LVRT) control and protection under grid fault. However, the existing methods are difficult to calculate the short circuit current of DFIG in engineering practice due to its complexity. A short circuit calculation method, based on pre-computed surface, was proposed by developing the surface of short circuit current changing with the calculating impedance and the open circuit voltage. And the short circuit currents were derived by taking into account the rotor excitation and crowbar activation time. Finally, the pre-computed surfaces of short circuit current at different time were established, and the procedure of DFIG short circuit calculation considering its LVRT was designed. The correctness of proposed method was verified by simulation.

  5. Cosmological N-body Simulation

    NASA Astrophysics Data System (ADS)

    Lake, George

    1994-05-01

    .90ex> }}} The ``N'' in N-body calculations has doubled every year for the last two decades. To continue this trend, the UW N-body group is working on algorithms for the fast evaluation of gravitational forces on parallel computers and establishing rigorous standards for the computations. In these algorithms, the computational cost per time step is ~ 10(3) pairwise forces per particle. A new adaptive time integrator enables us to perform high quality integrations that are fully temporally and spatially adaptive. SPH--smoothed particle hydrodynamics will be added to simulate the effects of dissipating gas and magnetic fields. The importance of these calculations is two-fold. First, they determine the nonlinear consequences of theories for the structure of the Universe. Second, they are essential for the interpretation of observations. Every galaxy has six coordinates of velocity and position. Observations determine two sky coordinates and a line of sight velocity that bundles universal expansion (distance) together with a random velocity created by the mass distribution. Simulations are needed to determine the underlying structure and masses. The importance of simulations has moved from ex post facto explanation to an integral part of planning large observational programs. I will show why high quality simulations with ``large N'' are essential to accomplish our scientific goals. This year, our simulations have N >~ 10(7) . This is sufficient to tackle some niche problems, but well short of our 5 year goal--simulating The Sloan Digital Sky Survey using a few Billion particles (a Teraflop-year simulation). Extrapolating past trends, we would have to ``wait'' 7 years for this hundred-fold improvement. Like past gains, significant changes in the computational methods are required for these advances. I will describe new algorithms, algorithmic hacks and a dedicated computer to perform Billion particle simulations. Finally, I will describe research that can be enabled by Petaflop computers. This research is supported by the NASA HPCC/ESS program.

  6. Plant Chemistry and Local Adaptation of a Specialized Folivore

    PubMed Central

    Laukkanen, Liisa; Leimu, Roosa; Muola, Anne; Lilley, Marianna; Salminen, Juha-Pekka; Mutikainen, Pia

    2012-01-01

    Local adaptation is central for creating and maintaining spatial variation in plant-herbivore interactions. Short-lived insect herbivores feeding on long-lived plants are likely to adapt to their local host plants, because of their short generation time, poor dispersal, and geographically varying selection due to variation in plant defences. In a reciprocal feeding trial, we investigated the impact of geographic variation in plant secondary chemistry of a long-lived plant, Vincetoxicum hirundinaria, on among-population variation in local adaptation of a specialist leaf-feeding herbivore, Abrostola asclepiadis. The occurrence and degree of local adaptation varied among populations. This variation correlated with qualitative and quantitative differences in plant chemistry among the plant populations. These findings provide insights into the mechanisms driving variation in local adaptation in this specialized plant-herbivore interaction. PMID:22666493

  7. Adaptation in CRISPR-Cas Systems.

    PubMed

    Sternberg, Samuel H; Richter, Hagen; Charpentier, Emmanuelle; Qimron, Udi

    2016-03-17

    Clustered regularly interspaced short palindromic repeats (CRISPR) and CRISPR-associated (Cas) proteins constitute an adaptive immune system in prokaryotes. The system preserves memories of prior infections by integrating short segments of foreign DNA, termed spacers, into the CRISPR array in a process termed adaptation. During the past 3 years, significant progress has been made on the genetic requirements and molecular mechanisms of adaptation. Here we review these recent advances, with a focus on the experimental approaches that have been developed, the insights they generated, and a proposed mechanism for self- versus non-self-discrimination during the process of spacer selection. We further describe the regulation of adaptation and the protein players involved in this fascinating process that allows bacteria and archaea to harbor adaptive immunity. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. A locally p-adaptive approach for Large Eddy Simulation of compressible flows in a DG framework

    NASA Astrophysics Data System (ADS)

    Tugnoli, Matteo; Abbà, Antonella; Bonaventura, Luca; Restelli, Marco

    2017-11-01

    We investigate the possibility of reducing the computational burden of LES models by employing local polynomial degree adaptivity in the framework of a high-order DG method. A novel degree adaptation technique especially featured to be effective for LES applications is proposed and its effectiveness is compared to that of other criteria already employed in the literature. The resulting locally adaptive approach allows to achieve significant reductions in computational cost of representative LES computations.

  9. A Research Program in Computer Technology. 1982 Annual Technical Report

    DTIC Science & Technology

    1983-03-01

    for the Defense Advanced Research Projects Agency. The research applies computer science and technology to areas of high DoD/ military impact. The ISI...implement the plan; New Computing Environment - investigation and adaptation of developing computer technologies to serve the research and military ...Computing Environment - ,.*_i;.;"’.)n and adaptation of developing computer technologies to serve the research and military tser communities; and Computer

  10. The representation of object viewpoint in human visual cortex.

    PubMed

    Andresen, David R; Vinberg, Joakim; Grill-Spector, Kalanit

    2009-04-01

    Understanding the nature of object representations in the human brain is critical for understanding the neural basis of invariant object recognition. However, the degree to which object representations are sensitive to object viewpoint is unknown. Using fMRI we employed a parametric approach to examine the sensitivity to object view as a function of rotation (0 degrees-180 degrees ), category (animal/vehicle) and fMRI-adaptation paradigm (short or long-lagged). For both categories and fMRI-adaptation paradigms, object-selective regions recovered from adaptation when a rotated view of an object was shown after adaptation to a specific view of that object, suggesting that representations are sensitive to object rotation. However, we found evidence for differential representations across categories and ventral stream regions. Rotation cross-adaptation was larger for animals than vehicles, suggesting higher sensitivity to vehicle than animal rotation, and was largest in the left fusiform/occipito-temporal sulcus (pFUS/OTS), suggesting that this region has low sensitivity to rotation. Moreover, right pFUS/OTS and FFA responded more strongly to front than back views of animals (without adaptation) and rotation cross-adaptation depended both on the level of rotation and the adapting view. This result suggests a prevalence of neurons that prefer frontal views of animals in fusiform regions. Using a computational model of view-tuned neurons, we demonstrate that differential neural view tuning widths and relative distributions of neural-tuned populations in fMRI voxels can explain the fMRI results. Overall, our findings underscore the utility of parametric approaches for studying the neural basis of object invariance and suggest that there is no complete invariance to object view in the human ventral stream.

  11. Psychosocial profiles of children with achondroplasia in terms of their short stature-related stress: a nationwide survey in Japan.

    PubMed

    Nishimura, Naoko; Hanaki, Keiichi

    2014-11-01

    To assess psychosocial profiles of children with achondroplasia using a nationwide survey. Achondroplasia, showing short stature and disproportionately short limbs, causes physical inconvenience such as difficulty in reaching high objects. It is, however, still controversial whether the condition is associated with psychological problems, especially in childhood. A cross-sectional descriptive design was employed. To evaluate psychosocial profiles and adaptation processes in children with achondroplasia, we developed an inventory of scales based on the psychological stress model of which conceptual framework was comprised of stressor, coping process, coping resource and adaptation outcome domains. Participants were recruited nationwide through the largest advocacy support group for achondroplasia in Japan. Of the 130 group members, 73 X-ray-diagnosed patients, aged 8-18 years, completed the inventory of questionnaires to be analysed. As for the stressor domain, patients experienced short stature-related unpleasant experiences more frequently (z-score: +1·3 in average, +3·9 in physical inconvenience). Nevertheless, these experiences had little effect on the coping process (threat appraisal: -0·2, control appraisal: +0·1) and the adaptation outcome (stress response: +0·3, self-concept: 0·0). Interestingly, self-efficacy in the coping resource domain was noticeably increased (+3·1) and was strongly correlated with most variables in the coping process and in adaptation outcome domains. Although the children with achondroplasia experienced more short stature-related stressors, there was no evidence of any psychosocial maladaptation. This finding suggests that coping process as well as coping resources such as self-efficacy could be important targets for promoting psychological adjustment in children with achondroplasia. To help children with achondroplasia adapt socially, nurses and other healthcare providers should routinely assess their psychological adaptation process, especially cognitive appraisal and self-efficacy.

  12. Intestinal adaptation in short bowel syndrome: A case report.

    PubMed

    Palla, Viktoria-Varvara; Karaolanis, Georgios; Pentazos, Panagiotis; Ladopoulos, Alexios; Papageorgiou, Evaggelos

    2015-06-01

    Short bowel syndrome is a clinical entity that includes loss of energy, fluid, electrolytes or micronutrient balance because of inadequate functional intestinal length. This case report demonstrates the case of a woman who compensated for short bowel syndrome through intestinal adaptation, which is a complex process worthy of further investigation for the avoidance of dependence on total parenteral nutrition and of intestinal transplantation in such patients. Copyright © 2015 Arab Journal of Gastroenterology. Published by Elsevier B.V. All rights reserved.

  13. Short-Term Adaptive Modification of Dynamic Ocular Accommodation

    PubMed Central

    Bharadwaj, Shrikant R.; Vedamurthy, Indu; Schor, Clifton M.

    2009-01-01

    Purpose Indirect observations suggest that the neural control of accommodation may undergo adaptive recalibration in response to age-related biomechanical changes in the accommodative system. However, there has been no direct demonstration of such an adaptive capability. This investigation was conducted to demonstrate short-term adaptation of accommodative step response dynamics to optically induced changes in neuromuscular demands. Methods Repetitive changes in accommodative effort were induced in 15 subjects (18–34 years) with a double-step adaptation paradigm wherein an initial 2-D step change in blur was followed 350 ms later by either a 2-D step increase in blur (increasing-step paradigm) or a 1.75-D step decrease in blur (decreasing-step paradigm). Peak velocity, peak acceleration, and latency of 2-D single-step test responses were assessed before and after 1.5 hours of training with these paradigms. Results Peak velocity and peak acceleration of 2-D step responses increased after adaptation to the increasing-step paradigm (9/12 subjects), and they decreased after adaptation to the decreasing-step paradigm (4/9 subjects). Adaptive changes in peak velocity and peak acceleration generalized to responses that were smaller (1 D) and larger (3 D) than the 2-D adaptation stimulus. The magnitude of adaptation correlated poorly with the subject's age, but it was significantly negatively correlated with the preadaptation dynamics. Response latency decreased after adaptation, irrespective of the direction of adaptation. Conclusions Short-term adaptive changes in accommodative step response dynamics could be induced, at least in some of our subjects between 18 and 34 years, with a directional bias toward increasing rather than decreasing the dynamics. PMID:19255153

  14. The Cultural Adaptation Process of Agricultural and Life Sciences Students on Short-Term Study Abroad Experiences

    ERIC Educational Resources Information Center

    Conner, Nathan William

    2013-01-01

    The purpose of this study was to explore how undergraduate students in a college of agricultural and life sciences experienced cultural adaptation during short-term study abroad programs. The specific objectives of this study were to describe how undergraduate students in the college of agricultural and life sciences experienced culture throughout…

  15. Persistence and Adaptation in Immunity: T Cells Balance the Extent and Thoroughness of Search

    PubMed Central

    Fricke, G. Matthew; Letendre, Kenneth A.; Moses, Melanie E.; Cannon, Judy L.

    2016-01-01

    Effective search strategies have evolved in many biological systems, including the immune system. T cells are key effectors of the immune response, required for clearance of pathogenic infection. T cell activation requires that T cells encounter antigen-bearing dendritic cells within lymph nodes, thus, T cell search patterns within lymph nodes may be a crucial determinant of how quickly a T cell immune response can be initiated. Previous work suggests that T cell motion in the lymph node is similar to a Brownian random walk, however, no detailed analysis has definitively shown whether T cell movement is consistent with Brownian motion. Here, we provide a precise description of T cell motility in lymph nodes and a computational model that demonstrates how motility impacts T cell search efficiency. We find that both Brownian and Lévy walks fail to capture the complexity of T cell motion. Instead, T cell movement is better described as a correlated random walk with a heavy-tailed distribution of step lengths. Using computer simulations, we identify three distinct factors that contribute to increasing T cell search efficiency: 1) a lognormal distribution of step lengths, 2) motion that is directionally persistent over short time scales, and 3) heterogeneity in movement patterns. Furthermore, we show that T cells move differently in specific frequently visited locations that we call “hotspots” within lymph nodes, suggesting that T cells change their movement in response to the lymph node environment. Our results show that like foraging animals, T cells adapt to environmental cues, suggesting that adaption is a fundamental feature of biological search. PMID:26990103

  16. Fault recovery for real-time, multi-tasking computer system

    NASA Technical Reports Server (NTRS)

    Hess, Richard (Inventor); Kelly, Gerald B. (Inventor); Rogers, Randy (Inventor); Stange, Kent A. (Inventor)

    2011-01-01

    System and methods for providing a recoverable real time multi-tasking computer system are disclosed. In one embodiment, a system comprises a real time computing environment, wherein the real time computing environment is adapted to execute one or more applications and wherein each application is time and space partitioned. The system further comprises a fault detection system adapted to detect one or more faults affecting the real time computing environment and a fault recovery system, wherein upon the detection of a fault the fault recovery system is adapted to restore a backup set of state variables.

  17. How to Represent Adaptation in e-Learning with IMS Learning Design

    ERIC Educational Resources Information Center

    Burgos, Daniel; Tattersall, Colin; Koper, Rob

    2007-01-01

    Adaptation in e-learning has been an important research topic for the last few decades in computer-based education. In adaptivity the behaviour of the user triggers some actions in the system that guides the learning process. In adaptability, the user makes changes and takes decisions. Progressing from computer-based training and adaptive…

  18. Identifying Differential Item Functioning in Multi-Stage Computer Adaptive Testing

    ERIC Educational Resources Information Center

    Gierl, Mark J.; Lai, Hollis; Li, Johnson

    2013-01-01

    The purpose of this study is to evaluate the performance of CATSIB (Computer Adaptive Testing-Simultaneous Item Bias Test) for detecting differential item functioning (DIF) when items in the matching and studied subtest are administered adaptively in the context of a realistic multi-stage adaptive test (MST). MST was simulated using a 4-item…

  19. A Guide to Computational Tools and Design Strategies for Genome Editing Experiments in Zebrafish Using CRISPR/Cas9.

    PubMed

    Prykhozhij, Sergey V; Rajan, Vinothkumar; Berman, Jason N

    2016-02-01

    The development of clustered regularly interspaced short palindromic repeats (CRISPR)/Cas9 technology for mainstream biotechnological use based on its discovery as an adaptive immune mechanism in bacteria has dramatically improved the ability of molecular biologists to modify genomes of model organisms. The zebrafish is highly amenable to applications of CRISPR/Cas9 for mutation generation and a variety of DNA insertions. Cas9 protein in complex with a guide RNA molecule recognizes where to cut the homologous DNA based on a short stretch of DNA termed the protospacer-adjacent motif (PAM). Rapid and efficient identification of target sites immediately preceding PAM sites, quantification of genomic occurrences of similar (off target) sites and predictions of cutting efficiency are some of the features where computational tools play critical roles in CRISPR/Cas9 applications. Given the rapid advent and development of this technology, it can be a challenge for researchers to remain up to date with all of the important technological developments in this field. We have contributed to the armamentarium of CRISPR/Cas9 bioinformatics tools and trained other researchers in the use of appropriate computational programs to develop suitable experimental strategies. Here we provide an in-depth guide on how to use CRISPR/Cas9 and other relevant computational tools at each step of a host of genome editing experimental strategies. We also provide detailed conceptual outlines of the steps involved in the design and execution of CRISPR/Cas9-based experimental strategies, such as generation of frameshift mutations, larger chromosomal deletions and inversions, homology-independent insertion of gene cassettes and homology-based knock-in of defined point mutations and larger gene constructs.

  20. Carbachol-induced volume adaptation in mouse bladder and length adaptation via rhythmic contraction in rabbit detrusor.

    PubMed

    Speich, John E; Wilson, Cameron W; Almasri, Atheer M; Southern, Jordan B; Klausner, Adam P; Ratz, Paul H

    2012-10-01

    The length-tension (L-T) relationships in rabbit detrusor smooth muscle (DSM) are similar to those in vascular and airway smooth muscles and exhibit short-term length adaptation characterized by L-T curves that shift along the length axis as a function of activation and strain history. In contrast to skeletal muscle, the length-active tension (L-T(a)) curve for rabbit DSM strips does not have a unique peak tension value with a single ascending and descending limb. Instead, DSM can exhibit multiple ascending and descending limbs, and repeated KCl-induced contractions at a particular muscle length on an ascending or descending limb display increasingly greater tension. In the present study, mouse bladder strips with and without urothelium exhibited KCl-induced and carbachol-induced length adaptation, and the pressure-volume relationship in mouse whole bladder displayed short-term volume adaptation. Finally, prostaglandin-E(2)-induced low-level rhythmic contraction produced length adaptation in rabbit DSM strips. A likely role of length adaptation during bladder filling is to prepare DSM cells to contract efficiently over a broad range of volumes. Mammalian bladders exhibit spontaneous rhythmic contraction (SRC) during the filling phase and SRC is elevated in humans with overactive bladder (OAB). The present data identify a potential physiological role for SRC in bladder adaptation and motivate the investigation of a potential link between short-term volume adaptation and OAB with impaired contractility.

  1. An FEC Adaptive Multicast MAC Protocol for Providing Reliability in WLANs

    NASA Astrophysics Data System (ADS)

    Basalamah, Anas; Sato, Takuro

    For wireless multicast applications like multimedia conferencing, voice over IP and video/audio streaming, a reliable transmission of packets within short delivery delay is needed. Moreover, reliability is crucial to the performance of error intolerant applications like file transfer, distributed computing, chat and whiteboard sharing. Forward Error Correction (FEC) is frequently used in wireless multicast to enhance Packet Error Rate (PER) performance, but cannot assure full reliability unless coupled with Automatic Repeat Request forming what is knows as Hybrid-ARQ. While reliable FEC can be deployed at different levels of the protocol stack, it cannot be deployed on the MAC layer of the unreliable IEEE802.11 WLAN due to its inability to exchange ACKs with multiple recipients. In this paper, we propose a Multicast MAC protocol that enhances WLAN reliability by using Adaptive FEC and study it's performance through mathematical analysis and simulation. Our results show that our protocol can deliver high reliability and throughput performance.

  2. A novel scene-based non-uniformity correction method for SWIR push-broom hyperspectral sensors

    NASA Astrophysics Data System (ADS)

    Hu, Bin-Lin; Hao, Shi-Jing; Sun, De-Xin; Liu, Yin-Nian

    2017-09-01

    A novel scene-based non-uniformity correction (NUC) method for short-wavelength infrared (SWIR) push-broom hyperspectral sensors is proposed and evaluated. This method relies on the assumption that for each band there will be ground objects with similar reflectance to form uniform regions when a sufficient number of scanning lines are acquired. The uniform regions are extracted automatically through a sorting algorithm, and are used to compute the corresponding NUC coefficients. SWIR hyperspectral data from airborne experiment are used to verify and evaluate the proposed method, and results show that stripes in the scenes have been well corrected without any significant information loss, and the non-uniformity is less than 0.5%. In addition, the proposed method is compared to two other regular methods, and they are evaluated based on their adaptability to the various scenes, non-uniformity, roughness and spectral fidelity. It turns out that the proposed method shows strong adaptability, high accuracy and efficiency.

  3. Probabilistic resource allocation system with self-adaptive capability

    NASA Technical Reports Server (NTRS)

    Yufik, Yan M. (Inventor)

    1996-01-01

    A probabilistic resource allocation system is disclosed containing a low capacity computational module (Short Term Memory or STM) and a self-organizing associative network (Long Term Memory or LTM) where nodes represent elementary resources, terminal end nodes represent goals, and directed links represent the order of resource association in different allocation episodes. Goals and their priorities are indicated by the user, and allocation decisions are made in the STM, while candidate associations of resources are supplied by the LTM based on the association strength (reliability). Reliability values are automatically assigned to the network links based on the frequency and relative success of exercising those links in the previous allocation decisions. Accumulation of allocation history in the form of an associative network in the LTM reduces computational demands on subsequent allocations. For this purpose, the network automatically partitions itself into strongly associated high reliability packets, allowing fast approximate computation and display of allocation solutions satisfying the overall reliability and other user-imposed constraints. System performance improves in time due to modification of network parameters and partitioning criteria based on the performance feedback.

  4. Probabilistic resource allocation system with self-adaptive capability

    NASA Technical Reports Server (NTRS)

    Yufik, Yan M. (Inventor)

    1998-01-01

    A probabilistic resource allocation system is disclosed containing a low capacity computational module (Short Term Memory or STM) and a self-organizing associative network (Long Term Memory or LTM) where nodes represent elementary resources, terminal end nodes represent goals, and weighted links represent the order of resource association in different allocation episodes. Goals and their priorities are indicated by the user, and allocation decisions are made in the STM, while candidate associations of resources are supplied by the LTM based on the association strength (reliability). Weights are automatically assigned to the network links based on the frequency and relative success of exercising those links in the previous allocation decisions. Accumulation of allocation history in the form of an associative network in the LTM reduces computational demands on subsequent allocations. For this purpose, the network automatically partitions itself into strongly associated high reliability packets, allowing fast approximate computation and display of allocation solutions satisfying the overall reliability and other user-imposed constraints. System performance improves in time due to modification of network parameters and partitioning criteria based on the performance feedback.

  5. STAR adaptation of QR algorithm. [program for solving over-determined systems of linear equations

    NASA Technical Reports Server (NTRS)

    Shah, S. N.

    1981-01-01

    The QR algorithm used on a serial computer and executed on the Control Data Corporation 6000 Computer was adapted to execute efficiently on the Control Data STAR-100 computer. How the scalar program was adapted for the STAR-100 and why these adaptations yielded an efficient STAR program is described. Program listings of the old scalar version and the vectorized SL/1 version are presented in the appendices. Execution times for the two versions applied to the same system of linear equations, are compared.

  6. Multithreaded Model for Dynamic Load Balancing Parallel Adaptive PDE Computations

    NASA Technical Reports Server (NTRS)

    Chrisochoides, Nikos

    1995-01-01

    We present a multithreaded model for the dynamic load-balancing of numerical, adaptive computations required for the solution of Partial Differential Equations (PDE's) on multiprocessors. Multithreading is used as a means of exploring concurrency in the processor level in order to tolerate synchronization costs inherent to traditional (non-threaded) parallel adaptive PDE solvers. Our preliminary analysis for parallel, adaptive PDE solvers indicates that multithreading can be used an a mechanism to mask overheads required for the dynamic balancing of processor workloads with computations required for the actual numerical solution of the PDE's. Also, multithreading can simplify the implementation of dynamic load-balancing algorithms, a task that is very difficult for traditional data parallel adaptive PDE computations. Unfortunately, multithreading does not always simplify program complexity, often makes code re-usability not an easy task, and increases software complexity.

  7. Water System Adaptation To Hydrological Changes: Module 11, Methods and Tools: Computational Models

    EPA Science Inventory

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  8. Management of Computer-Based Instruction: Design of an Adaptive Control Strategy.

    ERIC Educational Resources Information Center

    Tennyson, Robert D.; Rothen, Wolfgang

    1979-01-01

    Theoretical and research literature on learner, program, and adaptive control as forms of instructional management are critiqued in reference to the design of computer-based instruction. An adaptive control strategy using an online, iterative algorithmic model is proposed. (RAO)

  9. An evaluation of computerized adaptive testing for general psychological distress: combining GHQ-12 and Affectometer-2 in an item bank for public mental health research.

    PubMed

    Stochl, Jan; Böhnke, Jan R; Pickett, Kate E; Croudace, Tim J

    2016-05-20

    Recent developments in psychometric modeling and technology allow pooling well-validated items from existing instruments into larger item banks and their deployment through methods of computerized adaptive testing (CAT). Use of item response theory-based bifactor methods and integrative data analysis overcomes barriers in cross-instrument comparison. This paper presents the joint calibration of an item bank for researchers keen to investigate population variations in general psychological distress (GPD). Multidimensional item response theory was used on existing health survey data from the Scottish Health Education Population Survey (n = 766) to calibrate an item bank consisting of pooled items from the short common mental disorder screen (GHQ-12) and the Affectometer-2 (a measure of "general happiness"). Computer simulation was used to evaluate usefulness and efficacy of its adaptive administration. A bifactor model capturing variation across a continuum of population distress (while controlling for artefacts due to item wording) was supported. The numbers of items for different required reliabilities in adaptive administration demonstrated promising efficacy of the proposed item bank. Psychometric modeling of the common dimension captured by more than one instrument offers the potential of adaptive testing for GPD using individually sequenced combinations of existing survey items. The potential for linking other item sets with alternative candidate measures of positive mental health is discussed since an optimal item bank may require even more items than these.

  10. Statistical learning methods for aero-optic wavefront prediction and adaptive-optic latency compensation

    NASA Astrophysics Data System (ADS)

    Burns, W. Robert

    Since the early 1970's research in airborne laser systems has been the subject of continued interest. Airborne laser applications depend on being able to propagate a near diffraction-limited laser beam from an airborne platform. Turbulent air flowing over the aircraft produces density fluctuations through which the beam must propagate. Because the index of refraction of the air is directly related to the density, the turbulent flow imposes aberrations on the beam passing through it. This problem is referred to as Aero-Optics. Aero-Optics is recognized as a major technical issue that needs to be solved before airborne optical systems can become routinely fielded. This dissertation research specifically addresses an approach to mitigating the deleterious effects imposed on an airborne optical system by aero-optics. A promising technology is adaptive optics: a feedback control method that measures optical aberrations and imprints the conjugate aberrations onto an outgoing beam. The challenge is that it is a computationally-difficult problem, since aero-optic disturbances are on the order of kilohertz for practical applications. High control loop frequencies and high disturbance frequencies mean that adaptive-optic systems are sensitive to latency in sensors, mirrors, amplifiers, and computation. These latencies build up to result in a dramatic reduction in the system's effective bandwidth. This work presents two variations of an algorithm that uses model reduction and data-driven predictors to estimate the evolution of measured wavefronts over a short temporal horizon and thus compensate for feedback latency. The efficacy of the two methods are compared in this research, and evaluated against similar algorithms that have been previously developed. The best version achieved over 75% disturbance rejection in simulation in the most optically active flow region in the wake of a turret, considerably outperforming conventional approaches. The algorithm is shown to be insensitive to changes in flow condition, and stable in the presence of small latency uncertainty. Consideration is given to practical implementation of the algorithms as well as computational requirement scaling.

  11. A model of microsaccade-related neural responses induced by short-term depression in thalamocortical synapses

    PubMed Central

    Yuan, Wu-Jie; Dimigen, Olaf; Sommer, Werner; Zhou, Changsong

    2013-01-01

    Microsaccades during fixation have been suggested to counteract visual fading. Recent experiments have also observed microsaccade-related neural responses from cellular record, scalp electroencephalogram (EEG), and functional magnetic resonance imaging (fMRI). The underlying mechanism, however, is not yet understood and highly debated. It has been proposed that the neural activity of primary visual cortex (V1) is a crucial component for counteracting visual adaptation. In this paper, we use computational modeling to investigate how short-term depression (STD) in thalamocortical synapses might affect the neural responses of V1 in the presence of microsaccades. Our model not only gives a possible synaptic explanation for microsaccades in counteracting visual fading, but also reproduces several features in experimental findings. These modeling results suggest that STD in thalamocortical synapses plays an important role in microsaccade-related neural responses and the model may be useful for further investigation of behavioral properties and functional roles of microsaccades. PMID:23630494

  12. Adaptive CFD schemes for aerospace propulsion

    NASA Astrophysics Data System (ADS)

    Ferrero, A.; Larocca, F.

    2017-05-01

    The flow fields which can be observed inside several components of aerospace propulsion systems are characterised by the presence of very localised phenomena (boundary layers, shock waves,...) which can deeply influence the performances of the system. In order to accurately evaluate these effects by means of Computational Fluid Dynamics (CFD) simulations, it is necessary to locally refine the computational mesh. In this way the degrees of freedom related to the discretisation are focused in the most interesting regions and the computational cost of the simulation remains acceptable. In the present work, a discontinuous Galerkin (DG) discretisation is used to numerically solve the equations which describe the flow field. The local nature of the DG reconstruction makes it possible to efficiently exploit several adaptive schemes in which the size of the elements (h-adaptivity) and the order of reconstruction (p-adaptivity) are locally changed. After a review of the main adaptation criteria, some examples related to compressible flows in turbomachinery are presented. An hybrid hp-adaptive algorithm is also proposed and compared with a standard h-adaptive scheme in terms of computational efficiency.

  13. Measuring psychological trauma after spinal cord injury: Development and psychometric characteristics of the SCI-QOL Psychological Trauma item bank and short form

    PubMed Central

    Kisala, Pamela A.; Victorson, David; Pace, Natalie; Heinemann, Allen W.; Choi, Seung W.; Tulsky, David S.

    2015-01-01

    Objective To describe the development and psychometric properties of the SCI-QOL Psychological Trauma item bank and short form. Design Using a mixed-methods design, we developed and tested a Psychological Trauma item bank with patient and provider focus groups, cognitive interviews, and item response theory based analytic approaches, including tests of model fit, differential item functioning (DIF) and precision. Setting We tested a 31-item pool at several medical institutions across the United States, including the University of Michigan, Kessler Foundation, Rehabilitation Institute of Chicago, the University of Washington, Craig Hospital and the James J. Peters/Bronx Veterans Administration hospital. Participants A total of 716 individuals with SCI completed the trauma items Results The 31 items fit a unidimensional model (CFI=0.952; RMSEA=0.061) and demonstrated good precision (theta range between 0.6 and 2.5). Nine items demonstrated negligible DIF with little impact on score estimates. The final calibrated item bank contains 19 items Conclusion The SCI-QOL Psychological Trauma item bank is a psychometrically robust measurement tool from which a short form and a computer adaptive test (CAT) version are available. PMID:26010967

  14. Scalable improvement of SPME multipolar electrostatics in anisotropic polarizable molecular mechanics using a general short-range penetration correction up to quadrupoles.

    PubMed

    Narth, Christophe; Lagardère, Louis; Polack, Étienne; Gresh, Nohad; Wang, Qiantao; Bell, David R; Rackers, Joshua A; Ponder, Jay W; Ren, Pengyu Y; Piquemal, Jean-Philip

    2016-02-15

    We propose a general coupling of the Smooth Particle Mesh Ewald SPME approach for distributed multipoles to a short-range charge penetration correction modifying the charge-charge, charge-dipole and charge-quadrupole energies. Such an approach significantly improves electrostatics when compared to ab initio values and has been calibrated on Symmetry-Adapted Perturbation Theory reference data. Various neutral molecular dimers have been tested and results on the complexes of mono- and divalent cations with a water ligand are also provided. Transferability of the correction is adressed in the context of the implementation of the AMOEBA and SIBFA polarizable force fields in the TINKER-HP software. As the choices of the multipolar distribution are discussed, conclusions are drawn for the future penetration-corrected polarizable force fields highlighting the mandatory need of non-spurious procedures for the obtention of well balanced and physically meaningful distributed moments. Finally, scalability and parallelism of the short-range corrected SPME approach are addressed, demonstrating that the damping function is computationally affordable and accurate for molecular dynamics simulations of complex bio- or bioinorganic systems in periodic boundary conditions. Copyright © 2016 Wiley Periodicals, Inc.

  15. ICCE/ICCAI 2000 Full & Short Papers (Computer-Assisted Language Learning).

    ERIC Educational Resources Information Center

    2000

    This document contains the following full and short papers on computer-assisted language learning (CALL) from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A Computer-Assisted English Abstract Words Learning Environment on the Web" (Wenli Tsou and…

  16. Letters: Noise Equalization for Ultrafast Plane Wave Microvessel Imaging

    PubMed Central

    Song, Pengfei; Manduca, Armando; Trzasko, Joshua D.

    2017-01-01

    Ultrafast plane wave microvessel imaging significantly improves ultrasound Doppler sensitivity by increasing the number of Doppler ensembles that can be collected within a short period of time. The rich spatiotemporal plane wave data also enables more robust clutter filtering based on singular value decomposition (SVD). However, due to the lack of transmit focusing, plane wave microvessel imaging is very susceptible to noise. This study was designed to: 1) study the relationship between ultrasound system noise (primarily time gain compensation-induced) and microvessel blood flow signal; 2) propose an adaptive and computationally cost-effective noise equalization method that is independent of hardware or software imaging settings to improve microvessel image quality. PMID:28880169

  17. RASCAL: A Rudimentary Adaptive System for Computer-Aided Learning.

    ERIC Educational Resources Information Center

    Stewart, John Christopher

    Both the background of computer-assisted instruction (CAI) systems in general and the requirements of a computer-aided learning system which would be a reasonable assistant to a teacher are discussed. RASCAL (Rudimentary Adaptive System for Computer-Aided Learning) is a first attempt at defining a CAI system which would individualize the learning…

  18. Discriminating Children with Autism from Children with Learning Difficulties with an Adaptation of the Short Sensory Profile

    ERIC Educational Resources Information Center

    O'Brien, Justin; Tsermentseli, Stella; Cummins, Omar; Happe, Francesca; Heaton, Pamela; Spencer, Janine

    2009-01-01

    In this article, we examine the extent to which children with autism and children with learning difficulties can be discriminated from their responses to different patterns of sensory stimuli. Using an adapted version of the Short Sensory Profile (SSP), sensory processing was compared in 34 children with autism to 33 children with typical…

  19. [Linguistic adaptation of the Russian version of the Short-form McGill Pain Questionnaire-2].

    PubMed

    Bakhtadze, M A; Bolotov, D A; Kuzminov, K O; Padun, M P; Zakharova, O B

    Linguistic adaptation of the Russian version of the Short-form McGill Pain Questionnaire-2 (SF-MPQ-2), which is conceptually equivalent to the original questionnaire. The adaptation of the Russian version of SF-MPQ-2 was performed in accordance to established rules in several stages by two independent translators with the development of a consensus Russian version and its back translation by two independent translators and development of a consensus English version. The final Russian SF-MPQ-2 version was then created. The Russian version of the Short-form McGill Pain Questionnaire-2 (SF-MPQ-2-RU) was generated based on the established rules. This version was legally registered by the right holder - Mapi Research Trust and recommended for research in the Russian Federation.

  20. Spatiotemporal variation in local adaptation of a specialist insect herbivore to its long-lived host plant.

    PubMed

    Kalske, Aino; Leimu, Roosa; Scheepens, J F; Mutikainen, Pia

    2016-09-01

    Local adaptation of interacting species to one another indicates geographically variable reciprocal selection. This process of adaptation is central in the organization and maintenance of genetic variation across populations. Given that the strength of selection and responses to it often vary in time and space, the strength of local adaptation should in theory vary between generations and among populations. However, such spatiotemporal variation has rarely been explicitly demonstrated in nature and local adaptation is commonly considered to be relatively static. We report persistent local adaptation of the short-lived herbivore Abrostola asclepiadis to its long-lived host plant Vincetoxicum hirundinaria over three successive generations in two studied populations and considerable temporal variation in local adaptation in six populations supporting the geographic mosaic theory. The observed variation in local adaptation among populations was best explained by geographic distance and population isolation, suggesting that gene flow reduces local adaptation. Changes in herbivore population size did not conclusively explain temporal variation in local adaptation. Our results also imply that short-term studies are likely to capture only a part of the existing variation in local adaptation. © 2016 The Author(s). Evolution © 2016 The Society for the Study of Evolution.

  1. Simple and Effective Algorithms: Computer-Adaptive Testing.

    ERIC Educational Resources Information Center

    Linacre, John Michael

    Computer-adaptive testing (CAT) allows improved security, greater scoring accuracy, shorter testing periods, quicker availability of results, and reduced guessing and other undesirable test behavior. Simple approaches can be applied by the classroom teacher, or other content specialist, who possesses simple computer equipment and elementary…

  2. Immediate and Short-Term Effects of the 5th Grade Version of the "keepin' it REAL" Substance Use Prevention Intervention

    ERIC Educational Resources Information Center

    Hecht, Michael L.; Elek, Elvira; Wagstaff, David A.; Kam, Jennifer A.; Marsiglia, Flavio; Dustman, Patricia; Reeves, Leslie; Harthun, Mary

    2009-01-01

    This study assessed the immediate and short-term outcomes of adapting a culturally-grounded middle school program, "keepin' it REAL", for elementary school students. After curriculum adaptation, 10 schools were randomly assigned to the intervention in 5th grade with follow-up boosters in 6th grade; 13 schools were randomly assigned to the control…

  3. Short Stories via Computers in EFL Classrooms: An Empirical Study for Reading and Writing Skills

    ERIC Educational Resources Information Center

    Yilmaz, Adnan

    2015-01-01

    The present empirical study scrutinizes the use of short stories via computer technologies in teaching and learning English language. The objective of the study is two-fold: to examine how short stories could be used through computer programs in teaching and learning English and to collect data about students' perceptions of this technique via…

  4. Development of the Computer-Adaptive Version of the Late-Life Function and Disability Instrument

    PubMed Central

    Tian, Feng; Kopits, Ilona M.; Moed, Richard; Pardasaney, Poonam K.; Jette, Alan M.

    2012-01-01

    Background. Having psychometrically strong disability measures that minimize response burden is important in assessing of older adults. Methods. Using the original 48 items from the Late-Life Function and Disability Instrument and newly developed items, a 158-item Activity Limitation and a 62-item Participation Restriction item pool were developed. The item pools were administered to a convenience sample of 520 community-dwelling adults 60 years or older. Confirmatory factor analysis and item response theory were employed to identify content structure, calibrate items, and build the computer-adaptive testings (CATs). We evaluated real-data simulations of 10-item CAT subscales. We collected data from 102 older adults to validate the 10-item CATs against the Veteran’s Short Form-36 and assessed test–retest reliability in a subsample of 57 subjects. Results. Confirmatory factor analysis revealed a bifactor structure, and multi-dimensional item response theory was used to calibrate an overall Activity Limitation Scale (141 items) and an overall Participation Restriction Scale (55 items). Fit statistics were acceptable (Activity Limitation: comparative fit index = 0.95, Tucker Lewis Index = 0.95, root mean square error approximation = 0.03; Participation Restriction: comparative fit index = 0.95, Tucker Lewis Index = 0.95, root mean square error approximation = 0.05). Correlation of 10-item CATs with full item banks were substantial (Activity Limitation: r = .90; Participation Restriction: r = .95). Test–retest reliability estimates were high (Activity Limitation: r = .85; Participation Restriction r = .80). Strength and pattern of correlations with Veteran’s Short Form-36 subscales were as hypothesized. Each CAT, on average, took 3.56 minutes to administer. Conclusions. The Late-Life Function and Disability Instrument CATs demonstrated strong reliability, validity, accuracy, and precision. The Late-Life Function and Disability Instrument CAT can achieve psychometrically sound disability assessment in older persons while reducing respondent burden. Further research is needed to assess their ability to measure change in older adults. PMID:22546960

  5. Comparing Computer-Adaptive and Curriculum-Based Measurement Methods of Assessment

    ERIC Educational Resources Information Center

    Shapiro, Edward S.; Gebhardt, Sarah N.

    2012-01-01

    This article reported the concurrent, predictive, and diagnostic accuracy of a computer-adaptive test (CAT) and curriculum-based measurements (CBM; both computation and concepts/application measures) for universal screening in mathematics among students in first through fourth grade. Correlational analyses indicated moderate to strong…

  6. BUDS Candidate Success Through RTC: First Watch Results

    DTIC Science & Technology

    2007-01-01

    22 NCAPS ...motivation. 22 N P R S T 21 NCAPS (Navy Computer Adaptive Computer Scales) NCAPS • Achievement • Stress Tolerance • Self Reliance • Leadership...military population. The Navy Computer Adaptive Personality Scales ( NCAPS ), however, was developed specifically to predict success across all Navy

  7. Multistage Adaptive Testing for a Large-Scale Classification Test: Design, Heuristic Assembly, and Comparison with Other Testing Modes. ACT Research Report Series, 2012 (6)

    ERIC Educational Resources Information Center

    Zheng, Yi; Nozawa, Yuki; Gao, Xiaohong; Chang, Hua-Hua

    2012-01-01

    Multistage adaptive tests (MSTs) have gained increasing popularity in recent years. MST is a balanced compromise between linear test forms (i.e., paper-and-pencil testing and computer-based testing) and traditional item-level computer-adaptive testing (CAT). It combines the advantages of both. On one hand, MST is adaptive (and therefore more…

  8. Procedure for Adapting Direct Simulation Monte Carlo Meshes

    NASA Technical Reports Server (NTRS)

    Woronowicz, Michael S.; Wilmoth, Richard G.; Carlson, Ann B.; Rault, Didier F. G.

    1992-01-01

    A technique is presented for adapting computational meshes used in the G2 version of the direct simulation Monte Carlo method. The physical ideas underlying the technique are discussed, and adaptation formulas are developed for use on solutions generated from an initial mesh. The effect of statistical scatter on adaptation is addressed, and results demonstrate the ability of this technique to achieve more accurate results without increasing necessary computational resources.

  9. PROMIS Physical Function Computer Adaptive Test Compared With Other Upper Extremity Outcome Measures in the Evaluation of Proximal Humerus Fractures in Patients Older Than 60 Years.

    PubMed

    Morgan, Jordan H; Kallen, Michael A; Okike, Kanu; Lee, Olivia C; Vrahas, Mark S

    2015-06-01

    To compare the PROMIS Physical Function Computer Adaptive Test (PROMIS PF CAT) to commonly used traditional PF measures for the evaluation of patients with proximal humerus fractures. Prospective. Two Level I trauma centers. Forty-seven patients older than 60 years with displaced proximal humerus fractures treated between 2006 and 2009. Evaluation included completion of the PROMIS PF CAT, the Constant Shoulder Score, the Disabilities of the Arm, Shoulder, and Hand (DASH) and the Short Musculoskeletal Functional Assessment (SMFA). Observed correlations among the administered PF outcome measures. On average, patients responded to 86 outcome-related items for this study: 4 for the PROMIS PF CAT (range: 4-8 items), 6 for the Constant Shoulder Score, 30 for the DASH, and 46 for the SMFA. Time to complete the PROMIS PF CAT (median completion time = 98 seconds) was significantly less than that for the DASH (median completion time = 336 seconds, P < 0.001) and for the SMFA (median completion time = 482 seconds, P < 0.001). PROMIS PF CAT scores correlated statistically significantly and were of moderate-to-high magnitude with all other PF outcome measure scores administered. This study suggests using the PROMIS PF CAT as a sole PF outcome measure can yield an assessment of upper extremity function similar to those provided by traditional PF measures, while substantially reducing patient assessment time.

  10. Mechanisms of Regulation of Olfactory Transduction and Adaptation in the Olfactory Cilium

    PubMed Central

    Antunes, Gabriela; Sebastião, Ana Maria; Simoes de Souza, Fabio Marques

    2014-01-01

    Olfactory adaptation is a fundamental process for the functioning of the olfactory system, but the underlying mechanisms regulating its occurrence in intact olfactory sensory neurons (OSNs) are not fully understood. In this work, we have combined stochastic computational modeling and a systematic pharmacological study of different signaling pathways to investigate their impact during short-term adaptation (STA). We used odorant stimulation and electroolfactogram (EOG) recordings of the olfactory epithelium treated with pharmacological blockers to study the molecular mechanisms regulating the occurrence of adaptation in OSNs. EOG responses to paired-pulses of odorants showed that inhibition of phosphodiesterases (PDEs) and phosphatases enhanced the levels of STA in the olfactory epithelium, and this effect was mimicked by blocking vesicle exocytosis and reduced by blocking cyclic adenosine monophosphate (cAMP)-dependent protein kinase (PKA) and vesicle endocytosis. These results suggest that G-coupled receptors (GPCRs) cycling is involved with the occurrence of STA. To gain insights on the dynamical aspects of this process, we developed a stochastic computational model. The model consists of the olfactory transduction currents mediated by the cyclic nucleotide gated (CNG) channels and calcium ion (Ca2+)-activated chloride (CAC) channels, and the dynamics of their respective ligands, cAMP and Ca2+, and it simulates the EOG results obtained under different experimental conditions through changes in the amplitude and duration of cAMP and Ca2+ response, two second messengers implicated with STA occurrence. The model reproduced the experimental data for each pharmacological treatment and provided a mechanistic explanation for the action of GPCR cycling in the levels of second messengers modulating the levels of STA. All together, these experimental and theoretical results indicate the existence of a mechanism of regulation of STA by signaling pathways that control GPCR cycling and tune the levels of second messengers in OSNs, and not only by CNG channel desensitization as previously thought. PMID:25144232

  11. Distributed Problem Solving: Adaptive Networks with a Computer Intermediary Resource. Intelligent Executive Computer Communication

    DTIC Science & Technology

    1991-06-01

    Proceedings of The National Conference on Artificial Intelligence , pages 181-184, The American Association for Aritificial Intelligence , Pittsburgh...Intermediary Resource: Intelligent Executive Computer Communication John Lyman and Carla J. Conaway University of California at Los Angeles for Contracting...Include Security Classification) Interim Report: Distributed Problem Solving: Adaptive Networks With a Computer Intermediary Resource: Intelligent

  12. Synaptic plasticity, neural circuits, and the emerging role of altered short-term information processing in schizophrenia

    PubMed Central

    Crabtree, Gregg W.; Gogos, Joseph A.

    2014-01-01

    Synaptic plasticity alters the strength of information flow between presynaptic and postsynaptic neurons and thus modifies the likelihood that action potentials in a presynaptic neuron will lead to an action potential in a postsynaptic neuron. As such, synaptic plasticity and pathological changes in synaptic plasticity impact the synaptic computation which controls the information flow through the neural microcircuits responsible for the complex information processing necessary to drive adaptive behaviors. As current theories of neuropsychiatric disease suggest that distinct dysfunctions in neural circuit performance may critically underlie the unique symptoms of these diseases, pathological alterations in synaptic plasticity mechanisms may be fundamental to the disease process. Here we consider mechanisms of both short-term and long-term plasticity of synaptic transmission and their possible roles in information processing by neural microcircuits in both health and disease. As paradigms of neuropsychiatric diseases with strongly implicated risk genes, we discuss the findings in schizophrenia and autism and consider the alterations in synaptic plasticity and network function observed in both human studies and genetic mouse models of these diseases. Together these studies have begun to point toward a likely dominant role of short-term synaptic plasticity alterations in schizophrenia while dysfunction in autism spectrum disorders (ASDs) may be due to a combination of both short-term and long-term synaptic plasticity alterations. PMID:25505409

  13. Haloarcula hispanica CRISPR authenticates PAM of a target sequence to prime discriminative adaptation

    PubMed Central

    Li, Ming; Wang, Rui; Xiang, Hua

    2014-01-01

    The prokaryotic immune system CRISPR/Cas (Clustered Regularly Interspaced Short Palindromic Repeats/CRISPR-associated genes) adapts to foreign invaders by acquiring their short deoxyribonucleic acid (DNA) fragments as spacers, which guide subsequent interference to foreign nucleic acids based on sequence matching. The adaptation mechanism avoiding acquiring ‘self’ DNA fragments is poorly understood. In Haloarcula hispanica, we previously showed that CRISPR adaptation requires being primed by a pre-existing spacer partially matching the invader DNA. Here, we further demonstrate that flanking a fully-matched target sequence, a functional PAM (protospacer adjacent motif) is still required to prime adaptation. Interestingly, interference utilizes only four PAM sequences, whereas adaptation-priming tolerates as many as 23 PAM sequences. This relaxed PAM selectivity explains how adaptation-priming maximizes its tolerance of PAM mutations (that escape interference) while avoiding mis-targeting the spacer DNA within CRISPR locus. We propose that the primed adaptation, which hitches and cooperates with the interference pathway, distinguishes target from non-target by CRISPR ribonucleic acid guidance and PAM recognition. PMID:24803673

  14. Evo-devo of infantile and childhood growth.

    PubMed

    Hochberg, Ze'ev; Albertsson-Wikland, Kerstin

    2008-07-01

    Human size is a tradeoff between the evolutionary advantages and disadvantages of being small or big. We now propose that adult size is determined to an important extent during transition from infancy to childhood. This transition is marked by a growth spurt. A delay in the transition has a lifelong impact on stature and is responsible for 44% of children with short stature in developed countries and many more in developing countries. Here, we present the data and theory of an evolutionary adaptive strategy of plasticity in the timing of transition from infancy into childhood to match the prevailing energy supply. We propose that humans have evolved to withstand energy crises by decreasing their body size, and that evolutionary short-term adaptations to energy crises trigger a predictive adaptive response that modify the transition into childhood, culminating in short stature.

  15. Adaptive finite element methods for two-dimensional problems in computational fracture mechanics

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Bass, J. M.; Spradley, L. W.

    1994-01-01

    Some recent results obtained using solution-adaptive finite element methods in two-dimensional problems in linear elastic fracture mechanics are presented. The focus is on the basic issue of adaptive finite element methods for validating the new methodology by computing demonstration problems and comparing the stress intensity factors to analytical results.

  16. Simple adaptive sparse representation based classification schemes for EEG based brain-computer interface applications.

    PubMed

    Shin, Younghak; Lee, Seungchan; Ahn, Minkyu; Cho, Hohyun; Jun, Sung Chan; Lee, Heung-No

    2015-11-01

    One of the main problems related to electroencephalogram (EEG) based brain-computer interface (BCI) systems is the non-stationarity of the underlying EEG signals. This results in the deterioration of the classification performance during experimental sessions. Therefore, adaptive classification techniques are required for EEG based BCI applications. In this paper, we propose simple adaptive sparse representation based classification (SRC) schemes. Supervised and unsupervised dictionary update techniques for new test data and a dictionary modification method by using the incoherence measure of the training data are investigated. The proposed methods are very simple and additional computation for the re-training of the classifier is not needed. The proposed adaptive SRC schemes are evaluated using two BCI experimental datasets. The proposed methods are assessed by comparing classification results with the conventional SRC and other adaptive classification methods. On the basis of the results, we find that the proposed adaptive schemes show relatively improved classification accuracy as compared to conventional methods without requiring additional computation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Adaptive eye-gaze tracking using neural-network-based user profiles to assist people with motor disability.

    PubMed

    Sesin, Anaelis; Adjouadi, Malek; Cabrerizo, Mercedes; Ayala, Melvin; Barreto, Armando

    2008-01-01

    This study developed an adaptive real-time human-computer interface (HCI) that serves as an assistive technology tool for people with severe motor disability. The proposed HCI design uses eye gaze as the primary computer input device. Controlling the mouse cursor with raw eye coordinates results in sporadic motion of the pointer because of the saccadic nature of the eye. Even though eye movements are subtle and completely imperceptible under normal circumstances, they considerably affect the accuracy of an eye-gaze-based HCI. The proposed HCI system is novel because it adapts to each specific user's different and potentially changing jitter characteristics through the configuration and training of an artificial neural network (ANN) that is structured to minimize the mouse jitter. This task is based on feeding the ANN a user's initially recorded eye-gaze behavior through a short training session. The ANN finds the relationship between the gaze coordinates and the mouse cursor position based on the multilayer perceptron model. An embedded graphical interface is used during the training session to generate user profiles that make up these unique ANN configurations. The results with 12 subjects in test 1, which involved following a moving target, showed an average jitter reduction of 35%; the results with 9 subjects in test 2, which involved following the contour of a square object, showed an average jitter reduction of 53%. For both results, the outcomes led to trajectories that were significantly smoother and apt at reaching fixed or moving targets with relative ease and within a 5% error margin or deviation from desired trajectories. The positive effects of such jitter reduction are presented graphically for visual appreciation.

  18. The future of computing--new architectures and new technologies.

    PubMed

    Warren, P

    2004-02-01

    All modern computers are designed using the 'von Neumann' architecture and built using silicon transistor technology. Both architecture and technology have been remarkably successful. Yet there are a range of problems for which this conventional architecture is not particularly well adapted, and new architectures are being proposed to solve these problems, in particular based on insight from nature. Transistor technology has enjoyed 50 years of continuing progress. However, the laws of physics dictate that within a relatively short time period this progress will come to an end. New technologies, based on molecular and biological sciences as well as quantum physics, are vying to replace silicon, or at least coexist with it and extend its capability. The paper describes these novel architectures and technologies, places them in the context of the kinds of problems they might help to solve, and predicts their possible manner and time of adoption. Finally it describes some key questions and research problems associated with their use.

  19. Real-time deblurring of handshake blurred images on smartphones

    NASA Astrophysics Data System (ADS)

    Pourreza-Shahri, Reza; Chang, Chih-Hsiang; Kehtarnavaz, Nasser

    2015-02-01

    This paper discusses an Android app for the purpose of removing blur that is introduced as a result of handshakes when taking images via a smartphone. This algorithm utilizes two images to achieve deblurring in a computationally efficient manner without suffering from artifacts associated with deconvolution deblurring algorithms. The first image is the normal or auto-exposure image and the second image is a short-exposure image that is automatically captured immediately before or after the auto-exposure image is taken. A low rank approximation image is obtained by applying singular value decomposition to the auto-exposure image which may appear blurred due to handshakes. This approximation image does not suffer from blurring while incorporating the image brightness and contrast information. The eigenvalues extracted from the low rank approximation image are then combined with those from the shortexposure image. It is shown that this deblurring app is computationally more efficient than the adaptive tonal correction algorithm which was previously developed for the same purpose.

  20. Global Load Balancing with Parallel Mesh Adaption on Distributed-Memory Systems

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Oliker, Leonid; Sohn, Andrew

    1996-01-01

    Dynamic mesh adaption on unstructured grids is a powerful tool for efficiently computing unsteady problems to resolve solution features of interest. Unfortunately, this causes load imbalance among processors on a parallel machine. This paper describes the parallel implementation of a tetrahedral mesh adaption scheme and a new global load balancing method. A heuristic remapping algorithm is presented that assigns partitions to processors such that the redistribution cost is minimized. Results indicate that the parallel performance of the mesh adaption code depends on the nature of the adaption region and show a 35.5X speedup on 64 processors of an SP2 when 35% of the mesh is randomly adapted. For large-scale scientific computations, our load balancing strategy gives almost a sixfold reduction in solver execution times over non-balanced loads. Furthermore, our heuristic remapper yields processor assignments that are less than 3% off the optimal solutions but requires only 1% of the computational time.

  1. Usability of an Adaptive Computer Assistant that Improves Self-care and Health Literacy of Older Adults

    PubMed Central

    Blanson Henkemans, O. A.; Rogers, W. A.; Fisk, A. D.; Neerincx, M. A.; Lindenberg, J.; van der Mast, C. A. P. G.

    2014-01-01

    Summary Objectives We developed an adaptive computer assistant for the supervision of diabetics’ self-care, to support limiting illness and need for acute treatment, and improve health literacy. This assistant monitors self-care activities logged in the patient’s electronic diary. Accordingly, it provides context-aware feedback. The objective was to evaluate whether older adults in general can make use of the computer assistant and to compare an adaptive computer assistant with a fixed one, concerning its usability and contribution to health literacy. Methods We conducted a laboratory experiment in the Georgia Tech Aware Home wherein 28 older adults participated in a usability evaluation of the computer assistant, while engaged in scenarios reflecting normal and health-critical situations. We evaluated the assistant on effectiveness, efficiency, satisfaction, and educational value. Finally, we studied the moderating effects of the subjects’ personal characteristics. Results Logging self-care tasks and receiving feedback from the computer assistant enhanced the subjects’ knowledge of diabetes. The adaptive assistant was more effective in dealing with normal and health-critical situations, and, generally, it led to more time efficiency. Subjects’ personal characteristics had substantial effects on the effectiveness and efficiency of the two computer assistants. Conclusions Older adults were able to use the adaptive computer assistant. In addition, it had a positive effect on the development of health literacy. The assistant has the potential to support older diabetics’ self care while maintaining quality of life. PMID:18213433

  2. Short-term differential adaptation to anaerobic stress via genomic mutations by Escherichia coli strains K-12 and B lacking alcohol dehydrogenase.

    PubMed

    Kim, Hyun Ju; Jeong, Haeyoung; Hwang, Seungwoo; Lee, Moo-Seung; Lee, Yong-Jik; Lee, Dong-Woo; Lee, Sang Jun

    2014-01-01

    Microbial adaptations often occur via genomic mutations under adverse environmental conditions. This study used Escherichia coli ΔadhE cells as a model system to investigate adaptation to anaerobic conditions, which we then compared with the adaptive mechanisms of two closely related E. coli strains, K-12 and B. In contrast to K-12 ΔadhE cells, the E. coli B ΔadhE cells exhibited significantly delayed adaptive growth under anaerobic conditions. Adaptation by the K-12 and B strains mainly employed anaerobic lactate fermentation to restore cellular growth. Several mutations were identified in the pta or pflB genes of adapted K-12 cells, but mostly in the pta gene of the B strains. However, the types of mutation in the adapted K-12 and B strains were similar. Cellular viability was affected directly by severe redox imbalance in B ΔadhE cells, which also impaired their ability to adapt to anaerobic conditions. This study demonstrates that closely related microorganisms may undergo different adaptations under the same set of adverse conditions, which might be associated with the specific metabolic characteristics of each strain. This study provides new insights into short-term microbial adaptation to stressful conditions, which may reflect dynamic microbial population changes in nature.

  3. 2-dimensional implicit hydrodynamics on adaptive grids

    NASA Astrophysics Data System (ADS)

    Stökl, A.; Dorfi, E. A.

    2007-12-01

    We present a numerical scheme for two-dimensional hydrodynamics computations using a 2D adaptive grid together with an implicit discretization. The combination of these techniques has offered favorable numerical properties applicable to a variety of one-dimensional astrophysical problems which motivated us to generalize this approach for two-dimensional applications. Due to the different topological nature of 2D grids compared to 1D problems, grid adaptivity has to avoid severe grid distortions which necessitates additional smoothing parameters to be included into the formulation of a 2D adaptive grid. The concept of adaptivity is described in detail and several test computations demonstrate the effectivity of smoothing. The coupled solution of this grid equation together with the equations of hydrodynamics is illustrated by computation of a 2D shock tube problem.

  4. Cortico-striatal language pathways dynamically adjust for syntactic complexity: A computational study.

    PubMed

    Szalisznyó, Krisztina; Silverstein, David; Teichmann, Marc; Duffau, Hugues; Smits, Anja

    2017-01-01

    A growing body of literature supports a key role of fronto-striatal circuits in language perception. It is now known that the striatum plays a role in engaging attentional resources and linguistic rule computation while also serving phonological short-term memory capabilities. The ventral semantic and the dorsal phonological stream dichotomy assumed for spoken language processing also seems to play a role in cortico-striatal perception. Based on recent studies that correlate deep Broca-striatal pathways with complex syntax performance, we used a previously developed computational model of frontal-striatal syntax circuits and hypothesized that different parallel language pathways may contribute to canonical and non-canonical sentence comprehension separately. We modified and further analyzed a thematic role assignment task and corresponding reservoir computing model of language circuits, as previously developed by Dominey and coworkers. We examined the models performance under various parameter regimes, by influencing how fast the presented language input decays and altering the temporal dynamics of activated word representations. This enabled us to quantify canonical and non-canonical sentence comprehension abilities. The modeling results suggest that separate cortico-cortical and cortico-striatal circuits may be recruited differently for processing syntactically more difficult and less complicated sentences. Alternatively, a single circuit would need to dynamically and adaptively adjust to syntactic complexity. Copyright © 2016. Published by Elsevier Inc.

  5. Annual Rainfall Forecasting by Using Mamdani Fuzzy Inference System

    NASA Astrophysics Data System (ADS)

    Fallah-Ghalhary, G.-A.; Habibi Nokhandan, M.; Mousavi Baygi, M.

    2009-04-01

    Long-term rainfall prediction is very important to countries thriving on agro-based economy. In general, climate and rainfall are highly non-linear phenomena in nature giving rise to what is known as "butterfly effect". The parameters that are required to predict the rainfall are enormous even for a short period. Soft computing is an innovative approach to construct computationally intelligent systems that are supposed to possess humanlike expertise within a specific domain, adapt themselves and learn to do better in changing environments, and explain how they make decisions. Unlike conventional artificial intelligence techniques the guiding principle of soft computing is to exploit tolerance for imprecision, uncertainty, robustness, partial truth to achieve tractability, and better rapport with reality. In this paper, 33 years of rainfall data analyzed in khorasan state, the northeastern part of Iran situated at latitude-longitude pairs (31°-38°N, 74°- 80°E). this research attempted to train Fuzzy Inference System (FIS) based prediction models with 33 years of rainfall data. For performance evaluation, the model predicted outputs were compared with the actual rainfall data. Simulation results reveal that soft computing techniques are promising and efficient. The test results using by FIS model showed that the RMSE was obtained 52 millimeter.

  6. A Structured-Inquiry Approach to Teaching Neurophysiology Using Computer Simulation

    PubMed Central

    Crisp, Kevin M.

    2012-01-01

    Computer simulation is a valuable tool for teaching the fundamentals of neurophysiology in undergraduate laboratories where time and equipment limitations restrict the amount of course content that can be delivered through hands-on interaction. However, students often find such exercises to be tedious and unstimulating. In an effort to engage students in the use of computational modeling while developing a deeper understanding of neurophysiology, an attempt was made to use an educational neurosimulation environment as the basis for a novel, inquiry-based research project. During the semester, students in the class wrote a research proposal, used the Neurodynamix II simulator to generate a large data set, analyzed their modeling results statistically, and presented their findings at the Midbrains Neuroscience Consortium undergraduate poster session. Learning was assessed in the form of a series of short term papers and two 10-min in-class writing responses to the open-ended question, “How do ion channels influence neuronal firing?”, which they completed on weeks 6 and 15 of the semester. Students’ answers to this question showed a deeper understanding of neuronal excitability after the project; their term papers revealed evidence of critical thinking about computational modeling and neuronal excitability. Suggestions for the adaptation of this structured-inquiry approach into shorter term lab experiences are discussed. PMID:23494064

  7. On the use of interaction error potentials for adaptive brain computer interfaces.

    PubMed

    Llera, A; van Gerven, M A J; Gómez, V; Jensen, O; Kappen, H J

    2011-12-01

    We propose an adaptive classification method for the Brain Computer Interfaces (BCI) which uses Interaction Error Potentials (IErrPs) as a reinforcement signal and adapts the classifier parameters when an error is detected. We analyze the quality of the proposed approach in relation to the misclassification of the IErrPs. In addition we compare static versus adaptive classification performance using artificial and MEG data. We show that the proposed adaptive framework significantly improves the static classification methods. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. An Investigation on Computer-Adaptive Multistage Testing Panels for Multidimensional Assessment

    ERIC Educational Resources Information Center

    Wang, Xinrui

    2013-01-01

    The computer-adaptive multistage testing (ca-MST) has been developed as an alternative to computerized adaptive testing (CAT), and been increasingly adopted in large-scale assessments. Current research and practice only focus on ca-MST panels for credentialing purposes. The ca-MST test mode, therefore, is designed to gauge a single scale. The…

  9. Solution-adaptive finite element method in computational fracture mechanics

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Bass, J. M.; Spradley, L. W.

    1993-01-01

    Some recent results obtained using solution-adaptive finite element method in linear elastic two-dimensional fracture mechanics problems are presented. The focus is on the basic issue of adaptive finite element method for validating the applications of new methodology to fracture mechanics problems by computing demonstration problems and comparing the stress intensity factors to analytical results.

  10. Computer Adaptive Testing, Big Data and Algorithmic Approaches to Education

    ERIC Educational Resources Information Center

    Thompson, Greg

    2017-01-01

    This article critically considers the promise of computer adaptive testing (CAT) and digital data to provide better and quicker data that will improve the quality, efficiency and effectiveness of schooling. In particular, it uses the case of the Australian NAPLAN test that will become an online, adaptive test from 2016. The article argues that…

  11. A Guide to Computer Simulations of Three Adaptive Instructional Models for the Advanced Instructional System Phases II and III. Final Report.

    ERIC Educational Resources Information Center

    Hansen, Duncan N.; And Others

    Computer simulations of three individualized adaptive instructional models (AIM) were undertaken to determine if these models function as prescribed in Air Force technical training programs. In addition, the project sought to develop a user's guide for effective understanding of adaptive models during field implementation. Successful simulations…

  12. Adaptive Calibration of Dynamic Accommodation—Implications for Accommodating Intraocular Lenses

    PubMed Central

    Schor, Clifton M.; Bharadwaj, Shrikant R.

    2009-01-01

    PURPOSE When the aging lens is replaced with prosthetic accommodating intraocular lenses (IOLs), with effective viscoelasticities different from those of the natural lens, mismatches could arise between the neural control of accommodation and the biomechanical properties of the new lens. These mismatches could lead to either unstable oscillations or sluggishness of dynamic accommodation. Using computer simulations, we investigated whether optimal accommodative responses could be restored through recalibration of the neural control of accommodation. Using human experiments, we also investigated whether the accommodative system has the capacity for adaptive recalibration in response to changes in lens biomechanics. METHODS Dynamic performance of two accommodating IOL prototypes was simulated for a 45-year-old accommodative system, before and after neural recalibration, using a dynamic model of accommodation. Accommodating IOL I, a prototype for an injectable accommodating IOL, was less stiff and less viscous than the natural 45-year-old lens. Accommodating IOL II, a prototype for a translating accommodating IOL, was less stiff and more viscous than the natural 45-year-old lens. Short-term adaptive recalibration of dynamic accommodation was stimulated using a double-step adaptation paradigm that optically induced changes in neuromuscular effort mimicking responses to changes in lens biomechanics. RESULTS Model simulations indicate that the unstable oscillations or sluggishness of dynamic accommodation resulting from mismatches between neural control and lens biomechanics might be restored through neural recalibration. CONCLUSIONS Empirical measures reveal that the accommodative system is capable of adaptive recalibration in response to optical loads that simulate effects of changing lens biomechanics. PMID:19044245

  13. BAUM: improving genome assembly by adaptive unique mapping and local overlap-layout-consensus approach.

    PubMed

    Wang, Anqi; Wang, Zhanyu; Li, Zheng; Li, Lei M

    2018-06-15

    It is highly desirable to assemble genomes of high continuity and consistency at low cost. The current bottleneck of draft genome continuity using the second generation sequencing (SGS) reads is primarily caused by uncertainty among repetitive sequences. Even though the single-molecule real-time sequencing technology is very promising to overcome the uncertainty issue, its relatively high cost and error rate add burden on budget or computation. Many long-read assemblers take the overlap-layout-consensus (OLC) paradigm, which is less sensitive to sequencing errors, heterozygosity and variability of coverage. However, current assemblers of SGS data do not sufficiently take advantage of the OLC approach. Aiming at minimizing uncertainty, the proposed method BAUM, breaks the whole genome into regions by adaptive unique mapping; then the local OLC is used to assemble each region in parallel. BAUM can (i) perform reference-assisted assembly based on the genome of a close species (ii) or improve the results of existing assemblies that are obtained based on short or long sequencing reads. The tests on two eukaryote genomes, a wild rice Oryza longistaminata and a parrot Melopsittacus undulatus, show that BAUM achieved substantial improvement on genome size and continuity. Besides, BAUM reconstructed a considerable amount of repetitive regions that failed to be assembled by existing short read assemblers. We also propose statistical approaches to control the uncertainty in different steps of BAUM. http://www.zhanyuwang.xin/wordpress/index.php/2017/07/21/baum. Supplementary data are available at Bioinformatics online.

  14. An Efficient Adaptive Window Size Selection Method for Improving Spectrogram Visualization.

    PubMed

    Nisar, Shibli; Khan, Omar Usman; Tariq, Muhammad

    2016-01-01

    Short Time Fourier Transform (STFT) is an important technique for the time-frequency analysis of a time varying signal. The basic approach behind it involves the application of a Fast Fourier Transform (FFT) to a signal multiplied with an appropriate window function with fixed resolution. The selection of an appropriate window size is difficult when no background information about the input signal is known. In this paper, a novel empirical model is proposed that adaptively adjusts the window size for a narrow band-signal using spectrum sensing technique. For wide-band signals, where a fixed time-frequency resolution is undesirable, the approach adapts the constant Q transform (CQT). Unlike the STFT, the CQT provides a varying time-frequency resolution. This results in a high spectral resolution at low frequencies and high temporal resolution at high frequencies. In this paper, a simple but effective switching framework is provided between both STFT and CQT. The proposed method also allows for the dynamic construction of a filter bank according to user-defined parameters. This helps in reducing redundant entries in the filter bank. Results obtained from the proposed method not only improve the spectrogram visualization but also reduce the computation cost and achieves 87.71% of the appropriate window length selection.

  15. Modeling Two Types of Adaptation to Climate Change

    EPA Science Inventory

    Mitigation and adaptation are the two key responses available to policymakers to reduce the risks of climate change. We model these two policies together in a new DICE-based integrated assessment model that characterizes adaptation as either short-lived flow spending or long-live...

  16. An Adaptive Testing System for Supporting Versatile Educational Assessment

    ERIC Educational Resources Information Center

    Huang, Yueh-Min; Lin, Yen-Ting; Cheng, Shu-Chen

    2009-01-01

    With the rapid growth of computer and mobile technology, it is a challenge to integrate computer based test (CBT) with mobile learning (m-learning) especially for formative assessment and self-assessment. In terms of self-assessment, computer adaptive test (CAT) is a proper way to enable students to evaluate themselves. In CAT, students are…

  17. An Adaptive Evaluation Structure for Computer-Based Instruction.

    ERIC Educational Resources Information Center

    Welsh, William A.

    Adaptive Evaluation Structure (AES) is a set of linked computer programs designed to increase the effectiveness of interactive computer-assisted instruction at the college level. The package has four major features, the first of which is based on a prior cognitive inventory and on the accuracy and pace of student responses. AES adjusts materials…

  18. Advances in petascale kinetic plasma simulation with VPIC and Roadrunner

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowers, Kevin J; Albright, Brian J; Yin, Lin

    2009-01-01

    VPIC, a first-principles 3d electromagnetic charge-conserving relativistic kinetic particle-in-cell (PIC) code, was recently adapted to run on Los Alamos's Roadrunner, the first supercomputer to break a petaflop (10{sup 15} floating point operations per second) in the TOP500 supercomputer performance rankings. They give a brief overview of the modeling capabilities and optimization techniques used in VPIC and the computational characteristics of petascale supercomputers like Roadrunner. They then discuss three applications enabled by VPIC's unprecedented performance on Roadrunner: modeling laser plasma interaction in upcoming inertial confinement fusion experiments at the National Ignition Facility (NIF), modeling short pulse laser GeV ion acceleration andmore » modeling reconnection in magnetic confinement fusion experiments.« less

  19. Analysis of Environmental Stress Factors Using an Artificial Growth System and Plant Fitness Optimization

    PubMed Central

    Lee, Meonghun; Yoe, Hyun

    2015-01-01

    The environment promotes evolution. Evolutionary processes represent environmental adaptations over long time scales; evolution of crop genomes is not inducible within the relatively short time span of a human generation. Extreme environmental conditions can accelerate evolution, but such conditions are often stress inducing and disruptive. Artificial growth systems can be used to induce and select genomic variation by changing external environmental conditions, thus, accelerating evolution. By using cloud computing and big-data analysis, we analyzed environmental stress factors for Pleurotus ostreatus by assessing, evaluating, and predicting information of the growth environment. Through the indexing of environmental stress, the growth environment can be precisely controlled and developed into a technology for improving crop quality and production. PMID:25874206

  20. Short-Term Intercultural Psychotherapy: Ethnographic Inquiry

    ERIC Educational Resources Information Center

    Seeley, Karen M.

    2004-01-01

    This article examines the challenges specific to short-term intercultural treatments and recently developed approaches to intercultural treatments based on notions of cultural knowledge and cultural competence. The article introduces alternative approaches to short-term intercultural treatments based on ethnographic inquiry adapted for clinical…

  1. Effects of hypoxia on sympathetic neural control in humans

    NASA Technical Reports Server (NTRS)

    Smith, M. L.; Muenter, N. K.

    2000-01-01

    This special issue is principally focused on the time domain of the adaptive mechanisms of ventilatory responses to short-term, long-term and intermittent hypoxia. The purpose of this review is to summarize the limited literature on the sympathetic neural responses to sustained or intermittent hypoxia in humans and attempt to discern the time domain of these responses and potential adaptive processes that are evoked during short and long-term exposures to hypoxia.

  2. [Effect of glutamine and growth hormone on adaptation in short bowel syndrome].

    PubMed

    Wu, Guo-hao; Wu, Zhao-han; Wu, Zhao-guang

    2005-09-01

    To assess the effects of parenteral glutamine and growth hormone supplementation on gut adaptation for patients with short bowel syndrome. Twenty-six patients [male 15, female 11, aged (39 +/- 23) years] with short bowel syndrome received parenteral nutrition (PN) 3-52 months after surgical resection. The median length of remnant small intestine was 42.5(0-100)cm. All patients received growth hormone (0.10+/- 0.06) mg.kg(-1).d(-1) plus glutamine (0.30 +/- 0.17) mg.kg(-1).d(-1) for two or three weeks. Among the 26 patients, PN was not required soon after treatment in 34.6% (n=9) of the patients, the frequency and volume of PN decreased from (6.0 +/- 1.0) d to (4.2 +/- 1.0) d, from (13.6 +/- 5.2) L per week to (8.2 +/- 3.3) L per week respectively in 30.8% (n=8) of the patients,while 34.6% (n=9) still required PN after treatment. The combined administration of glutamine and growth hormone can promote remnant intestinal adaptation in short bowel patients.

  3. A novel composite adaptive flap controller design by a high-efficient modified differential evolution identification approach.

    PubMed

    Li, Nailu; Mu, Anle; Yang, Xiyun; Magar, Kaman T; Liu, Chao

    2018-05-01

    The optimal tuning of adaptive flap controller can improve adaptive flap control performance on uncertain operating environments, but the optimization process is usually time-consuming and it is difficult to design proper optimal tuning strategy for the flap control system (FCS). To solve this problem, a novel adaptive flap controller is designed based on a high-efficient differential evolution (DE) identification technique and composite adaptive internal model control (CAIMC) strategy. The optimal tuning can be easily obtained by DE identified inverse of the FCS via CAIMC structure. To achieve fast tuning, a high-efficient modified adaptive DE algorithm is proposed with new mutant operator and varying range adaptive mechanism for the FCS identification. A tradeoff between optimized adaptive flap control and low computation cost is successfully achieved by proposed controller. Simulation results show the robustness of proposed method and its superiority to conventional adaptive IMC (AIMC) flap controller and the CAIMC flap controllers using other DE algorithms on various uncertain operating conditions. The high computation efficiency of proposed controller is also verified based on the computation time on those operating cases. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  4. The demodulated band transform

    PubMed Central

    Kovach, Christopher K.; Gander, Phillip E.

    2016-01-01

    Background Windowed Fourier decompositions (WFD) are widely used in measuring stationary and non-stationary spectral phenomena and in describing pairwise relationships among multiple signals. Although a variety of WFDs see frequent application in electrophysiological research, including the short-time Fourier transform, continuous wavelets, band-pass filtering and multitaper-based approaches, each carries certain drawbacks related to computational efficiency and spectral leakage. This work surveys the advantages of a WFD not previously applied in electrophysiological settings. New Methods A computationally efficient form of complex demodulation, the demodulated band transform (DBT), is described. Results DBT is shown to provide an efficient approach to spectral estimation with minimal susceptibility to spectral leakage. In addition, it lends itself well to adaptive filtering of non-stationary narrowband noise. Comparison with existing methods A detailed comparison with alternative WFDs is offered, with an emphasis on the relationship between DBT and Thomson's multitaper. DBT is shown to perform favorably in combining computational efficiency with minimal introduction of spectral leakage. Conclusion DBT is ideally suited to efficient estimation of both stationary and non-stationary spectral and cross-spectral statistics with minimal susceptibility to spectral leakage. These qualities are broadly desirable in many settings. PMID:26711370

  5. Computer-Aided Sensor Development Focused on Security Issues.

    PubMed

    Bialas, Andrzej

    2016-05-26

    The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research.

  6. Computer-Aided Sensor Development Focused on Security Issues

    PubMed Central

    Bialas, Andrzej

    2016-01-01

    The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research. PMID:27240360

  7. Modeling Adaptation as a Flow and Stock Decsion with Mitigation

    EPA Science Inventory

    Mitigation and adaptation are the two key responses available to policymakers to reduce the risks of climate change. We model these two policies together in a new DICE-based integrated assessment model that characterizes adaptation as either short-lived flow spending or long-live...

  8. Modeling Adaptation as a Flow and Stock Decision with Mitigation

    EPA Science Inventory

    Mitigation and adaptation are the two key responses available to policymakers to reduce the risks of climate change. We model these two policies together in a new DICE-based integrated assessment model that characterizes adaptation as either short-lived flow spending or long-liv...

  9. On the Issue of Item Selection in Computerized Adaptive Testing with Response Times

    ERIC Educational Resources Information Center

    Veldkamp, Bernard P.

    2016-01-01

    Many standardized tests are now administered via computer rather than paper-and-pencil format. The computer-based delivery mode brings with it certain advantages. One advantage is the ability to adapt the difficulty level of the test to the ability level of the test taker in what has been termed computerized adaptive testing (CAT). A second…

  10. Preliminary Report on a National Cross-Validation of the Computerized Adaptive Screening Test (CAST).

    ERIC Educational Resources Information Center

    Knapp, Deirdre J.; Pliske, Rebecca M.

    A study was conducted to validate the Army's Computerized Adaptive Screening Test (CAST), using data from 2,240 applicants from 60 army recruiting stations across the nation. CAST is a computer-assisted adaptive test used to predict performance on the Armed Forces Qualification Test (AFQT). AFQT scores are computed by adding four subtest scores of…

  11. Adaptive 3D single-block grids for the computation of viscous flows around wings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagmeijer, R.; Kok, J.C.

    1996-12-31

    A robust algorithm for the adaption of a 3D single-block structured grid suitable for the computation of viscous flows around a wing is presented and demonstrated by application to the ONERA M6 wing. The effects of grid adaption on the flow solution and accuracy improvements is analyzed. Reynolds number variations are studied.

  12. Encoder-Decoder Optimization for Brain-Computer Interfaces

    PubMed Central

    Merel, Josh; Pianto, Donald M.; Cunningham, John P.; Paninski, Liam

    2015-01-01

    Neuroprosthetic brain-computer interfaces are systems that decode neural activity into useful control signals for effectors, such as a cursor on a computer screen. It has long been recognized that both the user and decoding system can adapt to increase the accuracy of the end effector. Co-adaptation is the process whereby a user learns to control the system in conjunction with the decoder adapting to learn the user's neural patterns. We provide a mathematical framework for co-adaptation and relate co-adaptation to the joint optimization of the user's control scheme ("encoding model") and the decoding algorithm's parameters. When the assumptions of that framework are respected, co-adaptation cannot yield better performance than that obtainable by an optimal initial choice of fixed decoder, coupled with optimal user learning. For a specific case, we provide numerical methods to obtain such an optimized decoder. We demonstrate our approach in a model brain-computer interface system using an online prosthesis simulator, a simple human-in-the-loop pyschophysics setup which provides a non-invasive simulation of the BCI setting. These experiments support two claims: that users can learn encoders matched to fixed, optimal decoders and that, once learned, our approach yields expected performance advantages. PMID:26029919

  13. Encoder-decoder optimization for brain-computer interfaces.

    PubMed

    Merel, Josh; Pianto, Donald M; Cunningham, John P; Paninski, Liam

    2015-06-01

    Neuroprosthetic brain-computer interfaces are systems that decode neural activity into useful control signals for effectors, such as a cursor on a computer screen. It has long been recognized that both the user and decoding system can adapt to increase the accuracy of the end effector. Co-adaptation is the process whereby a user learns to control the system in conjunction with the decoder adapting to learn the user's neural patterns. We provide a mathematical framework for co-adaptation and relate co-adaptation to the joint optimization of the user's control scheme ("encoding model") and the decoding algorithm's parameters. When the assumptions of that framework are respected, co-adaptation cannot yield better performance than that obtainable by an optimal initial choice of fixed decoder, coupled with optimal user learning. For a specific case, we provide numerical methods to obtain such an optimized decoder. We demonstrate our approach in a model brain-computer interface system using an online prosthesis simulator, a simple human-in-the-loop pyschophysics setup which provides a non-invasive simulation of the BCI setting. These experiments support two claims: that users can learn encoders matched to fixed, optimal decoders and that, once learned, our approach yields expected performance advantages.

  14. On high heels and short muscles: A multiscale model for sarcomere loss in the gastrocnemius muscle

    PubMed Central

    Zöllner, Alexander M.; Pok, Jacquelynn M.; McWalter, Emily J.; Gold, Garry E.; Kuhl, Ellen

    2014-01-01

    High heels are a major source of chronic lower limb pain. Yet, more than one third of all women compromise health for looks and wear high heels on a daily basis. Changing from flat footwear to high heels induces chronic muscle shortening associated with discomfort, fatigue, reduced shock absorption, and increased injury risk. However, the long-term effects of high-heeled footwear on the musculoskeletal kinematics of the lower extremities remain poorly understood. Here we create a multiscale computational model for chronic muscle adaptation to characterize the acute and chronic effects of global muscle shortening on local sarcomere lengths. We perform a case study of a healthy female subject and show that raising the heel by 13 cm shortens the gastrocnemius muscle by 5% while the Achilles tendon remains virtually unaffected. Our computational simulation indicates that muscle shortening displays significant regional variations with extreme values of 22% in the central gastrocnemius. Our model suggests that the muscle gradually adjusts to its new functional length by a chronic loss of sarcomeres in series. Sarcomere loss varies significantly across the muscle with an average loss of 9%, virtually no loss at the proximal and distal ends, and a maximum loss of 39% in the central region. These changes reposition the remaining sarcomeres back into their optimal operating regime. Computational modeling of chronic muscle shortening provides a valuable tool to shape our understanding of the underlying mechanisms of muscle adaptation. Our study could open new avenues in orthopedic surgery and enhance treatment for patients with muscle contracture caused by other conditions than high heel wear such as paralysis, muscular atrophy, and muscular dystrophy. PMID:25451524

  15. Multiple Motor Learning Strategies in Visuomotor Rotation

    PubMed Central

    Saijo, Naoki; Gomi, Hiroaki

    2010-01-01

    Background When exposed to a continuous directional discrepancy between movements of a visible hand cursor and the actual hand (visuomotor rotation), subjects adapt their reaching movements so that the cursor is brought to the target. Abrupt removal of the discrepancy after training induces reaching error in the direction opposite to the original discrepancy, which is called an aftereffect. Previous studies have shown that training with gradually increasing visuomotor rotation results in a larger aftereffect than with a suddenly increasing one. Although the aftereffect difference implies a difference in the learning process, it is still unclear whether the learned visuomotor transformations are qualitatively different between the training conditions. Methodology/Principal Findings We examined the qualitative changes in the visuomotor transformation after the learning of the sudden and gradual visuomotor rotations. The learning of the sudden rotation led to a significant increase of the reaction time for arm movement initiation and then the reaching error decreased, indicating that the learning is associated with an increase of computational load in motor preparation (planning). In contrast, the learning of the gradual rotation did not change the reaction time but resulted in an increase of the gain of feedback control, suggesting that the online adjustment of the reaching contributes to the learning of the gradual rotation. When the online cursor feedback was eliminated during the learning of the gradual rotation, the reaction time increased, indicating that additional computations are involved in the learning of the gradual rotation. Conclusions/Significance The results suggest that the change in the motor planning and online feedback adjustment of the movement are involved in the learning of the visuomotor rotation. The contributions of those computations to the learning are flexibly modulated according to the visual environment. Such multiple learning strategies would be required for reaching adaptation within a short training period. PMID:20195373

  16. Rapid equilibrium sampling initiated from nonequilibrium data.

    PubMed

    Huang, Xuhui; Bowman, Gregory R; Bacallado, Sergio; Pande, Vijay S

    2009-11-24

    Simulating the conformational dynamics of biomolecules is extremely difficult due to the rugged nature of their free energy landscapes and multiple long-lived, or metastable, states. Generalized ensemble (GE) algorithms, which have become popular in recent years, attempt to facilitate crossing between states at low temperatures by inducing a random walk in temperature space. Enthalpic barriers may be crossed more easily at high temperatures; however, entropic barriers will become more significant. This poses a problem because the dominant barriers to conformational change are entropic for many biological systems, such as the short RNA hairpin studied here. We present a new efficient algorithm for conformational sampling, called the adaptive seeding method (ASM), which uses nonequilibrium GE simulations to identify the metastable states, and seeds short simulations at constant temperature from each of them to quantitatively determine their equilibrium populations. Thus, the ASM takes advantage of the broad sampling possible with GE algorithms but generally crosses entropic barriers more efficiently during the seeding simulations at low temperature. We show that only local equilibrium is necessary for ASM, so very short seeding simulations may be used. Moreover, the ASM may be used to recover equilibrium properties from existing datasets that failed to converge, and is well suited to running on modern computer clusters.

  17. A 67-Item Stress Resilience item bank showing high content validity was developed in a psychosomatic sample.

    PubMed

    Obbarius, Nina; Fischer, Felix; Obbarius, Alexander; Nolte, Sandra; Liegl, Gregor; Rose, Matthias

    2018-04-10

    To develop the first item bank to measure Stress Resilience (SR) in clinical populations. Qualitative item development resulted in an initial pool of 131 items covering a broad theoretical SR concept. These items were tested in n=521 patients at a psychosomatic outpatient clinic. Exploratory and Confirmatory Factor Analysis (CFA), as well as other state-of-the-art item analyses and IRT were used for item evaluation and calibration of the final item bank. Out of the initial item pool of 131 items, we excluded 64 items (54 factor loading <.5, 4 residual correlations >.3, 2 non-discriminative Item Response Curves, 4 Differential Item Functioning). The final set of 67 items indicated sufficient model fit in CFA and IRT analyses. Additionally, a 10-item short form with high measurement precision (SE≤.32 in a theta range between -1.8 and +1.5) was derived. Both the SR item bank and the SR short form were highly correlated with an existing static legacy tool (Connor-Davidson Resilience Scale). The final SR item bank and 10-item short form showed good psychometric properties. When further validated, they will be ready to be used within a framework of Computer-Adaptive Tests for a comprehensive assessment of the Stress-Construct. Copyright © 2018. Published by Elsevier Inc.

  18. Science Fiction on Film.

    ERIC Educational Resources Information Center

    Burmester, David

    1985-01-01

    Reviews science fiction films used in a science fiction class. Discusses feature films, short science fiction films, short story adaptations, original science fiction pieces and factual science films that enrich literature. (EL)

  19. Adaptive multimode signal reconstruction from time–frequency representations

    PubMed Central

    Meignen, Sylvain; Oberlin, Thomas; Depalle, Philippe; Flandrin, Patrick

    2016-01-01

    This paper discusses methods for the adaptive reconstruction of the modes of multicomponent AM–FM signals by their time–frequency (TF) representation derived from their short-time Fourier transform (STFT). The STFT of an AM–FM component or mode spreads the information relative to that mode in the TF plane around curves commonly called ridges. An alternative view is to consider a mode as a particular TF domain termed a basin of attraction. Here we discuss two new approaches to mode reconstruction. The first determines the ridge associated with a mode by considering the location where the direction of the reassignment vector sharply changes, the technique used to determine the basin of attraction being directly derived from that used for ridge extraction. A second uses the fact that the STFT of a signal is fully characterized by its zeros (and then the particular distribution of these zeros for Gaussian noise) to deduce an algorithm to compute the mode domains. For both techniques, mode reconstruction is then carried out by simply integrating the information inside these basins of attraction or domains. PMID:26953184

  20. Sensitivity of a computer adaptive assessment for measuring functional mobility changes in children enrolled in a community fitness programme.

    PubMed

    Haley, Stephen M; Fragala-Pinkham, Maria; Ni, Pengsheng

    2006-07-01

    To examine the relative sensitivity to detect functional mobility changes with a full-length parent questionnaire compared with a computerized adaptive testing version of the questionnaire after a 16-week group fitness programme. Prospective, pre- and posttest study with a 16-week group fitness intervention. Three community-based fitness centres. Convenience sample of children (n = 28) with physical or developmental disabilities. A 16-week group exercise programme held twice a week in a community setting. A full-length (161 items) paper version of a mobility parent questionnaire based on the Pediatric Evaluation of Disability Inventory, but expanded to include expected skills of children up to 15 years old was compared with a 15-item computer adaptive testing version. Both measures were administered at pre- and posttest intervals. Both the full-length Pediatric Evaluation of Disability Inventory and the 15-item computer adaptive testing version detected significant changes between pre- and posttest scores, had large effect sizes, and standardized response means, with a modest decrease in the computer adaptive test as compared with the 161-item paper version. Correlations between the computer adaptive and paper formats across pre- and posttest scores ranged from r = 0.76 to 0.86. Both functional mobility test versions were able to detect positive functional changes at the end of the intervention period. Greater variability in score estimates was generated by the computerized adaptive testing version, which led to a relative reduction in sensitivity as defined by the standardized response mean. Extreme scores were generally more difficult for the computer adaptive format to estimate with as much accuracy as scores in the mid-range of the scale. However, the reduction in accuracy and sensitivity, which did not influence the group effect results in this study, is counterbalanced by the large reduction in testing burden.

  1. Computers for the Disabled.

    ERIC Educational Resources Information Center

    Lazzaro, Joseph J.

    1993-01-01

    Describes adaptive technology for personal computers that accommodate disabled users and may require special equipment including hardware, memory, expansion slots, and ports. Highlights include vision aids, including speech synthesizers, magnification, braille, and optical character recognition (OCR); hearing adaptations; motor-impaired…

  2. An adaptive brain actuated system for augmenting rehabilitation

    PubMed Central

    Roset, Scott A.; Gant, Katie; Prasad, Abhishek; Sanchez, Justin C.

    2014-01-01

    For people living with paralysis, restoration of hand function remains the top priority because it leads to independence and improvement in quality of life. In approaches to restore hand and arm function, a goal is to better engage voluntary control and counteract maladaptive brain reorganization that results from non-use. Standard rehabilitation augmented with developments from the study of brain-computer interfaces could provide a combined therapy approach for motor cortex rehabilitation and to alleviate motor impairments. In this paper, an adaptive brain-computer interface system intended for application to control a functional electrical stimulation (FES) device is developed as an experimental test bed for augmenting rehabilitation with a brain-computer interface. The system's performance is improved throughout rehabilitation by passive user feedback and reinforcement learning. By continuously adapting to the user's brain activity, similar adaptive systems could be used to support clinical brain-computer interface neurorehabilitation over multiple days. PMID:25565945

  3. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jablonowski, Christiane

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively withmore » advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project demonstrate significant advances in all six research areas. The major conclusions are that statically-adaptive variable-resolution modeling is currently becoming mature in the climate sciences, and that AMR holds outstanding promise for future-generation weather and climate models on high-performance computing architectures.« less

  4. Adaptation to low pH and lignocellulosic inhibitors resulting in ethanolic fermentation and growth of Saccharomyces cerevisiae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Narayanan, Venkatachalam; Sànchez i Nogué, Violeta; van Niel, Ed W. J.

    Here, lignocellulosic bioethanol from renewable feedstocks using Saccharomyces cerevisiae is a promising alternative to fossil fuels owing to environmental challenges. S. cerevisiae is frequently challenged by bacterial contamination and a combination of lignocellulosic inhibitors formed during the pre-treatment, in terms of growth, ethanol yield and productivity. We investigated the phenotypic robustness of a brewing yeast strain TMB3500 and its ability to adapt to low pH thereby preventing bacterial contamination along with lignocellulosic inhibitors by short-term adaptation and adaptive lab evolution (ALE). The short-term adaptation strategy was used to investigate the inherent ability of strain TMB3500 to activate a robust phenotypemore » involving pre-culturing yeast cells in defined medium with lignocellulosic inhibitors at pH 5.0 until late exponential phase prior to inoculating them in defined media with the same inhibitor cocktail at pH 3.7. Adapted cells were able to grow aerobically, ferment anaerobically (glucose exhaustion by 19 +/- 5 h to yield 0.45 +/- 0.01 g ethanol g glucose -1) and portray significant detoxification of inhibitors at pH 3.7, when compared to non-adapted cells. ALE was performed to investigate whether a stable strain could be developed to grow and ferment at low pH with lignocellulosic inhibitors in a continuous suspension culture. Though a robust population was obtained after 3600 h with an ability to grow and ferment at pH 3.7 with inhibitors, inhibitor robustness was not stable as indicated by the characterisation of the evolved culture possibly due to phenotypic plasticity. With further research, this short-term adaptation and low pH strategy could be successfully applied in lignocellulosic ethanol plants to prevent bacterial contamination.« less

  5. Adaptation to low pH and lignocellulosic inhibitors resulting in ethanolic fermentation and growth of Saccharomyces cerevisiae

    DOE PAGES

    Narayanan, Venkatachalam; Sànchez i Nogué, Violeta; van Niel, Ed W. J.; ...

    2016-08-26

    Here, lignocellulosic bioethanol from renewable feedstocks using Saccharomyces cerevisiae is a promising alternative to fossil fuels owing to environmental challenges. S. cerevisiae is frequently challenged by bacterial contamination and a combination of lignocellulosic inhibitors formed during the pre-treatment, in terms of growth, ethanol yield and productivity. We investigated the phenotypic robustness of a brewing yeast strain TMB3500 and its ability to adapt to low pH thereby preventing bacterial contamination along with lignocellulosic inhibitors by short-term adaptation and adaptive lab evolution (ALE). The short-term adaptation strategy was used to investigate the inherent ability of strain TMB3500 to activate a robust phenotypemore » involving pre-culturing yeast cells in defined medium with lignocellulosic inhibitors at pH 5.0 until late exponential phase prior to inoculating them in defined media with the same inhibitor cocktail at pH 3.7. Adapted cells were able to grow aerobically, ferment anaerobically (glucose exhaustion by 19 +/- 5 h to yield 0.45 +/- 0.01 g ethanol g glucose -1) and portray significant detoxification of inhibitors at pH 3.7, when compared to non-adapted cells. ALE was performed to investigate whether a stable strain could be developed to grow and ferment at low pH with lignocellulosic inhibitors in a continuous suspension culture. Though a robust population was obtained after 3600 h with an ability to grow and ferment at pH 3.7 with inhibitors, inhibitor robustness was not stable as indicated by the characterisation of the evolved culture possibly due to phenotypic plasticity. With further research, this short-term adaptation and low pH strategy could be successfully applied in lignocellulosic ethanol plants to prevent bacterial contamination.« less

  6. ICCE/ICCAI 2000 Full & Short Papers (Artificial Intelligence in Education).

    ERIC Educational Resources Information Center

    2000

    This document contains the full and short papers on artificial intelligence in education from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction) covering the following topics: a computational model for learners' motivation states in individualized tutoring system; a…

  7. ICCE/ICCAI 2000 Full & Short Papers (Student Modeling).

    ERIC Educational Resources Information Center

    2000

    This document contains the following full and short papers on student modeling from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A Computational Model for Learner's Motivation States in Individualized Tutoring System" (Behrouz H. Far and Anete H.…

  8. Numerical modeling of landslide-generated tsunami using adaptive unstructured meshes

    NASA Astrophysics Data System (ADS)

    Wilson, Cian; Collins, Gareth; Desousa Costa, Patrick; Piggott, Matthew

    2010-05-01

    Landslides impacting into or occurring under water generate waves, which can have devastating environmental consequences. Depending on the characteristics of the landslide the waves can have significant amplitude and potentially propagate over large distances. Linear models of classical earthquake-generated tsunamis cannot reproduce the highly nonlinear generation mechanisms required to accurately predict the consequences of landslide-generated tsunamis. Also, laboratory-scale experimental investigation is limited to simple geometries and short time-scales before wave reflections contaminate the data. Computational fluid dynamics models based on the nonlinear Navier-Stokes equations can simulate landslide-tsunami generation at realistic scales. However, traditional chessboard-like structured meshes introduce superfluous resolution and hence the computing power required for such a simulation can be prohibitively high, especially in three dimensions. Unstructured meshes allow the grid spacing to vary rapidly from high resolution in the vicinity of small scale features to much coarser, lower resolution in other areas. Combining this variable resolution with dynamic mesh adaptivity allows such high resolution zones to follow features like the interface between the landslide and the water whilst minimising the computational costs. Unstructured meshes are also better suited to representing complex geometries and bathymetries allowing more realistic domains to be simulated. Modelling multiple materials, like water, air and a landslide, on an unstructured adaptive mesh poses significant numerical challenges. Novel methods of interface preservation must be considered and coupled to a flow model in such a way that ensures conservation of the different materials. Furthermore this conservation property must be maintained during successive stages of mesh optimisation and interpolation. In this paper we validate a new multi-material adaptive unstructured fluid dynamics model against the well-known Lituya Bay landslide-generated wave experiment and case study [1]. In addition, we explore the effect of physical parameters, such as the shape, velocity and viscosity of the landslide, on wave amplitude and run-up, to quantify their influence on the landslide-tsunami hazard. As well as reproducing the experimental results, the model is shown to have excellent conservation and bounding properties. It also requires fewer nodes than an equivalent resolution fixed mesh simulation, therefore minimising at least one aspect of the computational cost. These computational savings are directly transferable to higher dimensions and some initial three dimensional results are also presented. These reproduce the experiments of DiRisio et al. [2], where an 80cm long landslide analogue was released from the side of an 8.9m diameter conical island in a 50 × 30m tank of water. The resulting impact between the landslide and the water generated waves with an amplitude of 1cm at wave gauges around the island. The range of scales that must be considered in any attempt to numerically reproduce this experiment makes it an ideal case study for our multi-material adaptive unstructured fluid dynamics model. [1] FRITZ, H. M., MOHAMMED, F., & YOO, J. 2009. Lituya Bay Landslide Impact Generated Mega-Tsunami 50th Anniversary. Pure and Applied Geophysics, 166(1), 153-175. [2] DIRISIO, M., DEGIROLAMO, P., BELLOTTI, G., PANIZZO, A., ARISTODEMO, F.,

  9. Short-term differential adaptation to anaerobic stress via genomic mutations by Escherichia coli strains K-12 and B lacking alcohol dehydrogenase

    PubMed Central

    Kim, Hyun Ju; Jeong, Haeyoung; Hwang, Seungwoo; Lee, Moo-Seung; Lee, Yong-Jik; Lee, Dong-Woo; Lee, Sang Jun

    2014-01-01

    Microbial adaptations often occur via genomic mutations under adverse environmental conditions. This study used Escherichia coli ΔadhE cells as a model system to investigate adaptation to anaerobic conditions, which we then compared with the adaptive mechanisms of two closely related E. coli strains, K-12 and B. In contrast to K-12 ΔadhE cells, the E. coli B ΔadhE cells exhibited significantly delayed adaptive growth under anaerobic conditions. Adaptation by the K-12 and B strains mainly employed anaerobic lactate fermentation to restore cellular growth. Several mutations were identified in the pta or pflB genes of adapted K-12 cells, but mostly in the pta gene of the B strains. However, the types of mutation in the adapted K-12 and B strains were similar. Cellular viability was affected directly by severe redox imbalance in B ΔadhE cells, which also impaired their ability to adapt to anaerobic conditions. This study demonstrates that closely related microorganisms may undergo different adaptations under the same set of adverse conditions, which might be associated with the specific metabolic characteristics of each strain. This study provides new insights into short-term microbial adaptation to stressful conditions, which may reflect dynamic microbial population changes in nature. PMID:25250024

  10. Global Load Balancing with Parallel Mesh Adaption on Distributed-Memory Systems

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Oliker, Leonid; Sohn, Andrew

    1996-01-01

    Dynamic mesh adaptation on unstructured grids is a powerful tool for efficiently computing unsteady problems to resolve solution features of interest. Unfortunately, this causes load inbalances among processors on a parallel machine. This paper described the parallel implementation of a tetrahedral mesh adaption scheme and a new global load balancing method. A heuristic remapping algorithm is presented that assigns partitions to processors such that the redistribution coast is minimized. Results indicate that the parallel performance of the mesh adaption code depends on the nature of the adaption region and show a 35.5X speedup on 64 processors of an SP2 when 35 percent of the mesh is randomly adapted. For large scale scientific computations, our load balancing strategy gives an almost sixfold reduction in solver execution times over non-balanced loads. Furthermore, our heuristic remappier yields processor assignments that are less than 3 percent of the optimal solutions, but requires only 1 percent of the computational time.

  11. Gradient-free MCMC methods for dynamic causal modelling

    DOE PAGES

    Sengupta, Biswa; Friston, Karl J.; Penny, Will D.

    2015-03-14

    Here, we compare the performance of four gradient-free MCMC samplers (random walk Metropolis sampling, slice-sampling, adaptive MCMC sampling and population-based MCMC sampling with tempering) in terms of the number of independent samples they can produce per unit computational time. For the Bayesian inversion of a single-node neural mass model, both adaptive and population-based samplers are more efficient compared with random walk Metropolis sampler or slice-sampling; yet adaptive MCMC sampling is more promising in terms of compute time. Slice-sampling yields the highest number of independent samples from the target density -- albeit at almost 1000% increase in computational time, in comparisonmore » to the most efficient algorithm (i.e., the adaptive MCMC sampler).« less

  12. A single exercise bout and locomotor learning after stroke: physiological, behavioural, and computational outcomes.

    PubMed

    Charalambous, Charalambos C; Alcantara, Carolina C; French, Margaret A; Li, Xin; Matt, Kathleen S; Kim, Hyosub E; Morton, Susanne M; Reisman, Darcy S

    2018-05-15

    Previous work demonstrated an effect of a single high-intensity exercise bout coupled with motor practice on the retention of a newly acquired skilled arm movement, in both neurologically intact and impaired adults. In the present study, using behavioural and computational analyses we demonstrated that a single exercise bout, regardless of its intensity and timing, did not increase the retention of a novel locomotor task after stroke. Considering both present and previous work, we postulate that the benefits of exercise effect may depend on the type of motor learning (e.g. skill learning, sensorimotor adaptation) and/or task (e.g. arm accuracy-tracking task, walking). Acute high-intensity exercise coupled with motor practice improves the retention of motor learning in neurologically intact adults. However, whether exercise could improve the retention of locomotor learning after stroke is still unknown. Here, we investigated the effect of exercise intensity and timing on the retention of a novel locomotor learning task (i.e. split-belt treadmill walking) after stroke. Thirty-seven people post stroke participated in two sessions, 24 h apart, and were allocated to active control (CON), treadmill walking (TMW), or total body exercise on a cycle ergometer (TBE). In session 1, all groups exercised for a short bout (∼5 min) at low (CON) or high (TMW and TBE) intensity and before (CON and TMW) or after (TBE) the locomotor learning task. In both sessions, the locomotor learning task was to walk on a split-belt treadmill in a 2:1 speed ratio (100% and 50% fast-comfortable walking speed) for 15 min. To test the effect of exercise on 24 h retention, we applied behavioural and computational analyses. Behavioural data showed that neither high-intensity group showed greater 24 h retention compared to CON, and computational data showed that 24 h retention was attributable to a slow learning process for sensorimotor adaptation. Our findings demonstrated that acute exercise coupled with a locomotor adaptation task, regardless of its intensity and timing, does not improve retention of the novel locomotor task after stroke. We postulate that exercise effects on motor learning may be context specific (e.g. type of motor learning and/or task) and interact with the presence of genetic variant (BDNF Val66Met). © 2018 The Authors. The Journal of Physiology © 2018 The Physiological Society.

  13. The Effect of Test and Examinee Characteristics on the Occurrence of Aberrant Response Patterns in a Computerized Adaptive Test

    ERIC Educational Resources Information Center

    Rizavi, Saba; Hariharan, Swaminathan

    2001-01-01

    The advantages that computer adaptive testing offers over linear tests have been well documented. The Computer Adaptive Test (CAT) design is more efficient than the Linear test design as fewer items are needed to estimate an examinee's proficiency to a desired level of precision. In the ideal situation, a CAT will result in examinees answering…

  14. Computer-Adaptive Testing for Students with Disabilities: A Review of the Literature. Research Report. ETS RR-11-32

    ERIC Educational Resources Information Center

    Stone, Elizabeth; Davey, Tim

    2011-01-01

    There has been an increased interest in developing computer-adaptive testing (CAT) and multistage assessments for K-12 accountability assessments. The move to adaptive testing has been met with some resistance by those in the field of special education who express concern about routing of students with divergent profiles (e.g., some students with…

  15. ICCE/ICCAI 2000 Full & Short Papers (Web-Based Learning).

    ERIC Educational Resources Information Center

    2000

    This document contains full and short papers on World Wide Web-based learning from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction). Topics covered include: design and development of CAL (Computer Assisted Learning) systems; design and development of WBI (Web-Based…

  16. An Adaptive Sensor Mining Framework for Pervasive Computing Applications

    NASA Astrophysics Data System (ADS)

    Rashidi, Parisa; Cook, Diane J.

    Analyzing sensor data in pervasive computing applications brings unique challenges to the KDD community. The challenge is heightened when the underlying data source is dynamic and the patterns change. We introduce a new adaptive mining framework that detects patterns in sensor data, and more importantly, adapts to the changes in the underlying model. In our framework, the frequent and periodic patterns of data are first discovered by the Frequent and Periodic Pattern Miner (FPPM) algorithm; and then any changes in the discovered patterns over the lifetime of the system are discovered by the Pattern Adaptation Miner (PAM) algorithm, in order to adapt to the changing environment. This framework also captures vital context information present in pervasive computing applications, such as the startup triggers and temporal information. In this paper, we present a description of our mining framework and validate the approach using data collected in the CASAS smart home testbed.

  17. Intentionally Short-Range Communications (ISRC) Exploratory Development Plan

    DTIC Science & Technology

    1992-06-01

    range voice communication links. In the 1980s , NOSC developed a short-range, 2400-bps, computer-to-computer link for the USMC (UV Communications, or UV...Communication Links," Proc. Tact. Comm. Conf. 1, 60. Hislop , A. R. 1982. "A Head-Worn 60 GHz Communicator for Short Range Applications." NOSC TN 1153

  18. Automated Analysis of Short Responses in an Interactive Synthetic Tutoring System for Introductory Physics

    ERIC Educational Resources Information Center

    Nakamura, Christopher M.; Murphy, Sytil K.; Christel, Michael G.; Stevens, Scott M.; Zollman, Dean A.

    2016-01-01

    Computer-automated assessment of students' text responses to short-answer questions represents an important enabling technology for online learning environments. We have investigated the use of machine learning to train computer models capable of automatically classifying short-answer responses and assessed the results. Our investigations are part…

  19. Prolegomena to the field

    NASA Astrophysics Data System (ADS)

    Chen, Su Shing; Caulfield, H. John

    1994-03-01

    Adaptive Computing, vs. Classical Computing, is emerging to be a field which is the culmination during the last 40 and more years of various scientific and technological areas, including cybernetics, neural networks, pattern recognition networks, learning machines, selfreproducing automata, genetic algorithms, fuzzy logics, probabilistic logics, chaos, electronics, optics, and quantum devices. This volume of "Critical Reviews on Adaptive Computing: Mathematics, Electronics, and Optics" is intended as a synergistic approach to this emerging field. There are many researchers in these areas working on important results. However, we have not seen a general effort to summarize and synthesize these results in theory as well as implementation. In order to reach a higher level of synergism, we propose Adaptive Computing as the field which comprises of the above mentioned computational paradigms and various realizations. The field should include both the Theory (or Mathematics) and the Implementation. Our emphasis is on the interplay of Theory and Implementation. The interplay, an adaptive process itself, of Theory and Implementation is the only "holistic" way to advance our understanding and realization of brain-like computation. We feel that a theory without implementation has the tendency to become unrealistic and "out-of-touch" with reality, while an implementation without theory runs the risk to be superficial and obsolete.

  20. High-speed on-chip windowed centroiding using photodiode-based CMOS imager

    NASA Technical Reports Server (NTRS)

    Pain, Bedabrata (Inventor); Sun, Chao (Inventor); Yang, Guang (Inventor); Cunningham, Thomas J. (Inventor); Hancock, Bruce (Inventor)

    2003-01-01

    A centroid computation system is disclosed. The system has an imager array, a switching network, computation elements, and a divider circuit. The imager array has columns and rows of pixels. The switching network is adapted to receive pixel signals from the image array. The plurality of computation elements operates to compute inner products for at least x and y centroids. The plurality of computation elements has only passive elements to provide inner products of pixel signals the switching network. The divider circuit is adapted to receive the inner products and compute the x and y centroids.

  1. High-speed on-chip windowed centroiding using photodiode-based CMOS imager

    NASA Technical Reports Server (NTRS)

    Pain, Bedabrata (Inventor); Sun, Chao (Inventor); Yang, Guang (Inventor); Cunningham, Thomas J. (Inventor); Hancock, Bruce (Inventor)

    2004-01-01

    A centroid computation system is disclosed. The system has an imager array, a switching network, computation elements, and a divider circuit. The imager array has columns and rows of pixels. The switching network is adapted to receive pixel signals from the image array. The plurality of computation elements operates to compute inner products for at least x and y centroids. The plurality of computation elements has only passive elements to provide inner products of pixel signals the switching network. The divider circuit is adapted to receive the inner products and compute the x and y centroids.

  2. Cooperative Drought Adaptation: Integrating Infrastructure Development, Conservation, and Water Transfers into Adaptive Policy Pathways

    NASA Astrophysics Data System (ADS)

    Zeff, H. B.; Characklis, G. W.; Reed, P. M.; Herman, J. D.

    2015-12-01

    Water supply policies that integrate portfolios of short-term management decisions with long-term infrastructure development enable utilities to adapt to a range of future scenarios. An effective mix of short-term management actions can augment existing infrastructure, potentially forestalling new development. Likewise, coordinated expansion of infrastructure such as regional interconnections and shared treatment capacity can increase the effectiveness of some management actions like water transfers. Highly adaptable decision pathways that mix long-term infrastructure options and short-term management actions require decision triggers capable of incorporating the impact of these time-evolving decisions on growing water supply needs. Here, we adapt risk-based triggers to sequence a set of potential infrastructure options in combination with utility-specific conservation actions and inter-utility water transfers. Individual infrastructure pathways can be augmented with conservation or water transfers to reduce the cost of meeting utility objectives, but they can also include cooperatively developed, shared infrastructure that expands regional capacity to transfer water. This analysis explores the role of cooperation among four water utilities in the 'Research Triangle' region of North Carolina by formulating three distinct categories of adaptive policy pathways: independent action (utility-specific conservation and supply infrastructure only), weak cooperation (utility-specific conservation and infrastructure development with regional transfers), and strong cooperation (utility specific conservation and jointly developed of regional infrastructure that supports transfers). Results suggest that strong cooperation aids the utilities in meeting their individual objections at substantially lower costs and with fewer irreversible infrastructure options.

  3. Unstructured mesh adaptivity for urban flooding modelling

    NASA Astrophysics Data System (ADS)

    Hu, R.; Fang, F.; Salinas, P.; Pain, C. C.

    2018-05-01

    Over the past few decades, urban floods have been gaining more attention due to their increase in frequency. To provide reliable flooding predictions in urban areas, various numerical models have been developed to perform high-resolution flood simulations. However, the use of high-resolution meshes across the whole computational domain causes a high computational burden. In this paper, a 2D control-volume and finite-element flood model using adaptive unstructured mesh technology has been developed. This adaptive unstructured mesh technique enables meshes to be adapted optimally in time and space in response to the evolving flow features, thus providing sufficient mesh resolution where and when it is required. It has the advantage of capturing the details of local flows and wetting and drying front while reducing the computational cost. Complex topographic features are represented accurately during the flooding process. For example, the high-resolution meshes around the buildings and steep regions are placed when the flooding water reaches these regions. In this work a flooding event that happened in 2002 in Glasgow, Scotland, United Kingdom has been simulated to demonstrate the capability of the adaptive unstructured mesh flooding model. The simulations have been performed using both fixed and adaptive unstructured meshes, and then results have been compared with those published 2D and 3D results. The presented method shows that the 2D adaptive mesh model provides accurate results while having a low computational cost.

  4. Career Adapt-Abilities Scale in a French-Speaking Swiss Sample: Psychometric Properties and Relationships to Personality and Work Engagement

    ERIC Educational Resources Information Center

    Rossier, Jerome; Zecca, Gregory; Stauffer, Sarah D.; Maggiori, Christian; Dauwalder, Jean-Pierre

    2012-01-01

    The aim of this study was to analyze the psychometric properties of the Career Adapt-Abilities Scale (CAAS) in a French-speaking Swiss sample and its relationship with personality dimensions and work engagement. The heterogeneous sample of 391 participants (M[subscript age] = 39.59, SD = 12.30) completed the CAAS-International and a short version…

  5. Adaptive Nulling for the Terrestrial Planet Finder Interferometer

    NASA Technical Reports Server (NTRS)

    Peters, Robert D.; Lay, Oliver P.; Jeganathan, Muthu; Hirai, Akiko

    2006-01-01

    A description of adaptive nulling for Terrestrial Planet Finder Interferometer (TPFI) is presented. The topics include: 1) Nulling in TPF-I; 2) Why Do Adaptive Nulling; 3) Parallel High-Order Compensator Design; 4) Phase and Amplitude Control; 5) Development Activates; 6) Requirements; 7) Simplified Experimental Setup; 8) Intensity Correction; and 9) Intensity Dispersion Stability. A short summary is also given on adaptive nulling for the TPFI.

  6. Stability and error estimation for Component Adaptive Grid methods

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph; Zhu, Xiaolei

    1994-01-01

    Component adaptive grid (CAG) methods for solving hyperbolic partial differential equations (PDE's) are discussed in this paper. Applying recent stability results for a class of numerical methods on uniform grids. The convergence of these methods for linear problems on component adaptive grids is established here. Furthermore, the computational error can be estimated on CAG's using the stability results. Using these estimates, the error can be controlled on CAG's. Thus, the solution can be computed efficiently on CAG's within a given error tolerance. Computational results for time dependent linear problems in one and two space dimensions are presented.

  7. A Weibull distribution accrual failure detector for cloud computing.

    PubMed

    Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Dong, Jian; Zhao, Yao; Wen, Dongxin

    2017-01-01

    Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing.

  8. Three-dimensional geoelectric modelling with optimal work/accuracy rate using an adaptive wavelet algorithm

    NASA Astrophysics Data System (ADS)

    Plattner, A.; Maurer, H. R.; Vorloeper, J.; Dahmen, W.

    2010-08-01

    Despite the ever-increasing power of modern computers, realistic modelling of complex 3-D earth models is still a challenging task and requires substantial computing resources. The overwhelming majority of current geophysical modelling approaches includes either finite difference or non-adaptive finite element algorithms and variants thereof. These numerical methods usually require the subsurface to be discretized with a fine mesh to accurately capture the behaviour of the physical fields. However, this may result in excessive memory consumption and computing times. A common feature of most of these algorithms is that the modelled data discretizations are independent of the model complexity, which may be wasteful when there are only minor to moderate spatial variations in the subsurface parameters. Recent developments in the theory of adaptive numerical solvers have the potential to overcome this problem. Here, we consider an adaptive wavelet-based approach that is applicable to a large range of problems, also including nonlinear problems. In comparison with earlier applications of adaptive solvers to geophysical problems we employ here a new adaptive scheme whose core ingredients arose from a rigorous analysis of the overall asymptotically optimal computational complexity, including in particular, an optimal work/accuracy rate. Our adaptive wavelet algorithm offers several attractive features: (i) for a given subsurface model, it allows the forward modelling domain to be discretized with a quasi minimal number of degrees of freedom, (ii) sparsity of the associated system matrices is guaranteed, which makes the algorithm memory efficient and (iii) the modelling accuracy scales linearly with computing time. We have implemented the adaptive wavelet algorithm for solving 3-D geoelectric problems. To test its performance, numerical experiments were conducted with a series of conductivity models exhibiting varying degrees of structural complexity. Results were compared with a non-adaptive finite element algorithm, which incorporates an unstructured mesh to best-fitting subsurface boundaries. Such algorithms represent the current state-of-the-art in geoelectric modelling. An analysis of the numerical accuracy as a function of the number of degrees of freedom revealed that the adaptive wavelet algorithm outperforms the finite element solver for simple and moderately complex models, whereas the results become comparable for models with high spatial variability of electrical conductivities. The linear dependence of the modelling error and the computing time proved to be model-independent. This feature will allow very efficient computations using large-scale models as soon as our experimental code is optimized in terms of its implementation.

  9. Technical guidance and analytic services in support of SEASAT-A. [radar altimeters for altimetry and ocean wave height

    NASA Technical Reports Server (NTRS)

    Brooks, W. L.; Dooley, R. P.

    1975-01-01

    The design of a high resolution radar for altimetry and ocean wave height estimation was studied. From basic principles, it is shown that a short pulse wide beam radar is the most appropriate and recommended technique for measuring both altitude and ocean wave height. To achieve a topographic resolution of + or - 10 cm RMS at 5.0 meter RMS wave heights, as required for SEASAT-A, it is recommended that the altimeter design include an onboard adaptive processor. The resulting design, which assumes a maximum likelihood estimation (MLE) processor, is shown to satisfy all performance requirements. A design summary is given for the recommended radar altimeter, which includes a full deramp STRETCH pulse compression technique followed by an analog filter bank to separate range returns as well as the assumed MLE processor. The feedback loop implementation of the MLE on a digital computer was examined in detail, and computer size, estimation accuracies, and bias due to range sidelobes are given for the MLE with typical SEASAT-A parameters. The standard deviation of the altitude estimate was developed and evaluated for several adaptive and nonadaptive split-gate trackers. Split-gate tracker biases due to range sidelobes and transmitter noise are examined. An approximate closed form solution for the altimeter power return is derived and evaluated. The feasibility of utilizing the basic radar altimeter design for the measurement of ocean wave spectra was examined.

  10. Evolutionary computing based approach for the removal of ECG artifact from the corrupted EEG signal.

    PubMed

    Priyadharsini, S Suja; Rajan, S Edward

    2014-01-01

    Electroencephalogram (EEG) is an important tool for clinical diagnosis of brain-related disorders and problems. However, it is corrupted by various biological artifacts, of which ECG is one among them that reduces the clinical importance of EEG especially for epileptic patients and patients with short neck. To remove the ECG artifact from the measured EEG signal using an evolutionary computing approach based on the concept of Hybrid Adaptive Neuro-Fuzzy Inference System, which helps the Neurologists in the diagnosis and follow-up of encephalopathy. The proposed hybrid learning methods are ANFIS-MA and ANFIS-GA, which uses Memetic Algorithm (MA) and Genetic algorithm (GA) for tuning the antecedent and consequent part of the ANFIS structure individually. The performances of the proposed methods are compared with that of ANFIS and adaptive Recursive Least Squares (RLS) filtering algorithm. The proposed methods are experimentally validated by applying it to the simulated data sets, subjected to non-linearity condition and real polysomonograph data sets. Performance metrics such as sensitivity, specificity and accuracy of the proposed method ANFIS-MA, in terms of correction rate are found to be 93.8%, 100% and 99% respectively, which is better than current state-of-the-art approaches. The evaluation process used and demonstrated effectiveness of the proposed method proves that ANFIS-MA is more effective in suppressing ECG artifacts from the corrupted EEG signals than ANFIS-GA, ANFIS and RLS algorithm.

  11. Adaptive Crack Modeling with Interface Solid Elements for Plain and Fiber Reinforced Concrete Structures.

    PubMed

    Zhan, Yijian; Meschke, Günther

    2017-07-08

    The effective analysis of the nonlinear behavior of cement-based engineering structures not only demands physically-reliable models, but also computationally-efficient algorithms. Based on a continuum interface element formulation that is suitable to capture complex cracking phenomena in concrete materials and structures, an adaptive mesh processing technique is proposed for computational simulations of plain and fiber-reinforced concrete structures to progressively disintegrate the initial finite element mesh and to add degenerated solid elements into the interfacial gaps. In comparison with the implementation where the entire mesh is processed prior to the computation, the proposed adaptive cracking model allows simulating the failure behavior of plain and fiber-reinforced concrete structures with remarkably reduced computational expense.

  12. Adaptive Crack Modeling with Interface Solid Elements for Plain and Fiber Reinforced Concrete Structures

    PubMed Central

    Zhan, Yijian

    2017-01-01

    The effective analysis of the nonlinear behavior of cement-based engineering structures not only demands physically-reliable models, but also computationally-efficient algorithms. Based on a continuum interface element formulation that is suitable to capture complex cracking phenomena in concrete materials and structures, an adaptive mesh processing technique is proposed for computational simulations of plain and fiber-reinforced concrete structures to progressively disintegrate the initial finite element mesh and to add degenerated solid elements into the interfacial gaps. In comparison with the implementation where the entire mesh is processed prior to the computation, the proposed adaptive cracking model allows simulating the failure behavior of plain and fiber-reinforced concrete structures with remarkably reduced computational expense. PMID:28773130

  13. Architecture-Adaptive Computing Environment: A Tool for Teaching Parallel Programming

    NASA Technical Reports Server (NTRS)

    Dorband, John E.; Aburdene, Maurice F.

    2002-01-01

    Recently, networked and cluster computation have become very popular. This paper is an introduction to a new C based parallel language for architecture-adaptive programming, aCe C. The primary purpose of aCe (Architecture-adaptive Computing Environment) is to encourage programmers to implement applications on parallel architectures by providing them the assurance that future architectures will be able to run their applications with a minimum of modification. A secondary purpose is to encourage computer architects to develop new types of architectures by providing an easily implemented software development environment and a library of test applications. This new language should be an ideal tool to teach parallel programming. In this paper, we will focus on some fundamental features of aCe C.

  14. The Psychological Well-Being and Sociocultural Adaptation of Short-Term International Students in Ireland

    ERIC Educational Resources Information Center

    O'Reilly, Aileen; Ryan, Dermot; Hickey, Tina

    2010-01-01

    This article reports on an empirical study of the psychosocial adaptation of international students in Ireland. Using measures of social support, loneliness, stress, psychological well-being, and sociocultural adaptation, data were obtained from international students and a comparison sample of Irish students. The study found that, although…

  15. Digital and biological computing in organizations.

    PubMed

    Kampfner, Roberto R

    2002-01-01

    Michael Conrad unveiled many of the fundamental characteristics of biological computing. Underlying the behavioral variability and the adaptability of biological systems are these characteristics, including the ability of biological information processing to exploit quantum features at the atomic level, the powerful 3-D pattern recognition capabilities of macromolecules, the computational efficiency, and the ability to support biological function. Among many other things, Conrad formalized and explicated the underlying principles of biological adaptability, characterized the differences between biological and digital computing in terms of a fundamental tradeoff between adaptability and programmability of information processing, and discussed the challenges of interfacing digital computers and human society. This paper is about the encounter of biological and digital computing. The focus is on the nature of the biological information processing infrastructure of organizations and how it can be extended effectively with digital computing. In order to achieve this goal effectively, however, we need to embed properly digital computing into the information processing aspects of human and social behavior and intelligence, which are fundamentally biological. Conrad's legacy provides a firm, strong, and inspiring foundation for this endeavor.

  16. Let Documents Talk to Each Other: A Computer Model for Connection of Short Documents.

    ERIC Educational Resources Information Center

    Chen, Z.

    1993-01-01

    Discusses the integration of scientific texts through the connection of documents and describes a computer model that can connect short documents. Information retrieval and artificial intelligence are discussed; a prototype system of the model is explained; and the model is compared to other computer models. (17 references) (LRW)

  17. Making the Computer Fit the Child Rather than the Child Fit the Computer: Conversations between Children and Robots.

    ERIC Educational Resources Information Center

    Draper, Thomas W.; And Others

    This paper introduces and develops the premise that technology should be used as a tool to be adapted to early childhood education rather than adapting the preschool curriculum to computers. Although recent evidence suggests a national interest in having high technology play a role in the teaching of young children, particularly in reading,…

  18. Sequential decision making in computational sustainability via adaptive submodularity

    USGS Publications Warehouse

    Krause, Andreas; Golovin, Daniel; Converse, Sarah J.

    2015-01-01

    Many problems in computational sustainability require making a sequence of decisions in complex, uncertain environments. Such problems are generally notoriously difficult. In this article, we review the recently discovered notion of adaptive submodularity, an intuitive diminishing returns condition that generalizes the classical notion of submodular set functions to sequential decision problems. Problems exhibiting the adaptive submodularity property can be efficiently and provably near-optimally solved using simple myopic policies. We illustrate this concept in several case studies of interest in computational sustainability: First, we demonstrate how it can be used to efficiently plan for resolving uncertainty in adaptive management scenarios. Secondly, we show how it applies to dynamic conservation planning for protecting endangered species, a case study carried out in collaboration with the US Geological Survey and the US Fish and Wildlife Service.

  19. Impact of Load Balancing on Unstructured Adaptive Grid Computations for Distributed-Memory Multiprocessors

    NASA Technical Reports Server (NTRS)

    Sohn, Andrew; Biswas, Rupak; Simon, Horst D.

    1996-01-01

    The computational requirements for an adaptive solution of unsteady problems change as the simulation progresses. This causes workload imbalance among processors on a parallel machine which, in turn, requires significant data movement at runtime. We present a new dynamic load-balancing framework, called JOVE, that balances the workload across all processors with a global view. Whenever the computational mesh is adapted, JOVE is activated to eliminate the load imbalance. JOVE has been implemented on an IBM SP2 distributed-memory machine in MPI for portability. Experimental results for two model meshes demonstrate that mesh adaption with load balancing gives more than a sixfold improvement over one without load balancing. We also show that JOVE gives a 24-fold speedup on 64 processors compared to sequential execution.

  20. Gradient-free MCMC methods for dynamic causal modelling.

    PubMed

    Sengupta, Biswa; Friston, Karl J; Penny, Will D

    2015-05-15

    In this technical note we compare the performance of four gradient-free MCMC samplers (random walk Metropolis sampling, slice-sampling, adaptive MCMC sampling and population-based MCMC sampling with tempering) in terms of the number of independent samples they can produce per unit computational time. For the Bayesian inversion of a single-node neural mass model, both adaptive and population-based samplers are more efficient compared with random walk Metropolis sampler or slice-sampling; yet adaptive MCMC sampling is more promising in terms of compute time. Slice-sampling yields the highest number of independent samples from the target density - albeit at almost 1000% increase in computational time, in comparison to the most efficient algorithm (i.e., the adaptive MCMC sampler). Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Odor-context effects in free recall after a short retention interval: a new methodology for controlling adaptation.

    PubMed

    Isarida, Takeo; Sakai, Tetsuya; Kubota, Takayuki; Koga, Miho; Katayama, Yu; Isarida, Toshiko K

    2014-04-01

    The present study investigated context effects of incidental odors in free recall after a short retention interval (5 min). With a short retention interval, the results are not confounded by extraneous odors or encounters with the experimental odor and possible rehearsal during a long retention interval. A short study time condition (4 s per item), predicted not to be affected by adaptation to the odor, and a long study time condition (8 s per item) were used. Additionally, we introduced a new method for recovery from adaptation, where a dissimilar odor was briefly presented at the beginning of the retention interval, and we demonstrated the effectiveness of this technique. An incidental learning paradigm was used to prevent overshadowing from confounding the results. In three experiments, undergraduates (N = 200) incidentally studied words presented one-by-one and received a free recall test. Two pairs of odors and a third odor having different semantic-differential characteristics were selected from 14 familiar odors. One of the odors was presented during encoding, and during the test, the same odor (same-context condition) or the other odor within the pair (different-context condition) was presented. Without using a recovery-from-adaptation method, a significant odor-context effect appeared in the 4-s/item condition, but not in the 8-s/item condition. Using the recovery-from-adaptation method, context effects were found for both the 8- and the 4-s/item conditions. The size of the recovered odor-context effect did not change with study time. There were no serial position effects. Implications of the present findings are discussed.

  2. Cooperative drought adaptation: Integrating infrastructure development, conservation, and water transfers into adaptive policy pathways

    NASA Astrophysics Data System (ADS)

    Zeff, Harrison B.; Herman, Jonathan D.; Reed, Patrick M.; Characklis, Gregory W.

    2016-09-01

    A considerable fraction of urban water supply capacity serves primarily as a hedge against drought. Water utilities can reduce their dependence on firm capacity and forestall the development of new supplies using short-term drought management actions, such as conservation and transfers. Nevertheless, new supplies will often be needed, especially as demands rise due to population growth and economic development. Planning decisions regarding when and how to integrate new supply projects are fundamentally shaped by the way in which short-term adaptive drought management strategies are employed. To date, the challenges posed by long-term infrastructure sequencing and adaptive short-term drought management are treated independently, neglecting important feedbacks between planning and management actions. This work contributes a risk-based framework that uses continuously updating risk-of-failure (ROF) triggers to capture the feedbacks between short-term drought management actions (e.g., conservation and water transfers) and the selection and sequencing of a set of regional supply infrastructure options over the long term. Probabilistic regional water supply pathways are discovered for four water utilities in the "Research Triangle" region of North Carolina. Furthermore, this study distinguishes the status-quo planning path of independent action (encompassing utility-specific conservation and new supply infrastructure only) from two cooperative formulations: "weak" cooperation, which combines utility-specific conservation and infrastructure development with regional transfers, and "strong" cooperation, which also includes jointly developed regional infrastructure to support transfers. Results suggest that strong cooperation aids utilities in meeting their individual objectives at substantially lower costs and with less overall development. These benefits demonstrate how an adaptive, rule-based decision framework can coordinate integrated solutions that would not be identified using more traditional optimization methods.

  3. Purpose-driven biomaterials research in liver-tissue engineering.

    PubMed

    Ananthanarayanan, Abhishek; Narmada, Balakrishnan Chakrapani; Mo, Xuejun; McMillian, Michael; Yu, Hanry

    2011-03-01

    Bottom-up engineering of microscale tissue ("microtissue") constructs to recapitulate partially the complex structure-function relationships of liver parenchyma has been realized through the development of sophisticated biomaterial scaffolds, liver-cell sources, and in vitro culture techniques. With regard to in vivo applications, the long-lived stem/progenitor cell constructs can improve cell engraftment, whereas the short-lived, but highly functional hepatocyte constructs stimulate host liver regeneration. With regard to in vitro applications, microtissue constructs are being adapted or custom-engineered into cell-based assays for testing acute, chronic and idiosyncratic toxicities of drugs or pathogens. Systems-level methods and computational models that represent quantitative relationships between biomaterial scaffolds, cells and microtissue constructs will further enable their rational design for optimal integration into specific biomedical applications. Copyright © 2010 Elsevier Ltd. All rights reserved.

  4. High Performance Fortran for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Mehrotra, Piyush; Zima, Hans; Bushnell, Dennis M. (Technical Monitor)

    2000-01-01

    This paper focuses on the use of High Performance Fortran (HPF) for important classes of algorithms employed in aerospace applications. HPF is a set of Fortran extensions designed to provide users with a high-level interface for programming data parallel scientific applications, while delegating to the compiler/runtime system the task of generating explicitly parallel message-passing programs. We begin by providing a short overview of the HPF language. This is followed by a detailed discussion of the efficient use of HPF for applications involving multiple structured grids such as multiblock and adaptive mesh refinement (AMR) codes as well as unstructured grid codes. We focus on the data structures and computational structures used in these codes and on the high-level strategies that can be expressed in HPF to optimally exploit the parallelism in these algorithms.

  5. Parallel-aware, dedicated job co-scheduling within/across symmetric multiprocessing nodes

    DOEpatents

    Jones, Terry R.; Watson, Pythagoras C.; Tuel, William; Brenner, Larry; ,Caffrey, Patrick; Fier, Jeffrey

    2010-10-05

    In a parallel computing environment comprising a network of SMP nodes each having at least one processor, a parallel-aware co-scheduling method and system for improving the performance and scalability of a dedicated parallel job having synchronizing collective operations. The method and system uses a global co-scheduler and an operating system kernel dispatcher adapted to coordinate interfering system and daemon activities on a node and across nodes to promote intra-node and inter-node overlap of said interfering system and daemon activities as well as intra-node and inter-node overlap of said synchronizing collective operations. In this manner, the impact of random short-lived interruptions, such as timer-decrement processing and periodic daemon activity, on synchronizing collective operations is minimized on large processor-count SPMD bulk-synchronous programming styles.

  6. Spreadsheet-based program for alignment of overlapping DNA sequences.

    PubMed

    Anbazhagan, R; Gabrielson, E

    1999-06-01

    Molecular biology laboratories frequently face the challenge of aligning small overlapping DNA sequences derived from a long DNA segment. Here, we present a short program that can be used to adapt Excel spreadsheets as a tool for aligning DNA sequences, regardless of their orientation. The program runs on any Windows or Macintosh operating system computer with Excel 97 or Excel 98. The program is available for use as an Excel file, which can be downloaded from the BioTechniques Web site. Upon execution, the program opens a specially designed customized workbook and is capable of identifying overlapping regions between two sequence fragments and displaying the sequence alignment. It also performs a number of specialized functions such as recognition of restriction enzyme cutting sites and CpG island mapping without costly specialized software.

  7. Apo AIV and Citrulline Plasma Concentrations in Short Bowel Syndrome Patients: The Influence of Short Bowel Anatomy.

    PubMed

    López-Tejero, M Dolores; Virgili, Núria; Targarona, Jordi; Ruiz, Jorge; García, Natalia; Oró, Denise; García-Villoria, Judit; Creus, Gloria; Pita, Ana M

    Parenteral nutrition (PN) dependence in short bowel syndrome (SBS) patients is linked to the functionality of the remnant small bowel (RSB). Patients may wean off PN following a period of intestinal adaptation that restores this functionality. Currently, plasma citrulline is the standard biomarker for monitoring intestinal functionality and adaptation. However, available studies reveal that the relationship the biomarker with the length and function of the RSB is arguable. Thus, having additional biomarkers would improve pointing out PN weaning. By measuring concomitant changes in citrulline and the novel biomarker apolipoprotein AIV (Apo AIV), as well as taking into account the anatomy of the RSB, this exploratory study aims to a better understanding of the intestinal adaptation process and characterization of the SBS patients under PN. Thirty four adult SBS patients were selected and assigned to adapted (aSBS) and non-adapted (nSBS) groups after reconstructive surgeries. Remaining jejunum and ileum lengths were recorded. The aSBS patients were either on an oral diet (ORAL group), those with intestinal insufficiency, or on oral and home parenteral nutrition (HPN group), those with chronic intestinal failure. Apo AIV and citrulline were analyzed in plasma samples after overnight fasting. An exploratory ROC analysis using citrulline as gold standard was performed. Biomarkers, Apo AIV and citrulline showed a significant correlation with RSBL in aSBS patients. In jejuno-ileocolic patients, only Apo AIV correlated with RSBL (rb = 0.54) and with ileum length (rb = 0.84). In patients without ileum neither biomarker showed any correlation with RSBL. ROC analysis indicated the Apo AIV cut-off value to be 4.6 mg /100 mL for differentiating between the aSBS HPN and ORAL groups. Therefore, in addition to citrulline, Apo AIV can be set as a biomarker to monitor intestinal adaptation in SBS patients. As short bowel anatomy is shown to influence citrulline and Apo AIV plasma values, both biomarkers complement each other furnishing a new insight to manage PN dependence.

  8. Enhanced adaptive signal control using dedicated short-range communications.

    DOT National Transportation Integrated Search

    2014-05-01

    Connected vehicle technology with dedicated short-range communications can provide traffic : information in a spatial domain that conventional fixed-point detectors cannot provide. However, because : of low market penetration with this new data sourc...

  9. Adaptation of gastrointestinal nematode parasites to host genotype: single locus simulation models

    PubMed Central

    2013-01-01

    Background Breeding livestock for improved resistance to disease is an increasingly important selection goal. However, the risk of pathogens adapting to livestock bred for improved disease resistance is difficult to quantify. Here, we explore the possibility of gastrointestinal worms adapting to sheep bred for low faecal worm egg count using computer simulation. Our model assumes sheep and worm genotypes interact at a single locus, such that the effect of an A allele in sheep is dependent on worm genotype, and the B allele in worms is favourable for parasitizing the A allele sheep but may increase mortality on pasture. We describe the requirements for adaptation and test if worm adaptation (1) is slowed by non-genetic features of worm infections and (2) can occur with little observable change in faecal worm egg count. Results Adaptation in worms was found to be primarily influenced by overall worm fitness, viz. the balance between the advantage of the B allele during the parasitic stage in sheep and its disadvantage on pasture. Genetic variation at the interacting locus in worms could be from de novo or segregating mutations, but de novo mutations are rare and segregating mutations are likely constrained to have (near) neutral effects on worm fitness. Most other aspects of the worm infection we modelled did not affect the outcomes. However, the host-controlled mechanism to reduce faecal worm egg count by lowering worm fecundity reduced the selection pressure on worms to adapt compared to other mechanisms, such as increasing worm mortality. Temporal changes in worm egg count were unreliable for detecting adaptation, despite the steady environment assumed in the simulations. Conclusions Adaptation of worms to sheep selected for low faecal worm egg count requires an allele segregating in worms that is favourable in animals with improved resistance but less favourable in other animals. Obtaining alleles with this specific property seems unlikely. With support from experimental data, we conclude that selection for low faecal worm egg count should be stable over a short time frame (e.g. 20 years). We are further exploring model outcomes with multiple loci and comparing outcomes to other control strategies. PMID:23714384

  10. Evaluation Parameters for Computer-Adaptive Testing

    ERIC Educational Resources Information Center

    Georgiadou, Elisabeth; Triantafillou, Evangelos; Economides, Anastasios A.

    2006-01-01

    With the proliferation of computers in test delivery today, adaptive testing has become quite popular, especially when examinees must be classified into two categories (passfail, master nonmaster). Several well-established organisations have provided standards and guidelines for the design and evaluation of educational and psychological testing.…

  11. Between-Trial Forgetting Due to Interference and Time in Motor Adaptation.

    PubMed

    Kim, Sungshin; Oh, Youngmin; Schweighofer, Nicolas

    2015-01-01

    Learning a motor task with temporally spaced presentations or with other tasks intermixed between presentations reduces performance during training, but can enhance retention post training. These two effects are known as the spacing and contextual interference effect, respectively. Here, we aimed at testing a unifying hypothesis of the spacing and contextual interference effects in visuomotor adaptation, according to which forgetting between trials due to either spaced presentations or interference by another task will promote between-trial forgetting, which will depress performance during acquisition, but will promote retention. We first performed an experiment with three visuomotor adaptation conditions: a short inter-trial-interval (ITI) condition (SHORT-ITI); a long ITI condition (LONG-ITI); and an alternating condition with two alternated opposite tasks (ALT), with the same single-task ITI as in LONG-ITI. In the SHORT-ITI condition, there was fastest increase in performance during training and largest immediate forgetting in the retention tests. In contrast, in the ALT condition, there was slowest increase in performance during training and little immediate forgetting in the retention tests. Compared to these two conditions, in the LONG-ITI, we found intermediate increase in performance during training and intermediate immediate forgetting. To account for these results, we fitted to the data six possible adaptation models with one or two time scales, and with interference in the fast, or in the slow, or in both time scales. Model comparison confirmed that two time scales and some degree of interferences in either time scale are needed to account for our experimental results. In summary, our results suggest that retention following adaptation is modulated by the degree of between-trial forgetting, which is due to time-based decay in single adaptation task and interferences in multiple adaptation tasks.

  12. [Possibilities of the pharmacological correction of adaptive reactions of human organism in short-term moving from middle to high altitude].

    PubMed

    Kundashev, U K; Zurdinov, A Z; Barchukov, V G

    2014-01-01

    It was assessed the efficacy of drugs belonging to the class of actoprotectors and antihypoxants, which are prescribed for short-term adaptation of the human organism moving from middle altitude (1670 m) to high altitude (3750 m). Volunteers stayed at the middle altitude for 15 days before moving to the high altitude. Prior to climbing, upon arrival at high altitude, and on the 3rd day, the state of the CNS, cardiovascular system, red and white blood growth, and some indices of the energy exchange metabolism were assessed. Drugs or placebo were administered in tablets after the first testing (one hour before moving to high altitude), upon arrival at altitude (next day, after breakfast), and on the third day (one hour before last testing). It is established that a combination of metaprote and ladasten (in doses of 0.125 and 0.1 g, respectively) favors acceleration of the short-term adaptation, which is manifested by the improved tolerance of physical activity at high altitude and the acceleration of white and red blood growth. In case of the combined administration of both actoprotectors, the adaptive reactions of tolerance and blood system were more pronounced in comparison to the administration of hypoxen (0.5 g) alone. A distinguishing feature of the reaction of the human organism upon taking the combination of actoprotectors was that adaptation of the energy supply system (judging from metabolites studied) already took place in the first hours of staying at high altitudes, while the adaptation upon taking 0.5 g hypoxen was observed on the 3rd day and the adaptive reactions in placebo group were still developing on the 3rd day.

  13. Bacterial computing: a form of natural computing and its applications.

    PubMed

    Lahoz-Beltra, Rafael; Navarro, Jorge; Marijuán, Pedro C

    2014-01-01

    The capability to establish adaptive relationships with the environment is an essential characteristic of living cells. Both bacterial computing and bacterial intelligence are two general traits manifested along adaptive behaviors that respond to surrounding environmental conditions. These two traits have generated a variety of theoretical and applied approaches. Since the different systems of bacterial signaling and the different ways of genetic change are better known and more carefully explored, the whole adaptive possibilities of bacteria may be studied under new angles. For instance, there appear instances of molecular "learning" along the mechanisms of evolution. More in concrete, and looking specifically at the time dimension, the bacterial mechanisms of learning and evolution appear as two different and related mechanisms for adaptation to the environment; in somatic time the former and in evolutionary time the latter. In the present chapter it will be reviewed the possible application of both kinds of mechanisms to prokaryotic molecular computing schemes as well as to the solution of real world problems.

  14. Bacterial computing: a form of natural computing and its applications

    PubMed Central

    Lahoz-Beltra, Rafael; Navarro, Jorge; Marijuán, Pedro C.

    2014-01-01

    The capability to establish adaptive relationships with the environment is an essential characteristic of living cells. Both bacterial computing and bacterial intelligence are two general traits manifested along adaptive behaviors that respond to surrounding environmental conditions. These two traits have generated a variety of theoretical and applied approaches. Since the different systems of bacterial signaling and the different ways of genetic change are better known and more carefully explored, the whole adaptive possibilities of bacteria may be studied under new angles. For instance, there appear instances of molecular “learning” along the mechanisms of evolution. More in concrete, and looking specifically at the time dimension, the bacterial mechanisms of learning and evolution appear as two different and related mechanisms for adaptation to the environment; in somatic time the former and in evolutionary time the latter. In the present chapter it will be reviewed the possible application of both kinds of mechanisms to prokaryotic molecular computing schemes as well as to the solution of real world problems. PMID:24723912

  15. Short-term adaptation of the VOR: non-retinal-slip error signals and saccade substitution

    NASA Technical Reports Server (NTRS)

    Eggers, Sscott D Z.; De Pennington, Nick; Walker, Mark F.; Shelhamer, Mark; Zee, David S.

    2003-01-01

    We studied short-term (30 min) adaptation of the vestibulo-ocular reflex (VOR) in five normal humans using a "position error" stimulus without retinal image motion. Both before and after adaptation a velocity gain (peak slow-phase eye velocity/peak head velocity) and a position gain (total eye movement during chair rotation/amplitude of chair motion) were measured in darkness using search coils. The vestibular stimulus was a brief ( approximately 700 ms), 15 degrees chair rotation in darkness (peak velocity 43 degrees /s). To elicit adaptation, a straight-ahead fixation target disappeared during chair movement and when the chair stopped the target reappeared at a new location in front of the subject for gain-decrease (x0) adaptation, or 10 degrees opposite to chair motion for gain-increase (x1.67) adaptation. This position-error stimulus was effective at inducing VOR adaptation, though for gain-increase adaptation the primary strategy was to substitute augmenting saccades during rotation while for gain-decrease adaptation both corrective saccades and a decrease in slow-phase velocity occurred. Finally, the presence of the position-error signal alone, at the end of head rotation, without any attempt to fix upon it, was not sufficient to induce adaptation. Adaptation did occur, however, if the subject did make a saccade to the target after head rotation, or even if the subject paid attention to the new location of the target without actually looking at it.

  16. Computer-aided detection of initial polyp candidates with level set-based adaptive convolution

    NASA Astrophysics Data System (ADS)

    Zhu, Hongbin; Duan, Chaijie; Liang, Zhengrong

    2009-02-01

    In order to eliminate or weaken the interference between different topological structures on the colon wall, adaptive and normalized convolution methods were used to compute the first and second order spatial derivatives of computed tomographic colonography images, which is the beginning of various geometric analyses. However, the performance of such methods greatly depends on the single-layer representation of the colon wall, which is called the starting layer (SL) in the following text. In this paper, we introduce a level set-based adaptive convolution (LSAC) method to compute the spatial derivatives, in which the level set method is employed to determine a more reasonable SL. The LSAC was applied to a computer-aided detection (CAD) scheme to detect the initial polyp candidates, and experiments showed that it benefits the CAD scheme in both the detection sensitivity and specificity as compared to our previous work.

  17. A Weibull distribution accrual failure detector for cloud computing

    PubMed Central

    Wu, Zhibo; Wu, Jin; Zhao, Yao; Wen, Dongxin

    2017-01-01

    Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing. PMID:28278229

  18. Parallel 3D Mortar Element Method for Adaptive Nonconforming Meshes

    NASA Technical Reports Server (NTRS)

    Feng, Huiyu; Mavriplis, Catherine; VanderWijngaart, Rob; Biswas, Rupak

    2004-01-01

    High order methods are frequently used in computational simulation for their high accuracy. An efficient way to avoid unnecessary computation in smooth regions of the solution is to use adaptive meshes which employ fine grids only in areas where they are needed. Nonconforming spectral elements allow the grid to be flexibly adjusted to satisfy the computational accuracy requirements. The method is suitable for computational simulations of unsteady problems with very disparate length scales or unsteady moving features, such as heat transfer, fluid dynamics or flame combustion. In this work, we select the Mark Element Method (MEM) to handle the non-conforming interfaces between elements. A new technique is introduced to efficiently implement MEM in 3-D nonconforming meshes. By introducing an "intermediate mortar", the proposed method decomposes the projection between 3-D elements and mortars into two steps. In each step, projection matrices derived in 2-D are used. The two-step method avoids explicitly forming/deriving large projection matrices for 3-D meshes, and also helps to simplify the implementation. This new technique can be used for both h- and p-type adaptation. This method is applied to an unsteady 3-D moving heat source problem. With our new MEM implementation, mesh adaptation is able to efficiently refine the grid near the heat source and coarsen the grid once the heat source passes. The savings in computational work resulting from the dynamic mesh adaptation is demonstrated by the reduction of the the number of elements used and CPU time spent. MEM and mesh adaptation, respectively, bring irregularity and dynamics to the computer memory access pattern. Hence, they provide a good way to gauge the performance of computer systems when running scientific applications whose memory access patterns are irregular and unpredictable. We select a 3-D moving heat source problem as the Unstructured Adaptive (UA) grid benchmark, a new component of the NAS Parallel Benchmarks (NPB). In this paper, we present some interesting performance results of ow OpenMP parallel implementation on different architectures such as the SGI Origin2000, SGI Altix, and Cray MTA-2.

  19. Unstructured Grid Adaptation: Status, Potential Impacts, and Recommended Investments Toward CFD Vision 2030

    NASA Technical Reports Server (NTRS)

    Park, Michael A.; Krakos, Joshua A.; Michal, Todd; Loseille, Adrien; Alonso, Juan J.

    2016-01-01

    Unstructured grid adaptation is a powerful tool to control discretization error for Computational Fluid Dynamics (CFD). It has enabled key increases in the accuracy, automation, and capacity of some fluid simulation applications. Slotnick et al. provides a number of case studies in the CFD Vision 2030 Study: A Path to Revolutionary Computational Aerosciences to illustrate the current state of CFD capability and capacity. The authors forecast the potential impact of emerging High Performance Computing (HPC) environments forecast in the year 2030 and identify that mesh generation and adaptivity continue to be significant bottlenecks in the CFD work flow. These bottlenecks may persist because very little government investment has been targeted in these areas. To motivate investment, the impacts of improved grid adaptation technologies are identified. The CFD Vision 2030 Study roadmap and anticipated capabilities in complementary disciplines are quoted to provide context for the progress made in grid adaptation in the past fifteen years, current status, and a forecast for the next fifteen years with recommended investments. These investments are specific to mesh adaptation and impact other aspects of the CFD process. Finally, a strategy is identified to diffuse grid adaptation technology into production CFD work flows.

  20. An SDR-Based Real-Time Testbed for GNSS Adaptive Array Anti-Jamming Algorithms Accelerated by GPU

    PubMed Central

    Xu, Hailong; Cui, Xiaowei; Lu, Mingquan

    2016-01-01

    Nowadays, software-defined radio (SDR) has become a common approach to evaluate new algorithms. However, in the field of Global Navigation Satellite System (GNSS) adaptive array anti-jamming, previous work has been limited due to the high computational power demanded by adaptive algorithms, and often lack flexibility and configurability. In this paper, the design and implementation of an SDR-based real-time testbed for GNSS adaptive array anti-jamming accelerated by a Graphics Processing Unit (GPU) are documented. This testbed highlights itself as a feature-rich and extendible platform with great flexibility and configurability, as well as high computational performance. Both Space-Time Adaptive Processing (STAP) and Space-Frequency Adaptive Processing (SFAP) are implemented with a wide range of parameters. Raw data from as many as eight antenna elements can be processed in real-time in either an adaptive nulling or beamforming mode. To fully take advantage of the parallelism resource provided by the GPU, a batched method in programming is proposed. Tests and experiments are conducted to evaluate both the computational and anti-jamming performance. This platform can be used for research and prototyping, as well as a real product in certain applications. PMID:26978363

  1. An SDR-Based Real-Time Testbed for GNSS Adaptive Array Anti-Jamming Algorithms Accelerated by GPU.

    PubMed

    Xu, Hailong; Cui, Xiaowei; Lu, Mingquan

    2016-03-11

    Nowadays, software-defined radio (SDR) has become a common approach to evaluate new algorithms. However, in the field of Global Navigation Satellite System (GNSS) adaptive array anti-jamming, previous work has been limited due to the high computational power demanded by adaptive algorithms, and often lack flexibility and configurability. In this paper, the design and implementation of an SDR-based real-time testbed for GNSS adaptive array anti-jamming accelerated by a Graphics Processing Unit (GPU) are documented. This testbed highlights itself as a feature-rich and extendible platform with great flexibility and configurability, as well as high computational performance. Both Space-Time Adaptive Processing (STAP) and Space-Frequency Adaptive Processing (SFAP) are implemented with a wide range of parameters. Raw data from as many as eight antenna elements can be processed in real-time in either an adaptive nulling or beamforming mode. To fully take advantage of the parallelism resource provided by the GPU, a batched method in programming is proposed. Tests and experiments are conducted to evaluate both the computational and anti-jamming performance. This platform can be used for research and prototyping, as well as a real product in certain applications.

  2. Electrooptical adaptive switching network for the hypercube computer

    NASA Technical Reports Server (NTRS)

    Chow, E.; Peterson, J.

    1988-01-01

    An all-optical network design for the hyperswitch network using regular free-space interconnects between electronic processor nodes is presented. The adaptive routing model used is described, and an adaptive routing control example is presented. The design demonstrates that existing electrooptical techniques are sufficient for implementing efficient parallel architectures without the need for more complex means of implementing arbitrary interconnection schemes. The electrooptical hyperswitch network significantly improves the communication performance of the hypercube computer.

  3. Computation of free energy profiles with parallel adaptive dynamics

    NASA Astrophysics Data System (ADS)

    Lelièvre, Tony; Rousset, Mathias; Stoltz, Gabriel

    2007-04-01

    We propose a formulation of an adaptive computation of free energy differences, in the adaptive biasing force or nonequilibrium metadynamics spirit, using conditional distributions of samples of configurations which evolve in time. This allows us to present a truly unifying framework for these methods, and to prove convergence results for certain classes of algorithms. From a numerical viewpoint, a parallel implementation of these methods is very natural, the replicas interacting through the reconstructed free energy. We demonstrate how to improve this parallel implementation by resorting to some selection mechanism on the replicas. This is illustrated by computations on a model system of conformational changes.

  4. Long-Lasting Modifications of Saccadic Eye Movements Following Adaptation Induced in the Double-Step Target Paradigm

    ERIC Educational Resources Information Center

    Alahyane, Nadia; Pelisson, Denis

    2005-01-01

    The adaptation of saccadic eye movements to environmental changes occurring throughout life is a good model of motor learning and motor memory. Numerous studies have analyzed the behavioral properties and neural substrate of oculomotor learning in short-term saccadic adaptation protocols, but to our knowledge, none have tested the persistence of…

  5. Implementing Process-Oriented, Guided-Inquiry Learning for the First Time: Adaptations and Short-Term Impacts on Students' Attitude and Performance

    ERIC Educational Resources Information Center

    Chase, Anthony; Pakhira, Deblina; Stains, Marilyne

    2013-01-01

    Innovative, research-based instructional practices are critical to transforming the conventional undergraduate instructional landscape into a student-centered learning environment. Research on dissemination of innovation indicates that instructors often adapt rather than adopt these practices. These adaptations can lead to the loss of critical…

  6. Towards a neuro-computational account of prism adaptation.

    PubMed

    Petitet, Pierre; O'Reilly, Jill X; O'Shea, Jacinta

    2017-12-14

    Prism adaptation has a long history as an experimental paradigm used to investigate the functional and neural processes that underlie sensorimotor control. In the neuropsychology literature, prism adaptation behaviour is typically explained by reference to a traditional cognitive psychology framework that distinguishes putative functions, such as 'strategic control' versus 'spatial realignment'. This theoretical framework lacks conceptual clarity, quantitative precision and explanatory power. Here, we advocate for an alternative computational framework that offers several advantages: 1) an algorithmic explanatory account of the computations and operations that drive behaviour; 2) expressed in quantitative mathematical terms; 3) embedded within a principled theoretical framework (Bayesian decision theory, state-space modelling); 4) that offers a means to generate and test quantitative behavioural predictions. This computational framework offers a route towards mechanistic neurocognitive explanations of prism adaptation behaviour. Thus it constitutes a conceptual advance compared to the traditional theoretical framework. In this paper, we illustrate how Bayesian decision theory and state-space models offer principled explanations for a range of behavioural phenomena in the field of prism adaptation (e.g. visual capture, magnitude of visual versus proprioceptive realignment, spontaneous recovery and dynamics of adaptation memory). We argue that this explanatory framework can advance understanding of the functional and neural mechanisms that implement prism adaptation behaviour, by enabling quantitative tests of hypotheses that go beyond merely descriptive mapping claims that 'brain area X is (somehow) involved in psychological process Y'. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Short-Term Effects of Playing Computer Games on Attention

    ERIC Educational Resources Information Center

    Tahiroglu, Aysegul Yolga; Celik, Gonca Gul; Avci, Ayse; Seydaoglu, Gulsah; Uzel, Mehtap; Altunbas, Handan

    2010-01-01

    Objective: The main aim of the present study is to investigate the short-term cognitive effects of computer games in children with different psychiatric disorders and normal controls. Method: One hundred one children are recruited for the study (aged between 9 and 12 years). All participants played a motor-racing game on the computer for 1 hour.…

  8. Reconstruction of Haplotype-Blocks Selected during Experimental Evolution.

    PubMed

    Franssen, Susanne U; Barton, Nicholas H; Schlötterer, Christian

    2017-01-01

    The genetic analysis of experimentally evolving populations typically relies on short reads from pooled individuals (Pool-Seq). While this method provides reliable allele frequency estimates, the underlying haplotype structure remains poorly characterized. With small population sizes and adaptive variants that start from low frequencies, the interpretation of selection signatures in most Evolve and Resequencing studies remains challenging. To facilitate the characterization of selection targets, we propose a new approach that reconstructs selected haplotypes from replicated time series, using Pool-Seq data. We identify selected haplotypes through the correlated frequencies of alleles carried by them. Computer simulations indicate that selected haplotype-blocks of several Mb can be reconstructed with high confidence and low error rates, even when allele frequencies change only by 20% across three replicates. Applying this method to real data from D. melanogaster populations adapting to a hot environment, we identify a selected haplotype-block of 6.93 Mb. We confirm the presence of this haplotype-block in evolved populations by experimental haplotyping, demonstrating the power and accuracy of our haplotype reconstruction from Pool-Seq data. We propose that the combination of allele frequency estimates with haplotype information will provide the key to understanding the dynamics of adaptive alleles. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. Adaptive Technology that Provides Access to Computers. DO-IT Program.

    ERIC Educational Resources Information Center

    Washington Univ., Seattle.

    This brochure describes the different types of barriers individuals with mobility impairments, blindness, low vision, hearing impairments, and specific learning disabilities face in providing computer input, interpreting output, and reading documentation. The adaptive hardware and software that has been developed to provide functional alternatives…

  10. The Feasibility of Adaptive Unstructured Computations On Petaflops Systems

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Oliker, Leonid; Heber, Gerd; Gao, Guang; Saini, Subhash (Technical Monitor)

    1999-01-01

    This viewgraph presentation covers the advantages of mesh adaptation, unstructured grids, and dynamic load balancing. It illustrates parallel adaptive communications, and explains PLUM (Parallel dynamic load balancing for adaptive unstructured meshes), and PSAW (Proper Self Avoiding Walks).

  11. Selection of Norway spruce somatic embryos by computer vision

    NASA Astrophysics Data System (ADS)

    Hamalainen, Jari J.; Jokinen, Kari J.

    1993-05-01

    A computer vision system was developed for the classification of plant somatic embryos. The embryos are in a Petri dish that is transferred with constant speed and they are recognized as they pass a line scan camera. A classification algorithm needs to be installed for every plant species. This paper describes an algorithm for the recognition of Norway spruce (Picea abies) embryos. A short review of conifer micropropagation by somatic embryogenesis is also given. The recognition algorithm is based on features calculated from the boundary of the object. Only part of the boundary corresponding to the developing cotyledons (2 - 15) and the straight sides of the embryo are used for recognition. An index of the length of the cotyledons describes the developmental stage of the embryo. The testing set for classifier performance consisted of 118 embryos and 478 nonembryos. With the classification tolerances chosen 69% of the objects classified as embryos by a human classifier were selected and 31$% rejected. Less than 1% of the nonembryos were classified as embryos. The basic features developed can probably be easily adapted for the recognition of other conifer somatic embryos.

  12. Atropos: specific, sensitive, and speedy trimming of sequencing reads.

    PubMed

    Didion, John P; Martin, Marcel; Collins, Francis S

    2017-01-01

    A key step in the transformation of raw sequencing reads into biological insights is the trimming of adapter sequences and low-quality bases. Read trimming has been shown to increase the quality and reliability while decreasing the computational requirements of downstream analyses. Many read trimming software tools are available; however, no tool simultaneously provides the accuracy, computational efficiency, and feature set required to handle the types and volumes of data generated in modern sequencing-based experiments. Here we introduce Atropos and show that it trims reads with high sensitivity and specificity while maintaining leading-edge speed. Compared to other state-of-the-art read trimming tools, Atropos achieves significant increases in trimming accuracy while remaining competitive in execution times. Furthermore, Atropos maintains high accuracy even when trimming data with elevated rates of sequencing errors. The accuracy, high performance, and broad feature set offered by Atropos makes it an appropriate choice for the pre-processing of Illumina, ABI SOLiD, and other current-generation short-read sequencing datasets. Atropos is open source and free software written in Python (3.3+) and available at https://github.com/jdidion/atropos.

  13. Atropos: specific, sensitive, and speedy trimming of sequencing reads

    PubMed Central

    Collins, Francis S.

    2017-01-01

    A key step in the transformation of raw sequencing reads into biological insights is the trimming of adapter sequences and low-quality bases. Read trimming has been shown to increase the quality and reliability while decreasing the computational requirements of downstream analyses. Many read trimming software tools are available; however, no tool simultaneously provides the accuracy, computational efficiency, and feature set required to handle the types and volumes of data generated in modern sequencing-based experiments. Here we introduce Atropos and show that it trims reads with high sensitivity and specificity while maintaining leading-edge speed. Compared to other state-of-the-art read trimming tools, Atropos achieves significant increases in trimming accuracy while remaining competitive in execution times. Furthermore, Atropos maintains high accuracy even when trimming data with elevated rates of sequencing errors. The accuracy, high performance, and broad feature set offered by Atropos makes it an appropriate choice for the pre-processing of Illumina, ABI SOLiD, and other current-generation short-read sequencing datasets. Atropos is open source and free software written in Python (3.3+) and available at https://github.com/jdidion/atropos. PMID:28875074

  14. Beamforming using subspace estimation from a diagonally averaged sample covariance.

    PubMed

    Quijano, Jorge E; Zurk, Lisa M

    2017-08-01

    The potential benefit of a large-aperture sonar array for high resolution target localization is often challenged by the lack of sufficient data required for adaptive beamforming. This paper introduces a Toeplitz-constrained estimator of the clairvoyant signal covariance matrix corresponding to multiple far-field targets embedded in background isotropic noise. The estimator is obtained by averaging along subdiagonals of the sample covariance matrix, followed by covariance extrapolation using the method of maximum entropy. The sample covariance is computed from limited data snapshots, a situation commonly encountered with large-aperture arrays in environments characterized by short periods of local stationarity. Eigenvectors computed from the Toeplitz-constrained covariance are used to construct signal-subspace projector matrices, which are shown to reduce background noise and improve detection of closely spaced targets when applied to subspace beamforming. Monte Carlo simulations corresponding to increasing array aperture suggest convergence of the proposed projector to the clairvoyant signal projector, thereby outperforming the classic projector obtained from the sample eigenvectors. Beamforming performance of the proposed method is analyzed using simulated data, as well as experimental data from the Shallow Water Array Performance experiment.

  15. Classical density functional theory and the phase-field crystal method using a rational function to describe the two-body direct correlation function.

    PubMed

    Pisutha-Arnond, N; Chan, V W L; Iyer, M; Gavini, V; Thornton, K

    2013-01-01

    We introduce a new approach to represent a two-body direct correlation function (DCF) in order to alleviate the computational demand of classical density functional theory (CDFT) and enhance the predictive capability of the phase-field crystal (PFC) method. The approach utilizes a rational function fit (RFF) to approximate the two-body DCF in Fourier space. We use the RFF to show that short-wavelength contributions of the two-body DCF play an important role in determining the thermodynamic properties of materials. We further show that using the RFF to empirically parametrize the two-body DCF allows us to obtain the thermodynamic properties of solids and liquids that agree with the results of CDFT simulations with the full two-body DCF without incurring significant computational costs. In addition, the RFF can also be used to improve the representation of the two-body DCF in the PFC method. Last, the RFF allows for a real-space reformulation of the CDFT and PFC method, which enables descriptions of nonperiodic systems and the use of nonuniform and adaptive grids.

  16. Retrieval of long and short lists from long term memory: a functional magnetic resonance imaging study with human subjects.

    PubMed

    Zysset, S; Müller, K; Lehmann, C; Thöne-Otto, A I; von Cramon, D Y

    2001-11-13

    Previous studies have shown that reaction time in an item-recognition task with both short and long lists is a quadratic function of list length. This suggests that either different memory retrieval processes are implied for short and long lists or an adaptive process is involved. An event-related functional magnetic resonance imaging study with nine subjects and list lengths varying between 3 and 18 words was conducted to identify the underlying neuronal structures of retrieval from long and short lists. For the retrieval and processing of word-lists a single fronto-parietal network, including premotor, left prefrontal, left precuneal and left parietal regions, was activated. With increasing list length, no additional regions became involved in retrieving information from long-term memory, suggesting that not necessarily different, but highly adaptive retrieval processes are involved.

  17. Age-related changes in gait adaptability in response to unpredictable obstacles and stepping targets.

    PubMed

    Caetano, Maria Joana D; Lord, Stephen R; Schoene, Daniel; Pelicioni, Paulo H S; Sturnieks, Daina L; Menant, Jasmine C

    2016-05-01

    A large proportion of falls in older people occur when walking. Limitations in gait adaptability might contribute to tripping; a frequently reported cause of falls in this group. To evaluate age-related changes in gait adaptability in response to obstacles or stepping targets presented at short notice, i.e.: approximately two steps ahead. Fifty older adults (aged 74±7 years; 34 females) and 21 young adults (aged 26±4 years; 12 females) completed 3 usual gait speed (baseline) trials. They then completed the following randomly presented gait adaptability trials: obstacle avoidance, short stepping target, long stepping target and no target/obstacle (3 trials of each). Compared with the young, the older adults slowed significantly in no target/obstacle trials compared with the baseline trials. They took more steps and spent more time in double support while approaching the obstacle and stepping targets, demonstrated poorer stepping accuracy and made more stepping errors (failed to hit the stepping targets/avoid the obstacle). The older adults also reduced velocity of the two preceding steps and shortened the previous step in the long stepping target condition and in the obstacle avoidance condition. Compared with their younger counterparts, the older adults exhibited a more conservative adaptation strategy characterised by slow, short and multiple steps with longer time in double support. Even so, they demonstrated poorer stepping accuracy and made more stepping errors. This reduced gait adaptability may place older adults at increased risk of falling when negotiating unexpected hazards. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Lawrence Livermore National Laboratory`s Computer Security Short Subjects Videos: Hidden Password, The Incident, Dangerous Games and The Mess; Computer Security Awareness Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    A video on computer security is described. Lonnie Moore, the Computer Security Manager, CSSM/CPPM at Lawrence Livermore National Laboratory (LLNL) and Gale Warshawsky, the Coordinator for Computer Security Education and Awareness at LLNL, wanted to share topics such as computer ethics, software piracy, privacy issues, and protecting information in a format that would capture and hold an audience`s attention. Four Computer Security Short Subject videos were produced which ranged from 1--3 minutes each. These videos are very effective education and awareness tools that can be used to generate discussions about computer security concerns and good computing practices.

  19. High-resolution ophthalmic imaging system

    DOEpatents

    Olivier, Scot S.; Carrano, Carmen J.

    2007-12-04

    A system for providing an improved resolution retina image comprising an imaging camera for capturing a retina image and a computer system operatively connected to the imaging camera, the computer producing short exposures of the retina image and providing speckle processing of the short exposures to provide the improved resolution retina image. The system comprises the steps of capturing a retina image, producing short exposures of the retina image, and speckle processing the short exposures of the retina image to provide the improved resolution retina image.

  20. Method and apparatus for detection of catalyst failure on-board a motor vehicle using a dual oxygen sensor and an algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clemmens, W.B.; Koupal, J.W.; Sabourin, M.A.

    1993-07-20

    Apparatus is described for detecting motor vehicle exhaust gas catalytic converter deterioration comprising a first exhaust gas oxygen sensor adapted for communication with an exhaust stream before passage of the exhaust stream through a catalytic converter and a second exhaust gas oxygen sensor adapted for communication with the exhaust stream after passage of the exhaust stream through the catalytic converter, an on-board vehicle computational means, said computational means adapted to accept oxygen content signals from the before and after catalytic converter oxygen sensors and adapted to generate signal threshold values, said computational means adapted to compare over repeated time intervalsmore » the oxygen content signals to the signal threshold values and to store the output of the compared oxygen content signals, and in response after a specified number of time intervals for a specified mode of motor vehicle operation to determine and indicate a level of catalyst deterioration.« less

  1. Adaptive subdomain modeling: A multi-analysis technique for ocean circulation models

    NASA Astrophysics Data System (ADS)

    Altuntas, Alper; Baugh, John

    2017-07-01

    Many coastal and ocean processes of interest operate over large temporal and geographical scales and require a substantial amount of computational resources, particularly when engineering design and failure scenarios are also considered. This study presents an adaptive multi-analysis technique that improves the efficiency of these computations when multiple alternatives are being simulated. The technique, called adaptive subdomain modeling, concurrently analyzes any number of child domains, with each instance corresponding to a unique design or failure scenario, in addition to a full-scale parent domain providing the boundary conditions for its children. To contain the altered hydrodynamics originating from the modifications, the spatial extent of each child domain is adaptively adjusted during runtime depending on the response of the model. The technique is incorporated in ADCIRC++, a re-implementation of the popular ADCIRC ocean circulation model with an updated software architecture designed to facilitate this adaptive behavior and to utilize concurrent executions of multiple domains. The results of our case studies confirm that the method substantially reduces computational effort while maintaining accuracy.

  2. Short-term adaptation and chronic cardiac remodelling to high altitude in lowlander natives and Himalayan Sherpa.

    PubMed

    Stembridge, Mike; Ainslie, Philip N; Shave, Rob

    2015-11-01

    What is the topic of this review? At high altitude, the cardiovascular system must adapt in order to meet the metabolic demand for oxygen. This review summarizes recent findings relating to short-term and life-long cardiac adaptation to high altitude in the context of exercise capacity. What advances does it highlight? Both Sherpa and lowlanders exhibit smaller left ventricular volumes at high altitude; however, myocardial relaxation, as evidenced by diastolic untwist, is reduced only in Sherpa, indicating that short-term hypoxia does not impair diastolic relaxation. Potential remodelling of systolic function, as evidenced by lower left ventricular systolic twist in Sherpa, may facilitate the requisite sea-level mechanical reserve required during exercise, although this remains to be confirmed. Both short-term and life-long high-altitude exposure challenge the cardiovascular system to meet the metabolic demand for O2 in a hypoxic environment. As the demand for O2 delivery increases during exercise, the circulatory component of oxygen transport is placed under additional stress. Acute adaptation and chronic remodelling of cardiac structure and function may occur to facilitate O2 delivery in lowlanders during sojourn to high altitude and in permanent highland residents. However, our understanding of cardiac structural and functional adaption in Sherpa remains confined to a higher maximal heart rate, lower pulmonary vascular resistance and no differences in resting cardiac output. Ventricular form and function are intrinsically linked through the left ventricular (LV) mechanics that facilitate efficient ejection, minimize myofibre stress during contraction and aid diastolic recoil. Recent examination of LV mechanics has allowed detailed insight into fundamental cardiac adaptation in high-altitude Sherpa. In this symposium report, we review recent advances in our understanding of LV function in both lowlanders and Sherpa at rest and discuss the potential consequences for exercise capacity. Collectively, data indicate chronic structural ventricular adaptation, with adult Sherpa having smaller absolute and relative LV size. Consistent with structural remodelling, cardiac mechanics also differ in Sherpa when compared with lowlanders at high altitude. These differences are characterized by a reduction in resting systolic deformation and slower diastolic untwisting, a surrogate of relaxation. These changes may reflect a functional cardiac adaptation that affords Sherpa the same mechanical reserve seen in lowlanders at sea level, which is absent when they ascend to high altitude. © 2014 The Authors. Experimental Physiology © 2014 The Physiological Society.

  3. Advances in Adaptive Control Methods

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan

    2009-01-01

    This poster presentation describes recent advances in adaptive control technology developed by NASA. Optimal Control Modification is a novel adaptive law that can improve performance and robustness of adaptive control systems. A new technique has been developed to provide an analytical method for computing time delay stability margin for adaptive control systems.

  4. Accurate typing of short tandem repeats from genome-wide sequencing data and its applications.

    PubMed

    Fungtammasan, Arkarachai; Ananda, Guruprasad; Hile, Suzanne E; Su, Marcia Shu-Wei; Sun, Chen; Harris, Robert; Medvedev, Paul; Eckert, Kristin; Makova, Kateryna D

    2015-05-01

    Short tandem repeats (STRs) are implicated in dozens of human genetic diseases and contribute significantly to genome variation and instability. Yet profiling STRs from short-read sequencing data is challenging because of their high sequencing error rates. Here, we developed STR-FM, short tandem repeat profiling using flank-based mapping, a computational pipeline that can detect the full spectrum of STR alleles from short-read data, can adapt to emerging read-mapping algorithms, and can be applied to heterogeneous genetic samples (e.g., tumors, viruses, and genomes of organelles). We used STR-FM to study STR error rates and patterns in publicly available human and in-house generated ultradeep plasmid sequencing data sets. We discovered that STRs sequenced with a PCR-free protocol have up to ninefold fewer errors than those sequenced with a PCR-containing protocol. We constructed an error correction model for genotyping STRs that can distinguish heterozygous alleles containing STRs with consecutive repeat numbers. Applying our model and pipeline to Illumina sequencing data with 100-bp reads, we could confidently genotype several disease-related long trinucleotide STRs. Utilizing this pipeline, for the first time we determined the genome-wide STR germline mutation rate from a deeply sequenced human pedigree. Additionally, we built a tool that recommends minimal sequencing depth for accurate STR genotyping, depending on repeat length and sequencing read length. The required read depth increases with STR length and is lower for a PCR-free protocol. This suite of tools addresses the pressing challenges surrounding STR genotyping, and thus is of wide interest to researchers investigating disease-related STRs and STR evolution. © 2015 Fungtammasan et al.; Published by Cold Spring Harbor Laboratory Press.

  5. Scaffolding and Integrated Assessment in Computer Assisted Learning (CAL) for Children with Learning Disabilities

    ERIC Educational Resources Information Center

    Beale, Ivan L.

    2005-01-01

    Computer assisted learning (CAL) can involve a computerised intelligent learning environment, defined as an environment capable of automatically, dynamically and continuously adapting to the learning context. One aspect of this adaptive capability involves automatic adjustment of instructional procedures in response to each learner's performance,…

  6. Parallel Algorithm Solves Coupled Differential Equations

    NASA Technical Reports Server (NTRS)

    Hayashi, A.

    1987-01-01

    Numerical methods adapted to concurrent processing. Algorithm solves set of coupled partial differential equations by numerical integration. Adapted to run on hypercube computer, algorithm separates problem into smaller problems solved concurrently. Increase in computing speed with concurrent processing over that achievable with conventional sequential processing appreciable, especially for large problems.

  7. ADAPTIVE-GRID SIMULATION OF GROUNDWATER FLOW IN HETEROGENEOUS AQUIFERS. (R825689C068)

    EPA Science Inventory

    Abstract

    The prediction of contaminant transport in porous media requires the computation of the flow velocity. This work presents a methodology for high-accuracy computation of flow in a heterogeneous isotropic formation, employing a dual-flow formulation and adaptive...

  8. Artificial Intelligence Methods in Computer-Based Instructional Design. The Minnesota Adaptive Instructional System.

    ERIC Educational Resources Information Center

    Tennyson, Robert

    1984-01-01

    Reviews educational applications of artificial intelligence and presents empirically-based design variables for developing a computer-based instruction management system. Taken from a programmatic research effort based on the Minnesota Adaptive Instructional System, variables include amount and sequence of instruction, display time, advisement,…

  9. SIMCA T 1.0: A SAS Computer Program for Simulating Computer Adaptive Testing

    ERIC Educational Resources Information Center

    Raiche, Gilles; Blais, Jean-Guy

    2006-01-01

    Monte Carlo methodologies are frequently applied to study the sampling distribution of the estimated proficiency level in adaptive testing. These methods eliminate real situational constraints. However, these Monte Carlo methodologies are not currently supported by the available software programs, and when these programs are available, their…

  10. Computer-Adaptive Testing in Second Language Contexts.

    ERIC Educational Resources Information Center

    Chalhoub-Deville, Micheline; Deville, Craig

    1999-01-01

    Provides a broad overview of computerized testing issues with an emphasis on computer-adaptive testing (CAT). A survey of the potential benefits and drawbacks of CAT are given, the process of CAT development is described; and some L2 instruments developed to assess various language skills are summarized. (Author/VWL)

  11. Dynamic grid refinement for partial differential equations on parallel computers

    NASA Technical Reports Server (NTRS)

    Mccormick, S.; Quinlan, D.

    1989-01-01

    The fast adaptive composite grid method (FAC) is an algorithm that uses various levels of uniform grids to provide adaptive resolution and fast solution of PDEs. An asynchronous version of FAC, called AFAC, that completely eliminates the bottleneck to parallelism is presented. This paper describes the advantage that this algorithm has in adaptive refinement for moving singularities on multiprocessor computers. This work is applicable to the parallel solution of two- and three-dimensional shock tracking problems.

  12. Unstructured mesh generation and adaptivity

    NASA Technical Reports Server (NTRS)

    Mavriplis, D. J.

    1995-01-01

    An overview of current unstructured mesh generation and adaptivity techniques is given. Basic building blocks taken from the field of computational geometry are first described. Various practical mesh generation techniques based on these algorithms are then constructed and illustrated with examples. Issues of adaptive meshing and stretched mesh generation for anisotropic problems are treated in subsequent sections. The presentation is organized in an education manner, for readers familiar with computational fluid dynamics, wishing to learn more about current unstructured mesh techniques.

  13. Estimating Skin Cancer Risk: Evaluating Mobile Computer-Adaptive Testing.

    PubMed

    Djaja, Ngadiman; Janda, Monika; Olsen, Catherine M; Whiteman, David C; Chien, Tsair-Wei

    2016-01-22

    Response burden is a major detriment to questionnaire completion rates. Computer adaptive testing may offer advantages over non-adaptive testing, including reduction of numbers of items required for precise measurement. Our aim was to compare the efficiency of non-adaptive (NAT) and computer adaptive testing (CAT) facilitated by Partial Credit Model (PCM)-derived calibration to estimate skin cancer risk. We used a random sample from a population-based Australian cohort study of skin cancer risk (N=43,794). All 30 items of the skin cancer risk scale were calibrated with the Rasch PCM. A total of 1000 cases generated following a normal distribution (mean [SD] 0 [1]) were simulated using three Rasch models with three fixed-item (dichotomous, rating scale, and partial credit) scenarios, respectively. We calculated the comparative efficiency and precision of CAT and NAT (shortening of questionnaire length and the count difference number ratio less than 5% using independent t tests). We found that use of CAT led to smaller person standard error of the estimated measure than NAT, with substantially higher efficiency but no loss of precision, reducing response burden by 48%, 66%, and 66% for dichotomous, Rating Scale Model, and PCM models, respectively. CAT-based administrations of the skin cancer risk scale could substantially reduce participant burden without compromising measurement precision. A mobile computer adaptive test was developed to help people efficiently assess their skin cancer risk.

  14. Adaptations in Electronic Structure Calculations in Heterogeneous Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Talamudupula, Sai

    Modern quantum chemistry deals with electronic structure calculations of unprecedented complexity and accuracy. They demand full power of high-performance computing and must be in tune with the given architecture for superior e ciency. To make such applications resourceaware, it is desirable to enable their static and dynamic adaptations using some external software (middleware), which may monitor both system availability and application needs, rather than mix science with system-related calls inside the application. The present work investigates scienti c application interlinking with middleware based on the example of the computational chemistry package GAMESS and middleware NICAN. The existing synchronous model ismore » limited by the possible delays due to the middleware processing time under the sustainable runtime system conditions. Proposed asynchronous and hybrid models aim at overcoming this limitation. When linked with NICAN, the fragment molecular orbital (FMO) method is capable of adapting statically and dynamically its fragment scheduling policy based on the computing platform conditions. Signi cant execution time and throughput gains have been obtained due to such static adaptations when the compute nodes have very di erent core counts. Dynamic adaptations are based on the main memory availability at run time. NICAN prompts FMO to postpone scheduling certain fragments, if there is not enough memory for their immediate execution. Hence, FMO may be able to complete the calculations whereas without such adaptations it aborts.« less

  15. Adaptive Tracking Control for Robots With an Interneural Computing Scheme.

    PubMed

    Tsai, Feng-Sheng; Hsu, Sheng-Yi; Shih, Mau-Hsiang

    2018-04-01

    Adaptive tracking control of mobile robots requires the ability to follow a trajectory generated by a moving target. The conventional analysis of adaptive tracking uses energy minimization to study the convergence and robustness of the tracking error when the mobile robot follows a desired trajectory. However, in the case that the moving target generates trajectories with uncertainties, a common Lyapunov-like function for energy minimization may be extremely difficult to determine. Here, to solve the adaptive tracking problem with uncertainties, we wish to implement an interneural computing scheme in the design of a mobile robot for behavior-based navigation. The behavior-based navigation adopts an adaptive plan of behavior patterns learning from the uncertainties of the environment. The characteristic feature of the interneural computing scheme is the use of neural path pruning with rewards and punishment interacting with the environment. On this basis, the mobile robot can be exploited to change its coupling weights in paths of neural connections systematically, which can then inhibit or enhance the effect of flow elimination in the dynamics of the evolutionary neural network. Such dynamical flow translation ultimately leads to robust sensory-to-motor transformations adapting to the uncertainties of the environment. A simulation result shows that the mobile robot with the interneural computing scheme can perform fault-tolerant behavior of tracking by maintaining suitable behavior patterns at high frequency levels.

  16. Translation, Validation, and Reliability of the Dutch Late-Life Function and Disability Instrument Computer Adaptive Test.

    PubMed

    Arensman, Remco M; Pisters, Martijn F; de Man-van Ginkel, Janneke M; Schuurmans, Marieke J; Jette, Alan M; de Bie, Rob A

    2016-09-01

    Adequate and user-friendly instruments for assessing physical function and disability in older adults are vital for estimating and predicting health care needs in clinical practice. The Late-Life Function and Disability Instrument Computer Adaptive Test (LLFDI-CAT) is a promising instrument for assessing physical function and disability in gerontology research and clinical practice. The aims of this study were: (1) to translate the LLFDI-CAT to the Dutch language and (2) to investigate its validity and reliability in a sample of older adults who spoke Dutch and dwelled in the community. For the assessment of validity of the LLFDI-CAT, a cross-sectional design was used. To assess reliability, measurement of the LLFDI-CAT was repeated in the same sample. The item bank of the LLFDI-CAT was translated with a forward-backward procedure. A sample of 54 older adults completed the LLFDI-CAT, World Health Organization Disability Assessment Schedule 2.0, RAND 36-Item Short-Form Health Survey physical functioning scale (10 items), and 10-Meter Walk Test. The LLFDI-CAT was repeated in 2 to 8 days (mean=4.5 days). Pearson's r and the intraclass correlation coefficient (ICC) (2,1) were calculated to assess validity, group-level reliability, and participant-level reliability. A correlation of .74 for the LLFDI-CAT function scale and the RAND 36-Item Short-Form Health Survey physical functioning scale (10 items) was found. The correlations of the LLFDI-CAT disability scale with the World Health Organization Disability Assessment Schedule 2.0 and the 10-Meter Walk Test were -.57 and -.53, respectively. The ICC (2,1) of the LLFDI-CAT function scale was .84, with a group-level reliability score of .85. The ICC (2,1) of the LLFDI-CAT disability scale was .76, with a group-level reliability score of .81. The high percentage of women in the study and the exclusion of older adults with recent joint replacement or hospitalization limit the generalizability of the results. The Dutch LLFDI-CAT showed strong validity and high reliability when used to assess physical function and disability in older adults dwelling in the community. © 2016 American Physical Therapy Association.

  17. SAGE - MULTIDIMENSIONAL SELF-ADAPTIVE GRID CODE

    NASA Technical Reports Server (NTRS)

    Davies, C. B.

    1994-01-01

    SAGE, Self Adaptive Grid codE, is a flexible tool for adapting and restructuring both 2D and 3D grids. Solution-adaptive grid methods are useful tools for efficient and accurate flow predictions. In supersonic and hypersonic flows, strong gradient regions such as shocks, contact discontinuities, shear layers, etc., require careful distribution of grid points to minimize grid error and produce accurate flow-field predictions. SAGE helps the user obtain more accurate solutions by intelligently redistributing (i.e. adapting) the original grid points based on an initial or interim flow-field solution. The user then computes a new solution using the adapted grid as input to the flow solver. The adaptive-grid methodology poses the problem in an algebraic, unidirectional manner for multi-dimensional adaptations. The procedure is analogous to applying tension and torsion spring forces proportional to the local flow gradient at every grid point and finding the equilibrium position of the resulting system of grid points. The multi-dimensional problem of grid adaption is split into a series of one-dimensional problems along the computational coordinate lines. The reduced one dimensional problem then requires a tridiagonal solver to find the location of grid points along a coordinate line. Multi-directional adaption is achieved by the sequential application of the method in each coordinate direction. The tension forces direct the redistribution of points to the strong gradient region. To maintain smoothness and a measure of orthogonality of grid lines, torsional forces are introduced that relate information between the family of lines adjacent to one another. The smoothness and orthogonality constraints are direction-dependent, since they relate only the coordinate lines that are being adapted to the neighboring lines that have already been adapted. Therefore the solutions are non-unique and depend on the order and direction of adaption. Non-uniqueness of the adapted grid is acceptable since it makes possible an overall and local error reduction through grid redistribution. SAGE includes the ability to modify the adaption techniques in boundary regions, which substantially improves the flexibility of the adaptive scheme. The vectorial approach used in the analysis also provides flexibility. The user has complete choice of adaption direction and order of sequential adaptions without concern for the computational data structure. Multiple passes are available with no restraint on stepping directions; for each adaptive pass the user can choose a completely new set of adaptive parameters. This facility, combined with the capability of edge boundary control, enables the code to individually adapt multi-dimensional multiple grids. Zonal grids can be adapted while maintaining continuity along the common boundaries. For patched grids, the multiple-pass capability enables complete adaption. SAGE is written in FORTRAN 77 and is intended to be machine independent; however, it requires a FORTRAN compiler which supports NAMELIST input. It has been successfully implemented on Sun series computers, SGI IRIS's, DEC MicroVAX computers, HP series computers, the Cray YMP, and IBM PC compatibles. Source code is provided, but no sample input and output files are provided. The code reads three datafiles: one that contains the initial grid coordinates (x,y,z), one that contains corresponding flow-field variables, and one that contains the user control parameters. It is assumed that the first two datasets are formatted as defined in the plotting software package PLOT3D. Several machine versions of PLOT3D are available from COSMIC. The amount of main memory is dependent on the size of the matrix. The standard distribution medium for SAGE is a 5.25 inch 360K MS-DOS format diskette. It is also available on a .25 inch streaming magnetic tape cartridge in UNIX tar format or on a 9-track 1600 BPI ASCII CARD IMAGE format magnetic tape. SAGE was developed in 1989, first released as a 2D version in 1991 and updated to 3D in 1993.

  18. Evaluation of electrosurgical interference to low-power spread-spectrum local area net transceivers.

    PubMed

    Gibby, G L; Schwab, W K; Miller, W C

    1997-11-01

    To study whether an electrosurgery device interferes with the operation of a low-power spread-spectrum wireless network adapter. Nonrandomized, unblinded trials with controls, conducted in the corridor of our institution's operating suite using two portable computers equipped with RoamAbout omnidirectional 250 mW spread-spectrum 928 MHz wireless network adapters. To simulate high power electrosurgery interference, a 100-watt continuous electrocoagulation arc was maintained five feet from the receiving adapter, while device reported signal to noise values were measured at 150 feet and 400 feet distance between the wireless-networked computers. At 150 feet range, and with continuous 100-watt electrocoagulation arc five feet from one computer, error-corrected local area net throughput was measured by sending and receiving a large file multiple times. The reported signal to noise (N = 50) decreased with electrocoagulation from 36.42+/-3.47 (control) to 31.85+/-3.64 (electrocoagulation) (p < 0.001) at 400 feet inter-adapter distance, and from 64.53+/-1.43 (control) to 60.12+/-3.77 (electrocoagulation) (p < 0.001) at 150 feet inter-adapter distance. There was no statistically significant change in network throughput (average 93 kbyte/second) at 150 feet inter-adapter distance, either transmitting or receiving during continuous 100 Watt electrocoagulation arc. The manufacturer indicates "acceptable" performance will be obtained with signal to noise values as low as 20. In view of this, while electrocoagulation affects this spread spectrum network adapter, the effects are small even at 400 feet. At a distance of 150 feet, no discernible effect on network communications was found, suggesting that if other obstructions are minimal, within a wide range on one floor of an operating suite, network communications may be maintained using the technology of this wireless spread spectrum network adapter. The impact of such adapters on cardiac pacemakers should be studied. Wireless spread spectrum network adapters are an attractive technology for mobile computer communications in the operating room.

  19. Short Films--Sure Winners: A Select List of the Best of the Recent Releases.

    ERIC Educational Resources Information Center

    Reingoldas, Elena; And Others

    1979-01-01

    Lists and offers brief descriptions of 47 recently released short films, arranged in the following categories: literary adaptations, profiles and personal growth, energy and technology, upbeat and positive, and human concerns. (FL)

  20. Short-Term Memory in Habituation and Dishabituation

    ERIC Educational Resources Information Center

    Whitlow, Jesse William, Jr.

    1975-01-01

    The present research evaluated the refractorylike response decrement, as found in habituation of auditory evoked peripheral vasoconstriction in rabbits, to determine whether or not it represents a short-term habituation process distinct from effector fatigue or sensory adaptation. (Editor)

  1. Reconfigurable environmentally adaptive computing

    NASA Technical Reports Server (NTRS)

    Coxe, Robin L. (Inventor); Galica, Gary E. (Inventor)

    2008-01-01

    Described are methods and apparatus, including computer program products, for reconfigurable environmentally adaptive computing technology. An environmental signal representative of an external environmental condition is received. A processing configuration is automatically selected, based on the environmental signal, from a plurality of processing configurations. A reconfigurable processing element is reconfigured to operate according to the selected processing configuration. In some examples, the environmental condition is detected and the environmental signal is generated based on the detected condition.

  2. Adaptive-optics optical coherence tomography processing using a graphics processing unit.

    PubMed

    Shafer, Brandon A; Kriske, Jeffery E; Kocaoglu, Omer P; Turner, Timothy L; Liu, Zhuolin; Lee, John Jaehwan; Miller, Donald T

    2014-01-01

    Graphics processing units are increasingly being used for scientific computing for their powerful parallel processing abilities, and moderate price compared to super computers and computing grids. In this paper we have used a general purpose graphics processing unit to process adaptive-optics optical coherence tomography (AOOCT) images in real time. Increasing the processing speed of AOOCT is an essential step in moving the super high resolution technology closer to clinical viability.

  3. Measuring resilience after spinal cord injury: Development, validation and psychometric characteristics of the SCI-QOL Resilience item bank and short form.

    PubMed

    Victorson, David; Tulsky, David S; Kisala, Pamela A; Kalpakjian, Claire Z; Weiland, Brian; Choi, Seung W

    2015-05-01

    To describe the development and psychometric properties of the Spinal Cord Injury--Quality of Life (SCI-QOL) Resilience item bank and short form. Using a mixed-methods design, we developed and tested a resilience item bank through the use of focus groups with individuals with SCI and clinicians with expertise in SCI, cognitive interviews, and item-response theory based analytic approaches, including tests of model fit and differential item functioning (DIF). We tested a 32-item pool at several medical institutions across the United States, including the University of Michigan, Kessler Foundation, the Rehabilitation Institute of Chicago, the University of Washington, Craig Hospital and the James J. Peters/Bronx Department of Veterans Affairs medical center. A total of 717 individuals with SCI completed the Resilience items. A unidimensional model was observed (CFI=0.968; RMSEA=0.074) and measurement precision was good (theta range between -3.1 and 0.9). Ten items were flagged for DIF, however, after examination of effect sizes we found this to be negligible with little practical impact on score estimates. The final calibrated item bank resulted in 21 retained items. This study indicates that the SCI-QOL Resilience item bank represents a psychometrically robust measurement tool. Short form items are also suggested and computer adaptive tests are available.

  4. Measuring resilience after spinal cord injury: Development, validation and psychometric characteristics of the SCI-QOL Resilience item bank and short form

    PubMed Central

    Victorson, David; Tulsky, David S.; Kisala, Pamela A.; Kalpakjian, Claire Z.; Weiland, Brian; Choi, Seung W.

    2015-01-01

    Objective To describe the development and psychometric properties of the Spinal Cord Injury - Quality of Life (SCI-QOL) Resilience item bank and short form. Design Using a mixed-methods design, we developed and tested a resilience item bank through the use of focus groups with individuals with SCI and clinicians with expertise in SCI, cognitive interviews, and item-response theory based analytic approaches, including tests of model fit and differential item functioning (DIF). Setting We tested a 32-item pool at several medical institutions across the United States, including the University of Michigan, Kessler Foundation, the Rehabilitation Institute of Chicago, the University of Washington, Craig Hospital and the James J. Peters/Bronx Department of Veterans Affairs medical center. Participants A total of 717 individuals with SCI completed the Resilience items. Results A unidimensional model was observed (CFI = 0.968; RMSEA = 0.074) and measurement precision was good (theta range between −3.1 and 0.9). Ten items were flagged for DIF, however, after examination of effect sizes we found this to be negligible with little practical impact on score estimates. The final calibrated item bank resulted in 21 retained items. Conclusion This study indicates that the SCI-QOL Resilience item bank represents a psychometrically robust measurement tool. Short form items are also suggested and computer adaptive tests are available. PMID:26010971

  5. Adaptive Instrument Module: Space Instrument Controller "Brain" through Programmable Logic Devices

    NASA Technical Reports Server (NTRS)

    Darrin, Ann Garrison; Conde, Richard; Chern, Bobbie; Luers, Phil; Jurczyk, Steve; Mills, Carl; Day, John H. (Technical Monitor)

    2001-01-01

    The Adaptive Instrument Module (AIM) will be the first true demonstration of reconfigurable computing with field-programmable gate arrays (FPGAs) in space, enabling the 'brain' of the system to evolve or adapt to changing requirements. In partnership with NASA Goddard Space Flight Center and the Australian Cooperative Research Centre for Satellite Systems (CRC-SS), APL has built the flight version to be flown on the Australian university-class satellite FEDSAT. The AIM provides satellites the flexibility to adapt to changing mission requirements by reconfiguring standardized processing hardware rather than incurring the large costs associated with new builds. This ability to reconfigure the processing in response to changing mission needs leads to true evolveable computing, wherein the instrument 'brain' can learn from new science data in order to perform state-of-the-art data processing. The development of the AIM is significant in its enormous potential to reduce total life-cycle costs for future space exploration missions. The advent of RAM-based FPGAs whose configuration can be changed at any time has enabled the development of the AIM for processing tasks that could not be performed in software. The use of the AIM enables reconfiguration of the FPGA circuitry while the spacecraft is in flight, with many accompanying advantages. The AIM demonstrates the practicalities of using reconfigurable computing hardware devices by conducting a series of designed experiments. These include the demonstration of implementing data compression, data filtering, and communication message processing and inter-experiment data computation. The second generation is the Adaptive Processing Template (ADAPT) which is further described in this paper. The next step forward is to make the hardware itself adaptable and the ADAPT pursues this challenge by developing a reconfigurable module that will be capable of functioning efficiently in various applications. ADAPT will take advantage of radiation tolerant RAM-based field programmable gate array (FPGA) technology to develop a reconfigurable processor that combines the flexibility of a general purpose processor running software with the performance of application specific processing hardware for a variety of high performance computing applications.

  6. Communicating climate change adaptation information using web-based platforms

    NASA Astrophysics Data System (ADS)

    Karali, Eleni; Mattern, Kati

    2017-07-01

    To facilitate progress in climate change adaptation policy and practice, it is important not only to ensure the production of accurate, comprehensive and relevant information, but also the easy, timely and affordable access to it. This can contribute to better-informed decisions and improve the design and implementation of adaptation policies and other relevant initiatives. Web-based platforms can play an important role in communicating and distributing data, information and knowledge that become constantly available, reaching out to a large group of potential users. Indeed in the last decade there has been an extensive increase in the number of platforms developed for this purpose in many fields including climate change adaptation. This short paper concentrates on the web-based adaptation platforms developed in Europe. It provides an overview of the recently emerged landscape, examines the basic characteristics of a set of platforms that operate at national, transnational and European level, and discusses some of the key challenges related to their development, maintenance and overall management. Findings presented in this short paper are discussed in greater detailed in the Technical Report of the European Environment Agency Overview of climate change adaptation platforms in Europe.

  7. Mitigation and Adaptation within a Climate Policy Portfolio

    EPA Science Inventory

    An effective policy response to climate change will include, among other things, investments in lowering greenhouse gas emissions (mitigation), as well as short-term temporary (flow) and long-lived capital-intensive (stock) adaptation to climate change. A critical near-term ques...

  8. Water System Adaptation To Hydrological Changes: Module 7, Adaptation Principles and Considerations

    EPA Science Inventory

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  9. Synaptic plasticity in a recurrent neural network for versatile and adaptive behaviors of a walking robot.

    PubMed

    Grinke, Eduard; Tetzlaff, Christian; Wörgötter, Florentin; Manoonpong, Poramate

    2015-01-01

    Walking animals, like insects, with little neural computing can effectively perform complex behaviors. For example, they can walk around their environment, escape from corners/deadlocks, and avoid or climb over obstacles. While performing all these behaviors, they can also adapt their movements to deal with an unknown situation. As a consequence, they successfully navigate through their complex environment. The versatile and adaptive abilities are the result of an integration of several ingredients embedded in their sensorimotor loop. Biological studies reveal that the ingredients include neural dynamics, plasticity, sensory feedback, and biomechanics. Generating such versatile and adaptive behaviors for a many degrees-of-freedom (DOFs) walking robot is a challenging task. Thus, in this study, we present a bio-inspired approach to solve this task. Specifically, the approach combines neural mechanisms with plasticity, exteroceptive sensory feedback, and biomechanics. The neural mechanisms consist of adaptive neural sensory processing and modular neural locomotion control. The sensory processing is based on a small recurrent neural network consisting of two fully connected neurons. Online correlation-based learning with synaptic scaling is applied to adequately change the connections of the network. By doing so, we can effectively exploit neural dynamics (i.e., hysteresis effects and single attractors) in the network to generate different turning angles with short-term memory for a walking robot. The turning information is transmitted as descending steering signals to the neural locomotion control which translates the signals into motor actions. As a result, the robot can walk around and adapt its turning angle for avoiding obstacles in different situations. The adaptation also enables the robot to effectively escape from sharp corners or deadlocks. Using backbone joint control embedded in the the locomotion control allows the robot to climb over small obstacles. Consequently, it can successfully explore and navigate in complex environments. We firstly tested our approach on a physical simulation environment and then applied it to our real biomechanical walking robot AMOSII with 19 DOFs to adaptively avoid obstacles and navigate in the real world.

  10. Synaptic plasticity in a recurrent neural network for versatile and adaptive behaviors of a walking robot

    PubMed Central

    Grinke, Eduard; Tetzlaff, Christian; Wörgötter, Florentin; Manoonpong, Poramate

    2015-01-01

    Walking animals, like insects, with little neural computing can effectively perform complex behaviors. For example, they can walk around their environment, escape from corners/deadlocks, and avoid or climb over obstacles. While performing all these behaviors, they can also adapt their movements to deal with an unknown situation. As a consequence, they successfully navigate through their complex environment. The versatile and adaptive abilities are the result of an integration of several ingredients embedded in their sensorimotor loop. Biological studies reveal that the ingredients include neural dynamics, plasticity, sensory feedback, and biomechanics. Generating such versatile and adaptive behaviors for a many degrees-of-freedom (DOFs) walking robot is a challenging task. Thus, in this study, we present a bio-inspired approach to solve this task. Specifically, the approach combines neural mechanisms with plasticity, exteroceptive sensory feedback, and biomechanics. The neural mechanisms consist of adaptive neural sensory processing and modular neural locomotion control. The sensory processing is based on a small recurrent neural network consisting of two fully connected neurons. Online correlation-based learning with synaptic scaling is applied to adequately change the connections of the network. By doing so, we can effectively exploit neural dynamics (i.e., hysteresis effects and single attractors) in the network to generate different turning angles with short-term memory for a walking robot. The turning information is transmitted as descending steering signals to the neural locomotion control which translates the signals into motor actions. As a result, the robot can walk around and adapt its turning angle for avoiding obstacles in different situations. The adaptation also enables the robot to effectively escape from sharp corners or deadlocks. Using backbone joint control embedded in the the locomotion control allows the robot to climb over small obstacles. Consequently, it can successfully explore and navigate in complex environments. We firstly tested our approach on a physical simulation environment and then applied it to our real biomechanical walking robot AMOSII with 19 DOFs to adaptively avoid obstacles and navigate in the real world. PMID:26528176

  11. Investigation of Grid Adaptation to Reduce Computational Efforts for a 2-D Hydrogen-Fueled Dual-Mode Scramjet

    NASA Astrophysics Data System (ADS)

    Foo, Kam Keong

    A two-dimensional dual-mode scramjet flowpath is developed and evaluated using the ANSYS Fluent density-based flow solver with various computational grids. Results are obtained for fuel-off, fuel-on non-reacting, and fuel-on reacting cases at different equivalence ratios. A one-step global chemical kinetics hydrogen-air model is used in conjunction with the eddy-dissipation model. Coarse, medium and fine computational grids are used to evaluate grid sensitivity and to investigate a lack of grid independence. Different grid adaptation strategies are performed on the coarse grid in an attempt to emulate the solutions obtained from the finer grids. The goal of this study is to investigate the feasibility of using various mesh adaptation criteria to significantly decrease computational efforts for high-speed reacting flows.

  12. Static roll-tilt over 5 minutes locally distorts the internal estimate of direction of gravity.

    PubMed

    Tarnutzer, A A; Bockisch, C J; Straumann, D; Marti, S; Bertolini, G

    2014-12-01

    The subjective visual vertical (SVV) indicates perceived direction of gravity. Even in healthy human subjects, roll angle-dependent misestimations, roll overcompensation (A-effect, head-roll > 60° and <135°) and undercompensation (E-effect, head-roll < 60°), occur. Previously, we demonstrated that, after prolonged roll-tilt, SVV estimates when upright are biased toward the preceding roll position, which indicates that perceived vertical (PV) is shifted by the prior tilt (Tarnutzer AA, Bertolini G, Bockisch CJ, Straumann D, Marti S. PLoS One 8: e78079, 2013). Hypothetically, PV in any roll position could be biased toward the previous roll position. We asked whether such a "global" bias occurs or whether the bias is "local". The SVV of healthy human subjects (N = 9) was measured in nine roll positions (-120° to +120°, steps = 30°) after 5 min of roll-tilt in one of two adaptation positions (±90°) and compared with control trials without adaptation. After adapting, adjustments were shifted significantly (P < 0.05) toward the previous adaptation position for nearby roll-tilted positions (±30°, ±60°) and upright only. We computationally simulated errors based on the sum of a monotonically increasing function (producing roll undercompensation) and a mixture of Gaussian functions (representing roll overcompensation centered around PV). In combination, the pattern of A- and E-effects could be generated. By shifting the function representing local overcompensation toward the adaptation position, the experimental postadaptation data could be fitted successfully. We conclude that prolonged roll-tilt locally distorts PV rather than globally shifting it. Short-term adaptation of roll overcompensation may explain these shifts and could reflect the brain's strategy to optimize SVV estimates around recent roll positions. Thus postural stability can be improved by visually-mediated compensatory responses at any sustained body-roll orientation. Copyright © 2014 the American Physiological Society.

  13. Adapting the serial Alpgen parton-interaction generator to simulate LHC collisions on millions of parallel threads

    NASA Astrophysics Data System (ADS)

    Childers, J. T.; Uram, T. D.; LeCompte, T. J.; Papka, M. E.; Benjamin, D. P.

    2017-01-01

    As the LHC moves to higher energies and luminosity, the demand for computing resources increases accordingly and will soon outpace the growth of the Worldwide LHC Computing Grid. To meet this greater demand, event generation Monte Carlo was targeted for adaptation to run on Mira, the supercomputer at the Argonne Leadership Computing Facility. Alpgen is a Monte Carlo event generation application that is used by LHC experiments in the simulation of collisions that take place in the Large Hadron Collider. This paper details the process by which Alpgen was adapted from a single-processor serial-application to a large-scale parallel-application and the performance that was achieved.

  14. Grid adaption based on modified anisotropic diffusion equations formulated in the parametic domain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagmeijer, R.

    1994-11-01

    A new grid-adaption algorithm for problems in computational fluid dynamics is presented. The basic equations are derived from a variational problem formulated in the parametric domain of the mapping that defines the existing grid. Modification of the basic equations provides desirable properties in boundary layers. The resulting modified anisotropic diffusion equations are solved for the computational coordinates as functions of the parametric coordinates and these functions are numerically inverted. Numerical examples show that the algorithm is robust, that shocks and boundary layers are well-resolved on the adapted grid, and that the flow solution becomes a globally smooth function of themore » computational coordinates.« less

  15. Adaptive Core Simulation Employing Discrete Inverse Theory - Part I: Theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdel-Khalik, Hany S.; Turinsky, Paul J.

    2005-07-15

    Use of adaptive simulation is intended to improve the fidelity and robustness of important core attribute predictions such as core power distribution, thermal margins, and core reactivity. Adaptive simulation utilizes a selected set of past and current reactor measurements of reactor observables, i.e., in-core instrumentation readings, to adapt the simulation in a meaningful way. A meaningful adaption will result in high-fidelity and robust adapted core simulator models. To perform adaption, we propose an inverse theory approach in which the multitudes of input data to core simulators, i.e., reactor physics and thermal-hydraulic data, are to be adjusted to improve agreement withmore » measured observables while keeping core simulator models unadapted. At first glance, devising such adaption for typical core simulators with millions of input and observables data would spawn not only several prohibitive challenges but also numerous disparaging concerns. The challenges include the computational burdens of the sensitivity-type calculations required to construct Jacobian operators for the core simulator models. Also, the computational burdens of the uncertainty-type calculations required to estimate the uncertainty information of core simulator input data present a demanding challenge. The concerns however are mainly related to the reliability of the adjusted input data. The methodologies of adaptive simulation are well established in the literature of data adjustment. We adopt the same general framework for data adjustment; however, we refrain from solving the fundamental adjustment equations in a conventional manner. We demonstrate the use of our so-called Efficient Subspace Methods (ESMs) to overcome the computational and storage burdens associated with the core adaption problem. We illustrate the successful use of ESM-based adaptive techniques for a typical boiling water reactor core simulator adaption problem.« less

  16. Endurance Exercise in Hypoxia, Hyperoxia and Normoxia: Mitochondrial and Global Adaptations.

    PubMed

    Przyklenk, Axel; Gutmann, Boris; Schiffer, Thorsten; Hollmann, Wildor; Strueder, Heiko K; Bloch, Wilhelm; Mierau, Andreas; Gehlert, Sebastian

    2017-07-01

    We hypothesized short-term endurance exercise (EN) in hypoxia (HY) to exert decreased mitochondrial adaptation, peak oxygen consumption (VO 2peak ) and peak power output (PPO) compared to EN in normoxia (NOR) and hyperoxia (PER). 11 male subjects performed repeated unipedal cycling EN in HY, PER, and NOR over 4 weeks in a cross-over design. VO 2peak , PPO, rate of perceived exertion (RPE) and blood lactate (Bla) were determined pre- and post-intervention to assess physiological demands and adaptation. Skeletal muscle biopsies were collected to determine molecular mitochondrial signaling and adaptation. Despite reduced exercise intensity (P<0.05), increased Bla and RPE levels in HY revealed higher metabolic load compared to PER (P<0.05) and NOR (n.s.). PPO increased in all groups (P<0.05) while VO 2peak and mitochondrial signaling were unchanged (P>0.05). Electron transport chain complexes tended to increase in all groups with the highest increase in HY (n.s.). EN-induced mitochondrial adaptability and exercise capacity neither decreased significantly in HY nor increased in PER compared to NOR. Despite decreased exercise intensity, short term EN under HY may not necessarily impair mitochondrial adaptation and exercise capacity while PER does not augment adaptation. HY might strengthen adaptive responses under circumstances when absolute training intensity has to be reduced. © Georg Thieme Verlag KG Stuttgart · New York.

  17. Whole-genome resequencing of Escherichia coli K-12 MG1655 undergoing short-term laboratory evolution in lactate minimal media reveals flexible selection of adaptive mutations

    PubMed Central

    2009-01-01

    Background Short-term laboratory evolution of bacteria followed by genomic sequencing provides insight into the mechanism of adaptive evolution, such as the number of mutations needed for adaptation, genotype-phenotype relationships, and the reproducibility of adaptive outcomes. Results In the present study, we describe the genome sequencing of 11 endpoints of Escherichia coli that underwent 60-day laboratory adaptive evolution under growth rate selection pressure in lactate minimal media. Two to eight mutations were identified per endpoint. Generally, each endpoint acquired mutations to different genes. The most notable exception was an 82 base-pair deletion in the rph-pyrE operon that appeared in 7 of the 11 adapted strains. This mutation conferred an approximately 15% increase to the growth rate when experimentally introduced to the wild-type background and resulted in an approximately 30% increase to growth rate when introduced to a background already harboring two adaptive mutations. Additionally, most endpoints had a mutation in a regulatory gene (crp or relA, for example) or the RNA polymerase. Conclusions The 82 base-pair deletion found in the rph-pyrE operon of many endpoints may function to relieve a pyrimidine biosynthesis defect present in MG1655. In contrast, a variety of regulators acquire mutations in the different endpoints, suggesting flexibility in overcoming regulatory challenges in the adaptation. PMID:19849850

  18. Disturbance-Adaptive Short-Term Frequency Support of a DFIG Associated With the Variable Gain Based on the ROCOF and Rotor Speed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hwang, Min; Muljadi, Eduard; Jang, Gilsoo

    This paper proposes a disturbance-adaptive short-term frequency support scheme of a doubly fed induction generator (DFIG) that can improve the frequency-supporting capability while ensuring stable operation. In the proposed scheme, the output of the additional control loop is determined as the product of the frequency deviation and adaptive gain, which is modified depending on the rate of change of frequency (ROCOF) and rotor speed. To achieve these objectives, the adaptive gain is set to be high during the early stage of a disturbance, when the ROCOF and rotor speed are high. Until the frequency nadir (FN), the gain decreases withmore » the ROCOF and rotor speed. After the FN, the gain decreases only with the rotor speed. The simulation results demonstrate that the proposed scheme improves the FN and maximum ROCOF while ensuring the stable operation of a DFIG under various wind conditions irrespective of the disturbance conditions by adaptively changing the control gain with the ROCOF and rotor speed, even if the wind speed decreases and a consecutive disturbance occurs.« less

  19. Translation and Cultural Adaptation of the Short-Form Food Frequency Questionnaire for Pregnancy into Brazilian Portuguese.

    PubMed

    Kasawara, Karina Tamy; Paulino, Daiane S M; Bgeginski, Roberta; Cleghorn, Chistine L; Mottola, Michelle F; Surita, Fernanda Garanhani

    2018-05-18

     To translate and culturally adapt the short-form Food Frequency Questionnaire (SFFFQ) for pregnant women, which contains 24 questions, into Brazilian Portuguese.  Description of the process of translation and cultural adaptation of the SFFFQ into Brazilian Portuguese. The present study followed the recommendation of the International Society for Pharmacoeconomics and Outcomes Research for translation and cultural adaptation with the following steps: 1) preparation; 2) first translation; 3) reconciliation; 4) back translation; 5) revision of back translation; 6) harmonization; 7) cognitive debriefing; 8) revision of debriefing results; 9) syntax and orthographic revision; and 10) final report. Five obstetricians, five dietitians and five pregnant women were interviewed to contribute with the language content of the SFFFQ.  Few changes were made to the SFFFQ compared with the original version. These changes were discussed with the research team, and differences in language were adapted to suit all regions of Brazil.  The SFFFQ translated to Brazilian Portuguese can now be validated for use in the Brazilian population. Thieme Revinter Publicações Ltda Rio de Janeiro, Brazil.

  20. Leisure activities in Prader-Wili syndrome: implications for health, cognition and adaptive functioning.

    PubMed

    Dykens, Elisabeth M

    2014-02-01

    Although hyperphagia and compulsivity in Prader-Willi syndrome (PWS) are well described, recreation and adaptive skills are relatively unexplored. Parents of 123 participants with PWS (4-48 years) completed measures of their child's adaptive, recreation, and problem behaviors. Offspring received cognitive testing. Watching TV was the most frequent recreational activity, and was associated with compulsivity and skin picking. BMIs were negatively correlated with physical play, and highest in those who watched TV and played computer games. Computer games and physical activities were associated with higher IQ and adaptive scores. People with PWS and other disabilities need to watch less TV and be more engaged in physical activities, games, and leisure pursuits that are fun, and may bring cognitive or adaptive advantages.

  1. A computer simulation of an adaptive noise canceler with a single input

    NASA Astrophysics Data System (ADS)

    Albert, Stuart D.

    1991-06-01

    A description of an adaptive noise canceler using Widrows' LMS algorithm is presented. A computer simulation of canceler performance (adaptive convergence time and frequency transfer function) was written for use as a design tool. The simulations, assumptions, and input parameters are described in detail. The simulation is used in a design example to predict the performance of an adaptive noise canceler in the simultaneous presence of both strong and weak narrow-band signals (a cosited frequency hopping radio scenario). On the basis of the simulation results, it is concluded that the simulation is suitable for use as an adaptive noise canceler design tool; i.e., it can be used to evaluate the effect of design parameter changes on canceler performance.

  2. The design and evaluation of a peripheral device for use with a computer game intended for children with motor disabilities.

    PubMed

    Scardovelli, Terigi Augusto; Frère, Annie France

    2015-01-01

    Many children with motor impairments cannot participate in games and jokes that contribute to their formation. Currently, commercial computer games there are few options of software and sufficiently flexible access devices to meet the needs of this group of children. In this study, a peripheral access device and a 3D computerized game that do not require the actions of dragging, clicking, or activating various keys at the same time were developed. The peripheral access device consists of a webcam and a supervisory system that processes the images. This method provides a field of action that can be adjusted to various types of motor impairments. To analyze the sensitivity of the commands, a virtual course was developed using the scenario of a path of straight lines and curves. A volunteer with good ability in virtual games performed a short training with the virtual course and, after 15min of training, obtained similar results with a standard keyboard and the adapted peripheral device. A 3D game in the Amazon forest was developed using the Blender 3D tool. This free software was used to model the characters and scenarios. To evaluate the usability of the 3D game, the game was tested by 20 volunteers without motor impairments (group A) and 13 volunteers with severe motor limitations of the upper limbs (group B). All the volunteers (group A and B) could easily execute all the actions of the game using the adapted peripheral device. The majority positively evaluated the questions of usability and expressed their satisfaction. The computerized game coupled to the adapted device will offer the option of leisure and learning to people with severe motor impairments who previously lacked this possibility. It also provided equality in this activity to all the users. Copyright © 2014. Published by Elsevier Ireland Ltd.

  3. Rangeland Rummy - a board game to support adaptive management of rangeland-based livestock systems.

    PubMed

    Farrié, B; Jouven, M; Launay, F; Moreau, J-C; Moulin, C-H; Piquet, M; Taverne, M; Tchakérian, E; Thénard, V; Martin, G

    2015-01-01

    Rangeland-based livestock systems have to deal with the significant instability and uncertainty of the agricultural context (policy changes, volatility of input prices, etc.), and especially of the climatic context. Thus, they are particularly concerned by adaptive management strategies. To support the development of such strategies, we developed a board game including a computer model called "Rangeland Rummy". It is to be used by groups of farmers and agricultural consultants in the context of short workshops (about 3 h). Rangeland Rummy builds upon five types of material object: (i) a game board; (ii) a calendar stick indicating the starting date of the game board; (iii) sticks marked with the feed resources available for combinations of vegetation types and their management practices; (iv) cards to define animal groups and their feeding requirements throughout the year; (v) cards related to types of feed that can be attributed to animal groups throughout the year. Using these material objects, farmers collectively design a rangeland-based livestock system. This system is immediately evaluated using a computer model, i.e. a spreadsheet providing graphs and indicators providing information on, among other things, the extent to which quantitative and qualitative animal feeding requirements are covered across the year. Playing the game thus consists in collectively and iteratively designing and evaluating rangeland-based livestock systems, while confronting the players with new contextual challenges (e.g. interannual variability of weather, volatility of input prices) or new farmers' objectives (e.g. being self-sufficient for animal feeding). An example of application of Rangeland Rummy with 3 farmers in southern France is reported. Applications show that it tends to develop farmers' adaptive capacity by stimulating their discussions and the exchange of locally-relevant knowledge on management strategies and practices in rangeland-based livestock systems. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Adaptive surrogate model based multi-objective transfer trajectory optimization between different libration points

    NASA Astrophysics Data System (ADS)

    Peng, Haijun; Wang, Wei

    2016-10-01

    An adaptive surrogate model-based multi-objective optimization strategy that combines the benefits of invariant manifolds and low-thrust control toward developing a low-computational-cost transfer trajectory between libration orbits around the L1 and L2 libration points in the Sun-Earth system has been proposed in this paper. A new structure for a multi-objective transfer trajectory optimization model that divides the transfer trajectory into several segments and gives the dominations for invariant manifolds and low-thrust control in different segments has been established. To reduce the computational cost of multi-objective transfer trajectory optimization, a mixed sampling strategy-based adaptive surrogate model has been proposed. Numerical simulations show that the results obtained from the adaptive surrogate-based multi-objective optimization are in agreement with the results obtained using direct multi-objective optimization methods, and the computational workload of the adaptive surrogate-based multi-objective optimization is only approximately 10% of that of direct multi-objective optimization. Furthermore, the generating efficiency of the Pareto points of the adaptive surrogate-based multi-objective optimization is approximately 8 times that of the direct multi-objective optimization. Therefore, the proposed adaptive surrogate-based multi-objective optimization provides obvious advantages over direct multi-objective optimization methods.

  5. Necessary, Yet Dissociable Contributions of the Insular and Ventromedial Prefrontal Cortices to Norm Adaptation: Computational and Lesion Evidence in Humans

    PubMed Central

    Gu, Xiaosi; Wang, Xingchao; Hula, Andreas; Wang, Shiwei; Xu, Shuai; Lohrenz, Terry M.; Knight, Robert T.; Gao, Zhixian; Dayan, Peter

    2015-01-01

    Social norms and their enforcement are fundamental to human societies. The ability to detect deviations from norms and to adapt to norms in a changing environment is therefore important to individuals' normal social functioning. Previous neuroimaging studies have highlighted the involvement of the insular and ventromedial prefrontal (vmPFC) cortices in representing norms. However, the necessity and dissociability of their involvement remain unclear. Using model-based computational modeling and neuropsychological lesion approaches, we examined the contributions of the insula and vmPFC to norm adaptation in seven human patients with focal insula lesions and six patients with focal vmPFC lesions, in comparison with forty neurologically intact controls and six brain-damaged controls. There were three computational signals of interest as participants played a fairness game (ultimatum game): sensitivity to the fairness of offers, sensitivity to deviations from expected norms, and the speed at which people adapt to norms. Significant group differences were assessed using bootstrapping methods. Patients with insula lesions displayed abnormally low adaptation speed to norms, yet detected norm violations with greater sensitivity than controls. Patients with vmPFC lesions did not have such abnormalities, but displayed reduced sensitivity to fairness and were more likely to accept the most unfair offers. These findings provide compelling computational and lesion evidence supporting the necessary, yet dissociable roles of the insula and vmPFC in norm adaptation in humans: the insula is critical for learning to adapt when reality deviates from norm expectations, and that the vmPFC is important for valuation of fairness during social exchange. PMID:25589742

  6. Why the leopard got its spots: relating pattern development to ecology in felids

    PubMed Central

    Allen, William L.; Cuthill, Innes C.; Scott-Samuel, Nicholas E.; Baddeley, Roland

    2011-01-01

    A complete explanation of the diversity of animal colour patterns requires an understanding of both the developmental mechanisms generating them and their adaptive value. However, only two previous studies, which involved computer-generated evolving prey, have attempted to make this link. This study examines variation in the camouflage patterns displayed on the flanks of many felids. After controlling for the effects of shared ancestry using a fully resolved molecular phylogeny, this study shows how phenotypes from plausible felid coat pattern generation mechanisms relate to ecology. We found that likelihood of patterning and pattern attributes, such as complexity and irregularity, were related to felids' habitats, arboreality and nocturnality. Our analysis also indicates that disruptive selection is a likely explanation for the prevalence of melanistic forms in Felidae. Furthermore, we show that there is little phylogenetic signal in the visual appearance of felid patterning, indicating that camouflage adapts to ecology over relatively short time scales. Our method could be applied to any taxon with colour patterns that can reasonably be matched to reaction–diffusion and similar models, where the kinetics of the reaction between two or more initially randomly dispersed morphogens determines the outcome of pattern development. PMID:20961899

  7. Data-driven forecasting algorithms for building energy consumption

    NASA Astrophysics Data System (ADS)

    Noh, Hae Young; Rajagopal, Ram

    2013-04-01

    This paper introduces two forecasting methods for building energy consumption data that are recorded from smart meters in high resolution. For utility companies, it is important to reliably forecast the aggregate consumption profile to determine energy supply for the next day and prevent any crisis. The proposed methods involve forecasting individual load on the basis of their measurement history and weather data without using complicated models of building system. The first method is most efficient for a very short-term prediction, such as the prediction period of one hour, and uses a simple adaptive time-series model. For a longer-term prediction, a nonparametric Gaussian process has been applied to forecast the load profiles and their uncertainty bounds to predict a day-ahead. These methods are computationally simple and adaptive and thus suitable for analyzing a large set of data whose pattern changes over the time. These forecasting methods are applied to several sets of building energy consumption data for lighting and heating-ventilation-air-conditioning (HVAC) systems collected from a campus building at Stanford University. The measurements are collected every minute, and corresponding weather data are provided hourly. The results show that the proposed algorithms can predict those energy consumption data with high accuracy.

  8. Computational Properties of the Hippocampus Increase the Efficiency of Goal-Directed Foraging through Hierarchical Reinforcement Learning

    PubMed Central

    Chalmers, Eric; Luczak, Artur; Gruber, Aaron J.

    2016-01-01

    The mammalian brain is thought to use a version of Model-based Reinforcement Learning (MBRL) to guide “goal-directed” behavior, wherein animals consider goals and make plans to acquire desired outcomes. However, conventional MBRL algorithms do not fully explain animals' ability to rapidly adapt to environmental changes, or learn multiple complex tasks. They also require extensive computation, suggesting that goal-directed behavior is cognitively expensive. We propose here that key features of processing in the hippocampus support a flexible MBRL mechanism for spatial navigation that is computationally efficient and can adapt quickly to change. We investigate this idea by implementing a computational MBRL framework that incorporates features inspired by computational properties of the hippocampus: a hierarchical representation of space, “forward sweeps” through future spatial trajectories, and context-driven remapping of place cells. We find that a hierarchical abstraction of space greatly reduces the computational load (mental effort) required for adaptation to changing environmental conditions, and allows efficient scaling to large problems. It also allows abstract knowledge gained at high levels to guide adaptation to new obstacles. Moreover, a context-driven remapping mechanism allows learning and memory of multiple tasks. Simulating dorsal or ventral hippocampal lesions in our computational framework qualitatively reproduces behavioral deficits observed in rodents with analogous lesions. The framework may thus embody key features of how the brain organizes model-based RL to efficiently solve navigation and other difficult tasks. PMID:28018203

  9. A self-adaptive-grid method with application to airfoil flow

    NASA Technical Reports Server (NTRS)

    Nakahashi, K.; Deiwert, G. S.

    1985-01-01

    A self-adaptive-grid method is described that is suitable for multidimensional steady and unsteady computations. Based on variational principles, a spring analogy is used to redistribute grid points in an optimal sense to reduce the overall solution error. User-specified parameters, denoting both maximum and minimum permissible grid spacings, are used to define the all-important constants, thereby minimizing the empiricism and making the method self-adaptive. Operator splitting and one-sided controls for orthogonality and smoothness are used to make the method practical, robust, and efficient. Examples are included for both steady and unsteady viscous flow computations about airfoils in two dimensions, as well as for a steady inviscid flow computation and a one-dimensional case. These examples illustrate the precise control the user has with the self-adaptive method and demonstrate a significant improvement in accuracy and quality of the solutions.

  10. Wavefront measurement using computational adaptive optics.

    PubMed

    South, Fredrick A; Liu, Yuan-Zhi; Bower, Andrew J; Xu, Yang; Carney, P Scott; Boppart, Stephen A

    2018-03-01

    In many optical imaging applications, it is necessary to correct for aberrations to obtain high quality images. Optical coherence tomography (OCT) provides access to the amplitude and phase of the backscattered optical field for three-dimensional (3D) imaging samples. Computational adaptive optics (CAO) modifies the phase of the OCT data in the spatial frequency domain to correct optical aberrations without using a deformable mirror, as is commonly done in hardware-based adaptive optics (AO). This provides improvement of image quality throughout the 3D volume, enabling imaging across greater depth ranges and in highly aberrated samples. However, the CAO aberration correction has a complicated relation to the imaging pupil and is not a direct measurement of the pupil aberrations. Here we present new methods for recovering the wavefront aberrations directly from the OCT data without the use of hardware adaptive optics. This enables both computational measurement and correction of optical aberrations.

  11. Studying the Global Bifurcation Involving Wada Boundary Metamorphosis by a Method of Generalized Cell Mapping with Sampling-Adaptive Interpolation

    NASA Astrophysics Data System (ADS)

    Liu, Xiao-Ming; Jiang, Jun; Hong, Ling; Tang, Dafeng

    In this paper, a new method of Generalized Cell Mapping with Sampling-Adaptive Interpolation (GCMSAI) is presented in order to enhance the efficiency of the computation of one-step probability transition matrix of the Generalized Cell Mapping method (GCM). Integrations with one mapping step are replaced by sampling-adaptive interpolations of third order. An explicit formula of interpolation error is derived for a sampling-adaptive control to switch on integrations for the accuracy of computations with GCMSAI. By applying the proposed method to a two-dimensional forced damped pendulum system, global bifurcations are investigated with observations of boundary metamorphoses including full to partial and partial to partial as well as the birth of fully Wada boundary. Moreover GCMSAI requires a computational time of one thirtieth up to one fiftieth compared to that of the previous GCM.

  12. Incorporating IStation into Early Childhood Classrooms to Improve Reading Comprehension

    ERIC Educational Resources Information Center

    Luo, Tian; Lee, Guang-Lea; Molina, Cynthia

    2017-01-01

    Aim/Purpose: IStation is an adaptive computer-based reading program that adapts to the learner's academic needs. This study investigates if the IStation computer-based reading program promotes reading improvement scores as shown on the STAR Reading test and the IStation test scaled scores for elementary school third-grade learners on different…

  13. Development and Validation of a Computer Adaptive EFL Test

    ERIC Educational Resources Information Center

    He, Lianzhen; Min, Shangchao

    2017-01-01

    The first aim of this study was to develop a computer adaptive EFL test (CALT) that assesses test takers' listening and reading proficiency in English with dichotomous items and polytomous testlets. We reported in detail on the development of the CALT, including item banking, determination of suitable item response theory (IRT) models for item…

  14. Promoting Contextual Vocabulary Learning through an Adaptive Computer-Assisted EFL Reading System

    ERIC Educational Resources Information Center

    Wang, Y.-H.

    2016-01-01

    The study developed an adaptive computer-assisted reading system and investigated its effect on promoting English as a foreign language learner-readers' contextual vocabulary learning performance. Seventy Taiwanese college students were assigned to two reading groups. Participants in the customised reading group read online English texts, each of…

  15. The Effect of Adaptive Confidence Strategies in Computer-Assisted Instruction on Learning and Learner Confidence

    ERIC Educational Resources Information Center

    Warren, Richard Daniel

    2012-01-01

    The purpose of this research was to investigate the effects of including adaptive confidence strategies in instructionally sound computer-assisted instruction (CAI) on learning and learner confidence. Seventy-one general educational development (GED) learners recruited from various GED learning centers at community colleges in the southeast United…

  16. An adaptive mesh-moving and refinement procedure for one-dimensional conservation laws

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Flaherty, Joseph E.; Arney, David C.

    1993-01-01

    We examine the performance of an adaptive mesh-moving and /or local mesh refinement procedure for the finite difference solution of one-dimensional hyperbolic systems of conservation laws. Adaptive motion of a base mesh is designed to isolate spatially distinct phenomena, and recursive local refinement of the time step and cells of the stationary or moving base mesh is performed in regions where a refinement indicator exceeds a prescribed tolerance. These adaptive procedures are incorporated into a computer code that includes a MacCormack finite difference scheme wih Davis' artificial viscosity model and a discretization error estimate based on Richardson's extrapolation. Experiments are conducted on three problems in order to qualify the advantages of adaptive techniques relative to uniform mesh computations and the relative benefits of mesh moving and refinement. Key results indicate that local mesh refinement, with and without mesh moving, can provide reliable solutions at much lower computational cost than possible on uniform meshes; that mesh motion can be used to improve the results of uniform mesh solutions for a modest computational effort; that the cost of managing the tree data structure associated with refinement is small; and that a combination of mesh motion and refinement reliably produces solutions for the least cost per unit accuracy.

  17. Comparing computer adaptive and curriculum-based measures of math in progress monitoring.

    PubMed

    Shapiro, Edward S; Dennis, Minyi Shih; Fu, Qiong

    2015-12-01

    The purpose of the study was to compare the use of a Computer Adaptive Test and Curriculum-Based Measurement in the assessment of mathematics. This study also investigated the degree to which slope or rate of change predicted student outcomes on the annual state assessment of mathematics above and beyond scores of single point screening assessments (i.e., the computer adaptive test or the CBM assessment just before the administration of the state assessment). Repeated measurement of mathematics once per month across a 7-month period using a Computer Adaptive Test (STAR-Math) and Curriculum-Based Measurement (CBM, AIMSweb Math Computation, AIMSweb Math Concepts/Applications) was collected for a maximum total of 250 third, fourth, and fifth grade students. Results showed STAR-Math in all 3 grades and AIMSweb Math Concepts/Applications in the third and fifth grades had primarily linear growth patterns in mathematics. AIMSweb Math Computation in all grades and AIMSweb Math Concepts/Applications in Grade 4 had decelerating positive trends. Predictive validity evidence showed the strongest relationships were between STAR-Math and outcomes for third and fourth grade students. The blockwise multiple regression by grade revealed that slopes accounted for only a very small proportion of additional variance above and beyond what was explained by the scores obtained on a single point of assessment just prior to the administration of the state assessment. (c) 2015 APA, all rights reserved).

  18. The short-term stress response - Mother nature's mechanism for enhancing protection and performance under conditions of threat, challenge, and opportunity.

    PubMed

    Dhabhar, Firdaus S

    2018-03-26

    Our group has proposed that in contrast to chronic stress that can have harmful effects, the short-term (fight-or-flight) stress response (lasting for minutes to hours) is nature's fundamental survival mechanism that enhances protection and performance under conditions involving threat/challenge/opportunity. Short-term stress enhances innate/primary, adaptive/secondary, vaccine-induced, and anti-tumor immune responses, and post-surgical recovery. Mechanisms and mediators include stress hormones, dendritic cell, neutrophil, macrophage, and lymphocyte trafficking/function and local/systemic chemokine and cytokine production. Short-term stress may also enhance mental/cognitive and physical performance through effects on brain, musculo-skeletal, and cardiovascular function, reappraisal of threat/anxiety, and training-induced stress-optimization. Therefore, short-term stress psychology/physiology could be harnessed to enhance immuno-protection, as well as mental and physical performance. This review aims to provide a conceptual framework and targets for further investigation of mechanisms and conditions under which the protective/adaptive aspects of short-term stress/exercise can be optimized/harnessed, and for developing pharmacological/biobehavioral interventions to enhance health/healing, and mental/cognitive/physical performance. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. Adapting to the surface: A comparison of handwriting measures when writing on a tablet computer and on paper.

    PubMed

    Gerth, Sabrina; Dolk, Thomas; Klassert, Annegret; Fliesser, Michael; Fischer, Martin H; Nottbusch, Guido; Festman, Julia

    2016-08-01

    Our study addresses the following research questions: Are there differences between handwriting movements on paper and on a tablet computer? Can experienced writers, such as most adults, adapt their graphomotor execution during writing to a rather unfamiliar surface for instance a tablet computer? We examined the handwriting performance of adults in three tasks with different complexity: (a) graphomotor abilities, (b) visuomotor abilities and (c) handwriting. Each participant performed each task twice, once on paper and once on a tablet computer with a pen. We tested 25 participants by measuring their writing duration, in air time, number of pen lifts, writing velocity and number of inversions in velocity. The data were analyzed using linear mixed-effects modeling with repeated measures. Our results reveal differences between writing on paper and on a tablet computer which were partly task-dependent. Our findings also show that participants were able to adapt their graphomotor execution to the smoother surface of the tablet computer during the tasks. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Service Mediation and Negotiation Bootstrapping as First Achievements Towards Self-adaptable Cloud Services

    NASA Astrophysics Data System (ADS)

    Brandic, Ivona; Music, Dejan; Dustdar, Schahram

    Nowadays, novel computing paradigms as for example Cloud Computing are gaining more and more on importance. In case of Cloud Computing users pay for the usage of the computing power provided as a service. Beforehand they can negotiate specific functional and non-functional requirements relevant for the application execution. However, providing computing power as a service bears different research challenges. On one hand dynamic, versatile, and adaptable services are required, which can cope with system failures and environmental changes. On the other hand, human interaction with the system should be minimized. In this chapter we present the first results in establishing adaptable, versatile, and dynamic services considering negotiation bootstrapping and service mediation achieved in context of the Foundations of Self-Governing ICT Infrastructures (FoSII) project. We discuss novel meta-negotiation and SLA mapping solutions for Cloud services bridging the gap between current QoS models and Cloud middleware and representing important prerequisites for the establishment of autonomic Cloud services.

  1. Daylength and temperature during seed production interactively affect adaptive performance of Picea abies progenies.

    PubMed

    Johnsen, Oystein; Daehlen, Ola Gram; Ostreng, Geir; Skrøppa, Tore

    2005-12-01

    Adaptive traits in Picea abies (Norway spruce) progenies are influenced by the maternal temperatures during seed production. Here, we have extended these studies by testing the effects of maternal photoperiod and temperature on phenology and frost hardiness on progenies. Using eight phytotron rooms, seeds from three unrelated crosses were made in an environmental 2 x 2 factorial combination of long and short days and high and low temperatures. The progenies were then forced to cease growth rapidly at the end of the first growing season. An interactive memory effect was expressed the second growth season. Progenies from high temperature and short days, and from low temperatures and long days, started growth later in spring, ceased shoot growth later in summer, grew taller and were less frost hardy in the autumn than their full siblings from low temperatures and short days, and from high temperatures and long days. Norway spruce has developed a memory mechanism, regulating adaptive plasticity by photoperiod and temperature, which could counteract harmful effects of a rapidly changing climate.

  2. Adapting to life: ocean biogeochemical modelling and adaptive remeshing

    NASA Astrophysics Data System (ADS)

    Hill, J.; Popova, E. E.; Ham, D. A.; Piggott, M. D.; Srokosz, M.

    2013-11-01

    An outstanding problem in biogeochemical modelling of the ocean is that many of the key processes occur intermittently at small scales, such as the sub-mesoscale, that are not well represented in global ocean models. As an example, state-of-the-art models give values of primary production approximately two orders of magnitude lower than those observed in the ocean's oligotrophic gyres, which cover a third of the Earth's surface. This is partly due to their failure to resolve sub-mesoscale phenomena, which play a significant role in nutrient supply. Simply increasing the resolution of the models may be an inefficient computational solution to this problem. An approach based on recent advances in adaptive mesh computational techniques may offer an alternative. Here the first steps in such an approach are described, using the example of a~simple vertical column (quasi 1-D) ocean biogeochemical model. We present a novel method of simulating ocean biogeochemical behaviour on a vertically adaptive computational mesh, where the mesh changes in response to the biogeochemical and physical state of the system throughout the simulation. We show that the model reproduces the general physical and biological behaviour at three ocean stations (India, Papa and Bermuda) as compared to a high-resolution fixed mesh simulation and to observations. The simulations capture both the seasonal and inter-annual variations. The use of an adaptive mesh does not increase the computational error, but reduces the number of mesh elements by a factor of 2-3, so reducing computational overhead. We then show the potential of this method in two case studies where we change the metric used to determine the varying mesh sizes in order to capture the dynamics of chlorophyll at Bermuda and sinking detritus at Papa. We therefore demonstrate adaptive meshes may provide a~suitable numerical technique for simulating seasonal or transient biogeochemical behaviour at high spatial resolution whilst minimising computational cost.

  3. Parallel optical information, concept, and response evolver: POINCARE

    NASA Astrophysics Data System (ADS)

    Caulfield, H. John; Caulfield, Kimberly

    1991-08-01

    It is now possible to build a nonlinear adaptive system which will incorporate many of the properties of the human mind, such as true originality in such skills as reasoning by analogy and reasoning by retrodiction, including literally unpredictable thoughts; and development of individual styles, personalities, expertise, etc. Like humans, these optical processors will have a rich `subconscious'' experience. Like humans, they will be clonable, but clones will develop differently as they experience the world differently, make different decisions, develop different habits, etc. In short, powerful optical processors with some of the properties normally associated with human intelligence can be made. This approach can result in a powerful optical processor with those properties. A demonstration chosen for simplicity of implementation is suggested. This could be the first computer of any type which uses quantum indeterminacy in an integral and important way.

  4. Measuring anxiety after spinal cord injury: Development and psychometric characteristics of the SCI-QOL Anxiety item bank and linkage with GAD-7.

    PubMed

    Kisala, Pamela A; Tulsky, David S; Kalpakjian, Claire Z; Heinemann, Allen W; Pohlig, Ryan T; Carle, Adam; Choi, Seung W

    2015-05-01

    To develop a calibrated item bank and computer adaptive test to assess anxiety symptoms in individuals with spinal cord injury (SCI), transform scores to the Patient Reported Outcomes Measurement Information System (PROMIS) metric, and create a statistical linkage with the Generalized Anxiety Disorder (GAD)-7, a widely used anxiety measure. Grounded-theory based qualitative item development methods; large-scale item calibration field testing; confirmatory factor analysis; graded response model item response theory analyses; statistical linking techniques to transform scores to a PROMIS metric; and linkage with the GAD-7. Setting Five SCI Model System centers and one Department of Veterans Affairs medical center in the United States. Participants Adults with traumatic SCI. Spinal Cord Injury-Quality of Life (SCI-QOL) Anxiety Item Bank Seven hundred sixteen individuals with traumatic SCI completed 38 items assessing anxiety, 17 of which were PROMIS items. After 13 items (including 2 PROMIS items) were removed, factor analyses confirmed unidimensionality. Item response theory analyses were used to estimate slopes and thresholds for the final 25 items (15 from PROMIS). The observed Pearson correlation between the SCI-QOL Anxiety and GAD-7 scores was 0.67. The SCI-QOL Anxiety item bank demonstrates excellent psychometric properties and is available as a computer adaptive test or short form for research and clinical applications. SCI-QOL Anxiety scores have been transformed to the PROMIS metric and we provide a method to link SCI-QOL Anxiety scores with those of the GAD-7.

  5. NASA Tech Briefs, October 2013

    NASA Technical Reports Server (NTRS)

    2013-01-01

    Topics include: A Short-Range Distance Sensor with Exceptional Linearity; Miniature Trace Gas Detector Based on Microfabricated Optical Resonators; Commercial Non-Dispersive Infrared Spectroscopy Sensors for Sub-Ambient Carbon Dioxide Detection; Fast, Large-Area, Wide-Bandgap UV Photodetector for Cherenkov Light Detection; Mission Data System Java Edition Version 7; Adaptive Distributed Environment for Procedure Training (ADEPT); LEGEND, a LEO-to-GEO Environment Debris Model; Electronics/Computers; Millimeter-Wave Localizers for Aircraft-to-Aircraft Approach Navigation; Impedance Discontinuity Reduction Between High-Speed Differential Connectors and PCB Interfaces; SpaceCube Version 1.5; High-Pressure Lightweight Thrusters; Non-Magnetic, Tough, Corrosion- and Wear-Resistant Knives From Bulk Metallic Glasses and Composites; Ambient Dried Aerogels; Applications for Gradient Metal Alloys Fabricated Using Additive Manufacturing; Passivation of Flexible YBCO Superconducting Current Lead With Amorphous SiO2 Layer; Propellant-Flow-Actuated Rocket Engine Igniter; Lightweight Liquid Helium Dewar for High-Altitude Balloon Payloads; Method to Increase Performance of Foil Bearings Through Passive Thermal Management; Unibody Composite Pressurized Structure; JWST Integrated Science Instrument Module Alignment Optimization Tool; Radar Range Sidelobe Reduction Using Adaptive Pulse Compression Technique; Digitally Calibrated TR Modules Enabling Real-Time Beamforming SweepSAR Architectures; Electro-Optic Time-to-Space Converter for Optical Detector Jitter Mitigation; Partially Transparent Petaled Mask/Occulter for Visible-Range Spectrum; Educational NASA Computational and Scientific Studies (enCOMPASS); Coarse-Grain Bandwidth Estimation Scheme for Large-Scale Network; Detection of Moving Targets Using Soliton Resonance Effect; High-Efficiency Nested Hall Thrusters for Robotic Solar System Exploration; High-Voltage Clock Driver for Photon-Counting CCD Characterization; Development of the Code RITRACKS; and Enabling Microliquid Chromatography by Microbead Packing of Microchannels.

  6. Identification of small RNAs abundant in Burkholderia cenocepacia biofilms reveal putative regulators with a potential role in carbon and iron metabolism.

    PubMed

    Sass, Andrea; Kiekens, Sanne; Coenye, Tom

    2017-11-15

    Small RNAs play a regulatory role in many central metabolic processes of bacteria, as well as in developmental processes such as biofilm formation. Small RNAs of Burkholderia cenocepacia, an opportunistic pathogenic beta-proteobacterium, are to date not well characterised. To address that, we performed genome-wide transcriptome structure analysis of biofilm grown B. cenocepacia J2315. 41 unannotated short transcripts were identified in intergenic regions of the B. cenocepacia genome. 15 of these short transcripts, highly abundant in biofilms, widely conserved in Burkholderia sp. and without known function, were selected for in-depth analysis. Expression profiling showed that most of these sRNAs are more abundant in biofilms than in planktonic cultures. Many are also highly abundant in cells grown in minimal media, suggesting they are involved in adaptation to nutrient limitation and growth arrest. Their computationally predicted targets include a high proportion of genes involved in carbon metabolism. Expression and target genes of one sRNA suggest a potential role in regulating iron homoeostasis. The strategy used for this study to detect sRNAs expressed in B. cenocepacia biofilms has successfully identified sRNAs with a regulatory function.

  7. Water System Adaptation To Hydrological Changes: Module 12, Models and Tools for Stormwater and Wastewater System Adaptation

    EPA Science Inventory

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  8. Exposure Control Using Adaptive Multi-Stage Item Bundles.

    ERIC Educational Resources Information Center

    Luecht, Richard M.

    This paper presents a multistage adaptive testing test development paradigm that promises to handle content balancing and other test development needs, psychometric reliability concerns, and item exposure. The bundled multistage adaptive testing (BMAT) framework is a modification of the computer-adaptive sequential testing framework introduced by…

  9. Adaptive Wavelet Modeling of Geophysical Data

    NASA Astrophysics Data System (ADS)

    Plattner, A.; Maurer, H.; Dahmen, W.; Vorloeper, J.

    2009-12-01

    Despite the ever-increasing power of modern computers, realistic modeling of complex three-dimensional Earth models is still a challenging task and requires substantial computing resources. The overwhelming majority of current geophysical modeling approaches includes either finite difference or non-adaptive finite element algorithms, and variants thereof. These numerical methods usually require the subsurface to be discretized with a fine mesh to accurately capture the behavior of the physical fields. However, this may result in excessive memory consumption and computing times. A common feature of most of these algorithms is that the modeled data discretizations are independent of the model complexity, which may be wasteful when there are only minor to moderate spatial variations in the subsurface parameters. Recent developments in the theory of adaptive numerical solvers have the potential to overcome this problem. Here, we consider an adaptive wavelet based approach that is applicable to a large scope of problems, also including nonlinear problems. To the best of our knowledge such algorithms have not yet been applied in geophysics. Adaptive wavelet algorithms offer several attractive features: (i) for a given subsurface model, they allow the forward modeling domain to be discretized with a quasi minimal number of degrees of freedom, (ii) sparsity of the associated system matrices is guaranteed, which makes the algorithm memory efficient, and (iii) the modeling accuracy scales linearly with computing time. We have implemented the adaptive wavelet algorithm for solving three-dimensional geoelectric problems. To test its performance, numerical experiments were conducted with a series of conductivity models exhibiting varying degrees of structural complexity. Results were compared with a non-adaptive finite element algorithm, which incorporates an unstructured mesh to best fit subsurface boundaries. Such algorithms represent the current state-of-the-art in geoelectrical modeling. An analysis of the numerical accuracy as a function of the number of degrees of freedom revealed that the adaptive wavelet algorithm outperforms the finite element solver for simple and moderately complex models, whereas the results become comparable for models with spatially highly variable electrical conductivities. The linear dependency of the modeling error and the computing time proved to be model-independent. This feature will allow very efficient computations using large-scale models as soon as our experimental code is optimized in terms of its implementation.

  10. Adapting to life: ocean biogeochemical modelling and adaptive remeshing

    NASA Astrophysics Data System (ADS)

    Hill, J.; Popova, E. E.; Ham, D. A.; Piggott, M. D.; Srokosz, M.

    2014-05-01

    An outstanding problem in biogeochemical modelling of the ocean is that many of the key processes occur intermittently at small scales, such as the sub-mesoscale, that are not well represented in global ocean models. This is partly due to their failure to resolve sub-mesoscale phenomena, which play a significant role in vertical nutrient supply. Simply increasing the resolution of the models may be an inefficient computational solution to this problem. An approach based on recent advances in adaptive mesh computational techniques may offer an alternative. Here the first steps in such an approach are described, using the example of a simple vertical column (quasi-1-D) ocean biogeochemical model. We present a novel method of simulating ocean biogeochemical behaviour on a vertically adaptive computational mesh, where the mesh changes in response to the biogeochemical and physical state of the system throughout the simulation. We show that the model reproduces the general physical and biological behaviour at three ocean stations (India, Papa and Bermuda) as compared to a high-resolution fixed mesh simulation and to observations. The use of an adaptive mesh does not increase the computational error, but reduces the number of mesh elements by a factor of 2-3. Unlike previous work the adaptivity metric used is flexible and we show that capturing the physical behaviour of the model is paramount to achieving a reasonable solution. Adding biological quantities to the adaptivity metric further refines the solution. We then show the potential of this method in two case studies where we change the adaptivity metric used to determine the varying mesh sizes in order to capture the dynamics of chlorophyll at Bermuda and sinking detritus at Papa. We therefore demonstrate that adaptive meshes may provide a suitable numerical technique for simulating seasonal or transient biogeochemical behaviour at high vertical resolution whilst minimising the number of elements in the mesh. More work is required to move this to fully 3-D simulations.

  11. ICCE/ICCAI 2000 Full & Short Papers (Collaborative Learning).

    ERIC Educational Resources Information Center

    2000

    This document contains the full and short papers on collaborative learning from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction) covering the following topics: comparison of applying Internet to cooperative and traditional learning; a distributed backbone system for…

  12. ICCE/ICCAI 2000 Full & Short Papers (Creative Learning).

    ERIC Educational Resources Information Center

    2000

    This document contains the following full and short papers on creative learning from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A Collaborative Learning Support System Based on Virtual Environment Server for Multiple Agents" (Takashi Ohno, Kenji…

  13. ICCE/ICCAI 2000 Full & Short Papers (Others).

    ERIC Educational Resources Information Center

    2000

    This document contains the following full and short papers from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A Code Restructuring Tool To Help Scaffold Novice Programmers" (Stuart Garner); (2) "An Assessment Framework for Information Technology Integrated…

  14. Parallel computations and control of adaptive structures

    NASA Technical Reports Server (NTRS)

    Park, K. C.; Alvin, Kenneth F.; Belvin, W. Keith; Chong, K. P. (Editor); Liu, S. C. (Editor); Li, J. C. (Editor)

    1991-01-01

    The equations of motion for structures with adaptive elements for vibration control are presented for parallel computations to be used as a software package for real-time control of flexible space structures. A brief introduction of the state-of-the-art parallel computational capability is also presented. Time marching strategies are developed for an effective use of massive parallel mapping, partitioning, and the necessary arithmetic operations. An example is offered for the simulation of control-structure interaction on a parallel computer and the impact of the approach presented for applications in other disciplines than aerospace industry is assessed.

  15. Computer-Assisted Virtual Planning for Surgical Guide Manufacturing and Internal Distractor Adaptation in the Management of Midface Hypoplasia in Cleft Patients.

    PubMed

    Scolozzi, Paolo; Herzog, Georges

    2017-07-01

    We are reporting the treatment of severe maxillary hypoplasia in two patients with unilateral cleft lip and palate by using a specific approach combining the Le Fort I distraction osteogenesis technique coupled with computer-aided design/computer-aided manufacturing customized surgical guides and internal distractors based on virtual computational planning. This technology allows for the transfer of the virtual planned reconstruction to the operating room by using custom patient-specific implants, surgical splints, surgical cutting guides, and surgical guides to plate or distractor adaptation.

  16. Method and system for spatial data input, manipulation and distribution via an adaptive wireless transceiver

    NASA Technical Reports Server (NTRS)

    Wang, Ray (Inventor)

    2009-01-01

    A method and system for spatial data manipulation input and distribution via an adaptive wireless transceiver. The method and system include a wireless transceiver for automatically and adaptively controlling wireless transmissions using a Waveform-DNA method. The wireless transceiver can operate simultaneously over both the short and long distances. The wireless transceiver is automatically adaptive and wireless devices can send and receive wireless digital and analog data from various sources rapidly in real-time via available networks and network services.

  17. hp-Adaptive time integration based on the BDF for viscous flows

    NASA Astrophysics Data System (ADS)

    Hay, A.; Etienne, S.; Pelletier, D.; Garon, A.

    2015-06-01

    This paper presents a procedure based on the Backward Differentiation Formulas of order 1 to 5 to obtain efficient time integration of the incompressible Navier-Stokes equations. The adaptive algorithm performs both stepsize and order selections to control respectively the solution accuracy and the computational efficiency of the time integration process. The stepsize selection (h-adaptivity) is based on a local error estimate and an error controller to guarantee that the numerical solution accuracy is within a user prescribed tolerance. The order selection (p-adaptivity) relies on the idea that low-accuracy solutions can be computed efficiently by low order time integrators while accurate solutions require high order time integrators to keep computational time low. The selection is based on a stability test that detects growing numerical noise and deems a method of order p stable if there is no method of lower order that delivers the same solution accuracy for a larger stepsize. Hence, it guarantees both that (1) the used method of integration operates inside of its stability region and (2) the time integration procedure is computationally efficient. The proposed time integration procedure also features a time-step rejection and quarantine mechanisms, a modified Newton method with a predictor and dense output techniques to compute solution at off-step points.

  18. Molecular determinants of enzyme cold adaptation: comparative structural and computational studies of cold- and warm-adapted enzymes.

    PubMed

    Papaleo, Elena; Tiberti, Matteo; Invernizzi, Gaetano; Pasi, Marco; Ranzani, Valeria

    2011-11-01

    The identification of molecular mechanisms underlying enzyme cold adaptation is a hot-topic both for fundamental research and industrial applications. In the present contribution, we review the last decades of structural computational investigations on cold-adapted enzymes in comparison to their warm-adapted counterparts. Comparative sequence and structural studies allow the definition of a multitude of adaptation strategies. Different enzymes carried out diverse mechanisms to adapt to low temperatures, so that a general theory for enzyme cold adaptation cannot be formulated. However, some common features can be traced in dynamic and flexibility properties of these enzymes, as well as in their intra- and inter-molecular interaction networks. Interestingly, the current data suggest that a family-centered point of view is necessary in the comparative analyses of cold- and warm-adapted enzymes. In fact, enzymes belonging to the same family or superfamily, thus sharing at least the three-dimensional fold and common features of the functional sites, have evolved similar structural and dynamic patterns to overcome the detrimental effects of low temperatures.

  19. Direction-specific adaptation effects acquired in a slow rotation room

    NASA Technical Reports Server (NTRS)

    Graybiel, A.; Knepton, J.

    1972-01-01

    Thirty-eight subjects were required to execute 120 head movements in a slow rotation room at each 1-rpm increase in velocity of the room between 0 and 6 rpm and, after a single-step gradual return to zero velocity, execute 120 head movements either immediately after the return or after delay periods varying from 1 to 24 hours unless, at any time, more than mild symptoms of motion sickness were elicited. A second stress profile differed by the sequential addition of an incremental adaptation schedule in which the direction of rotation was reversed. The experimental findings demonstrated the acquisition of direction-specific adaptation effects that underwent spontaneous decay with a short time constant (hours). Speculations are presented which could account for the simultaneous acquisition of short-term and long-term adaptation effects. The findings support the theory that motion sickness, although a consequence of vestibular stimulation, has its immediate origin in nonvestibular systems, implying a faculative or temporary linkage between the vestibular and nonvestibular systems.

  20. Unbounded and revocable hierarchical identity-based encryption with adaptive security, decryption key exposure resistant, and short public parameters

    PubMed Central

    Wang, Baosheng; Tao, Jing

    2018-01-01

    Revocation functionality and hierarchy key delegation are two necessary and crucial requirements to identity-based cryptosystems. Revocable hierarchical identity-based encryption (RHIBE) has attracted a lot of attention in recent years, many RHIBE schemes have been proposed but shown to be either insecure or bounded where they have to fix the maximum hierarchical depth of RHIBE at setup. In this paper, we propose a new unbounded RHIBE scheme with decryption key exposure resilience and with short public system parameters, and prove our RHIBE scheme to be adaptively secure. Our system model is scalable inherently to accommodate more levels of user adaptively with no adding workload or restarting the system. By carefully designing the hybrid games, we overcome the subtle obstacle in applying the dual system encryption methodology for the unbounded and revocable HIBE. To the best of our knowledge, this is the first construction of adaptively secure unbounded RHIBE scheme. PMID:29649326

  1. Adaptive angular-velocity Vold-Kalman filter order tracking - Theoretical basis, numerical implementation and parameter investigation

    NASA Astrophysics Data System (ADS)

    Pan, M.-Ch.; Chu, W.-Ch.; Le, Duc-Do

    2016-12-01

    The paper presents an alternative Vold-Kalman filter order tracking (VKF_OT) method, i.e. adaptive angular-velocity VKF_OT technique, to extract and characterize order components in an adaptive manner for the condition monitoring and fault diagnosis of rotary machinery. The order/spectral waveforms to be tracked can be recursively solved by using Kalman filter based on the one-step state prediction. The paper comprises theoretical derivation of computation scheme, numerical implementation, and parameter investigation. Comparisons of the adaptive VKF_OT scheme with two other ones are performed through processing synthetic signals of designated order components. Processing parameters such as the weighting factor and the correlation matrix of process noise, and data conditions like the sampling frequency, which influence tracking behavior, are explored. The merits such as adaptive processing nature and computation efficiency brought by the proposed scheme are addressed although the computation was performed in off-line conditions. The proposed scheme can simultaneously extract multiple spectral components, and effectively decouple close and crossing orders associated with multi-axial reference rotating speeds.

  2. Optimizing pKa computation in proteins with pH adapted conformations.

    PubMed

    Kieseritzky, Gernot; Knapp, Ernst-Walter

    2008-05-15

    pK(A) in proteins are determined by electrostatic energy computations using a small number of optimized protein conformations derived from crystal structures. In these protein conformations hydrogen positions and geometries of salt bridges on the protein surface were determined self-consistently with the protonation pattern at three pHs (low, ambient, and high). Considering salt bridges at protein surfaces is most relevant, since they open at low and high pH. In the absence of these conformational changes, computed pK(A)(comp) of acidic (basic) groups in salt bridges underestimate (overestimate) experimental pK(A)(exp), dramatically. The pK(A)(comp) for 15 different proteins with 185 known pK(A)(exp) yield an RMSD of 1.12, comparable with two other methods. One of these methods is fully empirical with many adjustable parameters. The other is also based on electrostatic energy computations using many non-optimized side chain conformers but employs larger dielectric constants at short distances of charge pairs that diminish their electrostatic interactions. These empirical corrections that account implicitly for additional conformational flexibility were needed to describe the energetics of salt bridges appropriately. This is not needed in the present approach. The RMSD of the present approach improves if one considers only strongly shifted pK(A)(exp) in contrast to the other methods under these conditions. Our method allows interpreting pK(A)(comp) in terms of pH dependent hydrogen bonding pattern and salt bridge geometries. A web service is provided to perform pK(A) computations. 2007 Wiley-Liss, Inc.

  3. Modification of Motion Perception and Manual Control Following Short-Durations Spaceflight

    NASA Technical Reports Server (NTRS)

    Wood, S. J.; Vanya, R. D.; Esteves, J. T.; Rupert, A. H.; Clement, G.

    2011-01-01

    Adaptive changes during space flight in how the brain integrates vestibular cues with other sensory information can lead to impaired movement coordination and spatial disorientation following G-transitions. This ESA-NASA study was designed to examine both the physiological basis and operational implications for disorientation and tilt-translation disturbances following short-duration spaceflights. The goals of this study were to (1) examine the effects of stimulus frequency on adaptive changes in motion perception during passive tilt and translation motion, (2) quantify decrements in manual control of tilt motion, and (3) evaluate vibrotactile feedback as a sensorimotor countermeasure.

  4. Heating Augmentation for Short Hypersonic Protuberances

    NASA Technical Reports Server (NTRS)

    Mazaheri, Alireza R.; Wood, William A.

    2008-01-01

    Computational aeroheating analyses of the Space Shuttle Orbiter plug repair models are validated against data collected in the Calspan University of Buffalo Research Center (CUBRC) 48 inch shock tunnel. The comparison shows that the average difference between computed heat transfer results and the data is about 9:5%. Using CFD and Wind Tunnel (WT) data, an empirical correlation for estimating heating augmentation on short hyper- sonic protuberances (k/delta < 0.33) is proposed. This proposed correlation is compared with several computed flight simulation cases and good agreement is achieved. Accordingly, this correlation is proposed for further investigation on other short hypersonic protuberances for estimating heating augmentation.

  5. Heating Augmentation for Short Hypersonic Protuberances

    NASA Technical Reports Server (NTRS)

    Mazaheri, Ali R.; Wood, William A.

    2008-01-01

    Computational aeroheating analyses of the Space Shuttle Orbiter plug repair models are validated against data collected in the Calspan University of Buffalo Research Center (CUBRC) 48 inch shock tunnel. The comparison shows that the average difference between computed heat transfer results and the data is about 9.5%. Using CFD and Wind Tunnel (WT) data, an empirical correlation for estimating heating augmentation on short hypersonic protuberances (k/delta less than 0.3) is proposed. This proposed correlation is compared with several computed flight simulation cases and good agreement is achieved. Accordingly, this correlation is proposed for further investigation on other short hypersonic protuberances for estimating heating augmentation.

  6. Towards feasible and effective predictive wavefront control for adaptive optics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poyneer, L A; Veran, J

    We have recently proposed Predictive Fourier Control, a computationally efficient and adaptive algorithm for predictive wavefront control that assumes frozen flow turbulence. We summarize refinements to the state-space model that allow operation with arbitrary computational delays and reduce the computational cost of solving for new control. We present initial atmospheric characterization using observations with Gemini North's Altair AO system. These observations, taken over 1 year, indicate that frozen flow is exists, contains substantial power, and is strongly detected 94% of the time.

  7. Lessons Learned in Designing and Implementing a Computer-Adaptive Test for English

    ERIC Educational Resources Information Center

    Burston, Jack; Neophytou, Maro

    2014-01-01

    This paper describes the lessons learned in designing and implementing a computer-adaptive test (CAT) for English. The early identification of students with weak L2 English proficiency is of critical importance in university settings that have compulsory English language course graduation requirements. The most efficient means of diagnosing the L2…

  8. Technical Adequacy of Growth Estimates from a Computer Adaptive Test: Implications for Progress Monitoring

    ERIC Educational Resources Information Center

    Van Norman, Ethan R.; Nelson, Peter M.; Parker, David C.

    2017-01-01

    Computer adaptive tests (CATs) hold promise to monitor student progress within multitiered systems of support. However, the relationship between how long and how often data are collected and the technical adequacy of growth estimates from CATs has not been explored. Given CAT administration times, it is important to identify optimal data…

  9. Stratified and Maximum Information Item Selection Procedures in Computer Adaptive Testing

    ERIC Educational Resources Information Center

    Deng, Hui; Ansley, Timothy; Chang, Hua-Hua

    2010-01-01

    In this study we evaluated and compared three item selection procedures: the maximum Fisher information procedure (F), the a-stratified multistage computer adaptive testing (CAT) (STR), and a refined stratification procedure that allows more items to be selected from the high a strata and fewer items from the low a strata (USTR), along with…

  10. Students' Perceived Usefulness of Formative Feedback for a Computer-Adaptive Test

    ERIC Educational Resources Information Center

    Lilley, Mariana; Barker, Trevor

    2007-01-01

    In this paper we report on research related to the provision of automated feedback based on a computer adaptive test (CAT), used in formative assessment. A cohort of 76 second year university undergraduates took part in a formative assessment with a CAT and were provided with automated feedback on their performance. A sample of students responded…

  11. Do You Think You Can? The Influence of Student Self-Efficacy on the Effectiveness of Tutorial Dialogue for Computer Science

    ERIC Educational Resources Information Center

    Wiggins, Joseph B.; Grafsgaard, Joseph F.; Boyer, Kristy Elizabeth; Wiebe, Eric N.; Lester, James C.

    2017-01-01

    In recent years, significant advances have been made in intelligent tutoring systems, and these advances hold great promise for adaptively supporting computer science (CS) learning. In particular, tutorial dialogue systems that engage students in natural language dialogue can create rich, adaptive interactions. A promising approach to increasing…

  12. Comparing Computer Adaptive and Curriculum-Based Measures of Math in Progress Monitoring

    ERIC Educational Resources Information Center

    Shapiro, Edward S.; Dennis, Minyi Shih; Fu, Qiong

    2015-01-01

    The purpose of the study was to compare the use of a Computer Adaptive Test and Curriculum-Based Measurement in the assessment of mathematics. This study also investigated the degree to which slope or rate of change predicted student outcomes on the annual state assessment of mathematics above and beyond scores of single point screening…

  13. Development of a Postacute Hospital Item Bank for the New Pediatric Evaluation of Disability Inventory-Computer Adaptive Test

    ERIC Educational Resources Information Center

    Dumas, Helene M.

    2010-01-01

    The PEDI-CAT is a new computer adaptive test (CAT) version of the Pediatric Evaluation of Disability Inventory (PEDI). Additional PEDI-CAT items specific to postacute pediatric hospital care were recently developed using expert reviews and cognitive interviewing techniques. Expert reviews established face and construct validity, providing positive…

  14. The Predictive Validity of a Computer-Adaptive Assessment of Kindergarten and First-Grade Reading Skills

    ERIC Educational Resources Information Center

    Clemens, Nathan H.; Hagan-Burke, Shanna; Luo, Wen; Cerda, Carissa; Blakely, Alane; Frosch, Jennifer; Gamez-Patience, Brenda; Jones, Meredith

    2015-01-01

    This study examined the predictive validity of a computer-adaptive assessment for measuring kindergarten reading skills using the STAR Early Literacy (SEL) test. The findings showed that the results of SEL assessments administered during the fall, winter, and spring of kindergarten were moderate and statistically significant predictors of year-end…

  15. Effects of Differentially Time-Consuming Tests on Computer-Adaptive Test Scores

    ERIC Educational Resources Information Center

    Bridgeman, Brent; Cline, Frederick

    2004-01-01

    Time limits on some computer-adaptive tests (CATs) are such that many examinees have difficulty finishing, and some examinees may be administered tests with more time-consuming items than others. Results from over 100,000 examinees suggested that about half of the examinees must guess on the final six questions of the analytical section of the…

  16. The Development and Evaluation of a Software Prototype for Computer-Adaptive Testing

    ERIC Educational Resources Information Center

    Lilley, M.; Barker, T.; Britton, C.

    2004-01-01

    This paper presents ongoing research at the University of Hertfordshire on the use of computer-adaptive tests (CATs) in Higher Education. A software prototype based on Item Response Theory has been developed and is described here. This application was designed to estimate the level of proficiency in English for those students whose first language…

  17. Content Range and Precision of a Computer Adaptive Test of Upper Extremity Function for Children with Cerebral Palsy

    ERIC Educational Resources Information Center

    Montpetit, Kathleen; Haley, Stephen; Bilodeau, Nathalie; Ni, Pengsheng; Tian, Feng; Gorton, George, III; Mulcahey, M. J.

    2011-01-01

    This article reports on the content range and measurement precision of an upper extremity (UE) computer adaptive testing (CAT) platform of physical function in children with cerebral palsy. Upper extremity items representing skills of all abilities were administered to 305 parents. These responses were compared with two traditional standardized…

  18. The Effects of Routing and Scoring within a Computer Adaptive Multi-Stage Framework

    ERIC Educational Resources Information Center

    Dallas, Andrew

    2014-01-01

    This dissertation examined the overall effects of routing and scoring within a computer adaptive multi-stage framework (ca-MST). Testing in a ca-MST environment has become extremely popular in the testing industry. Testing companies enjoy its efficiency benefits as compared to traditionally linear testing and its quality-control features over…

  19. Adaptation from Paper-Pencil to Web-Based Administration of a Parent-Completed Developmental Questionnaire for Young Children

    ERIC Educational Resources Information Center

    Yovanoff, Paul; Squires, Jane; McManus, Suzanne

    2013-01-01

    Adapting traditional paper-pencil instruments to computer-based environments has received considerable attention from the research community due to the possible administration mode effects on obtained measures. When differences due to mode of completion (i.e., paper-pencil, computer-based) are present, threats to measurement validity are posed. In…

  20. Water System Adaptation To Hydrological Changes: Module 14, Life Cycle Analysis (LCA) and Prioritization Tools in Water System Adaptation

    EPA Science Inventory

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  1. Test Information Targeting Strategies for Adaptive Multistage Testing Designs.

    ERIC Educational Resources Information Center

    Luecht, Richard M.; Burgin, William

    Adaptive multistage testlet (MST) designs appear to be gaining popularity for many large-scale computer-based testing programs. These adaptive MST designs use a modularized configuration of preconstructed testlets and embedded score-routing schemes to prepackage different forms of an adaptive test. The conditional information targeting (CIT)…

  2. Water System Adaptation to Hydrological Changes: Module 10, Basic Principles of Incorporating Adaptation Science into Hydrologic Planning and Design

    EPA Science Inventory

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  3. Water System Adaptation To Hydrological Changes: Module 5, Water Quality and Infrastructure Response to Rapid Urbanization: Adaptation Case Study in China

    EPA Science Inventory

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  4. Short-term effects of playing computer games on attention.

    PubMed

    Tahiroglu, Aysegul Yolga; Celik, Gonca Gul; Avci, Ayse; Seydaoglu, Gulsah; Uzel, Mehtap; Altunbas, Handan

    2010-05-01

    The main aim of the present study is to investigate the short-term cognitive effects of computer games in children with different psychiatric disorders and normal controls. One hundred one children are recruited for the study (aged between 9 and 12 years). All participants played a motor-racing game on the computer for 1 hour. The TBAG form of the Stroop task was administered to all participants twice, before playing and immediately after playing the game. Participants with improved posttest scores, compared to their pretest scores, used the computer on average 0.67 +/- 1.1 hr/day, while the average administered was measured at 1.6 +/- 1.4 hr/day and 1.3 +/- 0.9 hr/day computer use for participants with worse or unaltered scores, respectively. According to the regression model, male gender, younger ages, duration of daily computer use, and ADHD inattention type were found to be independent risk factors for worsened posttest scores. Time spent playing computer games can exert a short-term effect on attention as measured by the Stroop test.

  5. Adapting the serial Alpgen parton-interaction generator to simulate LHC collisions on millions of parallel threads

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Childers, J. T.; Uram, T. D.; LeCompte, T. J.

    As the LHC moves to higher energies and luminosity, the demand for computing resources increases accordingly and will soon outpace the growth of the World- wide LHC Computing Grid. To meet this greater demand, event generation Monte Carlo was targeted for adaptation to run on Mira, the supercomputer at the Argonne Leadership Computing Facility. Alpgen is a Monte Carlo event generation application that is used by LHC experiments in the simulation of collisions that take place in the Large Hadron Collider. This paper details the process by which Alpgen was adapted from a single-processor serial-application to a large-scale parallel-application andmore » the performance that was achieved.« less

  6. Adapting the serial Alpgen parton-interaction generator to simulate LHC collisions on millions of parallel threads

    DOE PAGES

    Childers, J. T.; Uram, T. D.; LeCompte, T. J.; ...

    2016-09-29

    As the LHC moves to higher energies and luminosity, the demand for computing resources increases accordingly and will soon outpace the growth of the Worldwide LHC Computing Grid. To meet this greater demand, event generation Monte Carlo was targeted for adaptation to run on Mira, the supercomputer at the Argonne Leadership Computing Facility. Alpgen is a Monte Carlo event generation application that is used by LHC experiments in the simulation of collisions that take place in the Large Hadron Collider. Finally, this paper details the process by which Alpgen was adapted from a single-processor serial-application to a large-scale parallel-application andmore » the performance that was achieved.« less

  7. Adapting the serial Alpgen parton-interaction generator to simulate LHC collisions on millions of parallel threads

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Childers, J. T.; Uram, T. D.; LeCompte, T. J.

    As the LHC moves to higher energies and luminosity, the demand for computing resources increases accordingly and will soon outpace the growth of the Worldwide LHC Computing Grid. To meet this greater demand, event generation Monte Carlo was targeted for adaptation to run on Mira, the supercomputer at the Argonne Leadership Computing Facility. Alpgen is a Monte Carlo event generation application that is used by LHC experiments in the simulation of collisions that take place in the Large Hadron Collider. Finally, this paper details the process by which Alpgen was adapted from a single-processor serial-application to a large-scale parallel-application andmore » the performance that was achieved.« less

  8. Adaptive compressive ghost imaging based on wavelet trees and sparse representation.

    PubMed

    Yu, Wen-Kai; Li, Ming-Fei; Yao, Xu-Ri; Liu, Xue-Feng; Wu, Ling-An; Zhai, Guang-Jie

    2014-03-24

    Compressed sensing is a theory which can reconstruct an image almost perfectly with only a few measurements by finding its sparsest representation. However, the computation time consumed for large images may be a few hours or more. In this work, we both theoretically and experimentally demonstrate a method that combines the advantages of both adaptive computational ghost imaging and compressed sensing, which we call adaptive compressive ghost imaging, whereby both the reconstruction time and measurements required for any image size can be significantly reduced. The technique can be used to improve the performance of all computational ghost imaging protocols, especially when measuring ultra-weak or noisy signals, and can be extended to imaging applications at any wavelength.

  9. Sensitivity analysis of a short distance atmospheric dispersion model applied to the Fukushima disaster

    NASA Astrophysics Data System (ADS)

    Périllat, Raphaël; Girard, Sylvain; Korsakissok, Irène; Mallet, Vinien

    2015-04-01

    In a previous study, the sensitivity of a long distance model was analyzed on the Fukushima Daiichi disaster case with the Morris screening method. It showed that a few variables, such as horizontal diffusion coefficient or clouds thickness, have a weak influence on most of the chosen outputs. The purpose of the present study is to apply a similar methodology on the IRSN's operational short distance atmospheric dispersion model, called pX. Atmospheric dispersion models are very useful in case of accidental releases of pollutant to minimize the population exposure during the accident and to obtain an accurate assessment of short and long term environmental and sanitary impact. Long range models are mostly used for consequences assessment while short range models are more adapted to the early phases of the crisis and are used to make prognosis. The Morris screening method was used to estimate the sensitivity of a set of outputs and to rank the inputs by their influences. The input ranking is highly dependent on the considered output, but a few variables seem to have a weak influence on most of them. This first step revealed that interactions and non-linearity are much more pronounced with the short range model than with the long range one. Afterward, the Sobol screening method was used to obtain more quantitative results on the same set of outputs. Using this method was possible for the short range model because it is far less computationally demanding than the long range model. The study also confronts two parameterizations, Doury's and Pasquill's models, to contrast their behavior. The Doury's model seems to excessively inflate the influence of some inputs compared to the Pasquill's model, such as the altitude of emission and the air stability which do not have the same role in the two models. The outputs of the long range model were dominated by only a few inputs. On the contrary, in this study the influence is shared more evenly between the inputs.

  10. Intestinal mucosal atrophy and adaptation

    PubMed Central

    Shaw, Darcy; Gohil, Kartik; Basson, Marc D

    2012-01-01

    Mucosal adaptation is an essential process in gut homeostasis. The intestinal mucosa adapts to a range of pathological conditions including starvation, short-gut syndrome, obesity, and bariatric surgery. Broadly, these adaptive functions can be grouped into proliferation and differentiation. These are influenced by diverse interactions with hormonal, immune, dietary, nervous, and mechanical stimuli. It seems likely that clinical outcomes can be improved by manipulating the physiology of adaptation. This review will summarize current understanding of the basic science surrounding adaptation, delineate the wide range of potential targets for therapeutic intervention, and discuss how these might be incorporated into an overall treatment plan. Deeper insight into the physiologic basis of adaptation will identify further targets for intervention to improve clinical outcomes. PMID:23197881

  11. An adaptive maneuvering logic computer program for the simulation of one-on-one air-to-air combat. Volume 1: General description

    NASA Technical Reports Server (NTRS)

    Burgin, G. H.; Fogel, L. J.; Phelps, J. P.

    1975-01-01

    A technique for computer simulation of air combat is described. Volume 1 decribes the computer program and its development in general terms. Two versions of the program exist. Both incorporate a logic for selecting and executing air combat maneuvers with performance models of specific fighter aircraft. In the batch processing version the flight paths of two aircraft engaged in interactive aerial combat and controlled by the same logic are computed. The realtime version permits human pilots to fly air-to-air combat against the adaptive maneuvering logic (AML) in Langley Differential Maneuvering Simulator (DMS). Volume 2 consists of a detailed description of the computer programs.

  12. ICCE/ICCAI 2000 Full & Short Papers (Virtual Lab/Classroom/School).

    ERIC Educational Resources Information Center

    2000

    This document contains the following full and short papers on virtual laboratories, classrooms, and schools from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A Collaborative Learning Support System Based on Virtual Environment Server for Multiple…

  13. Computed 3D visualisation of an extinct cephalopod using computer tomographs.

    PubMed

    Lukeneder, Alexander

    2012-08-01

    The first 3D visualisation of a heteromorph cephalopod species from the Southern Alps (Dolomites, northern Italy) is presented. Computed tomography, palaeontological data and 3D reconstructions were included in the production of a movie, which shows a life reconstruction of the extinct organism. This detailed reconstruction is according to the current knowledge of the shape and mode of life as well as habitat of this animal. The results are based on the most complete shell known thus far of the genus Dissimilites . Object-based combined analyses from computed tomography and various computed 3D facility programmes help to understand morphological details as well as their ontogentical changes in fossil material. In this study, an additional goal was to show changes in locomotion during different ontogenetic phases of such fossil, marine shell-bearing animals (ammonoids). Hence, the presented models and tools can serve as starting points for discussions on morphology and locomotion of extinct cephalopods in general, and of the genus Dissimilites in particular. The heteromorph ammonoid genus Dissimilites is interpreted here as an active swimmer of the Tethyan Ocean. This study portrays non-destructive methods of 3D visualisation applied on palaeontological material, starting with computed tomography resulting in animated, high-quality video clips. The here presented 3D geometrical models and animation, which are based on palaeontological material, demonstrate the wide range of applications, analytical techniques and also outline possible limitations of 3D models in earth sciences and palaeontology. The realistic 3D models and motion pictures can easily be shared amongst palaeontologists. Data, images and short clips can be discussed online and, if necessary, adapted in morphological details and motion-style to better represent the cephalopod animal.

  14. Computed 3D visualisation of an extinct cephalopod using computer tomographs

    NASA Astrophysics Data System (ADS)

    Lukeneder, Alexander

    2012-08-01

    The first 3D visualisation of a heteromorph cephalopod species from the Southern Alps (Dolomites, northern Italy) is presented. Computed tomography, palaeontological data and 3D reconstructions were included in the production of a movie, which shows a life reconstruction of the extinct organism. This detailed reconstruction is according to the current knowledge of the shape and mode of life as well as habitat of this animal. The results are based on the most complete shell known thus far of the genus Dissimilites. Object-based combined analyses from computed tomography and various computed 3D facility programmes help to understand morphological details as well as their ontogentical changes in fossil material. In this study, an additional goal was to show changes in locomotion during different ontogenetic phases of such fossil, marine shell-bearing animals (ammonoids). Hence, the presented models and tools can serve as starting points for discussions on morphology and locomotion of extinct cephalopods in general, and of the genus Dissimilites in particular. The heteromorph ammonoid genus Dissimilites is interpreted here as an active swimmer of the Tethyan Ocean. This study portrays non-destructive methods of 3D visualisation applied on palaeontological material, starting with computed tomography resulting in animated, high-quality video clips. The here presented 3D geometrical models and animation, which are based on palaeontological material, demonstrate the wide range of applications, analytical techniques and also outline possible limitations of 3D models in earth sciences and palaeontology. The realistic 3D models and motion pictures can easily be shared amongst palaeontologists. Data, images and short clips can be discussed online and, if necessary, adapted in morphological details and motion-style to better represent the cephalopod animal.

  15. Computed 3D visualisation of an extinct cephalopod using computer tomographs

    PubMed Central

    Lukeneder, Alexander

    2012-01-01

    The first 3D visualisation of a heteromorph cephalopod species from the Southern Alps (Dolomites, northern Italy) is presented. Computed tomography, palaeontological data and 3D reconstructions were included in the production of a movie, which shows a life reconstruction of the extinct organism. This detailed reconstruction is according to the current knowledge of the shape and mode of life as well as habitat of this animal. The results are based on the most complete shell known thus far of the genus Dissimilites. Object-based combined analyses from computed tomography and various computed 3D facility programmes help to understand morphological details as well as their ontogentical changes in fossil material. In this study, an additional goal was to show changes in locomotion during different ontogenetic phases of such fossil, marine shell-bearing animals (ammonoids). Hence, the presented models and tools can serve as starting points for discussions on morphology and locomotion of extinct cephalopods in general, and of the genus Dissimilites in particular. The heteromorph ammonoid genus Dissimilites is interpreted here as an active swimmer of the Tethyan Ocean. This study portrays non-destructive methods of 3D visualisation applied on palaeontological material, starting with computed tomography resulting in animated, high-quality video clips. The here presented 3D geometrical models and animation, which are based on palaeontological material, demonstrate the wide range of applications, analytical techniques and also outline possible limitations of 3D models in earth sciences and palaeontology. The realistic 3D models and motion pictures can easily be shared amongst palaeontologists. Data, images and short clips can be discussed online and, if necessary, adapted in morphological details and motion-style to better represent the cephalopod animal. PMID:24850976

  16. A virtual reality atlas of craniofacial anatomy.

    PubMed

    Smith, Darren M; Oliker, Aaron; Carter, Christina R; Kirov, Miro; McCarthy, Joseph G; Cutting, Court B

    2007-11-01

    Head and neck anatomy is complex and represents an educational challenge to the student. Conventional two-dimensional illustrations inherently fall short in conveying intricate anatomical relationships that exist in three dimensions. A gratis three-dimensional virtual reality atlas of craniofacial anatomy is presented in an effort to address the paucity of readily accessible and customizable three-dimensional educational material available to the student of head and neck anatomy. Three-dimensional model construction was performed in Alias Maya 4.5 and 6.0. A basic three-dimensional skull model was altered to include surgical landmarks and proportions. Some of the soft tissues were adapted from previous work, whereas others were constructed de novo. Texturing was completed with Adobe Photoshop 7.0 and Maya. The Internet application was designed in Viewpoint Enliven 1.0. A three-dimensional computer model of craniofacial anatomy (bone and soft tissue) was completed. The model is compatible with many software packages and can be accessed by means of the Internet or downloaded to a personal computer. As the three-dimensional meshes are publicly available, they can be extensively manipulated by the user, even at the polygonal level. Three-dimensional computer graphics has yet to be fully exploited for head and neck anatomy education. In this context, the authors present a publicly available computer model of craniofacial anatomy. This model may also find applications beyond clinical medicine. The model can be accessed gratis at the Plastic and Reconstructive Surgery Web site or obtained as a three-dimensional mesh, also gratis, by contacting the authors.

  17. An ESS treatment of the pattern of female arrival at the mating site in the yellow dung fly scathophaga stercoraria (L.)

    PubMed

    Reuter; Ward; Blanckenhorn

    1998-12-07

    In most previous work on the yellow dung fly Scathophaga stercoraria (L.), as on other species, adaptive explanations have been sought for male behaviour whereas female behaviour has not been examined in similar detail. Here, the arrival of females at the mating site, fresh cattle droppings, is investigated. While almost all males are present shortly after pat deposition females arrive at a low, decreasing rate over an interval of about 5 hours. We propose that the distribution of female arrival times represents a mixed Evolutionarily Stable Strategy (ESS), formed by different trade-offs between costs and benefits of early and late arrival. Early arrival could be favoured by advantages due to better conditions for oviposition, faster egg development of reduced larval competition. Late arrival could be favoured by negative effects on females of male-male competition being weaker later after deposition. Computer simulations with distributions of arrival times deviating from the natural one were performed to "measure" the costs for females arriving at different times. These costs were compared with estimated benefits corresponding to the females' arrival times. This procedure revealed that females coming to the pat later in a population of females arriving shortly after deposition would be favoured. In a population arriving according to a uniform distribution, early females would have fitness advantages. Thus, evolution should lead to an intermediate distribution of arrival times, as in nature, i.e. female arrival behaviour is probably adaptive. The simulations also revealed that the intensity of sexual selection though male-male competition is highest with the natural pattern of female arrival. Therefore, natural selection generating this pattern amplifies the intensity of male-male interaction as a by-product. Copyright 1998 Academic Press

  18. Necessary, yet dissociable contributions of the insular and ventromedial prefrontal cortices to norm adaptation: computational and lesion evidence in humans.

    PubMed

    Gu, Xiaosi; Wang, Xingchao; Hula, Andreas; Wang, Shiwei; Xu, Shuai; Lohrenz, Terry M; Knight, Robert T; Gao, Zhixian; Dayan, Peter; Montague, P Read

    2015-01-14

    Social norms and their enforcement are fundamental to human societies. The ability to detect deviations from norms and to adapt to norms in a changing environment is therefore important to individuals' normal social functioning. Previous neuroimaging studies have highlighted the involvement of the insular and ventromedial prefrontal (vmPFC) cortices in representing norms. However, the necessity and dissociability of their involvement remain unclear. Using model-based computational modeling and neuropsychological lesion approaches, we examined the contributions of the insula and vmPFC to norm adaptation in seven human patients with focal insula lesions and six patients with focal vmPFC lesions, in comparison with forty neurologically intact controls and six brain-damaged controls. There were three computational signals of interest as participants played a fairness game (ultimatum game): sensitivity to the fairness of offers, sensitivity to deviations from expected norms, and the speed at which people adapt to norms. Significant group differences were assessed using bootstrapping methods. Patients with insula lesions displayed abnormally low adaptation speed to norms, yet detected norm violations with greater sensitivity than controls. Patients with vmPFC lesions did not have such abnormalities, but displayed reduced sensitivity to fairness and were more likely to accept the most unfair offers. These findings provide compelling computational and lesion evidence supporting the necessary, yet dissociable roles of the insula and vmPFC in norm adaptation in humans: the insula is critical for learning to adapt when reality deviates from norm expectations, and that the vmPFC is important for valuation of fairness during social exchange. Copyright © 2015 Gu et al.

  19. Probabilistic co-adaptive brain-computer interfacing

    NASA Astrophysics Data System (ADS)

    Bryan, Matthew J.; Martin, Stefan A.; Cheung, Willy; Rao, Rajesh P. N.

    2013-12-01

    Objective. Brain-computer interfaces (BCIs) are confronted with two fundamental challenges: (a) the uncertainty associated with decoding noisy brain signals, and (b) the need for co-adaptation between the brain and the interface so as to cooperatively achieve a common goal in a task. We seek to mitigate these challenges. Approach. We introduce a new approach to brain-computer interfacing based on partially observable Markov decision processes (POMDPs). POMDPs provide a principled approach to handling uncertainty and achieving co-adaptation in the following manner: (1) Bayesian inference is used to compute posterior probability distributions (‘beliefs’) over brain and environment state, and (2) actions are selected based on entire belief distributions in order to maximize total expected reward; by employing methods from reinforcement learning, the POMDP’s reward function can be updated over time to allow for co-adaptive behaviour. Main results. We illustrate our approach using a simple non-invasive BCI which optimizes the speed-accuracy trade-off for individual subjects based on the signal-to-noise characteristics of their brain signals. We additionally demonstrate that the POMDP BCI can automatically detect changes in the user’s control strategy and can co-adaptively switch control strategies on-the-fly to maximize expected reward. Significance. Our results suggest that the framework of POMDPs offers a promising approach for designing BCIs that can handle uncertainty in neural signals and co-adapt with the user on an ongoing basis. The fact that the POMDP BCI maintains a probability distribution over the user’s brain state allows a much more powerful form of decision making than traditional BCI approaches, which have typically been based on the output of classifiers or regression techniques. Furthermore, the co-adaptation of the system allows the BCI to make online improvements to its behaviour, adjusting itself automatically to the user’s changing circumstances.

  20. Computational efficiency and Amdahl’s law for the adaptive resolution simulation technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Junghans, Christoph; Agarwal, Animesh; Delle Site, Luigi

    Here, we discuss the computational performance of the adaptive resolution technique in molecular simulation when it is compared with equivalent full coarse-grained and full atomistic simulations. We show that an estimate of its efficiency, within 10%–15% accuracy, is given by the Amdahl’s Law adapted to the specific quantities involved in the problem. The derivation of the predictive formula is general enough that it may be applied to the general case of molecular dynamics approaches where a reduction of degrees of freedom in a multi scale fashion occurs.

  1. Computational efficiency and Amdahl’s law for the adaptive resolution simulation technique

    DOE PAGES

    Junghans, Christoph; Agarwal, Animesh; Delle Site, Luigi

    2017-06-01

    Here, we discuss the computational performance of the adaptive resolution technique in molecular simulation when it is compared with equivalent full coarse-grained and full atomistic simulations. We show that an estimate of its efficiency, within 10%–15% accuracy, is given by the Amdahl’s Law adapted to the specific quantities involved in the problem. The derivation of the predictive formula is general enough that it may be applied to the general case of molecular dynamics approaches where a reduction of degrees of freedom in a multi scale fashion occurs.

  2. Sparse-grid, reduced-basis Bayesian inversion: Nonaffine-parametric nonlinear equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Peng, E-mail: peng@ices.utexas.edu; Schwab, Christoph, E-mail: christoph.schwab@sam.math.ethz.ch

    2016-07-01

    We extend the reduced basis (RB) accelerated Bayesian inversion methods for affine-parametric, linear operator equations which are considered in [16,17] to non-affine, nonlinear parametric operator equations. We generalize the analysis of sparsity of parametric forward solution maps in [20] and of Bayesian inversion in [48,49] to the fully discrete setting, including Petrov–Galerkin high-fidelity (“HiFi”) discretization of the forward maps. We develop adaptive, stochastic collocation based reduction methods for the efficient computation of reduced bases on the parametric solution manifold. The nonaffinity and nonlinearity with respect to (w.r.t.) the distributed, uncertain parameters and the unknown solution is collocated; specifically, by themore » so-called Empirical Interpolation Method (EIM). For the corresponding Bayesian inversion problems, computational efficiency is enhanced in two ways: first, expectations w.r.t. the posterior are computed by adaptive quadratures with dimension-independent convergence rates proposed in [49]; the present work generalizes [49] to account for the impact of the PG discretization in the forward maps on the convergence rates of the Quantities of Interest (QoI for short). Second, we propose to perform the Bayesian estimation only w.r.t. a parsimonious, RB approximation of the posterior density. Based on the approximation results in [49], the infinite-dimensional parametric, deterministic forward map and operator admit N-term RB and EIM approximations which converge at rates which depend only on the sparsity of the parametric forward map. In several numerical experiments, the proposed algorithms exhibit dimension-independent convergence rates which equal, at least, the currently known rate estimates for N-term approximation. We propose to accelerate Bayesian estimation by first offline construction of reduced basis surrogates of the Bayesian posterior density. The parsimonious surrogates can then be employed for online data assimilation and for Bayesian estimation. They also open a perspective for optimal experimental design.« less

  3. Accessible bioprinting: adaptation of a low-cost 3D-printer for precise cell placement and stem cell differentiation.

    PubMed

    Reid, John A; Mollica, Peter A; Johnson, Garett D; Ogle, Roy C; Bruno, Robert D; Sachs, Patrick C

    2016-06-07

    The precision and repeatability offered by computer-aided design and computer-numerically controlled techniques in biofabrication processes is quickly becoming an industry standard. However, many hurdles still exist before these techniques can be used in research laboratories for cellular and molecular biology applications. Extrusion-based bioprinting systems have been characterized by high development costs, injector clogging, difficulty achieving small cell number deposits, decreased cell viability, and altered cell function post-printing. To circumvent the high-price barrier to entry of conventional bioprinters, we designed and 3D printed components for the adaptation of an inexpensive 'off-the-shelf' commercially available 3D printer. We also demonstrate via goal based computer simulations that the needle geometries of conventional commercially standardized, 'luer-lock' syringe-needle systems cause many of the issues plaguing conventional bioprinters. To address these performance limitations we optimized flow within several microneedle geometries, which revealed a short tapered injector design with minimal cylindrical needle length was ideal to minimize cell strain and accretion. We then experimentally quantified these geometries using pulled glass microcapillary pipettes and our modified, low-cost 3D printer. This systems performance validated our models exhibiting: reduced clogging, single cell print resolution, and maintenance of cell viability without the use of a sacrificial vehicle. Using this system we show the successful printing of human induced pluripotent stem cells (hiPSCs) into Geltrex and note their retention of a pluripotent state 7 d post printing. We also show embryoid body differentiation of hiPSC by injection into differentiation conducive environments, wherein we observed continuous growth, emergence of various evaginations, and post-printing gene expression indicative of the presence of all three germ layers. These data demonstrate an accessible open-source 3D bioprinter capable of serving the needs of any laboratory interested in 3D cellular interactions and tissue engineering.

  4. Slow adaptation of ventricular repolarization as a cause of arrhythmia?

    PubMed

    Bueno-Orovio, A; Hanson, B M; Gill, J S; Taggart, P; Rodriguez, B

    2014-01-01

    This article is part of the Focus Theme of Methods of Information in Medicine on "Biosignal Interpretation: Advanced Methods for Studying Cardiovascular and Respiratory Systems". Adaptation of the QT-interval to changes in heart rate reflects on the body-surface electrocardiogram the adaptation of action potential duration (APD) at the cellular level. The initial fast phase of APD adaptation has been shown to modulate the arrhythmia substrate. Whether the slow phase is potentially proarrhythmic remains unclear. To analyze in-vivo human data and use computer simulations to examine effects of the slow APD adaptation phase on dispersion of repolarization and reentry in the human ventricle. Electrograms were acquired from 10 left and 10 right ventricle (LV/RV) endocardial sites in 15 patients with normal ventricles during RV pacing. Activation-recovery intervals, as a surrogate for APD, were measured during a sustained increase in heart rate. Observed dynamics were studied using computer simulations of human tissue electrophysiology. Spatial heterogeneity of rate adaptation was observed in all patients. Inhomogeneity in slow APD adaptation time constants (Δτ(s)) was greater in LV than RV (Δτ(s)(LV) = 31.8 ± 13.2, Δτ(s)(RV) = 19.0 ± 12.8 s , P< 0.01). Simulations showed that altering local slow time constants of adaptation was sufficient to convert partial wavefront block to block with successful reentry. Using electrophysiological data acquired in-vivo in human and computer simulations, we identify heterogeneity in the slow phase of APD adaptation as an important component of arrhythmogenesis.

  5. Predictor-Based Model Reference Adaptive Control

    NASA Technical Reports Server (NTRS)

    Lavretsky, Eugene; Gadient, Ross; Gregory, Irene M.

    2009-01-01

    This paper is devoted to robust, Predictor-based Model Reference Adaptive Control (PMRAC) design. The proposed adaptive system is compared with the now-classical Model Reference Adaptive Control (MRAC) architecture. Simulation examples are presented. Numerical evidence indicates that the proposed PMRAC tracking architecture has better than MRAC transient characteristics. In this paper, we presented a state-predictor based direct adaptive tracking design methodology for multi-input dynamical systems, with partially known dynamics. Efficiency of the design was demonstrated using short period dynamics of an aircraft. Formal proof of the reported PMRAC benefits constitute future research and will be reported elsewhere.

  6. Load Balancing Unstructured Adaptive Grids for CFD Problems

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Oliker, Leonid

    1996-01-01

    Mesh adaption is a powerful tool for efficient unstructured-grid computations but causes load imbalance among processors on a parallel machine. A dynamic load balancing method is presented that balances the workload across all processors with a global view. After each parallel tetrahedral mesh adaption, the method first determines if the new mesh is sufficiently unbalanced to warrant a repartitioning. If so, the adapted mesh is repartitioned, with new partitions assigned to processors so that the redistribution cost is minimized. The new partitions are accepted only if the remapping cost is compensated by the improved load balance. Results indicate that this strategy is effective for large-scale scientific computations on distributed-memory multiprocessors.

  7. Short term load forecasting using a self-supervised adaptive neural network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, H.; Pimmel, R.L.

    The authors developed a self-supervised adaptive neural network to perform short term load forecasts (STLF) for a large power system covering a wide service area with several heavy load centers. They used the self-supervised network to extract correlational features from temperature and load data. In using data from the calendar year 1993 as a test case, they found a 0.90 percent error for hour-ahead forecasting and 1.92 percent error for day-ahead forecasting. These levels of error compare favorably with those obtained by other techniques. The algorithm ran in a couple of minutes on a PC containing an Intel Pentium --more » 120 MHz CPU. Since the algorithm included searching the historical database, training the network, and actually performing the forecasts, this approach provides a real-time, portable, and adaptable STLF.« less

  8. ICCE/ICCAI 2000 Full & Short Papers (Knowledge Construction and Navigation).

    ERIC Educational Resources Information Center

    2000

    This document contains the following full and short papers on knowledge construction and navigation from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "An XML-Based Tool for Building and Using Conceptual Maps in Education and Training Environments"…

  9. ICCE/ICCAI 2000 Full & Short Papers (Lifelong Learning).

    ERIC Educational Resources Information Center

    2000

    This document contains the following full and short papers on lifelong learning from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A Study on the School Information Technology Pilot Scheme: Possibilities of Creative and Lifelong Learning" (Siu-Cheung Kong,…

  10. ICCE/ICCAI 2000 Full & Short Papers (Intelligent Tutoring Systems).

    ERIC Educational Resources Information Center

    2000

    This document contains the full and short papers on intelligent tutoring systems (ITS) from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction) covering the following topics: a framework for Internet-based distributed learning; a fuzzy-based assessment for the Perl tutoring…

  11. ICCE/ICCAI 2000 Full & Short Papers (Interactive Learning Environments).

    ERIC Educational Resources Information Center

    2000

    This document contains the full and short papers on interactive learning environments from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction) covering the following topics: a CAL system for appreciation of 3D shapes by surface development; a constructivist virtual physics…

  12. ICCE/ICCAI 2000 Full & Short Papers (System Design and Development).

    ERIC Educational Resources Information Center

    2000

    This document contains the full and short papers on system design and development from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction) covering the following topics: a code restructuring tool to help scaffold novice programmers; a framework for Internet-based…

  13. Computing The No-Escape Envelope Of A Short-Range Missile

    NASA Technical Reports Server (NTRS)

    Neuman, Frank

    1991-01-01

    Method for computing no-escape envelope of short-range air-to-air missile devised. Useful for analysis of both strategies for avoidance and strategies for attack. With modifications, also useful in analysis of control strategies for one-on-one air-to-air combat, or wherever multiple control strategies considered.

  14. ICCE/ICCAI 2000 Full & Short Papers (Multimedia and Hypermedia in Education).

    ERIC Educational Resources Information Center

    2000

    This document contains the full and short papers on multimedia and hypermedia in education from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction) covering the following topics: learner-centered navigation path planning in world Wide Web-based learning; the relation…

  15. ICCE/ICCAI 2000 Full & Short Papers (Methodologies).

    ERIC Educational Resources Information Center

    2000

    This document contains the full text of the following full and short papers on methodologies from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A Methodology for Learning Pattern Analysis from Web Logs by Interpreting Web Page Contents" (Chih-Kai Chang and…

  16. ICCE/ICCAI 2000 Full & Short Papers (Teaching and Learning Processes).

    ERIC Educational Resources Information Center

    2000

    This document contains the full and short papers on teaching and learning processes from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction) covering the following topics: a code restructuring tool to help scaffold novice programmers; efficient study of Kanji using…

  17. Method for Assessing Contrast Performance under Lighting Conditions such as Entering a Tunnel on Sunny Day.

    PubMed

    Huang, Y; Menozzi, M

    2015-04-01

    Clinical assessment of dark adaptation is time consuming and requires a specialised instrumentation such as a nyktometer. It is therefore not surprising that dark adaptation is rarely tested in practice. As for the case of testing fitness of a driver, demands on adaptation in daily driving tasks mostly depart from settings in a nyktometer. In daily driving, adaptation is stressed by high and fast transitions of light levels, and the period of time which is relevant to safe driving starts right after a transition and ends several seconds later. In the nyktometer dark adaptation is tested after completion of the adaptation process. RESULTS of a nyktometer test may therefore deliver little information about adaptation shortly after light transitions. In an attempt to develop a clinical test aiming to fulfill both a short measurement time and offering test conditions comparable to conditions in driving, we conducted a preliminary study in which contrast sensitivity thresholds were recorded for light transitions as found in daily driving tasks and for various times after transition onsets. Contrast sensitivity performance is compared to dark adaptation performance as assessed by a myktometer. Contrast sensitivity thresholds were recorded in 17 participants by means of a twin projection apparatus. The apparatus enabled the projection of an adapting field and of a Landolt ring both with a variable luminance. Five different stepwise transitions in levels of adapting luminance were tested. All transitions occurred from bright to dark. The Landolt ring was flashed 100 or 500 ms after the transition had occurred. Participants were instructed to report the orientation of the Landolt ring. A Rodenstock Nyktometer, Plate 501, was used to record dark adaptation threshold. Experimental data from the proposed test revealed a noticeably increasing contrast detection threshold measured in dark adaptation in the stronger transition from 14 000 to 8 cd/m2 than in the weaker transition from 2000 to 8 cd/m2. By raising the dark adaption luminance level from 8 to 60 cd/m2 in the stronger transition case, the contrast detection threshold was then improved by a factor of four. Another main finding showed that for the adaptation process from strong glare stimuli to the dark adaptation, a peak deterioration in contrast sensitivity occurred at the light adaptation level of 6000 cd/m2. Comparing the contrast performance assessed by the proposed test with that of the nyktometer test, there was no clear correlation between the two methods. Our suggested method to assess dark adaptation performance proved to be practical in use and, since the patient does not have to spend a long time to attain complete dark adaptation, the method required a short time for measurement. Our negative experience in the use of the myktometer was in agreement with reported experience in the literature. Georg Thieme Verlag KG Stuttgart · New York.

  18. Cross-cultural adaptation and validation of the Danish version of the Short Musculoskeletal Function Assessment questionnaire (SMFA).

    PubMed

    Lindahl, Marianne; Andersen, Signe; Joergensen, Annette; Frandsen, Christian; Jensen, Liselotte; Benedikz, Eirikur

    2018-01-01

    The aim of this study was to translate and culturally adapt the Short Musculoskeletal Function Assessment (SMFA) into Danish (SMFA-DK) and assess the psychometric properties. SMFA was translated and cross-culturally adapted according to a standardized procedure. Minor changes in the wording in three items were made to adapt to Danish conditions. Acute patients (n = 201) and rehabilitation patients (n = 231) with musculoskeletal problems aged 18-87 years were included. The following analysis were made to evaluate psychometric quality of SMFA-DK: Reliability with Chronbach's alpha, content validity as coding according to the International Classification of Functioning, Disability and Health (ICF), floor/ceiling effects, construct validity as factor analysis, correlations between SMFA-DK and Short Form 36 and also known group method. Responsiveness and effect size were calculated. Cronbach's alpha values were between 0.79 and 0.94. SMFA-DK captured all components of the ICF, and there were no floor/ceiling effects. Factor analysis demonstrated four subscales. SMFA-DK correlated good with the SF-36 subscales for the rehabilitation patients and lower for the newly injured patients. Effect sizes were excellent and better for SMFA-DK than for SF-36. The study indicates that SMFA-DK can be a valid and responsive measure of outcome in rehabilitation settings.

  19. Adaptive Conditioning of Multiple-Point Geostatistical Facies Simulation to Flow Data with Facies Probability Maps

    NASA Astrophysics Data System (ADS)

    Khodabakhshi, M.; Jafarpour, B.

    2013-12-01

    Characterization of complex geologic patterns that create preferential flow paths in certain reservoir systems requires higher-order geostatistical modeling techniques. Multipoint statistics (MPS) provides a flexible grid-based approach for simulating such complex geologic patterns from a conceptual prior model known as a training image (TI). In this approach, a stationary TI that encodes the higher-order spatial statistics of the expected geologic patterns is used to represent the shape and connectivity of the underlying lithofacies. While MPS is quite powerful for describing complex geologic facies connectivity, the nonlinear and complex relation between the flow data and facies distribution makes flow data conditioning quite challenging. We propose an adaptive technique for conditioning facies simulation from a prior TI to nonlinear flow data. Non-adaptive strategies for conditioning facies simulation to flow data can involves many forward flow model solutions that can be computationally very demanding. To improve the conditioning efficiency, we develop an adaptive sampling approach through a data feedback mechanism based on the sampling history. In this approach, after a short period of sampling burn-in time where unconditional samples are generated and passed through an acceptance/rejection test, an ensemble of accepted samples is identified and used to generate a facies probability map. This facies probability map contains the common features of the accepted samples and provides conditioning information about facies occurrence in each grid block, which is used to guide the conditional facies simulation process. As the sampling progresses, the initial probability map is updated according to the collective information about the facies distribution in the chain of accepted samples to increase the acceptance rate and efficiency of the conditioning. This conditioning process can be viewed as an optimization approach where each new sample is proposed based on the sampling history to improve the data mismatch objective function. We extend the application of this adaptive conditioning approach to the case where multiple training images are proposed to describe the geologic scenario in a given formation. We discuss the advantages and limitations of the proposed adaptive conditioning scheme and use numerical experiments from fluvial channel formations to demonstrate its applicability and performance compared to non-adaptive conditioning techniques.

  20. Collective Signal Processing in Cluster Chemotaxis: Roles of Adaptation, Amplification, and Co-attraction in Collective Guidance

    PubMed Central

    Camley, Brian A.; Zimmermann, Juliane; Levine, Herbert; Rappel, Wouter-Jan

    2016-01-01

    Single eukaryotic cells commonly sense and follow chemical gradients, performing chemotaxis. Recent experiments and theories, however, show that even when single cells do not chemotax, clusters of cells may, if their interactions are regulated by the chemoattractant. We study this general mechanism of “collective guidance” computationally with models that integrate stochastic dynamics for individual cells with biochemical reactions within the cells, and diffusion of chemical signals between the cells. We show that if clusters of cells use the well-known local excitation, global inhibition (LEGI) mechanism to sense chemoattractant gradients, the speed of the cell cluster becomes non-monotonic in the cluster’s size—clusters either larger or smaller than an optimal size will have lower speed. We argue that the cell cluster speed is a crucial readout of how the cluster processes chemotactic signals; both amplification and adaptation will alter the behavior of cluster speed as a function of size. We also show that, contrary to the assumptions of earlier theories, collective guidance does not require persistent cell-cell contacts and strong short range adhesion. If cell-cell adhesion is absent, and the cluster cohesion is instead provided by a co-attraction mechanism, e.g. chemotaxis toward a secreted molecule, collective guidance may still function. However, new behaviors, such as cluster rotation, may also appear in this case. Co-attraction and adaptation allow for collective guidance that is robust to varying chemoattractant concentrations while not requiring strong cell-cell adhesion. PMID:27367541

  1. Molecular PET imaging for biology-guided adaptive radiotherapy of head and neck cancer.

    PubMed

    Hoeben, Bianca A W; Bussink, Johan; Troost, Esther G C; Oyen, Wim J G; Kaanders, Johannes H A M

    2013-10-01

    Integration of molecular imaging PET techniques into therapy selection strategies and radiation treatment planning for head and neck squamous cell carcinoma (HNSCC) can serve several purposes. First, pre-treatment assessments can steer decisions about radiotherapy modifications or combinations with other modalities. Second, biology-based objective functions can be introduced to the radiation treatment planning process by co-registration of molecular imaging with planning computed tomography (CT) scans. Thus, customized heterogeneous dose distributions can be generated with escalated doses to tumor areas where radiotherapy resistance mechanisms are most prevalent. Third, monitoring of temporal and spatial variations in these radiotherapy resistance mechanisms early during the course of treatment can discriminate responders from non-responders. With such information available shortly after the start of treatment, modifications can be implemented or the radiation treatment plan can be adapted tailing the biological response pattern. Currently, these strategies are in various phases of clinical testing, mostly in single-center studies. Further validation in multicenter set-up is needed. Ultimately, this should result in availability for routine clinical practice requiring stable production and accessibility of tracers, reproducibility and standardization of imaging and analysis methods, as well as general availability of knowledge and expertise. Small studies employing adaptive radiotherapy based on functional dynamics and early response mechanisms demonstrate promising results. In this context, we focus this review on the widely used PET tracer (18)F-FDG and PET tracers depicting hypoxia and proliferation; two well-known radiation resistance mechanisms.

  2. Ventricular structure, function, and mechanics at high altitude: chronic remodeling in Sherpa vs. short-term lowlander adaptation.

    PubMed

    Stembridge, Mike; Ainslie, Philip N; Hughes, Michael G; Stöhr, Eric J; Cotter, James D; Nio, Amanda Q X; Shave, Rob

    2014-08-01

    Short-term, high-altitude (HA) exposure raises pulmonary artery systolic pressure (PASP) and decreases left-ventricular (LV) volumes. However, relatively little is known of the long-term cardiac consequences of prolonged exposure in Sherpa, a highly adapted HA population. To investigate short-term adaptation and potential long-term cardiac remodeling, we studied ventricular structure and function in Sherpa at 5,050 m (n = 11; 31 ± 13 yr; mass 68 ± 10 kg; height 169 ± 6 cm) and lowlanders at sea level (SL) and following 10 ± 3 days at 5,050 m (n = 9; 34 ± 7 yr; mass 82 ± 10 kg; height 177 ± 6 cm) using conventional and speckle-tracking echocardiography. At HA, PASP was higher in Sherpa and lowlanders compared with lowlanders at SL (both P < 0.05). Sherpa had smaller right-ventricular (RV) and LV stroke volumes than lowlanders at SL with lower RV systolic strain (P < 0.05) but similar LV systolic mechanics. In contrast to LV systolic mechanics, LV diastolic, untwisting velocity was significantly lower in Sherpa compared with lowlanders at both SL and HA. After partial acclimatization, lowlanders demonstrated no change in the RV end-diastolic area; however, both RV strain and LV end-diastolic volume were reduced. In conclusion, short-term hypoxia induced a reduction in RV systolic function that was also evident in Sherpa following chronic exposure. We propose that this was consequent to a persistently higher PASP. In contrast to the RV, remodeling of LV volumes and normalization of systolic mechanics indicate structural and functional adaptation to HA. However, altered LV diastolic relaxation after chronic hypoxic exposure may reflect differential remodeling of systolic and diastolic LV function. Copyright © 2014 the American Physiological Society.

  3. Ventricular structure, function, and mechanics at high altitude: chronic remodeling in Sherpa vs. short-term lowlander adaptation

    PubMed Central

    Ainslie, Philip N.; Hughes, Michael G.; Stöhr, Eric J.; Cotter, James D.; Nio, Amanda Q. X.; Shave, Rob

    2014-01-01

    Short-term, high-altitude (HA) exposure raises pulmonary artery systolic pressure (PASP) and decreases left-ventricular (LV) volumes. However, relatively little is known of the long-term cardiac consequences of prolonged exposure in Sherpa, a highly adapted HA population. To investigate short-term adaptation and potential long-term cardiac remodeling, we studied ventricular structure and function in Sherpa at 5,050 m (n = 11; 31 ± 13 yr; mass 68 ± 10 kg; height 169 ± 6 cm) and lowlanders at sea level (SL) and following 10 ± 3 days at 5,050 m (n = 9; 34 ± 7 yr; mass 82 ± 10 kg; height 177 ± 6 cm) using conventional and speckle-tracking echocardiography. At HA, PASP was higher in Sherpa and lowlanders compared with lowlanders at SL (both P < 0.05). Sherpa had smaller right-ventricular (RV) and LV stroke volumes than lowlanders at SL with lower RV systolic strain (P < 0.05) but similar LV systolic mechanics. In contrast to LV systolic mechanics, LV diastolic, untwisting velocity was significantly lower in Sherpa compared with lowlanders at both SL and HA. After partial acclimatization, lowlanders demonstrated no change in the RV end-diastolic area; however, both RV strain and LV end-diastolic volume were reduced. In conclusion, short-term hypoxia induced a reduction in RV systolic function that was also evident in Sherpa following chronic exposure. We propose that this was consequent to a persistently higher PASP. In contrast to the RV, remodeling of LV volumes and normalization of systolic mechanics indicate structural and functional adaptation to HA. However, altered LV diastolic relaxation after chronic hypoxic exposure may reflect differential remodeling of systolic and diastolic LV function. PMID:24876358

  4. Network Bursting Dynamics in Excitatory Cortical Neuron Cultures Results from the Combination of Different Adaptive Mechanism

    PubMed Central

    Masquelier, Timothée; Deco, Gustavo

    2013-01-01

    In the brain, synchronization among cells of an assembly is a common phenomenon, and thought to be functionally relevant. Here we used an in vitro experimental model of cell assemblies, cortical cultures, combined with numerical simulations of a spiking neural network (SNN) to investigate how and why spontaneous synchronization occurs. In order to deal with excitation only, we pharmacologically blocked GABAAergic transmission using bicuculline. Synchronous events in cortical cultures tend to involve almost every cell and to display relatively constant durations. We have thus named these “network spikes” (NS). The inter-NS-intervals (INSIs) proved to be a more interesting phenomenon. In most cortical cultures NSs typically come in series or bursts (“bursts of NSs”, BNS), with short (∼1 s) INSIs and separated by long silent intervals (tens of s), which leads to bimodal INSI distributions. This suggests that a facilitating mechanism is at work, presumably short-term synaptic facilitation, as well as two fatigue mechanisms: one with a short timescale, presumably short-term synaptic depression, and another one with a longer timescale, presumably cellular adaptation. We thus incorporated these three mechanisms into the SNN, which, indeed, produced realistic BNSs. Next, we systematically varied the recurrent excitation for various adaptation timescales. Strong excitability led to frequent, quasi-periodic BNSs (CV∼0), and weak excitability led to rare BNSs, approaching a Poisson process (CV∼1). Experimental cultures appear to operate within an intermediate weakly-synchronized regime (CV∼0.5), with an adaptation timescale in the 2–8 s range, and well described by a Poisson-with-refractory-period model. Taken together, our results demonstrate that the INSI statistics are indeed informative: they allowed us to infer the mechanisms at work, and many parameters that we cannot access experimentally. PMID:24146781

  5. Rapid adaptation of invertebrate pests to climatic stress?

    PubMed

    Hoffmann, Ary A

    2017-06-01

    There is surprisingly little information on adaptive responses of pests and disease vectors to climatic stresses even though the short generation times and large population sizes associated with pests make rapid adaptation likely. Most evidence of adaptive differentiation has been obtained from geographic comparisons and these can directly or indirectly indicate rates of adaptation where historical data on invasions are available. There is very little information on adaptive shifts in pests detected through molecular comparisons even though the genomes of many pests are now available and can help to identify markers underlying adaptation. While the limited evidence available points to frequent rapid adaptation that can affect pest and disease vector control, constraints to adaptation are also evident and a predictive framework around the likelihood and limits of rapid adaptation is required. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. A modified adjoint-based grid adaptation and error correction method for unstructured grid

    NASA Astrophysics Data System (ADS)

    Cui, Pengcheng; Li, Bin; Tang, Jing; Chen, Jiangtao; Deng, Youqi

    2018-05-01

    Grid adaptation is an important strategy to improve the accuracy of output functions (e.g. drag, lift, etc.) in computational fluid dynamics (CFD) analysis and design applications. This paper presents a modified robust grid adaptation and error correction method for reducing simulation errors in integral outputs. The procedure is based on discrete adjoint optimization theory in which the estimated global error of output functions can be directly related to the local residual error. According to this relationship, local residual error contribution can be used as an indicator in a grid adaptation strategy designed to generate refined grids for accurately estimating the output functions. This grid adaptation and error correction method is applied to subsonic and supersonic simulations around three-dimensional configurations. Numerical results demonstrate that the sensitive grids to output functions are detected and refined after grid adaptation, and the accuracy of output functions is obviously improved after error correction. The proposed grid adaptation and error correction method is shown to compare very favorably in terms of output accuracy and computational efficiency relative to the traditional featured-based grid adaptation.

  7. Progress Monitoring with Computer Adaptive Assessments: The Impact of Data Collection Schedule on Growth Estimates

    ERIC Educational Resources Information Center

    Nelson, Peter M.; Van Norman, Ethan R.; Klingbeil, Dave A.; Parker, David C.

    2017-01-01

    Although extensive research exists on the use of curriculum-based measures for progress monitoring, little is known about using computer adaptive tests (CATs) for progress-monitoring purposes. The purpose of this study was to evaluate the impact of the frequency of data collection on individual and group growth estimates using a CAT. Data were…

  8. Using Neural Net Technology To Enhance the Efficiency of a Computer Adaptive Testing Application.

    ERIC Educational Resources Information Center

    Van Nelson, C.; Henriksen, Larry W.

    The potential for computer adaptive testing (CAT) has been well documented. In order to improve the efficiency of this process, it may be possible to utilize a neural network, or more specifically, a back propagation neural network. The paper asserts that in order to accomplish this end, it must be shown that grouping examinees by ability as…

  9. Identifying Students at Risk: An Examination of Computer-Adaptive Measures and Latent Class Growth Analysis

    ERIC Educational Resources Information Center

    Keller-Margulis, Milena; McQuillin, Samuel D.; Castañeda, Juan Javier; Ochs, Sarah; Jones, John H.

    2018-01-01

    Multitiered systems of support depend on screening technology to identify students at risk. The purpose of this study was to examine the use of a computer-adaptive test and latent class growth analysis (LCGA) to identify students at risk in reading with focus on the use of this methodology to characterize student performance in screening.…

  10. Web-based computer adaptive assessment of individual perceptions of job satisfaction for hospital workplace employees

    PubMed Central

    2011-01-01

    Background To develop a web-based computer adaptive testing (CAT) application for efficiently collecting data regarding workers' perceptions of job satisfaction, we examined whether a 37-item Job Content Questionnaire (JCQ-37) could evaluate the job satisfaction of individual employees as a single construct. Methods The JCQ-37 makes data collection via CAT on the internet easy, viable and fast. A Rasch rating scale model was applied to analyze data from 300 randomly selected hospital employees who participated in job-satisfaction surveys in 2008 and 2009 via non-adaptive and computer-adaptive testing, respectively. Results Of the 37 items on the questionnaire, 24 items fit the model fairly well. Person-separation reliability for the 2008 surveys was 0.88. Measures from both years and item-8 job satisfaction for groups were successfully evaluated through item-by-item analyses by using t-test. Workers aged 26 - 35 felt that job satisfaction was significantly worse in 2009 than in 2008. Conclusions A Web-CAT developed in the present paper was shown to be more efficient than traditional computer-based or pen-and-paper assessments at collecting data regarding workers' perceptions of job content. PMID:21496311

  11. IFCPT S-Duct Grid-Adapted FUN3D Computations for the Third Propulsion Aerodynamics Works

    NASA Technical Reports Server (NTRS)

    Davis, Zach S.; Park, M. A.

    2017-01-01

    Contributions of the unstructured Reynolds-averaged Navier-Stokes code, FUN3D, to the 3rd AIAA Propulsion Aerodynamics Workshop are described for the diffusing IFCPT S-Duct. Using workshop-supplied grids, results for the baseline S-Duct, baseline S-Duct with Aerodynamic Interface Plane (AIP) rake hardware, and baseline S-Duct with flow control devices are compared with experimental data and results computed with output-based, off-body grid adaptation in FUN3D. Due to the absence of influential geometry components, total pressure recovery is overpredicted on the baseline S-Duct and S-Duct with flow control vanes when compared to experimental values. An estimate for the exact value of total pressure recovery is derived for these cases given an infinitely refined mesh. When results from output-based mesh adaptation are compared with those computed on workshop-supplied grids, a considerable improvement in predicting total pressure recovery is observed. By including more representative geometry, output-based mesh adaptation compares very favorably with experimental data in terms of predicting the total pressure recovery cost-function; whereas, results computed using the workshop-supplied grids are underpredicted.

  12. Monitoring California Hardwood Rangeland Resources: An Adaptive Approach

    Treesearch

    Raul Tuazon

    1991-01-01

    This paper describes monitoring hardwood rangelands in California within the context of an adaptive or anticipatory approach. A heuristic process of policy evolution under conditions of complexity and uncertainty is presented. Long-term, short-term and program effectiveness monitoring for hardwood rangelands are discussed relative to the process described. The...

  13. A Short History of the Computer.

    ERIC Educational Resources Information Center

    Leon, George

    1984-01-01

    Briefly traces the development of computers from the abacus, John Napier's logarithms, the first computer/calculator (known as the Differential Engine), the first computer programming via steel punched cards, the electrical analog computer, electronic digital computer, and the transistor to the microchip of today's computers. (MBR)

  14. A Gaussian mixture model based adaptive classifier for fNIRS brain-computer interfaces and its testing via simulation

    NASA Astrophysics Data System (ADS)

    Li, Zheng; Jiang, Yi-han; Duan, Lian; Zhu, Chao-zhe

    2017-08-01

    Objective. Functional near infra-red spectroscopy (fNIRS) is a promising brain imaging technology for brain-computer interfaces (BCI). Future clinical uses of fNIRS will likely require operation over long time spans, during which neural activation patterns may change. However, current decoders for fNIRS signals are not designed to handle changing activation patterns. The objective of this study is to test via simulations a new adaptive decoder for fNIRS signals, the Gaussian mixture model adaptive classifier (GMMAC). Approach. GMMAC can simultaneously classify and track activation pattern changes without the need for ground-truth labels. This adaptive classifier uses computationally efficient variational Bayesian inference to label new data points and update mixture model parameters, using the previous model parameters as priors. We test GMMAC in simulations in which neural activation patterns change over time and compare to static decoders and unsupervised adaptive linear discriminant analysis classifiers. Main results. Our simulation experiments show GMMAC can accurately decode under time-varying activation patterns: shifts of activation region, expansions of activation region, and combined contractions and shifts of activation region. Furthermore, the experiments show the proposed method can track the changing shape of the activation region. Compared to prior work, GMMAC performed significantly better than the other unsupervised adaptive classifiers on a difficult activation pattern change simulation: 99% versus  <54% in two-choice classification accuracy. Significance. We believe GMMAC will be useful for clinical fNIRS-based brain-computer interfaces, including neurofeedback training systems, where operation over long time spans is required.

  15. Spatial adaption procedures on unstructured meshes for accurate unsteady aerodynamic flow computation

    NASA Technical Reports Server (NTRS)

    Rausch, Russ D.; Batina, John T.; Yang, Henry T. Y.

    1991-01-01

    Spatial adaption procedures for the accurate and efficient solution of steady and unsteady inviscid flow problems are described. The adaption procedures were developed and implemented within a two-dimensional unstructured-grid upwind-type Euler code. These procedures involve mesh enrichment and mesh coarsening to either add points in a high gradient region or the flow or remove points where they are not needed, respectively, to produce solutions of high spatial accuracy at minimal computational costs. A detailed description is given of the enrichment and coarsening procedures and comparisons with alternative results and experimental data are presented to provide an assessment of the accuracy and efficiency of the capability. Steady and unsteady transonic results, obtained using spatial adaption for the NACA 0012 airfoil, are shown to be of high spatial accuracy, primarily in that the shock waves are very sharply captured. The results were obtained with a computational savings of a factor of approximately fifty-three for a steady case and as much as twenty-five for the unsteady cases.

  16. Low Boom Configuration Analysis with FUN3D Adjoint Simulation Framework

    NASA Technical Reports Server (NTRS)

    Park, Michael A.

    2011-01-01

    Off-body pressure, forces, and moments for the Gulfstream Low Boom Model are computed with a Reynolds Averaged Navier Stokes solver coupled with the Spalart-Allmaras (SA) turbulence model. This is the first application of viscous output-based adaptation to reduce estimated discretization errors in off-body pressure for a wing body configuration. The output adaptation approach is compared to an a priori grid adaptation technique designed to resolve the signature on the centerline by stretching and aligning the grid to the freestream Mach angle. The output-based approach produced good predictions of centerline and off-centerline measurements. Eddy viscosity predicted by the SA turbulence model increased significantly with grid adaptation. Computed lift as a function of drag compares well with wind tunnel measurements for positive lift, but predicted lift, drag, and pitching moment as a function of angle of attack has significant differences from the measured data. The sensitivity of longitudinal forces and moment to grid refinement is much smaller than the differences between the computed and measured data.

  17. Adaptive [theta]-methods for pricing American options

    NASA Astrophysics Data System (ADS)

    Khaliq, Abdul Q. M.; Voss, David A.; Kazmi, Kamran

    2008-12-01

    We develop adaptive [theta]-methods for solving the Black-Scholes PDE for American options. By adding a small, continuous term, the Black-Scholes PDE becomes an advection-diffusion-reaction equation on a fixed spatial domain. Standard implementation of [theta]-methods would require a Newton-type iterative procedure at each time step thereby increasing the computational complexity of the methods. Our linearly implicit approach avoids such complications. We establish a general framework under which [theta]-methods satisfy a discrete version of the positivity constraint characteristic of American options, and numerically demonstrate the sensitivity of the constraint. The positivity results are established for the single-asset and independent two-asset models. In addition, we have incorporated and analyzed an adaptive time-step control strategy to increase the computational efficiency. Numerical experiments are presented for one- and two-asset American options, using adaptive exponential splitting for two-asset problems. The approach is compared with an iterative solution of the two-asset problem in terms of computational efficiency.

  18. Adaptive independent joint control of manipulators - Theory and experiment

    NASA Technical Reports Server (NTRS)

    Seraji, H.

    1988-01-01

    The author presents a simple decentralized adaptive control scheme for multijoint robot manipulators based on the independent joint control concept. The proposed control scheme for each joint consists of a PID (proportional integral and differential) feedback controller and a position-velocity-acceleration feedforward controller, both with adjustable gains. The static and dynamic couplings that exist between the joint motions are compensated by the adaptive independent joint controllers while ensuring trajectory tracking. The proposed scheme is implemented on a MicroVAX II computer for motion control of the first three joints of a PUMA 560 arm. Experimental results are presented to demonstrate that trajectory tracking is achieved despite strongly coupled, highly nonlinear joint dynamics. The results confirm that the proposed decentralized adaptive control of manipulators is feasible, in spite of strong interactions between joint motions. The control scheme presented is computationally very fast and is amenable to parallel processing implementation within a distributed computing architecture, where each joint is controlled independently by a simple algorithm on a dedicated microprocessor.

  19. An adaptive angle-doppler compensation method for airborne bistatic radar based on PAST

    NASA Astrophysics Data System (ADS)

    Hang, Xu; Jun, Zhao

    2018-05-01

    Adaptive angle-Doppler compensation method extract the requisite information based on the data itself adaptively, thus avoiding the problem of performance degradation caused by inertia system error. However, this method requires estimation and egiendecomposition of sample covariance matrix, which has a high computational complexity and limits its real-time application. In this paper, an adaptive angle Doppler compensation method based on projection approximation subspace tracking (PAST) is studied. The method uses cyclic iterative processing to quickly estimate the positions of the spectral center of the maximum eigenvector of each range cell, and the computational burden of matrix estimation and eigen-decompositon is avoided, and then the spectral centers of all range cells is overlapped by two dimensional compensation. Simulation results show the proposed method can effectively reduce the no homogeneity of airborne bistatic radar, and its performance is similar to that of egien-decomposition algorithms, but the computation load is obviously reduced and easy to be realized.

  20. Spatial adaption procedures on unstructured meshes for accurate unsteady aerodynamic flow computation

    NASA Technical Reports Server (NTRS)

    Rausch, Russ D.; Yang, Henry T. Y.; Batina, John T.

    1991-01-01

    Spatial adaption procedures for the accurate and efficient solution of steady and unsteady inviscid flow problems are described. The adaption procedures were developed and implemented within a two-dimensional unstructured-grid upwind-type Euler code. These procedures involve mesh enrichment and mesh coarsening to either add points in high gradient regions of the flow or remove points where they are not needed, respectively, to produce solutions of high spatial accuracy at minimal computational cost. The paper gives a detailed description of the enrichment and coarsening procedures and presents comparisons with alternative results and experimental data to provide an assessment of the accuracy and efficiency of the capability. Steady and unsteady transonic results, obtained using spatial adaption for the NACA 0012 airfoil, are shown to be of high spatial accuracy, primarily in that the shock waves are very sharply captured. The results were obtained with a computational savings of a factor of approximately fifty-three for a steady case and as much as twenty-five for the unsteady cases.

  1. Mechanisms underlying interlimb transfer of visuomotor rotations

    PubMed Central

    Wang, Jinsung; Sainburg, Robert L.

    2013-01-01

    We previously reported that opposite arm training improved the initial direction of dominant arm movements, whereas it only improved the final position accuracy of non-dominant arm movements. We now ask whether each controller accesses common, or separate, short-term memory resources. To address this question, we investigated interlimb transfer of learning for visuomotor rotations that were directed oppositely [clockwise (CW)/counterclockwise (CCW)] for the two arms. We expected that if information obtained by initial training was stored in the same short-term memory space for both arms, opposite arm training of a CW rotation would interfere with subsequent adaptation to a CCW rotation. All subjects first adapted to a 30° rotation (CW) in the visual display during reaching movements. Following this, they adapted to a 30° rotation in the opposite direction (CCW) with the other arm. In contrast to our previous findings for interlimb transfer of same direction rotations (CCW/CCW), no effects of opposite arm adaptation were indicated in the initial trials performed. This indicates that interlimb transfer is not obligatory, and suggests that short-term memory resources for the two limbs are independent. Through single trial analysis, we found that the direction and final position errors of the first trial of movement, following opposite arm training, were always the same as those of naive performance. This was true whether the opposite arm was trained with the same or the opposing rotation. When trained with the same rotation, transfer of learning did not occur until the second trial. These findings suggest that the selective use of opposite arm information is dependent on the first trial to probe current movement conditions. Interestingly, the final extent of adaptation appeared to be reduced by opposite arm training of opposing rotations. Thus, the extent of adaptation, but not initial information transfer, appears obligatorily affected by prior opposite arm adaptation. According to our findings, it is plausible that the initiation and the final extent of adaptation involve two independent neural processes. Theoretical implications of these findings are discussed. PMID:12677333

  2. Unstructured and adaptive mesh generation for high Reynolds number viscous flows

    NASA Technical Reports Server (NTRS)

    Mavriplis, Dimitri J.

    1991-01-01

    A method for generating and adaptively refining a highly stretched unstructured mesh suitable for the computation of high-Reynolds-number viscous flows about arbitrary two-dimensional geometries was developed. The method is based on the Delaunay triangulation of a predetermined set of points and employs a local mapping in order to achieve the high stretching rates required in the boundary-layer and wake regions. The initial mesh-point distribution is determined in a geometry-adaptive manner which clusters points in regions of high curvature and sharp corners. Adaptive mesh refinement is achieved by adding new points in regions of large flow gradients, and locally retriangulating; thus, obviating the need for global mesh regeneration. Initial and adapted meshes about complex multi-element airfoil geometries are shown and compressible flow solutions are computed on these meshes.

  3. Adaptive regularization network based neural modeling paradigm for nonlinear adaptive estimation of cerebral evoked potentials.

    PubMed

    Zhang, Jian-Hua; Böhme, Johann F

    2007-11-01

    In this paper we report an adaptive regularization network (ARN) approach to realizing fast blind separation of cerebral evoked potentials (EPs) from background electroencephalogram (EEG) activity with no need to make any explicit assumption on the statistical (or deterministic) signal model. The ARNs are proposed to construct nonlinear EEG and EP signal models. A novel adaptive regularization training (ART) algorithm is proposed to improve the generalization performance of the ARN. Two adaptive neural modeling methods based on the ARN are developed and their implementation and performance analysis are also presented. The computer experiments using simulated and measured visual evoked potential (VEP) data have shown that the proposed ARN modeling paradigm yields computationally efficient and more accurate VEP signal estimation owing to its intrinsic model-free and nonlinear processing characteristics.

  4. SAGE: The Self-Adaptive Grid Code. 3

    NASA Technical Reports Server (NTRS)

    Davies, Carol B.; Venkatapathy, Ethiraj

    1999-01-01

    The multi-dimensional self-adaptive grid code, SAGE, is an important tool in the field of computational fluid dynamics (CFD). It provides an efficient method to improve the accuracy of flow solutions while simultaneously reducing computer processing time. Briefly, SAGE enhances an initial computational grid by redistributing the mesh points into more appropriate locations. The movement of these points is driven by an equal-error-distribution algorithm that utilizes the relationship between high flow gradients and excessive solution errors. The method also provides a balance between clustering points in the high gradient regions and maintaining the smoothness and continuity of the adapted grid, The latest version, Version 3, includes the ability to change the boundaries of a given grid to more efficiently enclose flow structures and provides alternative redistribution algorithms.

  5. A Stochastic Total Least Squares Solution of Adaptive Filtering Problem

    PubMed Central

    Ahmad, Noor Atinah

    2014-01-01

    An efficient and computationally linear algorithm is derived for total least squares solution of adaptive filtering problem, when both input and output signals are contaminated by noise. The proposed total least mean squares (TLMS) algorithm is designed by recursively computing an optimal solution of adaptive TLS problem by minimizing instantaneous value of weighted cost function. Convergence analysis of the algorithm is given to show the global convergence of the proposed algorithm, provided that the stepsize parameter is appropriately chosen. The TLMS algorithm is computationally simpler than the other TLS algorithms and demonstrates a better performance as compared with the least mean square (LMS) and normalized least mean square (NLMS) algorithms. It provides minimum mean square deviation by exhibiting better convergence in misalignment for unknown system identification under noisy inputs. PMID:24688412

  6. Computer image generation: Reconfigurability as a strategy in high fidelity space applications

    NASA Technical Reports Server (NTRS)

    Bartholomew, Michael J.

    1989-01-01

    The demand for realistic, high fidelity, computer image generation systems to support space simulation is well established. However, as the number and diversity of space applications increase, the complexity and cost of computer image generation systems also increase. One strategy used to harmonize cost with varied requirements is establishment of a reconfigurable image generation system that can be adapted rapidly and easily to meet new and changing requirements. The reconfigurability strategy through the life cycle of system conception, specification, design, implementation, operation, and support for high fidelity computer image generation systems are discussed. The discussion is limited to those issues directly associated with reconfigurability and adaptability of a specialized scene generation system in a multi-faceted space applications environment. Examples and insights gained through the recent development and installation of the Improved Multi-function Scene Generation System at Johnson Space Center, Systems Engineering Simulator are reviewed and compared with current simulator industry practices. The results are clear; the strategy of reconfigurability applied to space simulation requirements provides a viable path to supporting diverse applications with an adaptable computer image generation system.

  7. Parallel Adaptive Mesh Refinement for High-Order Finite-Volume Schemes in Computational Fluid Dynamics

    NASA Astrophysics Data System (ADS)

    Schwing, Alan Michael

    For computational fluid dynamics, the governing equations are solved on a discretized domain of nodes, faces, and cells. The quality of the grid or mesh can be a driving source for error in the results. While refinement studies can help guide the creation of a mesh, grid quality is largely determined by user expertise and understanding of the flow physics. Adaptive mesh refinement is a technique for enriching the mesh during a simulation based on metrics for error, impact on important parameters, or location of important flow features. This can offload from the user some of the difficult and ambiguous decisions necessary when discretizing the domain. This work explores the implementation of adaptive mesh refinement in an implicit, unstructured, finite-volume solver. Consideration is made for applying modern computational techniques in the presence of hanging nodes and refined cells. The approach is developed to be independent of the flow solver in order to provide a path for augmenting existing codes. It is designed to be applicable for unsteady simulations and refinement and coarsening of the grid does not impact the conservatism of the underlying numerics. The effect on high-order numerical fluxes of fourth- and sixth-order are explored. Provided the criteria for refinement is appropriately selected, solutions obtained using adapted meshes have no additional error when compared to results obtained on traditional, unadapted meshes. In order to leverage large-scale computational resources common today, the methods are parallelized using MPI. Parallel performance is considered for several test problems in order to assess scalability of both adapted and unadapted grids. Dynamic repartitioning of the mesh during refinement is crucial for load balancing an evolving grid. Development of the methods outlined here depend on a dual-memory approach that is described in detail. Validation of the solver developed here against a number of motivating problems shows favorable comparisons across a range of regimes. Unsteady and steady applications are considered in both subsonic and supersonic flows. Inviscid and viscous simulations achieve similar results at a much reduced cost when employing dynamic mesh adaptation. Several techniques for guiding adaptation are compared. Detailed analysis of statistics from the instrumented solver enable understanding of the costs associated with adaptation. Adaptive mesh refinement shows promise for the test cases presented here. It can be considerably faster than using conventional grids and provides accurate results. The procedures for adapting the grid are light-weight enough to not require significant computational time and yield significant reductions in grid size.

  8. Computer aided segmentation of kidneys using locally shape constrained deformable models on CT images

    NASA Astrophysics Data System (ADS)

    Erdt, Marius; Sakas, Georgios

    2010-03-01

    This work presents a novel approach for model based segmentation of the kidney in images acquired by Computed Tomography (CT). The developed computer aided segmentation system is expected to support computer aided diagnosis and operation planning. We have developed a deformable model based approach based on local shape constraints that prevents the model from deforming into neighboring structures while allowing the global shape to adapt freely to the data. Those local constraints are derived from the anatomical structure of the kidney and the presence and appearance of neighboring organs. The adaptation process is guided by a rule-based deformation logic in order to improve the robustness of the segmentation in areas of diffuse organ boundaries. Our work flow consists of two steps: 1.) a user guided positioning and 2.) an automatic model adaptation using affine and free form deformation in order to robustly extract the kidney. In cases which show pronounced pathologies, the system also offers real time mesh editing tools for a quick refinement of the segmentation result. Evaluation results based on 30 clinical cases using CT data sets show an average dice correlation coefficient of 93% compared to the ground truth. The results are therefore in most cases comparable to manual delineation. Computation times of the automatic adaptation step are lower than 6 seconds which makes the proposed system suitable for an application in clinical practice.

  9. Ontogeny of the daily profile of plasma melatonin in European starlings raised under long or short photoperiods.

    PubMed

    Dawson, Alistair; Van't, Hof Thomas J

    2002-06-01

    Photoperiodic manipulation of young European starlings suggests that their reproductive physiology is incapable of responding to a short photoperiod until they are fully grown. This study aimed to determine whether the lack of response to a short photoperiod is reflected in the daily profile of plasma melatonin concentrations. Five-day-old starlings taken from nest boxes showed a significant (p < 0.0001) rhythm in plasma melatonin concentrations, with high values during night. In nestlings hand-reared from 5 days of age on a long photoperiod (LD 16:8), equivalent to natural photoperiod at the time, the amplitude of the daily rhythm in melatonin increased significantly (p < 0.01) with age until birds were fully grown (20 days old). In nestlings reared on a short photoperiod (LD 8:16), the daily melatonin profile remained almost identical to that of long photoperiod birds until they were fully grown. However, after 20 days old, the duration of elevated nighttime melatonin began to extend to encompass the entire period of darkness. In contrast, fully grown starlings transferred from a long to a short photoperiod had partially adapted to the short photoperiod after 5 days; by 10 days, the daily melatonin profile was identical to that of birds held chronically on a short photoperiod. Thus, consistent with responses of reproductive physiology, the pineal of young birds appears to be incapable of perceiving, or adapting to, a short photoperiod.

  10. An Evaluation of Short-Term Distributed Online Learning Events

    ERIC Educational Resources Information Center

    Barker, Bradley; Brooks, David

    2005-01-01

    The purpose of this study was to evaluate the effectiveness of short-term distributed online training events using an adapted version of the compressed evaluation form developed by Wisher and Curnow (1998). Evaluating online distributed training events provides insight into course effectiveness, the contribution of prior knowledge to learning, and…

  11. A Bayesian framework for adaptive selection, calibration, and validation of coarse-grained models of atomistic systems

    NASA Astrophysics Data System (ADS)

    Farrell, Kathryn; Oden, J. Tinsley; Faghihi, Danial

    2015-08-01

    A general adaptive modeling algorithm for selection and validation of coarse-grained models of atomistic systems is presented. A Bayesian framework is developed to address uncertainties in parameters, data, and model selection. Algorithms for computing output sensitivities to parameter variances, model evidence and posterior model plausibilities for given data, and for computing what are referred to as Occam Categories in reference to a rough measure of model simplicity, make up components of the overall approach. Computational results are provided for representative applications.

  12. ICCE/ICCAI 2000 Full & Short Papers (Evaluation of Learning and Systems).

    ERIC Educational Resources Information Center

    2000

    This document contains the full and short papers on evaluation of learning and systems from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction) covering the following topics: a new method for efficient study of Kanji using mnemonics and software; a study on the relation…

  13. Multigrid methods for flow transition in three-dimensional boundary layers with surface roughness

    NASA Technical Reports Server (NTRS)

    Liu, Chaoqun; Liu, Zhining; Mccormick, Steve

    1993-01-01

    The efficient multilevel adaptive method has been successfully applied to perform direct numerical simulations (DNS) of flow transition in 3-D channels and 3-D boundary layers with 2-D and 3-D isolated and distributed roughness in a curvilinear coordinate system. A fourth-order finite difference technique on stretched and staggered grids, a fully-implicit time marching scheme, a semi-coarsening multigrid method associated with line distributive relaxation scheme, and an improved outflow boundary-condition treatment, which needs only a very short buffer domain to damp all order-one wave reflections, are developed. These approaches make the multigrid DNS code very accurate and efficient. This allows us not only to be able to do spatial DNS for the 3-D channel and flat plate at low computational costs, but also to do spatial DNS for transition in the 3-D boundary layer with 3-D single and multiple roughness elements, which would have extremely high computational costs with conventional methods. Numerical results show good agreement with the linear stability theory, the secondary instability theory, and a number of laboratory experiments. The contribution of isolated and distributed roughness to transition is analyzed.

  14. Dispersion interactions between neighboring Bi atoms in (BiH3 )2 and Te(BiR2 )2.

    PubMed

    Haack, Rebekka; Schulz, Stephan; Jansen, Georg

    2018-03-13

    Triggered by the observation of a short Bi⋯Bi distance and a BiTeBi bond angle of only 86.6° in the crystal structure of bis(diethylbismuthanyl)tellurane quantum chemical computations on interactions between neighboring Bi atoms in Te(BiR 2 ) 2 molecules (R = H, Me, Et) and in (BiH 3 ) 2 were undertaken. Bi⋯Bi distances atoms were found to significantly shorten upon inclusion of the d shells of the heavy metal atoms into the electron correlation treatment, and it was confirmed that interaction energies from spin component-scaled second-order Møller-Plesset theory (SCS-MP2) agree well with coupled-cluster singles and doubles theory including perturbative triples (CCSD(T)). Density functional theory-based symmetry-adapted perturbation theory (DFT-SAPT) was used to study the anisotropy of the interplay of dispersion attraction and steric repulsion between the Bi atoms. Finally, geometries and relative stabilities of syn-syn and syn-anti conformers of Te(BiR 2 ) 2 (R = H, Me, Et) and interconversion barriers between them were computed. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  15. Modeling disease transmission near eradication: An equation free approach

    NASA Astrophysics Data System (ADS)

    Williams, Matthew O.; Proctor, Joshua L.; Kutz, J. Nathan

    2015-01-01

    Although disease transmission in the near eradication regime is inherently stochastic, deterministic quantities such as the probability of eradication are of interest to policy makers and researchers. Rather than running large ensembles of discrete stochastic simulations over long intervals in time to compute these deterministic quantities, we create a data-driven and deterministic "coarse" model for them using the Equation Free (EF) framework. In lieu of deriving an explicit coarse model, the EF framework approximates any needed information, such as coarse time derivatives, by running short computational experiments. However, the choice of the coarse variables (i.e., the state of the coarse system) is critical if the resulting model is to be accurate. In this manuscript, we propose a set of coarse variables that result in an accurate model in the endemic and near eradication regimes, and demonstrate this on a compartmental model representing the spread of Poliomyelitis. When combined with adaptive time-stepping coarse projective integrators, this approach can yield over a factor of two speedup compared to direct simulation, and due to its lower dimensionality, could be beneficial when conducting systems level tasks such as designing eradication or monitoring campaigns.

  16. Transmission Line Ampacity Improvements of AltaLink Wind Plant Overhead Tie-Lines Using Weather-Based Dynamic Line Rating

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattarai, Bishnu P.; Gentle, Jake P.; Hill, Porter

    Abstract—Overhead transmission lines (TLs) are conventionally given seasonal ratings based on conservative environmental assumptions. Such an approach often results in underutilization of the line ampacity as the worst conditions prevail only for a short period over a year/season. We presents dynamic line rating (DLR) as an enabling smart grid technology that adaptively computes ratings of TLs based on local weather conditions to utilize additional headroom of existing lines. In particular, general line ampacity state solver utilizes measured weather data for computing the real-time thermal rating of the TLs. The performance of the presented method is demonstrated from a field studymore » of DLR technology implementation on four TL segments at AltaLink, Canada. The performance is evaluated and quantified by comparing the existing static and proposed dynamic line ratings, and the potential benefits of DLR for enhanced transmission assets utilization. For the given line segments, the proposed DLR results in real-time ratings above the seasonal static ratings for most of the time; up to 95.1% of the time, with a mean increase of 72% over static rating.« less

  17. The MaxQuant computational platform for mass spectrometry-based shotgun proteomics.

    PubMed

    Tyanova, Stefka; Temu, Tikira; Cox, Juergen

    2016-12-01

    MaxQuant is one of the most frequently used platforms for mass-spectrometry (MS)-based proteomics data analysis. Since its first release in 2008, it has grown substantially in functionality and can be used in conjunction with more MS platforms. Here we present an updated protocol covering the most important basic computational workflows, including those designed for quantitative label-free proteomics, MS1-level labeling and isobaric labeling techniques. This protocol presents a complete description of the parameters used in MaxQuant, as well as of the configuration options of its integrated search engine, Andromeda. This protocol update describes an adaptation of an existing protocol that substantially modifies the technique. Important concepts of shotgun proteomics and their implementation in MaxQuant are briefly reviewed, including different quantification strategies and the control of false-discovery rates (FDRs), as well as the analysis of post-translational modifications (PTMs). The MaxQuant output tables, which contain information about quantification of proteins and PTMs, are explained in detail. Furthermore, we provide a short version of the workflow that is applicable to data sets with simple and standard experimental designs. The MaxQuant algorithms are efficiently parallelized on multiple processors and scale well from desktop computers to servers with many cores. The software is written in C# and is freely available at http://www.maxquant.org.

  18. Time-frequency analysis of band-limited EEG with BMFLC and Kalman filter for BCI applications

    PubMed Central

    2013-01-01

    Background Time-Frequency analysis of electroencephalogram (EEG) during different mental tasks received significant attention. As EEG is non-stationary, time-frequency analysis is essential to analyze brain states during different mental tasks. Further, the time-frequency information of EEG signal can be used as a feature for classification in brain-computer interface (BCI) applications. Methods To accurately model the EEG, band-limited multiple Fourier linear combiner (BMFLC), a linear combination of truncated multiple Fourier series models is employed. A state-space model for BMFLC in combination with Kalman filter/smoother is developed to obtain accurate adaptive estimation. By virtue of construction, BMFLC with Kalman filter/smoother provides accurate time-frequency decomposition of the bandlimited signal. Results The proposed method is computationally fast and is suitable for real-time BCI applications. To evaluate the proposed algorithm, a comparison with short-time Fourier transform (STFT) and continuous wavelet transform (CWT) for both synthesized and real EEG data is performed in this paper. The proposed method is applied to BCI Competition data IV for ERD detection in comparison with existing methods. Conclusions Results show that the proposed algorithm can provide optimal time-frequency resolution as compared to STFT and CWT. For ERD detection, BMFLC-KF outperforms STFT and BMFLC-KS in real-time applicability with low computational requirement. PMID:24274109

  19. Water System Adaptation To Hydrological Changes: Module 15, Course Summary and Project Presentation

    EPA Science Inventory

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  20. Adaptive Statistical Iterative Reconstruction-V Versus Adaptive Statistical Iterative Reconstruction: Impact on Dose Reduction and Image Quality in Body Computed Tomography.

    PubMed

    Gatti, Marco; Marchisio, Filippo; Fronda, Marco; Rampado, Osvaldo; Faletti, Riccardo; Bergamasco, Laura; Ropolo, Roberto; Fonio, Paolo

    The aim of this study was to evaluate the impact on dose reduction and image quality of the new iterative reconstruction technique: adaptive statistical iterative reconstruction (ASIR-V). Fifty consecutive oncologic patients acted as case controls undergoing during their follow-up a computed tomography scan both with ASIR and ASIR-V. Each study was analyzed in a double-blinded fashion by 2 radiologists. Both quantitative and qualitative analyses of image quality were conducted. Computed tomography scanner radiation output was 38% (29%-45%) lower (P < 0.0001) for the ASIR-V examinations than for the ASIR ones. The quantitative image noise was significantly lower (P < 0.0001) for ASIR-V. Adaptive statistical iterative reconstruction-V had a higher performance for the subjective image noise (P = 0.01 for 5 mm and P = 0.009 for 1.25 mm), the other parameters (image sharpness, diagnostic acceptability, and overall image quality) being similar (P > 0.05). Adaptive statistical iterative reconstruction-V is a new iterative reconstruction technique that has the potential to provide image quality equal to or greater than ASIR, with a dose reduction around 40%.

  1. Three-dimensional computational fluid dynamics modeling of particle uptake by an occupational air sampler using manually-scaled and adaptive grids

    PubMed Central

    Landázuri, Andrea C.; Sáez, A. Eduardo; Anthony, T. Renée

    2016-01-01

    This work presents fluid flow and particle trajectory simulation studies to determine the aspiration efficiency of a horizontally oriented occupational air sampler using computational fluid dynamics (CFD). Grid adaption and manual scaling of the grids were applied to two sampler prototypes based on a 37-mm cassette. The standard k–ε model was used to simulate the turbulent air flow and a second order streamline-upwind discretization scheme was used to stabilize convective terms of the Navier–Stokes equations. Successively scaled grids for each configuration were created manually and by means of grid adaption using the velocity gradient in the main flow direction. Solutions were verified to assess iterative convergence, grid independence and monotonic convergence. Particle aspiration efficiencies determined for both prototype samplers were undistinguishable, indicating that the porous filter does not play a noticeable role in particle aspiration. Results conclude that grid adaption is a powerful tool that allows to refine specific regions that require lots of detail and therefore better resolve flow detail. It was verified that adaptive grids provided a higher number of locations with monotonic convergence than the manual grids and required the least computational effort. PMID:26949268

  2. Techniques for grid manipulation and adaptation. [computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Choo, Yung K.; Eisemann, Peter R.; Lee, Ki D.

    1992-01-01

    Two approaches have been taken to provide systematic grid manipulation for improved grid quality. One is the control point form (CPF) of algebraic grid generation. It provides explicit control of the physical grid shape and grid spacing through the movement of the control points. It works well in the interactive computer graphics environment and hence can be a good candidate for integration with other emerging technologies. The other approach is grid adaptation using a numerical mapping between the physical space and a parametric space. Grid adaptation is achieved by modifying the mapping functions through the effects of grid control sources. The adaptation process can be repeated in a cyclic manner if satisfactory results are not achieved after a single application.

  3. Accuracy requirements of optical linear algebra processors in adaptive optics imaging systems

    NASA Technical Reports Server (NTRS)

    Downie, John D.

    1990-01-01

    A ground-based adaptive optics imaging telescope system attempts to improve image quality by detecting and correcting for atmospherically induced wavefront aberrations. The required control computations during each cycle will take a finite amount of time. Longer time delays result in larger values of residual wavefront error variance since the atmosphere continues to change during that time. Thus an optical processor may be well-suited for this task. This paper presents a study of the accuracy requirements in a general optical processor that will make it competitive with, or superior to, a conventional digital computer for the adaptive optics application. An optimization of the adaptive optics correction algorithm with respect to an optical processor's degree of accuracy is also briefly discussed.

  4. How Big Is Big Enough? Sample Size Requirements for CAST Item Parameter Estimation

    ERIC Educational Resources Information Center

    Chuah, Siang Chee; Drasgow, Fritz; Luecht, Richard

    2006-01-01

    Adaptive tests offer the advantages of reduced test length and increased accuracy in ability estimation. However, adaptive tests require large pools of precalibrated items. This study looks at the development of an item pool for 1 type of adaptive administration: the computer-adaptive sequential test. An important issue is the sample size required…

  5. Small target pre-detection with an attention mechanism

    NASA Astrophysics Data System (ADS)

    Wang, Yuehuan; Zhang, Tianxu; Wang, Guoyou

    2002-04-01

    We introduce the concept of predetection based on an attention mechanism to improve the efficiency of small-target detection by limiting the image region of detection. According to the characteristics of small-target detection, local contrast is taken as the only feature in predetection and a nonlinear sampling model is adopted to make the predetection adaptive to detect small targets with different area sizes. To simplify the predetection itself and decrease the false alarm probability, neighboring nodes in the sampling grid are used to generate a saliency map, and a short-term memory is adopted to accelerate the `pop-out' of targets. We discuss the fact that the proposed approach is simple enough in computational complexity. In addition, even in a cluttered background, attention can be led to targets in a satisfying few iterations, which ensures that the detection efficiency will not be decreased due to false alarms. Experimental results are presented to demonstrate the applicability of the approach.

  6. Materials challenges for repeatable RF wireless device reconfiguration with microfluidic channels

    NASA Astrophysics Data System (ADS)

    Griffin, Anthony S.; Sottos, Nancy R.; White, Scott R.

    2018-03-01

    Recently, adaptive wireless devices have utilized displacement of EGaIn within microchannels as an electrical switching mechanism to enable reconfigurable electronics. Device reconfiguration using EGaIn in microchannels overcomes many challenges encountered by more traditional reconfiguration mechanisms such as diodes and microelectromechanical systems (MEMS). Reconfiguration using EGaIn is severely limited by undesired permanent shorting due to retention of the liquid in microchannels caused by wetting and rapid oxide skin formation. Here, we investigate the conditions which prevent repeatable electrical switching using EGaIn in microchannels. Initial contact angle tests of EGaIn on epoxy surfaces demonstrate the wettability of EGaIn on flat surfaces. SEM cross-sections of microchannels reveal adhesion of EGaIn residue to channel walls. Micro-computed tomography (microCT) scans of provide volumetric measurements of EGaIn remaining inside channels after flow cycling. Non-wetting coatings are proposed as materials based strategy to overcome these issues in future work.

  7. A cholinergic feedback circuit to regulate striatal population uncertainty and optimize reinforcement learning.

    PubMed

    Franklin, Nicholas T; Frank, Michael J

    2015-12-25

    Convergent evidence suggests that the basal ganglia support reinforcement learning by adjusting action values according to reward prediction errors. However, adaptive behavior in stochastic environments requires the consideration of uncertainty to dynamically adjust the learning rate. We consider how cholinergic tonically active interneurons (TANs) may endow the striatum with such a mechanism in computational models spanning three Marr's levels of analysis. In the neural model, TANs modulate the excitability of spiny neurons, their population response to reinforcement, and hence the effective learning rate. Long TAN pauses facilitated robustness to spurious outcomes by increasing divergence in synaptic weights between neurons coding for alternative action values, whereas short TAN pauses facilitated stochastic behavior but increased responsiveness to change-points in outcome contingencies. A feedback control system allowed TAN pauses to be dynamically modulated by uncertainty across the spiny neuron population, allowing the system to self-tune and optimize performance across stochastic environments.

  8. Modeling Complex Biological Flows in Multi-Scale Systems using the APDEC Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trebotich, D

    We have developed advanced numerical algorithms to model biological fluids in multiscale flow environments using the software framework developed under the SciDAC APDEC ISIC. The foundation of our computational effort is an approach for modeling DNA-laden fluids as ''bead-rod'' polymers whose dynamics are fully coupled to an incompressible viscous solvent. The method is capable of modeling short range forces and interactions between particles using soft potentials and rigid constraints. Our methods are based on higher-order finite difference methods in complex geometry with adaptivity, leveraging algorithms and solvers in the APDEC Framework. Our Cartesian grid embedded boundary approach to incompressible viscousmore » flow in irregular geometries has also been interfaced to a fast and accurate level-sets method within the APDEC Framework for extracting surfaces from volume renderings of medical image data and used to simulate cardio-vascular and pulmonary flows in critical anatomies.« less

  9. Modeling complex biological flows in multi-scale systems using the APDEC framework

    NASA Astrophysics Data System (ADS)

    Trebotich, David

    2006-09-01

    We have developed advanced numerical algorithms to model biological fluids in multiscale flow environments using the software framework developed under the SciDAC APDEC ISIC. The foundation of our computational effort is an approach for modeling DNA laden fluids as ''bead-rod'' polymers whose dynamics are fully coupled to an incompressible viscous solvent. The method is capable of modeling short range forces and interactions between particles using soft potentials and rigid constraints. Our methods are based on higher-order finite difference methods in complex geometry with adaptivity, leveraging algorithms and solvers in the APDEC Framework. Our Cartesian grid embedded boundary approach to incompressible viscous flow in irregular geometries has also been interfaced to a fast and accurate level-sets method within the APDEC Framework for extracting surfaces from volume renderings of medical image data and used to simulate cardio-vascular and pulmonary flows in critical anatomies.

  10. A Monte Carlo Simulation Investigating the Validity and Reliability of Ability Estimation in Item Response Theory with Speeded Computer Adaptive Tests

    ERIC Educational Resources Information Center

    Schmitt, T. A.; Sass, D. A.; Sullivan, J. R.; Walker, C. M.

    2010-01-01

    Imposed time limits on computer adaptive tests (CATs) can result in examinees having difficulty completing all items, thus compromising the validity and reliability of ability estimates. In this study, the effects of speededness were explored in a simulated CAT environment by varying examinee response patterns to end-of-test items. Expectedly,…

  11. Validating the ACE Model for Evaluating Student Performance Using a Teaching-Learning Process Based on Computational Modeling Systems

    ERIC Educational Resources Information Center

    Louzada, Alexandre Neves; Elia, Marcos da Fonseca; Sampaio, Fábio Ferrentini; Vidal, Andre Luiz Pestana

    2014-01-01

    The aim of this work is to adapt and test, in a Brazilian public school, the ACE model proposed by Borkulo for evaluating student performance as a teaching-learning process based on computational modeling systems. The ACE model is based on different types of reasoning involving three dimensions. In addition to adapting the model and introducing…

  12. Advanced Dynamically Adaptive Algorithms for Stochastic Simulations on Extreme Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiu, Dongbin

    2017-03-03

    The focus of the project is the development of mathematical methods and high-performance computational tools for stochastic simulations, with a particular emphasis on computations on extreme scales. The core of the project revolves around the design of highly efficient and scalable numerical algorithms that can adaptively and accurately, in high dimensional spaces, resolve stochastic problems with limited smoothness, even containing discontinuities.

  13. Using a Computer-Adapted, Conceptually Based History Text to Increase Comprehension and Problem-Solving Skills of Students with Disabilities

    ERIC Educational Resources Information Center

    Twyman, Todd; Tindal, Gerald

    2006-01-01

    The purpose of this study was to improve the comprehension and problem-solving skills of students with disabilities in social studies using a conceptually framed, computer-adapted history text. Participants were 11th and 12th grade students identified with learning disabilities in reading and writing from two intact, self-contained social studies…

  14. Exploring the Cross-Linguistic Transfer of Reading Skills in Spanish to English in the Context of a Computer Adaptive Reading Intervention

    ERIC Educational Resources Information Center

    Baker, Doris Luft; Basaraba, Deni Lee; Smolkowski, Keith; Conry, Jillian; Hautala, Jarkko; Richardson, Ulla; English, Sherril; Cole, Ron

    2017-01-01

    We explore the potential of a computer-adaptive decoding game in Spanish to increase the decoding skills and oral reading fluency in Spanish and English of bilingual students. Participants were 78 first-grade Spanish-speaking students attending bilingual programs in five classrooms in Texas. Classrooms were randomly assigned to the treatment…

  15. Study of Personnel Attrition and Revocation within U.S. Marine Corps Air Traffic Control Specialties

    DTIC Science & Technology

    2012-03-01

    Entrance Processing Stations (MEPS) and recruit depots, to include non-cognitive testing, such as Navy Computer Adaptive Personality Scales ( NCAPS ...Revocation, Selection, MOS, Regression, Probit, dProbit, STATA, Statistics, Marginal Effects, ASVAB, AFQT, Composite Scores, Screening, NCAPS 15. NUMBER...Navy Computer Adaptive Personality Scales ( NCAPS ), during recruitment. It is also recommended that an economic analysis be conducted comparing the

  16. Using Patterns of Summed Scores in Paper-and-Pencil Tests and Computer-Adaptive Tests to Detect Misfitting Item Score Patterns

    ERIC Educational Resources Information Center

    Meijer, Rob R.

    2004-01-01

    Two new methods have been proposed to determine unexpected sum scores on sub-tests (testlets) both for paper-and-pencil tests and computer adaptive tests. A method based on a conservative bound using the hypergeometric distribution, denoted p, was compared with a method where the probability for each score combination was calculated using a…

  17. The Impact of Item Dependency on the Efficiency of Testing and Reliability of Student Scores from a Computer Adaptive Assessment of Reading Comprehension

    ERIC Educational Resources Information Center

    Petscher, Yaacov; Foorman, Barbara R.; Truckenmiller, Adrea J.

    2017-01-01

    The objective of the present study was to evaluate the extent to which students who took a computer adaptive test of reading comprehension accounting for testlet effects were administered fewer passages and had a more precise estimate of their reading comprehension ability compared to students in the control condition. A randomized controlled…

  18. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method

    PubMed Central

    Cao, Youfang; Liang, Jie

    2013-01-01

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape. PMID:23862966

  19. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method

    NASA Astrophysics Data System (ADS)

    Cao, Youfang; Liang, Jie

    2013-07-01

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape.

  20. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method.

    PubMed

    Cao, Youfang; Liang, Jie

    2013-07-14

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape.

  1. An adaptive inverse kinematics algorithm for robot manipulators

    NASA Technical Reports Server (NTRS)

    Colbaugh, R.; Glass, K.; Seraji, H.

    1990-01-01

    An adaptive algorithm for solving the inverse kinematics problem for robot manipulators is presented. The algorithm is derived using model reference adaptive control (MRAC) theory and is computationally efficient for online applications. The scheme requires no a priori knowledge of the kinematics of the robot if Cartesian end-effector sensing is available, and it requires knowledge of only the forward kinematics if joint position sensing is used. Computer simulation results are given for the redundant seven-DOF robotics research arm, demonstrating that the proposed algorithm yields accurate joint angle trajectories for a given end-effector position/orientation trajectory.

  2. Real-time control system for adaptive resonator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flath, L; An, J; Brase, J

    2000-07-24

    Sustained operation of high average power solid-state lasers currently requires an adaptive resonator to produce the optimal beam quality. We describe the architecture of a real-time adaptive control system for correcting intra-cavity aberrations in a heat capacity laser. Image data collected from a wavefront sensor are processed and used to control phase with a high-spatial-resolution deformable mirror. Our controller takes advantage of recent developments in low-cost, high-performance processor technology. A desktop-based computational engine and object-oriented software architecture replaces the high-cost rack-mount embedded computers of previous systems.

  3. Accuracy requirements of optical linear algebra processors in adaptive optics imaging systems.

    PubMed

    Downie, J D; Goodman, J W

    1989-10-15

    A ground-based adaptive optics imaging telescope system attempts to improve image quality by measuring and correcting for atmospherically induced wavefront aberrations. The necessary control computations during each cycle will take a finite amount of time, which adds to the residual error variance since the atmosphere continues to change during that time. Thus an optical processor may be well-suited for this task. This paper investigates this possibility by studying the accuracy requirements in a general optical processor that will make it competitive with, or superior to, a conventional digital computer for adaptive optics use.

  4. Multi-Level Adaptive Techniques (MLAT) for singular-perturbation problems

    NASA Technical Reports Server (NTRS)

    Brandt, A.

    1978-01-01

    The multilevel (multigrid) adaptive technique, a general strategy of solving continuous problems by cycling between coarser and finer levels of discretization is described. It provides very fast general solvers, together with adaptive, nearly optimal discretization schemes. In the process, boundary layers are automatically either resolved or skipped, depending on a control function which expresses the computational goal. The global error decreases exponentially as a function of the overall computational work, in a uniform rate independent of the magnitude of the singular-perturbation terms. The key is high-order uniformly stable difference equations, and uniformly smoothing relaxation schemes.

  5. [Energy expenditure at rest and obesity].

    PubMed

    Müllerová, D; Matĕjková, D; Rusavý, Z; Müller, L

    1998-01-01

    Adult human body has to have, because of every day fluctuating energy intake and energy needs, very precious adaptive mechanisms for maintenance of heat homeostasis in the body and nearly stable body weight and body composition, which are optimal for life and reproduction. These short term functioning adaptive mechanisms are called "empty biochemical mechanisms", where chemically bound energy is transformed to heat without work performance. These mechanisms are present on the cellular level (substrates cycles, uncoupling of respiration chain), on the interorgan metabolic level (glycolysis and gluconeogenesis between liver and adipose tissue-glucose-lactate cycle). Central nervous system controls them via many factors; the most important are catecholamines, leptin, insulin, thyroid hormones, cortisol, growth and sex hormones. Neurotransmitters and neuronal net influence energy intake and other behavior. Obesity seems to be associated with the amelioration or overcoming of possibilities of function short-term effective adaptive mechanisms.

  6. Hardware Acceleration of Adaptive Neural Algorithms.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James, Conrad D.

    As tradit ional numerical computing has faced challenges, researchers have turned towards alternative computing approaches to reduce power - per - computation metrics and improve algorithm performance. Here, we describe an approach towards non - conventional computing that strengthens the connection between machine learning and neuroscience concepts. The Hardware Acceleration of Adaptive Neural Algorithms (HAANA) project ha s develop ed neural machine learning algorithms and hardware for applications in image processing and cybersecurity. While machine learning methods are effective at extracting relevant features from many types of data, the effectiveness of these algorithms degrades when subjected to real - worldmore » conditions. Our team has generated novel neural - inspired approa ches to improve the resiliency and adaptability of machine learning algorithms. In addition, we have also designed and fabricated hardware architectures and microelectronic devices specifically tuned towards the training and inference operations of neural - inspired algorithms. Finally, our multi - scale simulation framework allows us to assess the impact of microelectronic device properties on algorithm performance.« less

  7. A proposed study of multiple scattering through clouds up to 1 THz

    NASA Technical Reports Server (NTRS)

    Gerace, G. C.; Smith, E. K.

    1992-01-01

    A rigorous computation of the electromagnetic field scattered from an atmospheric liquid water cloud is proposed. The recent development of a fast recursive algorithm (Chew algorithm) for computing the fields scattered from numerous scatterers now makes a rigorous computation feasible. A method is presented for adapting this algorithm to a general case where there are an extremely large number of scatterers. It is also proposed to extend a new binary PAM channel coding technique (El-Khamy coding) to multiple levels with non-square pulse shapes. The Chew algorithm can be used to compute the transfer function of a cloud channel. Then the transfer function can be used to design an optimum El-Khamy code. In principle, these concepts can be applied directly to the realistic case of a time-varying cloud (adaptive channel coding and adaptive equalization). A brief review is included of some preliminary work on cloud dispersive effects on digital communication signals and on cloud liquid water spectra and correlations.

  8. Adaptive Mesh Refinement for Microelectronic Device Design

    NASA Technical Reports Server (NTRS)

    Cwik, Tom; Lou, John; Norton, Charles

    1999-01-01

    Finite element and finite volume methods are used in a variety of design simulations when it is necessary to compute fields throughout regions that contain varying materials or geometry. Convergence of the simulation can be assessed by uniformly increasing the mesh density until an observable quantity stabilizes. Depending on the electrical size of the problem, uniform refinement of the mesh may be computationally infeasible due to memory limitations. Similarly, depending on the geometric complexity of the object being modeled, uniform refinement can be inefficient since regions that do not need refinement add to the computational expense. In either case, convergence to the correct (measured) solution is not guaranteed. Adaptive mesh refinement methods attempt to selectively refine the region of the mesh that is estimated to contain proportionally higher solution errors. The refinement may be obtained by decreasing the element size (h-refinement), by increasing the order of the element (p-refinement) or by a combination of the two (h-p refinement). A successful adaptive strategy refines the mesh to produce an accurate solution measured against the correct fields without undue computational expense. This is accomplished by the use of a) reliable a posteriori error estimates, b) hierarchal elements, and c) automatic adaptive mesh generation. Adaptive methods are also useful when problems with multi-scale field variations are encountered. These occur in active electronic devices that have thin doped layers and also when mixed physics is used in the calculation. The mesh needs to be fine at and near the thin layer to capture rapid field or charge variations, but can coarsen away from these layers where field variations smoothen and charge densities are uniform. This poster will present an adaptive mesh refinement package that runs on parallel computers and is applied to specific microelectronic device simulations. Passive sensors that operate in the infrared portion of the spectrum as well as active device simulations that model charge transport and Maxwell's equations will be presented.

  9. Cyber-workstation for computational neuroscience.

    PubMed

    Digiovanna, Jack; Rattanatamrong, Prapaporn; Zhao, Ming; Mahmoudi, Babak; Hermer, Linda; Figueiredo, Renato; Principe, Jose C; Fortes, Jose; Sanchez, Justin C

    2010-01-01

    A Cyber-Workstation (CW) to study in vivo, real-time interactions between computational models and large-scale brain subsystems during behavioral experiments has been designed and implemented. The design philosophy seeks to directly link the in vivo neurophysiology laboratory with scalable computing resources to enable more sophisticated computational neuroscience investigation. The architecture designed here allows scientists to develop new models and integrate them with existing models (e.g. recursive least-squares regressor) by specifying appropriate connections in a block-diagram. Then, adaptive middleware transparently implements these user specifications using the full power of remote grid-computing hardware. In effect, the middleware deploys an on-demand and flexible neuroscience research test-bed to provide the neurophysiology laboratory extensive computational power from an outside source. The CW consolidates distributed software and hardware resources to support time-critical and/or resource-demanding computing during data collection from behaving animals. This power and flexibility is important as experimental and theoretical neuroscience evolves based on insights gained from data-intensive experiments, new technologies and engineering methodologies. This paper describes briefly the computational infrastructure and its most relevant components. Each component is discussed within a systematic process of setting up an in vivo, neuroscience experiment. Furthermore, a co-adaptive brain machine interface is implemented on the CW to illustrate how this integrated computational and experimental platform can be used to study systems neurophysiology and learning in a behavior task. We believe this implementation is also the first remote execution and adaptation of a brain-machine interface.

  10. Cyber-Workstation for Computational Neuroscience

    PubMed Central

    DiGiovanna, Jack; Rattanatamrong, Prapaporn; Zhao, Ming; Mahmoudi, Babak; Hermer, Linda; Figueiredo, Renato; Principe, Jose C.; Fortes, Jose; Sanchez, Justin C.

    2009-01-01

    A Cyber-Workstation (CW) to study in vivo, real-time interactions between computational models and large-scale brain subsystems during behavioral experiments has been designed and implemented. The design philosophy seeks to directly link the in vivo neurophysiology laboratory with scalable computing resources to enable more sophisticated computational neuroscience investigation. The architecture designed here allows scientists to develop new models and integrate them with existing models (e.g. recursive least-squares regressor) by specifying appropriate connections in a block-diagram. Then, adaptive middleware transparently implements these user specifications using the full power of remote grid-computing hardware. In effect, the middleware deploys an on-demand and flexible neuroscience research test-bed to provide the neurophysiology laboratory extensive computational power from an outside source. The CW consolidates distributed software and hardware resources to support time-critical and/or resource-demanding computing during data collection from behaving animals. This power and flexibility is important as experimental and theoretical neuroscience evolves based on insights gained from data-intensive experiments, new technologies and engineering methodologies. This paper describes briefly the computational infrastructure and its most relevant components. Each component is discussed within a systematic process of setting up an in vivo, neuroscience experiment. Furthermore, a co-adaptive brain machine interface is implemented on the CW to illustrate how this integrated computational and experimental platform can be used to study systems neurophysiology and learning in a behavior task. We believe this implementation is also the first remote execution and adaptation of a brain-machine interface. PMID:20126436

  11. Validation of Patient-Reported Outcomes Measurement Information System Computerized Adaptive Tests Against the Foot and Ankle Outcome Score for 6 Common Foot and Ankle Pathologies.

    PubMed

    Koltsov, Jayme C B; Greenfield, Stephen T; Soukup, Dylan; Do, Huong T; Ellis, Scott J

    2017-08-01

    The field of foot and ankle surgery lacks a widely accepted gold-standard patient-reported outcome instrument. With the changing infrastructure of the medical profession, more efficient patient-reported outcome tools are needed to reduce respondent burden and increase participation while providing consistent and reliable measurement across multiple pathologies and disciplines. The primary purpose of the present study was to validate 3 Patient-Reported Outcomes Measurement Information System computer adaptive tests (CATs) most relevant to the foot and ankle discipline against the Foot and Ankle Outcome Score (FAOS) and the Short Form 12 general health status survey in patients with 6 common foot and ankle pathologies. Patients (n = 240) indicated for operative treatment for 1 of 6 common foot and ankle pathologies completed the CATs, FAOS, and Short Form 12 at their preoperative surgical visits, 1 week subsequently (before surgery), and at 6 months postoperatively. The psychometric properties of the instruments were assessed and compared. The Patient-Reported Outcomes Measurement Information System CATs each took less than 1 minute to complete, whereas the FAOS took 6.5 minutes, and the Short Form 12 took 3 minutes. CAT scores were more normally distributed and had fewer floor and ceiling effects than those on the FAOS, which reached as high as 24%. The CATs were more precise than the FAOS and had similar responsiveness and test-retest reliability. The physical function and mobility CATs correlated strongly with the activities subscale of the FAOS, and the pain interference CAT correlated strongly with the pain subscale of the FAOS. The CATs and FAOS were responsive to changes with operative treatment for 6 common foot and ankle pathologies. The CATs performed as well as or better than the FAOS in all aspects of psychometric validity. The Patient-Reported Outcomes Measurement Information System CATs show tremendous potential for improving the study of patient outcomes in foot and ankle research through improved precision and reduced respondent burden. Level II, prospective comparative study.

  12. Water System Adaptation To Hydrological Changes: Module 8, Regulatory Framework Intersections: Past, Present, and Future

    EPA Science Inventory

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  13. Short forms of the Schedule for Nonadaptive and Adaptive Personality (SNAP) for self- and collateral ratings: development, reliability, and validity.

    PubMed

    Harlan, E; Clark, L A

    1999-06-01

    Researchers and clinicians alike increasingly seek brief, reliable, and valid measures to obtain personality trait ratings from both selves and peers. We report the development of a paragraph-descriptor short form of a full-length personality assessment instrument, the Schedule for Nonadaptive and Adaptive Personality (SNAP) with both self- and other versions. Reliability and validity data were collected on a sample of 294 college students, from 90 of whom we also obtained parental ratings of their personality. Internal consistency reliability was good in both self- and parent data. The factorial structures of the self-report short and long forms were very similar. Convergence between parental ratings was moderately high. Self-parent convergence was variable, with lower agreement on scales assessing subjective distress than those assessing more observable behaviors; it also was stronger for higher order factors than for scales.

  14. Parenting Young Arab Children: Psychometric Properties of an Adapted Arabic Brief Version of the Alabama Parenting Questionnaire.

    PubMed

    Badahdah, Abdallah; Le, Kien Trung

    2016-06-01

    Research has shown a connection between negative parenting practices and child conduct problems. One of the most commonly used measures to assess parenting practices is the Alabama parenting questionnaire (APQ). The current study aimed to culturally adapt and assess the psychometric properties of a short version of the APQ for use in Arabic cultures, using a sample of 251 Qatari parents of children ages 4-12. An exploratory factor analysis proposed a five-model solution that corresponds to the original proposed model in the full version of the APQ. The five constructs of the APQ correlated in the expected direction with the Conduct Problem Subscale from the Strength and Difficulties Questionnaire. This study provides support for the utility of the 15-item short version of the APQ in Arabic cultures. More studies are needed to validate the performance of the short version of APQ in clinical settings.

  15. The Role of Atomic Repertoires in Complex Behavior

    ERIC Educational Resources Information Center

    Palmer, David C.

    2012-01-01

    Evolution and reinforcement shape adaptive forms and adaptive behavior through many cycles of blind variation and selection, and therein lie their parsimony and power. Human behavior is distinctive in that this shaping process is commonly "short circuited": Critical variations are induced in a single trial. The processes by which this economy is…

  16. The Adaptive Basis of Psychosocial Acceleration: Comment on beyond Mental Health, Life History Strategies Articles

    ERIC Educational Resources Information Center

    Nettle, Daniel; Frankenhuis, Willem E.; Rickard, Ian J.

    2012-01-01

    Four of the articles published in this special section of "Developmental Psychology" build on and refine psychosocial acceleration theory. In this short commentary, we discuss some of the adaptive assumptions of psychosocial acceleration theory that have not received much attention. Psychosocial acceleration theory relies on the behavior of…

  17. Cas4 Facilitates PAM-Compatible Spacer Selection during CRISPR Adaptation.

    PubMed

    Kieper, Sebastian N; Almendros, Cristóbal; Behler, Juliane; McKenzie, Rebecca E; Nobrega, Franklin L; Haagsma, Anna C; Vink, Jochem N A; Hess, Wolfgang R; Brouns, Stan J J

    2018-03-27

    CRISPR-Cas systems adapt their immunological memory against their invaders by integrating short DNA fragments into clustered regularly interspaced short palindromic repeat (CRISPR) loci. While Cas1 and Cas2 make up the core machinery of the CRISPR integration process, various class I and II CRISPR-Cas systems encode Cas4 proteins for which the role is unknown. Here, we introduced the CRISPR adaptation genes cas1, cas2, and cas4 from the type I-D CRISPR-Cas system of Synechocystis sp. 6803 into Escherichia coli and observed that cas4 is strictly required for the selection of targets with protospacer adjacent motifs (PAMs) conferring I-D CRISPR interference in the native host Synechocystis. We propose a model in which Cas4 assists the CRISPR adaptation complex Cas1-2 by providing DNA substrates tailored for the correct PAM. Introducing functional spacers that target DNA sequences with the correct PAM is key to successful CRISPR interference, providing a better chance of surviving infection by mobile genetic elements. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Infrastructure Systems for Advanced Computing in E-science applications

    NASA Astrophysics Data System (ADS)

    Terzo, Olivier

    2013-04-01

    In the e-science field are growing needs for having computing infrastructure more dynamic and customizable with a model of use "on demand" that follow the exact request in term of resources and storage capacities. The integration of grid and cloud infrastructure solutions allows us to offer services that can adapt the availability in terms of up scaling and downscaling resources. The main challenges for e-sciences domains will on implement infrastructure solutions for scientific computing that allow to adapt dynamically the demands of computing resources with a strong emphasis on optimizing the use of computing resources for reducing costs of investments. Instrumentation, data volumes, algorithms, analysis contribute to increase the complexity for applications who require high processing power and storage for a limited time and often exceeds the computational resources that equip the majority of laboratories, research Unit in an organization. Very often it is necessary to adapt or even tweak rethink tools, algorithms, and consolidate existing applications through a phase of reverse engineering in order to adapt them to a deployment on Cloud infrastructure. For example, in areas such as rainfall monitoring, meteorological analysis, Hydrometeorology, Climatology Bioinformatics Next Generation Sequencing, Computational Electromagnetic, Radio occultation, the complexity of the analysis raises several issues such as the processing time, the scheduling of tasks of processing, storage of results, a multi users environment. For these reasons, it is necessary to rethink the writing model of E-Science applications in order to be already adapted to exploit the potentiality of cloud computing services through the uses of IaaS, PaaS and SaaS layer. An other important focus is on create/use hybrid infrastructure typically a federation between Private and public cloud, in fact in this way when all resources owned by the organization are all used it will be easy with a federate cloud infrastructure to add some additional resources form the Public cloud for following the needs in term of computational and storage resources and release them where process are finished. Following the hybrid model, the scheduling approach is important for managing both cloud models. Thanks to this model infrastructure every time resources are available for additional request in term of IT capacities that can used "on demand" for a limited time without having to proceed to purchase additional servers.

  19. Unstructured Adaptive Grid Computations on an Array of SMPs

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Pramanick, Ira; Sohn, Andrew; Simon, Horst D.

    1996-01-01

    Dynamic load balancing is necessary for parallel adaptive methods to solve unsteady CFD problems on unstructured grids. We have presented such a dynamic load balancing framework called JOVE, in this paper. Results on a four-POWERnode POWER CHALLENGEarray demonstrated that load balancing gives significant performance improvements over no load balancing for such adaptive computations. The parallel speedup of JOVE, implemented using MPI on the POWER CHALLENCEarray, was significant, being as high as 31 for 32 processors. An implementation of JOVE that exploits 'an array of SMPS' architecture was also studied; this hybrid JOVE outperformed flat JOVE by up to 28% on the meshes and adaption models tested. With large, realistic meshes and actual flow-solver and adaption phases incorporated into JOVE, hybrid JOVE can be expected to yield significant advantage over flat JOVE, especially as the number of processors is increased, thus demonstrating the scalability of an array of SMPs architecture.

  20. Adaptive control of Parkinson's state based on a nonlinear computational model with unknown parameters.

    PubMed

    Su, Fei; Wang, Jiang; Deng, Bin; Wei, Xi-Le; Chen, Ying-Yuan; Liu, Chen; Li, Hui-Yan

    2015-02-01

    The objective here is to explore the use of adaptive input-output feedback linearization method to achieve an improved deep brain stimulation (DBS) algorithm for closed-loop control of Parkinson's state. The control law is based on a highly nonlinear computational model of Parkinson's disease (PD) with unknown parameters. The restoration of thalamic relay reliability is formulated as the desired outcome of the adaptive control methodology, and the DBS waveform is the control input. The control input is adjusted in real time according to estimates of unknown parameters as well as the feedback signal. Simulation results show that the proposed adaptive control algorithm succeeds in restoring the relay reliability of the thalamus, and at the same time achieves accurate estimation of unknown parameters. Our findings point to the potential value of adaptive control approach that could be used to regulate DBS waveform in more effective treatment of PD.

Top