Sample records for context adaptive coding

  1. CAreDroid: Adaptation Framework for Android Context-Aware Applications

    PubMed Central

    Elmalaki, Salma; Wanner, Lucas; Srivastava, Mani

    2015-01-01

    Context-awareness is the ability of software systems to sense and adapt to their physical environment. Many contemporary mobile applications adapt to changing locations, connectivity states, available computational and energy resources, and proximity to other users and devices. Nevertheless, there is little systematic support for context-awareness in contemporary mobile operating systems. Because of this, application developers must build their own context-awareness adaptation engines, dealing directly with sensors and polluting application code with complex adaptation decisions. In this paper, we introduce CAreDroid, which is a framework that is designed to decouple the application logic from the complex adaptation decisions in Android context-aware applications. In this framework, developers are required— only—to focus on the application logic by providing a list of methods that are sensitive to certain contexts along with the permissible operating ranges under those contexts. At run time, CAreDroid monitors the context of the physical environment and intercepts calls to sensitive methods, activating only the blocks of code that best fit the current physical context. CAreDroid is implemented as part of the Android runtime system. By pushing context monitoring and adaptation into the runtime system, CAreDroid eases the development of context-aware applications and increases their efficiency. In particular, case study applications implemented using CAre-Droid are shown to have: (1) at least half lines of code fewer and (2) at least 10× more efficient in execution time compared to equivalent context-aware applications that use only standard Android APIs. PMID:26834512

  2. CAreDroid: Adaptation Framework for Android Context-Aware Applications.

    PubMed

    Elmalaki, Salma; Wanner, Lucas; Srivastava, Mani

    2015-09-01

    Context-awareness is the ability of software systems to sense and adapt to their physical environment. Many contemporary mobile applications adapt to changing locations, connectivity states, available computational and energy resources, and proximity to other users and devices. Nevertheless, there is little systematic support for context-awareness in contemporary mobile operating systems. Because of this, application developers must build their own context-awareness adaptation engines, dealing directly with sensors and polluting application code with complex adaptation decisions. In this paper, we introduce CAreDroid, which is a framework that is designed to decouple the application logic from the complex adaptation decisions in Android context-aware applications. In this framework, developers are required- only-to focus on the application logic by providing a list of methods that are sensitive to certain contexts along with the permissible operating ranges under those contexts. At run time, CAreDroid monitors the context of the physical environment and intercepts calls to sensitive methods, activating only the blocks of code that best fit the current physical context. CAreDroid is implemented as part of the Android runtime system. By pushing context monitoring and adaptation into the runtime system, CAreDroid eases the development of context-aware applications and increases their efficiency. In particular, case study applications implemented using CAre-Droid are shown to have: (1) at least half lines of code fewer and (2) at least 10× more efficient in execution time compared to equivalent context-aware applications that use only standard Android APIs.

  3. Deficits in context-dependent adaptive coding of reward in schizophrenia

    PubMed Central

    Kirschner, Matthias; Hager, Oliver M; Bischof, Martin; Hartmann-Riemer, Matthias N; Kluge, Agne; Seifritz, Erich; Tobler, Philippe N; Kaiser, Stefan

    2016-01-01

    Theoretical principles of information processing and empirical findings suggest that to efficiently represent all possible rewards in the natural environment, reward-sensitive neurons have to adapt their coding range dynamically to the current reward context. Adaptation ensures that the reward system is most sensitive for the most likely rewards, enabling the system to efficiently represent a potentially infinite range of reward information. A deficit in neural adaptation would prevent precise representation of rewards and could have detrimental effects for an organism’s ability to optimally engage with its environment. In schizophrenia, reward processing is known to be impaired and has been linked to different symptom dimensions. However, despite the fundamental significance of coding reward adaptively, no study has elucidated whether adaptive reward processing is impaired in schizophrenia. We therefore studied patients with schizophrenia (n=27) and healthy controls (n=25), using functional magnetic resonance imaging in combination with a variant of the monetary incentive delay task. Compared with healthy controls, patients with schizophrenia showed less efficient neural adaptation to the current reward context, which leads to imprecise neural representation of reward. Importantly, the deficit correlated with total symptom severity. Our results suggest that some of the deficits in reward processing in schizophrenia might be due to inefficient neural adaptation to the current reward context. Furthermore, because adaptive coding is a ubiquitous feature of the brain, we believe that our findings provide an avenue in defining a general impairment in neural information processing underlying this debilitating disorder. PMID:27430009

  4. Partial Adaptation of Obtained and Observed Value Signals Preserves Information about Gains and Losses

    PubMed Central

    Baddeley, Michelle; Tobler, Philippe N.; Schultz, Wolfram

    2016-01-01

    Given that the range of rewarding and punishing outcomes of actions is large but neural coding capacity is limited, efficient processing of outcomes by the brain is necessary. One mechanism to increase efficiency is to rescale neural output to the range of outcomes expected in the current context, and process only experienced deviations from this expectation. However, this mechanism comes at the cost of not being able to discriminate between unexpectedly low losses when times are bad versus unexpectedly high gains when times are good. Thus, too much adaptation would result in disregarding information about the nature and absolute magnitude of outcomes, preventing learning about the longer-term value structure of the environment. Here we investigate the degree of adaptation in outcome coding brain regions in humans, for directly experienced outcomes and observed outcomes. We scanned participants while they performed a social learning task in gain and loss blocks. Multivariate pattern analysis showed two distinct networks of brain regions adapt to the most likely outcomes within a block. Frontostriatal areas adapted to directly experienced outcomes, whereas lateral frontal and temporoparietal regions adapted to observed social outcomes. Critically, in both cases, adaptation was incomplete and information about whether the outcomes arose in a gain block or a loss block was retained. Univariate analysis confirmed incomplete adaptive coding in these regions but also detected nonadapting outcome signals. Thus, although neural areas rescale their responses to outcomes for efficient coding, they adapt incompletely and keep track of the longer-term incentives available in the environment. SIGNIFICANCE STATEMENT Optimal value-based choice requires that the brain precisely and efficiently represents positive and negative outcomes. One way to increase efficiency is to adapt responding to the most likely outcomes in a given context. However, too strong adaptation would result in loss of precise representation (e.g., when the avoidance of a loss in a loss-context is coded the same as receipt of a gain in a gain-context). We investigated an intermediate form of adaptation that is efficient while maintaining information about received gains and avoided losses. We found that frontostriatal areas adapted to directly experienced outcomes, whereas lateral frontal and temporoparietal regions adapted to observed social outcomes. Importantly, adaptation was intermediate, in line with influential models of reference dependence in behavioral economics. PMID:27683899

  5. Normalized value coding explains dynamic adaptation in the human valuation process.

    PubMed

    Khaw, Mel W; Glimcher, Paul W; Louie, Kenway

    2017-11-28

    The notion of subjective value is central to choice theories in ecology, economics, and psychology, serving as an integrated decision variable by which options are compared. Subjective value is often assumed to be an absolute quantity, determined in a static manner by the properties of an individual option. Recent neurobiological studies, however, have shown that neural value coding dynamically adapts to the statistics of the recent reward environment, introducing an intrinsic temporal context dependence into the neural representation of value. Whether valuation exhibits this kind of dynamic adaptation at the behavioral level is unknown. Here, we show that the valuation process in human subjects adapts to the history of previous values, with current valuations varying inversely with the average value of recently observed items. The dynamics of this adaptive valuation are captured by divisive normalization, linking these temporal context effects to spatial context effects in decision making as well as spatial and temporal context effects in perception. These findings suggest that adaptation is a universal feature of neural information processing and offer a unifying explanation for contextual phenomena in fields ranging from visual psychophysics to economic choice.

  6. Adapting the coping in deliberation (CODE) framework: a multi-method approach in the context of familial ovarian cancer risk management.

    PubMed

    Witt, Jana; Elwyn, Glyn; Wood, Fiona; Rogers, Mark T; Menon, Usha; Brain, Kate

    2014-11-01

    To test whether the coping in deliberation (CODE) framework can be adapted to a specific preference-sensitive medical decision: risk-reducing bilateral salpingo-oophorectomy (RRSO) in women at increased risk of ovarian cancer. We performed a systematic literature search to identify issues important to women during deliberations about RRSO. Three focus groups with patients (most were pre-menopausal and untested for genetic mutations) and 11 interviews with health professionals were conducted to determine which issues mattered in the UK context. Data were used to adapt the generic CODE framework. The literature search yielded 49 relevant studies, which highlighted various issues and coping options important during deliberations, including mutation status, risks of surgery, family obligations, physician recommendation, peer support and reliable information sources. Consultations with UK stakeholders confirmed most of these factors as pertinent influences on deliberations. Questions in the generic framework were adapted to reflect the issues and coping options identified. The generic CODE framework was readily adapted to a specific preference-sensitive medical decision, showing that deliberations and coping are linked during deliberations about RRSO. Adapted versions of the CODE framework may be used to develop tailored decision support methods and materials in order to improve patient-centred care. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. A biological inspired fuzzy adaptive window median filter (FAWMF) for enhancing DNA signal processing.

    PubMed

    Ahmad, Muneer; Jung, Low Tan; Bhuiyan, Al-Amin

    2017-10-01

    Digital signal processing techniques commonly employ fixed length window filters to process the signal contents. DNA signals differ in characteristics from common digital signals since they carry nucleotides as contents. The nucleotides own genetic code context and fuzzy behaviors due to their special structure and order in DNA strand. Employing conventional fixed length window filters for DNA signal processing produce spectral leakage and hence results in signal noise. A biological context aware adaptive window filter is required to process the DNA signals. This paper introduces a biological inspired fuzzy adaptive window median filter (FAWMF) which computes the fuzzy membership strength of nucleotides in each slide of window and filters nucleotides based on median filtering with a combination of s-shaped and z-shaped filters. Since coding regions cause 3-base periodicity by an unbalanced nucleotides' distribution producing a relatively high bias for nucleotides' usage, such fundamental characteristic of nucleotides has been exploited in FAWMF to suppress the signal noise. Along with adaptive response of FAWMF, a strong correlation between median nucleotides and the Π shaped filter was observed which produced enhanced discrimination between coding and non-coding regions contrary to fixed length conventional window filters. The proposed FAWMF attains a significant enhancement in coding regions identification i.e. 40% to 125% as compared to other conventional window filters tested over more than 250 benchmarked and randomly taken DNA datasets of different organisms. This study proves that conventional fixed length window filters applied to DNA signals do not achieve significant results since the nucleotides carry genetic code context. The proposed FAWMF algorithm is adaptive and outperforms significantly to process DNA signal contents. The algorithm applied to variety of DNA datasets produced noteworthy discrimination between coding and non-coding regions contrary to fixed window length conventional filters. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. A Simulation Testbed for Adaptive Modulation and Coding in Airborne Telemetry

    DTIC Science & Technology

    2014-05-29

    its modulation waveforms and LDPC for the FEC codes . It also uses several sets of published telemetry channel sounding data as its channel models...waveforms and LDPC for the FEC codes . It also uses several sets of published telemetry channel sounding data as its channel models. Within the context...check ( LDPC ) codes with tunable code rates, and both static and dynamic telemetry channel models are included. In an effort to maximize the

  9. Non-tables look-up search algorithm for efficient H.264/AVC context-based adaptive variable length coding decoding

    NASA Astrophysics Data System (ADS)

    Han, Yishi; Luo, Zhixiao; Wang, Jianhua; Min, Zhixuan; Qin, Xinyu; Sun, Yunlong

    2014-09-01

    In general, context-based adaptive variable length coding (CAVLC) decoding in H.264/AVC standard requires frequent access to the unstructured variable length coding tables (VLCTs) and significant memory accesses are consumed. Heavy memory accesses will cause high power consumption and time delays, which are serious problems for applications in portable multimedia devices. We propose a method for high-efficiency CAVLC decoding by using a program instead of all the VLCTs. The decoded codeword from VLCTs can be obtained without any table look-up and memory access. The experimental results show that the proposed algorithm achieves 100% memory access saving and 40% decoding time saving without degrading video quality. Additionally, the proposed algorithm shows a better performance compared with conventional CAVLC decoding, such as table look-up by sequential search, table look-up by binary search, Moon's method, and Kim's method.

  10. Adaptive Value Normalization in the Prefrontal Cortex Is Reduced by Memory Load.

    PubMed

    Holper, L; Van Brussel, L D; Schmidt, L; Schulthess, S; Burke, C J; Louie, K; Seifritz, E; Tobler, P N

    2017-01-01

    Adaptation facilitates neural representation of a wide range of diverse inputs, including reward values. Adaptive value coding typically relies on contextual information either obtained from the environment or retrieved from and maintained in memory. However, it is unknown whether having to retrieve and maintain context information modulates the brain's capacity for value adaptation. To address this issue, we measured hemodynamic responses of the prefrontal cortex (PFC) in two studies on risky decision-making. In each trial, healthy human subjects chose between a risky and a safe alternative; half of the participants had to remember the risky alternatives, whereas for the other half they were presented visually. The value of safe alternatives varied across trials. PFC responses adapted to contextual risk information, with steeper coding of safe alternative value in lower-risk contexts. Importantly, this adaptation depended on working memory load, such that response functions relating PFC activity to safe values were steeper with presented versus remembered risk. An independent second study replicated the findings of the first study and showed that similar slope reductions also arose when memory maintenance demands were increased with a secondary working memory task. Formal model comparison showed that a divisive normalization model fitted effects of both risk context and working memory demands on PFC activity better than alternative models of value adaptation, and revealed that reduced suppression of background activity was the critical parameter impairing normalization with increased memory maintenance demand. Our findings suggest that mnemonic processes can constrain normalization of neural value representations.

  11. Exploring the read-write genome: mobile DNA and mammalian adaptation.

    PubMed

    Shapiro, James A

    2017-02-01

    The read-write genome idea predicts that mobile DNA elements will act in evolution to generate adaptive changes in organismal DNA. This prediction was examined in the context of mammalian adaptations involving regulatory non-coding RNAs, viviparous reproduction, early embryonic and stem cell development, the nervous system, and innate immunity. The evidence shows that mobile elements have played specific and sometimes major roles in mammalian adaptive evolution by generating regulatory sites in the DNA and providing interaction motifs in non-coding RNA. Endogenous retroviruses and retrotransposons have been the predominant mobile elements in mammalian adaptive evolution, with the notable exception of bats, where DNA transposons are the major agents of RW genome inscriptions. A few examples of independent but convergent exaptation of mobile DNA elements for similar regulatory rewiring functions are noted.

  12. Memory-efficient table look-up optimized algorithm for context-based adaptive variable length decoding in H.264/advanced video coding

    NASA Astrophysics Data System (ADS)

    Wang, Jianhua; Cheng, Lianglun; Wang, Tao; Peng, Xiaodong

    2016-03-01

    Table look-up operation plays a very important role during the decoding processing of context-based adaptive variable length decoding (CAVLD) in H.264/advanced video coding (AVC). However, frequent table look-up operation can result in big table memory access, and then lead to high table power consumption. Aiming to solve the problem of big table memory access of current methods, and then reduce high power consumption, a memory-efficient table look-up optimized algorithm is presented for CAVLD. The contribution of this paper lies that index search technology is introduced to reduce big memory access for table look-up, and then reduce high table power consumption. Specifically, in our schemes, we use index search technology to reduce memory access by reducing the searching and matching operations for code_word on the basis of taking advantage of the internal relationship among length of zero in code_prefix, value of code_suffix and code_lengh, thus saving the power consumption of table look-up. The experimental results show that our proposed table look-up algorithm based on index search can lower about 60% memory access consumption compared with table look-up by sequential search scheme, and then save much power consumption for CAVLD in H.264/AVC.

  13. ContextProvider: Context awareness for medical monitoring applications.

    PubMed

    Mitchell, Michael; Meyers, Christopher; Wang, An-I Andy; Tyson, Gary

    2011-01-01

    Smartphones are sensor-rich and Internet-enabled. With their on-board sensors, web services, social media, and external biosensors, smartphones can provide contextual information about the device, user, and environment, thereby enabling the creation of rich, biologically driven applications. We introduce ContextProvider, a framework that offers a unified, query-able interface to contextual data on the device. Unlike other context-based frameworks, ContextProvider offers interactive user feedback, self-adaptive sensor polling, and minimal reliance on third-party infrastructure. ContextProvider also allows for rapid development of new context and bio-aware applications. Evaluation of ContextProvider shows the incorporation of an additional monitoring sensor into the framework with fewer than 100 lines of Java code. With adaptive sensor monitoring, power consumption per sensor can be reduced down to 1% overhead. Finally, through the use of context, accuracy of data interpretation can be improved by up to 80%.

  14. Adaptive Value Normalization in the Prefrontal Cortex Is Reduced by Memory Load

    PubMed Central

    Burke, C. J.; Seifritz, E.; Tobler, P. N.

    2017-01-01

    Abstract Adaptation facilitates neural representation of a wide range of diverse inputs, including reward values. Adaptive value coding typically relies on contextual information either obtained from the environment or retrieved from and maintained in memory. However, it is unknown whether having to retrieve and maintain context information modulates the brain’s capacity for value adaptation. To address this issue, we measured hemodynamic responses of the prefrontal cortex (PFC) in two studies on risky decision-making. In each trial, healthy human subjects chose between a risky and a safe alternative; half of the participants had to remember the risky alternatives, whereas for the other half they were presented visually. The value of safe alternatives varied across trials. PFC responses adapted to contextual risk information, with steeper coding of safe alternative value in lower-risk contexts. Importantly, this adaptation depended on working memory load, such that response functions relating PFC activity to safe values were steeper with presented versus remembered risk. An independent second study replicated the findings of the first study and showed that similar slope reductions also arose when memory maintenance demands were increased with a secondary working memory task. Formal model comparison showed that a divisive normalization model fitted effects of both risk context and working memory demands on PFC activity better than alternative models of value adaptation, and revealed that reduced suppression of background activity was the critical parameter impairing normalization with increased memory maintenance demand. Our findings suggest that mnemonic processes can constrain normalization of neural value representations. PMID:28462394

  15. Assessing Implementation Fidelity and Adaptation in a Community-Based Childhood Obesity Prevention Intervention

    ERIC Educational Resources Information Center

    Richards, Zoe; Kostadinov, Iordan; Jones, Michelle; Richard, Lucie; Cargo, Margaret

    2014-01-01

    Little research has assessed the fidelity, adaptation or integrity of activities implemented within community-based obesity prevention initiatives. To address this gap, a mixed-method process evaluation was undertaken in the context of the South Australian Obesity Prevention and Lifestyle (OPAL) initiative. An ecological coding procedure assessed…

  16. Percept User Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carnes, Brian; Kennon, Stephen Ray

    2017-05-01

    This document is the main user guide for the Sierra/Percept capabilities including the mesh_adapt and mesh_transfer tools. Basic capabilities for uniform mesh refinement (UMR) and mesh transfers are discussed. Examples are used to provide illustration. Future versions of this manual will include more advanced features such as geometry and mesh smoothing. Additionally, all the options for the mesh_adapt code will be described in detail. Capabilities for local adaptivity in the context of offline adaptivity will also be included. This page intentionally left blank.

  17. A Spanish version for the new ERA-EDTA coding system for primary renal disease.

    PubMed

    Zurriaga, Óscar; López-Briones, Carmen; Martín Escobar, Eduardo; Saracho-Rotaeche, Ramón; Moina Eguren, Íñigo; Pallardó Mateu, Luis; Abad Díez, José María; Sánchez Miret, José Ignacio

    2015-01-01

    The European Renal Association and the European Dialysis and Transplant Association (ERA-EDTA) have issued an English-language new coding system for primary kidney disease (PKD) aimed at solving the problems that were identified in the list of "Primary renal diagnoses" that has been in use for over 40 years. In the context of Registro Español de Enfermos Renales (Spanish Registry of Renal Patients, [REER]), the need for a translation and adaptation of terms, definitions and notes for the new ERA-EDTA codes was perceived in order to help those who have Spanish as their working language when using such codes. Bilingual nephrologists contributed a professional translation and were involved in a terminological adaptation process, which included a number of phases to contrast translation outputs. Codes, paragraphs, definitions and diagnostic criteria were reviewed and agreements and disagreements aroused for each term were labelled. Finally, the version that was accepted by a majority of reviewers was agreed. A wide agreement was reached in the first review phase, with only 5 points of discrepancy remaining, which were agreed on in the final phase. Translation and adaptation into Spanish represent an improvement that will help to introduce and use the new coding system for PKD, as it can help reducing the time devoted to coding and also the period of adaptation of health workers to the new codes. Copyright © 2015 The Authors. Published by Elsevier España, S.L.U. All rights reserved.

  18. Adaptive neural coding: from biological to behavioral decision-making

    PubMed Central

    Louie, Kenway; Glimcher, Paul W.; Webb, Ryan

    2015-01-01

    Empirical decision-making in diverse species deviates from the predictions of normative choice theory, but why such suboptimal behavior occurs is unknown. Here, we propose that deviations from optimality arise from biological decision mechanisms that have evolved to maximize choice performance within intrinsic biophysical constraints. Sensory processing utilizes specific computations such as divisive normalization to maximize information coding in constrained neural circuits, and recent evidence suggests that analogous computations operate in decision-related brain areas. These adaptive computations implement a relative value code that may explain the characteristic context-dependent nature of behavioral violations of classical normative theory. Examining decision-making at the computational level thus provides a crucial link between the architecture of biological decision circuits and the form of empirical choice behavior. PMID:26722666

  19. Coding efficiency of AVS 2.0 for CBAC and CABAC engines

    NASA Astrophysics Data System (ADS)

    Cui, Jing; Choi, Youngkyu; Chae, Soo-Ik

    2015-12-01

    In this paper we compare the coding efficiency of AVS 2.0[1] for engines of the Context-based Binary Arithmetic Coding (CBAC)[2] in the AVS 2.0 and the Context-Adaptive Binary Arithmetic Coder (CABAC)[3] in the HEVC[4]. For fair comparison, the CABAC is embedded in the reference code RD10.1 because the CBAC is in the HEVC in our previous work[5]. The rate estimation table is employed only for RDOQ in the RD code. To reduce the computation complexity of the video encoder, therefore we modified the RD code so that the rate estimation table is employed for all RDO decision. Furthermore, we also simplify the complexity of rate estimation table by reducing the bit depth of its fractional part to 2 from 8. The simulation result shows that the CABAC has the BD-rate loss of about 0.7% compared to the CBAC. It seems that the CBAC is a little more efficient than that the CABAC in the AVS 2.0.

  20. QOS-aware error recovery in wireless body sensor networks using adaptive network coding.

    PubMed

    Razzaque, Mohammad Abdur; Javadi, Saeideh S; Coulibaly, Yahaya; Hira, Muta Tah

    2014-12-29

    Wireless body sensor networks (WBSNs) for healthcare and medical applications are real-time and life-critical infrastructures, which require a strict guarantee of quality of service (QoS), in terms of latency, error rate and reliability. Considering the criticality of healthcare and medical applications, WBSNs need to fulfill users/applications and the corresponding network's QoS requirements. For instance, for a real-time application to support on-time data delivery, a WBSN needs to guarantee a constrained delay at the network level. A network coding-based error recovery mechanism is an emerging mechanism that can be used in these systems to support QoS at very low energy, memory and hardware cost. However, in dynamic network environments and user requirements, the original non-adaptive version of network coding fails to support some of the network and user QoS requirements. This work explores the QoS requirements of WBSNs in both perspectives of QoS. Based on these requirements, this paper proposes an adaptive network coding-based, QoS-aware error recovery mechanism for WBSNs. It utilizes network-level and user-/application-level information to make it adaptive in both contexts. Thus, it provides improved QoS support adaptively in terms of reliability, energy efficiency and delay. Simulation results show the potential of the proposed mechanism in terms of adaptability, reliability, real-time data delivery and network lifetime compared to its counterparts.

  1. Context adaptive binary arithmetic coding-based data hiding in partially encrypted H.264/AVC videos

    NASA Astrophysics Data System (ADS)

    Xu, Dawen; Wang, Rangding

    2015-05-01

    A scheme of data hiding directly in a partially encrypted version of H.264/AVC videos is proposed which includes three parts, i.e., selective encryption, data embedding and data extraction. Selective encryption is performed on context adaptive binary arithmetic coding (CABAC) bin-strings via stream ciphers. By careful selection of CABAC entropy coder syntax elements for selective encryption, the encrypted bitstream is format-compliant and has exactly the same bit rate. Then a data-hider embeds the additional data into partially encrypted H.264/AVC videos using a CABAC bin-string substitution technique without accessing the plaintext of the video content. Since bin-string substitution is carried out on those residual coefficients with approximately the same magnitude, the quality of the decrypted video is satisfactory. Video file size is strictly preserved even after data embedding. In order to adapt to different application scenarios, data extraction can be done either in the encrypted domain or in the decrypted domain. Experimental results have demonstrated the feasibility and efficiency of the proposed scheme.

  2. A Neural Mechanism for Time-Window Separation Resolves Ambiguity of Adaptive Coding

    PubMed Central

    Hildebrandt, K. Jannis; Ronacher, Bernhard; Hennig, R. Matthias; Benda, Jan

    2015-01-01

    The senses of animals are confronted with changing environments and different contexts. Neural adaptation is one important tool to adjust sensitivity to varying intensity ranges. For instance, in a quiet night outdoors, our hearing is more sensitive than when we are confronted with the plurality of sounds in a large city during the day. However, adaptation also removes available information on absolute sound levels and may thus cause ambiguity. Experimental data on the trade-off between benefits and loss through adaptation is scarce and very few mechanisms have been proposed to resolve it. We present an example where adaptation is beneficial for one task—namely, the reliable encoding of the pattern of an acoustic signal—but detrimental for another—the localization of the same acoustic stimulus. With a combination of neurophysiological data, modeling, and behavioral tests, we show that adaptation in the periphery of the auditory pathway of grasshoppers enables intensity-invariant coding of amplitude modulations, but at the same time, degrades information available for sound localization. We demonstrate how focusing the response of localization neurons to the onset of relevant signals separates processing of localization and pattern information temporally. In this way, the ambiguity of adaptive coding can be circumvented and both absolute and relative levels can be processed using the same set of peripheral neurons. PMID:25761097

  3. Recursive time-varying filter banks for subband image coding

    NASA Technical Reports Server (NTRS)

    Smith, Mark J. T.; Chung, Wilson C.

    1992-01-01

    Filter banks and wavelet decompositions that employ recursive filters have been considered previously and are recognized for their efficiency in partitioning the frequency spectrum. This paper presents an analysis of a new infinite impulse response (IIR) filter bank in which these computationally efficient filters may be changed adaptively in response to the input. The filter bank is presented and discussed in the context of finite-support signals with the intended application in subband image coding. In the absence of quantization errors, exact reconstruction can be achieved and by the proper choice of an adaptation scheme, it is shown that IIR time-varying filter banks can yield improvement over conventional ones.

  4. Adaptive partially hidden Markov models with application to bilevel image coding.

    PubMed

    Forchhammer, S; Rasmussen, T S

    1999-01-01

    Partially hidden Markov models (PHMMs) have previously been introduced. The transition and emission/output probabilities from hidden states, as known from the HMMs, are conditioned on the past. This way, the HMM may be applied to images introducing the dependencies of the second dimension by conditioning. In this paper, the PHMM is extended to multiple sequences with a multiple token version and adaptive versions of PHMM coding are presented. The different versions of the PHMM are applied to lossless bilevel image coding. To reduce and optimize the model cost and size, the contexts are organized in trees and effective quantization of the parameters is introduced. The new coding methods achieve results that are better than the JBIG standard on selected test images, although at the cost of increased complexity. By the minimum description length principle, the methods presented for optimizing the code length may apply as guidance for training (P)HMMs for, e.g., segmentation or recognition purposes. Thereby, the PHMM models provide a new approach to image modeling.

  5. Parallel design of JPEG-LS encoder on graphics processing units

    NASA Astrophysics Data System (ADS)

    Duan, Hao; Fang, Yong; Huang, Bormin

    2012-01-01

    With recent technical advances in graphic processing units (GPUs), GPUs have outperformed CPUs in terms of compute capability and memory bandwidth. Many successful GPU applications to high performance computing have been reported. JPEG-LS is an ISO/IEC standard for lossless image compression which utilizes adaptive context modeling and run-length coding to improve compression ratio. However, adaptive context modeling causes data dependency among adjacent pixels and the run-length coding has to be performed in a sequential way. Hence, using JPEG-LS to compress large-volume hyperspectral image data is quite time-consuming. We implement an efficient parallel JPEG-LS encoder for lossless hyperspectral compression on a NVIDIA GPU using the computer unified device architecture (CUDA) programming technology. We use the block parallel strategy, as well as such CUDA techniques as coalesced global memory access, parallel prefix sum, and asynchronous data transfer. We also show the relation between GPU speedup and AVIRIS block size, as well as the relation between compression ratio and AVIRIS block size. When AVIRIS images are divided into blocks, each with 64×64 pixels, we gain the best GPU performance with 26.3x speedup over its original CPU code.

  6. Visual adaptation and face perception

    PubMed Central

    Webster, Michael A.; MacLeod, Donald I. A.

    2011-01-01

    The appearance of faces can be strongly affected by the characteristics of faces viewed previously. These perceptual after-effects reflect processes of sensory adaptation that are found throughout the visual system, but which have been considered only relatively recently in the context of higher level perceptual judgements. In this review, we explore the consequences of adaptation for human face perception, and the implications of adaptation for understanding the neural-coding schemes underlying the visual representation of faces. The properties of face after-effects suggest that they, in part, reflect response changes at high and possibly face-specific levels of visual processing. Yet, the form of the after-effects and the norm-based codes that they point to show many parallels with the adaptations and functional organization that are thought to underlie the encoding of perceptual attributes like colour. The nature and basis for human colour vision have been studied extensively, and we draw on ideas and principles that have been developed to account for norms and normalization in colour vision to consider potential similarities and differences in the representation and adaptation of faces. PMID:21536555

  7. Informational basis of sensory adaptation: entropy and single-spike efficiency in rat barrel cortex.

    PubMed

    Adibi, Mehdi; Clifford, Colin W G; Arabzadeh, Ehsan

    2013-09-11

    We showed recently that exposure to whisker vibrations enhances coding efficiency in rat barrel cortex despite increasing correlations in variability (Adibi et al., 2013). Here, to understand how adaptation achieves this improvement in sensory representation, we decomposed the stimulus information carried in neuronal population activity into its fundamental components in the framework of information theory. In the context of sensory coding, these components are the entropy of the responses across the entire stimulus set (response entropy) and the entropy of the responses conditional on the stimulus (conditional response entropy). We found that adaptation decreased response entropy and conditional response entropy at both the level of single neurons and the pooled activity of neuronal populations. However, the net effect of adaptation was to increase the mutual information because the drop in the conditional entropy outweighed the drop in the response entropy. The information transmitted by a single spike also increased under adaptation. As population size increased, the information content of individual spikes declined but the relative improvement attributable to adaptation was maintained.

  8. Visual adaptation and face perception.

    PubMed

    Webster, Michael A; MacLeod, Donald I A

    2011-06-12

    The appearance of faces can be strongly affected by the characteristics of faces viewed previously. These perceptual after-effects reflect processes of sensory adaptation that are found throughout the visual system, but which have been considered only relatively recently in the context of higher level perceptual judgements. In this review, we explore the consequences of adaptation for human face perception, and the implications of adaptation for understanding the neural-coding schemes underlying the visual representation of faces. The properties of face after-effects suggest that they, in part, reflect response changes at high and possibly face-specific levels of visual processing. Yet, the form of the after-effects and the norm-based codes that they point to show many parallels with the adaptations and functional organization that are thought to underlie the encoding of perceptual attributes like colour. The nature and basis for human colour vision have been studied extensively, and we draw on ideas and principles that have been developed to account for norms and normalization in colour vision to consider potential similarities and differences in the representation and adaptation of faces.

  9. Computational Fluid Dynamics Technology for Hypersonic Applications

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.

    2003-01-01

    Several current challenges in computational fluid dynamics and aerothermodynamics for hypersonic vehicle applications are discussed. Example simulations are presented from code validation and code benchmarking efforts to illustrate capabilities and limitations. Opportunities to advance the state-of-art in algorithms, grid generation and adaptation, and code validation are identified. Highlights of diverse efforts to address these challenges are then discussed. One such effort to re-engineer and synthesize the existing analysis capability in LAURA, VULCAN, and FUN3D will provide context for these discussions. The critical (and evolving) role of agile software engineering practice in the capability enhancement process is also noted.

  10. Normalization is a general neural mechanism for context-dependent decision making

    PubMed Central

    Louie, Kenway; Khaw, Mel W.; Glimcher, Paul W.

    2013-01-01

    Understanding the neural code is critical to linking brain and behavior. In sensory systems, divisive normalization seems to be a canonical neural computation, observed in areas ranging from retina to cortex and mediating processes including contrast adaptation, surround suppression, visual attention, and multisensory integration. Recent electrophysiological studies have extended these insights beyond the sensory domain, demonstrating an analogous algorithm for the value signals that guide decision making, but the effects of normalization on choice behavior are unknown. Here, we show that choice models using normalization generate significant (and classically irrational) choice phenomena driven by either the value or number of alternative options. In value-guided choice experiments, both monkey and human choosers show novel context-dependent behavior consistent with normalization. These findings suggest that the neural mechanism of value coding critically influences stochastic choice behavior and provide a generalizable quantitative framework for examining context effects in decision making. PMID:23530203

  11. Survey of adaptive image coding techniques

    NASA Technical Reports Server (NTRS)

    Habibi, A.

    1977-01-01

    The general problem of image data compression is discussed briefly with attention given to the use of Karhunen-Loeve transforms, suboptimal systems, and block quantization. A survey is then conducted encompassing the four categories of adaptive systems: (1) adaptive transform coding (adaptive sampling, adaptive quantization, etc.), (2) adaptive predictive coding (adaptive delta modulation, adaptive DPCM encoding, etc.), (3) adaptive cluster coding (blob algorithms and the multispectral cluster coding technique), and (4) adaptive entropy coding.

  12. What is coded into memory in the absence of outcome feedback?

    PubMed

    Henriksson, Maria P; Elwin, Ebba; Juslin, Peter

    2010-01-01

    Although people often have to learn from environments with scarce and highly selective outcome feedback, the question of how nonfeedback trials are represented in memory and affect later performance has received little attention in models of learning and decision making. In this article, the authors use the generalized context model (Nosofsky, 1986) as a vehicle to test contrasting hypotheses about the coding of nonfeedback trials. Data across 3 experiments with selective decision-contingent and selective outcome-contingent feedback provide support for the hypothesis of constructivist coding (Elwin, Juslin, Olsson, & Enkvist, 2007), according to which the outcomes on nonfeedback trials are coded with the most likely outcome, as inferred by the individual. The relation to sampling-based approaches to judgment, and the adaptive significance of constructivist coding, are discussed. (PsycINFO Database Record (c) 2009 APA, all rights reserved).

  13. Auditory spatial processing in the human cortex.

    PubMed

    Salminen, Nelli H; Tiitinen, Hannu; May, Patrick J C

    2012-12-01

    The auditory system codes spatial locations in a way that deviates from the spatial representations found in other modalities. This difference is especially striking in the cortex, where neurons form topographical maps of visual and tactile space but where auditory space is represented through a population rate code. In this hemifield code, sound source location is represented in the activity of two widely tuned opponent populations, one tuned to the right and the other to the left side of auditory space. Scientists are only beginning to uncover how this coding strategy adapts to various spatial processing demands. This review presents the current understanding of auditory spatial processing in the cortex. To this end, the authors consider how various implementations of the hemifield code may exist within the auditory cortex and how these may be modulated by the stimulation and task context. As a result, a coherent set of neural strategies for auditory spatial processing emerges.

  14. Tol2 transposon-mediated transgenesis in the Midas cichlid (Amphilophus citrinellus) - towards understanding gene function and regulatory evolution in an ecological model system for rapid phenotypic diversification.

    PubMed

    Kratochwil, Claudius F; Sefton, Maggie M; Liang, Yipeng; Meyer, Axel

    2017-11-23

    The Midas cichlid species complex (Amphilophus spp.) is widely known among evolutionary biologists as a model system for sympatric speciation and adaptive phenotypic divergence within extremely short periods of time (a few hundred generations). The repeated parallel evolution of adaptive phenotypes in this radiation, combined with their near genetic identity, makes them an excellent model for studying phenotypic diversification. While many ecological and evolutionary studies have been performed on Midas cichlids, the molecular basis of specific phenotypes, particularly adaptations, and their underlying coding and cis-regulatory changes have not yet been studied thoroughly. For the first time in any New World cichlid, we use Tol2 transposon-mediated transgenesis in the Midas cichlid (Amphilophus citrinellus). By adapting existing microinjection protocols, we established an effective protocol for transgenesis in Midas cichlids. Embryos were injected with a Tol2 plasmid construct that drives enhanced green fluorescent protein (eGFP) expression under the control of the ubiquitin promoter. The transgene was successfully integrated into the germline, driving strong ubiquitous expression of eGFP in the first transgenic Midas cichlid line. Additionally, we show transient expression of two further transgenic constructs, ubiquitin::tdTomato and mitfa::eGFP. Transgenesis in Midas cichlids will facilitate further investigation of the genetic basis of species-specific traits, many of which are adaptations. Transgenesis is a versatile tool not only for studying regulatory elements such as promoters and enhancers, but also for testing gene function through overexpression of allelic gene variants. As such, it is an important first step in establishing the Midas cichlid as a powerful model for studying adaptive coding and non-coding changes in an ecological and evolutionary context.

  15. Configuring a Context-Aware Middleware for Wireless Sensor Networks

    PubMed Central

    Gámez, Nadia; Cubo, Javier; Fuentes, Lidia; Pimentel, Ernesto

    2012-01-01

    In the Future Internet, applications based on Wireless Sensor Networks will have to support reconfiguration with minimum human intervention, depending on dynamic context changes in their environment. These situations create a need for building these applications as adaptive software and including techniques that allow the context acquisition and decisions about adaptation. However, contexts use to be made up of complex information acquired from heterogeneous devices and user characteristics, making them difficult to manage. So, instead of building context-aware applications from scratch, we propose to use FamiWare, a family of middleware for Ambient Intelligence specifically designed to be aware of contexts in sensor and smartphone devices. It provides both, several monitoring services to acquire contexts from devices and users, and a context-awareness service to analyze and detect context changes. However, the current version of FamiWare does not allow the automatic incorporation related to the management of new contexts into the FamiWare family. To overcome this shortcoming, in this work, we first present how to model the context using a metamodel to define the contexts that must to be taken into account in an instantiation of FamiWare for a certain Ambient Intelligence system. Then, to configure a new context-aware version of FamiWare and to generate code ready-to-install within heterogeneous devices, we define a mapping that automatically transforms metamodel elements defining contexts into elements of the FamiWare family, and we also use the FamiWare configuration process to customize the new context-aware variant. Finally, we evaluate the benefits of our process, and we analyze both that the new version of the middleware works as expected and that it manages the contexts in an efficient way. PMID:23012505

  16. Gender, Cultural Influences, and Coping with Musculoskeletal Pain at Work: The Experience of Malaysian Female Office Workers.

    PubMed

    Maakip, Ismail; Oakman, Jodi; Stuckey, Rwth

    2017-06-01

    Purpose Workers with musculoskeletal pain (MSP) often continue to work despite their condition. Understanding the factors that enable them to remain at work provides insights into the development of appropriate workplace accommodations. This qualitative study aims to explore the strategies utilised by female Malaysian office workers with MSP to maintain productive employment. Methods A qualitative approach using thematic analysis was used. Individual semi-structured interviews were conducted with 13 female Malaysian office workers with MSP. Initial codes were identified and refined through iterative discussion to further develop the emerging codes and modify the coding framework. A further stage of coding was undertaken to eliminate redundant codes and establish analytic connections between distinct themes. Results Two major themes were identified: managing the demands of work and maintaining employment with persistent musculoskeletal pain. Participants reported developing strategies to assist them to remain at work, but most focused on individually initiated adaptations or peer support, rather than systemic changes to work systems or practices. A combination of the patriarchal and hierarchical cultural occupational context emerged as a critical factor in the finding of individual or peer based adaptations rather than organizational accommodations. Conclusions It is recommended that supervisors be educated in the benefits of maintaining and retaining employees with MSP, and encouraged to challenge cultural norms and develop appropriate flexible workplace accommodations through consultation and negotiation with these workers.

  17. MILCOM '85 - Military Communications Conference, Boston, MA, October 20-23, 1985, Conference Record. Volumes 1, 2, & 3

    NASA Astrophysics Data System (ADS)

    The present conference on the development status of communications systems in the context of electronic warfare gives attention to topics in spread spectrum code acquisition, digital speech technology, fiber-optics communications, free space optical communications, the networking of HF systems, and applications and evaluation methods for digital speech. Also treated are issues in local area network system design, coding techniques and applications, technology applications for HF systems, receiver technologies, software development status, channel simultion/prediction methods, C3 networking spread spectrum networks, the improvement of communication efficiency and reliability through technical control methods, mobile radio systems, and adaptive antenna arrays. Finally, communications system cost analyses, spread spectrum performance, voice and image coding, switched networks, and microwave GaAs ICs, are considered.

  18. Weighted bi-prediction for light field image coding

    NASA Astrophysics Data System (ADS)

    Conti, Caroline; Nunes, Paulo; Ducla Soares, Luís.

    2017-09-01

    Light field imaging based on a single-tier camera equipped with a microlens array - also known as integral, holoscopic, and plenoptic imaging - has currently risen up as a practical and prospective approach for future visual applications and services. However, successfully deploying actual light field imaging applications and services will require developing adequate coding solutions to efficiently handle the massive amount of data involved in these systems. In this context, self-similarity compensated prediction is a non-local spatial prediction scheme based on block matching that has been shown to achieve high efficiency for light field image coding based on the High Efficiency Video Coding (HEVC) standard. As previously shown by the authors, this is possible by simply averaging two predictor blocks that are jointly estimated from a causal search window in the current frame itself, referred to as self-similarity bi-prediction. However, theoretical analyses for motion compensated bi-prediction have suggested that it is still possible to achieve further rate-distortion performance improvements by adaptively estimating the weighting coefficients of the two predictor blocks. Therefore, this paper presents a comprehensive study of the rate-distortion performance for HEVC-based light field image coding when using different sets of weighting coefficients for self-similarity bi-prediction. Experimental results demonstrate that it is possible to extend the previous theoretical conclusions to light field image coding and show that the proposed adaptive weighting coefficient selection leads to up to 5 % of bit savings compared to the previous self-similarity bi-prediction scheme.

  19. Adaptive EAGLE dynamic solution adaptation and grid quality enhancement

    NASA Technical Reports Server (NTRS)

    Luong, Phu Vinh; Thompson, J. F.; Gatlin, B.; Mastin, C. W.; Kim, H. J.

    1992-01-01

    In the effort described here, the elliptic grid generation procedure in the EAGLE grid code was separated from the main code into a subroutine, and a new subroutine which evaluates several grid quality measures at each grid point was added. The elliptic grid routine can now be called, either by a computational fluid dynamics (CFD) code to generate a new adaptive grid based on flow variables and quality measures through multiple adaptation, or by the EAGLE main code to generate a grid based on quality measure variables through static adaptation. Arrays of flow variables can be read into the EAGLE grid code for use in static adaptation as well. These major changes in the EAGLE adaptive grid system make it easier to convert any CFD code that operates on a block-structured grid (or single-block grid) into a multiple adaptive code.

  20. Distributed Coding/Decoding Complexity in Video Sensor Networks

    PubMed Central

    Cordeiro, Paulo J.; Assunção, Pedro

    2012-01-01

    Video Sensor Networks (VSNs) are recent communication infrastructures used to capture and transmit dense visual information from an application context. In such large scale environments which include video coding, transmission and display/storage, there are several open problems to overcome in practical implementations. This paper addresses the most relevant challenges posed by VSNs, namely stringent bandwidth usage and processing time/power constraints. In particular, the paper proposes a novel VSN architecture where large sets of visual sensors with embedded processors are used for compression and transmission of coded streams to gateways, which in turn transrate the incoming streams and adapt them to the variable complexity requirements of both the sensor encoders and end-user decoder terminals. Such gateways provide real-time transcoding functionalities for bandwidth adaptation and coding/decoding complexity distribution by transferring the most complex video encoding/decoding tasks to the transcoding gateway at the expense of a limited increase in bit rate. Then, a method to reduce the decoding complexity, suitable for system-on-chip implementation, is proposed to operate at the transcoding gateway whenever decoders with constrained resources are targeted. The results show that the proposed method achieves good performance and its inclusion into the VSN infrastructure provides an additional level of complexity control functionality. PMID:22736972

  1. Distributed coding/decoding complexity in video sensor networks.

    PubMed

    Cordeiro, Paulo J; Assunção, Pedro

    2012-01-01

    Video Sensor Networks (VSNs) are recent communication infrastructures used to capture and transmit dense visual information from an application context. In such large scale environments which include video coding, transmission and display/storage, there are several open problems to overcome in practical implementations. This paper addresses the most relevant challenges posed by VSNs, namely stringent bandwidth usage and processing time/power constraints. In particular, the paper proposes a novel VSN architecture where large sets of visual sensors with embedded processors are used for compression and transmission of coded streams to gateways, which in turn transrate the incoming streams and adapt them to the variable complexity requirements of both the sensor encoders and end-user decoder terminals. Such gateways provide real-time transcoding functionalities for bandwidth adaptation and coding/decoding complexity distribution by transferring the most complex video encoding/decoding tasks to the transcoding gateway at the expense of a limited increase in bit rate. Then, a method to reduce the decoding complexity, suitable for system-on-chip implementation, is proposed to operate at the transcoding gateway whenever decoders with constrained resources are targeted. The results show that the proposed method achieves good performance and its inclusion into the VSN infrastructure provides an additional level of complexity control functionality.

  2. Reduced adaptability, but no fundamental disruption, of norm-based face coding following early visual deprivation from congenital cataracts.

    PubMed

    Rhodes, Gillian; Nishimura, Mayu; de Heering, Adelaide; Jeffery, Linda; Maurer, Daphne

    2017-05-01

    Faces are adaptively coded relative to visual norms that are updated by experience, and this adaptive coding is linked to face recognition ability. Here we investigated whether adaptive coding of faces is disrupted in individuals (adolescents and adults) who experience face recognition difficulties following visual deprivation from congenital cataracts in infancy. We measured adaptive coding using face identity aftereffects, where smaller aftereffects indicate less adaptive updating of face-coding mechanisms by experience. We also examined whether the aftereffects increase with adaptor identity strength, consistent with norm-based coding of identity, as in typical populations, or whether they show a different pattern indicating some more fundamental disruption of face-coding mechanisms. Cataract-reversal patients showed significantly smaller face identity aftereffects than did controls (Experiments 1 and 2). However, their aftereffects increased significantly with adaptor strength, consistent with norm-based coding (Experiment 2). Thus we found reduced adaptability but no fundamental disruption of norm-based face-coding mechanisms in cataract-reversal patients. Our results suggest that early visual experience is important for the normal development of adaptive face-coding mechanisms. © 2016 John Wiley & Sons Ltd.

  3. Space-time adaptive solution of inverse problems with the discrete adjoint method

    NASA Astrophysics Data System (ADS)

    Alexe, Mihai; Sandu, Adrian

    2014-08-01

    This paper develops a framework for the construction and analysis of discrete adjoint sensitivities in the context of time dependent, adaptive grid, adaptive step models. Discrete adjoints are attractive in practice since they can be generated with low effort using automatic differentiation. However, this approach brings several important challenges. The space-time adjoint of the forward numerical scheme may be inconsistent with the continuous adjoint equations. A reduction in accuracy of the discrete adjoint sensitivities may appear due to the inter-grid transfer operators. Moreover, the optimization algorithm may need to accommodate state and gradient vectors whose dimensions change between iterations. This work shows that several of these potential issues can be avoided through a multi-level optimization strategy using discontinuous Galerkin (DG) hp-adaptive discretizations paired with Runge-Kutta (RK) time integration. We extend the concept of dual (adjoint) consistency to space-time RK-DG discretizations, which are then shown to be well suited for the adaptive solution of time-dependent inverse problems. Furthermore, we prove that DG mesh transfer operators on general meshes are also dual consistent. This allows the simultaneous derivation of the discrete adjoint for both the numerical solver and the mesh transfer logic with an automatic code generation mechanism such as algorithmic differentiation (AD), potentially speeding up development of large-scale simulation codes. The theoretical analysis is supported by numerical results reported for a two-dimensional non-stationary inverse problem.

  4. Vector Adaptive/Predictive Encoding Of Speech

    NASA Technical Reports Server (NTRS)

    Chen, Juin-Hwey; Gersho, Allen

    1989-01-01

    Vector adaptive/predictive technique for digital encoding of speech signals yields decoded speech of very good quality after transmission at coding rate of 9.6 kb/s and of reasonably good quality at 4.8 kb/s. Requires 3 to 4 million multiplications and additions per second. Combines advantages of adaptive/predictive coding, and code-excited linear prediction, yielding speech of high quality but requires 600 million multiplications and additions per second at encoding rate of 4.8 kb/s. Vector adaptive/predictive coding technique bridges gaps in performance and complexity between adaptive/predictive coding and code-excited linear prediction.

  5. Performance optimization of PM-16QAM transmission system enabled by real-time self-adaptive coding.

    PubMed

    Qu, Zhen; Li, Yao; Mo, Weiyang; Yang, Mingwei; Zhu, Shengxiang; Kilper, Daniel C; Djordjevic, Ivan B

    2017-10-15

    We experimentally demonstrate self-adaptive coded 5×100  Gb/s WDM polarization multiplexed 16 quadrature amplitude modulation transmission over a 100 km fiber link, which is enabled by a real-time control plane. The real-time optical signal-to-noise ratio (OSNR) is measured using an optical performance monitoring device. The OSNR measurement is processed and fed back using control plane logic and messaging to the transmitter side for code adaptation, where the binary data are adaptively encoded with three types of low-density parity-check (LDPC) codes with code rates of 0.8, 0.75, and 0.7 of large girth. The total code-adaptation latency is measured to be 2273 ms. Compared with transmission without adaptation, average net capacity improvements of 102%, 36%, and 7.5% are obtained, respectively, by adaptive LDPC coding.

  6. Adaptation can explain evidence for encoding of probabilistic information in macaque inferior temporal cortex.

    PubMed

    Vinken, Kasper; Vogels, Rufin

    2017-11-20

    In predictive coding theory, the brain is conceptualized as a prediction machine that constantly constructs and updates expectations of the sensory environment [1]. In the context of this theory, Bell et al.[2] recently studied the effect of the probability of task-relevant stimuli on the activity of macaque inferior temporal (IT) neurons and observed a reduced population response to expected faces in face-selective neurons. They concluded that "IT neurons encode long-term, latent probabilistic information about stimulus occurrence", supporting predictive coding. They manipulated expectation by the frequency of face versus fruit stimuli in blocks of trials. With such a design, stimulus repetition is confounded with expectation. As previous studies showed that IT neurons decrease their response with repetition [3], such adaptation (or repetition suppression), instead of expectation suppression as assumed by the authors, could explain their effects. The authors attempted to control for this alternative interpretation with a multiple regression approach. Here we show by using simulation that adaptation can still masquerade as expectation effects reported in [2]. Further, the results from the regression model used for most analyses cannot be trusted, because the model is not uniquely defined. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Facial expression coding in children and adolescents with autism: Reduced adaptability but intact norm-based coding.

    PubMed

    Rhodes, Gillian; Burton, Nichola; Jeffery, Linda; Read, Ainsley; Taylor, Libby; Ewing, Louise

    2018-05-01

    Individuals with autism spectrum disorder (ASD) can have difficulty recognizing emotional expressions. Here, we asked whether the underlying perceptual coding of expression is disrupted. Typical individuals code expression relative to a perceptual (average) norm that is continuously updated by experience. This adaptability of face-coding mechanisms has been linked to performance on various face tasks. We used an adaptation aftereffect paradigm to characterize expression coding in children and adolescents with autism. We asked whether face expression coding is less adaptable in autism and whether there is any fundamental disruption of norm-based coding. If expression coding is norm-based, then the face aftereffects should increase with adaptor expression strength (distance from the average expression). We observed this pattern in both autistic and typically developing participants, suggesting that norm-based coding is fundamentally intact in autism. Critically, however, expression aftereffects were reduced in the autism group, indicating that expression-coding mechanisms are less readily tuned by experience. Reduced adaptability has also been reported for coding of face identity and gaze direction. Thus, there appears to be a pervasive lack of adaptability in face-coding mechanisms in autism, which could contribute to face processing and broader social difficulties in the disorder. © 2017 The British Psychological Society.

  8. Professional codes in a changing nursing context: literature review.

    PubMed

    Meulenbergs, Tom; Verpeet, Ellen; Schotsmans, Paul; Gastmans, Chris

    2004-05-01

    Professional codes played a definitive role during a specific period of time, when the professional context of nursing was characterized by an increasing professionalization. Today, however, this professional context has changed. This paper reports on a study which aimed to explore the meaning of professional codes in the current context of the nursing profession. A literature review on professional codes and the nursing profession was carried out. The literature was systematically investigated using the electronic databases PubMed and The Philosopher's Index, and the keywords nursing codes, professional codes in nursing, ethics codes/ethical codes, professional ethics. Due to the nursing profession's growing multidisciplinary nature, the increasing dominance of economic discourse, and the intensified legal framework in which health care professionals need to operate, the context of nursing is changing. In this changed professional context, nursing professional codes have to accommodate to the increasing ethical demands placed upon the profession. Therefore, an ethicization of these codes is desirable, and their moral objectives need to be revalued.

  9. The agents of natural genome editing.

    PubMed

    Witzany, Guenther

    2011-06-01

    The DNA serves as a stable information storage medium and every protein which is needed by the cell is produced from this blueprint via an RNA intermediate code. More recently it was found that an abundance of various RNA elements cooperate in a variety of steps and substeps as regulatory and catalytic units with multiple competencies to act on RNA transcripts. Natural genome editing on one side is the competent agent-driven generation and integration of meaningful DNA nucleotide sequences into pre-existing genomic content arrangements, and the ability to (re-)combine and (re-)regulate them according to context-dependent (i.e. adaptational) purposes of the host organism. Natural genome editing on the other side designates the integration of all RNA activities acting on RNA transcripts without altering DNA-encoded genes. If we take the genetic code seriously as a natural code, there must be agents that are competent to act on this code because no natural code codes itself as no natural language speaks itself. As code editing agents, viral and subviral agents have been suggested because there are several indicators that demonstrate viruses competent in both RNA and DNA natural genome editing.

  10. Reduced adaptability, but no fundamental disruption, of norm-based face-coding mechanisms in cognitively able children and adolescents with autism.

    PubMed

    Rhodes, Gillian; Ewing, Louise; Jeffery, Linda; Avard, Eleni; Taylor, Libby

    2014-09-01

    Faces are adaptively coded relative to visual norms that are updated by experience. This coding is compromised in autism and the broader autism phenotype, suggesting that atypical adaptive coding of faces may be an endophenotype for autism. Here we investigate the nature of this atypicality, asking whether adaptive face-coding mechanisms are fundamentally altered, or simply less responsive to experience, in autism. We measured adaptive coding, using face identity aftereffects, in cognitively able children and adolescents with autism and neurotypical age- and ability-matched participants. We asked whether these aftereffects increase with adaptor identity strength as in neurotypical populations, or whether they show a different pattern indicating a more fundamental alteration in face-coding mechanisms. As expected, face identity aftereffects were reduced in the autism group, but they nevertheless increased with adaptor strength, like those of our neurotypical participants, consistent with norm-based coding of face identity. Moreover, their aftereffects correlated positively with face recognition ability, consistent with an intact functional role for adaptive coding in face recognition ability. We conclude that adaptive norm-based face-coding mechanisms are basically intact in autism, but are less readily calibrated by experience. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Code of Ethics in a Multicultural Company and its Legal Context

    NASA Astrophysics Data System (ADS)

    Odlerová, Eva; Ďurišová, Jaroslava; Šramel, Bystrík

    2012-12-01

    The entry of foreign investors and simultaneous expansion of different national cultures, religions, rules, moral and ethical standards is bringing up problems of cooperation and coexistence of different nationalities, ethnicities and cultures. Working in an international environment therefore requires adaptation to a variety of economic, political, legal, technical, social, cultural and historical conditions. One possible solution is to define a code of ethics, guidelines which find enough common moral principles, which can become the basis for the adoption of general ethical standards, while respecting national, cultural differences and practices. In this article, the authors pay attention not only to the analysis of the common ethical rules in a multicultural company, but also to the legal aspects of codes of ethics. Each code of ethics is a set of standards, which, like the legal norms, regulate the behaviour of individuals. These standards, however, must simultaneously meet certain statutory criteria that define the boundaries of regulation of employee’s behaviour.

  12. Numerical studies of film formation in context of steel coating

    NASA Astrophysics Data System (ADS)

    Aniszewski, Wojciech; Zaleski, Stephane; Popinet, Stephane

    2017-11-01

    In this work, we present a detailed example of numerical study of film formation in the context of metal coating. Liquid metal is drawn from a reservoir onto a retracting solid sheet, forming a coating film characterized by phenomena such as longitudinal thickness variation (in 3D) or waves akin to that predicted by Kapitza and Kapitza (visible in two dimensions as well). While the industry standard configuration for Zinc coating is marked by coexistence of medium Capillary number (Ca = 0.03) and film Reynolds number above 1000, we present also parametric studies in order to establish more clearly to what degree does the numerical method influence film regimes obtained in the target configuration. The simulations have been performed using Basilisk, a grid-adapting, strongly optimized code derived from Gerris . Mesh adaptation allows for arbitrary precision in relevant regions such as the contact line or the meniscus, while a coarse grid is applied elsewhere. This adaptation strategy, as the results indicate, is the only realistic approach for numerical method to cover the wide range of necessary scales from the predicted film thickness (hundreds of microns) to the domain size (meters).

  13. Adaptive coding of MSS imagery. [Multi Spectral band Scanners

    NASA Technical Reports Server (NTRS)

    Habibi, A.; Samulon, A. S.; Fultz, G. L.; Lumb, D.

    1977-01-01

    A number of adaptive data compression techniques are considered for reducing the bandwidth of multispectral data. They include adaptive transform coding, adaptive DPCM, adaptive cluster coding, and a hybrid method. The techniques are simulated and their performance in compressing the bandwidth of Landsat multispectral images is evaluated and compared using signal-to-noise ratio and classification consistency as fidelity criteria.

  14. Interplay between cardiac transcription factors and non-coding RNAs in predisposing to atrial fibrillation.

    PubMed

    Mikhailov, Alexander T; Torrado, Mario

    2018-05-12

    There is growing evidence that putative gene regulatory networks including cardio-enriched transcription factors, such as PITX2, TBX5, ZFHX3, and SHOX2, and their effector/target genes along with downstream non-coding RNAs can play a potentially important role in the process of adaptive and maladaptive atrial rhythm remodeling. In turn, expression of atrial fibrillation-associated transcription factors is under the control of upstream regulatory non-coding RNAs. This review broadly explores gene regulatory mechanisms associated with susceptibility to atrial fibrillation-with key examples from both animal models and patients-within the context of both cardiac transcription factors and non-coding RNAs. These two systems appear to have multiple levels of cross-regulation and act coordinately to achieve effective control of atrial rhythm effector gene expression. Perturbations of a dynamic expression balance between transcription factors and corresponding non-coding RNAs can provoke the development or promote the progression of atrial fibrillation. We also outline deficiencies in current models and discuss ongoing studies to clarify remaining mechanistic questions. An understanding of the function of transcription factors and non-coding RNAs in gene regulatory networks associated with atrial fibrillation risk will enable the development of innovative therapeutic strategies.

  15. Results of an Integrative Analysis: A Call for Contextualizing HIV and AIDS Clinical Practice Guidelines to Support Evidence-Based Practice.

    PubMed

    Edwards, Nancy; Kahwa, Eulalia; Hoogeveen, Katie

    2017-12-01

    Practice guidelines aim to improve the standard of care for people living with HIV/AIDS. Successfully implementing guidelines requires tailoring them to populations served and to social and organizational influences on care. To examine dimensions of context, which nurses and midwives described as having a significant impact on their care of patients living with HIV/AIDS in Kenya, Uganda, South Africa, and Jamaica and to determine whether HIV/AIDS guidelines include adaptations congruent with these dimensions of context. Two sets of data were used. The first came from a qualitative study. In-depth interviews were conducted with purposively selected nurses, midwives, and nurse managers from 21 districts in four study countries. A coding framework was iteratively developed and themes inductively identified. Context dimensions were derived from these themes. A second data set of published guidelines for HIV/AIDS care was then assembled. Guidelines were identified through Google and PubMed searches. Using a deductive integrative analysis approach, text related to context dimensions was extracted from guidelines and categorized into problem and strategy statements. Ninety-six individuals participated in qualitative interviews. Four discrete dimensions of context were identified: health workforce adequacy, workplace exposure risk, workplace consequences for nurses living with HIV/AIDS, and the intersection of work and family life. Guidelines most often acknowledged health human resource constraints and presented mitigation strategies to offset them, and least often discussed workplace consequences and the intersections of family and work life. Guidelines should more consistently acknowledge diverse implementation contexts, propose how recommendations can be adapted to these realities, and suggest what role frontline healthcare providers have in realizing the structural changes necessary for healthier work environments and better patient care. Guideline recommendations should include more explicit advice on adapting their recommendations to different care conditions. © 2017 The Authors. Worldviews on Evidence-Based Nursing published by Wiley Periodicals, Inc. on behalf of Sigma Theta Tau International The Honor Society of Nursing.

  16. A qualitative study of implementation and adaptations to Progressive Tinnitus Management (PTM) delivery.

    PubMed

    Tuepker, Anaïs; Elnitsky, Christine; Newell, Summer; Zaugg, Tara; Henry, James A

    2018-01-01

    Tinnitus is a common condition, especially prevalent among military Veterans. Progressive Tinnitus Management (PTM) is an interdisciplinary, structured, stepped-care approach to providing clinical services, including teaching coping skills, to people bothered by tinnitus. PTM has been shown to be effective at reducing functional distress, but implementation of the intervention outside of a research setting has not been studied, even though dissemination is underway within the Veterans Health Administration (VHA) system in the United States. This study was designed to address a gap in knowledge of PTM clinical implementation to date, with a focus on factors facilitating or hindering implementation in VHA audiology and mental health clinic contexts, and whether implementing sites had developed intervention adaptations. Qualitative interviews were conducted with 21 audiology and mental health clinicians and service chiefs across a regional service network. Interviews were transcribed and coded using a hybrid inductive-deductive analytic approach guided by existing implementation research frameworks and then iteratively developed for emergent themes. PTM prioritization was rare overall, with providers across disciplines challenged by lack of capacity for implementation, but with differences by discipline in challenges to prioritization. Where PTM was prioritized and delivered, this was facilitated by perception of unique value, provider's own experience of tinnitus, observation/experience with PTM delivery, intervention fit with provider's skills, and an environment with supportive leadership and adaptive reserve. PTM was frequently adapted to local contexts to address delivery challenges and diversify patient options. Adaptations included shifting from group to individual formats, reducing or combining sessions, and employing novel therapeutic approaches. Existing adaptations highlight the need to better understand mechanisms underlying PTM's effectiveness, and research on the impact of adaptations on patient outcomes is an important next step. Prioritization of PTM is a key barrier to the scale up and spread of this evidence-based intervention. Developing clinician champions may facilitate dissemination, especially if accompanied by signals of systemic prioritization. Novel approaches exposing clinicians and administrators to PTM may identify and develop clinical champions. Acknowledging the potential for PTM adaptations may make delivery more feasible in the context of existing system constraints and priorities.

  17. Towards Time Automata and Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Hutzler, G.; Klaudel, H.; Wang, D. Y.

    2004-01-01

    The design of reactive systems must comply with logical correctness (the system does what it is supposed to do) and timeliness (the system has to satisfy a set of temporal constraints) criteria. In this paper, we propose a global approach for the design of adaptive reactive systems, i.e., systems that dynamically adapt their architecture depending on the context. We use the timed automata formalism for the design of the agents' behavior. This allows evaluating beforehand the properties of the system (regarding logical correctness and timeliness), thanks to model-checking and simulation techniques. This model is enhanced with tools that we developed for the automatic generation of code, allowing to produce very quickly a running multi-agent prototype satisfying the properties of the model.

  18. Individual differences in adaptive coding of face identity are linked to individual differences in face recognition ability.

    PubMed

    Rhodes, Gillian; Jeffery, Linda; Taylor, Libby; Hayward, William G; Ewing, Louise

    2014-06-01

    Despite their similarity as visual patterns, we can discriminate and recognize many thousands of faces. This expertise has been linked to 2 coding mechanisms: holistic integration of information across the face and adaptive coding of face identity using norms tuned by experience. Recently, individual differences in face recognition ability have been discovered and linked to differences in holistic coding. Here we show that they are also linked to individual differences in adaptive coding of face identity, measured using face identity aftereffects. Identity aftereffects correlated significantly with several measures of face-selective recognition ability. They also correlated marginally with own-race face recognition ability, suggesting a role for adaptive coding in the well-known other-race effect. More generally, these results highlight the important functional role of adaptive face-coding mechanisms in face expertise, taking us beyond the traditional focus on holistic coding mechanisms. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  19. Autistic traits are linked to reduced adaptive coding of face identity and selectively poorer face recognition in men but not women.

    PubMed

    Rhodes, Gillian; Jeffery, Linda; Taylor, Libby; Ewing, Louise

    2013-11-01

    Our ability to discriminate and recognize thousands of faces despite their similarity as visual patterns relies on adaptive, norm-based, coding mechanisms that are continuously updated by experience. Reduced adaptive coding of face identity has been proposed as a neurocognitive endophenotype for autism, because it is found in autism and in relatives of individuals with autism. Autistic traits can also extend continuously into the general population, raising the possibility that reduced adaptive coding of face identity may be more generally associated with autistic traits. In the present study, we investigated whether adaptive coding of face identity decreases as autistic traits increase in an undergraduate population. Adaptive coding was measured using face identity aftereffects, and autistic traits were measured using the Autism-Spectrum Quotient (AQ) and its subscales. We also measured face and car recognition ability to determine whether autistic traits are selectively related to face recognition difficulties. We found that men who scored higher on levels of autistic traits related to social interaction had reduced adaptive coding of face identity. This result is consistent with the idea that atypical adaptive face-coding mechanisms are an endophenotype for autism. Autistic traits were also linked with face-selective recognition difficulties in men. However, there were some unexpected sex differences. In women, autistic traits were linked positively, rather than negatively, with adaptive coding of identity, and were unrelated to face-selective recognition difficulties. These sex differences indicate that autistic traits can have different neurocognitive correlates in men and women and raise the intriguing possibility that endophenotypes of autism can differ in males and females. © 2013 Elsevier Ltd. All rights reserved.

  20. The birth of information in the brain: Edgar Adrian and the vacuum tube.

    PubMed

    Garson, Justin

    2015-03-01

    As historian Henning Schmidgen notes, the scientific study of the nervous system would have been "unthinkable" without the industrialization of communication in the 1830s. Historians have investigated extensively the way nerve physiologists have borrowed concepts and tools from the field of communications, particularly regarding the nineteenth-century work of figures like Helmholtz and in the American Cold War Era. The following focuses specifically on the interwar research of the Cambridge physiologist Edgar Douglas Adrian, and on the technology that led to his Nobel-Prize-winning research, the thermionic vacuum tube. Many countries used the vacuum tube during the war for the purpose of amplifying and intercepting coded messages. These events provided a context for Adrian's evolving understanding of the nerve fiber in the 1920s. In particular, they provide the background for Adrian's transition around 1926 to describing the nerve impulse in terms of "information," "messages," "signals," or even "codes," and for translating the basic principles of the nerve, such as the all-or-none principle and adaptation, into such an "informational" context. The following also places Adrian's research in the broader context of the changing relationship between science and technology, and between physics and physiology, in the first few decades of the twentieth century.

  1. Context-aware and locality-constrained coding for image categorization.

    PubMed

    Xiao, Wenhua; Wang, Bin; Liu, Yu; Bao, Weidong; Zhang, Maojun

    2014-01-01

    Improving the coding strategy for BOF (Bag-of-Features) based feature design has drawn increasing attention in recent image categorization works. However, the ambiguity in coding procedure still impedes its further development. In this paper, we introduce a context-aware and locality-constrained Coding (CALC) approach with context information for describing objects in a discriminative way. It is generally achieved by learning a word-to-word cooccurrence prior to imposing context information over locality-constrained coding. Firstly, the local context of each category is evaluated by learning a word-to-word cooccurrence matrix representing the spatial distribution of local features in neighbor region. Then, the learned cooccurrence matrix is used for measuring the context distance between local features and code words. Finally, a coding strategy simultaneously considers locality in feature space and context space, while introducing the weight of feature is proposed. This novel coding strategy not only semantically preserves the information in coding, but also has the ability to alleviate the noise distortion of each class. Extensive experiments on several available datasets (Scene-15, Caltech101, and Caltech256) are conducted to validate the superiority of our algorithm by comparing it with baselines and recent published methods. Experimental results show that our method significantly improves the performance of baselines and achieves comparable and even better performance with the state of the arts.

  2. FPGA-based rate-adaptive LDPC-coded modulation for the next generation of optical communication systems.

    PubMed

    Zou, Ding; Djordjevic, Ivan B

    2016-09-05

    In this paper, we propose a rate-adaptive FEC scheme based on LDPC codes together with its software reconfigurable unified FPGA architecture. By FPGA emulation, we demonstrate that the proposed class of rate-adaptive LDPC codes based on shortening with an overhead from 25% to 42.9% provides a coding gain ranging from 13.08 dB to 14.28 dB at a post-FEC BER of 10-15 for BPSK transmission. In addition, the proposed rate-adaptive LDPC coding combined with higher-order modulations have been demonstrated including QPSK, 8-QAM, 16-QAM, 32-QAM, and 64-QAM, which covers a wide range of signal-to-noise ratios. Furthermore, we apply the unequal error protection by employing different LDPC codes on different bits in 16-QAM and 64-QAM, which results in additional 0.5dB gain compared to conventional LDPC coded modulation with the same code rate of corresponding LDPC code.

  3. Software Helps Retrieve Information Relevant to the User

    NASA Technical Reports Server (NTRS)

    Mathe, Natalie; Chen, James

    2003-01-01

    The Adaptive Indexing and Retrieval Agent (ARNIE) is a code library, designed to be used by an application program, that assists human users in retrieving desired information in a hypertext setting. Using ARNIE, the program implements a computational model for interactively learning what information each human user considers relevant in context. The model, called a "relevance network," incrementally adapts retrieved information to users individual profiles on the basis of feedback from the users regarding specific queries. The model also generalizes such knowledge for subsequent derivation of relevant references for similar queries and profiles, thereby, assisting users in filtering information by relevance. ARNIE thus enables users to categorize and share information of interest in various contexts. ARNIE encodes the relevance and structure of information in a neural network dynamically configured with a genetic algorithm. ARNIE maintains an internal database, wherein it saves associations, and from which it returns associated items in response to a query. A C++ compiler for a platform on which ARNIE will be utilized is necessary for creating the ARNIE library but is not necessary for the execution of the software.

  4. Implementing Culture Change in Nursing Homes: An Adaptive Leadership Framework

    PubMed Central

    Corazzini, Kirsten; Twersky, Jack; White, Heidi K.; Buhr, Gwendolen T.; McConnell, Eleanor S.; Weiner, Madeline; Colón-Emeric, Cathleen S.

    2015-01-01

    Purpose of the Study: To describe key adaptive challenges and leadership behaviors to implement culture change for person-directed care. Design and Methods: The study design was a qualitative, observational study of nursing home staff perceptions of the implementation of culture change in each of 3 nursing homes. We conducted 7 focus groups of licensed and unlicensed nursing staff, medical care providers, and administrators. Questions explored perceptions of facilitators and barriers to culture change. Using a template organizing style of analysis with immersion/crystallization, themes of barriers and facilitators were coded for adaptive challenges and leadership. Results: Six key themes emerged, including relationships, standards and expectations, motivation and vision, workload, respect of personhood, and physical environment. Within each theme, participants identified barriers that were adaptive challenges and facilitators that were examples of adaptive leadership. Commonly identified challenges were how to provide person-directed care in the context of extant rules or policies or how to develop staff motivated to provide person-directed care. Implications: Implementing culture change requires the recognition of adaptive challenges for which there are no technical solutions, but which require reframing of norms and expectations, and the development of novel and flexible solutions. Managers and administrators seeking to implement person-directed care will need to consider the role of adaptive leadership to address these adaptive challenges. PMID:24451896

  5. From Theory to Practice: Measuring end-of-life communication quality using multiple goals theory.

    PubMed

    Van Scoy, L J; Scott, A M; Reading, J M; Chuang, C H; Chinchilli, V M; Levi, B H; Green, M J

    2017-05-01

    To describe how multiple goals theory can be used as a reliable and valid measure (i.e., coding scheme) of the quality of conversations about end-of-life issues. We analyzed conversations from 17 conversations in which 68 participants (mean age=51years) played a game that prompted discussion in response to open-ended questions about end-of-life issues. Conversations (mean duration=91min) were audio-recorded and transcribed. Communication quality was assessed by three coders who assigned numeric scores rating how well individuals accomplished task, relational, and identity goals in the conversation. The coding measure, which results in a quantifiable outcome, yielded strong reliability (intra-class correlation range=0.73-0.89 and Cronbach's alpha range=0.69-0.89 for each of the coded domains) and validity (using multilevel nonlinear modeling, we detected significant variability in scores between games for each of the coded domains, all p-values <0.02). Our coding scheme provides a theory-based measure of end-of-life conversation quality that is superior to other methods of measuring communication quality. Our description of the coding method enables researches to adapt and apply this measure to communication interventions in other clinical contexts. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. PARAMESH: A Parallel Adaptive Mesh Refinement Community Toolkit

    NASA Technical Reports Server (NTRS)

    MacNeice, Peter; Olson, Kevin M.; Mobarry, Clark; deFainchtein, Rosalinda; Packer, Charles

    1999-01-01

    In this paper, we describe a community toolkit which is designed to provide parallel support with adaptive mesh capability for a large and important class of computational models, those using structured, logically cartesian meshes. The package of Fortran 90 subroutines, called PARAMESH, is designed to provide an application developer with an easy route to extend an existing serial code which uses a logically cartesian structured mesh into a parallel code with adaptive mesh refinement. Alternatively, in its simplest use, and with minimal effort, it can operate as a domain decomposition tool for users who want to parallelize their serial codes, but who do not wish to use adaptivity. The package can provide them with an incremental evolutionary path for their code, converting it first to uniformly refined parallel code, and then later if they so desire, adding adaptivity.

  7. Affective functioning among early adolescents at high and low familial risk for depression and their mothers: A focus on individual and transactional processes across contexts

    PubMed Central

    McMakin, Dana L.; Burkhouse, Katie L.; Olino, Thomas M.; Siegle, Greg J.; Dahl, Ronald E.; Silk, Jennifer S.

    2013-01-01

    This study aimed to characterize affective functioning in families of youth at high familial risk for depression, with particular attention to features of affective functioning that appear to be critical to adaptive functioning but have been underrepresented in prior research including: positive and negative affect across multiple contexts, individual and transactional processes, and affective flexibility. Interactions among early adolescents (ages 9-14) and their mothers were coded for affective behaviors across both positive and negative contexts. Primary analyses compared never-depressed youth at high (n=44) and low (n=57) familial risk for depression. The high risk group showed a relatively consistent pattern for low positive affect across negative and positive contexts at both the individual and transactional level. In contrast to prior studies focusing on negative affect that did not support disruptions in negative affect, the data from this study suggest variability by context: (i.e. increased negativity in a positive, but not negative, context) and individual vs. transactional processes (e.g., negative escalation). Findings are discussed in concert with attention to affect flexibility, contextual and transactional factors. PMID:21744058

  8. Resident challenges with daily life in Chinese long-term care facilities: A qualitative pilot study.

    PubMed

    Song, Yuting; Scales, Kezia; Anderson, Ruth A; Wu, Bei; Corazzini, Kirsten N

    As traditional family-based care in China declines, the demand for residential care increases. Knowledge of residents' experiences with long-term care (LTC) facilities is essential to improving quality of care. This pilot study aimed to describe residents' experiences in LTC facilities, particularly as it related to physical function. Semi-structured open-ended interviews were conducted in two facilities with residents stratified by three functional levels (n = 5). Directed content analysis was guided by the Adaptive Leadership Framework. A two-cycle coding approach was used with a first-cycle descriptive coding and second-cycle dramaturgical coding. Interviews provided examples of challenges faced by residents in meeting their daily care needs. Five themes emerged: staff care, care from family members, physical environment, other residents in the facility, and personal strategies. Findings demonstrate the significance of organizational context for care quality and reveal foci for future research. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. The influence of patient-centered communication during radiotherapy education sessions on post-consultation patient outcomes.

    PubMed

    Dong, Skye; Butow, Phyllis N; Costa, Daniel S J; Dhillon, Haryana M; Shields, Cleveland G

    2014-06-01

    To adapt an observational tool for assessing patient-centeredness of radiotherapy consultations and to assess whether scores for this tool and an existing tool assessing patient-perceived patient-centeredness predict patient outcomes. The Measure of Patient-Centered Communication (MPCC), an observational coding system that assesses depth of discussion during a consultation, was adapted to the radiotherapy context. Fifty-six radiotherapy patients (from 10 radiation therapists) had their psycho-education sessions recorded and coded using the MPCC. Patients also completed instruments assessing their perception of patient-centeredness, trust in the radiation therapist, satisfaction with the consultation, authentic self-representation (ASR) and state anxiety. The MPCC correlated weakly with patient-perceived patient-centeredness. The Feelings subcomponent of the MPCC predicted one aspect of ASR and trust, and interacted with level of therapist experience to predict trust. Patient-perceived patient-centeredness, which exhibited a ceiling effect, predicted satisfaction. Patient-centered communication is an important predictor of patient outcomes in radiotherapy and obviates some negative aspects of radiation therapists' experience on patient trust. As in other studies, there is a weak association between self-reported and observational coding of PCC. Radiation therapists have both technical and supportive roles to play in patient care, and may benefit from training in their supportive role. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  10. Perceptually-Based Adaptive JPEG Coding

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.; Rosenholtz, Ruth; Null, Cynthia H. (Technical Monitor)

    1996-01-01

    An extension to the JPEG standard (ISO/IEC DIS 10918-3) allows spatial adaptive coding of still images. As with baseline JPEG coding, one quantization matrix applies to an entire image channel, but in addition the user may specify a multiplier for each 8 x 8 block, which multiplies the quantization matrix, yielding the new matrix for the block. MPEG 1 and 2 use much the same scheme, except there the multiplier changes only on macroblock boundaries. We propose a method for perceptual optimization of the set of multipliers. We compute the perceptual error for each block based upon DCT quantization error adjusted according to contrast sensitivity, light adaptation, and contrast masking, and pick the set of multipliers which yield maximally flat perceptual error over the blocks of the image. We investigate the bitrate savings due to this adaptive coding scheme and the relative importance of the different sorts of masking on adaptive coding.

  11. Transcription in space--environmental vs. genetic effects on differential immune gene expression.

    PubMed

    Lenz, Tobias L

    2015-09-01

    Understanding how organisms adapt to their local environment is one of the key goals in molecular ecology. Adaptation can be achieved through qualitative changes in the coding sequence and/or quantitative changes in gene expression, where the optimal dosage of a gene's product in a given environment is being selected for. Differences in gene expression among populations inhabiting distinct environments can be suggestive of locally adapted gene regulation and have thus been studied in different species (Whitehead & Crawford ; Hodgins-Davis & Townsend ). However, in contrast to a gene's coding sequence, its expression level at a given point in time may depend on various factors, including the current environment. Although critical for understanding the extent of local adaptation, it is usually difficult to disentangle the heritable differences in gene regulation from environmental effects. In this issue of Molecular Ecology, Stutz et al. () describe an experiment in which they reciprocally transplanted three-spined sticklebacks (Gasterosteus aculeatus) between independent pairs of small and large lakes. Their experimental design allows them to attribute differences in gene expression among sticklebacks either to lake of origin or destination lake. Interestingly, they find that translocated sticklebacks show a pattern of gene expression more similar to individuals from the destination lake than to individuals from the lake of origin, suggesting that expression of the targeted genes is more strongly regulated by environmental effects than by genetics. The environmental effect by itself is not entirely surprising; however, the relative extent of it is. Especially when put in the context of local adaptation and population differentiation, as done here, these findings cast a new light onto the heritability of differential gene expression and specifically its relative importance during population divergence and ultimately ecological speciation. © 2015 John Wiley & Sons Ltd.

  12. Processing Code-Switching in Algerian Bilinguals: Effects of Language Use and Semantic Expectancy

    PubMed Central

    Kheder, Souad; Kaan, Edith

    2016-01-01

    Using a cross-modal naming paradigm this study investigated the effect of sentence constraint and language use on the expectancy of a language switch during listening comprehension. Sixty-five Algerian bilinguals who habitually code-switch between Algerian Arabic and French (AA-FR) but not between Standard Arabic and French (SA-FR) listened to sentence fragments and named a visually presented French target NP out loud. Participants’ speech onset times were recorded. The sentence context was either highly semantically constraining toward the French NP or not. The language of the sentence context was either in Algerian Arabic or in Standard Arabic, but the target NP was always in French, thus creating two code-switching contexts: a typical and recurrent code-switching context (AA-FR) and a non-typical code-switching context (SA-FR). Results revealed a semantic constraint effect indicating that the French switches were easier to process in the high compared to the low-constraint context. In addition, the effect size of semantic constraint was significant in the more typical code-switching context (AA-FR) suggesting that language use influences the processing of switching between languages. The effect of semantic constraint was also modulated by code-switching habits and the proficiency of L2 French. Semantic constraint was reduced in bilinguals who frequently code-switch and in bilinguals with high proficiency in French. Results are discussed with regards to the bilingual interactive activation model (Dijkstra and Van Heuven, 2002) and the control process model of code-switching (Green and Wei, 2014). PMID:26973559

  13. Adaptive variable-length coding for efficient compression of spacecraft television data.

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Plaunt, J. R.

    1971-01-01

    An adaptive variable length coding system is presented. Although developed primarily for the proposed Grand Tour missions, many features of this system clearly indicate a much wider applicability. Using sample to sample prediction, the coding system produces output rates within 0.25 bit/picture element (pixel) of the one-dimensional difference entropy for entropy values ranging from 0 to 8 bit/pixel. This is accomplished without the necessity of storing any code words. Performance improvements of 0.5 bit/pixel can be simply achieved by utilizing previous line correlation. A Basic Compressor, using concatenated codes, adapts to rapid changes in source statistics by automatically selecting one of three codes to use for each block of 21 pixels. The system adapts to less frequent, but more dramatic, changes in source statistics by adjusting the mode in which the Basic Compressor operates on a line-to-line basis. Furthermore, the compression system is independent of the quantization requirements of the pulse-code modulation system.

  14. Differential expression and emerging functions of non-coding RNAs in cold adaptation.

    PubMed

    Frigault, Jacques J; Morin, Mathieu D; Morin, Pier Jr

    2017-01-01

    Several species undergo substantial physiological and biochemical changes to confront the harsh conditions associated with winter. Small mammalian hibernators and cold-hardy insects are examples of natural models of cold adaptation that have been amply explored. While the molecular picture associated with cold adaptation has started to become clearer in recent years, notably through the use of high-throughput experimental approaches, the underlying cold-associated functions attributed to several non-coding RNAs, including microRNAs (miRNAs) and long non-coding RNAs (lncRNAs), remain to be better characterized. Nevertheless, key pioneering work has provided clues on the likely relevance of these molecules in cold adaptation. With an emphasis on mammalian hibernation and insect cold hardiness, this work first reviews various molecular changes documented so far in these processes. The cascades leading to miRNA and lncRNA production as well as the mechanisms of action of these non-coding RNAs are subsequently described. Finally, we present examples of differentially expressed non-coding RNAs in models of cold adaptation and elaborate on the potential significance of this modulation with respect to low-temperature adaptation.

  15. Place field assembly distribution encodes preferred locations

    PubMed Central

    Mamad, Omar; Stumpp, Lars; McNamara, Harold M.; Ramakrishnan, Charu; Deisseroth, Karl; Reilly, Richard B.

    2017-01-01

    The hippocampus is the main locus of episodic memory formation and the neurons there encode the spatial map of the environment. Hippocampal place cells represent location, but their role in the learning of preferential location remains unclear. The hippocampus may encode locations independently from the stimuli and events that are associated with these locations. We have discovered a unique population code for the experience-dependent value of the context. The degree of reward-driven navigation preference highly correlates with the spatial distribution of the place fields recorded in the CA1 region of the hippocampus. We show place field clustering towards rewarded locations. Optogenetic manipulation of the ventral tegmental area demonstrates that the experience-dependent place field assembly distribution is directed by tegmental dopaminergic activity. The ability of the place cells to remap parallels the acquisition of reward context. Our findings present key evidence that the hippocampal neurons are not merely mapping the static environment but also store the concurrent context reward value, enabling episodic memory for past experience to support future adaptive behavior. PMID:28898248

  16. Reward value-based gain control: divisive normalization in parietal cortex.

    PubMed

    Louie, Kenway; Grattan, Lauren E; Glimcher, Paul W

    2011-07-20

    The representation of value is a critical component of decision making. Rational choice theory assumes that options are assigned absolute values, independent of the value or existence of other alternatives. However, context-dependent choice behavior in both animals and humans violates this assumption, suggesting that biological decision processes rely on comparative evaluation. Here we show that neurons in the monkey lateral intraparietal cortex encode a relative form of saccadic value, explicitly dependent on the values of the other available alternatives. Analogous to extra-classical receptive field effects in visual cortex, this relative representation incorporates target values outside the response field and is observed in both stimulus-driven activity and baseline firing rates. This context-dependent modulation is precisely described by divisive normalization, indicating that this standard form of sensory gain control may be a general mechanism of cortical computation. Such normalization in decision circuits effectively implements an adaptive gain control for value coding and provides a possible mechanistic basis for behavioral context-dependent violations of rationality.

  17. A single-rate context-dependent learning process underlies rapid adaptation to familiar object dynamics.

    PubMed

    Ingram, James N; Howard, Ian S; Flanagan, J Randall; Wolpert, Daniel M

    2011-09-01

    Motor learning has been extensively studied using dynamic (force-field) perturbations. These induce movement errors that result in adaptive changes to the motor commands. Several state-space models have been developed to explain how trial-by-trial errors drive the progressive adaptation observed in such studies. These models have been applied to adaptation involving novel dynamics, which typically occurs over tens to hundreds of trials, and which appears to be mediated by a dual-rate adaptation process. In contrast, when manipulating objects with familiar dynamics, subjects adapt rapidly within a few trials. Here, we apply state-space models to familiar dynamics, asking whether adaptation is mediated by a single-rate or dual-rate process. Previously, we reported a task in which subjects rotate an object with known dynamics. By presenting the object at different visual orientations, adaptation was shown to be context-specific, with limited generalization to novel orientations. Here we show that a multiple-context state-space model, with a generalization function tuned to visual object orientation, can reproduce the time-course of adaptation and de-adaptation as well as the observed context-dependent behavior. In contrast to the dual-rate process associated with novel dynamics, we show that a single-rate process mediates adaptation to familiar object dynamics. The model predicts that during exposure to the object across multiple orientations, there will be a degree of independence for adaptation and de-adaptation within each context, and that the states associated with all contexts will slowly de-adapt during exposure in one particular context. We confirm these predictions in two new experiments. Results of the current study thus highlight similarities and differences in the processes engaged during exposure to novel versus familiar dynamics. In both cases, adaptation is mediated by multiple context-specific representations. In the case of familiar object dynamics, however, the representations can be engaged based on visual context, and are updated by a single-rate process.

  18. Starting a new conversation: Engaging Veterans with spinal cord injury in discussions of what function means to them, the barriers/facilitators they encounter, and the adaptations they use to optimize function.

    PubMed

    Hill, Jennifer N; Balbale, Salva; Lones, Keshonna; LaVela, Sherri L

    2017-01-01

    Assessments of function in persons with spinal cord injury (SCI) often utilize pre-defined constructs and measures without consideration of patient context, including how patients define function and what matters to them. We utilized photovoice to understand how individuals define function, facilitators and barriers to function, and adaptations to support functioning. Veterans with SCI were provided with cameras and guidelines to take photographs of things that: (1) help with functioning, (2) are barriers to function, and (3) represent adaptations used to support functioning. Interviews to discuss photographs followed and were audio-recorded, transcribed, and analyzed using grounded-thematic coding. Nvivo 8 was used to store and organize data. Participants (n = 9) were male (89%), Caucasian (67%), had paraplegia (75%), averaged 64 years of age, and were injured, on average, for 22 years. Function was described in several ways: the concept of 'normalcy,' aspects of daily living, and ability to be independent. Facilitators included: helpful tools, physical therapy/therapists, transportation, and caregivers. Barriers included: wheelchair-related issues and interior/exterior barriers both in the community and in the hospital. Examples of adaptations included: traditional examples like ramps, and also creative examples like the use of rubber bands on a can to help with grip. Patient-perspectives elicited in-depth information that expanded the common definition of function by highlighting the concept of "normality," facilitators and barriers to function, and adaptations to optimize function. These insights emphasize function within a patient-context, emphasizing a holistic definition of function that can be used to develop personalized, patient-driven care plans. Published by Elsevier Inc.

  19. Was Wright Right? The Canonical Genetic Code is an Empirical Example of an Adaptive Peak in Nature; Deviant Genetic Codes Evolved Using Adaptive Bridges

    PubMed Central

    2010-01-01

    The canonical genetic code is on a sub-optimal adaptive peak with respect to its ability to minimize errors, and is close to, but not quite, optimal. This is demonstrated by the near-total adjacency of synonymous codons, the similarity of adjacent codons, and comparisons of frequency of amino acid usage with number of codons in the code for each amino acid. As a rare empirical example of an adaptive peak in nature, it shows adaptive peaks are real, not merely theoretical. The evolution of deviant genetic codes illustrates how populations move from a lower to a higher adaptive peak. This is done by the use of “adaptive bridges,” neutral pathways that cross over maladaptive valleys by virtue of masking of the phenotypic expression of some maladaptive aspects in the genotype. This appears to be the general mechanism by which populations travel from one adaptive peak to another. There are multiple routes a population can follow to cross from one adaptive peak to another. These routes vary in the probability that they will be used, and this probability is determined by the number and nature of the mutations that happen along each of the routes. A modification of the depiction of adaptive landscapes showing genetic distances and probabilities of travel along their multiple possible routes would throw light on this important concept. PMID:20711776

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strauss, H.R.

    This paper describes the code FEMHD, an adaptive finite element MHD code, which is applied in a number of different manners to model MHD behavior and edge plasma phenomena on a diverted tokamak. The code uses an unstructured triangular mesh in 2D and wedge shaped mesh elements in 3D. The code has been adapted to look at neutral and charged particle dynamics in the plasma scrape off region, and into a full MHD-particle code.

  1. Reliability associated with the Roter Interaction Analysis System (RIAS) adapted for the telemedicine context.

    PubMed

    Nelson, Eve-Lynn; Miller, Edward Alan; Larson, Kiley A

    2010-01-01

    This study's purpose was to adapt the Roter Interaction Analysis System (RIAS) for telemedicine clinics and to investigate the adapted measure's reliability. The study also sought to better understand the volume of technology-related utterance in established telemedicine clinics and the feasibility of using the measure within the telemedicine setting. This initial evaluation is a first step before broadly using the adapted measure across technologies and raters. An expert panel adapted the RIAS for the telemedicine context. This involved accounting for all consultation participants (patient, provider, presenter, family) and adding technology-specific subcategories. Ten new and 36 follow-up telemedicine encounters were videotaped and double coded using the adapted RIAS. These consisted primarily of follow-up visits (78.0%) involving patients, providers, presenters, and other parties. Reliability was calculated for those categories with 15 or more utterances. Traditional RIAS categories related to socioemotional and task-focused clusters had fair to excellent levels of reliability in the telemedicine setting. Although there were too few utterances to calculate the reliability of the specific technology-related subcategories, the summary technology-related category proved reliable for patients, providers, and presenters. Overall patterns seen in traditional patient-provider interactions were observed, with the number of provider utterances far exceeding patient, presenter, and family utterances, and few technology-specific utterances. The traditional RIAS is reliable when applied across multiple participants in the telemedicine context. Reliability of technology-related subcategories could not be evaluated; however, the aggregate technology-related cluster was found to be reliable and may be especially relevant in understanding communication patterns with patients new to the telemedicine setting. Use of the RIAS instrument is encouraged to facilitate comparison between traditional, face-to-face clinics and telemedicine; among diverse consultation mediums and technologies; and across different specialties. Future research is necessary to further investigate the reliability and validity of adding technology-related subcategories to the RIAS. The limited number of technology-related utterances, however, implies a certain degree of comfort with two-way interactive video consultation among study participants. Telemedicine continues to increase access to healthcare. The technology-related categories of the adapted RIAS were reliable when aggregated, thereby providing a tool to better understand how telemedicine affects provider-patient communication and outcomes.

  2. Emergence of Adaptive Computation by Single Neurons in the Developing Cortex

    PubMed Central

    Famulare, Michael; Gjorgjieva, Julijana; Moody, William J.

    2013-01-01

    Adaptation is a fundamental computational motif in neural processing. To maintain stable perception in the face of rapidly shifting input, neural systems must extract relevant information from background fluctuations under many different contexts. Many neural systems are able to adjust their input–output properties such that an input's ability to trigger a response depends on the size of that input relative to its local statistical context. This “gain-scaling” strategy has been shown to be an efficient coding strategy. We report here that this property emerges during early development as an intrinsic property of single neurons in mouse sensorimotor cortex, coinciding with the disappearance of spontaneous waves of network activity, and can be modulated by changing the balance of spike-generating currents. Simultaneously, developing neurons move toward a common intrinsic operating point and a stable ratio of spike-generating currents. This developmental trajectory occurs in the absence of sensory input or spontaneous network activity. Through a combination of electrophysiology and modeling, we demonstrate that developing cortical neurons develop the ability to perform nearly perfect gain scaling by virtue of the maturing spike-generating currents alone. We use reduced single neuron models to identify the conditions for this property to hold. PMID:23884925

  3. Quality Scalability Aware Watermarking for Visual Content.

    PubMed

    Bhowmik, Deepayan; Abhayaratne, Charith

    2016-11-01

    Scalable coding-based content adaptation poses serious challenges to traditional watermarking algorithms, which do not consider the scalable coding structure and hence cannot guarantee correct watermark extraction in media consumption chain. In this paper, we propose a novel concept of scalable blind watermarking that ensures more robust watermark extraction at various compression ratios while not effecting the visual quality of host media. The proposed algorithm generates scalable and robust watermarked image code-stream that allows the user to constrain embedding distortion for target content adaptations. The watermarked image code-stream consists of hierarchically nested joint distortion-robustness coding atoms. The code-stream is generated by proposing a new wavelet domain blind watermarking algorithm guided by a quantization based binary tree. The code-stream can be truncated at any distortion-robustness atom to generate the watermarked image with the desired distortion-robustness requirements. A blind extractor is capable of extracting watermark data from the watermarked images. The algorithm is further extended to incorporate a bit-plane discarding-based quantization model used in scalable coding-based content adaptation, e.g., JPEG2000. This improves the robustness against quality scalability of JPEG2000 compression. The simulation results verify the feasibility of the proposed concept, its applications, and its improved robustness against quality scalable content adaptation. Our proposed algorithm also outperforms existing methods showing 35% improvement. In terms of robustness to quality scalable video content adaptation using Motion JPEG2000 and wavelet-based scalable video coding, the proposed method shows major improvement for video watermarking.

  4. Star adaptation for two-algorithms used on serial computers

    NASA Technical Reports Server (NTRS)

    Howser, L. M.; Lambiotte, J. J., Jr.

    1974-01-01

    Two representative algorithms used on a serial computer and presently executed on the Control Data Corporation 6000 computer were adapted to execute efficiently on the Control Data STAR-100 computer. Gaussian elimination for the solution of simultaneous linear equations and the Gauss-Legendre quadrature formula for the approximation of an integral are the two algorithms discussed. A description is given of how the programs were adapted for STAR and why these adaptations were necessary to obtain an efficient STAR program. Some points to consider when adapting an algorithm for STAR are discussed. Program listings of the 6000 version coded in 6000 FORTRAN, the adapted STAR version coded in 6000 FORTRAN, and the STAR version coded in STAR FORTRAN are presented in the appendices.

  5. Mapping to Irregular Torus Topologies and Other Techniques for Petascale Biomolecular Simulation

    PubMed Central

    Phillips, James C.; Sun, Yanhua; Jain, Nikhil; Bohm, Eric J.; Kalé, Laxmikant V.

    2014-01-01

    Currently deployed petascale supercomputers typically use toroidal network topologies in three or more dimensions. While these networks perform well for topology-agnostic codes on a few thousand nodes, leadership machines with 20,000 nodes require topology awareness to avoid network contention for communication-intensive codes. Topology adaptation is complicated by irregular node allocation shapes and holes due to dedicated input/output nodes or hardware failure. In the context of the popular molecular dynamics program NAMD, we present methods for mapping a periodic 3-D grid of fixed-size spatial decomposition domains to 3-D Cray Gemini and 5-D IBM Blue Gene/Q toroidal networks to enable hundred-million atom full machine simulations, and to similarly partition node allocations into compact domains for smaller simulations using multiple-copy algorithms. Additional enabling techniques are discussed and performance is reported for NCSA Blue Waters, ORNL Titan, ANL Mira, TACC Stampede, and NERSC Edison. PMID:25594075

  6. Adaptive format conversion for scalable video coding

    NASA Astrophysics Data System (ADS)

    Wan, Wade K.; Lim, Jae S.

    2001-12-01

    The enhancement layer in many scalable coding algorithms is composed of residual coding information. There is another type of information that can be transmitted instead of (or in addition to) residual coding. Since the encoder has access to the original sequence, it can utilize adaptive format conversion (AFC) to generate the enhancement layer and transmit the different format conversion methods as enhancement data. This paper investigates the use of adaptive format conversion information as enhancement data in scalable video coding. Experimental results are shown for a wide range of base layer qualities and enhancement bitrates to determine when AFC can improve video scalability. Since the parameters needed for AFC are small compared to residual coding, AFC can provide video scalability at low enhancement layer bitrates that are not possible with residual coding. In addition, AFC can also be used in addition to residual coding to improve video scalability at higher enhancement layer bitrates. Adaptive format conversion has not been studied in detail, but many scalable applications may benefit from it. An example of an application that AFC is well-suited for is the migration path for digital television where AFC can provide immediate video scalability as well as assist future migrations.

  7. Awareness Becomes Necessary Between Adaptive Pattern Coding of Open and Closed Curvatures

    PubMed Central

    Sweeny, Timothy D.; Grabowecky, Marcia; Suzuki, Satoru

    2012-01-01

    Visual pattern processing becomes increasingly complex along the ventral pathway, from the low-level coding of local orientation in the primary visual cortex to the high-level coding of face identity in temporal visual areas. Previous research using pattern aftereffects as a psychophysical tool to measure activation of adaptive feature coding has suggested that awareness is relatively unimportant for the coding of orientation, but awareness is crucial for the coding of face identity. We investigated where along the ventral visual pathway awareness becomes crucial for pattern coding. Monoptic masking, which interferes with neural spiking activity in low-level processing while preserving awareness of the adaptor, eliminated open-curvature aftereffects but preserved closed-curvature aftereffects. In contrast, dichoptic masking, which spares spiking activity in low-level processing while wiping out awareness, preserved open-curvature aftereffects but eliminated closed-curvature aftereffects. This double dissociation suggests that adaptive coding of open and closed curvatures straddles the divide between weakly and strongly awareness-dependent pattern coding. PMID:21690314

  8. Visual coding of human bodies: perceptual aftereffects reveal norm-based, opponent coding of body identity.

    PubMed

    Rhodes, Gillian; Jeffery, Linda; Boeing, Alexandra; Calder, Andrew J

    2013-04-01

    Despite the discovery of body-selective neural areas in occipitotemporal cortex, little is known about how bodies are visually coded. We used perceptual adaptation to determine how body identity is coded. Brief exposure to a body (e.g., anti-Rose) biased perception toward an identity with opposite properties (Rose). Moreover, the size of this aftereffect increased with adaptor extremity, as predicted by norm-based, opponent coding of body identity. A size change between adapt and test bodies minimized the effects of low-level, retinotopic adaptation. These results demonstrate that body identity, like face identity, is opponent coded in higher-level vision. More generally, they show that a norm-based multidimensional framework, which is well established for face perception, may provide a powerful framework for understanding body perception.

  9. Enhanced attention amplifies face adaptation.

    PubMed

    Rhodes, Gillian; Jeffery, Linda; Evangelista, Emma; Ewing, Louise; Peters, Marianne; Taylor, Libby

    2011-08-15

    Perceptual adaptation not only produces striking perceptual aftereffects, but also enhances coding efficiency and discrimination by calibrating coding mechanisms to prevailing inputs. Attention to simple stimuli increases adaptation, potentially enhancing its functional benefits. Here we show that attention also increases adaptation to faces. In Experiment 1, face identity aftereffects increased when attention to adapting faces was increased using a change detection task. In Experiment 2, figural (distortion) face aftereffects increased when attention was increased using a snap game (detecting immediate repeats) during adaptation. Both were large effects. Contributions of low-level adaptation were reduced using free viewing (both experiments) and a size change between adapt and test faces (Experiment 2). We suggest that attention may enhance adaptation throughout the entire cortical visual pathway, with functional benefits well beyond the immediate advantages of selective processing of potentially important stimuli. These results highlight the potential to facilitate adaptive updating of face-coding mechanisms by strategic deployment of attentional resources. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Context-specific adaptation of the gain of the oculomotor response to lateral translation using roll and pitch head tilts as contexts

    NASA Technical Reports Server (NTRS)

    Shelhamer, Mark; Peng, Grace C Y.; Ramat, Stefano; Patel, Vivek

    2002-01-01

    Previous studies established that vestibular and oculomotor behaviors can have two adapted states (e.g., gain) simultaneously, and that a context cue (e.g., vertical eye position) can switch between the two states. The present study examined this phenomenon of context-specific adaptation for the oculomotor response to interaural translation (which we term "linear vestibulo-ocular reflex" or LVOR even though it may have extravestibular components). Subjects sat upright on a linear sled and were translated at 0.7 Hz and 0.3 gpeak acceleration while a visual-vestibular mismatch paradigm was used to adaptively increase (x2) or decrease (x0) the gain of the LVOR. In each experimental session, gain increase was asked for in one context, and gain decrease in another context. Testing in darkness with steps and sines before and after adaptation, in each context, assessed the extent to which the context itself could recall the gain state that was imposed in that context during adaptation. Two different contexts were used: head pitch (26 degrees forward and backward) and head roll (26 degrees or 45 degrees, right and left). Head roll tilt worked well as a context cue: with the head rolled to the right the LVOR could be made to have a higher gain than with the head rolled to the left. Head pitch tilt was less effective as a context cue. This suggests that the more closely related a context cue is to the response being adapted, the more effective it is.

  11. A novel bit-wise adaptable entropy coding technique

    NASA Technical Reports Server (NTRS)

    Kiely, A.; Klimesh, M.

    2001-01-01

    We present a novel entropy coding technique which is adaptable in that each bit to be encoded may have an associated probability esitmate which depends on previously encoded bits. The technique may have advantages over arithmetic coding. The technique can achieve arbitrarily small redundancy and admits a simple and fast decoder.

  12. Flowfield computer graphics

    NASA Technical Reports Server (NTRS)

    Desautel, Richard

    1993-01-01

    The objectives of this research include supporting the Aerothermodynamics Branch's research by developing graphical visualization tools for both the branch's adaptive grid code and flow field ray tracing code. The completed research for the reporting period includes development of a graphical user interface (GUI) and its implementation into the NAS Flowfield Analysis Software Tool kit (FAST), for both the adaptive grid code (SAGE) and the flow field ray tracing code (CISS).

  13. A novel, implicit treatment for language comprehension processes in right hemisphere brain damage: Phase I data

    PubMed Central

    Tompkins, Connie A.; Blake, Margaret T.; Wambaugh, Julie; Meigh, Kimberly

    2012-01-01

    Background This manuscript reports the initial phase of testing for a novel, “Contextual constraint” treatment, designed to stimulate inefficient language comprehension processes in adults with right hemisphere brain damage (RHD). Two versions of treatment were developed to target two normal comprehension processes that have broad relevance for discourse comprehension and that are often disrupted by RHD: coarse semantic coding and suppression. The development of the treatment was informed by two well-documented strengths of the RHD population. The first is consistently better performance on assessments that are implicit, or nearly so, than on explicit, metalinguistic measures of language and cognitive processing. The second is improved performance when given linguistic context that moderately-to-strongly biases an intended meaning. Treatment consisted of providing brief context sentences to prestimulate, or constrain, intended interpretations. Participants made no explicit associations or judgments about the constraint sentences; rather, these contexts served only as implicit primes. Aims This Phase I treatment study aimed to determine the effects of a novel, implicit, Contextual Constraint treatment in adults with RHD whose coarse coding or suppression processes were inefficient. Treatment was hypothesized to speed coarse coding or suppression function in these individuals. Methods & Procedures Three adults with RHD participated in this study, one (P1) with a coarse coding deficit and two (P2, P3) with suppression deficits. Probe tasks were adapted from prior studies of coarse coding and suppression in RHD. The dependent measure was the percentage of responses that met predetermined response time criteria. When pre-treatment baseline performance was stable, treatment was initiated. There were two levels of contextual constraint, Strong and Moderate, and treatment for each item began with the provision of the Strong constraint context. Outcomes & Results Treatment-contingent gains were evident after brief periods of treatment, for P1 on two treatment lists, and for P2. P3 made slower but still substantial gains. Maintenance of gains was evident for P1, the only participant for whom it was measured. Conclusions This Phase I treatment study documents the potential for considerable gains from an implicit, Contextual constraint treatment. If replicated, this approach to treatment may hold promise for individuals who do poorly with effortful, metalinguistic treatment tasks, or for whom it is desirable to minimize errors during treatment. The real test of this treatment’s benefit will come from later phase studies of study, which will test broad-based generalization to various aspects of discourse comprehension. PMID:22368317

  14. How Attending Physician Preceptors Negotiate Their Complex Work Environment: A Collective Ethnography.

    PubMed

    Lemaire, Jane B; Wallace, Jean E; Sargious, Peter M; Bacchus, Maria; Zarnke, Kelly; Ward, David R; Ghali, William A

    2017-12-01

    To generate an empiric, detailed, and updated view of the attending physician preceptor role and its interface with the complex work environment. In 2013, the authors conducted a modified collective ethnography with observations of internal medicine medical teaching unit preceptors from two university hospitals in Canada. Eleven observers conducted 32 observations (99.5 hours) of 26 preceptors (30 observations [93.5 hours] of 24 preceptors were included in the analysis). An inductive thematic approach was used to analyze the data with further axial coding to identify connections between themes. Four individuals coded the main data set; differences were addressed through discussion to achieve consensus. Three elements or major themes of the preceptor role were identified: (1) competence or the execution of traditional physician competencies, (2) context or the extended medical teaching unit environment, and (3) conduct or the manner of acting or behaviors and attitudes in the role. Multiple connections between the elements emerged. The preceptor role appeared to depend on the execution of professional skills (competence) but also was vulnerable to contextual factors (context) independent of these skills, many of which were unpredictable. This vulnerability appeared to be tempered by preceptors' use of adaptive behaviors and attitudes (conduct), such as creativity, interpersonal skills, and wellness behaviors. Preceptors not only possess traditional competencies but also enlist additional behaviors and attitudes to deal with context-driven tensions and to negotiate their complex work environment. These skills could be incorporated into role training, orientation, and mentorship.

  15. The Complex Nature of Bilinguals' Language Usage Modulates Task-Switching Outcomes

    PubMed Central

    Yang, Hwajin; Hartanto, Andree; Yang, Sujin

    2016-01-01

    In view of inconsistent findings regarding bilingual advantages in executive functions (EF), we reviewed the literature to determine whether bilinguals' different language usage causes measureable changes in the shifting aspects of EF. By drawing on the theoretical framework of the adaptive control hypothesis—which postulates a critical link between bilinguals' varying demands on language control and adaptive cognitive control (Green and Abutalebi, 2013), we examined three factors that characterize bilinguals' language-switching experience: (a) the interactional context of conversational exchanges, (b) frequency of language switching, and (c) typology of code-switching. We also examined whether methodological variations in previous task-switching studies modulate task-specific demands on control processing and lead to inconsistencies in the literature. Our review demonstrates that not only methodological rigor but also a more finely grained, theory-based approach will be required to understand the cognitive consequences of bilinguals' varied linguistic practices in shifting EF. PMID:27199800

  16. Rate adaptive multilevel coded modulation with high coding gain in intensity modulation direct detection optical communication

    NASA Astrophysics Data System (ADS)

    Xiao, Fei; Liu, Bo; Zhang, Lijia; Xin, Xiangjun; Zhang, Qi; Tian, Qinghua; Tian, Feng; Wang, Yongjun; Rao, Lan; Ullah, Rahat; Zhao, Feng; Li, Deng'ao

    2018-02-01

    A rate-adaptive multilevel coded modulation (RA-MLC) scheme based on fixed code length and a corresponding decoding scheme is proposed. RA-MLC scheme combines the multilevel coded and modulation technology with the binary linear block code at the transmitter. Bits division, coding, optional interleaving, and modulation are carried out by the preset rule, then transmitted through standard single mode fiber span equal to 100 km. The receiver improves the accuracy of decoding by means of soft information passing through different layers, which enhances the performance. Simulations are carried out in an intensity modulation-direct detection optical communication system using MATLAB®. Results show that the RA-MLC scheme can achieve bit error rate of 1E-5 when optical signal-to-noise ratio is 20.7 dB. It also reduced the number of decoders by 72% and realized 22 rate adaptation without significantly increasing the computing time. The coding gain is increased by 7.3 dB at BER=1E-3.

  17. Local statistics adaptive entropy coding method for the improvement of H.26L VLC coding

    NASA Astrophysics Data System (ADS)

    Yoo, Kook-yeol; Kim, Jong D.; Choi, Byung-Sun; Lee, Yung Lyul

    2000-05-01

    In this paper, we propose an adaptive entropy coding method to improve the VLC coding efficiency of H.26L TML-1 codec. First of all, we will show that the VLC coding presented in TML-1 does not satisfy the sibling property of entropy coding. Then, we will modify the coding method into the local statistics adaptive one to satisfy the property. The proposed method based on the local symbol statistics dynamically changes the mapping relationship between symbol and bit pattern in the VLC table according to sibling property. Note that the codewords in the VLC table of TML-1 codec is not changed. Since this changed mapping relationship also derived in the decoder side by using the decoded symbols, the proposed VLC coding method does not require any overhead information. The simulation results show that the proposed method gives about 30% and 37% reduction in average bit rate for MB type and CBP information, respectively.

  18. A domain specific language for performance portable molecular dynamics algorithms

    NASA Astrophysics Data System (ADS)

    Saunders, William Robert; Grant, James; Müller, Eike Hermann

    2018-03-01

    Developers of Molecular Dynamics (MD) codes face significant challenges when adapting existing simulation packages to new hardware. In a continuously diversifying hardware landscape it becomes increasingly difficult for scientists to be experts both in their own domain (physics/chemistry/biology) and specialists in the low level parallelisation and optimisation of their codes. To address this challenge, we describe a "Separation of Concerns" approach for the development of parallel and optimised MD codes: the science specialist writes code at a high abstraction level in a domain specific language (DSL), which is then translated into efficient computer code by a scientific programmer. In a related context, an abstraction for the solution of partial differential equations with grid based methods has recently been implemented in the (Py)OP2 library. Inspired by this approach, we develop a Python code generation system for molecular dynamics simulations on different parallel architectures, including massively parallel distributed memory systems and GPUs. We demonstrate the efficiency of the auto-generated code by studying its performance and scalability on different hardware and compare it to other state-of-the-art simulation packages. With growing data volumes the extraction of physically meaningful information from the simulation becomes increasingly challenging and requires equally efficient implementations. A particular advantage of our approach is the easy expression of such analysis algorithms. We consider two popular methods for deducing the crystalline structure of a material from the local environment of each atom, show how they can be expressed in our abstraction and implement them in the code generation framework.

  19. Visualization of Octree Adaptive Mesh Refinement (AMR) in Astrophysical Simulations

    NASA Astrophysics Data System (ADS)

    Labadens, M.; Chapon, D.; Pomaréde, D.; Teyssier, R.

    2012-09-01

    Computer simulations are important in current cosmological research. Those simulations run in parallel on thousands of processors, and produce huge amount of data. Adaptive mesh refinement is used to reduce the computing cost while keeping good numerical accuracy in regions of interest. RAMSES is a cosmological code developed by the Commissariat à l'énergie atomique et aux énergies alternatives (English: Atomic Energy and Alternative Energies Commission) which uses Octree adaptive mesh refinement. Compared to grid based AMR, the Octree AMR has the advantage to fit very precisely the adaptive resolution of the grid to the local problem complexity. However, this specific octree data type need some specific software to be visualized, as generic visualization tools works on Cartesian grid data type. This is why the PYMSES software has been also developed by our team. It relies on the python scripting language to ensure a modular and easy access to explore those specific data. In order to take advantage of the High Performance Computer which runs the RAMSES simulation, it also uses MPI and multiprocessing to run some parallel code. We would like to present with more details our PYMSES software with some performance benchmarks. PYMSES has currently two visualization techniques which work directly on the AMR. The first one is a splatting technique, and the second one is a custom ray tracing technique. Both have their own advantages and drawbacks. We have also compared two parallel programming techniques with the python multiprocessing library versus the use of MPI run. The load balancing strategy has to be smartly defined in order to achieve a good speed up in our computation. Results obtained with this software are illustrated in the context of a massive, 9000-processor parallel simulation of a Milky Way-like galaxy.

  20. Lightweight Adaptation of Classifiers to Users and Contexts: Trends of the Emerging Domain

    PubMed Central

    Vildjiounaite, Elena; Gimel'farb, Georgy; Kyllönen, Vesa; Peltola, Johannes

    2015-01-01

    Intelligent computer applications need to adapt their behaviour to contexts and users, but conventional classifier adaptation methods require long data collection and/or training times. Therefore classifier adaptation is often performed as follows: at design time application developers define typical usage contexts and provide reasoning models for each of these contexts, and then at runtime an appropriate model is selected from available ones. Typically, definition of usage contexts and reasoning models heavily relies on domain knowledge. However, in practice many applications are used in so diverse situations that no developer can predict them all and collect for each situation adequate training and test databases. Such applications have to adapt to a new user or unknown context at runtime just from interaction with the user, preferably in fairly lightweight ways, that is, requiring limited user effort to collect training data and limited time of performing the adaptation. This paper analyses adaptation trends in several emerging domains and outlines promising ideas, proposed for making multimodal classifiers user-specific and context-specific without significant user efforts, detailed domain knowledge, and/or complete retraining of the classifiers. Based on this analysis, this paper identifies important application characteristics and presents guidelines to consider these characteristics in adaptation design. PMID:26473165

  1. Adaptive Transmission and Channel Modeling for Frequency Hopping Communications

    DTIC Science & Technology

    2009-09-21

    proposed adaptive transmission method has much greater system capacity than conventional non-adaptive MC direct- sequence ( DS )- CDMA system. • We...several mobile radio systems. First, a new improved allocation algorithm was proposed for multicarrier code-division multiple access (MC- CDMA ) system...Multicarrier code-division multiple access (MC- CDMA ) system with adaptive frequency hopping (AFH) has attracted attention of researchers due to its

  2. Prism adaptation in virtual and natural contexts: Evidence for a flexible adaptive process.

    PubMed

    Veilleux, Louis-Nicolas; Proteau, Luc

    2015-01-01

    Prism exposure when aiming at a visual target in a virtual condition (e.g., when the hand is represented by a video representation) produces no or only small adaptations (after-effects), whereas prism exposure in a natural condition produces large after-effects. Some researchers suggested that this difference may arise from distinct adaptive processes, but other studies suggested a unique process. The present study reconciled these conflicting interpretations. Forty participants were divided into two groups: One group used visual feedback of their hand (natural context), and the other group used computer-generated representational feedback (virtual context). Visual feedback during adaptation was concurrent or terminal. All participants underwent laterally displacing prism perturbation. The results showed that the after-effects were twice as large in the "natural context" than in the "virtual context". No significant differences were observed between the concurrent and terminal feedback conditions. The after-effects generalized to untested targets and workspace. These results suggest that prism adaptation in virtual and natural contexts involves the same process. The smaller after-effects in the virtual context suggest that the depth of adaptation is a function of the degree of convergence between the proprioceptive and visual information that arises from the hand.

  3. Implementing Culture Change in Nursing Homes: An Adaptive Leadership Framework.

    PubMed

    Corazzini, Kirsten; Twersky, Jack; White, Heidi K; Buhr, Gwendolen T; McConnell, Eleanor S; Weiner, Madeline; Colón-Emeric, Cathleen S

    2015-08-01

    To describe key adaptive challenges and leadership behaviors to implement culture change for person-directed care. The study design was a qualitative, observational study of nursing home staff perceptions of the implementation of culture change in each of 3 nursing homes. We conducted 7 focus groups of licensed and unlicensed nursing staff, medical care providers, and administrators. Questions explored perceptions of facilitators and barriers to culture change. Using a template organizing style of analysis with immersion/crystallization, themes of barriers and facilitators were coded for adaptive challenges and leadership. Six key themes emerged, including relationships, standards and expectations, motivation and vision, workload, respect of personhood, and physical environment. Within each theme, participants identified barriers that were adaptive challenges and facilitators that were examples of adaptive leadership. Commonly identified challenges were how to provide person-directed care in the context of extant rules or policies or how to develop staff motivated to provide person-directed care. Implementing culture change requires the recognition of adaptive challenges for which there are no technical solutions, but which require reframing of norms and expectations, and the development of novel and flexible solutions. Managers and administrators seeking to implement person-directed care will need to consider the role of adaptive leadership to address these adaptive challenges. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  4. Low complexity Reed-Solomon-based low-density parity-check design for software defined optical transmission system based on adaptive puncturing decoding algorithm

    NASA Astrophysics Data System (ADS)

    Pan, Xiaolong; Liu, Bo; Zheng, Jianglong; Tian, Qinghua

    2016-08-01

    We propose and demonstrate a low complexity Reed-Solomon-based low-density parity-check (RS-LDPC) code with adaptive puncturing decoding algorithm for elastic optical transmission system. Partial received codes and the relevant column in parity-check matrix can be punctured to reduce the calculation complexity by adaptive parity-check matrix during decoding process. The results show that the complexity of the proposed decoding algorithm is reduced by 30% compared with the regular RS-LDPC system. The optimized code rate of the RS-LDPC code can be obtained after five times iteration.

  5. Pragmatic turn in biology: From biological molecules to genetic content operators.

    PubMed

    Witzany, Guenther

    2014-08-26

    Erwin Schrödinger's question "What is life?" received the answer for decades of "physics + chemistry". The concepts of Alain Turing and John von Neumann introduced a third term: "information". This led to the understanding of nucleic acid sequences as a natural code. Manfred Eigen adapted the concept of Hammings "sequence space". Similar to Hilbert space, in which every ontological entity could be defined by an unequivocal point in a mathematical axiomatic system, in the abstract "sequence space" concept each point represents a unique syntactic structure and the value of their separation represents their dissimilarity. In this concept molecular features of the genetic code evolve by means of self-organisation of matter. Biological selection determines the fittest types among varieties of replication errors of quasi-species. The quasi-species concept dominated evolution theory for many decades. In contrast to this, recent empirical data on the evolution of DNA and its forerunners, the RNA-world and viruses indicate cooperative agent-based interactions. Group behaviour of quasi-species consortia constitute de novo and arrange available genetic content for adaptational purposes within real-life contexts that determine epigenetic markings. This review focuses on some fundamental changes in biology, discarding its traditional status as a subdiscipline of physics and chemistry.

  6. Constrained-transport Magnetohydrodynamics with Adaptive Mesh Refinement in CHARM

    NASA Astrophysics Data System (ADS)

    Miniati, Francesco; Martin, Daniel F.

    2011-07-01

    We present the implementation of a three-dimensional, second-order accurate Godunov-type algorithm for magnetohydrodynamics (MHD) in the adaptive-mesh-refinement (AMR) cosmological code CHARM. The algorithm is based on the full 12-solve spatially unsplit corner-transport-upwind (CTU) scheme. The fluid quantities are cell-centered and are updated using the piecewise-parabolic method (PPM), while the magnetic field variables are face-centered and are evolved through application of the Stokes theorem on cell edges via a constrained-transport (CT) method. The so-called multidimensional MHD source terms required in the predictor step for high-order accuracy are applied in a simplified form which reduces their complexity in three dimensions without loss of accuracy or robustness. The algorithm is implemented on an AMR framework which requires specific synchronization steps across refinement levels. These include face-centered restriction and prolongation operations and a reflux-curl operation, which maintains a solenoidal magnetic field across refinement boundaries. The code is tested against a large suite of test problems, including convergence tests in smooth flows, shock-tube tests, classical two- and three-dimensional MHD tests, a three-dimensional shock-cloud interaction problem, and the formation of a cluster of galaxies in a fully cosmological context. The magnetic field divergence is shown to remain negligible throughout.

  7. PS1-41: Just Add Data: Implementing an Event-Based Data Model for Clinical Trial Tracking

    PubMed Central

    Fuller, Sharon; Carrell, David; Pardee, Roy

    2012-01-01

    Background/Aims Clinical research trials often have similar fundamental tracking needs, despite being quite variable in their specific logic and activities. A model tracking database that can be quickly adapted by a variety of studies has the potential to achieve significant efficiencies in database development and maintenance. Methods Over the course of several different clinical trials, we have developed a database model that is highly adaptable to a variety of projects. Rather than hard-coding each specific event that might occur in a trial, along with its logical consequences, this model considers each event and its parameters to be a data record in its own right. Each event may have related variables (metadata) describing its prerequisites, subsequent events due, associated mailings, or events that it overrides. The metadata for each event is stored in the same record with the event name. When changes are made to the study protocol, no structural changes to the database are needed. One has only to add or edit events and their metadata. Changes in the event metadata automatically determine any related logic changes. In addition to streamlining application code, this model simplifies communication between the programmer and other team members. Database requirements can be phrased as changes to the underlying data, rather than to the application code. The project team can review a single report of events and metadata and easily see where changes might be needed. In addition to benefitting from streamlined code, the front end database application can also implement useful standard features such as automated mail merges and to do lists. Results The event-based data model has proven itself to be robust, adaptable and user-friendly in a variety of study contexts. We have chosen to implement it as a SQL Server back end and distributed Access front end. Interested readers may request a copy of the Access front end and scripts for creating the back end database. Discussion An event-based database with a consistent, robust set of features has the potential to significantly reduce development time and maintenance expense for clinical trial tracking databases.

  8. Clinical diagnostic and sociocultural dimensions of deliberate self-harm in Mumbai, India.

    PubMed

    Parkar, Shubhangi R; Dawani, Varsha; Weiss, Mitchell G

    2006-04-01

    Patients' accounts complement psychiatric assessment of deliberate self-harm (DSH). In this study we examined psychiatric disorders, and sociocultural and cross-cultural features of DSH. SCID diagnostic interviews and a locally adapted EMIC interview were used to study 196 patients after DSH at a general hospital in Mumbai, India. Major depression was the most common diagnosis (38.8%), followed by substance use disorders (16.8%), but 44.4% of patients did not meet criteria for an enduring Axis-I disorder (no diagnosis, V-code, or adjustment disorder). Psychache arising from patient-identified sociocultural contexts and stressors complements, but does not necessarily fulfill, criteria for explanatory psychiatric disorders.

  9. First benchmark of the Unstructured Grid Adaptation Working Group

    NASA Technical Reports Server (NTRS)

    Ibanez, Daniel; Barral, Nicolas; Krakos, Joshua; Loseille, Adrien; Michal, Todd; Park, Mike

    2017-01-01

    Unstructured grid adaptation is a technology that holds the potential to improve the automation and accuracy of computational fluid dynamics and other computational disciplines. Difficulty producing the highly anisotropic elements necessary for simulation on complex curved geometries that satisfies a resolution request has limited this technology's widespread adoption. The Unstructured Grid Adaptation Working Group is an open gathering of researchers working on adapting simplicial meshes to conform to a metric field. Current members span a wide range of institutions including academia, industry, and national laboratories. The purpose of this group is to create a common basis for understanding and improving mesh adaptation. We present our first major contribution: a common set of benchmark cases, including input meshes and analytic metric specifications, that are publicly available to be used for evaluating any mesh adaptation code. We also present the results of several existing codes on these benchmark cases, to illustrate their utility in identifying key challenges common to all codes and important differences between available codes. Future directions are defined to expand this benchmark to mature the technology necessary to impact practical simulation workflows.

  10. AN ADVANCED LEAKAGE SCHEME FOR NEUTRINO TREATMENT IN ASTROPHYSICAL SIMULATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perego, A.; Cabezón, R. M.; Käppeli, R., E-mail: albino.perego@physik.tu-darmstadt.de

    We present an Advanced Spectral Leakage (ASL) scheme to model neutrinos in the context of core-collapse supernovae (CCSNe) and compact binary mergers. Based on previous gray leakage schemes, the ASL scheme computes the neutrino cooling rates by interpolating local production and diffusion rates (relevant in optically thin and thick regimes, respectively) separately for discretized values of the neutrino energy. Neutrino trapped components are also modeled, based on equilibrium and timescale arguments. The better accuracy achieved by the spectral treatment allows a more reliable computation of neutrino heating rates in optically thin conditions. The scheme has been calibrated and tested against Boltzmannmore » transport in the context of Newtonian spherically symmetric models of CCSNe. ASL shows a very good qualitative and a partial quantitative agreement for key quantities from collapse to a few hundreds of milliseconds after core bounce. We have proved the adaptability and flexibility of our ASL scheme, coupling it to an axisymmetric Eulerian and to a three-dimensional smoothed particle hydrodynamics code to simulate core collapse. Therefore, the neutrino treatment presented here is ideal for large parameter-space explorations, parametric studies, high-resolution tests, code developments, and long-term modeling of asymmetric configurations, where more detailed neutrino treatments are not available or are currently computationally too expensive.« less

  11. Multipath search coding of stationary signals with applications to speech

    NASA Astrophysics Data System (ADS)

    Fehn, H. G.; Noll, P.

    1982-04-01

    This paper deals with the application of multipath search coding (MSC) concepts to the coding of stationary memoryless and correlated sources, and of speech signals, at a rate of one bit per sample. Use is made of three MSC classes: (1) codebook coding, or vector quantization, (2) tree coding, and (3) trellis coding. This paper explains the performances of these coders and compares them both with those of conventional coders and with rate-distortion bounds. The potentials of MSC coding strategies are demonstrated by illustrations. The paper reports also on results of MSC coding of speech, where both the strategy of adaptive quantization and of adaptive prediction were included in coder design.

  12. Research on pre-processing of QR Code

    NASA Astrophysics Data System (ADS)

    Sun, Haixing; Xia, Haojie; Dong, Ning

    2013-10-01

    QR code encodes many kinds of information because of its advantages: large storage capacity, high reliability, full arrange of utter-high-speed reading, small printing size and high-efficient representation of Chinese characters, etc. In order to obtain the clearer binarization image from complex background, and improve the recognition rate of QR code, this paper researches on pre-processing methods of QR code (Quick Response Code), and shows algorithms and results of image pre-processing for QR code recognition. Improve the conventional method by changing the Souvola's adaptive text recognition method. Additionally, introduce the QR code Extraction which adapts to different image size, flexible image correction approach, and improve the efficiency and accuracy of QR code image processing.

  13. Divided multimodal attention sensory trace and context coding strategies in spatially congruent auditory and visual presentation.

    PubMed

    Kristjánsson, Tómas; Thorvaldsson, Tómas Páll; Kristjánsson, Arni

    2014-01-01

    Previous research involving both unimodal and multimodal studies suggests that single-response change detection is a capacity-free process while a discriminatory up or down identification is capacity-limited. The trace/context model assumes that this reflects different memory strategies rather than inherent differences between identification and detection. To perform such tasks, one of two strategies is used, a sensory trace or a context coding strategy, and if one is blocked, people will automatically use the other. A drawback to most preceding studies is that stimuli are presented at separate locations, creating the possibility of a spatial confound, which invites alternative interpretations of the results. We describe a series of experiments, investigating divided multimodal attention, without the spatial confound. The results challenge the trace/context model. Our critical experiment involved a gap before a change in volume and brightness, which according to the trace/context model blocks the sensory trace strategy, simultaneously with a roaming pedestal, which should block the context coding strategy. The results clearly show that people can use strategies other than sensory trace and context coding in the tasks and conditions of these experiments, necessitating changes to the trace/context model.

  14. Incorporating spike-rate adaptation into a rate code in mathematical and biological neurons

    PubMed Central

    Ralston, Bridget N.; Flagg, Lucas Q.; Faggin, Eric

    2016-01-01

    For a slowly varying stimulus, the simplest relationship between a neuron's input and output is a rate code, in which the spike rate is a unique function of the stimulus at that instant. In the case of spike-rate adaptation, there is no unique relationship between input and output, because the spike rate at any time depends both on the instantaneous stimulus and on prior spiking (the “history”). To improve the decoding of spike trains produced by neurons that show spike-rate adaptation, we developed a simple scheme that incorporates “history” into a rate code. We utilized this rate-history code successfully to decode spike trains produced by 1) mathematical models of a neuron in which the mechanism for adaptation (IAHP) is specified, and 2) the gastropyloric receptor (GPR2), a stretch-sensitive neuron in the stomatogastric nervous system of the crab Cancer borealis, that exhibits long-lasting adaptation of unknown origin. Moreover, when we modified the spike rate either mathematically in a model system or by applying neuromodulatory agents to the experimental system, we found that changes in the rate-history code could be related to the biophysical mechanisms responsible for altering the spiking. PMID:26888106

  15. Interactive Finite Elements for General Engine Dynamics Analysis

    NASA Technical Reports Server (NTRS)

    Adams, M. L.; Padovan, J.; Fertis, D. G.

    1984-01-01

    General nonlinear finite element codes were adapted for the purpose of analyzing the dynamics of gas turbine engines. In particular, this adaptation required the development of a squeeze-film damper element software package and its implantation into a representative current generation code. The ADINA code was selected because of prior use of it and familiarity with its internal structure and logic. This objective was met and the results indicate that such use of general purpose codes is viable alternative to specialized codes for general dynamics analysis of engines.

  16. A User's Guide to AMR1D: An Instructional Adaptive Mesh Refinement Code for Unstructured Grids

    NASA Technical Reports Server (NTRS)

    deFainchtein, Rosalinda

    1996-01-01

    This report documents the code AMR1D, which is currently posted on the World Wide Web (http://sdcd.gsfc.nasa.gov/ESS/exchange/contrib/de-fainchtein/adaptive _mesh_refinement.html). AMR1D is a one-dimensional finite element fluid-dynamics solver, capable of adaptive mesh refinement (AMR). It was written as an instructional tool for AMR on unstructured mesh codes. It is meant to illustrate the minimum requirements for AMR on more than one dimension. For that purpose, it uses the same type of data structure that would be necessary on a two-dimensional AMR code (loosely following the algorithm described by Lohner).

  17. The effect of multiple internal representations on context-rich instruction

    NASA Astrophysics Data System (ADS)

    Lasry, Nathaniel; Aulls, Mark W.

    2007-11-01

    We discuss n-coding, a theoretical model of multiple internal mental representations. The n-coding construct is developed from a review of cognitive and imaging data that demonstrates the independence of information processed along different modalities such as verbal, visual, kinesthetic, logico-mathematic, and social modalities. A study testing the effectiveness of the n-coding construct in classrooms is presented. Four sections differing in the level of n-coding opportunities were compared. Besides a traditional-instruction section used as a control group, each of the remaining three sections were given context-rich problems, which differed by the level of n-coding opportunities designed into their laboratory environment. To measure the effectiveness of the construct, problem-solving skills were assessed as conceptual learning using the force concept inventory. We also developed several new measures that take students' confidence in concepts into account. Our results show that the n-coding construct is useful in designing context-rich environments and can be used to increase learning gains in problem solving, conceptual knowledge, and concept confidence. Specifically, when using props in designing context-rich problems, we find n-coding to be a useful construct in guiding which additional dimensions need to be attended to.

  18. Wall-interference assessment and corrections for transonic NACA 0012 airfoil data from various wind tunnels. M.S. Thesis - George Washington Univ., 1988

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Newman, Perry A.

    1991-01-01

    A nonlinear, four wall, post-test wall interference assessment/correction (WIAC) code was developed for transonic airfoil data from solid wall wind tunnels with flexibly adaptable top and bottom walls. The WIAC code was applied over a broad range of test conditions to four sets of NACA 0012 airfoil data, from two different adaptive wall wind tunnels. The data include many test points for fully adapted walls, as well as numerous partially adapted and unadapted test points, which together represent many different model/tunnel configurations and possible wall interference effects. Small corrections to the measured Mach numbers and angles of attack were obtained from the WIAC code even for fully adapted data; these corrections generally improve the correlation among the various sets of airfoil data and simultaneously improve the correlation of the data with calculations for a 2-D, free air, Navier-Stokes code. The WIAC corrections for airfoil data taken in fully adapted wall test sections are shown to be significantly smaller than those for comparable airfoil data from straight, slotted wall test sections. This indicates, as expected, a lesser degree of wall interference in the adapted wall tunnels relative to the slotted wall tunnels. Application of the WIAC code to this data was, however, somewhat more difficult and time consuming than initially expected from similar previous experience with WIAC applications to slotted wall data.

  19. Open-source framework for documentation of scientific software written on MATLAB-compatible programming languages

    NASA Astrophysics Data System (ADS)

    Konnik, Mikhail V.; Welsh, James

    2012-09-01

    Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.

  20. Complexity control algorithm based on adaptive mode selection for interframe coding in high efficiency video coding

    NASA Astrophysics Data System (ADS)

    Chen, Gang; Yang, Bing; Zhang, Xiaoyun; Gao, Zhiyong

    2017-07-01

    The latest high efficiency video coding (HEVC) standard significantly increases the encoding complexity for improving its coding efficiency. Due to the limited computational capability of handheld devices, complexity constrained video coding has drawn great attention in recent years. A complexity control algorithm based on adaptive mode selection is proposed for interframe coding in HEVC. Considering the direct proportionality between encoding time and computational complexity, the computational complexity is measured in terms of encoding time. First, complexity is mapped to a target in terms of prediction modes. Then, an adaptive mode selection algorithm is proposed for the mode decision process. Specifically, the optimal mode combination scheme that is chosen through offline statistics is developed at low complexity. If the complexity budget has not been used up, an adaptive mode sorting method is employed to further improve coding efficiency. The experimental results show that the proposed algorithm achieves a very large complexity control range (as low as 10%) for the HEVC encoder while maintaining good rate-distortion performance. For the lowdelayP condition, compared with the direct resource allocation method and the state-of-the-art method, an average gain of 0.63 and 0.17 dB in BDPSNR is observed for 18 sequences when the target complexity is around 40%.

  1. Code-Switching in Persian-English and Telugu-English Conversations: With a Focus on Light Verb Constructions

    ERIC Educational Resources Information Center

    Moradi, Hamzeh

    2014-01-01

    Depending on the demands of a particular communicative situation, bilingual or multilingual speakers ("bilingualism-multilingualism") will switch between language varieties. Code-switching is the practice of moving between variations of languages in different contexts. In an educational context, code-switching is defined as the practice…

  2. Microanalytic Coding versus Global Rating of Maternal Parenting Behaviour

    ERIC Educational Resources Information Center

    Morawska, Alina; Basha, Allison; Adamson, Michelle; Winter, Leanne

    2015-01-01

    This study examined the relationship between microanalytic coding and global rating systems when coding maternal parenting behaviour in two contexts. Observational data from 55 mother--child interactions with two- to four-year-old children, in either a mealtime (clinic; N?=?20 or control; N?=?20) or a playtime context (community; N?=?15), were…

  3. Fuzzy support vector machines for adaptive Morse code recognition.

    PubMed

    Yang, Cheng-Hong; Jin, Li-Cheng; Chuang, Li-Yeh

    2006-11-01

    Morse code is now being harnessed for use in rehabilitation applications of augmentative-alternative communication and assistive technology, facilitating mobility, environmental control and adapted worksite access. In this paper, Morse code is selected as a communication adaptive device for persons who suffer from muscle atrophy, cerebral palsy or other severe handicaps. A stable typing rate is strictly required for Morse code to be effective as a communication tool. Therefore, an adaptive automatic recognition method with a high recognition rate is needed. The proposed system uses both fuzzy support vector machines and the variable-degree variable-step-size least-mean-square algorithm to achieve these objectives. We apply fuzzy memberships to each point, and provide different contributions to the decision learning function for support vector machines. Statistical analyses demonstrated that the proposed method elicited a higher recognition rate than other algorithms in the literature.

  4. Nine-year-old children use norm-based coding to visually represent facial expression.

    PubMed

    Burton, Nichola; Jeffery, Linda; Skinner, Andrew L; Benton, Christopher P; Rhodes, Gillian

    2013-10-01

    Children are less skilled than adults at making judgments about facial expression. This could be because they have not yet developed adult-like mechanisms for visually representing faces. Adults are thought to represent faces in a multidimensional face-space, and have been shown to code the expression of a face relative to the norm or average face in face-space. Norm-based coding is economical and adaptive, and may be what makes adults more sensitive to facial expression than children. This study investigated the coding system that children use to represent facial expression. An adaptation aftereffect paradigm was used to test 24 adults and 18 children (9 years 2 months to 9 years 11 months old). Participants adapted to weak and strong antiexpressions. They then judged the expression of an average expression. Adaptation created aftereffects that made the test face look like the expression opposite that of the adaptor. Consistent with the predictions of norm-based but not exemplar-based coding, aftereffects were larger for strong than weak adaptors for both age groups. Results indicate that, like adults, children's coding of facial expressions is norm-based. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  5. Adaptive image coding based on cubic-spline interpolation

    NASA Astrophysics Data System (ADS)

    Jiang, Jian-Xing; Hong, Shao-Hua; Lin, Tsung-Ching; Wang, Lin; Truong, Trieu-Kien

    2014-09-01

    It has been investigated that at low bit rates, downsampling prior to coding and upsampling after decoding can achieve better compression performance than standard coding algorithms, e.g., JPEG and H. 264/AVC. However, at high bit rates, the sampling-based schemes generate more distortion. Additionally, the maximum bit rate for the sampling-based scheme to outperform the standard algorithm is image-dependent. In this paper, a practical adaptive image coding algorithm based on the cubic-spline interpolation (CSI) is proposed. This proposed algorithm adaptively selects the image coding method from CSI-based modified JPEG and standard JPEG under a given target bit rate utilizing the so called ρ-domain analysis. The experimental results indicate that compared with the standard JPEG, the proposed algorithm can show better performance at low bit rates and maintain the same performance at high bit rates.

  6. Reservoir Computing Properties of Neural Dynamics in Prefrontal Cortex

    PubMed Central

    Procyk, Emmanuel; Dominey, Peter Ford

    2016-01-01

    Primates display a remarkable ability to adapt to novel situations. Determining what is most pertinent in these situations is not always possible based only on the current sensory inputs, and often also depends on recent inputs and behavioral outputs that contribute to internal states. Thus, one can ask how cortical dynamics generate representations of these complex situations. It has been observed that mixed selectivity in cortical neurons contributes to represent diverse situations defined by a combination of the current stimuli, and that mixed selectivity is readily obtained in randomly connected recurrent networks. In this context, these reservoir networks reproduce the highly recurrent nature of local cortical connectivity. Recombining present and past inputs, random recurrent networks from the reservoir computing framework generate mixed selectivity which provides pre-coded representations of an essentially universal set of contexts. These representations can then be selectively amplified through learning to solve the task at hand. We thus explored their representational power and dynamical properties after training a reservoir to perform a complex cognitive task initially developed for monkeys. The reservoir model inherently displayed a dynamic form of mixed selectivity, key to the representation of the behavioral context over time. The pre-coded representation of context was amplified by training a feedback neuron to explicitly represent this context, thereby reproducing the effect of learning and allowing the model to perform more robustly. This second version of the model demonstrates how a hybrid dynamical regime combining spatio-temporal processing of reservoirs, and input driven attracting dynamics generated by the feedback neuron, can be used to solve a complex cognitive task. We compared reservoir activity to neural activity of dorsal anterior cingulate cortex of monkeys which revealed similar network dynamics. We argue that reservoir computing is a pertinent framework to model local cortical dynamics and their contribution to higher cognitive function. PMID:27286251

  7. The strategic management of organizational knowledge exchange related to hospital quality measurement and reporting.

    PubMed

    Rangachari, Pavani

    2008-01-01

    CONTEXT/PURPOSE: With the growing momentum toward hospital quality measurement and reporting by public and private health care payers, hospitals face increasing pressures to improve their medical record documentation and administrative data coding accuracy. This study explores the relationship between the organizational knowledge-sharing structure related to quality and hospital coding accuracy for quality measurement. Simultaneously, this study seeks to identify other leadership/management characteristics associated with coding for quality measurement. Drawing upon complexity theory, the literature on "professional complex systems" has put forth various strategies for managing change and turnaround in professional organizations. In so doing, it has emphasized the importance of knowledge creation and organizational learning through interdisciplinary networks. This study integrates complexity, network structure, and "subgoals" theories to develop a framework for knowledge-sharing network effectiveness in professional complex systems. This framework is used to design an exploratory and comparative research study. The sample consists of 4 hospitals, 2 showing "good coding" accuracy for quality measurement and 2 showing "poor coding" accuracy. Interviews and surveys are conducted with administrators and staff in the quality, medical staff, and coding subgroups in each facility. Findings of this study indicate that good coding performance is systematically associated with a knowledge-sharing network structure rich in brokerage and hierarchy (with leaders connecting different professional subgroups to each other and to the external environment), rather than in density (where everyone is directly connected to everyone else). It also implies that for the hospital organization to adapt to the changing environment of quality transparency, senior leaders must undertake proactive and unceasing efforts to coordinate knowledge exchange across physician and coding subgroups and connect these subgroups with the changing external environment.

  8. Lossless compression of VLSI layout image data.

    PubMed

    Dai, Vito; Zakhor, Avideh

    2006-09-01

    We present a novel lossless compression algorithm called Context Copy Combinatorial Code (C4), which integrates the advantages of two very disparate compression techniques: context-based modeling and Lempel-Ziv (LZ) style copying. While the algorithm can be applied to many lossless compression applications, such as document image compression, our primary target application has been lossless compression of integrated circuit layout image data. These images contain a heterogeneous mix of data: dense repetitive data better suited to LZ-style coding, and less dense structured data, better suited to context-based encoding. As part of C4, we have developed a novel binary entropy coding technique called combinatorial coding which is simultaneously as efficient as arithmetic coding, and as fast as Huffman coding. Compression results show C4 outperforms JBIG, ZIP, BZIP2, and two-dimensional LZ, and achieves lossless compression ratios greater than 22 for binary layout image data, and greater than 14 for gray-pixel image data.

  9. Adaptive Nodal Transport Methods for Reactor Transient Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas Downar; E. Lewis

    2005-08-31

    Develop methods for adaptively treating the angular, spatial, and time dependence of the neutron flux in reactor transient analysis. These methods were demonstrated in the DOE transport nodal code VARIANT and the US NRC spatial kinetics code, PARCS.

  10. An FPGA design of generalized low-density parity-check codes for rate-adaptive optical transport networks

    NASA Astrophysics Data System (ADS)

    Zou, Ding; Djordjevic, Ivan B.

    2016-02-01

    Forward error correction (FEC) is as one of the key technologies enabling the next-generation high-speed fiber optical communications. In this paper, we propose a rate-adaptive scheme using a class of generalized low-density parity-check (GLDPC) codes with a Hamming code as local code. We show that with the proposed unified GLDPC decoder architecture, a variable net coding gains (NCGs) can be achieved with no error floor at BER down to 10-15, making it a viable solution in the next-generation high-speed fiber optical communications.

  11. A Context-Aware Self-Adaptive Fractal Based Generalized Pedagogical Agent Framework for Mobile Learning

    ERIC Educational Resources Information Center

    Boulehouache, Soufiane; Maamri, Ramdane; Sahnoun, Zaidi

    2015-01-01

    The Pedagogical Agents (PAs) for Mobile Learning (m-learning) must be able not only to adapt the teaching to the learner knowledge level and profile but also to ensure the pedagogical efficiency within unpredictable changing runtime contexts. Therefore, to deal with this issue, this paper proposes a Context-aware Self-Adaptive Fractal Component…

  12. Addressing Methodological Challenges in Large Communication Data Sets: Collecting and Coding Longitudinal Interactions in Home Hospice Cancer Care.

    PubMed

    Reblin, Maija; Clayton, Margaret F; John, Kevin K; Ellington, Lee

    2016-07-01

    In this article, we present strategies for collecting and coding a large longitudinal communication data set collected across multiple sites, consisting of more than 2000 hours of digital audio recordings from approximately 300 families. We describe our methods within the context of implementing a large-scale study of communication during cancer home hospice nurse visits, but this procedure could be adapted to communication data sets across a wide variety of settings. This research is the first study designed to capture home hospice nurse-caregiver communication, a highly understudied location and type of communication event. We present a detailed example protocol encompassing data collection in the home environment, large-scale, multisite secure data management, the development of theoretically-based communication coding, and strategies for preventing coder drift and ensuring reliability of analyses. Although each of these challenges has the potential to undermine the utility of the data, reliability between coders is often the only issue consistently reported and addressed in the literature. Overall, our approach demonstrates rigor and provides a "how-to" example for managing large, digitally recorded data sets from collection through analysis. These strategies can inform other large-scale health communication research.

  13. Addressing Methodological Challenges in Large Communication Datasets: Collecting and Coding Longitudinal Interactions in Home Hospice Cancer Care

    PubMed Central

    Reblin, Maija; Clayton, Margaret F; John, Kevin K; Ellington, Lee

    2015-01-01

    In this paper, we present strategies for collecting and coding a large longitudinal communication dataset collected across multiple sites, consisting of over 2000 hours of digital audio recordings from approximately 300 families. We describe our methods within the context of implementing a large-scale study of communication during cancer home hospice nurse visits, but this procedure could be adapted to communication datasets across a wide variety of settings. This research is the first study designed to capture home hospice nurse-caregiver communication, a highly understudied location and type of communication event. We present a detailed example protocol encompassing data collection in the home environment, large-scale, multi-site secure data management, the development of theoretically-based communication coding, and strategies for preventing coder drift and ensuring reliability of analyses. Although each of these challenges have the potential to undermine the utility of the data, reliability between coders is often the only issue consistently reported and addressed in the literature. Overall, our approach demonstrates rigor and provides a “how-to” example for managing large, digitally-recorded data sets from collection through analysis. These strategies can inform other large-scale health communication research. PMID:26580414

  14. Dopamine Modulates Adaptive Prediction Error Coding in the Human Midbrain and Striatum.

    PubMed

    Diederen, Kelly M J; Ziauddeen, Hisham; Vestergaard, Martin D; Spencer, Tom; Schultz, Wolfram; Fletcher, Paul C

    2017-02-15

    Learning to optimally predict rewards requires agents to account for fluctuations in reward value. Recent work suggests that individuals can efficiently learn about variable rewards through adaptation of the learning rate, and coding of prediction errors relative to reward variability. Such adaptive coding has been linked to midbrain dopamine neurons in nonhuman primates, and evidence in support for a similar role of the dopaminergic system in humans is emerging from fMRI data. Here, we sought to investigate the effect of dopaminergic perturbations on adaptive prediction error coding in humans, using a between-subject, placebo-controlled pharmacological fMRI study with a dopaminergic agonist (bromocriptine) and antagonist (sulpiride). Participants performed a previously validated task in which they predicted the magnitude of upcoming rewards drawn from distributions with varying SDs. After each prediction, participants received a reward, yielding trial-by-trial prediction errors. Under placebo, we replicated previous observations of adaptive coding in the midbrain and ventral striatum. Treatment with sulpiride attenuated adaptive coding in both midbrain and ventral striatum, and was associated with a decrease in performance, whereas bromocriptine did not have a significant impact. Although we observed no differential effect of SD on performance between the groups, computational modeling suggested decreased behavioral adaptation in the sulpiride group. These results suggest that normal dopaminergic function is critical for adaptive prediction error coding, a key property of the brain thought to facilitate efficient learning in variable environments. Crucially, these results also offer potential insights for understanding the impact of disrupted dopamine function in mental illness. SIGNIFICANCE STATEMENT To choose optimally, we have to learn what to expect. Humans dampen learning when there is a great deal of variability in reward outcome, and two brain regions that are modulated by the brain chemical dopamine are sensitive to reward variability. Here, we aimed to directly relate dopamine to learning about variable rewards, and the neural encoding of associated teaching signals. We perturbed dopamine in healthy individuals using dopaminergic medication and asked them to predict variable rewards while we made brain scans. Dopamine perturbations impaired learning and the neural encoding of reward variability, thus establishing a direct link between dopamine and adaptation to reward variability. These results aid our understanding of clinical conditions associated with dopaminergic dysfunction, such as psychosis. Copyright © 2017 Diederen et al.

  15. Is Integration Always most Adaptive? The Role of Cultural Identity in Academic Achievement and in Psychological Adaptation of Immigrant Students in Germany.

    PubMed

    Schotte, Kristin; Stanat, Petra; Edele, Aileen

    2018-01-01

    Immigrant adaptation research views identification with the mainstream context as particularly beneficial for sociocultural adaptation, including academic achievement, and identification with the ethnic context as particularly beneficial for psychological adaptation. A strong identification with both contexts is considered most beneficial for both outcomes (integration hypothesis). However, it is unclear whether the integration hypothesis applies in assimilative contexts, across different outcomes, and across different immigrant groups. This study investigates the association of cultural identity with several indicators of academic achievement and psychological adaptation in immigrant adolescents (N = 3894, 51% female, M age = 16.24, SD age  = 0.71) in Germany. Analyses support the integration hypothesis for aspects of psychological adaptation but not for academic achievement. Moreover, for some outcomes, findings vary across immigrant groups from Turkey (n = 809), the former Soviet Union (n = 712), and heterogeneous other countries (n = 2373). The results indicate that the adaptive potential of identity integration is limited in assimilative contexts, such as Germany, and that it may vary across different outcomes and groups. As each identification is positively associated with at least one outcome, however, both identification dimensions seem to be important for the adaptation of immigrant adolescents.

  16. Beyond the Triplet Code: Context Cues Transform Translation.

    PubMed

    Brar, Gloria A

    2016-12-15

    The elucidation of the genetic code remains among the most influential discoveries in biology. While innumerable studies have validated the general universality of the code and its value in predicting and analyzing protein coding sequences, established and emerging work has also suggested that full genome decryption may benefit from a greater consideration of a codon's neighborhood within an mRNA than has been broadly applied. This Review examines the evidence for context cues in translation, with a focus on several recent studies that reveal broad roles for mRNA context in programming translation start sites, the rate of translation elongation, and stop codon identity. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Turning back the hands of time: autobiographical memories in dementia cued by a museum setting.

    PubMed

    Miles, Amanda N; Fischer-Mogensen, Lise; Nielsen, Nadia H; Hermansen, Stine; Berntsen, Dorthe

    2013-09-01

    The current study examined the effects of cuing autobiographical memory retrieval in 12 older participants with dementia through immersion into a historically authentic environment that recreated the material and cultural context of the participants' youth. Participants conversed in either an everyday setting (control condition) or a museum setting furnished in early twentieth century style (experimental condition) while being presented with condition matched cues. Conversations were coded for memory content based on an adapted version of Levine, Svoboda, Hay, Winocur, and Moscovitch (2002) coding scheme. More autobiographical memories were recalled in the museum setting, and these memories were more elaborated, more spontaneous and included especially more internal (episodic) details compared to memories in the control condition. The findings have theoretical and practical implications by showing that the memories retrieved in the museum setting were both quantitatively and qualitatively different from memories retrieved during a control condition. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. Lexicons, contexts, events, and images: commentary on Elman (2009) from the perspective of dual coding theory.

    PubMed

    Paivio, Allan; Sadoski, Mark

    2011-01-01

    Elman (2009) proposed that the traditional role of the mental lexicon in language processing can largely be replaced by a theoretical model of schematic event knowledge founded on dynamic context-dependent variables. We evaluate Elman's approach and propose an alternative view, based on dual coding theory and evidence that modality-specific cognitive representations contribute strongly to word meaning and language performance across diverse contexts which also have effects predictable from dual coding theory. Copyright © 2010 Cognitive Science Society, Inc.

  19. Capacity achieving nonbinary LDPC coded non-uniform shaping modulation for adaptive optical communications.

    PubMed

    Lin, Changyu; Zou, Ding; Liu, Tao; Djordjevic, Ivan B

    2016-08-08

    A mutual information inspired nonbinary coded modulation design with non-uniform shaping is proposed. Instead of traditional power of two signal constellation sizes, we design 5-QAM, 7-QAM and 9-QAM constellations, which can be used in adaptive optical networks. The non-uniform shaping and LDPC code rate are jointly considered in the design, which results in a better performance scheme for the same SNR values. The matched nonbinary (NB) LDPC code is used for this scheme, which further improves the coding gain and the overall performance. We analyze both coding performance and system SNR performance. We show that the proposed NB LDPC-coded 9-QAM has more than 2dB gain in symbol SNR compared to traditional LDPC-coded star-8-QAM. On the other hand, the proposed NB LDPC-coded 5-QAM and 7-QAM have even better performance than LDPC-coded QPSK.

  20. Cross-Layer Design for Video Transmission over Wireless Rician Slow-Fading Channels Using an Adaptive Multiresolution Modulation and Coding Scheme

    NASA Astrophysics Data System (ADS)

    Pei, Yong; Modestino, James W.

    2007-12-01

    We describe a multilayered video transport scheme for wireless channels capable of adapting to channel conditions in order to maximize end-to-end quality of service (QoS). This scheme combines a scalable H.263+ video source coder with unequal error protection (UEP) across layers. The UEP is achieved by employing different channel codes together with a multiresolution modulation approach to transport the different priority layers. Adaptivity to channel conditions is provided through a joint source-channel coding (JSCC) approach which attempts to jointly optimize the source and channel coding rates together with the modulation parameters to obtain the maximum achievable end-to-end QoS for the prevailing channel conditions. In this work, we model the wireless links as slow-fading Rician channel where the channel conditions can be described in terms of the channel signal-to-noise ratio (SNR) and the ratio of specular-to-diffuse energy[InlineEquation not available: see fulltext.]. The multiresolution modulation/coding scheme consists of binary rate-compatible punctured convolutional (RCPC) codes used together with nonuniform phase-shift keyed (PSK) signaling constellations. Results indicate that this adaptive JSCC scheme employing scalable video encoding together with a multiresolution modulation/coding approach leads to significant improvements in delivered video quality for specified channel conditions. In particular, the approach results in considerably improved graceful degradation properties for decreasing channel SNR.

  1. Advanced technology development for image gathering, coding, and processing

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.

    1990-01-01

    Three overlapping areas of research activities are presented: (1) Information theory and optimal filtering are extended to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing. (2) Focal-plane processing techniques and technology are developed to combine effectively image gathering with coding. The emphasis is on low-level vision processing akin to the retinal processing in human vision. (3) A breadboard adaptive image-coding system is being assembled. This system will be used to develop and evaluate a number of advanced image-coding technologies and techniques as well as research the concept of adaptive image coding.

  2. The multidimensional Self-Adaptive Grid code, SAGE, version 2

    NASA Technical Reports Server (NTRS)

    Davies, Carol B.; Venkatapathy, Ethiraj

    1995-01-01

    This new report on Version 2 of the SAGE code includes all the information in the original publication plus all upgrades and changes to the SAGE code since that time. The two most significant upgrades are the inclusion of a finite-volume option and the ability to adapt and manipulate zonal-matching multiple-grid files. In addition, the original SAGE code has been upgraded to Version 1.1 and includes all options mentioned in this report, with the exception of the multiple grid option and its associated features. Since Version 2 is a larger and more complex code, it is suggested (but not required) that Version 1.1 be used for single-grid applications. This document contains all the information required to run both versions of SAGE. The formulation of the adaption method is described in the first section of this document. The second section is presented in the form of a user guide that explains the input and execution of the code. The third section provides many examples. Successful application of the SAGE code in both two and three dimensions for the solution of various flow problems has proven the code to be robust, portable, and simple to use. Although the basic formulation follows the method of Nakahashi and Deiwert, many modifications have been made to facilitate the use of the self-adaptive grid method for complex grid structures. Modifications to the method and the simple but extensive input options make this a flexible and user-friendly code. The SAGE code can accommodate two-dimensional and three-dimensional, finite-difference and finite-volume, single grid, and zonal-matching multiple grid flow problems.

  3. Adaptive Online Sequential ELM for Concept Drift Tackling

    PubMed Central

    Basaruddin, Chan

    2016-01-01

    A machine learning method needs to adapt to over time changes in the environment. Such changes are known as concept drift. In this paper, we propose concept drift tackling method as an enhancement of Online Sequential Extreme Learning Machine (OS-ELM) and Constructive Enhancement OS-ELM (CEOS-ELM) by adding adaptive capability for classification and regression problem. The scheme is named as adaptive OS-ELM (AOS-ELM). It is a single classifier scheme that works well to handle real drift, virtual drift, and hybrid drift. The AOS-ELM also works well for sudden drift and recurrent context change type. The scheme is a simple unified method implemented in simple lines of code. We evaluated AOS-ELM on regression and classification problem by using concept drift public data set (SEA and STAGGER) and other public data sets such as MNIST, USPS, and IDS. Experiments show that our method gives higher kappa value compared to the multiclassifier ELM ensemble. Even though AOS-ELM in practice does not need hidden nodes increase, we address some issues related to the increasing of the hidden nodes such as error condition and rank values. We propose taking the rank of the pseudoinverse matrix as an indicator parameter to detect “underfitting” condition. PMID:27594879

  4. Complex interplay between neutral and adaptive evolution shaped differential genomic background and disease susceptibility along the Italian peninsula.

    PubMed

    Sazzini, Marco; Gnecchi Ruscone, Guido Alberto; Giuliani, Cristina; Sarno, Stefania; Quagliariello, Andrea; De Fanti, Sara; Boattini, Alessio; Gentilini, Davide; Fiorito, Giovanni; Catanoso, Mariagrazia; Boiardi, Luigi; Croci, Stefania; Macchioni, Pierluigi; Mantovani, Vilma; Di Blasio, Anna Maria; Matullo, Giuseppe; Salvarani, Carlo; Franceschi, Claudio; Pettener, Davide; Garagnani, Paolo; Luiselli, Donata

    2016-09-01

    The Italian peninsula has long represented a natural hub for human migrations across the Mediterranean area, being involved in several prehistoric and historical population movements. Coupled with a patchy environmental landscape entailing different ecological/cultural selective pressures, this might have produced peculiar patterns of population structure and local adaptations responsible for heterogeneous genomic background of present-day Italians. To disentangle this complex scenario, genome-wide data from 780 Italian individuals were generated and set into the context of European/Mediterranean genomic diversity by comparison with genotypes from 50 populations. To maximize possibility of pinpointing functional genomic regions that have played adaptive roles during Italian natural history, our survey included also ~250,000 exomic markers and ~20,000 coding/regulatory variants with well-established clinical relevance. This enabled fine-grained dissection of Italian population structure through the identification of clusters of genetically homogeneous provinces and of genomic regions underlying their local adaptations. Description of such patterns disclosed crucial implications for understanding differential susceptibility to some inflammatory/autoimmune disorders, coronary artery disease and type 2 diabetes of diverse Italian subpopulations, suggesting the evolutionary causes that made some of them particularly exposed to the metabolic and immune challenges imposed by dietary and lifestyle shifts that involved western societies in the last centuries.

  5. Context-dependent adaptation of visually-guided arm movements and vestibular eye movements: role of the cerebellum

    NASA Technical Reports Server (NTRS)

    Lewis, Richard F.

    2003-01-01

    Accurate motor control requires adaptive processes that correct for gradual and rapid perturbations in the properties of the controlled object. The ability to quickly switch between different movement synergies using sensory cues, referred to as context-dependent adaptation, is a subject of considerable interest at present. The potential function of the cerebellum in context-dependent adaptation remains uncertain, but the data reviewed below suggest that it may play a fundamental role in this process.

  6. The numerical simulation tool for the MAORY multiconjugate adaptive optics system

    NASA Astrophysics Data System (ADS)

    Arcidiacono, C.; Schreiber, L.; Bregoli, G.; Diolaiti, E.; Foppiani, I.; Agapito, G.; Puglisi, A.; Xompero, M.; Oberti, S.; Cosentino, G.; Lombini, M.; Butler, R. C.; Ciliegi, P.; Cortecchia, F.; Patti, M.; Esposito, S.; Feautrier, P.

    2016-07-01

    The Multiconjugate Adaptive Optics RelaY (MAORY) is and Adaptive Optics module to be mounted on the ESO European-Extremely Large Telescope (E-ELT). It is an hybrid Natural and Laser Guide System that will perform the correction of the atmospheric turbulence volume above the telescope feeding the Multi-AO Imaging Camera for Deep Observations Near Infrared spectro-imager (MICADO). We developed an end-to-end Monte- Carlo adaptive optics simulation tool to investigate the performance of a the MAORY and the calibration, acquisition, operation strategies. MAORY will implement Multiconjugate Adaptive Optics combining Laser Guide Stars (LGS) and Natural Guide Stars (NGS) measurements. The simulation tool implement the various aspect of the MAORY in an end to end fashion. The code has been developed using IDL and use libraries in C++ and CUDA for efficiency improvements. Here we recall the code architecture, we describe the modeled instrument components and the control strategies implemented in the code.

  7. MAG3D and its application to internal flowfield analysis

    NASA Technical Reports Server (NTRS)

    Lee, K. D.; Henderson, T. L.; Choo, Y. K.

    1992-01-01

    MAG3D (multiblock adaptive grid, 3D) is a 3D solution-adaptive grid generation code which redistributes grid points to improve the accuracy of a flow solution without increasing the number of grid points. The code is applicable to structured grids with a multiblock topology. It is independent of the original grid generator and the flow solver. The code uses the coordinates of an initial grid and the flow solution interpolated onto the new grid. MAG3D uses a numerical mapping and potential theory to modify the grid distribution based on properties of the flow solution on the initial grid. The adaptation technique is discussed, and the capability of MAG3D is demonstrated with several internal flow examples. Advantages of using solution-adaptive grids are also shown by comparing flow solutions on adaptive grids with those on initial grids.

  8. An approach enabling adaptive FEC for OFDM in fiber-VLLC system

    NASA Astrophysics Data System (ADS)

    Wei, Yiran; He, Jing; Deng, Rui; Shi, Jin; Chen, Shenghai; Chen, Lin

    2017-12-01

    In this paper, we propose an orthogonal circulant matrix transform (OCT)-based adaptive frame-level-forward error correction (FEC) scheme for fiber-visible laser light communication (VLLC) system and experimentally demonstrate by Reed-Solomon (RS) Code. In this method, no extra bits are spent for adaptive message, except training sequence (TS), which is simultaneously used for synchronization and channel estimation. Therefore, RS-coding can be adaptively performed frames by frames via the last received codeword-error-rate (CER) feedback estimated by the TSs of the previous few OFDM frames. In addition, the experimental results exhibit that over 20 km standard single-mode fiber (SSMF) and 8 m visible light transmission, the costs of RS codewords are at most 14.12% lower than those of conventional adaptive subcarrier-RS-code based 16-QAM OFDM at bit error rate (BER) of 10-5.

  9. A study on multiresolution lossless video coding using inter/intra frame adaptive prediction

    NASA Astrophysics Data System (ADS)

    Nakachi, Takayuki; Sawabe, Tomoko; Fujii, Tetsuro

    2003-06-01

    Lossless video coding is required in the fields of archiving and editing digital cinema or digital broadcasting contents. This paper combines a discrete wavelet transform and adaptive inter/intra-frame prediction in the wavelet transform domain to create multiresolution lossless video coding. The multiresolution structure offered by the wavelet transform facilitates interchange among several video source formats such as Super High Definition (SHD) images, HDTV, SDTV, and mobile applications. Adaptive inter/intra-frame prediction is an extension of JPEG-LS, a state-of-the-art lossless still image compression standard. Based on the image statistics of the wavelet transform domains in successive frames, inter/intra frame adaptive prediction is applied to the appropriate wavelet transform domain. This adaptation offers superior compression performance. This is achieved with low computational cost and no increase in additional information. Experiments on digital cinema test sequences confirm the effectiveness of the proposed algorithm.

  10. A-7 Aloft Demonstration Flight Test Plan

    DTIC Science & Technology

    1975-09-01

    6095979 72A2130 Power Supply 12 VDC 6095681 72A29 NOC 72A30 ALOFT ASCU Adapter Set 72A3100 ALOFT ASCU Adapter L20-249-1 72A3110 Page assy L Bay and ASCU ...checks will also be performed for each of the following: 3.1.2.1.1 ASCU Codes. Verification will be made that all legal ASCU codes are recognized and...invalid codes inhibit attack mode. A check will also be made to verify that the ASCU codes for pilot-option weapons A-25 enable the retarded weapons

  11. Barriers and facilitators to the implementation of person-centred care in different healthcare contexts.

    PubMed

    Moore, Lucy; Britten, Nicky; Lydahl, Doris; Naldemirci, Öncel; Elam, Mark; Wolf, Axel

    2017-12-01

    To empower patients and improve the quality of care, policy-makers increasingly adopt systems to enhance person-centred care. Although models of person-centredness and patient-centredness vary, respecting the needs and preferences of individuals receiving care is paramount. In Sweden, as in other countries, healthcare providers seek to improve person-centred principles and address gaps in practice. Consequently, researchers at the University of Gothenburg Centre for Person-Centred Care are currently delivering person-centred interventions employing a framework that incorporates three routines. These include eliciting the patient's narrative, agreeing a partnership with shared goals between patient and professional, and safeguarding this through documentation. To explore the barriers and facilitators to the delivery of person-centred care interventions, in different contexts. Qualitative interviews were conducted with a purposeful sample of 18 researchers from seven research studies across contrasting healthcare settings. Interviews were transcribed, translated and thematically analysed, adopting some basic features of grounded theory. The ethical code of conduct was followed and conformed to the ethical guidelines adopted by the Swedish Research Council. Barriers to the implementation of person-centred care covered three themes: traditional practices and structures; sceptical, stereotypical attitudes from professionals; and factors related to the development of person-centred interventions. Facilitators included organisational factors, leadership and training and an enabling attitude and approach by professionals. Trained project managers, patients taking an active role in research and adaptive strategies by researchers all helped person-centred care delivery. At the University of Gothenburg, a model of person-centred care is being initiated and integrated into practice through research. Knowledgeable, well-trained professionals facilitate the routines of narrative elicitation and partnership. Strong leadership and adaptive strategies are important for overcoming existing practices, routines and methods of documentation. This study provides guidance for practitioners when delivering and adapting person-centred care in different contexts. © 2016 The Authors. Scandinavian Journal of Caring Sciences published by John Wiley & Sons Ltd on behalf of Nordic College of Caring Science.

  12. Adaptive grid embedding for the two-dimensional flux-split Euler equations. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Warren, Gary Patrick

    1990-01-01

    A numerical algorithm is presented for solving the 2-D flux-split Euler equations using a multigrid method with adaptive grid embedding. The method uses an unstructured data set along with a system of pointers for communication on the irregularly shaped grid topologies. An explicit two-stage time advancement scheme is implemented. A multigrid algorithm is used to provide grid level communication and to accelerate the convergence of the solution to steady state. Results are presented for a subcritical airfoil and a transonic airfoil with 3 levels of adaptation. Comparisons are made with a structured upwind Euler code which uses the same flux integration techniques of the present algorithm. Good agreement is obtained with converged surface pressure coefficients. The lift coefficients of the adaptive code are within 2 1/2 percent of the structured code for the sub-critical case and within 4 1/2 percent of the structured code for the transonic case using approximately one-third the number of grid points.

  13. A methodological survey identified eight proposed frameworks for the adaptation of health related guidelines.

    PubMed

    Darzi, Andrea; Abou-Jaoude, Elias A; Agarwal, Arnav; Lakis, Chantal; Wiercioch, Wojtek; Santesso, Nancy; Brax, Hneine; El-Jardali, Fadi; Schünemann, Holger J; Akl, Elie A

    2017-06-01

    Our objective was to identify and describe published frameworks for adaptation of clinical, public health, and health services guidelines. We included reports describing methods of adaptation of guidelines in sufficient detail to allow its reproducibility. We searched Medline and EMBASE databases. We also searched personal files, as well manuals and handbooks of organizations and professional societies that proposed methods of adaptation and adoption of guidelines. We followed standard systematic review methodology. Our search captured 12,021 citations, out of which we identified eight proposed methods of guidelines adaptation: ADAPTE, Adapted ADAPTE, Alberta Ambassador Program adaptation phase, GRADE-ADOLOPMENT, MAGIC, RAPADAPTE, Royal College of Nursing (RCN), and Systematic Guideline Review (SGR). The ADAPTE framework consists of a 24-step process to adapt guidelines to a local context taking into consideration the needs, priorities, legislation, policies, and resources. The Alexandria Center for Evidence-Based Clinical Practice Guidelines updated one of ADAPTE's tools, modified three tools, and added three new ones. In addition, they proposed optionally using three other tools. The Alberta Ambassador Program adaptation phase consists of 11 steps and focused on adapting good-quality guidelines for nonspecific low back pain into local context. GRADE-ADOLOPMENT is an eight-step process based on the GRADE Working Group's Evidence to Decision frameworks and applied in 22 guidelines in the context of national guideline development program. The MAGIC research program developed a five-step adaptation process, informed by ADAPTE and the GRADE approach in the context of adapting thrombosis guidelines. The RAPADAPTE framework consists of 12 steps based on ADAPTE and using synthesized evidence databases, retrospectively derived from the experience of producing a high-quality guideline for the treatment of breast cancer with limited resources in Costa Rica. The RCN outlines five key steps strategy for adaptation of guidelines to the local context. The SGR method consists of nine steps and takes into consideration both methodological gaps and context-specific normative issues in source guidelines. We identified through searching personal files two abandoned methods. We identified and described eight proposed frameworks for the adaptation of health-related guidelines. There is a need to evaluate these different frameworks to assess rigor, efficiency, and transparency of their proposed processes. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Improve load balancing and coding efficiency of tiles in high efficiency video coding by adaptive tile boundary

    NASA Astrophysics Data System (ADS)

    Chan, Chia-Hsin; Tu, Chun-Chuan; Tsai, Wen-Jiin

    2017-01-01

    High efficiency video coding (HEVC) not only improves the coding efficiency drastically compared to the well-known H.264/AVC but also introduces coding tools for parallel processing, one of which is tiles. Tile partitioning is allowed to be arbitrary in HEVC, but how to decide tile boundaries remains an open issue. An adaptive tile boundary (ATB) method is proposed to select a better tile partitioning to improve load balancing (ATB-LoadB) and coding efficiency (ATB-Gain) with a unified scheme. Experimental results show that, compared to ordinary uniform-space partitioning, the proposed ATB can save up to 17.65% of encoding times in parallel encoding scenarios and can reduce up to 0.8% of total bit rates for coding efficiency.

  15. Tsunami modelling with adaptively refined finite volume methods

    USGS Publications Warehouse

    LeVeque, R.J.; George, D.L.; Berger, M.J.

    2011-01-01

    Numerical modelling of transoceanic tsunami propagation, together with the detailed modelling of inundation of small-scale coastal regions, poses a number of algorithmic challenges. The depth-averaged shallow water equations can be used to reduce this to a time-dependent problem in two space dimensions, but even so it is crucial to use adaptive mesh refinement in order to efficiently handle the vast differences in spatial scales. This must be done in a 'wellbalanced' manner that accurately captures very small perturbations to the steady state of the ocean at rest. Inundation can be modelled by allowing cells to dynamically change from dry to wet, but this must also be done carefully near refinement boundaries. We discuss these issues in the context of Riemann-solver-based finite volume methods for tsunami modelling. Several examples are presented using the GeoClaw software, and sample codes are available to accompany the paper. The techniques discussed also apply to a variety of other geophysical flows. ?? 2011 Cambridge University Press.

  16. Statistical context shapes stimulus-specific adaptation in human auditory cortex

    PubMed Central

    Henry, Molly J.; Fromboluti, Elisa Kim; McAuley, J. Devin

    2015-01-01

    Stimulus-specific adaptation is the phenomenon whereby neural response magnitude decreases with repeated stimulation. Inconsistencies between recent nonhuman animal recordings and computational modeling suggest dynamic influences on stimulus-specific adaptation. The present human electroencephalography (EEG) study investigates the potential role of statistical context in dynamically modulating stimulus-specific adaptation by examining the auditory cortex-generated N1 and P2 components. As in previous studies of stimulus-specific adaptation, listeners were presented with oddball sequences in which the presentation of a repeated tone was infrequently interrupted by rare spectral changes taking on three different magnitudes. Critically, the statistical context varied with respect to the probability of small versus large spectral changes within oddball sequences (half of the time a small change was most probable; in the other half a large change was most probable). We observed larger N1 and P2 amplitudes (i.e., release from adaptation) for all spectral changes in the small-change compared with the large-change statistical context. The increase in response magnitude also held for responses to tones presented with high probability, indicating that statistical adaptation can overrule stimulus probability per se in its influence on neural responses. Computational modeling showed that the degree of coadaptation in auditory cortex changed depending on the statistical context, which in turn affected stimulus-specific adaptation. Thus the present data demonstrate that stimulus-specific adaptation in human auditory cortex critically depends on statistical context. Finally, the present results challenge the implicit assumption of stationarity of neural response magnitudes that governs the practice of isolating established deviant-detection responses such as the mismatch negativity. PMID:25652920

  17. More About Vector Adaptive/Predictive Coding Of Speech

    NASA Technical Reports Server (NTRS)

    Jedrey, Thomas C.; Gersho, Allen

    1992-01-01

    Report presents additional information about digital speech-encoding and -decoding system described in "Vector Adaptive/Predictive Encoding of Speech" (NPO-17230). Summarizes development of vector adaptive/predictive coding (VAPC) system and describes basic functions of algorithm. Describes refinements introduced enabling receiver to cope with errors. VAPC algorithm implemented in integrated-circuit coding/decoding processors (codecs). VAPC and other codecs tested under variety of operating conditions. Tests designed to reveal effects of various background quiet and noisy environments and of poor telephone equipment. VAPC found competitive with and, in some respects, superior to other 4.8-kb/s codecs and other codecs of similar complexity.

  18. Building locally relevant ethics curricula for nursing education in Botswana.

    PubMed

    Barchi, F; Kasimatis Singleton, M; Magama, M; Shaibu, S

    2014-12-01

    The goal of this multi-institutional collaboration was to develop an innovative, locally relevant ethics curriculum for nurses in Botswana. Nurses in Botswana face ethical challenges that are compounded by lack of resources, pressures to handle tasks beyond training or professional levels, workplace stress and professional isolation. Capacity to teach nursing ethics in the classroom and in professional practice settings has been limited. A pilot curriculum, including cases set in local contexts, was tested with nursing faculty in Botswana in 2012. Thirty-three per cent of the faculty members indicated they would be more comfortable teaching ethics. A substantial number of faculty members were more likely to introduce the International Council of Nurses Code of Ethics in teaching, practice and mentoring as a result of the training. Based on evaluation data, curricular materials were developed using the Code and the regulatory requirements for nursing practice in Botswana. A web-based repository of sample lectures, discussion cases and evaluation rubrics was created to support the use of the materials. A new master degree course, Nursing Ethics in Practice, has been proposed for fall 2015 at the University of Botswana. The modular nature of the materials and the availability of cases set within the context of clinical nurse practice in Botswana make them readily adaptable to various student academic levels and continuing professional development programmes. The ICN Code of Ethics for Nursing is a valuable teaching tool in developing countries when taught using locally relevant case materials and problem-based teaching methods. The approach used in the development of a locally relevant nursing ethics curriculum in Botswana can serve as a model for nursing education and continuing professional development programmes in other sub-Saharan African countries to enhance use of the ICN Code of Ethics in nursing practice. © 2014 International Council of Nurses.

  19. Accelerating Convolutional Sparse Coding for Curvilinear Structures Segmentation by Refining SCIRD-TS Filter Banks.

    PubMed

    Annunziata, Roberto; Trucco, Emanuele

    2016-11-01

    Deep learning has shown great potential for curvilinear structure (e.g., retinal blood vessels and neurites) segmentation as demonstrated by a recent auto-context regression architecture based on filter banks learned by convolutional sparse coding. However, learning such filter banks is very time-consuming, thus limiting the amount of filters employed and the adaptation to other data sets (i.e., slow re-training). We address this limitation by proposing a novel acceleration strategy to speed-up convolutional sparse coding filter learning for curvilinear structure segmentation. Our approach is based on a novel initialisation strategy (warm start), and therefore it is different from recent methods improving the optimisation itself. Our warm-start strategy is based on carefully designed hand-crafted filters (SCIRD-TS), modelling appearance properties of curvilinear structures which are then refined by convolutional sparse coding. Experiments on four diverse data sets, including retinal blood vessels and neurites, suggest that the proposed method reduces significantly the time taken to learn convolutional filter banks (i.e., up to -82%) compared to conventional initialisation strategies. Remarkably, this speed-up does not worsen performance; in fact, filters learned with the proposed strategy often achieve a much lower reconstruction error and match or exceed the segmentation performance of random and DCT-based initialisation, when used as input to a random forest classifier.

  20. Adaptive Wavelet Coding Applied in a Wireless Control System.

    PubMed

    Gama, Felipe O S; Silveira, Luiz F Q; Salazar, Andrés O

    2017-12-13

    Wireless control systems can sense, control and act on the information exchanged between the wireless sensor nodes in a control loop. However, the exchanged information becomes susceptible to the degenerative effects produced by the multipath propagation. In order to minimize the destructive effects characteristic of wireless channels, several techniques have been investigated recently. Among them, wavelet coding is a good alternative for wireless communications for its robustness to the effects of multipath and its low computational complexity. This work proposes an adaptive wavelet coding whose parameters of code rate and signal constellation can vary according to the fading level and evaluates the use of this transmission system in a control loop implemented by wireless sensor nodes. The performance of the adaptive system was evaluated in terms of bit error rate (BER) versus E b / N 0 and spectral efficiency, considering a time-varying channel with flat Rayleigh fading, and in terms of processing overhead on a control system with wireless communication. The results obtained through computational simulations and experimental tests show performance gains obtained by insertion of the adaptive wavelet coding in a control loop with nodes interconnected by wireless link. These results enable the use of this technique in a wireless link control loop.

  1. Reflexive Principlism as an Effective Approach for Developing Ethical Reasoning in Engineering.

    PubMed

    Beever, Jonathan; Brightman, Andrew O

    2016-02-01

    An important goal of teaching ethics to engineering students is to enhance their ability to make well-reasoned ethical decisions in their engineering practice: a goal in line with the stated ethical codes of professional engineering organizations. While engineering educators have explored a wide range of methodologies for teaching ethics, a satisfying model for developing ethical reasoning skills has not been adopted broadly. In this paper we argue that a principlist-based approach to ethical reasoning is uniquely suited to engineering ethics education. Reflexive Principlism is an approach to ethical decision-making that focuses on internalizing a reflective and iterative process of specification, balancing, and justification of four core ethical principles in the context of specific cases. In engineering, that approach provides structure to ethical reasoning while allowing the flexibility for adaptation to varying contexts through specification. Reflexive Principlism integrates well with the prevalent and familiar methodologies of reasoning within the engineering disciplines as well as with the goals of engineering ethics education.

  2. Yellow and social perceptions of racing cyclists' sportspersonship: Proposing an inter-contextual analysis.

    PubMed

    Chantal, Yves; Bernache-Assollant, Iouri

    2017-03-01

    Through inter-contextual designs, the present set of experiments sought to explore whether the colour yellow would impact on social perceptions of sportspersonship exclusively in relation to competitive cycling. In Experiment 1 (N = 149), a silhouette image of a cyclist on a yellow background yielded lower perceptions of sportspersonship in comparison to grey or to the context of motocross, regardless of the colour. That interaction was conceptually replicated in Experiment 2 (N = 146) while changing measures (i.e., adaptation of the World Anti-Doping Code) and the context of comparison to sprinting. Furthermore, female and male observers' scores did not differ significantly thereby suggesting that yellow impacted on perceived sportspersonship similarly across gender. On the whole, those findings suggest that yellow can generate negative impressions of racing cyclists because, with years, this colour took on a meaning of opportunism from frequent pairings with doping. We close with discussing a number of limitations and future research avenues.

  3. Mistranslation: from adaptations to applications.

    PubMed

    Hoffman, Kyle S; O'Donoghue, Patrick; Brandl, Christopher J

    2017-11-01

    The conservation of the genetic code indicates that there was a single origin, but like all genetic material, the cell's interpretation of the code is subject to evolutionary pressure. Single nucleotide variations in tRNA sequences can modulate codon assignments by altering codon-anticodon pairing or tRNA charging. Either can increase translation errors and even change the code. The frozen accident hypothesis argued that changes to the code would destabilize the proteome and reduce fitness. In studies of model organisms, mistranslation often acts as an adaptive response. These studies reveal evolutionary conserved mechanisms to maintain proteostasis even during high rates of mistranslation. This review discusses the evolutionary basis of altered genetic codes, how mistranslation is identified, and how deviations to the genetic code are exploited. We revisit early discoveries of genetic code deviations and provide examples of adaptive mistranslation events in nature. Lastly, we highlight innovations in synthetic biology to expand the genetic code. The genetic code is still evolving. Mistranslation increases proteomic diversity that enables cells to survive stress conditions or suppress a deleterious allele. Genetic code variants have been identified by genome and metagenome sequence analyses, suppressor genetics, and biochemical characterization. Understanding the mechanisms of translation and genetic code deviations enables the design of new codes to produce novel proteins. Engineering the translation machinery and expanding the genetic code to incorporate non-canonical amino acids are valuable tools in synthetic biology that are impacting biomedical research. This article is part of a Special Issue entitled "Biochemistry of Synthetic Biology - Recent Developments" Guest Editor: Dr. Ilka Heinemann and Dr. Patrick O'Donoghue. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Adaptive Core Simulation Employing Discrete Inverse Theory - Part II: Numerical Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdel-Khalik, Hany S.; Turinsky, Paul J.

    2005-07-15

    Use of adaptive simulation is intended to improve the fidelity and robustness of important core attribute predictions such as core power distribution, thermal margins, and core reactivity. Adaptive simulation utilizes a selected set of past and current reactor measurements of reactor observables, i.e., in-core instrumentation readings, to adapt the simulation in a meaningful way. The companion paper, ''Adaptive Core Simulation Employing Discrete Inverse Theory - Part I: Theory,'' describes in detail the theoretical background of the proposed adaptive techniques. This paper, Part II, demonstrates several computational experiments conducted to assess the fidelity and robustness of the proposed techniques. The intentmore » is to check the ability of the adapted core simulator model to predict future core observables that are not included in the adaption or core observables that are recorded at core conditions that differ from those at which adaption is completed. Also, this paper demonstrates successful utilization of an efficient sensitivity analysis approach to calculate the sensitivity information required to perform the adaption for millions of input core parameters. Finally, this paper illustrates a useful application for adaptive simulation - reducing the inconsistencies between two different core simulator code systems, where the multitudes of input data to one code are adjusted to enhance the agreement between both codes for important core attributes, i.e., core reactivity and power distribution. Also demonstrated is the robustness of such an application.« less

  5. Ethical, Legal and Social Issues related to the health data-warehouses: re-using health data in the research and public health research.

    PubMed

    Lamas, Eugenia; Barh, Anne; Brown, Dario; Jaulent, Marie-Christine

    2015-01-01

    Research derived from the application of information and communication technologies in medicine operates in a context involving the globalization of collecting, sharing, storage, transfer and re-use of personal health data. Health data computerization within Clinical Information Systems (as Electronic Healthcare Records) should allow the re-use of health data for clinical research and public health purposes. One of the objects allowing the integration of healthcare and research information systems is the health data-warehouse (DWH). However, ethical-legal frameworks in force are not adapted to these DWHs because they were not conceived for re-using data in a different context than the one of their acquisition. For that matter, access modalities to data-warehouses must ensure the respect of patients' rights: information to the patient, as well as confidentiality and security. Through a bibliography research, some Ethical, legal and Social Issues (ELSI) have been identified: Patients' rights Modalities of implementation of the DWs; Solidarity and common good; Transparency and Trust. Comparative analysis between the Directive 95/46/CE and the "Proposal for regulation on protection of individuals with regard to the processing of personal data" shows that this regulation pretends allowing the re-use of key-coded data when aimed at a scientific purpose. However, since this new regulation does not align with the ethical and legal requirements at an operational level, a Code of practice on secondary use of Medical Data in scientific Research Projects has been developed at the European Level. This Code provides guidance for Innovative Medicine Initiative (IMI) and will help to propose practical solutions to overcome the issue of the re-use of data for research purposes.

  6. RD Optimized, Adaptive, Error-Resilient Transmission of MJPEG2000-Coded Video over Multiple Time-Varying Channels

    NASA Astrophysics Data System (ADS)

    Bezan, Scott; Shirani, Shahram

    2006-12-01

    To reliably transmit video over error-prone channels, the data should be both source and channel coded. When multiple channels are available for transmission, the problem extends to that of partitioning the data across these channels. The condition of transmission channels, however, varies with time. Therefore, the error protection added to the data at one instant of time may not be optimal at the next. In this paper, we propose a method for adaptively adding error correction code in a rate-distortion (RD) optimized manner using rate-compatible punctured convolutional codes to an MJPEG2000 constant rate-coded frame of video. We perform an analysis on the rate-distortion tradeoff of each of the coding units (tiles and packets) in each frame and adapt the error correction code assigned to the unit taking into account the bandwidth and error characteristics of the channels. This method is applied to both single and multiple time-varying channel environments. We compare our method with a basic protection method in which data is either not transmitted, transmitted with no protection, or transmitted with a fixed amount of protection. Simulation results show promising performance for our proposed method.

  7. Mutual information-based analysis of JPEG2000 contexts.

    PubMed

    Liu, Zhen; Karam, Lina J

    2005-04-01

    Context-based arithmetic coding has been widely adopted in image and video compression and is a key component of the new JPEG2000 image compression standard. In this paper, the contexts used in JPEG2000 are analyzed using the mutual information, which is closely related to the compression performance. We first show that, when combining the contexts, the mutual information between the contexts and the encoded data will decrease unless the conditional probability distributions of the combined contexts are the same. Given I, the initial number of contexts, and F, the final desired number of contexts, there are S(I, F) possible context classification schemes where S(I, F) is called the Stirling number of the second kind. The optimal classification scheme is the one that gives the maximum mutual information. Instead of using an exhaustive search, the optimal classification scheme can be obtained through a modified generalized Lloyd algorithm with the relative entropy as the distortion metric. For binary arithmetic coding, the search complexity can be reduced by using dynamic programming. Our experimental results show that the JPEG2000 contexts capture the correlations among the wavelet coefficients very well. At the same time, the number of contexts used as part of the standard can be reduced without loss in the coding performance.

  8. Finding your way through EOL challenges in the ICU using Adaptive Leadership behaviours: A qualitative descriptive case study.

    PubMed

    Adams, Judith A; Bailey, Donald E; Anderson, Ruth A; Thygeson, Marcus

    2013-12-01

    Using the Adaptive Leadership framework, we describe behaviours that providers used while interacting with family members facing the challenges of recognising that their loved one was dying in the ICU. In this prospective pilot case study, we selected one ICU patient with end-stage illness who lacked decision-making capacity. Participants included four family members, one nurse and two physicians. The principle investigator observed and recorded three family conferences and conducted one in-depth interview with the family. Three members of the research team independently coded the transcripts using a priori codes to describe the Adaptive Leadership behaviours that providers used to facilitate the family's adaptive work, met to compare and discuss the codes and resolved all discrepancies. We identified behaviours used by nurses and physicians that facilitated the family's ability to adapt to the impending death of a loved one. Examples of these behaviours include defining the adaptive challenges for families and foreshadowing a poor prognosis. Nurse and physician Adaptive Leadership behaviours can facilitate the transition from curative to palliative care by helping family members do the adaptive work of letting go. Further research is warranted to create knowledge for providers to help family members adapt. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Context-Specific Adaptation of Gravity-Dependent Vestibular Reflex Responses (NSBRI Neurovestibular Project 1)

    NASA Technical Reports Server (NTRS)

    Shelhamer, Mark; Goldberg, Jefim; Minor, Lloyd B.; Paloski, William H.; Young, Laurence R.; Zee, David S.

    1999-01-01

    Impairment of gaze and head stabilization reflexes can lead to disorientation and reduced performance in sensorimotor tasks such as piloting of spacecraft. Transitions between different gravitoinertial force (gif) environments - as during different phases of space flight - provide an extreme test of the adaptive capabilities of these mechanisms. We wish to determine to what extent the sensorimotor skills acquired in one gravity environment will transfer to others, and to what extent gravity serves as a context cue for inhibiting such transfer. We use the general approach of adapting a response (saccades, vestibuloocular reflex: VOR, or vestibulocollic reflex: VCR) to a particular change in gain or phase in one gif condition, adapting to a different gain or phase in a second gif condition, and then seeing if gif itself - the context cue - can recall the previously-learned adapted responses. Previous evidence indicates that unless there is specific training to induce context-specificity, reflex adaptation is sequential rather than simultaneous. Various experiments in this project investigate the behavioral properties, neurophysiological basis, and anatomical substrate of context-specific learning, using otolith (gravity) signals as a context cue. In the following, we outline the methods for all experiments in this project, and provide details and results on selected experiments.

  10. A progressive data compression scheme based upon adaptive transform coding: Mixture block coding of natural images

    NASA Technical Reports Server (NTRS)

    Rost, Martin C.; Sayood, Khalid

    1991-01-01

    A method for efficiently coding natural images using a vector-quantized variable-blocksized transform source coder is presented. The method, mixture block coding (MBC), incorporates variable-rate coding by using a mixture of discrete cosine transform (DCT) source coders. Which coders are selected to code any given image region is made through a threshold driven distortion criterion. In this paper, MBC is used in two different applications. The base method is concerned with single-pass low-rate image data compression. The second is a natural extension of the base method which allows for low-rate progressive transmission (PT). Since the base method adapts easily to progressive coding, it offers the aesthetic advantage of progressive coding without incorporating extensive channel overhead. Image compression rates of approximately 0.5 bit/pel are demonstrated for both monochrome and color images.

  11. Statistical context shapes stimulus-specific adaptation in human auditory cortex.

    PubMed

    Herrmann, Björn; Henry, Molly J; Fromboluti, Elisa Kim; McAuley, J Devin; Obleser, Jonas

    2015-04-01

    Stimulus-specific adaptation is the phenomenon whereby neural response magnitude decreases with repeated stimulation. Inconsistencies between recent nonhuman animal recordings and computational modeling suggest dynamic influences on stimulus-specific adaptation. The present human electroencephalography (EEG) study investigates the potential role of statistical context in dynamically modulating stimulus-specific adaptation by examining the auditory cortex-generated N1 and P2 components. As in previous studies of stimulus-specific adaptation, listeners were presented with oddball sequences in which the presentation of a repeated tone was infrequently interrupted by rare spectral changes taking on three different magnitudes. Critically, the statistical context varied with respect to the probability of small versus large spectral changes within oddball sequences (half of the time a small change was most probable; in the other half a large change was most probable). We observed larger N1 and P2 amplitudes (i.e., release from adaptation) for all spectral changes in the small-change compared with the large-change statistical context. The increase in response magnitude also held for responses to tones presented with high probability, indicating that statistical adaptation can overrule stimulus probability per se in its influence on neural responses. Computational modeling showed that the degree of coadaptation in auditory cortex changed depending on the statistical context, which in turn affected stimulus-specific adaptation. Thus the present data demonstrate that stimulus-specific adaptation in human auditory cortex critically depends on statistical context. Finally, the present results challenge the implicit assumption of stationarity of neural response magnitudes that governs the practice of isolating established deviant-detection responses such as the mismatch negativity. Copyright © 2015 the American Physiological Society.

  12. Nyx: Adaptive mesh, massively-parallel, cosmological simulation code

    NASA Astrophysics Data System (ADS)

    Almgren, Ann; Beckner, Vince; Friesen, Brian; Lukic, Zarija; Zhang, Weiqun

    2017-12-01

    Nyx code solves equations of compressible hydrodynamics on an adaptive grid hierarchy coupled with an N-body treatment of dark matter. The gas dynamics in Nyx use a finite volume methodology on an adaptive set of 3-D Eulerian grids; dark matter is represented as discrete particles moving under the influence of gravity. Particles are evolved via a particle-mesh method, using Cloud-in-Cell deposition/interpolation scheme. Both baryonic and dark matter contribute to the gravitational field. In addition, Nyx includes physics for accurately modeling the intergalactic medium; in optically thin limits and assuming ionization equilibrium, the code calculates heating and cooling processes of the primordial-composition gas in an ionizing ultraviolet background radiation field.

  13. Adaptation and perceptual norms

    NASA Astrophysics Data System (ADS)

    Webster, Michael A.; Yasuda, Maiko; Haber, Sara; Leonard, Deanne; Ballardini, Nicole

    2007-02-01

    We used adaptation to examine the relationship between perceptual norms--the stimuli observers describe as psychologically neutral, and response norms--the stimulus levels that leave visual sensitivity in a neutral or balanced state. Adapting to stimuli on opposite sides of a neutral point (e.g. redder or greener than white) biases appearance in opposite ways. Thus the adapting stimulus can be titrated to find the unique adapting level that does not bias appearance. We compared these response norms to subjectively defined neutral points both within the same observer (at different retinal eccentricities) and between observers. These comparisons were made for visual judgments of color, image focus, and human faces, stimuli that are very different and may depend on very different levels of processing, yet which share the property that for each there is a well defined and perceptually salient norm. In each case the adaptation aftereffects were consistent with an underlying sensitivity basis for the perceptual norm. Specifically, response norms were similar to and thus covaried with the perceptual norm, and under common adaptation differences between subjectively defined norms were reduced. These results are consistent with models of norm-based codes and suggest that these codes underlie an important link between visual coding and visual experience.

  14. Students' explanations in complex learning of disciplinary programming

    NASA Astrophysics Data System (ADS)

    Vieira, Camilo

    Computational Science and Engineering (CSE) has been denominated as the third pillar of science and as a set of important skills to solve the problems of a global society. Along with the theoretical and the experimental approaches, computation offers a third alternative to solve complex problems that require processing large amounts of data, or representing complex phenomena that are not easy to experiment with. Despite the relevance of CSE, current professionals and scientists are not well prepared to take advantage of this set of tools and methods. Computation is usually taught in an isolated way from engineering disciplines, and therefore, engineers do not know how to exploit CSE affordances. This dissertation intends to introduce computational tools and methods contextualized within the Materials Science and Engineering curriculum. Considering that learning how to program is a complex task, the dissertation explores effective pedagogical practices that can support student disciplinary and computational learning. Two case studies will be evaluated to identify the characteristics of effective worked examples in the context of CSE. Specifically, this dissertation explores students explanations of these worked examples in two engineering courses with different levels of transparency: a programming course in materials science and engineering glass box and a thermodynamics course involving computational representations black box. Results from this study suggest that students benefit in different ways from writing in-code comments. These benefits include but are not limited to: connecting xv individual lines of code to the overall problem, getting familiar with the syntax, learning effective algorithm design strategies, and connecting computation with their discipline. Students in the glass box context generate higher quality explanations than students in the black box context. These explanations are related to students prior experiences. Specifically, students with low ability to do programming engage in a more thorough explanation process than students with high ability. This dissertation concludes proposing an adaptation to the instructional principles of worked-examples for the context of CSE education.

  15. Dynamic Distribution and Layouting of Model-Based User Interfaces in Smart Environments

    NASA Astrophysics Data System (ADS)

    Roscher, Dirk; Lehmann, Grzegorz; Schwartze, Veit; Blumendorf, Marco; Albayrak, Sahin

    The developments in computer technology in the last decade change the ways of computer utilization. The emerging smart environments make it possible to build ubiquitous applications that assist users during their everyday life, at any time, in any context. But the variety of contexts-of-use (user, platform and environment) makes the development of such ubiquitous applications for smart environments and especially its user interfaces a challenging and time-consuming task. We propose a model-based approach, which allows adapting the user interface at runtime to numerous (also unknown) contexts-of-use. Based on a user interface modelling language, defining the fundamentals and constraints of the user interface, a runtime architecture exploits the description to adapt the user interface to the current context-of-use. The architecture provides automatic distribution and layout algorithms for adapting the applications also to contexts unforeseen at design time. Designers do not specify predefined adaptations for each specific situation, but adaptation constraints and guidelines. Furthermore, users are provided with a meta user interface to influence the adaptations according to their needs. A smart home energy management system serves as running example to illustrate the approach.

  16. 43 CFR 11.64 - Injury determination phase-testing and sampling methods.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...

  17. 43 CFR 11.64 - Injury determination phase-testing and sampling methods.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    .... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...

  18. 43 CFR 11.64 - Injury determination phase-testing and sampling methods.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    .... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...

  19. A fragmented code: The moral and structural context for providing assistance with injection drug use initiation in San Diego, USA.

    PubMed

    Guise, Andy; Melo, Jason; Mittal, Maria Luisa; Rafful, Claudia; Cuevas-Mota, Jazmine; Davidson, Peter; Garfein, Richard S; Werb, Dan

    2018-05-01

    Injection drug use initiation is shaped by social networks and structural contexts, with people who inject drugs often assisting in this process. We sought to explore the norms and contexts linked to assisting others to initiate injection drug use in San Diego, USA, to inform the development of structural interventions to prevent this phenomenon. We undertook qualitative interviews with a purposive sample of people who inject drugs and had reported assisting others to initiate injection (n = 17) and a sub-sample of people who inject drugs (n = 4) who had not reported initiating others to triangulate accounts. We analyzed data thematically and abductively. Respondents' accounts of providing initiation assistance were consistent with themes and motives reported in other contexts: of seeking to reduce harm to the 'initiate', responding to requests for help, fostering pleasure, accessing resources, and claims that initiation assistance was unintentional. We developed analysis of these themes to explore initiation assistance as governed by a 'moral code'. We delineate a fragmented moral code which includes a range of meanings and social contexts that shape initiation assistance. We also show how assistance is happening within a structural context that limits discussion of injection drug use, reflecting a prevailing silence on drug use linked to stigma and criminalization. In San Diego, the assistance of others to initiate injection drug use is governed by a fragmented moral code situated within particular social norms and contexts. Interventions that address the social and structural conditions shaped by and shaping this code may be beneficial, in tandem with efforts to support safe injection and the reduction of injection-related harms. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Adaptive bit plane quadtree-based block truncation coding for image compression

    NASA Astrophysics Data System (ADS)

    Li, Shenda; Wang, Jin; Zhu, Qing

    2018-04-01

    Block truncation coding (BTC) is a fast image compression technique applied in spatial domain. Traditional BTC and its variants mainly focus on reducing computational complexity for low bit rate compression, at the cost of lower quality of decoded images, especially for images with rich texture. To solve this problem, in this paper, a quadtree-based block truncation coding algorithm combined with adaptive bit plane transmission is proposed. First, the direction of edge in each block is detected using Sobel operator. For the block with minimal size, adaptive bit plane is utilized to optimize the BTC, which depends on its MSE loss encoded by absolute moment block truncation coding (AMBTC). Extensive experimental results show that our method gains 0.85 dB PSNR on average compare to some other state-of-the-art BTC variants. So it is desirable for real time image compression applications.

  1. Evaluation of deflectometry for E-ELT optics.

    NASA Astrophysics Data System (ADS)

    Sironi, G.; Canestrari, R.; Civitani, M. M.

    A deflectometrical facility was developed at Italian National Institute for Astrophysics-OAB in the context of the ASTRI project to characterize free-form segments for Cherenkov optics. The test works as an inverse Ronchi test in combination with a ray-tracing code: the under-test surface is illuminated by a known light pattern and the pattern warped by local surface errors is observed. Knowing the geometry of the system it is possible to retrieve the surface normal vectors. This contribution presents the analysis of the upgrades and of the configuration modifications required to allow the use of deflectometry in the realization of optical components suitable for European Extremely Large Telescope and as a specific case to support the manufacturing of the Multi-conjugate Adaptive Optics Relay (MAORY) module.

  2. Ramanujan sums for signal processing of low-frequency noise.

    PubMed

    Planat, Michel; Rosu, Haret; Perrine, Serge

    2002-11-01

    An aperiodic (low-frequency) spectrum may originate from the error term in the mean value of an arithmetical function such as Möbius function or Mangoldt function, which are coding sequences for prime numbers. In the discrete Fourier transform the analyzing wave is periodic and not well suited to represent the low-frequency regime. In place we introduce a different signal processing tool based on the Ramanujan sums c(q)(n), well adapted to the analysis of arithmetical sequences with many resonances p/q. The sums are quasiperiodic versus the time n and aperiodic versus the order q of the resonance. Different results arise from the use of this Ramanujan-Fourier transform in the context of arithmetical and experimental signals.

  3. Ramanujan sums for signal processing of low-frequency noise

    NASA Astrophysics Data System (ADS)

    Planat, Michel; Rosu, Haret; Perrine, Serge

    2002-11-01

    An aperiodic (low-frequency) spectrum may originate from the error term in the mean value of an arithmetical function such as Möbius function or Mangoldt function, which are coding sequences for prime numbers. In the discrete Fourier transform the analyzing wave is periodic and not well suited to represent the low-frequency regime. In place we introduce a different signal processing tool based on the Ramanujan sums cq(n), well adapted to the analysis of arithmetical sequences with many resonances p/q. The sums are quasiperiodic versus the time n and aperiodic versus the order q of the resonance. Different results arise from the use of this Ramanujan-Fourier transform in the context of arithmetical and experimental signals.

  4. Current status of antisense RNA-mediated gene regulation in Listeria monocytogenes.

    PubMed

    Schultze, Tilman; Izar, Benjamin; Qing, Xiaoxing; Mannala, Gopala K; Hain, Torsten

    2014-01-01

    Listeria monocytogenes is a Gram-positive human-pathogen bacterium that served as an experimental model for investigating fundamental processes of adaptive immunity and virulence. Recent novel technologies allowed the identification of several hundred non-coding RNAs (ncRNAs) in the Listeria genome and provided insight into an unexpected complex transcriptional machinery. In this review, we discuss ncRNAs that are encoded on the opposite strand of the target gene and are therefore termed antisense RNAs (asRNAs). We highlight mechanistic and functional concepts of asRNAs in L. monocytogenes and put these in context of asRNAs in other bacteria. Understanding asRNAs will further broaden our knowledge of RNA-mediated gene regulation and may provide targets for diagnostic and antimicrobial development.

  5. Urn models for response-adaptive randomized designs: a simulation study based on a non-adaptive randomized trial.

    PubMed

    Ghiglietti, Andrea; Scarale, Maria Giovanna; Miceli, Rosalba; Ieva, Francesca; Mariani, Luigi; Gavazzi, Cecilia; Paganoni, Anna Maria; Edefonti, Valeria

    2018-03-22

    Recently, response-adaptive designs have been proposed in randomized clinical trials to achieve ethical and/or cost advantages by using sequential accrual information collected during the trial to dynamically update the probabilities of treatment assignments. In this context, urn models-where the probability to assign patients to treatments is interpreted as the proportion of balls of different colors available in a virtual urn-have been used as response-adaptive randomization rules. We propose the use of Randomly Reinforced Urn (RRU) models in a simulation study based on a published randomized clinical trial on the efficacy of home enteral nutrition in cancer patients after major gastrointestinal surgery. We compare results with the RRU design with those previously published with the non-adaptive approach. We also provide a code written with the R software to implement the RRU design in practice. In detail, we simulate 10,000 trials based on the RRU model in three set-ups of different total sample sizes. We report information on the number of patients allocated to the inferior treatment and on the empirical power of the t-test for the treatment coefficient in the ANOVA model. We carry out a sensitivity analysis to assess the effect of different urn compositions. For each sample size, in approximately 75% of the simulation runs, the number of patients allocated to the inferior treatment by the RRU design is lower, as compared to the non-adaptive design. The empirical power of the t-test for the treatment effect is similar in the two designs.

  6. Data Analysis Approaches for the Risk-Informed Safety Margins Characterization Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Alfonsi, Andrea; Maljovec, Daniel P.

    2016-09-01

    In the past decades, several numerical simulation codes have been employed to simulate accident dynamics (e.g., RELAP5-3D, RELAP-7, MELCOR, MAAP). In order to evaluate the impact of uncertainties into accident dynamics, several stochastic methodologies have been coupled with these codes. These stochastic methods range from classical Monte-Carlo and Latin Hypercube sampling to stochastic polynomial methods. Similar approaches have been introduced into the risk and safety community where stochastic methods (such as RAVEN, ADAPT, MCDET, ADS) have been coupled with safety analysis codes in order to evaluate the safety impact of timing and sequencing of events. These approaches are usually calledmore » Dynamic PRA or simulation-based PRA methods. These uncertainties and safety methods usually generate a large number of simulation runs (database storage may be on the order of gigabytes or higher). The scope of this paper is to present a broad overview of methods and algorithms that can be used to analyze and extract information from large data sets containing time dependent data. In this context, “extracting information” means constructing input-output correlations, finding commonalities, and identifying outliers. Some of the algorithms presented here have been developed or are under development within the RAVEN statistical framework.« less

  7. Reviewing the Challenges and Opportunities Presented by Code Switching and Mixing in Bangla

    ERIC Educational Resources Information Center

    Hasan, Md. Kamrul; Akhand, Mohd. Moniruzzaman

    2015-01-01

    This paper investigates the issues related to code-switching/code-mixing in an ESL context. Some preliminary data on Bangla-English code-switching/code-mixing has been analyzed in order to determine which structural pattern of code-switching/code-mixing is predominant in different social strata. This study also explores the relationship of…

  8. Reviewing the Challenges and Opportunities Presented by Code Switching and Mixing in Bangla

    ERIC Educational Resources Information Center

    Hasan, Md. Kamrul; Akhand, Mohd. Moniruzzaman

    2014-01-01

    This paper investigates the issues related to code-switching/code-mixing in an ESL context. Some preliminary data on Bangla-English code-switching/code-mixing has been analyzed in order to determine which structural pattern of code-switching/code-mixing is predominant in different social strata. This study also explores the relationship of…

  9. A generic efficient adaptive grid scheme for rocket propulsion modeling

    NASA Technical Reports Server (NTRS)

    Mo, J. D.; Chow, Alan S.

    1993-01-01

    The objective of this research is to develop an efficient, time-accurate numerical algorithm to discretize the Navier-Stokes equations for the predictions of internal one-, two-dimensional and axisymmetric flows. A generic, efficient, elliptic adaptive grid generator is implicitly coupled with the Lower-Upper factorization scheme in the development of ALUNS computer code. The calculations of one-dimensional shock tube wave propagation and two-dimensional shock wave capture, wave-wave interactions, shock wave-boundary interactions show that the developed scheme is stable, accurate and extremely robust. The adaptive grid generator produced a very favorable grid network by a grid speed technique. This generic adaptive grid generator is also applied in the PARC and FDNS codes and the computational results for solid rocket nozzle flowfield and crystal growth modeling by those codes will be presented in the conference, too. This research work is being supported by NASA/MSFC.

  10. Counter-propagation network with variable degree variable step size LMS for single switch typing recognition.

    PubMed

    Yang, Cheng-Huei; Luo, Ching-Hsing; Yang, Cheng-Hong; Chuang, Li-Yeh

    2004-01-01

    Morse code is now being harnessed for use in rehabilitation applications of augmentative-alternative communication and assistive technology, including mobility, environmental control and adapted worksite access. In this paper, Morse code is selected as a communication adaptive device for disabled persons who suffer from muscle atrophy, cerebral palsy or other severe handicaps. A stable typing rate is strictly required for Morse code to be effective as a communication tool. This restriction is a major hindrance. Therefore, a switch adaptive automatic recognition method with a high recognition rate is needed. The proposed system combines counter-propagation networks with a variable degree variable step size LMS algorithm. It is divided into five stages: space recognition, tone recognition, learning process, adaptive processing, and character recognition. Statistical analyses demonstrated that the proposed method elicited a better recognition rate in comparison to alternative methods in the literature.

  11. Parallel Adaptive Mesh Refinement Library

    NASA Technical Reports Server (NTRS)

    Mac-Neice, Peter; Olson, Kevin

    2005-01-01

    Parallel Adaptive Mesh Refinement Library (PARAMESH) is a package of Fortran 90 subroutines designed to provide a computer programmer with an easy route to extension of (1) a previously written serial code that uses a logically Cartesian structured mesh into (2) a parallel code with adaptive mesh refinement (AMR). Alternatively, in its simplest use, and with minimal effort, PARAMESH can operate as a domain-decomposition tool for users who want to parallelize their serial codes but who do not wish to utilize adaptivity. The package builds a hierarchy of sub-grids to cover the computational domain of a given application program, with spatial resolution varying to satisfy the demands of the application. The sub-grid blocks form the nodes of a tree data structure (a quad-tree in two or an oct-tree in three dimensions). Each grid block has a logically Cartesian mesh. The package supports one-, two- and three-dimensional models.

  12. An object-oriented approach for parallel self adaptive mesh refinement on block structured grids

    NASA Technical Reports Server (NTRS)

    Lemke, Max; Witsch, Kristian; Quinlan, Daniel

    1993-01-01

    Self-adaptive mesh refinement dynamically matches the computational demands of a solver for partial differential equations to the activity in the application's domain. In this paper we present two C++ class libraries, P++ and AMR++, which significantly simplify the development of sophisticated adaptive mesh refinement codes on (massively) parallel distributed memory architectures. The development is based on our previous research in this area. The C++ class libraries provide abstractions to separate the issues of developing parallel adaptive mesh refinement applications into those of parallelism, abstracted by P++, and adaptive mesh refinement, abstracted by AMR++. P++ is a parallel array class library to permit efficient development of architecture independent codes for structured grid applications, and AMR++ provides support for self-adaptive mesh refinement on block-structured grids of rectangular non-overlapping blocks. Using these libraries, the application programmers' work is greatly simplified to primarily specifying the serial single grid application and obtaining the parallel and self-adaptive mesh refinement code with minimal effort. Initial results for simple singular perturbation problems solved by self-adaptive multilevel techniques (FAC, AFAC), being implemented on the basis of prototypes of the P++/AMR++ environment, are presented. Singular perturbation problems frequently arise in large applications, e.g. in the area of computational fluid dynamics. They usually have solutions with layers which require adaptive mesh refinement and fast basic solvers in order to be resolved efficiently.

  13. Structured Set Intra Prediction With Discriminative Learning in a Max-Margin Markov Network for High Efficiency Video Coding

    PubMed Central

    Dai, Wenrui; Xiong, Hongkai; Jiang, Xiaoqian; Chen, Chang Wen

    2014-01-01

    This paper proposes a novel model on intra coding for High Efficiency Video Coding (HEVC), which simultaneously predicts blocks of pixels with optimal rate distortion. It utilizes the spatial statistical correlation for the optimal prediction based on 2-D contexts, in addition to formulating the data-driven structural interdependences to make the prediction error coherent with the probability distribution, which is desirable for successful transform and coding. The structured set prediction model incorporates a max-margin Markov network (M3N) to regulate and optimize multiple block predictions. The model parameters are learned by discriminating the actual pixel value from other possible estimates to maximize the margin (i.e., decision boundary bandwidth). Compared to existing methods that focus on minimizing prediction error, the M3N-based model adaptively maintains the coherence for a set of predictions. Specifically, the proposed model concurrently optimizes a set of predictions by associating the loss for individual blocks to the joint distribution of succeeding discrete cosine transform coefficients. When the sample size grows, the prediction error is asymptotically upper bounded by the training error under the decomposable loss function. As an internal step, we optimize the underlying Markov network structure to find states that achieve the maximal energy using expectation propagation. For validation, we integrate the proposed model into HEVC for optimal mode selection on rate-distortion optimization. The proposed prediction model obtains up to 2.85% bit rate reduction and achieves better visual quality in comparison to the HEVC intra coding. PMID:25505829

  14. Multigrid solution of internal flows using unstructured solution adaptive meshes

    NASA Technical Reports Server (NTRS)

    Smith, Wayne A.; Blake, Kenneth R.

    1992-01-01

    This is the final report of the NASA Lewis SBIR Phase 2 Contract Number NAS3-25785, Multigrid Solution of Internal Flows Using Unstructured Solution Adaptive Meshes. The objective of this project, as described in the Statement of Work, is to develop and deliver to NASA a general three-dimensional Navier-Stokes code using unstructured solution-adaptive meshes for accuracy and multigrid techniques for convergence acceleration. The code will primarily be applied, but not necessarily limited, to high speed internal flows in turbomachinery.

  15. Image coding of SAR imagery

    NASA Technical Reports Server (NTRS)

    Chang, C. Y.; Kwok, R.; Curlander, J. C.

    1987-01-01

    Five coding techniques in the spatial and transform domains have been evaluated for SAR image compression: linear three-point predictor (LTPP), block truncation coding (BTC), microadaptive picture sequencing (MAPS), adaptive discrete cosine transform (ADCT), and adaptive Hadamard transform (AHT). These techniques have been tested with Seasat data. Both LTPP and BTC spatial domain coding techniques provide very good performance at rates of 1-2 bits/pixel. The two transform techniques, ADCT and AHT, demonstrate the capability to compress the SAR imagery to less than 0.5 bits/pixel without visible artifacts. Tradeoffs such as the rate distortion performance, the computational complexity, the algorithm flexibility, and the controllability of compression ratios are also discussed.

  16. Adapting HYDRUS-1D to simulate overland flow and reactive transport during sheet flow deviations

    USDA-ARS?s Scientific Manuscript database

    The HYDRUS-1D code is a popular numerical model for solving the Richards equation for variably-saturated water flow and solute transport in porous media. This code was adapted to solve rather than the Richards equation for subsurface flow the diffusion wave equation for overland flow at the soil sur...

  17. Layer-based buffer aware rate adaptation design for SHVC video streaming

    NASA Astrophysics Data System (ADS)

    Gudumasu, Srinivas; Hamza, Ahmed; Asbun, Eduardo; He, Yong; Ye, Yan

    2016-09-01

    This paper proposes a layer based buffer aware rate adaptation design which is able to avoid abrupt video quality fluctuation, reduce re-buffering latency and improve bandwidth utilization when compared to a conventional simulcast based adaptive streaming system. The proposed adaptation design schedules DASH segment requests based on the estimated bandwidth, dependencies among video layers and layer buffer fullness. Scalable HEVC video coding is the latest state-of-art video coding technique that can alleviate various issues caused by simulcast based adaptive video streaming. With scalable coded video streams, the video is encoded once into a number of layers representing different qualities and/or resolutions: a base layer (BL) and one or more enhancement layers (EL), each incrementally enhancing the quality of the lower layers. Such layer based coding structure allows fine granularity rate adaptation for the video streaming applications. Two video streaming use cases are presented in this paper. The first use case is to stream HD SHVC video over a wireless network where available bandwidth varies, and the performance comparison between proposed layer-based streaming approach and conventional simulcast streaming approach is provided. The second use case is to stream 4K/UHD SHVC video over a hybrid access network that consists of a 5G millimeter wave high-speed wireless link and a conventional wired or WiFi network. The simulation results verify that the proposed layer based rate adaptation approach is able to utilize the bandwidth more efficiently. As a result, a more consistent viewing experience with higher quality video content and minimal video quality fluctuations can be presented to the user.

  18. Identification and Classification of Orthogonal Frequency Division Multiple Access (OFDMA) Signals Used in Next Generation Wireless Systems

    DTIC Science & Technology

    2012-03-01

    advanced antenna systems AMC adaptive modulation and coding AWGN additive white Gaussian noise BPSK binary phase shift keying BS base station BTC ...QAM-16, and QAM-64, and coding types include convolutional coding (CC), convolutional turbo coding (CTC), block turbo coding ( BTC ), zero-terminating

  19. Development and application of the dynamic system doctor to nuclear reactor probabilistic risk assessments.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kunsman, David Marvin; Aldemir, Tunc; Rutt, Benjamin

    2008-05-01

    This LDRD project has produced a tool that makes probabilistic risk assessments (PRAs) of nuclear reactors - analyses which are very resource intensive - more efficient. PRAs of nuclear reactors are being increasingly relied on by the United States Nuclear Regulatory Commission (U.S.N.R.C.) for licensing decisions for current and advanced reactors. Yet, PRAs are produced much as they were 20 years ago. The work here applied a modern systems analysis technique to the accident progression analysis portion of the PRA; the technique was a system-independent multi-task computer driver routine. Initially, the objective of the work was to fuse the accidentmore » progression event tree (APET) portion of a PRA to the dynamic system doctor (DSD) created by Ohio State University. Instead, during the initial efforts, it was found that the DSD could be linked directly to a detailed accident progression phenomenological simulation code - the type on which APET construction and analysis relies, albeit indirectly - and thereby directly create and analyze the APET. The expanded DSD computational architecture and infrastructure that was created during this effort is called ADAPT (Analysis of Dynamic Accident Progression Trees). ADAPT is a system software infrastructure that supports execution and analysis of multiple dynamic event-tree simulations on distributed environments. A simulator abstraction layer was developed, and a generic driver was implemented for executing simulators on a distributed environment. As a demonstration of the use of the methodological tool, ADAPT was applied to quantify the likelihood of competing accident progression pathways occurring for a particular accident scenario in a particular reactor type using MELCOR, an integrated severe accident analysis code developed at Sandia. (ADAPT was intentionally created with flexibility, however, and is not limited to interacting with only one code. With minor coding changes to input files, ADAPT can be linked to other such codes.) The results of this demonstration indicate that the approach can significantly reduce the resources required for Level 2 PRAs. From the phenomenological viewpoint, ADAPT can also treat the associated epistemic and aleatory uncertainties. This methodology can also be used for analyses of other complex systems. Any complex system can be analyzed using ADAPT if the workings of that system can be displayed as an event tree, there is a computer code that simulates how those events could progress, and that simulator code has switches to turn on and off system events, phenomena, etc. Using and applying ADAPT to particular problems is not human independent. While the human resources for the creation and analysis of the accident progression are significantly decreased, knowledgeable analysts are still necessary for a given project to apply ADAPT successfully. This research and development effort has met its original goals and then exceeded them.« less

  20. Data compression using adaptive transform coding. Appendix 1: Item 1. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Rost, Martin Christopher

    1988-01-01

    Adaptive low-rate source coders are described in this dissertation. These coders adapt by adjusting the complexity of the coder to match the local coding difficulty of the image. This is accomplished by using a threshold driven maximum distortion criterion to select the specific coder used. The different coders are built using variable blocksized transform techniques, and the threshold criterion selects small transform blocks to code the more difficult regions and larger blocks to code the less complex regions. A theoretical framework is constructed from which the study of these coders can be explored. An algorithm for selecting the optimal bit allocation for the quantization of transform coefficients is developed. The bit allocation algorithm is more fully developed, and can be used to achieve more accurate bit assignments than the algorithms currently used in the literature. Some upper and lower bounds for the bit-allocation distortion-rate function are developed. An obtainable distortion-rate function is developed for a particular scalar quantizer mixing method that can be used to code transform coefficients at any rate.

  1. CORALINA: a universal method for the generation of gRNA libraries for CRISPR-based screening.

    PubMed

    Köferle, Anna; Worf, Karolina; Breunig, Christopher; Baumann, Valentin; Herrero, Javier; Wiesbeck, Maximilian; Hutter, Lukas H; Götz, Magdalena; Fuchs, Christiane; Beck, Stephan; Stricker, Stefan H

    2016-11-14

    The bacterial CRISPR system is fast becoming the most popular genetic and epigenetic engineering tool due to its universal applicability and adaptability. The desire to deploy CRISPR-based methods in a large variety of species and contexts has created an urgent need for the development of easy, time- and cost-effective methods enabling large-scale screening approaches. Here we describe CORALINA (comprehensive gRNA library generation through controlled nuclease activity), a method for the generation of comprehensive gRNA libraries for CRISPR-based screens. CORALINA gRNA libraries can be derived from any source of DNA without the need of complex oligonucleotide synthesis. We show the utility of CORALINA for human and mouse genomic DNA, its reproducibility in covering the most relevant genomic features including regulatory, coding and non-coding sequences and confirm the functionality of CORALINA generated gRNAs. The simplicity and cost-effectiveness make CORALINA suitable for any experimental system. The unprecedented sequence complexities obtainable with CORALINA libraries are a necessary pre-requisite for less biased large scale genomic and epigenomic screens.

  2. A Context-Adaptive Teacher Training Model in a Ubiquitous Learning Environment

    ERIC Educational Resources Information Center

    Chen, Min; Chiang, Feng Kuang; Jiang, Ya Na; Yu, Sheng Quan

    2017-01-01

    In view of the discrepancies in teacher training and teaching practice, this paper put forward a context-adaptive teacher training model in a ubiquitous learning (u-learning) environment. The innovative model provides teachers of different subjects with adaptive and personalized learning content in a u-learning environment, implements intra- and…

  3. [Coping strategies in adaptation of higher education students].

    PubMed

    das Neves Mira Freitas, Helena Cristina

    2007-01-01

    The adjustment to higher education can be understood as a multidimensional process, which requires by the student a development of adaptive skills to a new and dynamic context in itself. To meet these challenges students have to develop effective coping strategies, enabling them to be adapted to the context. The school has a key role in the help it can give to these young people, in order to adapt effectively.

  4. Visual cues that are effective for contextual saccade adaptation

    PubMed Central

    Azadi, Reza

    2014-01-01

    The accuracy of saccades, as maintained by saccade adaptation, has been shown to be context dependent: able to have different amplitude movements to the same retinal displacement dependent on motor contexts such as orbital starting location. There is conflicting evidence as to whether purely visual cues also effect contextual saccade adaptation and, if so, what function this might serve. We tested what visual cues might evoke contextual adaptation. Over 5 experiments, 78 naive subjects made saccades to circularly moving targets, which stepped outward or inward during the saccade depending on target movement direction, speed, or color and shape. To test if the movement or context postsaccade were critical, we stopped the postsaccade target motion (experiment 4) or neutralized the contexts by equating postsaccade target speed to an intermediate value (experiment 5). We found contextual adaptation in all conditions except those defined by color and shape. We conclude that some, but not all, visual cues before the saccade are sufficient for contextual adaptation. We conjecture that this visual contextuality functions to allow for different motor states for different coordinated movement patterns, such as coordinated saccade and pursuit motor planning. PMID:24647429

  5. Automatic Adaptation to Fast Input Changes in a Time-Invariant Neural Circuit

    PubMed Central

    Bharioke, Arjun; Chklovskii, Dmitri B.

    2015-01-01

    Neurons must faithfully encode signals that can vary over many orders of magnitude despite having only limited dynamic ranges. For a correlated signal, this dynamic range constraint can be relieved by subtracting away components of the signal that can be predicted from the past, a strategy known as predictive coding, that relies on learning the input statistics. However, the statistics of input natural signals can also vary over very short time scales e.g., following saccades across a visual scene. To maintain a reduced transmission cost to signals with rapidly varying statistics, neuronal circuits implementing predictive coding must also rapidly adapt their properties. Experimentally, in different sensory modalities, sensory neurons have shown such adaptations within 100 ms of an input change. Here, we show first that linear neurons connected in a feedback inhibitory circuit can implement predictive coding. We then show that adding a rectification nonlinearity to such a feedback inhibitory circuit allows it to automatically adapt and approximate the performance of an optimal linear predictive coding network, over a wide range of inputs, while keeping its underlying temporal and synaptic properties unchanged. We demonstrate that the resulting changes to the linearized temporal filters of this nonlinear network match the fast adaptations observed experimentally in different sensory modalities, in different vertebrate species. Therefore, the nonlinear feedback inhibitory network can provide automatic adaptation to fast varying signals, maintaining the dynamic range necessary for accurate neuronal transmission of natural inputs. PMID:26247884

  6. An adaptable binary entropy coder

    NASA Technical Reports Server (NTRS)

    Kiely, A.; Klimesh, M.

    2001-01-01

    We present a novel entropy coding technique which is based on recursive interleaving of variable-to-variable length binary source codes. We discuss code design and performance estimation methods, as well as practical encoding and decoding algorithms.

  7. Towards Increased Relevance: Context-Adapted Models of the Learning Organization

    ERIC Educational Resources Information Center

    Örtenblad, Anders

    2015-01-01

    Purpose: The purposes of this paper are to take a closer look at the relevance of the idea of the learning organization for organizations in different generalized organizational contexts; to open up for the existence of multiple, context-adapted models of the learning organization; and to suggest a number of such models.…

  8. The tactile speed aftereffect depends on the speed of adapting motion across the skin rather than other spatiotemporal features

    PubMed Central

    Seizova-Cajic, Tatjana; Holcombe, Alex O.

    2015-01-01

    After prolonged exposure to a surface moving across the skin, this felt movement appears slower, a phenomenon known as the tactile speed aftereffect (tSAE). We asked which feature of the adapting motion drives the tSAE: speed, the spacing between texture elements, or the frequency with which they cross the skin. After adapting to a ridged moving surface with one hand, participants compared the speed of test stimuli on adapted and unadapted hands. We used surfaces with different spatial periods (SPs; 3, 6, 12 mm) that produced adapting motion with different combinations of adapting speed (20, 40, 80 mm/s) and temporal frequency (TF; 3.4, 6.7, 13.4 ridges/s). The primary determinant of tSAE magnitude was speed of the adapting motion, not SP or TF. This suggests that adaptation occurs centrally, after speed has been computed from SP and TF, and/or that it reflects a speed cue independent of those features in the first place (e.g., indentation force). In a second experiment, we investigated the properties of the neural code for speed. Speed tuning predicts that adaptation should be greatest for speeds at or near the adapting speed. However, the tSAE was always stronger when the adapting stimulus was faster (242 mm/s) than the test (30–143 mm/s) compared with when the adapting and test speeds were matched. These results give no indication of speed tuning and instead suggest that adaptation occurs at a level where an intensive code dominates. In an intensive code, the faster the stimulus, the more the neurons fire. PMID:26631149

  9. An adaptive distributed data aggregation based on RCPC for wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Hua, Guogang; Chen, Chang Wen

    2006-05-01

    One of the most important design issues in wireless sensor networks is energy efficiency. Data aggregation has significant impact on the energy efficiency of the wireless sensor networks. With massive deployment of sensor nodes and limited energy supply, data aggregation has been considered as an essential paradigm for data collection in sensor networks. Recently, distributed source coding has been demonstrated to possess several advantages in data aggregation for wireless sensor networks. Distributed source coding is able to encode sensor data with lower bit rate without direct communication among sensor nodes. To ensure reliable and high throughput transmission with the aggregated data, we proposed in this research a progressive transmission and decoding of Rate-Compatible Punctured Convolutional (RCPC) coded data aggregation with distributed source coding. Our proposed 1/2 RSC codes with Viterbi algorithm for distributed source coding are able to guarantee that, even without any correlation between the data, the decoder can always decode the data correctly without wasting energy. The proposed approach achieves two aspects in adaptive data aggregation for wireless sensor networks. First, the RCPC coding facilitates adaptive compression corresponding to the correlation of the sensor data. When the data correlation is high, higher compression ration can be achieved. Otherwise, lower compression ratio will be achieved. Second, the data aggregation is adaptively accumulated. There is no waste of energy in the transmission; even there is no correlation among the data, the energy consumed is at the same level as raw data collection. Experimental results have shown that the proposed distributed data aggregation based on RCPC is able to achieve high throughput and low energy consumption data collection for wireless sensor networks

  10. Adaptive decoding of convolutional codes

    NASA Astrophysics Data System (ADS)

    Hueske, K.; Geldmacher, J.; Götze, J.

    2007-06-01

    Convolutional codes, which are frequently used as error correction codes in digital transmission systems, are generally decoded using the Viterbi Decoder. On the one hand the Viterbi Decoder is an optimum maximum likelihood decoder, i.e. the most probable transmitted code sequence is obtained. On the other hand the mathematical complexity of the algorithm only depends on the used code, not on the number of transmission errors. To reduce the complexity of the decoding process for good transmission conditions, an alternative syndrome based decoder is presented. The reduction of complexity is realized by two different approaches, the syndrome zero sequence deactivation and the path metric equalization. The two approaches enable an easy adaptation of the decoding complexity for different transmission conditions, which results in a trade-off between decoding complexity and error correction performance.

  11. Behavioral correlates of the distributed coding of spatial context.

    PubMed

    Anderson, Michael I; Killing, Sarah; Morris, Caitlin; O'Donoghue, Alan; Onyiagha, Dikennam; Stevenson, Rosemary; Verriotis, Madeleine; Jeffery, Kathryn J

    2006-01-01

    Hippocampal place cells respond heterogeneously to elemental changes of a compound spatial context, suggesting that they form a distributed code of context, whereby context information is shared across a population of neurons. The question arises as to what this distributed code might be useful for. The present study explored two possibilities: one, that it allows contexts with common elements to be disambiguated, and the other, that it allows a given context to be associated with more than one outcome. We used two naturalistic measures of context processing in rats, rearing and thigmotaxis (boundary-hugging), to explore how rats responded to contextual novelty and to relate this to the behavior of place cells. In experiment 1, rats showed dishabituation of rearing to a novel reconfiguration of familiar context elements, suggesting that they perceived the reconfiguration as novel, a behavior that parallels that of place cells in a similar situation. In experiment 2, rats were trained in a place preference task on an open-field arena. A change in the arena context triggered renewed thigmotaxis, and yet navigation continued unimpaired, indicating simultaneous representation of both the altered contextual and constant spatial cues. Place cells similarly exhibited a dual population of responses, consistent with the hypothesis that their activity underlies spatial behavior. Together, these experiments suggest that heterogeneous context encoding (or "partial remapping") by place cells may function to allow the flexible assignment of associations to contexts, a faculty that could be useful in episodic memory encoding. Copyright (c) 2006 Wiley-Liss, Inc.

  12. Grid-Adapted FUN3D Computations for the Second High Lift Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Lee-Rausch, E. M.; Rumsey, C. L.; Park, M. A.

    2014-01-01

    Contributions of the unstructured Reynolds-averaged Navier-Stokes code FUN3D to the 2nd AIAA CFD High Lift Prediction Workshop are described, and detailed comparisons are made with experimental data. Using workshop-supplied grids, results for the clean wing configuration are compared with results from the structured code CFL3D Using the same turbulence model, both codes compare reasonably well in terms of total forces and moments, and the maximum lift is similarly over-predicted for both codes compared to experiment. By including more representative geometry features such as slat and flap brackets and slat pressure tube bundles, FUN3D captures the general effects of the Reynolds number variation, but under-predicts maximum lift on workshop-supplied grids in comparison with the experimental data, due to excessive separation. However, when output-based, off-body grid adaptation in FUN3D is employed, results improve considerably. In particular, when the geometry includes both brackets and the pressure tube bundles, grid adaptation results in a more accurate prediction of lift near stall in comparison with the wind-tunnel data. Furthermore, a rotation-corrected turbulence model shows improved pressure predictions on the outboard span when using adapted grids.

  13. Polarization-multiplexed rate-adaptive non-binary-quasi-cyclic-LDPC-coded multilevel modulation with coherent detection for optical transport networks.

    PubMed

    Arabaci, Murat; Djordjevic, Ivan B; Saunders, Ross; Marcoccia, Roberto M

    2010-02-01

    In order to achieve high-speed transmission over optical transport networks (OTNs) and maximize its throughput, we propose using a rate-adaptive polarization-multiplexed coded multilevel modulation with coherent detection based on component non-binary quasi-cyclic (QC) LDPC codes. Compared to prior-art bit-interleaved LDPC-coded modulation (BI-LDPC-CM) scheme, the proposed non-binary LDPC-coded modulation (NB-LDPC-CM) scheme not only reduces latency due to symbol- instead of bit-level processing but also provides either impressive reduction in computational complexity or striking improvements in coding gain depending on the constellation size. As the paper presents, compared to its prior-art binary counterpart, the proposed NB-LDPC-CM scheme addresses the needs of future OTNs, which are achieving the target BER performance and providing maximum possible throughput both over the entire lifetime of the OTN, better.

  14. Toward a Probabilistic Automata Model of Some Aspects of Code-Switching.

    ERIC Educational Resources Information Center

    Dearholt, D. W.; Valdes-Fallis, G.

    1978-01-01

    The purpose of the model is to select either Spanish or English as the language to be used; its goals at this stage of development include modeling code-switching for lexical need, apparently random code-switching, dependency of code-switching upon sociolinguistic context, and code-switching within syntactic constraints. (EJS)

  15. Distributed Learning, Recognition, and Prediction by ART and ARTMAP Neural Networks.

    PubMed

    Carpenter, Gail A.

    1997-11-01

    A class of adaptive resonance theory (ART) models for learning, recognition, and prediction with arbitrarily distributed code representations is introduced. Distributed ART neural networks combine the stable fast learning capabilities of winner-take-all ART systems with the noise tolerance and code compression capabilities of multilayer perceptrons. With a winner-take-all code, the unsupervised model dART reduces to fuzzy ART and the supervised model dARTMAP reduces to fuzzy ARTMAP. With a distributed code, these networks automatically apportion learned changes according to the degree of activation of each coding node, which permits fast as well as slow learning without catastrophic forgetting. Distributed ART models replace the traditional neural network path weight with a dynamic weight equal to the rectified difference between coding node activation and an adaptive threshold. Thresholds increase monotonically during learning according to a principle of atrophy due to disuse. However, monotonic change at the synaptic level manifests itself as bidirectional change at the dynamic level, where the result of adaptation resembles long-term potentiation (LTP) for single-pulse or low frequency test inputs but can resemble long-term depression (LTD) for higher frequency test inputs. This paradoxical behavior is traced to dual computational properties of phasic and tonic coding signal components. A parallel distributed match-reset-search process also helps stabilize memory. Without the match-reset-search system, dART becomes a type of distributed competitive learning network.

  16. Parallel evolutionary computation in bioinformatics applications.

    PubMed

    Pinho, Jorge; Sobral, João Luis; Rocha, Miguel

    2013-05-01

    A large number of optimization problems within the field of Bioinformatics require methods able to handle its inherent complexity (e.g. NP-hard problems) and also demand increased computational efforts. In this context, the use of parallel architectures is a necessity. In this work, we propose ParJECoLi, a Java based library that offers a large set of metaheuristic methods (such as Evolutionary Algorithms) and also addresses the issue of its efficient execution on a wide range of parallel architectures. The proposed approach focuses on the easiness of use, making the adaptation to distinct parallel environments (multicore, cluster, grid) transparent to the user. Indeed, this work shows how the development of the optimization library can proceed independently of its adaptation for several architectures, making use of Aspect-Oriented Programming. The pluggable nature of parallelism related modules allows the user to easily configure its environment, adding parallelism modules to the base source code when needed. The performance of the platform is validated with two case studies within biological model optimization. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  17. Codon usage bias reveals genomic adaptations to environmental conditions in an acidophilic consortium.

    PubMed

    Hart, Andrew; Cortés, María Paz; Latorre, Mauricio; Martinez, Servet

    2018-01-01

    The analysis of codon usage bias has been widely used to characterize different communities of microorganisms. In this context, the aim of this work was to study the codon usage bias in a natural consortium of five acidophilic bacteria used for biomining. The codon usage bias of the consortium was contrasted with genes from an alternative collection of acidophilic reference strains and metagenome samples. Results indicate that acidophilic bacteria preferentially have low codon usage bias, consistent with both their capacity to live in a wide range of habitats and their slow growth rate, a characteristic probably acquired independently from their phylogenetic relationships. In addition, the analysis showed significant differences in the unique sets of genes from the autotrophic species of the consortium in relation to other acidophilic organisms, principally in genes which code for proteins involved in metal and oxidative stress resistance. The lower values of codon usage bias obtained in this unique set of genes suggest higher transcriptional adaptation to living in extreme conditions, which was probably acquired as a measure for resisting the elevated metal conditions present in the mine.

  18. Computer programming for generating visual stimuli.

    PubMed

    Bukhari, Farhan; Kurylo, Daniel D

    2008-02-01

    Critical to vision research is the generation of visual displays with precise control over stimulus metrics. Generating stimuli often requires adapting commercial software or developing specialized software for specific research applications. In order to facilitate this process, we give here an overview that allows nonexpert users to generate and customize stimuli for vision research. We first give a review of relevant hardware and software considerations, to allow the selection of display hardware, operating system, programming language, and graphics packages most appropriate for specific research applications. We then describe the framework of a generic computer program that can be adapted for use with a broad range of experimental applications. Stimuli are generated in the context of trial events, allowing the display of text messages, the monitoring of subject responses and reaction times, and the inclusion of contingency algorithms. This approach allows direct control and management of computer-generated visual stimuli while utilizing the full capabilities of modern hardware and software systems. The flowchart and source code for the stimulus-generating program may be downloaded from www.psychonomic.org/archive.

  19. Some practical universal noiseless coding techniques, part 3, module PSl14,K+

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.

    1991-01-01

    The algorithmic definitions, performance characterizations, and application notes for a high-performance adaptive noiseless coding module are provided. Subsets of these algorithms are currently under development in custom very large scale integration (VLSI) at three NASA centers. The generality of coding algorithms recently reported is extended. The module incorporates a powerful adaptive noiseless coder for Standard Data Sources (i.e., sources whose symbols can be represented by uncorrelated non-negative integers, where smaller integers are more likely than the larger ones). Coders can be specified to provide performance close to the data entropy over any desired dynamic range (of entropy) above 0.75 bit/sample. This is accomplished by adaptively choosing the best of many efficient variable-length coding options to use on each short block of data (e.g., 16 samples) All code options used for entropies above 1.5 bits/sample are 'Huffman Equivalent', but they require no table lookups to implement. The coding can be performed directly on data that have been preprocessed to exhibit the characteristics of a standard source. Alternatively, a built-in predictive preprocessor can be used where applicable. This built-in preprocessor includes the familiar 1-D predictor followed by a function that maps the prediction error sequences into the desired standard form. Additionally, an external prediction can be substituted if desired. A broad range of issues dealing with the interface between the coding module and the data systems it might serve are further addressed. These issues include: multidimensional prediction, archival access, sensor noise, rate control, code rate improvements outside the module, and the optimality of certain internal code options.

  20. Adaptive and reliably acknowledged FSO communications

    NASA Astrophysics Data System (ADS)

    Fitz, Michael P.; Halford, Thomas R.; Kose, Cenk; Cromwell, Jonathan; Gordon, Steven

    2015-05-01

    Atmospheric turbulence causes the receive signal intensity on free space optical (FSO) communication links to vary over time. Scintillation fades can stymie connectivity for milliseconds at a time. To approach the information-theoretic limits of communication in such time-varying channels, it necessary to either code across extremely long blocks of data - thereby inducing unacceptable delays - or to vary the code rate according to the instantaneous channel conditions. We describe the design, laboratory testing, and over-the-air testing of an FSO modem that employs a protocol with adaptive coded modulation (ACM) and hybrid automatic repeat request. For links with fixed throughput, this protocol provides a 10dB reduction in the required received signal-to-noise ratio (SNR); for links with fixed range, this protocol provides the greater than a 3x increase in throughput. Independent U.S. Government tests demonstrate that our protocol effectively adapts the code rate to match the instantaneous channel conditions. The modem is able to provide throughputs in excess of 850 Mbps on links with ranges greater than 15 kilometers.

  1. GIZMO: Multi-method magneto-hydrodynamics+gravity code

    NASA Astrophysics Data System (ADS)

    Hopkins, Philip F.

    2014-10-01

    GIZMO is a flexible, multi-method magneto-hydrodynamics+gravity code that solves the hydrodynamic equations using a variety of different methods. It introduces new Lagrangian Godunov-type methods that allow solving the fluid equations with a moving particle distribution that is automatically adaptive in resolution and avoids the advection errors, angular momentum conservation errors, and excessive diffusion problems that seriously limit the applicability of “adaptive mesh” (AMR) codes, while simultaneously avoiding the low-order errors inherent to simpler methods like smoothed-particle hydrodynamics (SPH). GIZMO also allows the use of SPH either in “traditional” form or “modern” (more accurate) forms, or use of a mesh. Self-gravity is solved quickly with a BH-Tree (optionally a hybrid PM-Tree for periodic boundaries) and on-the-fly adaptive gravitational softenings. The code is descended from P-GADGET, itself descended from GADGET-2 (ascl:0003.001), and many of the naming conventions remain (for the sake of compatibility with the large library of GADGET work and analysis software).

  2. FUN3D and CFL3D Computations for the First High Lift Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Park, Michael A.; Lee-Rausch, Elizabeth M.; Rumsey, Christopher L.

    2011-01-01

    Two Reynolds-averaged Navier-Stokes codes were used to compute flow over the NASA Trapezoidal Wing at high lift conditions for the 1st AIAA CFD High Lift Prediction Workshop, held in Chicago in June 2010. The unstructured-grid code FUN3D and the structured-grid code CFL3D were applied to several different grid systems. The effects of code, grid system, turbulence model, viscous term treatment, and brackets were studied. The SST model on this configuration predicted lower lift than the Spalart-Allmaras model at high angles of attack; the Spalart-Allmaras model agreed better with experiment. Neglecting viscous cross-derivative terms caused poorer prediction in the wing tip vortex region. Output-based grid adaptation was applied to the unstructured-grid solutions. The adapted grids better resolved wake structures and reduced flap flow separation, which was also observed in uniform grid refinement studies. Limitations of the adaptation method as well as areas for future improvement were identified.

  3. A proposed study of multiple scattering through clouds up to 1 THz

    NASA Technical Reports Server (NTRS)

    Gerace, G. C.; Smith, E. K.

    1992-01-01

    A rigorous computation of the electromagnetic field scattered from an atmospheric liquid water cloud is proposed. The recent development of a fast recursive algorithm (Chew algorithm) for computing the fields scattered from numerous scatterers now makes a rigorous computation feasible. A method is presented for adapting this algorithm to a general case where there are an extremely large number of scatterers. It is also proposed to extend a new binary PAM channel coding technique (El-Khamy coding) to multiple levels with non-square pulse shapes. The Chew algorithm can be used to compute the transfer function of a cloud channel. Then the transfer function can be used to design an optimum El-Khamy code. In principle, these concepts can be applied directly to the realistic case of a time-varying cloud (adaptive channel coding and adaptive equalization). A brief review is included of some preliminary work on cloud dispersive effects on digital communication signals and on cloud liquid water spectra and correlations.

  4. Functions of Code-Switching among Iranian Advanced and Elementary Teachers and Students

    ERIC Educational Resources Information Center

    Momenian, Mohammad; Samar, Reza Ghafar

    2011-01-01

    This paper reports on the findings of a study carried out on the advanced and elementary teachers' and students' functions and patterns of code-switching in Iranian English classrooms. This concept has not been adequately examined in L2 (second language) classroom contexts than in outdoor natural contexts. Therefore, besides reporting on the…

  5. Inter-Sentential Patterns of Code-Switching: A Gender-Based Investigation of Male and Female EFL Teachers

    ERIC Educational Resources Information Center

    Gulzar, Malik Ajmal; Farooq, Muhammad Umar; Umer, Muhammad

    2013-01-01

    This article has sought to contribute to discussions concerning the value of inter-sentential patterns of code-switching (henceforth ISPCS) particularly in the context of EFL classrooms. Through a detailed analysis of recorded data produced in that context, distinctive features in the discourse were discerned which were associated with males' and…

  6. A New Realistic Evaluation Analysis Method: Linked Coding of Context, Mechanism, and Outcome Relationships

    ERIC Educational Resources Information Center

    Jackson, Suzanne F.; Kolla, Gillian

    2012-01-01

    In attempting to use a realistic evaluation approach to explore the role of Community Parents in early parenting programs in Toronto, a novel technique was developed to analyze the links between contexts (C), mechanisms (M) and outcomes (O) directly from experienced practitioner interviews. Rather than coding the interviews into themes in terms of…

  7. Code-Switching in Higher Education in a Multilingual Environment: A Lebanese Exploratory Study

    ERIC Educational Resources Information Center

    Bahous, Rima N.; Nabhani, Mona Baroud; Bacha, Nahla Nola

    2014-01-01

    Research has shown that code-switching (CS) between languages in spoken discourse is prevalent in multilingual contexts and is used for many purposes. More recently, it has become the subject of much concern in academic contexts in negatively affecting students' language use and learning. However, while the concern has been increasing, no rigorous…

  8. Distributed Adaptive Binary Quantization for Fast Nearest Neighbor Search.

    PubMed

    Xianglong Liu; Zhujin Li; Cheng Deng; Dacheng Tao

    2017-11-01

    Hashing has been proved an attractive technique for fast nearest neighbor search over big data. Compared with the projection based hashing methods, prototype-based ones own stronger power to generate discriminative binary codes for the data with complex intrinsic structure. However, existing prototype-based methods, such as spherical hashing and K-means hashing, still suffer from the ineffective coding that utilizes the complete binary codes in a hypercube. To address this problem, we propose an adaptive binary quantization (ABQ) method that learns a discriminative hash function with prototypes associated with small unique binary codes. Our alternating optimization adaptively discovers the prototype set and the code set of a varying size in an efficient way, which together robustly approximate the data relations. Our method can be naturally generalized to the product space for long hash codes, and enjoys the fast training linear to the number of the training data. We further devise a distributed framework for the large-scale learning, which can significantly speed up the training of ABQ in the distributed environment that has been widely deployed in many areas nowadays. The extensive experiments on four large-scale (up to 80 million) data sets demonstrate that our method significantly outperforms state-of-the-art hashing methods, with up to 58.84% performance gains relatively.

  9. Developing an ethical code for engineers: the discursive approach.

    PubMed

    Lozano, J Félix

    2006-04-01

    From the Hippocratic Oath on, deontological codes and other professional self-regulation mechanisms have been used to legitimize and identify professional groups. New technological challenges and, above all, changes in the socioeconomic environment require adaptable codes which can respond to new demands. We assume that ethical codes for professionals should not simply focus on regulative functions, but must also consider ideological and educative functions. Any adaptations should take into account both contents (values, norms and recommendations) and the drafting process itself. In this article we propose a process for developing a professional ethical code for an official professional association (Colegio Oficial de Ingenieros Industriales de Valencia (COIIV) starting from the philosophical assumptions of discursive ethics but adapting them to critical hermeneutics. Our proposal is based on the Integrity Approach rather than the Compliance Approach. A process aiming to achieve an effective ethical document that fulfils regulative and ideological functions requires a participative, dialogical and reflexive methodology. This process must respond to moral exigencies and demands for efficiency and professional effectiveness. In addition to the methodological proposal we present our experience of producing an ethical code for the industrial engineers' association in Valencia (Spain) where this methodology was applied, and we evaluate the detected problems and future potential.

  10. The Dynamics of Self-Esteem in Cognitive Therapy for Avoidant and Obsessive-Compulsive Personality Disorders: An Adaptive Role of Self-Esteem Variability?

    PubMed Central

    Cummings, Jorden A.; Hayes, Adele M.; Cardaciotto, LeeAnn; Newman, Cory F.

    2011-01-01

    Self-esteem variability is often associated with poor functioning. However, in disorders with entrenched negative views of self and in a context designed to challenge those views, variable self-esteem might represent a marker of change. We examined self-esteem variability in a sample of 27 patients with Avoidant and Obsessive-Compulsive Personality Disorders who received Cognitive Therapy (CT). A therapy coding system was used to rate patients’ positive and negative views of self expressed in the first ten sessions of a 52-week treatment. Ratings of negative (reverse scored) and positive view of self were summed to create a composite score for each session. Self-esteem variability was calculated as the standard deviation of self-esteem scores across sessions. More self-esteem variability predicted more improvement in personality disorder and depression symptoms at the end of treatment, beyond baseline and average self-esteem. Early variability in self-esteem, in this population and context, appeared to be a marker of therapeutic change. PMID:22923855

  11. Challenges in Wireless System Integration as Enablers for Indoor Context Aware Environments

    PubMed Central

    Aguirre, Erik

    2017-01-01

    The advent of fully interactive environments within Smart Cities and Smart Regions requires the use of multiple wireless systems. In the case of user-device interaction, which finds multiple applications such as Ambient Assisted Living, Intelligent Transportation Systems or Smart Grids, among others, large amount of transceivers are employed in order to achieve anytime, anyplace and any device connectivity. The resulting combination of heterogeneous wireless network exhibits fundamental limitations derived from Coverage/Capacity relations, as a function of required Quality of Service parameters, required bit rate, energy restrictions and adaptive modulation and coding schemes. In this context, inherent transceiver density poses challenges in overall system operation, given by multiple node operation which increases overall interference levels. In this work, a deterministic based analysis applied to variable density wireless sensor network operation within complex indoor scenarios is presented, as a function of topological node distribution. The extensive analysis derives interference characterizations, both for conventional transceivers as well as wearables, which provide relevant information in terms of individual node configuration as well as complete network layout. PMID:28704963

  12. Gain-adaptive vector quantization for medium-rate speech coding

    NASA Technical Reports Server (NTRS)

    Chen, J.-H.; Gersho, A.

    1985-01-01

    A class of adaptive vector quantizers (VQs) that can dynamically adjust the 'gain' of codevectors according to the input signal level is introduced. The encoder uses a gain estimator to determine a suitable normalization of each input vector prior to VQ coding. The normalized vectors have reduced dynamic range and can then be more efficiently coded. At the receiver, the VQ decoder output is multiplied by the estimated gain. Both forward and backward adaptation are considered and several different gain estimators are compared and evaluated. An approach to optimizing the design of gain estimators is introduced. Some of the more obvious techniques for achieving gain adaptation are substantially less effective than the use of optimized gain estimators. A novel design technique that is needed to generate the appropriate gain-normalized codebook for the vector quantizer is introduced. Experimental results show that a significant gain in segmental SNR can be obtained over nonadaptive VQ with a negligible increase in complexity.

  13. CosmosDG: An hp -adaptive Discontinuous Galerkin Code for Hyper-resolved Relativistic MHD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anninos, Peter; Lau, Cheuk; Bryant, Colton

    We have extended Cosmos++, a multidimensional unstructured adaptive mesh code for solving the covariant Newtonian and general relativistic radiation magnetohydrodynamic (MHD) equations, to accommodate both discrete finite volume and arbitrarily high-order finite element structures. The new finite element implementation, called CosmosDG, is based on a discontinuous Galerkin (DG) formulation, using both entropy-based artificial viscosity and slope limiting procedures for the regularization of shocks. High-order multistage forward Euler and strong-stability preserving Runge–Kutta time integration options complement high-order spatial discretization. We have also added flexibility in the code infrastructure allowing for both adaptive mesh and adaptive basis order refinement to be performedmore » separately or simultaneously in a local (cell-by-cell) manner. We discuss in this report the DG formulation and present tests demonstrating the robustness, accuracy, and convergence of our numerical methods applied to special and general relativistic MHD, although we note that an equivalent capability currently also exists in CosmosDG for Newtonian systems.« less

  14. CosmosDG: An hp-adaptive Discontinuous Galerkin Code for Hyper-resolved Relativistic MHD

    NASA Astrophysics Data System (ADS)

    Anninos, Peter; Bryant, Colton; Fragile, P. Chris; Holgado, A. Miguel; Lau, Cheuk; Nemergut, Daniel

    2017-08-01

    We have extended Cosmos++, a multidimensional unstructured adaptive mesh code for solving the covariant Newtonian and general relativistic radiation magnetohydrodynamic (MHD) equations, to accommodate both discrete finite volume and arbitrarily high-order finite element structures. The new finite element implementation, called CosmosDG, is based on a discontinuous Galerkin (DG) formulation, using both entropy-based artificial viscosity and slope limiting procedures for the regularization of shocks. High-order multistage forward Euler and strong-stability preserving Runge-Kutta time integration options complement high-order spatial discretization. We have also added flexibility in the code infrastructure allowing for both adaptive mesh and adaptive basis order refinement to be performed separately or simultaneously in a local (cell-by-cell) manner. We discuss in this report the DG formulation and present tests demonstrating the robustness, accuracy, and convergence of our numerical methods applied to special and general relativistic MHD, although we note that an equivalent capability currently also exists in CosmosDG for Newtonian systems.

  15. An assessment of the adaptive unstructured tetrahedral grid, Euler Flow Solver Code FELISA

    NASA Technical Reports Server (NTRS)

    Djomehri, M. Jahed; Erickson, Larry L.

    1994-01-01

    A three-dimensional solution-adaptive Euler flow solver for unstructured tetrahedral meshes is assessed, and the accuracy and efficiency of the method for predicting sonic boom pressure signatures about simple generic models are demonstrated. Comparison of computational and wind tunnel data and enhancement of numerical solutions by means of grid adaptivity are discussed. The mesh generation is based on the advancing front technique. The FELISA code consists of two solvers, the Taylor-Galerkin and the Runge-Kutta-Galerkin schemes, both of which are spacially discretized by the usual Galerkin weighted residual finite-element methods but with different explicit time-marching schemes to steady state. The solution-adaptive grid procedure is based on either remeshing or mesh refinement techniques. An alternative geometry adaptive procedure is also incorporated.

  16. Simulations of recoiling black holes: adaptive mesh refinement and radiative transfer

    NASA Astrophysics Data System (ADS)

    Meliani, Zakaria; Mizuno, Yosuke; Olivares, Hector; Porth, Oliver; Rezzolla, Luciano; Younsi, Ziri

    2017-02-01

    Context. In many astrophysical phenomena, and especially in those that involve the high-energy regimes that always accompany the astronomical phenomenology of black holes and neutron stars, physical conditions that are achieved are extreme in terms of speeds, temperatures, and gravitational fields. In such relativistic regimes, numerical calculations are the only tool to accurately model the dynamics of the flows and the transport of radiation in the accreting matter. Aims: We here continue our effort of modelling the behaviour of matter when it orbits or is accreted onto a generic black hole by developing a new numerical code that employs advanced techniques geared towards solving the equations of general-relativistic hydrodynamics. Methods: More specifically, the new code employs a number of high-resolution shock-capturing Riemann solvers and reconstruction algorithms, exploiting the enhanced accuracy and the reduced computational cost of adaptive mesh-refinement (AMR) techniques. In addition, the code makes use of sophisticated ray-tracing libraries that, coupled with general-relativistic radiation-transfer calculations, allow us to accurately compute the electromagnetic emissions from such accretion flows. Results: We validate the new code by presenting an extensive series of stationary accretion flows either in spherical or axial symmetry that are performed either in two or three spatial dimensions. In addition, we consider the highly nonlinear scenario of a recoiling black hole produced in the merger of a supermassive black-hole binary interacting with the surrounding circumbinary disc. In this way, we can present for the first time ray-traced images of the shocked fluid and the light curve resulting from consistent general-relativistic radiation-transport calculations from this process. Conclusions: The work presented here lays the ground for the development of a generic computational infrastructure employing AMR techniques to accurately and self-consistently calculate general-relativistic accretion flows onto compact objects. In addition to the accurate handling of the matter, we provide a self-consistent electromagnetic emission from these scenarios by solving the associated radiative-transfer problem. While magnetic fields are currently excluded from our analysis, the tools presented here can have a number of applications to study accretion flows onto black holes or neutron stars.

  17. Professional Learning Communities Assessment: Adaptation, Internal Validity, and Multidimensional Model Testing in Turkish Context

    ERIC Educational Resources Information Center

    Dogan, Selçuk; Tatik, R. Samil; Yurtseven, Nihal

    2017-01-01

    The main purpose of this study is to adapt and validate the Professional Learning Communities Assessment Revised (PLCA-R) by Olivier, Hipp, and Huffman within the context of Turkish schools. The instrument was translated and adapted to administer to teachers in Turkey. Internal structure of the Turkish version of PLCA-R was investigated by using…

  18. Visual cues that are effective for contextual saccade adaptation.

    PubMed

    Azadi, Reza; Harwood, Mark R

    2014-06-01

    The accuracy of saccades, as maintained by saccade adaptation, has been shown to be context dependent: able to have different amplitude movements to the same retinal displacement dependent on motor contexts such as orbital starting location. There is conflicting evidence as to whether purely visual cues also effect contextual saccade adaptation and, if so, what function this might serve. We tested what visual cues might evoke contextual adaptation. Over 5 experiments, 78 naive subjects made saccades to circularly moving targets, which stepped outward or inward during the saccade depending on target movement direction, speed, or color and shape. To test if the movement or context postsaccade were critical, we stopped the postsaccade target motion (experiment 4) or neutralized the contexts by equating postsaccade target speed to an intermediate value (experiment 5). We found contextual adaptation in all conditions except those defined by color and shape. We conclude that some, but not all, visual cues before the saccade are sufficient for contextual adaptation. We conjecture that this visual contextuality functions to allow for different motor states for different coordinated movement patterns, such as coordinated saccade and pursuit motor planning. Copyright © 2014 the American Physiological Society.

  19. User's manual for a material transport code on the Octopus Computer Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naymik, T.G.; Mendez, G.D.

    1978-09-15

    A code to simulate material transport through porous media was developed at Oak Ridge National Laboratory. This code has been modified and adapted for use at Lawrence Livermore Laboratory. This manual, in conjunction with report ORNL-4928, explains the input, output, and execution of the code on the Octopus Computer Network.

  20. Adapting to a US Medical Curriculum in Malaysia: A Qualitative Study on Cultural Dissonance in International Education

    PubMed Central

    Shields, Ryan Y

    2016-01-01

    Context Minimal research has examined the recent exportation of medical curricula to international settings. Johns Hopkins University School of Medicine in Baltimore, USA partnered with Perdana University Graduate School of Medicine in Kuala Lumpur, Malaysia and implemented the same curriculum currently used at Johns Hopkins University to teach medical students at Perdana University. This study aimed to explore the perspectives of first-year medical students at Perdana University, focusing on issues of cultural dissonance during adaptation to a US curriculum. Methods In-depth semi-structured interviews with the inaugural class of first-year students (n=24) were conducted, audio-recorded, and transcribed. Two reviewers independently coded and analyzed the qualitative data for major themes. Results The most prominent themes identified were the transition from a “passive” to an “active” learning environment and the friendliness and openness of the professors. Students noted that “[Perdana University] is a whole new, different culture and now we are adapting to the culture.” Being vocal during classes and taking exams based on conceptual understanding and knowledge application/integration proved to be more challenging for students than having classes taught entirely in English or the amount of material covered. Discussion This study reinforced many cultural education theories as it revealed the major issues of Malaysian graduate students adapting to a US-style medical curriculum. Despite coming from a collectivistic, Confucian-based cultural learning background, the Malaysian students at Perdana University adopted and adapted to, and subsequently supported, the US learning expectations. PMID:27672530

  1. Language Recognition via Sparse Coding

    DTIC Science & Technology

    2016-09-08

    a posteriori (MAP) adaptation scheme that further optimizes the discriminative quality of sparse-coded speech fea - tures. We empirically validate the...significantly improve the discriminative quality of sparse-coded speech fea - tures. In Section 4, we evaluate the proposed approaches against an i-vector

  2. Dual coding theory, word abstractness, and emotion: a critical review of Kousta et al. (2011).

    PubMed

    Paivio, Allan

    2013-02-01

    Kousta, Vigliocco, Del Campo, Vinson, and Andrews (2011) questioned the adequacy of dual coding theory and the context availability model as explanations of representational and processing differences between concrete and abstract words. They proposed an alternative approach that focuses on the role of emotional content in the processing of abstract concepts. Their dual coding critique is, however, based on impoverished and, in some respects, incorrect interpretations of the theory and its implications. This response corrects those gaps and misinterpretations and summarizes research findings that show predicted variations in the effects of dual coding variables in different tasks and contexts. Especially emphasized is an empirically supported dual coding theory of emotion that goes beyond the Kousta et al. emphasis on emotion in abstract semantics. 2013 APA, all rights reserved

  3. Adaptive coded aperture imaging in the infrared: towards a practical implementation

    NASA Astrophysics Data System (ADS)

    Slinger, Chris W.; Gilholm, Kevin; Gordon, Neil; McNie, Mark; Payne, Doug; Ridley, Kevin; Strens, Malcolm; Todd, Mike; De Villiers, Geoff; Watson, Philip; Wilson, Rebecca; Dyer, Gavin; Eismann, Mike; Meola, Joe; Rogers, Stanley

    2008-08-01

    An earlier paper [1] discussed the merits of adaptive coded apertures for use as lensless imaging systems in the thermal infrared and visible. It was shown how diffractive (rather than the more conventional geometric) coding could be used, and that 2D intensity measurements from multiple mask patterns could be combined and decoded to yield enhanced imagery. Initial experimental results in the visible band were presented. Unfortunately, radiosity calculations, also presented in that paper, indicated that the signal to noise performance of systems using this approach was likely to be compromised, especially in the infrared. This paper will discuss how such limitations can be overcome, and some of the tradeoffs involved. Experimental results showing tracking and imaging performance of these modified, diffractive, adaptive coded aperture systems in the visible and infrared will be presented. The subpixel imaging and tracking performance is compared to that of conventional imaging systems and shown to be superior. System size, weight and cost calculations indicate that the coded aperture approach, employing novel photonic MOEMS micro-shutter architectures, has significant merits for a given level of performance in the MWIR when compared to more conventional imaging approaches.

  4. Algorithms for high-speed universal noiseless coding

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.; Yeh, Pen-Shu; Miller, Warner

    1993-01-01

    This paper provides the basic algorithmic definitions and performance characterizations for a high-performance adaptive noiseless (lossless) 'coding module' which is currently under separate developments as single-chip microelectronic circuits at two NASA centers. Laboratory tests of one of these implementations recently demonstrated coding rates of up to 900 Mbits/s. Operation of a companion 'decoding module' can operate at up to half the coder's rate. The functionality provided by these modules should be applicable to most of NASA's science data. The hardware modules incorporate a powerful adaptive noiseless coder for 'standard form' data sources (i.e., sources whose symbols can be represented by uncorrelated nonnegative integers where the smaller integers are more likely than the larger ones). Performance close to data entries can be expected over a 'dynamic range' of from 1.5 to 12-15 bits/sample (depending on the implementation). This is accomplished by adaptively choosing the best of many Huffman equivalent codes to use on each block of 1-16 samples. Because of the extreme simplicity of these codes no table lookups are actually required in an implementation, thus leading to the expected very high data rate capabilities already noted.

  5. Direct collapse to supermassive black hole seeds: comparing the AMR and SPH approaches.

    PubMed

    Luo, Yang; Nagamine, Kentaro; Shlosman, Isaac

    2016-07-01

    We provide detailed comparison between the adaptive mesh refinement (AMR) code enzo-2.4 and the smoothed particle hydrodynamics (SPH)/ N -body code gadget-3 in the context of isolated or cosmological direct baryonic collapse within dark matter (DM) haloes to form supermassive black holes. Gas flow is examined by following evolution of basic parameters of accretion flows. Both codes show an overall agreement in the general features of the collapse; however, many subtle differences exist. For isolated models, the codes increase their spatial and mass resolutions at different pace, which leads to substantially earlier collapse in SPH than in AMR cases due to higher gravitational resolution in gadget-3. In cosmological runs, the AMR develops a slightly higher baryonic resolution than SPH during halo growth via cold accretion permeated by mergers. Still, both codes agree in the build-up of DM and baryonic structures. However, with the onset of collapse, this difference in mass and spatial resolution is amplified, so evolution of SPH models begins to lag behind. Such a delay can have effect on formation/destruction rate of H 2 due to UV background, and on basic properties of host haloes. Finally, isolated non-cosmological models in spinning haloes, with spin parameter λ ∼ 0.01-0.07, show delayed collapse for greater λ, but pace of this increase is faster for AMR. Within our simulation set-up, gadget-3 requires significantly larger computational resources than enzo-2.4 during collapse, and needs similar resources, during the pre-collapse, cosmological structure formation phase. Yet it benefits from substantially higher gravitational force and hydrodynamic resolutions, except at the end of collapse.

  6. Direct collapse to supermassive black hole seeds: comparing the AMR and SPH approaches

    NASA Astrophysics Data System (ADS)

    Luo, Yang; Nagamine, Kentaro; Shlosman, Isaac

    2016-07-01

    We provide detailed comparison between the adaptive mesh refinement (AMR) code ENZO-2.4 and the smoothed particle hydrodynamics (SPH)/N-body code GADGET-3 in the context of isolated or cosmological direct baryonic collapse within dark matter (DM) haloes to form supermassive black holes. Gas flow is examined by following evolution of basic parameters of accretion flows. Both codes show an overall agreement in the general features of the collapse; however, many subtle differences exist. For isolated models, the codes increase their spatial and mass resolutions at different pace, which leads to substantially earlier collapse in SPH than in AMR cases due to higher gravitational resolution in GADGET-3. In cosmological runs, the AMR develops a slightly higher baryonic resolution than SPH during halo growth via cold accretion permeated by mergers. Still, both codes agree in the build-up of DM and baryonic structures. However, with the onset of collapse, this difference in mass and spatial resolution is amplified, so evolution of SPH models begins to lag behind. Such a delay can have effect on formation/destruction rate of H2 due to UV background, and on basic properties of host haloes. Finally, isolated non-cosmological models in spinning haloes, with spin parameter λ ˜ 0.01-0.07, show delayed collapse for greater λ, but pace of this increase is faster for AMR. Within our simulation set-up, GADGET-3 requires significantly larger computational resources than ENZO-2.4 during collapse, and needs similar resources, during the pre-collapse, cosmological structure formation phase. Yet it benefits from substantially higher gravitational force and hydrodynamic resolutions, except at the end of collapse.

  7. Semantic enrichment of medical forms - semi-automated coding of ODM-elements via web services.

    PubMed

    Breil, Bernhard; Watermann, Andreas; Haas, Peter; Dziuballe, Philipp; Dugas, Martin

    2012-01-01

    Semantic interoperability is an unsolved problem which occurs while working with medical forms from different information systems or institutions. Standards like ODM or CDA assure structural homogenization but in order to compare elements from different data models it is necessary to use semantic concepts and codes on an item level of those structures. We developed and implemented a web-based tool which enables a domain expert to perform semi-automated coding of ODM-files. For each item it is possible to inquire web services which result in unique concept codes without leaving the context of the document. Although it was not feasible to perform a totally automated coding we have implemented a dialog based method to perform an efficient coding of all data elements in the context of the whole document. The proportion of codable items was comparable to results from previous studies.

  8. One Speaker, Two Languages. Cross-Disciplinary Perspectives on Code-Switching.

    ERIC Educational Resources Information Center

    Milroy, Lesley, Ed.; Muysken, Pieter, Ed.

    Fifteen articles review code-switching in the four major areas: policy implications in specific institutional and community settings; perspectives of social theory of code-switching as a form of speech behavior in particular social contexts; the grammatical analysis of code-switching, including factors that constrain switching even within a…

  9. Regional vertical total electron content (VTEC) modeling together with satellite and receiver differential code biases (DCBs) using semi-parametric multivariate adaptive regression B-splines (SP-BMARS)

    NASA Astrophysics Data System (ADS)

    Durmaz, Murat; Karslioglu, Mahmut Onur

    2015-04-01

    There are various global and regional methods that have been proposed for the modeling of ionospheric vertical total electron content (VTEC). Global distribution of VTEC is usually modeled by spherical harmonic expansions, while tensor products of compactly supported univariate B-splines can be used for regional modeling. In these empirical parametric models, the coefficients of the basis functions as well as differential code biases (DCBs) of satellites and receivers can be treated as unknown parameters which can be estimated from geometry-free linear combinations of global positioning system observables. In this work we propose a new semi-parametric multivariate adaptive regression B-splines (SP-BMARS) method for the regional modeling of VTEC together with satellite and receiver DCBs, where the parametric part of the model is related to the DCBs as fixed parameters and the non-parametric part adaptively models the spatio-temporal distribution of VTEC. The latter is based on multivariate adaptive regression B-splines which is a non-parametric modeling technique making use of compactly supported B-spline basis functions that are generated from the observations automatically. This algorithm takes advantage of an adaptive scale-by-scale model building strategy that searches for best-fitting B-splines to the data at each scale. The VTEC maps generated from the proposed method are compared numerically and visually with the global ionosphere maps (GIMs) which are provided by the Center for Orbit Determination in Europe (CODE). The VTEC values from SP-BMARS and CODE GIMs are also compared with VTEC values obtained through calibration using local ionospheric model. The estimated satellite and receiver DCBs from the SP-BMARS model are compared with the CODE distributed DCBs. The results show that the SP-BMARS algorithm can be used to estimate satellite and receiver DCBs while adaptively and flexibly modeling the daily regional VTEC.

  10. Layer-oriented simulation tool.

    PubMed

    Arcidiacono, Carmelo; Diolaiti, Emiliano; Tordi, Massimiliano; Ragazzoni, Roberto; Farinato, Jacopo; Vernet, Elise; Marchetti, Enrico

    2004-08-01

    The Layer-Oriented Simulation Tool (LOST) is a numerical simulation code developed for analysis of the performance of multiconjugate adaptive optics modules following a layer-oriented approach. The LOST code computes the atmospheric layers in terms of phase screens and then propagates the phase delays introduced in the natural guide stars' wave fronts by using geometrical optics approximations. These wave fronts are combined in an optical or numerical way, including the effects of wave-front sensors on measurements in terms of phase noise. The LOST code is described, and two applications to layer-oriented modules are briefly presented. We have focus on the Multiconjugate adaptive optics demonstrator to be mounted upon the Very Large Telescope and on the Near-IR-Visible Adaptive Interferometer for Astronomy (NIRVANA) interferometric system to be installed on the combined focus of the Large Binocular Telescope.

  11. Throughput Optimization Via Adaptive MIMO Communications

    DTIC Science & Technology

    2006-05-30

    End-to-end matlab packet simulation platform. * Low density parity check code (LDPCC). * Field trials with Silvus DSP MIMO testbed. * High mobility...incorporate advanced LDPC (low density parity check) codes . Realizing that the power of LDPC codes come at the price of decoder complexity, we also...Channel Coding Binary Convolution Code or LDPC Packet Length 0 - 216-1, bytes Coding Rate 1/2, 2/3, 3/4, 5/6 MIMO Channel Training Length 0 - 4, symbols

  12. "We would never forget who we are": resettlement, cultural negotiation, and family relationships among Somali Bantu refugees.

    PubMed

    Frounfelker, Rochelle L; Assefa, Mehret T; Smith, Emily; Hussein, Aweis; Betancourt, Theresa S

    2017-11-01

    Somali refugees are resettling in large numbers in the US, but little is known about the Somali Bantu, an ethnic minority within this population. Refugee youth mental health is linked to the functioning of the larger family unit. Understanding how the process of culturally adjusting to life after resettlement relates to family functioning can help identify what kind of interventions might strengthen families and lead to better mental health outcomes for youth. This paper seeks to address the following research questions: (1) How do different groups of Somali Bantu refugees describe their experiences of culturally adapting to life in the US?; and (2) How, if at all, do processes of cultural adaptation in a new country affect Somali Bantu family functioning? We conducted 14 focus groups with a total of 81 Somali Bantu refugees in New England. Authors analyzed focus groups using principles of thematic analysis to develop codes and an overarching theoretical model about the relationship between cultural adaptation, parent-child relationships, and family functioning. Views and expectations of parent-child relationships were compared between Somali Bantu youth and adults. Cultural negotiation was dependent upon broader sociocultural contexts in the United States that were most salient to the experience of the individual. Adult and youth participants had conflicting views around negotiating Somali Bantu culture, which often led to strained parent-child relationships. In contrast, youth sibling relationships were strengthened, as they turned to each other for support in navigating the process of cultural adaptation.

  13. The adaptive significance of adult neurogenesis: an integrative approach

    PubMed Central

    Konefal, Sarah; Elliot, Mick; Crespi, Bernard

    2013-01-01

    Adult neurogenesis in mammals is predominantly restricted to two brain regions, the dentate gyrus (DG) of the hippocampus and the olfactory bulb (OB), suggesting that these two brain regions uniquely share functions that mediate its adaptive significance. Benefits of adult neurogenesis across these two regions appear to converge on increased neuronal and structural plasticity that subserves coding of novel, complex, and fine-grained information, usually with contextual components that include spatial positioning. By contrast, costs of adult neurogenesis appear to center on potential for dysregulation resulting in higher risk of brain cancer or psychological dysfunctions, but such costs have yet to be quantified directly. The three main hypotheses for the proximate functions and adaptive significance of adult neurogenesis, pattern separation, memory consolidation, and olfactory spatial, are not mutually exclusive and can be reconciled into a simple general model amenable to targeted experimental and comparative tests. Comparative analysis of brain region sizes across two major social-ecological groups of primates, gregarious (mainly diurnal haplorhines, visually-oriented, and in large social groups) and solitary (mainly noctural, territorial, and highly reliant on olfaction, as in most rodents) suggest that solitary species, but not gregarious species, show positive associations of population densities and home range sizes with sizes of both the hippocampus and OB, implicating their functions in social-territorial systems mediated by olfactory cues. Integrated analyses of the adaptive significance of adult neurogenesis will benefit from experimental studies motivated and structured by ecologically and socially relevant selective contexts. PMID:23882188

  14. Developing a method for specifying the components of behavior change interventions in practice: the example of smoking cessation.

    PubMed

    Lorencatto, Fabiana; West, Robert; Seymour, Natalie; Michie, Susan

    2013-06-01

    There is a difference between interventions as planned and as delivered in practice. Unless we know what was actually delivered, we cannot understand "what worked" in effective interventions. This study aimed to (a) assess whether an established taxonomy of 53 smoking cessation behavior change techniques (BCTs) may be applied or adapted as a method for reliably specifying the content of smoking cessation behavioral support consultations and (b) develop an effective method for training researchers and practitioners in the reliable application of the taxonomy. Fifteen transcripts of audio-recorded consultations delivered by England's Stop Smoking Services were coded into component BCTs using the taxonomy. Interrater reliability and potential adaptations to the taxonomy to improve coding were discussed following 3 coding waves. A coding training manual was developed through expert consensus and piloted on 10 trainees, assessing coding reliability and self-perceived competence before and after training. An average of 33 BCTs from the taxonomy were identified at least once across sessions and coding waves. Consultations contained on average 12 BCTs (range = 8-31). Average interrater reliability was high (88% agreement). The taxonomy was adapted to simplify coding by merging co-occurring BCTs and refining BCT definitions. Coding reliability and self-perceived competence significantly improved posttraining for all trainees. It is possible to apply a taxonomy to reliably identify and classify BCTs in smoking cessation behavioral support delivered in practice, and train inexperienced coders to do so reliably. This method can be used to investigate variability in provision of behavioral support across services, monitor fidelity of delivery, and identify training needs.

  15. Using an innovative mixed method methodology to investigate the appropriateness of a quantitative instrument in an African context: Antiretroviral treatment and quality of life.

    PubMed

    Greeff, Minrie; Chepuka, Lignet M; Chilemba, Winnie; Chimwaza, Angela F; Kululanga, Lucy I; Kgositau, Mabedi; Manyedi, Eva; Shaibu, Sheila; Wright, Susan C D

    2014-01-01

    The relationship between quality of life (QoL) and antiretroviral treatment (ART) has mainly been studied using quantitative scales often not appropriate for use in other contexts and without taking peoples' lived experiences into consideration. Sub-Saharan Africa has the highest incidence of HIV and AIDS yet there is paucity in research done on QoL. This research report is intended to give an account of the use of a mixed method convergent parallel design as a novice approach to evaluate an instrument's context specificity, appropriateness and usefulness in another context for which it was designed. Data were collected through a qualitative exploration of the experiences of QoL of people living with HIV or AIDS (PLHA) in Africa since being on ART, as well as the quantitative measurements obtained from the HIV/AIDS-targeted quality of life (HAT-QoL) instrument. This study was conducted in three African countries. Permission and ethical approval to conduct the study were obtained. Purposive voluntary sampling was used to recruit PLHA through mediators working in community-based HIV/AIDS organisations and health clinics. Interviews were analysed through open coding and the quantitative data through descriptive statistics and the Cronbach's alpha coefficient. A much wider range and richness of experiences were expressed than measured by the HAT-QoL instrument. Although an effective instrument for use in the USA, it was found not to be sensitive, appropriate and useful in an African context in its present form. The recommendations focus on adapting the instrument using the data from the in-depth interviews or to develop a context-sensitive instrument that could measure QoL of PLHA in Africa.

  16. Optimum Boundaries of Signal-to-Noise Ratio for Adaptive Code Modulations

    DTIC Science & Technology

    2017-11-14

    1510–1521, Feb. 2015. [2]. Pursley, M. B. and Royster, T. C., “Adaptive-rate nonbinary LDPC coding for frequency - hop communications ,” IEEE...and this can cause a very narrowband noise near the center frequency during USRP signal acquisition and generation. This can cause a high BER...Final Report APPROVED FOR PUBLIC RELEASE; DISTRIBUTION IS UNLIMITED. AIR FORCE RESEARCH LABORATORY Space Vehicles Directorate 3550 Aberdeen Ave

  17. A multiblock/multizone code (PAB 3D-v2) for the three-dimensional Navier-Stokes equations: Preliminary applications

    NASA Technical Reports Server (NTRS)

    Abdol-Hamid, Khaled S.

    1990-01-01

    The development and applications of multiblock/multizone and adaptive grid methodologies for solving the three-dimensional simplified Navier-Stokes equations are described. Adaptive grid and multiblock/multizone approaches are introduced and applied to external and internal flow problems. These new implementations increase the capabilities and flexibility of the PAB3D code in solving flow problems associated with complex geometry.

  18. Categorizing the social context of the wildland urban interface: Adaptive capacity for wildfire and community "archetypes"

    Treesearch

    Tavis B. Paveglio; Cassandra Moseley; Matthew S. Carroll; Daniel R. Williams; Emily Jane Davis; A. Paige Fischer

    2015-01-01

    Understanding the local context that shapes collective response to wildfire risk continues to be a challenge for scientists and policymakers. This study utilizes and expands on a conceptual approach for understanding adaptive capacity to wildfire in a comparison of 18 past case studies. The intent is to determine whether comparison of local social context and community...

  19. Nevada Administrative Code for Special Education Programs.

    ERIC Educational Resources Information Center

    Nevada State Dept. of Education, Carson City. Special Education Branch.

    This document presents excerpts from Chapter 388 of the Nevada Administrative Code, which concerns definitions, eligibility, and programs for students who are disabled or gifted/talented. The first section gathers together 36 relevant definitions from the Code for such concepts as "adaptive behavior,""autism,""gifted and…

  20. CACTI: free, open-source software for the sequential coding of behavioral interactions.

    PubMed

    Glynn, Lisa H; Hallgren, Kevin A; Houck, Jon M; Moyers, Theresa B

    2012-01-01

    The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery.

  1. Critical roles for a genetic code alteration in the evolution of the genus Candida.

    PubMed

    Silva, Raquel M; Paredes, João A; Moura, Gabriela R; Manadas, Bruno; Lima-Costa, Tatiana; Rocha, Rita; Miranda, Isabel; Gomes, Ana C; Koerkamp, Marian J G; Perrot, Michel; Holstege, Frank C P; Boucherie, Hélian; Santos, Manuel A S

    2007-10-31

    During the last 30 years, several alterations to the standard genetic code have been discovered in various bacterial and eukaryotic species. Sense and nonsense codons have been reassigned or reprogrammed to expand the genetic code to selenocysteine and pyrrolysine. These discoveries highlight unexpected flexibility in the genetic code, but do not elucidate how the organisms survived the proteome chaos generated by codon identity redefinition. In order to shed new light on this question, we have reconstructed a Candida genetic code alteration in Saccharomyces cerevisiae and used a combination of DNA microarrays, proteomics and genetics approaches to evaluate its impact on gene expression, adaptation and sexual reproduction. This genetic manipulation blocked mating, locked yeast in a diploid state, remodelled gene expression and created stress cross-protection that generated adaptive advantages under environmental challenging conditions. This study highlights unanticipated roles for codon identity redefinition during the evolution of the genus Candida, and strongly suggests that genetic code alterations create genetic barriers that speed up speciation.

  2. Adaptive Precoded MIMO for LTE Wireless Communication

    NASA Astrophysics Data System (ADS)

    Nabilla, A. F.; Tiong, T. C.

    2015-04-01

    Long-Term Evolution (LTE) and Long Term Evolution-Advanced (ATE-A) have provided a major step forward in mobile communication capability. The objectives to be achieved are high peak data rates in high spectrum bandwidth and high spectral efficiencies. Technically, pre-coding means that multiple data streams are emitted from the transmit antenna with independent and appropriate weightings such that the link throughput is maximized at the receiver output thus increasing or equalizing the received signal to interference and noise (SINR) across the multiple receiver terminals. However, it is not reliable enough to fully utilize the information transfer rate to fit the condition of channel according to the bandwidth size. Thus, adaptive pre-coding is proposed. It applies pre-coding matrix indicator (PMI) channel state making it possible to change the pre-coding codebook accordingly thus improving the data rate higher than fixed pre-coding.

  3. Norm-based coding of facial identity in adults with autism spectrum disorder.

    PubMed

    Walsh, Jennifer A; Maurer, Daphne; Vida, Mark D; Rhodes, Gillian; Jeffery, Linda; Rutherford, M D

    2015-03-01

    It is unclear whether reported deficits in face processing in individuals with autism spectrum disorders (ASD) can be explained by deficits in perceptual face coding mechanisms. In the current study, we examined whether adults with ASD showed evidence of norm-based opponent coding of facial identity, a perceptual process underlying the recognition of facial identity in typical adults. We began with an original face and an averaged face and then created an anti-face that differed from the averaged face in the opposite direction from the original face by a small amount (near adaptor) or a large amount (far adaptor). To test for norm-based coding, we adapted participants on different trials to the near versus far adaptor, then asked them to judge the identity of the averaged face. We varied the size of the test and adapting faces in order to reduce any contribution of low-level adaptation. Consistent with the predictions of norm-based coding, high functioning adults with ASD (n = 27) and matched typical participants (n = 28) showed identity aftereffects that were larger for the far than near adaptor. Unlike results with children with ASD, the strength of the aftereffects were similar in the two groups. This is the first study to demonstrate norm-based coding of facial identity in adults with ASD. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. ATHENA 3D: A finite element code for ultrasonic wave propagation

    NASA Astrophysics Data System (ADS)

    Rose, C.; Rupin, F.; Fouquet, T.; Chassignole, B.

    2014-04-01

    The understanding of wave propagation phenomena requires use of robust numerical models. 3D finite element (FE) models are generally prohibitively time consuming. However, advances in computing processor speed and memory allow them to be more and more competitive. In this context, EDF R&D developed the 3D version of the well-validated FE code ATHENA2D. The code is dedicated to the simulation of wave propagation in all kinds of elastic media and in particular, heterogeneous and anisotropic materials like welds. It is based on solving elastodynamic equations in the calculation zone expressed in terms of stress and particle velocities. The particularity of the code relies on the fact that the discretization of the calculation domain uses a Cartesian regular 3D mesh while the defect of complex geometry can be described using a separate (2D) mesh using the fictitious domains method. This allows combining the rapidity of regular meshes computation with the capability of modelling arbitrary shaped defects. Furthermore, the calculation domain is discretized with a quasi-explicit time evolution scheme. Thereby only local linear systems of small size have to be solved. The final step to reduce the computation time relies on the fact that ATHENA3D has been parallelized and adapted to the use of HPC resources. In this paper, the validation of the 3D FE model is discussed. A cross-validation of ATHENA 3D and CIVA is proposed for several inspection configurations. The performances in terms of calculation time are also presented in the cases of both local computer and computation cluster use.

  5. Adaptation in Coding by Large Populations of Neurons in the Retina

    NASA Astrophysics Data System (ADS)

    Ioffe, Mark L.

    A comprehensive theory of neural computation requires an understanding of the statistical properties of the neural population code. The focus of this work is the experimental study and theoretical analysis of the statistical properties of neural activity in the tiger salamander retina. This is an accessible yet complex system, for which we control the visual input and record from a substantial portion--greater than a half--of the ganglion cell population generating the spiking output. Our experiments probe adaptation of the retina to visual statistics: a central feature of sensory systems which have to adjust their limited dynamic range to a far larger space of possible inputs. In Chapter 1 we place our work in context with a brief overview of the relevant background. In Chapter 2 we describe the experimental methodology of recording from 100+ ganglion cells in the tiger salamander retina. In Chapter 3 we first present the measurements of adaptation of individual cells to changes in stimulation statistics and then investigate whether pairwise correlations in fluctuations of ganglion cell activity change across different stimulation conditions. We then transition to a study of the population-level probability distribution of the retinal response captured with maximum-entropy models. Convergence of the model inference is presented in Chapter 4. In Chapter 5 we first test the empirical presence of a phase transition in such models fitting the retinal response to different experimental conditions, and then proceed to develop other characterizations which are sensitive to complexity in the interaction matrix. This includes an analysis of the dynamics of sampling at finite temperature, which demonstrates a range of subtle attractor-like properties in the energy landscape. These are largely conserved when ambient illumination is varied 1000-fold, a result not necessarily apparent from the measured low-order statistics of the distribution. Our results form a consistent picture which is discussed at the end of Chapter 5. We conclude with a few future directions related to this thesis.

  6. FPGA implementation of advanced FEC schemes for intelligent aggregation networks

    NASA Astrophysics Data System (ADS)

    Zou, Ding; Djordjevic, Ivan B.

    2016-02-01

    In state-of-the-art fiber-optics communication systems the fixed forward error correction (FEC) and constellation size are employed. While it is important to closely approach the Shannon limit by using turbo product codes (TPC) and low-density parity-check (LDPC) codes with soft-decision decoding (SDD) algorithm; rate-adaptive techniques, which enable increased information rates over short links and reliable transmission over long links, are likely to become more important with ever-increasing network traffic demands. In this invited paper, we describe a rate adaptive non-binary LDPC coding technique, and demonstrate its flexibility and good performance exhibiting no error floor at BER down to 10-15 in entire code rate range, by FPGA-based emulation, making it a viable solution in the next-generation high-speed intelligent aggregation networks.

  7. Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.

    2003-01-01

    A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.

  8. Context aware adaptive security service model

    NASA Astrophysics Data System (ADS)

    Tunia, Marcin A.

    2015-09-01

    Present systems and devices are usually protected against different threats concerning digital data processing. The protection mechanisms consume resources, which are either highly limited or intensively utilized by many entities. The optimization of these resources usage is advantageous. The resources that are saved performing optimization may be utilized by other mechanisms or may be sufficient for longer time. It is usually assumed that protection has to provide specific quality and attack resistance. By interpreting context situation of business services - users and services themselves, it is possible to adapt security services parameters to countermeasure threats associated with current situation. This approach leads to optimization of used resources and maintains sufficient security level. This paper presents architecture of adaptive security service, which is context-aware and exploits quality of context data issue.

  9. Current and anticipated uses of thermal-hydraulic codes in NFI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsuda, K.; Takayasu, M.

    1997-07-01

    This paper presents the thermal-hydraulic codes currently used in NFI for the LWR fuel development and licensing application including transient and design basis accident analyses of LWR plants. The current status of the codes are described in the context of code capability, modeling feature, and experience of code application related to the fuel development and licensing. Finally, the anticipated use of the future thermal-hydraulic code in NFI is briefly given.

  10. Context-aware adaptive spelling in motor imagery BCI

    NASA Astrophysics Data System (ADS)

    Perdikis, S.; Leeb, R.; Millán, J. d. R.

    2016-06-01

    Objective. This work presents a first motor imagery-based, adaptive brain-computer interface (BCI) speller, which is able to exploit application-derived context for improved, simultaneous classifier adaptation and spelling. Online spelling experiments with ten able-bodied users evaluate the ability of our scheme, first, to alleviate non-stationarity of brain signals for restoring the subject’s performances, second, to guide naive users into BCI control avoiding initial offline BCI calibration and, third, to outperform regular unsupervised adaptation. Approach. Our co-adaptive framework combines the BrainTree speller with smooth-batch linear discriminant analysis adaptation. The latter enjoys contextual assistance through BrainTree’s language model to improve online expectation-maximization maximum-likelihood estimation. Main results. Our results verify the possibility to restore single-sample classification and BCI command accuracy, as well as spelling speed for expert users. Most importantly, context-aware adaptation performs significantly better than its unsupervised equivalent and similar to the supervised one. Although no significant differences are found with respect to the state-of-the-art PMean approach, the proposed algorithm is shown to be advantageous for 30% of the users. Significance. We demonstrate the possibility to circumvent supervised BCI recalibration, saving time without compromising the adaptation quality. On the other hand, we show that this type of classifier adaptation is not as efficient for BCI training purposes.

  11. Context-aware adaptive spelling in motor imagery BCI.

    PubMed

    Perdikis, S; Leeb, R; Millán, J D R

    2016-06-01

    This work presents a first motor imagery-based, adaptive brain-computer interface (BCI) speller, which is able to exploit application-derived context for improved, simultaneous classifier adaptation and spelling. Online spelling experiments with ten able-bodied users evaluate the ability of our scheme, first, to alleviate non-stationarity of brain signals for restoring the subject's performances, second, to guide naive users into BCI control avoiding initial offline BCI calibration and, third, to outperform regular unsupervised adaptation. Our co-adaptive framework combines the BrainTree speller with smooth-batch linear discriminant analysis adaptation. The latter enjoys contextual assistance through BrainTree's language model to improve online expectation-maximization maximum-likelihood estimation. Our results verify the possibility to restore single-sample classification and BCI command accuracy, as well as spelling speed for expert users. Most importantly, context-aware adaptation performs significantly better than its unsupervised equivalent and similar to the supervised one. Although no significant differences are found with respect to the state-of-the-art PMean approach, the proposed algorithm is shown to be advantageous for 30% of the users. We demonstrate the possibility to circumvent supervised BCI recalibration, saving time without compromising the adaptation quality. On the other hand, we show that this type of classifier adaptation is not as efficient for BCI training purposes.

  12. Do perceived context pictures automatically activate their phonological code?

    PubMed

    Jescheniak, Jörg D; Oppermann, Frank; Hantsch, Ansgar; Wagner, Valentin; Mädebach, Andreas; Schriefers, Herbert

    2009-01-01

    Morsella and Miozzo (Morsella, E., & Miozzo, M. (2002). Evidence for a cascade model of lexical access in speech production. Journal of Experimental Psychology: Learning, Memory, and Cognition, 28, 555-563) have reported that the to-be-ignored context pictures become phonologically activated when participants name a target picture, and took this finding as support for cascaded models of lexical retrieval in speech production. In a replication and extension of their experiment in German, we failed to obtain priming effects from context pictures phonologically related to a to-be-named target picture. By contrast, corresponding context words (i.e., the names of the respective pictures) and the same context pictures, when used in an identity condition, did reliably facilitate the naming process. This pattern calls into question the generality of the claim advanced by Morsella and Miozzo that perceptual processing of pictures in the context of a naming task automatically leads to the activation of corresponding lexical-phonological codes.

  13. Visual Coding of Human Bodies: Perceptual Aftereffects Reveal Norm-Based, Opponent Coding of Body Identity

    ERIC Educational Resources Information Center

    Rhodes, Gillian; Jeffery, Linda; Boeing, Alexandra; Calder, Andrew J.

    2013-01-01

    Despite the discovery of body-selective neural areas in occipitotemporal cortex, little is known about how bodies are visually coded. We used perceptual adaptation to determine how body identity is coded. Brief exposure to a body (e.g., anti-Rose) biased perception toward an identity with opposite properties (Rose). Moreover, the size of this…

  14. Implementation of context independent code on a new array processor: The Super-65

    NASA Technical Reports Server (NTRS)

    Colbert, R. O.; Bowhill, S. A.

    1981-01-01

    The feasibility of rewriting standard uniprocessor programs into code which contains no context-dependent branches is explored. Context independent code (CIC) would contain no branches that might require different processing elements to branch different ways. In order to investigate the possibilities and restrictions of CIC, several programs were recoded into CIC and a four-element array processor was built. This processor (the Super-65) consisted of three 6502 microprocessors and the Apple II microcomputer. The results obtained were somewhat dependent upon the specific architecture of the Super-65 but within bounds, the throughput of the array processor was found to increase linearly with the number of processing elements (PEs). The slope of throughput versus PEs is highly dependent on the program and varied from 0.33 to 1.00 for the sample programs.

  15. Cultural adaptation of an evidence-based nursing intervention to improve medication adherence among people living with HIV/AIDS (PLWHA) in China.

    PubMed

    Williams, Ann B; Wang, Honghong; Burgess, Jane; Li, Xianhong; Danvers, Karina

    2013-04-01

    Adapting nursing interventions to suit the needs and culture of a new population (cultural adaptation) is an important early step in the process of implementation and dissemination. While the need for cultural adaptation is widely accepted, research-based strategies for doing so are not well articulated. Non-adherence to medications for chronic disease is a global problem and cultural adaptation of existing evidence-based interventions could be useful. This paper aims to describe the cultural adaptation of an evidence-based nursing intervention to improve medication adherence among people living with HIV/AIDS and to offer recommendations for adaptation of interventions across cultures and borders. SITE: The intervention, which demonstrated efficacy in a randomized controlled trial in North America, was adapted for the cultural and social context of Hunan Province, in south central China. The adaptation process was undertaken by intervention stakeholders including the original intervention study team, the proposed adaptation team, and members of a Community Advisory Board, including people living with HIV/AIDS, family members, and health care workers at the target clinical sites. The adaptation process was driven by quantitative and qualitative data describing the new population and context and was guided by principles for cultural adaptation drawn from prevention science research. The primary adaptation to the intervention was the inclusion of family members in intervention activities, in response to the cultural and social importance of the family in rural China. In a pilot test of the adapted intervention, self-reported medication adherence improved significantly in the group receiving the intervention compared to the control group (p=0.01). Recommendations for cultural adaptation of nursing interventions include (1) involve stakeholders from the beginning; (2) assess the population, need, and context; (3) evaluate the intervention to be adapted with attention to details of the original studies that demonstrated efficacy; (4) compare important elements of the original intervention with those of the proposed new population and context to identify primary points for adaptation; (5) explicitly identify sources of tension between intervention fidelity and cultural adaptive needs; (6) document the process of adaptation, pilot the adapted intervention, and evaluate its effectiveness before moving to dissemination and implementation on a large scale. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Using conversation analytic methods to assess fidelity to a talk-based healthcare intervention for frequently attending patients.

    PubMed

    Barnes, Rebecca K; Jepson, Marcus; Thomas, Clare; Jackson, Sue; Metcalfe, Chris; Kessler, David; Cramer, Helen

    2018-06-01

    The study aim was to assess implementation fidelity (i.e., adherence) to a talk-based primary care intervention using Conversation Analytic (CA) methods. The context was a UK feasibility trial where General Practitioners (GPs) were trained to use "BATHE" (Background,Affect,Trouble,Handling,Empathy) - a technique to screen for psychosocial issues during consultations - with frequently attending patients. 35 GPs received BATHE training between July-October 2015. 15 GPs across six practices self-selected to record a sample of their consultations with study patients at three and six months. 31 consultations were recorded. 21/26 patients in four intervention practices gave permission for analysis. The recordings were transcribed and initially coded for the presence or absence of the five BATHE components. CA methods were applied to assess delivery, focusing on position and composition of each component, and patients' responses. Initial coding showed most of the BATHE components to be present in most contacts. However the CA analysis revealed unplanned deviations in position and adaptations in composition. Frequently the intervention was initiated too early in the consultation, and the BATHE questions misunderstood by patients as pertaining to their presenting problems rather than the psychosocial context for their problems. Often these deviations resulted in reducing theoretical fidelity of the intervention as a whole. A CA approach enabled a dynamic assessment of the delivery and receipt of BATHE in situ revealing common pitfalls in delivery and provided valuable examples of more and less efficacious implementations. During the trial this evidence was used in top-up trainings to address problems in delivery and to improve GP engagement. Using CA methods enabled a more accurate assessment of implementation fidelity, a fuller description of the intervention itself, and enhanced resources for future training. When positioned appropriately, BATHE can be a useful tool for eliciting information about the wider context of the medical visit. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. AFRESh: an adaptive framework for compression of reads and assembled sequences with random access functionality.

    PubMed

    Paridaens, Tom; Van Wallendael, Glenn; De Neve, Wesley; Lambert, Peter

    2017-05-15

    The past decade has seen the introduction of new technologies that lowered the cost of genomic sequencing increasingly. We can even observe that the cost of sequencing is dropping significantly faster than the cost of storage and transmission. The latter motivates a need for continuous improvements in the area of genomic data compression, not only at the level of effectiveness (compression rate), but also at the level of functionality (e.g. random access), configurability (effectiveness versus complexity, coding tool set …) and versatility (support for both sequenced reads and assembled sequences). In that regard, we can point out that current approaches mostly do not support random access, requiring full files to be transmitted, and that current approaches are restricted to either read or sequence compression. We propose AFRESh, an adaptive framework for no-reference compression of genomic data with random access functionality, targeting the effective representation of the raw genomic symbol streams of both reads and assembled sequences. AFRESh makes use of a configurable set of prediction and encoding tools, extended by a Context-Adaptive Binary Arithmetic Coding scheme (CABAC), to compress raw genetic codes. To the best of our knowledge, our paper is the first to describe an effective implementation CABAC outside of its' original application. By applying CABAC, the compression effectiveness improves by up to 19% for assembled sequences and up to 62% for reads. By applying AFRESh to the genomic symbols of the MPEG genomic compression test set for reads, a compression gain is achieved of up to 51% compared to SCALCE, 42% compared to LFQC and 44% compared to ORCOM. When comparing to generic compression approaches, a compression gain is achieved of up to 41% compared to GNU Gzip and 22% compared to 7-Zip at the Ultra setting. Additionaly, when compressing assembled sequences of the Human Genome, a compression gain is achieved up to 34% compared to GNU Gzip and 16% compared to 7-Zip at the Ultra setting. A Windows executable version can be downloaded at https://github.com/tparidae/AFresh . tom.paridaens@ugent.be. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  18. Odor-context effects in free recall after a short retention interval: a new methodology for controlling adaptation.

    PubMed

    Isarida, Takeo; Sakai, Tetsuya; Kubota, Takayuki; Koga, Miho; Katayama, Yu; Isarida, Toshiko K

    2014-04-01

    The present study investigated context effects of incidental odors in free recall after a short retention interval (5 min). With a short retention interval, the results are not confounded by extraneous odors or encounters with the experimental odor and possible rehearsal during a long retention interval. A short study time condition (4 s per item), predicted not to be affected by adaptation to the odor, and a long study time condition (8 s per item) were used. Additionally, we introduced a new method for recovery from adaptation, where a dissimilar odor was briefly presented at the beginning of the retention interval, and we demonstrated the effectiveness of this technique. An incidental learning paradigm was used to prevent overshadowing from confounding the results. In three experiments, undergraduates (N = 200) incidentally studied words presented one-by-one and received a free recall test. Two pairs of odors and a third odor having different semantic-differential characteristics were selected from 14 familiar odors. One of the odors was presented during encoding, and during the test, the same odor (same-context condition) or the other odor within the pair (different-context condition) was presented. Without using a recovery-from-adaptation method, a significant odor-context effect appeared in the 4-s/item condition, but not in the 8-s/item condition. Using the recovery-from-adaptation method, context effects were found for both the 8- and the 4-s/item conditions. The size of the recovered odor-context effect did not change with study time. There were no serial position effects. Implications of the present findings are discussed.

  19. ALEGRA -- A massively parallel h-adaptive code for solid dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Summers, R.M.; Wong, M.K.; Boucheron, E.A.

    1997-12-31

    ALEGRA is a multi-material, arbitrary-Lagrangian-Eulerian (ALE) code for solid dynamics designed to run on massively parallel (MP) computers. It combines the features of modern Eulerian shock codes, such as CTH, with modern Lagrangian structural analysis codes using an unstructured grid. ALEGRA is being developed for use on the teraflop supercomputers to conduct advanced three-dimensional (3D) simulations of shock phenomena important to a variety of systems. ALEGRA was designed with the Single Program Multiple Data (SPMD) paradigm, in which the mesh is decomposed into sub-meshes so that each processor gets a single sub-mesh with approximately the same number of elements. Usingmore » this approach the authors have been able to produce a single code that can scale from one processor to thousands of processors. A current major effort is to develop efficient, high precision simulation capabilities for ALEGRA, without the computational cost of using a global highly resolved mesh, through flexible, robust h-adaptivity of finite elements. H-adaptivity is the dynamic refinement of the mesh by subdividing elements, thus changing the characteristic element size and reducing numerical error. The authors are working on several major technical challenges that must be met to make effective use of HAMMER on MP computers.« less

  20. An Adaptive Source-Channel Coding with Feedback for Progressive Transmission of Medical Images

    PubMed Central

    Lo, Jen-Lung; Sanei, Saeid; Nazarpour, Kianoush

    2009-01-01

    A novel adaptive source-channel coding with feedback for progressive transmission of medical images is proposed here. In the source coding part, the transmission starts from the region of interest (RoI). The parity length in the channel code varies with respect to both the proximity of the image subblock to the RoI and the channel noise, which is iteratively estimated in the receiver. The overall transmitted data can be controlled by the user (clinician). In the case of medical data transmission, it is vital to keep the distortion level under control as in most of the cases certain clinically important regions have to be transmitted without any visible error. The proposed system significantly reduces the transmission time and error. Moreover, the system is very user friendly since the selection of the RoI, its size, overall code rate, and a number of test features such as noise level can be set by the users in both ends. A MATLAB-based TCP/IP connection has been established to demonstrate the proposed interactive and adaptive progressive transmission system. The proposed system is simulated for both binary symmetric channel (BSC) and Rayleigh channel. The experimental results verify the effectiveness of the design. PMID:19190770

  1. Optimum Adaptive Modulation and Channel Coding Scheme for Frequency Domain Channel-Dependent Scheduling in OFDM Based Evolved UTRA Downlink

    NASA Astrophysics Data System (ADS)

    Miki, Nobuhiko; Kishiyama, Yoshihisa; Higuchi, Kenichi; Sawahashi, Mamoru; Nakagawa, Masao

    In the Evolved UTRA (UMTS Terrestrial Radio Access) downlink, Orthogonal Frequency Division Multiplexing (OFDM) based radio access was adopted because of its inherent immunity to multipath interference and flexible accommodation of different spectrum arrangements. This paper presents the optimum adaptive modulation and channel coding (AMC) scheme when resource blocks (RBs) is simultaneously assigned to the same user when frequency and time domain channel-dependent scheduling is assumed in the downlink OFDMA radio access with single-antenna transmission. We start by presenting selection methods for the modulation and coding scheme (MCS) employing mutual information both for RB-common and RB-dependent modulation schemes. Simulation results show that, irrespective of the application of power adaptation to RB-dependent modulation, the improvement in the achievable throughput of the RB-dependent modulation scheme compared to that for the RB-common modulation scheme is slight, i.e., 4 to 5%. In addition, the number of required control signaling bits in the RB-dependent modulation scheme becomes greater than that for the RB-common modulation scheme. Therefore, we conclude that the RB-common modulation and channel coding rate scheme is preferred, when multiple RBs of the same coded stream are assigned to one user in the case of single-antenna transmission.

  2. How prevention curricula are taught under real-world conditions

    PubMed Central

    Miller-Day, Michelle; Pettigrew, Jonathan; Hecht, Michael L.; Shin, YoungJu; Graham, John; Krieger, Janice

    2015-01-01

    Purpose As interventions are disseminated widely, issues of fidelity and adaptation become increasingly critical to understand. This study aims to describe the types of adaptations made by teachers delivering a school-based substance use prevention curriculum and their reasons for adapting program content. Design/methodology/approach To determine the degree to which implementers adhere to a prevention curriculum, naturally adapt the curriculum, and the reasons implementers give for making adaptations, the study examined lesson adaptations made by the 31 teachers who implemented the keepin' it REAL drug prevention curriculum in 7th grade classrooms (n = 25 schools). Data were collected from teacher self-reports after each lesson and observer coding of videotaped lessons. From the total sample, 276 lesson videos were randomly selected for observational analysis. Findings Teachers self-reported adapting more than 68 percent of prevention lessons, while independent observers reported more than 97 percent of the observed lessons were adapted in some way. Types of adaptations included: altering the delivery of the lesson by revising the delivery timetable or delivery context; changing content of the lesson by removing, partially covering, revising, or adding content; and altering the designated format of the lesson (such as assigning small group activities to students as individual work). Reasons for adaptation included responding to constraints (time, institutional, personal, and technical), and responding to student needs (students' abilities to process curriculum content, to enhance student engagement with material). Research limitations/implications The study sample was limited to rural schools in the US mid-Atlantic; however, the results suggest that if programs are to be effectively implemented, program developers need a better understanding of the types of adaptations and reasons implementers provide for adapting curricula. Practical implications These descriptive data suggest that prevention curricula be developed in shorter teaching modules, developers reconsider the usefulness of homework, and implementer training and ongoing support might benefit from more attention to different implementation styles. Originality/value With nearly half of US public schools implementing some form of evidence-based substance use prevention program, issues of implementation fidelity and adaptation have become paramount in the field of prevention. The findings from this study reveal the complexity of the types of adaptations teachers make naturally in the classroom to evidence-based curricula and provide reasons for these adaptations. This information should prove useful for prevention researchers, program developers, and health educators alike. PMID:26290626

  3. Mixing of the Interstellar and Solar Plasmas at the Heliospheric Interface

    DOE PAGES

    Pogorelov, N. V.; Borovikov, S. N.

    2015-10-12

    From the ideal MHD perspective, the heliopause is a tangential discontinuity that separates the solar wind plasma from the local interstellar medium plasma. There are physical processes, however, that make the heliopause permeable. They can be subdivided into kinetic and MHD categories. Kinetic processes occur on small length and time scales, and cannot be resolved with MHD equations. On the other hand, MHD instabilities of the heliopause have much larger scales and can be easily observed by spacecraft. The heliopause may also be a subject of magnetic reconnection. In this paper, we discuss mechanisms of plasma mixing at the heliopausemore » in the context of Voyager 1 observations. Numerical results are obtained with a Multi-Scale Fluid-Kinetic Simulation Suite (MS-FLUKSS), which is a package of numerical codes capable of performing adaptive mesh refinement simulations of complex plasma flows in the presence of discontinuities and charge exchange between ions and neutral atoms. The flow of the ionized component is described with the ideal MHD equations, while the transport of atoms is governed either by the Boltzmann equation or multiple Euler gas dynamics equations. The code can also treat nonthermal ions and turbulence produced by them.« less

  4. The global heliosphere: A parametric study

    NASA Technical Reports Server (NTRS)

    McNutt, R. L., Jr.; Lyon, J.; Goodrich, C. C.

    1995-01-01

    As the Pioneer 10 and 11 and Voyager 1 and 2 spacecraft continue their penetration into the outer heliosphere, more attention has been focused on the nature of the solar wind interaction with the Very Local Interstellar Medium (VLISM). Since the initial pioneering concepts of Davis in 1955 and Parker in the early 1960's both in situ and remote measurements have led to various constraints that do not fit well into a coherent picture. To provide a context for these various observable constraints, we have adapted an explicitly time-dependent, explicitly three-dimensional magnetohydrodynamic (MHD) code to simulate the dependence of the heliospheric configuration and interaction with the VLISM on the properties of the external medium. The code also allows us to study temporal variations brought about by both short- and long-term changes in the solar wind and/or VLISM properties. We will discuss some of the initial results from this new effort and implications for the distances inferred to the termination shock and heliopause boundary. In particular, we will consider the effect of the Very Local Interstellar Magnetic Field (VLIMF) on the configuration and compare it with inferences from observations of outer heliosphere cosmic rays and the Very Low Frequency (VLF) outer heliospheric radio emissions.

  5. Towards measuring the semantic capacity of a physical medium demonstrated with elementary cellular automata.

    PubMed

    Dittrich, Peter

    2018-02-01

    The organic code concept and its operationalization by molecular codes have been introduced to study the semiotic nature of living systems. This contribution develops further the idea that the semantic capacity of a physical medium can be measured by assessing its ability to implement a code as a contingent mapping. For demonstration and evaluation, the approach is applied to a formal medium: elementary cellular automata (ECA). The semantic capacity is measured by counting the number of ways codes can be implemented. Additionally, a link to information theory is established by taking multivariate mutual information for quantifying contingency. It is shown how ECAs differ in their semantic capacities, how this is related to various ECA classifications, and how this depends on how a meaning is defined. Interestingly, if the meaning should persist for a certain while, the highest semantic capacity is found in CAs with apparently simple behavior, i.e., the fixed-point and two-cycle class. Synergy as a predictor for a CA's ability to implement codes can only be used if context implementing codes are common. For large context spaces with sparse coding contexts synergy is a weak predictor. Concluding, the approach presented here can distinguish CA-like systems with respect to their ability to implement contingent mappings. Applying this to physical systems appears straight forward and might lead to a novel physical property indicating how suitable a physical medium is to implement a semiotic system. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Incorporating Code-Based Software in an Introductory Statistics Course

    ERIC Educational Resources Information Center

    Doehler, Kirsten; Taylor, Laura

    2015-01-01

    This article is based on the experiences of two statistics professors who have taught students to write and effectively utilize code-based software in a college-level introductory statistics course. Advantages of using software and code-based software in this context are discussed. Suggestions are made on how to ease students into using code with…

  7. Adaptive software-defined coded modulation for ultra-high-speed optical transport

    NASA Astrophysics Data System (ADS)

    Djordjevic, Ivan B.; Zhang, Yequn

    2013-10-01

    In optically-routed networks, different wavelength channels carrying the traffic to different destinations can have quite different optical signal-to-noise ratios (OSNRs) and signal is differently impacted by various channel impairments. Regardless of the data destination, an optical transport system (OTS) must provide the target bit-error rate (BER) performance. To provide target BER regardless of the data destination we adjust the forward error correction (FEC) strength. Depending on the information obtained from the monitoring channels, we select the appropriate code rate matching to the OSNR range that current channel OSNR falls into. To avoid frame synchronization issues, we keep the codeword length fixed independent of the FEC code being employed. The common denominator is the employment of quasi-cyclic (QC-) LDPC codes in FEC. For high-speed implementation, low-complexity LDPC decoding algorithms are needed, and some of them will be described in this invited paper. Instead of conventional QAM based modulation schemes, we employ the signal constellations obtained by optimum signal constellation design (OSCD) algorithm. To improve the spectral efficiency, we perform the simultaneous rate adaptation and signal constellation size selection so that the product of number of bits per symbol × code rate is closest to the channel capacity. Further, we describe the advantages of using 4D signaling instead of polarization-division multiplexed (PDM) QAM, by using the 4D MAP detection, combined with LDPC coding, in a turbo equalization fashion. Finally, to solve the problems related to the limited bandwidth of information infrastructure, high energy consumption, and heterogeneity of optical networks, we describe an adaptive energy-efficient hybrid coded-modulation scheme, which in addition to amplitude, phase, and polarization state employs the spatial modes as additional basis functions for multidimensional coded-modulation.

  8. Implications for Language Diversity in Instruction in the Context of Target Language Classrooms: Development of a Preliminary Model of the Effectiveness of Teacher Code-Switching

    ERIC Educational Resources Information Center

    Lee, Jang Ho

    2012-01-01

    This paper concerns the conceptual and pedagogical issues that revolve around target language (TL) only instruction and teacher code-switching in the context of TL classrooms. To this end, I first examine four intertwined ideas (that is, monolingualism, naturalism, native-speakerism, and absolutism) that run through the monolingual approach to TL…

  9. An edge-based solution-adaptive method applied to the AIRPLANE code

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Thomas, Scott D.; Cliff, Susan E.

    1995-01-01

    Computational methods to solve large-scale realistic problems in fluid flow can be made more efficient and cost effective by using them in conjunction with dynamic mesh adaption procedures that perform simultaneous coarsening and refinement to capture flow features of interest. This work couples the tetrahedral mesh adaption scheme, 3D_TAG, with the AIRPLANE code to solve complete aircraft configuration problems in transonic and supersonic flow regimes. Results indicate that the near-field sonic boom pressure signature of a cone-cylinder is improved, the oblique and normal shocks are better resolved on a transonic wing, and the bow shock ahead of an unstarted inlet is better defined.

  10. Using the verona coding definitions of emotional sequences (VR-CoDES) and health provider responses (VR-CoDES-P) in the dental context.

    PubMed

    Wright, Alice; Humphris, Gerry; Wanyonyi, Kristina L; Freeman, Ruth

    2012-10-01

    To show if cues, concerns and provider responses (defined in VR-CoDES and VR-CoDES-P manuals) are present, can be reliably coded and require additional advice for adoption in a dental context. Thirteen patients in a dental practice setting were videoed with either their dentist or hygienist and dental nurse present in routine treatment sessions. All utterances were coded using the Verona systems: VR-CoDES and the VR-CoDES-P. Rates of cue, concerns and provider responses described and reliability tested. The VR-CoDES and VR-CoDES-P were successfully applied in the dental context. The intra-rater ICCs for the detection of cues and concerns and provider response were acceptable and above 0.75. A similar satisfactory result was found for the inter-rater reliability. The VR-CoDES and the VR-CoDES-P are applicable in the dental setting with minor supporting guidelines and show evidence of reliable coding. The VR-CoDES and the VR-CoDES-P may be helpful tools for analysing patient cues and concerns and the dental professionals' responses in the dental context. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  11. Here Today, Gone Tomorrow – Adaptation to Change in Memory-Guided Visual Search

    PubMed Central

    Zellin, Martina; Conci, Markus; von Mühlenen, Adrian; Müller, Hermann J.

    2013-01-01

    Visual search for a target object can be facilitated by the repeated presentation of an invariant configuration of nontargets (‘contextual cueing’). Here, we tested adaptation of learned contextual associations after a sudden, but permanent, relocation of the target. After an initial learning phase targets were relocated within their invariant contexts and repeatedly presented at new locations, before they returned to the initial locations. Contextual cueing for relocated targets was neither observed after numerous presentations nor after insertion of an overnight break. Further experiments investigated whether learning of additional, previously unseen context-target configurations is comparable to adaptation of existing contextual associations to change. In contrast to the lack of adaptation to changed target locations, contextual cueing developed for additional invariant configurations under identical training conditions. Moreover, across all experiments, presenting relocated targets or additional contexts did not interfere with contextual cueing of initially learned invariant configurations. Overall, the adaptation of contextual memory to changed target locations was severely constrained and unsuccessful in comparison to learning of an additional set of contexts, which suggests that contextual cueing facilitates search for only one repeated target location. PMID:23555038

  12. Benefit of adaptive FEC in shared backup path protected elastic optical network.

    PubMed

    Guo, Hong; Dai, Hua; Wang, Chao; Li, Yongcheng; Bose, Sanjay K; Shen, Gangxiang

    2015-07-27

    We apply an adaptive forward error correction (FEC) allocation strategy to an Elastic Optical Network (EON) operated with shared backup path protection (SBPP). To maximize the protected network capacity that can be carried, an Integer Linear Programing (ILP) model and a spectrum window plane (SWP)-based heuristic algorithm are developed. Simulation results show that the FEC coding overhead required by the adaptive FEC scheme is significantly lower than that needed by a fixed FEC allocation strategy resulting in higher network capacity for the adaptive strategy. The adaptive FEC allocation strategy can also significantly outperform the fixed FEC allocation strategy both in terms of the spare capacity redundancy and the average FEC coding overhead needed per optical channel. The proposed heuristic algorithm is efficient and not only performs closer to the ILP model but also does much better than the shortest-path algorithm.

  13. A genome-wide signature of positive selection in ancient and recent invasive expansions of the honey bee Apis mellifera

    PubMed Central

    Zayed, Amro; Whitfield, Charles W.

    2008-01-01

    Apis mellifera originated in Africa and extended its range into Eurasia in two or more ancient expansions. In 1956, honey bees of African origin were introduced into South America, their descendents admixing with previously introduced European bees, giving rise to the highly invasive and economically devastating “Africanized” honey bee. Here we ask whether the honey bee's out-of-Africa expansions, both ancient and recent (invasive), were associated with a genome-wide signature of positive selection, detected by contrasting genetic differentiation estimates (FST) between coding and noncoding SNPs. In native populations, SNPs in protein-coding regions had significantly higher FST estimates than those in noncoding regions, indicating adaptive evolution in the genome driven by positive selection. This signal of selection was associated with the expansion of honey bees from Africa into Western and Northern Europe, perhaps reflecting adaptation to temperate environments. We estimate that positive selection acted on a minimum of 852–1,371 genes or ≈10% of the bee's coding genome. We also detected positive selection associated with the invasion of African-derived honey bees in the New World. We found that introgression of European-derived alleles into Africanized bees was significantly greater for coding than noncoding regions. Our findings demonstrate that Africanized bees exploited the genetic diversity present from preexisting introductions in an adaptive way. Finally, we found a significant negative correlation between FST estimates and the local GC content surrounding coding SNPs, suggesting that AT-rich genes play an important role in adaptive evolution in the honey bee. PMID:18299560

  14. A genome-wide signature of positive selection in ancient and recent invasive expansions of the honey bee Apis mellifera.

    PubMed

    Zayed, Amro; Whitfield, Charles W

    2008-03-04

    Apis mellifera originated in Africa and extended its range into Eurasia in two or more ancient expansions. In 1956, honey bees of African origin were introduced into South America, their descendents admixing with previously introduced European bees, giving rise to the highly invasive and economically devastating "Africanized" honey bee. Here we ask whether the honey bee's out-of-Africa expansions, both ancient and recent (invasive), were associated with a genome-wide signature of positive selection, detected by contrasting genetic differentiation estimates (F(ST)) between coding and noncoding SNPs. In native populations, SNPs in protein-coding regions had significantly higher F(ST) estimates than those in noncoding regions, indicating adaptive evolution in the genome driven by positive selection. This signal of selection was associated with the expansion of honey bees from Africa into Western and Northern Europe, perhaps reflecting adaptation to temperate environments. We estimate that positive selection acted on a minimum of 852-1,371 genes or approximately 10% of the bee's coding genome. We also detected positive selection associated with the invasion of African-derived honey bees in the New World. We found that introgression of European-derived alleles into Africanized bees was significantly greater for coding than noncoding regions. Our findings demonstrate that Africanized bees exploited the genetic diversity present from preexisting introductions in an adaptive way. Finally, we found a significant negative correlation between F(ST) estimates and the local GC content surrounding coding SNPs, suggesting that AT-rich genes play an important role in adaptive evolution in the honey bee.

  15. Theoretical Roots and Pedagogical Implications for Contextual Evaluation.

    ERIC Educational Resources Information Center

    Ewald, Helen Rothschild

    There are three types of contexts subject to evaluation of student writing; the textual context that influences grammatical acceptability and the rhetorical effectiveness of a sentence; the coded context or cultural constraints such as generic and stylistic conventions; and pragmatic contexts that unite form, function, and setting in a…

  16. Within Your Control? When Problem Solving May Be Most Helpful.

    PubMed

    Sarfan, Laurel D; Gooch, Peter; Clerkin, Elise M

    2017-08-01

    Emotion regulation strategies have been conceptualized as adaptive or maladaptive, but recent evidence suggests emotion regulation outcomes may be context-dependent. The present study tested whether the adaptiveness of a putatively adaptive emotion regulation strategy-problem solving-varied across contexts of high and low controllability. The present study also tested rumination, suggested to be one of the most putatively maladaptive strategies, which was expected to be associated with negative outcomes regardless of context. Participants completed an in vivo speech task, in which they were randomly assigned to a controllable ( n = 65) or an uncontrollable ( n = 63) condition. Using moderation analyses, we tested whether controllability interacted with emotion regulation use to predict negative affect, avoidance, and perception of performance. Partially consistent with hypotheses, problem solving was associated with certain positive outcomes (i.e., reduced behavioral avoidance) in the controllable (vs. uncontrollable) condition. Consistent with predictions, rumination was associated with negative outcomes (i.e., desired avoidance, negative affect, negative perception of performance) in both conditions. Overall, findings partially support contextual models of emotion regulation, insofar as the data suggest that the effects of problem solving may be more adaptive in controllable contexts for certain outcomes, whereas rumination may be maladaptive regardless of context.

  17. Intelligent Context-Aware and Adaptive Interface for Mobile LBS

    PubMed Central

    Liu, Yanhong

    2015-01-01

    Context-aware user interface plays an important role in many human-computer Interaction tasks of location based services. Although spatial models for context-aware systems have been studied extensively, how to locate specific spatial information for users is still not well resolved, which is important in the mobile environment where location based services users are impeded by device limitations. Better context-aware human-computer interaction models of mobile location based services are needed not just to predict performance outcomes, such as whether people will be able to find the information needed to complete a human-computer interaction task, but to understand human processes that interact in spatial query, which will in turn inform the detailed design of better user interfaces in mobile location based services. In this study, a context-aware adaptive model for mobile location based services interface is proposed, which contains three major sections: purpose, adjustment, and adaptation. Based on this model we try to describe the process of user operation and interface adaptation clearly through the dynamic interaction between users and the interface. Then we show how the model applies users' demands in a complicated environment and suggested the feasibility by the experimental results. PMID:26457077

  18. Evaluation of the Xeon phi processor as a technology for the acceleration of real-time control in high-order adaptive optics systems

    NASA Astrophysics Data System (ADS)

    Barr, David; Basden, Alastair; Dipper, Nigel; Schwartz, Noah; Vick, Andy; Schnetler, Hermine

    2014-08-01

    We present wavefront reconstruction acceleration of high-order AO systems using an Intel Xeon Phi processor. The Xeon Phi is a coprocessor providing many integrated cores and designed for accelerating compute intensive, numerical codes. Unlike other accelerator technologies, it allows virtually unchanged C/C++ to be recompiled to run on the Xeon Phi, giving the potential of making development, upgrade and maintenance faster and less complex. We benchmark the Xeon Phi in the context of AO real-time control by running a matrix vector multiply (MVM) algorithm. We investigate variability in execution time and demonstrate a substantial speed-up in loop frequency. We examine the integration of a Xeon Phi into an existing RTC system and show that performance improvements can be achieved with limited development effort.

  19. A COTS-Based Replacement Strategy for Aging Avionics Computers

    DTIC Science & Technology

    2001-12-01

    Communication Control Unit. A COTS-Based Replacement Strategy for Aging Avionics Computers COTS Microprocessor Real Time Operating System New Native Code...Native Code Objec ts Native Code Thread Real - Time Operating System Legacy Function x Virtual Component Environment Context Switch Thunk Add-in Replace

  20. Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling

    PubMed Central

    Lareo, Angel; Forlim, Caroline G.; Pinto, Reynaldo D.; Varona, Pablo; Rodriguez, Francisco de Borja

    2016-01-01

    Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox. PMID:27766078

  1. Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling.

    PubMed

    Lareo, Angel; Forlim, Caroline G; Pinto, Reynaldo D; Varona, Pablo; Rodriguez, Francisco de Borja

    2016-01-01

    Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox.

  2. What adaptation to research is needed following crises: a comparative, qualitative study of the health workforce in Sierra Leone and Nepal.

    PubMed

    Raven, Joanna; Baral, Sushil; Wurie, Haja; Witter, Sophie; Samai, Mohamed; Paudel, Pravin; Subedi, Hom Nath; Martineau, Tim; Elsey, Helen; Theobald, Sally

    2018-02-07

    Health workers are critical to the performance of health systems; yet, evidence about their coping strategies and support needs during and post crisis is lacking. There is very limited discussion about how research teams should respond when unexpected crises occur during on-going research. This paper critically presents the approaches and findings of two health systems research projects that explored and evaluated health worker performance and were adapted during crises, and provides lessons learnt on re-orientating research when the unexpected occurs. Health systems research was adapted post crisis to assess health workers' experiences and coping strategies. Qualitative in-depth interviews were conducted with 14 health workers in a heavily affected earthquake district in Nepal and 25 frontline health workers in four districts in Ebola-affected Sierra Leone. All data were transcribed and analysed using the framework approach, which included developing coding frameworks for each study, applying the frameworks, developing charts and describing the themes. A second layer of analysis included analysis across the two contexts, whereas a third layer involved the research teams reflecting on the approaches used to adapt the research during these crises and what was learned as individuals and research teams. In Sierra Leone, health workers were heavily stigmatised by the epidemic, leading to a breakdown of trust. Coping strategies included finding renewed purpose in continuing to serve their community, peer and family support (in some cases), and religion. In Nepal, individual determination, a sense of responsibility to the community and professional duty compelled staff to stay or return to their workplace. The research teams had trusting relationships with policy-makers and practitioners, which brought credibility and legitimacy to the change of research direction as well as the relationships to maximise the opportunity for findings to inform practice. In both contexts, health workers demonstrated considerable resilience in continuing to provide services despite limited support. Embedded researchers and institutions are arguably best placed to navigate emerging ethical and social justice challenges and are strategically positioned to support the co-production of knowledge and ensure research findings have impact.

  3. SAGE - MULTIDIMENSIONAL SELF-ADAPTIVE GRID CODE

    NASA Technical Reports Server (NTRS)

    Davies, C. B.

    1994-01-01

    SAGE, Self Adaptive Grid codE, is a flexible tool for adapting and restructuring both 2D and 3D grids. Solution-adaptive grid methods are useful tools for efficient and accurate flow predictions. In supersonic and hypersonic flows, strong gradient regions such as shocks, contact discontinuities, shear layers, etc., require careful distribution of grid points to minimize grid error and produce accurate flow-field predictions. SAGE helps the user obtain more accurate solutions by intelligently redistributing (i.e. adapting) the original grid points based on an initial or interim flow-field solution. The user then computes a new solution using the adapted grid as input to the flow solver. The adaptive-grid methodology poses the problem in an algebraic, unidirectional manner for multi-dimensional adaptations. The procedure is analogous to applying tension and torsion spring forces proportional to the local flow gradient at every grid point and finding the equilibrium position of the resulting system of grid points. The multi-dimensional problem of grid adaption is split into a series of one-dimensional problems along the computational coordinate lines. The reduced one dimensional problem then requires a tridiagonal solver to find the location of grid points along a coordinate line. Multi-directional adaption is achieved by the sequential application of the method in each coordinate direction. The tension forces direct the redistribution of points to the strong gradient region. To maintain smoothness and a measure of orthogonality of grid lines, torsional forces are introduced that relate information between the family of lines adjacent to one another. The smoothness and orthogonality constraints are direction-dependent, since they relate only the coordinate lines that are being adapted to the neighboring lines that have already been adapted. Therefore the solutions are non-unique and depend on the order and direction of adaption. Non-uniqueness of the adapted grid is acceptable since it makes possible an overall and local error reduction through grid redistribution. SAGE includes the ability to modify the adaption techniques in boundary regions, which substantially improves the flexibility of the adaptive scheme. The vectorial approach used in the analysis also provides flexibility. The user has complete choice of adaption direction and order of sequential adaptions without concern for the computational data structure. Multiple passes are available with no restraint on stepping directions; for each adaptive pass the user can choose a completely new set of adaptive parameters. This facility, combined with the capability of edge boundary control, enables the code to individually adapt multi-dimensional multiple grids. Zonal grids can be adapted while maintaining continuity along the common boundaries. For patched grids, the multiple-pass capability enables complete adaption. SAGE is written in FORTRAN 77 and is intended to be machine independent; however, it requires a FORTRAN compiler which supports NAMELIST input. It has been successfully implemented on Sun series computers, SGI IRIS's, DEC MicroVAX computers, HP series computers, the Cray YMP, and IBM PC compatibles. Source code is provided, but no sample input and output files are provided. The code reads three datafiles: one that contains the initial grid coordinates (x,y,z), one that contains corresponding flow-field variables, and one that contains the user control parameters. It is assumed that the first two datasets are formatted as defined in the plotting software package PLOT3D. Several machine versions of PLOT3D are available from COSMIC. The amount of main memory is dependent on the size of the matrix. The standard distribution medium for SAGE is a 5.25 inch 360K MS-DOS format diskette. It is also available on a .25 inch streaming magnetic tape cartridge in UNIX tar format or on a 9-track 1600 BPI ASCII CARD IMAGE format magnetic tape. SAGE was developed in 1989, first released as a 2D version in 1991 and updated to 3D in 1993.

  4. A Distributed Value of Information (VoI)-Based Approach for Mission-Adaptive Context-Aware Information Management and Presentation

    DTIC Science & Technology

    2016-05-16

    metrics involve regulating automation of complex systems , such as aircraft .12 Additionally, adaptive management of content in user interfaces has also...both the user and environmental context would aid in deciding how to present the information to the Warfighter. The prototype system currently...positioning system , and rate sensors can provide user - specific context to disambiguate physiologic data. The consumer “quantified self” market has driven

  5. Network adaptation improves temporal representation of naturalistic stimuli in Drosophila eye: I dynamics.

    PubMed

    Zheng, Lei; Nikolaev, Anton; Wardill, Trevor J; O'Kane, Cahir J; de Polavieja, Gonzalo G; Juusola, Mikko

    2009-01-01

    Because of the limited processing capacity of eyes, retinal networks must adapt constantly to best present the ever changing visual world to the brain. However, we still know little about how adaptation in retinal networks shapes neural encoding of changing information. To study this question, we recorded voltage responses from photoreceptors (R1-R6) and their output neurons (LMCs) in the Drosophila eye to repeated patterns of contrast values, collected from natural scenes. By analyzing the continuous photoreceptor-to-LMC transformations of these graded-potential neurons, we show that the efficiency of coding is dynamically improved by adaptation. In particular, adaptation enhances both the frequency and amplitude distribution of LMC output by improving sensitivity to under-represented signals within seconds. Moreover, the signal-to-noise ratio of LMC output increases in the same time scale. We suggest that these coding properties can be used to study network adaptation using the genetic tools in Drosophila, as shown in a companion paper (Part II).

  6. Network Adaptation Improves Temporal Representation of Naturalistic Stimuli in Drosophila Eye: I Dynamics

    PubMed Central

    Wardill, Trevor J.; O'Kane, Cahir J.; de Polavieja, Gonzalo G.; Juusola, Mikko

    2009-01-01

    Because of the limited processing capacity of eyes, retinal networks must adapt constantly to best present the ever changing visual world to the brain. However, we still know little about how adaptation in retinal networks shapes neural encoding of changing information. To study this question, we recorded voltage responses from photoreceptors (R1–R6) and their output neurons (LMCs) in the Drosophila eye to repeated patterns of contrast values, collected from natural scenes. By analyzing the continuous photoreceptor-to-LMC transformations of these graded-potential neurons, we show that the efficiency of coding is dynamically improved by adaptation. In particular, adaptation enhances both the frequency and amplitude distribution of LMC output by improving sensitivity to under-represented signals within seconds. Moreover, the signal-to-noise ratio of LMC output increases in the same time scale. We suggest that these coding properties can be used to study network adaptation using the genetic tools in Drosophila, as shown in a companion paper (Part II). PMID:19180196

  7. Adapting a Navier-Stokes code to the ICL-DAP

    NASA Technical Reports Server (NTRS)

    Grosch, C. E.

    1985-01-01

    The results of an experiment are reported, i.c., to adapt a Navier-Stokes code, originally developed on a serial computer, to concurrent processing on the CL Distributed Array Processor (DAP). The algorithm used in solving the Navier-Stokes equations is briefly described. The architecture of the DAP and DAP FORTRAN are also described. The modifications of the algorithm so as to fit the DAP are given and discussed. Finally, performance results are given and conclusions are drawn.

  8. Multi-optimization Criteria-based Robot Behavioral Adaptability and Motion Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pin, Francois G.

    2002-06-01

    Robotic tasks are typically defined in Task Space (e.g., the 3-D World), whereas robots are controlled in Joint Space (motors). The transformation from Task Space to Joint Space must consider the task objectives (e.g., high precision, strength optimization, torque optimization), the task constraints (e.g., obstacles, joint limits, non-holonomic constraints, contact or tool task constraints), and the robot kinematics configuration (e.g., tools, type of joints, mobile platform, manipulator, modular additions, locked joints). Commercially available robots are optimized for a specific set of tasks, objectives and constraints and, therefore, their control codes are extremely specific to a particular set of conditions. Thus,more » there exist a multiplicity of codes, each handling a particular set of conditions, but none suitable for use on robots with widely varying tasks, objectives, constraints, or environments. On the other hand, most DOE missions and tasks are typically ''batches of one''. Attempting to use commercial codes for such work requires significant personnel and schedule costs for re-programming or adding code to the robots whenever a change in task objective, robot configuration, number and type of constraint, etc. occurs. The objective of our project is to develop a ''generic code'' to implement this Task-space to Joint-Space transformation that would allow robot behavior adaptation, in real time (at loop rate), to changes in task objectives, number and type of constraints, modes of controls, kinematics configuration (e.g., new tools, added module). Our specific goal is to develop a single code for the general solution of under-specified systems of algebraic equations that is suitable for solving the inverse kinematics of robots, is useable for all types of robots (mobile robots, manipulators, mobile manipulators, etc.) with no limitation on the number of joints and the number of controlled Task-Space variables, can adapt to real time changes in number and type of constraints and in task objectives, and can adapt to changes in kinematics configurations (change of module, change of tool, joint failure adaptation, etc.).« less

  9. Biological Sensitivity to Context: The Interactive Effects of Stress Reactivity and Family Adversity on Socio-Emotional Behavior and School Readiness

    PubMed Central

    Obradović, Jelena; Bush, Nicole R.; Stamperdahl, Juliet; Adler, Nancy E.; Boyce, W. Thomas

    2009-01-01

    This study examined the direct and interactive effects of stress reactivity and family adversity on socio-emotional and cognitive development in 338 five-to-six-year-old children. Neurobiological stress reactivity was measured as respiratory sinus arrhythmia and salivary cortisol responses to social, cognitive, sensory, and emotional challenges. Adaptation was assessed using child, parent, and teacher reports of externalizing symptoms, prosocial behaviors, school engagement, and academic competence. Results revealed significant interactions between reactivity and adversity. High stress reactivity was associated with more maladaptive outcomes in the context of high adversity but with better adaption in the context of low adversity. The findings corroborate a reconceptualization of stress reactivity as biological sensitivity to context by showing that high reactivity can both hinder and promote adaptive functioning. PMID:20331667

  10. Real-time data compression of broadcast video signals

    NASA Technical Reports Server (NTRS)

    Shalkauser, Mary Jo W. (Inventor); Whyte, Wayne A., Jr. (Inventor); Barnes, Scott P. (Inventor)

    1991-01-01

    A non-adaptive predictor, a nonuniform quantizer, and a multi-level Huffman coder are incorporated into a differential pulse code modulation system for coding and decoding broadcast video signals in real time.

  11. Review and Implementation of the Emerging CCSDS Recommended Standard for Multispectral and Hyperspectral Lossless Image Coding

    NASA Technical Reports Server (NTRS)

    Sanchez, Jose Enrique; Auge, Estanislau; Santalo, Josep; Blanes, Ian; Serra-Sagrista, Joan; Kiely, Aaron

    2011-01-01

    A new standard for image coding is being developed by the MHDC working group of the CCSDS, targeting onboard compression of multi- and hyper-spectral imagery captured by aircraft and satellites. The proposed standard is based on the "Fast Lossless" adaptive linear predictive compressor, and is adapted to better overcome issues of onboard scenarios. In this paper, we present a review of the state of the art in this field, and provide an experimental comparison of the coding performance of the emerging standard in relation to other state-of-the-art coding techniques. Our own independent implementation of the MHDC Recommended Standard, as well as of some of the other techniques, has been used to provide extensive results over the vast corpus of test images from the CCSDS-MHDC.

  12. High Order Modulation Protograph Codes

    NASA Technical Reports Server (NTRS)

    Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)

    2014-01-01

    Digital communication coding methods for designing protograph-based bit-interleaved code modulation that is general and applies to any modulation. The general coding framework can support not only multiple rates but also adaptive modulation. The method is a two stage lifting approach. In the first stage, an original protograph is lifted to a slightly larger intermediate protograph. The intermediate protograph is then lifted via a circulant matrix to the expected codeword length to form a protograph-based low-density parity-check code.

  13. CACTI: Free, Open-Source Software for the Sequential Coding of Behavioral Interactions

    PubMed Central

    Glynn, Lisa H.; Hallgren, Kevin A.; Houck, Jon M.; Moyers, Theresa B.

    2012-01-01

    The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery. PMID:22815713

  14. Atypicalities in Perceptual Adaptation in Autism Do Not Extend to Perceptual Causality

    PubMed Central

    Karaminis, Themelis; Turi, Marco; Neil, Louise; Badcock, Nicholas A.; Burr, David; Pellicano, Elizabeth

    2015-01-01

    A recent study showed that adaptation to causal events (collisions) in adults caused subsequent events to be less likely perceived as causal. In this study, we examined if a similar negative adaptation effect for perceptual causality occurs in children, both typically developing and with autism. Previous studies have reported diminished adaptation for face identity, facial configuration and gaze direction in children with autism. To test whether diminished adaptive coding extends beyond high-level social stimuli (such as faces) and could be a general property of autistic perception, we developed a child-friendly paradigm for adaptation of perceptual causality. We compared the performance of 22 children with autism with 22 typically developing children, individually matched on age and ability (IQ scores). We found significant and equally robust adaptation aftereffects for perceptual causality in both groups. There were also no differences between the two groups in their attention, as revealed by reaction times and accuracy in a change-detection task. These findings suggest that adaptation to perceptual causality in autism is largely similar to typical development and, further, that diminished adaptive coding might not be a general characteristic of autism at low levels of the perceptual hierarchy, constraining existing theories of adaptation in autism. PMID:25774507

  15. Certifying Auto-Generated Flight Code

    NASA Technical Reports Server (NTRS)

    Denney, Ewen

    2008-01-01

    Model-based design and automated code generation are being used increasingly at NASA. Many NASA projects now use MathWorks Simulink and Real-Time Workshop for at least some of their modeling and code development. However, there are substantial obstacles to more widespread adoption of code generators in safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. Moreover, the regeneration of code can require complete recertification, which offsets many of the advantages of using a generator. Indeed, manual review of autocode can be more challenging than for hand-written code. Since the direct V&V of code generators is too laborious and complicated due to their complex (and often proprietary) nature, we have developed a generator plug-in to support the certification of the auto-generated code. Specifically, the AutoCert tool supports certification by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews. The generated documentation also contains substantial tracing information, allowing users to trace between model, code, documentation, and V&V artifacts. This enables missions to obtain assurance about the safety and reliability of the code without excessive manual V&V effort and, as a consequence, eases the acceptance of code generators in safety-critical contexts. The generation of explicit certificates and textual reports is particularly well-suited to supporting independent V&V. The primary contribution of this approach is the combination of human-friendly documentation with formal analysis. The key technical idea is to exploit the idiomatic nature of auto-generated code in order to automatically infer logical annotations. The annotation inference algorithm itself is generic, and parametrized with respect to a library of coding patterns that depend on the safety policies and the code generator. The patterns characterize the notions of definitions and uses that are specific to the given safety property. For example, for initialization safety, definitions correspond to variable initializations while uses are statements which read a variable, whereas for array bounds safety, definitions are the array declarations, while uses are statements which access an array variable. The inferred annotations are thus highly dependent on the actual program and the properties being proven. The annotations, themselves, need not be trusted, but are crucial to obtain the automatic formal verification of the safety properties without requiring access to the internals of the code generator. The approach has been applied to both in-house and commercial code generators, but is independent of the particular generator used. It is currently being adapted to flight code generated using MathWorks Real-Time Workshop, an automatic code generator that translates from Simulink/Stateflow models into embedded C code.

  16. Constructing Noise-Invariant Representations of Sound in the Auditory Pathway

    PubMed Central

    Rabinowitz, Neil C.; Willmore, Ben D. B.; King, Andrew J.; Schnupp, Jan W. H.

    2013-01-01

    Identifying behaviorally relevant sounds in the presence of background noise is one of the most important and poorly understood challenges faced by the auditory system. An elegant solution to this problem would be for the auditory system to represent sounds in a noise-invariant fashion. Since a major effect of background noise is to alter the statistics of the sounds reaching the ear, noise-invariant representations could be promoted by neurons adapting to stimulus statistics. Here we investigated the extent of neuronal adaptation to the mean and contrast of auditory stimulation as one ascends the auditory pathway. We measured these forms of adaptation by presenting complex synthetic and natural sounds, recording neuronal responses in the inferior colliculus and primary fields of the auditory cortex of anaesthetized ferrets, and comparing these responses with a sophisticated model of the auditory nerve. We find that the strength of both forms of adaptation increases as one ascends the auditory pathway. To investigate whether this adaptation to stimulus statistics contributes to the construction of noise-invariant sound representations, we also presented complex, natural sounds embedded in stationary noise, and used a decoding approach to assess the noise tolerance of the neuronal population code. We find that the code for complex sounds in the periphery is affected more by the addition of noise than the cortical code. We also find that noise tolerance is correlated with adaptation to stimulus statistics, so that populations that show the strongest adaptation to stimulus statistics are also the most noise-tolerant. This suggests that the increase in adaptation to sound statistics from auditory nerve to midbrain to cortex is an important stage in the construction of noise-invariant sound representations in the higher auditory brain. PMID:24265596

  17. Inclusion of the fitness sharing technique in an evolutionary algorithm to analyze the fitness landscape of the genetic code adaptability.

    PubMed

    Santos, José; Monteagudo, Ángel

    2017-03-27

    The canonical code, although prevailing in complex genomes, is not universal. It was shown the canonical genetic code superior robustness compared to random codes, but it is not clearly determined how it evolved towards its current form. The error minimization theory considers the minimization of point mutation adverse effect as the main selection factor in the evolution of the code. We have used simulated evolution in a computer to search for optimized codes, which helps to obtain information about the optimization level of the canonical code in its evolution. A genetic algorithm searches for efficient codes in a fitness landscape that corresponds with the adaptability of possible hypothetical genetic codes. The lower the effects of errors or mutations in the codon bases of a hypothetical code, the more efficient or optimal is that code. The inclusion of the fitness sharing technique in the evolutionary algorithm allows the extent to which the canonical genetic code is in an area corresponding to a deep local minimum to be easily determined, even in the high dimensional spaces considered. The analyses show that the canonical code is not in a deep local minimum and that the fitness landscape is not a multimodal fitness landscape with deep and separated peaks. Moreover, the canonical code is clearly far away from the areas of higher fitness in the landscape. Given the non-presence of deep local minima in the landscape, although the code could evolve and different forces could shape its structure, the fitness landscape nature considered in the error minimization theory does not explain why the canonical code ended its evolution in a location which is not an area of a localized deep minimum of the huge fitness landscape.

  18. The tactile motion aftereffect suggests an intensive code for speed in neurons sensitive to both speed and direction of motion

    PubMed Central

    Birznieks, I.; Vickery, R. M.; Holcombe, A. O.; Seizova-Cajic, T.

    2016-01-01

    Neurophysiological studies in primates have found that direction-sensitive neurons in the primary somatosensory cortex (SI) generally increase their response rate with increasing speed of object motion across the skin and show little evidence of speed tuning. We employed psychophysics to determine whether human perception of motion direction could be explained by features of such neurons and whether evidence can be found for a speed-tuned process. After adaptation to motion across the skin, a subsequently presented dynamic test stimulus yields an impression of motion in the opposite direction. We measured the strength of this tactile motion aftereffect (tMAE) induced with different combinations of adapting and test speeds. Distal-to-proximal or proximal-to-distal adapting motion was applied to participants' index fingers using a tactile array, after which participants reported the perceived direction of a bidirectional test stimulus. An intensive code for speed, like that observed in SI neurons, predicts greater adaptation (and a stronger tMAE) the faster the adapting speed, regardless of the test speed. In contrast, speed tuning of direction-sensitive neurons predicts the greatest tMAE when the adapting and test stimuli have matching speeds. We found that the strength of the tMAE increased monotonically with adapting speed, regardless of the test speed, showing no evidence of speed tuning. Our data are consistent with neurophysiological findings that suggest an intensive code for speed along the motion processing pathways comprising neurons sensitive both to speed and direction of motion. PMID:26823511

  19. SSPARAMA: A Nonlinear, Wave Optics Multipulse (and CW) Steady-State Propagation Code with Adaptive Coordinates

    DTIC Science & Technology

    1977-02-10

    RL Report SUM F ~ SSPARAMA: A Nonlinear, Wave Optics Multipulse (and CW) Steady-State Propagation * Code with Adaptive Coordinates K. G. WHIITNEY...ie rmtu o- a ~e oD DISCLAIMER NOTICE THIS DOCUMENT IS BEST QUALITY AVAILABLE. THE COPY FURNISHED TO DTIC CONTAINED A SIGNIFICANT NUMBER OF PAGES WHICH...DO NOT REPRODUCE LEGIBLY. SECU RITY CL ASSI FICATION OF TVII, PAZOE Fl?l ba PJM 0vI,.j REPOR DOCMENTTIONPAGEREAL) INS~TRUCTION~S REPOT DOUMENATIO PAG

  20. Self-adaptive multimethod optimization applied to a tailored heating forging process

    NASA Astrophysics Data System (ADS)

    Baldan, M.; Steinberg, T.; Baake, E.

    2018-05-01

    The presented paper describes an innovative self-adaptive multi-objective optimization code. Investigation goals concern proving the superiority of this code compared to NGSA-II and applying it to an inductor’s design case study addressed to a “tailored” heating forging application. The choice of the frequency and the heating time are followed by the determination of the turns number and their positions. Finally, a straightforward optimization is performed in order to minimize energy consumption using “optimal control”.

  1. Python/Lua Benchmarks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Busby, L.

    This is an adaptation of the pre-existing Scimark benchmark code to a variety of Python and Lua implementations. It also measures performance of the Fparser expression parser and C and C++ code on a variety of simple scientific expressions.

  2. Model and experiments to optimize co-adaptation in a simplified myoelectric control system.

    PubMed

    Couraud, M; Cattaert, D; Paclet, F; Oudeyer, P Y; de Rugy, A

    2018-04-01

    To compensate for a limb lost in an amputation, myoelectric prostheses use surface electromyography (EMG) from the remaining muscles to control the prosthesis. Despite considerable progress, myoelectric controls remain markedly different from the way we normally control movements, and require intense user adaptation. To overcome this, our goal is to explore concurrent machine co-adaptation techniques that are developed in the field of brain-machine interface, and that are beginning to be used in myoelectric controls. We combined a simplified myoelectric control with a perturbation for which human adaptation is well characterized and modeled, in order to explore co-adaptation settings in a principled manner. First, we reproduced results obtained in a classical visuomotor rotation paradigm in our simplified myoelectric context, where we rotate the muscle pulling vectors used to reconstruct wrist force from EMG. Then, a model of human adaptation in response to directional error was used to simulate various co-adaptation settings, where perturbations and machine co-adaptation are both applied on muscle pulling vectors. These simulations established that a relatively low gain of machine co-adaptation that minimizes final errors generates slow and incomplete adaptation, while higher gains increase adaptation rate but also errors by amplifying noise. After experimental verification on real subjects, we tested a variable gain that cumulates the advantages of both, and implemented it with directionally tuned neurons similar to those used to model human adaptation. This enables machine co-adaptation to locally improve myoelectric control, and to absorb more challenging perturbations. The simplified context used here enabled to explore co-adaptation settings in both simulations and experiments, and to raise important considerations such as the need for a variable gain encoded locally. The benefits and limits of extending this approach to more complex and functional myoelectric contexts are discussed.

  3. Making children laugh: parent-child dyadic synchrony and preschool attachment.

    PubMed

    Bureau, Jean-FrançOis; Yurkowski, Kim; Schmiedel, Sabrina; Martin, Jodi; Moss, Ellen; Pallanca, Dominique

    2014-01-01

    The current study examined whether dyadic synchrony of father-child and mother-child interactions in a playful context were associated with attachment organization in preschool children. One hundred seven children (48 boys, Mage = 46.67 months, SD = 8.57) and their mothers and fathers (counterbalanced order of lab visits) participated in a playful interaction without toys (Laughing Task procedure). Playful interactions were coded based on the degree to which the dyads demonstrated a variety of behavior representing dyadic synchrony and task management. Children's attachment behavior toward fathers and mothers was observed in a modified separation-reunion procedure adapted for the preschool period. Results demonstrate that mothers and fathers are similar in their effort to arouse and engage their child in a playful context, but mothers achieved a greater synchrony with their child. Disorganized attachment to either mother or father is linked with a lack of synchrony in dyadic interaction. Findings are in contrast with prevailing theory, suggesting that despite gender-related differences in parental playful behaviors, dyadic synchrony is equally important in both mother- and father-child relationships for the development of organized social and affectional bonds. © 2014 Michigan Association for Infant Mental Health.

  4. Adapting Scale for Children: A Practical Model for Researchers

    ERIC Educational Resources Information Center

    Aydin, Selami; Harputlu, Leyla; Çelik, Seyda Savran; Ustuk, Özgehan; Güzel, Serhat; Genç, Deniz

    2016-01-01

    Measurement of children's behaviors in an educational and research context is a problematic and complex area. It is also evident that adapting scales to measure children's behaviors in an educational and research context is a complex process due to several reasons. First, cultural elements constitute a considerable problem. Second, it is difficult…

  5. Implementing Self-Management within a Group Counseling Context: Effects on Academic Enabling Behaviors

    ERIC Educational Resources Information Center

    Briesch DuBois, Jacquelyn M.; Briesch, Amy M.; Hoffman, Jessica A.; Struzziero, Joan; Toback, Robin

    2017-01-01

    Self-management interventions have been adapted to serve as targeted interventions to increase academic enabling behaviors in groups of students. However, a trade-off exists between adapting these interventions to feasibly fit group contexts and maintaining theoretical intervention components. This study examines the use of self-management within…

  6. Adaptive governance of riverine and wetland ecosystem goods and services

    EPA Science Inventory

    Adaptive governance and adaptive management have developed over the past quarter century in response to institutional and organizational failures, and unforeseen changes in natural resource dynamics. Adaptive governance provides a context for managing known and unknown consequenc...

  7. The advantages and limitations of guideline adaptation frameworks.

    PubMed

    Wang, Zhicheng; Norris, Susan L; Bero, Lisa

    2018-05-29

    The implementation of evidence-based guidelines can improve clinical and public health outcomes by helping health professionals practice in the most effective manner, as well as assisting policy-makers in designing optimal programs. Adaptation of a guideline to suit the context in which it is intended to be applied can be a key step in the implementation process. Without taking the local context into account, certain interventions recommended in evidence-based guidelines may be infeasible under local conditions. Guideline adaptation frameworks provide a systematic way of approaching adaptation, and their use may increase transparency, methodological rigor, and the quality of the adapted guideline. This paper presents a number of adaptation frameworks that are currently available. We aim to compare the advantages and limitations of their processes, methods, and resource implications. These insights into adaptation frameworks can inform the future development of guidelines and systematic methods to optimize their adaptation. Recent adaptation frameworks show an evolution from adapting entire existing guidelines, to adapting specific recommendations extracted from an existing guideline, to constructing evidence tables for each recommendation that needs to be adapted. This is a move towards more recommendation-focused, context-specific processes and considerations. There are still many gaps in knowledge about guideline adaptation. Most of the frameworks reviewed lack any evaluation of the adaptation process and outcomes, including user satisfaction and resources expended. The validity, usability, and health impact of guidelines developed via an adaptation process have not been studied. Lastly, adaptation frameworks have not been evaluated for use in low-income countries. Despite the limitations in frameworks, a more systematic approach to adaptation based on a framework is valuable, as it helps to ensure that the recommendations stay true to the evidence while taking local needs into account. The utilization of frameworks in the guideline implementation process can be optimized by increasing the understanding and upfront estimation of resource and time needed, capacity building in adaptation methods, and increasing the adaptability of the source recommendation document.

  8. Modeling Adaptive Educational Methods with IMS Learning Design

    ERIC Educational Resources Information Center

    Specht, Marcus; Burgos, Daniel

    2007-01-01

    The paper describes a classification system for adaptive methods developed in the area of adaptive educational hypermedia based on four dimensions: What components of the educational system are adapted? To what features of the user and the current context does the system adapt? Why does the system adapt? How does the system get the necessary…

  9. In Their Own Words: Teachers' Reflections on Adaptability

    ERIC Educational Resources Information Center

    Vaughn, Margaret; Parsons, Seth A.; Burrowbridge, Sarah Cohen; Weesner, Janice; Taylor, Laurel

    2016-01-01

    Current research explores adaptability by gathering teachers' reflections on their adaptations. However, the field knows little of what the term "adaptability" means to teachers who currently teach in today's educational context. In this article, adaptability is discussed from the perspectives of 3 practicing classroom educators,…

  10. Design of ACM system based on non-greedy punctured LDPC codes

    NASA Astrophysics Data System (ADS)

    Lu, Zijun; Jiang, Zihong; Zhou, Lin; He, Yucheng

    2017-08-01

    In this paper, an adaptive coded modulation (ACM) scheme based on rate-compatible LDPC (RC-LDPC) codes was designed. The RC-LDPC codes were constructed by a non-greedy puncturing method which showed good performance in high code rate region. Moreover, the incremental redundancy scheme of LDPC-based ACM system over AWGN channel was proposed. By this scheme, code rates vary from 2/3 to 5/6 and the complication of the ACM system is lowered. Simulations show that more and more obvious coding gain can be obtained by the proposed ACM system with higher throughput.

  11. Information theory of adaptation in neurons, behavior, and mood.

    PubMed

    Sharpee, Tatyana O; Calhoun, Adam J; Chalasani, Sreekanth H

    2014-04-01

    The ability to make accurate predictions of future stimuli and consequences of one's actions are crucial for the survival and appropriate decision-making. These predictions are constantly being made at different levels of the nervous system. This is evidenced by adaptation to stimulus parameters in sensory coding, and in learning of an up-to-date model of the environment at the behavioral level. This review will discuss recent findings that actions of neurons and animals are selected based on detailed stimulus history in such a way as to maximize information for achieving the task at hand. Information maximization dictates not only how sensory coding should adapt to various statistical aspects of stimuli, but also that reward function should adapt to match the predictive information from past to future. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. CRASH: A BLOCK-ADAPTIVE-MESH CODE FOR RADIATIVE SHOCK HYDRODYNAMICS-IMPLEMENTATION AND VERIFICATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van der Holst, B.; Toth, G.; Sokolov, I. V.

    We describe the Center for Radiative Shock Hydrodynamics (CRASH) code, a block-adaptive-mesh code for multi-material radiation hydrodynamics. The implementation solves the radiation diffusion model with a gray or multi-group method and uses a flux-limited diffusion approximation to recover the free-streaming limit. Electrons and ions are allowed to have different temperatures and we include flux-limited electron heat conduction. The radiation hydrodynamic equations are solved in the Eulerian frame by means of a conservative finite-volume discretization in either one-, two-, or three-dimensional slab geometry or in two-dimensional cylindrical symmetry. An operator-split method is used to solve these equations in three substeps: (1)more » an explicit step of a shock-capturing hydrodynamic solver; (2) a linear advection of the radiation in frequency-logarithm space; and (3) an implicit solution of the stiff radiation diffusion, heat conduction, and energy exchange. We present a suite of verification test problems to demonstrate the accuracy and performance of the algorithms. The applications are for astrophysics and laboratory astrophysics. The CRASH code is an extension of the Block-Adaptive Tree Solarwind Roe Upwind Scheme (BATS-R-US) code with a new radiation transfer and heat conduction library and equation-of-state and multi-group opacity solvers. Both CRASH and BATS-R-US are part of the publicly available Space Weather Modeling Framework.« less

  13. Read-Write-Codes: An Erasure Resilient Encoding System for Flexible Reading and Writing in Storage Networks

    NASA Astrophysics Data System (ADS)

    Mense, Mario; Schindelhauer, Christian

    We introduce the Read-Write-Coding-System (RWC) - a very flexible class of linear block codes that generate efficient and flexible erasure codes for storage networks. In particular, given a message x of k symbols and a codeword y of n symbols, an RW code defines additional parameters k ≤ r,w ≤ n that offer enhanced possibilities to adjust the fault-tolerance capability of the code. More precisely, an RWC provides linear left(n,k,dright)-codes that have (a) minimum distance d = n - r + 1 for any two codewords, and (b) for each codeword there exists a codeword for each other message with distance of at most w. Furthermore, depending on the values r,w and the code alphabet, different block codes such as parity codes (e.g. RAID 4/5) or Reed-Solomon (RS) codes (if r = k and thus, w = n) can be generated. In storage networks in which I/O accesses are very costly and redundancy is crucial, this flexibility has considerable advantages as r and w can optimally be adapted to read or write intensive applications; only w symbols must be updated if the message x changes completely, what is different from other codes which always need to rewrite y completely as x changes. In this paper, we first state a tight lower bound and basic conditions for all RW codes. Furthermore, we introduce special RW codes in which all mentioned parameters are adjustable even online, that is, those RW codes are adaptive to changing demands. At last, we point out some useful properties regarding safety and security of the stored data.

  14. Real-time range acquisition by adaptive structured light.

    PubMed

    Koninckx, Thomas P; Van Gool, Luc

    2006-03-01

    The goal of this paper is to provide a "self-adaptive" system for real-time range acquisition. Reconstructions are based on a single frame structured light illumination. Instead of using generic, static coding that is supposed to work under all circumstances, system adaptation is proposed. This occurs on-the-fly and renders the system more robust against instant scene variability and creates suitable patterns at startup. A continuous trade-off between speed and quality is made. A weighted combination of different coding cues--based upon pattern color, geometry, and tracking--yields a robust way to solve the correspondence problem. The individual coding cues are automatically adapted within a considered family of patterns. The weights to combine them are based on the average consistency with the result within a small time-window. The integration itself is done by reformulating the problem as a graph cut. Also, the camera-projector configuration is taken into account for generating the projection patterns. The correctness of the range maps is not guaranteed, but an estimation of the uncertainty is provided for each part of the reconstruction. Our prototype is implemented using unmodified consumer hardware only and, therefore, is cheap. Frame rates vary between 10 and 25 fps, dependent on scene complexity.

  15. Coestimation of recombination, substitution and molecular adaptation rates by approximate Bayesian computation.

    PubMed

    Lopes, J S; Arenas, M; Posada, D; Beaumont, M A

    2014-03-01

    The estimation of parameters in molecular evolution may be biased when some processes are not considered. For example, the estimation of selection at the molecular level using codon-substitution models can have an upward bias when recombination is ignored. Here we address the joint estimation of recombination, molecular adaptation and substitution rates from coding sequences using approximate Bayesian computation (ABC). We describe the implementation of a regression-based strategy for choosing subsets of summary statistics for coding data, and show that this approach can accurately infer recombination allowing for intracodon recombination breakpoints, molecular adaptation and codon substitution rates. We demonstrate that our ABC approach can outperform other analytical methods under a variety of evolutionary scenarios. We also show that although the choice of the codon-substitution model is important, our inferences are robust to a moderate degree of model misspecification. In addition, we demonstrate that our approach can accurately choose the evolutionary model that best fits the data, providing an alternative for when the use of full-likelihood methods is impracticable. Finally, we applied our ABC method to co-estimate recombination, substitution and molecular adaptation rates from 24 published human immunodeficiency virus 1 coding data sets.

  16. A spatially adaptive spectral re-ordering technique for lossless coding of hyper-spectral images

    NASA Technical Reports Server (NTRS)

    Memon, Nasir D.; Galatsanos, Nikolas

    1995-01-01

    In this paper, we propose a new approach, applicable to lossless compression of hyper-spectral images, that alleviates some limitations of linear prediction as applied to this problem. According to this approach, an adaptive re-ordering of the spectral components of each pixel is performed prior to prediction and encoding. This re-ordering adaptively exploits, on a pixel-by pixel basis, the presence of inter-band correlations for prediction. Furthermore, the proposed approach takes advantage of spatial correlations, and does not introduce any coding overhead to transmit the order of the spectral bands. This is accomplished by using the assumption that two spatially adjacent pixels are expected to have similar spectral relationships. We thus have a simple technique to exploit spectral and spatial correlations in hyper-spectral data sets, leading to compression performance improvements as compared to our previously reported techniques for lossless compression. We also look at some simple error modeling techniques for further exploiting any structure that remains in the prediction residuals prior to entropy coding.

  17. The design of an adaptive predictive coder using a single-chip digital signal processor

    NASA Astrophysics Data System (ADS)

    Randolph, M. A.

    1985-01-01

    A speech coding processor architecture design study has been performed in which Texas Instruments TMS32010 has been selected from among three commercially available digital signal processing integrated circuits and evaluated in an implementation study of real-time Adaptive Predictive Coding (APC). The TMS32010 has been compared with AR&T Bell Laboratories DSP I and Nippon Electric Co. PD7720 and was found to be most suitable for a single chip implementation of APC. A preliminary design system based on TMS32010 has been performed, and several of the hardware and software design issues are discussed. Particular attention was paid to the design of an external memory controller which permits rapid sequential access of external RAM. As a result, it has been determined that a compact hardware implementation of the APC algorithm is feasible based of the TSM32010. Originator-supplied keywords include: vocoders, speech compression, adaptive predictive coding, digital signal processing microcomputers, speech processor architectures, and special purpose processor.

  18. Distribution of cold adaptation proteins in microbial mats in Lake Joyce, Antarctica: Analysis of metagenomic data by using two bioinformatics tools.

    PubMed

    Koo, Hyunmin; Hakim, Joseph A; Fisher, Phillip R E; Grueneberg, Alexander; Andersen, Dale T; Bej, Asim K

    2016-01-01

    In this study, we report the distribution and abundance of cold-adaptation proteins in microbial mat communities in the perennially ice-covered Lake Joyce, located in the McMurdo Dry Valleys, Antarctica. We have used MG-RAST and R code bioinformatics tools on Illumina HiSeq2000 shotgun metagenomic data and compared the filtering efficacy of these two methods on cold-adaptation proteins. Overall, the abundance of cold-shock DEAD-box protein A (CSDA), antifreeze proteins (AFPs), fatty acid desaturase (FAD), trehalose synthase (TS), and cold-shock family of proteins (CSPs) were present in all mat samples at high, moderate, or low levels, whereas the ice nucleation protein (INP) was present only in the ice and bulbous mat samples at insignificant levels. Considering the near homogeneous temperature profile of Lake Joyce (0.08-0.29 °C), the distribution and abundance of these proteins across various mat samples predictively correlated with known functional attributes necessary for microbial communities to thrive in this ecosystem. The comparison of the MG-RAST and the R code methods showed dissimilar occurrences of the cold-adaptation protein sequences, though with insignificant ANOSIM (R = 0.357; p-value = 0.012), ADONIS (R(2) = 0.274; p-value = 0.03) and STAMP (p-values = 0.521-0.984) statistical analyses. Furthermore, filtering targeted sequences using the R code accounted for taxonomic groups by avoiding sequence redundancies, whereas the MG-RAST provided total counts resulting in a higher sequence output. The results from this study revealed for the first time the distribution of cold-adaptation proteins in six different types of microbial mats in Lake Joyce, while suggesting a simpler and more manageable user-defined method of R code, as compared to a web-based MG-RAST pipeline.

  19. Learning of spatio-temporal codes in a coupled oscillator system.

    PubMed

    Orosz, Gábor; Ashwin, Peter; Townley, Stuart

    2009-07-01

    In this paper, we consider a learning strategy that allows one to transmit information between two coupled phase oscillator systems (called teaching and learning systems) via frequency adaptation. The dynamics of these systems can be modeled with reference to a number of partially synchronized cluster states and transitions between them. Forcing the teaching system by steady but spatially nonhomogeneous inputs produces cyclic sequences of transitions between the cluster states, that is, information about inputs is encoded via a "winnerless competition" process into spatio-temporal codes. The large variety of codes can be learned by the learning system that adapts its frequencies to those of the teaching system. We visualize the dynamics using "weighted order parameters (WOPs)" that are analogous to "local field potentials" in neural systems. Since spatio-temporal coding is a mechanism that appears in olfactory systems, the developed learning rules may help to extract information from these neural ensembles.

  20. Advanced propeller noise prediction in the time domain

    NASA Technical Reports Server (NTRS)

    Farassat, F.; Dunn, M. H.; Spence, P. L.

    1992-01-01

    The time domain code ASSPIN gives acousticians a powerful technique of advanced propeller noise prediction. Except for nonlinear effects, the code uses exact solutions of the Ffowcs Williams-Hawkings equation with exact blade geometry and kinematics. By including nonaxial inflow, periodic loading noise, and adaptive time steps to accelerate computer execution, the development of this code becomes complete.

  1. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  2. Comparing AMR and SPH Cosmological Simulations. I. Dark Matter and Adiabatic Simulations

    NASA Astrophysics Data System (ADS)

    O'Shea, Brian W.; Nagamine, Kentaro; Springel, Volker; Hernquist, Lars; Norman, Michael L.

    2005-09-01

    We compare two cosmological hydrodynamic simulation codes in the context of hierarchical galaxy formation: the Lagrangian smoothed particle hydrodynamics (SPH) code GADGET, and the Eulerian adaptive mesh refinement (AMR) code Enzo. Both codes represent dark matter with the N-body method but use different gravity solvers and fundamentally different approaches for baryonic hydrodynamics. The SPH method in GADGET uses a recently developed ``entropy conserving'' formulation of SPH, while for the mesh-based Enzo two different formulations of Eulerian hydrodynamics are employed: the piecewise parabolic method (PPM) extended with a dual energy formulation for cosmology, and the artificial viscosity-based scheme used in the magnetohydrodynamics code ZEUS. In this paper we focus on a comparison of cosmological simulations that follow either only dark matter, or also a nonradiative (``adiabatic'') hydrodynamic gaseous component. We perform multiple simulations using both codes with varying spatial and mass resolution with identical initial conditions. The dark matter-only runs agree generally quite well provided Enzo is run with a comparatively fine root grid and a low overdensity threshold for mesh refinement, otherwise the abundance of low-mass halos is suppressed. This can be readily understood as a consequence of the hierarchical particle-mesh algorithm used by Enzo to compute gravitational forces, which tends to deliver lower force resolution than the tree-algorithm of GADGET at early times before any adaptive mesh refinement takes place. At comparable force resolution we find that the latter offers substantially better performance and lower memory consumption than the present gravity solver in Enzo. In simulations that include adiabatic gasdynamics we find general agreement in the distribution functions of temperature, entropy, and density for gas of moderate to high overdensity, as found inside dark matter halos. However, there are also some significant differences in the same quantities for gas of lower overdensity. For example, at z=3 the fraction of cosmic gas that has temperature logT>0.5 is ~80% for both Enzo ZEUS and GADGET, while it is 40%-60% for Enzo PPM. We argue that these discrepancies are due to differences in the shock-capturing abilities of the different methods. In particular, we find that the ZEUS implementation of artificial viscosity in Enzo leads to some unphysical heating at early times in preshock regions. While this is apparently a significantly weaker effect in GADGET, its use of an artificial viscosity technique may also make it prone to some excess generation of entropy that should be absent in Enzo PPM. Overall, the hydrodynamical results for GADGET are bracketed by those for Enzo ZEUS and Enzo PPM but are closer to Enzo ZEUS.

  3. Selecting Effective Means to Any End: Futures and Ethics of Persuasion Profiling

    NASA Astrophysics Data System (ADS)

    Kaptein, Maurits; Eckles, Dean

    Interactive persuasive technologies can and do adapt to individuals. Existing systems identify and adapt to user preferences within a specific domain: e.g., a music recommender system adapts its recommended songs to user preferences. This paper is concerned with adaptive persuasive systems that adapt to individual differences in the effectiveness of particular means, rather than selecting different ends. We give special attention to systems that implement persuasion profiling - adapting to individual differences in the effects of influence strategies. We argue that these systems are worth separate consideration and raise unique ethical issues for two reasons: (1) their end-independence implies that systems trained in one context can be used in other, unexpected contexts and (2) they do not rely on - and are generally disadvantaged by - disclosing that they are adapting to individual differences. We use examples of these systems to illustrate some ethically and practically challenging futures that these characteristics make possible.

  4. A Critical Reflection on Codes of Conduct in Vocational Education

    ERIC Educational Resources Information Center

    Bagnall, Richard G.; Nakar, Sonal

    2018-01-01

    The contemporary cultural context may be seen as presenting a moral void in vocational education, sanctioning the ascendency of instrumental epistemology and a proliferation of codes of conduct, to which workplace actions are expected to conform. Important among the purposes of such codes is that of encouraging ethical conduct, but, true to their…

  5. Codes, Code-Switching, and Context: Style and Footing in Peer Group Bilingual Play

    ERIC Educational Resources Information Center

    Kyratzis, Amy; Tang, Ya-Ting; Koymen, S. Bahar

    2009-01-01

    According to Bernstein (A sociolinguistic approach to socialization; with some reference to educability, Basil Blackwell Ltd., 1972), middle-class parents transmit an elaborated code to their children that relies on verbal means, rather than paralinguistic devices or shared assumptions, to express meanings. Bernstein's ideas were used to argue…

  6. Reed-Solomon Codes and the Deep Hole Problem

    NASA Astrophysics Data System (ADS)

    Keti, Matt

    In many types of modern communication, a message is transmitted over a noisy medium. When this is done, there is a chance that the message will be corrupted. An error-correcting code adds redundant information to the message which allows the receiver to detect and correct errors accrued during the transmission. We will study the famous Reed-Solomon code (found in QR codes, compact discs, deep space probes,ldots) and investigate the limits of its error-correcting capacity. It can be shown that understanding this is related to understanding the "deep hole" problem, which is a question of determining when a received message has, in a sense, incurred the worst possible corruption. We partially resolve this in its traditional context, when the code is based on the finite field F q or Fq*, as well as new contexts, when it is based on a subgroup of F q* or the image of a Dickson polynomial. This is a new and important problem that could give insight on the true error-correcting potential of the Reed-Solomon code.

  7. Leadership for Coping with and Adapting to Policy Change in Deprived Contexts: Lessons from School Principals

    ERIC Educational Resources Information Center

    Bhengu, Thamsanqa Thulani; Myende, Phumlani Erasmus

    2016-01-01

    This paper explores what, from school principals' perspectives, constitutes leadership for coping with and adapting to policy change within deprived school contexts. Using qualitative interpretive research, we drew from the practices of five principals that were purposively selected from a broader study, which focused on school principals'…

  8. Career Adaptability, Hope, Optimism, and Life Satisfaction in Italian and Swiss Adolescents

    ERIC Educational Resources Information Center

    Santilli, Sara; Marcionetti, Jenny; Rochat, Shékina; Rossier, Jérôme; Nota, Laura

    2017-01-01

    The consequences of economic crisis are different from one European context to the other. Based on life design (LD) approach, the present study focused on two variables--career adaptability and a positive orientation toward future (hope and optimism)--relevant to coping with the current work context and their role in affecting life satisfaction. A…

  9. Rapid perceptual adaptation to high gravitoinertial force levels Evidence for context-specific adaptation

    NASA Technical Reports Server (NTRS)

    Lackner, J. R.; Graybiel, A.

    1982-01-01

    Subjects exposed to periodic variations in gravitoinertial force (2-G peak) in parabolic flight maneuvers quickly come to perceive the peak force level as having decreased in intensity. By the end of a 40-parabola flight, the decrease in apparent force is approximately 40%. On successive flight days, the apparent intensity of the force loads seems to decrease as well, indicating a cumulative adaptive effect. None of the subjects reported feeling abnormally 'light' for more than a minute or two after return to 1-G background force levels. The pattern of findings suggests a context-specific adaptation to high-force levels.

  10. A lncRNA Perspective into (Re)Building the Heart.

    PubMed

    Frank, Stefan; Aguirre, Aitor; Hescheler, Juergen; Kurian, Leo

    2016-01-01

    Our conception of the human genome, long focused on the 2% that codes for proteins, has profoundly changed since its first draft assembly in 2001. Since then, an unanticipatedly expansive functionality and convolution has been attributed to the majority of the genome that is transcribed in a cell-type/context-specific manner into transcripts with no apparent protein coding ability. While the majority of these transcripts, currently annotated as long non-coding RNAs (lncRNAs), are functionally uncharacterized, their prominent role in embryonic development and tissue homeostasis, especially in the context of the heart, is emerging. In this review, we summarize and discuss the latest advances in understanding the relevance of lncRNAs in (re)building the heart.

  11. A neural mechanism of dynamic gating of task-relevant information by top-down influence in primary visual cortex.

    PubMed

    Kamiyama, Akikazu; Fujita, Kazuhisa; Kashimori, Yoshiki

    2016-12-01

    Visual recognition involves bidirectional information flow, which consists of bottom-up information coding from retina and top-down information coding from higher visual areas. Recent studies have demonstrated the involvement of early visual areas such as primary visual area (V1) in recognition and memory formation. V1 neurons are not passive transformers of sensory inputs but work as adaptive processor, changing their function according to behavioral context. Top-down signals affect tuning property of V1 neurons and contribute to the gating of sensory information relevant to behavior. However, little is known about the neuronal mechanism underlying the gating of task-relevant information in V1. To address this issue, we focus on task-dependent tuning modulations of V1 neurons in two tasks of perceptual learning. We develop a model of the V1, which receives feedforward input from lateral geniculate nucleus and top-down input from a higher visual area. We show here that the change in a balance between excitation and inhibition in V1 connectivity is necessary for gating task-relevant information in V1. The balance change well accounts for the modulations of tuning characteristic and temporal properties of V1 neuronal responses. We also show that the balance change of V1 connectivity is shaped by top-down signals with temporal correlations reflecting the perceptual strategies of the two tasks. We propose a learning mechanism by which synaptic balance is modulated. To conclude, top-down signal changes the synaptic balance between excitation and inhibition in V1 connectivity, enabling early visual area such as V1 to gate context-dependent information under multiple task performances. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. Context-dependent miR-204 and miR-211 affect the biological properties of amelanotic and melanotic melanoma cells

    PubMed Central

    Vitiello, Marianna; Tuccoli, Andrea; D’Aurizio, Romina; Sarti, Samanta; Giannecchini, Laura; Lubrano, Simone; Marranci, Andrea; Evangelista, Monica; Peppicelli, Silvia; Ippolito, Chiara; Barravecchia, Ivana; Guzzolino, Elena; Montagnani, Valentina; Gowen, Michael; Mercoledi, Elisa; Mercatanti, Alberto; Comelli, Laura; Gurrieri, Salvatore; Wu, Lawrence W.; Ope, Omotayo; Flaherty, Keith; Boland, Genevieve M.; Hammond, Marc R.; Kwong, Lawrence; Chiariello, Mario; Stecca, Barbara; Zhang, Gao; Salvetti, Alessandra; Angeloni, Debora; Pitto, Letizia; Calorini, Lido; Chiorino, Giovanna; Pellegrini, Marco; Herlyn, Meenhard; Osman, Iman; Poliseno, Laura

    2017-01-01

    Despite increasing amounts of experimental evidence depicting the involvement of non-coding RNAs in cancer, the study of BRAFV600E-regulated genes has thus far focused mainly on protein-coding ones. Here, we identify and study the microRNAs that BRAFV600E regulates through the ERK pathway. By performing small RNA sequencing on A375 melanoma cells and a vemurafenib-resistant clone that was taken as negative control, we discover miR-204 and miR-211 as the miRNAs most induced by vemurafenib. We also demonstrate that, although belonging to the same family, these two miRNAs have distinctive features. miR-204 is under the control of STAT3 and its expression is induced in amelanotic melanoma cells, where it acts as an effector of vemurafenib's anti-motility activity by targeting AP1S2. Conversely, miR-211, a known transcriptional target of MITF, is induced in melanotic melanoma cells, where it targets EDEM1 and consequently impairs the degradation of TYROSINASE (TYR) through the ER-associated degradation (ERAD) pathway. In doing so, miR-211 serves as an effector of vemurafenib's pro-pigmentation activity. We also show that such an increase in pigmentation in turn represents an adaptive response that needs to be overcome using appropriate inhibitors in order to increase the efficacy of vemurafenib. In summary, we unveil the distinct and context-dependent activities exerted by miR-204 family members in melanoma cells. Our work challenges the widely accepted “same miRNA family = same function” rule and provides a rationale for a novel treatment strategy for melanotic melanomas that is based on the combination of ERK pathway inhibitors with pigmentation inhibitors. PMID:28445987

  13. Enhanced Sensitivity to Rapid Input Fluctuations by Nonlinear Threshold Dynamics in Neocortical Pyramidal Neurons.

    PubMed

    Mensi, Skander; Hagens, Olivier; Gerstner, Wulfram; Pozzorini, Christian

    2016-02-01

    The way in which single neurons transform input into output spike trains has fundamental consequences for network coding. Theories and modeling studies based on standard Integrate-and-Fire models implicitly assume that, in response to increasingly strong inputs, neurons modify their coding strategy by progressively reducing their selective sensitivity to rapid input fluctuations. Combining mathematical modeling with in vitro experiments, we demonstrate that, in L5 pyramidal neurons, the firing threshold dynamics adaptively adjust the effective timescale of somatic integration in order to preserve sensitivity to rapid signals over a broad range of input statistics. For that, a new Generalized Integrate-and-Fire model featuring nonlinear firing threshold dynamics and conductance-based adaptation is introduced that outperforms state-of-the-art neuron models in predicting the spiking activity of neurons responding to a variety of in vivo-like fluctuating currents. Our model allows for efficient parameter extraction and can be analytically mapped to a Generalized Linear Model in which both the input filter--describing somatic integration--and the spike-history filter--accounting for spike-frequency adaptation--dynamically adapt to the input statistics, as experimentally observed. Overall, our results provide new insights on the computational role of different biophysical processes known to underlie adaptive coding in single neurons and support previous theoretical findings indicating that the nonlinear dynamics of the firing threshold due to Na+-channel inactivation regulate the sensitivity to rapid input fluctuations.

  14. A Network Coding Based Hybrid ARQ Protocol for Underwater Acoustic Sensor Networks

    PubMed Central

    Wang, Hao; Wang, Shilian; Zhang, Eryang; Zou, Jianbin

    2016-01-01

    Underwater Acoustic Sensor Networks (UASNs) have attracted increasing interest in recent years due to their extensive commercial and military applications. However, the harsh underwater channel causes many challenges for the design of reliable underwater data transport protocol. In this paper, we propose an energy efficient data transport protocol based on network coding and hybrid automatic repeat request (NCHARQ) to ensure reliability, efficiency and availability in UASNs. Moreover, an adaptive window length estimation algorithm is designed to optimize the throughput and energy consumption tradeoff. The algorithm can adaptively change the code rate and can be insensitive to the environment change. Extensive simulations and analysis show that NCHARQ significantly reduces energy consumption with short end-to-end delay. PMID:27618044

  15. Toward a clearer portrayal of confounding bias in instrumental variable applications.

    PubMed

    Jackson, John W; Swanson, Sonja A

    2015-07-01

    Recommendations for reporting instrumental variable analyses often include presenting the balance of covariates across levels of the proposed instrument and levels of the treatment. However, such presentation can be misleading as relatively small imbalances among covariates across levels of the instrument can result in greater bias because of bias amplification. We introduce bias plots and bias component plots as alternative tools for understanding biases in instrumental variable analyses. Using previously published data on proposed preference-based, geography-based, and distance-based instruments, we demonstrate why presenting covariate balance alone can be problematic, and how bias component plots can provide more accurate context for bias from omitting a covariate from an instrumental variable versus non-instrumental variable analysis. These plots can also provide relevant comparisons of different proposed instruments considered in the same data. Adaptable code is provided for creating the plots.

  16. Public health and church-based constructions of HIV prevention: black Baptist perspective

    PubMed Central

    Roman Isler, Malika; Eng, Eugenia; Maman, Susanne; Adimora, Adaora; Weiner, Bryan

    2014-01-01

    The black church is influential in shaping health behaviors within African-American communities, yet few use evidence-based strategies for HIV prevention (abstinence, monogamy, condoms, voluntary counseling and testing, and prevention with positives). Using principles of grounded theory and interpretive description, we explored the social construction of HIV prevention within black Baptist churches in North Carolina. Data collection included interviews with church leaders (n = 12) and focus groups with congregants (n = 7; 36 participants). Analytic tools included open coding and case-level comparisons. Social constructions of HIV/AIDS prevention were influenced by two worldviews: public health and church-based. Areas of compatibility and incompatibility exist between the two worldviews that inform acceptability and adaptability of current evidence-based strategies. These findings offer insight into ways to increase the compatibility of evidence-based HIV prevention strategies within the black Baptist church context. PMID:24643141

  17. Figuring Out Gas in Galaxies In Enzo (FOGGIE): Resolving the Inner Circumgalactic Medium

    NASA Astrophysics Data System (ADS)

    Corlies, Lauren; Peeples, Molly; Tumlinson, Jason; O'Shea, Brian; Smith, Britton

    2018-01-01

    Cosmological hydrodynamical simulations using every common numerical method have struggled to reproduce the multiphase nature of the circumgalactic medium (CGM) revealed by recent observations. However, to date, resolution in these simulations has been aimed at dense regions — the galactic disk and in-falling satellites — while the diffuse CGM never reaches comparable levels of refinement. Taking advantage of the flexible grid structure of the adaptive mesh refinement code Enzo, we force refinement in a region of the CGM of a Milky Way-like galaxy to the same spatial resolution as that of the disk. In this talk, I will present how the physical and structural distributions of the circumgalactic gas change dramatically as a function of the resolution alone. I will also show the implications these changes have for the observational properties of the gas in the context of the observations.

  18. Imposing a Lagrangian Particle Framework on an Eulerian Hydrodynamics Infrastructure in Flash

    NASA Technical Reports Server (NTRS)

    Dubey, A.; Daley, C.; ZuHone, J.; Ricker, P. M.; Weide, K.; Graziani, C.

    2012-01-01

    In many astrophysical simulations, both Eulerian and Lagrangian quantities are of interest. For example, in a galaxy cluster merger simulation, the intracluster gas can have Eulerian discretization, while dark matter can be modeled using particles. FLASH, a component-based scientific simulation code, superimposes a Lagrangian framework atop an adaptive mesh refinement Eulerian framework to enable such simulations. The discretization of the field variables is Eulerian, while the Lagrangian entities occur in many different forms including tracer particles, massive particles, charged particles in particle-in-cell mode, and Lagrangian markers to model fluid structure interactions. These widely varying roles for Lagrangian entities are possible because of the highly modular, flexible, and extensible architecture of the Lagrangian framework. In this paper, we describe the Lagrangian framework in FLASH in the context of two very different applications, Type Ia supernovae and galaxy cluster mergers, which use the Lagrangian entities in fundamentally different ways.

  19. A developmentally informed adaptation of minority stress for sexual minority adolescents

    PubMed Central

    Goldbach, Jeremy T.; Gibbs, Jeremy J.

    2017-01-01

    Sexual minority adolescents (lesbian, gay, bisexual) experience disparities in behavioral health outcomes compared to their heterosexual peers, generally attributed to minority stress. Although evidence of the applicability of the minority stress model among adolescents exists, it is based on a primarily adult literature. Developmental and generational differences demand further examination of minority stress to confirm its applicability. Forty-eight life history interviews with sexual minority adolescents in California (age 14–19; M=19.27 SD = 1.38; 39.6% cismale, 35.4% cisfemale, 25% other gender) were completed, recorded, transcribed, and analyzed using thematic analysis in QSR NVivo. Following a consensus model, all transcripts were double coded. Results suggest that minority stress is appropriate for use with adolescents; however, further emphasis should be placed on social context, coping resources, and developmental processes regarding identity development. A conceptual model is provided, as are implications for research and practice. PMID:28033502

  20. CBL-CIPK network for calcium signaling in higher plants

    NASA Astrophysics Data System (ADS)

    Luan, Sheng

    Plants sense their environment by signaling mechanisms involving calcium. Calcium signals are encoded by a complex set of parameters and decoded by a large number of proteins including the more recently discovered CBL-CIPK network. The calcium-binding CBL proteins specifi-cally interact with a family of protein kinases CIPKs and regulate the activity and subcellular localization of these kinases, leading to the modification of kinase substrates. This represents a paradigm shift as compared to a calcium signaling mechanism from yeast and animals. One example of CBL-CIPK signaling pathways is the low-potassium response of Arabidopsis roots. When grown in low-K medium, plants develop stronger K-uptake capacity adapting to the low-K condition. Recent studies show that the increased K-uptake is caused by activation of a specific K-channel by the CBL-CIPK network. A working model for this regulatory pathway will be discussed in the context of calcium coding and decoding processes.

  1. Imposing a Lagrangian Particle Framework on an Eulerian Hydrodynamics Infrastructure in FLASH

    NASA Astrophysics Data System (ADS)

    Dubey, A.; Daley, C.; ZuHone, J.; Ricker, P. M.; Weide, K.; Graziani, C.

    2012-08-01

    In many astrophysical simulations, both Eulerian and Lagrangian quantities are of interest. For example, in a galaxy cluster merger simulation, the intracluster gas can have Eulerian discretization, while dark matter can be modeled using particles. FLASH, a component-based scientific simulation code, superimposes a Lagrangian framework atop an adaptive mesh refinement Eulerian framework to enable such simulations. The discretization of the field variables is Eulerian, while the Lagrangian entities occur in many different forms including tracer particles, massive particles, charged particles in particle-in-cell mode, and Lagrangian markers to model fluid-structure interactions. These widely varying roles for Lagrangian entities are possible because of the highly modular, flexible, and extensible architecture of the Lagrangian framework. In this paper, we describe the Lagrangian framework in FLASH in the context of two very different applications, Type Ia supernovae and galaxy cluster mergers, which use the Lagrangian entities in fundamentally different ways.

  2. Scalable L-infinite coding of meshes.

    PubMed

    Munteanu, Adrian; Cernea, Dan C; Alecu, Alin; Cornelis, Jan; Schelkens, Peter

    2010-01-01

    The paper investigates the novel concept of local-error control in mesh geometry encoding. In contrast to traditional mesh-coding systems that use the mean-square error as target distortion metric, this paper proposes a new L-infinite mesh-coding approach, for which the target distortion metric is the L-infinite distortion. In this context, a novel wavelet-based L-infinite-constrained coding approach for meshes is proposed, which ensures that the maximum error between the vertex positions in the original and decoded meshes is lower than a given upper bound. Furthermore, the proposed system achieves scalability in L-infinite sense, that is, any decoding of the input stream will correspond to a perfectly predictable L-infinite distortion upper bound. An instantiation of the proposed L-infinite-coding approach is demonstrated for MESHGRID, which is a scalable 3D object encoding system, part of MPEG-4 AFX. In this context, the advantages of scalable L-infinite coding over L-2-oriented coding are experimentally demonstrated. One concludes that the proposed L-infinite mesh-coding approach guarantees an upper bound on the local error in the decoded mesh, it enables a fast real-time implementation of the rate allocation, and it preserves all the scalability features and animation capabilities of the employed scalable mesh codec.

  3. Shyness and boldness in pumpkinseed sunfish: individual differences are context-specific.

    PubMed

    Coleman; Wilson

    1998-10-01

    Natural selection often promotes a mix of behavioural phenotypes in a population. Adaptive variation in the propensity to take risks might explain individual differences in shyness and boldness in humans and other species. It is often implicitly assumed that shyness and boldness are general personality traits expressed across many situations. From the evolutionary standpoint, however, individual differences that are adaptive in one context (e.g. predator defence) may not be adaptive in other contexts (e.g. exploration of the physical environment or intraspecific social interactions). We measured the context specificity of shyness and boldness in a natural population of juvenile pumpkinseed sunfish, Lepomis gibbosus, by exposing the fish to a potentially threatening stimulus (a red-tipped metrestick extended towards the individual) and a nonthreatening stimulus (a novel food source). We also related these measures of shyness and boldness to behaviours observed during focal observations, both before and after the introduction of a predator (largemouth bass, Micropterus salmoides). Consistent individual differences were found within both contexts, but individual differences did not correlate across contexts. Furthermore, fish that were scored as intermediate in their response to the metrestick behaved most boldly as foragers and in response to the bass predators. These results suggest that shyness and boldness are context-specific and may not exist as a one-dimensional behavioural continuum even within a single context. Copyright 1998 The Association for the Study of Animal Behaviour.

  4. Transcultural adaptation and new proposal for the nursing outcome, Physical condition (2004)

    PubMed Central

    Navarrete, Jessica Rojas; Pérez, Paloma Echevarría; Costa, César Leal

    2018-01-01

    ABSTRACT Objectives: cross-culturally adapt to the Spanish context and make a new proposal for the nursing outcome, Physical Condition (2004), of the Nursing Outcomes Classification (NOC) for its precise use in clinical practice. Method: a cross-cultural adaptation study and a proposal for the nursing outcome, Physical Condition, was conducted and supported by the opinion of 26 experts. The data was obtained through an electronic form, and a quantitative analysis was conducted, using the SPSS software. Results: the version adapted to the Spanish context was obtained and the proposal of the outcome, Physical Condition, received agreement from 26 experts, with a mean score greater than 7.6 for adequacy of the outcome definition and its indicators, and 8.5 for the relevance of the indicators. Conclusions: the version adapted to the Spanish context and a new proposal for Physical Condition were obtained. The results obtained indicate a high level of adequacy and relevance, an instrument of great utility in the clinic, and research was obtained to evaluate the interventions directed to the improvement of the physical condition. PMID:29791669

  5. Enhanced conflict-driven cognitive control by emotional arousal, not by valence.

    PubMed

    Zeng, Qinghong; Qi, Senqing; Li, Miaoyun; Yao, Shuxia; Ding, Cody; Yang, Dong

    2017-09-01

    Emotion is widely agreed to have two dimensions, valence and arousal. Few studies have explored the effect of emotion on conflict adaptation by considering both of these, which could have dissociate influence. The present study aimed to fill the gap as to whether emotional valence and arousal would exert dissociable influence on conflict adaptation. In the experiments, we included positive, neutral, and negative conditions, with comparable arousal between positive and negative conditions. Both positive and negative conditions have higher arousal than neutral ones. In Experiment 1, by using a two-colour-word Flanker task, we found that conflict adaptation was enhanced in both positive and negative contexts compared to a neutral context. Furthermore, this effect still existed when controlling stimulus-response repetitions in Experiment 2, which used a four-colour-word Flanker task. The findings suggest emotional arousal enhances conflict adaptation, regardless of emotional valence. Thus, future studies should consider emotional arousal when studying the effect of emotion on conflict adaptation. Moreover, the unique role of the emotional context in conflict-driven cognitive control is emphasised.

  6. Comparative Study of Neural Network Frameworks for the Next Generation of Adaptive Optics Systems.

    PubMed

    González-Gutiérrez, Carlos; Santos, Jesús Daniel; Martínez-Zarzuela, Mario; Basden, Alistair G; Osborn, James; Díaz-Pernas, Francisco Javier; De Cos Juez, Francisco Javier

    2017-06-02

    Many of the next generation of adaptive optics systems on large and extremely large telescopes require tomographic techniques in order to correct for atmospheric turbulence over a large field of view. Multi-object adaptive optics is one such technique. In this paper, different implementations of a tomographic reconstructor based on a machine learning architecture named "CARMEN" are presented. Basic concepts of adaptive optics are introduced first, with a short explanation of three different control systems used on real telescopes and the sensors utilised. The operation of the reconstructor, along with the three neural network frameworks used, and the developed CUDA code are detailed. Changes to the size of the reconstructor influence the training and execution time of the neural network. The native CUDA code turns out to be the best choice for all the systems, although some of the other frameworks offer good performance under certain circumstances.

  7. Comparative Study of Neural Network Frameworks for the Next Generation of Adaptive Optics Systems

    PubMed Central

    González-Gutiérrez, Carlos; Santos, Jesús Daniel; Martínez-Zarzuela, Mario; Basden, Alistair G.; Osborn, James; Díaz-Pernas, Francisco Javier; De Cos Juez, Francisco Javier

    2017-01-01

    Many of the next generation of adaptive optics systems on large and extremely large telescopes require tomographic techniques in order to correct for atmospheric turbulence over a large field of view. Multi-object adaptive optics is one such technique. In this paper, different implementations of a tomographic reconstructor based on a machine learning architecture named “CARMEN” are presented. Basic concepts of adaptive optics are introduced first, with a short explanation of three different control systems used on real telescopes and the sensors utilised. The operation of the reconstructor, along with the three neural network frameworks used, and the developed CUDA code are detailed. Changes to the size of the reconstructor influence the training and execution time of the neural network. The native CUDA code turns out to be the best choice for all the systems, although some of the other frameworks offer good performance under certain circumstances. PMID:28574426

  8. Parallel Adaptive Simulation of Detonation Waves Using a Weighted Essentially Non-Oscillatory Scheme

    NASA Astrophysics Data System (ADS)

    McMahon, Sean

    The purpose of this thesis was to develop a code that could be used to develop a better understanding of the physics of detonation waves. First, a detonation was simulated in one dimension using ZND theory. Then, using the 1D solution as an initial condition, a detonation was simulated in two dimensions using a weighted essentially non-oscillatory scheme on an adaptive mesh with the smallest lengthscales being equal to 2-3 flamelet lengths. The code development in linking Chemkin for chemical kinetics to the adaptive mesh refinement flow solver was completed. The detonation evolved in a way that, qualitatively, matched the experimental observations, however, the simulation was unable to progress past the formation of the triple point.

  9. Gyroaveraging operations using adaptive matrix operators

    NASA Astrophysics Data System (ADS)

    Dominski, Julien; Ku, Seung-Hoe; Chang, Choong-Seock

    2018-05-01

    A new adaptive scheme to be used in particle-in-cell codes for carrying out gyroaveraging operations with matrices is presented. This new scheme uses an intermediate velocity grid whose resolution is adapted to the local thermal Larmor radius. The charge density is computed by projecting marker weights in a field-line following manner while preserving the adiabatic magnetic moment μ. These choices permit to improve the accuracy of the gyroaveraging operations performed with matrices even when strong spatial variation of temperature and magnetic field is present. Accuracy of the scheme in different geometries from simple 2D slab geometry to realistic 3D toroidal equilibrium has been studied. A successful implementation in the gyrokinetic code XGC is presented in the delta-f limit.

  10. CoreTSAR: Core Task-Size Adapting Runtime

    DOE PAGES

    Scogland, Thomas R. W.; Feng, Wu-chun; Rountree, Barry; ...

    2014-10-27

    Heterogeneity continues to increase at all levels of computing, with the rise of accelerators such as GPUs, FPGAs, and other co-processors into everything from desktops to supercomputers. As a consequence, efficiently managing such disparate resources has become increasingly complex. CoreTSAR seeks to reduce this complexity by adaptively worksharing parallel-loop regions across compute resources without requiring any transformation of the code within the loop. Lastly, our results show performance improvements of up to three-fold over a current state-of-the-art heterogeneous task scheduler as well as linear performance scaling from a single GPU to four GPUs for many codes. In addition, CoreTSAR demonstratesmore » a robust ability to adapt to both a variety of workloads and underlying system configurations.« less

  11. Mujeres Fuertes y Corazones Saludables: adaptation of the StrongWomen -healthy hearts program for rural Latinas using an intervention mapping approach.

    PubMed

    Perry, Cynthia K; McCalmont, Jean C; Ward, Judy P; Menelas, Hannah-Dulya K; Jackson, Christie; De Witz, Jazmyne R; Solanki, Emma; Seguin, Rebecca A

    2017-12-28

    To describe our use of intervention mapping as a systematic method to adapt an evidence-based physical activity and nutrition program to reflect the needs of rural Latinas. An intervention mapping process involving six steps guided the adaptation of an evidence based physical activity and nutrition program, using a community-based participatory research approach. We partnered with a community advisory board of rural Latinas throughout the adaptation process. A needs assessment and logic models were used to ascertain which program was the best fit for adaptation. Once identified, we collaborated with one of the developers of the original program (StrongWomen - Healthy Hearts) during the adaptation process. First, essential theoretical methods and program elements were identified, and additional elements were added or adapted. Next, we reviewed and made changes to reflect the community and cultural context of the practical applications, intervention strategies, program curriculum, materials, and participant information. Finally, we planned for the implementation and evaluation of the adapted program, Mujeres Fuertes y Corazones Saludables, within the context of the rural community. A pilot study will be conducted with overweight, sedentary, middle-aged, Spanish-speaking Latinas. Outcome measures will assess change in weight, physical fitness, physical activity, and nutrition behavior. The intervention mapping process was feasible and provided a systematic approach to balance fit and fidelity in the adaptation of an evidence-based program. Collaboration with community members ensured that the components of the curriculum that were adapted were culturally appropriate and relevant within the local community context.

  12. Coding tools investigation for next generation video coding based on HEVC

    NASA Astrophysics Data System (ADS)

    Chen, Jianle; Chen, Ying; Karczewicz, Marta; Li, Xiang; Liu, Hongbin; Zhang, Li; Zhao, Xin

    2015-09-01

    The new state-of-the-art video coding standard, H.265/HEVC, has been finalized in 2013 and it achieves roughly 50% bit rate saving compared to its predecessor, H.264/MPEG-4 AVC. This paper provides the evidence that there is still potential for further coding efficiency improvements. A brief overview of HEVC is firstly given in the paper. Then, our improvements on each main module of HEVC are presented. For instance, the recursive quadtree block structure is extended to support larger coding unit and transform unit. The motion information prediction scheme is improved by advanced temporal motion vector prediction, which inherits the motion information of each small block within a large block from a temporal reference picture. Cross component prediction with linear prediction model improves intra prediction and overlapped block motion compensation improves the efficiency of inter prediction. Furthermore, coding of both intra and inter prediction residual is improved by adaptive multiple transform technique. Finally, in addition to deblocking filter and SAO, adaptive loop filter is applied to further enhance the reconstructed picture quality. This paper describes above-mentioned techniques in detail and evaluates their coding performance benefits based on the common test condition during HEVC development. The simulation results show that significant performance improvement over HEVC standard can be achieved, especially for the high resolution video materials.

  13. A Four-Phase Modulation System for Use with an Adaptive Array.

    DTIC Science & Technology

    1982-07-01

    MODULATION SYSTEM FORIteia epr * ~USE WITH AN ADAPTIVE ARRAY *P RIGOG EOTM~E _____________________________________ ESL 711679-5 7s AUTHOeO~) 9 . CONTRACT r0...OUSOLE1T6 UNCLASSIFIED SECURITY CLASSIFICATION OF THIS P040E (when Doe I91 r2 UNCLASSIFIED 8ncumV CL"M,ICAnIo, o TP , ImS 8... ., 9 fte-H - LMS...nterval has a duration of : Tb seconds. a(t) is a pseudonotse code, i.e., a maximum length " lInear shift register sequence [ 9 ]. The code symbol interval

  14. Time course of dynamic range adaptation in the auditory nerve

    PubMed Central

    Wang, Grace I.; Dean, Isabel; Delgutte, Bertrand

    2012-01-01

    Auditory adaptation to sound-level statistics occurs as early as in the auditory nerve (AN), the first stage of neural auditory processing. In addition to firing rate adaptation characterized by a rate decrement dependent on previous spike activity, AN fibers show dynamic range adaptation, which is characterized by a shift of the rate-level function or dynamic range toward the most frequently occurring levels in a dynamic stimulus, thereby improving the precision of coding of the most common sound levels (Wen B, Wang GI, Dean I, Delgutte B. J Neurosci 29: 13797–13808, 2009). We investigated the time course of dynamic range adaptation by recording from AN fibers with a stimulus in which the sound levels periodically switch from one nonuniform level distribution to another (Dean I, Robinson BL, Harper NS, McAlpine D. J Neurosci 28: 6430–6438, 2008). Dynamic range adaptation occurred rapidly, but its exact time course was difficult to determine directly from the data because of the concomitant firing rate adaptation. To characterize the time course of dynamic range adaptation without the confound of firing rate adaptation, we developed a phenomenological “dual adaptation” model that accounts for both forms of AN adaptation. When fitted to the data, the model predicts that dynamic range adaptation occurs as rapidly as firing rate adaptation, over 100–400 ms, and the time constants of the two forms of adaptation are correlated. These findings suggest that adaptive processing in the auditory periphery in response to changes in mean sound level occurs rapidly enough to have significant impact on the coding of natural sounds. PMID:22457465

  15. Single-channel voice-response-system program documentation volume I : system description

    DOT National Transportation Integrated Search

    1977-01-01

    This report documents the design and implementation of a Voice Response System (VRS) using Adaptive Differential Pulse Code Modulation (ADPCM) voice coding. Implemented on a Digital Equipment Corporation PDP-11/20,R this VRS system supports a single ...

  16. In search of an adaptive social-ecological approach to understanding a tropical city

    Treesearch

    A.E. Lugo; C.M. Concepcion; L.E. Santiago-Acevedo; T.A. Munoz-Erickson; J.C. Verdejo Ortiz; R. Santiago-Bartolomei; J. Forero-Montana; C.J. Nytch; H. Manrique; W. Colon-Cortes

    2012-01-01

    This essay describes our effort to develop a practical approach to the integration of the social and ecological sciences in the context of a Latin-American city such as San Juan, Puerto Rico. We describe our adaptive social-ecological approach in the historical context of the developing paradigms of the Anthropocene, new integrative social and ecological sciences, and...

  17. Person Fit Based on Statistical Process Control in an Adaptive Testing Environment. Research Report 98-13.

    ERIC Educational Resources Information Center

    van Krimpen-Stoop, Edith M. L. A.; Meijer, Rob R.

    Person-fit research in the context of paper-and-pencil tests is reviewed, and some specific problems regarding person fit in the context of computerized adaptive testing (CAT) are discussed. Some new methods are proposed to investigate person fit in a CAT environment. These statistics are based on Statistical Process Control (SPC) theory. A…

  18. Code Switching in English Language Teaching (ELT) Teaching Practice in Turkey: Student Teacher Practices, Beliefs and Identity

    ERIC Educational Resources Information Center

    Bilgin, Sezen Seymen

    2016-01-01

    Code switching involves the interplay of two languages and as well as serving linguistic functions, it has social and psychological implications. In the context of English language teaching, these psychological implications reveal themselves as teachers' thought processes. While the nature of code switching in language classrooms has been widely…

  19. Blending Classroom Teaching and Learning with QR Codes

    ERIC Educational Resources Information Center

    Rikala, Jenni; Kankaanranta, Marja

    2014-01-01

    The aim of this case study was to explore the feasibility of the Quick Response (QR) codes and mobile devices in the context of Finnish basic education. The interest was especially to explore how mobile devices and QR codes can enhance and blend teaching and learning. The data were collected with a teacher interview and pupil surveys. The learning…

  20. A co-designed equalization, modulation, and coding scheme

    NASA Technical Reports Server (NTRS)

    Peile, Robert E.

    1992-01-01

    The commercial impact and technical success of Trellis Coded Modulation seems to illustrate that, if Shannon's capacity is going to be neared, the modulation and coding of an analogue signal ought to be viewed as an integrated process. More recent work has focused on going beyond the gains obtained for Average White Gaussian Noise and has tried to combine the coding/modulation with adaptive equalization. The motive is to gain similar advances on less perfect or idealized channels.

  1. A Novel Approach for Creating Activity-Aware Applications in a Hospital Environment

    NASA Astrophysics Data System (ADS)

    Bardram, Jakob E.

    Context-aware and activity-aware computing has been proposed as a way to adapt the computer to the user’s ongoing activity. However, deductively moving from physical context - like location - to establishing human activity has proved difficult. This paper proposes a novel approach to activity-aware computing. Instead of inferring activities, this approach enables the user to explicitly model their activity, and then use sensor-based events to create, manage, and use these computational activities adjusted to a specific context. This approach was crafted through a user-centered design process in collaboration with a hospital department. We propose three strategies for activity-awareness: context-based activity matching, context-based activity creation, and context-based activity adaptation. We present the implementation of these strategies and present an experimental evaluation of them. The experiments demonstrate that rather than considering context as information, context can be a relational property that links ’real-world activities’ with their ’computational activities’.

  2. SNP discovery in candidate adaptive genes using exon capture in a free-ranging alpine ungulate

    Treesearch

    Gretchen H. Roffler; Stephen J. Amish; Seth Smith; Ted Cosart; Marty Kardos; Michael K. Schwartz; Gordon Luikart

    2016-01-01

    Identification of genes underlying genomic signatures of natural selection is key to understanding adaptation to local conditions. We used targeted resequencing to identify SNP markers in 5321 candidate adaptive genes associated with known immunological, metabolic and growth functions in ovids and other ungulates. We selectively targeted 8161 exons in protein-coding...

  3. Development of full wave code for modeling RF fields in hot non-uniform plasmas

    NASA Astrophysics Data System (ADS)

    Zhao, Liangji; Svidzinski, Vladimir; Spencer, Andrew; Kim, Jin-Soo

    2016-10-01

    FAR-TECH, Inc. is developing a full wave RF modeling code to model RF fields in fusion devices and in general plasma applications. As an important component of the code, an adaptive meshless technique is introduced to solve the wave equations, which allows resolving plasma resonances efficiently and adapting to the complexity of antenna geometry and device boundary. The computational points are generated using either a point elimination method or a force balancing method based on the monitor function, which is calculated by solving the cold plasma dispersion equation locally. Another part of the code is the conductivity kernel calculation, used for modeling the nonlocal hot plasma dielectric response. The conductivity kernel is calculated on a coarse grid of test points and then interpolated linearly onto the computational points. All the components of the code are parallelized using MPI and OpenMP libraries to optimize the execution speed and memory. The algorithm and the results of our numerical approach to solving 2-D wave equations in a tokamak geometry will be presented. Work is supported by the U.S. DOE SBIR program.

  4. Full Wave Parallel Code for Modeling RF Fields in Hot Plasmas

    NASA Astrophysics Data System (ADS)

    Spencer, Joseph; Svidzinski, Vladimir; Evstatiev, Evstati; Galkin, Sergei; Kim, Jin-Soo

    2015-11-01

    FAR-TECH, Inc. is developing a suite of full wave RF codes in hot plasmas. It is based on a formulation in configuration space with grid adaptation capability. The conductivity kernel (which includes a nonlocal dielectric response) is calculated by integrating the linearized Vlasov equation along unperturbed test particle orbits. For Tokamak applications a 2-D version of the code is being developed. Progress of this work will be reported. This suite of codes has the following advantages over existing spectral codes: 1) It utilizes the localized nature of plasma dielectric response to the RF field and calculates this response numerically without approximations. 2) It uses an adaptive grid to better resolve resonances in plasma and antenna structures. 3) It uses an efficient sparse matrix solver to solve the formulated linear equations. The linear wave equation is formulated using two approaches: for cold plasmas the local cold plasma dielectric tensor is used (resolving resonances by particle collisions), while for hot plasmas the conductivity kernel is calculated. Work is supported by the U.S. DOE SBIR program.

  5. Collection Efficiency and Ice Accretion Characteristics of Two Full Scale and One 1/4 Scale Business Jet Horizontal Tails

    NASA Technical Reports Server (NTRS)

    Bidwell, Colin S.; Papadakis, Michael

    2005-01-01

    Collection efficiency and ice accretion calculations have been made for a series of business jet horizontal tail configurations using a three-dimensional panel code, an adaptive grid code, and the NASA Glenn LEWICE3D grid based ice accretion code. The horizontal tail models included two full scale wing tips and a 25 percent scale model. Flow solutions for the horizontal tails were generated using the PMARC panel code. Grids used in the ice accretion calculations were generated using the adaptive grid code ICEGRID. The LEWICE3D grid based ice accretion program was used to calculate impingement efficiency and ice shapes. Ice shapes typifying rime and mixed icing conditions were generated for a 30 minute hold condition. All calculations were performed on an SGI Octane computer. The results have been compared to experimental flow and impingement data. In general, the calculated flow and collection efficiencies compared well with experiment, and the ice shapes appeared representative of the rime and mixed icing conditions for which they were calculated.

  6. A seismic data compression system using subband coding

    NASA Technical Reports Server (NTRS)

    Kiely, A. B.; Pollara, F.

    1995-01-01

    This article presents a study of seismic data compression techniques and a compression algorithm based on subband coding. The algorithm includes three stages: a decorrelation stage, a quantization stage that introduces a controlled amount of distortion to allow for high compression ratios, and a lossless entropy coding stage based on a simple but efficient arithmetic coding method. Subband coding methods are particularly suited to the decorrelation of nonstationary processes such as seismic events. Adaptivity to the nonstationary behavior of the waveform is achieved by dividing the data into separate blocks that are encoded separately with an adaptive arithmetic encoder. This is done with high efficiency due to the low overhead introduced by the arithmetic encoder in specifying its parameters. The technique could be used as a progressive transmission system, where successive refinements of the data can be requested by the user. This allows seismologists to first examine a coarse version of waveforms with minimal usage of the channel and then decide where refinements are required. Rate-distortion performance results are presented and comparisons are made with two block transform methods.

  7. Motion-adaptive model-assisted compatible coding with spatiotemporal scalability

    NASA Astrophysics Data System (ADS)

    Lee, JaeBeom; Eleftheriadis, Alexandros

    1997-01-01

    We introduce the concept of motion adaptive spatio-temporal model-assisted compatible (MA-STMAC) coding, a technique to selectively encode areas of different importance to the human eye in terms of space and time in moving images with the consideration of object motion. PRevious STMAC was proposed base don the fact that human 'eye contact' and 'lip synchronization' are very important in person-to-person communication. Several areas including the eyes and lips need different types of quality, since different areas have different perceptual significance to human observers. The approach provides a better rate-distortion tradeoff than conventional image coding techniques base don MPEG-1, MPEG- 2, H.261, as well as H.263. STMAC coding is applied on top of an encoder, taking full advantage of its core design. Model motion tracking in our previous STMAC approach was not automatic. The proposed MA-STMAC coding considers the motion of the human face within the STMAC concept using automatic area detection. Experimental results are given using ITU-T H.263, addressing very low bit-rate compression.

  8. Behavior Change Interventions to Improve the Health of Racial and Ethnic Minority Populations: A Tool Kit of Adaptation Approaches

    PubMed Central

    Davidson, Emma M; Liu, Jing Jing; Bhopal, Raj; White, Martin; Johnson, Mark RD; Netto, Gina; Wabnitz, Cecile; Sheikh, Aziz

    2013-01-01

    Context Adapting behavior change interventions to meet the needs of racial and ethnic minority populations has the potential to enhance their effectiveness in the target populations. But because there is little guidance on how best to undertake these adaptations, work in this field has proceeded without any firm foundations. In this article, we present our Tool Kit of Adaptation Approaches as a framework for policymakers, practitioners, and researchers interested in delivering behavior change interventions to ethnically diverse, underserved populations in the United Kingdom. Methods We undertook a mixed-method program of research on interventions for smoking cessation, increasing physical activity, and promoting healthy eating that had been adapted to improve salience and acceptability for African-, Chinese-, and South Asian–origin minority populations. This program included a systematic review (reported using PRISMA criteria), qualitative interviews, and a realist synthesis of data. Findings We compiled a richly informative data set of 161 publications and twenty-six interviews detailing the adaptation of behavior change interventions and the contexts in which they were undertaken. On the basis of these data, we developed our Tool Kit of Adaptation Approaches, which contains (1) a forty-six-item Typology of Adaptation Approaches; (2) a Pathway to Adaptation, which shows how to use the Typology to create a generic behavior change intervention; and (3) RESET, a decision tool that provides practical guidance on which adaptations to use in different contexts. Conclusions Our Tool Kit of Adaptation Approaches provides the first evidence-derived suite of materials to support the development, design, implementation, and reporting of health behavior change interventions for minority groups. The Tool Kit now needs prospective, empirical evaluation in a range of intervention and population settings. PMID:24320170

  9. A Parallel Implementation of Multilevel Recursive Spectral Bisection for Application to Adaptive Unstructured Meshes. Chapter 1

    NASA Technical Reports Server (NTRS)

    Barnard, Stephen T.; Simon, Horst; Lasinski, T. A. (Technical Monitor)

    1994-01-01

    The design of a parallel implementation of multilevel recursive spectral bisection is described. The goal is to implement a code that is fast enough to enable dynamic repartitioning of adaptive meshes.

  10. Toward a virtual building laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klems, J.H.; Finlayson, E.U.; Olsen, T.H.

    1999-03-01

    In order to achieve in a timely manner the large energy and dollar savings technically possible through improvements in building energy efficiency, it will be necessary to solve the problem of design failure risk. The most economical method of doing this would be to learn to calculate building performance with sufficient detail, accuracy and reliability to avoid design failure. Existing building simulation models (BSM) are a large step in this direction, but are still not capable of this level of modeling. Developments in computational fluid dynamics (CFD) techniques now allow one to construct a road map from present BSM's tomore » a complete building physical model. The most useful first step is a building interior model (BIM) that would allow prediction of local conditions affecting occupant health and comfort. To provide reliable prediction a BIM must incorporate the correct physical boundary conditions on a building interior. Doing so raises a number of specific technical problems and research questions. The solution of these within a context useful for building research and design is not likely to result from other research on CFD, which is directed toward the solution of different types of problems. A six-step plan for incorporating the correct boundary conditions within the context of the model problem of a large atrium has been outlined. A promising strategy for constructing a BIM is the overset grid technique for representing a building space in a CFD calculation. This technique promises to adapt well to building design and allows a step-by-step approach. A state-of-the-art CFD computer code using this technique has been adapted to the problem and can form the departure point for this research.« less

  11. Using Prospect Theory to Investigate Decision-Making Bias Within an Information Security Context

    DTIC Science & Technology

    2005-12-01

    risk was acceptable, 5 when to the CA the risk was so bad...Population Proportion Lower Tail: Risk Averse (A) Coded as 0. Risk Seeking (B) Coded as 1. Ho (indifferent in risk behavior): p = . 5 Ha ( risk averse...Averse (A) Coded as 0. Risk Seeking (B) Coded as 1. Ho (indifferent in risk behavior): p = . 5 Ha ( risk averse thus significantly below . 5 ): p < . 5

  12. Attraction Effect in Risky Choice Can Be Explained by Subjective Distance Between Choice Alternatives.

    PubMed

    Mohr, Peter N C; Heekeren, Hauke R; Rieskamp, Jörg

    2017-08-21

    Individuals make decisions under risk throughout daily life. Standard models of economic decision making typically assume that people evaluate choice options independently. There is, however, substantial evidence showing that this independence assumption is frequently violated in decision making without risk. The present study extends these findings to the domain of decision making under risk. To explain the independence violations, we adapted a sequential sampling model, namely Multialternative Decision Field Theory (MDFT), to decision making under risk and showed how this model can account for the observed preference shifts. MDFT not only better predicts choices compared with the standard Expected Utility Theory, but it also explains individual differences in the size of the observed context effect. Evidence in favor of the chosen option, as predicted by MDFT, was positively correlated with brain activity in the medial orbitofrontal cortex (mOFC) and negatively correlated with brain activity in the anterior insula (aINS). From a neuroscience perspective, the results of the present study show that specific brain regions, such as the mOFC and aINS, not only code the value or risk of a single choice option but also code the evidence in favor of the best option compared with other available choice options.

  13. Coding visual features extracted from video sequences.

    PubMed

    Baroffio, Luca; Cesana, Matteo; Redondi, Alessandro; Tagliasacchi, Marco; Tubaro, Stefano

    2014-05-01

    Visual features are successfully exploited in several applications (e.g., visual search, object recognition and tracking, etc.) due to their ability to efficiently represent image content. Several visual analysis tasks require features to be transmitted over a bandwidth-limited network, thus calling for coding techniques to reduce the required bit budget, while attaining a target level of efficiency. In this paper, we propose, for the first time, a coding architecture designed for local features (e.g., SIFT, SURF) extracted from video sequences. To achieve high coding efficiency, we exploit both spatial and temporal redundancy by means of intraframe and interframe coding modes. In addition, we propose a coding mode decision based on rate-distortion optimization. The proposed coding scheme can be conveniently adopted to implement the analyze-then-compress (ATC) paradigm in the context of visual sensor networks. That is, sets of visual features are extracted from video frames, encoded at remote nodes, and finally transmitted to a central controller that performs visual analysis. This is in contrast to the traditional compress-then-analyze (CTA) paradigm, in which video sequences acquired at a node are compressed and then sent to a central unit for further processing. In this paper, we compare these coding paradigms using metrics that are routinely adopted to evaluate the suitability of visual features in the context of content-based retrieval, object recognition, and tracking. Experimental results demonstrate that, thanks to the significant coding gains achieved by the proposed coding scheme, ATC outperforms CTA with respect to all evaluation metrics.

  14. Situating adaptation: How governance challenges and perceptions of uncertainty influence adaptation in the Rocky Mountains

    Treesearch

    Carina Wyborn; Laurie Yung; Daniel Murphy; Daniel R. Williams

    2015-01-01

    Adaptation is situated within multiple, interacting social, political, and economic forces. Adaptation pathways envision adaptation as a continual pathway of change and response embedded within this broader sociopolitical context. Pathways emphasize that current decisions are both informed by past actions and shape the landscape of future options. This research...

  15. Model and experiments to optimize co-adaptation in a simplified myoelectric control system

    NASA Astrophysics Data System (ADS)

    Couraud, M.; Cattaert, D.; Paclet, F.; Oudeyer, P. Y.; de Rugy, A.

    2018-04-01

    Objective. To compensate for a limb lost in an amputation, myoelectric prostheses use surface electromyography (EMG) from the remaining muscles to control the prosthesis. Despite considerable progress, myoelectric controls remain markedly different from the way we normally control movements, and require intense user adaptation. To overcome this, our goal is to explore concurrent machine co-adaptation techniques that are developed in the field of brain-machine interface, and that are beginning to be used in myoelectric controls. Approach. We combined a simplified myoelectric control with a perturbation for which human adaptation is well characterized and modeled, in order to explore co-adaptation settings in a principled manner. Results. First, we reproduced results obtained in a classical visuomotor rotation paradigm in our simplified myoelectric context, where we rotate the muscle pulling vectors used to reconstruct wrist force from EMG. Then, a model of human adaptation in response to directional error was used to simulate various co-adaptation settings, where perturbations and machine co-adaptation are both applied on muscle pulling vectors. These simulations established that a relatively low gain of machine co-adaptation that minimizes final errors generates slow and incomplete adaptation, while higher gains increase adaptation rate but also errors by amplifying noise. After experimental verification on real subjects, we tested a variable gain that cumulates the advantages of both, and implemented it with directionally tuned neurons similar to those used to model human adaptation. This enables machine co-adaptation to locally improve myoelectric control, and to absorb more challenging perturbations. Significance. The simplified context used here enabled to explore co-adaptation settings in both simulations and experiments, and to raise important considerations such as the need for a variable gain encoded locally. The benefits and limits of extending this approach to more complex and functional myoelectric contexts are discussed.

  16. Local intensity adaptive image coding

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.

    1989-01-01

    The objective of preprocessing for machine vision is to extract intrinsic target properties. The most important properties ordinarily are structure and reflectance. Illumination in space, however, is a significant problem as the extreme range of light intensity, stretching from deep shadow to highly reflective surfaces in direct sunlight, impairs the effectiveness of standard approaches to machine vision. To overcome this critical constraint, an image coding scheme is being investigated which combines local intensity adaptivity, image enhancement, and data compression. It is very effective under the highly variant illumination that can exist within a single frame or field of view, and it is very robust to noise at low illuminations. Some of the theory and salient features of the coding scheme are reviewed. Its performance is characterized in a simulated space application, the research and development activities are described.

  17. Enhancing understanding: the development of a glossary of health technology assessment adaptation terms.

    PubMed

    Rosten, Claire; Chase, Deborah L; Hicks, Nicholas J; Milne, Ruairidh

    2009-12-01

    The way people use health technology assessment (HTA) terms varies considerably across Europe. Such variation can lead to misunderstandings when reading HTA reports from different contexts. This work is one of the outputs of the EUnetHTA Project and was undertaken between 2006 and 2008. The aim of this study was to develop a glossary of HTA adaptation terms to help reduce the misunderstandings of terms used in HTA reports from contexts other than the reader's own. Several HTA glossaries were examined to identify ways in which an additional glossary could offer readers something new and to identify adaptation terms for inclusion. Twenty-eight European HTA organizations provided terms for the glossary and drafted descriptions and examples of how each specific term was used in their particular setting. The organizations then commented on the descriptions provided by the other groups and worked together to draft a single description for certain terms. A glossary of HTA adaptation terms was developed. It provides a comprehensive range of descriptions, examples, and comments for forty-two potentially confusing HTA terms related to adaptation. This glossary will be a valuable resource for European HTA agencies when reading HTA reports produced in different contexts and for adapting HTA reports produced in other countries. The glossary will help improve understanding and help facilitate the adaptation process.

  18. Resilience thinking: integrating resilience, adaptability and transformability

    Treesearch

    Carl Folke; Stephen R. Carpenter; Brian Walker; Marten Scheffer; Terry Chapin; Johan Rockstrom

    2010-01-01

    Resilience thinking addresses the dynamics and development of complex social-ecological systems (SES). Three aspects are central: resilience, adaptability and transformability. These aspects interrelate across multiple scales. Resilience in this context is the capacity of a SES to continually change and adapt yet remain within critical thresholds. Adaptability is part...

  19. To Adapt or Not to Adapt: Navigating an Implementation Conundrum

    ERIC Educational Resources Information Center

    Leko, Melinda M.

    2015-01-01

    Maximizing the effectiveness of evidence-based practices (EBPs) requires an optimal balance of implementation fidelity and adaptation so EBPs fit local contexts and meet the individual learning needs of students with disabilities. The framework for classifying adaptations presented in this article can help educators make decisions about whether…

  20. Impact of Burnout on Organizational Outcomes, the Influence of Legal Demands: The Case of Ecuadorian Physicians.

    PubMed

    Ochoa, Paola

    2018-01-01

    Interest in burnout has developed extensively worldwide, but there is scarce the literature regarding the consequences that new legal demands have on burnout and on organizational outcomes in physicians. The global context of the medical profession has been characterized in the recent years by changes in the employment patterns, profound intensification of work, and increment of labor flexibility. In this context, the study aims to analyze the influence of burnout on organizational outcomes in physicians, depending on new legal demands perception in Ecuador. Regarding the method, the research was cross sectional and in the first stage, studied the psychometric characteristics, validity and reliability of the instrument to assess burnout through a series of confirmatory factor analyses (CFA). In a second part, we assessed, the robustness of the model of causal relations between the burnout dimensions and organizational outcomes. We carried out a series of path analysis, structural equation model. The study was accomplished in five hospitals and the sample was incidental, comprising 435 physicians from Ecuador. We divided the group in two subcategories, Sample A, composed by participants that considered that new Criminal Code (COIP) affects them and the Sample B, the group of physicians who believed that the COIP does not affect them. Burnout was assessed with the Spanish adaptation of the Maslach Burnout Inventory (MBI), the Organizational outcomes were measured with a seven-item self-report questionnaire, and we included an item regarding to the influence of new Criminal Code. We formulated four hypotheses, that considered that physicians who believed that the COIP affect them experience a greater negative influence of burnout on organizational outcomes. The results indicated that the group of physicians who believed that the COIP affects them (Sample A) experienced a greater negative influence of cynicism on productivity than Sample B. Moreover, the lack of efficacy dimension had more positive influence on turnover in group that believed that the Criminal Code does not affect their practice. The study is unique because incorporated new legal demands to traditional relation burnout and organizational outcomes in physicians.

  1. Impact of Burnout on Organizational Outcomes, the Influence of Legal Demands: The Case of Ecuadorian Physicians

    PubMed Central

    Ochoa, Paola

    2018-01-01

    Interest in burnout has developed extensively worldwide, but there is scarce the literature regarding the consequences that new legal demands have on burnout and on organizational outcomes in physicians. The global context of the medical profession has been characterized in the recent years by changes in the employment patterns, profound intensification of work, and increment of labor flexibility. In this context, the study aims to analyze the influence of burnout on organizational outcomes in physicians, depending on new legal demands perception in Ecuador. Regarding the method, the research was cross sectional and in the first stage, studied the psychometric characteristics, validity and reliability of the instrument to assess burnout through a series of confirmatory factor analyses (CFA). In a second part, we assessed, the robustness of the model of causal relations between the burnout dimensions and organizational outcomes. We carried out a series of path analysis, structural equation model. The study was accomplished in five hospitals and the sample was incidental, comprising 435 physicians from Ecuador. We divided the group in two subcategories, Sample A, composed by participants that considered that new Criminal Code (COIP) affects them and the Sample B, the group of physicians who believed that the COIP does not affect them. Burnout was assessed with the Spanish adaptation of the Maslach Burnout Inventory (MBI), the Organizational outcomes were measured with a seven-item self-report questionnaire, and we included an item regarding to the influence of new Criminal Code. We formulated four hypotheses, that considered that physicians who believed that the COIP affect them experience a greater negative influence of burnout on organizational outcomes. The results indicated that the group of physicians who believed that the COIP affects them (Sample A) experienced a greater negative influence of cynicism on productivity than Sample B. Moreover, the lack of efficacy dimension had more positive influence on turnover in group that believed that the Criminal Code does not affect their practice. The study is unique because incorporated new legal demands to traditional relation burnout and organizational outcomes in physicians. PMID:29780347

  2. User's manual for a two-dimensional, ground-water flow code on the Octopus computer network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naymik, T.G.

    1978-08-30

    A ground-water hydrology computer code, programmed by R.L. Taylor (in Proc. American Society of Civil Engineers, Journal of Hydraulics Division, 93(HY2), pp. 25-33 (1967)), has been adapted to the Octopus computer system at Lawrence Livermore Laboratory. Using an example problem, this manual details the input, output, and execution options of the code.

  3. Cultural Adaptation of a Neurobiologically Informed Intervention in Local and International Contexts.

    PubMed

    Pakulak, Eric; Hampton Wray, Amanda; Longoria, Zayra; Garcia Isaza, Alejandra; Stevens, Courtney; Bell, Theodore; Burlingame, Sarah; Klein, Scott; Berlinski, Samuel; Attanasio, Orazio; Neville, Helen

    2017-12-01

    The relationship between early adversity and numerous negative outcomes across the lifespan is evident in a wide range of societies and cultures (e.g., Pakulak, Stevens, & Neville, 2018). Among the most affected neural systems are those supporting attention, self-regulation, and stress regulation. As such, these systems represent targets for neurobiologically informed interventions addressing early adversity. In prior work with monolingual native English-speaking families, we showed that a two-generation intervention targeting these systems in families improves outcomes across multiple domains including child brain function for selective attention (for detail, see Neville et al., 2013). Here, we discuss the translation and cultural adaptation (CA) of this intervention in local and international contexts, which required systematic consideration of cultural differences that could affect program acceptability. First, we conducted a translation and CA of our program to serve Latino families in the United States using the Cultural Adaptation Process (CAP), a model that works closely with stakeholders in a systematic, iterative process. Second, to implement the adapted program in Medellín, Colombia, we conducted a subsequent adaptation for Colombian culture using the same CAP. Our experience underscores the importance of consideration of cultural differences and a systematic approach to adaptation before assessing the efficacy of neurobiologically informed interventions in different cultural contexts. © 2017 Wiley Periodicals, Inc.

  4. Support for context effects on segmentation and segments depends on the context.

    PubMed

    Heffner, Christopher C; Newman, Rochelle S; Idsardi, William J

    2017-04-01

    Listeners must adapt to differences in speech rate across talkers and situations. Speech rate adaptation effects are strong for adjacent syllables (i.e., proximal syllables). For studies that have assessed adaptation effects on speech rate information more than one syllable removed from a point of ambiguity in speech (i.e., distal syllables), the difference in strength between different types of ambiguity is stark. Studies of word segmentation have shown large shifts in perception as a result of distal rate manipulations, while studies of segmental perception have shown only weak, or even nonexistent, effects. However, no study has standardized methods and materials to study context effects for both types of ambiguity simultaneously. Here, a set of sentences was created that differed as minimally as possible except for whether the sentences were ambiguous to the voicing of a consonant or ambiguous to the location of a word boundary. The sentences were then rate-modified to slow down the distal context speech rate to various extents, dependent on three different definitions of distal context that were adapted from previous experiments, along with a manipulation of proximal context to assess whether proximal effects were comparable across ambiguity types. The results indicate that the definition of distal influenced the extent of distal rate effects strongly for both segments and segmentation. They also establish the presence of distal rate effects on word-final segments for the first time. These results were replicated, with some caveats regarding the perception of individual segments, in an Internet-based sample recruited from Mechanical Turk.

  5. Generalization of Prism Adaptation

    ERIC Educational Resources Information Center

    Redding, Gordon M.; Wallace, Benjamin

    2006-01-01

    Prism exposure produces 2 kinds of adaptive response. Recalibration is ordinary strategic remapping of spatially coded movement commands to rapidly reduce performance error. Realignment is the extraordinary process of transforming spatial maps to bring the origins of coordinate systems into correspondence. Realignment occurs when spatial…

  6. How to Cope with Bias While Adapting for Inclusion in Physical Education and Sports: A Judgment and Decision-Making Perspective

    ERIC Educational Resources Information Center

    Hutzler, Yeshayahu; Bar-Eli, Michael

    2013-01-01

    The purpose of this article is to describe a theoretical model and practice examples of judgment and decision making bias within the context of inclusion in physical education and sports. After presenting the context of adapting for inclusion, the theoretical roots of judgment and decision are described, and are linked to the practice of physical…

  7. Detecting consistent patterns of directional adaptation using differential selection codon models.

    PubMed

    Parto, Sahar; Lartillot, Nicolas

    2017-06-23

    Phylogenetic codon models are often used to characterize the selective regimes acting on protein-coding sequences. Recent methodological developments have led to models explicitly accounting for the interplay between mutation and selection, by modeling the amino acid fitness landscape along the sequence. However, thus far, most of these models have assumed that the fitness landscape is constant over time. Fluctuations of the fitness landscape may often be random or depend on complex and unknown factors. However, some organisms may be subject to systematic changes in selective pressure, resulting in reproducible molecular adaptations across independent lineages subject to similar conditions. Here, we introduce a codon-based differential selection model, which aims to detect and quantify the fine-grained consistent patterns of adaptation at the protein-coding level, as a function of external conditions experienced by the organism under investigation. The model parameterizes the global mutational pressure, as well as the site- and condition-specific amino acid selective preferences. This phylogenetic model is implemented in a Bayesian MCMC framework. After validation with simulations, we applied our method to a dataset of HIV sequences from patients with known HLA genetic background. Our differential selection model detects and characterizes differentially selected coding positions specifically associated with two different HLA alleles. Our differential selection model is able to identify consistent molecular adaptations as a function of repeated changes in the environment of the organism. These models can be applied to many other problems, ranging from viral adaptation to evolution of life-history strategies in plants or animals.

  8. Role-play simulations for climate change adaptation education and engagement

    NASA Astrophysics Data System (ADS)

    Rumore, Danya; Schenk, Todd; Susskind, Lawrence

    2016-08-01

    In order to effectively adapt to climate change, public officials and other stakeholders need to rapidly enhance their understanding of local risks and their ability to collaboratively and adaptively respond to them. We argue that science-based role-play simulation exercises -- a type of 'serious game' involving face-to-face mock decision-making -- have considerable potential as education and engagement tools for enhancing readiness to adapt. Prior research suggests role-play simulations and other serious games can foster public learning and encourage collective action in public policy-making contexts. However, the effectiveness of such exercises in the context of climate change adaptation education and engagement has heretofore been underexplored. We share results from two research projects that demonstrate the effectiveness of role-play simulations in cultivating climate change adaptation literacy, enhancing collaborative capacity and facilitating social learning. Based on our findings, we suggest such exercises should be more widely embraced as part of adaptation professionals' education and engagement toolkits.

  9. An introduction to adaptive management for threatened and endangered species

    USGS Publications Warehouse

    Runge, Michael C.

    2011-01-01

    Management of threatened and endangered species would seem to be a perfect context for adaptive management. Many of the decisions are recurrent and plagued by uncertainty, exactly the conditions that warrant an adaptive approach. But although the potential of adaptive management in these settings has been extolled, there are limited applications in practice. The impediments to practical implementation are manifold and include semantic confusion, institutional inertia, misperceptions about the suitability and utility, and a lack of guiding examples. In this special section of the Journal of Fish and Wildlife Management, we hope to reinvigorate the appropriate application of adaptive management for threatened and endangered species by framing such management in a decision-analytical context, clarifying misperceptions, classifying the types of decisions that might be amenable to an adaptive approach, and providing three fully developed case studies. In this overview paper, I define terms, review the past application of adaptive management, challenge perceived hurdles, and set the stage for the case studies which follow.

  10. Do Librarians Have a Shared Set of Values? A Comparative Study of 36 Codes of Ethics Based on Gorman's "Enduring Values"

    ERIC Educational Resources Information Center

    Foster, Catherine; McMenemy, David

    2012-01-01

    Thirty-six ethical codes from national professional associations were studied, the aim to test whether librarians have global shared values or if political and cultural contexts have significantly influenced the codes' content. Gorman's eight core values of stewardship, service, intellectual freedom, rationalism, literacy and learning, equity of…

  11. Extraordinarily Adaptive Properties of the Genetically Encoded Amino Acids

    PubMed Central

    Ilardo, Melissa; Meringer, Markus; Freeland, Stephen; Rasulev, Bakhtiyor; Cleaves II, H. James

    2015-01-01

    Using novel advances in computational chemistry, we demonstrate that the set of 20 genetically encoded amino acids, used nearly universally to construct all coded terrestrial proteins, has been highly influenced by natural selection. We defined an adaptive set of amino acids as one whose members thoroughly cover relevant physico-chemical properties, or “chemistry space.” Using this metric, we compared the encoded amino acid alphabet to random sets of amino acids. These random sets were drawn from a computationally generated compound library containing 1913 alternative amino acids that lie within the molecular weight range of the encoded amino acids. Sets that cover chemistry space better than the genetically encoded alphabet are extremely rare and energetically costly. Further analysis of more adaptive sets reveals common features and anomalies, and we explore their implications for synthetic biology. We present these computations as evidence that the set of 20 amino acids found within the standard genetic code is the result of considerable natural selection. The amino acids used for constructing coded proteins may represent a largely global optimum, such that any aqueous biochemistry would use a very similar set. PMID:25802223

  12. Adaptation of the Tool to Estimate Patient Costs Questionnaire into Indonesian Context for Tuberculosis-affected Households.

    PubMed

    Fuady, Ahmad; Houweling, Tanja A; Mansyur, Muchtaruddin; Richardus, Jan H

    2018-01-01

    Indonesia is the second-highest country for tuberculosis (TB) incidence worldwide. Hence, it urgently requires improvements and innovations beyond the strategies that are currently being implemented throughout the country. One fundamental step in monitoring its progress is by preparing a validated tool to measure total patient costs and catastrophic total costs. The World Health Organization (WHO) recommends using a version of the generic questionnaire that has been adapted to the local cultural context in order to interpret findings correctly. This study is aimed to adapt the Tool to Estimate Patient Costs questionnaire into the Indonesian context, which measures total costs and catastrophic total costs for tuberculosis-affected households. the tool was adapted using best-practice guidelines. On the basis of a pre-test performed in a previous study (referred to as Phase 1 Study), we refined the adaptation process by comparing it with the generic tool introduced by the WHO. We also held an expert committee review and performed pre-testing by interviewing 30 TB patients. After pre-testing, the tool was provided with complete explanation sheets for finalization. seventy-two major changes were made during the adaptation process including changing the answer choices to match the Indonesian context, refining the flow of questions, deleting questions, changing some words and restoring original questions that had been changed in Phase 1 Study. Participants indicated that most questions were clear and easy to understand. To address recall difficulties by the participants, we made some adaptations to obtain data that might be missing, such as tracking data to medical records, developing a proxy of costs and guiding interviewers to ask for a specific value when participants were uncertain about the estimated market value of property they had sold. the adapted Tool to Estimate Patient Costs in Bahasa Indonesia is comprehensive and ready for use in future studies on TB-related catastrophic costs and is suitable for monitoring progress to achieve the target of the End TB Strategy.

  13. Context-specific adjustment of cognitive control: Transfer of adaptive control sets.

    PubMed

    Surrey, Caroline; Dreisbach, Gesine; Fischer, Rico

    2017-11-01

    Cognitive control protects processing of relevant information from interference by irrelevant information. The level of this processing selectivity can be flexibly adjusted to different control demands (e.g., frequency of conflict) associated with a certain context, leading to the formation of specific context-control associations. In the present study we investigated the robustness and transferability of the acquired context-control demands to new situations. In three experiments, we used a version of the context-specific proportion congruence (CSPC) paradigm, in which each context (e.g., location) is associated with a specific conflict frequency, determining high and low control demands. In a learning phase, associations between context and control demands were established. In a subsequent transfer block, stimulus-response mappings, whole task sets, or context-control demands changed. Results showed an impressive robustness of context-control associations, as context-specific adjustments of control from the learning phase were virtually unaffected by new stimuli and tasks in the transfer block. Only a change of the context-control demand eliminated the context-specific adjustment of control. These findings suggest that context-control associations that have proven to be adaptive in the past are continuously applied despite major changes in the task structure as long as the context-control associations remain the same.

  14. Framework for Designing Context-Aware Learning Systems

    ERIC Educational Resources Information Center

    Tortorella, Richard A. W.; Kinshuk; Chen, Nian-Shing

    2018-01-01

    Today people learn in many diverse locations and contexts, beyond the confines of classical brick and mortar classrooms. This trend is ever increasing, progressing hand-in-hand with the progress of technology. Context-aware learning systems are systems which adapt to the learner's context, providing tailored learning for a particular learning…

  15. SAGE: The Self-Adaptive Grid Code. 3

    NASA Technical Reports Server (NTRS)

    Davies, Carol B.; Venkatapathy, Ethiraj

    1999-01-01

    The multi-dimensional self-adaptive grid code, SAGE, is an important tool in the field of computational fluid dynamics (CFD). It provides an efficient method to improve the accuracy of flow solutions while simultaneously reducing computer processing time. Briefly, SAGE enhances an initial computational grid by redistributing the mesh points into more appropriate locations. The movement of these points is driven by an equal-error-distribution algorithm that utilizes the relationship between high flow gradients and excessive solution errors. The method also provides a balance between clustering points in the high gradient regions and maintaining the smoothness and continuity of the adapted grid, The latest version, Version 3, includes the ability to change the boundaries of a given grid to more efficiently enclose flow structures and provides alternative redistribution algorithms.

  16. Gyroaveraging operations using adaptive matrix operators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dominski, Julien; Ku, Seung -Hoe; Chang, Choong -Seock

    A new adaptive scheme to be used in particle-in-cell codes for carrying out gyroaveraging operations with matrices is presented. This new scheme uses an intermediate velocity grid whose resolution is adapted to the local thermal Larmor radius. The charge density is computed by projecting marker weights in a field-line following manner while preserving the adiabatic magnetic moment μ. These choices permit to improve the accuracy of the gyroaveraging operations performed with matrices even when strong spatial variation of temperature and magnetic field is present. Accuracy of the scheme in different geometries from simple 2D slab geometry to realistic 3D toroidalmore » equilibrium has been studied. As a result, a successful implementation in the gyrokinetic code XGC is presented in the delta-f limit.« less

  17. Gyroaveraging operations using adaptive matrix operators

    DOE PAGES

    Dominski, Julien; Ku, Seung -Hoe; Chang, Choong -Seock

    2018-05-17

    A new adaptive scheme to be used in particle-in-cell codes for carrying out gyroaveraging operations with matrices is presented. This new scheme uses an intermediate velocity grid whose resolution is adapted to the local thermal Larmor radius. The charge density is computed by projecting marker weights in a field-line following manner while preserving the adiabatic magnetic moment μ. These choices permit to improve the accuracy of the gyroaveraging operations performed with matrices even when strong spatial variation of temperature and magnetic field is present. Accuracy of the scheme in different geometries from simple 2D slab geometry to realistic 3D toroidalmore » equilibrium has been studied. As a result, a successful implementation in the gyrokinetic code XGC is presented in the delta-f limit.« less

  18. Teaching Decoding Strategies without Destroying Story.

    ERIC Educational Resources Information Center

    Kane, Sharon

    1999-01-01

    Argues that deep coding skills must and can be introduced, taught, practiced, and reinforced within contexts meaningful to students. Shows how teachers can provide these meaningful educational contexts within which decoding strategies make sense to emerging readers. (SR)

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    SmartImport.py is a Python source-code file that implements a replacement for the standard Python module importer. The code is derived from knee.py, a file in the standard Python diestribution , and adds functionality to improve the performance of Python module imports in massively parallel contexts.

  20. De-coding Reading at Work: Workplace Reading Competencies.

    ERIC Educational Resources Information Center

    Searle, Jean

    1998-01-01

    Naturalistic observations and interviews with service workers found on-the-job reading was based on knowledge of codes and rules of practice and required problem-solving and metacognitive strategies. Workplace competencies should be considered within their social and cultural context. (SK)

  1. Amy2B copy number variation reveals starch diet adaptations in ancient European dogs.

    PubMed

    Ollivier, Morgane; Tresset, Anne; Bastian, Fabiola; Lagoutte, Laetitia; Axelsson, Erik; Arendt, Maja-Louise; Bălăşescu, Adrian; Marshour, Marjan; Sablin, Mikhail V; Salanova, Laure; Vigne, Jean-Denis; Hitte, Christophe; Hänni, Catherine

    2016-11-01

    Extant dog and wolf DNA indicates that dog domestication was accompanied by the selection of a series of duplications on the Amy2B gene coding for pancreatic amylase. In this study, we used a palaeogenetic approach to investigate the timing and expansion of the Amy2B gene in the ancient dog populations of Western and Eastern Europe and Southwest Asia. Quantitative polymerase chain reaction was used to estimate the copy numbers of this gene for 13 ancient dog samples, dated to between 15 000 and 4000 years before present (cal. BP). This evidenced an increase of Amy2B copies in ancient dogs from as early as the 7th millennium cal. BP in Southeastern Europe. We found that the gene expansion was not fixed across all dogs within this early farming context, with ancient dogs bearing between 2 and 20 diploid copies of the gene. The results also suggested that selection for the increased Amy2B copy number started 7000 years cal. BP, at the latest. This expansion reflects a local adaptation that allowed dogs to thrive on a starch rich diet, especially within early farming societies, and suggests a biocultural coevolution of dog genes and human culture.

  2. Amy2B copy number variation reveals starch diet adaptations in ancient European dogs

    PubMed Central

    Tresset, Anne; Bastian, Fabiola; Lagoutte, Laetitia; Arendt, Maja-Louise; Bălăşescu, Adrian; Marshour, Marjan; Sablin, Mikhail V.; Salanova, Laure; Vigne, Jean-Denis; Hitte, Christophe; Hänni, Catherine

    2016-01-01

    Extant dog and wolf DNA indicates that dog domestication was accompanied by the selection of a series of duplications on the Amy2B gene coding for pancreatic amylase. In this study, we used a palaeogenetic approach to investigate the timing and expansion of the Amy2B gene in the ancient dog populations of Western and Eastern Europe and Southwest Asia. Quantitative polymerase chain reaction was used to estimate the copy numbers of this gene for 13 ancient dog samples, dated to between 15 000 and 4000 years before present (cal. BP). This evidenced an increase of Amy2B copies in ancient dogs from as early as the 7th millennium cal. BP in Southeastern Europe. We found that the gene expansion was not fixed across all dogs within this early farming context, with ancient dogs bearing between 2 and 20 diploid copies of the gene. The results also suggested that selection for the increased Amy2B copy number started 7000 years cal. BP, at the latest. This expansion reflects a local adaptation that allowed dogs to thrive on a starch rich diet, especially within early farming societies, and suggests a biocultural coevolution of dog genes and human culture. PMID:28018628

  3. Measuring emotion regulation and emotional expression in breast cancer patients: A systematic review.

    PubMed

    Brandão, Tânia; Tavares, Rita; Schulz, Marc S; Matos, Paula Mena

    2016-02-01

    The important role of emotion regulation and expression in adaptation to breast cancer is now widely recognized. Studies have shown that optimal emotion regulation strategies, including less constrained emotional expression, are associated with better adaptation. Our objective was to systematically review measures used to assess the way women with breast cancer regulate their emotions. This systematic review was conducted in accordance with PRISMA guidelines. Nine different databases were searched. Data were independently extracted and assessed by two researchers. English-language articles that used at least one instrument to measure strategies to regulate emotions in women with breast cancer were included. Of 679 abstracts identified 59 studies were deemed eligible for inclusion. Studies were coded regarding their objectives, methods, and results. We identified 16 instruments used to measure strategies of emotion regulation and expression. The most frequently employed instrument was the Courtauld Emotional Control Scale. Few psychometric proprieties other than internal consistency were reported for most instruments. Many studies did not include important information regarding descriptive characteristics and psychometric properties of the instruments used. The instruments used tap different aspects of emotion regulation. Specific instruments should be explored further with regard to content, validity, and reliability in the context of breast cancer. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Orthogonal Procrustes Analysis for Dictionary Learning in Sparse Linear Representation.

    PubMed

    Grossi, Giuliano; Lanzarotti, Raffaella; Lin, Jianyi

    2017-01-01

    In the sparse representation model, the design of overcomplete dictionaries plays a key role for the effectiveness and applicability in different domains. Recent research has produced several dictionary learning approaches, being proven that dictionaries learnt by data examples significantly outperform structured ones, e.g. wavelet transforms. In this context, learning consists in adapting the dictionary atoms to a set of training signals in order to promote a sparse representation that minimizes the reconstruction error. Finding the best fitting dictionary remains a very difficult task, leaving the question still open. A well-established heuristic method for tackling this problem is an iterative alternating scheme, adopted for instance in the well-known K-SVD algorithm. Essentially, it consists in repeating two stages; the former promotes sparse coding of the training set and the latter adapts the dictionary to reduce the error. In this paper we present R-SVD, a new method that, while maintaining the alternating scheme, adopts the Orthogonal Procrustes analysis to update the dictionary atoms suitably arranged into groups. Comparative experiments on synthetic data prove the effectiveness of R-SVD with respect to well known dictionary learning algorithms such as K-SVD, ILS-DLA and the online method OSDL. Moreover, experiments on natural data such as ECG compression, EEG sparse representation, and image modeling confirm R-SVD's robustness and wide applicability.

  5. The future of 3D and video coding in mobile and the internet

    NASA Astrophysics Data System (ADS)

    Bivolarski, Lazar

    2013-09-01

    The current Internet success has already changed our social and economic world and is still continuing to revolutionize the information exchange. The exponential increase of amount and types of data that is currently exchanged on the Internet represents significant challenge for the design of future architectures and solutions. This paper reviews the current status and trends in the design of solutions and research activities in the future Internet from point of view of managing the growth of bandwidth requirements and complexity of the multimedia that is being created and shared. Outlines the challenges that are present before the video coding and approaches to the design of standardized media formats and protocols while considering the expected convergence of multimedia formats and exchange interfaces. The rapid growth of connected mobile devices adds to the current and the future challenges in combination with the expected, in near future, arrival of multitude of connected devices. The new Internet technologies connecting the Internet of Things with wireless visual sensor networks and 3D virtual worlds requires conceptually new approaches of media content handling from acquisition to presentation in the 3D Media Internet. Accounting for the entire transmission system properties and enabling adaptation in real-time to context and content throughout the media proceeding path will be paramount in enabling the new media architectures as well as the new applications and services. The common video coding formats will need to be conceptually redesigned to allow for the implementation of the necessary 3D Media Internet features.

  6. Smartphone tool to collect repeated 24 h dietary recall data in Nepal.

    PubMed

    Harris-Fry, Helen; Beard, B James; Harrisson, Tom; Paudel, Puskar; Shrestha, Niva; Jha, Sonali; Shrestha, Bhim P; Manandhar, Dharma S; Costello, Anthony; Saville, Naomi M

    2018-02-01

    To outline the development of a smartphone-based tool to collect thrice-repeated 24 h dietary recall data in rural Nepal, and to describe energy intakes, common errors and researchers' experiences using the tool. We designed a novel tool to collect multi-pass 24 h dietary recalls in rural Nepal by combining the use of a CommCare questionnaire on smartphones, a paper form, a QR (quick response)-coded list of foods and a photographic atlas of portion sizes. Twenty interviewers collected dietary data on three non-consecutive days per respondent, with three respondents per household. Intakes were converted into nutrients using databases on nutritional composition of foods, recipes and portion sizes. Dhanusha and Mahottari districts, Nepal. Pregnant women, their mothers-in-law and male household heads. Energy intakes assessed in 150 households; data corrections and our experiences reported from 805 households and 6765 individual recalls. Dietary intake estimates gave plausible values, with male household heads appearing to have higher energy intakes (median (25th-75th centile): 12 079 (9293-14 108) kJ/d) than female members (8979 (7234-11 042) kJ/d for pregnant women). Manual editing of data was required when interviewers mistook portions for food codes and for coding items not on the food list. Smartphones enabled quick monitoring of data and interviewer performance, but we initially faced technical challenges with CommCare forms crashing. With sufficient time dedicated to development and pre-testing, this novel smartphone-based tool provides a useful method to collect data. Future work is needed to further validate this tool and adapt it for other contexts.

  7. Predictive coding accelerates word recognition and learning in the early stages of language development.

    PubMed

    Ylinen, Sari; Bosseler, Alexis; Junttila, Katja; Huotilainen, Minna

    2017-11-01

    The ability to predict future events in the environment and learn from them is a fundamental component of adaptive behavior across species. Here we propose that inferring predictions facilitates speech processing and word learning in the early stages of language development. Twelve- and 24-month olds' electrophysiological brain responses to heard syllables are faster and more robust when the preceding word context predicts the ending of a familiar word. For unfamiliar, novel word forms, however, word-expectancy violation generates a prediction error response, the strength of which significantly correlates with children's vocabulary scores at 12 months. These results suggest that predictive coding may accelerate word recognition and support early learning of novel words, including not only the learning of heard word forms but also their mapping to meanings. Prediction error may mediate learning via attention, since infants' attention allocation to the entire learning situation in natural environments could account for the link between prediction error and the understanding of word meanings. On the whole, the present results on predictive coding support the view that principles of brain function reported across domains in humans and non-human animals apply to language and its development in the infant brain. A video abstract of this article can be viewed at: http://hy.fi/unitube/video/e1cbb495-41d8-462e-8660-0864a1abd02c. [Correction added on 27 January 2017, after first online publication: The video abstract link was added.]. © 2016 John Wiley & Sons Ltd.

  8. Using the American Board of Internal Medicine's "Elements of Professionalism" for undergraduate ethics education.

    PubMed

    Robins, Lynne S; Braddock, Clarence H; Fryer-Edwards, Kelly A

    2002-06-01

    To examine the feasibility of using the taxonomy of professional and unprofessional behaviors presented in the American Board of Internal Medicine's (ABIM's) Project Professionalism to categorize ethical issues that undergraduate medical students perceive to be salient. Beginning second-year medical students at the University of Washington School of Medicine (n = 120) were asked to respond to three open-ended questions about professional standards of conduct and peer evaluation. Two of the authors read and coded the students' responses according to the ABIM's elements of professionalism (altruism, accountability, excellence, duty, honor and integrity, and respect for others) and the challenges to those elements (abuse of power, arrogance, greed, misrepresentation, impairment, lack of conscientiousness, and conflict of interest). Coding disagreements were solved using review and revision of the category definitions. New categories were created for students' responses that described behaviors or issues that were not captured in the ABIM's categories. A total of 114 students responded. The ABIM's professional code was adapted for students and teachers, making it context- and learning-stage-specific. One new category of challenges, conflicts of conscience, was added, and one category (abuse of power) was expanded to include abuse of power/negotiating power asymmetries. Using the ABIM's taxonomy to name professional and unprofessional behaviors was particularly useful for examining undergraduate medical students' perceptions of the ethical climate for learning during the first year of medical school, and it holds promise for research into changes in students' perceptions as they move into clinical experiences. Using the framework, students can build a unified professional knowledge-and-skills base.

  9. Virtual Knowledge Brokering: Describing the Roles and Strategies Used by Knowledge Brokers in a Pediatric Physiotherapy Virtual Community of Practice.

    PubMed

    Hurtubise, Karen; Rivard, Lisa; Héguy, Léa; Berbari, Jade; Camden, Chantal

    2016-01-01

    Knowledge transfer in pediatric rehabilitation is challenging and requires active, multifaceted strategies. The use of knowledge brokers (KBs) is one such strategy noted to promote clinician behavior change. The success of using KBs to transfer knowledge relies on their ability to adapt to ever-changing clinical contexts. In addition, with the rapid growth of online platforms as knowledge transfer forums, KBs must become effective in virtual environments. Although the role of KBs has been studied in various clinical contexts, their emerging role in specific online environments designed to support evidence-based behavior change has not yet been described. Our objective is to describe the roles of, and strategies used by, four KBs involved in a virtual community of practice to guide and inform future online KB interventions. A descriptive design guided this study and a thematic content analysis process was used to analyze online KB postings. The Promoting Action on Research in Health Sciences knowledge transfer framework and online andragogical learning theories assisted in the coding. A thematic map was created illustrating the links between KBs' strategies and emerging roles in the virtual environment. We analyzed 95 posts and identified three roles: 1) context architect: promoting a respectful learning environment, 2) knowledge sharing promoter: building capacity, and 3) linkage creator: connecting research-to-practice. Strategies used by KBs reflected invitational, constructivism, and connectivism approaches, with roles and strategies changing over time. This study increases our understanding of the actions of KBs in virtual contexts to foster uptake of research evidence in pediatric physiotherapy. Our results provide valuable information about the knowledge and skills required by individuals to fulfill this role in virtual environments.

  10. 'The stars seem aligned': a qualitative study to understand the effects of context on scale-up of maternal and newborn health innovations in Ethiopia, India and Nigeria.

    PubMed

    Spicer, Neil; Berhanu, Della; Bhattacharya, Dipankar; Tilley-Gyado, Ritgak Dimka; Gautham, Meenakshi; Schellenberg, Joanna; Tamire-Woldemariam, Addis; Umar, Nasir; Wickremasinghe, Deepthi

    2016-11-25

    Donors commonly fund innovative interventions to improve health in the hope that governments of low and middle-income countries will scale-up those that are shown to be effective. Yet innovations can be slow to be adopted by country governments and implemented at scale. Our study explores this problem by identifying key contextual factors influencing scale-up of maternal and newborn health innovations in three low-income settings: Ethiopia, the six states of northeast Nigeria and Uttar Pradesh state in India. We conducted 150 semi-structured interviews in 2012/13 with stakeholders from government, development partner agencies, externally funded implementers including civil society organisations, academic institutions and professional associations to understand scale-up of innovations to improve the health of mothers and newborns these study settings. We analysed interview data with the aid of a common analytic framework to enable cross-country comparison, with Nvivo to code themes. We found that multiple contextual factors enabled and undermined attempts to catalyse scale-up of donor-funded maternal and newborn health innovations. Factors influencing government decisions to accept innovations at scale included: how health policy decisions are made; prioritising and funding maternal and newborn health; and development partner harmonisation. Factors influencing the implementation of innovations at scale included: health systems capacity in the three settings; and security in northeast Nigeria. Contextual factors influencing beneficiary communities' uptake of innovations at scale included: sociocultural contexts; and access to healthcare. We conclude that context is critical: externally funded implementers need to assess and adapt for contexts if they are to successfully position an innovation for scale-up.

  11. Posteriori error determination and grid adaptation for AMR and ALE computational fluid dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lapenta, G. M.

    2002-01-01

    We discuss grid adaptation for application to AMR and ALE codes. Two new contributions are presented. First, a new method to locate the regions where the truncation error is being created due to an insufficient accuracy: the operator recovery error origin (OREO) detector. The OREO detector is automatic, reliable, easy to implement and extremely inexpensive. Second, a new grid motion technique is presented for application to ALE codes. The method is based on the Brackbill-Saltzman approach but it is directly linked to the OREO detector and moves the grid automatically to minimize the error.

  12. The geography of sex-specific selection, local adaptation, and sexual dimorphism.

    PubMed

    Connallon, Tim

    2015-09-01

    Local adaptation and sexual dimorphism are iconic evolutionary scenarios of intraspecific adaptive differentiation in the face of gene flow. Although theory has traditionally considered local adaptation and sexual dimorphism as conceptually distinct processes, emerging data suggest that they often act concurrently during evolutionary diversification. Here, I merge theories of local adaptation in space and sex-specific adaptation over time, and show that their confluence yields several new predictions about the roles of context-specific selection, migration, and genetic correlations, in adaptive diversification. I specifically revisit two influential predictions from classical studies of clinal adaptation and sexual dimorphism: (1) that local adaptation should decrease with distance from the species' range center and (2) that opposing directional selection between the sexes (sexual antagonism) should inevitably accompany the evolution of sexual dimorphism. I show that both predictions can break down under clinally varying selection. First, the geography of local adaptation can be sexually dimorphic, with locations of relatively high local adaptation differing profoundly between the sexes. Second, the intensity of sexual antagonism varies across the species' range, with subpopulations near the range center representing hotspots for antagonistic selection. The results highlight the context-dependent roles of migration versus sexual conflict as primary constraints to adaptive diversification. © 2015 The Author(s). Evolution © 2015 The Society for the Study of Evolution.

  13. Reflections on the Adaptive Designs Accelerating Promising Trials Into Treatments (ADAPT-IT) Process—Findings from a Qualitative Study

    PubMed Central

    Guetterman, Timothy C.; Fetters, Michael D.; Legocki, Laurie J.; Mawocha, Samkeliso; Barsan, William G.; Lewis, Roger J.; Berry, Donald A.; Meurer, William J.

    2015-01-01

    Context The context for this study was the Adaptive Designs Advancing Promising Treatments Into Trials (ADAPT-IT) project, which aimed to incorporate flexible adaptive designs into pivotal clinical trials and to conduct an assessment of the trial development process. Little research provides guidance to academic institutions in planning adaptive trials. Objectives The purpose of this qualitative study was to explore the perspectives and experiences of stakeholders as they reflected back about the interactive ADAPT-IT adaptive design development process, and to understand their perspectives regarding lessons learned about the design of the trials and trial development. Materials and methods We conducted semi-structured interviews with ten key stakeholders and observations of the process. We employed qualitative thematic text data analysis to reduce the data into themes about the ADAPT-IT project and adaptive clinical trials. Results The qualitative analysis revealed four themes: education of the project participants, how the process evolved with participant feedback, procedures that could enhance the development of other trials, and education of the broader research community. Discussion and conclusions While participants became more likely to consider flexible adaptive designs, additional education is needed to both understand the adaptive methodology and articulate it when planning trials. PMID:26622163

  14. Cultural Adaptation of the Strengthening Families Program 10-14 to Italian Families

    ERIC Educational Resources Information Center

    Ortega, Enrique; Giannotta, Fabrizia; Latina, Delia; Ciairano, Silvia

    2012-01-01

    Background: The family context has proven to be a useful target in which to apply prevention efforts aimed at child and adolescent health risk behaviors. There are currently a variety of cultural adaptation models that serve to guide the international adaptation of intervention programs. Objective: The cultural adaptation process and program…

  15. Adaptive Technologies. Research Report. ETS RR-07-05

    ERIC Educational Resources Information Center

    Shute, Valerie J.; Zapata-Rivera, Diego

    2007-01-01

    This paper describes research and development efforts related to adaptive technologies, which can be combined with other technologies and processes to form an adaptive system. The goal of an adaptive system, in the context of this paper, is to create an instructionally sound and flexible environment that supports learning for students with a range…

  16. PLEXIL-DL: Language and Runtime for Context-Aware Robot Behaviour

    NASA Astrophysics Data System (ADS)

    Moser, Herwig; Reichelt, Toni; Oswald, Norbert; Förster, Stefan

    Faced with the growing complexity of application scenarios social robots are involved with, the perception of environmental circumstances and the sentient reactions are becoming more and more important abilities. Rather than regarding both abilities in isolation, the entire transformation process, from context-awareness to purposive behaviour, forms a robot’s adaptivity. While attaining context-awareness has received much attention in literature so far, translating it into appropriate actions still lacks a comprehensive approach. In this paper, we present PLEXIL-DL, an expressive language allowing complex context expressions as an integral part of constructs that define sophisticated behavioural reactions. Our approach extends NASA’s PLEXIL language by Description Logic queries, both in syntax and formal semantics. A prototypical implementation of a PLEXIL-DL interpreter shows the basic mechanisms facilitating the robot’s adaptivity through context-awareness.

  17. M-type potassium conductance controls the emergence of neural phase codes: a combined experimental and neuron modelling study

    PubMed Central

    Kwag, Jeehyun; Jang, Hyun Jae; Kim, Mincheol; Lee, Sujeong

    2014-01-01

    Rate and phase codes are believed to be important in neural information processing. Hippocampal place cells provide a good example where both coding schemes coexist during spatial information processing. Spike rate increases in the place field, whereas spike phase precesses relative to the ongoing theta oscillation. However, what intrinsic mechanism allows for a single neuron to generate spike output patterns that contain both neural codes is unknown. Using dynamic clamp, we simulate an in vivo-like subthreshold dynamics of place cells to in vitro CA1 pyramidal neurons to establish an in vitro model of spike phase precession. Using this in vitro model, we show that membrane potential oscillation (MPO) dynamics is important in the emergence of spike phase codes: blocking the slowly activating, non-inactivating K+ current (IM), which is known to control subthreshold MPO, disrupts MPO and abolishes spike phase precession. We verify the importance of adaptive IM in the generation of phase codes using both an adaptive integrate-and-fire and a Hodgkin–Huxley (HH) neuron model. Especially, using the HH model, we further show that it is the perisomatically located IM with slow activation kinetics that is crucial for the generation of phase codes. These results suggest an important functional role of IM in single neuron computation, where IM serves as an intrinsic mechanism allowing for dual rate and phase coding in single neurons. PMID:25100320

  18. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    PubMed

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  19. Context-specific adaptation of pursuit initiation in humans

    NASA Technical Reports Server (NTRS)

    Takagi, M.; Abe, H.; Hasegawa, S.; Usui, T.; Hasebe, H.; Miki, A.; Zee, D. S.; Shelhauser, M. (Principal Investigator)

    2000-01-01

    PURPOSE: To determine if multiple states for the initiation of pursuit, as assessed by acceleration in the "open-loop" period, can be learned and gated by context. METHODS: Four normal subjects were studied. A modified step-ramp paradigm for horizontal pursuit was used to induce adaptation. In an increasing paradigm, target velocity doubled 230 msec after onset; in a decreasing paradigm, it was halved. In the first experiment, vertical eye position (+/-5 degrees ) was used as the context cue, and the training paradigm (increasing or decreasing) changed with vertical eye position. In the second experiment, with vertical position constant, when the target was red, training was decreasing, and when green, increasing. The average eye acceleration in the first 100 msec of tracking was the index of open-loop pursuit performance. RESULTS: With vertical position as the cue, pursuit adaptation differed between up and down gaze. In some cases, the direction of adaptation was in exact accord with the training stimuli. In others, acceleration increased or decreased for both up and down gaze but always in correct relative proportion to the training stimuli. In contrast, multiple adaptive states were not induced with color as the cue. CONCLUSIONS: Multiple values for the relationship between the average eye acceleration during the initiation of pursuit and target velocity could be learned and gated by context. Vertical position was an effective contextual cue but not target color, implying that useful contextual cues must be similar to those occurring naturally, for example, orbital position with eye muscle weakness.

  20. Factor Structure and Incremental Validity of the Enhanced Computer- Administered Tests

    DTIC Science & Technology

    1992-07-01

    performance in the mechanical maintenance specialties. 14. SUBJECT TERMS Aptitude tests, ASVAB (Armed services vocational aptitude battery), CAT ...Code 11) Attn: Dir, Personnel Systems (Code 12) Attn: Dir, Testing Systems (Code 13) Attn: CAT /ASVABPMO FJB1 COMNAVCRUITCOM FT1 CNET V8 CG MCRD...test, a computerized adaptive testing version of the ASVAB ( CAT -ASVAB), the psychomotor portion of the General Aptitude Test Battery (GATB), and the

  1. Information preserving coding for multispectral data

    NASA Technical Reports Server (NTRS)

    Duan, J. R.; Wintz, P. A.

    1973-01-01

    A general formulation of the data compression system is presented. A method of instantaneous expansion of quantization levels by reserving two codewords in the codebook to perform a folding over in quantization is implemented for error free coding of data with incomplete knowledge of the probability density function. Results for simple DPCM with folding and an adaptive transform coding technique followed by a DPCM technique are compared using ERTS-1 data.

  2. Solution of nonlinear flow equations for complex aerodynamic shapes

    NASA Technical Reports Server (NTRS)

    Djomehri, M. Jahed

    1992-01-01

    Solution-adaptive CFD codes based on unstructured methods for 3-D complex geometries in subsonic to supersonic regimes were investigated, and the computed solution data were analyzed in conjunction with experimental data obtained from wind tunnel measurements in order to assess and validate the predictability of the code. Specifically, the FELISA code was assessed and improved in cooperation with NASA Langley and Imperial College, Swansea, U.K.

  3. Edge equilibrium code for tokamaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Xujing; Zakharov, Leonid E.; Drozdov, Vladimir V.

    2014-01-15

    The edge equilibrium code (EEC) described in this paper is developed for simulations of the near edge plasma using the finite element method. It solves the Grad-Shafranov equation in toroidal coordinate and uses adaptive grids aligned with magnetic field lines. Hermite finite elements are chosen for the numerical scheme. A fast Newton scheme which is the same as implemented in the equilibrium and stability code (ESC) is applied here to adjust the grids.

  4. Reliable video transmission over fading channels via channel state estimation

    NASA Astrophysics Data System (ADS)

    Kumwilaisak, Wuttipong; Kim, JongWon; Kuo, C.-C. Jay

    2000-04-01

    Transmission of continuous media such as video over time- varying wireless communication channels can benefit from the use of adaptation techniques in both source and channel coding. An adaptive feedback-based wireless video transmission scheme is investigated in this research with special emphasis on feedback-based adaptation. To be more specific, an interactive adaptive transmission scheme is developed by letting the receiver estimate the channel state information and send it back to the transmitter. By utilizing the feedback information, the transmitter is capable of adapting the level of protection by changing the flexible RCPC (rate-compatible punctured convolutional) code ratio depending on the instantaneous channel condition. The wireless channel is modeled as a fading channel, where the long-term and short- term fading effects are modeled as the log-normal fading and the Rayleigh flat fading, respectively. Then, its state (mainly the long term fading portion) is tracked and predicted by using an adaptive LMS (least mean squares) algorithm. By utilizing the delayed feedback on the channel condition, the adaptation performance of the proposed scheme is first evaluated in terms of the error probability and the throughput. It is then extended to incorporate variable size packets of ITU-T H.263+ video with the error resilience option. Finally, the end-to-end performance of wireless video transmission is compared against several non-adaptive protection schemes.

  5. An Adaption of Gagné's Instructional Model to Increase the Teaching Effectiveness in the Classroom: The Impact in Romanian Universities

    ERIC Educational Resources Information Center

    Ilie, Marian D.

    2014-01-01

    Gagné's instructional events are more focused on the human internal learning process than on the learning context. This study fills this gap because it presents certain instructional events that are focused on the construction of a positive learning context through the teacher-student relationship. Therefore, it's proposing an adaption of Gagné's…

  6. The moving platform after-effect reveals dissociation between what we know and how we walk.

    PubMed

    Reynolds, R; Bronstein, A

    2007-01-01

    Gait adaptation is crucial for coping with varying terrain and biological needs. It is also important that any acquired adaptation is expressed only in the appropriate context. Here we review a recent series of experiments which demonstrate inappropriate expression of gait adaptation. We showed that a brief period of walking onto a platform previously experienced as moving results in a large forward sway despite full awareness of the changing context. The adaptation mechanisms involved in this paradigm are extremely fast, just 1-2 discrete exposures to the moving platform results in a motor after-effect. This after-effect still occurs even if subjects deliberately attempt to suppress it. However it disappears when the location or method of gait is altered, indicating that after-effect expression is context dependent. Conversely, making gait self-initiated increased sway during the after-effect. This after-effect demonstrates a profound dissociation between knowledge and action. The absence of generalisation suggests a simple form of motor learning. However, persistent expression of gait after-effects may be dependent on an intact cerebral cortex. The fact that the after-effect is greater during self-initiated gait, and is context dependent, would be consistent with the involvement of supraspinal areas.

  7. A second chance: meanings of body weight, diet, and physical activity to women who have experienced cancer.

    PubMed

    Maley, Mary; Warren, Barbour S; Devine, Carol M

    2013-01-01

    To understand the meanings of diet, physical activity, and body weight in the context of women's cancer experiences. Grounded theory using 15 qualitative interviews and 3 focus groups. Grassroots community cancer organizations in the northeastern United States. Thirty-six white women cancer survivors; 86% had experienced breast cancer. Participants' views of the meanings of body weight, diet, and physical activity in the context of the cancer. Procedures adapted from the constant comparative method of qualitative analysis using iterative open coding. Themes emerged along 3 intersecting dimensions: vulnerability and control, stress and living well, and uncertainty and confidence. Diet and body weight were seen as sources of increased vulnerability and distress. Uncertainty about diet heightened distress and lack of control. Physical activity was seen as a way to regain control and reduce distress. Emergent themes of vulnerability-control, stress-living well, and uncertainty-confidence may aid in understanding and promoting health behaviors in the growing population of cancer survivors. Messages that resonated with participants included taking ownership over one's body, physical activity as stress reduction, healthy eating for overall health and quality of life, and a second chance to get it right. Copyright © 2013 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  8. The Cortical Organization of Speech Processing: Feedback Control and Predictive Coding the Context of a Dual-Stream Model

    ERIC Educational Resources Information Center

    Hickok, Gregory

    2012-01-01

    Speech recognition is an active process that involves some form of predictive coding. This statement is relatively uncontroversial. What is less clear is the source of the prediction. The dual-stream model of speech processing suggests that there are two possible sources of predictive coding in speech perception: the motor speech system and the…

  9. A Pragmatic Approach to the Application of the Code of Ethics in Nursing Education.

    PubMed

    Tinnon, Elizabeth; Masters, Kathleen; Butts, Janie

    The code of ethics for nurses was written for nurses in all settings. However, the language focuses primarily on the nurse in context of the patient relationship, which may make it difficult for nurse educators to internalize the code to inform practice. The purpose of this article is to explore the code of ethics, establish that it can be used to guide nurse educators' practice, and provide a pragmatic approach to application of the provisions.

  10. Coding for Efficient Image Transmission

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    NASA publication second in series on data-coding techniques for noiseless channels. Techniques used even in noisy channels, provided data further processed with Reed-Solomon or other error-correcting code. Techniques discussed in context of transmission of monochrome imagery from Voyager II spacecraft but applicable to other streams of data. Objective of this type coding to "compress" data; that is, to transmit using as few bits as possible by omitting as much as possible of portion of information repeated in subsequent samples (or picture elements).

  11. Separation of Evans and Hiro currents in VDE of tokamak plasma

    NASA Astrophysics Data System (ADS)

    Galkin, Sergei A.; Svidzinski, V. A.; Zakharov, L. E.

    2014-10-01

    Progress on the Disruption Simulation Code (DSC-3D) development and benchmarking will be presented. The DSC-3D is one-fluid nonlinear time-dependent MHD code, which utilizes fully 3D toroidal geometry for the first wall, pure vacuum and plasma itself, with adaptation to the moving plasma boundary and accurate resolution of the plasma surface current. Suppression of fast magnetosonic scale by the plasma inertia neglecting will be demonstrated. Due to code adaptive nature, self-consistent plasma surface current modeling during non-linear dynamics of the Vertical Displacement Event (VDE) is accurately provided. Separation of the plasma surface current on Evans and Hiro currents during simulation of fully developed VDE, then the plasma touches in-vessel tiles, will be discussed. Work is supported by the US DOE SBIR Grant # DE-SC0004487.

  12. Underwater Acoustic Propagation and Communications: A Coupled Research Program

    DTIC Science & Technology

    2015-06-15

    coding technique suitable for both SIMO and MIMO systems. 4. an adaptive OFDM modulation technique, whereby the transmitter acts in response to...timate based adaptation for SIMO and MIMO systems in a interactive turbo-equalization framework were developed and analyzed. MIMO and SISO

  13. Generalized type II hybrid ARQ scheme using punctured convolutional coding

    NASA Astrophysics Data System (ADS)

    Kallel, Samir; Haccoun, David

    1990-11-01

    A method is presented to construct rate-compatible convolutional (RCC) codes from known high-rate punctured convolutional codes, obtained from best-rate 1/2 codes. The construction method is rather simple and straightforward, and still yields good codes. Moreover, low-rate codes can be obtained without any limit on the lowest achievable code rate. Based on the RCC codes, a generalized type-II hybrid ARQ scheme, which combines the benefits of the modified type-II hybrid ARQ strategy of Hagenauer (1988) with the code-combining ARQ strategy of Chase (1985), is proposed and analyzed. With the proposed generalized type-II hybrid ARQ strategy, the throughput increases as the starting coding rate increases, and as the channel degrades, it tends to merge with the throughput of rate 1/2 type-II hybrid ARQ schemes with code combining, thus allowing the system to be flexible and adaptive to channel conditions, even under wide noise variations and severe degradations.

  14. Progress of IRSN R&D on ITER Safety Assessment

    NASA Astrophysics Data System (ADS)

    Van Dorsselaere, J. P.; Perrault, D.; Barrachin, M.; Bentaib, A.; Gensdarmes, F.; Haeck, W.; Pouvreau, S.; Salat, E.; Seropian, C.; Vendel, J.

    2012-08-01

    The French "Institut de Radioprotection et de Sûreté Nucléaire" (IRSN), in support to the French "Autorité de Sûreté Nucléaire", is analysing the safety of ITER fusion installation on the basis of the ITER operator's safety file. IRSN set up a multi-year R&D program in 2007 to support this safety assessment process. Priority has been given to four technical issues and the main outcomes of the work done in 2010 and 2011 are summarized in this paper: for simulation of accident scenarios in the vacuum vessel, adaptation of the ASTEC system code; for risk of explosion of gas-dust mixtures in the vacuum vessel, adaptation of the TONUS-CFD code for gas distribution, development of DUST code for dust transport, and preparation of IRSN experiments on gas inerting, dust mobilization, and hydrogen-dust mixtures explosion; for evaluation of the efficiency of the detritiation systems, thermo-chemical calculations of tritium speciation during transport in the gas phase and preparation of future experiments to evaluate the most influent factors on detritiation; for material neutron activation, adaptation of the VESTA Monte Carlo depletion code. The first results of these tasks have been used in 2011 for the analysis of the ITER safety file. In the near future, this R&D global programme may be reoriented to account for the feedback of the latter analysis or for new knowledge.

  15. Advanced techniques and technology for efficient data storage, access, and transfer

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.; Miller, Warner

    1991-01-01

    Advanced techniques for efficiently representing most forms of data are being implemented in practical hardware and software form through the joint efforts of three NASA centers. These techniques adapt to local statistical variations to continually provide near optimum code efficiency when representing data without error. Demonstrated in several earlier space applications, these techniques are the basis of initial NASA data compression standards specifications. Since the techniques clearly apply to most NASA science data, NASA invested in the development of both hardware and software implementations for general use. This investment includes high-speed single-chip very large scale integration (VLSI) coding and decoding modules as well as machine-transferrable software routines. The hardware chips were tested in the laboratory at data rates as high as 700 Mbits/s. A coding module's definition includes a predictive preprocessing stage and a powerful adaptive coding stage. The function of the preprocessor is to optimally process incoming data into a standard form data source that the second stage can handle.The built-in preprocessor of the VLSI coder chips is ideal for high-speed sampled data applications such as imaging and high-quality audio, but additionally, the second stage adaptive coder can be used separately with any source that can be externally preprocessed into the 'standard form'. This generic functionality assures that the applicability of these techniques and their recent high-speed implementations should be equally broad outside of NASA.

  16. Preliminary SAGE Simulations of Volcanic Jets Into a Stratified Atmosphere

    NASA Astrophysics Data System (ADS)

    Peterson, A. H.; Wohletz, K. H.; Ogden, D. E.; Gisler, G. R.; Glatzmaier, G. A.

    2007-12-01

    The SAGE (SAIC Adaptive Grid Eulerian) code employs adaptive mesh refinement in solving Eulerian equations of complex fluid flow desirable for simulation of volcanic eruptions. The goal of modeling volcanic eruptions is to better develop a code's predictive capabilities in order to understand the dynamics that govern the overall behavior of real eruption columns. To achieve this goal, we focus on the dynamics of underexpended jets, one of the fundamental physical processes important to explosive eruptions. Previous simulations of laboratory jets modeled in cylindrical coordinates were benchmarked with simulations in CFDLib (Los Alamos National Laboratory), which solves the full Navier-Stokes equations (includes viscous stress tensor), and showed close agreement, indicating that adaptive mesh refinement used in SAGE may offset the need for explicit calculation of viscous dissipation.We compare gas density contours of these previous simulations with the same initial conditions in cylindrical and Cartesian geometries to laboratory experiments to determine both the validity of the model and the robustness of the code. The SAGE results in both geometries are within several percent of the experiments for position and density of the incident (intercepting) and reflected shocks, slip lines, shear layers, and Mach disk. To expand our study into a volcanic regime, we simulate large-scale jets in a stratified atmosphere to establish the code's ability to model a sustained jet into a stable atmosphere.

  17. Evidence of translation efficiency adaptation of the coding regions of the bacteriophage lambda.

    PubMed

    Goz, Eli; Mioduser, Oriah; Diament, Alon; Tuller, Tamir

    2017-08-01

    Deciphering the way gene expression regulatory aspects are encoded in viral genomes is a challenging mission with ramifications related to all biomedical disciplines. Here, we aimed to understand how the evolution shapes the bacteriophage lambda genes by performing a high resolution analysis of ribosomal profiling data and gene expression related synonymous/silent information encoded in bacteriophage coding regions.We demonstrated evidence of selection for distinct compositions of synonymous codons in early and late viral genes related to the adaptation of translation efficiency to different bacteriophage developmental stages. Specifically, we showed that evolution of viral coding regions is driven, among others, by selection for codons with higher decoding rates; during the initial/progressive stages of infection the decoding rates in early/late genes were found to be superior to those in late/early genes, respectively. Moreover, we argued that selection for translation efficiency could be partially explained by adaptation to Escherichia coli tRNA pool and the fact that it can change during the bacteriophage life cycle.An analysis of additional aspects related to the expression of viral genes, such as mRNA folding and more complex/longer regulatory signals in the coding regions, is also reported. The reported conclusions are likely to be relevant also to additional viruses. © The Author 2017. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.

  18. A tale of two cities: replication of a study on the acculturation and adaptation of immigrant adolescents from the former Soviet Union in a different community context.

    PubMed

    Birman, Dina; Trickett, Edison; Buchanan, Rebecca M

    2005-03-01

    While a great deal of research has been conducted to understand acculturation and its relationship to adaptation in the new country, surprisingly little attention has been paid to the ways in which the characteristics of the local community impact these processes. The present study addresses this gap in the literature by exploring the potential role of community differences in the acculturation and adaptation processes of 269 refugee and immigrant adolescents from the former Soviet Union who resettled in two different community contexts. Specifically, a prior study on acculturation and adjustment among high school students (D. Birman, E. J. Trickett, & A. Vinokurov, 2002) was replicated with the same émigré population in a contrasting community within the same state. The contrast between these communities allowed us to test hypotheses emerging from an ecological perspective concerning (1) patterns of acculturation, (2) levels of discrimination and its effect on acculturative outcomes, and (3) community differences in the relationship between acculturation and outcomes. In addition to the focus on community differences, the study also employs a multidimensional measure of acculturation and assesses acculturation to both American and Russian culture. Furthermore, adaptation is assessed across different life domains; including peer relationships, family relationships, school adaptation, and psychological adaptation. Findings support the general ecological perspective, suggesting the importance of studying acculturation and adaptation as a reflexive process in which culture and context are very much intertwined.

  19. Performance Benefits Associated with Context-Dependent Arm Pointing Adaptation

    NASA Technical Reports Server (NTRS)

    Seidler, R. D.; Bloomberg, J. J.; Stelmach, George E.

    2000-01-01

    Our previous work has demonstrated that head orientation can be used as a contextual cue to switch between mUltiple adaptive states. Subjects were assigned to one of three groups: the head orientation group tilted the head towards the right shoulder when drawing under a 0.5 gain of display and towards the left shoulder when drawing under a 1.5 gain of display; the target orientation group had the home & target positions rotated counterclockwise when drawing under the 0.5 gain and clockwise for the l.5 gain; the arm posture group changed the elbow angle of the arm they were not drawing with from full flexion to full extension with 0.5 and l.5 gain display changes. The head orientation cue was effectively associated with the multiple gains, in comparison to the control conditions. The purpose of the current investigation was to determine whether this context-dependent adaptation results in any savings in terms of performance measures such as movement duration and movement smoothness when subjects switch between multiple adaptive states. Subjects in the head adaptation group demonstrated reduced movement duration and increased movement smoothness (measured via normalized j erk scores) in comparison to the two control groups when switching between the 0.5 and 1.5 gain. of display. This work has demonstrated not only that subjects can acquire context-dependent adaptation, but also that it results in a significant savings of performance upon transfer between adaptive states

  20. Easy Web Interfaces to IDL Code for NSTX Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    W.M. Davis

    Reusing code is a well-known Software Engineering practice to substantially increase the efficiency of code production, as well as to reduce errors and debugging time. A variety of "Web Tools" for the analysis and display of raw and analyzed physics data are in use on NSTX [1], and new ones can be produced quickly from existing IDL [2] code. A Web Tool with only a few inputs, and which calls an IDL routine written in the proper style, can be created in less than an hour; more typical Web Tools with dozens of inputs, and the need for some adaptationmore » of existing IDL code, can be working in a day or so. Efficiency is also increased for users of Web Tools because o f the familiar interface of the web browser, and not needing X-windows, accounts, passwords, etc. Web Tools were adapted for use by PPPL physicists accessing EAST data stored in MDSplus with only a few man-weeks of effort; adapting to additional sites should now be even easier. An overview of Web Tools in use on NSTX, and a list of the most useful features, is also presented.« less

  1. Coding Location: The View from Toddler Studies

    ERIC Educational Resources Information Center

    Huttenlocher, Janellen

    2008-01-01

    The ability to locate objects in the environment is adaptively important for mobile organisms. Research on location coding reveals that even toddlers have considerable spatial skill. Important information has been obtained using a disorientation task in which children watch a target object being hidden and are then blindfolded and rotated so they…

  2. Empirical evaluation of H.265/HEVC-based dynamic adaptive video streaming over HTTP (HEVC-DASH)

    NASA Astrophysics Data System (ADS)

    Irondi, Iheanyi; Wang, Qi; Grecos, Christos

    2014-05-01

    Real-time HTTP streaming has gained global popularity for delivering video content over Internet. In particular, the recent MPEG-DASH (Dynamic Adaptive Streaming over HTTP) standard enables on-demand, live, and adaptive Internet streaming in response to network bandwidth fluctuations. Meanwhile, emerging is the new-generation video coding standard, H.265/HEVC (High Efficiency Video Coding) promises to reduce the bandwidth requirement by 50% at the same video quality when compared with the current H.264/AVC standard. However, little existing work has addressed the integration of the DASH and HEVC standards, let alone empirical performance evaluation of such systems. This paper presents an experimental HEVC-DASH system, which is a pull-based adaptive streaming solution that delivers HEVC-coded video content through conventional HTTP servers where the client switches to its desired quality, resolution or bitrate based on the available network bandwidth. Previous studies in DASH have focused on H.264/AVC, whereas we present an empirical evaluation of the HEVC-DASH system by implementing a real-world test bed, which consists of an Apache HTTP Server with GPAC, an MP4Client (GPAC) with open HEVC-based DASH client and a NETEM box in the middle emulating different network conditions. We investigate and analyze the performance of HEVC-DASH by exploring the impact of various network conditions such as packet loss, bandwidth and delay on video quality. Furthermore, we compare the Intra and Random Access profiles of HEVC coding with the Intra profile of H.264/AVC when the correspondingly encoded video is streamed with DASH. Finally, we explore the correlation among the quality metrics and network conditions, and empirically establish under which conditions the different codecs can provide satisfactory performance.

  3. Blind information-theoretic multiuser detection algorithms for DS-CDMA and WCDMA downlink systems.

    PubMed

    Waheed, Khuram; Salem, Fathi M

    2005-07-01

    Code division multiple access (CDMA) is based on the spread-spectrum technology and is a dominant air interface for 2.5G, 3G, and future wireless networks. For the CDMA downlink, the transmitted CDMA signals from the base station (BS) propagate through a noisy multipath fading communication channel before arriving at the receiver of the user equipment/mobile station (UE/MS). Classical CDMA single-user detection (SUD) algorithms implemented in the UE/MS receiver do not provide the required performance for modern high data-rate applications. In contrast, multi-user detection (MUD) approaches require a lot of a priori information not available to the UE/MS. In this paper, three promising adaptive Riemannian contra-variant (or natural) gradient based user detection approaches, capable of handling the highly dynamic wireless environments, are proposed. The first approach, blind multiuser detection (BMUD), is the process of simultaneously estimating multiple symbol sequences associated with all the users in the downlink of a CDMA communication system using only the received wireless data and without any knowledge of the user spreading codes. This approach is applicable to CDMA systems with relatively short spreading codes but becomes impractical for systems using long spreading codes. We also propose two other adaptive approaches, namely, RAKE -blind source recovery (RAKE-BSR) and RAKE-principal component analysis (RAKE-PCA) that fuse an adaptive stage into a standard RAKE receiver. This adaptation results in robust user detection algorithms with performance exceeding the linear minimum mean squared error (LMMSE) detectors for both Direct Sequence CDMA (DS-CDMA) and wide-band CDMA (WCDMA) systems under conditions of congestion, imprecise channel estimation and unmodeled multiple access interference (MAI).

  4. How Principled Are Guidelines?

    ERIC Educational Resources Information Center

    Homan, Roger

    1998-01-01

    Explores and interprets factors related to the burgeoning of codes, criteria, and guidelines in recent years within three kinds of context: (1) higher degree and dissertation-award processes; (2) guidelines for teachers and university lecturers; and (3) codes for researchers in the social sciences. Discusses the intentions and unforeseen…

  5. Verification and Validation in a Rapid Software Development Process

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Easterbrook, Steve M.

    1997-01-01

    The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.

  6. The Relations of Employability Skills to Career Adaptability among Technical School Students

    ERIC Educational Resources Information Center

    de Guzman, Allan B.; Choi, Kyoung Ok

    2013-01-01

    This two pronged study reports the initial validation of the psychometric properties and factor structure of the Career Adapt-Abilities Scale (CAAS) in the context of Papua New Guinea (PNG) and the investigation of the relationship between employability skills and career adaptability. Results of the study revealed that CAAS can be a valid and…

  7. OER Quality and Adaptation in K-12: Comparing Teacher Evaluations of Copyright-Restricted, Open, and Open/Adapted Textbooks

    ERIC Educational Resources Information Center

    Kimmons, Royce

    2015-01-01

    Conducted in conjunction with an institute on open textbook adaptation, this study compares textbook evaluations from practicing K-12 classroom teachers (n = 30) on three different types of textbooks utilized in their contexts: copyright-restricted, open, and open/adapted. Copyright-restricted textbooks consisted of those textbooks already in use…

  8. Identifying Differential Item Functioning in Multi-Stage Computer Adaptive Testing

    ERIC Educational Resources Information Center

    Gierl, Mark J.; Lai, Hollis; Li, Johnson

    2013-01-01

    The purpose of this study is to evaluate the performance of CATSIB (Computer Adaptive Testing-Simultaneous Item Bias Test) for detecting differential item functioning (DIF) when items in the matching and studied subtest are administered adaptively in the context of a realistic multi-stage adaptive test (MST). MST was simulated using a 4-item…

  9. Mechanisms of β-cell functional adaptation to changes in workload

    PubMed Central

    Wortham, Matthew; Sander, Maike

    2016-01-01

    Insulin secretion must be tightly coupled to nutritional state to maintain blood glucose homeostasis. To this end, pancreatic β-cells sense and respond to changes in metabolic conditions, thereby anticipating insulin demands for a given physiological context. This is achieved in part through adjustments of nutrient metabolism, which is controlled at several levels including allosteric regulation, posttranslational modifications, and altered expression of metabolic enzymes. In this review, we discuss mechanisms of β-cell metabolic and functional adaptation in the context of two physiological states that alter glucose-stimulated insulin secretion: fasting and insulin resistance. We review current knowledge of metabolic changes that occur in the β-cell during adaptation and specifically discuss transcriptional mechanisms that underlie β-cell adaptation. A more comprehensive understanding of how β-cells adapt to changes in nutrient state could identify mechanisms to be co-opted for therapeutically modulating insulin secretion in metabolic disease. PMID:27615135

  10. Behavioral and Neural Adaptation in Approach Behavior.

    PubMed

    Wang, Shuo; Falvello, Virginia; Porter, Jenny; Said, Christopher P; Todorov, Alexander

    2018-06-01

    People often make approachability decisions based on perceived facial trustworthiness. However, it remains unclear how people learn trustworthiness from a population of faces and whether this learning influences their approachability decisions. Here we investigated the neural underpinning of approach behavior and tested two important hypotheses: whether the amygdala adapts to different trustworthiness ranges and whether the amygdala is modulated by task instructions and evaluative goals. We showed that participants adapted to the stimulus range of perceived trustworthiness when making approach decisions and that these decisions were further modulated by the social context. The right amygdala showed both linear response and quadratic response to trustworthiness level, as observed in prior studies. Notably, the amygdala's response to trustworthiness was not modulated by stimulus range or social context, a possible neural dynamic adaptation. Together, our data have revealed a robust behavioral adaptation to different trustworthiness ranges as well as a neural substrate underlying approach behavior based on perceived facial trustworthiness.

  11. Application of Non-Kolmogorovian Probability and Quantum Adaptive Dynamics to Unconscious Inference in Visual Perception Process

    NASA Astrophysics Data System (ADS)

    Accardi, Luigi; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu; Yamato, Ichiro

    2016-07-01

    Recently a novel quantum information formalism — quantum adaptive dynamics — was developed and applied to modelling of information processing by bio-systems including cognitive phenomena: from molecular biology (glucose-lactose metabolism for E.coli bacteria, epigenetic evolution) to cognition, psychology. From the foundational point of view quantum adaptive dynamics describes mutual adapting of the information states of two interacting systems (physical or biological) as well as adapting of co-observations performed by the systems. In this paper we apply this formalism to model unconscious inference: the process of transition from sensation to perception. The paper combines theory and experiment. Statistical data collected in an experimental study on recognition of a particular ambiguous figure, the Schröder stairs, support the viability of the quantum(-like) model of unconscious inference including modelling of biases generated by rotation-contexts. From the probabilistic point of view, we study (for concrete experimental data) the problem of contextuality of probability, its dependence on experimental contexts. Mathematically contextuality leads to non-Komogorovness: probability distributions generated by various rotation contexts cannot be treated in the Kolmogorovian framework. At the same time they can be embedded in a “big Kolmogorov space” as conditional probabilities. However, such a Kolmogorov space has too complex structure and the operational quantum formalism in the form of quantum adaptive dynamics simplifies the modelling essentially.

  12. A Simulation Testbed for Adaptive Modulation and Coding in Airborne Telemetry (Brief)

    DTIC Science & Technology

    2014-10-01

    SOQPSK 0.0085924 us 0.015231 kH2 10 1/2 20 Time Modulation/ Coding State ... .. . . D - 2/3 3/4 4/5 GTRI_B-‹#› MATLAB GUI Interface 8...802.11a) • Modulations: BPSK, QPSK, 16 QAM, 64 QAM • Cyclic Prefix Lengths • Number of Subcarriers • Coding • LDPC • Rates: 1/2, 2/3, 3/4, 4/5...and Coding in Airborne Telemetry (Brief) October 2014 DISTRIBUTION STATEMENT A. Approved for public release: distribution unlimited. Test

  13. Bit-wise arithmetic coding for data compression

    NASA Technical Reports Server (NTRS)

    Kiely, A. B.

    1994-01-01

    This article examines the problem of compressing a uniformly quantized independent and identically distributed (IID) source. We present a new compression technique, bit-wise arithmetic coding, that assigns fixed-length codewords to the quantizer output and uses arithmetic coding to compress the codewords, treating the codeword bits as independent. We examine the performance of this method and evaluate the overhead required when used block-adaptively. Simulation results are presented for Gaussian and Laplacian sources. This new technique could be used as the entropy coder in a transform or subband coding system.

  14. KEWPIE: A dynamical cascade code for decaying exited compound nuclei

    NASA Astrophysics Data System (ADS)

    Bouriquet, Bertrand; Abe, Yasuhisa; Boilley, David

    2004-05-01

    A new dynamical cascade code for decaying hot nuclei is proposed and specially adapted to the synthesis of super-heavy nuclei. For such a case, the interesting channel is of the tiny fraction that will decay through particles emission, thus the code avoids classical Monte-Carlo methods and proposes a new numerical scheme. The time dependence is explicitely taken into account in order to cope with the fact that fission decay rate might not be constant. The code allows to evaluate both statistical and dynamical observables. Results are successfully compared to experimental data.

  15. The Refinement-Tree Partition for Parallel Solution of Partial Differential Equations

    PubMed Central

    Mitchell, William F.

    1998-01-01

    Dynamic load balancing is considered in the context of adaptive multilevel methods for partial differential equations on distributed memory multiprocessors. An approach that periodically repartitions the grid is taken. The important properties of a partitioning algorithm are presented and discussed in this context. A partitioning algorithm based on the refinement tree of the adaptive grid is presented and analyzed in terms of these properties. Theoretical and numerical results are given. PMID:28009355

  16. The Refinement-Tree Partition for Parallel Solution of Partial Differential Equations.

    PubMed

    Mitchell, William F

    1998-01-01

    Dynamic load balancing is considered in the context of adaptive multilevel methods for partial differential equations on distributed memory multiprocessors. An approach that periodically repartitions the grid is taken. The important properties of a partitioning algorithm are presented and discussed in this context. A partitioning algorithm based on the refinement tree of the adaptive grid is presented and analyzed in terms of these properties. Theoretical and numerical results are given.

  17. Talker-specificity and adaptation in quantifier interpretation

    PubMed Central

    Yildirim, Ilker; Degen, Judith; Tanenhaus, Michael K.; Jaeger, T. Florian

    2015-01-01

    Linguistic meaning has long been recognized to be highly context-dependent. Quantifiers like many and some provide a particularly clear example of context-dependence. For example, the interpretation of quantifiers requires listeners to determine the relevant domain and scale. We focus on another type of context-dependence that quantifiers share with other lexical items: talker variability. Different talkers might use quantifiers with different interpretations in mind. We used a web-based crowdsourcing paradigm to study participants’ expectations about the use of many and some based on recent exposure. We first established that the mapping of some and many onto quantities (candies in a bowl) is variable both within and between participants. We then examined whether and how listeners’ expectations about quantifier use adapts with exposure to talkers who use quantifiers in different ways. The results demonstrate that listeners can adapt to talker-specific biases in both how often and with what intended meaning many and some are used. PMID:26858511

  18. MobiDiC: Context Adaptive Digital Signage with Coupons

    NASA Astrophysics Data System (ADS)

    Müller, Jörg; Krüger, Antonio

    In this paper we present a field study of a digital signage system that measures audience response with coupons in order to enable context adaptivity. In the concept for context adaptivity, the signs sense their environment; decide which content to show, and then sense the audience reaction to the content shown. From this audience measurement, the strategies which content to show in which situation are refined. As one instantiation of audience measurement, we propose a novel simple couponing system, where customers can photograph the coupons at the signs. Thus, it can be measured whether customers really went to the shop. To investigate the feasibility of this approach, we implemented a prototype of 20 signs in the city center of Münster, Germany. During one year of deployment, we investigated usage of the system through interviews with shop owners and customers. Our experiences show that customer attention towards the signs is a major hurdle to overcome.

  19. Face Adaptation and Attractiveness Aftereffects in 8-Year-Olds and Adults

    ERIC Educational Resources Information Center

    Anzures, Gizelle; Mondloch, Catherine J.; Lackner, Christine

    2009-01-01

    A novel method was used to investigate developmental changes in face processing: attractiveness aftereffects. Consistent with the norm-based coding model, viewing consistently distorted faces shifts adults' attractiveness preferences toward the adapting stimuli. Thus, adults' attractiveness judgments are influenced by a continuously updated face…

  20. Three-Dimensional Numerical Analyses of Earth Penetration Dynamics

    DTIC Science & Technology

    1979-01-31

    Lagrangian formulation based on the HEMP method and has been adapted and validated for treatment of normal-incidence (axisymmetric) impact and...code, is a detailed analysis of the structural response of the EPW. This analysis is generated using a nonlinear dynamic, elastic- plastic finite element...based on the HEMP scheme. Thus, the code has the same material modeling capabilities and abilities to track large scale motion found in the WAVE-L code

  1. Asymmetric Memory Circuit Would Resist Soft Errors

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G.; Perlman, Marvin

    1990-01-01

    Some nonlinear error-correcting codes more efficient in presence of asymmetry. Combination of circuit-design and coding concepts expected to make integrated-circuit random-access memories more resistant to "soft" errors (temporary bit errors, also called "single-event upsets" due to ionizing radiation). Integrated circuit of new type made deliberately more susceptible to one kind of bit error than to other, and associated error-correcting code adapted to exploit this asymmetry in error probabilities.

  2. Edge Equilibrium Code (EEC) For Tokamaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Xujling

    2014-02-24

    The edge equilibrium code (EEC) described in this paper is developed for simulations of the near edge plasma using the finite element method. It solves the Grad-Shafranov equation in toroidal coordinate and uses adaptive grids aligned with magnetic field lines. Hermite finite elements are chosen for the numerical scheme. A fast Newton scheme which is the same as implemented in the equilibrium and stability code (ESC) is applied here to adjust the grids

  3. Towards a European code of medical ethics. Ethical and legal issues.

    PubMed

    Patuzzo, Sara; Pulice, Elisabetta

    2017-01-01

    The feasibility of a common European code of medical ethics is discussed, with consideration and evaluation of the difficulties such a project is going to face, from both the legal and ethical points of view. On the one hand, the analysis will underline the limits of a common European code of medical ethics as an instrument for harmonising national professional rules in the European context; on the other hand, we will highlight some of the potentials of this project, which could be increased and strengthened through a proper rulemaking process and through adequate and careful choice of content. We will also stress specific elements and devices that should be taken into consideration during the establishment of the code, from both procedural and content perspectives. Regarding methodological issues, the limits and potentialities of a common European code of medical ethics will be analysed from an ethical point of view and then from a legal perspective. The aim of this paper is to clarify the framework for the potential but controversial role of the code in the European context, showing the difficulties in enforcing and harmonising national ethical rules into a European code of medical ethics. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  4. Observation of children with attention-deficit hyperactivity (ADHD) problems in three natural classroom contexts.

    PubMed

    Lauth, G W; Heubeck, B G; Mackowiak, K

    2006-06-01

    Observation studies of students with attention-deficit hyperactivity disorder (ADHD) problems in natural classroom situations are costly and relatively rare. The study enquired how teacher ratings are anchored in actual student classroom behaviours, and how the behaviour of children with ADHD problems differs from their classmates. The authors attempted to broaden the usual focus on disruptive and inattentive behaviours to elucidate the role of various on-task behaviours, as well as considering differences between classroom contexts. DSM-III-R criteria were used in conjunction with a teacher rating scale to select a sample of 55 students with ADHD problems, and 55 matched controls from a population of 569 primary school students. Students were observed in their natural classrooms using the Munich Observation of Attention Inventory (MAI; Helmke, 1988). Correlations between teacher reports and observation codes were computed, and systematic differences between students with ADHD problems and controls in different classroom contexts were examined using a generalized linear mixed model (GLMM). Global teacher reports showed moderate to strong correlations with observed student behaviours. Expected on-task behaviour demonstrated the strongest relationship (r>-.70) with teacher reports. As hypothesized, the children with ADHD were more disruptive and inattentive than their matched peers. They were also less often inconspicuous on-task as expected by their teachers. However, their behaviour was assigned to two other on-task categories more often than their peers, and this raised their total on-task behaviour to over 66%. Situational differences were found for all codes as well, which mostly affected all students in a similar way, not just students with ADHD. ADHD related behaviours are pervasive across the classroom situations coded. Teachers appear to distinguish between desirable and undesirable on-task behaviours. Nevertheless, assisting students with ADHD problems requires shaping both. Future studies need to include more differentiated codes for various types of on-task behaviours and also need to code the lesson context concurrently.

  5. The diversity of gendered adaptation strategies to climate change of Indian farmers: A feminist intersectional approach.

    PubMed

    Ravera, Federica; Martín-López, Berta; Pascual, Unai; Drucker, Adam

    2016-12-01

    This paper examines climate change adaptation and gender issues through an application of a feminist intersectional approach. This approach permits the identification of diverse adaptation responses arising from the existence of multiple and fragmented dimensions of identity (including gender) that intersect with power relations to shape situation-specific interactions between farmers and ecosystems. Based on results from contrasting research cases in Bihar and Uttarakhand, India, this paper demonstrates, inter alia, that there are geographically determined gendered preferences and adoption strategies regarding adaptation options and that these are influenced by the socio-ecological context and institutional dynamics. Intersecting identities, such as caste, wealth, age and gender, influence decisions and reveal power dynamics and negotiation within the household and the community, as well as barriers to adaptation among groups. Overall, the findings suggest that a feminist intersectional approach does appear to be useful and worth further exploration in the context of climate change adaptation. In particular, future research could benefit from more emphasis on a nuanced analysis of the intra-gender differences that shape adaptive capacity to climate change.

  6. Phylogenetic context determines the role of competition in adaptive radiation

    PubMed Central

    Tan, Jiaqi; Slattery, Matthew R.; Yang, Xian; Jiang, Lin

    2016-01-01

    Understanding ecological mechanisms regulating the evolution of biodiversity is of much interest to ecologists and evolutionary biologists. Adaptive radiation constitutes an important evolutionary process that generates biodiversity. Competition has long been thought to influence adaptive radiation, but the directionality of its effect and associated mechanisms remain ambiguous. Here, we report a rigorous experimental test of the role of competition on adaptive radiation using the rapidly evolving bacterium Pseudomonas fluorescens SBW25 interacting with multiple bacterial species that differed in their phylogenetic distance to the diversifying bacterium. We showed that the inhibitive effect of competitors on the adaptive radiation of P. fluorescens decreased as their phylogenetic distance increased. To explain this phylogenetic dependency of adaptive radiation, we linked the phylogenetic distance between P. fluorescens and its competitors to their niche and competitive fitness differences. Competitive fitness differences, which showed weak phylogenetic signal, reduced P. fluorescens abundance and thus diversification, whereas phylogenetically conserved niche differences promoted diversification. These results demonstrate the context dependency of competitive effects on adaptive radiation, and highlight the importance of past evolutionary history for ongoing evolutionary processes. PMID:27335414

  7. Model-Driven Engineering: Automatic Code Generation and Beyond

    DTIC Science & Technology

    2015-03-01

    and Weblogic as well as cloud environments such as Mi- crosoft Azure and Amazon Web Services®. Finally, while the generated code has dependencies on...code generation in the context of the full system lifecycle from development to sustainment. Acquisition programs in govern- ment or large commercial...Acquirers are concerned with the full system lifecycle, and they need confidence that the development methods will enable the system to meet the functional

  8. A Policy Based Approach for the Management of Web Browser Resources to Prevent Anonymity Attacks in Tor

    NASA Astrophysics Data System (ADS)

    Navarro-Arribas, Guillermo; Garcia-Alfaro, Joaquin

    Web browsers are becoming the universal interface to reach applications and services related with these systems. Different browsing contexts may be required in order to reach them, e.g., use of VPN tunnels, corporate proxies, anonymisers, etc. By browsing context we mean how the user browsers the Web, including mainly the concrete configuration of its browser. When the context of the browser changes, its security requirements also change. In this work, we present the use of authorisation policies to automatise the process of controlling the resources of a Web browser when its context changes. The objective of our proposal is oriented towards easing the adaptation to the security requirements of the new context and enforce them in the browser without the need for user intervention. We present a concrete application of our work as a plug-in for the adaption of security requirements in Mozilla/Firefox browser when a context of anonymous navigation through the Tor network is enabled.

  9. Concreteness Effects in Text Recall: Dual Coding or Context Availability?

    ERIC Educational Resources Information Center

    Sadoski, Mark; And Others

    1995-01-01

    Extends an earlier study by using different materials, ratings for familiarity, and more stringent experimental controls. Finds concreteness effects in two experiments using undergraduate students. Suggests that familiarity and concreteness contribute separately to recall. Supports a dual coding theory. Discusses implications for text design. (RS)

  10. Switching Codes in the Plurilingual Classroom

    ERIC Educational Resources Information Center

    Corcoll López, Cristina; González-Davies, Maria

    2016-01-01

    The English as a foreign language classroom is a plurilingual setting par excellence since it involves at least two languages. However, plurilingual practices such as code-switching and translation have been consistently discouraged in formal learning contexts, based on the belief that keeping languages compartmentalized helps learning, and…

  11. The Development of Adaptive Expertise in Biotransport

    ERIC Educational Resources Information Center

    Martin, Taylor; Petrosino, Anthony J.; Rivale, Stephanie; Diller, Kenneth R.

    2006-01-01

    This chapter describes a model for continuous development of adaptive expertise, including growth along the dimensions of innovation and knowledge, examined in the context of a biotransport course in biomedical engineering. Students improved on both knowledge and innovation, moving along a continuum toward adaptive expertise. (Contains 5 figures.)

  12. 43 CFR 46.145 - Using adaptive management.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... implementation decisions. The NEPA analysis conducted in the context of an adaptive management approach should identify the range of management options that may be taken in response to the results of monitoring and should analyze the effects of such options. The environmental effects of any adaptive management strategy...

  13. 43 CFR 46.145 - Using adaptive management.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... implementation decisions. The NEPA analysis conducted in the context of an adaptive management approach should identify the range of management options that may be taken in response to the results of monitoring and should analyze the effects of such options. The environmental effects of any adaptive management strategy...

  14. Design of a digital voice data compression technique for orbiter voice channels

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Candidate techniques were investigated for digital voice compression to a transmission rate of 8 kbps. Good voice quality, speaker recognition, and robustness in the presence of error bursts were considered. The technique of delayed-decision adaptive predictive coding is described and compared with conventional adaptive predictive coding. Results include a set of experimental simulations recorded on analog tape. The two FM broadcast segments produced show the delayed-decision technique to be virtually undegraded or minimally degraded at .001 and .01 Viterbi decoder bit error rates. Preliminary estimates of the hardware complexity of this technique indicate potential for implementation in space shuttle orbiters.

  15. PHISICS/RELAP5-3D Adaptive Time-Step Method Demonstrated for the HTTR LOFC#1 Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Robin Ivey; Balestra, Paolo; Strydom, Gerhard

    A collaborative effort between Japan Atomic Energy Agency (JAEA) and Idaho National Laboratory (INL) as part of the Civil Nuclear Energy Working Group is underway to model the high temperature engineering test reactor (HTTR) loss of forced cooling (LOFC) transient that was performed in December 2010. The coupled version of RELAP5-3D, a thermal fluids code, and PHISICS, a neutronics code, were used to model the transient. The focus of this report is to summarize the changes made to the PHISICS-RELAP5-3D code for implementing an adaptive time step methodology into the code for the first time, and to test it usingmore » the full HTTR PHISICS/RELAP5-3D model developed by JAEA and INL and the LOFC simulation. Various adaptive schemes are available based on flux or power convergence criteria that allow significantly larger time steps to be taken by the neutronics module. The report includes a description of the HTTR and the associated PHISICS/RELAP5-3D model test results as well as the University of Rome sub-contractor report documenting the adaptive time step theory and methodology implemented in PHISICS/RELAP5-3D. Two versions of the HTTR model were tested using 8 and 26 energy groups. It was found that most of the new adaptive methods lead to significant improvements in the LOFC simulation time required without significant accuracy penalties in the prediction of the fission power and the fuel temperature. In the best performing 8 group model scenarios, a LOFC simulation of 20 hours could be completed in real-time, or even less than real-time, compared with the previous version of the code that completed the same transient 3-8 times slower than real-time. A few of the user choice combinations between the methodologies available and the tolerance settings did however result in unacceptably high errors or insignificant gains in simulation time. The study is concluded with recommendations on which methods to use for this HTTR model. An important caveat is that these findings are very model-specific and cannot be generalized to other PHISICS/RELAP5-3D models.« less

  16. Reversible wavelet filter banks with side informationless spatially adaptive low-pass filters

    NASA Astrophysics Data System (ADS)

    Abhayaratne, Charith

    2011-07-01

    Wavelet transforms that have an adaptive low-pass filter are useful in applications that require the signal singularities, sharp transitions, and image edges to be left intact in the low-pass signal. In scalable image coding, the spatial resolution scalability is achieved by reconstructing the low-pass signal subband, which corresponds to the desired resolution level, and discarding other high-frequency wavelet subbands. In such applications, it is vital to have low-pass subbands that are not affected by smoothing artifacts associated with low-pass filtering. We present the mathematical framework for achieving 1-D wavelet transforms that have a spatially adaptive low-pass filter (SALP) using the prediction-first lifting scheme. The adaptivity decisions are computed using the wavelet coefficients, and no bookkeeping is required for the perfect reconstruction. Then, 2-D wavelet transforms that have a spatially adaptive low-pass filter are designed by extending the 1-D SALP framework. Because the 2-D polyphase decompositions are used in this case, the 2-D adaptivity decisions are made nonseparable as opposed to the separable 2-D realization using 1-D transforms. We present examples using the 2-D 5/3 wavelet transform and their lossless image coding and scalable decoding performances in terms of quality and resolution scalability. The proposed 2-D-SALP scheme results in better performance compared to the existing adaptive update lifting schemes.

  17. Failing to learn from negative prediction errors: Obesity is associated with alterations in a fundamental neural learning mechanism.

    PubMed

    Mathar, David; Neumann, Jane; Villringer, Arno; Horstmann, Annette

    2017-10-01

    Prediction errors (PEs) encode the difference between expected and actual action outcomes in the brain via dopaminergic modulation. Integration of these learning signals ensures efficient behavioral adaptation. Obesity has recently been linked to altered dopaminergic fronto-striatal circuits, thus implying impairments in cognitive domains that rely on its integrity. 28 obese and 30 lean human participants performed an implicit stimulus-response learning paradigm inside an fMRI scanner. Computational modeling and psycho-physiological interaction (PPI) analysis was utilized for assessing PE-related learning and associated functional connectivity. We show that human obesity is associated with insufficient incorporation of negative PEs into behavioral adaptation even in a non-food context, suggesting differences in a fundamental neural learning mechanism. Obese subjects were less efficient in using negative PEs to improve implicit learning performance, despite proper coding of PEs in striatum. We further observed lower functional coupling between ventral striatum and supplementary motor area in obese subjects subsequent to negative PEs. Importantly, strength of functional coupling predicted task performance and negative PE utilization. These findings show that obesity is linked to insufficient behavioral adaptation specifically in response to negative PEs, and to associated alterations in function and connectivity within the fronto-striatal system. Recognition of neural differences as a central characteristic of obesity hopefully paves the way to rethink established intervention strategies: Differential behavioral sensitivity to negative and positive PEs should be considered when designing intervention programs. Measures relying on penalization of unwanted behavior may prove less effective in obese subjects than alternative approaches. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Feedback-related negativity codes outcome valence, but not outcome expectancy, during reversal learning.

    PubMed

    von Borries, A K L; Verkes, R J; Bulten, B H; Cools, R; de Bruijn, E R A

    2013-12-01

    Optimal behavior depends on the ability to assess the predictive value of events and to adjust behavior accordingly. Outcome processing can be studied by using its electrophysiological signatures--that is, the feedback-related negativity (FRN) and the P300. A prominent reinforcement-learning model predicts an FRN on negative prediction errors, as well as implying a role for the FRN in learning and the adaptation of behavior. However, these predictions have recently been challenged. Notably, studies so far have used tasks in which the outcomes have been contingent on the response. In these paradigms, the need to adapt behavioral responses is present only for negative, not for positive feedback. The goal of the present study was to investigate the effects of positive as well as negative violations of expectancy on FRN amplitudes, without the usual confound of behavioral adjustments. A reversal-learning task was employed in which outcome value and outcome expectancy were orthogonalized; that is, both positive and negative outcomes were equally unexpected. The results revealed a double dissociation, with effects of valence but not expectancy on the FRN and, conversely, effects of expectancy but not valence on the P300. While FRN amplitudes were largest for negative-outcome trials, irrespective of outcome expectancy, P300 amplitudes were largest for unexpected-outcome trials, irrespective of outcome valence. These FRN effects were interpreted to reflect an evaluation along a good-bad dimension, rather than reflecting a negative prediction error or a role in behavioral adaptation. By contrast, the P300 reflects the updating of information relevant for behavior in a changing context.

  19. Detecting and Characterizing Semantic Inconsistencies in Ported Code

    NASA Technical Reports Server (NTRS)

    Ray, Baishakhi; Kim, Miryung; Person, Suzette J.; Rungta, Neha

    2013-01-01

    Adding similar features and bug fixes often requires porting program patches from reference implementations and adapting them to target implementations. Porting errors may result from faulty adaptations or inconsistent updates. This paper investigates (I) the types of porting errors found in practice, and (2) how to detect and characterize potential porting errors. Analyzing version histories, we define five categories of porting errors, including incorrect control- and data-flow, code redundancy, inconsistent identifier renamings, etc. Leveraging this categorization, we design a static control- and data-dependence analysis technique, SPA, to detect and characterize porting inconsistencies. Our evaluation on code from four open-source projects shows thai SPA can dell-oct porting inconsistencies with 65% to 73% precision and 90% recall, and identify inconsistency types with 58% to 63% precision and 92% to 100% recall. In a comparison with two existing error detection tools, SPA improves precision by 14 to 17 percentage points

  20. Detecting and Characterizing Semantic Inconsistencies in Ported Code

    NASA Technical Reports Server (NTRS)

    Ray, Baishakhi; Kim, Miryung; Person,Suzette; Rungta, Neha

    2013-01-01

    Adding similar features and bug fixes often requires porting program patches from reference implementations and adapting them to target implementations. Porting errors may result from faulty adaptations or inconsistent updates. This paper investigates (1) the types of porting errors found in practice, and (2) how to detect and characterize potential porting errors. Analyzing version histories, we define five categories of porting errors, including incorrect control- and data-flow, code redundancy, inconsistent identifier renamings, etc. Leveraging this categorization, we design a static control- and data-dependence analysis technique, SPA, to detect and characterize porting inconsistencies. Our evaluation on code from four open-source projects shows that SPA can detect porting inconsistencies with 65% to 73% precision and 90% recall, and identify inconsistency types with 58% to 63% precision and 92% to 100% recall. In a comparison with two existing error detection tools, SPA improves precision by 14 to 17 percentage points.

Top