Sample records for code execution detection

  1. Detecting Heap-Spraying Code Injection Attacks in Malicious Web Pages Using Runtime Execution

    NASA Astrophysics Data System (ADS)

    Choi, Younghan; Kim, Hyoungchun; Lee, Donghoon

    The growing use of web services is increasing web browser attacks exponentially. Most attacks use a technique called heap spraying because of its high success rate. Heap spraying executes a malicious code without indicating the exact address of the code by copying it into many heap objects. For this reason, the attack has a high potential to succeed if only the vulnerability is exploited. Thus, attackers have recently begun using this technique because it is easy to use JavaScript to allocate the heap memory area. This paper proposes a novel technique that detects heap spraying attacks by executing a heap object in a real environment, irrespective of the version and patch status of the web browser. This runtime execution is used to detect various forms of heap spraying attacks, such as encoding and polymorphism. Heap objects are executed after being filtered on the basis of patterns of heap spraying attacks in order to reduce the overhead of the runtime execution. Patterns of heap spraying attacks are based on analysis of how an web browser accesses benign web sites. The heap objects are executed forcibly by changing the instruction register into the address of them after being loaded into memory. Thus, we can execute the malicious code without having to consider the version and patch status of the browser. An object is considered to contain a malicious code if the execution reaches a call instruction and then the instruction accesses the API of system libraries, such as kernel32.dll and ws_32.dll. To change registers and monitor execution flow, we used a debugger engine. A prototype, named HERAD(HEap spRAying Detector), is implemented and evaluated. In experiments, HERAD detects various forms of exploit code that an emulation cannot detect, and some heap spraying attacks that NOZZLE cannot detect. Although it has an execution overhead, HERAD produces a low number of false alarms. The processing time of several minutes is negligible because our research focuses on

  2. DROP: Detecting Return-Oriented Programming Malicious Code

    NASA Astrophysics Data System (ADS)

    Chen, Ping; Xiao, Hai; Shen, Xiaobin; Yin, Xinchun; Mao, Bing; Xie, Li

    Return-Oriented Programming (ROP) is a new technique that helps the attacker construct malicious code mounted on x86/SPARC executables without any function call at all. Such technique makes the ROP malicious code contain no instruction, which is different from existing attacks. Moreover, it hides the malicious code in benign code. Thus, it circumvents the approaches that prevent control flow diversion outside legitimate regions (such as W ⊕ X ) and most malicious code scanning techniques (such as anti-virus scanners). However, ROP has its own intrinsic feature which is different from normal program design: (1) uses short instruction sequence ending in "ret", which is called gadget, and (2) executes the gadgets contiguously in specific memory space, such as standard GNU libc. Based on the features of the ROP malicious code, in this paper, we present a tool DROP, which is focused on dynamically detecting ROP malicious code. Preliminary experimental results show that DROP can efficiently detect ROP malicious code, and have no false positives and negatives.

  3. 17 CFR 232.106 - Prohibition against electronic submissions containing executable code.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... executable code will be suspended, unless the executable code is contained only in one or more PDF documents, in which case the submission will be accepted but the PDF document(s) containing executable code will...

  4. System for loading executable code into volatile memory in a downhole tool

    DOEpatents

    Hall, David R.; Bartholomew, David B.; Johnson, Monte L.

    2007-09-25

    A system for loading an executable code into volatile memory in a downhole tool string component comprises a surface control unit comprising executable code. An integrated downhole network comprises data transmission elements in communication with the surface control unit and the volatile memory. The executable code, stored in the surface control unit, is not permanently stored in the downhole tool string component. In a preferred embodiment of the present invention, the downhole tool string component comprises boot memory. In another embodiment, the executable code is an operating system executable code. Preferably, the volatile memory comprises random access memory (RAM). A method for loading executable code to volatile memory in a downhole tool string component comprises sending the code from the surface control unit to a processor in the downhole tool string component over the network. A central processing unit writes the executable code in the volatile memory.

  5. Model-Driven Engineering of Machine Executable Code

    NASA Astrophysics Data System (ADS)

    Eichberg, Michael; Monperrus, Martin; Kloppenburg, Sven; Mezini, Mira

    Implementing static analyses of machine-level executable code is labor intensive and complex. We show how to leverage model-driven engineering to facilitate the design and implementation of programs doing static analyses. Further, we report on important lessons learned on the benefits and drawbacks while using the following technologies: using the Scala programming language as target of code generation, using XML-Schema to express a metamodel, and using XSLT to implement (a) transformations and (b) a lint like tool. Finally, we report on the use of Prolog for writing model transformations.

  6. Mal-Xtract: Hidden Code Extraction using Memory Analysis

    NASA Astrophysics Data System (ADS)

    Lim, Charles; Syailendra Kotualubun, Yohanes; Suryadi; Ramli, Kalamullah

    2017-01-01

    Software packer has been used effectively to hide the original code inside a binary executable, making it more difficult for existing signature based anti malware software to detect malicious code inside the executable. A new method of written and rewritten memory section is introduced to to detect the exact end time of unpacking routine and extract original code from packed binary executable using Memory Analysis running in an software emulated environment. Our experiment results show that at least 97% of the original code from the various binary executable packed with different software packers could be extracted. The proposed method has also been successfully extracted hidden code from recent malware family samples.

  7. A model-guided symbolic execution approach for network protocol implementations and vulnerability detection.

    PubMed

    Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing

    2017-01-01

    Formal techniques have been devoted to analyzing whether network protocol specifications violate security policies; however, these methods cannot detect vulnerabilities in the implementations of the network protocols themselves. Symbolic execution can be used to analyze the paths of the network protocol implementations, but for stateful network protocols, it is difficult to reach the deep states of the protocol. This paper proposes a novel model-guided approach to detect vulnerabilities in network protocol implementations. Our method first abstracts a finite state machine (FSM) model, then utilizes the model to guide the symbolic execution. This approach achieves high coverage of both the code and the protocol states. The proposed method is implemented and applied to test numerous real-world network protocol implementations. The experimental results indicate that the proposed method is more effective than traditional fuzzing methods such as SPIKE at detecting vulnerabilities in the deep states of network protocol implementations.

  8. Executive functioning and processing speed in age-related differences in memory: contribution of a coding task.

    PubMed

    Baudouin, Alexia; Clarys, David; Vanneste, Sandrine; Isingrini, Michel

    2009-12-01

    The aim of the present study was to examine executive dysfunctioning and decreased processing speed as potential mediators of age-related differences in episodic memory. We compared the performances of young and elderly adults in a free-recall task. Participants were also given tests to measure executive functions and perceptual processing speed and a coding task (the Digit Symbol Substitution Test, DSST). More precisely, we tested the hypothesis that executive functions would mediate the age-related differences observed in the free-recall task better than perceptual speed. We also tested the assumption that a coding task, assumed to involve both executive processes and perceptual speed, would be the best mediator of age-related differences in memory. Findings first confirmed that the DSST combines executive processes and perceptual speed. Secondly, they showed that executive functions are a significant mediator of age-related differences in memory, and that DSST performance is the best predictor.

  9. 17 CFR 232.106 - Prohibition against electronic submissions containing executable code.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 17 Commodity and Securities Exchanges 2 2011-04-01 2011-04-01 false Prohibition against electronic submissions containing executable code. 232.106 Section 232.106 Commodity and Securities Exchanges SECURITIES... Filer Manual section also may be a violation of the Computer Fraud and Abuse Act of 1986, as amended...

  10. 17 CFR 232.106 - Prohibition against electronic submissions containing executable code.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 17 Commodity and Securities Exchanges 2 2013-04-01 2013-04-01 false Prohibition against electronic submissions containing executable code. 232.106 Section 232.106 Commodity and Securities Exchanges SECURITIES... Filer Manual section also may be a violation of the Computer Fraud and Abuse Act of 1986, as amended...

  11. 17 CFR 232.106 - Prohibition against electronic submissions containing executable code.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 17 Commodity and Securities Exchanges 2 2012-04-01 2012-04-01 false Prohibition against electronic submissions containing executable code. 232.106 Section 232.106 Commodity and Securities Exchanges SECURITIES... Filer Manual section also may be a violation of the Computer Fraud and Abuse Act of 1986, as amended...

  12. Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, L.M.; Hochstedler, R.D.

    1997-02-01

    Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of themore » accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).« less

  13. Runtime Detection of C-Style Errors in UPC Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pirkelbauer, P; Liao, C; Panas, T

    2011-09-29

    Unified Parallel C (UPC) extends the C programming language (ISO C 99) with explicit parallel programming support for the partitioned global address space (PGAS), which provides a global memory space with localized partitions to each thread. Like its ancestor C, UPC is a low-level language that emphasizes code efficiency over safety. The absence of dynamic (and static) safety checks allows programmer oversights and software flaws that can be hard to spot. In this paper, we present an extension of a dynamic analysis tool, ROSE-Code Instrumentation and Runtime Monitor (ROSECIRM), for UPC to help programmers find C-style errors involving the globalmore » address space. Built on top of the ROSE source-to-source compiler infrastructure, the tool instruments source files with code that monitors operations and keeps track of changes to the system state. The resulting code is linked to a runtime monitor that observes the program execution and finds software defects. We describe the extensions to ROSE-CIRM that were necessary to support UPC. We discuss complications that arise from parallel code and our solutions. We test ROSE-CIRM against a runtime error detection test suite, and present performance results obtained from running error-free codes. ROSE-CIRM is released as part of the ROSE compiler under a BSD-style open source license.« less

  14. Transformation of Graphical ECA Policies into Executable PonderTalk Code

    NASA Astrophysics Data System (ADS)

    Romeikat, Raphael; Sinsel, Markus; Bauer, Bernhard

    Rules are becoming more and more important in business modeling and systems engineering and are recognized as a high-level programming paradigma. For the effective development of rules it is desired to start at a high level, e.g. with graphical rules, and to refine them into code of a particular rule language for implementation purposes later. An model-driven approach is presented in this paper to transform graphical rules into executable code in a fully automated way. The focus is on event-condition-action policies as a special rule type. These are modeled graphically and translated into the PonderTalk language. The approach may be extended to integrate other rule types and languages as well.

  15. Coding for parallel execution of hardware-in-the-loop millimeter-wave scene generation models on multicore SIMD processor architectures

    NASA Astrophysics Data System (ADS)

    Olson, Richard F.

    2013-05-01

    Rendering of point scatterer based radar scenes for millimeter wave (mmW) seeker tests in real-time hardware-in-the-loop (HWIL) scene generation requires efficient algorithms and vector-friendly computer architectures for complex signal synthesis. New processor technology from Intel implements an extended 256-bit vector SIMD instruction set (AVX, AVX2) in a multi-core CPU design providing peak execution rates of hundreds of GigaFLOPS (GFLOPS) on one chip. Real world mmW scene generation code can approach peak SIMD execution rates only after careful algorithm and source code design. An effective software design will maintain high computing intensity emphasizing register-to-register SIMD arithmetic operations over data movement between CPU caches or off-chip memories. Engineers at the U.S. Army Aviation and Missile Research, Development and Engineering Center (AMRDEC) applied two basic parallel coding methods to assess new 256-bit SIMD multi-core architectures for mmW scene generation in HWIL. These include use of POSIX threads built on vector library functions and more portable, highlevel parallel code based on compiler technology (e.g. OpenMP pragmas and SIMD autovectorization). Since CPU technology is rapidly advancing toward high processor core counts and TeraFLOPS peak SIMD execution rates, it is imperative that coding methods be identified which produce efficient and maintainable parallel code. This paper describes the algorithms used in point scatterer target model rendering, the parallelization of those algorithms, and the execution performance achieved on an AVX multi-core machine using the two basic parallel coding methods. The paper concludes with estimates for scale-up performance on upcoming multi-core technology.

  16. FORTRAN Automated Code Evaluation System (faces) system documentation, version 2, mod 0. [error detection codes/user manuals (computer programs)

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A system is presented which processes FORTRAN based software systems to surface potential problems before they become execution malfunctions. The system complements the diagnostic capabilities of compilers, loaders, and execution monitors rather than duplicating these functions. Also, it emphasizes frequent sources of FORTRAN problems which require inordinate manual effort to identify. The principle value of the system is extracting small sections of unusual code from the bulk of normal sequences. Code structures likely to cause immediate or future problems are brought to the user's attention. These messages stimulate timely corrective action of solid errors and promote identification of 'tricky' code. Corrective action may require recoding or simply extending software documentation to explain the unusual technique.

  17. Directed Incremental Symbolic Execution

    NASA Technical Reports Server (NTRS)

    Person, Suzette; Yang, Guowei; Rungta, Neha; Khurshid, Sarfraz

    2011-01-01

    The last few years have seen a resurgence of interest in the use of symbolic execution -- a program analysis technique developed more than three decades ago to analyze program execution paths. Scaling symbolic execution and other path-sensitive analysis techniques to large systems remains challenging despite recent algorithmic and technological advances. An alternative to solving the problem of scalability is to reduce the scope of the analysis. One approach that is widely studied in the context of regression analysis is to analyze the differences between two related program versions. While such an approach is intuitive in theory, finding efficient and precise ways to identify program differences, and characterize their effects on how the program executes has proved challenging in practice. In this paper, we present Directed Incremental Symbolic Execution (DiSE), a novel technique for detecting and characterizing the effects of program changes. The novelty of DiSE is to combine the efficiencies of static analysis techniques to compute program difference information with the precision of symbolic execution to explore program execution paths and generate path conditions affected by the differences. DiSE is a complementary technique to other reduction or bounding techniques developed to improve symbolic execution. Furthermore, DiSE does not require analysis results to be carried forward as the software evolves -- only the source code for two related program versions is required. A case-study of our implementation of DiSE illustrates its effectiveness at detecting and characterizing the effects of program changes.

  18. Error-Detecting Identification Codes for Algebra Students.

    ERIC Educational Resources Information Center

    Sutherland, David C.

    1990-01-01

    Discusses common error-detecting identification codes using linear algebra terminology to provide an interesting application of algebra. Presents examples from the International Standard Book Number, the Universal Product Code, bank identification numbers, and the ZIP code bar code. (YP)

  19. mGrid: A load-balanced distributed computing environment for the remote execution of the user-defined Matlab code

    PubMed Central

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-01-01

    Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it

  20. mGrid: a load-balanced distributed computing environment for the remote execution of the user-defined Matlab code.

    PubMed

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-03-15

    Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over

  1. Using recurrence plot analysis for software execution interpretation and fault detection

    NASA Astrophysics Data System (ADS)

    Mosdorf, M.

    2015-09-01

    This paper shows a method targeted at software execution interpretation and fault detection using recurrence plot analysis. In in the proposed approach recurrence plot analysis is applied to software execution trace that contains executed assembly instructions. Results of this analysis are subject to further processing with PCA (Principal Component Analysis) method that simplifies number coefficients used for software execution classification. This method was used for the analysis of five algorithms: Bubble Sort, Quick Sort, Median Filter, FIR, SHA-1. Results show that some of the collected traces could be easily assigned to particular algorithms (logs from Bubble Sort and FIR algorithms) while others are more difficult to distinguish.

  2. Writing executable assertions to test flight software

    NASA Technical Reports Server (NTRS)

    Mahmood, A.; Andrews, D. M.; Mccluskey, E. J.

    1984-01-01

    An executable assertion is a logical statement about the variables or a block of code. If there is no error during execution, the assertion statement results in a true value. Executable assertions can be used for dynamic testing of software. They can be employed for validation during the design phase, and exception and error detection during the operation phase. The present investigation is concerned with the problem of writing executable assertions, taking into account the use of assertions for testing flight software. They can be employed for validation during the design phase, and for exception handling and error detection during the operation phase The digital flight control system and the flight control software are discussed. The considered system provides autopilot and flight director modes of operation for automatic and manual control of the aircraft during all phases of flight. Attention is given to techniques for writing and using assertions to test flight software, an experimental setup to test flight software, and language features to support efficient use of assertions.

  3. Are You Overpaying Your Academic Executive Team? A Method for Detecting Unmerited Academic Executive Compensation

    ERIC Educational Resources Information Center

    Pearce, Joshua

    2016-01-01

    University tuition fees and student debt have risen in part due to rapid expansion of university administration compensation. This study provides a novel methodology for detecting inappropriate executive compensation within universities. The usefulness of academic ideas is openly ranked using the h-index. By comparing the ratio of academic…

  4. Methodology for fast detection of false sharing in threaded scientific codes

    DOEpatents

    Chung, I-Hsin; Cong, Guojing; Murata, Hiroki; Negishi, Yasushi; Wen, Hui-Fang

    2014-11-25

    A profiling tool identifies a code region with a false sharing potential. A static analysis tool classifies variables and arrays in the identified code region. A mapping detection library correlates memory access instructions in the identified code region with variables and arrays in the identified code region while a processor is running the identified code region. The mapping detection library identifies one or more instructions at risk, in the identified code region, which are subject to an analysis by a false sharing detection library. A false sharing detection library performs a run-time analysis of the one or more instructions at risk while the processor is re-running the identified code region. The false sharing detection library determines, based on the performed run-time analysis, whether two different portions of the cache memory line are accessed by the generated binary code.

  5. Self-assembled software and method of overriding software execution

    DOEpatents

    Bouchard, Ann M.; Osbourn, Gordon C.

    2013-01-08

    A computer-implemented software self-assembled system and method for providing an external override and monitoring capability to dynamically self-assembling software containing machines that self-assemble execution sequences and data structures. The method provides an external override machine that can be introduced into a system of self-assembling machines while the machines are executing such that the functionality of the executing software can be changed or paused without stopping the code execution and modifying the existing code. Additionally, a monitoring machine can be introduced without stopping code execution that can monitor specified code execution functions by designated machines and communicate the status to an output device.

  6. A Novel Technique to Detect Code for SAC-OCDMA System

    NASA Astrophysics Data System (ADS)

    Bharti, Manisha; Kumar, Manoj; Sharma, Ajay K.

    2018-04-01

    The main task of optical code division multiple access (OCDMA) system is the detection of code used by a user in presence of multiple access interference (MAI). In this paper, new method of detection known as XOR subtraction detection for spectral amplitude coding OCDMA (SAC-OCDMA) based on double weight codes has been proposed and presented. As MAI is the main source of performance deterioration in OCDMA system, therefore, SAC technique is used in this paper to eliminate the effect of MAI up to a large extent. A comparative analysis is then made between the proposed scheme and other conventional detection schemes used like complimentary subtraction detection, AND subtraction detection and NAND subtraction detection. The system performance is characterized by Q-factor, BER and received optical power (ROP) with respect to input laser power and fiber length. The theoretical and simulation investigations reveal that the proposed detection technique provides better quality factor, security and received power in comparison to other conventional techniques. The wide opening of eye in case of proposed technique also proves its robustness.

  7. Quantum-capacity-approaching codes for the detected-jump channel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grassl, Markus; Wei Zhaohui; Ji Zhengfeng

    2010-12-15

    The quantum-channel capacity gives the ultimate limit for the rate at which quantum data can be reliably transmitted through a noisy quantum channel. Degradable quantum channels are among the few channels whose quantum capacities are known. Given the quantum capacity of a degradable channel, it remains challenging to find a practical coding scheme which approaches capacity. Here we discuss code designs for the detected-jump channel, a degradable channel with practical relevance describing the physics of spontaneous decay of atoms with detected photon emission. We show that this channel can be used to simulate a binary classical channel with both erasuresmore » and bit flips. The capacity of the simulated classical channel gives a lower bound on the quantum capacity of the detected-jump channel. When the jump probability is small, it almost equals the quantum capacity. Hence using a classical capacity-approaching code for the simulated classical channel yields a quantum code which approaches the quantum capacity of the detected-jump channel.« less

  8. The role of the PIRT process in identifying code improvements and executing code development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, G.E.; Boyack, B.E.

    1997-07-01

    In September 1988, the USNRC issued a revised ECCS rule for light water reactors that allows, as an option, the use of best estimate (BE) plus uncertainty methods in safety analysis. The key feature of this licensing option relates to quantification of the uncertainty in the determination that an NPP has a {open_quotes}low{close_quotes} probability of violating the safety criteria specified in 10 CFR 50. To support the 1988 licensing revision, the USNRC and its contractors developed the CSAU evaluation methodology to demonstrate the feasibility of the BE plus uncertainty approach. The PIRT process, Step 3 in the CSAU methodology, wasmore » originally formulated to support the BE plus uncertainty licensing option as executed in the CSAU approach to safety analysis. Subsequent work has shown the PIRT process to be a much more powerful tool than conceived in its original form. Through further development and application, the PIRT process has shown itself to be a robust means to establish safety analysis computer code phenomenological requirements in their order of importance to such analyses. Used early in research directed toward these objectives, PIRT results also provide the technical basis and cost effective organization for new experimental programs needed to improve the safety analysis codes for new applications. The primary purpose of this paper is to describe the generic PIRT process, including typical and common illustrations from prior applications. The secondary objective is to provide guidance to future applications of the process to help them focus, in a graded approach, on systems, components, processes and phenomena that have been common in several prior applications.« less

  9. Coded Excitation Plane Wave Imaging for Shear Wave Motion Detection

    PubMed Central

    Song, Pengfei; Urban, Matthew W.; Manduca, Armando; Greenleaf, James F.; Chen, Shigao

    2015-01-01

    Plane wave imaging has greatly advanced the field of shear wave elastography thanks to its ultrafast imaging frame rate and the large field-of-view (FOV). However, plane wave imaging also has decreased penetration due to lack of transmit focusing, which makes it challenging to use plane waves for shear wave detection in deep tissues and in obese patients. This study investigated the feasibility of implementing coded excitation in plane wave imaging for shear wave detection, with the hypothesis that coded ultrasound signals can provide superior detection penetration and shear wave signal-to-noise-ratio (SNR) compared to conventional ultrasound signals. Both phase encoding (Barker code) and frequency encoding (chirp code) methods were studied. A first phantom experiment showed an approximate penetration gain of 2-4 cm for the coded pulses. Two subsequent phantom studies showed that all coded pulses outperformed the conventional short imaging pulse by providing superior sensitivity to small motion and robustness to weak ultrasound signals. Finally, an in vivo liver case study on an obese subject (Body Mass Index = 40) demonstrated the feasibility of using the proposed method for in vivo applications, and showed that all coded pulses could provide higher SNR shear wave signals than the conventional short pulse. These findings indicate that by using coded excitation shear wave detection, one can benefit from the ultrafast imaging frame rate and large FOV provided by plane wave imaging while preserving good penetration and shear wave signal quality, which is essential for obtaining robust shear elasticity measurements of tissue. PMID:26168181

  10. Performance Metrics for Monitoring Parallel Program Executions

    NASA Technical Reports Server (NTRS)

    Sarukkai, Sekkar R.; Gotwais, Jacob K.; Yan, Jerry; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    Existing tools for debugging performance of parallel programs either provide graphical representations of program execution or profiles of program executions. However, for performance debugging tools to be useful, such information has to be augmented with information that highlights the cause of poor program performance. Identifying the cause of poor performance necessitates the need for not only determining the significance of various performance problems on the execution time of the program, but also needs to consider the effect of interprocessor communications of individual source level data structures. In this paper, we present a suite of normalized indices which provide a convenient mechanism for focusing on a region of code with poor performance and highlights the cause of the problem in terms of processors, procedures and data structure interactions. All the indices are generated from trace files augmented with data structure information.. Further, we show with the help of examples from the NAS benchmark suite that the indices help in detecting potential cause of poor performance, based on augmented execution traces obtained by monitoring the program.

  11. Shared prefetching to reduce execution skew in multi-threaded systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eichenberger, Alexandre E; Gunnels, John A

    Mechanisms are provided for optimizing code to perform prefetching of data into a shared memory of a computing device that is shared by a plurality of threads that execute on the computing device. A memory stream of a portion of code that is shared by the plurality of threads is identified. A set of prefetch instructions is distributed across the plurality of threads. Prefetch instructions are inserted into the instruction sequences of the plurality of threads such that each instruction sequence has a separate sub-portion of the set of prefetch instructions, thereby generating optimized code. Executable code is generated basedmore » on the optimized code and stored in a storage device. The executable code, when executed, performs the prefetches associated with the distributed set of prefetch instructions in a shared manner across the plurality of threads.« less

  12. 3 CFR 13490 - Executive Order 13490 of January 21, 2009. Ethics Commitments by Executive Branch Personnel

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 3 The President 1 2010-01-01 2010-01-01 false Executive Order 13490 of January 21, 2009. Ethics... Order 13490 of January 21, 2009 EO 13490 Ethics Commitments by Executive Branch Personnel By the... Code, it is hereby ordered as follows: Section 1. Ethics Pledge. Every appointee in every executive...

  13. Directed Hidden-Code Extractor for Environment-Sensitive Malwares

    NASA Astrophysics Data System (ADS)

    Jia, Chunfu; Wang, Zhi; Lu, Kai; Liu, Xinhai; Liu, Xin

    Malware writers often use packing technique to hide malicious payload. A number of dynamic unpacking tools are.designed in order to identify and extract the hidden code in the packed malware. However, such unpacking methods.are all based on a highly controlled environment that is vulnerable to various anti-unpacking techniques. If execution.environment is suspicious, malwares may stay inactive for a long time or stop execution immediately to evade.detection. In this paper, we proposed a novel approach that automatically reasons about the environment requirements.imposed by malware, then directs a unpacking tool to change the controlled environment to extract the hide code at.the new environment. The experimental results show that our approach significantly increases the resilience of the.traditional unpacking tools to environment-sensitive malware.

  14. Concurrent error detecting codes for arithmetic processors

    NASA Technical Reports Server (NTRS)

    Lim, R. S.

    1979-01-01

    A method of concurrent error detection for arithmetic processors is described. Low-cost residue codes with check-length l and checkbase m = 2 to the l power - 1 are described for checking arithmetic operations of addition, subtraction, multiplication, division complement, shift, and rotate. Of the three number representations, the signed-magnitude representation is preferred for residue checking. Two methods of residue generation are described: the standard method of using modulo m adders and the method of using a self-testing residue tree. A simple single-bit parity-check code is described for checking the logical operations of XOR, OR, and AND, and also the arithmetic operations of complement, shift, and rotate. For checking complement, shift, and rotate, the single-bit parity-check code is simpler to implement than the residue codes.

  15. Detecting and Characterizing Semantic Inconsistencies in Ported Code

    NASA Technical Reports Server (NTRS)

    Ray, Baishakhi; Kim, Miryung; Person,Suzette; Rungta, Neha

    2013-01-01

    Adding similar features and bug fixes often requires porting program patches from reference implementations and adapting them to target implementations. Porting errors may result from faulty adaptations or inconsistent updates. This paper investigates (1) the types of porting errors found in practice, and (2) how to detect and characterize potential porting errors. Analyzing version histories, we define five categories of porting errors, including incorrect control- and data-flow, code redundancy, inconsistent identifier renamings, etc. Leveraging this categorization, we design a static control- and data-dependence analysis technique, SPA, to detect and characterize porting inconsistencies. Our evaluation on code from four open-source projects shows that SPA can detect porting inconsistencies with 65% to 73% precision and 90% recall, and identify inconsistency types with 58% to 63% precision and 92% to 100% recall. In a comparison with two existing error detection tools, SPA improves precision by 14 to 17 percentage points.

  16. Protection of Mobile Agents Execution Using a Modified Self-Validating Branch-Based Software Watermarking with External Sentinel

    NASA Astrophysics Data System (ADS)

    Tomàs-Buliart, Joan; Fernández, Marcel; Soriano, Miguel

    Critical infrastructures are usually controlled by software entities. To monitor the well-function of these entities, a solution based in the use of mobile agents is proposed. Some proposals to detect modifications of mobile agents, as digital signature of code, exist but they are oriented to protect software against modification or to verify that an agent have been executed correctly. The aim of our proposal is to guarantee that the software is being executed correctly by a non trusted host. The way proposed to achieve this objective is by the improvement of the Self-Validating Branch-Based Software Watermarking by Myles et al.. The proposed modification is the incorporation of an external element called sentinel which controls branch targets. This technique applied in mobile agents can guarantee the correct operation of an agent or, at least, can detect suspicious behaviours of a malicious host during the execution of the agent instead of detecting when the execution of the agent have finished.

  17. Detecting and Characterizing Semantic Inconsistencies in Ported Code

    NASA Technical Reports Server (NTRS)

    Ray, Baishakhi; Kim, Miryung; Person, Suzette J.; Rungta, Neha

    2013-01-01

    Adding similar features and bug fixes often requires porting program patches from reference implementations and adapting them to target implementations. Porting errors may result from faulty adaptations or inconsistent updates. This paper investigates (I) the types of porting errors found in practice, and (2) how to detect and characterize potential porting errors. Analyzing version histories, we define five categories of porting errors, including incorrect control- and data-flow, code redundancy, inconsistent identifier renamings, etc. Leveraging this categorization, we design a static control- and data-dependence analysis technique, SPA, to detect and characterize porting inconsistencies. Our evaluation on code from four open-source projects shows thai SPA can dell-oct porting inconsistencies with 65% to 73% precision and 90% recall, and identify inconsistency types with 58% to 63% precision and 92% to 100% recall. In a comparison with two existing error detection tools, SPA improves precision by 14 to 17 percentage points

  18. Multimodal Sparse Coding for Event Detection

    DTIC Science & Technology

    2015-10-13

    classification tasks based on single modality. We present multimodal sparse coding for learning feature representations shared across multiple modalities...The shared representa- tions are applied to multimedia event detection (MED) and evaluated in compar- ison to unimodal counterparts, as well as other...and video tracks from the same multimedia clip, we can force the two modalities to share a similar sparse representation whose benefit includes robust

  19. When Interference Helps: Increasing Executive Load to Facilitate Deception Detection in the Concealed Information Test

    PubMed Central

    Visu-Petra, George; Varga, Mihai; Miclea, Mircea; Visu-Petra, Laura

    2013-01-01

    The possibility to enhance the detection efficiency of the Concealed Information Test (CIT) by increasing executive load was investigated, using an interference design. After learning and executing a mock crime scenario, subjects underwent three deception detection tests: an RT-based CIT, an RT-based CIT plus a concurrent memory task (CITMem), and an RT-based CIT plus a concurrent set-shifting task (CITShift). The concealed information effect, consisting in increased RT and lower response accuracy for probe items compared to irrelevant items, was evidenced across all three conditions. The group analyses indicated a larger difference between RTs to probe and irrelevant items in the dual-task conditions, but this difference was not translated in a significantly increased detection efficiency at an individual level. Signal detection parameters based on the comparison with a simulated innocent group showed accurate discrimination for all conditions. Overall response accuracy on the CITMem was highest and the difference between response accuracy to probes and irrelevants was smallest in this condition. Accuracy on the concurrent tasks (Mem and Shift) was high, and responses on these tasks were significantly influenced by CIT stimulus type (probes vs. irrelevants). The findings are interpreted in relation to the cognitive load/dual-task interference literature, generating important insights for research on the involvement of executive functions in deceptive behavior. PMID:23543918

  20. Efficient coding and detection of ultra-long IDs for visible light positioning systems.

    PubMed

    Zhang, Hualong; Yang, Chuanchuan

    2018-05-14

    Visible light positioning (VLP) is a promising technique to complement Global Navigation Satellite System (GNSS) such as Global positioning system (GPS) and BeiDou Navigation Satellite System (BDS) which features the advantage of low-cost and high accuracy. The situation becomes even more crucial for indoor environments, where satellite signals are weak or even unavailable. For large-scale application of VLP, there would be a considerable number of Light emitting diode (LED) IDs, which bring forward the demand of long LED ID detection. In particular, to provision indoor localization globally, a convenient way is to program a unique ID into each LED during manufacture. This poses a big challenge for image sensors, such as the CMOS camera in everybody's hands since the long ID covers the span of multiple frames. In this paper, we investigate the detection of ultra-long ID using rolling shutter cameras. By analyzing the pattern of data loss in each frame, we proposed a novel coding technique to improve the efficiency of LED ID detection. We studied the performance of Reed-Solomon (RS) code in this system and designed a new coding method which considered the trade-off between performance and decoding complexity. Coding technique decreases the number of frames needed in data processing, significantly reduces the detection time, and improves the accuracy of detection. Numerical and experimental results show that the detected LED ID can be much longer with the coding technique. Besides, our proposed coding method is proved to achieve a performance close to that of RS code while the decoding complexity is much lower.

  1. The detection and extraction of interleaved code segments

    NASA Technical Reports Server (NTRS)

    Rugaber, Spencer; Stirewalt, Kurt; Wills, Linda M.

    1995-01-01

    This project is concerned with a specific difficulty that arises when trying to understand and modify computer programs. In particular, it is concerned with the phenomenon of 'interleaving' in which one section of a program accomplishes several purposes, and disentangling the code responsible for each purposes is difficult. Unraveling interleaved code involves discovering the purpose of each strand of computation, as well as understanding why the programmer decided to interleave the strands. Increased understanding improve the productivity and quality of software maintenance, enhancement, and documentation activities. It is the goal of the project to characterize the phenomenon of interleaving as a prerequisite for building tools to detect and extract interleaved code fragments.

  2. Tactile detection of slip: surface microgeometry and peripheral neural codes.

    PubMed

    Srinivasan, M A; Whitehouse, J M; LaMotte, R H

    1990-06-01

    1. The role of the microgeometry of planar surfaces in the detection of sliding of the surfaces on human and monkey fingerpads was investigated. By the use of a servo-controlled tactile stimulator to press and stroke glass plates on passive fingerpads of human subjects, the ability of humans to discriminate the direction of skin stretch caused by friction and to detect the sliding motion (slip) of the plates with or without micrometer-sized surface features was determined. To identify the associated peripheral neural codes, evoked responses to the same stimuli were recorded from single, low-threshold mechanoreceptive afferent fibers innervating the fingerpads of anesthetized macaque monkeys. 2. Humans could not detect the slip of a smooth glass plate on the fingerpad. However, the direction of skin stretch was perceived based on the information conveyed by the slowly adapting afferents that respond differentially to the stretch directions. Whereas the direction of skin stretch signaled the direction of impending slip, the perception of relative motion between the plate and the finger required the existence of detectable surface features. 3. Barely detectable micrometer-sized protrusions on smooth surfaces led to the detection of slip of these surfaces, because of the exclusive activation of rapidly adapting fibers of either the Meissner (RA) or the Pacinian (PC) type to specific geometries of the microfeatures. The motion of a smooth plate with a very small single raised dot (4 microns high, 550 microns diam) caused the sequential activation of neighboring RAs along the dot path, thus providing a reliable spatiotemporal code. The stroking of the plate with a fine homogeneous texture composed of a matrix of dots (1 microns high, 50 microns diam, and spaced at 100 microns center-to-center) induced vibrations in the fingerpad that activated only the PCs and resulted in an intensive code. 4. The results show that surprisingly small features on smooth surfaces are

  3. Parallelized direct execution simulation of message-passing parallel programs

    NASA Technical Reports Server (NTRS)

    Dickens, Phillip M.; Heidelberger, Philip; Nicol, David M.

    1994-01-01

    As massively parallel computers proliferate, there is growing interest in findings ways by which performance of massively parallel codes can be efficiently predicted. This problem arises in diverse contexts such as parallelizing computers, parallel performance monitoring, and parallel algorithm development. In this paper we describe one solution where one directly executes the application code, but uses a discrete-event simulator to model details of the presumed parallel machine such as operating system and communication network behavior. Because this approach is computationally expensive, we are interested in its own parallelization specifically the parallelization of the discrete-event simulator. We describe methods suitable for parallelized direct execution simulation of message-passing parallel programs, and report on the performance of such a system, Large Application Parallel Simulation Environment (LAPSE), we have built on the Intel Paragon. On all codes measured to date, LAPSE predicts performance well typically within 10 percent relative error. Depending on the nature of the application code, we have observed low slowdowns (relative to natively executing code) and high relative speedups using up to 64 processors.

  4. Methods, media, and systems for detecting attack on a digital processing device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stolfo, Salvatore J.; Li, Wei-Jen; Keromytis, Angelos D.

    Methods, media, and systems for detecting attack are provided. In some embodiments, the methods include: comparing at least part of a document to a static detection model; determining whether attacking code is included in the document based on the comparison of the document to the static detection model; executing at least part of the document; determining whether attacking code is included in the document based on the execution of the at least part of the document; and if attacking code is determined to be included in the document based on at least one of the comparison of the document tomore » the static detection model and the execution of the at least part of the document, reporting the presence of an attack. In some embodiments, the methods include: selecting a data segment in at least one portion of an electronic document; determining whether the arbitrarily selected data segment can be altered without causing the electronic document to result in an error when processed by a corresponding program; in response to determining that the arbitrarily selected data segment can be altered, arbitrarily altering the data segment in the at least one portion of the electronic document to produce an altered electronic document; and determining whether the corresponding program produces an error state when the altered electronic document is processed by the corresponding program.« less

  5. Methods, media, and systems for detecting attack on a digital processing device

    DOEpatents

    Stolfo, Salvatore J.; Li, Wei-Jen; Keromylis, Angelos D.; Androulaki, Elli

    2014-07-22

    Methods, media, and systems for detecting attack are provided. In some embodiments, the methods include: comparing at least part of a document to a static detection model; determining whether attacking code is included in the document based on the comparison of the document to the static detection model; executing at least part of the document; determining whether attacking code is included in the document based on the execution of the at least part of the document; and if attacking code is determined to be included in the document based on at least one of the comparison of the document to the static detection model and the execution of the at least part of the document, reporting the presence of an attack. In some embodiments, the methods include: selecting a data segment in at least one portion of an electronic document; determining whether the arbitrarily selected data segment can be altered without causing the electronic document to result in an error when processed by a corresponding program; in response to determining that the arbitrarily selected data segment can be altered, arbitrarily altering the data segment in the at least one portion of the electronic document to produce an altered electronic document; and determining whether the corresponding program produces an error state when the altered electronic document is processed by the corresponding program.

  6. Rate adaptive multilevel coded modulation with high coding gain in intensity modulation direct detection optical communication

    NASA Astrophysics Data System (ADS)

    Xiao, Fei; Liu, Bo; Zhang, Lijia; Xin, Xiangjun; Zhang, Qi; Tian, Qinghua; Tian, Feng; Wang, Yongjun; Rao, Lan; Ullah, Rahat; Zhao, Feng; Li, Deng'ao

    2018-02-01

    A rate-adaptive multilevel coded modulation (RA-MLC) scheme based on fixed code length and a corresponding decoding scheme is proposed. RA-MLC scheme combines the multilevel coded and modulation technology with the binary linear block code at the transmitter. Bits division, coding, optional interleaving, and modulation are carried out by the preset rule, then transmitted through standard single mode fiber span equal to 100 km. The receiver improves the accuracy of decoding by means of soft information passing through different layers, which enhances the performance. Simulations are carried out in an intensity modulation-direct detection optical communication system using MATLAB®. Results show that the RA-MLC scheme can achieve bit error rate of 1E-5 when optical signal-to-noise ratio is 20.7 dB. It also reduced the number of decoders by 72% and realized 22 rate adaptation without significantly increasing the computing time. The coding gain is increased by 7.3 dB at BER=1E-3.

  7. Interference Cancellation Technique Based on Discovery of Spreading Codes of Interference Signals and Maximum Correlation Detection for DS-CDMA System

    NASA Astrophysics Data System (ADS)

    Hettiarachchi, Ranga; Yokoyama, Mitsuo; Uehara, Hideyuki

    This paper presents a novel interference cancellation (IC) scheme for both synchronous and asynchronous direct-sequence code-division multiple-access (DS-CDMA) wireless channels. In the DS-CDMA system, the multiple access interference (MAI) and the near-far problem (NFP) are the two factors which reduce the capacity of the system. In this paper, we propose a new algorithm that is able to detect all interference signals as an individual MAI signal by maximum correlation detection. It is based on the discovery of all the unknowing spreading codes of the interference signals. Then, all possible MAI patterns so called replicas are generated as a summation of interference signals. And the true MAI pattern is found by taking correlation between the received signal and the replicas. Moreover, the receiver executes MAI cancellation in a successive manner, removing all interference signals by single-stage. Numerical results will show that the proposed IC strategy, which alleviates the detrimental effect of the MAI and the near-far problem, can significantly improve the system performance. Especially, we can obtain almost the same receiving characteristics as in the absense of interference for asynchrnous system when received powers are equal. Also, the same performances can be seen under any received power state for synchronous system.

  8. [SUPPORT, CO-OPERATIVE EDUCATION PROGRAMMES, PRAGMATIC CODE OF ETHICS: A CLINICAL APPROACH OF EXECUTIVE TRAINING].

    PubMed

    Cabaret, Véronique

    2016-01-01

    This article aims at introducing an educational sequence completed at l'Institut de Formation des Cadres de Santé (IFCS) at the CHRU in Lille in France, entitled "training project and educational project" present in the "training duties" module whose goal is to generate students'knowledge through co-operative education programmes. By creating this innovative sequence, the educational aim is to use the Institut ground as a ground of learning, associated with the various internship grounds, in order to get the most of co-operative education programmes. Besides, in a pragmatic code of ethics in training, the teaching staff draw their inspiration from a clinical approach of executive training: they regard students as true protagonists in a co-operative plan created for them, wishing to design it with them using their words. Thus, students are brought to criticize the IFCS educational project and debate it with the trainers who have built it. Each partner tries to understand the Other, being aware of their being different. By contributing every year to rewriting the educational project which directly concerns them, students build their professional positions as health executives. They play an active role in co-operative education programmes just like IFCS outside partners.

  9. Accuracy comparison among different machine learning techniques for detecting malicious codes

    NASA Astrophysics Data System (ADS)

    Narang, Komal

    2016-03-01

    In this paper, a machine learning based model for malware detection is proposed. It can detect newly released malware i.e. zero day attack by analyzing operation codes on Android operating system. The accuracy of Naïve Bayes, Support Vector Machine (SVM) and Neural Network for detecting malicious code has been compared for the proposed model. In the experiment 400 benign files, 100 system files and 500 malicious files have been used to construct the model. The model yields the best accuracy 88.9% when neural network is used as classifier and achieved 95% and 82.8% accuracy for sensitivity and specificity respectively.

  10. Data processing with microcode designed with source coding

    DOEpatents

    McCoy, James A; Morrison, Steven E

    2013-05-07

    Programming for a data processor to execute a data processing application is provided using microcode source code. The microcode source code is assembled to produce microcode that includes digital microcode instructions with which to signal the data processor to execute the data processing application.

  11. Translating expert system rules into Ada code with validation and verification

    NASA Technical Reports Server (NTRS)

    Becker, Lee; Duckworth, R. James; Green, Peter; Michalson, Bill; Gosselin, Dave; Nainani, Krishan; Pease, Adam

    1991-01-01

    The purpose of this ongoing research and development program is to develop software tools which enable the rapid development, upgrading, and maintenance of embedded real-time artificial intelligence systems. The goals of this phase of the research were to investigate the feasibility of developing software tools which automatically translate expert system rules into Ada code and develop methods for performing validation and verification testing of the resultant expert system. A prototype system was demonstrated which automatically translated rules from an Air Force expert system was demonstrated which detected errors in the execution of the resultant system. The method and prototype tools for converting AI representations into Ada code by converting the rules into Ada code modules and then linking them with an Activation Framework based run-time environment to form an executable load module are discussed. This method is based upon the use of Evidence Flow Graphs which are a data flow representation for intelligent systems. The development of prototype test generation and evaluation software which was used to test the resultant code is discussed. This testing was performed automatically using Monte-Carlo techniques based upon a constraint based description of the required performance for the system.

  12. Design of Cyber Attack Precursor Symptom Detection Algorithm through System Base Behavior Analysis and Memory Monitoring

    NASA Astrophysics Data System (ADS)

    Jung, Sungmo; Kim, Jong Hyun; Cagalaban, Giovanni; Lim, Ji-Hoon; Kim, Seoksoo

    More recently, botnet-based cyber attacks, including a spam mail or a DDos attack, have sharply increased, which poses a fatal threat to Internet services. At present, antivirus businesses make it top priority to detect malicious code in the shortest time possible (Lv.2), based on the graph showing a relation between spread of malicious code and time, which allows them to detect after malicious code occurs. Despite early detection, however, it is not possible to prevent malicious code from occurring. Thus, we have developed an algorithm that can detect precursor symptoms at Lv.1 to prevent a cyber attack using an evasion method of 'an executing environment aware attack' by analyzing system behaviors and monitoring memory.

  13. Roles for Coincidence Detection in Coding Amplitude-Modulated Sounds

    PubMed Central

    Ashida, Go; Kretzberg, Jutta; Tollin, Daniel J.

    2016-01-01

    Many sensory neurons encode temporal information by detecting coincident arrivals of synaptic inputs. In the mammalian auditory brainstem, binaural neurons of the medial superior olive (MSO) are known to act as coincidence detectors, whereas in the lateral superior olive (LSO) roles of coincidence detection have remained unclear. LSO neurons receive excitatory and inhibitory inputs driven by ipsilateral and contralateral acoustic stimuli, respectively, and vary their output spike rates according to interaural level differences. In addition, LSO neurons are also sensitive to binaural phase differences of low-frequency tones and envelopes of amplitude-modulated (AM) sounds. Previous physiological recordings in vivo found considerable variations in monaural AM-tuning across neurons. To investigate the underlying mechanisms of the observed temporal tuning properties of LSO and their sources of variability, we used a simple coincidence counting model and examined how specific parameters of coincidence detection affect monaural and binaural AM coding. Spike rates and phase-locking of evoked excitatory and spontaneous inhibitory inputs had only minor effects on LSO output to monaural AM inputs. In contrast, the coincidence threshold of the model neuron affected both the overall spike rates and the half-peak positions of the AM-tuning curve, whereas the width of the coincidence window merely influenced the output spike rates. The duration of the refractory period affected only the low-frequency portion of the monaural AM-tuning curve. Unlike monaural AM coding, temporal factors, such as the coincidence window and the effective duration of inhibition, played a major role in determining the trough positions of simulated binaural phase-response curves. In addition, empirically-observed level-dependence of binaural phase-coding was reproduced in the framework of our minimalistic coincidence counting model. These modeling results suggest that coincidence detection of excitatory

  14. An Execution Service for Grid Computing

    NASA Technical Reports Server (NTRS)

    Smith, Warren; Hu, Chaumin

    2004-01-01

    This paper describes the design and implementation of the IPG Execution Service that reliably executes complex jobs on a computational grid. Our Execution Service is part of the IPG service architecture whose goal is to support location-independent computing. In such an environment, once n user ports an npplicntion to one or more hardware/software platfrms, the user can describe this environment to the grid the grid can locate instances of this platfrm, configure the platfrm as required for the application, and then execute the application. Our Execution Service runs jobs that set up such environments for applications and executes them. These jobs consist of a set of tasks for executing applications and managing data. The tasks have user-defined starting conditions that allow users to specih complex dependencies including task to execute when tasks fail, afiequent occurrence in a large distributed system, or are cancelled. The execution task provided by our service also configures the application environment exactly as specified by the user and captures the exit code of the application, features that many grid execution services do not support due to dflculties interfacing to local scheduling systems.

  15. A Comparison of Source Code Plagiarism Detection Engines

    NASA Astrophysics Data System (ADS)

    Lancaster, Thomas; Culwin, Fintan

    2004-06-01

    Automated techniques for finding plagiarism in student source code submissions have been in use for over 20 years and there are many available engines and services. This paper reviews the literature on the major modern detection engines, providing a comparison of them based upon the metrics and techniques they deploy. Generally the most common and effective techniques are seen to involve tokenising student submissions then searching pairs of submissions for long common substrings, an example of what is defined to be a paired structural metric. Computing academics are recommended to use one of the two Web-based detection engines, MOSS and JPlag. It is shown that whilst detection is well established there are still places where further research would be useful, particularly where visual support of the investigation process is possible.

  16. Development of a Coded Aperture X-Ray Backscatter Imager for Explosive Device Detection

    NASA Astrophysics Data System (ADS)

    Faust, Anthony A.; Rothschild, Richard E.; Leblanc, Philippe; McFee, John Elton

    2009-02-01

    Defence R&D Canada has an active research and development program on detection of explosive devices using nuclear methods. One system under development is a coded aperture-based X-ray backscatter imaging detector designed to provide sufficient speed, contrast and spatial resolution to detect antipersonnel landmines and improvised explosive devices. The successful development of a hand-held imaging detector requires, among other things, a light-weight, ruggedized detector with low power requirements, supplying high spatial resolution. The University of California, San Diego-designed HEXIS detector provides a modern, large area, high-temperature CZT imaging surface, robustly packaged in a light-weight housing with sound mechanical properties. Based on the potential for the HEXIS detector to be incorporated as the detection element of a hand-held imaging detector, the authors initiated a collaborative effort to demonstrate the capability of a coded aperture-based X-ray backscatter imaging detector. This paper will discuss the landmine and IED detection problem and review the coded aperture technique. Results from initial proof-of-principle experiments will then be reported.

  17. Transferring ecosystem simulation codes to supercomputers

    NASA Technical Reports Server (NTRS)

    Skiles, J. W.; Schulbach, C. H.

    1995-01-01

    Many ecosystem simulation computer codes have been developed in the last twenty-five years. This development took place initially on main-frame computers, then mini-computers, and more recently, on micro-computers and workstations. Supercomputing platforms (both parallel and distributed systems) have been largely unused, however, because of the perceived difficulty in accessing and using the machines. Also, significant differences in the system architectures of sequential, scalar computers and parallel and/or vector supercomputers must be considered. We have transferred a grassland simulation model (developed on a VAX) to a Cray Y-MP/C90. We describe porting the model to the Cray and the changes we made to exploit the parallelism in the application and improve code execution. The Cray executed the model 30 times faster than the VAX and 10 times faster than a Unix workstation. We achieved an additional speedup of 30 percent by using the compiler's vectoring and 'in-line' capabilities. The code runs at only about 5 percent of the Cray's peak speed because it ineffectively uses the vector and parallel processing capabilities of the Cray. We expect that by restructuring the code, it could execute an additional six to ten times faster.

  18. Multiplexed Detection of Cytokines Based on Dual Bar-Code Strategy and Single-Molecule Counting.

    PubMed

    Li, Wei; Jiang, Wei; Dai, Shuang; Wang, Lei

    2016-02-02

    Cytokines play important roles in the immune system and have been regarded as biomarkers. While single cytokine is not specific and accurate enough to meet the strict diagnosis in practice, in this work, we constructed a multiplexed detection method for cytokines based on dual bar-code strategy and single-molecule counting. Taking interferon-γ (IFN-γ) and tumor necrosis factor-α (TNF-α) as model analytes, first, the magnetic nanobead was functionalized with the second antibody and primary bar-code strands, forming a magnetic nanoprobe. Then, through the specific reaction of the second antibody and the antigen that fixed by the primary antibody, sandwich-type immunocomplex was formed on the substrate. Next, the primary bar-code strands as amplification units triggered multibranched hybridization chain reaction (mHCR), producing nicked double-stranded polymers with multiple branched arms, which were served as secondary bar-code strands. Finally, the secondary bar-code strands hybridized with the multimolecule labeled fluorescence probes, generating enhanced fluorescence signals. The numbers of fluorescence dots were counted one by one for quantification with epi-fluorescence microscope. By integrating the primary and secondary bar-code-based amplification strategy and the multimolecule labeled fluorescence probes, this method displayed an excellent sensitivity with the detection limits were both 5 fM. Unlike the typical bar-code assay that the bar-code strands should be released and identified on a microarray, this method is more direct. Moreover, because of the selective immune reaction and the dual bar-code mechanism, the resulting method could detect the two targets simultaneously. Multiple analysis in human serum was also performed, suggesting that our strategy was reliable and had a great potential application in early clinical diagnosis.

  19. Process Model Improvement for Source Code Plagiarism Detection in Student Programming Assignments

    ERIC Educational Resources Information Center

    Kermek, Dragutin; Novak, Matija

    2016-01-01

    In programming courses there are various ways in which students attempt to cheat. The most commonly used method is copying source code from other students and making minimal changes in it, like renaming variable names. Several tools like Sherlock, JPlag and Moss have been devised to detect source code plagiarism. However, for larger student…

  20. PCR-free quantitative detection of genetically modified organism from raw materials. An electrochemiluminescence-based bio bar code method.

    PubMed

    Zhu, Debin; Tang, Yabing; Xing, Da; Chen, Wei R

    2008-05-15

    A bio bar code assay based on oligonucleotide-modified gold nanoparticles (Au-NPs) provides a PCR-free method for quantitative detection of nucleic acid targets. However, the current bio bar code assay requires lengthy experimental procedures including the preparation and release of bar code DNA probes from the target-nanoparticle complex and immobilization and hybridization of the probes for quantification. Herein, we report a novel PCR-free electrochemiluminescence (ECL)-based bio bar code assay for the quantitative detection of genetically modified organism (GMO) from raw materials. It consists of tris-(2,2'-bipyridyl) ruthenium (TBR)-labeled bar code DNA, nucleic acid hybridization using Au-NPs and biotin-labeled probes, and selective capture of the hybridization complex by streptavidin-coated paramagnetic beads. The detection of target DNA is realized by direct measurement of ECL emission of TBR. It can quantitatively detect target nucleic acids with high speed and sensitivity. This method can be used to quantitatively detect GMO fragments from real GMO products.

  1. Symbolic Execution Enhanced System Testing

    NASA Technical Reports Server (NTRS)

    Davies, Misty D.; Pasareanu, Corina S.; Raman, Vishwanath

    2012-01-01

    We describe a testing technique that uses information computed by symbolic execution of a program unit to guide the generation of inputs to the system containing the unit, in such a way that the unit's, and hence the system's, coverage is increased. The symbolic execution computes unit constraints at run-time, along program paths obtained by system simulations. We use machine learning techniques treatment learning and function fitting to approximate the system input constraints that will lead to the satisfaction of the unit constraints. Execution of system input predictions either uncovers new code regions in the unit under analysis or provides information that can be used to improve the approximation. We have implemented the technique and we have demonstrated its effectiveness on several examples, including one from the aerospace domain.

  2. Intelligent Rover Execution for Detecting Life in the Atacama Desert

    NASA Technical Reports Server (NTRS)

    Baskaran, Vijayakumar; Muscettola, Nicola; Rijsman, David; Plaunt, Chris; Fry, Chuck

    2006-01-01

    On-board supervisory execution is crucial for the deployment of more capable and autonomous remote explorers. Planetary science is considering robotic explorers operating for long periods of time without ground supervision while interacting with a changing and often hostile environment. Effective and robust operations require on-board supervisory control with a high level of awareness of the principles of functioning of the environment and of the numerous internal subsystems that need to be coordinated. We describe an on-board rover executive that was deployed on a rover as past of the "Limits of Life in the Atacama Desert (LITA)" field campaign sponsored by the NASA ASTEP program. The executive was built using the Intelligent Distributed Execution Architecture (IDEA), an execution framework that uses model-based and plan-based supervisory control of its fundamental computational paradigm. We present the results of the third field experiment conducted in the Atacama desert (Chile) in August - October 2005.

  3. The Environment for Application Software Integration and Execution (EASIE) version 1.0. Volume 1: Executive overview

    NASA Technical Reports Server (NTRS)

    Rowell, Lawrence F.; Davis, John S.

    1989-01-01

    The Environment for Application Software Integration and Execution (EASIE) provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational database management system. Volume 1, Executive Overview, gives an overview of the functions provided by EASIE and describes their use. Three operational design systems based upon the EASIE software are briefly described.

  4. Demonstration of a quantum error detection code using a square lattice of four superconducting qubits.

    PubMed

    Córcoles, A D; Magesan, Easwar; Srinivasan, Srikanth J; Cross, Andrew W; Steffen, M; Gambetta, Jay M; Chow, Jerry M

    2015-04-29

    The ability to detect and deal with errors when manipulating quantum systems is a fundamental requirement for fault-tolerant quantum computing. Unlike classical bits that are subject to only digital bit-flip errors, quantum bits are susceptible to a much larger spectrum of errors, for which any complete quantum error-correcting code must account. Whilst classical bit-flip detection can be realized via a linear array of qubits, a general fault-tolerant quantum error-correcting code requires extending into a higher-dimensional lattice. Here we present a quantum error detection protocol on a two-by-two planar lattice of superconducting qubits. The protocol detects an arbitrary quantum error on an encoded two-qubit entangled state via quantum non-demolition parity measurements on another pair of error syndrome qubits. This result represents a building block towards larger lattices amenable to fault-tolerant quantum error correction architectures such as the surface code.

  5. A modular telerobotic task execution system

    NASA Technical Reports Server (NTRS)

    Backes, Paul G.; Tso, Kam S.; Hayati, Samad; Lee, Thomas S.

    1990-01-01

    A telerobot task execution system is proposed to provide a general parametrizable task execution capability. The system includes communication with the calling system, e.g., a task planning system, and single- and dual-arm sensor-based task execution with monitoring and reflexing. A specific task is described by specifying the parameters to various available task execution modules including trajectory generation, compliance control, teleoperation, monitoring, and sensor fusion. Reflex action is achieved by finding the corresponding reflex action in a reflex table when an execution event has been detected with a monitor.

  6. Microfluidic CODES: a scalable multiplexed electronic sensor for orthogonal detection of particles in microfluidic channels.

    PubMed

    Liu, Ruxiu; Wang, Ningquan; Kamili, Farhan; Sarioglu, A Fatih

    2016-04-21

    Numerous biophysical and biochemical assays rely on spatial manipulation of particles/cells as they are processed on lab-on-a-chip devices. Analysis of spatially distributed particles on these devices typically requires microscopy negating the cost and size advantages of microfluidic assays. In this paper, we introduce a scalable electronic sensor technology, called microfluidic CODES, that utilizes resistive pulse sensing to orthogonally detect particles in multiple microfluidic channels from a single electrical output. Combining the techniques from telecommunications and microfluidics, we route three coplanar electrodes on a glass substrate to create multiple Coulter counters producing distinct orthogonal digital codes when they detect particles. We specifically design a digital code set using the mathematical principles of Code Division Multiple Access (CDMA) telecommunication networks and can decode signals from different microfluidic channels with >90% accuracy through computation even if these signals overlap. As a proof of principle, we use this technology to detect human ovarian cancer cells in four different microfluidic channels fabricated using soft lithography. Microfluidic CODES offers a simple, all-electronic interface that is well suited to create integrated, low-cost lab-on-a-chip devices for cell- or particle-based assays in resource-limited settings.

  7. Practices in Code Discoverability: Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; Teuben, P.; Nemiroff, R. J.; Shamir, L.

    2012-09-01

    Here we describe the Astrophysics Source Code Library (ASCL), which takes an active approach to sharing astrophysics source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL now has over 340 codes in it and continues to grow. In 2011, the ASCL has on average added 19 codes per month. An advisory committee has been established to provide input and guide the development and expansion of the new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This paper provides the history and description of the ASCL. It lists the requirements for including codes, examines the advantages of the ASCL, and outlines some of its future plans.

  8. A Hybrid Procedural/Deductive Executive for Autonomous Spacecraft

    NASA Technical Reports Server (NTRS)

    Pell, Barney; Gamble, Edward B.; Gat, Erann; Kessing, Ron; Kurien, James; Millar, William; Nayak, P. Pandurang; Plaunt, Christian; Williams, Brian C.; Lau, Sonie (Technical Monitor)

    1998-01-01

    The New Millennium Remote Agent (NMRA) will be the first AI system to control an actual spacecraft. The spacecraft domain places a strong premium on autonomy and requires dynamic recoveries and robust concurrent execution, all in the presence of tight real-time deadlines, changing goals, scarce resource constraints, and a wide variety of possible failures. To achieve this level of execution robustness, we have integrated a procedural executive based on generic procedures with a deductive model-based executive. A procedural executive provides sophisticated control constructs such as loops, parallel activity, locks, and synchronization which are used for robust schedule execution, hierarchical task decomposition, and routine configuration management. A deductive executive provides algorithms for sophisticated state inference and optimal failure recover), planning. The integrated executive enables designers to code knowledge via a combination of procedures and declarative models, yielding a rich modeling capability suitable to the challenges of real spacecraft control. The interface between the two executives ensures both that recovery sequences are smoothly merged into high-level schedule execution and that a high degree of reactivity is retained to effectively handle additional failures during recovery.

  9. Demonstration of a quantum error detection code using a square lattice of four superconducting qubits

    PubMed Central

    Córcoles, A.D.; Magesan, Easwar; Srinivasan, Srikanth J.; Cross, Andrew W.; Steffen, M.; Gambetta, Jay M.; Chow, Jerry M.

    2015-01-01

    The ability to detect and deal with errors when manipulating quantum systems is a fundamental requirement for fault-tolerant quantum computing. Unlike classical bits that are subject to only digital bit-flip errors, quantum bits are susceptible to a much larger spectrum of errors, for which any complete quantum error-correcting code must account. Whilst classical bit-flip detection can be realized via a linear array of qubits, a general fault-tolerant quantum error-correcting code requires extending into a higher-dimensional lattice. Here we present a quantum error detection protocol on a two-by-two planar lattice of superconducting qubits. The protocol detects an arbitrary quantum error on an encoded two-qubit entangled state via quantum non-demolition parity measurements on another pair of error syndrome qubits. This result represents a building block towards larger lattices amenable to fault-tolerant quantum error correction architectures such as the surface code. PMID:25923200

  10. Bypassing Races in Live Applications with Execution Filters

    DTIC Science & Technology

    2010-01-01

    LOOM creates the needed locks and semaphores on demand. The first time a lock or semaphore is refer- enced by one of the inserted synchronization ...runtime. LOOM provides a flexible and safe language for develop- ers to write execution filters that explicitly synchronize code. It then uses an...first compile their application with LOOM. At runtime, to workaround a race, an application developer writes an execution filter that synchronizes the

  11. PMD compensation in multilevel coded-modulation schemes with coherent detection using BLAST algorithm and iterative polarization cancellation.

    PubMed

    Djordjevic, Ivan B; Xu, Lei; Wang, Ting

    2008-09-15

    We present two PMD compensation schemes suitable for use in multilevel (M>or=2) block-coded modulation schemes with coherent detection. The first scheme is based on a BLAST-type polarization-interference cancellation scheme, and the second scheme is based on iterative polarization cancellation. Both schemes use the LDPC codes as channel codes. The proposed PMD compensations schemes are evaluated by employing coded-OFDM and coherent detection. When used in combination with girth-10 LDPC codes those schemes outperform polarization-time coding based OFDM by 1 dB at BER of 10(-9), and provide two times higher spectral efficiency. The proposed schemes perform comparable and are able to compensate even 1200 ps of differential group delay with negligible penalty.

  12. Plagiarism Detection Algorithm for Source Code in Computer Science Education

    ERIC Educational Resources Information Center

    Liu, Xin; Xu, Chan; Ouyang, Boyu

    2015-01-01

    Nowadays, computer programming is getting more necessary in the course of program design in college education. However, the trick of plagiarizing plus a little modification exists among some students' home works. It's not easy for teachers to judge if there's plagiarizing in source code or not. Traditional detection algorithms cannot fit this…

  13. Young Children Detect and Avoid Logically Inconsistent Sources: The Importance of Communicative Context and Executive Function

    PubMed Central

    Doebel, Sabine; Rowell, Shaina F.; Koenig, Melissa A.

    2016-01-01

    The reported research tested the hypothesis that young children detect logical inconsistency in communicative contexts that support the evaluation of speakers’ epistemic reliability. In two experiments (N = 194), 3- to 5-year-olds were presented with two speakers who expressed logically consistent or inconsistent claims. Three-year-olds failed to detect inconsistencies (Experiment 1), 4-year-olds detected inconsistencies when expressed by human speakers but not when read from books, and 5-year-olds detected inconsistencies in both contexts (Experiment 2). In both experiments, children demonstrated skepticism toward testimony from previously inconsistent sources. Executive function and working memory each predicted inconsistency detection. These findings indicate logical inconsistency understanding emerges in early childhood, is supported by social and domain general cognitive skills, and plays a role in adaptive learning from testimony. PMID:27317511

  14. Feasibility study for combination of field-flow fractionation (FFF)-based separation of size-coded particle probes with amplified surface enhanced Raman scattering (SERS) tagging for simultaneous detection of multiple miRNAs.

    PubMed

    Shin, Kayeong; Choi, Jaeyeong; Kim, Yeoju; Lee, Yoonjeong; Kim, Joohoon; Lee, Seungho; Chung, Hoeil

    2018-06-29

    We propose a new analytical scheme in which field-flow fractionation (FFF)-based separation of target-specific polystyrene (PS) particle probes of different sizes are incorporated with amplified surface-enhanced Raman scattering (SERS) tagging for the simultaneous and sensitive detection of multiple microRNAs (miRNAs). For multiplexed detection, PS particles of three different diameters (15, 10, 5 μm) were used for the size-coding, and a probe single stranded DNA (ssDNA) complementary to a target miRNA was conjugated on an intended PS particle. After binding of a target miRNA on PS probe, polyadenylation reaction was executed to generate a long tail composed of adenine (A) serving as a binding site to thymine (T) conjugated Au nanoparticles (T-AuNPs) to increase SERS intensity. The three size-coded PS probes bound with T-AuNPs were then separated in a FFF channel. With the observation of extinction-based fractograms, separation of three size-coded PS probes was clearly confirmed, thereby enabling of measuring three miRNAs simultaneously. Raman intensities of FFF fractions collected at the peak maximum of 15, 10 and 5 μm PS probes varied fairy quantitatively with the change of miRNA concentrations, and the reproducibility of measurement was acceptable. The proposed method is potentially useful for simultaneous detection of multiple miRNAs with high sensitivity. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Reliability techniques for computer executive programs

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Computer techniques for increasing the stability and reliability of executive and supervisory systems were studied. Program segmentation characteristics are discussed along with a validation system which is designed to retain the natural top down outlook in coding. An analysis of redundancy techniques and roll back procedures is included.

  16. Detecting the borders between coding and non-coding DNA regions in prokaryotes based on recursive segmentation and nucleotide doublets statistics

    PubMed Central

    2012-01-01

    Background Detecting the borders between coding and non-coding regions is an essential step in the genome annotation. And information entropy measures are useful for describing the signals in genome sequence. However, the accuracies of previous methods of finding borders based on entropy segmentation method still need to be improved. Methods In this study, we first applied a new recursive entropic segmentation method on DNA sequences to get preliminary significant cuts. A 22-symbol alphabet is used to capture the differential composition of nucleotide doublets and stop codon patterns along three phases in both DNA strands. This process requires no prior training datasets. Results Comparing with the previous segmentation methods, the experimental results on three bacteria genomes, Rickettsia prowazekii, Borrelia burgdorferi and E.coli, show that our approach improves the accuracy for finding the borders between coding and non-coding regions in DNA sequences. Conclusions This paper presents a new segmentation method in prokaryotes based on Jensen-Rényi divergence with a 22-symbol alphabet. For three bacteria genomes, comparing to A12_JR method, our method raised the accuracy of finding the borders between protein coding and non-coding regions in DNA sequences. PMID:23282225

  17. Capacity, cutoff rate, and coding for a direct-detection optical channel

    NASA Technical Reports Server (NTRS)

    Massey, J. L.

    1980-01-01

    It is shown that Pierce's pulse position modulation scheme with 2 to the L pulse positions used on a self-noise-limited direct detection optical communication channel results in a 2 to the L-ary erasure channel that is equivalent to the parallel combination of L completely correlated binary erasure channels. The capacity of the full channel is the sum of the capacities of the component channels, but the cutoff rate of the full channel is shown to be much smaller than the sum of the cutoff rates. An interpretation of the cutoff rate is given that suggests a complexity advantage in coding separately on the component channels. It is shown that if short-constraint-length convolutional codes with Viterbi decoders are used on the component channels, then the performance and complexity compare favorably with the Reed-Solomon coding system proposed by McEliece for the full channel. The reasons for this unexpectedly fine performance by the convolutional code system are explored in detail, as are various facets of the channel structure.

  18. Insertion of operation-and-indicate instructions for optimized SIMD code

    DOEpatents

    Eichenberger, Alexander E; Gara, Alan; Gschwind, Michael K

    2013-06-04

    Mechanisms are provided for inserting indicated instructions for tracking and indicating exceptions in the execution of vectorized code. A portion of first code is received for compilation. The portion of first code is analyzed to identify non-speculative instructions performing designated non-speculative operations in the first code that are candidates for replacement by replacement operation-and-indicate instructions that perform the designated non-speculative operations and further perform an indication operation for indicating any exception conditions corresponding to special exception values present in vector register inputs to the replacement operation-and-indicate instructions. The replacement is performed and second code is generated based on the replacement of the at least one non-speculative instruction. The data processing system executing the compiled code is configured to store special exception values in vector output registers, in response to a speculative instruction generating an exception condition, without initiating exception handling.

  19. Dual-channel-coded microbeads for multiplexed detection of biomolecules using assembling of quantum dots and element coding nanoparticles.

    PubMed

    Lu, Bangrong; He, Qinghua; He, Yonghong; Chen, Xuejing; Feng, Guangxia; Liu, Siyu; Ji, Yanhong

    2018-09-18

    To achieve the dual-channel (analog and digital) encoding, microbeads assembled with quantum dots (QDs) and element coding nanoparticles (ECNPs) have been prepared. Dual-spectra, including fluorescence generated from quantum dots (QDs) and laser induced breakdown spectrum obtained from the plasma of ECNPs, including AgO, MgO and ZnO nanoparticles, has been adopted to provide more encoding amounts and more accurate dual recognition for encoded microbeads in multiplexed utilization. The experimental results demonstrate that the single microbead can be decoded in two optical channels. Multiplexed analysis and contrast adsorption experiment of anti-IgG verified the availability and specificity of dual-channel-coded microbeads in bioanalysis. In gradient detection of anti-IgG, we obtained the linear concentration response to target biomolecules from 3.125 × 10 -10  M to 1 × 10 -8  M, and the limit of detection was calculated to be 2.91 × 10 -11  M. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. CODE OF FEDERAL REGULATIONS

    EPA Science Inventory

    The Code of Federal Regulations (CFR) is an annually revised codification of the general and permanent rules published in the Federal Register by the executive departments and agencies of the Federal Government. The CFR is divided into 50 titles which represent broad areas subje...

  1. 12 CFR 1710.14 - Code of conduct and ethics.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 7 2011-01-01 2011-01-01 false Code of conduct and ethics. 1710.14 Section... Code of conduct and ethics. (a) General. An Enterprise shall establish and administer a written code of conduct and ethics that is reasonably designed to assure the ability of board members, executive officers...

  2. 12 CFR 1710.14 - Code of conduct and ethics.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Code of conduct and ethics. 1710.14 Section... Code of conduct and ethics. (a) General. An Enterprise shall establish and administer a written code of conduct and ethics that is reasonably designed to assure the ability of board members, executive officers...

  3. 12 CFR 1710.14 - Code of conduct and ethics.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 9 2012-01-01 2012-01-01 false Code of conduct and ethics. 1710.14 Section... Code of conduct and ethics. (a) General. An Enterprise shall establish and administer a written code of conduct and ethics that is reasonably designed to assure the ability of board members, executive officers...

  4. 12 CFR 1710.14 - Code of conduct and ethics.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 10 2014-01-01 2014-01-01 false Code of conduct and ethics. 1710.14 Section... Code of conduct and ethics. (a) General. An Enterprise shall establish and administer a written code of conduct and ethics that is reasonably designed to assure the ability of board members, executive officers...

  5. 12 CFR 1710.14 - Code of conduct and ethics.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 9 2013-01-01 2013-01-01 false Code of conduct and ethics. 1710.14 Section... Code of conduct and ethics. (a) General. An Enterprise shall establish and administer a written code of conduct and ethics that is reasonably designed to assure the ability of board members, executive officers...

  6. Executable assertions and flight software

    NASA Technical Reports Server (NTRS)

    Mahmood, A.; Andrews, D. M.; Mccluskey, E. J.

    1984-01-01

    Executable assertions are used to test flight control software. The techniques used for testing flight software; however, are different from the techniques used to test other kinds of software. This is because of the redundant nature of flight software. An experimental setup for testing flight software using executable assertions is described. Techniques for writing and using executable assertions to test flight software are presented. The error detection capability of assertions is studied and many examples of assertions are given. The issues of placement and complexity of assertions and the language features to support efficient use of assertions are discussed.

  7. From Verified Models to Verifiable Code

    NASA Technical Reports Server (NTRS)

    Lensink, Leonard; Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    Declarative specifications of digital systems often contain parts that can be automatically translated into executable code. Automated code generation may reduce or eliminate the kinds of errors typically introduced through manual code writing. For this approach to be effective, the generated code should be reasonably efficient and, more importantly, verifiable. This paper presents a prototype code generator for the Prototype Verification System (PVS) that translates a subset of PVS functional specifications into an intermediate language and subsequently to multiple target programming languages. Several case studies are presented to illustrate the tool's functionality. The generated code can be analyzed by software verification tools such as verification condition generators, static analyzers, and software model-checkers to increase the confidence that the generated code is correct.

  8. Heterogeneous VM Replication: A New Approach to Intrusion Detection, Active Response and Recovery in Cloud Data Centers

    DTIC Science & Technology

    2015-08-17

    from the same execution history, and cost-effective active response by proactively setting up standby VM replicas: migration from a compromised VM...the guest OSes system call code to be reused inside a “shadowed” portion of the context of the out-of- guest inspection program. Besides...by the rootkits in cloud environments. RootkitDet detects rootkits by identifying suspicious code region in the kernel space of guest OSes through

  9. Ethical guidance in the era of managed care: an analysis of the American College of Healthcare Executives' Code of Ethics.

    PubMed

    Higgins, W

    2000-01-01

    Market competition and the rise of managed care are transforming the healthcare system from a physician-dominated cottage industry into a manager-dominated corporate enterprise. The managed care revolution is also undermining the safe-guards offered by medical ethics and raising serious public concerns. These trends highlight the growing importance of ethical standards for managers. The most comprehensive ethical guidance for health service managers is contained in the American College of Healthcare Executives' (ACHE) Code of Ethics. An analysis of the ACHE Code suggests that it does not adequately address several ethical concerns associated with managed care. The ACHE may wish to develop a supplemental statement regarding ethical issues in managed care. A supplemental statement that provides more specific guidance in the areas of financial incentives to reduce utilization, social mission, consumer/patient information, and the health service manager's responsibility to patients could be extremely valuable in today's complex and rapidly changing environment. More specific ethical guidelines would not ensure individual or organizational compliance. However, they would provide professional standards that could guide decision making and help managers evaluate performance in managed care settings.

  10. [Spanish translation and validation of an Executive Battery 25 (EB25) and its shortened version (ABE12) for executive dysfunction screening in dementia].

    PubMed

    Serrani Azcurra, D J L

    2013-10-01

    There is a need for clinically administered instruments capable of detecting executive dysfunction in dementia. The translation and validation of Executive Battery 25 (EB25) and a short version for screening of executive dysfunction in dementia. The original battery was translated and validated using convergent and divergent correlation in 66 mild dementia patients (CDR 1) matched with 66 controls. EB25 consists of 25 items which detect executive dysfunction. Convergent correlation was made with 7 tests assessing executive dysfunction, the Frontal Systems Behaviour Scale (FrSBe) and Disability Fast Assessment Scale. Patients had higher scores than controls and correlated with the Stroop Test, verbal fluency test and Frontal Behaviour Inventory. Only 12 out of 25 items were needed to separate both groups, which were used to build an abbreviated Executive Battery with equal psychometric properties and discriminative power. The cut-off point for EB25 was 12, and 7 for the abbreviated version. A cut-off point of 12 was able to discriminate between ¿Alzheimer's disease? (AD) and frontotemporal lobe dementia (FTLD). EB25 and AEB12 enable executive dysfunction to be detected in mild dementia. On the other hand, AEB12 exhibits better psychometric properties than the original battery, allowing discrimination between AD and FTLD and is completed in less time. Copyright © 2010 Sociedad Española de Neurología. Published by Elsevier Espana. All rights reserved.

  11. Optimizing Tensor Contraction Expressions for Hybrid CPU-GPU Execution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Wenjing; Krishnamoorthy, Sriram; Villa, Oreste

    2013-03-01

    Tensor contractions are generalized multidimensional matrix multiplication operations that widely occur in quantum chemistry. Efficient execution of tensor contractions on Graphics Processing Units (GPUs) requires several challenges to be addressed, including index permutation and small dimension-sizes reducing thread block utilization. Moreover, to apply the same optimizations to various expressions, we need a code generation tool. In this paper, we present our approach to automatically generate CUDA code to execute tensor contractions on GPUs, including management of data movement between CPU and GPU. To evaluate our tool, GPU-enabled code is generated for the most expensive contractions in CCSD(T), a key coupledmore » cluster method, and incorporated into NWChem, a popular computational chemistry suite. For this method, we demonstrate speedup over a factor of 8.4 using one GPU (instead of one core per node) and over 2.6 when utilizing the entire system using hybrid CPU+GPU solution with 2 GPUs and 5 cores (instead of 7 cores per node). Finally, we analyze the implementation behavior on future GPU systems.« less

  12. Quartz crystal microbalance detection of DNA single-base mutation based on monobase-coded cadmium tellurium nanoprobe.

    PubMed

    Zhang, Yuqin; Lin, Fanbo; Zhang, Youyu; Li, Haitao; Zeng, Yue; Tang, Hao; Yao, Shouzhuo

    2011-01-01

    A new method for the detection of point mutation in DNA based on the monobase-coded cadmium tellurium nanoprobes and the quartz crystal microbalance (QCM) technique was reported. A point mutation (single-base, adenine, thymine, cytosine, and guanine, namely, A, T, C and G, mutation in DNA strand, respectively) DNA QCM sensor was fabricated by immobilizing single-base mutation DNA modified magnetic beads onto the electrode surface with an external magnetic field near the electrode. The DNA-modified magnetic beads were obtained from the biotin-avidin affinity reaction of biotinylated DNA and streptavidin-functionalized core/shell Fe(3)O(4)/Au magnetic nanoparticles, followed by a DNA hybridization reaction. Single-base coded CdTe nanoprobes (A-CdTe, T-CdTe, C-CdTe and G-CdTe, respectively) were used as the detection probes. The mutation site in DNA was distinguished by detecting the decreases of the resonance frequency of the piezoelectric quartz crystal when the coded nanoprobe was added to the test system. This proposed detection strategy for point mutation in DNA is proved to be sensitive, simple, repeatable and low-cost, consequently, it has a great potential for single nucleotide polymorphism (SNP) detection. 2011 © The Japan Society for Analytical Chemistry

  13. Fixed-point Design of the Lattice-reduction-aided Iterative Detection and Decoding Receiver for Coded MIMO Systems

    DTIC Science & Technology

    2011-01-01

    reliability, e.g., Turbo Codes [2] and Low Density Parity Check ( LDPC ) codes [3]. The challenge to apply both MIMO and ECC into wireless systems is on...REPORT Fixed-point Design of theLattice-reduction-aided Iterative Detection andDecoding Receiver for Coded MIMO Systems 14. ABSTRACT 16. SECURITY...illustrates the performance of coded LR aided detectors. 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13. SUPPLEMENTARY NOTES The views, opinions

  14. An improved algorithm for evaluating trellis phase codes

    NASA Technical Reports Server (NTRS)

    Mulligan, M. G.; Wilson, S. G.

    1982-01-01

    A method is described for evaluating the minimum distance parameters of trellis phase codes, including CPFSK, partial response FM, and more importantly, coded CPM (continuous phase modulation) schemes. The algorithm provides dramatically faster execution times and lesser memory requirements than previous algorithms. Results of sample calculations and timing comparisons are included.

  15. An improved algorithm for evaluating trellis phase codes

    NASA Technical Reports Server (NTRS)

    Mulligan, M. G.; Wilson, S. G.

    1984-01-01

    A method is described for evaluating the minimum distance parameters of trellis phase codes, including CPFSK, partial response FM, and more importantly, coded CPM (continuous phase modulation) schemes. The algorithm provides dramatically faster execution times and lesser memory requirements than previous algorithms. Results of sample calculations and timing comparisons are included.

  16. SecureQEMU: Emulation-Based Software Protection Providing Encrypted Code Execution and Page Granularity Code Signing

    DTIC Science & Technology

    2008-12-01

    SHA256 DIGEST LENGTH) ) ; peAddSection(&sF i l e , " . S i g S t u b " , dwStubSecSize , dwStubSecSize ) ; 169 peSecure(&sF i l e , deqAddrSize...deqAuthPageAddrSize . s i z e ( ) /2) ∗ (8 + SHA256 DIGEST LENGTH) ) + 16 ; bCode [ 3 4 ] = ( ( char∗)&dwSize ) [ 0 ] ; bCode [ 3 5 ] = ( ( char∗)&dwSize ) [ 1...2) ∗ (8 + SHA256 DIGEST LENGTH... ) ) ; AES KEY aesKey ; unsigned char i v s a l t [ 1 6 ] , temp iv [ 1 6 ] ; 739 unsigned char ∗key

  17. Virtual multiple errands test (VMET): a virtual reality-based tool to detect early executive functions deficit in Parkinson's disease.

    PubMed

    Cipresso, Pietro; Albani, Giovanni; Serino, Silvia; Pedroli, Elisa; Pallavicini, Federica; Mauro, Alessandro; Riva, Giuseppe

    2014-01-01

    Several recent studies have pointed out that early impairment of executive functions (EFs) in Parkinson's Disease (PD) may be a crucial marker to detect patients at risk for developing dementia. The main objective of this study was to compare the performances of PD patients with mild cognitive impairment (PD-MCI) with PD patients with normal cognition (PD-NC) and a control group (CG) using a traditional assessment of EFs and the Virtual Multiple Errands Test (VMET), a virtual reality (VR)-based tool. In order to understand which subcomponents of EFs are early impaired, this experimental study aimed to investigate specifically which instrument best discriminates among these three groups. The study included three groups of 15 individuals each (for a total of 45 participants): 15 PD-NC; 15 PD-MCI, and 15 cognitively healthy individuals (CG). To assess the global neuropsychological functioning and the EFs, several tests (including the Mini Mental State Examination (MMSE), Clock Drawing Test, and Tower of London test) were administered to the participants. The VMET was used for a more ecologically valid neuropsychological evaluation of EFs. Findings revealed significant differences in the VMET scores between the PD-NC patients vs. the controls. In particular, patients made more errors in the tasks of the VMET, and showed a poorer ability to use effective strategies to complete the tasks. This VMET result seems to be more sensitive in the early detection of executive deficits because these two groups did not differ in the traditional assessment of EFs (neuropsychological battery). This study offers initial evidence that a more ecologically valid evaluation of EFs is more likely to lead to detection of subtle executive deficits.

  18. Methods and computer executable instructions for marking a downhole elongate line and detecting same

    DOEpatents

    Watkins, Arthur D.

    2003-05-13

    Methods and computer executable instructions are provided for making an elongate line (22) with a plurality of marks (30) and detecting those marks (30) to determine a distance of the elongate line (22) in a downhole or a physical integrity thereof. In a preferred embodiment, each mark comprises a plurality of particles (44) having a substantially permanent magnetizing capability adhered to an exterior surface of the elongate line (22) at preselected intervals with an epoxy paint. The particles (44) are arranged at each interval as a plurality of bands (40). Thereafter, the particles are oriented into a magnetic signature for that interval by magnetizing the particles to create a magnetic field substantially normal to the exterior surface. This facilitates detection by a Hall effect probe. The magnetic signatures are stored in a computing configuration and, once a mark is detected, a correlation is made to a unique position on the elongate line by comparison with the stored magnetic signatures. Preferred particles include samarium-cobalt and neodymium-iron-boride.

  19. An Extended Proof-Carrying Code Framework for Security Enforcement

    NASA Astrophysics Data System (ADS)

    Pirzadeh, Heidar; Dubé, Danny; Hamou-Lhadj, Abdelwahab

    The rapid growth of the Internet has resulted in increased attention to security to protect users from being victims of security threats. In this paper, we focus on security mechanisms that are based on Proof-Carrying Code (PCC) techniques. In a PCC system, a code producer sends a code along with its safety proof to the consumer. The consumer executes the code only if the proof is valid. Although PCC has been shown to be a useful security framework, it suffers from the sheer size of typical proofs -proofs of even small programs can be considerably large. In this paper, we propose an extended PCC framework (EPCC) in which, instead of the proof, a proof generator for the program in question is transmitted. This framework enables the execution of the proof generator and the recovery of the proof on the consumer's side in a secure manner using a newly created virtual machine called the VEP (Virtual Machine for Extended PCC).

  20. The interactive roles of parenting, emotion regulation and executive functioning in moral reasoning during middle childhood.

    PubMed

    Hinnant, J Benjamin; Nelson, Jackie A; O'Brien, Marion; Keane, Susan P; Calkins, Susan D

    2013-01-01

    We examined mother-child co-operative behaviour, children's emotion regulation and executive function, as well as combinations of these factors, as predictors of moral reasoning in 89 10-year-old children. Dyadic co-operation was coded from videotaped observations of laboratory puzzle and speech tasks. Emotion regulation was derived from maternal report, and executive functioning was assessed with the Tower of London task. Moral reasoning was coded during mother-child conversations about morally ambiguous, peer-conflict situations. Two significant interactions indicated that children from more co-operative dyads who also had higher executive function skills had higher moral reasoning scores than other children, and children lower in both emotion regulation and executive function had lower moral reasoning scores than other children. The results contribute to the literature on the multiple and interactive levels of influence on moral reasoning in childhood.

  1. The Interactive Roles of Parenting, Emotion Regulation and Executive Functioning in Moral Reasoning during Middle Childhood

    PubMed Central

    Hinnant, J. Benjamin; Nelson, Jackie A.; O’Brien, Marion; Keane, Susan P.; Calkins, Susan D.

    2013-01-01

    We examined mother-child cooperative behavior, children’s emotion regulation and executive function, as well as combinations of these factors, as predictors of moral reasoning in 89 10-year-old children. Dyadic cooperation was coded from videotaped observations of laboratory puzzle and speech tasks. Emotion regulation was derived from maternal report, and executive functioning was assessed with the Tower of London task. Moral reasoning was coded during mother-child conversations about morally ambiguous, peer-conflict situations. Two significant interactions indicated that children from more cooperative dyads who also had higher executive function skills had higher moral reasoning scores than other children, and children lower in both emotion regulation and executive function had lower moral reasoning scores than other children. The results contribute to the literature on the multiple and interactive levels of influence on moral reasoning in childhood. PMID:23650955

  2. Dedicated memory structure holding data for detecting available worker thread(s) and informing available worker thread(s) of task(s) to execute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiu, George L.; Eichenberger, Alexandre E.; O'Brien, John K. P.

    The present disclosure relates generally to a dedicated memory structure (that is, hardware device) holding data for detecting available worker thread(s) and informing available worker thread(s) of task(s) to execute.

  3. Some Problems and Solutions in Transferring Ecosystem Simulation Codes to Supercomputers

    NASA Technical Reports Server (NTRS)

    Skiles, J. W.; Schulbach, C. H.

    1994-01-01

    Many computer codes for the simulation of ecological systems have been developed in the last twenty-five years. This development took place initially on main-frame computers, then mini-computers, and more recently, on micro-computers and workstations. Recent recognition of ecosystem science as a High Performance Computing and Communications Program Grand Challenge area emphasizes supercomputers (both parallel and distributed systems) as the next set of tools for ecological simulation. Transferring ecosystem simulation codes to such systems is not a matter of simply compiling and executing existing code on the supercomputer since there are significant differences in the system architectures of sequential, scalar computers and parallel and/or vector supercomputers. To more appropriately match the application to the architecture (necessary to achieve reasonable performance), the parallelism (if it exists) of the original application must be exploited. We discuss our work in transferring a general grassland simulation model (developed on a VAX in the FORTRAN computer programming language) to a Cray Y-MP. We show the Cray shared-memory vector-architecture, and discuss our rationale for selecting the Cray. We describe porting the model to the Cray and executing and verifying a baseline version, and we discuss the changes we made to exploit the parallelism in the application and to improve code execution. As a result, the Cray executed the model 30 times faster than the VAX 11/785 and 10 times faster than a Sun 4 workstation. We achieved an additional speed-up of approximately 30 percent over the original Cray run by using the compiler's vectorizing capabilities and the machine's ability to put subroutines and functions "in-line" in the code. With the modifications, the code still runs at only about 5% of the Cray's peak speed because it makes ineffective use of the vector processing capabilities of the Cray. We conclude with a discussion and future plans.

  4. The Limits of Coding with Joint Constraints on Detected and Undetected Error Rates

    NASA Technical Reports Server (NTRS)

    Dolinar, Sam; Andrews, Kenneth; Pollara, Fabrizio; Divsalar, Dariush

    2008-01-01

    We develop a remarkably tight upper bound on the performance of a parameterized family of bounded angle maximum-likelihood (BA-ML) incomplete decoders. The new bound for this class of incomplete decoders is calculated from the code's weight enumerator, and is an extension of Poltyrev-type bounds developed for complete ML decoders. This bound can also be applied to bound the average performance of random code ensembles in terms of an ensemble average weight enumerator. We also formulate conditions defining a parameterized family of optimal incomplete decoders, defined to minimize both the total codeword error probability and the undetected error probability for any fixed capability of the decoder to detect errors. We illustrate the gap between optimal and BA-ML incomplete decoding via simulation of a small code.

  5. Performing aggressive code optimization with an ability to rollback changes made by the aggressive optimizations

    DOEpatents

    Gschwind, Michael K

    2013-07-23

    Mechanisms for aggressively optimizing computer code are provided. With these mechanisms, a compiler determines an optimization to apply to a portion of source code and determines if the optimization as applied to the portion of source code will result in unsafe optimized code that introduces a new source of exceptions being generated by the optimized code. In response to a determination that the optimization is an unsafe optimization, the compiler generates an aggressively compiled code version, in which the unsafe optimization is applied, and a conservatively compiled code version in which the unsafe optimization is not applied. The compiler stores both versions and provides them for execution. Mechanisms are provided for switching between these versions during execution in the event of a failure of the aggressively compiled code version. Moreover, predictive mechanisms are provided for predicting whether such a failure is likely.

  6. Global ISR: Toward a Comprehensive Defense Against Unauthorized Code Execution

    DTIC Science & Technology

    2010-10-01

    implementation using two of the most popular open- source servers: the Apache web server, and the MySQL database server. For Apache, we measure the effect that...utility ab. T o ta l T im e ( s e c ) 0 500 1000 1500 2000 2500 3000 Native Null ISR ISR−MP Fig. 3. The MySQL test-insert bench- mark measures...various SQL operations. The figure draws total execution time as reported by the benchmark utility. Finally, we benchmarked a MySQL database server using

  7. The cooking task: making a meal of executive functions

    PubMed Central

    Doherty, T. A.; Barker, L. A.; Denniss, R.; Jalil, A.; Beer, M. D.

    2015-01-01

    Current standardized neuropsychological tests may fail to accurately capture real-world executive deficits. We developed a computer-based Cooking Task (CT) assessment of executive functions and trialed the measure with a normative group before use with a head-injured population. Forty-six participants completed the computerized CT and subtests from standardized neuropsychological tasks, including the Tower and Sorting Tests of executive function from the Delis-Kaplan Executive Function System (D-KEFS) and the Cambridge prospective memory test (CAMPROMPT), in order to examine whether standardized executive function tasks, predicted performance on measurement indices from the CT. Findings showed that verbal comprehension, rule detection and prospective memory contributed to measures of prospective planning accuracy and strategy implementation of the CT. Results also showed that functions necessary for cooking efficacy differ as an effect of task demands (difficulty levels). Performance on rule detection, strategy implementation and flexible thinking executive function measures contributed to accuracy on the CT. These findings raise questions about the functions captured by present standardized tasks particularly at varying levels of difficulty and during dual-task performance. Our preliminary findings also indicate that CT measures can effectively distinguish between executive function and Full Scale IQ abilities. Results of the present study indicate that the CT shows promise as an ecologically valid measure of executive function for future use with a head-injured population and indexes selective executive function’s captured by standardized tests. PMID:25717294

  8. The cooking task: making a meal of executive functions.

    PubMed

    Doherty, T A; Barker, L A; Denniss, R; Jalil, A; Beer, M D

    2015-01-01

    Current standardized neuropsychological tests may fail to accurately capture real-world executive deficits. We developed a computer-based Cooking Task (CT) assessment of executive functions and trialed the measure with a normative group before use with a head-injured population. Forty-six participants completed the computerized CT and subtests from standardized neuropsychological tasks, including the Tower and Sorting Tests of executive function from the Delis-Kaplan Executive Function System (D-KEFS) and the Cambridge prospective memory test (CAMPROMPT), in order to examine whether standardized executive function tasks, predicted performance on measurement indices from the CT. Findings showed that verbal comprehension, rule detection and prospective memory contributed to measures of prospective planning accuracy and strategy implementation of the CT. Results also showed that functions necessary for cooking efficacy differ as an effect of task demands (difficulty levels). Performance on rule detection, strategy implementation and flexible thinking executive function measures contributed to accuracy on the CT. These findings raise questions about the functions captured by present standardized tasks particularly at varying levels of difficulty and during dual-task performance. Our preliminary findings also indicate that CT measures can effectively distinguish between executive function and Full Scale IQ abilities. Results of the present study indicate that the CT shows promise as an ecologically valid measure of executive function for future use with a head-injured population and indexes selective executive function's captured by standardized tests.

  9. Advanced turboprop noise prediction: Development of a code at NASA Langley based on recent theoretical results

    NASA Technical Reports Server (NTRS)

    Farassat, F.; Dunn, M. H.; Padula, S. L.

    1986-01-01

    The development of a high speed propeller noise prediction code at Langley Research Center is described. The code utilizes two recent acoustic formulations in the time domain for subsonic and supersonic sources. The structure and capabilities of the code are discussed. Grid size study for accuracy and speed of execution on a computer is also presented. The code is tested against an earlier Langley code. Considerable increase in accuracy and speed of execution are observed. Some examples of noise prediction of a high speed propeller for which acoustic test data are available are given. A brisk derivation of formulations used is given in an appendix.

  10. Development of an Efficient Entire-Capsid-Coding-Region Amplification Method for Direct Detection of Poliovirus from Stool Extracts

    PubMed Central

    Kilpatrick, David R.; Nakamura, Tomofumi; Burns, Cara C.; Bukbuk, David; Oderinde, Soji B.; Oberste, M. Steven; Kew, Olen M.; Pallansch, Mark A.; Shimizu, Hiroyuki

    2014-01-01

    Laboratory diagnosis has played a critical role in the Global Polio Eradication Initiative since 1988, by isolating and identifying poliovirus (PV) from stool specimens by using cell culture as a highly sensitive system to detect PV. In the present study, we aimed to develop a molecular method to detect PV directly from stool extracts, with a high efficiency comparable to that of cell culture. We developed a method to efficiently amplify the entire capsid coding region of human enteroviruses (EVs) including PV. cDNAs of the entire capsid coding region (3.9 kb) were obtained from as few as 50 copies of PV genomes. PV was detected from the cDNAs with an improved PV-specific real-time reverse transcription-PCR system and nucleotide sequence analysis of the VP1 coding region. For assay validation, we analyzed 84 stool extracts that were positive for PV in cell culture and detected PV genomes from 100% of the extracts (84/84 samples) with this method in combination with a PV-specific extraction method. PV could be detected in 2/4 stool extract samples that were negative for PV in cell culture. In PV-positive samples, EV species C viruses were also detected with high frequency (27% [23/86 samples]). This method would be useful for direct detection of PV from stool extracts without using cell culture. PMID:25339406

  11. A direct-execution parallel architecture for the Advanced Continuous Simulation Language (ACSL)

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Owen, Jeffrey E.

    1988-01-01

    A direct-execution parallel architecture for the Advanced Continuous Simulation Language (ACSL) is presented which overcomes the traditional disadvantages of simulations executed on a digital computer. The incorporation of parallel processing allows the mapping of simulations into a digital computer to be done in the same inherently parallel manner as they are currently mapped onto an analog computer. The direct-execution format maximizes the efficiency of the executed code since the need for a high level language compiler is eliminated. Resolution is greatly increased over that which is available with an analog computer without the sacrifice in execution speed normally expected with digitial computer simulations. Although this report covers all aspects of the new architecture, key emphasis is placed on the processing element configuration and the microprogramming of the ACLS constructs. The execution times for all ACLS constructs are computed using a model of a processing element based on the AMD 29000 CPU and the AMD 29027 FPU. The increase in execution speed provided by parallel processing is exemplified by comparing the derived execution times of two ACSL programs with the execution times for the same programs executed on a similar sequential architecture.

  12. Decoding of DBEC-TBED Reed-Solomon codes. [Double-Byte-Error-Correcting, Triple-Byte-Error-Detecting

    NASA Technical Reports Server (NTRS)

    Deng, Robert H.; Costello, Daniel J., Jr.

    1987-01-01

    A problem in designing semiconductor memories is to provide some measure of error control without requiring excessive coding overhead or decoding time. In LSI and VLSI technology, memories are often organized on a multiple bit (or byte) per chip basis. For example, some 256 K bit DRAM's are organized in 32 K x 8 bit-bytes. Byte-oriented codes such as Reed-Solomon (RS) codes can provide efficient low overhead error control for such memories. However, the standard iterative algorithm for decoding RS codes is too slow for these applications. The paper presents a special decoding technique for double-byte-error-correcting, triple-byte-error-detecting RS codes which is capable of high-speed operation. This technique is designed to find the error locations and the error values directly from the syndrome without having to use the iterative algorithm to find the error locator polynomial.

  13. Investigating common coding of observed and executed actions in the monkey brain using cross-modal multi-variate fMRI classification.

    PubMed

    Fiave, Prosper Agbesi; Sharma, Saloni; Jastorff, Jan; Nelissen, Koen

    2018-05-19

    Mirror neurons are generally described as a neural substrate hosting shared representations of actions, by simulating or 'mirroring' the actions of others onto the observer's own motor system. Since single neuron recordings are rarely feasible in humans, it has been argued that cross-modal multi-variate pattern analysis (MVPA) of non-invasive fMRI data is a suitable technique to investigate common coding of observed and executed actions, allowing researchers to infer the presence of mirror neurons in the human brain. In an effort to close the gap between monkey electrophysiology and human fMRI data with respect to the mirror neuron system, here we tested this proposal for the first time in the monkey. Rhesus monkeys either performed reach-and-grasp or reach-and-touch motor acts with their right hand in the dark or observed videos of human actors performing similar motor acts. Unimodal decoding showed that both executed or observed motor acts could be decoded from numerous brain regions. Specific portions of rostral parietal, premotor and motor cortices, previously shown to house mirror neurons, in addition to somatosensory regions, yielded significant asymmetric action-specific cross-modal decoding. These results validate the use of cross-modal multi-variate fMRI analyses to probe the representations of own and others' actions in the primate brain and support the proposed mapping of others' actions onto the observer's own motor cortices. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. User's manual for a material transport code on the Octopus Computer Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naymik, T.G.; Mendez, G.D.

    1978-09-15

    A code to simulate material transport through porous media was developed at Oak Ridge National Laboratory. This code has been modified and adapted for use at Lawrence Livermore Laboratory. This manual, in conjunction with report ORNL-4928, explains the input, output, and execution of the code on the Octopus Computer Network.

  15. Improvement of the predicted aural detection code ICHIN (I Can Hear It Now)

    NASA Technical Reports Server (NTRS)

    Mueller, Arnold W.; Smith, Charles D.; Lemasurier, Phillip

    1993-01-01

    Acoustic tests were conducted to study the far-field sound pressure levels and aural detection ranges associated with a Sikorsky S-76A helicopter in straight and level flight at various advancing blade tip Mach numbers. The flight altitude was nominally 150 meters above ground level. This paper compares the normalized predicted aural detection distances, based on the measured far-field sound pressure levels, to the normalized measured aural detection distances obtained from sound jury response measurements obtained during the same test. Both unmodified and modified versions of the prediction code ICHIN-6 (I Can Hear It Now) were used to produce the results for this study.

  16. PMD compensation in fiber-optic communication systems with direct detection using LDPC-coded OFDM.

    PubMed

    Djordjevic, Ivan B

    2007-04-02

    The possibility of polarization-mode dispersion (PMD) compensation in fiber-optic communication systems with direct detection using a simple channel estimation technique and low-density parity-check (LDPC)-coded orthogonal frequency division multiplexing (OFDM) is demonstrated. It is shown that even for differential group delay (DGD) of 4/BW (BW is the OFDM signal bandwidth), the degradation due to the first-order PMD can be completely compensated for. Two classes of LDPC codes designed based on two different combinatorial objects (difference systems and product of combinatorial designs) suitable for use in PMD compensation are introduced.

  17. Transformation of the neural code for tactile detection from thalamus to cortex.

    PubMed

    Vázquez, Yuriria; Salinas, Emilio; Romo, Ranulfo

    2013-07-09

    To understand how sensory-driven neural activity gives rise to perception, it is essential to characterize how various relay stations in the brain encode stimulus presence. Neurons in the ventral posterior lateral (VPL) nucleus of the somatosensory thalamus and in primary somatosensory cortex (S1) respond to vibrotactile stimulation with relatively slow modulations (∼100 ms) of their firing rate. In addition, faster modulations (∼10 ms) time-locked to the stimulus waveform are observed in both areas, but their contribution to stimulus detection is unknown. Furthermore, it is unclear whether VPL and S1 neurons encode stimulus presence with similar accuracy and via the same response features. To address these questions, we recorded single neurons while trained monkeys judged the presence or absence of a vibrotactile stimulus of variable amplitude, and their activity was analyzed with a unique decoding method that is sensitive to the time scale of the firing rate fluctuations. We found that the maximum detection accuracy of single neurons is similar in VPL and S1. However, VPL relies more heavily on fast rate modulations than S1, and as a consequence, the neural code in S1 is more tolerant: its performance degrades less when the readout method or the time scale of integration is suboptimal. Therefore, S1 neurons implement a more robust code, one less sensitive to the temporal integration window used to infer stimulus presence downstream. The differences between VPL and S1 responses signaling the appearance of a stimulus suggest a transformation of the neural code from thalamus to cortex.

  18. Virtual multiple errands test (VMET): a virtual reality-based tool to detect early executive functions deficit in Parkinson’s disease

    PubMed Central

    Cipresso, Pietro; Albani, Giovanni; Serino, Silvia; Pedroli, Elisa; Pallavicini, Federica; Mauro, Alessandro; Riva, Giuseppe

    2014-01-01

    Introduction: Several recent studies have pointed out that early impairment of executive functions (EFs) in Parkinson’s Disease (PD) may be a crucial marker to detect patients at risk for developing dementia. The main objective of this study was to compare the performances of PD patients with mild cognitive impairment (PD-MCI) with PD patients with normal cognition (PD-NC) and a control group (CG) using a traditional assessment of EFs and the Virtual Multiple Errands Test (VMET), a virtual reality (VR)-based tool. In order to understand which subcomponents of EFs are early impaired, this experimental study aimed to investigate specifically which instrument best discriminates among these three groups. Materials and methods: The study included three groups of 15 individuals each (for a total of 45 participants): 15 PD-NC; 15 PD-MCI, and 15 cognitively healthy individuals (CG). To assess the global neuropsychological functioning and the EFs, several tests (including the Mini Mental State Examination (MMSE), Clock Drawing Test, and Tower of London test) were administered to the participants. The VMET was used for a more ecologically valid neuropsychological evaluation of EFs. Results: Findings revealed significant differences in the VMET scores between the PD-NC patients vs. the controls. In particular, patients made more errors in the tasks of the VMET, and showed a poorer ability to use effective strategies to complete the tasks. This VMET result seems to be more sensitive in the early detection of executive deficits because these two groups did not differ in the traditional assessment of EFs (neuropsychological battery). Conclusion: This study offers initial evidence that a more ecologically valid evaluation of EFs is more likely to lead to detection of subtle executive deficits. PMID:25538578

  19. Relations between Short-term Memory Deficits, Semantic Processing, and Executive Function

    PubMed Central

    Allen, Corinne M.; Martin, Randi C.; Martin, Nadine

    2012-01-01

    Background Previous research has suggested separable short-term memory (STM) buffers for the maintenance of phonological and lexical-semantic information, as some patients with aphasia show better ability to retain semantic than phonological information and others show the reverse. Recently, researchers have proposed that deficits to the maintenance of semantic information in STM are related to executive control abilities. Aims The present study investigated the relationship of executive function abilities with semantic and phonological short-term memory (STM) and semantic processing in such patients, as some previous research has suggested that semantic STM deficits and semantic processing abilities are critically related to specific or general executive function deficits. Method and Procedures 20 patients with aphasia and STM deficits were tested on measures of short-term retention, semantic processing, and both complex and simple executive function tasks. Outcome and Results In correlational analyses, we found no relation between semantic STM and performance on simple or complex executive function tasks. In contrast, phonological STM was related to executive function performance in tasks that had a verbal component, suggesting that performance in some executive function tasks depends on maintaining or rehearsing phonological codes. Although semantic STM was not related to executive function ability, performance on semantic processing tasks was related to executive function, perhaps due to similar executive task requirements in both semantic processing and executive function tasks. Conclusions Implications for treatment and interpretations of executive deficits are discussed. PMID:22736889

  20. Development of an efficient entire-capsid-coding-region amplification method for direct detection of poliovirus from stool extracts.

    PubMed

    Arita, Minetaro; Kilpatrick, David R; Nakamura, Tomofumi; Burns, Cara C; Bukbuk, David; Oderinde, Soji B; Oberste, M Steven; Kew, Olen M; Pallansch, Mark A; Shimizu, Hiroyuki

    2015-01-01

    Laboratory diagnosis has played a critical role in the Global Polio Eradication Initiative since 1988, by isolating and identifying poliovirus (PV) from stool specimens by using cell culture as a highly sensitive system to detect PV. In the present study, we aimed to develop a molecular method to detect PV directly from stool extracts, with a high efficiency comparable to that of cell culture. We developed a method to efficiently amplify the entire capsid coding region of human enteroviruses (EVs) including PV. cDNAs of the entire capsid coding region (3.9 kb) were obtained from as few as 50 copies of PV genomes. PV was detected from the cDNAs with an improved PV-specific real-time reverse transcription-PCR system and nucleotide sequence analysis of the VP1 coding region. For assay validation, we analyzed 84 stool extracts that were positive for PV in cell culture and detected PV genomes from 100% of the extracts (84/84 samples) with this method in combination with a PV-specific extraction method. PV could be detected in 2/4 stool extract samples that were negative for PV in cell culture. In PV-positive samples, EV species C viruses were also detected with high frequency (27% [23/86 samples]). This method would be useful for direct detection of PV from stool extracts without using cell culture. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  1. Executive Functioning and Processing Speed in Age-Related Differences in Memory: Contribution of a Coding Task

    ERIC Educational Resources Information Center

    Baudouin, Alexia; Clarys, David; Vanneste, Sandrine; Isingrini, Michel

    2009-01-01

    The aim of the present study was to examine executive dysfunctioning and decreased processing speed as potential mediators of age-related differences in episodic memory. We compared the performances of young and elderly adults in a free-recall task. Participants were also given tests to measure executive functions and perceptual processing speed…

  2. Ffuzz: Towards full system high coverage fuzz testing on binary executables

    PubMed Central

    2018-01-01

    Bugs and vulnerabilities in binary executables threaten cyber security. Current discovery methods, like fuzz testing, symbolic execution and manual analysis, both have advantages and disadvantages when exercising the deeper code area in binary executables to find more bugs. In this paper, we designed and implemented a hybrid automatic bug finding tool—Ffuzz—on top of fuzz testing and selective symbolic execution. It targets full system software stack testing including both the user space and kernel space. Combining these two mainstream techniques enables us to achieve higher coverage and avoid getting stuck both in fuzz testing and symbolic execution. We also proposed two key optimizations to improve the efficiency of full system testing. We evaluated the efficiency and effectiveness of our method on real-world binary software and 844 memory corruption vulnerable programs in the Juliet test suite. The results show that Ffuzz can discover software bugs in the full system software stack effectively and efficiently. PMID:29791469

  3. Ffuzz: Towards full system high coverage fuzz testing on binary executables.

    PubMed

    Zhang, Bin; Ye, Jiaxi; Bi, Xing; Feng, Chao; Tang, Chaojing

    2018-01-01

    Bugs and vulnerabilities in binary executables threaten cyber security. Current discovery methods, like fuzz testing, symbolic execution and manual analysis, both have advantages and disadvantages when exercising the deeper code area in binary executables to find more bugs. In this paper, we designed and implemented a hybrid automatic bug finding tool-Ffuzz-on top of fuzz testing and selective symbolic execution. It targets full system software stack testing including both the user space and kernel space. Combining these two mainstream techniques enables us to achieve higher coverage and avoid getting stuck both in fuzz testing and symbolic execution. We also proposed two key optimizations to improve the efficiency of full system testing. We evaluated the efficiency and effectiveness of our method on real-world binary software and 844 memory corruption vulnerable programs in the Juliet test suite. The results show that Ffuzz can discover software bugs in the full system software stack effectively and efficiently.

  4. High Frequency Scattering Code in a Distributed Processing Environment

    DTIC Science & Technology

    1991-06-01

    Block 6. Author(s). Name(s) of person (s) Block 14. Subiect Terms. Keywords or phrases responsible for writing the report, performing identifying major...use of auttomated analysis tools is indicated. One tool developed by Pacific-Sierra Re- 22 search Corporation and marketed by Intel Corporation for...XQ: EXECUTE CODE EN : END CODE This input deck differs from that in the manual because the "PP" option is disabled in the modified code. 45 A.3

  5. Securing mobile code.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Link, Hamilton E.; Schroeppel, Richard Crabtree; Neumann, William Douglas

    2004-10-01

    If software is designed so that the software can issue functions that will move that software from one computing platform to another, then the software is said to be 'mobile'. There are two general areas of security problems associated with mobile code. The 'secure host' problem involves protecting the host from malicious mobile code. The 'secure mobile code' problem, on the other hand, involves protecting the code from malicious hosts. This report focuses on the latter problem. We have found three distinct camps of opinions regarding how to secure mobile code. There are those who believe special distributed hardware ismore » necessary, those who believe special distributed software is necessary, and those who believe neither is necessary. We examine all three camps, with a focus on the third. In the distributed software camp we examine some commonly proposed techniques including Java, D'Agents and Flask. For the specialized hardware camp, we propose a cryptographic technique for 'tamper-proofing' code over a large portion of the software/hardware life cycle by careful modification of current architectures. This method culminates by decrypting/authenticating each instruction within a physically protected CPU, thereby protecting against subversion by malicious code. Our main focus is on the camp that believes that neither specialized software nor hardware is necessary. We concentrate on methods of code obfuscation to render an entire program or a data segment on which a program depends incomprehensible. The hope is to prevent or at least slow down reverse engineering efforts and to prevent goal-oriented attacks on the software and execution. The field of obfuscation is still in a state of development with the central problem being the lack of a basis for evaluating the protection schemes. We give a brief introduction to some of the main ideas in the field, followed by an in depth analysis of a technique called 'white-boxing'. We put forth some new attacks and

  6. Deep Constrained Siamese Hash Coding Network and Load-Balanced Locality-Sensitive Hashing for Near Duplicate Image Detection.

    PubMed

    Hu, Weiming; Fan, Yabo; Xing, Junliang; Sun, Liang; Cai, Zhaoquan; Maybank, Stephen

    2018-09-01

    We construct a new efficient near duplicate image detection method using a hierarchical hash code learning neural network and load-balanced locality-sensitive hashing (LSH) indexing. We propose a deep constrained siamese hash coding neural network combined with deep feature learning. Our neural network is able to extract effective features for near duplicate image detection. The extracted features are used to construct a LSH-based index. We propose a load-balanced LSH method to produce load-balanced buckets in the hashing process. The load-balanced LSH significantly reduces the query time. Based on the proposed load-balanced LSH, we design an effective and feasible algorithm for near duplicate image detection. Extensive experiments on three benchmark data sets demonstrate the effectiveness of our deep siamese hash encoding network and load-balanced LSH.

  7. Embedding Temporal Constraints For Coordinated Execution in Habitat Automation

    NASA Technical Reports Server (NTRS)

    Morris, Paul; Schwabacher, Mark; Dalal, Michael; Fry, Charles

    2013-01-01

    Future NASA plans call for long-duration deep space missions with human crews. Because of light-time delay and other considerations, increased autonomy will be needed. This will necessitate integration of tools in such areas as anomaly detection, diagnosis, planning, and execution. In this paper we investigate an approach that integrates planning and execution by embedding planner-derived temporal constraints in an execution procedure. To avoid the need for propagation, we convert the temporal constraints to dispatchable form. We handle some uncertainty in the durations without it affecting the execution; larger variations may cause activities to be skipped.

  8. Synthesizing Safety Conditions for Code Certification Using Meta-Level Programming

    NASA Technical Reports Server (NTRS)

    Eusterbrock, Jutta

    2004-01-01

    In code certification the code consumer publishes a safety policy and the code producer generates a proof that the produced code is in compliance with the published safety policy. In this paper, a novel viewpoint approach towards an implementational re-use oriented framework for code certification is taken. It adopts ingredients from Necula's approach for proof-carrying code, but in this work safety properties can be analyzed on a higher code level than assembly language instructions. It consists of three parts: (1) The specification language is extended to include generic pre-conditions that shall ensure safety at all states that can be reached during program execution. Actual safety requirements can be expressed by providing domain-specific definitions for the generic predicates which act as interface to the environment. (2) The Floyd-Hoare inductive assertion method is refined to obtain proof rules that allow the derivation of the proof obligations in terms of the generic safety predicates. (3) A meta-interpreter is designed and experimentally implemented that enables automatic synthesis of proof obligations for submitted programs by applying the modified Floyd-Hoare rules. The proof obligations have two separate conjuncts, one for functional correctness and another for the generic safety obligations. Proof of the generic obligations, having provided the actual safety definitions as context, ensures domain-specific safety of program execution in a particular environment and is simpler than full program verification.

  9. Mobile, hybrid Compton/coded aperture imaging for detection, identification and localization of gamma-ray sources at stand-off distances

    NASA Astrophysics Data System (ADS)

    Tornga, Shawn R.

    The Stand-off Radiation Detection System (SORDS) program is an Advanced Technology Demonstration (ATD) project through the Department of Homeland Security's Domestic Nuclear Detection Office (DNDO) with the goal of detection, identification and localization of weak radiological sources in the presence of large dynamic backgrounds. The Raytheon-SORDS Tri-Modal Imager (TMI) is a mobile truck-based, hybrid gamma-ray imaging system able to quickly detect, identify and localize, radiation sources at standoff distances through improved sensitivity while minimizing the false alarm rate. Reconstruction of gamma-ray sources is performed using a combination of two imaging modalities; coded aperture and Compton scatter imaging. The TMI consists of 35 sodium iodide (NaI) crystals 5x5x2 in3 each, arranged in a random coded aperture mask array (CA), followed by 30 position sensitive NaI bars each 24x2.5x3 in3 called the detection array (DA). The CA array acts as both a coded aperture mask and scattering detector for Compton events. The large-area DA array acts as a collection detector for both Compton scattered events and coded aperture events. In this thesis, developed coded aperture, Compton and hybrid imaging algorithms will be described along with their performance. It will be shown that multiple imaging modalities can be fused to improve detection sensitivity over a broader energy range than either alone. Since the TMI is a moving system, peripheral data, such as a Global Positioning System (GPS) and Inertial Navigation System (INS) must also be incorporated. A method of adapting static imaging algorithms to a moving platform has been developed. Also, algorithms were developed in parallel with detector hardware, through the use of extensive simulations performed with the Geometry and Tracking Toolkit v4 (GEANT4). Simulations have been well validated against measured data. Results of image reconstruction algorithms at various speeds and distances will be presented as well as

  10. Error probability for RFID SAW tags with pulse position coding and peak-pulse detection.

    PubMed

    Shmaliy, Yuriy S; Plessky, Victor; Cerda-Villafaña, Gustavo; Ibarra-Manzano, Oscar

    2012-11-01

    This paper addresses the code reading error probability (EP) in radio-frequency identification (RFID) SAW tags with pulse position coding (PPC) and peak-pulse detection. EP is found in a most general form, assuming M groups of codes with N slots each and allowing individual SNRs in each slot. The basic case of zero signal in all off-pulses and equal signals in all on-pulses is investigated in detail. We show that if a SAW-tag with PPC is designed such that the spurious responses are attenuated by more than 20 dB below on-pulses, then EP can be achieved at the level of 10(-8) (one false read per 108 readings) with SNR >17 dB for any reasonable M and N. The tag reader range is estimated as a function of the transmitted power and EP.

  11. The procedure execution manager and its application to Advanced Photon Source operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borland, M.

    1997-06-01

    The Procedure Execution Manager (PEM) combines a complete scripting environment for coding accelerator operation procedures with a manager application for executing and monitoring the procedures. PEM is based on Tcl/Tk, a supporting widget library, and the dp-tcl extension for distributed processing. The scripting environment provides support for distributed, parallel execution of procedures along with join and abort operations. Nesting of procedures is supported, permitting the same code to run as a top-level procedure under operator control or as a subroutine under control of another procedure. The manager application allows an operator to execute one or more procedures in automatic, semi-automatic,more » or manual modes. It also provides a standard way for operators to interact with procedures. A number of successful applications of PEM to accelerator operations have been made to date. These include start-up, shutdown, and other control of the positron accumulator ring (PAR), low-energy transport (LET) lines, and the booster rf systems. The PAR/LET procedures make nested use of PEM`s ability to run parallel procedures. There are also a number of procedures to guide and assist tune-up operations, to make accelerator physics measurements, and to diagnose equipment. Because of the success of the existing procedures, expanded use of PEM is planned.« less

  12. Deep Long-period Seismicity Beneath the Executive Committee Range, Marie Byrd Land, Antarctica, Studied Using Subspace Detection

    NASA Astrophysics Data System (ADS)

    Aster, R. C.; McMahon, N. D.; Myers, E. K.; Lough, A. C.

    2015-12-01

    Lough et al. (2014) first detected deep sub-icecap magmatic events beneath the Executive Committee Range volcanoes of Marie Byrd Land. Here, we extend the identification and analysis of these events in space and time utilizing subspace detection. Subspace detectors provide a highly effective methodology for studying events within seismic swarms that have similar moment tensor and Green's function characteristics and are particularly effective for identifying low signal-to-noise events. Marie Byrd Land (MBL) is an extremely remote continental region that is nearly completely covered by the West Antarctic Ice Sheet (WAIS). The southern extent of Marie Byrd Land lies within the West Antarctic Rift System (WARS), which includes the volcanic Executive Committee Range (ECR). The ECR shows north-to-south progression of volcanism across the WARS during the Holocene. In 2013, the POLENET/ANET seismic data identified two swarms of seismic activity in 2010 and 2011. These events have been interpreted as deep, long-period (DLP) earthquakes based on depth (25-40 km) and low frequency content. The DLP events in MBL lie beneath an inferred sub-WAIS volcanic edifice imaged with ice penetrating radar and have been interpreted as a present location of magmatic intrusion. The magmatic swarm activity in MBL provides a promising target for advanced subspace detection and temporal, spatial, and event size analysis of an extensive deep long period earthquake swarm using a remote seismographic network. We utilized a catalog of 1,370 traditionally identified DLP events to construct subspace detectors for the six nearest stations and analyzed two years of data spanning 2010-2011. Association of these detections into events resulted in an approximate ten-fold increase in number of locatable earthquakes. In addition to the two previously identified swarms during early 2010 and early 2011, we find sustained activity throughout the two years of study that includes several previously

  13. SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (IBM VERSION)

    NASA Technical Reports Server (NTRS)

    Manteufel, R.

    1994-01-01

    The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.

  14. Frontal Assessment Battery (FAB) is a simple tool for detecting executive deficits in chronic cannabis users.

    PubMed

    Fontes, Maria Alice; Bolla, Karen I; Cunha, Paulo Jannuzzi; Almeida, Priscila Previato; Jungerman, Flávia; Laranjeira, Ronaldo Ramos; Bressan, Rodrigo A; Lacerda, Acioly L T

    2011-06-01

    Cannabis is the most used illicit drug in the world, and its use has been associated with prefrontal cortex (PFC) dysfunction, including deficits in executive functions (EF). Considering that EF may influence treatment outcome, it would be interesting to have a brief neuropsychological battery to assess EF in chronic cannabis users (CCU). In the present study, the Frontal Assessment Battery (FAB), a brief, easy to use neuropsychological instrument aimed to evaluate EF, was used to evaluate cognitive functioning of CCU. We evaluated 107 abstinent CCU with the FAB and compared with 44 controls matched for age, estimated IQ, and years of education. CCU performed poorly as compared to controls (FAB total score = 16.53 vs. 17.09, p < .05). CCU had also a poor performance in the Motor Programming subtest (2.47 vs. 2.73, p < .05). This study examined effects of cannabis in executive functioning and showed evidence that the FAB is sensitive to detect EF deficits in early abstinent chronic cannabis users. Clinical significance of these findings remains to be investigated in further longitudinal studies. FAB may be useful as a screening instrument to evaluate the necessity for a complete neuropsychological assessment in this population.

  15. Different effects of executive and visuospatial working memory on visual consciousness.

    PubMed

    De Loof, Esther; Poppe, Louise; Cleeremans, Axel; Gevers, Wim; Van Opstal, Filip

    2015-11-01

    Consciousness and working memory are two widely studied cognitive phenomena. Although they have been closely tied on a theoretical and neural level, empirical work that investigates their relation is largely lacking. In this study, the relationship between visual consciousness and different working memory components is investigated by using a dual-task paradigm. More specifically, while participants were performing a visual detection task to measure their visual awareness threshold, they had to concurrently perform either an executive or visuospatial working memory task. We hypothesized that visual consciousness would be hindered depending on the type and the size of the load in working memory. Results showed that maintaining visuospatial content in working memory hinders visual awareness, irrespective of the amount of information maintained. By contrast, the detection threshold was progressively affected under increasing executive load. Interestingly, increasing executive load had a generic effect on detection speed, calling into question whether its obstructing effect is specific to the visual awareness threshold. Together, these results indicate that visual consciousness depends differently on executive and visuospatial working memory.

  16. The role of the insula in intuitive expert bug detection in computer code: an fMRI study.

    PubMed

    Castelhano, Joao; Duarte, Isabel C; Ferreira, Carlos; Duraes, Joao; Madeira, Henrique; Castelo-Branco, Miguel

    2018-05-09

    Software programming is a complex and relatively recent human activity, involving the integration of mathematical, recursive thinking and language processing. The neural correlates of this recent human activity are still poorly understood. Error monitoring during this type of task, requiring the integration of language, logical symbol manipulation and other mathematical skills, is particularly challenging. We therefore aimed to investigate the neural correlates of decision-making during source code understanding and mental manipulation in professional participants with high expertise. The present fMRI study directly addressed error monitoring during source code comprehension, expert bug detection and decision-making. We used C code, which triggers the same sort of processing irrespective of the native language of the programmer. We discovered a distinct role for the insula in bug monitoring and detection and a novel connectivity pattern that goes beyond the expected activation pattern evoked by source code understanding in semantic language and mathematical processing regions. Importantly, insula activity levels were critically related to the quality of error detection, involving intuition, as signalled by reported initial bug suspicion, prior to final decision and bug detection. Activity in this salience network (SN) region evoked by bug suspicion was predictive of bug detection precision, suggesting that it encodes the quality of the behavioral evidence. Connectivity analysis provided evidence for top-down circuit "reutilization" stemming from anterior cingulate cortex (BA32), a core region in the SN that evolved for complex error monitoring such as required for this type of recent human activity. Cingulate (BA32) and anterolateral (BA10) frontal regions causally modulated decision processes in the insula, which in turn was related to activity of math processing regions in early parietal cortex. In other words, earlier brain regions used during evolution for

  17. CAFNA{reg{underscore}sign}, coded aperture fast neutron analysis for contraband detection: Preliminary results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, L.; Lanza, R.C.

    1999-12-01

    The authors have developed a near field coded aperture imaging system for use with fast neutron techniques as a tool for the detection of contraband and hidden explosives through nuclear elemental analysis. The technique relies on the prompt gamma rays produced by fast neutron interactions with the object being examined. The position of the nuclear elements is determined by the location of the gamma emitters. For existing fast neutron techniques, in Pulsed Fast Neutron Analysis (PFNA), neutrons are used with very low efficiency; in Fast Neutron Analysis (FNS), the sensitivity for detection of the signature gamma rays is very low.more » For the Coded Aperture Fast Neutron Analysis (CAFNA{reg{underscore}sign}) the authors have developed, the efficiency for both using the probing fast neutrons and detecting the prompt gamma rays is high. For a probed volume of n{sup 3} volume elements (voxels) in a cube of n resolution elements on a side, they can compare the sensitivity with other neutron probing techniques. As compared to PFNA, the improvement for neutron utilization is n{sup 2}, where the total number of voxels in the object being examined is n{sup 3}. Compared to FNA, the improvement for gamma-ray imaging is proportional to the total open area of the coded aperture plane; a typical value is n{sup 2}/2, where n{sup 2} is the number of total detector resolution elements or the number of pixels in an object layer. It should be noted that the actual signal to noise ratio of a system depends also on the nature and distribution of background events and this comparison may reduce somewhat the effective sensitivity of CAFNA. They have performed analysis, Monte Carlo simulations, and preliminary experiments using low and high energy gamma-ray sources. The results show that a high sensitivity 3-D contraband imaging and detection system can be realized by using CAFNA.« less

  18. Ultrasonic Array for Obstacle Detection Based on CDMA with Kasami Codes

    PubMed Central

    Diego, Cristina; Hernández, Álvaro; Jiménez, Ana; Álvarez, Fernando J.; Sanz, Rebeca; Aparicio, Joaquín

    2011-01-01

    This paper raises the design of an ultrasonic array for obstacle detection based on Phased Array (PA) techniques, which steers the acoustic beam through the environment by electronics rather than mechanical means. The transmission of every element in the array has been encoded, according to Code Division for Multiple Access (CDMA), which allows multiple beams to be transmitted simultaneously. All these features together enable a parallel scanning system which does not only improve the image rate but also achieves longer inspection distances in comparison with conventional PA techniques. PMID:22247675

  19. Automatic Detection of Frontal Face Midline by Chain-coded Merlin-Farber Hough Trasform

    NASA Astrophysics Data System (ADS)

    Okamoto, Daichi; Ohyama, Wataru; Wakabayashi, Tetsushi; Kimura, Fumitaka

    We propose a novel approach for detection of the facial midline (facial symmetry axis) from a frontal face image. The facial midline has several applications, for instance reducing computational cost required for facial feature extraction (FFE) and postoperative assessment for cosmetic or dental surgery. The proposed method detects the facial midline of a frontal face from an edge image as the symmetry axis using the Merlin-Faber Hough transformation. And a new performance improvement scheme for midline detection by MFHT is present. The main concept of the proposed scheme is suppression of redundant vote on the Hough parameter space by introducing chain code representation for the binary edge image. Experimental results on the image dataset containing 2409 images from FERET database indicate that the proposed algorithm can improve the accuracy of midline detection from 89.9% to 95.1 % for face images with different scales and rotation.

  20. Investigating the effects of caffeine on executive functions using traditional Stroop and a new ecologically-valid virtual reality task, the Jansari assessment of Executive Functions (JEF(©)).

    PubMed

    Soar, K; Chapman, E; Lavan, N; Jansari, A S; Turner, J J D

    2016-10-01

    Caffeine has been shown to have effects on certain areas of cognition, but in executive functioning the research is limited and also inconsistent. One reason could be the need for a more sensitive measure to detect the effects of caffeine on executive function. This study used a new non-immersive virtual reality assessment of executive functions known as JEF(©) (the Jansari Assessment of Executive Function) alongside the 'classic' Stroop Colour-Word task to assess the effects of a normal dose of caffeinated coffee on executive function. Using a double-blind, counterbalanced within participants procedure 43 participants were administered either a caffeinated or decaffeinated coffee and completed the 'JEF(©)' and Stroop tasks, as well as a subjective mood scale and blood pressure pre- and post condition on two separate occasions a week apart. JEF(©) yields measures for eight separate aspects of executive functions, in addition to a total average score. Findings indicate that performance was significantly improved on the planning, creative thinking, event-, time- and action-based prospective memory, as well as total JEF(©) score following caffeinated coffee relative to the decaffeinated coffee. The caffeinated beverage significantly decreased reaction times on the Stroop task, but there was no effect on Stroop interference. The results provide further support for the effects of a caffeinated beverage on cognitive functioning. In particular, it has demonstrated the ability of JEF(©) to detect the effects of caffeine across a number of executive functioning constructs, which weren't shown in the Stroop task, suggesting executive functioning improvements as a result of a 'typical' dose of caffeine may only be detected by the use of more real-world, ecologically valid tasks. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Auditory detection of ultrasonic coded transmitters by seals and sea lions.

    PubMed

    Cunningham, Kane A; Hayes, Sean A; Michelle Wargo Rub, A; Reichmuth, Colleen

    2014-04-01

    Ultrasonic coded transmitters (UCTs) are high-frequency acoustic tags that are often used to conduct survivorship studies of vulnerable fish species. Recent observations of differential mortality in tag control studies suggest that fish instrumented with UCTs may be selectively targeted by marine mammal predators, thereby skewing valuable survivorship data. In order to better understand the ability of pinnipeds to detect UCT outputs, behavioral high-frequency hearing thresholds were obtained from a trained harbor seal (Phoca vitulina) and a trained California sea lion (Zalophus californianus). Thresholds were measured for extended (500 ms) and brief (10 ms) 69 kHz narrowband stimuli, as well as for a stimulus recorded directly from a Vemco V16-3H UCT, which consisted of eight 10 ms, 69 kHz pure-tone pulses. Detection thresholds for the harbor seal were as expected based on existing audiometric data for this species, while the California sea lion was much more sensitive than predicted. Given measured detection thresholds of 113 dB re 1 μPa and 124 dB re 1 μPa, respectively, both species are likely able to detect acoustic outputs of the Vemco V16-3H under water from distances exceeding 200 m in typical natural conditions, suggesting that these species are capable of using UCTs to detect free-ranging fish.

  2. A concatenated coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Lin, S.

    1985-01-01

    A concatenated coding scheme for error contol in data communications was analyzed. The inner code is used for both error correction and detection, however the outer code is used only for error detection. A retransmission is requested if either the inner code decoder fails to make a successful decoding or the outer code decoder detects the presence of errors after the inner code decoding. Probability of undetected error of the proposed scheme is derived. An efficient method for computing this probability is presented. Throughout efficiency of the proposed error control scheme incorporated with a selective repeat ARQ retransmission strategy is analyzed.

  3. XSECT: A computer code for generating fuselage cross sections - user's manual

    NASA Technical Reports Server (NTRS)

    Ames, K. R.

    1982-01-01

    A computer code, XSECT, has been developed to generate fuselage cross sections from a given area distribution and wing definition. The cross sections are generated to match the wing definition while conforming to the area requirement. An iterative procedure is used to generate each cross section. Fuselage area balancing may be included in this procedure if desired. The code is intended as an aid for engineers who must first design a wing under certain aerodynamic constraints and then design a fuselage for the wing such that the contraints remain satisfied. This report contains the information necessary for accessing and executing the code, which is written in FORTRAN to execute on the Cyber 170 series computers (NOS operating system) and produces graphical output for a Tektronix 4014 CRT. The LRC graphics software is used in combination with the interface between this software and the PLOT 10 software.

  4. Coding for reliable satellite communications

    NASA Technical Reports Server (NTRS)

    Gaarder, N. T.; Lin, S.

    1986-01-01

    This research project was set up to study various kinds of coding techniques for error control in satellite and space communications for NASA Goddard Space Flight Center. During the project period, researchers investigated the following areas: (1) decoding of Reed-Solomon codes in terms of dual basis; (2) concatenated and cascaded error control coding schemes for satellite and space communications; (3) use of hybrid coding schemes (error correction and detection incorporated with retransmission) to improve system reliability and throughput in satellite communications; (4) good codes for simultaneous error correction and error detection, and (5) error control techniques for ring and star networks.

  5. Development of Web Interfaces for Analysis Codes

    NASA Astrophysics Data System (ADS)

    Emoto, M.; Watanabe, T.; Funaba, H.; Murakami, S.; Nagayama, Y.; Kawahata, K.

    Several codes have been developed to analyze plasma physics. However, most of them are developed to run on supercomputers. Therefore, users who typically use personal computers (PCs) find it difficult to use these codes. In order to facilitate the widespread use of these codes, a user-friendly interface is required. The authors propose Web interfaces for these codes. To demonstrate the usefulness of this approach, the authors developed Web interfaces for two analysis codes. One of them is for FIT developed by Murakami. This code is used to analyze the NBI heat deposition, etc. Because it requires electron density profiles, electron temperatures, and ion temperatures as polynomial expressions, those unfamiliar with the experiments find it difficult to use this code, especially visitors from other institutes. The second one is for visualizing the lines of force in the LHD (large helical device) developed by Watanabe. This code is used to analyze the interference caused by the lines of force resulting from the various structures installed in the vacuum vessel of the LHD. This code runs on PCs; however, it requires that the necessary parameters be edited manually. Using these Web interfaces, users can execute these codes interactively.

  6. Mining dynamic noteworthy functions in software execution sequences

    PubMed Central

    Huang, Guoyan; Wang, Yuqian; He, Haitao; Ren, Jiadong

    2017-01-01

    As the quality of crucial entities can directly affect that of software, their identification and protection become an important premise for effective software development, management, maintenance and testing, which thus contribute to improving the software quality and its attack-defending ability. Most analysis and evaluation on important entities like codes-based static structure analysis are on the destruction of the actual software running. In this paper, from the perspective of software execution process, we proposed an approach to mine dynamic noteworthy functions (DNFM)in software execution sequences. First, according to software decompiling and tracking stack changes, the execution traces composed of a series of function addresses were acquired. Then these traces were modeled as execution sequences and then simplified so as to get simplified sequences (SFS), followed by the extraction of patterns through pattern extraction (PE) algorithm from SFS. After that, evaluating indicators inner-importance and inter-importance were designed to measure the noteworthiness of functions in DNFM algorithm. Finally, these functions were sorted by their noteworthiness. Comparison and contrast were conducted on the experiment results from two traditional complex network-based node mining methods, namely PageRank and DegreeRank. The results show that the DNFM method can mine noteworthy functions in software effectively and precisely. PMID:28278276

  7. Mining dynamic noteworthy functions in software execution sequences.

    PubMed

    Zhang, Bing; Huang, Guoyan; Wang, Yuqian; He, Haitao; Ren, Jiadong

    2017-01-01

    As the quality of crucial entities can directly affect that of software, their identification and protection become an important premise for effective software development, management, maintenance and testing, which thus contribute to improving the software quality and its attack-defending ability. Most analysis and evaluation on important entities like codes-based static structure analysis are on the destruction of the actual software running. In this paper, from the perspective of software execution process, we proposed an approach to mine dynamic noteworthy functions (DNFM)in software execution sequences. First, according to software decompiling and tracking stack changes, the execution traces composed of a series of function addresses were acquired. Then these traces were modeled as execution sequences and then simplified so as to get simplified sequences (SFS), followed by the extraction of patterns through pattern extraction (PE) algorithm from SFS. After that, evaluating indicators inner-importance and inter-importance were designed to measure the noteworthiness of functions in DNFM algorithm. Finally, these functions were sorted by their noteworthiness. Comparison and contrast were conducted on the experiment results from two traditional complex network-based node mining methods, namely PageRank and DegreeRank. The results show that the DNFM method can mine noteworthy functions in software effectively and precisely.

  8. SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (DEC VAX VERSION)

    NASA Technical Reports Server (NTRS)

    Merwarth, P. D.

    1994-01-01

    The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.

  9. Execution time support for scientific programs on distributed memory machines

    NASA Technical Reports Server (NTRS)

    Berryman, Harry; Saltz, Joel; Scroggs, Jeffrey

    1990-01-01

    Optimizations are considered that are required for efficient execution of code segments that consists of loops over distributed data structures. The PARTI (Parallel Automated Runtime Toolkit at ICASE) execution time primitives are designed to carry out these optimizations and can be used to implement a wide range of scientific algorithms on distributed memory machines. These primitives allow the user to control array mappings in a way that gives an appearance of shared memory. Computations can be based on a global index set. Primitives are used to carry out gather and scatter operations on distributed arrays. Communications patterns are derived at runtime, and the appropriate send and receive messages are automatically generated.

  10. ETF system code: composition and applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reid, R.L.; Wu, K.F.

    1980-01-01

    A computer code has been developed for application to ETF tokamak system and conceptual design studies. The code determines cost, performance, configuration, and technology requirements as a function of tokamak parameters. The ETF code is structured in a modular fashion in order to allow independent modeling of each major tokamak component. The primary benefit of modularization is that it allows updating of a component module, such as the TF coil module, without disturbing the remainder of the system code as long as the input/output to the modules remains unchanged. The modules may be run independently to perform specific design studies,more » such as determining the effect of allowable strain on TF coil structural requirements, or the modules may be executed together as a system to determine global effects, such as defining the impact of aspect ratio on the entire tokamak system.« less

  11. Operations analysis (study 2.1). Program listing for the LOVES computer code

    NASA Technical Reports Server (NTRS)

    Wray, S. T., Jr.

    1974-01-01

    A listing of the LOVES computer program is presented. The program is coded partially in SIMSCRIPT and FORTRAN. This version of LOVES is compatible with both the CDC 7600 and the UNIVAC 1108 computers. The code has been compiled, loaded, and executed successfully on the EXEC 8 system for the UNIVAC 1108.

  12. A Survey of New Trends in Symbolic Execution for Software Testing and Analysis

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Visser, Willem

    2009-01-01

    Symbolic execution is a well-known program analysis technique which represents values of program inputs with symbolic values instead of concrete (initialized) data and executes the program by manipulating program expressions involving the symbolic values. Symbolic execution has been proposed over three decades ago but recently it has found renewed interest in the research community, due in part to the progress in decision procedures, availability of powerful computers and new algorithmic developments. We provide a survey of some of the new research trends in symbolic execution, with particular emphasis on applications to test generation and program analysis. We first describe an approach that handles complex programming constructs such as input data structures, arrays, as well as multi-threading. We follow with a discussion of abstraction techniques that can be used to limit the (possibly infinite) number of symbolic configurations that need to be analyzed for the symbolic execution of looping programs. Furthermore, we describe recent hybrid techniques that combine concrete and symbolic execution to overcome some of the inherent limitations of symbolic execution, such as handling native code or availability of decision procedures for the application domain. Finally, we give a short survey of interesting new applications, such as predictive testing, invariant inference, program repair, analysis of parallel numerical programs and differential symbolic execution.

  13. Executive Functioning Heterogeneity in Pediatric ADHD.

    PubMed

    Kofler, Michael J; Irwin, Lauren N; Soto, Elia F; Groves, Nicole B; Harmon, Sherelle L; Sarver, Dustin E

    2018-04-28

    Neurocognitive heterogeneity is increasingly recognized as a valid phenomenon in ADHD, with most estimates suggesting that executive dysfunction is present in only about 33%-50% of these children. However, recent critiques question the veracity of these estimates because our understanding of executive functioning in ADHD is based, in large part, on data from single tasks developed to detect gross neurological impairment rather than the specific executive processes hypothesized to underlie the ADHD phenotype. The current study is the first to comprehensively assess heterogeneity in all three primary executive functions in ADHD using a criterion battery that includes multiple tests per construct (working memory, inhibitory control, set shifting). Children ages 8-13 (M = 10.37, SD = 1.39) with and without ADHD (N = 136; 64 girls; 62% Caucasian/Non-Hispanic) completed a counterbalanced series of executive function tests. Accounting for task unreliability, results indicated significantly improved sensitivity and specificity relative to prior estimates, with 89% of children with ADHD demonstrating objectively-defined impairment on at least one executive function (62% impaired working memory, 27% impaired inhibitory control, 38% impaired set shifting; 54% impaired on one executive function, 35% impaired on two or all three executive functions). Children with working memory deficits showed higher parent- and teacher-reported ADHD inattentive and hyperactive/impulsive symptoms (BF 10  = 5.23 × 10 4 ), and were slightly younger (BF 10  = 11.35) than children without working memory deficits. Children with vs. without set shifting or inhibitory control deficits did not differ on ADHD symptoms, age, gender, IQ, SES, or medication status. Taken together, these findings confirm that ADHD is characterized by neurocognitive heterogeneity, while suggesting that contemporary, cognitively-informed criteria may provide improved precision for identifying a

  14. Supersensitive detection and discrimination of enantiomers by dorsal olfactory receptors: evidence for hierarchical odour coding.

    PubMed

    Sato, Takaaki; Kobayakawa, Reiko; Kobayakawa, Ko; Emura, Makoto; Itohara, Shigeyoshi; Kizumi, Miwako; Hamana, Hiroshi; Tsuboi, Akio; Hirono, Junzo

    2015-09-11

    Enantiomeric pairs of mirror-image molecular structures are difficult to resolve by instrumental analyses. The human olfactory system, however, discriminates (-)-wine lactone from its (+)-form rapidly within seconds. To gain insight into receptor coding of enantiomers, we compared behavioural detection and discrimination thresholds of wild-type mice with those of ΔD mice in which all dorsal olfactory receptors are genetically ablated. Surprisingly, wild-type mice displayed an exquisite "supersensitivity" to enantiomeric pairs of wine lactones and carvones. They were capable of supersensitive discrimination of enantiomers, consistent with their high detection sensitivity. In contrast, ΔD mice showed selective major loss of sensitivity to the (+)-enantiomers. The resulting 10(8)-fold differential sensitivity of ΔD mice to (-)- vs. (+)-wine lactone matched that observed in humans. This suggests that humans lack highly sensitive orthologous dorsal receptors for the (+)-enantiomer, similarly to ΔD mice. Moreover, ΔD mice showed >10(10)-fold reductions in enantiomer discrimination sensitivity compared to wild-type mice. ΔD mice detected one or both of the (-)- and (+)-enantiomers over a wide concentration range, but were unable to discriminate them. This "enantiomer odour discrimination paradox" indicates that the most sensitive dorsal receptors play a critical role in hierarchical odour coding for enantiomer identification.

  15. Supersensitive detection and discrimination of enantiomers by dorsal olfactory receptors: evidence for hierarchical odour coding

    PubMed Central

    Sato, Takaaki; Kobayakawa, Reiko; Kobayakawa, Ko; Emura, Makoto; Itohara, Shigeyoshi; Kizumi, Miwako; Hamana, Hiroshi; Tsuboi, Akio; Hirono, Junzo

    2015-01-01

    Enantiomeric pairs of mirror-image molecular structures are difficult to resolve by instrumental analyses. The human olfactory system, however, discriminates (−)-wine lactone from its (+)-form rapidly within seconds. To gain insight into receptor coding of enantiomers, we compared behavioural detection and discrimination thresholds of wild-type mice with those of ΔD mice in which all dorsal olfactory receptors are genetically ablated. Surprisingly, wild-type mice displayed an exquisite “supersensitivity” to enantiomeric pairs of wine lactones and carvones. They were capable of supersensitive discrimination of enantiomers, consistent with their high detection sensitivity. In contrast, ΔD mice showed selective major loss of sensitivity to the (+)-enantiomers. The resulting 108-fold differential sensitivity of ΔD mice to (−)- vs. (+)-wine lactone matched that observed in humans. This suggests that humans lack highly sensitive orthologous dorsal receptors for the (+)-enantiomer, similarly to ΔD mice. Moreover, ΔD mice showed >1010-fold reductions in enantiomer discrimination sensitivity compared to wild-type mice. ΔD mice detected one or both of the (−)- and (+)-enantiomers over a wide concentration range, but were unable to discriminate them. This “enantiomer odour discrimination paradox” indicates that the most sensitive dorsal receptors play a critical role in hierarchical odour coding for enantiomer identification. PMID:26361056

  16. Accumulating pyramid spatial-spectral collaborative coding divergence for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Sun, Hao; Zou, Huanxin; Zhou, Shilin

    2016-03-01

    Detection of anomalous targets of various sizes in hyperspectral data has received a lot of attention in reconnaissance and surveillance applications. Many anomaly detectors have been proposed in literature. However, current methods are susceptible to anomalies in the processing window range and often make critical assumptions about the distribution of the background data. Motivated by the fact that anomaly pixels are often distinctive from their local background, in this letter, we proposed a novel hyperspectral anomaly detection framework for real-time remote sensing applications. The proposed framework consists of four major components, sparse feature learning, pyramid grid window selection, joint spatial-spectral collaborative coding and multi-level divergence fusion. It exploits the collaborative representation difference in the feature space to locate potential anomalies and is totally unsupervised without any prior assumptions. Experimental results on airborne recorded hyperspectral data demonstrate that the proposed methods adaptive to anomalies in a large range of sizes and is well suited for parallel processing.

  17. Income, neural executive processes, and preschool children's executive control.

    PubMed

    Ruberry, Erika J; Lengua, Liliana J; Crocker, Leanna Harris; Bruce, Jacqueline; Upshaw, Michaela B; Sommerville, Jessica A

    2017-02-01

    This study aimed to specify the neural mechanisms underlying the link between low household income and diminished executive control in the preschool period. Specifically, we examined whether individual differences in the neural processes associated with executive attention and inhibitory control accounted for income differences observed in performance on a neuropsychological battery of executive control tasks. The study utilized a sample of preschool-aged children (N = 118) whose families represented the full range of income, with 32% of families at/near poverty, 32% lower income, and 36% middle to upper income. Children completed a neuropsychological battery of executive control tasks and then completed two computerized executive control tasks while EEG data were collected. We predicted that differences in the event-related potential (ERP) correlates of executive attention and inhibitory control would account for income differences observed on the executive control battery. Income and ERP measures were related to performance on the executive control battery. However, income was unrelated to ERP measures. The findings suggest that income differences observed in executive control during the preschool period might relate to processes other than executive attention and inhibitory control.

  18. An approach for coupled-code multiphysics core simulations from a common input

    DOE PAGES

    Schmidt, Rodney; Belcourt, Kenneth; Hooper, Russell; ...

    2014-12-10

    This study describes an approach for coupled-code multiphysics reactor core simulations that is being developed by the Virtual Environment for Reactor Applications (VERA) project in the Consortium for Advanced Simulation of Light-Water Reactors (CASL). In this approach a user creates a single problem description, called the “VERAIn” common input file, to define and setup the desired coupled-code reactor core simulation. A preprocessing step accepts the VERAIn file and generates a set of fully consistent input files for the different physics codes being coupled. The problem is then solved using a single-executable coupled-code simulation tool applicable to the problem, which ismore » built using VERA infrastructure software tools and the set of physics codes required for the problem of interest. The approach is demonstrated by performing an eigenvalue and power distribution calculation of a typical three-dimensional 17 × 17 assembly with thermal–hydraulic and fuel temperature feedback. All neutronics aspects of the problem (cross-section calculation, neutron transport, power release) are solved using the Insilico code suite and are fully coupled to a thermal–hydraulic analysis calculated by the Cobra-TF (CTF) code. The single-executable coupled-code (Insilico-CTF) simulation tool is created using several VERA tools, including LIME (Lightweight Integrating Multiphysics Environment for coupling codes), DTK (Data Transfer Kit), Trilinos, and TriBITS. Parallel calculations are performed on the Titan supercomputer at Oak Ridge National Laboratory using 1156 cores, and a synopsis of the solution results and code performance is presented. Finally, ongoing development of this approach is also briefly described.« less

  19. Performance of Serially Concatenated Convolutional Codes with Binary Modulation in AWGN and Noise Jamming over Rayleigh Fading Channels

    DTIC Science & Technology

    2001-09-01

    Rate - compatible punctured convolutional codes (RCPC codes ) and their applications,” IEEE...ABSTRACT In this dissertation, the bit error rates for serially concatenated convolutional codes (SCCC) for both BPSK and DPSK modulation with...INTENTIONALLY LEFT BLANK i EXECUTIVE SUMMARY In this dissertation, the bit error rates of serially concatenated convolutional codes

  20. Method and apparatus for executing a shift in a hybrid transmission

    DOEpatents

    Gupta, Pinaki; Kaminsky, Lawrence A; Demirovic, Besim

    2013-09-03

    A method for executing a transmission shift in a hybrid transmission including first and second electric machines includes executing a shift-through-neutral sequence from an initial transmission state to a target transmission state including executing an intermediate shift to neutral. Upon detecting a change in an output torque request while executing the shift-through-neutral sequence, possible recovery shift paths are identified. Available ones of the possible recovery shift paths are identified and a shift cost for each said available recovery shift path is evaluated. The available recovery shift path having a minimum shift cost is selected as a preferred recovery shift path and is executed to achieve a non-neutral transmission state.

  1. Executive Values, Executive Functions, and the Humanities.

    ERIC Educational Resources Information Center

    Pichler, Joseph A.

    The benefits of studying the humanities to the business executive are considered. The humanities can help develop both the values and functional skills that are necessary for executive success. Competence in value analysis helps future executives to understand the full implications of the economic system, especially when it is followed by the…

  2. On the linear programming bound for linear Lee codes.

    PubMed

    Astola, Helena; Tabus, Ioan

    2016-01-01

    Based on an invariance-type property of the Lee-compositions of a linear Lee code, additional equality constraints can be introduced to the linear programming problem of linear Lee codes. In this paper, we formulate this property in terms of an action of the multiplicative group of the field [Formula: see text] on the set of Lee-compositions. We show some useful properties of certain sums of Lee-numbers, which are the eigenvalues of the Lee association scheme, appearing in the linear programming problem of linear Lee codes. Using the additional equality constraints, we formulate the linear programming problem of linear Lee codes in a very compact form, leading to a fast execution, which allows to efficiently compute the bounds for large parameter values of the linear codes.

  3. The puzzle of processing speed, memory, and executive function impairments in schizophrenia: fitting the pieces together.

    PubMed

    Knowles, Emma E M; Weiser, Mark; David, Anthony S; Glahn, David C; Davidson, Michael; Reichenberg, Abraham

    2015-12-01

    Substantial impairment in performance on the digit-symbol substitution task in patients with schizophrenia is well established, which has been widely interpreted as denoting a specific impairment in processing speed. However, other higher order cognitive functions might be more critical to performance on this task. To date, this idea has not been rigorously investigated in patients with schizophrenia. Neuropsychological measures of processing speed, memory, and executive functioning were completed by 125 patients with schizophrenia and 272 control subjects. We implemented a series of confirmatory factor and structural regression modeling to build an integrated model of processing speed, memory, and executive function with which to deconstruct the digit-symbol substitution task and characterize discrepancies between patients with schizophrenia and control subjects. The overall structure of the processing speed, memory, and executive function model was the same across groups (χ(2) = 208.86, p > .05), but the contribution of the specific cognitive domains to coding task performance differed significantly. When completing the task, control subjects relied on executive function and, indirectly, on working memory ability, whereas patients with schizophrenia used an alternative set of cognitive operations whereby they relied on the same processes required to complete verbal fluency tasks. Successful coding task performance relies predominantly on executive function, rather than processing speed or memory. Patients with schizophrenia perform poorly on this task because of an apparent lack of appropriate executive function input; they rely instead on an alternative cognitive pathway. Copyright © 2015 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  4. Flexible Generation of Kalman Filter Code

    NASA Technical Reports Server (NTRS)

    Richardson, Julian; Wilson, Edward

    2006-01-01

    Domain-specific program synthesis can automatically generate high quality code in complex domains from succinct specifications, but the range of programs which can be generated by a given synthesis system is typically narrow. Obtaining code which falls outside this narrow scope necessitates either 1) extension of the code generator, which is usually very expensive, or 2) manual modification of the generated code, which is often difficult and which must be redone whenever changes are made to the program specification. In this paper, we describe adaptations and extensions of the AUTOFILTER Kalman filter synthesis system which greatly extend the range of programs which can be generated. Users augment the input specification with a specification of code fragments and how those fragments should interleave with or replace parts of the synthesized filter. This allows users to generate a much wider range of programs without their needing to modify the synthesis system or edit generated code. We demonstrate the usefulness of the approach by applying it to the synthesis of a complex state estimator which combines code from several Kalman filters with user-specified code. The work described in this paper allows the complex design decisions necessary for real-world applications to be reflected in the synthesized code. When executed on simulated input data, the generated state estimator was found to produce comparable estimates to those produced by a handcoded estimator

  5. Aquarius Project: Research in the System Architecture of Accelerators for the High Performance Execution of Logic Programs.

    DTIC Science & Technology

    1991-05-31

    benchmarks ............ .... . .. .. . . .. 220 Appendix G : Source code of the Aquarius Prolog compiler ........ . 224 Chapter I Introduction "You’re given...notation, a tool that is used throughout the compiler’s implementation. Appendix F lists the source code of the C and Prolog benchmarks. Appendix G lists the...source code of the compilcr. 5 "- standard form Prolog / a-sfomadon / head umrvln Convert to tmeikernel Prol g vrans~fonaon 1symbolic execution

  6. Production code control system for hydrodynamics simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slone, D.M.

    1997-08-18

    We describe how the Production Code Control System (pCCS), written in Perl, has been used to control and monitor the execution of a large hydrodynamics simulation code in a production environment. We have been able to integrate new, disparate, and often independent, applications into the PCCS framework without the need to modify any of our existing application codes. Both users and code developers see a consistent interface to the simulation code and associated applications regardless of the physical platform, whether an MPP, SMP, server, or desktop workstation. We will also describe our use of Perl to develop a configuration managementmore » system for the simulation code, as well as a code usage database and report generator. We used Perl to write a backplane that allows us plug in preprocessors, the hydrocode, postprocessors, visualization tools, persistent storage requests, and other codes. We need only teach PCCS a minimal amount about any new tool or code to essentially plug it in and make it usable to the hydrocode. PCCS has made it easier to link together disparate codes, since using Perl has removed the need to learn the idiosyncrasies of system or RPC programming. The text handling in Perl makes it easy to teach PCCS about new codes, or changes to existing codes.« less

  7. Execution time supports for adaptive scientific algorithms on distributed memory machines

    NASA Technical Reports Server (NTRS)

    Berryman, Harry; Saltz, Joel; Scroggs, Jeffrey

    1990-01-01

    Optimizations are considered that are required for efficient execution of code segments that consists of loops over distributed data structures. The PARTI (Parallel Automated Runtime Toolkit at ICASE) execution time primitives are designed to carry out these optimizations and can be used to implement a wide range of scientific algorithms on distributed memory machines. These primitives allow the user to control array mappings in a way that gives an appearance of shared memory. Computations can be based on a global index set. Primitives are used to carry out gather and scatter operations on distributed arrays. Communications patterns are derived at runtime, and the appropriate send and receive messages are automatically generated.

  8. Ability of primary auditory cortical neurons to detect amplitude modulation with rate and temporal codes: neurometric analysis

    PubMed Central

    Johnson, Jeffrey S.; Yin, Pingbo; O'Connor, Kevin N.

    2012-01-01

    Amplitude modulation (AM) is a common feature of natural sounds, and its detection is biologically important. Even though most sounds are not fully modulated, the majority of physiological studies have focused on fully modulated (100% modulation depth) sounds. We presented AM noise at a range of modulation depths to awake macaque monkeys while recording from neurons in primary auditory cortex (A1). The ability of neurons to detect partial AM with rate and temporal codes was assessed with signal detection methods. On average, single-cell synchrony was as or more sensitive than spike count in modulation detection. Cells are less sensitive to modulation depth if tested away from their best modulation frequency, particularly for temporal measures. Mean neural modulation detection thresholds in A1 are not as sensitive as behavioral thresholds, but with phase locking the most sensitive neurons are more sensitive, suggesting that for temporal measures the lower-envelope principle cannot account for thresholds. Three methods of preanalysis pooling of spike trains (multiunit, similar to convergence from a cortical column; within cell, similar to convergence of cells with matched response properties; across cell, similar to indiscriminate convergence of cells) all result in an increase in neural sensitivity to modulation depth for both temporal and rate codes. For the across-cell method, pooling of a few dozen cells can result in detection thresholds that approximate those of the behaving animal. With synchrony measures, indiscriminate pooling results in sensitive detection of modulation frequencies between 20 and 60 Hz, suggesting that differences in AM response phase are minor in A1. PMID:22422997

  9. Detection of non-coding RNA in bacteria and archaea using the DETR'PROK Galaxy pipeline.

    PubMed

    Toffano-Nioche, Claire; Luo, Yufei; Kuchly, Claire; Wallon, Claire; Steinbach, Delphine; Zytnicki, Matthias; Jacq, Annick; Gautheret, Daniel

    2013-09-01

    RNA-seq experiments are now routinely used for the large scale sequencing of transcripts. In bacteria or archaea, such deep sequencing experiments typically produce 10-50 million fragments that cover most of the genome, including intergenic regions. In this context, the precise delineation of the non-coding elements is challenging. Non-coding elements include untranslated regions (UTRs) of mRNAs, independent small RNA genes (sRNAs) and transcripts produced from the antisense strand of genes (asRNA). Here we present a computational pipeline (DETR'PROK: detection of ncRNAs in prokaryotes) based on the Galaxy framework that takes as input a mapping of deep sequencing reads and performs successive steps of clustering, comparison with existing annotation and identification of transcribed non-coding fragments classified into putative 5' UTRs, sRNAs and asRNAs. We provide a step-by-step description of the protocol using real-life example data sets from Vibrio splendidus and Escherichia coli. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  10. A concatenated coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Kasami, T.; Fujiwara, T.; Lin, S.

    1986-01-01

    In this paper, a concatenated coding scheme for error control in data communications is presented and analyzed. In this scheme, the inner code is used for both error correction and detection; however, the outer code is used only for error detection. A retransmission is requested if either the inner code decoder fails to make a successful decoding or the outer code decoder detects the presence of errors after the inner code decoding. Probability of undetected error (or decoding error) of the proposed scheme is derived. An efficient method for computing this probability is presented. Throughput efficiency of the proposed error control scheme incorporated with a selective-repeat ARQ retransmission strategy is also analyzed. Three specific examples are presented. One of the examples is proposed for error control in the NASA Telecommand System.

  11. Performance Measures of Diagnostic Codes for Detecting Opioid Overdose in the Emergency Department.

    PubMed

    Rowe, Christopher; Vittinghoff, Eric; Santos, Glenn-Milo; Behar, Emily; Turner, Caitlin; Coffin, Phillip O

    2017-04-01

    Opioid overdose mortality has tripled in the United States since 2000 and opioids are responsible for more than half of all drug overdose deaths, which reached an all-time high in 2014. Opioid overdoses resulting in death, however, represent only a small fraction of all opioid overdose events and efforts to improve surveillance of this public health problem should include tracking nonfatal overdose events. International Classification of Disease (ICD) diagnosis codes, increasingly used for the surveillance of nonfatal drug overdose events, have not been rigorously assessed for validity in capturing overdose events. The present study aimed to validate the use of ICD, 9th revision, Clinical Modification (ICD-9-CM) codes in identifying opioid overdose events in the emergency department (ED) by examining multiple performance measures, including sensitivity and specificity. Data on ED visits from January 1, 2012, to December 31, 2014, including clinical determination of whether the visit constituted an opioid overdose event, were abstracted from electronic medical records for patients prescribed long-term opioids for pain from any of six safety net primary care clinics in San Francisco, California. Combinations of ICD-9-CM codes were validated in the detection of overdose events as determined by medical chart review. Both sensitivity and specificity of different combinations of ICD-9-CM codes were calculated. Unadjusted logistic regression models with robust standard errors and accounting for clustering by patient were used to explore whether overdose ED visits with certain characteristics were more or less likely to be assigned an opioid poisoning ICD-9-CM code by the documenting physician. Forty-four (1.4%) of 3,203 ED visits among 804 patients were determined to be opioid overdose events. Opioid-poisoning ICD-9-CM codes (E850.2-E850.2, 965.00-965.09) identified overdose ED visits with a sensitivity of 25.0% (95% confidence interval [CI] = 13.6% to 37.8%) and

  12. The Astrophysics Source Code Library: An Update

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Nemiroff, R. J.; Shamir, L.; Teuben, P. J.

    2012-01-01

    The Astrophysics Source Code Library (ASCL), founded in 1999, takes an active approach to sharing astrophysical source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL moved to a new location in 2010, and has over 300 codes in it and continues to grow. In 2011, the ASCL (http://asterisk.apod.com/viewforum.php?f=35) has on average added 19 new codes per month; we encourage scientists to submit their codes for inclusion. An advisory committee has been established to provide input and guide the development and expansion of its new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This presentation covers the history of the ASCL and examines the current state and benefits of the ASCL, the means of and requirements for including codes, and outlines its future plans.

  13. RRTMGP: A High-Performance Broadband Radiation Code for the Next Decade

    DTIC Science & Technology

    2014-09-30

    Hardware counters were used to measure several performance metrics, including the number of double-precision (DP) floating- point operations ( FLOPs ...0.2 DP FLOPs per CPU cycle. Experience with production science code is that it is possible to achieve execution rates in the range of 0.5 to 1.0...DP FLOPs per cycle. Looking at the ratio of vectorized DP FLOPs to total DP FLOPs we see (Figure PROF) that for most of the execution time the

  14. Symbolic PathFinder: Symbolic Execution of Java Bytecode

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Rungta, Neha

    2010-01-01

    Symbolic Pathfinder (SPF) combines symbolic execution with model checking and constraint solving for automated test case generation and error detection in Java programs with unspecified inputs. In this tool, programs are executed on symbolic inputs representing multiple concrete inputs. Values of variables are represented as constraints generated from the analysis of Java bytecode. The constraints are solved using off-the shelf solvers to generate test inputs guaranteed to achieve complex coverage criteria. SPF has been used successfully at NASA, in academia, and in industry.

  15. Polarization-multiplexed rate-adaptive non-binary-quasi-cyclic-LDPC-coded multilevel modulation with coherent detection for optical transport networks.

    PubMed

    Arabaci, Murat; Djordjevic, Ivan B; Saunders, Ross; Marcoccia, Roberto M

    2010-02-01

    In order to achieve high-speed transmission over optical transport networks (OTNs) and maximize its throughput, we propose using a rate-adaptive polarization-multiplexed coded multilevel modulation with coherent detection based on component non-binary quasi-cyclic (QC) LDPC codes. Compared to prior-art bit-interleaved LDPC-coded modulation (BI-LDPC-CM) scheme, the proposed non-binary LDPC-coded modulation (NB-LDPC-CM) scheme not only reduces latency due to symbol- instead of bit-level processing but also provides either impressive reduction in computational complexity or striking improvements in coding gain depending on the constellation size. As the paper presents, compared to its prior-art binary counterpart, the proposed NB-LDPC-CM scheme addresses the needs of future OTNs, which are achieving the target BER performance and providing maximum possible throughput both over the entire lifetime of the OTN, better.

  16. A general multiblock Euler code for propulsion integration. Volume 3: User guide for the Euler code

    NASA Technical Reports Server (NTRS)

    Chen, H. C.; Su, T. Y.; Kao, T. J.

    1991-01-01

    This manual explains the procedures for using the general multiblock Euler (GMBE) code developed under NASA contract NAS1-18703. The code was developed for the aerodynamic analysis of geometrically complex configurations in either free air or wind tunnel environments (vol. 1). The complete flow field is divided into a number of topologically simple blocks within each of which surface fitted grids and efficient flow solution algorithms can easily be constructed. The multiblock field grid is generated with the BCON procedure described in volume 2. The GMBE utilizes a finite volume formulation with an explicit time stepping scheme to solve the Euler equations. A multiblock version of the multigrid method was developed to accelerate the convergence of the calculations. This user guide provides information on the GMBE code, including input data preparations with sample input files and a sample Unix script for program execution in the UNICOS environment.

  17. Computer Viruses: Prevention, Detection, and Treatment

    DTIC Science & Technology

    1990-03-12

    executed, also carries out its covert function, potentially undetected. This class of attack earned the term "Trojan horse" from the original of Greek ... mythology , signifying a gift which conceals a malicious purpose. 1 cause harm. The offending code may be present in a code segment the user "touches," which

  18. A concatenated coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Lin, S.

    1985-01-01

    A concatenated coding scheme for error control in data communications is analyzed. The inner code is used for both error correction and detection, however the outer code is used only for error detection. A retransmission is requested if the outer code detects the presence of errors after the inner code decoding. The probability of undetected error of the above error control scheme is derived and upper bounded. Two specific exmaples are analyzed. In the first example, the inner code is a distance-4 shortened Hamming code with generator polynomial (X+1)(X(6)+X+1) = X(7)+X(6)+X(2)+1 and the outer code is a distance-4 shortened Hamming code with generator polynomial (X+1)X(15+X(14)+X(13)+X(12)+X(4)+X(3)+X(2)+X+1) = X(16)+X(12)+X(5)+1 which is the X.25 standard for packet-switched data network. This example is proposed for error control on NASA telecommand links. In the second example, the inner code is the same as that in the first example but the outer code is a shortened Reed-Solomon code with symbols from GF(2(8)) and generator polynomial (X+1)(X+alpha) where alpha is a primitive element in GF(z(8)).

  19. Performance enhancement of various real-time image processing techniques via speculative execution

    NASA Astrophysics Data System (ADS)

    Younis, Mohamed F.; Sinha, Purnendu; Marlowe, Thomas J.; Stoyenko, Alexander D.

    1996-03-01

    In real-time image processing, an application must satisfy a set of timing constraints while ensuring the semantic correctness of the system. Because of the natural structure of digital data, pure data and task parallelism have been used extensively in real-time image processing to accelerate the handling time of image data. These types of parallelism are based on splitting the execution load performed by a single processor across multiple nodes. However, execution of all parallel threads is mandatory for correctness of the algorithm. On the other hand, speculative execution is an optimistic execution of part(s) of the program based on assumptions on program control flow or variable values. Rollback may be required if the assumptions turn out to be invalid. Speculative execution can enhance average, and sometimes worst-case, execution time. In this paper, we target various image processing techniques to investigate applicability of speculative execution. We identify opportunities for safe and profitable speculative execution in image compression, edge detection, morphological filters, and blob recognition.

  20. Grid Task Execution

    NASA Technical Reports Server (NTRS)

    Hu, Chaumin

    2007-01-01

    IPG Execution Service is a framework that reliably executes complex jobs on a computational grid, and is part of the IPG service architecture designed to support location-independent computing. The new grid service enables users to describe the platform on which they need a job to run, which allows the service to locate the desired platform, configure it for the required application, and execute the job. After a job is submitted, users can monitor it through periodic notifications, or through queries. Each job consists of a set of tasks that performs actions such as executing applications and managing data. Each task is executed based on a starting condition that is an expression of the states of other tasks. This formulation allows tasks to be executed in parallel, and also allows a user to specify tasks to execute when other tasks succeed, fail, or are canceled. The two core components of the Execution Service are the Task Database, which stores tasks that have been submitted for execution, and the Task Manager, which executes tasks in the proper order, based on the user-specified starting conditions, and avoids overloading local and remote resources while executing tasks.

  1. Compiling global name-space programs for distributed execution

    NASA Technical Reports Server (NTRS)

    Koelbel, Charles; Mehrotra, Piyush

    1990-01-01

    Distributed memory machines do not provide hardware support for a global address space. Thus programmers are forced to partition the data across the memories of the architecture and use explicit message passing to communicate data between processors. The compiler support required to allow programmers to express their algorithms using a global name-space is examined. A general method is presented for analysis of a high level source program and along with its translation to a set of independently executing tasks communicating via messages. If the compiler has enough information, this translation can be carried out at compile-time. Otherwise run-time code is generated to implement the required data movement. The analysis required in both situations is described and the performance of the generated code on the Intel iPSC/2 is presented.

  2. An Implementation of Privacy Protection for a Surveillance Camera Using ROI Coding of JPEG2000 with Face Detection

    NASA Astrophysics Data System (ADS)

    Muneyasu, Mitsuji; Odani, Shuhei; Kitaura, Yoshihiro; Namba, Hitoshi

    On the use of a surveillance camera, there is a case where privacy protection should be considered. This paper proposes a new privacy protection method by automatically degrading the face region in surveillance images. The proposed method consists of ROI coding of JPEG2000 and a face detection method based on template matching. The experimental result shows that the face region can be detected and hidden correctly.

  3. CMCpy: Genetic Code-Message Coevolution Models in Python

    PubMed Central

    Becich, Peter J.; Stark, Brian P.; Bhat, Harish S.; Ardell, David H.

    2013-01-01

    Code-message coevolution (CMC) models represent coevolution of a genetic code and a population of protein-coding genes (“messages”). Formally, CMC models are sets of quasispecies coupled together for fitness through a shared genetic code. Although CMC models display plausible explanations for the origin of multiple genetic code traits by natural selection, useful modern implementations of CMC models are not currently available. To meet this need we present CMCpy, an object-oriented Python API and command-line executable front-end that can reproduce all published results of CMC models. CMCpy implements multiple solvers for leading eigenpairs of quasispecies models. We also present novel analytical results that extend and generalize applications of perturbation theory to quasispecies models and pioneer the application of a homotopy method for quasispecies with non-unique maximally fit genotypes. Our results therefore facilitate the computational and analytical study of a variety of evolutionary systems. CMCpy is free open-source software available from http://pypi.python.org/pypi/CMCpy/. PMID:23532367

  4. Labelled Execution Systems

    DTIC Science & Technology

    2012-05-07

    executions are all the executions of the first except the single infinite execution stuttering around s0. And because of this exception, s0 is not bisimilar...maximal paths in the diagram, and that whose executions are all the executions of the first system except the infinite execution stuttering around s0. s...advance to s1, from where on it behaves just like the first one. What sets the behaviour of the two processes apart is, of course, the infinite stuttering

  5. ATLAS, an integrated structural analysis and design system. Volume 3: User's manual, input and execution data

    NASA Technical Reports Server (NTRS)

    Dreisbach, R. L. (Editor)

    1979-01-01

    The input data and execution control statements for the ATLAS integrated structural analysis and design system are described. It is operational on the Control Data Corporation (CDC) 6600/CYBER computers in a batch mode or in a time-shared mode via interactive graphic or text terminals. ATLAS is a modular system of computer codes with common executive and data base management components. The system provides an extensive set of general-purpose technical programs with analytical capabilities including stiffness, stress, loads, mass, substructuring, strength design, unsteady aerodynamics, vibration, and flutter analyses. The sequence and mode of execution of selected program modules are controlled via a common user-oriented language.

  6. A modified carrier-to-code leveling method for retrieving ionospheric observables and detecting short-term temporal variability of receiver differential code biases

    NASA Astrophysics Data System (ADS)

    Zhang, Baocheng; Teunissen, Peter J. G.; Yuan, Yunbin; Zhang, Xiao; Li, Min

    2018-03-01

    Sensing the ionosphere with the global positioning system involves two sequential tasks, namely the ionospheric observable retrieval and the ionospheric parameter estimation. A prominent source of error has long been identified as short-term variability in receiver differential code bias (rDCB). We modify the carrier-to-code leveling (CCL), a method commonly used to accomplish the first task, through assuming rDCB to be unlinked in time. Aside from the ionospheric observables, which are affected by, among others, the rDCB at one reference epoch, the Modified CCL (MCCL) can also provide the rDCB offsets with respect to the reference epoch as by-products. Two consequences arise. First, MCCL is capable of excluding the effects of time-varying rDCB from the ionospheric observables, which, in turn, improves the quality of ionospheric parameters of interest. Second, MCCL has significant potential as a means to detect between-epoch fluctuations experienced by rDCB of a single receiver.

  7. External-Compression Supersonic Inlet Design Code

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2011-01-01

    A computer code named SUPIN has been developed to perform aerodynamic design and analysis of external-compression, supersonic inlets. The baseline set of inlets include axisymmetric pitot, two-dimensional single-duct, axisymmetric outward-turning, and two-dimensional bifurcated-duct inlets. The aerodynamic methods are based on low-fidelity analytical and numerical procedures. The geometric methods are based on planar geometry elements. SUPIN has three modes of operation: 1) generate the inlet geometry from a explicit set of geometry information, 2) size and design the inlet geometry and analyze the aerodynamic performance, and 3) compute the aerodynamic performance of a specified inlet geometry. The aerodynamic performance quantities includes inlet flow rates, total pressure recovery, and drag. The geometry output from SUPIN includes inlet dimensions, cross-sectional areas, coordinates of planar profiles, and surface grids suitable for input to grid generators for analysis by computational fluid dynamics (CFD) methods. The input data file for SUPIN and the output file from SUPIN are text (ASCII) files. The surface grid files are output as formatted Plot3D or stereolithography (STL) files. SUPIN executes in batch mode and is available as a Microsoft Windows executable and Fortran95 source code with a makefile for Linux.

  8. Permanence analysis of a concatenated coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Costello, D. J., Jr.; Lin, S.; Kasami, T.

    1983-01-01

    A concatenated coding scheme for error control in data communications is analyzed. In this scheme, the inner code is used for both error correction and detection, however, the outer code is used only for error detection. A retransmission is requested if the outer code detects the presence of errors after the inner code decoding. Probability of undetected error is derived and bounded. A particular example, proposed for the planetary program, is analyzed.

  9. Evaluation and Testing of the ADVANTG Code on SNM Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaver, Mark W.; Casella, Andrew M.; Wittman, Richard S.

    2013-09-24

    Pacific Northwest National Laboratory (PNNL) has been tasked with evaluating the effectiveness of ORNL’s new hybrid transport code, ADVANTG, on scenarios of interest to our NA-22 sponsor, specifically of detection of diversion of special nuclear material (SNM). PNNL staff have determined that acquisition and installation of ADVANTG was relatively straightforward for a code in its phase of development, but probably not yet sufficient for mass distribution to the general user. PNNL staff also determined that with little effort, ADVANTG generated weight windows that typically worked for the problems and generated results consistent with MCNP. With slightly greater effort of choosingmore » a finer mesh around detectors or sample reaction tally regions, the figure of merit (FOM) could be further improved in most cases. This does take some limited knowledge of deterministic transport methods. The FOM could also be increased by limiting the energy range for a tally to the energy region of greatest interest. It was then found that an MCNP run with the full energy range for the tally showed improved statistics in the region used for the ADVANTG run. The specific case of interest chosen by the sponsor is the CIPN project from Las Alamos National Laboratory (LANL), which is an active interrogation, non-destructive assay (NDA) technique to quantify the fissile content in a spent fuel assembly and is also sensitive to cases of material diversion. Unfortunately, weight windows for the CIPN problem cannot currently be properly generated with ADVANTG due to inadequate accommodations for source definition. ADVANTG requires that a fixed neutron source be defined within the problem and cannot account for neutron multiplication. As such, it is rendered useless in active interrogation scenarios. It is also interesting to note that this is a difficult problem to solve and that the automated weight windows generator in MCNP actually slowed down the problem. Therefore, PNNL had

  10. Performance optimization of spectral amplitude coding OCDMA system using new enhanced multi diagonal code

    NASA Astrophysics Data System (ADS)

    Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf

    2016-11-01

    This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.

  11. Overview of the ArbiTER edge plasma eigenvalue code

    NASA Astrophysics Data System (ADS)

    Baver, Derek; Myra, James; Umansky, Maxim

    2011-10-01

    The Arbitrary Topology Equation Reader, or ArbiTER, is a flexible eigenvalue solver that is currently under development for plasma physics applications. The ArbiTER code builds on the equation parser framework of the existing 2DX code, extending it to include a topology parser. This will give the code the capability to model problems with complicated geometries (such as multiple X-points and scrape-off layers) or model equations with arbitrary numbers of dimensions (e.g. for kinetic analysis). In the equation parser framework, model equations are not included in the program's source code. Instead, an input file contains instructions for building a matrix from profile functions and elementary differential operators. The program then executes these instructions in a sequential manner. These instructions may also be translated into analytic form, thus giving the code transparency as well as flexibility. We will present an overview of how the ArbiTER code is to work, as well as preliminary results from early versions of this code. Work supported by the U.S. DOE.

  12. [EXPERIENCE OF STUDY AND POSSIBLE WAYS OF ELIMINATION OF FALSE POSITIVE AND FALSE NEGATIVE RESULTS DURING EXECUTION OF POLYMERASE CHAIN REACTION ON AN EXAMPLE OF JUNIN VIRUS RNA DETECTION].

    PubMed

    Sizikova, T E; Lebedev, V N; Pantyukhov, V B; Borisevich, S V; Merkulov, V A

    2015-01-01

    Experience of study and possible ways of elimination of false positive and false negative results during execution of polymerase chain reaction on an example of Junin virus RNA detection. MATERIALSS AND METHODS: Junin virus--causative agent of Argentine hemorrhagic fever (AHF) strain XJpR37/5787 was obtained from the State collection of pathogenicity group I causative agents of the 48th Central Research Institute. Reagent kit for detection of Junin virus RNA by RT-PCR was developed in the Institute and consists of 4 sets: for isolation of RNA, execution of reverse-transcription reaction, execution of PCR and electrophoretic detection of PCR products. RT-PCR was carried out by a standard technique. Continuous cell cultures of African green monkey Vero B, GMK-AH-1(D) were obtained from the museum of cell culture department of the Centre. An experimental study of the effect of various factors of impact on the sample under investigation ("thawing-freezing", presence of formaldehyde, heparin) on the obtaining of false negative results during Junin virus RNA detection by using RT-PCR was studied. Addition of 0.01% heparin to the samples was shown to completely inhibit PCR. Addition of 0.05% formaldehyde significantly reduces sensitivity of the method. A possibility of reduction of analysis timeframe from 15 to 5 days was shown during detection of the causative agent in samples with low concentration of the latter by growing the samples and subsequent analysis of the material obtained by using RT-PCR. During detection of causative agent by using RT-PCR false negative results could appear in the presence of formaldehyde and heparin in the sample. A possibility of elimination of false negative PCR results due to concentration of the causative agent in the sample under investigation at a level below sensitivity threshold was shown on the example of Junin virus RNA detection by using growing of the pathogen in appropriate accumulation system with subsequent analysis of the

  13. An ultrasound transient elastography system with coded excitation.

    PubMed

    Diao, Xianfen; Zhu, Jing; He, Xiaonian; Chen, Xin; Zhang, Xinyu; Chen, Siping; Liu, Weixiang

    2017-06-28

    Ultrasound transient elastography technology has found its place in elastography because it is safe and easy to operate. However, it's application in deep tissue is limited. The aim of this study is to design an ultrasound transient elastography system with coded excitation to obtain greater detection depth. The ultrasound transient elastography system requires tissue vibration to be strictly synchronous with ultrasound detection. Therefore, an ultrasound transient elastography system with coded excitation was designed. A central component of this transient elastography system was an arbitrary waveform generator with multi-channel signals output function. This arbitrary waveform generator was used to produce the tissue vibration signal, the ultrasound detection signal and the synchronous triggering signal of the radio frequency data acquisition system. The arbitrary waveform generator can produce different forms of vibration waveform to induce different shear wave propagation in the tissue. Moreover, it can achieve either traditional pulse-echo detection or a phase-modulated or a frequency-modulated coded excitation. A 7-chip Barker code and traditional pulse-echo detection were programmed on the designed ultrasound transient elastography system to detect the shear wave in the phantom excited by the mechanical vibrator. Then an elasticity QA phantom and sixteen in vitro rat livers were used for performance evaluation of the two detection pulses. The elasticity QA phantom's results show that our system is effective, and the rat liver results show the detection depth can be increased more than 1 cm. In addition, the SNR (signal-to-noise ratio) is increased by 15 dB using the 7-chip Barker coded excitation. Applying 7-chip Barker coded excitation technique to the ultrasound transient elastography can increase the detection depth and SNR. Using coded excitation technology to assess the human liver, especially in obese patients, may be a good choice.

  14. The use of self checks and voting in software error detection - An empirical study

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G.; Cha, Stephen S.; Knight, John C.; Shimeall, Timothy J.

    1990-01-01

    The results of an empirical study of software error detection using self checks and N-version voting are presented. Working independently, each of 24 programmers first prepared a set of self checks using just the requirements specification of an aerospace application, and then each added self checks to an existing implementation of that specification. The modified programs were executed to measure the error-detection performance of the checks and to compare this with error detection using simple voting among multiple versions. The analysis of the checks revealed that there are great differences in the ability of individual programmers to design effective checks. It was found that some checks that might have been effective failed to detect an error because they were badly placed, and there were numerous instances of checks signaling nonexistent errors. In general, specification-based checks alone were not as effective as specification-based checks combined with code-based checks. Self checks made it possible to identify faults that had not been detected previously by voting 28 versions of the program over a million randomly generated inputs. This appeared to result from the fact that the self checks could examine the internal state of the executing program, whereas voting examines only final results of computations. If internal states had to be identical in N-version voting systems, then there would be no reason to write multiple versions.

  15. Authorship Attribution of Source Code

    ERIC Educational Resources Information Center

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…

  16. The Effect of Repetitive Saccade Execution on the Attention Network Test: Enhancing Executive Function with a Flick of the Eyes

    ERIC Educational Resources Information Center

    Edlin, James M.; Lyle, Keith B.

    2013-01-01

    The simple act of repeatedly looking left and right can enhance subsequent cognition, including divergent thinking, detection of matching letters from visual arrays, and memory retrieval. One hypothesis is that saccade execution enhances subsequent cognition by altering attentional control. To test this hypothesis, we compared performance…

  17. A Review on Spectral Amplitude Coding Optical Code Division Multiple Access

    NASA Astrophysics Data System (ADS)

    Kaur, Navpreet; Goyal, Rakesh; Rani, Monika

    2017-06-01

    This manuscript deals with analysis of Spectral Amplitude Coding Optical Code Division Multiple Access (SACOCDMA) system. The major noise source in optical CDMA is co-channel interference from other users known as multiple access interference (MAI). The system performance in terms of bit error rate (BER) degrades as a result of increased MAI. It is perceived that number of users and type of codes used for optical system directly decide the performance of system. MAI can be restricted by efficient designing of optical codes and implementing them with unique architecture to accommodate more number of users. Hence, it is a necessity to design a technique like spectral direct detection (SDD) technique with modified double weight code, which can provide better cardinality and good correlation property.

  18. Identifying Executable Plans

    NASA Technical Reports Server (NTRS)

    Bedrax-Weiss, Tania; Jonsson, Ari K.; Frank, Jeremy D.; McGann, Conor

    2003-01-01

    Generating plans for execution imposes a different set of requirements on the planning process than those imposed by planning alone. In highly unpredictable execution environments, a fully-grounded plan may become inconsistent frequently when the world fails to behave as expected. Intelligent execution permits making decisions when the most up-to-date information is available, ensuring fewer failures. Planning should acknowledge the capabilities of the execution system, both to ensure robust execution in the face of uncertainty, which also relieves the planner of the burden of making premature commitments. We present Plan Identification Functions (PIFs), which formalize what it means for a plan to be executable, md are used in conjunction with a complete model of system behavior to halt the planning process when an executable plan is found. We describe the implementation of plan identification functions for a temporal, constraint-based planner. This particular implementation allows the description of many different plan identification functions. characteristics crf the xectieonfvii rnm-enft,h e best plan to hand to the execution system will contain more or less commitment and information.

  19. Executive Functioning in Schizophrenia

    PubMed Central

    Orellana, Gricel; Slachevsky, Andrea

    2013-01-01

    The executive function (EF) is a set of abilities, which allows us to invoke voluntary control of our behavioral responses. These functions enable human beings to develop and carry out plans, make up analogies, obey social rules, solve problems, adapt to unexpected circumstances, do many tasks simultaneously, and locate episodes in time and place. EF includes divided attention and sustained attention, working memory (WM), set-shifting, flexibility, planning, and the regulation of goal directed behavior and can be defined as a brain function underlying the human faculty to act or think not only in reaction to external events but also in relation with internal goals and states. EF is mostly associated with dorsolateral prefrontal cortex (PFC). Besides EF, PFC is involved in self-regulation of behavior, i.e., the ability to regulate behavior according to internal goals and constraints, particularly in less structured situations. Self-regulation of behavior is subtended by ventral medial/orbital PFC. Impairment of EF is one of the most commonly observed deficits in schizophrenia through the various disease stages. Impairment in tasks measuring conceptualization, planning, cognitive flexibility, verbal fluency, ability to solve complex problems, and WM occur in schizophrenia. Disorders detected by executive tests are consistent with evidence from functional neuroimaging, which have shown PFC dysfunction in patients while performing these kinds of tasks. Schizophrenics also exhibit deficit in odor identifying, decision-making, and self-regulation of behavior suggesting dysfunction of the orbital PFC. However, impairment in executive tests is explained by dysfunction of prefronto-striato-thalamic, prefronto-parietal, and prefronto-temporal neural networks mainly. Disorders in EFs may be considered central facts with respect to schizophrenia and it has been suggested that negative symptoms may be explained by that executive dysfunction. PMID:23805107

  20. Biobar-coded gold nanoparticles and DNAzyme-based dual signal amplification strategy for ultrasensitive detection of protein by electrochemiluminescence.

    PubMed

    Xia, Hui; Li, Lingling; Yin, Zhouyang; Hou, Xiandeng; Zhu, Jun-Jie

    2015-01-14

    A dual signal amplification strategy for electrochemiluminescence (ECL) aptasensor was designed based on biobar-coded gold nanoparticles (Au NPs) and DNAzyme. CdSeTe@ZnS quantum dots (QDs) were chosen as the ECL signal probes. To verify the proposed ultrasensitive ECL aptasensor for biomolecules, we detected thrombin (Tb) as a proof-of-principle analyte. The hairpin DNA designed for the recognition of protein consists of two parts: the sequences of catalytical 8-17 DNAzyme and thrombin aptamer. Only in the presence of thrombin could the hairpin DNA be opened, followed by a recycling cleavage of excess substrates by catalytic core of the DNAzyme to induce the first-step amplification. One part of the fragments was captured to open the capture DNA modified on the Au electrode, which further connected with the prepared biobar-coded Au NPs-CdSeTe@ZnS QDs to get the final dual-amplified ECL signal. The limit of detection for Tb was 0.28 fM with excellent selectivity, and this proposed method possessed good performance in real sample analysis. This design introduces the new concept of dual-signal amplification by a biobar-coded system and DNAzyme recycling into ECL determination, and it is promising to be extended to provide a highly sensitive platform for various target biomolecules.

  1. PMD mitigation through interleaving LDPC codes with polarization scramblers

    NASA Astrophysics Data System (ADS)

    Han, Dahai; Chen, Haoran; Xi, Lixia

    2013-09-01

    The combination of forward error correction (FEC) and distributed fast polarization scramblers (D-FPSs) is approved an effective method to mitigate polarization mode dispersion (PMD) in high-speed optical fiber communication system. The low-density parity-check (LDPC) codes are newly introduced into the PMD mitigation scheme with D-FPSs in this article as one of the promising FEC codes to achieve better performance. The scrambling speed of FPS for LDPC (2040, 1903) codes system is discussed, and the reasonable speed 10MHz is obtained from the simulation results. For easy application in practical large scale integrated (LSI) circuit, the number of iterations in decoding LDPC codes is also investigated. The PMD tolerance and cut-off optical signal-to-noise ratio (OSNR) of LDPC codes are compared with Reed-Solomon (RS) codes in different conditions. In the simulation, the interleaving LDPC codes bring incremental performance of error correction, and the PMD tolerance is 10ps at OSNR=11.4dB. The results show the meaning of the work is that LDPC codes are a substitute for traditional RS codes with D-FPSs and all of the executable code files are open for researchers who have practical LSI platform for PMD mitigation.

  2. Systematic behavioural observation of executive performance after brain injury.

    PubMed

    Lewis, Mark W; Babbage, Duncan R; Leathem, Janet M

    2017-01-01

    To develop an ecologically valid measure of executive functioning (i.e. Planning and Organization, Executive Memory, Initiation, Cognitive Shifting, Impulsivity, Sustained and Directed Attention, Error Detection, Error Correction and Time Management) during a functional chocolate brownie cooking task. In Study 1, the inter-rater reliability of a novel behavioural observation assessment method was assessed with 10 people with traumatic brain injury (TBI). In Study 2, 27 people with TBI and 16 healthy controls completed the functional task along with other measures of executive functioning to assess validity. Intraclass correlation coefficients for six of the nine aspects of executive functioning ranged from .54 to 1.00. Percentage agreements for the remaining aspects ranged from 70% to 90%. Significant and non-significant, moderate, correlations were found between the functional cooking task and standard neuropsychological measures. The healthy control group performed better than the TBI group in six areas (d = 0.56 to 1.23). In this initial trial of a novel assessment method, adequate inter-rater reliability was found. The measure was associated with standard neuropsychological measures, and our healthy control group performed better than the TBI group. The measure appears to be an ecologically valid measure of executive functioning.

  3. RACER: Effective Race Detection Using AspectJ

    NASA Technical Reports Server (NTRS)

    Bodden, Eric; Havelund, Klaus

    2008-01-01

    The limits of coding with joint constraints on detected and undetected error rates Programming errors occur frequently in large software systems, and even more so if these systems are concurrent. In the past, researchers have developed specialized programs to aid programmers detecting concurrent programming errors such as deadlocks, livelocks, starvation and data races. In this work we propose a language extension to the aspect-oriented programming language AspectJ, in the form of three new built-in pointcuts, lock(), unlock() and may be Shared(), which allow programmers to monitor program events where locks are granted or handed back, and where values are accessed that may be shared amongst multiple Java threads. We decide thread-locality using a static thread-local objects analysis developed by others. Using the three new primitive pointcuts, researchers can directly implement efficient monitoring algorithms to detect concurrent programming errors online. As an example, we expose a new algorithm which we call RACER, an adoption of the well-known ERASER algorithm to the memory model of Java. We implemented the new pointcuts as an extension to the Aspect Bench Compiler, implemented the RACER algorithm using this language extension and then applied the algorithm to the NASA K9 Rover Executive. Our experiments proved our implementation very effective. In the Rover Executive RACER finds 70 data races. Only one of these races was previously known.We further applied the algorithm to two other multi-threaded programs written by Computer Science researchers, in which we found races as well.

  4. User's manual for a two-dimensional, ground-water flow code on the Octopus computer network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naymik, T.G.

    1978-08-30

    A ground-water hydrology computer code, programmed by R.L. Taylor (in Proc. American Society of Civil Engineers, Journal of Hydraulics Division, 93(HY2), pp. 25-33 (1967)), has been adapted to the Octopus computer system at Lawrence Livermore Laboratory. Using an example problem, this manual details the input, output, and execution options of the code.

  5. An expert system executive for automated assembly of large space truss structures

    NASA Technical Reports Server (NTRS)

    Allen, Cheryl L.

    1993-01-01

    Langley Research Center developed a unique test bed for investigating the practical problems associated with the assembly of large space truss structures using robotic manipulators. The test bed is the result of an interdisciplinary effort that encompasses the full spectrum of assembly problems - from the design of mechanisms to the development of software. The automated structures assembly test bed and its operation are described, the expert system executive and its development are detailed, and the planned system evolution is discussed. Emphasis is on the expert system implementation of the program executive. The executive program must direct and reliably perform complex assembly tasks with the flexibility to recover from realistic system errors. The employment of an expert system permits information that pertains to the operation of the system to be encapsulated concisely within a knowledge base. This consolidation substantially reduced code, increased flexibility, eased software upgrades, and realized a savings in software maintenance costs.

  6. Probability of undetected error after decoding for a concatenated coding scheme

    NASA Technical Reports Server (NTRS)

    Costello, D. J., Jr.; Lin, S.

    1984-01-01

    A concatenated coding scheme for error control in data communications is analyzed. In this scheme, the inner code is used for both error correction and detection, however the outer code is used only for error detection. A retransmission is requested if the outer code detects the presence of errors after the inner code decoding. Probability of undetected error is derived and bounded. A particular example, proposed for NASA telecommand system is analyzed.

  7. Dual-balanced detection scheme with optical hard-limiters in an optical code division multiple access system

    NASA Astrophysics Data System (ADS)

    Liu, Maw-Yang; Hsu, Yi-Kai

    2017-03-01

    Three-arm dual-balanced detection scheme is studied in an optical code division multiple access system. As the MAI and beat noise are the main deleterious source of system performance, we utilize optical hard-limiters to alleviate such channel impairment. In addition, once the channel condition is improved effectively, the proposed two-dimensional error correction code can remarkably enhance the system performance. In our proposed scheme, the optimal thresholds of optical hard-limiters and decision circuitry are fixed, and they will not change with other system parameters. Our proposed scheme can accommodate a large number of users simultaneously and is suitable for burst traffic with asynchronous transmission. Therefore, it is highly recommended as the platform for broadband optical access network.

  8. PanCoreGen - Profiling, detecting, annotating protein-coding genes in microbial genomes.

    PubMed

    Paul, Sandip; Bhardwaj, Archana; Bag, Sumit K; Sokurenko, Evgeni V; Chattopadhyay, Sujay

    2015-12-01

    A large amount of genomic data, especially from multiple isolates of a single species, has opened new vistas for microbial genomics analysis. Analyzing the pan-genome (i.e. the sum of genetic repertoire) of microbial species is crucial in understanding the dynamics of molecular evolution, where virulence evolution is of major interest. Here we present PanCoreGen - a standalone application for pan- and core-genomic profiling of microbial protein-coding genes. PanCoreGen overcomes key limitations of the existing pan-genomic analysis tools, and develops an integrated annotation-structure for a species-specific pan-genomic profile. It provides important new features for annotating draft genomes/contigs and detecting unidentified genes in annotated genomes. It also generates user-defined group-specific datasets within the pan-genome. Interestingly, analyzing an example-set of Salmonella genomes, we detect potential footprints of adaptive convergence of horizontally transferred genes in two human-restricted pathogenic serovars - Typhi and Paratyphi A. Overall, PanCoreGen represents a state-of-the-art tool for microbial phylogenomics and pathogenomics study. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Malware detection and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiang, Ken; Lloyd, Levi; Crussell, Jonathan

    Embodiments of the invention describe systems and methods for malicious software detection and analysis. A binary executable comprising obfuscated malware on a host device may be received, and incident data indicating a time when the binary executable was received and identifying processes operating on the host device may be recorded. The binary executable is analyzed via a scalable plurality of execution environments, including one or more non-virtual execution environments and one or more virtual execution environments, to generate runtime data and deobfuscation data attributable to the binary executable. At least some of the runtime data and deobfuscation data attributable tomore » the binary executable is stored in a shared database, while at least some of the incident data is stored in a private, non-shared database.« less

  10. Memoized Symbolic Execution

    NASA Technical Reports Server (NTRS)

    Yang, Guowei; Pasareanu, Corina S.; Khurshid, Sarfraz

    2012-01-01

    This paper introduces memoized symbolic execution (Memoise), a novel approach for more efficient application of forward symbolic execution, which is a well-studied technique for systematic exploration of program behaviors based on bounded execution paths. Our key insight is that application of symbolic execution often requires several successive runs of the technique on largely similar underlying problems, e.g., running it once to check a program to find a bug, fixing the bug, and running it again to check the modified program. Memoise introduces a trie-based data structure that stores the key elements of a run of symbolic execution. Maintenance of the trie during successive runs allows re-use of previously computed results of symbolic execution without the need for re-computing them as is traditionally done. Experiments using our prototype embodiment of Memoise show the benefits it holds in various standard scenarios of using symbolic execution, e.g., with iterative deepening of exploration depth, to perform regression analysis, or to enhance coverage.

  11. Evaluation of the efficiency and fault density of software generated by code generators

    NASA Technical Reports Server (NTRS)

    Schreur, Barbara

    1993-01-01

    Flight computers and flight software are used for GN&C (guidance, navigation, and control), engine controllers, and avionics during missions. The software development requires the generation of a considerable amount of code. The engineers who generate the code make mistakes and the generation of a large body of code with high reliability requires considerable time. Computer-aided software engineering (CASE) tools are available which generates code automatically with inputs through graphical interfaces. These tools are referred to as code generators. In theory, code generators could write highly reliable code quickly and inexpensively. The various code generators offer different levels of reliability checking. Some check only the finished product while some allow checking of individual modules and combined sets of modules as well. Considering NASA's requirement for reliability, an in house manually generated code is needed. Furthermore, automatically generated code is reputed to be as efficient as the best manually generated code when executed. In house verification is warranted.

  12. Scaling Optimization of the SIESTA MHD Code

    NASA Astrophysics Data System (ADS)

    Seal, Sudip; Hirshman, Steven; Perumalla, Kalyan

    2013-10-01

    SIESTA is a parallel three-dimensional plasma equilibrium code capable of resolving magnetic islands at high spatial resolutions for toroidal plasmas. Originally designed to exploit small-scale parallelism, SIESTA has now been scaled to execute efficiently over several thousands of processors P. This scaling improvement was accomplished with minimal intrusion to the execution flow of the original version. First, the efficiency of the iterative solutions was improved by integrating the parallel tridiagonal block solver code BCYCLIC. Krylov-space generation in GMRES was then accelerated using a customized parallel matrix-vector multiplication algorithm. Novel parallel Hessian generation algorithms were integrated and memory access latencies were dramatically reduced through loop nest optimizations and data layout rearrangement. These optimizations sped up equilibria calculations by factors of 30-50. It is possible to compute solutions with granularity N/P near unity on extremely fine radial meshes (N > 1024 points). Grid separation in SIESTA, which manifests itself primarily in the resonant components of the pressure far from rational surfaces, is strongly suppressed by finer meshes. Large problem sizes of up to 300 K simultaneous non-linear coupled equations have been solved on the NERSC supercomputers. Work supported by U.S. DOE under Contract DE-AC05-00OR22725 with UT-Battelle, LLC.

  13. Performance analysis of a cascaded coding scheme with interleaved outer code

    NASA Technical Reports Server (NTRS)

    Lin, S.

    1986-01-01

    A cascaded coding scheme for a random error channel with a bit-error rate is analyzed. In this scheme, the inner code C sub 1 is an (n sub 1, m sub 1l) binary linear block code which is designed for simultaneous error correction and detection. The outer code C sub 2 is a linear block code with symbols from the Galois field GF (2 sup l) which is designed for correcting both symbol errors and erasures, and is interleaved with a degree m sub 1. A procedure for computing the probability of a correct decoding is presented and an upper bound on the probability of a decoding error is derived. The bound provides much better results than the previous bound for a cascaded coding scheme with an interleaved outer code. Example schemes with inner codes ranging from high rates to very low rates are evaluated. Several schemes provide extremely high reliability even for very high bit-error rates say 10 to the -1 to 10 to the -2 power.

  14. Stimulation at Desert Peak -modeling with the coupled THM code FEHM

    DOE Data Explorer

    kelkar, sharad

    2013-04-30

    Numerical modeling of the 2011 shear stimulation at the Desert Peak well 27-15. This submission contains the FEHM executable code for a 64-bit PC Windows-7 machine, and the input and output files for the results presented in the included paper from ARMA-213 meeting.

  15. Two-stage sparse coding of region covariance via Log-Euclidean kernels to detect saliency.

    PubMed

    Zhang, Ying-Ying; Yang, Cai; Zhang, Ping

    2017-05-01

    In this paper, we present a novel bottom-up saliency detection algorithm from the perspective of covariance matrices on a Riemannian manifold. Each superpixel is described by a region covariance matrix on Riemannian Manifolds. We carry out a two-stage sparse coding scheme via Log-Euclidean kernels to extract salient objects efficiently. In the first stage, given background dictionary on image borders, sparse coding of each region covariance via Log-Euclidean kernels is performed. The reconstruction error on the background dictionary is regarded as the initial saliency of each superpixel. In the second stage, an improvement of the initial result is achieved by calculating reconstruction errors of the superpixels on foreground dictionary, which is extracted from the first stage saliency map. The sparse coding in the second stage is similar to the first stage, but is able to effectively highlight the salient objects uniformly from the background. Finally, three post-processing methods-highlight-inhibition function, context-based saliency weighting, and the graph cut-are adopted to further refine the saliency map. Experiments on four public benchmark datasets show that the proposed algorithm outperforms the state-of-the-art methods in terms of precision, recall and mean absolute error, and demonstrate the robustness and efficiency of the proposed method. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. How Executive Coaches Assess and Develop Emotional Intelligence in the Executive Suite

    ERIC Educational Resources Information Center

    McNevin, Mary

    2010-01-01

    This qualitative research study explores the connections between executive coaching and emotional intelligence (EI) when working with senior level executives. The focus is on coaching the senior executives (chief executive officer, chief financial officer, senior vice-presidents) of companies of over $1 billion dollars in revenue. Since research…

  17. Statistical fingerprinting for malware detection and classification

    DOEpatents

    Prowell, Stacy J.; Rathgeb, Christopher T.

    2015-09-15

    A system detects malware in a computing architecture with an unknown pedigree. The system includes a first computing device having a known pedigree and operating free of malware. The first computing device executes a series of instrumented functions that, when executed, provide a statistical baseline that is representative of the time it takes the software application to run on a computing device having a known pedigree. A second computing device executes a second series of instrumented functions that, when executed, provides an actual time that is representative of the time the known software application runs on the second computing device. The system detects malware when there is a difference in execution times between the first and the second computing devices.

  18. A New Parallel Approach for Accelerating the GPU-Based Execution of Edge Detection Algorithms.

    PubMed

    Emrani, Zahra; Bateni, Soroosh; Rabbani, Hossein

    2017-01-01

    Real-time image processing is used in a wide variety of applications like those in medical care and industrial processes. This technique in medical care has the ability to display important patient information graphi graphically, which can supplement and help the treatment process. Medical decisions made based on real-time images are more accurate and reliable. According to the recent researches, graphic processing unit (GPU) programming is a useful method for improving the speed and quality of medical image processing and is one of the ways of real-time image processing. Edge detection is an early stage in most of the image processing methods for the extraction of features and object segments from a raw image. The Canny method, Sobel and Prewitt filters, and the Roberts' Cross technique are some examples of edge detection algorithms that are widely used in image processing and machine vision. In this work, these algorithms are implemented using the Compute Unified Device Architecture (CUDA), Open Source Computer Vision (OpenCV), and Matrix Laboratory (MATLAB) platforms. An existing parallel method for Canny approach has been modified further to run in a fully parallel manner. This has been achieved by replacing the breadth- first search procedure with a parallel method. These algorithms have been compared by testing them on a database of optical coherence tomography images. The comparison of results shows that the proposed implementation of the Canny method on GPU using the CUDA platform improves the speed of execution by 2-100× compared to the central processing unit-based implementation using the OpenCV and MATLAB platforms.

  19. Production Level CFD Code Acceleration for Hybrid Many-Core Architectures

    NASA Technical Reports Server (NTRS)

    Duffy, Austen C.; Hammond, Dana P.; Nielsen, Eric J.

    2012-01-01

    In this work, a novel graphics processing unit (GPU) distributed sharing model for hybrid many-core architectures is introduced and employed in the acceleration of a production-level computational fluid dynamics (CFD) code. The latest generation graphics hardware allows multiple processor cores to simultaneously share a single GPU through concurrent kernel execution. This feature has allowed the NASA FUN3D code to be accelerated in parallel with up to four processor cores sharing a single GPU. For codes to scale and fully use resources on these and the next generation machines, codes will need to employ some type of GPU sharing model, as presented in this work. Findings include the effects of GPU sharing on overall performance. A discussion of the inherent challenges that parallel unstructured CFD codes face in accelerator-based computing environments is included, with considerations for future generation architectures. This work was completed by the author in August 2010, and reflects the analysis and results of the time.

  20. New double-byte error-correcting codes for memory systems

    NASA Technical Reports Server (NTRS)

    Feng, Gui-Liang; Wu, Xinen; Rao, T. R. N.

    1996-01-01

    Error-correcting or error-detecting codes have been used in the computer industry to increase reliability, reduce service costs, and maintain data integrity. The single-byte error-correcting and double-byte error-detecting (SbEC-DbED) codes have been successfully used in computer memory subsystems. There are many methods to construct double-byte error-correcting (DBEC) codes. In the present paper we construct a class of double-byte error-correcting codes, which are more efficient than those known to be optimum, and a decoding procedure for our codes is also considered.

  1. Additional extensions to the NASCAP computer code, volume 3

    NASA Technical Reports Server (NTRS)

    Mandell, M. J.; Cooke, D. L.

    1981-01-01

    The ION computer code is designed to calculate charge exchange ion densities, electric potentials, plasma temperatures, and current densities external to a neutralized ion engine in R-Z geometry. The present version assumes the beam ion current and density to be known and specified, and the neutralizing electrons to originate from a hot-wire ring surrounding the beam orifice. The plasma is treated as being resistive, with an electron relaxation time comparable to the plasma frequency. Together with the thermal and electrical boundary conditions described below and other straightforward engine parameters, these assumptions suffice to determine the required quantities. The ION code, written in ASCII FORTRAN for UNIVAC 1100 series computers, is designed to be run interactively, although it can also be run in batch mode. The input is free-format, and the output is mainly graphical, using the machine-independent graphics developed for the NASCAP code. The executive routine calls the code's major subroutines in user-specified order, and the code allows great latitude for restart and parameter change.

  2. The Senior Executive Service

    NASA Technical Reports Server (NTRS)

    1992-01-01

    A major innovation of the Civil Service Reform Act of 1978 was the creation of a Senior Executive Service (SES). The purpose of the SES is both simple and bold: to attract executives of the highest quality into Federal service and to retain them by providing outstanding opportunities for career growth and reward. The SES is intended to: provide greater authority in managing executive resources; attract and retain highly competent executives, and assign them where they will effectively accomplish their missions and best use their talents; provide for systematic development of executives; hold executives accountable for individual and organizational performance; reward outstanding performers and remove poor performers; and provide for an executive merit system free of inappropriate personnel practices and arbitrary actions. This Handbook summarizes the key features of the SES at NASA. It is intended as a special welcome to new appointees and also as a general reference document. It contains an overview of SES management at NASA, including the Executive Resources Board and the Performance Review Board, which are mandated by law to carry out key SES functions. In addition, assistance is provided by a Senior Executive Committee in certain reviews and decisions and by Executive Position Managers in day-to-day administration and oversight.

  3. Execution of a parallel edge-based Navier-Stokes solver on commodity graphics processor units

    NASA Astrophysics Data System (ADS)

    Corral, Roque; Gisbert, Fernando; Pueblas, Jesus

    2017-02-01

    The implementation of an edge-based three-dimensional Reynolds Average Navier-Stokes solver for unstructured grids able to run on multiple graphics processing units (GPUs) is presented. Loops over edges, which are the most time-consuming part of the solver, have been written to exploit the massively parallel capabilities of GPUs. Non-blocking communications between parallel processes and between the GPU and the central processor unit (CPU) have been used to enhance code scalability. The code is written using a mixture of C++ and OpenCL, to allow the execution of the source code on GPUs. The Message Passage Interface (MPI) library is used to allow the parallel execution of the solver on multiple GPUs. A comparative study of the solver parallel performance is carried out using a cluster of CPUs and another of GPUs. It is shown that a single GPU is up to 64 times faster than a single CPU core. The parallel scalability of the solver is mainly degraded due to the loss of computing efficiency of the GPU when the size of the case decreases. However, for large enough grid sizes, the scalability is strongly improved. A cluster featuring commodity GPUs and a high bandwidth network is ten times less costly and consumes 33% less energy than a CPU-based cluster with an equivalent computational power.

  4. 5 CFR 842.211 - Senior Executive Service, Defense Intelligence Senior Executive Service, and Senior Cryptologic...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Intelligence Senior Executive Service, and Senior Cryptologic Executive Service. 842.211 Section 842.211... Intelligence Senior Executive Service, and Senior Cryptologic Executive Service. (a) A member of the Senior Executive Service, the Defense Intelligence Senior Executive Service, or the Senior Cryptologic Senior...

  5. 5 CFR 842.211 - Senior Executive Service, Defense Intelligence Senior Executive Service, and Senior Cryptologic...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Intelligence Senior Executive Service, and Senior Cryptologic Executive Service. 842.211 Section 842.211... Intelligence Senior Executive Service, and Senior Cryptologic Executive Service. (a) A member of the Senior Executive Service, the Defense Intelligence Senior Executive Service, or the Senior Cryptologic Senior...

  6. 5 CFR 842.211 - Senior Executive Service, Defense Intelligence Senior Executive Service, and Senior Cryptologic...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Intelligence Senior Executive Service, and Senior Cryptologic Executive Service. 842.211 Section 842.211... Intelligence Senior Executive Service, and Senior Cryptologic Executive Service. (a) A member of the Senior Executive Service, the Defense Intelligence Senior Executive Service, or the Senior Cryptologic Senior...

  7. 5 CFR 842.211 - Senior Executive Service, Defense Intelligence Senior Executive Service, and Senior Cryptologic...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Intelligence Senior Executive Service, and Senior Cryptologic Executive Service. 842.211 Section 842.211... Intelligence Senior Executive Service, and Senior Cryptologic Executive Service. (a) A member of the Senior Executive Service, the Defense Intelligence Senior Executive Service, or the Senior Cryptologic Senior...

  8. 5 CFR 842.211 - Senior Executive Service, Defense Intelligence Senior Executive Service, and Senior Cryptologic...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Intelligence Senior Executive Service, and Senior Cryptologic Executive Service. 842.211 Section 842.211... Intelligence Senior Executive Service, and Senior Cryptologic Executive Service. (a) A member of the Senior Executive Service, the Defense Intelligence Senior Executive Service, or the Senior Cryptologic Senior...

  9. Utilities for master source code distribution: MAX and Friends

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1988-01-01

    MAX is a program for the manipulation of FORTRAN master source code (MSC). This is a technique by which one maintains one and only one master copy of a FORTRAN program under a program developing system, which for MAX is assumed to be VAX/VMS. The master copy is not intended to be directly compiled. Instead it must be pre-processed by MAX to produce compilable instances. These instances may correspond to different code versions (for example, double precision versus single precision), different machines (for example, IBM, CDC, Cray) or different operating systems (i.e., VAX/VMS versus VAX/UNIX). The advantage os using a master source is more pronounced in complex application programs that are developed and maintained over many years and are to be transported and executed on several computer environments. The version lag problem that plagues many such programs is avoided by this approach. MAX is complemented by several auxiliary programs that perform nonessential functions. The ensemble is collectively known as MAX and Friends. All of these programs, including MAX, are executed as foreign VAX/VMS commands and can easily be hidden in customized VMS command procedures.

  10. Getting the right grasp on executive function

    PubMed Central

    Gonzalez, Claudia L. R.; Mills, Kelly J.; Genee, Inge; Li, Fangfang; Piquette, Noella; Rosen, Nicole; Gibb, Robbin

    2014-01-01

    Executive Function (EF) refers to important socio-emotional and cognitive skills that are known to be highly correlated with both academic and life success. EF is a blanket term that is considered to include self-regulation, working memory, and planning. Recent studies have shown a relationship between EF and motor control. The emergence of motor control coincides with that of EF, hence understanding the relationship between these two domains could have significant implications for early detection and remediation of later EF deficits. The purpose of the current study was to investigate this relationship in young children. This study incorporated the Behavioral Rating Inventory of Executive Function (BRIEF) and two motor assessments with a focus on precision grasping to test this hypothesis. The BRIEF is comprised of two indices of EF: (1) the Behavioral Regulation Index (BRI) containing three subscales: Inhibit, Shift, and Emotional Control; (2) the Metacognition Index (MI) containing five subscales: Initiate, Working Memory, Plan/Organize, Organization of Materials, and Monitor. A global executive composite (GEC) is derived from the two indices. In this study, right-handed children aged 5–6 and 9–10 were asked to: grasp-to-construct (Lego® models); and grasp-to-place (wooden blocks), while their parents completed the BRIEF questionnaire. Analysis of results indicated significant correlations between the strength of right hand preference for grasping and numerous elements of the BRIEF including the BRI, MI, and GEC. Specifically, the more the right hand was used for grasping the better the EF ratings. In addition, patterns of space-use correlated with the GEC in several subscales of the BRIEF. Finally and remarkably, the results also showed a reciprocal relationship between hand and space use for grasping and EF. These findings are discussed with respect to: (1) the developmental overlap of motor and executive functions; (2) detection of EF deficits through

  11. A Mechanism to Avoid Collusion Attacks Based on Code Passing in Mobile Agent Systems

    NASA Astrophysics Data System (ADS)

    Jaimez, Marc; Esparza, Oscar; Muñoz, Jose L.; Alins-Delgado, Juan J.; Mata-Díaz, Jorge

    Mobile agents are software entities consisting of code, data, state and itinerary that can migrate autonomously from host to host executing their code. Despite its benefits, security issues strongly restrict the use of code mobility. The protection of mobile agents against the attacks of malicious hosts is considered the most difficult security problem to solve in mobile agent systems. In particular, collusion attacks have been barely studied in the literature. This paper presents a mechanism that avoids collusion attacks based on code passing. Our proposal is based on a Multi-Code agent, which contains a different variant of the code for each host. A Trusted Third Party is responsible for providing the information to extract its own variant to the hosts, and for taking trusted timestamps that will be used to verify time coherence.

  12. Parental Guidance and Children's Executive Function: Working Memory and Planning as Moderators during Joint Problem-Solving

    ERIC Educational Resources Information Center

    Eason, Sarah H.; Ramani, Geetha B.

    2017-01-01

    Cognitive aspects of children's executive function (EF) were examined as moderators of the effectiveness of parental guidance on children's learning. Thirty-two 5-year-old children and their parents were observed during joint problem-solving. Forms of guidance geared towards cognitive assistance were coded as directive or elaborative, and…

  13. Autonomous execution of the Precision Immobilization Technique

    NASA Astrophysics Data System (ADS)

    Mascareñas, David D. L.; Stull, Christopher J.; Farrar, Charles R.

    2017-03-01

    Over the course of the last decade great advances have been made in autonomously driving cars. The technology has advanced to the point that driverless car technology is currently being tested on publicly accessed roadways. The introduction of these technologies onto publicly accessed roadways not only raises questions of safety, but also security. Autonomously driving cars are inherently cyber-physical systems and as such will have novel security vulnerabilities that couple both the cyber aspects of the vehicle including the on-board computing and any network data it makes use of, with the physical nature of the vehicle including its sensors, actuators, and the vehicle chassis. Widespread implementation of driverless car technology will require that both the cyber, as well as physical security concerns surrounding these vehicles are addressed. In this work, we specifically developed a control policy to autonomously execute the Precision Immobilization Technique, a.k.a. the PIT maneuver. The PIT maneuver was originally developed by law enforcement to end high-speed vehicular pursuits in a quasi-safe manner. However, there is still a risk of damage/roll-over to both the vehicle executing the PIT maneuver as well as to the vehicle subject to the PIT maneuver. In law enforcement applications, it would be preferable to execute the PIT maneuver using an autonomous vehicle, thus removing the danger to law-enforcement officers. Furthermore, it is entirely possible that unscrupulous individuals could inject code into an autonomously-driving car to use the PIT maneuver to immobilize other vehicles while maintaining anonymity. For these reasons it is useful to know how the PIT maneuver can be implemented on an autonomous car. In this work a simple control policy based on velocity pursuit was developed to autonomously execute the PIT maneuver using only a vision and range measurements that are both commonly collected by contemporary driverless cars. The ability of this

  14. Intrusion Prevention and Detection in Grid Computing - The ALICE Case

    NASA Astrophysics Data System (ADS)

    Gomez, Andres; Lara, Camilo; Kebschull, Udo

    2015-12-01

    Grids allow users flexible on-demand usage of computing resources through remote communication networks. A remarkable example of a Grid in High Energy Physics (HEP) research is used in the ALICE experiment at European Organization for Nuclear Research CERN. Physicists can submit jobs used to process the huge amount of particle collision data produced by the Large Hadron Collider (LHC). Grids face complex security challenges. They are interesting targets for attackers seeking for huge computational resources. Since users can execute arbitrary code in the worker nodes on the Grid sites, special care should be put in this environment. Automatic tools to harden and monitor this scenario are required. Currently, there is no integrated solution for such requirement. This paper describes a new security framework to allow execution of job payloads in a sandboxed context. It also allows process behavior monitoring to detect intrusions, even when new attack methods or zero day vulnerabilities are exploited, by a Machine Learning approach. We plan to implement the proposed framework as a software prototype that will be tested as a component of the ALICE Grid middleware.

  15. Real-Time Projection to Verify Plan Success During Execution

    NASA Technical Reports Server (NTRS)

    Wagner, David A.; Dvorak, Daniel L.; Rasmussen, Robert D.; Knight, Russell L.; Morris, John R.; Bennett, Matthew B.; Ingham, Michel D.

    2012-01-01

    The Mission Data System provides a framework for modeling complex systems in terms of system behaviors and goals that express intent. Complex activity plans can be represented as goal networks that express the coordination of goals on different state variables of the system. Real-time projection extends the ability of this system to verify plan achievability (all goals can be satisfied over the entire plan) into the execution domain so that the system is able to continuously re-verify a plan as it is executed, and as the states of the system change in response to goals and the environment. Previous versions were able to detect and respond to goal violations when they actually occur during execution. This new capability enables the prediction of future goal failures; specifically, goals that were previously found to be achievable but are no longer achievable due to unanticipated faults or environmental conditions. Early detection of such situations enables operators or an autonomous fault response capability to deal with the problem at a point that maximizes the available options. For example, this system has been applied to the problem of managing battery energy on a lunar rover as it is used to explore the Moon. Astronauts drive the rover to waypoints and conduct science observations according to a plan that is scheduled and verified to be achievable with the energy resources available. As the astronauts execute this plan, the system uses this new capability to continuously re-verify the plan as energy is consumed to ensure that the battery will never be depleted below safe levels across the entire plan.

  16. A New Parallel Approach for Accelerating the GPU-Based Execution of Edge Detection Algorithms

    PubMed Central

    Emrani, Zahra; Bateni, Soroosh; Rabbani, Hossein

    2017-01-01

    Real-time image processing is used in a wide variety of applications like those in medical care and industrial processes. This technique in medical care has the ability to display important patient information graphi graphically, which can supplement and help the treatment process. Medical decisions made based on real-time images are more accurate and reliable. According to the recent researches, graphic processing unit (GPU) programming is a useful method for improving the speed and quality of medical image processing and is one of the ways of real-time image processing. Edge detection is an early stage in most of the image processing methods for the extraction of features and object segments from a raw image. The Canny method, Sobel and Prewitt filters, and the Roberts’ Cross technique are some examples of edge detection algorithms that are widely used in image processing and machine vision. In this work, these algorithms are implemented using the Compute Unified Device Architecture (CUDA), Open Source Computer Vision (OpenCV), and Matrix Laboratory (MATLAB) platforms. An existing parallel method for Canny approach has been modified further to run in a fully parallel manner. This has been achieved by replacing the breadth- first search procedure with a parallel method. These algorithms have been compared by testing them on a database of optical coherence tomography images. The comparison of results shows that the proposed implementation of the Canny method on GPU using the CUDA platform improves the speed of execution by 2–100× compared to the central processing unit-based implementation using the OpenCV and MATLAB platforms. PMID:28487831

  17. Atomicity violation detection using access interleaving invariants

    DOEpatents

    Zhou, Yuanyuan; Lu, Shan; Tucek, Joseph Andrew

    2013-09-10

    During execution of a program, the situation where the atomicity of a pair of instructions that are to be executed atomically is violated is identified, and a bug is detected as occurring in the program at the pair of instructions. The pairs of instructions that are to be executed atomically can be identified in different manners, such as by executing a program multiple times and using the results of those executions to automatically identify the pairs of instructions.

  18. RETRACTED — PMD mitigation through interleaving LDPC codes with polarization scramblers

    NASA Astrophysics Data System (ADS)

    Han, Dahai; Chen, Haoran; Xi, Lixia

    2012-11-01

    The combination of forward error correction (FEC) and distributed fast polarization scramblers (D-FPSs) is approved as an effective method to mitigate polarization mode dispersion (PMD) in high-speed optical fiber communication system. The low-density parity-check (LDPC) codes are newly introduced into the PMD mitigation scheme with D-FPSs in this paper as one of the promising FEC codes to achieve better performance. The scrambling speed of FPS for LDPC (2040, 1903) codes system is discussed, and the reasonable speed 10 MHz is obtained from the simulation results. For easy application in practical large scale integrated (LSI) circuit, the number of iterations in decoding LDPC codes is also investigated. The PMD tolerance and cut-off optical signal-to-noise ratio (OSNR) of LDPC codes are compared with Reed-Solomon (RS) codes in different conditions. In the simulation, the interleaving LDPC codes brings incremental performance of error correction, and the PMD tolerance is 10 ps at OSNR=11.4 dB. The results show that the meaning of the work is that LDPC codes are a substitute for traditional RS codes with D-FPSs and all of the executable code files are open for researchers who have practical LSI platform for PMD mitigation.

  19. Real-time transmission of digital video using variable-length coding

    NASA Technical Reports Server (NTRS)

    Bizon, Thomas P.; Shalkhauser, Mary JO; Whyte, Wayne A., Jr.

    1993-01-01

    Huffman coding is a variable-length lossless compression technique where data with a high probability of occurrence is represented with short codewords, while 'not-so-likely' data is assigned longer codewords. Compression is achieved when the high-probability levels occur so frequently that their benefit outweighs any penalty paid when a less likely input occurs. One instance where Huffman coding is extremely effective occurs when data is highly predictable and differential coding can be applied (as with a digital video signal). For that reason, it is desirable to apply this compression technique to digital video transmission; however, special care must be taken in order to implement a communication protocol utilizing Huffman coding. This paper addresses several of the issues relating to the real-time transmission of Huffman-coded digital video over a constant-rate serial channel. Topics discussed include data rate conversion (from variable to a fixed rate), efficient data buffering, channel coding, recovery from communication errors, decoder synchronization, and decoder architectures. A description of the hardware developed to execute Huffman coding and serial transmission is also included. Although this paper focuses on matters relating to Huffman-coded digital video, the techniques discussed can easily be generalized for a variety of applications which require transmission of variable-length data.

  20. Systematic alphanumeric-coded endoscopy versus chromoendoscopy for the detection of precancerous gastric lesions and early gastric cancer in subjects at average risk for gastric cancer.

    PubMed

    Pérez-Mendoza, A; Zárate-Guzmán, Á M; Galvis García, E S; Sobrino Cossío, S; Djamus Birch, J

    Gastric cancer is one of the main causes of cancer worldwide, but there is currently no global screening strategy for the disease. Endoscopy is the screening method of choice in some Asian countries, but no standardized technique has been recognized. Systematic alphanumeric-coded endoscopy can increase gastric lesion detection. The aim of the present article was to compare the usefulness of systematic alphanumeric-coded endoscopy with conventional endoscopy for the detection of premalignant lesions and early gastric cancer in subjects at average risk for gastric cancer. A cross-sectional, comparative, prospective, randomized study was conducted on patients at average risk for gastric cancer (40-50 years of age, no history of H. pylori infection, intestinal metaplasia, gastric atrophy, or gastrointestinal surgery). Before undergoing endoscopy, the patients had gastric preparation (200mg of oral acetylcysteine or 50mg of oral dimethicone). Conventional chromoendoscopy was performed with indigo carmine dye for contrast enhancement. Fifty consecutive cases (mean age 44.4 ± 3.34 years, 60% women, BMI 27.6 ± 5.82 kg/m 2 ) were evaluated. Endoscopic imaging quality was satisfactory in all the cases, with no differences between methods (p = 0.817). The detection rate of premalignant lesions and early gastric cancer was 14% (6 cases of intestinal metaplasia and one case of gastric adenocarcinoma). Sensitivity, specificity, positive predictive value, negative predictive value, and diagnostic accuracy were 100, 95, 80, 100 and 96%, respectively, for systematic alphanumeric-coded endoscopy, and 100, 45, 20, 100, and 52%, respectively, for conventional endoscopy. Lesion detection through systematic alphanumeric-coded endoscopy was superior to that of conventional endoscopy (p = 0.003; OR = 12). Both techniques were effective, but systematic alphanumeric-coded endoscopy significantly reduced the false positive rate. Copyright © 2018 Asociación Mexicana de

  1. Cooperative multi-user detection and ranging based on pseudo-random codes

    NASA Astrophysics Data System (ADS)

    Morhart, C.; Biebl, E. M.

    2009-05-01

    We present an improved approach for a Round Trip Time of Flight distance measurement system. The system is intended for the usage in a cooperative localisation system for automotive applications. Therefore, it is designed to address a large number of communication partners per measurement cycle. By using coded signals in a time divison multiple access order, we can detect a large number of pedestrian sensors with just one car sensor. We achieve this by using very short transmit bursts in combination with a real time correlation algorithm. Futhermore, the correlation approach offers real time data, concerning the time of arrival, that can serve as a trigger impulse for other comunication systems. The distance accuracy of the correlation result was further increased by adding a fourier interpolation filter. The system performance was checked with a prototype at 2.4 GHz. We reached a distance measurement accuracy of 12 cm at a range up to 450 m.

  2. Efficient Proximity Computation Techniques Using ZIP Code Data for Smart Cities †

    PubMed Central

    Murdani, Muhammad Harist; Hong, Bonghee

    2018-01-01

    In this paper, we are interested in computing ZIP code proximity from two perspectives, proximity between two ZIP codes (Ad-Hoc) and neighborhood proximity (Top-K). Such a computation can be used for ZIP code-based target marketing as one of the smart city applications. A naïve approach to this computation is the usage of the distance between ZIP codes. We redefine a distance metric combining the centroid distance with the intersecting road network between ZIP codes by using a weighted sum method. Furthermore, we prove that the results of our combined approach conform to the characteristics of distance measurement. We have proposed a general and heuristic approach for computing Ad-Hoc proximity, while for computing Top-K proximity, we have proposed a general approach only. Our experimental results indicate that our approaches are verifiable and effective in reducing the execution time and search space. PMID:29587366

  3. Efficient Proximity Computation Techniques Using ZIP Code Data for Smart Cities †.

    PubMed

    Murdani, Muhammad Harist; Kwon, Joonho; Choi, Yoon-Ho; Hong, Bonghee

    2018-03-24

    In this paper, we are interested in computing ZIP code proximity from two perspectives, proximity between two ZIP codes ( Ad-Hoc ) and neighborhood proximity ( Top-K ). Such a computation can be used for ZIP code-based target marketing as one of the smart city applications. A naïve approach to this computation is the usage of the distance between ZIP codes. We redefine a distance metric combining the centroid distance with the intersecting road network between ZIP codes by using a weighted sum method. Furthermore, we prove that the results of our combined approach conform to the characteristics of distance measurement. We have proposed a general and heuristic approach for computing Ad-Hoc proximity, while for computing Top-K proximity, we have proposed a general approach only. Our experimental results indicate that our approaches are verifiable and effective in reducing the execution time and search space.

  4. Build and Execute Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guan, Qiang

    annotate workflows for repeatable and partial re-execution. We will coordinate the physical snapshots of virtual machines with parallel programming constructs, such as barriers, to automate checkpoint and restart. We will also integrate with HPC-specific container runtimes to gain access to accelerators and other specialized hardware to preserve native performance. Containers will link development to continuous integration. When application developers check code in, it will automatically be tested on a suite of different software and hardware architectures.« less

  5. Higher-Performance Executives: Bringing Executive Development Programs Into Balance

    ERIC Educational Resources Information Center

    Gilad, Benjamin; Chussil, Mark

    2013-01-01

    Executive development programs teach various skills deemed important in future leaders and help shape future leadership and its performance. However, they are often excessively focused on competencies required for dealing with internal issues and relationships. They do a much less admirable job preparing future executives for the unique skills…

  6. C code generation from Petri-net-based logic controller specification

    NASA Astrophysics Data System (ADS)

    Grobelny, Michał; Grobelna, Iwona; Karatkevich, Andrei

    2017-08-01

    The article focuses on programming of logic controllers. It is important that a programming code of a logic controller is executed flawlessly according to the primary specification. In the presented approach we generate C code for an AVR microcontroller from a rule-based logical model of a control process derived from a control interpreted Petri net. The same logical model is also used for formal verification of the specification by means of the model checking technique. The proposed rule-based logical model and formal rules of transformation ensure that the obtained implementation is consistent with the already verified specification. The approach is validated by practical experiments.

  7. The Role of Executive Functions in the Control of Aggressive Behavior

    PubMed Central

    Krämer, Ulrike M.; Kopyciok, Robert P. J.; Richter, Sylvia; Rodriguez-Fornells, Antoni; Münte, Thomas F.

    2011-01-01

    An extensive literature suggests a link between executive functions and aggressive behavior in humans, pointing mostly to an inverse relationship, i.e., increased tendencies toward aggression in individuals scoring low on executive function tests. This literature is limited, though, in terms of the groups studied and the measures of executive functions. In this paper, we present data from two studies addressing these issues. In a first behavioral study, we asked whether high trait aggressiveness is related to reduced executive functions. A sample of over 600 students performed in an extensive behavioral test battery including paradigms addressing executive functions such as the Eriksen Flanker task, Stroop task, n-back task, and Tower of London (TOL). High trait aggressive participants were found to have a significantly reduced latency score in the TOL, indicating more impulsive behavior compared to low trait aggressive participants. No other differences were detected. In an EEG-study, we assessed neural and behavioral correlates of error monitoring and response inhibition in participants who were characterized based on their laboratory-induced aggressive behavior in a competitive reaction time task. Participants who retaliated more in the aggression paradigm and had reduced frontal activity when being provoked did not, however, show any reduction in behavioral or neural correlates of executive control compared to the less aggressive participants. Our results question a strong relationship between aggression and executive functions at least for healthy, high-functioning people. PMID:21747775

  8. A (72, 36; 15) box code

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1993-01-01

    A (72,36;15) box code is constructed as a 9 x 8 matrix whose columns add to form an extended BCH-Hamming (8,4;4) code and whose rows sum to odd or even parity. The newly constructed code, due to its matrix form, is easily decodable for all seven-error and many eight-error patterns. The code comes from a slight modification in the parity (eighth) dimension of the Reed-Solomon (8,4;5) code over GF(512). Error correction uses the row sum parity information to detect errors, which then become erasures in a Reed-Solomon correction algorithm.

  9. Generic detection of poleroviruses using an RT-PCR assay targeting the RdRp coding sequence.

    PubMed

    Lotos, Leonidas; Efthimiou, Konstantinos; Maliogka, Varvara I; Katis, Nikolaos I

    2014-03-01

    In this study a two-step RT-PCR assay was developed for the generic detection of poleroviruses. The RdRp coding region was selected as the primers' target, since it differs significantly from that of other members in the family Luteoviridae and its sequence can be more informative than other regions in the viral genome. Species specific RT-PCR assays targeting the same region were also developed for the detection of the six most widespread poleroviral species (Beet mild yellowing virus, Beet western yellows virus, Cucurbit aphid-borne virus, Carrot red leaf virus, Potato leafroll virus and Turnip yellows virus) in Greece and the collection of isolates. These isolates along with other characterized ones were used for the evaluation of the generic PCR's detection range. The developed assay efficiently amplified a 593bp RdRp fragment from 46 isolates of 10 different Polerovirus species. Phylogenetic analysis using the generic PCR's amplicon sequence showed that although it cannot accurately infer evolutionary relationships within the genus it can differentiate poleroviruses at the species level. Overall, the described generic assay could be applied for the reliable detection of Polerovirus infections and, in combination with the specific PCRs, for the identification of new and uncharacterized species in the genus. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Error coding simulations

    NASA Technical Reports Server (NTRS)

    Noble, Viveca K.

    1993-01-01

    There are various elements such as radio frequency interference (RFI) which may induce errors in data being transmitted via a satellite communication link. When a transmission is affected by interference or other error-causing elements, the transmitted data becomes indecipherable. It becomes necessary to implement techniques to recover from these disturbances. The objective of this research is to develop software which simulates error control circuits and evaluate the performance of these modules in various bit error rate environments. The results of the evaluation provide the engineer with information which helps determine the optimal error control scheme. The Consultative Committee for Space Data Systems (CCSDS) recommends the use of Reed-Solomon (RS) and convolutional encoders and Viterbi and RS decoders for error correction. The use of forward error correction techniques greatly reduces the received signal to noise needed for a certain desired bit error rate. The use of concatenated coding, e.g. inner convolutional code and outer RS code, provides even greater coding gain. The 16-bit cyclic redundancy check (CRC) code is recommended by CCSDS for error detection.

  11. Dynamic wavefront creation for processing units using a hybrid compactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Puthoor, Sooraj; Beckmann, Bradford M.; Yudanov, Dmitri

    A method, a non-transitory computer readable medium, and a processor for repacking dynamic wavefronts during program code execution on a processing unit, each dynamic wavefront including multiple threads are presented. If a branch instruction is detected, a determination is made whether all wavefronts following a same control path in the program code have reached a compaction point, which is the branch instruction. If no branch instruction is detected in executing the program code, a determination is made whether all wavefronts following the same control path have reached a reconvergence point, which is a beginning of a program code segment tomore » be executed by both a taken branch and a not taken branch from a previous branch instruction. The dynamic wavefronts are repacked with all threads that follow the same control path, if all wavefronts following the same control path have reached the branch instruction or the reconvergence point.« less

  12. 3 CFR 13488 - Executive Order 13488 of January 16, 2009. Granting Reciprocity on Excepted Service and Federal...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., United States Code, but does not include those positions in any element of the intelligence community as... 3 The President 1 2010-01-01 2010-01-01 false Executive Order 13488 of January 16, 2009. Granting... and Reinvestigating Individuals in Positions of Public Trust By the authority vested in me as...

  13. Genetic Code Analysis Toolkit: A novel tool to explore the coding properties of the genetic code and DNA sequences

    NASA Astrophysics Data System (ADS)

    Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.

    2018-01-01

    The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/

  14. Evaluation of the finite element fuel rod analysis code (FRANCO)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, K.; Feltus, M.A.

    1994-12-31

    Knowledge of temperature distribution in a nuclear fuel rod is required to predict the behavior of fuel elements during operating conditions. The thermal and mechanical properties and performance characteristics are strongly dependent on the temperature, which can vary greatly inside the fuel rod. A detailed model of fuel rod behavior can be described by various numerical methods, including the finite element approach. The finite element method has been successfully used in many engineering applications, including nuclear piping and reactor component analysis. However, fuel pin analysis has traditionally been carried out with finite difference codes, with the exception of Electric Powermore » Research Institute`s FREY code, which was developed for mainframe execution. This report describes FRANCO, a finite element fuel rod analysis code capable of computing temperature disrtibution and mechanical deformation of a single light water reactor fuel rod.« less

  15. The structure of affective action representations: temporal binding of affective response codes.

    PubMed

    Eder, Andreas B; Müsseler, Jochen; Hommel, Bernhard

    2012-01-01

    Two experiments examined the hypothesis that preparing an action with a specific affective connotation involves the binding of this action to an affective code reflecting this connotation. This integration into an action plan should lead to a temporary occupation of the affective code, which should impair the concurrent representation of affectively congruent events, such as the planning of another action with the same valence. This hypothesis was tested with a dual-task setup that required a speeded choice between approach- and avoidance-type lever movements after having planned and before having executed an evaluative button press. In line with the code-occupation hypothesis, slower lever movements were observed when the lever movement was affectively compatible with the prepared evaluative button press than when the two actions were affectively incompatible. Lever movements related to approach and avoidance and evaluative button presses thus seem to share a code that represents affective meaning. A model of affective action control that is based on the theory of event coding is discussed.

  16. Evaluating Open-Source Full-Text Search Engines for Matching ICD-10 Codes.

    PubMed

    Jurcău, Daniel-Alexandru; Stoicu-Tivadar, Vasile

    2016-01-01

    This research presents the results of evaluating multiple free, open-source engines on matching ICD-10 diagnostic codes via full-text searches. The study investigates what it takes to get an accurate match when searching for a specific diagnostic code. For each code the evaluation starts by extracting the words that make up its text and continues with building full-text search queries from the combinations of these words. The queries are then run against all the ICD-10 codes until a match indicates the code in question as a match with the highest relative score. This method identifies the minimum number of words that must be provided in order for the search engines choose the desired entry. The engines analyzed include a popular Java-based full-text search engine, a lightweight engine written in JavaScript which can even execute on the user's browser, and two popular open-source relational database management systems.

  17. ESAS Deliverable PS 1.1.2.3: Customer Survey on Code Generations in Safety-Critical Applications

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Denney, Ewen

    2006-01-01

    Automated code generators (ACG) are tools that convert a (higher-level) model of a software (sub-)system into executable code without the necessity for a developer to actually implement the code. Although both commercially supported and in-house tools have been used in many industrial applications, little data exists on how these tools are used in safety-critical domains (e.g., spacecraft, aircraft, automotive, nuclear). The aims of the survey, therefore, were threefold: 1) to determine if code generation is primarily used as a tool for prototyping, including design exploration and simulation, or for fiight/production code; 2) to determine the verification issues with code generators relating, in particular, to qualification and certification in safety-critical domains; and 3) to determine perceived gaps in functionality of existing tools.

  18. State recovery and lockstep execution restart in a system with multiprocessor pairing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gara, Alan; Gschwind, Michael K; Salapura, Valentina

    System, method and computer program product for a multiprocessing system to offer selective pairing of processor cores for increased processing reliability. A selective pairing facility is provided that selectively connects, i.e., pairs, multiple microprocessor or processor cores to provide one highly reliable thread (or thread group). Each paired microprocessor or processor cores that provide one highly reliable thread for high-reliability connect with a system components such as a memory "nest" (or memory hierarchy), an optional system controller, and optional interrupt controller, optional I/O or peripheral devices, etc. The memory nest is attached to a selective pairing facility via a switchmore » or a bus. Each selectively paired processor core is includes a transactional execution facility, whereing the system is configured to enable processor rollback to a previous state and reinitialize lockstep execution in order to recover from an incorrect execution when an incorrect execution has been detected by the selective pairing facility.« less

  19. Executive working memory load induces inattentional blindness.

    PubMed

    Fougnie, Daryl; Marois, René

    2007-02-01

    When attention is engaged in a task, unexpected events in the visual scene may go undetected, a phenomenon known as inattentional blindness (IB). At what stage of information processing must attention be engaged for IB to occur? Although manipulations that tax visuospatial attention can induce IB, the evidence is more equivocal for tasks that engage attention at late, central stages of information processing. Here, we tested whether IB can be specifically induced by central executive processes. An unexpected visual stimulus was presented during the retention interval of a working memory task that involved either simply maintaining verbal material or rearranging the material into alphabetical order. The unexpected stimulus was more likely to be missed during manipulation than during simple maintenance of the verbal information. Thus, the engagement of executive processes impairs the ability to detect unexpected, task-irrelevant stimuli, suggesting that IB can result from central, amodal stages of processing.

  20. Using predicated execution to improve the performance of a dynamically scheduled machine with speculative execution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, P.Y.; Hao, E.; Patt, Y.

    Conditional branches incur a severe performance penalty in wide-issue, deeply pipelined processors. Speculative execution and predicated execution are two mechanisms that have been proposed for reducing this penalty. Speculative execution can completely eliminate the penalty associated with a particular branch, but requires accurate branch prediction to be effective. Predicated execution does not require accurate branch prediction to eliminate the branch penalty, but is not applicable to all branches and can increase the latencies within the program. This paper examines the performance benefit of using both mechanisms to reduce the branch execution penalty. Predicated execution is used to handle the hard-to-protectmore » branches and speculative execution is used to handle the remaining branches. The hard-to-predict branches within the program are determined by profiling. We show that this approach can significantly reduce the branch execution penalty suffered by wide-issue processors.« less

  1. Memory, executive, and multidomain subtle cognitive impairment: clinical and biomarker findings.

    PubMed

    Toledo, Jon B; Bjerke, Maria; Chen, Kewei; Rozycki, Martin; Jack, Clifford R; Weiner, Michael W; Arnold, Steven E; Reiman, Eric M; Davatzikos, Christos; Shaw, Leslie M; Trojanowski, John Q

    2015-07-14

    We studied the biomarker signatures and prognoses of 3 different subtle cognitive impairment (SCI) groups (executive, memory, and multidomain) as well as the subjective memory complaints (SMC) group. We studied 522 healthy controls in the Alzheimer's Disease Neuroimaging Initiative (ADNI). Cutoffs for executive, memory, and multidomain SCI were defined using participants who remained cognitively normal (CN) for 7 years. CSF Alzheimer disease (AD) biomarkers, composite and region-of-interest (ROI) MRI, and fluorodeoxyglucose-PET measures were compared in these participants. Using a stringent cutoff (fifth percentile), 27.6% of the ADNI participants were classified as SCI. Most single ROI or global-based measures were not sensitive to detect differences between groups. Only MRI-SPARE-AD (Spatial Pattern of Abnormalities for Recognition of Early AD), a quantitative MRI pattern-based global index, showed differences between all groups, excluding the executive SCI group. Atrophy patterns differed in memory SCI and SMC. The CN and the SMC groups presented a similar distribution of preclinical dementia stages. Fifty percent of the participants with executive, memory, and multidomain SCI progressed to mild cognitive impairment or dementia at 7, 5, and 2 years, respectively. Our results indicate that (1) the different SCI categories have different clinical prognoses and biomarker signatures, (2) longitudinally followed CN subjects are needed to establish clinical cutoffs, (3) subjects with SMC show a frontal pattern of brain atrophy, and (4) pattern-based analyses outperform commonly used single ROI-based neuroimaging biomarkers and are needed to detect initial stages of cognitive impairment. © 2015 American Academy of Neurology.

  2. Plan Execution Interchange Language (PLEXIL)

    NASA Technical Reports Server (NTRS)

    Estlin, Tara; Jonsson, Ari; Pasareanu, Corina; Simmons, Reid; Tso, Kam; Verma, Vandi

    2006-01-01

    Plan execution is a cornerstone of spacecraft operations, irrespective of whether the plans to be executed are generated on board the spacecraft or on the ground. Plan execution frameworks vary greatly, due to both different capabilities of the execution systems, and relations to associated decision-making frameworks. The latter dependency has made the reuse of execution and planning frameworks more difficult, and has all but precluded information sharing between different execution and decision-making systems. As a step in the direction of addressing some of these issues, a general plan execution language, called the Plan Execution Interchange Language (PLEXIL), is being developed. PLEXIL is capable of expressing concepts used by many high-level automated planners and hence provides an interface to multiple planners. PLEXIL includes a domain description that specifies command types, expansions, constraints, etc., as well as feedback to the higher-level decision-making capabilities. This document describes the grammar and semantics of PLEXIL. It includes a graphical depiction of this grammar and illustrative rover scenarios. It also outlines ongoing work on implementing a universal execution system, based on PLEXIL, using state-of-the-art rover functional interfaces and planners as test cases.

  3. 3 CFR 13499 - Executive Order 13499 of February 5, 2009. Further Amendments to Executive Order 12835...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 3 The President 1 2010-01-01 2010-01-01 false Executive Order 13499 of February 5, 2009. Further Amendments to Executive Order 12835, Establishment of the National Economic Council 13499 Order 13499 Presidential Documents Executive Orders Executive Order 13499 of February 5, 2009 EO 13499 Further Amendments to Executive Order 12835, Establishmen...

  4. 3 CFR 13500 - Executive Order 13500 of February 5, 2009. Further Amendments to Executive Order 12859...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 3 The President 1 2010-01-01 2010-01-01 false Executive Order 13500 of February 5, 2009. Further Amendments to Executive Order 12859, Establishment of the Domestic Policy Council 13500 Order 13500 Presidential Documents Executive Orders Executive Order 13500 of February 5, 2009 EO 13500 Further Amendments to Executive Order 12859, Establishment...

  5. Reprint of "Two-stage sparse coding of region covariance via Log-Euclidean kernels to detect saliency".

    PubMed

    Zhang, Ying-Ying; Yang, Cai; Zhang, Ping

    2017-08-01

    In this paper, we present a novel bottom-up saliency detection algorithm from the perspective of covariance matrices on a Riemannian manifold. Each superpixel is described by a region covariance matrix on Riemannian Manifolds. We carry out a two-stage sparse coding scheme via Log-Euclidean kernels to extract salient objects efficiently. In the first stage, given background dictionary on image borders, sparse coding of each region covariance via Log-Euclidean kernels is performed. The reconstruction error on the background dictionary is regarded as the initial saliency of each superpixel. In the second stage, an improvement of the initial result is achieved by calculating reconstruction errors of the superpixels on foreground dictionary, which is extracted from the first stage saliency map. The sparse coding in the second stage is similar to the first stage, but is able to effectively highlight the salient objects uniformly from the background. Finally, three post-processing methods-highlight-inhibition function, context-based saliency weighting, and the graph cut-are adopted to further refine the saliency map. Experiments on four public benchmark datasets show that the proposed algorithm outperforms the state-of-the-art methods in terms of precision, recall and mean absolute error, and demonstrate the robustness and efficiency of the proposed method. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. SIERRA Code Coupling Module: Arpeggio User Manual Version 4.44

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Subia, Samuel R.; Overfelt, James R.; Baur, David G.

    2017-04-01

    The SNL Sierra Mechanics code suite is designed to enable simulation of complex multiphysics scenarios. The code suite is composed of several specialized applications which can operate either in standalone mode or coupled with each other. Arpeggio is a supported utility that enables loose coupling of the various Sierra Mechanics applications by providing access to Framework services that facilitate the coupling. More importantly Arpeggio orchestrates the execution of applications that participate in the coupling. This document describes the various components of Arpeggio and their operability. The intent of the document is to provide a fast path for analysts interested inmore » coupled applications via simple examples of its usage.« less

  7. Multiple-Symbol Noncoherent Decoding of Uncoded and Convolutionally Codes Continous Phase Modulation

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Raphaeli, D.

    2000-01-01

    Recently, a method for combined noncoherent detection and decoding of trellis-codes (noncoherent coded modulation) has been proposed, which can practically approach the performance of coherent detection.

  8. Recurrent criminal behavior and executive dysfunction.

    PubMed

    Santos Barbosa, Manuel Fernando; Coelho Monteiro, Luis Manuel

    2008-05-01

    To experimentally test the hypothesis that people who repeatedly participate in forms of non-violent crime exhibit an executive deficit detected in tests of high ecological validity, having changes in prefrontal functioning as neurophysiologic basis. A batteiy to assess executive dysfunction was administered--the Behavioural Assessment of the Dysexecutive Syndrome (BADS)--to an experimental group of 30 inmates convicted of crimes against property (mean age = 39.3, SD = 9.98), and a control group of 30 (mean age = 32.7, SD = 11.8), all male. The group of recurrent inmates performed significantly worse than the control group in their global scores on the battery, as well as in the majority of subscales. Without removing from consideration the fact that sample size was not very large and, primarily, alerting ourselves to the dangerous hypothesis of a "frontal criminogenesis," the authors interpret criminal recurrence and resistance to penal measures in terms of the scarcity of control that individuals from the experimental group have over their behavior and its respective consequences.

  9. Commentary: Mentoring the mentor: executive coaching for clinical departmental executive officers.

    PubMed

    Geist, Lois J; Cohen, Michael B

    2010-01-01

    Departmental executive officers (DEOs), department chairs, and department heads in medical schools are often hired on the basis of their accomplishments in research as well as their skills in administration, management, and leadership. These individuals are also expected to be expert in multiple areas, including negotiation, finance and budgeting, mentoring, and personnel management. At the same time, they are expected to maintain and perhaps even enhance their personal academic standing for the purposes of raising the level of departmental and institutional prestige and for recruiting the next generation of physicians and scientists. In the corporate world, employers understand the importance of training new leaders in requisite skill enhancement that will lead to success in their new positions. These individuals are often provided with extensive executive training to develop the necessary competencies to make them successful leaders. Among the tools employed for this purpose are the use of personal coaches or executive training courses. The authors propose that the use of executive coaching in academic medicine may be of benefit for new DEOs. Experience using an executive coach suggests that this was a valuable growth experience for new leaders in the institution.

  10. Performance of concatenated Reed-Solomon trellis-coded modulation over Rician fading channels

    NASA Technical Reports Server (NTRS)

    Moher, Michael L.; Lodge, John H.

    1990-01-01

    A concatenated coding scheme for providing very reliable data over mobile-satellite channels at power levels similar to those used for vocoded speech is described. The outer code is a shorter Reed-Solomon code which provides error detection as well as error correction capabilities. The inner code is a 1-D 8-state trellis code applied independently to both the inphase and quadrature channels. To achieve the full error correction potential of this inner code, the code symbols are multiplexed with a pilot sequence which is used to provide dynamic channel estimation and coherent detection. The implementation structure of this scheme is discussed and its performance is estimated.

  11. Geosoft eXecutables (GX's) Developed by the U.S. Geological Survey, Version 2.0, with Notes on GX Development from Fortran Code

    USGS Publications Warehouse

    Phillips, Jeffrey D.

    2007-01-01

    Introduction Geosoft executables (GX's) are custom software modules for use with the Geosoft Oasis montaj geophysical data processing system, which currently runs under the Microsoft Windows 2000 or XP operating systems. The U.S. Geological Survey (USGS) uses Oasis montaj primarily for the processing and display of airborne geophysical data. The ability to add custom software modules to the Oasis montaj system is a feature employed by the USGS in order to take advantage of the large number of geophysical algorithms developed by the USGS during the past half century. This main part of this report, along with Appendix 1, describes Version 2.0 GX's developed by the USGS or specifically for the USGS by contractors. These GX's perform both basic and advanced operations. Version 1.0 GX's developed by the USGS were described by Phillips and others (2003), and are included in Version 2.0. Appendix 1 contains the help files for the individual GX's. Appendix 2 describes the new method that was used to create the compiled GX files, starting from legacy Fortran source code. Although the new method shares many steps with the approach presented in the Geosoft GX Developer manual, it differs from that approach in that it uses free, open-source Fortran and C compilers and avoids all Fortran-to-C conversion.

  12. Overview of Particle and Heavy Ion Transport Code System PHITS

    NASA Astrophysics Data System (ADS)

    Sato, Tatsuhiko; Niita, Koji; Matsuda, Norihiro; Hashimoto, Shintaro; Iwamoto, Yosuke; Furuta, Takuya; Noda, Shusaku; Ogawa, Tatsuhiko; Iwase, Hiroshi; Nakashima, Hiroshi; Fukahori, Tokio; Okumura, Keisuke; Kai, Tetsuya; Chiba, Satoshi; Sihver, Lembit

    2014-06-01

    A general purpose Monte Carlo Particle and Heavy Ion Transport code System, PHITS, is being developed through the collaboration of several institutes in Japan and Europe. The Japan Atomic Energy Agency is responsible for managing the entire project. PHITS can deal with the transport of nearly all particles, including neutrons, protons, heavy ions, photons, and electrons, over wide energy ranges using various nuclear reaction models and data libraries. It is written in Fortran language and can be executed on almost all computers. All components of PHITS such as its source, executable and data-library files are assembled in one package and then distributed to many countries via the Research organization for Information Science and Technology, the Data Bank of the Organization for Economic Co-operation and Development's Nuclear Energy Agency, and the Radiation Safety Information Computational Center. More than 1,000 researchers have been registered as PHITS users, and they apply the code to various research and development fields such as nuclear technology, accelerator design, medical physics, and cosmic-ray research. This paper briefly summarizes the physics models implemented in PHITS, and introduces some important functions useful for specific applications, such as an event generator mode and beam transport functions.

  13. CHEETAH: A next generation thermochemical code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fried, L.; Souers, P.

    1994-11-01

    CHEETAH is an effort to bring the TIGER thermochemical code into the 1990s. A wide variety of improvements have been made in Version 1.0. We have improved the robustness and ease of use of TIGER. All of TIGER`s solvers have been replaced by new algorithms. We find that CHEETAH solves a wider variety of problems with no user intervention (e.g. no guesses for the C-J state) than TIGER did. CHEETAH has been made simpler to use than TIGER; typical use of the code occurs with the new standard run command. CHEETAH will make the use of thermochemical codes more attractivemore » to practical explosive formulators. We have also made an extensive effort to improve over the results of TIGER. CHEETAH`s version of the BKW equation of state (BKWC) is able to accurately reproduce energies from cylinder tests; something that other BKW parameter sets have been unable to do. Calculations performed with BKWC execute very quickly; typical run times are under 10 seconds on a workstation. In the future we plan to improve the underlying science in CHEETAH. More accurate equations of state will be used in the gas and the condensed phase. A kinetics capability will be added to the code that will predict reaction zone thickness. Further ease of use features will eventually be added; an automatic formulator that adjusts concentrations to match desired properties is planned.« less

  14. Density-based parallel skin lesion border detection with webCL

    PubMed Central

    2015-01-01

    Background Dermoscopy is a highly effective and noninvasive imaging technique used in diagnosis of melanoma and other pigmented skin lesions. Many aspects of the lesion under consideration are defined in relation to the lesion border. This makes border detection one of the most important steps in dermoscopic image analysis. In current practice, dermatologists often delineate borders through a hand drawn representation based upon visual inspection. Due to the subjective nature of this technique, intra- and inter-observer variations are common. Because of this, the automated assessment of lesion borders in dermoscopic images has become an important area of study. Methods Fast density based skin lesion border detection method has been implemented in parallel with a new parallel technology called WebCL. WebCL utilizes client side computing capabilities to use available hardware resources such as multi cores and GPUs. Developed WebCL-parallel density based skin lesion border detection method runs efficiently from internet browsers. Results Previous research indicates that one of the highest accuracy rates can be achieved using density based clustering techniques for skin lesion border detection. While these algorithms do have unfavorable time complexities, this effect could be mitigated when implemented in parallel. In this study, density based clustering technique for skin lesion border detection is parallelized and redesigned to run very efficiently on the heterogeneous platforms (e.g. tablets, SmartPhones, multi-core CPUs, GPUs, and fully-integrated Accelerated Processing Units) by transforming the technique into a series of independent concurrent operations. Heterogeneous computing is adopted to support accessibility, portability and multi-device use in the clinical settings. For this, we used WebCL, an emerging technology that enables a HTML5 Web browser to execute code in parallel for heterogeneous platforms. We depicted WebCL and our parallel algorithm design. In

  15. Density-based parallel skin lesion border detection with webCL.

    PubMed

    Lemon, James; Kockara, Sinan; Halic, Tansel; Mete, Mutlu

    2015-01-01

    Dermoscopy is a highly effective and noninvasive imaging technique used in diagnosis of melanoma and other pigmented skin lesions. Many aspects of the lesion under consideration are defined in relation to the lesion border. This makes border detection one of the most important steps in dermoscopic image analysis. In current practice, dermatologists often delineate borders through a hand drawn representation based upon visual inspection. Due to the subjective nature of this technique, intra- and inter-observer variations are common. Because of this, the automated assessment of lesion borders in dermoscopic images has become an important area of study. Fast density based skin lesion border detection method has been implemented in parallel with a new parallel technology called WebCL. WebCL utilizes client side computing capabilities to use available hardware resources such as multi cores and GPUs. Developed WebCL-parallel density based skin lesion border detection method runs efficiently from internet browsers. Previous research indicates that one of the highest accuracy rates can be achieved using density based clustering techniques for skin lesion border detection. While these algorithms do have unfavorable time complexities, this effect could be mitigated when implemented in parallel. In this study, density based clustering technique for skin lesion border detection is parallelized and redesigned to run very efficiently on the heterogeneous platforms (e.g. tablets, SmartPhones, multi-core CPUs, GPUs, and fully-integrated Accelerated Processing Units) by transforming the technique into a series of independent concurrent operations. Heterogeneous computing is adopted to support accessibility, portability and multi-device use in the clinical settings. For this, we used WebCL, an emerging technology that enables a HTML5 Web browser to execute code in parallel for heterogeneous platforms. We depicted WebCL and our parallel algorithm design. In addition, we tested

  16. Generating performance portable geoscientific simulation code with Firedrake (Invited)

    NASA Astrophysics Data System (ADS)

    Ham, D. A.; Bercea, G.; Cotter, C. J.; Kelly, P. H.; Loriant, N.; Luporini, F.; McRae, A. T.; Mitchell, L.; Rathgeber, F.

    2013-12-01

    , can be written as short C kernels operating locally on the underlying mesh, with no explicit parallelism. The executable code is then generated in C, CUDA or OpenCL and executed in parallel on the target architecture. The system also offers features of special relevance to the geosciences. In particular, the large scale separation between the vertical and horizontal directions in many geoscientific processes can be exploited to offer the flexibility of unstructured meshes in the horizontal direction, without the performance penalty usually associated with those methods.

  17. Is intelligence equivalent to executive functions?

    PubMed

    Ardila, Alfredo

    2018-05-01

    Since the mid 19th century, cognitive and behavioral neurosciences have attempted to find the neurological bases of intellectual abilities. During the early 20th century the psychometric concept of "intelligence" was coined; and toward the end of the 20th century the neuropsychological concept of "executive functions" was introduced. Controversies, however, remain about the unity or heterogeneity of so-called executive functions. It is proposed that two major executive functions could be separated: metacognitive -or intelectual- and emotional/motivational. A similar distinction has been suggested by several authors. Standard definitions of intelligence implicitly assume that executive functions represent the fundamental components of intelligence. Research has demonstrated that, if considered as a whole, executive functions only partially correspond to the psychometric concept of intelligence; whereas some specific executive functions clearly correspond to intelligence, some others do not involve intelligence. If using a major distinction between metacognitive -or simply "intellectual"-executive functions, and emotional/ motivational -or simply non-intellectual-executive functions, it becomes evident that general intelligence can be equated with metacognitive executive functions but not with emotional/ motivational executive functions.

  18. Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder

    NASA Technical Reports Server (NTRS)

    Staats, Matt

    2009-01-01

    We present work on a prototype tool based on the JavaPathfinder (JPF) model checker for automatically generating tests satisfying the MC/DC code coverage criterion. Using the Eclipse IDE, developers and testers can quickly instrument Java source code with JPF annotations covering all MC/DC coverage obligations, and JPF can then be used to automatically generate tests that satisfy these obligations. The prototype extension to JPF enables various tasks useful in automatic test generation to be performed, such as test suite reduction and execution of generated tests.

  19. Executive pay trends and golden parachute tax: a collision on the horizon.

    PubMed

    Johnson, David G

    2004-01-01

    Ironically, many corporations will likely discover that tying equity-based executive compensation more closely to performance will cost millions of dollars when there is a merger or acquisition. The reason: Internal Revenue Code Section 280G, which is designed to discourage "excess" parachute payments, often assesses a significantly higher toll on performance-based compensation than on time-vested equity payments. There is no magic remedy, but advance planning can often help mitigate the impact. This article describes the dilemma and suggests several approaches to the challenge.

  20. Quality optimized medical image information hiding algorithm that employs edge detection and data coding.

    PubMed

    Al-Dmour, Hayat; Al-Ani, Ahmed

    2016-04-01

    The present work has the goal of developing a secure medical imaging information system based on a combined steganography and cryptography technique. It attempts to securely embed patient's confidential information into his/her medical images. The proposed information security scheme conceals coded Electronic Patient Records (EPRs) into medical images in order to protect the EPRs' confidentiality without affecting the image quality and particularly the Region of Interest (ROI), which is essential for diagnosis. The secret EPR data is converted into ciphertext using private symmetric encryption method. Since the Human Visual System (HVS) is less sensitive to alterations in sharp regions compared to uniform regions, a simple edge detection method has been introduced to identify and embed in edge pixels, which will lead to an improved stego image quality. In order to increase the embedding capacity, the algorithm embeds variable number of bits (up to 3) in edge pixels based on the strength of edges. Moreover, to increase the efficiency, two message coding mechanisms have been utilized to enhance the ±1 steganography. The first one, which is based on Hamming code, is simple and fast, while the other which is known as the Syndrome Trellis Code (STC), is more sophisticated as it attempts to find a stego image that is close to the cover image through minimizing the embedding impact. The proposed steganography algorithm embeds the secret data bits into the Region of Non Interest (RONI), where due to its importance; the ROI is preserved from modifications. The experimental results demonstrate that the proposed method can embed large amount of secret data without leaving a noticeable distortion in the output image. The effectiveness of the proposed algorithm is also proven using one of the efficient steganalysis techniques. The proposed medical imaging information system proved to be capable of concealing EPR data and producing imperceptible stego images with minimal

  1. Principles of Faithful Execution in the implementation of trusted objects.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tarman, Thomas David; Campbell, Philip LaRoche; Pierson, Lyndon George

    2003-09-01

    We begin with the following definitions: Definition: A trusted volume is the computing machinery (including communication lines) within which data is assumed to be physically protected from an adversary. A trusted volume provides both integrity and privacy. Definition: Program integrity consists of the protection necessary to enable the detection of changes in the bits comprising a program as specified by the developer, for the entire time that the program is outside a trusted volume. For ease of discussion we consider program integrity to be the aggregation of two elements: instruction integrity (detection of changes in the bits within an instructionmore » or block of instructions), and sequence integrity (detection of changes in the locations of instructions within a program). Definition: Faithful Execution (FE) is a type of software protection that begins when the software leaves the control of the developer and ends within the trusted volume of a target processor. That is, FE provides program integrity, even while the program is in execution. (As we will show below, FE schemes are a function of trusted volume size.) FE is a necessary quality for computing. Without it we cannot trust computations. In the early days of computing FE came for free since the software never left a trusted volume. At that time the execution environment was the same as the development environment. In some circles that environment was referred to as a ''closed shop:'' all of the software that was used there was developed there. When an organization bought a large computer from a vendor the organization would run its own operating system on that computer, use only its own editors, only its own compilers, only its own debuggers, and so on. However, with the continuing maturity of computing technology, FE becomes increasingly difficult to achieve« less

  2. The multidimensional Self-Adaptive Grid code, SAGE, version 2

    NASA Technical Reports Server (NTRS)

    Davies, Carol B.; Venkatapathy, Ethiraj

    1995-01-01

    This new report on Version 2 of the SAGE code includes all the information in the original publication plus all upgrades and changes to the SAGE code since that time. The two most significant upgrades are the inclusion of a finite-volume option and the ability to adapt and manipulate zonal-matching multiple-grid files. In addition, the original SAGE code has been upgraded to Version 1.1 and includes all options mentioned in this report, with the exception of the multiple grid option and its associated features. Since Version 2 is a larger and more complex code, it is suggested (but not required) that Version 1.1 be used for single-grid applications. This document contains all the information required to run both versions of SAGE. The formulation of the adaption method is described in the first section of this document. The second section is presented in the form of a user guide that explains the input and execution of the code. The third section provides many examples. Successful application of the SAGE code in both two and three dimensions for the solution of various flow problems has proven the code to be robust, portable, and simple to use. Although the basic formulation follows the method of Nakahashi and Deiwert, many modifications have been made to facilitate the use of the self-adaptive grid method for complex grid structures. Modifications to the method and the simple but extensive input options make this a flexible and user-friendly code. The SAGE code can accommodate two-dimensional and three-dimensional, finite-difference and finite-volume, single grid, and zonal-matching multiple grid flow problems.

  3. Model for mapping settlements

    DOEpatents

    Vatsavai, Ranga Raju; Graesser, Jordan B.; Bhaduri, Budhendra L.

    2016-07-05

    A programmable media includes a graphical processing unit in communication with a memory element. The graphical processing unit is configured to detect one or more settlement regions from a high resolution remote sensed image based on the execution of programming code. The graphical processing unit identifies one or more settlements through the execution of the programming code that executes a multi-instance learning algorithm that models portions of the high resolution remote sensed image. The identification is based on spectral bands transmitted by a satellite and on selected designations of the image patches.

  4. Executive and arousal vigilance decrement in the context of the attentional networks: The ANTI-Vea task.

    PubMed

    Luna, Fernando Gabriel; Marino, Julián; Roca, Javier; Lupiáñez, Juan

    2018-05-20

    Vigilance is generally understood as the ability to detect infrequent critical events through long time periods. In tasks like the Sustained Attention to Response Task (SART), participants tend to detect fewer events across time, a phenomenon known as "vigilance decrement". However, vigilance might also involve sustaining a tonic arousal level. In the Psychomotor Vigilance Test (PVT), the vigilance decrement corresponds to an increment across time in both mean and variability of reaction time. The present study aimed to develop a single task -Attentional Networks Test for Interactions and Vigilance - executive and arousal components (ANTI-Vea)- to simultaneously assess both components of vigilance (i.e., the executive vigilance as in the SART, and the arousal vigilance as in the PVT), while measuring the classic attentional functions (phasic alertness, orienting, and executive control). In Experiment #1, the executive vigilance decrement was found as an increment in response bias. In Experiment #2, this result was replicated, and the arousal vigilance decrement was simultaneously observed as an increment in reaction time. The ANTI-Vea solves some issues observed in the previous ANTI-V task with the executive vigilance measure (e.g., a low hit rate and no vigilance decrement). Furthermore, the new ANTI-Vea task assesses both components of vigilance together with others typical attentional functions. The new attentional networks test developed here may be useful to provide a better understanding of the human attentional system. The role of sensitivity and response bias in the executive vigilance decrement are discussed. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Varying execution discipline to increase performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, P.L.; Maccabe, A.B.

    1993-12-22

    This research investigates the relationship between execution discipline and performance. The hypothesis has two parts: 1. Different execution disciplines exhibit different performance for different computations, and 2. These differences can be effectively predicted by heuristics. A machine model is developed that can vary its execution discipline. That is, the model can execute a given program using either the control-driven, data-driven or demand-driven execution discipline. This model is referred to as a ``variable-execution-discipline`` machine. The instruction set for the model is the Program Dependence Web (PDW). The first part of the hypothesis will be tested by simulating the execution of themore » machine model on a suite of computations, based on the Livermore Fortran Kernel (LFK) Test (a.k.a. the Livermore Loops), using all three execution disciplines. Heuristics are developed to predict relative performance. These heuristics predict (a) the execution time under each discipline for one iteration of each loop and (b) the number of iterations taken by that loop; then the heuristics use those predictions to develop a prediction for the execution of the entire loop. Similar calculations are performed for branch statements. The second part of the hypothesis will be tested by comparing the results of the simulated execution with the predictions produced by the heuristics. If the hypothesis is supported, then the door is open for the development of machines that can vary execution discipline to increase performance.« less

  6. Performance of a parallel code for the Euler equations on hypercube computers

    NASA Technical Reports Server (NTRS)

    Barszcz, Eric; Chan, Tony F.; Jesperson, Dennis C.; Tuminaro, Raymond S.

    1990-01-01

    The performance of hypercubes were evaluated on a computational fluid dynamics problem and the parallel environment issues were considered that must be addressed, such as algorithm changes, implementation choices, programming effort, and programming environment. The evaluation focuses on a widely used fluid dynamics code, FLO52, which solves the two dimensional steady Euler equations describing flow around the airfoil. The code development experience is described, including interacting with the operating system, utilizing the message-passing communication system, and code modifications necessary to increase parallel efficiency. Results from two hypercube parallel computers (a 16-node iPSC/2, and a 512-node NCUBE/ten) are discussed and compared. In addition, a mathematical model of the execution time was developed as a function of several machine and algorithm parameters. This model accurately predicts the actual run times obtained and is used to explore the performance of the code in interesting but yet physically realizable regions of the parameter space. Based on this model, predictions about future hypercubes are made.

  7. Path planning and execution monitoring for a planetary rover

    NASA Technical Reports Server (NTRS)

    Gat, Erann; Slack, Marc G.; Miller, David P.; Firby, R. James

    1990-01-01

    A path planner and an execution monitoring planner that will enable the rover to navigate to its various destinations safely and correctly while detecting and avoiding hazards are described. An overview of the complete architecture is given. Implementation and testbeds are described. The robot can detect unforseen obstacles and take appropriate action. This includes having the rover back away from the hazard and mark the area as untraversable in the in the rover's internal map. The experiments have consisted of paths roughly 20 m in length. The architecture works with a large variety of rover configurations with different kinematic constraints.

  8. Cell-assembly coding in several memory processes.

    PubMed

    Sakurai, Y

    1998-01-01

    The present paper discusses why the cell assembly, i.e., an ensemble population of neurons with flexible functional connections, is a tenable view of the basic code for information processes in the brain. The main properties indicating the reality of cell-assembly coding are neurons overlaps among different assemblies and connection dynamics within and among the assemblies. The former can be detected as multiple functions of individual neurons in processing different kinds of information. Individual neurons appear to be involved in multiple information processes. The latter can be detected as changes of functional synaptic connections in processing different kinds of information. Correlations of activity among some of the recorded neurons appear to change in multiple information processes. Recent experiments have compared several different memory processes (tasks) and detected these two main properties, indicating cell-assembly coding of memory in the working brain. The first experiment compared different types of processing of identical stimuli, i.e., working memory and reference memory of auditory stimuli. The second experiment compared identical processes of different types of stimuli, i.e., discriminations of simple auditory, simple visual, and configural auditory-visual stimuli. The third experiment compared identical processes of different types of stimuli with or without temporal processing of stimuli, i.e., discriminations of elemental auditory, configural auditory-visual, and sequential auditory-visual stimuli. Some possible features of the cell-assembly coding, especially "dual coding" by individual neurons and cell assemblies, are discussed for future experimental approaches. Copyright 1998 Academic Press.

  9. Investigating executive functions in children with severe speech and movement disorders using structured tasks.

    PubMed

    Stadskleiv, Kristine; von Tetzchner, Stephen; Batorowicz, Beata; van Balkom, Hans; Dahlgren-Sandberg, Annika; Renner, Gregor

    2014-01-01

    Executive functions are the basis for goal-directed activity and include planning, monitoring, and inhibition, and language seems to play a role in the development of these functions. There is a tradition of studying executive function in both typical and atypical populations, and the present study investigates executive functions in children with severe speech and motor impairments who are communicating using communication aids with graphic symbols, letters, and/or words. There are few neuropsychological studies of children in this group and little is known about their cognitive functioning, including executive functions. It was hypothesized that aided communication would tax executive functions more than speech. Twenty-nine children using communication aids and 27 naturally speaking children participated. Structured tasks resembling everyday activities, where the action goals had to be reached through communication with a partner, were used to get information about executive functions. The children (a) directed the partner to perform actions like building a Lego tower from a model the partner could not see and (b) gave information about an object without naming it to a person who had to guess what object it was. The executive functions of planning, monitoring, and impulse control were coded from the children's on-task behavior. Both groups solved most of the tasks correctly, indicating that aided communicators are able to use language to direct another person to do a complex set of actions. Planning and lack of impulsivity was positively related to task success in both groups. The aided group completed significantly fewer tasks, spent longer time and showed more variation in performance than the comparison group. The aided communicators scored lower on planning and showed more impulsivity than the comparison group, while both groups showed an equal degree of monitoring of the work progress. The results are consistent with the hypothesis that aided language tax

  10. Investigating executive functions in children with severe speech and movement disorders using structured tasks

    PubMed Central

    Stadskleiv, Kristine; von Tetzchner, Stephen; Batorowicz, Beata; van Balkom, Hans; Dahlgren-Sandberg, Annika; Renner, Gregor

    2014-01-01

    Executive functions are the basis for goal-directed activity and include planning, monitoring, and inhibition, and language seems to play a role in the development of these functions. There is a tradition of studying executive function in both typical and atypical populations, and the present study investigates executive functions in children with severe speech and motor impairments who are communicating using communication aids with graphic symbols, letters, and/or words. There are few neuropsychological studies of children in this group and little is known about their cognitive functioning, including executive functions. It was hypothesized that aided communication would tax executive functions more than speech. Twenty-nine children using communication aids and 27 naturally speaking children participated. Structured tasks resembling everyday activities, where the action goals had to be reached through communication with a partner, were used to get information about executive functions. The children (a) directed the partner to perform actions like building a Lego tower from a model the partner could not see and (b) gave information about an object without naming it to a person who had to guess what object it was. The executive functions of planning, monitoring, and impulse control were coded from the children's on-task behavior. Both groups solved most of the tasks correctly, indicating that aided communicators are able to use language to direct another person to do a complex set of actions. Planning and lack of impulsivity was positively related to task success in both groups. The aided group completed significantly fewer tasks, spent longer time and showed more variation in performance than the comparison group. The aided communicators scored lower on planning and showed more impulsivity than the comparison group, while both groups showed an equal degree of monitoring of the work progress. The results are consistent with the hypothesis that aided language tax

  11. Educating Executive Function

    PubMed Central

    Blair, Clancy

    2016-01-01

    Executive functions are thinking skills that assist with reasoning, planning, problem solving, and managing one’s life. The brain areas that underlie these skills are interconnected with and influenced by activity in many different brain areas, some of which are associated with emotion and stress. One consequence of the stress-specific connections is that executive functions, which help us to organize our thinking, tend to be disrupted when stimulation is too high and we are stressed out, or too low when we are bored and lethargic. Given their central role in reasoning and also in managing stress and emotion, scientists have conducted studies, primarily with adults, to determine whether executive functions can be improved by training. By and large, results have shown that they can be, in part through computer-based videogame-like activities. Evidence of wider, more general benefits from such computer-based training, however, is mixed. Accordingly, scientists have reasoned that training will have wider benefits if it is implemented early, with very young children as the neural circuitry of executive functions is developing, and that it will be most effective if embedded in children’s everyday activities. Evidence produced by this research, however, is also mixed. In sum, much remains to be learned about executive function training. Without question, however, continued research on this important topic will yield valuable information about cognitive development. PMID:27906522

  12. Transversal Clifford gates on folded surface codes

    DOE PAGES

    Moussa, Jonathan E.

    2016-10-12

    Surface and color codes are two forms of topological quantum error correction in two spatial dimensions with complementary properties. Surface codes have lower-depth error detection circuits and well-developed decoders to interpret and correct errors, while color codes have transversal Clifford gates and better code efficiency in the number of physical qubits needed to achieve a given code distance. A formal equivalence exists between color codes and folded surface codes, but it does not guarantee the transferability of any of these favorable properties. However, the equivalence does imply the existence of constant-depth circuit implementations of logical Clifford gates on folded surfacemore » codes. We achieve and improve this result by constructing two families of folded surface codes with transversal Clifford gates. This construction is presented generally for qudits of any dimension. Lastly, the specific application of these codes to universal quantum computation based on qubit fusion is also discussed.« less

  13. Prevalence of executive dysfunction in cocaine, heroin and alcohol users enrolled in therapeutic communities.

    PubMed

    Fernández-Serrano, María José; Pérez-García, Miguel; Perales, José C; Verdejo-García, Antonio

    2010-01-10

    Many studies have observed relevant executive alterations in polysubstance users but no data have been generated in terms of prevalence of these alterations. Studies of the prevalence of neuropsychological impairment can be useful in the design and implementations of interventional programs for substance abusers. The present study was conducted to estimate the prevalence of neuropsychological impairment in different components of executive functions in polysubstance users enrolled in therapeutic communities. Moreover, we estimated the effect size of the differences in the executive performance between polysubstance users and non substance users in order to know which neuropsychological tasks can be useful to detect alterations in the executive functions. Study results showed a high prevalence of executive function impairment in polysubstance users. Working memory was the component with the highest impairment proportion, followed by fluency, shifting, planning, multi-tasking and interference. Comparisons between user groups showed very similar executive impairment prevalence for all the analyzed executive components. The best discriminating task between users and controls was Arithmetic (Wechsler Adult Intelligence Scale, WAIS-III). Moreover FAS and Ruff Figural Fluency Test was discriminating for fluency, Category Test for shifting, Stroop Colour-Word Interference Test for interference, Zoo Map (Behavioural Assessment of the Dysexecutive Syndrome, BADS) for planning and Six Elements (BADS) for multi-tasking. The existence of significant prevalence of executive impairment in polysubstance users reveals the need to redirect the actuation policies in the field of drug-dependency towards the creation of treatments addressed at the executive deficits of the participants, which in turn would facilitate the individuals' compliance and final rehabilitation.

  14. CRYPTOCHROME mediates behavioral executive choice in response to UV light

    PubMed Central

    Baik, Lisa S.; Fogle, Keri J.; Roberts, Logan; Galschiodt, Alexis M.; Chevez, Joshua A.; Recinos, Yocelyn; Nguy, Vinh; Holmes, Todd C.

    2017-01-01

    Drosophila melanogaster CRYPTOCHROME (CRY) mediates behavioral and electrophysiological responses to blue light coded by circadian and arousal neurons. However, spectroscopic and biochemical assays of heterologously expressed CRY suggest that CRY may mediate functional responses to UV-A (ultraviolet A) light as well. To determine the relative contributions of distinct phototransduction systems, we tested mutants lacking CRY and mutants with disrupted opsin-based phototransduction for behavioral and electrophysiological responses to UV light. CRY and opsin-based external photoreceptor systems cooperate for UV light-evoked acute responses. CRY mediates behavioral avoidance responses related to executive choice, consistent with its expression in central brain neurons. PMID:28062690

  15. Spaceborne computer executive routine functional design specification. Volume 2: Computer executive design for space station/base

    NASA Technical Reports Server (NTRS)

    Kennedy, J. R.; Fitzpatrick, W. S.

    1971-01-01

    The computer executive functional system design concepts derived from study of the Space Station/Base are presented. Information Management System hardware configuration as directly influencing the executive design is reviewed. The hardware configuration and generic executive design requirements are considered in detail in a previous report (System Configuration and Executive Requirements Specifications for Reusable Shuttle and Space Station/Base, 9/25/70). This report defines basic system primitives and delineates processes and process control. Supervisor states are considered for describing basic multiprogramming and multiprocessing systems. A high-level computer executive including control of scheduling, allocation of resources, system interactions, and real-time supervisory functions is defined. The description is oriented to provide a baseline for a functional simulation of the computer executive system.

  16. Detecting chronic kidney disease in population-based administrative databases using an algorithm of hospital encounter and physician claim codes.

    PubMed

    Fleet, Jamie L; Dixon, Stephanie N; Shariff, Salimah Z; Quinn, Robert R; Nash, Danielle M; Harel, Ziv; Garg, Amit X

    2013-04-05

    Large, population-based administrative healthcare databases can be used to identify patients with chronic kidney disease (CKD) when serum creatinine laboratory results are unavailable. We examined the validity of algorithms that used combined hospital encounter and physician claims database codes for the detection of CKD in Ontario, Canada. We accrued 123,499 patients over the age of 65 from 2007 to 2010. All patients had a baseline serum creatinine value to estimate glomerular filtration rate (eGFR). We developed an algorithm of physician claims and hospital encounter codes to search administrative databases for the presence of CKD. We determined the sensitivity, specificity, positive and negative predictive values of this algorithm to detect our primary threshold of CKD, an eGFR <45 mL/min per 1.73 m² (15.4% of patients). We also assessed serum creatinine and eGFR values in patients with and without CKD codes (algorithm positive and negative, respectively). Our algorithm required evidence of at least one of eleven CKD codes and 7.7% of patients were algorithm positive. The sensitivity was 32.7% [95% confidence interval: (95% CI): 32.0 to 33.3%]. Sensitivity was lower in women compared to men (25.7 vs. 43.7%; p <0.001) and in the oldest age category (over 80 vs. 66 to 80; 28.4 vs. 37.6 %; p < 0.001). All specificities were over 94%. The positive and negative predictive values were 65.4% (95% CI: 64.4 to 66.3%) and 88.8% (95% CI: 88.6 to 89.0%), respectively. In algorithm positive patients, the median [interquartile range (IQR)] baseline serum creatinine value was 135 μmol/L (106 to 179 μmol/L) compared to 82 μmol/L (69 to 98 μmol/L) for algorithm negative patients. Corresponding eGFR values were 38 mL/min per 1.73 m² (26 to 51 mL/min per 1.73 m²) vs. 69 mL/min per 1.73 m² (56 to 82 mL/min per 1.73 m²), respectively. Patients with CKD as identified by our database algorithm had distinctly higher baseline serum creatinine values and lower eGFR values

  17. Detecting chronic kidney disease in population-based administrative databases using an algorithm of hospital encounter and physician claim codes

    PubMed Central

    2013-01-01

    Background Large, population-based administrative healthcare databases can be used to identify patients with chronic kidney disease (CKD) when serum creatinine laboratory results are unavailable. We examined the validity of algorithms that used combined hospital encounter and physician claims database codes for the detection of CKD in Ontario, Canada. Methods We accrued 123,499 patients over the age of 65 from 2007 to 2010. All patients had a baseline serum creatinine value to estimate glomerular filtration rate (eGFR). We developed an algorithm of physician claims and hospital encounter codes to search administrative databases for the presence of CKD. We determined the sensitivity, specificity, positive and negative predictive values of this algorithm to detect our primary threshold of CKD, an eGFR <45 mL/min per 1.73 m2 (15.4% of patients). We also assessed serum creatinine and eGFR values in patients with and without CKD codes (algorithm positive and negative, respectively). Results Our algorithm required evidence of at least one of eleven CKD codes and 7.7% of patients were algorithm positive. The sensitivity was 32.7% [95% confidence interval: (95% CI): 32.0 to 33.3%]. Sensitivity was lower in women compared to men (25.7 vs. 43.7%; p <0.001) and in the oldest age category (over 80 vs. 66 to 80; 28.4 vs. 37.6 %; p < 0.001). All specificities were over 94%. The positive and negative predictive values were 65.4% (95% CI: 64.4 to 66.3%) and 88.8% (95% CI: 88.6 to 89.0%), respectively. In algorithm positive patients, the median [interquartile range (IQR)] baseline serum creatinine value was 135 μmol/L (106 to 179 μmol/L) compared to 82 μmol/L (69 to 98 μmol/L) for algorithm negative patients. Corresponding eGFR values were 38 mL/min per 1.73 m2 (26 to 51 mL/min per 1.73 m2) vs. 69 mL/min per 1.73 m2 (56 to 82 mL/min per 1.73 m2), respectively. Conclusions Patients with CKD as identified by our database algorithm had distinctly higher baseline serum

  18. Coded excitation speeds up the detection of the fundamental flexural guided wave in coated tubes

    NASA Astrophysics Data System (ADS)

    Song, Xiaojun; Moilanen, Petro; Zhao, Zuomin; Ta, Dean; Pirhonen, Jalmari; Salmi, Ari; Hæeggström, Edward; Myllylä, Risto; Timonen, Jussi; Wang, Weiqi

    2016-09-01

    The fundamental flexural guided wave (FFGW) permits ultrasonic assessment of the wall thickness of solid waveguides, such as tubes or, e.g., long cortical bones. Recently, an optical non-contact method was proposed for ultrasound excitation and detection with the aim of facilitating the FFGW reception by suppressing the interfering modes from the soft coating. This technique suffers from low SNR and requires iterative physical scanning across the source-receiver distance for 2D-FFT analysis. This means that SNR improvement achieved by temporal averaging becomes time-consuming (several minutes) which reduces the applicability of the technique, especially in time-critical applications such as clinical quantitative ultrasound. To achieve sufficient SNR faster, an ultrasonic excitation by a base-sequence-modulated Golay code (BSGC, 64-bit code pair) on coated tube samples (1-5 mm wall thickness and 5 mm soft coating layer) was used. This approach improved SNR by 21 dB and speeded up the measurement by a factor of 100 compared to using a classical pulse excitation with temporal averaging. The measurement now took seconds instead of minutes, while the ability to determine the wall thickness of the phantoms was maintained. The technique thus allows rapid noncontacting assessment of the wall thickness in coated solid tubes, such as the human bone.

  19. Executive control systems in the engineering design environment. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Hurst, P. W.

    1985-01-01

    An executive control system (ECS) is a software structure for unifying various applications codes into a comprehensive system. It provides a library of applications, a uniform access method through a cental user interface, and a data management facility. A survey of twenty-four executive control systems designed to unify various CAD/CAE applications for use in diverse engineering design environments within government and industry was conducted. The goals of this research were to establish system requirements to survey state-of-the-art architectural design approaches, and to provide an overview of the historical evolution of these systems. Foundations for design are presented and include environmental settings, system requirements, major architectural components, and a system classification scheme based on knowledge of the supported engineering domain(s). An overview of the design approaches used in developing the major architectural components of an ECS is presented with examples taken from the surveyed systems. Attention is drawn to four major areas of ECS development: interdisciplinary usage; standardization; knowledge utilization; and computer science technology transfer.

  20. A common neural code for similar conscious experiences in different individuals

    PubMed Central

    Naci, Lorina; Cusack, Rhodri; Anello, Mimma; Owen, Adrian M.

    2014-01-01

    The interpretation of human consciousness from brain activity, without recourse to speech or action, is one of the most provoking and challenging frontiers of modern neuroscience. We asked whether there is a common neural code that underpins similar conscious experiences, which could be used to decode these experiences in the absence of behavior. To this end, we used richly evocative stimulation (an engaging movie) portraying real-world events to elicit a similar conscious experience in different people. Common neural correlates of conscious experience were quantified and related to measurable, quantitative and qualitative, executive components of the movie through two additional behavioral investigations. The movie’s executive demands drove synchronized brain activity across healthy participants’ frontal and parietal cortices in regions known to support executive function. Moreover, the timing of activity in these regions was predicted by participants’ highly similar qualitative experience of the movie’s moment-to-moment executive demands, suggesting that synchronization of activity across participants underpinned their similar experience. Thus we demonstrate, for the first time to our knowledge, that a neural index based on executive function reliably predicted every healthy individual’s similar conscious experience in response to real-world events unfolding over time. This approach provided strong evidence for the conscious experience of a brain-injured patient, who had remained entirely behaviorally nonresponsive for 16 y. The patient’s executive engagement and moment-to-moment perception of the movie content were highly similar to that of every healthy participant. These findings shed light on the common basis of human consciousness and enable the interpretation of conscious experience in the absence of behavior. PMID:25225384

  1. 22 CFR 901.13 - Executive secretary.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 22 Foreign Relations 2 2013-04-01 2009-04-01 true Executive secretary. 901.13 Section 901.13 Foreign Relations FOREIGN SERVICE GRIEVANCE BOARD GENERAL Meanings of Terms As Used in This Chapter § 901.13 Executive secretary. Executive secretary means the executive secretary of the Board or his or her...

  2. 22 CFR 901.13 - Executive secretary.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 22 Foreign Relations 2 2012-04-01 2009-04-01 true Executive secretary. 901.13 Section 901.13 Foreign Relations FOREIGN SERVICE GRIEVANCE BOARD GENERAL Meanings of Terms As Used in This Chapter § 901.13 Executive secretary. Executive secretary means the executive secretary of the Board or his or her...

  3. 22 CFR 901.13 - Executive secretary.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Executive secretary. 901.13 Section 901.13 Foreign Relations FOREIGN SERVICE GRIEVANCE BOARD GENERAL Meanings of Terms As Used in This Chapter § 901.13 Executive secretary. Executive secretary means the executive secretary of the Board or his or her...

  4. 22 CFR 901.13 - Executive secretary.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 22 Foreign Relations 2 2011-04-01 2009-04-01 true Executive secretary. 901.13 Section 901.13 Foreign Relations FOREIGN SERVICE GRIEVANCE BOARD GENERAL Meanings of Terms As Used in This Chapter § 901.13 Executive secretary. Executive secretary means the executive secretary of the Board or his or her...

  5. 22 CFR 901.13 - Executive secretary.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 22 Foreign Relations 2 2014-04-01 2014-04-01 false Executive secretary. 901.13 Section 901.13 Foreign Relations FOREIGN SERVICE GRIEVANCE BOARD GENERAL Meanings of Terms As Used in This Chapter § 901.13 Executive secretary. Executive secretary means the executive secretary of the Board or his or her...

  6. A development and integration of database code-system with a compilation of comparator, k0 and absolute methods for INAA using microsoft access

    NASA Astrophysics Data System (ADS)

    Hoh, Siew Sin; Rapie, Nurul Nadiah; Lim, Edwin Suh Wen; Tan, Chun Yuan; Yavar, Alireza; Sarmani, Sukiman; Majid, Amran Ab.; Khoo, Kok Siong

    2013-05-01

    Instrumental Neutron Activation Analysis (INAA) is often used to determine and calculate the elemental concentrations of a sample at The National University of Malaysia (UKM) typically in Nuclear Science Programme, Faculty of Science and Technology. The objective of this study was to develop a database code-system based on Microsoft Access 2010 which could help the INAA users to choose either comparator method, k0-method or absolute method for calculating the elemental concentrations of a sample. This study also integrated k0data, Com-INAA, k0Concent, k0-Westcott and Abs-INAA to execute and complete the ECC-UKM database code-system. After the integration, a study was conducted to test the effectiveness of the ECC-UKM database code-system by comparing the concentrations between the experiments and the code-systems. 'Triple Bare Monitor' Zr-Au and Cr-Mo-Au were used in k0Concent, k0-Westcott and Abs-INAA code-systems as monitors to determine the thermal to epithermal neutron flux ratio (f). Calculations involved in determining the concentration were net peak area (Np), measurement time (tm), irradiation time (tirr), k-factor (k), thermal to epithermal neutron flux ratio (f), parameters of the neutron flux distribution epithermal (α) and detection efficiency (ɛp). For Com-INAA code-system, certified reference material IAEA-375 Soil was used to calculate the concentrations of elements in a sample. Other CRM and SRM were also used in this database codesystem. Later, a verification process to examine the effectiveness of the Abs-INAA code-system was carried out by comparing the sample concentrations between the code-system and the experiment. The results of the experimental concentration values of ECC-UKM database code-system were performed with good accuracy.

  7. Automated Coding Software: Development and Use to Enhance Anti-Fraud Activities*

    PubMed Central

    Garvin, Jennifer H.; Watzlaf, Valerie; Moeini, Sohrab

    2006-01-01

    This descriptive research project identified characteristics of automated coding systems that have the potential to detect improper coding and to minimize improper or fraudulent coding practices in the setting of automated coding used with the electronic health record (EHR). Recommendations were also developed for software developers and users of coding products to maximize anti-fraud practices. PMID:17238546

  8. White matter structural network abnormalities underlie executive dysfunction in amyotrophic lateral sclerosis.

    PubMed

    Dimond, Dennis; Ishaque, Abdullah; Chenji, Sneha; Mah, Dennell; Chen, Zhang; Seres, Peter; Beaulieu, Christian; Kalra, Sanjay

    2017-03-01

    Research in amyotrophic lateral sclerosis (ALS) suggests that executive dysfunction, a prevalent cognitive feature of the disease, is associated with abnormal structural connectivity and white matter integrity. In this exploratory study, we investigated the white matter constructs of executive dysfunction, and attempted to detect structural abnormalities specific to cognitively impaired ALS patients. Eighteen ALS patients and 22 age and education matched healthy controls underwent magnetic resonance imaging on a 4.7 Tesla scanner and completed neuropsychometric testing. ALS patients were categorized into ALS cognitively impaired (ALSci, n = 9) and ALS cognitively competent (ALScc, n = 5) groups. Tract-based spatial statistics and connectomics were used to compare white matter integrity and structural connectivity of ALSci and ALScc patients. Executive function performance was correlated with white matter FA and network metrics within the ALS group. Executive function performance in the ALS group correlated with global and local network properties, as well as FA, in regions throughout the brain, with a high predilection for the frontal lobe. ALSci patients displayed altered local connectivity and structural integrity in these same frontal regions that correlated with executive dysfunction. Our results suggest that executive dysfunction in ALS is related to frontal network disconnectivity, which potentially mediates domain-specific, or generalized cognitive impairment, depending on the degree of global network disruption. Furthermore, reported co-localization of decreased network connectivity and diminished white matter integrity suggests white matter pathology underlies this topological disruption. We conclude that executive dysfunction in ALSci is associated with frontal and global network disconnectivity, underlined by diminished white matter integrity. Hum Brain Mapp 38:1249-1268, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  9. Real-time detection of natural objects using AM-coded spectral matching imager

    NASA Astrophysics Data System (ADS)

    Kimachi, Akira

    2004-12-01

    This paper describes application of the amplitude-modulation (AM)-coded spectral matching imager (SMI) to real-time detection of natural objects such as human beings, animals, vegetables, or geological objects or phenomena, which are much more liable to change with time than artificial products while often exhibiting characteristic spectral functions associated with some specific activity states. The AM-SMI produces correlation between spectral functions of the object and a reference at each pixel of the correlation image sensor (CIS) in every frame, based on orthogonal amplitude modulation (AM) of each spectral channel and simultaneous demodulation of all channels on the CIS. This principle makes the SMI suitable to monitoring dynamic behavior of natural objects in real-time by looking at a particular spectral reflectance or transmittance function. A twelve-channel multispectral light source was developed with improved spatial uniformity of spectral irradiance compared to a previous one. Experimental results of spectral matching imaging of human skin and vegetable leaves are demonstrated, as well as a preliminary feasibility test of imaging a reflective object using a test color chart.

  10. Real-time detection of natural objects using AM-coded spectral matching imager

    NASA Astrophysics Data System (ADS)

    Kimachi, Akira

    2005-01-01

    This paper describes application of the amplitude-modulation (AM)-coded spectral matching imager (SMI) to real-time detection of natural objects such as human beings, animals, vegetables, or geological objects or phenomena, which are much more liable to change with time than artificial products while often exhibiting characteristic spectral functions associated with some specific activity states. The AM-SMI produces correlation between spectral functions of the object and a reference at each pixel of the correlation image sensor (CIS) in every frame, based on orthogonal amplitude modulation (AM) of each spectral channel and simultaneous demodulation of all channels on the CIS. This principle makes the SMI suitable to monitoring dynamic behavior of natural objects in real-time by looking at a particular spectral reflectance or transmittance function. A twelve-channel multispectral light source was developed with improved spatial uniformity of spectral irradiance compared to a previous one. Experimental results of spectral matching imaging of human skin and vegetable leaves are demonstrated, as well as a preliminary feasibility test of imaging a reflective object using a test color chart.

  11. Accuracy of Administrative Billing Codes to Detect Urinary Tract Infection Hospitalizations

    PubMed Central

    Hall, Matthew; Auger, Katherine A.; Hain, Paul D.; Jerardi, Karen E.; Myers, Angela L.; Rahman, Suraiya S.; Williams, Derek J.; Shah, Samir S.

    2011-01-01

    BACKGROUND: Hospital billing data are frequently used for quality measures and research, but the accuracy of the use of discharge codes to identify urinary tract infections (UTIs) is unknown. OBJECTIVE: To determine the accuracy of International Classification of Diseases, 9th revision (ICD-9) discharge codes to identify children hospitalized with UTIs. METHODS: This multicenter study conducted in 5 children's hospitals included children aged 3 days to 18 years who had been admitted to the hospital, undergone a urinalysis or urine culture, and discharged from the hospital. Data were obtained from the pediatric health information system database and medical record review. With the use of 2 gold-standard methods, the positive predictive value (PPV) was calculated for individual and combined UTI codes and for common UTI identification strategies. PPV was measured for all groupings for which the UTI code was the principal discharge diagnosis. RESULTS: There were 833 patients in the study. The PPV was 50.3% with the use of the gold standard of laboratory-confirmed UTIs but increased to 85% with provider confirmation. Restriction of the study cohort to patients with a principle diagnosis of UTI improved the PPV for laboratory-confirmed UTI (61.2%) and provider-confirmed UTI (93.2%), as well as the ability to benchmark performance. Other common identification strategies did not markedly affect the PPV. CONCLUSIONS: ICD-9 codes can be used to identify patients with UTIs but are most accurate when UTI is the principal discharge diagnosis. The identification strategies reported in this study can be used to improve the accuracy and applicability of benchmarking measures. PMID:21768320

  12. Caregiver person-centeredness and behavioral symptoms during mealtime interactions: development and feasibility of a coding scheme.

    PubMed

    Gilmore-Bykovskyi, Andrea L

    2015-01-01

    Mealtime behavioral symptoms are distressing and frequently interrupt eating for the individual experiencing them and others in the environment. A computer-assisted coding scheme was developed to measure caregiver person-centeredness and behavioral symptoms for nursing home residents with dementia during mealtime interactions. The purpose of this pilot study was to determine the feasibility, ease of use, and inter-observer reliability of the coding scheme, and to explore the clinical utility of the coding scheme. Trained observers coded 22 observations. Data collection procedures were acceptable to participants. Overall, the coding scheme proved to be feasible, easy to execute and yielded good to very good inter-observer agreement following observer re-training. The coding scheme captured clinically relevant, modifiable antecedents to mealtime behavioral symptoms, but would be enhanced by the inclusion of measures for resident engagement and consolidation of items for measuring caregiver person-centeredness that co-occurred and were difficult for observers to distinguish. Published by Elsevier Inc.

  13. Antiplagiarism Software Takes on the Honor Code

    ERIC Educational Resources Information Center

    Wasley, Paula

    2008-01-01

    Among the 100-odd colleges with academic honor codes, plagiarism-detection services raise a knotty problem: Is software compatible with a system based on trust? The answer frequently devolves to the size and culture of the university. Colleges with traditional student-run honor codes tend to "forefront" trust, emphasizing it above all else. This…

  14. A P2P Botnet detection scheme based on decision tree and adaptive multilayer neural networks.

    PubMed

    Alauthaman, Mohammad; Aslam, Nauman; Zhang, Li; Alasem, Rafe; Hossain, M A

    2018-01-01

    In recent years, Botnets have been adopted as a popular method to carry and spread many malicious codes on the Internet. These malicious codes pave the way to execute many fraudulent activities including spam mail, distributed denial-of-service attacks and click fraud. While many Botnets are set up using centralized communication architecture, the peer-to-peer (P2P) Botnets can adopt a decentralized architecture using an overlay network for exchanging command and control data making their detection even more difficult. This work presents a method of P2P Bot detection based on an adaptive multilayer feed-forward neural network in cooperation with decision trees. A classification and regression tree is applied as a feature selection technique to select relevant features. With these features, a multilayer feed-forward neural network training model is created using a resilient back-propagation learning algorithm. A comparison of feature set selection based on the decision tree, principal component analysis and the ReliefF algorithm indicated that the neural network model with features selection based on decision tree has a better identification accuracy along with lower rates of false positives. The usefulness of the proposed approach is demonstrated by conducting experiments on real network traffic datasets. In these experiments, an average detection rate of 99.08 % with false positive rate of 0.75 % was observed.

  15. Interfacing modules for integrating discipline specific structural mechanics codes

    NASA Technical Reports Server (NTRS)

    Endres, Ned M.

    1989-01-01

    An outline of the organization and capabilities of the Engine Structures Computational Simulator (Simulator) at NASA Lewis Research Center is given. One of the goals of the research at Lewis is to integrate various discipline specific structural mechanics codes into a software system which can be brought to bear effectively on a wide range of engineering problems. This system must possess the qualities of being effective and efficient while still remaining user friendly. The simulator was initially designed for the finite element simulation of gas jet engine components. Currently, the simulator has been restricted to only the analysis of high pressure turbine blades and the accompanying rotor assembly, although the current installation can be expanded for other applications. The simulator presently assists the user throughout its procedures by performing information management tasks, executing external support tasks, organizing analysis modules and executing these modules in the user defined order while maintaining processing continuity.

  16. Toward performance portability of the Albany finite element analysis code using the Kokkos library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demeshko, Irina; Watkins, Jerry; Tezaur, Irina K.

    Performance portability on heterogeneous high-performance computing (HPC) systems is a major challenge faced today by code developers: parallel code needs to be executed correctly as well as with high performance on machines with different architectures, operating systems, and software libraries. The finite element method (FEM) is a popular and flexible method for discretizing partial differential equations arising in a wide variety of scientific, engineering, and industrial applications that require HPC. This paper presents some preliminary results pertaining to our development of a performance portable implementation of the FEM-based Albany code. Performance portability is achieved using the Kokkos library. We presentmore » performance results for the Aeras global atmosphere dynamical core module in Albany. Finally, numerical experiments show that our single code implementation gives reasonable performance across three multicore/many-core architectures: NVIDIA General Processing Units (GPU’s), Intel Xeon Phis, and multicore CPUs.« less

  17. Toward performance portability of the Albany finite element analysis code using the Kokkos library

    DOE PAGES

    Demeshko, Irina; Watkins, Jerry; Tezaur, Irina K.; ...

    2018-02-05

    Performance portability on heterogeneous high-performance computing (HPC) systems is a major challenge faced today by code developers: parallel code needs to be executed correctly as well as with high performance on machines with different architectures, operating systems, and software libraries. The finite element method (FEM) is a popular and flexible method for discretizing partial differential equations arising in a wide variety of scientific, engineering, and industrial applications that require HPC. This paper presents some preliminary results pertaining to our development of a performance portable implementation of the FEM-based Albany code. Performance portability is achieved using the Kokkos library. We presentmore » performance results for the Aeras global atmosphere dynamical core module in Albany. Finally, numerical experiments show that our single code implementation gives reasonable performance across three multicore/many-core architectures: NVIDIA General Processing Units (GPU’s), Intel Xeon Phis, and multicore CPUs.« less

  18. The proposed coding standard at GSFC

    NASA Technical Reports Server (NTRS)

    Morakis, J. C.; Helgert, H. J.

    1977-01-01

    As part of the continuing effort to introduce standardization of spacecraft and ground equipment in satellite systems, NASA's Goddard Space Flight Center and other NASA facilities have supported the development of a set of standards for the use of error control coding in telemetry subsystems. These standards are intended to ensure compatibility between spacecraft and ground encoding equipment, while allowing sufficient flexibility to meet all anticipated mission requirements. The standards which have been developed to date cover the application of block codes in error detection and error correction modes, as well as short and long constraint length convolutional codes decoded via the Viterbi and sequential decoding algorithms, respectively. Included are detailed specifications of the codes, and their implementation. Current effort is directed toward the development of standards covering channels with burst noise characteristics, channels with feedback, and code concatenation.

  19. Parallelization of Finite Element Analysis Codes Using Heterogeneous Distributed Computing

    NASA Technical Reports Server (NTRS)

    Ozguner, Fusun

    1996-01-01

    Performance gains in computer design are quickly consumed as users seek to analyze larger problems to a higher degree of accuracy. Innovative computational methods, such as parallel and distributed computing, seek to multiply the power of existing hardware technology to satisfy the computational demands of large applications. In the early stages of this project, experiments were performed using two large, coarse-grained applications, CSTEM and METCAN. These applications were parallelized on an Intel iPSC/860 hypercube. It was found that the overall speedup was very low, due to large, inherently sequential code segments present in the applications. The overall execution time T(sub par), of the application is dependent on these sequential segments. If these segments make up a significant fraction of the overall code, the application will have a poor speedup measure.

  20. From an Executive Network to Executive Control: A Computational Model of the "n"-Back Task

    ERIC Educational Resources Information Center

    Chatham, Christopher H.; Herd, Seth A.; Brant, Angela M.; Hazy, Thomas E.; Miyake, Akira; O'Reilly, Randy; Friedman, Naomi P.

    2011-01-01

    A paradigmatic test of executive control, the n-back task, is known to recruit a widely distributed parietal, frontal, and striatal "executive network," and is thought to require an equally wide array of executive functions. The mapping of functions onto substrates in such a complex task presents a significant challenge to any theoretical…

  1. Intra-procedural Path-insensitve Grams (I-GRAMS) and Disassembly Based Features for Packer Tool Classification and Detection

    DTIC Science & Technology

    2012-06-14

    executable file is packed is a critical step in software security. This research uses machine learning methods to build the Polymorphic and Non-Polymorphic...Packer Detection (PNPD) system that detects whether an executable is packed by either ASPack, UPX, Metasploit’s polymorphic msfencode, or is packed in...detect packed executables used in experiments. Overall, it is discovered i-grams provide the best results with accuracies above 99.5%, average true

  2. Educating executive function.

    PubMed

    Blair, Clancy

    2017-01-01

    Executive functions are thinking skills that assist with reasoning, planning, problem solving, and managing one's life. The brain areas that underlie these skills are interconnected with and influenced by activity in many different brain areas, some of which are associated with emotion and stress. One consequence of the stress-specific connections is that executive functions, which help us to organize our thinking, tend to be disrupted when stimulation is too high and we are stressed out, or too low when we are bored and lethargic. Given their central role in reasoning and also in managing stress and emotion, scientists have conducted studies, primarily with adults, to determine whether executive functions can be improved by training. By and large, results have shown that they can be, in part through computer-based videogame-like activities. Evidence of wider, more general benefits from such computer-based training, however, is mixed. Accordingly, scientists have reasoned that training will have wider benefits if it is implemented early, with very young children as the neural circuitry of executive functions is developing, and that it will be most effective if embedded in children's everyday activities. Evidence produced by this research, however, is also mixed. In sum, much remains to be learned about executive function training. Without question, however, continued research on this important topic will yield valuable information about cognitive development. WIREs Cogn Sci 2017, 8:e1403. doi: 10.1002/wcs.1403 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.

  3. Resource conflict detection and removal strategy for nondeterministic emergency response processes using Petri nets

    NASA Astrophysics Data System (ADS)

    Zeng, Qingtian; Liu, Cong; Duan, Hua

    2016-09-01

    Correctness of an emergency response process specification is critical to emergency mission success. Therefore, errors in the specification should be detected and corrected at build-time. In this paper, we propose a resource conflict detection approach and removal strategy for emergency response processes constrained by resources and time. In this kind of emergency response process, there are two timing functions representing the minimum and maximum execution time for each activity, respectively, and many activities require resources to be executed. Based on the RT_ERP_Net, the earliest time to start each activity and the ideal execution time of the process can be obtained. To detect and remove the resource conflicts in the process, the conflict detection algorithms and a priority-activity-first resolution strategy are given. In this way, real execution time for each activity is obtained and a conflict-free RT_ERP_Net is constructed by adding virtual activities. By experiments, it is proved that the resolution strategy proposed can shorten the execution time of the whole process to a great degree.

  4. Auto Code Generation for Simulink-Based Attitude Determination Control System

    NASA Technical Reports Server (NTRS)

    MolinaFraticelli, Jose Carlos

    2012-01-01

    This paper details the work done to auto generate C code from a Simulink-Based Attitude Determination Control System (ADCS) to be used in target platforms. NASA Marshall Engineers have developed an ADCS Simulink simulation to be used as a component for the flight software of a satellite. This generated code can be used for carrying out Hardware in the loop testing of components for a satellite in a convenient manner with easily tunable parameters. Due to the nature of the embedded hardware components such as microcontrollers, this simulation code cannot be used directly, as it is, on the target platform and must first be converted into C code; this process is known as auto code generation. In order to generate C code from this simulation; it must be modified to follow specific standards set in place by the auto code generation process. Some of these modifications include changing certain simulation models into their atomic representations which can bring new complications into the simulation. The execution order of these models can change based on these modifications. Great care must be taken in order to maintain a working simulation that can also be used for auto code generation. After modifying the ADCS simulation for the auto code generation process, it is shown that the difference between the output data of the former and that of the latter is between acceptable bounds. Thus, it can be said that the process is a success since all the output requirements are met. Based on these results, it can be argued that this generated C code can be effectively used by any desired platform as long as it follows the specific memory requirements established in the Simulink Model.

  5. A meta-model for computer executable dynamic clinical safety checklists.

    PubMed

    Nan, Shan; Van Gorp, Pieter; Lu, Xudong; Kaymak, Uzay; Korsten, Hendrikus; Vdovjak, Richard; Duan, Huilong

    2017-12-12

    Safety checklist is a type of cognitive tool enforcing short term memory of medical workers with the purpose of reducing medical errors caused by overlook and ignorance. To facilitate the daily use of safety checklists, computerized systems embedded in the clinical workflow and adapted to patient-context are increasingly developed. However, the current hard-coded approach of implementing checklists in these systems increase the cognitive efforts of clinical experts and coding efforts for informaticists. This is due to the lack of a formal representation format that is both understandable by clinical experts and executable by computer programs. We developed a dynamic checklist meta-model with a three-step approach. Dynamic checklist modeling requirements were extracted by performing a domain analysis. Then, existing modeling approaches and tools were investigated with the purpose of reusing these languages. Finally, the meta-model was developed by eliciting domain concepts and their hierarchies. The feasibility of using the meta-model was validated by two case studies. The meta-model was mapped to specific modeling languages according to the requirements of hospitals. Using the proposed meta-model, a comprehensive coronary artery bypass graft peri-operative checklist set and a percutaneous coronary intervention peri-operative checklist set have been developed in a Dutch hospital and a Chinese hospital, respectively. The result shows that it is feasible to use the meta-model to facilitate the modeling and execution of dynamic checklists. We proposed a novel meta-model for the dynamic checklist with the purpose of facilitating creating dynamic checklists. The meta-model is a framework of reusing existing modeling languages and tools to model dynamic checklists. The feasibility of using the meta-model is validated by implementing a use case in the system.

  6. Scalable asynchronous execution of cellular automata

    NASA Astrophysics Data System (ADS)

    Folino, Gianluigi; Giordano, Andrea; Mastroianni, Carlo

    2016-10-01

    The performance and scalability of cellular automata, when executed on parallel/distributed machines, are limited by the necessity of synchronizing all the nodes at each time step, i.e., a node can execute only after the execution of the previous step at all the other nodes. However, these synchronization requirements can be relaxed: a node can execute one step after synchronizing only with the adjacent nodes. In this fashion, different nodes can execute different time steps. This can be a notable advantageous in many novel and increasingly popular applications of cellular automata, such as smart city applications, simulation of natural phenomena, etc., in which the execution times can be different and variable, due to the heterogeneity of machines and/or data and/or executed functions. Indeed, a longer execution time at a node does not slow down the execution at all the other nodes but only at the neighboring nodes. This is particularly advantageous when the nodes that act as bottlenecks vary during the application execution. The goal of the paper is to analyze the benefits that can be achieved with the described asynchronous implementation of cellular automata, when compared to the classical all-to-all synchronization pattern. The performance and scalability have been evaluated through a Petri net model, as this model is very useful to represent the synchronization barrier among nodes. We examined the usual case in which the territory is partitioned into a number of regions, and the computation associated with a region is assigned to a computing node. We considered both the cases of mono-dimensional and two-dimensional partitioning. The results show that the advantage obtained through the asynchronous execution, when compared to the all-to-all synchronous approach is notable, and it can be as large as 90% in terms of speedup.

  7. Code CUGEL: A code to unfold Ge(Li) spectrometer polyenergetic gamma photon experimental distributions

    NASA Technical Reports Server (NTRS)

    Steyn, J. J.; Born, U.

    1970-01-01

    A FORTRAN code was developed for the Univac 1108 digital computer to unfold lithium-drifted germanium semiconductor spectrometers, polyenergetic gamma photon experimental distributions. It was designed to analyze the combination continuous and monoenergetic gamma radiation field of radioisotope volumetric sources. The code generates the detector system response matrix function and applies it to monoenergetic spectral components discretely and to the continuum iteratively. It corrects for system drift, source decay, background, and detection efficiency. Results are presented in digital form for differential and integrated photon number and energy distributions, and for exposure dose.

  8. Executive Mind, Timely Action.

    ERIC Educational Resources Information Center

    Torbert, William R.

    1983-01-01

    The idea of "Executive Mind" carries with it the notion of purposeful and effective action. Part I of this paper characterizes three complements to "Executive Mind"--"Observing Mind,""Theorizing Mind," and "Passionate Mind"--and offers historical figures exemplifying all four types. The concluding…

  9. Load power device and system for real-time execution of hierarchical load identification algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Yi; Madane, Mayura Arun; Zambare, Prachi Suresh

    A load power device includes a power input; at least one power output for at least one load; and a plurality of sensors structured to sense voltage and current at the at least one power output. A processor is structured to provide real-time execution of: (a) a plurality of load identification algorithms, and (b) event detection and operating mode detection for the at least one load.

  10. Current Research on Non-Coding Ribonucleic Acid (RNA).

    PubMed

    Wang, Jing; Samuels, David C; Zhao, Shilin; Xiang, Yu; Zhao, Ying-Yong; Guo, Yan

    2017-12-05

    Non-coding ribonucleic acid (RNA) has without a doubt captured the interest of biomedical researchers. The ability to screen the entire human genome with high-throughput sequencing technology has greatly enhanced the identification, annotation and prediction of the functionality of non-coding RNAs. In this review, we discuss the current landscape of non-coding RNA research and quantitative analysis. Non-coding RNA will be categorized into two major groups by size: long non-coding RNAs and small RNAs. In long non-coding RNA, we discuss regular long non-coding RNA, pseudogenes and circular RNA. In small RNA, we discuss miRNA, transfer RNA, piwi-interacting RNA, small nucleolar RNA, small nuclear RNA, Y RNA, single recognition particle RNA, and 7SK RNA. We elaborate on the origin, detection method, and potential association with disease, putative functional mechanisms, and public resources for these non-coding RNAs. We aim to provide readers with a complete overview of non-coding RNAs and incite additional interest in non-coding RNA research.

  11. Prospective memory in multiple sclerosis: The impact of cue distinctiveness and executive functioning.

    PubMed

    Dagenais, Emmanuelle; Rouleau, Isabelle; Tremblay, Alexandra; Demers, Mélanie; Roger, Élaine; Jobin, Céline; Duquette, Pierre

    2016-11-01

    Prospective memory (PM), the ability to remember to do something at the appropriate time in the future, is crucial in everyday life. One way to improve PM performance is to increase the salience of a cue announcing that it is time to act. Multiple sclerosis (MS) patients often report PM failures and there is growing evidence of PM deficits among this population. However, such deficits are poorly characterized and their relation to cognitive status remains unclear. To better understand PM deficits in MS patients, this study investigated the impact of cue salience on PM, and its relation to retrospective memory (RM) and executive deficits. Thirty-nine (39) MS patients were compared to 18 healthy controls on a PM task modulating cue salience during an ongoing general knowledge test. MS patients performed worse than controls on the PM task, regardless of cue salience. MS patients' executive functions contributed significantly to the variance in PM performance, whereas age, education and RM did not. Interestingly, low- and high-executive patients' performance differed when the cue was not salient, but not when it was, suggesting that low-executive MS patients benefited more from cue salience. These findings add to the growing evidence of PM deficits in MS and highlight the contribution of executive functions to certain aspects of PM. In low-executive MS patients, high cue salience improves PM performance by reducing the detection threshold and need for environmental monitoring. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. A User''s Guide to the Zwikker-Kosten Transmission Line Code (ZKTL)

    NASA Technical Reports Server (NTRS)

    Kelly, J. J.; Abu-Khajeel, H.

    1997-01-01

    This user's guide documents updates to the Zwikker-Kosten Transmission Line Code (ZKTL). This code was developed for analyzing new liner concepts developed to provide increased sound absorption. Contiguous arrays of multi-degree-of-freedom (MDOF) liner elements serve as the model for these liner configurations, and Zwikker and Kosten's theory of sound propagation in channels is used to predict the surface impedance. Transmission matrices for the various liner elements incorporate both analytical and semi-empirical methods. This allows standard matrix techniques to be employed in the code to systematically calculate the composite impedance due to the individual liner elements. The ZKTL code consists of four independent subroutines: 1. Single channel impedance calculation - linear version (SCIC) 2. Single channel impedance calculation - nonlinear version (SCICNL) 3. Multi-channel, multi-segment, multi-layer impedance calculation - linear version (MCMSML) 4. Multi-channel, multi-segment, multi-layer impedance calculation - nonlinear version (MCMSMLNL) Detailed examples, comments, and explanations for each liner impedance computation module are included. Also contained in the guide are depictions of the interactive execution, input files and output files.

  13. A hybrid gyrokinetic ion and isothermal electron fluid code for astrophysical plasma

    NASA Astrophysics Data System (ADS)

    Kawazura, Y.; Barnes, M.

    2018-05-01

    This paper describes a new code for simulating astrophysical plasmas that solves a hybrid model composed of gyrokinetic ions (GKI) and an isothermal electron fluid (ITEF) Schekochihin et al. (2009) [9]. This model captures ion kinetic effects that are important near the ion gyro-radius scale while electron kinetic effects are ordered out by an electron-ion mass ratio expansion. The code is developed by incorporating the ITEF approximation into AstroGK, an Eulerian δf gyrokinetics code specialized to a slab geometry Numata et al. (2010) [41]. The new code treats the linear terms in the ITEF equations implicitly while the nonlinear terms are treated explicitly. We show linear and nonlinear benchmark tests to prove the validity and applicability of the simulation code. Since the fast electron timescale is eliminated by the mass ratio expansion, the Courant-Friedrichs-Lewy condition is much less restrictive than in full gyrokinetic codes; the present hybrid code runs ∼ 2√{mi /me } ∼ 100 times faster than AstroGK with a single ion species and kinetic electrons where mi /me is the ion-electron mass ratio. The improvement of the computational time makes it feasible to execute ion scale gyrokinetic simulations with a high velocity space resolution and to run multiple simulations to determine the dependence of turbulent dynamics on parameters such as electron-ion temperature ratio and plasma beta.

  14. Introduction of the ASGARD Code

    NASA Technical Reports Server (NTRS)

    Bethge, Christian; Winebarger, Amy; Tiwari, Sanjiv; Fayock, Brian

    2017-01-01

    ASGARD stands for 'Automated Selection and Grouping of events in AIA Regional Data'. The code is a refinement of the event detection method in Ugarte-Urra & Warren (2014). It is intended to automatically detect and group brightenings ('events') in the AIA EUV channels, to record event parameters, and to find related events over multiple channels. Ultimately, the goal is to automatically determine heating and cooling timescales in the corona and to significantly increase statistics in this respect. The code is written in IDL and requires the SolarSoft library. It is parallelized and can run with multiple CPUs. Input files are regions of interest (ROIs) in time series of AIA images from the JSOC cutout service (http://jsoc.stanford.edu/ajax/exportdata.html). The ROIs need to be tracked, co-registered, and limited in time (typically 12 hours).

  15. Event dependence in U.S. executions

    PubMed Central

    Baumgartner, Frank R.; Box-Steffensmeier, Janet M.

    2018-01-01

    Since 1976, the United States has seen over 1,400 judicial executions, and these have been highly concentrated in only a few states and counties. The number of executions across counties appears to fit a stretched distribution. These distributions are typically reflective of self-reinforcing processes where the probability of observing an event increases for each previous event. To examine these processes, we employ two-pronged empirical strategy. First, we utilize bootstrapped Kolmogorov-Smirnov tests to determine whether the pattern of executions reflect a stretched distribution, and confirm that they do. Second, we test for event-dependence using the Conditional Frailty Model. Our tests estimate the monthly hazard of an execution in a given county, accounting for the number of previous executions, homicides, poverty, and population demographics. Controlling for other factors, we find that the number of prior executions in a county increases the probability of the next execution and accelerates its timing. Once a jurisdiction goes down a given path, the path becomes self-reinforcing, causing the counties to separate out into those never executing (the vast majority of counties) and those which use the punishment frequently. This finding is of great legal and normative concern, and ultimately, may not be consistent with the equal protection clause of the U.S. Constitution. PMID:29293583

  16. Intergenerational Transmission of Neuropsychological Executive Functioning

    PubMed Central

    Jester, Jennifer M.; Nigg, Joel T.; Puttler, Leon I.; Long, Jeffrey C.; Fitzgerald, Hiram E.; Zucker, Robert A.

    2009-01-01

    Relationships between parent and child executive functioning were examined, controlling for the critical potential confound of IQ, in a family study involving 434 children (130 girls, 304 boys) and 376 parents from 204 community recruited families at high risk for the development of substance use disorder. Structural equation modeling found evidence of separate executive functioning and intelligence (IQ) latent variables. Mother’s and father’s executive functioning were associated with child’s executive functioning (beta = 0.34 for father-child, 0.51 for mother-child), independently of parental IQ, which as expected was associated with child’s IQ (beta = 0.52 for father-child, 0.54 for mother-child). Familial correlations also showed a significant relationship of executive functioning between parents and offspring. These findings clarify that key elements of the executive functioning construct are reliably differentiable from IQ, and are transmitted in families. This work supports the utility of the construct of executive function in further study of the mechanisms and etiology of externalizing psychopathologies. PMID:19243871

  17. Executive cognitive tests for the evaluation of patients with Parkinson’s disease

    PubMed Central

    Sobreira, Emmanuelle Silva Tavares; Pena, Marina Ceres Silva; Silva Filho, José Humberto; Souza, Carolina Pinto; Oliveira, Guiomar Nascimento; Tumas, Vitor; do Vale, Francisco de Assis Carvalho

    2008-01-01

    Parkinson’s disease (PD) is characterized by changes in movement, which are later followed by cognitive, behavioral and psychological changes. The objective of the present study was to correlate different tests used to examine executive functions in PD patients followed at a specialized outpatient clinic. Methods Thirty-five patients with idiopathic PD aged 63.0 years on average and with mean schooling of 5.5±4.2 years, were examined using the following tests: Mattis Dementia Rating Scale (MDRS), Scales for Outcomes of Parkinson’s Disease-Cognition (SCOPA-COG), Wisconsin Card Sorting Test (WCST), Frontal Assessment Battery (FAB), Digit Span – Inverse Order (IO) (a subtest of the WAIS III) and Verbal Fluency Test (category animals). Results Significant correlations were detected between FAB and MDRS Conceptualization (0.814), MDRS Initiation/Perseveration (I/P) and SCOPA-COG Executive Function (0.643), FAB and MDRS I/P (0.601), FAB and Verbal Fluency (0.602), MDRS I/P and MDRS Conceptualization (0.558), Verbal Fluency and MDRS I/P (0.529), MDRS Attention and SCOPA-COG Executive Function (0.495), MDRS Conceptualization and SCOPA-COG Executive Function (0.520), FAB and Digit Span (OI) (0.503), Verbal Fluency and MDRS Conceptualization (0.501), and WCST perseverative errors and FAB (–0.379), WCST perseverative errors and MDRS Conceptualization (0.445), WCST perseverative errors and MDRS I/P (–0.407) and WCST categories completed and MDRS Conceptualization (0.382). Discussion The results demonstrated strong correlations between most of the tests applied, but no associations were detected between the WCST and the other tests, a fact that may be explained by the heterogeneity of scores obtained in the tests by the patients evaluated. A difficulty of the present study was the lack of a control groups for the establishment of adequate standards for this population. PMID:29213572

  18. Code Optimization Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MAGEE,GLEN I.

    Computers transfer data in a number of different ways. Whether through a serial port, a parallel port, over a modem, over an ethernet cable, or internally from a hard disk to memory, some data will be lost. To compensate for that loss, numerous error detection and correction algorithms have been developed. One of the most common error correction codes is the Reed-Solomon code, which is a special subset of BCH (Bose-Chaudhuri-Hocquenghem) linear cyclic block codes. In the AURA project, an unmanned aircraft sends the data it collects back to earth so it can be analyzed during flight and possible flightmore » modifications made. To counter possible data corruption during transmission, the data is encoded using a multi-block Reed-Solomon implementation with a possibly shortened final block. In order to maximize the amount of data transmitted, it was necessary to reduce the computation time of a Reed-Solomon encoding to three percent of the processor's time. To achieve such a reduction, many code optimization techniques were employed. This paper outlines the steps taken to reduce the processing time of a Reed-Solomon encoding and the insight into modern optimization techniques gained from the experience.« less

  19. Symbolically Modeling Concurrent MCAPI Executions

    NASA Technical Reports Server (NTRS)

    Fischer, Topher; Mercer, Eric; Rungta, Neha

    2011-01-01

    Improper use of Inter-Process Communication (IPC) within concurrent systems often creates data races which can lead to bugs that are challenging to discover. Techniques that use Satisfiability Modulo Theories (SMT) problems to symbolically model possible executions of concurrent software have recently been proposed for use in the formal verification of software. In this work we describe a new technique for modeling executions of concurrent software that use a message passing API called MCAPI. Our technique uses an execution trace to create an SMT problem that symbolically models all possible concurrent executions and follows the same sequence of conditional branch outcomes as the provided execution trace. We check if there exists a satisfying assignment to the SMT problem with respect to specific safety properties. If such an assignment exists, it provides the conditions that lead to the violation of the property. We show how our method models behaviors of MCAPI applications that are ignored in previously published techniques.

  20. The role of executive functioning in children's attentional pain control: an experimental analysis.

    PubMed

    Verhoeven, Katrien; Dick, Bruce; Eccleston, Christopher; Goubert, Liesbet; Crombez, Geert

    2014-02-01

    Directing attention away from pain is often used in children's pain treatment programs to control pain. However, empirical evidence concerning its effectiveness is inconclusive. We therefore sought to understand other influencing factors, including executive function and its role in the pain experience. This study investigates the role of executive functioning in the effectiveness of distraction. School children (n=164) completed executive functioning tasks (inhibition, switching, and working memory) and performed a cold-pressor task. One half of the children simultaneously performed a distracting tone-detection task; the other half did not. Results showed that participants in the distraction group were engaged in the distraction task and were reported to pay significantly less attention to pain than controls. Executive functioning influenced distraction task engagement. More specifically, participants with good inhibition and working memory abilities performed the distraction task better; participants with good switching abilities reported having paid more attention to the distraction task. Furthermore, distraction was found to be ineffective in reducing pain intensity and affect. Executive functioning did not influence the effectiveness of distraction. However, a relationship was found between executive functioning and pain affect, indicating that participants with good inhibition and working memory abilities experienced the cold-pressor task as less stressful and unpleasant. Our findings suggest that distraction as a process for managing pain is complex. While it appears that executive function may play a role in adult distraction, in this study it did not direct attention away from pain. It may instead be involved in the overall pain experience. Copyright © 2013 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  1. Use of diagnosis codes for detection of clinically significant opioid poisoning in the emergency department: A retrospective analysis of a surveillance case definition.

    PubMed

    Reardon, Joseph M; Harmon, Katherine J; Schult, Genevieve C; Staton, Catherine A; Waller, Anna E

    2016-02-08

    Although fatal opioid poisonings tripled from 1999 to 2008, data describing nonfatal poisonings are rare. Public health authorities are in need of tools to track opioid poisonings in near real time. We determined the utility of ICD-9-CM diagnosis codes for identifying clinically significant opioid poisonings in a state-wide emergency department (ED) surveillance system. We sampled visits from four hospitals from July 2009 to June 2012 with diagnosis codes of 965.00, 965.01, 965.02 and 965.09 (poisoning by opiates and related narcotics) and/or an external cause of injury code of E850.0-E850.2 (accidental poisoning by opiates and related narcotics), and developed a novel case definition to determine in which cases opioid poisoning prompted the ED visit. We calculated the percentage of visits coded for opioid poisoning that were clinically significant and compared it to the percentage of visits coded for poisoning by non-opioid agents in which there was actually poisoning by an opioid agent. We created a multivariate regression model to determine if other collected triage data can improve the positive predictive value of diagnosis codes alone for detecting clinically significant opioid poisoning. 70.1 % of visits (Standard Error 2.4 %) coded for opioid poisoning were primarily prompted by opioid poisoning. The remainder of visits represented opioid exposure in the setting of other primary diseases. Among non-opioid poisoning codes reviewed, up to 36 % were reclassified as an opioid poisoning. In multivariate analysis, only naloxone use improved the positive predictive value of ICD-9-CM codes for identifying clinically significant opioid poisoning, but was associated with a high false negative rate. This surveillance mechanism identifies many clinically significant opioid overdoses with a high positive predictive value. With further validation, it may help target control measures such as prescriber education and pharmacy monitoring.

  2. Fast-neutron, coded-aperture imager

    NASA Astrophysics Data System (ADS)

    Woolf, Richard S.; Phlips, Bernard F.; Hutcheson, Anthony L.; Wulf, Eric A.

    2015-06-01

    This work discusses a large-scale, coded-aperture imager for fast neutrons, building off a proof-of concept instrument developed at the U.S. Naval Research Laboratory (NRL). The Space Science Division at the NRL has a heritage of developing large-scale, mobile systems, using coded-aperture imaging, for long-range γ-ray detection and localization. The fast-neutron, coded-aperture imaging instrument, designed for a mobile unit (20 ft. ISO container), consists of a 32-element array of 15 cm×15 cm×15 cm liquid scintillation detectors (EJ-309) mounted behind a 12×12 pseudorandom coded aperture. The elements of the aperture are composed of 15 cm×15 cm×10 cm blocks of high-density polyethylene (HDPE). The arrangement of the aperture elements produces a shadow pattern on the detector array behind the mask. By measuring of the number of neutron counts per masked and unmasked detector, and with knowledge of the mask pattern, a source image can be deconvolved to obtain a 2-d location. The number of neutrons per detector was obtained by processing the fast signal from each PMT in flash digitizing electronics. Digital pulse shape discrimination (PSD) was performed to filter out the fast-neutron signal from the γ background. The prototype instrument was tested at an indoor facility at the NRL with a 1.8-μCi and 13-μCi 252Cf neutron/γ source at three standoff distances of 9, 15 and 26 m (maximum allowed in the facility) over a 15-min integration time. The imaging and detection capabilities of the instrument were tested by moving the source in half- and one-pixel increments across the image plane. We show a representative sample of the results obtained at one-pixel increments for a standoff distance of 9 m. The 1.8-μCi source was not detected at the 26-m standoff. In order to increase the sensitivity of the instrument, we reduced the fastneutron background by shielding the top, sides and back of the detector array with 10-cm-thick HDPE. This shielding configuration led

  3. Adaptive Impact-Driven Detection of Silent Data Corruption for HPC Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di, Sheng; Cappello, Franck

    For exascale HPC applications, silent data corruption (SDC) is one of the most dangerous problems because there is no indication that there are errors during the execution. We propose an adaptive impact-driven method that can detect SDCs dynamically. The key contributions are threefold. (1) We carefully characterize 18 real-world HPC applications and discuss the runtime data features, as well as the impact of the SDCs on their execution results. (2) We propose an impact-driven detection model that does not blindly improve the prediction accuracy, but instead detects only influential SDCs to guarantee user-acceptable execution results. (3) Our solution can adaptmore » to dynamic prediction errors based on local runtime data and can automatically tune detection ranges for guaranteeing low false alarms. Experiments show that our detector can detect 80-99.99% of SDCs with a false alarm rate less that 1% of iterations for most cases. The memory cost and detection overhead are reduced to 15% and 6.3%, respectively, for a large majority of applications.« less

  4. Television and children's executive function.

    PubMed

    Lillard, Angeline S; Li, Hui; Boguszewski, Katie

    2015-01-01

    Children spend a lot of time watching television on its many platforms: directly, online, and via videos and DVDs. Many researchers are concerned that some types of television content appear to negatively influence children's executive function. Because (1) executive function predicts key developmental outcomes, (2) executive function appears to be influenced by some television content, and (3) American children watch large quantities of television (including the content of concern), the issues discussed here comprise a crucial public health issue. Further research is needed to reveal exactly what television content is implicated, what underlies television's effect on executive function, how long the effect lasts, and who is affected. © 2015 Elsevier Inc. All rights reserved.

  5. Executive High School Internships

    ERIC Educational Resources Information Center

    Hirsch, Sharlene Pearlman

    1974-01-01

    The Executive High School Internships Program enables juniors and seniors to take a one-semester sabbatical from their studies to serve as special assistants to executives in government, business, non-profit organizations, and civic organizations. They perform a variety of duties, earning full academic credit for their participation. (AG)

  6. Assisting Movement Training and Execution With Visual and Haptic Feedback.

    PubMed

    Ewerton, Marco; Rother, David; Weimar, Jakob; Kollegger, Gerrit; Wiemeyer, Josef; Peters, Jan; Maeda, Guilherme

    2018-01-01

    In the practice of motor skills in general, errors in the execution of movements may go unnoticed when a human instructor is not available. In this case, a computer system or robotic device able to detect movement errors and propose corrections would be of great help. This paper addresses the problem of how to detect such execution errors and how to provide feedback to the human to correct his/her motor skill using a general, principled methodology based on imitation learning. The core idea is to compare the observed skill with a probabilistic model learned from expert demonstrations. The intensity of the feedback is regulated by the likelihood of the model given the observed skill. Based on demonstrations, our system can, for example, detect errors in the writing of characters with multiple strokes. Moreover, by using a haptic device, the Haption Virtuose 6D, we demonstrate a method to generate haptic feedback based on a distribution over trajectories, which could be used as an auxiliary means of communication between an instructor and an apprentice. Additionally, given a performance measurement, the haptic device can help the human discover and perform better movements to solve a given task. In this case, the human first tries a few times to solve the task without assistance. Our framework, in turn, uses a reinforcement learning algorithm to compute haptic feedback, which guides the human toward better solutions.

  7. Performance Bounds on Two Concatenated, Interleaved Codes

    NASA Technical Reports Server (NTRS)

    Moision, Bruce; Dolinar, Samuel

    2010-01-01

    A method has been developed of computing bounds on the performance of a code comprised of two linear binary codes generated by two encoders serially concatenated through an interleaver. Originally intended for use in evaluating the performances of some codes proposed for deep-space communication links, the method can also be used in evaluating the performances of short-block-length codes in other applications. The method applies, more specifically, to a communication system in which following processes take place: At the transmitter, the original binary information that one seeks to transmit is first processed by an encoder into an outer code (Co) characterized by, among other things, a pair of numbers (n,k), where n (n > k)is the total number of code bits associated with k information bits and n k bits are used for correcting or at least detecting errors. Next, the outer code is processed through either a block or a convolutional interleaver. In the block interleaver, the words of the outer code are processed in blocks of I words. In the convolutional interleaver, the interleaving operation is performed bit-wise in N rows with delays that are multiples of B bits. The output of the interleaver is processed through a second encoder to obtain an inner code (Ci) characterized by (ni,ki). The output of the inner code is transmitted over an additive-white-Gaussian- noise channel characterized by a symbol signal-to-noise ratio (SNR) Es/No and a bit SNR Eb/No. At the receiver, an inner decoder generates estimates of bits. Depending on whether a block or a convolutional interleaver is used at the transmitter, the sequence of estimated bits is processed through a block or a convolutional de-interleaver, respectively, to obtain estimates of code words. Then the estimates of the code words are processed through an outer decoder, which generates estimates of the original information along with flags indicating which estimates are presumed to be correct and which are found to

  8. Planning and Execution for an Autonomous Aerobot

    NASA Technical Reports Server (NTRS)

    Gaines, Daniel M.; Estlin, Tara A.; Schaffer, Steven R.; Chouinard, Caroline M.

    2010-01-01

    The Aerial Onboard Autonomous Science Investigation System (AerOASIS) system provides autonomous planning and execution capabilities for aerial vehicles (see figure). The system is capable of generating high-quality operations plans that integrate observation requests from ground planning teams, as well as opportunistic science events detected onboard the vehicle while respecting mission and resource constraints. AerOASIS allows an airborne planetary exploration vehicle to summarize and prioritize the most scientifically relevant data; identify and select high-value science sites for additional investigation; and dynamically plan, schedule, and monitor the various science activities being performed, even during extended communications blackout periods with Earth.

  9. Checkpoint-based forward recovery using lookahead execution and rollback validation in parallel and distributed systems. Ph.D. Thesis, 1992

    NASA Technical Reports Server (NTRS)

    Long, Junsheng

    1994-01-01

    This thesis studies a forward recovery strategy using checkpointing and optimistic execution in parallel and distributed systems. The approach uses replicated tasks executing on different processors for forwared recovery and checkpoint comparison for error detection. To reduce overall redundancy, this approach employs a lower static redundancy in the common error-free situation to detect error than the standard N Module Redundancy scheme (NMR) does to mask off errors. For the rare occurrence of an error, this approach uses some extra redundancy for recovery. To reduce the run-time recovery overhead, look-ahead processes are used to advance computation speculatively and a rollback process is used to produce a diagnosis for correct look-ahead processes without rollback of the whole system. Both analytical and experimental evaluation have shown that this strategy can provide a nearly error-free execution time even under faults with a lower average redundancy than NMR.

  10. Development of an integrated thermal-hydraulics capability incorporating RELAP5 and PANTHER neutronics code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Page, R.; Jones, J.R.

    1997-07-01

    Ensuring that safety analysis needs are met in the future is likely to lead to the development of new codes and the further development of existing codes. It is therefore advantageous to define standards for data interfaces and to develop software interfacing techniques which can readily accommodate changes when they are made. Defining interface standards is beneficial but is necessarily restricted in application if future requirements are not known in detail. Code interfacing methods are of particular relevance with the move towards automatic grid frequency response operation where the integration of plant dynamic, core follow and fault study calculation toolsmore » is considered advantageous. This paper describes the background and features of a new code TALINK (Transient Analysis code LINKage program) used to provide a flexible interface to link the RELAP5 thermal hydraulics code with the PANTHER neutron kinetics and the SIBDYM whole plant dynamic modelling codes used by Nuclear Electric. The complete package enables the codes to be executed in parallel and provides an integrated whole plant thermal-hydraulics and neutron kinetics model. In addition the paper discusses the capabilities and pedigree of the component codes used to form the integrated transient analysis package and the details of the calculation of a postulated Sizewell `B` Loss of offsite power fault transient.« less

  11. 76 FR 66235 - Bar Code Technologies for Drugs and Biological Products; Retrospective Review Under Executive...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-26

    ... costs and benefits of the rule and to identify any relevant changes in technology that have occurred... access to care; Whether the public health benefits of an action have been realized; Whether the public or... reviewing under E.O. 13563 is the Bar Code Final Rule. The Agency plans to reassess its costs and benefits...

  12. Cortical sources of visual evoked potentials during consciousness of executive processes.

    PubMed

    Babiloni, Claudio; Vecchio, Fabrizio; Iacoboni, Marco; Buffo, Paola; Eusebi, Fabrizio; Rossini, Paolo Maria

    2009-03-01

    What is the timing of cortical activation related to consciousness of visuo-spatial executive functions? Electroencephalographic data (128 channels) were recorded in 13 adults. Cue stimulus briefly appeared on right or left (equal probability) monitor side for a period, inducing about 50% of recognitions. It was then masked and followed (2 s) by a central visual go stimulus. Left (right) mouse button had to be clicked after right (left) cue stimulus. This "inverted" response indexed executive processes. Afterward, subjects said "seen" if they had detected the cue stimulus or "not seen" when it was missed. Sources of event-related potentials (ERPs) were estimated by LORETA software. The inverted responses were about 95% in seen trials and about 60% in not seen trials. Cue stimulus evoked frontal-parietooccipital potentials, having the same peak latencies in the seen and not seen data. Maximal difference in amplitude of the seen and not seen ERPs was detected at about +300-ms post-stimulus (P3). P3 sources were higher in amplitude in the seen than not seen trials in dorsolateral prefrontal, premotor and parietooccipital areas. This was true in dorsolateral prefrontal and premotor cortex even when percentage of the inverted responses and reaction time were paired in the seen and not seen trials. These results suggest that, in normal subjects, the primary consciousness enhances the efficacy of visuo-spatial executive processes and is sub-served by a late (100- to 400-ms post-stimulus) enhancement of the neural synchronization in frontal areas.

  13. Ultrasound strain imaging using Barker code

    NASA Astrophysics Data System (ADS)

    Peng, Hui; Tie, Juhong; Guo, Dequan

    2017-01-01

    Ultrasound strain imaging is showing promise as a new way of imaging soft tissue elasticity in order to help clinicians detect lesions or cancers in tissues. In this paper, Barker code is applied to strain imaging to improve its quality. Barker code as a coded excitation signal can be used to improve the echo signal-to-noise ratio (eSNR) in ultrasound imaging system. For the Baker code of length 13, the sidelobe level of the matched filter output is -22dB, which is unacceptable for ultrasound strain imaging, because high sidelobe level will cause high decorrelation noise. Instead of using the conventional matched filter, we use the Wiener filter to decode the Barker-coded echo signal to suppress the range sidelobes. We also compare the performance of Barker code and the conventional short pulse in simulation method. The simulation results demonstrate that the performance of the Wiener filter is much better than the matched filter, and Baker code achieves higher elastographic signal-to-noise ratio (SNRe) than the short pulse in low eSNR or great depth conditions due to the increased eSNR with it.

  14. FLY MPI-2: a parallel tree code for LSS

    NASA Astrophysics Data System (ADS)

    Becciani, U.; Comparato, M.; Antonuccio-Delogu, V.

    2006-04-01

    New version program summaryProgram title: FLY 3.1 Catalogue identifier: ADSC_v2_0 Licensing provisions: yes Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADSC_v2_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland No. of lines in distributed program, including test data, etc.: 158 172 No. of bytes in distributed program, including test data, etc.: 4 719 953 Distribution format: tar.gz Programming language: Fortran 90, C Computer: Beowulf cluster, PC, MPP systems Operating system: Linux, Aix RAM: 100M words Catalogue identifier of previous version: ADSC_v1_0 Journal reference of previous version: Comput. Phys. Comm. 155 (2003) 159 Does the new version supersede the previous version?: yes Nature of problem: FLY is a parallel collisionless N-body code for the calculation of the gravitational force Solution method: FLY is based on the hierarchical oct-tree domain decomposition introduced by Barnes and Hut (1986) Reasons for the new version: The new version of FLY is implemented by using the MPI-2 standard: the distributed version 3.1 was developed by using the MPICH2 library on a PC Linux cluster. Today the FLY performance allows us to consider the FLY code among the most powerful parallel codes for tree N-body simulations. Another important new feature regards the availability of an interface with hydrodynamical Paramesh based codes. Simulations must follow a box large enough to accurately represent the power spectrum of fluctuations on very large scales so that we may hope to compare them meaningfully with real data. The number of particles then sets the mass resolution of the simulation, which we would like to make as fine as possible. The idea to build an interface between two codes, that have different and complementary cosmological tasks, allows us to execute complex cosmological simulations with FLY, specialized for DM evolution, and a code specialized for hydrodynamical components that uses a Paramesh block

  15. High Speed Research Noise Prediction Code (HSRNOISE) User's and Theoretical Manual

    NASA Technical Reports Server (NTRS)

    Golub, Robert (Technical Monitor); Rawls, John W., Jr.; Yeager, Jessie C.

    2004-01-01

    This report describes a computer program, HSRNOISE, that predicts noise levels for a supersonic aircraft powered by mixed flow turbofan engines with rectangular mixer-ejector nozzles. It fully documents the noise prediction algorithms, provides instructions for executing the HSRNOISE code, and provides predicted noise levels for the High Speed Research (HSR) program Technology Concept (TC) aircraft. The component source noise prediction algorithms were developed jointly by Boeing, General Electric Aircraft Engines (GEAE), NASA and Pratt & Whitney during the course of the NASA HSR program. Modern Technologies Corporation developed an alternative mixer ejector jet noise prediction method under contract to GEAE that has also been incorporated into the HSRNOISE prediction code. Algorithms for determining propagation effects and calculating noise metrics were taken from the NASA Aircraft Noise Prediction Program.

  16. FastDart : a fast, accurate and friendly version of DART code.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rest, J.; Taboada, H.

    2000-11-08

    A new enhanced, visual version of DART code is presented. DART is a mechanistic model based code, developed for the performance calculation and assessment of aluminum dispersion fuel. Major issues of this new version are the development of a new, time saving calculation routine, able to be run on PC, a friendly visual input interface and a plotting facility. This version, available for silicide and U-Mo fuels,adds to the classical accuracy of DART models for fuel performance prediction, a faster execution and visual interfaces. It is part of a collaboration agreement between ANL and CNEA in the area of Lowmore » Enriched Uranium Advanced Fuels, held by the Implementation Arrangement for Technical Exchange and Cooperation in the Area of Peaceful Uses of Nuclear Energy.« less

  17. Executions and scientific anatomy.

    PubMed

    Dolezal, Antonín; Jelen, Karel; Stajnrtova, Olga

    2015-12-01

    The very word "anatomy" tells us about this branch's connection with dissection. Studies of anatomy have taken place for approximately 2.300 years already. Anatomy's birthplace lies in Greece and Egypt. Knowledge in this specific field of science was necessary during surgical procedures in ophthalmology and obstetrics. Embalming took place without public disapproval just like autopsies and manipulation with relics. Thus, anatomical dissection became part of later forensic sciences. Anatomical studies on humans themselves, which needed to be compared with the knowledge gained through studying procedures performed on animals, elicited public disapprobation and prohibition. When faced with a shortage of cadavers, anatomists resorted to obtaining bodies of the executed and suicide victims - since torture, public display of the mutilated body, (including anatomical autopsy), were perceived as an intensification of the death penalty. Decapitation and hanging were the main execution methods meted out for death sentences. Anatomists preferred intact bodies for dissection; hence, convicts could thus avoid torture. This paper lists examples of how this process was resolved. It concerns the manners of killing, vivisection on people in the antiquity and middle-ages, experiments before the execution and after, vivifying from seeming death, experiments with galvanizing electricity on fresh cadavers, evaluating of sensibility after guillotine execution, and making perfect anatomical preparations and publications during Nazism from fresh bodies of the executed.

  18. 45 CFR 1700.5 - Executive Director.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 4 2012-10-01 2012-10-01 false Executive Director. 1700.5 Section 1700.5 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL COMMISSION ON LIBRARIES AND INFORMATION SCIENCE ORGANIZATION AND FUNCTIONS § 1700.5 Executive Director. (a) The Executive Director serves...

  19. 45 CFR 1700.5 - Executive Director.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Executive Director. 1700.5 Section 1700.5 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL COMMISSION ON LIBRARIES AND INFORMATION SCIENCE ORGANIZATION AND FUNCTIONS § 1700.5 Executive Director. (a) The Executive Director serves...

  20. 45 CFR 1700.5 - Executive Director.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 4 2014-10-01 2014-10-01 false Executive Director. 1700.5 Section 1700.5 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL COMMISSION ON LIBRARIES AND INFORMATION SCIENCE ORGANIZATION AND FUNCTIONS § 1700.5 Executive Director. (a) The Executive Director serves...

  1. 45 CFR 1700.5 - Executive Director.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 4 2013-10-01 2013-10-01 false Executive Director. 1700.5 Section 1700.5 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL COMMISSION ON LIBRARIES AND INFORMATION SCIENCE ORGANIZATION AND FUNCTIONS § 1700.5 Executive Director. (a) The Executive Director serves...

  2. 45 CFR 1700.5 - Executive Director.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 4 2011-10-01 2011-10-01 false Executive Director. 1700.5 Section 1700.5 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL COMMISSION ON LIBRARIES AND INFORMATION SCIENCE ORGANIZATION AND FUNCTIONS § 1700.5 Executive Director. (a) The Executive Director serves...

  3. Executive Development: Key Factors for Success.

    ERIC Educational Resources Information Center

    Fenwick-Magrath, Julie A.

    1988-01-01

    Reports results from a survey of 12 leading corporations concerning their management of the executive development process. Indicates that involvement of the chief executive officer, a clear policy, a relationship between executive development and business strategies and objectives, annual succession planning, and management responsibility are key…

  4. Dynamic Detection of Malicious Code in COTS Software

    DTIC Science & Technology

    2000-04-01

    run the following documented hostile applets or ActiveX of these tools work only on mobile code (Java, ActiveX , controls: 16-11 Hostile Applets Tiny...Killer App Exploder Runner ActiveX Check Spy eSafe Protect Desktop 9/9 blocked NB B NB 13/17 blocked NB Surfinshield Online 9/9 blocked NB B B 13/17...Exploder is an ActiveX control top (@). that performs a clean shutdown of your computer. The interface is attractive, although rather complex, as McLain’s

  5. Codes of professional conduct for Australian Defence Force military physicians: evenomating the serpent?

    PubMed

    O'Connor, Mike

    2010-09-01

    The scandal of health professionals' involvement in recent human rights abuses in United States military detention centres has prompted concern that Australian military physicians should be well protected against similar pressures to participate in harsh interrogations. A framework of military health ethics has been proposed. Would a code of professional conduct be a partial solution? This article examines the utility of professional codes: can they transform unethical behaviour or are they only of value to those who already behave ethically? How should such codes be designed, what support mechanisms should be in place and how should complaints be managed? A key recommendation is that codes of professional conduct should be accompanied by publicly transparent procedures for the investigation of serious infractions and appropriate disciplinary action when proven. The training of military physicians should also aim to develop a sound understanding of both humanitarian and human rights law. At present, both civil and military education of physicians generally lacks any component of human rights law. The Australian Defence Force (ADF) seems well placed to add codes of professional conduct to its existing ethical framework because of strong support at the highest executive levels.

  6. LSENS, a general chemical kinetics and sensitivity analysis code for homogeneous gas-phase reactions. 2: Code description and usage

    NASA Technical Reports Server (NTRS)

    Radhakrishnan, Krishnan; Bittker, David A.

    1994-01-01

    LSENS, the Lewis General Chemical Kinetics Analysis Code, has been developed for solving complex, homogeneous, gas-phase chemical kinetics problems and contains sensitivity analysis for a variety of problems, including nonisothermal situations. This report is part 2 of a series of three reference publications that describe LSENS, provide a detailed guide to its usage, and present many example problems. Part 2 describes the code, how to modify it, and its usage, including preparation of the problem data file required to execute LSENS. Code usage is illustrated by several example problems, which further explain preparation of the problem data file and show how to obtain desired accuracy in the computed results. LSENS is a flexible, convenient, accurate, and efficient solver for chemical reaction problems such as static system; steady, one-dimensional, inviscid flow; reaction behind incident shock wave, including boundary layer correction; and perfectly stirred (highly backmixed) reactor. In addition, the chemical equilibrium state can be computed for the following assigned states: temperature and pressure, enthalpy and pressure, temperature and volume, and internal energy and volume. For static problems the code computes the sensitivity coefficients of the dependent variables and their temporal derivatives with respect to the initial values of the dependent variables and/or the three rate coefficient parameters of the chemical reactions. Part 1 (NASA RP-1328) derives the governing equations describes the numerical solution procedures for the types of problems that can be solved by lSENS. Part 3 (NASA RP-1330) explains the kinetics and kinetics-plus-sensitivity-analysis problems supplied with LSENS and presents sample results.

  7. Assessing Executive Functioning: A Pragmatic Review

    ERIC Educational Resources Information Center

    Hass, Michael R.; Patterson, Ashlea; Sukraw, Jocelyn; Sullivan, Brianna M.

    2014-01-01

    Despite the common usage of the term "executive functioning" in neuropsychology, several aspects of this concept remain unsettled. In this paper, we will address some of the issues surrounding the notion of executive functioning and how an understanding of executive functioning and its components might assist school-based practitioners…

  8. Achievement Motivation Training and Executive Advancement

    ERIC Educational Resources Information Center

    Aronoff, Joel; Litwin, George H.

    1971-01-01

    Executives who were given a program designed to strengthen their need for achievement were matched with comparable executives chosen to attend the corporation's executive development course during approximately the same period. In a followup study, participants in the motivation training course had performed significantly better than their matched…

  9. SCORE/ACE Counselor Handbook. Service Corps of Retired Executives. Active Corps of Executives.

    ERIC Educational Resources Information Center

    Landsverk, Arvel; And Others

    This counselor handbook is intended to help Service Corps of Retired Executives/Active Corps of Executives (SCORE/ACE) counselors to plan and conduct counseling services more effectively. Included in the introductory section are an overview of the SCORE/ACE counseling program, a discussion of what the counselor does, directions for completing…

  10. Error coding simulations in C

    NASA Technical Reports Server (NTRS)

    Noble, Viveca K.

    1994-01-01

    When data is transmitted through a noisy channel, errors are produced within the data rendering it indecipherable. Through the use of error control coding techniques, the bit error rate can be reduced to any desired level without sacrificing the transmission data rate. The Astrionics Laboratory at Marshall Space Flight Center has decided to use a modular, end-to-end telemetry data simulator to simulate the transmission of data from flight to ground and various methods of error control. The simulator includes modules for random data generation, data compression, Consultative Committee for Space Data Systems (CCSDS) transfer frame formation, error correction/detection, error generation and error statistics. The simulator utilizes a concatenated coding scheme which includes CCSDS standard (255,223) Reed-Solomon (RS) code over GF(2(exp 8)) with interleave depth of 5 as the outermost code, (7, 1/2) convolutional code as an inner code and CCSDS recommended (n, n-16) cyclic redundancy check (CRC) code as the innermost code, where n is the number of information bits plus 16 parity bits. The received signal-to-noise for a desired bit error rate is greatly reduced through the use of forward error correction techniques. Even greater coding gain is provided through the use of a concatenated coding scheme. Interleaving/deinterleaving is necessary to randomize burst errors which may appear at the input of the RS decoder. The burst correction capability length is increased in proportion to the interleave depth. The modular nature of the simulator allows for inclusion or exclusion of modules as needed. This paper describes the development and operation of the simulator, the verification of a C-language Reed-Solomon code, and the possibility of using Comdisco SPW(tm) as a tool for determining optimal error control schemes.

  11. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  12. 43 CFR 3183.3 - Executed agreements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 43 Public Lands: Interior 2 2014-10-01 2014-10-01 false Executed agreements. 3183.3 Section 3183.3..., DEPARTMENT OF THE INTERIOR MINERALS MANAGEMENT (3000) ONSHORE OIL AND GAS UNIT AGREEMENTS: UNPROVEN AREAS Filing and Approval of Documents § 3183.3 Executed agreements. Where a duly executed agreement is...

  13. 43 CFR 3183.3 - Executed agreements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false Executed agreements. 3183.3 Section 3183.3..., DEPARTMENT OF THE INTERIOR MINERALS MANAGEMENT (3000) ONSHORE OIL AND GAS UNIT AGREEMENTS: UNPROVEN AREAS Filing and Approval of Documents § 3183.3 Executed agreements. Where a duly executed agreement is...

  14. 43 CFR 3183.3 - Executed agreements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false Executed agreements. 3183.3 Section 3183.3..., DEPARTMENT OF THE INTERIOR MINERALS MANAGEMENT (3000) ONSHORE OIL AND GAS UNIT AGREEMENTS: UNPROVEN AREAS Filing and Approval of Documents § 3183.3 Executed agreements. Where a duly executed agreement is...

  15. Partial correlation properties of pseudonoise /PN/ codes in noncoherent synchronization/detection schemes

    NASA Technical Reports Server (NTRS)

    Cartier, D. E.

    1976-01-01

    This concise paper considers the effect on the autocorrelation function of a pseudonoise (PN) code when the acquisition scheme only integrates coherently over part of the code and then noncoherently combines these results. The peak-to-null ratio of the effective PN autocorrelation function is shown to degrade to the square root of n, where n is the number of PN symbols over which coherent integration takes place.

  16. Bridging the gap between motor imagery and motor execution with a brain-robot interface.

    PubMed

    Bauer, Robert; Fels, Meike; Vukelić, Mathias; Ziemann, Ulf; Gharabaghi, Alireza

    2015-03-01

    According to electrophysiological studies motor imagery and motor execution are associated with perturbations of brain oscillations over spatially similar cortical areas. By contrast, neuroimaging and lesion studies suggest that at least partially distinct cortical networks are involved in motor imagery and execution. We sought to further disentangle this relationship by studying the role of brain-robot interfaces in the context of motor imagery and motor execution networks. Twenty right-handed subjects performed several behavioral tasks as indicators for imagery and execution of movements of the left hand, i.e. kinesthetic imagery, visual imagery, visuomotor integration and tonic contraction. In addition, subjects performed motor imagery supported by haptic/proprioceptive feedback from a brain-robot-interface. Principal component analysis was applied to assess the relationship of these indicators. The respective cortical resting state networks in the α-range were investigated by electroencephalography using the phase slope index. We detected two distinct abilities and cortical networks underlying motor control: a motor imagery network connecting the left parietal and motor areas with the right prefrontal cortex and a motor execution network characterized by transmission from the left to right motor areas. We found that a brain-robot-interface might offer a way to bridge the gap between these networks, opening thereby a backdoor to the motor execution system. This knowledge might promote patient screening and may lead to novel treatment strategies, e.g. for the rehabilitation of hemiparesis after stroke. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Connection anonymity analysis in coded-WDM PONs

    NASA Astrophysics Data System (ADS)

    Sue, Chuan-Ching

    2008-04-01

    A coded wavelength division multiplexing passive optical network (WDM PON) is presented for fiber to the home (FTTH) systems to protect against eavesdropping. The proposed scheme applies spectral amplitude coding (SAC) with a unipolar maximal-length sequence (M-sequence) code matrix to generate a specific signature address (coding) and to retrieve its matching address codeword (decoding) by exploiting the cyclic properties inherent in array waveguide grating (AWG) routers. In addition to ensuring the confidentiality of user data, the proposed coded-WDM scheme is also a suitable candidate for the physical layer with connection anonymity. Under the assumption that the eavesdropper applies a photo-detection strategy, it is shown that the coded WDM PON outperforms the conventional TDM PON and WDM PON schemes in terms of a higher degree of connection anonymity. Additionally, the proposed scheme allows the system operator to partition the optical network units (ONUs) into appropriate groups so as to achieve a better degree of anonymity.

  18. Clinical coding of prospectively identified paediatric adverse drug reactions--a retrospective review of patient records.

    PubMed

    Bellis, Jennifer R; Kirkham, Jamie J; Nunn, Anthony J; Pirmohamed, Munir

    2014-12-17

    National Health Service (NHS) hospitals in the UK use a system of coding for patient episodes. The coding system used is the International Classification of Disease (ICD-10). There are ICD-10 codes which may be associated with adverse drug reactions (ADRs) and there is a possibility of using these codes for ADR surveillance. This study aimed to determine whether ADRs prospectively identified in children admitted to a paediatric hospital were coded appropriately using ICD-10. The electronic admission abstract for each patient with at least one ADR was reviewed. A record was made of whether the ADR(s) had been coded using ICD-10. Of 241 ADRs, 76 (31.5%) were coded using at least one ICD-10 ADR code. Of the oncology ADRs, 70/115 (61%) were coded using an ICD-10 ADR code compared with 6/126 (4.8%) non-oncology ADRs (difference in proportions 56%, 95% CI 46.2% to 65.8%; p < 0.001). The majority of ADRs detected in a prospective study at a paediatric centre would not have been identified if the study had relied on ICD-10 codes as a single means of detection. Data derived from administrative healthcare databases are not reliable for identifying ADRs by themselves, but may complement other methods of detection.

  19. How Do South Korean Female Executives' Definitions of Career Success Differ from Those of Male Executives?

    ERIC Educational Resources Information Center

    Cho, Yonjoo; Park, Jiwon; Han, Soo Jeoung; Ju, Boreum; You, Jieun; Ju, Ahreum; Park, Chan Kyun; Park, Hye Young

    2017-01-01

    Purpose: The purpose of this study was to compare South Korean female executives' definitions of career success with those of male executives, identify their career development strategies for success and provide implications for research and practice. Two research questions guiding our inquiry included: How do female executives' definitions of…

  20. Error-correcting codes in computer arithmetic.

    NASA Technical Reports Server (NTRS)

    Massey, J. L.; Garcia, O. N.

    1972-01-01

    Summary of the most important results so far obtained in the theory of coding for the correction and detection of errors in computer arithmetic. Attempts to satisfy the stringent reliability demands upon the arithmetic unit are considered, and special attention is given to attempts to incorporate redundancy into the numbers themselves which are being processed so that erroneous results can be detected and corrected.

  1. Keys and seats: Spatial response coding underlying the joint spatial compatibility effect.

    PubMed

    Dittrich, Kerstin; Dolk, Thomas; Rothe-Wulf, Annelie; Klauer, Karl Christoph; Prinz, Wolfgang

    2013-11-01

    Spatial compatibility effects (SCEs) are typically observed when participants have to execute spatially defined responses to nonspatial stimulus features (e.g., the color red or green) that randomly appear to the left and the right. Whereas a spatial correspondence of stimulus and response features facilitates response execution, a noncorrespondence impairs task performance. Interestingly, the SCE is drastically reduced when a single participant responds to one stimulus feature (e.g., green) by operating only one response key (individual go/no-go task), whereas a full-blown SCE is observed when the task is distributed between two participants (joint go/no-go task). This joint SCE (a.k.a. the social Simon effect) has previously been explained by action/task co-representation, whereas alternative accounts ascribe joint SCEs to spatial components inherent in joint go/no-go tasks that allow participants to code their responses spatially. Although increasing evidence supports the idea that spatial rather than social aspects are responsible for joint SCEs emerging, it is still unclear to which component(s) the spatial coding refers to: the spatial orientation of response keys, the spatial orientation of responding agents, or both. By varying the spatial orientation of the responding agents (Exp. 1) and of the response keys (Exp. 2), independent of the spatial orientation of the stimuli, in the present study we found joint SCEs only when both the seating and the response key alignment matched the stimulus alignment. These results provide evidence that spatial response coding refers not only to the response key arrangement, but also to the-often neglected-spatial orientation of the responding agents.

  2. Box codes of lengths 48 and 72

    NASA Technical Reports Server (NTRS)

    Solomon, G.; Jin, Y.

    1993-01-01

    A self-dual code length 48, dimension 24, with Hamming distance essentially equal to 12 is constructed here. There are only six code words of weight eight. All the other code words have weights that are multiples of four and have a minimum weight equal to 12. This code may be encoded systematically and arises from a strict binary representation of the (8,4;5) Reed-Solomon (RS) code over GF (64). The code may be considered as six interrelated (8,7;2) codes. The Mattson-Solomon representation of the cyclic decomposition of these codes and their parity sums are used to detect an odd number of errors in any of the six codes. These may then be used in a correction algorithm for hard or soft decision decoding. A (72,36;15) box code was constructed from a (63,35;8) cyclic code. The theoretical justification is presented herein. A second (72,36;15) code is constructed from an inner (63,27;16) Bose Chaudhuri Hocquenghem (BCH) code and expanded to length 72 using box code algorithms for extension. This code was simulated and verified to have a minimum distance of 15 with even weight words congruent to zero modulo four. The decoding for hard and soft decision is still more complex than the first code constructed above. Finally, an (8,4;5) RS code over GF (512) in the binary representation of the (72,36;15) box code gives rise to a (72,36;16*) code with nine words of weight eight, and all the rest have weights greater than or equal to 16.

  3. Highly-sensitive microRNA detection based on bio-bar-code assay and catalytic hairpin assembly two-stage amplification.

    PubMed

    Tang, Songsong; Gu, Yuan; Lu, Huiting; Dong, Haifeng; Zhang, Kai; Dai, Wenhao; Meng, Xiangdan; Yang, Fan; Zhang, Xueji

    2018-04-03

    Herein, a highly-sensitive microRNA (miRNA) detection strategy was developed by combining bio-bar-code assay (BBA) with catalytic hairpin assembly (CHA). In the proposed system, two nanoprobes of magnetic nanoparticles functionalized with DNA probes (MNPs-DNA) and gold nanoparticles with numerous barcode DNA (AuNPs-DNA) were designed. In the presence of target miRNA, the MNP-DNA and AuNP-DNA hybridized with target miRNA to form a "sandwich" structure. After "sandwich" structures were separated from the solution by the magnetic field and dehybridized by high temperature, the barcode DNA sequences were released by dissolving AuNPs. The released barcode DNA sequences triggered the toehold strand displacement assembly of two hairpin probes, leading to recycle of barcode DNA sequences and producing numerous fluorescent CHA products for miRNA detection. Under the optimal experimental conditions, the proposed two-stage amplification system could sensitively detect target miRNA ranging from 10 pM to 10 aM with a limit of detection (LOD) down to 97.9 zM. It displayed good capability to discriminate single base and three bases mismatch due to the unique sandwich structure. Notably, it presented good feasibility for selective multiplexed detection of various combinations of synthetic miRNA sequences and miRNAs extracted from different cell lysates, which were in agreement with the traditional polymerase chain reaction analysis. The two-stage amplification strategy may be significant implication in the biological detection and clinical diagnosis. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Reading the Second Code: Mapping Epigenomes to Understand Plant Growth, Development, and Adaptation to the Environment[OA

    PubMed Central

    2012-01-01

    We have entered a new era in agricultural and biomedical science made possible by remarkable advances in DNA sequencing technologies. The complete sequence of an individual’s set of chromosomes (collectively, its genome) provides a primary genetic code for what makes that individual unique, just as the contents of every personal computer reflect the unique attributes of its owner. But a second code, composed of “epigenetic” layers of information, affects the accessibility of the stored information and the execution of specific tasks. Nature’s second code is enigmatic and must be deciphered if we are to fully understand and optimize the genetic potential of crop plants. The goal of the Epigenomics of Plants International Consortium is to crack this second code, and ultimately master its control, to help catalyze a new green revolution. PMID:22751210

  5. Nanoparticle based bio-bar code technology for trace analysis of aflatoxin B1 in Chinese herbs.

    PubMed

    Yu, Yu-Yan; Chen, Yuan-Yuan; Gao, Xuan; Liu, Yuan-Yuan; Zhang, Hong-Yan; Wang, Tong-Ying

    2018-04-01

    A novel and sensitive assay for aflatoxin B1 (AFB1) detection has been developed by using bio-bar code assay (BCA). The method that relies on polyclonal antibodies encoded with DNA modified gold nanoparticle (NP) and monoclonal antibodies modified magnetic microparticle (MMP), and subsequent detection of amplified target in the form of bio-bar code using a fluorescent quantitative polymerase chain reaction (FQ-PCR) detection method. First, NP probes encoded with DNA that was unique to AFB1, MMP probes with monoclonal antibodies that bind AFB1 specifically were prepared. Then, the MMP-AFB1-NP sandwich compounds were acquired, dehybridization of the oligonucleotides on the nanoparticle surface allows the determination of the presence of AFB1 by identifying the oligonucleotide sequence released from the NP through FQ-PCR detection. The bio-bar code techniques system for detecting AFB1 was established, and the sensitivity limit was about 10 -8  ng/mL, comparable ELISA assays for detecting the same target, it showed that we can detect AFB1 at low attomolar levels with the bio-bar-code amplification approach. This is also the first demonstration of a bio-bar code type assay for the detection of AFB1 in Chinese herbs. Copyright © 2017. Published by Elsevier B.V.

  6. Primer for the Transportable Applications Executive

    NASA Technical Reports Server (NTRS)

    Carlson, P. A.; Emmanuelli, C. A.; Harris, E. L.; Perkins, D. C.

    1984-01-01

    The Transportable Applications Executive (TAE), an interactive multipurpose executive that provides commonly required functions for scientific analysis systems, is discussed. The concept of an executive is discussed and the various components of TAE are presented. These include on-line help information, the use of menus or commands to access analysis programs, and TAE command procedures.

  7. Code query by example

    NASA Astrophysics Data System (ADS)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  8. Linear chirp phase perturbing approach for finding binary phased codes

    NASA Astrophysics Data System (ADS)

    Li, Bing C.

    2017-05-01

    Binary phased codes have many applications in communication and radar systems. These applications require binary phased codes to have low sidelobes in order to reduce interferences and false detection. Barker codes are the ones that satisfy these requirements and they have lowest maximum sidelobes. However, Barker codes have very limited code lengths (equal or less than 13) while many applications including low probability of intercept radar, and spread spectrum communication, require much higher code lengths. The conventional techniques of finding binary phased codes in literatures include exhaust search, neural network, and evolutionary methods, and they all require very expensive computation for large code lengths. Therefore these techniques are limited to find binary phased codes with small code lengths (less than 100). In this paper, by analyzing Barker code, linear chirp, and P3 phases, we propose a new approach to find binary codes. Experiments show that the proposed method is able to find long low sidelobe binary phased codes (code length >500) with reasonable computational cost.

  9. Evolvix BEST Names for semantic reproducibility across code2brain interfaces

    PubMed Central

    Scheuer, Katherine S.; Keel, Seth A.; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C.; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G.; Moog, Cecilia L.; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist‐Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda‐Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L.; Freiberg, Erika; Waters, Noah P.; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M.; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha

    2016-01-01

    Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general‐purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long‐term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder‐brains to reader‐brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core. PMID:27918836

  10. Short non-coding RNAs as bacteria species identifiers detected by surface plasmon resonance enhanced common path interferometry

    NASA Astrophysics Data System (ADS)

    Greef, Charles; Petropavlovskikh, Viatcheslav; Nilsen, Oyvind; Khattatov, Boris; Plam, Mikhail; Gardner, Patrick; Hall, John

    2008-04-01

    Small non-coding RNA sequences have recently been discovered as unique identifiers of certain bacterial species, raising the possibility that they can be used as highly specific Biowarfare Agent detection markers in automated field deployable integrated detection systems. Because they are present in high abundance they could allow genomic based bacterial species identification without the need for pre-assay amplification. Further, a direct detection method would obviate the need for chemical labeling, enabling a rapid, efficient, high sensitivity mechanism for bacterial detection. Surface Plasmon Resonance enhanced Common Path Interferometry (SPR-CPI) is a potentially market disruptive, high sensitivity dual technology that allows real-time direct multiplex measurement of biomolecule interactions, including small molecules, nucleic acids, proteins, and microbes. SPR-CPI measures differences in phase shift of reflected S and P polarized light under Total Internal Reflection (TIR) conditions at a surface, caused by changes in refractive index induced by biomolecular interactions within the evanescent field at the TIR interface. The measurement is performed on a microarray of discrete 2-dimensional areas functionalized with biomolecule capture reagents, allowing simultaneous measurement of up to 100 separate analytes. The optical beam encompasses the entire microarray, allowing a solid state detector system with no scanning requirement. Output consists of simultaneous voltage measurements proportional to the phase differences resulting from the refractive index changes from each microarray feature, and is automatically processed and displayed graphically or delivered to a decision making algorithm, enabling a fully automatic detection system capable of rapid detection and quantification of small nucleic acids at extremely sensitive levels. Proof-of-concept experiments on model systems and cell culture samples have demonstrated utility of the system, and efforts are in

  11. Autism Spectrum Disorder and intact executive functioning.

    PubMed

    Ferrara, R; Ansermet, F; Massoni, F; Petrone, L; Onofri, E; Ricci, P; Archer, T; Ricci, S

    2016-01-01

    Earliest notions concerning autism (Autism Spectrum Disorders, ASD) describe the disturbance in executive functioning. Despite altered definition, executive functioning, expressed as higher cognitive skills required complex behaviors linked to the prefrontal cortex, are defective in autism. Specific difficulties in children presenting autism or verbal disabilities at executive functioning levels have been identified. Nevertheless, the developmental deficit of executive functioning in autism is highly diversified with huge individual variation and may even be absent. The aim of the present study to examine the current standing of intact executive functioning intact in ASD. Analysis of ASD populations, whether high-functioning, Asperger's or autism Broad Phenotype, studied over a range of executive functions including response inhibition, planning, cognitive flexibility, cognitive inhibition, and alerting networks indicates an absence of damage/impairment compared to the typically-developed normal control subjects. These findings of intact executive functioning in ASD subjects provide a strong foundation on which to construct applications for growth environments and the rehabilitation of autistic subjects.

  12. Apolipoprotein Eε4: A Biomarker for Executive Dysfunction among Parkinson's Disease Patients with Mild Cognitive Impairment.

    PubMed

    Samat, Nor A; Abdul Murad, Nor A; Mohamad, Khairiyah; Abdul Razak, Mohd R; Mohamed Ibrahim, Norlinah

    2017-01-01

    Background: Cognitive impairment is prevalent in Parkinson's disease (PD), affecting 15-20% of patients at diagnosis. α-synuclein expression and genetic polymorphisms of Apolipoprotein E ( ApoE ) have been associated with the presence of cognitive impairment in PD although data have been inconsistent. Objectives: To determine the prevalence of cognitive impairment in patients with PD using Montreal Cognitive Assessment (MoCA), Comprehensive Trail Making Test (CTMT) and Parkinson's disease-cognitive rating scale (PDCRS), and its association with plasma α-synuclein and ApoE genetic polymorphisms. Methods: This was across-sectional study involving 46 PD patients. Patients were evaluated using Montreal cognitive assessment test (MoCA), and detailed neuropsychological tests. The Parkinson's disease cognitive rating scale (PDCRS) was used for cognitive function and comprehensive trail making test (CTMT) for executive function. Blood was drawn for plasma α-synuclein measurements and ApoE genetic analysis. ApoE polymorphism was detected using MutaGEL APoE from ImmunDiagnostik. Plasma α-synuclein was detected using the ELISA Technique (USCN Life Science Inc.) according to the standard protocol. Results: Based on MoCA, 26 (56.5%) patients had mild cognitive impairment (PD-MCI) and 20 (43.5%) had normal cognition (PD-NC). Based on the PDCRS, 18 (39.1%) had normal cognition (PDCRS-NC), 17 (37%) had mild cognitive impairment (PDCRS-MCI), and 11 (23.9%) had dementia (PDCRS-PDD). In the PDCRS-MCI group, 5 (25%) patients were from PD-NC group and all PDCRS-PDD patients were from PD-MCI group. CTMT scores were significantly different between patients with MCI and normal cognition on MoCA ( p = 0.003). Twenty one patients (72.4%) with executive dysfunction were from the PD-MCI group; 17 (77.3%) with severe executive dysfunction and 4 (57.1%) had mild to moderate executive dysfunction. There were no differences in the plasma α-synuclein concentration between the presence or

  13. Physician-executives past, present, and future.

    PubMed

    Smallwood, K G; Wilson, C N

    1992-08-01

    The dramatic changes in the United States' health care system during the last decade have sparked increasing interest in physician-executives. These executives, skilled in both clinical medicine and health care management, can be found in hospitals, managed care organizations, group practices, and government institutions. This paper outlines the physician-executive's roles and the development process. The remarkable growth in the number of physician-executives is expected to continue as they demonstrate their abilities to help health care providers expand ambulatory services, facilitate provider-physician relationships and physician recruitment, and lend expertise in quality improvement and risk management issues.

  14. LSENS, A General Chemical Kinetics and Sensitivity Analysis Code for Homogeneous Gas-Phase Reactions. Part 2; Code Description and Usage

    NASA Technical Reports Server (NTRS)

    Radhakrishnan, Krishnan; Bittker, David A.

    1994-01-01

    LSENS, the Lewis General Chemical Kinetics and Sensitivity Analysis Code, has been developed for solving complex, homogeneous, gas-phase chemical kinetics problems and contains sensitivity analysis for a variety of problems, including nonisothermal situations. This report is part II of a series of three reference publications that describe LSENS, provide a detailed guide to its usage, and present many example problems. Part II describes the code, how to modify it, and its usage, including preparation of the problem data file required to execute LSENS. Code usage is illustrated by several example problems, which further explain preparation of the problem data file and show how to obtain desired accuracy in the computed results. LSENS is a flexible, convenient, accurate, and efficient solver for chemical reaction problems such as static system; steady, one-dimensional, inviscid flow; reaction behind incident shock wave, including boundary layer correction; and perfectly stirred (highly backmixed) reactor. In addition, the chemical equilibrium state can be computed for the following assigned states: temperature and pressure, enthalpy and pressure, temperature and volume, and internal energy and volume. For static problems the code computes the sensitivity coefficients of the dependent variables and their temporal derivatives with respect to the initial values of the dependent variables and/or the three rate coefficient parameters of the chemical reactions. Part I (NASA RP-1328) derives the governing equations and describes the numerical solution procedures for the types of problems that can be solved by LSENS. Part III (NASA RP-1330) explains the kinetics and kinetics-plus-sensitivity-analysis problems supplied with LSENS and presents sample results.

  15. An installed nacelle design code using a multiblock Euler solver. Volume 2: User guide

    NASA Technical Reports Server (NTRS)

    Chen, H. C.

    1992-01-01

    This is a user manual for the general multiblock Euler design (GMBEDS) code. The code is for the design of a nacelle installed on a geometrically complex configuration such as a complete airplane with wing/body/nacelle/pylon. It consists of two major building blocks: a design module developed by LaRC using directive iterative surface curvature (DISC); and a general multiblock Euler (GMBE) flow solver. The flow field surrounding a complex configuration is divided into a number of topologically simple blocks to facilitate surface-fitted grid generation and improve flow solution efficiency. This user guide provides input data formats along with examples of input files and a Unix script for program execution in the UNICOS environment.

  16. Improvements in the MGA Code Provide Flexibility and Better Error Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruhter, W D; Kerr, J

    2005-05-26

    The Multi-Group Analysis (MGA) code is widely used to determine nondestructively the relative isotopic abundances of plutonium by gamma-ray spectrometry. MGA users have expressed concern about the lack of flexibility and transparency in the code. Users often have to ask the code developers for modifications to the code to accommodate new measurement situations, such as additional peaks being present in the plutonium spectrum or expected peaks being absent. We are testing several new improvements to a prototype, general gamma-ray isotopic analysis tool with the intent of either revising or replacing the MGA code. These improvements will give the user themore » ability to modify, add, or delete the gamma- and x-ray energies and branching intensities used by the code in determining a more precise gain and in the determination of the relative detection efficiency. We have also fully integrated the determination of the relative isotopic abundances with the determination of the relative detection efficiency to provide a more accurate determination of the errors in the relative isotopic abundances. We provide details in this paper on these improvements and a comparison of results obtained with current versions of the MGA code.« less

  17. Profile of executive deficits in cocaine and heroin polysubstance users: common and differential effects on separate executive components.

    PubMed

    Verdejo-García, Antonio; Pérez-García, Miguel

    2007-03-01

    Structure of executive function was examined and we contrasted performance of substance dependent individuals (polysubstance users) and control participants on neuropsychological measures assessing the different executive components obtained. Additionally, we contrasted performance of polysubstance users with preference for cocaine vs heroin and controls to explore possible differential effects of the main substance abused on executive impairment. Two groups of participants were recruited: abstinent polysubstance users and controls. Polysubstance users were further subdivided based on their drug of choice (cocaine vs heroin). We administered to all participants a comprehensive protocol of executive measures, including tests of fluency, working memory, reasoning, inhibitory control, flexibility, and decision making. Consistent with previous models, the principal component analysis showed that executive functions are organized into four separate components, three of them previously described: updating, inhibition, and shifting; and a fourth component of decision making. Abstinent polysubstance users had clinically significant impairments on measures assessing these four executive components (with effect sizes ranging from 0.5 to 2.2). Cocaine polysubstance users had more severe impairments than heroin users and controls on measures of inhibition (Stroop) and shifting (go/no go and category test). Greater severity of drug use predicted poorer performance on updating measures. Executive functions can be fractionated into four relatively independent components. Chronic drug use is associated with widespread impairment of these four executive components, with cocaine use inducing more severe deficits on inhibition and shifting. These findings show both common and differential effects of two widely used drugs on different executive components.

  18. Toward Intelligent Software Defect Detection

    NASA Technical Reports Server (NTRS)

    Benson, Markland J.

    2011-01-01

    Source code level software defect detection has gone from state of the art to a software engineering best practice. Automated code analysis tools streamline many of the aspects of formal code inspections but have the drawback of being difficult to construct and either prone to false positives or severely limited in the set of defects that can be detected. Machine learning technology provides the promise of learning software defects by example, easing construction of detectors and broadening the range of defects that can be found. Pinpointing software defects with the same level of granularity as prominent source code analysis tools distinguishes this research from past efforts, which focused on analyzing software engineering metrics data with granularity limited to that of a particular function rather than a line of code.

  19. Studying self-awareness in children: validation of the Questionnaire of Executive Functioning (QEF).

    PubMed

    Geurten, Marie; Catale, Corinne; Geurten, Claire; Wansard, Murielle; Meulemans, Thierry

    2016-05-01

    People with accurate representations of their own cognitive functioning (i.e. cognitive self-awareness) tend to use appropriate strategies to regulate their behavior. Due to the lack of appropriate instruments, few studies have examined the development of this ability among children. This study tested the measurement properties of the self-rating and other-rating forms of the Questionnaire of Executive Functioning (QEF), designed to tap children's knowledge of their executive functioning. Specifically, the construct, convergent, and discriminant validities were investigated and a self-other discrepancy score was computed to assess children's executive self-awareness. Participants were 317 children aged 7-14 years old. Confirmatory factor analyses carried out on the QEF confirmed the eight-factor structure of both versions. There were significant correlations between the QEF and the parent versions of the Behavior Rating Inventory of Executive Function, the Dysexecutive Questionnaire for Children, and the Childhood Executive Functioning Inventory. Both forms of the QEF were able to distinguish between children who had sustained a traumatic brain injury (TBI) and control participants. A statistical difference was observed between the TBI and control groups on this score, suggesting that TBI may trigger self-awareness impairments in children. The good psychometric properties of the two forms of the QEF were established. Furthermore, results of the analyses carried out on the different discrepancy scores seem to indicate that the QEF could help clinicians to detect patients with self-awareness deficits.

  20. Color-coded automated signal intensity curves for detection and characterization of breast lesions: preliminary evaluation of a new software package for integrated magnetic resonance-based breast imaging.

    PubMed

    Pediconi, Federica; Catalano, Carlo; Venditti, Fiammetta; Ercolani, Mauro; Carotenuto, Luigi; Padula, Simona; Moriconi, Enrica; Roselli, Antonella; Giacomelli, Laura; Kirchin, Miles A; Passariello, Roberto

    2005-07-01

    The objective of this study was to evaluate the value of a color-coded automated signal intensity curve software package for contrast-enhanced magnetic resonance mammography (CE-MRM) in patients with suspected breast cancer. Thirty-six women with suspected breast cancer based on mammographic and sonographic examinations were preoperatively evaluated on CE-MRM. CE-MRM was performed on a 1.5-T magnet using a 2D Flash dynamic T1-weighted sequence. A dosage of 0.1 mmol/kg of Gd-BOPTA was administered at a flow rate of 2 mL/s followed by 10 mL of saline. Images were analyzed with the new software package and separately with a standard display method. Statistical comparison was performed of the confidence for lesion detection and characterization with the 2 methods and of the diagnostic accuracy for characterization compared with histopathologic findings. At pathology, 54 malignant lesions and 14 benign lesions were evaluated. All 68 (100%) lesions were detected with both methods and good correlation with histopathologic specimens was obtained. Confidence for both detection and characterization was significantly (P < or = 0.025) better with the color-coded method, although no difference (P > 0.05) between the methods was noted in terms of the sensitivity, specificity, and overall accuracy for lesion characterization. Excellent agreement between the 2 methods was noted for both the determination of lesion size (kappa = 0.77) and determination of SI/T curves (kappa = 0.85). The novel color-coded signal intensity curve software allows lesions to be visualized as false color maps that correspond to conventional signal intensity time curves. Detection and characterization of breast lesions with this method is quick and easily interpretable.

  1. Nurse executive transformational leadership found in participative organizations.

    PubMed

    Dunham-Taylor, J

    2000-05-01

    The study examined a national sample of 396 randomly selected hospital nurse executives to explore transformational leadership, stage of power, and organizational climate. Results from a few nurse executive studies have found nurse executives were transformational leaders. As executives were more transformational, they achieved better staff satisfaction and higher work group effectiveness. This study integrates Bass' transformational leadership model with Hagberg's power stage theory and Likert's organizational climate theory. Nurse executives (396) and staff reporting to them (1,115) rated the nurse executives' leadership style, staff extra effort, staff satisfaction, and work group effectiveness using Bass and Avolio's Multifactor Leadership Questionnaire. Executives' bosses (360) rated executive work group effectiveness. Executives completed Hagberg's Personal Power Profile and ranked their organizational climate using Likert's Profile of Organizational Characteristics. Nurse executives used transformational leadership fairly often; achieved fairly satisfied staff levels; were very effective according to bosses; were most likely at stage 3 (power by achievement) or stage 4 (power by reflection); and rated their hospital as a Likert System 3 Consultative Organization. Staff satisfaction and work group effectiveness decreased as nurse executives were more transactional. Higher transformational scores tended to occur with higher educational degrees and within more participative organizations. Transformational qualities can be enhanced by further education, by achieving higher power stages, and by being within more participative organizations.

  2. 5 CFR 412.401 - Continuing executive development.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Section 412.401 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS SUPERVISORY, MANAGEMENT, AND EXECUTIVE DEVELOPMENT Executive Development § 412.401 Continuing executive... participation in short-term and longer-term experiences, meet organizational needs for leadership, managerial...

  3. Fully automated macular pathology detection in retina optical coherence tomography images using sparse coding and dictionary learning

    NASA Astrophysics Data System (ADS)

    Sun, Yankui; Li, Shan; Sun, Zhongyang

    2017-01-01

    We propose a framework for automated detection of dry age-related macular degeneration (AMD) and diabetic macular edema (DME) from retina optical coherence tomography (OCT) images, based on sparse coding and dictionary learning. The study aims to improve the classification performance of state-of-the-art methods. First, our method presents a general approach to automatically align and crop retina regions; then it obtains global representations of images by using sparse coding and a spatial pyramid; finally, a multiclass linear support vector machine classifier is employed for classification. We apply two datasets for validating our algorithm: Duke spectral domain OCT (SD-OCT) dataset, consisting of volumetric scans acquired from 45 subjects-15 normal subjects, 15 AMD patients, and 15 DME patients; and clinical SD-OCT dataset, consisting of 678 OCT retina scans acquired from clinics in Beijing-168, 297, and 213 OCT images for AMD, DME, and normal retinas, respectively. For the former dataset, our classifier correctly identifies 100%, 100%, and 93.33% of the volumes with DME, AMD, and normal subjects, respectively, and thus performs much better than the conventional method; for the latter dataset, our classifier leads to a correct classification rate of 99.67%, 99.67%, and 100.00% for DME, AMD, and normal images, respectively.

  4. Fully automated macular pathology detection in retina optical coherence tomography images using sparse coding and dictionary learning.

    PubMed

    Sun, Yankui; Li, Shan; Sun, Zhongyang

    2017-01-01

    We propose a framework for automated detection of dry age-related macular degeneration (AMD) and diabetic macular edema (DME) from retina optical coherence tomography (OCT) images, based on sparse coding and dictionary learning. The study aims to improve the classification performance of state-of-the-art methods. First, our method presents a general approach to automatically align and crop retina regions; then it obtains global representations of images by using sparse coding and a spatial pyramid; finally, a multiclass linear support vector machine classifier is employed for classification. We apply two datasets for validating our algorithm: Duke spectral domain OCT (SD-OCT) dataset, consisting of volumetric scans acquired from 45 subjects—15 normal subjects, 15 AMD patients, and 15 DME patients; and clinical SD-OCT dataset, consisting of 678 OCT retina scans acquired from clinics in Beijing—168, 297, and 213 OCT images for AMD, DME, and normal retinas, respectively. For the former dataset, our classifier correctly identifies 100%, 100%, and 93.33% of the volumes with DME, AMD, and normal subjects, respectively, and thus performs much better than the conventional method; for the latter dataset, our classifier leads to a correct classification rate of 99.67%, 99.67%, and 100.00% for DME, AMD, and normal images, respectively.

  5. Development of full wave code for modeling RF fields in hot non-uniform plasmas

    NASA Astrophysics Data System (ADS)

    Zhao, Liangji; Svidzinski, Vladimir; Spencer, Andrew; Kim, Jin-Soo

    2016-10-01

    FAR-TECH, Inc. is developing a full wave RF modeling code to model RF fields in fusion devices and in general plasma applications. As an important component of the code, an adaptive meshless technique is introduced to solve the wave equations, which allows resolving plasma resonances efficiently and adapting to the complexity of antenna geometry and device boundary. The computational points are generated using either a point elimination method or a force balancing method based on the monitor function, which is calculated by solving the cold plasma dispersion equation locally. Another part of the code is the conductivity kernel calculation, used for modeling the nonlocal hot plasma dielectric response. The conductivity kernel is calculated on a coarse grid of test points and then interpolated linearly onto the computational points. All the components of the code are parallelized using MPI and OpenMP libraries to optimize the execution speed and memory. The algorithm and the results of our numerical approach to solving 2-D wave equations in a tokamak geometry will be presented. Work is supported by the U.S. DOE SBIR program.

  6. The Continual Intercomparison of Radiation Codes: Results from Phase I

    NASA Technical Reports Server (NTRS)

    Oreopoulos, Lazaros; Mlawer, Eli; Delamere, Jennifer; Shippert, Timothy; Cole, Jason; Iacono, Michael; Jin, Zhonghai; Li, Jiangnan; Manners, James; Raisanen, Petri; hide

    2011-01-01

    The computer codes that calculate the energy budget of solar and thermal radiation in Global Climate Models (GCMs), our most advanced tools for predicting climate change, have to be computationally efficient in order to not impose undue computational burden to climate simulations. By using approximations to gain execution speed, these codes sacrifice accuracy compared to more accurate, but also much slower, alternatives. International efforts to evaluate the approximate schemes have taken place in the past, but they have suffered from the drawback that the accurate standards were not validated themselves for performance. The manuscript summarizes the main results of the first phase of an effort called "Continual Intercomparison of Radiation Codes" (CIRC) where the cases chosen to evaluate the approximate models are based on observations and where we have ensured that the accurate models perform well when compared to solar and thermal radiation measurements. The effort is endorsed by international organizations such as the GEWEX Radiation Panel and the International Radiation Commission and has a dedicated website (i.e., http://circ.gsfc.nasa.gov) where interested scientists can freely download data and obtain more information about the effort's modus operandi and objectives. In a paper published in the March 2010 issue of the Bulletin of the American Meteorological Society only a brief overview of CIRC was provided with some sample results. In this paper the analysis of submissions of 11 solar and 13 thermal infrared codes relative to accurate reference calculations obtained by so-called "line-by-line" radiation codes is much more detailed. We demonstrate that, while performance of the approximate codes continues to improve, significant issues still remain to be addressed for satisfactory performance within GCMs. We hope that by identifying and quantifying shortcomings, the paper will help establish performance standards to objectively assess radiation code quality

  7. The Ethical Dimensions of Working with Parents: Using the Code of Ethics when Faced with a Difficult Decision

    ERIC Educational Resources Information Center

    Freeman, Nancy K.; Swick, Kevin J.

    2007-01-01

    In 2000 ACEI began an exploration of the potential role that a code of professional ethics might have in the Association. The Public Affairs Committee recommended that the Executive Board appoint an ad hoc Ethics Committee. That committee, under the leadership of Nita Barbour, accepted its charge to provide guidance to colleagues who struggle to…

  8. Coded continuous wave meteor radar

    NASA Astrophysics Data System (ADS)

    Chau, J. L.; Vierinen, J.; Pfeffer, N.; Clahsen, M.; Stober, G.

    2016-12-01

    The concept of a coded continuous wave specular meteor radar (SMR) is described. The radar uses a continuously transmitted pseudorandom phase-modulated waveform, which has several advantages compared to conventional pulsed SMRs. The coding avoids range and Doppler aliasing, which are in some cases problematic with pulsed radars. Continuous transmissions maximize pulse compression gain, allowing operation at lower peak power than a pulsed system. With continuous coding, the temporal and spectral resolution are not dependent on the transmit waveform and they can be fairly flexibly changed after performing a measurement. The low signal-to-noise ratio before pulse compression, combined with independent pseudorandom transmit waveforms, allows multiple geographically separated transmitters to be used in the same frequency band simultaneously without significantly interfering with each other. Because the same frequency band can be used by multiple transmitters, the same interferometric receiver antennas can be used to receive multiple transmitters at the same time. The principles of the signal processing are discussed, in addition to discussion of several practical ways to increase computation speed, and how to optimally detect meteor echoes. Measurements from a campaign performed with a coded continuous wave SMR are shown and compared with two standard pulsed SMR measurements. The type of meteor radar described in this paper would be suited for use in a large-scale multi-static network of meteor radar transmitters and receivers. Such a system would be useful for increasing the number of meteor detections to obtain improved meteor radar data products, such as wind fields. This type of a radar would also be useful for over-the-horizon radar, ionosondes, and observations of field-aligned-irregularities.

  9. Image authentication using distributed source coding.

    PubMed

    Lin, Yao-Chung; Varodayan, David; Girod, Bernd

    2012-01-01

    We present a novel approach using distributed source coding for image authentication. The key idea is to provide a Slepian-Wolf encoded quantized image projection as authentication data. This version can be correctly decoded with the help of an authentic image as side information. Distributed source coding provides the desired robustness against legitimate variations while detecting illegitimate modification. The decoder incorporating expectation maximization algorithms can authenticate images which have undergone contrast, brightness, and affine warping adjustments. Our authentication system also offers tampering localization by using the sum-product algorithm.

  10. Memory and Executive Screening for the Detection of Cognitive Impairment in Obstructive Sleep Apnea.

    PubMed

    Mu, Li; Peng, Liping; Zhang, Zhengjiao; Jie, Jing; Jia, Siqi; Yuan, Haibo

    2017-10-01

    Obstructive sleep apnea (OSA) is commonly associated with cognitive dysfunction, which is more apparent in severe OSA and impairs quality of life. However, the clinical screening methods for these impairments in OSA are still limited. In this study, we evaluated the feasibility of using the Memory and Executive Screening (MES) for assessing cognitive performance in OSA. Twenty-four patients with nonsevere OSA and 36 patients with severe OSA participated in this study. All participants underwent comprehensive, laboratory-based polysomnography and completed assessments of cognitive function, which included both the MES and the Beijing version of the Montreal Cognitive Assessment (MoCA-BJ). Both the total MES scores and 5 recall scores of the MES (MES-5R) were significantly lower in the severe OSA group than those in the nonsevere OSA group. The patients with severe OSA performed worse on the memory subtests of the MES-5R, especially on immediate recall. The sensitivity and specificity of the MES for identifying cognitive impairment in patients with OSA were 63.89% and 66.67%, respectively, for a cutoff value of <92 out of 100 points. An optimal cutoff between nonsevere and severe OSA was also set at 45 points (MES-5R) and at 0.94 points (MES ratio). Compared with the MES, the MoCA-BJ had similar sensitivity (61.11%) and specificity (66.67%). The MES is an acceptable tool for detecting cognitive dysfunction in patients with OSA. The sensitivity and specificity of the MES were similar to those of the MoCA-BJ. The MES-5R and total MES scores can assess the presence and severity of cognitive impairment in patients with severe OSA. Copyright © 2017 Southern Society for Clinical Investigation. Published by Elsevier Inc. All rights reserved.

  11. A Statistical Analysis of IrisCode and Its Security Implications.

    PubMed

    Kong, Adams Wai-Kin

    2015-03-01

    IrisCode has been used to gather iris data for 430 million people. Because of the huge impact of IrisCode, it is vital that it is completely understood. This paper first studies the relationship between bit probabilities and a mean of iris images (The mean of iris images is defined as the average of independent iris images.) and then uses the Chi-square statistic, the correlation coefficient and a resampling algorithm to detect statistical dependence between bits. The results show that the statistical dependence forms a graph with a sparse and structural adjacency matrix. A comparison of this graph with a graph whose edges are defined by the inner product of the Gabor filters that produce IrisCodes shows that partial statistical dependence is induced by the filters and propagates through the graph. Using this statistical information, the security risk associated with two patented template protection schemes that have been deployed in commercial systems for producing application-specific IrisCodes is analyzed. To retain high identification speed, they use the same key to lock all IrisCodes in a database. The belief has been that if the key is not compromised, the IrisCodes are secure. This study shows that even without the key, application-specific IrisCodes can be unlocked and that the key can be obtained through the statistical dependence detected.

  12. Caregiver Person-Centeredness and Behavioral Symptoms during Mealtime Interactions: Development and Feasibility of a Coding Scheme

    PubMed Central

    Gilmore-Bykovskyi, Andrea L.

    2015-01-01

    Mealtime behavioral symptoms are distressing and frequently interrupt eating for the individual experiencing them and others in the environment. In order to enable identification of potential antecedents to mealtime behavioral symptoms, a computer-assisted coding scheme was developed to measure caregiver person-centeredness and behavioral symptoms for nursing home residents with dementia during mealtime interactions. The purpose of this pilot study was to determine the acceptability and feasibility of procedures for video-capturing naturally-occurring mealtime interactions between caregivers and residents with dementia, to assess the feasibility, ease of use, and inter-observer reliability of the coding scheme, and to explore the clinical utility of the coding scheme. Trained observers coded 22 observations. Data collection procedures were feasible and acceptable to caregivers, residents and their legally authorized representatives. Overall, the coding scheme proved to be feasible, easy to execute and yielded good to very good inter-observer agreement following observer re-training. The coding scheme captured clinically relevant, modifiable antecedents to mealtime behavioral symptoms, but would be enhanced by the inclusion of measures for resident engagement and consolidation of items for measuring caregiver person-centeredness that co-occurred and were difficult for observers to distinguish. PMID:25784080

  13. Run-time scheduling and execution of loops on message passing machines

    NASA Technical Reports Server (NTRS)

    Crowley, Kay; Saltz, Joel; Mirchandaney, Ravi; Berryman, Harry

    1989-01-01

    Sparse system solvers and general purpose codes for solving partial differential equations are examples of the many types of problems whose irregularity can result in poor performance on distributed memory machines. Often, the data structures used in these problems are very flexible. Crucial details concerning loop dependences are encoded in these structures rather than being explicitly represented in the program. Good methods for parallelizing and partitioning these types of problems require assignment of computations in rather arbitrary ways. Naive implementations of programs on distributed memory machines requiring general loop partitions can be extremely inefficient. Instead, the scheduling mechanism needs to capture the data reference patterns of the loops in order to partition the problem. First, the indices assigned to each processor must be locally numbered. Next, it is necessary to precompute what information is needed by each processor at various points in the computation. The precomputed information is then used to generate an execution template designed to carry out the computation, communication, and partitioning of data, in an optimized manner. The design is presented for a general preprocessor and schedule executer, the structures of which do not vary, even though the details of the computation and of the type of information are problem dependent.

  14. Run-time scheduling and execution of loops on message passing machines

    NASA Technical Reports Server (NTRS)

    Saltz, Joel; Crowley, Kathleen; Mirchandaney, Ravi; Berryman, Harry

    1990-01-01

    Sparse system solvers and general purpose codes for solving partial differential equations are examples of the many types of problems whose irregularity can result in poor performance on distributed memory machines. Often, the data structures used in these problems are very flexible. Crucial details concerning loop dependences are encoded in these structures rather than being explicitly represented in the program. Good methods for parallelizing and partitioning these types of problems require assignment of computations in rather arbitrary ways. Naive implementations of programs on distributed memory machines requiring general loop partitions can be extremely inefficient. Instead, the scheduling mechanism needs to capture the data reference patterns of the loops in order to partition the problem. First, the indices assigned to each processor must be locally numbered. Next, it is necessary to precompute what information is needed by each processor at various points in the computation. The precomputed information is then used to generate an execution template designed to carry out the computation, communication, and partitioning of data, in an optimized manner. The design is presented for a general preprocessor and schedule executer, the structures of which do not vary, even though the details of the computation and of the type of information are problem dependent.

  15. 78 FR 28441 - Executive Compensation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-14

    ...'' is defined to cover the chief executive officer, chief financial officer, chief operating officer... president that reports to the president or chief operating officer, but instead, should be based only on... compensation provided to executive officers by the Federal National Mortgage Association, the Federal Home Loan...

  16. 2 CFR 170.315 - Executive.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 2 Grants and Agreements 1 2014-01-01 2014-01-01 false Executive. 170.315 Section 170.315 Grants and Agreements Office of Management and Budget Guidance for Grants and Agreements OFFICE OF MANAGEMENT AND BUDGET GOVERNMENTWIDE GUIDANCE FOR GRANTS AND AGREEMENTS Reserved REPORTING SUBAWARD AND EXECUTIVE...

  17. On the Evolutionary Origins of Executive Functions

    ERIC Educational Resources Information Center

    Ardila, Alfredo

    2008-01-01

    In this paper it is proposed that the prefrontal lobe participates in two closely related but different executive function abilities: (1) "metacognitive executive functions": problem solving, planning, concept formation, strategy development and implementation, controlling attention, working memory, and the like; that is, executive functions as…

  18. 48 CFR 702.170-6 - Executive agency.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Executive agency. 702.170-6 Section 702.170-6 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT GENERAL DEFINITIONS OF WORDS AND TERMS Definitions 702.170-6 Executive agency. Executive agency includes...

  19. Roadblocks to Change: Executive Behaviors Versus Executive Perceptions.

    ERIC Educational Resources Information Center

    Harris, Thomas E.

    A study analyzed the responses of chief executive officers (CEOs) and company presidents to a leadership test and an organizational environment test, to determine whether these individuals' managerial approaches coincided with their characterizations of their organizations' environments. Subjects, CEOs or presidents of 65 randomly selected…

  20. Nurse executive transformational leadership and organizational commitment.

    PubMed

    Leach, Linda Searle

    2005-05-01

    To investigate the relationship between nurse executive leadership and organizational commitment among nurses in acute care hospitals. A key challenge for organizations is to maximize the contributions of all workers by cultivating their commitment. Nurse leaders are in a position to influence organizational commitment among nurses. The theoretical constructs underlying this study are the transformational leadership theory and the Etzioni's organizational theory. A cross-sectional, field survey of nurse executives, nurse managers, and staff nurses was conducted to assess nurse executive transformational and transactional leadership and their relationship to organizational commitment. Hypotheses were tested using correlational analysis, and univariate statistics were used to describe the sample. An inverse relationship between nurse executive transformational and transactional leadership and alienative (highly negative) organizational commitment was statistically significant. A positive association was demonstrated between nurse executive leadership and nurse manager leadership. This study supports the effect of nurse executive leadership on nurse manager leadership and on organizational commitment among nurses despite role distance. To the extent that transformational leadership is present, alienative organizational commitment is reduced. This relationship shows the importance of nurse executive leadership in organizational involvement among nurses in the dynamic context of contemporary hospital settings.

  1. Evaluating transformational leadership skills of hospice executives.

    PubMed

    Longenecker, Paul D

    2006-01-01

    Health care is a rapidly changing environment requiring a high level of leadership skills by executive level personnel. The hospice industry is experiencing the same rapid changes; however, the changes have been experienced over the brief span of 25 years. Highly skilled hospice executives are a necessity for the growth and long-term survival of hospice care. This descriptive study was conducted to evaluate the leadership skills of hospice executives. The study population consisted of hospice executives who were members of the state hospice organization in Ohio and/or licensed by the state (88 hospice providers). Three questionnaires were utilized for collecting data. These questionnaires collected data on transformational leadership skills of participants, participants' personal demographics, and their employer's organizational demographics. Forty-seven hospice executives responded (53%). Key findings reported were high levels of transformational leadership skills (mean, 3.39), increased use of laissez-faire skills with years of hospice experience (P = .57), and positive reward being a frequent leadership technique utilized (mean, 3.29). In addition, this was the first study of leadership skills of hospice executives and the first formal collection of personal demographic data about hospice executives.

  2. Sexual Harassment and Organizational Outcomes Executive Summary

    DTIC Science & Technology

    2011-10-01

    quid pro quo type of Sexual harassment and Organizational, 4 sexual harassment (e.g., sexual coercion). This should drive organizational efforts to... Sexual Harassment and Organizational Outcomes Executive Summary Charlie L. Law DEFENSE EQUAL...Executive Summary] No. 99-11 Sexual harassment and Organizational, 2 Executive Summary Issue

  3. Executable Architecture Research at Old Dominion University

    NASA Technical Reports Server (NTRS)

    Tolk, Andreas; Shuman, Edwin A.; Garcia, Johnny J.

    2011-01-01

    Executable Architectures allow the evaluation of system architectures not only regarding their static, but also their dynamic behavior. However, the systems engineering community do not agree on a common formal specification of executable architectures. To close this gap and identify necessary elements of an executable architecture, a modeling language, and a modeling formalism is topic of ongoing PhD research. In addition, systems are generally defined and applied in an operational context to provide capabilities and enable missions. To maximize the benefits of executable architectures, a second PhD effort introduces the idea of creating an executable context in addition to the executable architecture. The results move the validation of architectures from the current information domain into the knowledge domain and improve the reliability of such validation efforts. The paper presents research and results of both doctoral research efforts and puts them into a common context of state-of-the-art of systems engineering methods supporting more agility.

  4. 4 CFR 9.1 - GAO Senior Executive Service.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 4 Accounts 1 2010-01-01 2010-01-01 false GAO Senior Executive Service. 9.1 Section 9.1 Accounts GOVERNMENT ACCOUNTABILITY OFFICE PERSONNEL SYSTEM SENIOR EXECUTIVE SERVICE § 9.1 GAO Senior Executive Service... Office Senior Executive Service which meets the requirements set forth in section 3131 of title 5, United...

  5. 2008 Munitions Executive Summit

    DTIC Science & Technology

    2008-02-21

    7:00pm Hosted Reception 2008 Munitions Executive Summit February 20, 2008 – AGENDA (cont.) 8:00am Administrative Remarks 8:05am PEO Keynote Address...MES Awards Tuesday, February 19 5pm Reception /Pre-registration Sponsored by ATK Ammunition Systems Group and Kaman Aerospace’s Fuzing...15pm Adjourn 5:15pm - 7pm Hosted Reception Sponsored by DSE, Inc. and General Dynamics-OTS Munitions executive suMMit 2008 agenda & prograM

  6. Ground Operations Aerospace Language (GOAL). Volume 4: Interpretive code translator

    NASA Technical Reports Server (NTRS)

    1973-01-01

    This specification identifies and describes the principal functions and elements of the Interpretive Code Translator which has been developed for use with the GOAL Compiler. This translator enables the user to convert a compliled GOAL program to a highly general binary format which is designed to enable interpretive execution. The translator program provides user controls which are designed to enable the selection of various output types and formats. These controls provide a means for accommodating many of the implementation options which are discussed in the Interpretive Code Guideline document. The technical design approach is given. The relationship between the translator and the GOAL compiler is explained and the principal functions performed by the Translator are described. Specific constraints regarding the use of the Translator are discussed. The control options are described. These options enable the user to select outputs to be generated by the translator and to control vrious aspects of the translation processing.

  7. A new code for automatic detection and analysis of the lineament patterns for geophysical and geological purposes (ADALGEO)

    NASA Astrophysics Data System (ADS)

    Soto-Pinto, C.; Arellano-Baeza, A.; Sánchez, G.

    2013-08-01

    We present a new numerical method for automatic detection and analysis of changes in lineament patterns caused by seismic and volcanic activities. The method is implemented as a series of modules: (i) normalization of the image contrast, (ii) extraction of small linear features (stripes) through convolution of the part of the image in the vicinity of each pixel with a circular mask or through Canny algorithm, and (iii) posterior detection of main lineaments using the Hough transform. We demonstrate that our code reliably detects changes in the lineament patterns related to the stress evolution in the Earth's crust: specifically, a significant number of new lineaments appear approximately one month before an earthquake, while one month after the earthquake the lineament configuration returns to its initial state. Application of our software to the deformations caused by volcanic activity yields the opposite results: the number of lineaments decreases with the onset of microseismicity. This discrepancy can be explained assuming that the plate tectonic earthquakes are caused by the compression and accumulation of stress in the Earth's crust due to subduction of tectonic plates, whereas in the case of volcanic activity we deal with the inflation of a volcano edifice due to elevation of pressure and magma intrusion and the resulting stretching of the surface.

  8. Evolvix BEST Names for semantic reproducibility across code2brain interfaces.

    PubMed

    Loewe, Laurence; Scheuer, Katherine S; Keel, Seth A; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G; Moog, Cecilia L; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist-Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda-Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L; Freiberg, Erika; Waters, Noah P; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha

    2017-01-01

    Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general-purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long-term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder-brains to reader-brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core. © 2016 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.

  9. The MIMIC Code Repository: enabling reproducibility in critical care research.

    PubMed

    Johnson, Alistair Ew; Stone, David J; Celi, Leo A; Pollard, Tom J

    2018-01-01

    Lack of reproducibility in medical studies is a barrier to the generation of a robust knowledge base to support clinical decision-making. In this paper we outline the Medical Information Mart for Intensive Care (MIMIC) Code Repository, a centralized code base for generating reproducible studies on an openly available critical care dataset. Code is provided to load the data into a relational structure, create extractions of the data, and reproduce entire analysis plans including research studies. Concepts extracted include severity of illness scores, comorbid status, administrative definitions of sepsis, physiologic criteria for sepsis, organ failure scores, treatment administration, and more. Executable documents are used for tutorials and reproduce published studies end-to-end, providing a template for future researchers to replicate. The repository's issue tracker enables community discussion about the data and concepts, allowing users to collaboratively improve the resource. The centralized repository provides a platform for users of the data to interact directly with the data generators, facilitating greater understanding of the data. It also provides a location for the community to collaborate on necessary concepts for research progress and share them with a larger audience. Consistent application of the same code for underlying concepts is a key step in ensuring that research studies on the MIMIC database are comparable and reproducible. By providing open source code alongside the freely accessible MIMIC-III database, we enable end-to-end reproducible analysis of electronic health records. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  10. Planning for execution monitoring on a planetary rover

    NASA Technical Reports Server (NTRS)

    Gat, Erann; Firby, R. James; Miller, David P.

    1990-01-01

    A planetary rover will be traversing largely unknown and often unknowable terrain. In addition to geometric obstacles such as cliffs, rocks, and holes, it may also have to deal with non-geometric hazards such as soft soil and surface breakthroughs which often cannot be detected until rover is in imminent danger. Therefore, the rover must monitor its progress throughout a traverse, making sure to stay on course and to detect and act on any previously unseen hazards. Its onboard planning system must decide what sensors to monitor, what landmarks to take position readings from, and what actions to take if something should go wrong. The planning systems being developed for the Pathfinder Planetary Rover to perform these execution monitoring tasks are discussed. This system includes a network of planners to perform path planning, expectation generation, path analysis, sensor and reaction selection, and resource allocation.

  11. The contribution of executive control to semantic cognition: Convergent evidence from semantic aphasia and executive dysfunction.

    PubMed

    Thompson, Hannah E; Almaghyuli, Azizah; Noonan, Krist A; Barak, Ohr; Lambon Ralph, Matthew A; Jefferies, Elizabeth

    2018-01-03

    Semantic cognition, as described by the controlled semantic cognition (CSC) framework (Rogers et al., , Neuropsychologia, 76, 220), involves two key components: activation of coherent, generalizable concepts within a heteromodal 'hub' in combination with modality-specific features (spokes), and a constraining mechanism that manipulates and gates this knowledge to generate time- and task-appropriate behaviour. Executive-semantic goal representations, largely supported by executive regions such as frontal and parietal cortex, are thought to allow the generation of non-dominant aspects of knowledge when these are appropriate for the task or context. Semantic aphasia (SA) patients have executive-semantic deficits, and these are correlated with general executive impairment. If the CSC proposal is correct, patients with executive impairment should not only exhibit impaired semantic cognition, but should also show characteristics that align with those observed in SA. This possibility remains largely untested, as patients selected on the basis that they show executive impairment (i.e., with 'dysexecutive syndrome') have not been extensively tested on tasks tapping semantic control and have not been previously compared with SA cases. We explored conceptual processing in 12 patients showing symptoms consistent with dysexecutive syndrome (DYS) and 24 SA patients, using a range of multimodal semantic assessments which manipulated control demands. Patients with executive impairments, despite not being selected to show semantic impairments, nevertheless showed parallel patterns to SA cases. They showed strong effects of distractor strength, cues and miscues, and probe-target distance, plus minimal effects of word frequency on comprehension (unlike semantic dementia patients with degradation of conceptual knowledge). This supports a component process account of semantic cognition in which retrieval is shaped by control processes, and confirms that deficits in SA patients reflect

  12. Culture, executive function, and social understanding.

    PubMed

    Lewis, Charlie; Koyasu, Masuo; Oh, Seungmi; Ogawa, Ayako; Short, Benjamin; Huang, Zhao

    2009-01-01

    Much of the evidence from the West has shown links between children's developing self-control (executive function), their social experiences, and their social understanding (Carpendale & Lewis, 2006, chapters 5 and 6), across a range of cultures including China. This chapter describes four studies conducted in three Oriental cultures, suggesting that the relationships among social interaction, executive function, and social understanding are different in these cultures, implying that social and executive skills are underpinned by key cultural processes.

  13. New Class of Quantum Error-Correcting Codes for a Bosonic Mode

    NASA Astrophysics Data System (ADS)

    Michael, Marios H.; Silveri, Matti; Brierley, R. T.; Albert, Victor V.; Salmilehto, Juha; Jiang, Liang; Girvin, S. M.

    2016-07-01

    We construct a new class of quantum error-correcting codes for a bosonic mode, which are advantageous for applications in quantum memories, communication, and scalable computation. These "binomial quantum codes" are formed from a finite superposition of Fock states weighted with binomial coefficients. The binomial codes can exactly correct errors that are polynomial up to a specific degree in bosonic creation and annihilation operators, including amplitude damping and displacement noise as well as boson addition and dephasing errors. For realistic continuous-time dissipative evolution, the codes can perform approximate quantum error correction to any given order in the time step between error detection measurements. We present an explicit approximate quantum error recovery operation based on projective measurements and unitary operations. The binomial codes are tailored for detecting boson loss and gain errors by means of measurements of the generalized number parity. We discuss optimization of the binomial codes and demonstrate that by relaxing the parity structure, codes with even lower unrecoverable error rates can be achieved. The binomial codes are related to existing two-mode bosonic codes, but offer the advantage of requiring only a single bosonic mode to correct amplitude damping as well as the ability to correct other errors. Our codes are similar in spirit to "cat codes" based on superpositions of the coherent states but offer several advantages such as smaller mean boson number, exact rather than approximate orthonormality of the code words, and an explicit unitary operation for repumping energy into the bosonic mode. The binomial quantum codes are realizable with current superconducting circuit technology, and they should prove useful in other quantum technologies, including bosonic quantum memories, photonic quantum communication, and optical-to-microwave up- and down-conversion.

  14. Effectiveness comparison of partially executed t-way test suite based generated by existing strategies

    NASA Astrophysics Data System (ADS)

    Othman, Rozmie R.; Ahmad, Mohd Zamri Zahir; Ali, Mohd Shaiful Aziz Rashid; Zakaria, Hasneeza Liza; Rahman, Md. Mostafijur

    2015-05-01

    Consuming 40 to 50 percent of software development cost, software testing is one of the most resource consuming activities in software development lifecycle. To ensure an acceptable level of quality and reliability of a typical software product, it is desirable to test every possible combination of input data under various configurations. Due to combinatorial explosion problem, considering all exhaustive testing is practically impossible. Resource constraints, costing factors as well as strict time-to-market deadlines are amongst the main factors that inhibit such consideration. Earlier work suggests that sampling strategy (i.e. based on t-way parameter interaction or called as t-way testing) can be effective to reduce number of test cases without effecting the fault detection capability. However, for a very large system, even t-way strategy will produce a large test suite that need to be executed. In the end, only part of the planned test suite can be executed in order to meet the aforementioned constraints. Here, there is a need for test engineers to measure the effectiveness of partially executed test suite in order for them to assess the risk they have to take. Motivated by the abovementioned problem, this paper presents the effectiveness comparison of partially executed t-way test suite generated by existing strategies using tuples coverage method. Here, test engineers can predict the effectiveness of the testing process if only part of the original test cases is executed.

  15. [Prematurity: longitudinal analysis of executive functions].

    PubMed

    Sastre-Riba, S

    2009-02-27

    Understanding cognitive development requires an interdisciplinary and neuropsychological approach. Executive functions facilitates cognitive activity and they are related to progressive cerebral configuration during pregnancy and infancy. One of the aims of the actual neuropsychology is the ontogeny of executive functions and their capacity to explain differential and normative developmental trends, specially because of its consequences on mental flexibility, monitoring, planning and cognitive control; they are also essential for good performance at school. The incidence of developmental risk factors as prematurity could affect long-term executive functioning expressed in learning difficulties or behavioral control. We studied, comparatively and longitudinally, the individual activity on objects displayed by typical babies (n = 25), and preterm babies (n = 10) from 1.5 to 2 years-old. Applying systematic observational methodology, spontaneous babies' activity is registered. Double intra and inter-group analysis compare the data from the resolution of a non-verbal task through a multifaceted design. Results obtained show us differential pattern of early executive functioning among the groups studied. The growth of executive functioning is showed, too, through the ages studied for every group.

  16. Components of executive functioning in metamemory.

    PubMed

    Mäntylä, Timo; Rönnlund, Michael; Kliegel, Matthias

    2010-10-01

    This study examined metamemory in relation to three basic executive functions (set shifting, working memory updating, and response inhibition) measured as latent variables. Young adults (Experiment 1) and middle-aged adults (Experiment 2) completed a set of executive functioning tasks and the Prospective and Retrospective Memory Questionnaire (PRMQ). In Experiment 1, source recall and face recognition tasks were included as indicators of objective memory performance. In both experiments, analyses of the executive functioning data yielded a two-factor solution, with the updating and inhibition tasks constituting a common factor and the shifting tasks a separate factor. Self-reported memory problems showed low predictive validity, but subjective and objective memory performance were related to different components of executive functioning. In both experiments, set shifting, but not updating and inhibition, was related to PRMQ, whereas source recall showed the opposite pattern of correlations in Experiment 1. These findings suggest that metamemorial judgments reflect selective effects of executive functioning and that individual differences in mental flexibility contribute to self-beliefs of efficacy.

  17. A very simple, re-executable neuroimaging publication

    PubMed Central

    Ghosh, Satrajit S.; Poline, Jean-Baptiste; Keator, David B.; Halchenko, Yaroslav O.; Thomas, Adam G.; Kessler, Daniel A.; Kennedy, David N.

    2017-01-01

    Reproducible research is a key element of the scientific process. Re-executability of neuroimaging workflows that lead to the conclusions arrived at in the literature has not yet been sufficiently addressed and adopted by the neuroimaging community. In this paper, we document a set of procedures, which include supplemental additions to a manuscript, that unambiguously define the data, workflow, execution environment and results of a neuroimaging analysis, in order to generate a verifiable re-executable publication. Re-executability provides a starting point for examination of the generalizability and reproducibility of a given finding. PMID:28781753

  18. Developmental Changes in Executive Functioning

    ERIC Educational Resources Information Center

    Lee, Kerry; Bull, Rebecca; Ho, Ringo M. H.

    2013-01-01

    Although early studies of executive functioning in children supported Miyake et al.'s (2000) three-factor model, more recent findings supported a variety of undifferentiated or two-factor structures. Using a cohort-sequential design, this study examined whether there were age-related differences in the structure of executive functioning among…

  19. Motor Execution Affects Action Prediction

    ERIC Educational Resources Information Center

    Springer, Anne; Brandstadter, Simone; Liepelt, Roman; Birngruber, Teresa; Giese, Martin; Mechsner, Franz; Prinz, Wolfgang

    2011-01-01

    Previous studies provided evidence of the claim that the prediction of occluded action involves real-time simulation. We report two experiments that aimed to study how real-time simulation is affected by simultaneous action execution under conditions of full, partial or no overlap between observed and executed actions. This overlap was analysed by…

  20. Questionnaire-based assessment of executive functioning: Psychometrics.

    PubMed

    Castellanos, Irina; Kronenberger, William G; Pisoni, David B

    2018-01-01

    The psychometric properties of the Learning, Executive, and Attention Functioning (LEAF) scale were investigated in an outpatient clinical pediatric sample. As a part of clinical testing, the LEAF scale, which broadly measures neuropsychological abilities related to executive functioning and learning, was administered to parents of 118 children and adolescents referred for psychological testing at a pediatric psychology clinic; 85 teachers also completed LEAF scales to assess reliability across different raters and settings. Scores on neuropsychological tests of executive functioning and academic achievement were abstracted from charts. Psychometric analyses of the LEAF scale demonstrated satisfactory internal consistency, parent-teacher inter-rater reliability in the small to large effect size range, and test-retest reliability in the large effect size range, similar to values for other executive functioning checklists. Correlations between corresponding subscales on the LEAF and other behavior checklists were large, while most correlations with neuropsychological tests of executive functioning and achievement were significant but in the small to medium range. Results support the utility of the LEAF as a reliable and valid questionnaire-based assessment of delays and disturbances in executive functioning and learning. Applications and advantages of the LEAF and other questionnaire measures of executive functioning in clinical neuropsychology settings are discussed.

  1. Leap Frog and Time Step Sub-Cycle Scheme for Coupled Neutronics and Thermal-Hydraulic Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, S.

    2002-07-01

    As the result of the advancing TCP/IP based inter-process communication technology, more and more legacy thermal-hydraulic codes have been coupled with neutronics codes to provide best-estimate capabilities for reactivity related reactor transient analysis. Most of the coupling schemes are based on closely coupled serial or parallel approaches. Therefore, the execution of the coupled codes usually requires significant CPU time, when a complicated system is analyzed. Leap Frog scheme has been used to reduce the run time. The extent of the decoupling is usually determined based on a trial and error process for a specific analysis. It is the intent ofmore » this paper to develop a set of general criteria, which can be used to invoke the automatic Leap Frog algorithm. The algorithm will not only provide the run time reduction but also preserve the accuracy. The criteria will also serve as the base of an automatic time step sub-cycle scheme when a sudden reactivity change is introduced and the thermal-hydraulic code is marching with a relatively large time step. (authors)« less

  2. Statistical Symbolic Execution with Informed Sampling

    NASA Technical Reports Server (NTRS)

    Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco

    2014-01-01

    Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.

  3. Throughput of Coded Optical CDMA Systems with AND Detectors

    NASA Astrophysics Data System (ADS)

    Memon, Kehkashan A.; Umrani, Fahim A.; Umrani, A. W.; Umrani, Naveed A.

    2012-09-01

    Conventional detection techniques used in optical code-division multiple access (OCDMA) systems are not optimal and result in poor bit error rate performance. This paper analyzes the coded performance of optical CDMA systems with AND detectors for enhanced throughput efficiencies and improved error rate performance. The results show that the use of AND detectors significantly improve the performance of an optical channel.

  4. Toward synthesizing executable models in biology.

    PubMed

    Fisher, Jasmin; Piterman, Nir; Bodik, Rastislav

    2014-01-01

    Over the last decade, executable models of biological behaviors have repeatedly provided new scientific discoveries, uncovered novel insights, and directed new experimental avenues. These models are computer programs whose execution mechanistically simulates aspects of the cell's behaviors. If the observed behavior of the program agrees with the observed biological behavior, then the program explains the phenomena. This approach has proven beneficial for gaining new biological insights and directing new experimental avenues. One advantage of this approach is that techniques for analysis of computer programs can be applied to the analysis of executable models. For example, one can confirm that a model agrees with experiments for all possible executions of the model (corresponding to all environmental conditions), even if there are a huge number of executions. Various formal methods have been adapted for this context, for example, model checking or symbolic analysis of state spaces. To avoid manual construction of executable models, one can apply synthesis, a method to produce programs automatically from high-level specifications. In the context of biological modeling, synthesis would correspond to extracting executable models from experimental data. We survey recent results about the usage of the techniques underlying synthesis of computer programs for the inference of biological models from experimental data. We describe synthesis of biological models from curated mutation experiment data, inferring network connectivity models from phosphoproteomic data, and synthesis of Boolean networks from gene expression data. While much work has been done on automated analysis of similar datasets using machine learning and artificial intelligence, using synthesis techniques provides new opportunities such as efficient computation of disambiguating experiments, as well as the ability to produce different kinds of models automatically from biological data.

  5. Ratings of Everyday Executive Functioning (REEF): A parent-report measure of preschoolers' executive functioning skills.

    PubMed

    Nilsen, Elizabeth S; Huyder, Vanessa; McAuley, Tara; Liebermann, Dana

    2017-01-01

    Executive functioning (EF) facilitates the development of academic, cognitive, and social-emotional skills and deficits in EF are implicated in a broad range of child psychopathologies. Although EF has clear implications for early development, the few questionnaires that assess EF in preschoolers tend to ask parents for global judgments of executive dysfunction and thus do not cover the full range of EF within the preschool age group. Here we present a new measure of preschoolers' EF-the Ratings of Everyday Executive Functioning (REEF)-that capitalizes on parents' observations of their preschoolers' (i.e., 3- to 5-year-olds) behavior in specific, everyday contexts. Over 4 studies, items comprising the REEF were refined and the measure's reliability and validity were evaluated. Factor analysis of the REEF yielded 1 factor, with items showing strong internal reliability. More important, children's scores on the REEF related to both laboratory measures of EF and another parent-report EF questionnaire. Moreover, reflecting divergent validity, the REEF was more strongly related to measures of EF as opposed to measures of affective styles. The REEF also captured differences in children's executive skills across the preschool years, and norms at 6-month intervals are reported. In summary, the REEF is a new parent-report measure that provides researchers with an efficient, valid, and reliable means of assessing preschoolers' executive functioning. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  6. SEQassembly: A Practical Tools Program for Coding Sequences Splicing

    NASA Astrophysics Data System (ADS)

    Lee, Hongbin; Yang, Hang; Fu, Lei; Qin, Long; Li, Huili; He, Feng; Wang, Bo; Wu, Xiaoming

    CDS (Coding Sequences) is a portion of mRNA sequences, which are composed by a number of exon sequence segments. The construction of CDS sequence is important for profound genetic analysis such as genotyping. A program in MATLAB environment is presented, which can process batch of samples sequences into code segments under the guide of reference exon models, and splice these code segments of same sample source into CDS according to the exon order in queue file. This program is useful in transcriptional polymorphism detection and gene function study.

  7. Questionnaire-based assessment of executive functioning: Case studies.

    PubMed

    Kronenberger, William G; Castellanos, Irina; Pisoni, David B

    2018-01-01

    Delays in the development of executive functioning skills are frequently observed in pediatric neuropsychology populations and can have a broad and significant impact on quality of life. As a result, assessment of executive functioning is often relevant for the development of formulations and recommendations in pediatric neuropsychology clinical work. Questionnaire-based measures of executive functioning behaviors in everyday life have unique advantages and complement traditional neuropsychological measures of executive functioning. Two case studies of children with spina bifida are presented to illustrate the clinical use of a new questionnaire measure of executive and learning-related functioning, the Learning, Executive, and Attention Functioning Scale (LEAF). The LEAF emphasizes clinical utility in assessment by incorporating four characteristics: brevity in administration, breadth of additional relevant content, efficiency of scoring and interpretation, and ease of availability for use. LEAF results were consistent with another executive functioning checklist in documenting everyday behavior problems related to working memory, planning, and organization while offering additional breadth of assessment of domains such as attention, processing speed, and novel problem-solving. These case study results demonstrate the clinical utility of questionnaire-based measurement of executive functioning in pediatric neuropsychology and provide a new measure for accomplishing this goal.

  8. 3 CFR 13524 - Executive Order 13524 of December 16, 2009. Amending Executive Order 12425 Designating Interpol...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Privileges, Exemptions, and Immunities 13524 Order 13524 Presidential Documents Executive Orders Executive... Public International Organization Entitled To Enjoy Certain Privileges, Exemptions, and Immunities By the..., including section 1 of the International Organizations Immunities Act (22 U.S.C. 288), and in order to...

  9. An empirical analysis of executive behaviour with hospital executive information systems in Taiwan.

    PubMed

    Huang, Wei-Min

    2013-01-01

    Existing health information systems largely only support the daily operations of a medical centre, and are unable to generate the information required by executives for decision-making. Building on past research concerning information retrieval behaviour and learning through mental models, this study examines the use of information systems by hospital executives in medical centres. It uses a structural equation model to help find ways hospital executives might use information systems more effectively. The results show that computer self-efficacy directly affects the maintenance of mental models, and that system characteristics directly impact learning styles and information retrieval behaviour. Other results include the significant impact of perceived environmental uncertainty on scan searches; information retrieval behaviour and focused searches on mental models and perceived efficiency; scan searches on mental model building; learning styles and model building on perceived efficiency; and finally the impact of mental model maintenance on perceived efficiency and effectiveness.

  10. Executive dysfunction, brain aging, and political leadership.

    PubMed

    Fisher, Mark; Franklin, David L; Post, Jerrold M

    2014-01-01

    Decision-making is an essential component of executive function, and a critical skill of political leadership. Neuroanatomic localization studies have established the prefrontal cortex as the critical brain site for executive function. In addition to the prefrontal cortex, white matter tracts as well as subcortical brain structures are crucial for optimal executive function. Executive function shows a significant decline beginning at age 60, and this is associated with age-related atrophy of prefrontal cortex, cerebral white matter disease, and cerebral microbleeds. Notably, age-related decline in executive function appears to be a relatively selective cognitive deterioration, generally sparing language and memory function. While an individual may appear to be functioning normally with regard to relatively obvious cognitive functions such as language and memory, that same individual may lack the capacity to integrate these cognitive functions to achieve normal decision-making. From a historical perspective, global decline in cognitive function of political leaders has been alternatively described as a catastrophic event, a slowly progressive deterioration, or a relatively episodic phenomenon. Selective loss of executive function in political leaders is less appreciated, but increased utilization of highly sensitive brain imaging techniques will likely bring greater appreciation to this phenomenon. Former Israeli Prime Minister Ariel Sharon was an example of a political leader with a well-described neurodegenerative condition (cerebral amyloid angiopathy) that creates a neuropathological substrate for executive dysfunction. Based on the known neuroanatomical and neuropathological changes that occur with aging, we should probably assume that a significant proportion of political leaders over the age of 65 have impairment of executive function.

  11. Accelerating phylogenetics computing on the desktop: experiments with executing UPGMA in programmable logic.

    PubMed

    Davis, J P; Akella, S; Waddell, P H

    2004-01-01

    Having greater computational power on the desktop for processing taxa data sets has been a dream of biologists/statisticians involved in phylogenetics data analysis. Many existing algorithms have been highly optimized-one example being Felsenstein's PHYLIP code, written in C, for UPGMA and neighbor joining algorithms. However, the ability to process more than a few tens of taxa in a reasonable amount of time using conventional computers has not yielded a satisfactory speedup in data processing, making it difficult for phylogenetics practitioners to quickly explore data sets-such as might be done from a laptop computer. We discuss the application of custom computing techniques to phylogenetics. In particular, we apply this technology to speed up UPGMA algorithm execution by a factor of a hundred, against that of PHYLIP code running on the same PC. We report on these experiments and discuss how custom computing techniques can be used to not only accelerate phylogenetics algorithm performance on the desktop, but also on larger, high-performance computing engines, thus enabling the high-speed processing of data sets involving thousands of taxa.

  12. Identifying personal microbiomes using metagenomic codes

    PubMed Central

    Franzosa, Eric A.; Huang, Katherine; Meadow, James F.; Gevers, Dirk; Lemon, Katherine P.; Bohannan, Brendan J. M.; Huttenhower, Curtis

    2015-01-01

    Community composition within the human microbiome varies across individuals, but it remains unknown if this variation is sufficient to uniquely identify individuals within large populations or stable enough to identify them over time. We investigated this by developing a hitting set-based coding algorithm and applying it to the Human Microbiome Project population. Our approach defined body site-specific metagenomic codes: sets of microbial taxa or genes prioritized to uniquely and stably identify individuals. Codes capturing strain variation in clade-specific marker genes were able to distinguish among 100s of individuals at an initial sampling time point. In comparisons with follow-up samples collected 30–300 d later, ∼30% of individuals could still be uniquely pinpointed using metagenomic codes from a typical body site; coincidental (false positive) matches were rare. Codes based on the gut microbiome were exceptionally stable and pinpointed >80% of individuals. The failure of a code to match its owner at a later time point was largely explained by the loss of specific microbial strains (at current limits of detection) and was only weakly associated with the length of the sampling interval. In addition to highlighting patterns of temporal variation in the ecology of the human microbiome, this work demonstrates the feasibility of microbiome-based identifiability—a result with important ethical implications for microbiome study design. The datasets and code used in this work are available for download from huttenhower.sph.harvard.edu/idability. PMID:25964341

  13. Non-coding cancer driver candidates identified with a sample- and position-specific model of the somatic mutation rate

    PubMed Central

    Juul, Malene; Bertl, Johanna; Guo, Qianyun; Nielsen, Morten Muhlig; Świtnicki, Michał; Hornshøj, Henrik; Madsen, Tobias; Hobolth, Asger; Pedersen, Jakob Skou

    2017-01-01

    Non-coding mutations may drive cancer development. Statistical detection of non-coding driver regions is challenged by a varying mutation rate and uncertainty of functional impact. Here, we develop a statistically founded non-coding driver-detection method, ncdDetect, which includes sample-specific mutational signatures, long-range mutation rate variation, and position-specific impact measures. Using ncdDetect, we screened non-coding regulatory regions of protein-coding genes across a pan-cancer set of whole-genomes (n = 505), which top-ranked known drivers and identified new candidates. For individual candidates, presence of non-coding mutations associates with altered expression or decreased patient survival across an independent pan-cancer sample set (n = 5454). This includes an antigen-presenting gene (CD1A), where 5’UTR mutations correlate significantly with decreased survival in melanoma. Additionally, mutations in a base-excision-repair gene (SMUG1) correlate with a C-to-T mutational-signature. Overall, we find that a rich model of mutational heterogeneity facilitates non-coding driver identification and integrative analysis points to candidates of potential clinical relevance. DOI: http://dx.doi.org/10.7554/eLife.21778.001 PMID:28362259

  14. Simulation of Code Spectrum and Code Flow of Cultured Neuronal Networks.

    PubMed

    Tamura, Shinichi; Nishitani, Yoshi; Hosokawa, Chie; Miyoshi, Tomomitsu; Sawai, Hajime

    2016-01-01

    It has been shown that, in cultured neuronal networks on a multielectrode, pseudorandom-like sequences (codes) are detected, and they flow with some spatial decay constant. Each cultured neuronal network is characterized by a specific spectrum curve. That is, we may consider the spectrum curve as a "signature" of its associated neuronal network that is dependent on the characteristics of neurons and network configuration, including the weight distribution. In the present study, we used an integrate-and-fire model of neurons with intrinsic and instantaneous fluctuations of characteristics for performing a simulation of a code spectrum from multielectrodes on a 2D mesh neural network. We showed that it is possible to estimate the characteristics of neurons such as the distribution of number of neurons around each electrode and their refractory periods. Although this process is a reverse problem and theoretically the solutions are not sufficiently guaranteed, the parameters seem to be consistent with those of neurons. That is, the proposed neural network model may adequately reflect the behavior of a cultured neuronal network. Furthermore, such prospect is discussed that code analysis will provide a base of communication within a neural network that will also create a base of natural intelligence.

  15. 76 FR 57980 - Senior Executive Service Performance Review Board

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-19

    ... DEFENSE NUCLEAR FACILITIES SAFETY BOARD Senior Executive Service Performance Review Board AGENCY... the Defense Nuclear Facilities Safety Board (DNFSB) Senior Executive Service (SES) Performance Review... summary rating of the senior executive's performance, the executive's response, and the higher level...

  16. Executive Control of Attention in Narcolepsy

    PubMed Central

    Bayard, Sophie; Croisier Langenier, Muriel; Cochen De Cock, Valérie; Scholz, Sabine; Dauvilliers, Yves

    2012-01-01

    Background Narcolepsy with cataplexy (NC) is a disabling sleep disorder characterized by early loss of hypocretin neurons that project to areas involved in the attention network. We characterized the executive control of attention in drug-free patients with NC to determine whether the executive deficits observed in patients with NC are specific to the disease itself or whether they reflect performance changes due to the severity of excessive daytime sleepiness. Methodology Twenty-two patients with NC compared to 22 patients with narcolepsy without cataplexy (NwC) matched for age, gender, intellectual level, objective daytime sleepiness and number of sleep onset REM periods (SOREMPs) were studied. Thirty-two matched healthy controls were included. All participants underwent a standardized interview, completed questionnaires, and neuropsychological tests. All patients underwent a polysomnography followed by multiple sleep latency tests (MSLT), with neuropsychological evaluation performed the same day between MSLT sessions. Principal Findings Irrespective of diagnosis, patients reported higher self-reported attentional complaints associated with the intensity of depressive symptoms. Patients with NC performed slower and more variably on simple reaction time tasks than patients with NwC, who did not differ from controls. Patients with NC and NwC generally performed slower, reacted more variably, and made more errors than controls on executive functioning tests. Individual profile analyses showed a clear heterogeneity of the severity of executive deficit. This severity was related to objective sleepiness, higher number of SOREMPs on the MSLT, and lower intelligence quotient. The nature and severity of the executive deficits were unrelated to NC and NwC diagnosis. Conclusions We demonstrated that drug-free patients with NC and NwC complained of attention deficit, with altered executive control of attention being explained by the severity of objective sleepiness and

  17. Heterodyne detection using spectral line pairing for spectral phase encoding optical code division multiple access and dynamic dispersion compensation.

    PubMed

    Yang, Yi; Foster, Mark; Khurgin, Jacob B; Cooper, A Brinton

    2012-07-30

    A novel coherent optical code-division multiple access (OCDMA) scheme is proposed that uses spectral line pairing to generate signals suitable for heterodyne decoding. Both signal and local reference are transmitted via a single optical fiber and a simple balanced receiver performs sourceless heterodyne detection, canceling speckle noise and multiple-access interference (MAI). To validate the idea, a 16 user fully loaded phase encoded system is simulated. Effects of fiber dispersion on system performance are studied as well. Both second and third order dispersion management is achieved by using a spectral phase encoder to adjust phase shifts of spectral components at the optical network unit (ONU).

  18. Fast Sparse Coding for Range Data Denoising with Sparse Ridges Constraint.

    PubMed

    Gao, Zhi; Lao, Mingjie; Sang, Yongsheng; Wen, Fei; Ramesh, Bharath; Zhai, Ruifang

    2018-05-06

    Light detection and ranging (LiDAR) sensors have been widely deployed on intelligent systems such as unmanned ground vehicles (UGVs) and unmanned aerial vehicles (UAVs) to perform localization, obstacle detection, and navigation tasks. Thus, research into range data processing with competitive performance in terms of both accuracy and efficiency has attracted increasing attention. Sparse coding has revolutionized signal processing and led to state-of-the-art performance in a variety of applications. However, dictionary learning, which plays the central role in sparse coding techniques, is computationally demanding, resulting in its limited applicability in real-time systems. In this study, we propose sparse coding algorithms with a fixed pre-learned ridge dictionary to realize range data denoising via leveraging the regularity of laser range measurements in man-made environments. Experiments on both synthesized data and real data demonstrate that our method obtains accuracy comparable to that of sophisticated sparse coding methods, but with much higher computational efficiency.

  19. Fast Sparse Coding for Range Data Denoising with Sparse Ridges Constraint

    PubMed Central

    Lao, Mingjie; Sang, Yongsheng; Wen, Fei; Zhai, Ruifang

    2018-01-01

    Light detection and ranging (LiDAR) sensors have been widely deployed on intelligent systems such as unmanned ground vehicles (UGVs) and unmanned aerial vehicles (UAVs) to perform localization, obstacle detection, and navigation tasks. Thus, research into range data processing with competitive performance in terms of both accuracy and efficiency has attracted increasing attention. Sparse coding has revolutionized signal processing and led to state-of-the-art performance in a variety of applications. However, dictionary learning, which plays the central role in sparse coding techniques, is computationally demanding, resulting in its limited applicability in real-time systems. In this study, we propose sparse coding algorithms with a fixed pre-learned ridge dictionary to realize range data denoising via leveraging the regularity of laser range measurements in man-made environments. Experiments on both synthesized data and real data demonstrate that our method obtains accuracy comparable to that of sophisticated sparse coding methods, but with much higher computational efficiency. PMID:29734793

  20. Grid workflow job execution service 'Pilot'

    NASA Astrophysics Data System (ADS)

    Shamardin, Lev; Kryukov, Alexander; Demichev, Andrey; Ilyin, Vyacheslav

    2011-12-01

    'Pilot' is a grid job execution service for workflow jobs. The main goal for the service is to automate computations with multiple stages since they can be expressed as simple workflows. Each job is a directed acyclic graph of tasks and each task is an execution of something on a grid resource (or 'computing element'). Tasks may be submitted to any WS-GRAM (Globus Toolkit 4) service. The target resources for the tasks execution are selected by the Pilot service from the set of available resources which match the specific requirements from the task and/or job definition. Some simple conditional execution logic is also provided. The 'Pilot' service is built on the REST concepts and provides a simple API through authenticated HTTPS. This service is deployed and used in production in a Russian national grid project GridNNN.

  1. Channel modeling, signal processing and coding for perpendicular magnetic recording

    NASA Astrophysics Data System (ADS)

    Wu, Zheng

    With the increasing areal density in magnetic recording systems, perpendicular recording has replaced longitudinal recording to overcome the superparamagnetic limit. Studies on perpendicular recording channels including aspects of channel modeling, signal processing and coding techniques are presented in this dissertation. To optimize a high density perpendicular magnetic recording system, one needs to know the tradeoffs between various components of the system including the read/write transducers, the magnetic medium, and the read channel. We extend the work by Chaichanavong on the parameter optimization for systems via design curves. Different signal processing and coding techniques are studied. Information-theoretic tools are utilized to determine the acceptable region for the channel parameters when optimal detection and linear coding techniques are used. Our results show that a considerable gain can be achieved by the optimal detection and coding techniques. The read-write process in perpendicular magnetic recording channels includes a number of nonlinear effects. Nonlinear transition shift (NLTS) is one of them. The signal distortion induced by NLTS can be reduced by write precompensation during data recording. We numerically evaluate the effect of NLTS on the read-back signal and examine the effectiveness of several write precompensation schemes in combating NLTS in a channel characterized by both transition jitter noise and additive white Gaussian electronics noise. We also present an analytical method to estimate the bit-error-rate and use it to help determine the optimal write precompensation values in multi-level precompensation schemes. We propose a mean-adjusted pattern-dependent noise predictive (PDNP) detection algorithm for use on the channel with NLTS. We show that this detector can offer significant improvements in bit-error-rate (BER) compared to conventional Viterbi and PDNP detectors. Moreover, the system performance can be further improved by

  2. Modelling Limit Order Execution Times from Market Data

    NASA Astrophysics Data System (ADS)

    Kim, Adlar; Farmer, Doyne; Lo, Andrew

    2007-03-01

    Although the term ``liquidity'' is widely used in finance literatures, its meaning is very loosely defined and there is no quantitative measure for it. Generally, ``liquidity'' means an ability to quickly trade stocks without causing a significant impact on the stock price. From this definition, we identified two facets of liquidity -- 1.execution time of limit orders, and 2.price impact of market orders. The limit order is an order to transact a prespecified number of shares at a prespecified price, which will not cause an immediate execution. On the other hand, the market order is an order to transact a prespecified number of shares at a market price, which will cause an immediate execution, but are subject to price impact. Therefore, when the stock is liquid, market participants will experience quick limit order executions and small market order impacts. As a first step to understand market liquidity, we studied the facet of liquidity related to limit order executions -- execution times. In this talk, we propose a novel approach of modeling limit order execution times and show how they are affected by size and price of orders. We used q-Weibull distribution, which is a generalized form of Weibull distribution that can control the fatness of tail to model limit order execution times.

  3. The implementation of an aeronautical CFD flow code onto distributed memory parallel systems

    NASA Astrophysics Data System (ADS)

    Ierotheou, C. S.; Forsey, C. R.; Leatham, M.

    2000-04-01

    The parallelization of an industrially important in-house computational fluid dynamics (CFD) code for calculating the airflow over complex aircraft configurations using the Euler or Navier-Stokes equations is presented. The code discussed is the flow solver module of the SAUNA CFD suite. This suite uses a novel grid system that may include block-structured hexahedral or pyramidal grids, unstructured tetrahedral grids or a hybrid combination of both. To assist in the rapid convergence to a solution, a number of convergence acceleration techniques are employed including implicit residual smoothing and a multigrid full approximation storage scheme (FAS). Key features of the parallelization approach are the use of domain decomposition and encapsulated message passing to enable the execution in parallel using a single programme multiple data (SPMD) paradigm. In the case where a hybrid grid is used, a unified grid partitioning scheme is employed to define the decomposition of the mesh. The parallel code has been tested using both structured and hybrid grids on a number of different distributed memory parallel systems and is now routinely used to perform industrial scale aeronautical simulations. Copyright

  4. A novel "signal-on/off" sensing platform for selective detection of thrombin based on target-induced ratiometric electrochemical biosensing and bio-bar-coded nanoprobe amplification strategy.

    PubMed

    Wang, Lanlan; Ma, Rongna; Jiang, Liushan; Jia, Liping; Jia, Wenli; Wang, Huaisheng

    2017-06-15

    A novel dual-signal ratiometric electrochemical aptasensor for highly sensitive and selective detection of thrombin has been designed on the basis of signal-on and signal-off strategy. Ferrocene labeled hairpin probe (Fc-HP), thrombin aptamer and methyl blue labeled bio-bar-coded AuNPs (MB-P3-AuNPs) were rationally introduced for the construction of the assay platform, which combined the advantages of the recognition of aptamer, the amplification of bio-bar-coded nanoprobe, and the ratiometric signaling readout. In the presence of thrombin, the interaction between thrombin and the aptamer leads to the departure of MB-P3-AuNPs from the sensing interface, and the conformation of the single stranded Fc-HP to a hairpin structure to take the Fc confined near the electrode surface. Such conformational changes resulted in the oxidation current of Fc increased and that of MB decreased. Therefore, the recognition event of the target can be dual-signal ratiometric electrochemical readout in both the "signal-off" of MB and the "signal-on" of Fc. The proposed strategy showed a wide linear detection range from 0.003 to 30nM with a detection limit of 1.1 pM. Moreover, it exhibits good performance of excellent selectivity, good stability, and acceptable fabrication reproducibility. By changing the recognition probe, this protocol could be easily expanded into the detection of other targets, showing promising potential applications in disease diagnostics and bioanalysis. Copyright © 2016. Published by Elsevier B.V.

  5. Expected Utility Distributions for Flexible, Contingent Execution

    NASA Technical Reports Server (NTRS)

    Bresina, John L.; Washington, Richard

    2000-01-01

    This paper presents a method for using expected utility distributions in the execution of flexible, contingent plans. A utility distribution maps the possible start times of an action to the expected utility of the plan suffix starting with that action. The contingent plan encodes a tree of possible courses of action and includes flexible temporal constraints and resource constraints. When execution reaches a branch point, the eligible option with the highest expected utility at that point in time is selected. The utility distributions make this selection sensitive to the runtime context, yet still efficient. Our approach uses predictions of action duration uncertainty as well as expectations of resource usage and availability to determine when an action can execute and with what probability. Execution windows and probabilities inevitably change as execution proceeds, but such changes do not invalidate the cached utility distributions, thus, dynamic updating of utility information is minimized.

  6. ATDM LANL FleCSI: Topology and Execution Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergen, Benjamin Karl

    FleCSI is a compile-time configurable C++ framework designed to support multi-physics application development. As such, FleCSI attempts to provide a very general set of infrastructure design patterns that can be specialized and extended to suit the needs of a broad variety of solver and data requirements. This means that FleCSI is potentially useful to many different ECP projects. Current support includes multidimensional mesh topology, mesh geometry, and mesh adjacency information, n-dimensional hashed-tree data structures, graph partitioning interfaces, and dependency closures (to identify data dependencies between distributed-memory address spaces). FleCSI introduces a functional programming model with control, execution, and data abstractionsmore » that are consistent with state-of-the-art task-based runtimes such as Legion and Charm++. The model also provides support for fine-grained, data-parallel execution with backend support for runtimes such as OpenMP and C++17. The FleCSI abstraction layer provides the developer with insulation from the underlying runtimes, while allowing support for multiple runtime systems, including conventional models like asynchronous MPI. The intent is to give developers a concrete set of user-friendly programming tools that can be used now, while allowing flexibility in choosing runtime implementations and optimizations that can be applied to architectures and runtimes that arise in the future. This project is essential to the ECP Ristra Next-Generation Code project, part of ASC ATDM, because it provides a hierarchically parallel programming model that is consistent with the design of modern system architectures, but which allows for the straightforward expression of algorithmic parallelism in a portably performant manner.« less

  7. Executive Functions

    PubMed Central

    Diamond, Adele

    2014-01-01

    Executive functions (EFs) make possible mentally playing with ideas; taking the time to think before acting; meeting novel, unanticipated challenges; resisting temptations; and staying focused. Core EFs are inhibition [response inhibition (self-control—resisting temptations and resisting acting impulsively) and interference control (selective attention and cognitive inhibition)], working memory, and cognitive flexibility (including creatively thinking “outside the box,” seeing anything from different perspectives, and quickly and flexibly adapting to changed circumstances). The developmental progression and representative measures of each are discussed. Controversies are addressed (e.g., the relation between EFs and fluid intelligence, self-regulation, executive attention, and effortful control, and the relation between working memory and inhibition and attention). The importance of social, emotional, and physical health for cognitive health is discussed because stress, lack of sleep, loneliness, or lack of exercise each impair EFs. That EFs are trainable and can be improved with practice is addressed, including diverse methods tried thus far. PMID:23020641

  8. 8 CFR 1003.0 - Executive Office for Immigration Review.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Executive Office for Immigration Review. 1003.0 Section 1003.0 Aliens and Nationality EXECUTIVE OFFICE FOR IMMIGRATION REVIEW, DEPARTMENT OF JUSTICE GENERAL PROVISIONS EXECUTIVE OFFICE FOR IMMIGRATION REVIEW § 1003.0 Executive Office for...

  9. Failure detection in high-performance clusters and computers using chaotic map computations

    DOEpatents

    Rao, Nageswara S.

    2015-09-01

    A programmable media includes a processing unit capable of independent operation in a machine that is capable of executing 10.sup.18 floating point operations per second. The processing unit is in communication with a memory element and an interconnect that couples computing nodes. The programmable media includes a logical unit configured to execute arithmetic functions, comparative functions, and/or logical functions. The processing unit is configured to detect computing component failures, memory element failures and/or interconnect failures by executing programming threads that generate one or more chaotic map trajectories. The central processing unit or graphical processing unit is configured to detect a computing component failure, memory element failure and/or an interconnect failure through an automated comparison of signal trajectories generated by the chaotic maps.

  10. The longitudinal development of social and executive functions in late adolescence and early adulthood

    PubMed Central

    Taylor, Sophie J.; Barker, Lynne A.; Heavey, Lisa; McHale, Sue

    2015-01-01

    Our earlier work suggests that, executive functions and social cognition show protracted development into late adolescence and early adulthood (Taylor et al., 2013). However, it remains unknown whether these functions develop linearly or non-linearly corresponding to dynamic changes to white matter density at these age ranges. Executive functions are particularly in demand during the transition to independence and autonomy associated with this age range (Ahmed and Miller, 2011). Previous research examining executive function (Romine and Reynolds, 2005) and social cognition (Dumontheil et al., 2010a) in late adolescence has utilized a cross sectional design. The current study employed a longitudinal design with 58 participants aged 17, 18, and 19 years completing social cognition and executive function tasks, Wechsler Abbreviated Scale of Intelligence (Wechsler, 1999), Positive and Negative Affect Schedule (Watson et al., 1988), and Hospital Anxiety and Depression Scale (Zigmond and Snaith, 1983) at Time 1 with follow up testing 12–16 months later. Inhibition, rule detection, strategy generation and planning executive functions and emotion recognition with dynamic stimuli showed longitudinal development between time points. Self-report empathy and emotion recognition functions using visual static and auditory stimuli were stable by age 17 whereas concept formation declined between time points. The protracted development of some functions may reflect continued brain maturation into late adolescence and early adulthood including synaptic pruning (Sowell et al., 2001) and changes to functional connectivity (Stevens et al., 2007) and/or environmental change. Clinical implications, such as assessing the effectiveness of rehabilitation following Head Injury, are discussed. PMID:26441579

  11. Evaluating executive function in patients with temporal lobe epilepsy using the frontal assessment battery.

    PubMed

    Agah, Elmira; Asgari-Rad, Nasima; Ahmadi, Mona; Tafakhori, Abbas; Aghamollaii, Vajiheh

    2017-07-01

    Previous studies have demonstrated executive dysfunction in patients with temporal lobe epilepsy (TLE). Frontal assessment battery (FAB) is a short neuropsychological tool that was developed for assessment of frontal lobe function in a clinical setting. The aim of the present study is to evaluate the clinical utility of FAB for detection of executive dysfunction in TLE patients. Forty-eight TLE patients and 48 sex and age-matched healthy controls participated in this study. Compared to healthy participants, the total FAB score was significantly lower among the TLE patients. TLE patients performed significantly worse at the mental flexibility, motor programming, sensitivity to interference and inhibitory control tasks. The duration of time has been passed since the last seizure was the only significant predictor of FAB score and patients who had a seizure less than a week before the evaluation time, had significantly lower FAB scores. The number of antiepileptic drugs (AEDs) did not influence the executive function in this study; however, sodium valproate was found to affect the mental flexibility. In conclusion, impaired executive function is common in TLE patients, and we suggest that FAB is a clinically applicable tool to monitor it. Moreover, we found that the time of the last seizure is a significant predictor of executive functioning and patients' performance may become worse up to seven days after a seizure. We also recommend that clinicians evaluate the cognitive adverse effects of AEDs especially sodium valproate, which was found to affect the mental flexibility in this study. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Revitalizing executive information systems.

    PubMed

    Crockett, F

    1992-01-01

    As the saying goes, "garbage in, garbage out"--and this is as true for executive information systems as for any other computer system. Crockett presents a methodology he has used with clients to help them develop more useful systems that produce higher quality information. The key is to develop performance measures based on critical success factors and stakeholder expectations and then to link them cross functionally to show how progress is being made on strategic goals. Feedback from the executive information system then informs strategy formulation, business plan development, and operational activities.

  13. Preschool Executive Functioning Abilities Predict Early Mathematics Achievement

    ERIC Educational Resources Information Center

    Clark, Caron A. C.; Pritchard, Verena E.; Woodward, Lianne J.

    2010-01-01

    Impairments in executive function have been documented in school-age children with mathematical learning difficulties. However, the utility and specificity of preschool executive function abilities in predicting later mathematical achievement are poorly understood. This study examined linkages between children's developing executive function…

  14. ls1 mardyn: The Massively Parallel Molecular Dynamics Code for Large Systems.

    PubMed

    Niethammer, Christoph; Becker, Stefan; Bernreuther, Martin; Buchholz, Martin; Eckhardt, Wolfgang; Heinecke, Alexander; Werth, Stephan; Bungartz, Hans-Joachim; Glass, Colin W; Hasse, Hans; Vrabec, Jadran; Horsch, Martin

    2014-10-14

    The molecular dynamics simulation code ls1 mardyn is presented. It is a highly scalable code, optimized for massively parallel execution on supercomputing architectures and currently holds the world record for the largest molecular simulation with over four trillion particles. It enables the application of pair potentials to length and time scales that were previously out of scope for molecular dynamics simulation. With an efficient dynamic load balancing scheme, it delivers high scalability even for challenging heterogeneous configurations. Presently, multicenter rigid potential models based on Lennard-Jones sites, point charges, and higher-order polarities are supported. Due to its modular design, ls1 mardyn can be extended to new physical models, methods, and algorithms, allowing future users to tailor it to suit their respective needs. Possible applications include scenarios with complex geometries, such as fluids at interfaces, as well as nonequilibrium molecular dynamics simulation of heat and mass transfer.

  15. Evaluation of coded aperture radiation detectors using a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Miller, Kyle; Huggins, Peter; Labov, Simon; Nelson, Karl; Dubrawski, Artur

    2016-12-01

    We investigate tradeoffs arising from the use of coded aperture gamma-ray spectrometry to detect and localize sources of harmful radiation in the presence of noisy background. Using an example application scenario of area monitoring and search, we empirically evaluate weakly supervised spectral, spatial, and hybrid spatio-spectral algorithms for scoring individual observations, and two alternative methods of fusing evidence obtained from multiple observations. Results of our experiments confirm the intuition that directional information provided by spectrometers masked with coded aperture enables gains in source localization accuracy, but at the expense of reduced probability of detection. Losses in detection performance can however be to a substantial extent reclaimed by using our new spatial and spatio-spectral scoring methods which rely on realistic assumptions regarding masking and its impact on measured photon distributions.

  16. Multi-Sensor Detection with Particle Swarm Optimization for Time-Frequency Coded Cooperative WSNs Based on MC-CDMA for Underground Coal Mines

    PubMed Central

    Xu, Jingjing; Yang, Wei; Zhang, Linyuan; Han, Ruisong; Shao, Xiaotao

    2015-01-01

    In this paper, a wireless sensor network (WSN) technology adapted to underground channel conditions is developed, which has important theoretical and practical value for safety monitoring in underground coal mines. According to the characteristics that the space, time and frequency resources of underground tunnel are open, it is proposed to constitute wireless sensor nodes based on multicarrier code division multiple access (MC-CDMA) to make full use of these resources. To improve the wireless transmission performance of source sensor nodes, it is also proposed to utilize cooperative sensors with good channel conditions from the sink node to assist source sensors with poor channel conditions. Moreover, the total power of the source sensor and its cooperative sensors is allocated on the basis of their channel conditions to increase the energy efficiency of the WSN. To solve the problem that multiple access interference (MAI) arises when multiple source sensors transmit monitoring information simultaneously, a kind of multi-sensor detection (MSD) algorithm with particle swarm optimization (PSO), namely D-PSO, is proposed for the time-frequency coded cooperative MC-CDMA WSN. Simulation results show that the average bit error rate (BER) performance of the proposed WSN in an underground coal mine is improved significantly by using wireless sensor nodes based on MC-CDMA, adopting time-frequency coded cooperative transmission and D-PSO algorithm with particle swarm optimization. PMID:26343660

  17. Multi-Sensor Detection with Particle Swarm Optimization for Time-Frequency Coded Cooperative WSNs Based on MC-CDMA for Underground Coal Mines.

    PubMed

    Xu, Jingjing; Yang, Wei; Zhang, Linyuan; Han, Ruisong; Shao, Xiaotao

    2015-08-27

    In this paper, a wireless sensor network (WSN) technology adapted to underground channel conditions is developed, which has important theoretical and practical value for safety monitoring in underground coal mines. According to the characteristics that the space, time and frequency resources of underground tunnel are open, it is proposed to constitute wireless sensor nodes based on multicarrier code division multiple access (MC-CDMA) to make full use of these resources. To improve the wireless transmission performance of source sensor nodes, it is also proposed to utilize cooperative sensors with good channel conditions from the sink node to assist source sensors with poor channel conditions. Moreover, the total power of the source sensor and its cooperative sensors is allocated on the basis of their channel conditions to increase the energy efficiency of the WSN. To solve the problem that multiple access interference (MAI) arises when multiple source sensors transmit monitoring information simultaneously, a kind of multi-sensor detection (MSD) algorithm with particle swarm optimization (PSO), namely D-PSO, is proposed for the time-frequency coded cooperative MC-CDMA WSN. Simulation results show that the average bit error rate (BER) performance of the proposed WSN in an underground coal mine is improved significantly by using wireless sensor nodes based on MC-CDMA, adopting time-frequency coded cooperative transmission and D-PSO algorithm with particle swarm optimization.

  18. Concatenated Coding Using Trellis-Coded Modulation

    NASA Technical Reports Server (NTRS)

    Thompson, Michael W.

    1997-01-01

    In the late seventies and early eighties a technique known as Trellis Coded Modulation (TCM) was developed for providing spectrally efficient error correction coding. Instead of adding redundant information in the form of parity bits, redundancy is added at the modulation stage thereby increasing bandwidth efficiency. A digital communications system can be designed to use bandwidth-efficient multilevel/phase modulation such as Amplitude Shift Keying (ASK), Phase Shift Keying (PSK), Differential Phase Shift Keying (DPSK) or Quadrature Amplitude Modulation (QAM). Performance gain can be achieved by increasing the number of signals over the corresponding uncoded system to compensate for the redundancy introduced by the code. A considerable amount of research and development has been devoted toward developing good TCM codes for severely bandlimited applications. More recently, the use of TCM for satellite and deep space communications applications has received increased attention. This report describes the general approach of using a concatenated coding scheme that features TCM and RS coding. Results have indicated that substantial (6-10 dB) performance gains can be achieved with this approach with comparatively little bandwidth expansion. Since all of the bandwidth expansion is due to the RS code we see that TCM based concatenated coding results in roughly 10-50% bandwidth expansion compared to 70-150% expansion for similar concatenated scheme which use convolution code. We stress that combined coding and modulation optimization is important for achieving performance gains while maintaining spectral efficiency.

  19. Exact and Approximate Probabilistic Symbolic Execution

    NASA Technical Reports Server (NTRS)

    Luckow, Kasper; Pasareanu, Corina S.; Dwyer, Matthew B.; Filieri, Antonio; Visser, Willem

    2014-01-01

    Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also introduce approximate algorithms to search for good schedulers, speeding up established random sampling and reinforcement learning results through the quantification of path probabilities based on symbolic execution. We implemented the techniques in Symbolic PathFinder and evaluated them on nondeterministic Java programs. We show that our algorithms significantly improve upon a state-of- the-art statistical model checking algorithm, originally developed for Markov Decision Processes.

  20. Executive Headteachers: What's in a Name? Executive Summary

    ERIC Educational Resources Information Center

    Theobald, Katy; Lord, Pippa

    2016-01-01

    Executive headteachers (EHTs) are becoming increasingly prevalent as the self-improving school system matures; there are over 620 EHTs in the school workforce today; and the number recorded in the School Workforce Census (SWC) has increased by 240 per cent between 2010 and 2014. The role is still evolving locally and nationally and, as EHTs take…

  1. Automatic Rock Detection and Mapping from HiRISE Imagery

    NASA Technical Reports Server (NTRS)

    Huertas, Andres; Adams, Douglas S.; Cheng, Yang

    2008-01-01

    This system includes a C-code software program and a set of MATLAB software tools for statistical analysis and rock distribution mapping. The major functions include rock detection and rock detection validation. The rock detection code has been evolved into a production tool that can be used by engineers and geologists with minor training.

  2. Biosensors and Bio-Bar Code Assays Based on Biofunctionalized Magnetic Microbeads

    PubMed Central

    Jaffrezic-Renault, Nicole; Martelet, Claude; Chevolot, Yann; Cloarec, Jean-Pierre

    2007-01-01

    This review paper reports the applications of magnetic microbeads in biosensors and bio-bar code assays. Affinity biosensors are presented through different types of transducing systems: electrochemical, piezo electric or magnetic ones, applied to immunodetection and genodetection. Enzymatic biosensors are based on biofunctionalization through magnetic microbeads of a transducer, more often amperometric, potentiometric or conductimetric. The bio-bar code assays relie on a sandwich structure based on specific biological interaction of a magnetic microbead and a nanoparticle with a defined biological molecule. The magnetic particle allows the separation of the reacted target molecules from unreacted ones. The nanoparticles aim at the amplification and the detection of the target molecule. The bio-bar code assays allow the detection at very low concentration of biological molecules, similar to PCR sensitivity.

  3. Development and Implementation of Dynamic Scripts to Execute Cycled WRF/GSI Forecasts

    NASA Technical Reports Server (NTRS)

    Zavodsky, Bradley; Srikishen, Jayanthi; Berndt, Emily; Li, Quanli; Watson, Leela

    2014-01-01

    Automating the coupling of data assimilation (DA) and modeling systems is a unique challenge in the numerical weather prediction (NWP) research community. In recent years, the Development Testbed Center (DTC) has released well-documented tools such as the Weather Research and Forecasting (WRF) model and the Gridpoint Statistical Interpolation (GSI) DA system that can be easily downloaded, installed, and run by researchers on their local systems. However, developing a coupled system in which the various preprocessing, DA, model, and postprocessing capabilities are all integrated can be labor-intensive if one has little experience with any of these individual systems. Additionally, operational modeling entities generally have specific coupling methodologies that can take time to understand and develop code to implement properly. To better enable collaborating researchers to perform modeling and DA experiments with GSI, the Short-term Prediction Research and Transition (SPoRT) Center has developed a set of Perl scripts that couple GSI and WRF in a cycling methodology consistent with the use of real-time, regional observation data from the National Centers for Environmental Prediction (NCEP)/Environmental Modeling Center (EMC). Because Perl is open source, the code can be easily downloaded and executed regardless of the user's native shell environment. This paper will provide a description of this open-source code and descriptions of a number of the use cases that have been performed by SPoRT collaborators using the scripts on different computing systems.

  4. 8 CFR 3.0 - Executive Office for Immigration Review

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Executive Office for Immigration Review 3.0 Section 3.0 Aliens and Nationality DEPARTMENT OF HOMELAND SECURITY GENERAL PROVISIONS EXECUTIVE OFFICE FOR IMMIGRATION REVIEW § 3.0 Executive Office for Immigration Review Regulations of the Executive Office for...

  5. The Relationship Between the Perceived Executive Management Capabilities of Senior Navy Medical Department Executives and their Reported Managerial Requirements

    DTIC Science & Technology

    1993-06-01

    by implication, the areas to potentially target in executive development courses. Sieveking, Nicholas , and Woods provide additional support from their...and the executive skills outlined by Sieveking,, Nicholas , and Woods in 1992 and Mann and Standenmeyer in 1990. While technical management issues...25) and the executive skills outlined in Sieveking, Nicholas , and Woods in 1992 and Mann and Standenmeyer in 1990. While a broad spectrum of the

  6. Disjointness of Stabilizer Codes and Limitations on Fault-Tolerant Logical Gates

    NASA Astrophysics Data System (ADS)

    Jochym-O'Connor, Tomas; Kubica, Aleksander; Yoder, Theodore J.

    2018-04-01

    Stabilizer codes are among the most successful quantum error-correcting codes, yet they have important limitations on their ability to fault tolerantly compute. Here, we introduce a new quantity, the disjointness of the stabilizer code, which, roughly speaking, is the number of mostly nonoverlapping representations of any given nontrivial logical Pauli operator. The notion of disjointness proves useful in limiting transversal gates on any error-detecting stabilizer code to a finite level of the Clifford hierarchy. For code families, we can similarly restrict logical operators implemented by constant-depth circuits. For instance, we show that it is impossible, with a constant-depth but possibly geometrically nonlocal circuit, to implement a logical non-Clifford gate on the standard two-dimensional surface code.

  7. 17 CFR 37.12 - Trade execution compliance schedule.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 17 Commodity and Securities Exchanges 1 2014-04-01 2014-04-01 false Trade execution compliance... EXECUTION FACILITIES General Provisions § 37.12 Trade execution compliance schedule. (a) A swap transaction... days after the available-to-trade determination submission or certification for that swap is...

  8. The very real dangers of executive coaching.

    PubMed

    Berglas, Steven

    2002-06-01

    A personal coach to help your most promising executives reach their potential--sounds good, doesn't it? But, according to Steven Berglas, executive coaches can make a bad situation worse. Because of their backgrounds and biases, they ignore psychological problems they don't understand. Companies need to consider psychotherapeutic intervention when the symptoms plaguing an executive are stubborn or severe. Executives with issues that require more than coaching come in many shapes and sizes. Consider Rob Bernstein, an executive vice president of sales at an automotive parts distributor. According to the CEO, Bernstein had just the right touch with clients but caused personnel problems inside the company. The last straw came when Bernstein publicly humiliated a mail clerk who had interrupted a meeting to ask someone to sign for a package. At that point, the CEO assigned Tom Davis to coach Bernstein. Davis, a former corporate lawyer, worked with Bernstein for four years. But Davis only exacerbated the problem by teaching Bernstein techniques for "handling" employees--methods that were condescending at best. While Bernstein appeared to be improving, he was in fact getting worse. Bernstein's real problems went undetected, and when his boss left the company, he was picked as the successor. Soon enough, Bernstein was again in trouble, suspected of embezzlement. This time, the CEO didn't call Davis; instead, he turned to the author, a trained psychotherapist, for help. Berglas soon realized that Bernstein had a serious narcissistic personality disorder and executive coaching could not help him. As that tale and others in the article teach us, executives to be coached should at the very least first receive a psychological evaluation. And company leaders should beware that executive coaches given free rein can end up wreaking personnel havoc.

  9. 40 CFR 68.155 - Executive summary.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 15 2011-07-01 2011-07-01 false Executive summary. 68.155 Section 68.155 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CHEMICAL ACCIDENT PREVENTION PROVISIONS Risk Management Plan § 68.155 Executive summary. The owner or...

  10. 48 CFR 9901.315 - Executive Secretary.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 7 2014-10-01 2014-10-01 false Executive Secretary. 9901.315 Section 9901.315 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF... Executive Secretary. The Board's staff of professional, technical and supporting personnel is directed and...

  11. 48 CFR 9901.315 - Executive Secretary.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 7 2012-10-01 2012-10-01 false Executive Secretary. 9901.315 Section 9901.315 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF... Executive Secretary. The Board's staff of professional, technical and supporting personnel is directed and...

  12. 48 CFR 9901.315 - Executive Secretary.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 7 2011-10-01 2011-10-01 false Executive Secretary. 9901.315 Section 9901.315 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF... Executive Secretary. The Board's staff of professional, technical and supporting personnel is directed and...

  13. 48 CFR 9901.315 - Executive Secretary.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Executive Secretary. 9901.315 Section 9901.315 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF... Executive Secretary. The Board's staff of professional, technical and supporting personnel is directed and...

  14. 48 CFR 9901.315 - Executive Secretary.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 7 2013-10-01 2012-10-01 true Executive Secretary. 9901.315 Section 9901.315 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF... Executive Secretary. The Board's staff of professional, technical and supporting personnel is directed and...

  15. 40 CFR 68.155 - Executive summary.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 15 2010-07-01 2010-07-01 false Executive summary. 68.155 Section 68.155 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CHEMICAL ACCIDENT PREVENTION PROVISIONS Risk Management Plan § 68.155 Executive summary. The owner or...

  16. Biological significance of long non-coding RNA FTX expression in human colorectal cancer

    PubMed Central

    Guo, Xiao-Bo; Hua, Zhu; Li, Chen; Peng, Li-Pan; Wang, Jing-Shen; Wang, Bo; Zhi, Qiao-Ming

    2015-01-01

    The purpose of this study was to determine the expression of long non-coding RNA (lncRNA) FTX and analyze its prognostic and biological significance in colorectal cancer (CRC). A quantitative reverse transcription PCR was performed to detect the expression of long non-coding RNA FTX in 35 pairs of colorectal cancer and corresponding noncancerous tissues. The expression of long non-coding RNA FTX was detected in 187 colorectal cancer tissues and its correlations with clinicopathological factors of patients were examined. Univariate and multivariate analyses were performed to analyze the prognostic significance of Long Non-coding RNA FTX expression. The effects of long non-coding RNA FTX expression on malignant phenotypes of colorectal cancer cells and its possible biological significances were further determined. Long non-coding RNA FTX was significantly upregulated in colorectal cancer tissues, and low long non-coding RNA FTX expression was significantly correlated with differentiation grade, lymph vascular invasion, and clinical stage. Patients with high long non-coding RNA FTX showed poorer overall survival than those with low long non-coding RNA FTX. Multivariate analyses indicated that status of long non-coding RNA FTX was an independent prognostic factor for patients. Functional analyses showed that upregulation of long non-coding RNA FTX significantly promoted growth, migration, invasion, and increased colony formation in colorectal cancer cells. Therefore, long non-coding RNA FTX may be a potential biomarker for predicting the survival of colorectal cancer patients and might be a molecular target for treatment of human colorectal cancer. PMID:26629053

  17. Biological significance of long non-coding RNA FTX expression in human colorectal cancer.

    PubMed

    Guo, Xiao-Bo; Hua, Zhu; Li, Chen; Peng, Li-Pan; Wang, Jing-Shen; Wang, Bo; Zhi, Qiao-Ming

    2015-01-01

    The purpose of this study was to determine the expression of long non-coding RNA (lncRNA) FTX and analyze its prognostic and biological significance in colorectal cancer (CRC). A quantitative reverse transcription PCR was performed to detect the expression of long non-coding RNA FTX in 35 pairs of colorectal cancer and corresponding noncancerous tissues. The expression of long non-coding RNA FTX was detected in 187 colorectal cancer tissues and its correlations with clinicopathological factors of patients were examined. Univariate and multivariate analyses were performed to analyze the prognostic significance of Long Non-coding RNA FTX expression. The effects of long non-coding RNA FTX expression on malignant phenotypes of colorectal cancer cells and its possible biological significances were further determined. Long non-coding RNA FTX was significantly upregulated in colorectal cancer tissues, and low long non-coding RNA FTX expression was significantly correlated with differentiation grade, lymph vascular invasion, and clinical stage. Patients with high long non-coding RNA FTX showed poorer overall survival than those with low long non-coding RNA FTX. Multivariate analyses indicated that status of long non-coding RNA FTX was an independent prognostic factor for patients. Functional analyses showed that upregulation of long non-coding RNA FTX significantly promoted growth, migration, invasion, and increased colony formation in colorectal cancer cells. Therefore, long non-coding RNA FTX may be a potential biomarker for predicting the survival of colorectal cancer patients and might be a molecular target for treatment of human colorectal cancer.

  18. GPU based cloud system for high-performance arrhythmia detection with parallel k-NN algorithm.

    PubMed

    Tae Joon Jun; Hyun Ji Park; Hyuk Yoo; Young-Hak Kim; Daeyoung Kim

    2016-08-01

    In this paper, we propose an GPU based Cloud system for high-performance arrhythmia detection. Pan-Tompkins algorithm is used for QRS detection and we optimized beat classification algorithm with K-Nearest Neighbor (K-NN). To support high performance beat classification on the system, we parallelized beat classification algorithm with CUDA to execute the algorithm on virtualized GPU devices on the Cloud system. MIT-BIH Arrhythmia database is used for validation of the algorithm. The system achieved about 93.5% of detection rate which is comparable to previous researches while our algorithm shows 2.5 times faster execution time compared to CPU only detection algorithm.

  19. Modern multicore and manycore architectures: Modelling, optimisation and benchmarking a multiblock CFD code

    NASA Astrophysics Data System (ADS)

    Hadade, Ioan; di Mare, Luca

    2016-08-01

    Modern multicore and manycore processors exhibit multiple levels of parallelism through a wide range of architectural features such as SIMD for data parallel execution or threads for core parallelism. The exploitation of multi-level parallelism is therefore crucial for achieving superior performance on current and future processors. This paper presents the performance tuning of a multiblock CFD solver on Intel SandyBridge and Haswell multicore CPUs and the Intel Xeon Phi Knights Corner coprocessor. Code optimisations have been applied on two computational kernels exhibiting different computational patterns: the update of flow variables and the evaluation of the Roe numerical fluxes. We discuss at great length the code transformations required for achieving efficient SIMD computations for both kernels across the selected devices including SIMD shuffles and transpositions for flux stencil computations and global memory transformations. Core parallelism is expressed through threading based on a number of domain decomposition techniques together with optimisations pertaining to alleviating NUMA effects found in multi-socket compute nodes. Results are correlated with the Roofline performance model in order to assert their efficiency for each distinct architecture. We report significant speedups for single thread execution across both kernels: 2-5X on the multicore CPUs and 14-23X on the Xeon Phi coprocessor. Computations at full node and chip concurrency deliver a factor of three speedup on the multicore processors and up to 24X on the Xeon Phi manycore coprocessor.

  20. Numerical ‘health check’ for scientific codes: the CADNA approach

    NASA Astrophysics Data System (ADS)

    Scott, N. S.; Jézéquel, F.; Denis, C.; Chesneaux, J.-M.

    2007-04-01

    Scientific computation has unavoidable approximations built into its very fabric. One important source of error that is difficult to detect and control is round-off error propagation which originates from the use of finite precision arithmetic. We propose that there is a need to perform regular numerical 'health checks' on scientific codes in order to detect the cancerous effect of round-off error propagation. This is particularly important in scientific codes that are built on legacy software. We advocate the use of the CADNA library as a suitable numerical screening tool. We present a case study to illustrate the practical use of CADNA in scientific codes that are of interest to the Computer Physics Communications readership. In doing so we hope to stimulate a greater awareness of round-off error propagation and present a practical means by which it can be analyzed and managed.