Sample records for case execution time

  1. Performance enhancement of various real-time image processing techniques via speculative execution

    NASA Astrophysics Data System (ADS)

    Younis, Mohamed F.; Sinha, Purnendu; Marlowe, Thomas J.; Stoyenko, Alexander D.

    1996-03-01

    In real-time image processing, an application must satisfy a set of timing constraints while ensuring the semantic correctness of the system. Because of the natural structure of digital data, pure data and task parallelism have been used extensively in real-time image processing to accelerate the handling time of image data. These types of parallelism are based on splitting the execution load performed by a single processor across multiple nodes. However, execution of all parallel threads is mandatory for correctness of the algorithm. On the other hand, speculative execution is an optimistic execution of part(s) of the program based on assumptions on program control flow or variable values. Rollback may be required if the assumptions turn out to be invalid. Speculative execution can enhance average, and sometimes worst-case, execution time. In this paper, we target various image processing techniques to investigate applicability of speculative execution. We identify opportunities for safe and profitable speculative execution in image compression, edge detection, morphological filters, and blob recognition.

  2. 28 CFR 26.3 - Date, time, place, and method of execution.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... intravenous injection of a lethal substance or substances in a quantity sufficient to cause death, such... execution. 26.3 Section 26.3 Judicial Administration DEPARTMENT OF JUSTICE DEATH SENTENCES PROCEDURES Implementation of Death Sentences in Federal Cases § 26.3 Date, time, place, and method of execution. (a) Except...

  3. Scalable asynchronous execution of cellular automata

    NASA Astrophysics Data System (ADS)

    Folino, Gianluigi; Giordano, Andrea; Mastroianni, Carlo

    2016-10-01

    The performance and scalability of cellular automata, when executed on parallel/distributed machines, are limited by the necessity of synchronizing all the nodes at each time step, i.e., a node can execute only after the execution of the previous step at all the other nodes. However, these synchronization requirements can be relaxed: a node can execute one step after synchronizing only with the adjacent nodes. In this fashion, different nodes can execute different time steps. This can be a notable advantageous in many novel and increasingly popular applications of cellular automata, such as smart city applications, simulation of natural phenomena, etc., in which the execution times can be different and variable, due to the heterogeneity of machines and/or data and/or executed functions. Indeed, a longer execution time at a node does not slow down the execution at all the other nodes but only at the neighboring nodes. This is particularly advantageous when the nodes that act as bottlenecks vary during the application execution. The goal of the paper is to analyze the benefits that can be achieved with the described asynchronous implementation of cellular automata, when compared to the classical all-to-all synchronization pattern. The performance and scalability have been evaluated through a Petri net model, as this model is very useful to represent the synchronization barrier among nodes. We examined the usual case in which the territory is partitioned into a number of regions, and the computation associated with a region is assigned to a computing node. We considered both the cases of mono-dimensional and two-dimensional partitioning. The results show that the advantage obtained through the asynchronous execution, when compared to the all-to-all synchronous approach is notable, and it can be as large as 90% in terms of speedup.

  4. Implementation of and Ada real-time executive: A case study

    NASA Technical Reports Server (NTRS)

    Laird, James D.; Burton, Bruce A.; Koppes, Mary R.

    1986-01-01

    Current Ada language implementations and runtime environments are immature, unproven and are a key risk area for real-time embedded computer system (ECS). A test-case environment is provided in which the concerns of the real-time, ECS community are addressed. A priority driven executive is selected to be implemented in the Ada programming language. The model selected is representative of real-time executives tailored for embedded systems used missile, spacecraft, and avionics applications. An Ada-based design methodology is utilized, and two designs are considered. The first of these designs requires the use of vendor supplied runtime and tasking support. An alternative high-level design is also considered for an implementation requiring no vendor supplied runtime or tasking support. The former approach is carried through to implementation.

  5. Exact-Differential Large-Scale Traffic Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanai, Masatoshi; Suzumura, Toyotaro; Theodoropoulos, Georgios

    2015-01-01

    Analyzing large-scale traffics by simulation needs repeating execution many times with various patterns of scenarios or parameters. Such repeating execution brings about big redundancy because the change from a prior scenario to a later scenario is very minor in most cases, for example, blocking only one of roads or changing the speed limit of several roads. In this paper, we propose a new redundancy reduction technique, called exact-differential simulation, which enables to simulate only changing scenarios in later execution while keeping exactly same results as in the case of whole simulation. The paper consists of two main efforts: (i) amore » key idea and algorithm of the exact-differential simulation, (ii) a method to build large-scale traffic simulation on the top of the exact-differential simulation. In experiments of Tokyo traffic simulation, the exact-differential simulation shows 7.26 times as much elapsed time improvement in average and 2.26 times improvement even in the worst case as the whole simulation.« less

  6. Run-time parallelization and scheduling of loops

    NASA Technical Reports Server (NTRS)

    Saltz, Joel H.; Mirchandaney, Ravi; Crowley, Kay

    1990-01-01

    Run time methods are studied to automatically parallelize and schedule iterations of a do loop in certain cases, where compile-time information is inadequate. The methods presented involve execution time preprocessing of the loop. At compile-time, these methods set up the framework for performing a loop dependency analysis. At run time, wave fronts of concurrently executable loop iterations are identified. Using this wavefront information, loop iterations are reordered for increased parallelism. Symbolic transformation rules are used to produce: inspector procedures that perform execution time preprocessing and executors or transformed versions of source code loop structures. These transformed loop structures carry out the calculations planned in the inspector procedures. Performance results are presented from experiments conducted on the Encore Multimax. These results illustrate that run time reordering of loop indices can have a significant impact on performance. Furthermore, the overheads associated with this type of reordering are amortized when the loop is executed several times with the same dependency structure.

  7. Maximizing Total QoS-Provisioning of Image Streams with Limited Energy Budget

    NASA Astrophysics Data System (ADS)

    Lee, Wan Yeon; Kim, Kyong Hoon; Ko, Young Woong

    To fully utilize the limited battery energy of mobile electronic devices, we propose an adaptive adjustment method of processing quality for multiple image stream tasks running with widely varying execution times. This adjustment method completes the worst-case executions of the tasks with a given budget of energy, and maximizes the total reward value of processing quality obtained during their executions by exploiting the probability distribution of task execution times. The proposed method derives the maximum reward value for the tasks being executable with arbitrary processing quality, and near maximum value for the tasks being executable with a finite number of processing qualities. Our evaluation on a prototype system shows that the proposed method achieves larger reward values, by up to 57%, than the previous method.

  8. Time and Memory Efficient Online Piecewise Linear Approximation of Sensor Signals.

    PubMed

    Grützmacher, Florian; Beichler, Benjamin; Hein, Albert; Kirste, Thomas; Haubelt, Christian

    2018-05-23

    Piecewise linear approximation of sensor signals is a well-known technique in the fields of Data Mining and Activity Recognition. In this context, several algorithms have been developed, some of them with the purpose to be performed on resource constrained microcontroller architectures of wireless sensor nodes. While microcontrollers are usually constrained in computational power and memory resources, all state-of-the-art piecewise linear approximation techniques either need to buffer sensor data or have an execution time depending on the segment’s length. In the paper at hand, we propose a novel piecewise linear approximation algorithm, with a constant computational complexity as well as a constant memory complexity. Our proposed algorithm’s worst-case execution time is one to three orders of magnitude smaller and its average execution time is three to seventy times smaller compared to the state-of-the-art Piecewise Linear Approximation (PLA) algorithms in our experiments. In our evaluations, we show that our algorithm is time and memory efficient without sacrificing the approximation quality compared to other state-of-the-art piecewise linear approximation techniques, while providing a maximum error guarantee per segment, a small parameter space of only one parameter, and a maximum latency of one sample period plus its worst-case execution time.

  9. Run-time parallelization and scheduling of loops

    NASA Technical Reports Server (NTRS)

    Saltz, Joel H.; Mirchandaney, Ravi; Crowley, Kay

    1991-01-01

    Run-time methods are studied to automatically parallelize and schedule iterations of a do loop in certain cases where compile-time information is inadequate. The methods presented involve execution time preprocessing of the loop. At compile-time, these methods set up the framework for performing a loop dependency analysis. At run-time, wavefronts of concurrently executable loop iterations are identified. Using this wavefront information, loop iterations are reordered for increased parallelism. Symbolic transformation rules are used to produce: inspector procedures that perform execution time preprocessing, and executors or transformed versions of source code loop structures. These transformed loop structures carry out the calculations planned in the inspector procedures. Performance results are presented from experiments conducted on the Encore Multimax. These results illustrate that run-time reordering of loop indexes can have a significant impact on performance.

  10. NMC denies prioritising new FtP cases to meet government target.

    PubMed

    Osborne, Katie

    2015-01-27

    The Nursing and Midwifery Council's success in reducing the time it takes to handle fitness to practise cases has not been achieved by neglecting older cases, according to its chief executive Jackie Smith.

  11. Mobile Cloud Computing with SOAP and REST Web Services

    NASA Astrophysics Data System (ADS)

    Ali, Mushtaq; Fadli Zolkipli, Mohamad; Mohamad Zain, Jasni; Anwar, Shahid

    2018-05-01

    Mobile computing in conjunction with Mobile web services drives a strong approach where the limitations of mobile devices may possibly be tackled. Mobile Web Services are based on two types of technologies; SOAP and REST, which works with the existing protocols to develop Web services. Both the approaches carry their own distinct features, yet to keep the constraint features of mobile devices in mind, the better in two is considered to be the one which minimize the computation and transmission overhead while offloading. The load transferring of mobile device to remote servers for execution called computational offloading. There are numerous approaches to implement computational offloading a viable solution for eradicating the resources constraints of mobile device, yet a dynamic method of computational offloading is always required for a smooth and simple migration of complex tasks. The intention of this work is to present a distinctive approach which may not engage the mobile resources for longer time. The concept of web services utilized in our work to delegate the computational intensive tasks for remote execution. We tested both SOAP Web services approach and REST Web Services for mobile computing. Two parameters considered in our lab experiments to test; Execution Time and Energy Consumption. The results show that RESTful Web services execution is far better than executing the same application by SOAP Web services approach, in terms of execution time and energy consumption. Conducting experiments with the developed prototype matrix multiplication app, REST execution time is about 200% better than SOAP execution approach. In case of energy consumption REST execution is about 250% better than SOAP execution approach.

  12. Combining instruction prefetching with partial cache locking to improve WCET in real-time systems.

    PubMed

    Ni, Fan; Long, Xiang; Wan, Han; Gao, Xiaopeng

    2013-01-01

    Caches play an important role in embedded systems to bridge the performance gap between fast processor and slow memory. And prefetching mechanisms are proposed to further improve the cache performance. While in real-time systems, the application of caches complicates the Worst-Case Execution Time (WCET) analysis due to its unpredictable behavior. Modern embedded processors often equip locking mechanism to improve timing predictability of the instruction cache. However, locking the whole cache may degrade the cache performance and increase the WCET of the real-time application. In this paper, we proposed an instruction-prefetching combined partial cache locking mechanism, which combines an instruction prefetching mechanism (termed as BBIP) with partial cache locking to improve the WCET estimates of real-time applications. BBIP is an instruction prefetching mechanism we have already proposed to improve the worst-case cache performance and in turn the worst-case execution time. The estimations on typical real-time applications show that the partial cache locking mechanism shows remarkable WCET improvement over static analysis and full cache locking.

  13. Combining Instruction Prefetching with Partial Cache Locking to Improve WCET in Real-Time Systems

    PubMed Central

    Ni, Fan; Long, Xiang; Wan, Han; Gao, Xiaopeng

    2013-01-01

    Caches play an important role in embedded systems to bridge the performance gap between fast processor and slow memory. And prefetching mechanisms are proposed to further improve the cache performance. While in real-time systems, the application of caches complicates the Worst-Case Execution Time (WCET) analysis due to its unpredictable behavior. Modern embedded processors often equip locking mechanism to improve timing predictability of the instruction cache. However, locking the whole cache may degrade the cache performance and increase the WCET of the real-time application. In this paper, we proposed an instruction-prefetching combined partial cache locking mechanism, which combines an instruction prefetching mechanism (termed as BBIP) with partial cache locking to improve the WCET estimates of real-time applications. BBIP is an instruction prefetching mechanism we have already proposed to improve the worst-case cache performance and in turn the worst-case execution time. The estimations on typical real-time applications show that the partial cache locking mechanism shows remarkable WCET improvement over static analysis and full cache locking. PMID:24386133

  14. Performance comparison analysis library communication cluster system using merge sort

    NASA Astrophysics Data System (ADS)

    Wulandari, D. A. R.; Ramadhan, M. E.

    2018-04-01

    Begins by using a single processor, to increase the speed of computing time, the use of multi-processor was introduced. The second paradigm is known as parallel computing, example cluster. The cluster must have the communication potocol for processing, one of it is message passing Interface (MPI). MPI have many library, both of them OPENMPI and MPICH2. Performance of the cluster machine depend on suitable between performance characters of library communication and characters of the problem so this study aims to analyze the comparative performances libraries in handling parallel computing process. The case study in this research are MPICH2 and OpenMPI. This case research execute sorting’s problem to know the performance of cluster system. The sorting problem use mergesort method. The research method is by implementing OpenMPI and MPICH2 on a Linux-based cluster by using five computer virtual then analyze the performance of the system by different scenario tests and three parameters for to know the performance of MPICH2 and OpenMPI. These performances are execution time, speedup and efficiency. The results of this study showed that the addition of each data size makes OpenMPI and MPICH2 have an average speed-up and efficiency tend to increase but at a large data size decreases. increased data size doesn’t necessarily increased speed up and efficiency but only execution time example in 100000 data size. OpenMPI has a execution time greater than MPICH2 example in 1000 data size average execution time with MPICH2 is 0,009721 and OpenMPI is 0,003895 OpenMPI can customize communication needs.

  15. Rt-Space: A Real-Time Stochastically-Provisioned Adaptive Container Environment

    DTIC Science & Technology

    2017-08-04

    SECURITY CLASSIFICATION OF: This project was directed at component-based soft real- time (SRT) systems implemented on multicore platforms. To facilitate...upon average-case or near- average-case task execution times . The main intellectual contribution of this project was the development of methods for...allocating CPU time to components and associated analysis for validating SRT correctness. 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13

  16. Use of a remote clinical decision support service for a multicenter trial to implement prediction rules for children with minor blunt head trauma.

    PubMed

    Goldberg, Howard S; Paterno, Marilyn D; Grundmeier, Robert W; Rocha, Beatriz H; Hoffman, Jeffrey M; Tham, Eric; Swietlik, Marguerite; Schaeffer, Molly H; Pabbathi, Deepika; Deakyne, Sara J; Kuppermann, Nathan; Dayan, Peter S

    2016-03-01

    To evaluate the architecture, integration requirements, and execution characteristics of a remote clinical decision support (CDS) service used in a multicenter clinical trial. The trial tested the efficacy of implementing brain injury prediction rules for children with minor blunt head trauma. We integrated the Epic(®) electronic health record (EHR) with the Enterprise Clinical Rules Service (ECRS), a web-based CDS service, at two emergency departments. Patterns of CDS review included either a delayed, near-real-time review, where the physician viewed CDS recommendations generated by the nursing assessment, or a real-time review, where the physician viewed recommendations generated by their own documentation. A backstopping, vendor-based CDS triggered with zero delay when no recommendation was available in the EHR from the web-service. We assessed the execution characteristics of the integrated system and the source of the generated recommendations viewed by physicians. The ECRS mean execution time was 0.74 ±0.72 s. Overall execution time was substantially different at the two sites, with mean total transaction times of 19.67 and 3.99 s. Of 1930 analyzed transactions from the two sites, 60% (310/521) of all physician documentation-initiated recommendations and 99% (1390/1409) of all nurse documentation-initiated recommendations originated from the remote web service. The remote CDS system was the source of recommendations in more than half of the real-time cases and virtually all the near-real-time cases. Comparisons are limited by allowable variation in user workflow and resolution of the EHR clock. With maturation and adoption of standards for CDS services, remote CDS shows promise to decrease time-to-trial for multicenter evaluations of candidate decision support interventions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  17. High Velocity Jet Noise Source Location and Reduction. Task 2 Supplement. Computer Program for Calculating the Aeroacoustic Characteristics of Jets form Nozzles of Arbitrary Shape.

    DTIC Science & Technology

    1978-05-01

    controls and executes the jet plume flow field compu- tation. After each axial slice has been evaluated, the MAIN program calls subroutine SLICE to...input data; otherwise the execution is halted. 4.3.2 ARCCOS(X) This is a function subroutine which computes the principal value of the arc cosine of the... execution time available. Each successive case requires a title card (80 - character label in columns 1 - 80), followed by the INPUT NAMELIST. The data from

  18. Flawed Execution: A Case Study on Operational Contract Support

    DTIC Science & Technology

    2016-06-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA JOINT APPLIED PROJECT FLAWED EXECUTION: A CASE STUDY ON OPERATIONAL CONTRACT SUPPORT June 2016...applied project 4. TITLE AND SUBTITLE FLAWED EXECUTION: A CASE STUDY ON OPERATIONAL CONTRACT SUPPORT 5. FUNDING NUMBERS 6. AUTHOR(S) Scott F...unlimited FLAWED EXECUTION: A CASE STUDY ON OPERATIONAL CONTRACT SUPPORT Scott F. Taggart, Captain, United States Marine Corps Jacob Ledford

  19. Development and Execution of End-of-Mission Operations Case Study of the UARS and ERBS End-of-Mission Plans

    NASA Technical Reports Server (NTRS)

    Hughes, John; Marius, Julio L.; Montoro, Manuel; Patel, Mehul; Bludworth, David

    2006-01-01

    This Paper is a case study of the development and execution of the End-of-Mission plans for the Earth Radiation Budget Satellite (ERBS) and the Upper Atmosphere Research Satellite (UARS). The goals of the End-of-Mission Plans are to minimize the time the spacecraft remains on orbit and to minimize the risk of creating orbital debris. Both of these Missions predate the NASA Management Instructions (NMI) that directs missions to provide for safe mission termination. Each spacecrafts had their own unique challenges, which required assessing End-of-Mission requirements versus spacecraft limitations. Ultimately the End-of- Mission operations were about risk mitigation. This paper will describe the operational challenges and the lessons learned executing these End-of-Mission Plans

  20. Simulator for heterogeneous dataflow architectures

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.

    1993-01-01

    A new simulator is developed to simulate the execution of an algorithm graph in accordance with the Algorithm to Architecture Mapping Model (ATAMM) rules. ATAMM is a Petri Net model which describes the periodic execution of large-grained, data-independent dataflow graphs and which provides predictable steady state time-optimized performance. This simulator extends the ATAMM simulation capability from a heterogenous set of resources, or functional units, to a more general heterogenous architecture. Simulation test cases show that the simulator accurately executes the ATAMM rules for both a heterogenous architecture and a homogenous architecture, which is the special case for only one processor type. The simulator forms one tool in an ATAMM Integrated Environment which contains other tools for graph entry, graph modification for performance optimization, and playback of simulations for analysis.

  1. PRENATAL INFECTION AND EXECUTIVE DYSFUNCTION IN ADULT SCHIZOPHRENIA

    PubMed Central

    Brown, Alan S.; Vinogradov, Sophia; Kremen, William S.; Poole, John H.; Deicken, Raymond F.; Penner, Justin D.; McKeague, Ian W.; Kochetkova, Anna; Kern, David; Schaefer, Catherine A.

    2010-01-01

    Objective Executive dysfunction is one of the most prominent and functionally important cognitive deficits in schizophrenia. Although strong associations have been identified between executive impairments and structural and functional prefrontal cortical deficits, the etiological factors that contribute to disruption of this important cognitive domain remain unclear. Increasing evidence suggests that schizophrenia has a neurodevelopmental etiology, and several prenatal infections have been associated with risk of this disorder. To date, however, no previous study has examined whether in utero infection is associated with executive dysfunction in patients with schizophrenia. Method In the present study, we assessed the relationship between serologically documented prenatal exposure to influenza and toxoplasmosis and performance on the Wisconsin Card Sorting Test (WCST) and the Trail Making Test, part B (Trails B), as well as other measures of executive function, in 26 patients with schizophrenia from a large and well-characterized birth cohort. Results Cases who were exposed in utero to infection committed significantly more total errors on the WCST and took significantly more time to complete the Trails B than unexposed cases. Exposed cases also exhibited deficits on figural fluency, letter-number sequencing, and backward digit span. Conclusion Prenatal infections previously associated with schizophrenia are related to impaired performance on the WCST and Trails B. The pattern of results suggests that cognitive set-shifting ability may be particularly vulnerable to this gestational exposure. Further work is necessary to elucidate the specificity of prenatal infection to these executive function measures and examine correlates with neuroanatomic and neurophysiologic anomalies. PMID:19369317

  2. Problems and Opportunities in the Design of Entrances to Ports and Harbors. Proceedings of a Symposium held August 13-15, 1980, Fort Belvoir, Virginia.

    DTIC Science & Technology

    1980-01-01

    Louisiana staff Jack W, Boller, Executive Director I Donald W, Perkins, Assistant Executive Director S1. Charles A, Bookman, Staff Officer / Aurora M...wave, as in the case of the Amazon River in South America, the system may contain two or more tides at the same time. Thus, the tide may be rising or

  3. Telemanipulator design and optimization software

    NASA Astrophysics Data System (ADS)

    Cote, Jean; Pelletier, Michel

    1995-12-01

    For many years, industrial robots have been used to execute specific repetitive tasks. In those cases, the optimal configuration and location of the manipulator only has to be found once. The optimal configuration or position where often found empirically according to the tasks to be performed. In telemanipulation, the nature of the tasks to be executed is much wider and can be very demanding in terms of dexterity and workspace. The position/orientation of the robot's base could be required to move during the execution of a task. At present, the choice of the initial position of the teleoperator is usually found empirically which can be sufficient in the case of an easy or repetitive task. In the converse situation, the amount of time wasted to move the teleoperator support platform has to be taken into account during the execution of the task. Automatic optimization of the position/orientation of the platform or a better designed robot configuration could minimize these movements and save time. This paper will present two algorithms. The first algorithm is used to optimize the position and orientation of a given manipulator (or manipulators) with respect to the environment on which a task has to be executed. The second algorithm is used to optimize the position or the kinematic configuration of a robot. For this purpose, the tasks to be executed are digitized using a position/orientation measurement system and a compact representation based on special octrees. Given a digitized task, the optimal position or Denavit-Hartenberg configuration of the manipulator can be obtained numerically. Constraints on the robot design can also be taken into account. A graphical interface has been designed to facilitate the use of the two optimization algorithms.

  4. Multiple Microcomputer Control Algorithm.

    DTIC Science & Technology

    1979-09-01

    discrete and semaphore supervisor calls can be used with tasks in separate processors, in which case they are maintained in shared memory. Operations on ...the source or destination operand specifier of each mode in most cases . However, four of the 16 general register addressing modes and one of the 8 pro...instruction time is based on the specified usage factors and the best cast, and worst case execution times for the instruc- 1I 5 1NAVTRAEQZJ1PCrN M’.V7~j

  5. Online Planning Algorithm

    NASA Technical Reports Server (NTRS)

    Rabideau, Gregg R.; Chien, Steve A.

    2010-01-01

    AVA v2 software selects goals for execution from a set of goals that oversubscribe shared resources. The term goal refers to a science or engineering request to execute a possibly complex command sequence, such as image targets or ground-station downlinks. Developed as an extension to the Virtual Machine Language (VML) execution system, the software enables onboard and remote goal triggering through the use of an embedded, dynamic goal set that can oversubscribe resources. From the set of conflicting goals, a subset must be chosen that maximizes a given quality metric, which in this case is strict priority selection. A goal can never be pre-empted by a lower priority goal, and high-level goals can be added, removed, or updated at any time, and the "best" goals will be selected for execution. The software addresses the issue of re-planning that must be performed in a short time frame by the embedded system where computational resources are constrained. In particular, the algorithm addresses problems with well-defined goal requests without temporal flexibility that oversubscribes available resources. By using a fast, incremental algorithm, goal selection can be postponed in a "just-in-time" fashion allowing requests to be changed or added at the last minute. Thereby enabling shorter response times and greater autonomy for the system under control.

  6. Technical integration of hippocampus, Basal Ganglia and physical models for spatial navigation.

    PubMed

    Fox, Charles; Humphries, Mark; Mitchinson, Ben; Kiss, Tamas; Somogyvari, Zoltan; Prescott, Tony

    2009-01-01

    Computational neuroscience is increasingly moving beyond modeling individual neurons or neural systems to consider the integration of multiple models, often constructed by different research groups. We report on our preliminary technical integration of recent hippocampal formation, basal ganglia and physical environment models, together with visualisation tools, as a case study in the use of Python across the modelling tool-chain. We do not present new modeling results here. The architecture incorporates leaky-integrator and rate-coded neurons, a 3D environment with collision detection and tactile sensors, 3D graphics and 2D plots. We found Python to be a flexible platform, offering a significant reduction in development time, without a corresponding significant increase in execution time. We illustrate this by implementing a part of the model in various alternative languages and coding styles, and comparing their execution times. For very large-scale system integration, communication with other languages and parallel execution may be required, which we demonstrate using the BRAHMS framework's Python bindings.

  7. 16 CFR 444.3 - Unfair or deceptive cosigner practices.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... creating the cosigner's liability for future charges is executed, of the nature of his or her liability as... case of open end credit shall mean prior to the time that the agreement creating the cosigner's...

  8. An acceleration framework for synthetic aperture radar algorithms

    NASA Astrophysics Data System (ADS)

    Kim, Youngsoo; Gloster, Clay S.; Alexander, Winser E.

    2017-04-01

    Algorithms for radar signal processing, such as Synthetic Aperture Radar (SAR) are computationally intensive and require considerable execution time on a general purpose processor. Reconfigurable logic can be used to off-load the primary computational kernel onto a custom computing machine in order to reduce execution time by an order of magnitude as compared to kernel execution on a general purpose processor. Specifically, Field Programmable Gate Arrays (FPGAs) can be used to accelerate these kernels using hardware-based custom logic implementations. In this paper, we demonstrate a framework for algorithm acceleration. We used SAR as a case study to illustrate the potential for algorithm acceleration offered by FPGAs. Initially, we profiled the SAR algorithm and implemented a homomorphic filter using a hardware implementation of the natural logarithm. Experimental results show a linear speedup by adding reasonably small processing elements in Field Programmable Gate Array (FPGA) as opposed to using a software implementation running on a typical general purpose processor.

  9. Computer-Aided Software Engineering - An approach to real-time software development

    NASA Technical Reports Server (NTRS)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  10. Effectiveness comparison of partially executed t-way test suite based generated by existing strategies

    NASA Astrophysics Data System (ADS)

    Othman, Rozmie R.; Ahmad, Mohd Zamri Zahir; Ali, Mohd Shaiful Aziz Rashid; Zakaria, Hasneeza Liza; Rahman, Md. Mostafijur

    2015-05-01

    Consuming 40 to 50 percent of software development cost, software testing is one of the most resource consuming activities in software development lifecycle. To ensure an acceptable level of quality and reliability of a typical software product, it is desirable to test every possible combination of input data under various configurations. Due to combinatorial explosion problem, considering all exhaustive testing is practically impossible. Resource constraints, costing factors as well as strict time-to-market deadlines are amongst the main factors that inhibit such consideration. Earlier work suggests that sampling strategy (i.e. based on t-way parameter interaction or called as t-way testing) can be effective to reduce number of test cases without effecting the fault detection capability. However, for a very large system, even t-way strategy will produce a large test suite that need to be executed. In the end, only part of the planned test suite can be executed in order to meet the aforementioned constraints. Here, there is a need for test engineers to measure the effectiveness of partially executed test suite in order for them to assess the risk they have to take. Motivated by the abovementioned problem, this paper presents the effectiveness comparison of partially executed t-way test suite generated by existing strategies using tuples coverage method. Here, test engineers can predict the effectiveness of the testing process if only part of the original test cases is executed.

  11. Creating a Learning Organization for State, Local, and Tribal Law Enforcement to Combat Violent Extremism

    DTIC Science & Technology

    2016-09-01

    iterations in that time for the student practitioners to work through. When possible, case studies will be selected from actual counter-radicalizations...justify participation in the learning 9 organization. Those cases will be evaluated on a case -by- case basis and the need to expand the CVE mission...interested within the learning organization. The National Fire Academy Executive Fire Officer Program applied research pre -course is an example of

  12. Prefetching in file systems for MIMD multiprocessors

    NASA Technical Reports Server (NTRS)

    Kotz, David F.; Ellis, Carla Schlatter

    1990-01-01

    The question of whether prefetching blocks on the file into the block cache can effectively reduce overall execution time of a parallel computation, even under favorable assumptions, is considered. Experiments have been conducted with an interleaved file system testbed on the Butterfly Plus multiprocessor. Results of these experiments suggest that (1) the hit ratio, the accepted measure in traditional caching studies, may not be an adequate measure of performance when the workload consists of parallel computations and parallel file access patterns, (2) caching with prefetching can significantly improve the hit ratio and the average time to perform an I/O (input/output) operation, and (3) an improvement in overall execution time has been observed in most cases. In spite of these gains, prefetching sometimes results in increased execution times (a negative result, given the optimistic nature of the study). The authors explore why it is not trivial to translate savings on individual I/O requests into consistently better overall performance and identify the key problems that need to be addressed in order to improve the potential of prefetching techniques in the environment.

  13. Decision exploration lab: a visual analytics solution for decision management.

    PubMed

    Broeksema, Bertjan; Baudel, Thomas; Telea, Arthur G; Crisafulli, Paolo

    2013-12-01

    We present a visual analytics solution designed to address prevalent issues in the area of Operational Decision Management (ODM). In ODM, which has its roots in Artificial Intelligence (Expert Systems) and Management Science, it is increasingly important to align business decisions with business goals. In our work, we consider decision models (executable models of the business domain) as ontologies that describe the business domain, and production rules that describe the business logic of decisions to be made over this ontology. Executing a decision model produces an accumulation of decisions made over time for individual cases. We are interested, first, to get insight in the decision logic and the accumulated facts by themselves. Secondly and more importantly, we want to see how the accumulated facts reveal potential divergences between the reality as captured by the decision model, and the reality as captured by the executed decisions. We illustrate the motivation, added value for visual analytics, and our proposed solution and tooling through a business case from the car insurance industry.

  14. The contribution of executive control to semantic cognition: Convergent evidence from semantic aphasia and executive dysfunction.

    PubMed

    Thompson, Hannah E; Almaghyuli, Azizah; Noonan, Krist A; Barak, Ohr; Lambon Ralph, Matthew A; Jefferies, Elizabeth

    2018-01-03

    Semantic cognition, as described by the controlled semantic cognition (CSC) framework (Rogers et al., , Neuropsychologia, 76, 220), involves two key components: activation of coherent, generalizable concepts within a heteromodal 'hub' in combination with modality-specific features (spokes), and a constraining mechanism that manipulates and gates this knowledge to generate time- and task-appropriate behaviour. Executive-semantic goal representations, largely supported by executive regions such as frontal and parietal cortex, are thought to allow the generation of non-dominant aspects of knowledge when these are appropriate for the task or context. Semantic aphasia (SA) patients have executive-semantic deficits, and these are correlated with general executive impairment. If the CSC proposal is correct, patients with executive impairment should not only exhibit impaired semantic cognition, but should also show characteristics that align with those observed in SA. This possibility remains largely untested, as patients selected on the basis that they show executive impairment (i.e., with 'dysexecutive syndrome') have not been extensively tested on tasks tapping semantic control and have not been previously compared with SA cases. We explored conceptual processing in 12 patients showing symptoms consistent with dysexecutive syndrome (DYS) and 24 SA patients, using a range of multimodal semantic assessments which manipulated control demands. Patients with executive impairments, despite not being selected to show semantic impairments, nevertheless showed parallel patterns to SA cases. They showed strong effects of distractor strength, cues and miscues, and probe-target distance, plus minimal effects of word frequency on comprehension (unlike semantic dementia patients with degradation of conceptual knowledge). This supports a component process account of semantic cognition in which retrieval is shaped by control processes, and confirms that deficits in SA patients reflect difficulty controlling semantic retrieval. © 2018 The Authors. Journal of Neuropsychology published by John Wiley & Sons Ltd on behalf of British Psychological Society.

  15. Questionnaire-based assessment of executive functioning: Case studies.

    PubMed

    Kronenberger, William G; Castellanos, Irina; Pisoni, David B

    2018-01-01

    Delays in the development of executive functioning skills are frequently observed in pediatric neuropsychology populations and can have a broad and significant impact on quality of life. As a result, assessment of executive functioning is often relevant for the development of formulations and recommendations in pediatric neuropsychology clinical work. Questionnaire-based measures of executive functioning behaviors in everyday life have unique advantages and complement traditional neuropsychological measures of executive functioning. Two case studies of children with spina bifida are presented to illustrate the clinical use of a new questionnaire measure of executive and learning-related functioning, the Learning, Executive, and Attention Functioning Scale (LEAF). The LEAF emphasizes clinical utility in assessment by incorporating four characteristics: brevity in administration, breadth of additional relevant content, efficiency of scoring and interpretation, and ease of availability for use. LEAF results were consistent with another executive functioning checklist in documenting everyday behavior problems related to working memory, planning, and organization while offering additional breadth of assessment of domains such as attention, processing speed, and novel problem-solving. These case study results demonstrate the clinical utility of questionnaire-based measurement of executive functioning in pediatric neuropsychology and provide a new measure for accomplishing this goal.

  16. A distributed version of the NASA Engine Performance Program

    NASA Technical Reports Server (NTRS)

    Cours, Jeffrey T.; Curlett, Brian P.

    1993-01-01

    Distributed NEPP, a version of the NASA Engine Performance Program, uses the original NEPP code but executes it in a distributed computer environment. Multiple workstations connected by a network increase the program's speed and, more importantly, the complexity of the cases it can handle in a reasonable time. Distributed NEPP uses the public domain software package, called Parallel Virtual Machine, allowing it to execute on clusters of machines containing many different architectures. It includes the capability to link with other computers, allowing them to process NEPP jobs in parallel. This paper discusses the design issues and granularity considerations that entered into programming Distributed NEPP and presents the results of timing runs.

  17. Dynamic Control of Plans with Temporal Uncertainty

    NASA Technical Reports Server (NTRS)

    Morris, Paul; Muscettola, Nicola; Vidal, Thierry

    2001-01-01

    Certain planning systems that deal with quantitative time constraints have used an underlying Simple Temporal Problem solver to ensure temporal consistency of plans. However, many applications involve processes of uncertain duration whose timing cannot be controlled by the execution agent. These cases require more complex notions of temporal feasibility. In previous work, various "controllability" properties such as Weak, Strong, and Dynamic Controllability have been defined. The most interesting and useful Controllability property, the Dynamic one, has ironically proved to be the most difficult to analyze. In this paper, we resolve the complexity issue for Dynamic Controllability. Unexpectedly, the problem turns out to be tractable. We also show how to efficiently execute networks whose status has been verified.

  18. Cleanroom certification model

    NASA Technical Reports Server (NTRS)

    Currit, P. A.

    1983-01-01

    The Cleanroom software development methodology is designed to take the gamble out of product releases for both suppliers and receivers of the software. The ingredients of this procedure are a life cycle of executable product increments, representative statistical testing, and a standard estimate of the MTTF (Mean Time To Failure) of the product at the time of its release. A statistical approach to software product testing using randomly selected samples of test cases is considered. A statistical model is defined for the certification process which uses the timing data recorded during test. A reasonableness argument for this model is provided that uses previously published data on software product execution. Also included is a derivation of the certification model estimators and a comparison of the proposed least squares technique with the more commonly used maximum likelihood estimators.

  19. A Case Study Understanding Employability through the Lens of Human Resource Executives

    ERIC Educational Resources Information Center

    Stokes, Carmeda L.

    2013-01-01

    The purpose of this qualitative multiple case study was to examine HR executives' perspectives on employability enhancement for employees and how it is operationalized in their workplace. The exploratory questions that guided the study were, What are the perspectives of HR executives regarding employability enhancement for employees, and In what…

  20. A path-level exact parallelization strategy for sequential simulation

    NASA Astrophysics Data System (ADS)

    Peredo, Oscar F.; Baeza, Daniel; Ortiz, Julián M.; Herrero, José R.

    2018-01-01

    Sequential Simulation is a well known method in geostatistical modelling. Following the Bayesian approach for simulation of conditionally dependent random events, Sequential Indicator Simulation (SIS) method draws simulated values for K categories (categorical case) or classes defined by K different thresholds (continuous case). Similarly, Sequential Gaussian Simulation (SGS) method draws simulated values from a multivariate Gaussian field. In this work, a path-level approach to parallelize SIS and SGS methods is presented. A first stage of re-arrangement of the simulation path is performed, followed by a second stage of parallel simulation for non-conflicting nodes. A key advantage of the proposed parallelization method is to generate identical realizations as with the original non-parallelized methods. Case studies are presented using two sequential simulation codes from GSLIB: SISIM and SGSIM. Execution time and speedup results are shown for large-scale domains, with many categories and maximum kriging neighbours in each case, achieving high speedup results in the best scenarios using 16 threads of execution in a single machine.

  1. Navigating Change and Leading an Institution of Higher Education: A Case Study of the Missional Leadership of a University President

    ERIC Educational Resources Information Center

    Bunn, Christopher Edward

    2010-01-01

    The purpose of this qualitative, single case study is to explore key leadership principles and strategies related to the "good to great" pattern of growth at Lee University. In order to accomplish this purpose, this study investigates Dr. Paul Conn's thoughts and navigation through times of change, conflict, and the strategic execution of planned…

  2. Black-Box System Testing of Real-Time Embedded Systems Using Random and Search-Based Testing

    NASA Astrophysics Data System (ADS)

    Arcuri, Andrea; Iqbal, Muhammad Zohaib; Briand, Lionel

    Testing real-time embedded systems (RTES) is in many ways challenging. Thousands of test cases can be potentially executed on an industrial RTES. Given the magnitude of testing at the system level, only a fully automated approach can really scale up to test industrial RTES. In this paper we take a black-box approach and model the RTES environment using the UML/MARTE international standard. Our main motivation is to provide a more practical approach to the model-based testing of RTES by allowing system testers, who are often not familiar with the system design but know the application domain well-enough, to model the environment to enable test automation. Environment models can support the automation of three tasks: the code generation of an environment simulator, the selection of test cases, and the evaluation of their expected results (oracles). In this paper, we focus on the second task (test case selection) and investigate three test automation strategies using inputs from UML/MARTE environment models: Random Testing (baseline), Adaptive Random Testing, and Search-Based Testing (using Genetic Algorithms). Based on one industrial case study and three artificial systems, we show how, in general, no technique is better than the others. Which test selection technique to use is determined by the failure rate (testing stage) and the execution time of test cases. Finally, we propose a practical process to combine the use of all three test strategies.

  3. Managing and Securing Critical Infrastructure - A Semantic Policy and Trust Driven Approach

    DTIC Science & Technology

    2011-08-01

    enviromental factors, then it is very likely that the corresponding device has been compromised and controlled by an adversary. In this case, the report... Enviromental Factors in Faulty Case (b) Result of Policy Execution in Faulty Case Figure 7: Policy Execution in Faulty Case (a) Enviromental Factors

  4. Reducing the Use of Force: De-Escalation Training for Police Officers

    DTIC Science & Technology

    2016-09-01

    19641398. 152 Christine Nixon and David Bradley, “Policing Us More Gently: An Australian Case Study on the Police Use of Force,” Police Executive...Us More Gently: An Australian Case Study on the Police Use of Force,” Police Executive Research Forum (April 2007): 95, http://www.policeforum.org...Christine, and David Bradley. “Policing Us More Gently: An Australian Case Study on the Police Use of Force.” Police Executive Research Forum, April 2007

  5. 24 CFR 7.36 - Hearing.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... EMPLOYMENT OPPORTUNITY; POLICY, PROCEDURES AND PROGRAMS Equal Employment Opportunity Without Regard to Race... and the time frames for executing the right to request an administrative hearing. Note: Where a mixed... unless the MSPB has dismissed the mixed case complaint or appeal for jurisdictional reasons. (See 29 CFR...

  6. 24 CFR 7.36 - Hearing.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... EMPLOYMENT OPPORTUNITY; POLICY, PROCEDURES AND PROGRAMS Equal Employment Opportunity Without Regard to Race... and the time frames for executing the right to request an administrative hearing. Note: Where a mixed... unless the MSPB has dismissed the mixed case complaint or appeal for jurisdictional reasons. (See 29 CFR...

  7. On-time reliability impacts of advanced traveler information services (ATIS) : Washington, DC case study, executive summary

    DOT National Transportation Integrated Search

    1999-05-01

    This report documents the development and testing of a Surveillance and Delay Advisory System (SDAS) for application in congested rural areas. SDAS included several techniques that could be used on rural highways to give travelers advance information...

  8. Designing a multistage supply chain in cross-stage reverse logistics environments: application of particle swarm optimization algorithms.

    PubMed

    Chiang, Tzu-An; Che, Z H; Cui, Zhihua

    2014-01-01

    This study designed a cross-stage reverse logistics course for defective products so that damaged products generated in downstream partners can be directly returned to upstream partners throughout the stages of a supply chain for rework and maintenance. To solve this reverse supply chain design problem, an optimal cross-stage reverse logistics mathematical model was developed. In addition, we developed a genetic algorithm (GA) and three particle swarm optimization (PSO) algorithms: the inertia weight method (PSOA_IWM), V(Max) method (PSOA_VMM), and constriction factor method (PSOA_CFM), which we employed to find solutions to support this mathematical model. Finally, a real case and five simulative cases with different scopes were used to compare the execution times, convergence times, and objective function values of the four algorithms used to validate the model proposed in this study. Regarding system execution time, the GA consumed more time than the other three PSOs did. Regarding objective function value, the GA, PSOA_IWM, and PSOA_CFM could obtain a lower convergence value than PSOA_VMM could. Finally, PSOA_IWM demonstrated a faster convergence speed than PSOA_VMM, PSOA_CFM, and the GA did.

  9. Designing a Multistage Supply Chain in Cross-Stage Reverse Logistics Environments: Application of Particle Swarm Optimization Algorithms

    PubMed Central

    Chiang, Tzu-An; Che, Z. H.

    2014-01-01

    This study designed a cross-stage reverse logistics course for defective products so that damaged products generated in downstream partners can be directly returned to upstream partners throughout the stages of a supply chain for rework and maintenance. To solve this reverse supply chain design problem, an optimal cross-stage reverse logistics mathematical model was developed. In addition, we developed a genetic algorithm (GA) and three particle swarm optimization (PSO) algorithms: the inertia weight method (PSOA_IWM), V Max method (PSOA_VMM), and constriction factor method (PSOA_CFM), which we employed to find solutions to support this mathematical model. Finally, a real case and five simulative cases with different scopes were used to compare the execution times, convergence times, and objective function values of the four algorithms used to validate the model proposed in this study. Regarding system execution time, the GA consumed more time than the other three PSOs did. Regarding objective function value, the GA, PSOA_IWM, and PSOA_CFM could obtain a lower convergence value than PSOA_VMM could. Finally, PSOA_IWM demonstrated a faster convergence speed than PSOA_VMM, PSOA_CFM, and the GA did. PMID:24772026

  10. 8 CFR 1003.18 - Scheduling of cases.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ....18 Aliens and Nationality EXECUTIVE OFFICE FOR IMMIGRATION REVIEW, DEPARTMENT OF JUSTICE GENERAL PROVISIONS EXECUTIVE OFFICE FOR IMMIGRATION REVIEW Immigration Court-Rules of Procedure § 1003.18 Scheduling of cases. (a) The Immigration Court shall be responsible for scheduling cases and providing notice to...

  11. Impact force and time analysis influenced by execution distance in a roundhouse kick to the head in taekwondo.

    PubMed

    Estevan, Isaac; Alvarez, Octavio; Falco, Coral; Molina-García, Javier; Castillo, Isabel

    2011-10-01

    The execution distance is a tactic factor that affects mechanical performance and execution technique in taekwondo. This study analyzes the roundhouse kick to the head by comparing the maximum impact force, execution time, and impact time in 3 distances according to the athletes' competition level. It also analyzes the relationship between impact force and weight in each group. It examines whether the execution distance affects the maximum impact force, execution time, and impact time, in each level group or 2 different competition levels. Participants were 27 male taekwondo players (13 medallists and 14 nonmedallists). The medallists executed the roundhouse kick to the head with greater impact force and in a shorter execution time than did the nonmedallists when they kicked from any distance different to their combat distance. However, the results showed that the execution distance is influential in the execution time and impact time in the nonmedallist group. It is considered appropriate to orientate the high-level competitors to train for offensive actions from any distance similar to the long execution distance because it offers equally effectiveness and a greater security against the opponent. Also, practitioners should focus their training to improve time performance because it is more affected by distance than impact force.

  12. 17 CFR 232.106 - Prohibition against electronic submissions containing executable code.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... executable code will be suspended, unless the executable code is contained only in one or more PDF documents, in which case the submission will be accepted but the PDF document(s) containing executable code will...

  13. Model Checking Abstract PLEXIL Programs with SMART

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.

    2007-01-01

    We describe a method to automatically generate discrete-state models of abstract Plan Execution Interchange Language (PLEXIL) programs that can be analyzed using model checking tools. Starting from a high-level description of a PLEXIL program or a family of programs with common characteristics, the generator lays the framework that models the principles of program execution. The concrete parts of the program are not automatically generated, but require the modeler to introduce them by hand. As a case study, we generate models to verify properties of the PLEXIL macro constructs that are introduced as shorthand notation. After an exhaustive analysis, we conclude that the macro definitions obey the intended semantics and behave as expected, but contingently on a few specific requirements on the timing semantics of micro-steps in the concrete executive implementation.

  14. Orion Burn Management, Nominal and Response to Failures

    NASA Technical Reports Server (NTRS)

    Odegard, Ryan; Goodman, John L.; Barrett, Charles P.; Pohlkamp, Kara; Robinson, Shane

    2016-01-01

    An approach for managing Orion on-orbit burn execution is described for nominal and failure response scenarios. The burn management strategy for Orion takes into account per-burn variations in targeting, timing, and execution; crew and ground operator intervention and overrides; defined burn failure triggers and responses; and corresponding on-board software sequencing functionality. Burn-to- burn variations are managed through the identification of specific parameters that may be updated for each progressive burn. Failure triggers and automatic responses during the burn timeframe are defined to provide safety for the crew in the case of vehicle failures, along with override capabilities to ensure operational control of the vehicle. On-board sequencing software provides the timeline coordination for performing the required activities related to targeting, burn execution, and responding to burn failures.

  15. Applications for General Purpose Command Buffers: The Emergency Conjunction Avoidance Maneuver

    USGS Publications Warehouse

    Scheid, Robert J; England, Martin

    2016-01-01

    A case study is presented for the use of Relative Operation Sequence (ROS) command buffers to quickly execute a propulsive maneuver to avoid a collision with space debris. In this process, a ROS is custom-built with a burn time and magnitude, uplinked to the spacecraft, and executed in 15 percent of the time of the previous method. This new process provides three primary benefits. First, the planning cycle can be delayed until it is certain a burn must be performed, reducing team workload. Second, changes can be made to the burn parameters almost up to the point of execution while still allowing the normal uplink product review process, reducing the risk of leaving the operational orbit because of outdated burn parameters, and minimizing the chance of accidents from human error, such as missed commands, in a high-stress situation. Third, the science impacts can be customized and minimized around the burn, and in the event of an abort can be eliminated entirely in some circumstances. The result is a compact burn process that can be executed in as few as four hours and can be aborted seconds before execution. Operational, engineering, planning, and flight dynamics perspectives are presented, as well as a functional overview of the code and workflow required to implement the process. Future expansions and capabilities are also discussed.

  16. Supervising simulations with the Prodiguer Messaging Platform

    NASA Astrophysics Data System (ADS)

    Greenslade, Mark; Carenton, Nicolas; Denvil, Sebastien

    2015-04-01

    At any one moment in time, researchers affiliated with the Institut Pierre Simon Laplace (IPSL) climate modeling group, are running hundreds of global climate simulations. These simulations execute upon a heterogeneous set of High Performance Computing (HPC) environments spread throughout France. The IPSL's simulation execution runtime is called libIGCM (library for IPSL Global Climate Modeling group). libIGCM has recently been enhanced so as to support realtime operational use cases. Such use cases include simulation monitoring, data publication, environment metrics collection, automated simulation control … etc. At the core of this enhancement is the Prodiguer messaging platform. libIGCM now emits information, in the form of messages, for remote processing at IPSL servers in Paris. The remote message processing takes several forms, for example: 1. Persisting message content to database(s); 2. Notifying an operator of changes in a simulation's execution status; 3. Launching rollback jobs upon simulation failure; 4. Dynamically updating controlled vocabularies; 5. Notifying downstream applications such as the Prodiguer web portal; We will describe how the messaging platform has been implemented from a technical perspective and demonstrate the Prodiguer web portal receiving realtime notifications.

  17. Pragmatic and executive functions in traumatic brain injury and right brain damage: An exploratory comparative study

    PubMed Central

    Zimmermann, Nicolle; Gindri, Gigiane; de Oliveira, Camila Rosa; Fonseca, Rochele Paz

    2011-01-01

    Objective To describe the frequency of pragmatic and executive deficits in right brain damaged (RBD) and in traumatic brain injury (TBI) patients, and to verify possible dissociations between pragmatic and executive functions in these two groups. Methods The sample comprised 7 cases of TBI and 7 cases of RBD. All participants were assessed by means of tasks from the Montreal Communication Evaluation Battery and executive functions tests including the Trail Making Test, Hayling Test, Wisconsin Card Sorting Test, semantic and phonemic verbal fluency tasks, and working memory tasks from the Brazilian Brief Neuropsychological Assessment Battery NEUPSILIN. Z-score was calculated and a descriptive analysis of frequency of deficits (Z< -1.5) was carried out. Results RBD patients presented with deficits predominantly on conversational and narrative discursive tasks, while TBI patients showed a wider spread pattern of pragmatic deficits. Regarding EF, RBD deficits included predominantly working memory and verbal initiation impairment. On the other hand, TBI individuals again exhibited a general profile of executive dysfunction, affecting mainly working memory, initiation, inhibition, planning and switching. Pragmatic and executive deficits were generally associated upon comparisons of RBD patients and TBI cases, except for two simple dissociations: two post-TBI cases showed executive deficits in the absence of pragmatic deficits. Discussion Pragmatic and executive deficits can be very frequent following TBI or vascular RBD. There seems to be an association between these abilities, indicating that although they can co-occur, a cause-consequence relationship cannot be the only hypothesis. PMID:29213762

  18. Prototype Development and Redesign: A Case Study

    DTIC Science & Technology

    1990-03-01

    deal with difficult problems of leadership , strategy and management." [Ref. 10:p. 1] Admiral Turner feels that using the case study method "will help...placement officer was a Lieutenant Commander or Commander. Often times they came from leadership positions of executive officer equivalence. They were...ting power. Personnel within the computer organizatin who are used to manual methods and potential users of the system are resisting the change and

  19. Real time emotion aware applications: a case study employing emotion evocative pictures and neuro-physiological sensing enhanced by Graphic Processor Units.

    PubMed

    Konstantinidis, Evdokimos I; Frantzidis, Christos A; Pappas, Costas; Bamidis, Panagiotis D

    2012-07-01

    In this paper the feasibility of adopting Graphic Processor Units towards real-time emotion aware computing is investigated for boosting the time consuming computations employed in such applications. The proposed methodology was employed in analysis of encephalographic and electrodermal data gathered when participants passively viewed emotional evocative stimuli. The GPU effectiveness when processing electroencephalographic and electrodermal recordings is demonstrated by comparing the execution time of chaos/complexity analysis through nonlinear dynamics (multi-channel correlation dimension/D2) and signal processing algorithms (computation of skin conductance level/SCL) into various popular programming environments. Apart from the beneficial role of parallel programming, the adoption of special design techniques regarding memory management may further enhance the time minimization which approximates a factor of 30 in comparison with ANSI C language (single-core sequential execution). Therefore, the use of GPU parallel capabilities offers a reliable and robust solution for real-time sensing the user's affective state. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  20. Analysis of loss of time value during road maintenance project

    NASA Astrophysics Data System (ADS)

    Sudarsana, Dewa Ketut; Sanjaya, Putu Ari

    2017-06-01

    Lane closure is frequently performed in the execution of the road maintenance project. It has a negative impact on road users such as the loss of vehicle operating costs and the loss of time value. Nevertheless, analysis on loss of time value in Indonesia has not been carried out. The parameter of time value for the road users was the minimum wage city/region approach. Vehicle speed of pre-construction was obtained by observation, while the speed during the road maintenance project was predicted by the speed of the pre-construction by multiplying it with the speed adjustment factor. In the case of execution of the National road maintenance project in the two-lane two-way urban and interurban road types in the fiscal year of 2015 in Bali province, the loss of time value was at the average of IDR 12,789,000/day/link road. The relationship of traffic volume and loss of time value of the road users was obtained by a logarithm model.

  1. Executive Headteachers: What's in a Name? Case Study Compendium

    ERIC Educational Resources Information Center

    Wespieser, Karen, Ed.

    2016-01-01

    This Case Study Compendium provides an overview of the 12 cases that were investigated as part of the study "Executive Headteachers: What's in a Name?'" (Lord et al., 2016). The case study overviews are based on in-depth analysis and research as described in the full report (ibid) and the Technical Appendix (Harland and Bernardinelli,…

  2. Research on schedulers for astronomical observatories

    NASA Astrophysics Data System (ADS)

    Colome, Josep; Colomer, Pau; Guàrdia, Josep; Ribas, Ignasi; Campreciós, Jordi; Coiffard, Thierry; Gesa, Lluis; Martínez, Francesc; Rodler, Florian

    2012-09-01

    The main task of a scheduler applied to astronomical observatories is the time optimization of the facility and the maximization of the scientific return. Scheduling of astronomical observations is an example of the classical task allocation problem known as the job-shop problem (JSP), where N ideal tasks are assigned to M identical resources, while minimizing the total execution time. A problem of higher complexity, called the Flexible-JSP (FJSP), arises when the tasks can be executed by different resources, i.e. by different telescopes, and it focuses on determining a routing policy (i.e., which machine to assign for each operation) other than the traditional scheduling decisions (i.e., to determine the starting time of each operation). In most cases there is no single best approach to solve the planning system and, therefore, various mathematical algorithms (Genetic Algorithms, Ant Colony Optimization algorithms, Multi-Objective Evolutionary algorithms, etc.) are usually considered to adapt the application to the system configuration and task execution constraints. The scheduling time-cycle is also an important ingredient to determine the best approach. A shortterm scheduler, for instance, has to find a good solution with the minimum computation time, providing the system with the capability to adapt the selected task to varying execution constraints (i.e., environment conditions). We present in this contribution an analysis of the task allocation problem and the solutions currently in use at different astronomical facilities. We also describe the schedulers for three different projects (CTA, CARMENES and TJO) where the conclusions of this analysis are applied to develop a suitable routine.

  3. Before you make that big decision...

    PubMed

    Kahneman, Daniel; Lovallo, Dan; Sibony, Olivier

    2011-06-01

    When an executive makes a big bet, he or she typically relies on the judgment of a team that has put together a proposal for a strategic course of action. After all, the team will have delved into the pros and cons much more deeply than the executive has time to do. The problem is, biases invariably creep into any team's reasoning-and often dangerously distort its thinking. A team that has fallen in love with its recommendation, for instance, may subconsciously dismiss evidence that contradicts its theories, give far too much weight to one piece of data, or make faulty comparisons to another business case. That's why, with important decisions, executives need to conduct a careful review not only of the content of recommendations but of the recommendation process. To that end, the authors-Kahneman, who won a Nobel Prize in economics for his work on cognitive biases; Lovallo of the University of Sydney; and Sibony of McKinsey-have put together a 12-question checklist intended to unearth and neutralize defects in teams' thinking. These questions help leaders examine whether a team has explored alternatives appropriately, gathered all the right information, and used well-grounded numbers to support its case. They also highlight considerations such as whether the team might be unduly influenced by self-interest, overconfidence, or attachment to past decisions. By using this practical tool, executives will build decision processes over time that reduce the effects of biases and upgrade the quality of decisions their organizations make. The payoffs can be significant: A recent McKinsey study of more than 1,000 business investments, for instance, showed that when companies worked to reduce the effects of bias, they raised their returns on investment by seven percentage points. Executives need to realize that the judgment of even highly experienced, superbly competent managers can be fallible. A disciplined decision-making process, not individual genius, is the key to good strategy.

  4. An advanced approach to traditional round robin CPU scheduling algorithm to prioritize processes with residual burst time nearest to the specified time quantum

    NASA Astrophysics Data System (ADS)

    Swaraj Pati, Mythili N.; Korde, Pranav; Dey, Pallav

    2017-11-01

    The purpose of this paper is to introduce an optimised variant to the round robin scheduling algorithm. Every algorithm works in its own way and has its own merits and demerits. The proposed algorithm overcomes the shortfalls of the existing scheduling algorithms in terms of waiting time, turnaround time, throughput and number of context switches. The algorithm is pre-emptive and works based on the priority of the associated processes. The priority is decided on the basis of the remaining burst time of a particular process, that is; lower the burst time, higher the priority and higher the burst time, lower the priority. To complete the execution, a time quantum is initially specified. In case if the burst time of a particular process is less than 2X of the specified time quantum but more than 1X of the specified time quantum; the process is given high priority and is allowed to execute until it completes entirely and finishes. Such processes do not have to wait for their next burst cycle.

  5. Medical simulation in interventional cardiology: "More research is needed".

    PubMed

    Tajti, Peter; Brilakis, Emmanouil S

    2018-05-01

    Medical simulation is being used for training fellows to perform coronary angiography. Medical simulation training was associated with 2 min less fluoroscopy time per case after adjustment. Whether medical simulation really works needs to be evaluated in additional, well-designed and executed clinical studies. © 2018 Wiley Periodicals, Inc.

  6. Executive report : effects of changing HOV lane occupancy requirements : El Monte busway case study.

    DOT National Transportation Integrated Search

    2002-09-01

    In 1999, the California Legislature passed Senate Bill 63, which lowered the vehicle-occupancy requirement on the El Monte Busway on the San Bernardino (I-10) Freeway from three persons per vehicle (3+) to two persons per vehicle (2+) full time. The ...

  7. Leadership profiles of senior nurse executives.

    PubMed

    Hemman, E A

    2000-01-01

    As hospitals reorganize to meet the demand for accessible, cost-effective quality healthcare, nursing's active participation as part of the top management team is vital. The purpose of this study was to describe the leadership profiles of four senior nurse executives and determine their congruence with the theoretical perspectives of the stratified systems theory. A multiple case study methodology was employed to develop individual and group leadership profiles through related experiences obtained during an interview, the organization's expectations based on their job descriptions, and a survey of their self-perceptions of how they spent most of their time. The findings indicated that the executives' leadership behavior was consistent with the theory in that they reported more frequent leadership behaviors at the strategic domain, less activity at the organizational domain, and infrequent activity at the production domain. Individual profiles were uniformly consistent with the group profile.

  8. Executive Development: Meeting the Needs of Top Teams and Boards.

    ERIC Educational Resources Information Center

    Jackson, Sheila; Farndale, Elaine; Kakabadse, Andrew

    2003-01-01

    A literature review and six case studies examined the roles and responsibilities of the chief executive officers and board chairs, the capabilities required for success, and related executive development activity. Findings include the importance of focusing executive development on capability enhancement to ensure that it supports organizational…

  9. Providing web-based tools for time series access and analysis

    NASA Astrophysics Data System (ADS)

    Eberle, Jonas; Hüttich, Christian; Schmullius, Christiane

    2014-05-01

    Time series information is widely used in environmental change analyses and is also an essential information for stakeholders and governmental agencies. However, a challenging issue is the processing of raw data and the execution of time series analysis. In most cases, data has to be found, downloaded, processed and even converted in the correct data format prior to executing time series analysis tools. Data has to be prepared to use it in different existing software packages. Several packages like TIMESAT (Jönnson & Eklundh, 2004) for phenological studies, BFAST (Verbesselt et al., 2010) for breakpoint detection, and GreenBrown (Forkel et al., 2013) for trend calculations are provided as open-source software and can be executed from the command line. This is needed if data pre-processing and time series analysis is being automated. To bring both parts, automated data access and data analysis, together, a web-based system was developed to provide access to satellite based time series data and access to above mentioned analysis tools. Users of the web portal are able to specify a point or a polygon and an available dataset (e.g., Vegetation Indices and Land Surface Temperature datasets from NASA MODIS). The data is then being processed and provided as a time series CSV file. Afterwards the user can select an analysis tool that is being executed on the server. The final data (CSV, plot images, GeoTIFFs) is visualized in the web portal and can be downloaded for further usage. As a first use case, we built up a complimentary web-based system with NASA MODIS products for Germany and parts of Siberia based on the Earth Observation Monitor (www.earth-observation-monitor.net). The aim of this work is to make time series analysis with existing tools as easy as possible that users can focus on the interpretation of the results. References: Jönnson, P. and L. Eklundh (2004). TIMESAT - a program for analysing time-series of satellite sensor data. Computers and Geosciences 30, 833-845. Verbesselt, J., R. Hyndman, G. Newnham and D. Culvenor (2010). Detecting trend and seasonal changes in satellite image time series. Remote Sensing of Environment, 114, 106-115. DOI: 10.1016/j.rse.2009.08.014 Forkel, M., N. Carvalhais, J. Verbesselt, M. Mahecha, C. Neigh and M. Reichstein (2013). Trend Change Detection in NDVI Time Series: Effects of Inter-Annual Variability and Methodology. Remote Sensing 5, 2113-2144.

  10. Promising Practices: A Case Study on Public Health Emergency Preparedness at a University

    ERIC Educational Resources Information Center

    Mathes, Amy L.

    2013-01-01

    There is little published literature on operational coordination during a real time disaster regardless of the setting. This study describes a university's emergency management plan and its execution in response to a specific natural disaster, the May 8, 2009 "inland hurricane," which was later classified as a "Super Derecho."…

  11. Determination of the Underlying Task Scheduling Algorithm for an Ada Runtime System

    DTIC Science & Technology

    1989-12-01

    was also curious as to how well I could model the test cases with Ada programs . In particular, I wanted to see whether I could model the equal arrival...parameter relationshis=s required to detect the execution of individual algorithms. These test cases were modeled using Ada programs . Then, the...results were analyzed to determine whether the Ada programs were capable of revealing the task scheduling algorithm used by the Ada run-time system. This

  12. Moles: Tool-Assisted Environment Isolation with Closures

    NASA Astrophysics Data System (ADS)

    de Halleux, Jonathan; Tillmann, Nikolai

    Isolating test cases from environment dependencies is often desirable, as it increases test reliability and reduces test execution time. However, code that calls non-virtual methods or consumes sealed classes is often impossible to test in isolation. Moles is a new lightweight framework which addresses this problem. For any .NET method, Moles allows test-code to provide alternative implementations, given as .NET delegates, for which C# provides very concise syntax while capturing local variables in a closure object. Using code instrumentation, the Moles framework will redirect calls to provided delegates instead of the original methods. The Moles framework is designed to work together with the dynamic symbolic execution tool Pex to enable automated test generation. In a case study, testing code programmed against the Microsoft SharePoint Foundation API, we achieved full code coverage while running tests in isolation without an actual SharePoint server. The Moles framework integrates with .NET and Visual Studio.

  13. Imprecise results: Utilizing partial computations in real-time systems

    NASA Technical Reports Server (NTRS)

    Lin, Kwei-Jay; Natarajan, Swaminathan; Liu, Jane W.-S.

    1987-01-01

    In real-time systems, a computation may not have time to complete its execution because of deadline requirements. In such cases, no result except the approximate results produced by the computations up to that point will be available. It is desirable to utilize these imprecise results if possible. Two approaches are proposed to enable computations to return imprecise results when executions cannot be completed normally. The milestone approach records results periodically, and if a deadline is reached, returns the last recorded result. The sieve approach demarcates sections of code which can be skipped if the time available is insufficient. By using these approaches, the system is able to produce imprecise results when deadlines are reached. The design of the Concord project is described which supports imprecise computations using these techniques. Also presented is a general model of imprecise computations using these techniques, as well as one which takes into account the influence of the environment, showing where the latter approach fits into this model.

  14. Component Framework for Loosely Coupled High Performance Integrated Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Elwasif, W. R.; Bernholdt, D. E.; Shet, A. G.; Batchelor, D. B.; Foley, S.

    2010-11-01

    We present the design and implementation of a component-based simulation framework for the execution of coupled time-dependent plasma modeling codes. The Integrated Plasma Simulator (IPS) provides a flexible lightweight component model that streamlines the integration of stand alone codes into coupled simulations. Standalone codes are adapted to the IPS component interface specification using a thin wrapping layer implemented in the Python programming language. The framework provides services for inter-component method invocation, configuration, task, and data management, asynchronous event management, simulation monitoring, and checkpoint/restart capabilities. Services are invoked, as needed, by the computational components to coordinate the execution of different aspects of coupled simulations on Massive parallel Processing (MPP) machines. A common plasma state layer serves as the foundation for inter-component, file-based data exchange. The IPS design principles, implementation details, and execution model will be presented, along with an overview of several use cases.

  15. Meta-cognitive processes in executive control development: The case of reactive and proactive control

    PubMed Central

    Chevalier, Nicolas; Martis, Shaina Bailey; Curran, Tim; Munakata, Yuko

    2015-01-01

    Young children engage cognitive control reactively in response to events, rather than proactively preparing for events. Such limitations in executive control have been explained in terms of fundamental constraints on children’s cognitive capacities. Alternatively, young children might be capable of proactive control but differ from older children in their meta-cognitive decisions regarding when to engage proactive control. We examined these possibilities in three conditions of a task-switching paradigm, varying in whether task cues were available before or after target onset. Reaction times, ERPs, and pupil dilation showed that 5-year-olds did engage in advance preparation, a critical aspect of proactive control, but only when reactive control was made more difficult, whereas 10-year-olds engaged proactive control whenever possible. These findings highlight meta-cognitive processes in children’s cognitive control, an understudied aspect of executive control development. PMID:25603026

  16. Accelerated Monte Carlo Simulation on the Chemical Stage in Water Radiolysis using GPU

    PubMed Central

    Tian, Zhen; Jiang, Steve B.; Jia, Xun

    2018-01-01

    The accurate simulation of water radiolysis is an important step to understand the mechanisms of radiobiology and quantitatively test some hypotheses regarding radiobiological effects. However, the simulation of water radiolysis is highly time consuming, taking hours or even days to be completed by a conventional CPU processor. This time limitation hinders cell-level simulations for a number of research studies. We recently initiated efforts to develop gMicroMC, a GPU-based fast microscopic MC simulation package for water radiolysis. The first step of this project focused on accelerating the simulation of the chemical stage, the most time consuming stage in the entire water radiolysis process. A GPU-friendly parallelization strategy was designed to address the highly correlated many-body simulation problem caused by the mutual competitive chemical reactions between the radiolytic molecules. Two cases were tested, using a 750 keV electron and a 5 MeV proton incident in pure water, respectively. The time-dependent yields of all the radiolytic species during the chemical stage were used to evaluate the accuracy of the simulation. The relative differences between our simulation and the Geant4-DNA simulation were on average 5.3% and 4.4% for the two cases. Our package, executed on an Nvidia Titan black GPU card, successfully completed the chemical stage simulation of the two cases within 599.2 s and 489.0 s. As compared with Geant4-DNA that was executed on an Intel i7-5500U CPU processor and needed 28.6 h and 26.8 h for the two cases using a single CPU core, our package achieved a speed-up factor of 171.1-197.2. PMID:28323637

  17. Accelerated Monte Carlo simulation on the chemical stage in water radiolysis using GPU

    NASA Astrophysics Data System (ADS)

    Tian, Zhen; Jiang, Steve B.; Jia, Xun

    2017-04-01

    The accurate simulation of water radiolysis is an important step to understand the mechanisms of radiobiology and quantitatively test some hypotheses regarding radiobiological effects. However, the simulation of water radiolysis is highly time consuming, taking hours or even days to be completed by a conventional CPU processor. This time limitation hinders cell-level simulations for a number of research studies. We recently initiated efforts to develop gMicroMC, a GPU-based fast microscopic MC simulation package for water radiolysis. The first step of this project focused on accelerating the simulation of the chemical stage, the most time consuming stage in the entire water radiolysis process. A GPU-friendly parallelization strategy was designed to address the highly correlated many-body simulation problem caused by the mutual competitive chemical reactions between the radiolytic molecules. Two cases were tested, using a 750 keV electron and a 5 MeV proton incident in pure water, respectively. The time-dependent yields of all the radiolytic species during the chemical stage were used to evaluate the accuracy of the simulation. The relative differences between our simulation and the Geant4-DNA simulation were on average 5.3% and 4.4% for the two cases. Our package, executed on an Nvidia Titan black GPU card, successfully completed the chemical stage simulation of the two cases within 599.2 s and 489.0 s. As compared with Geant4-DNA that was executed on an Intel i7-5500U CPU processor and needed 28.6 h and 26.8 h for the two cases using a single CPU core, our package achieved a speed-up factor of 171.1-197.2.

  18. Accelerated Monte Carlo simulation on the chemical stage in water radiolysis using GPU.

    PubMed

    Tian, Zhen; Jiang, Steve B; Jia, Xun

    2017-04-21

    The accurate simulation of water radiolysis is an important step to understand the mechanisms of radiobiology and quantitatively test some hypotheses regarding radiobiological effects. However, the simulation of water radiolysis is highly time consuming, taking hours or even days to be completed by a conventional CPU processor. This time limitation hinders cell-level simulations for a number of research studies. We recently initiated efforts to develop gMicroMC, a GPU-based fast microscopic MC simulation package for water radiolysis. The first step of this project focused on accelerating the simulation of the chemical stage, the most time consuming stage in the entire water radiolysis process. A GPU-friendly parallelization strategy was designed to address the highly correlated many-body simulation problem caused by the mutual competitive chemical reactions between the radiolytic molecules. Two cases were tested, using a 750 keV electron and a 5 MeV proton incident in pure water, respectively. The time-dependent yields of all the radiolytic species during the chemical stage were used to evaluate the accuracy of the simulation. The relative differences between our simulation and the Geant4-DNA simulation were on average 5.3% and 4.4% for the two cases. Our package, executed on an Nvidia Titan black GPU card, successfully completed the chemical stage simulation of the two cases within 599.2 s and 489.0 s. As compared with Geant4-DNA that was executed on an Intel i7-5500U CPU processor and needed 28.6 h and 26.8 h for the two cases using a single CPU core, our package achieved a speed-up factor of 171.1-197.2.

  19. DALiuGE: A graph execution framework for harnessing the astronomical data deluge

    NASA Astrophysics Data System (ADS)

    Wu, C.; Tobar, R.; Vinsen, K.; Wicenec, A.; Pallot, D.; Lao, B.; Wang, R.; An, T.; Boulton, M.; Cooper, I.; Dodson, R.; Dolensky, M.; Mei, Y.; Wang, F.

    2017-07-01

    The Data Activated Liu Graph Engine - DALiuGE- is an execution framework for processing large astronomical datasets at a scale required by the Square Kilometre Array Phase 1 (SKA1). It includes an interface for expressing complex data reduction pipelines consisting of both datasets and algorithmic components and an implementation run-time to execute such pipelines on distributed resources. By mapping the logical view of a pipeline to its physical realisation, DALiuGE separates the concerns of multiple stakeholders, allowing them to collectively optimise large-scale data processing solutions in a coherent manner. The execution in DALiuGE is data-activated, where each individual data item autonomously triggers the processing on itself. Such decentralisation also makes the execution framework very scalable and flexible, supporting pipeline sizes ranging from less than ten tasks running on a laptop to tens of millions of concurrent tasks on the second fastest supercomputer in the world. DALiuGE has been used in production for reducing interferometry datasets from the Karl E. Jansky Very Large Array and the Mingantu Ultrawide Spectral Radioheliograph; and is being developed as the execution framework prototype for the Science Data Processor (SDP) consortium of the Square Kilometre Array (SKA) telescope. This paper presents a technical overview of DALiuGE and discusses case studies from the CHILES and MUSER projects that use DALiuGE to execute production pipelines. In a companion paper, we provide in-depth analysis of DALiuGE's scalability to very large numbers of tasks on two supercomputing facilities.

  20. Motor and Executive Control in Repetitive Timing of Brief Intervals

    ERIC Educational Resources Information Center

    Holm, Linus; Ullen, Fredrik; Madison, Guy

    2013-01-01

    We investigated the causal role of executive control functions in the production of brief time intervals by means of a concurrent task paradigm. To isolate the influence of executive functions on timing from motor coordination effects, we dissociated executive load from the number of effectors used in the dual task situation. In 3 experiments,…

  1. Kill a brand, keep a customer.

    PubMed

    Kumar, Nirmalya

    2003-12-01

    Most brands don't make much money. Year after year, businesses generate 80% to 90% of their profits from less than 20% of their brands. Yet most companies tend to ignore loss-making brands, unaware of the hidden costs they incur. That's because executives believe it's easy to erase a brand; they have only to stop investing in it, they assume, and it will die a natural death. But they're wrong. When companies drop brands clumsily, they antagonize loyal customers: Research shows that seven times out of eight, when firms merge two brands, the market share of the new brand never reaches the combined share of the two original ones. It doesn't have to be that way. Smart companies use a four-step process to kill brands methodically. First, CEOs make the case for rationalization by getting groups of senior executives to conduct joint audits of the brand portfolio. These audits make the need to prune brands apparent throughout the organization. In the next stage, executives need to decide how many brands will be retained, which they do either by setting broad parameters that all brands must meet or by identifying the brands they need in order to cater to all the customer segments in their markets. Third, executives must dispose of the brands they've decided to drop, deciding in each case whether it is appropriate to merge, sell, milk, or just eliminate the brand outright. Finally, it's critical that executives invest the resources they've freed to grow the brands they've retained. Done right, dropping brands will result in a company poised for new growth from the source where it's likely to be found--its profitable brands.

  2. When There is Tumult - The Ohio Army National Guard and Civil Disturbance Control, 1965 - 1970

    DTIC Science & Technology

    1982-03-15

    experi- nced 36 alerts for civil disturbances between 1965 and 1970 Fn almost all cases in which troops were committed to the r streets, they halted...constitution is an instructive case study in the appropriateness and effective- ness of military organizations in executing the laws in a democracy. 1 2 Three...time of state emergency and on the other to be prepared to become part of the active army upon the order of the President. The latter mission has

  3. A Case Study in Design Thinking Applied Through Aviation Mission Support Tactical Advancements for the Next Generation (TANG)

    DTIC Science & Technology

    2017-12-01

    This is an examination of the research, execution, and follow- on developments supporting the Design Thinking event explored through Case Study ...research, execution, and follow- on developments supporting the Design Thinking event explored through case study methods. Additionally, the lenses of...total there have been two Naval Postgraduate School (NPS) case study theses on U.S. Navy innovation events as well as other works examining the

  4. 12 CFR 701.14 - Change in official or senior executive officer in credit unions that are newly chartered or are...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Change in official or senior executive officer... OPERATION OF FEDERAL CREDIT UNIONS § 701.14 Change in official or senior executive officer in credit unions... senior executive staff. The regulation only applies in cases of newly chartered credit unions and credit...

  5. Modelling Limit Order Execution Times from Market Data

    NASA Astrophysics Data System (ADS)

    Kim, Adlar; Farmer, Doyne; Lo, Andrew

    2007-03-01

    Although the term ``liquidity'' is widely used in finance literatures, its meaning is very loosely defined and there is no quantitative measure for it. Generally, ``liquidity'' means an ability to quickly trade stocks without causing a significant impact on the stock price. From this definition, we identified two facets of liquidity -- 1.execution time of limit orders, and 2.price impact of market orders. The limit order is an order to transact a prespecified number of shares at a prespecified price, which will not cause an immediate execution. On the other hand, the market order is an order to transact a prespecified number of shares at a market price, which will cause an immediate execution, but are subject to price impact. Therefore, when the stock is liquid, market participants will experience quick limit order executions and small market order impacts. As a first step to understand market liquidity, we studied the facet of liquidity related to limit order executions -- execution times. In this talk, we propose a novel approach of modeling limit order execution times and show how they are affected by size and price of orders. We used q-Weibull distribution, which is a generalized form of Weibull distribution that can control the fatness of tail to model limit order execution times.

  6. Predictive sufficiency and the use of stored internal state

    NASA Technical Reports Server (NTRS)

    Musliner, David J.; Durfee, Edmund H.; Shin, Kang G.

    1994-01-01

    In all embedded computing systems, some delay exists between sensing and acting. By choosing an action based on sensed data, a system is essentially predicting that there will be no significant changes in the world during this delay. However, the dynamic and uncertain nature of the real world can make these predictions incorrect, and thus, a system may execute inappropriate actions. Making systems more reactive by decreasing the gap between sensing and action leaves less time for predictions to err, but still provides no principled assurance that they will be correct. Using the concept of predictive sufficiency described in this paper, a system can prove that its predictions are valid, and that it will never execute inappropriate actions. In the context of our CIRCA system, we also show how predictive sufficiency allows a system to guarantee worst-case response times to changes in its environment. Using predictive sufficiency, CIRCA is able to build real-time reactive control plans which provide a sound basis for performance guarantees that are unavailable with other reactive systems.

  7. Age slowing down in detection and visual discrimination under varying presentation times.

    PubMed

    Moret-Tatay, Carmen; Lemus-Zúñiga, Lenin-Guillermo; Tortosa, Diana Abad; Gamermann, Daniel; Vázquez-Martínez, Andrea; Navarro-Pardo, Esperanza; Conejero, J Alberto

    2017-08-01

    The reaction time has been described as a measure of perception, decision making, and other cognitive processes. The aim of this work is to examine age-related changes in executive functions in terms of demand load under varying presentation times. Two tasks were employed where a signal detection and a discrimination task were performed by young and older university students. Furthermore, a characterization of the response time distribution by an ex-Gaussian fit was carried out. The results indicated that the older participants were slower than the younger ones in signal detection and discrimination. Moreover, the differences between both processes for the older participants were higher, and they also showed a higher distribution average except for the lower and higher presentation time. The results suggest a general slowdown in both tasks for age under different presentation times, except for the cases where presentation times were lower and higher. Moreover, if these parameters are understood to be a reflection of executive functions, these findings are consistent with the common view that age-related cognitive deficits show a decline in this function. © 2017 Scandinavian Psychological Associations and John Wiley & Sons Ltd.

  8. Access to Presidential Materials.

    ERIC Educational Resources Information Center

    Tyler, John Edward

    The Supreme Court's decision regarding executive privilege in the case of the United States v. Richard Nixon focused on specifics and left the greater issues of executive privilege untouched. This report summarizes the events leading up to Nixon's confrontation with the Supreme Court and examines the future of executive privilege. Questions raised…

  9. Reduced variability and execution time to reach a target with a needle GPS system: Comparison between physicians, residents and nurse anaesthetists.

    PubMed

    Fevre, Marie-Cécile; Vincent, Caroline; Picard, Julien; Vighetti, Arnaud; Chapuis, Claire; Detavernier, Maxime; Allenet, Benoît; Payen, Jean-François; Bosson, Jean-Luc; Albaladejo, Pierre

    2018-02-01

    Ultrasound (US) guided needle positioning is safer than anatomical landmark techniques for central venous access. Hand-eye coordination and execution time depend on the professional's ability, previous training and personal skills. Needle guidance positioning systems (GPS) may theoretically reduce execution time and facilitate needle positioning in specific targets, thus improving patient comfort and safety. Three groups of healthcare professionals (41 anaesthesiologists and intensivists, 41 residents in anaesthesiology and intensive care, 39 nurse anaesthetists) were included and required to perform 3 tasks (positioning the tip of a needle in three different targets in a silicon phantom) by using successively a conventional US-guided needle positioning and a needle GPS. We measured execution times to perform the tasks, hand-eye coordination and the number of repositioning occurrences or errors in handling the needle or the probe. Without the GPS system, we observed a significant inter-individual difference for execution time (P<0.05), hand-eye coordination and the number of errors/needle repositioning between physicians, residents and nurse anaesthetists. US training and video gaming were found to be independent factors associated with a shorter execution time. Use of GPS attenuated the inter-individual and group variability. We observed a reduced execution time and improved hand-eye coordination in all groups as compared to US without GPS. Neither US training, video gaming nor demographic personal or professional factors were found to be significantly associated with reduced execution time when GPS was used. US associated with GPS systems may improve safety and decrease execution time by reducing inter-individual variability between professionals for needle-handling procedures. Copyright © 2016 Société française d'anesthésie et de réanimation (Sfar). Published by Elsevier Masson SAS. All rights reserved.

  10. Cluster of Sound Speed Fields by an Integral Measure

    DTIC Science & Technology

    2010-06-01

    the same cost in time. The increasing the number of sensor depths does not cause execution time to increase. And finally assume that the time required...to be P = Z − ∫ 0 b ∂C(ρ, θ, λ) ∂ρ ∂C(ρ, θ, λ) ∂ρ dρ (2) where (ρ,θ,λ) are the usual geocentric spherical coordinates, and the limits of integration...but using spherical coordinates requires that the horizontal (θ , λ) terms be normalized by the radius. In the case of geocentric coordinates this

  11. A direct-execution parallel architecture for the Advanced Continuous Simulation Language (ACSL)

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Owen, Jeffrey E.

    1988-01-01

    A direct-execution parallel architecture for the Advanced Continuous Simulation Language (ACSL) is presented which overcomes the traditional disadvantages of simulations executed on a digital computer. The incorporation of parallel processing allows the mapping of simulations into a digital computer to be done in the same inherently parallel manner as they are currently mapped onto an analog computer. The direct-execution format maximizes the efficiency of the executed code since the need for a high level language compiler is eliminated. Resolution is greatly increased over that which is available with an analog computer without the sacrifice in execution speed normally expected with digitial computer simulations. Although this report covers all aspects of the new architecture, key emphasis is placed on the processing element configuration and the microprogramming of the ACLS constructs. The execution times for all ACLS constructs are computed using a model of a processing element based on the AMD 29000 CPU and the AMD 29027 FPU. The increase in execution speed provided by parallel processing is exemplified by comparing the derived execution times of two ACSL programs with the execution times for the same programs executed on a similar sequential architecture.

  12. Modeling and executing electronic health records driven phenotyping algorithms using the NQF Quality Data Model and JBoss® Drools Engine.

    PubMed

    Li, Dingcheng; Endle, Cory M; Murthy, Sahana; Stancl, Craig; Suesse, Dale; Sottara, Davide; Huff, Stanley M; Chute, Christopher G; Pathak, Jyotishman

    2012-01-01

    With increasing adoption of electronic health records (EHRs), the need for formal representations for EHR-driven phenotyping algorithms has been recognized for some time. The recently proposed Quality Data Model from the National Quality Forum (NQF) provides an information model and a grammar that is intended to represent data collected during routine clinical care in EHRs as well as the basic logic required to represent the algorithmic criteria for phenotype definitions. The QDM is further aligned with Meaningful Use standards to ensure that the clinical data and algorithmic criteria are represented in a consistent, unambiguous and reproducible manner. However, phenotype definitions represented in QDM, while structured, cannot be executed readily on existing EHRs. Rather, human interpretation, and subsequent implementation is a required step for this process. To address this need, the current study investigates open-source JBoss® Drools rules engine for automatic translation of QDM criteria into rules for execution over EHR data. In particular, using Apache Foundation's Unstructured Information Management Architecture (UIMA) platform, we developed a translator tool for converting QDM defined phenotyping algorithm criteria into executable Drools rules scripts, and demonstrated their execution on real patient data from Mayo Clinic to identify cases for Coronary Artery Disease and Diabetes. To the best of our knowledge, this is the first study illustrating a framework and an approach for executing phenotyping criteria modeled in QDM using the Drools business rules management system.

  13. Modeling and Executing Electronic Health Records Driven Phenotyping Algorithms using the NQF Quality Data Model and JBoss® Drools Engine

    PubMed Central

    Li, Dingcheng; Endle, Cory M; Murthy, Sahana; Stancl, Craig; Suesse, Dale; Sottara, Davide; Huff, Stanley M.; Chute, Christopher G.; Pathak, Jyotishman

    2012-01-01

    With increasing adoption of electronic health records (EHRs), the need for formal representations for EHR-driven phenotyping algorithms has been recognized for some time. The recently proposed Quality Data Model from the National Quality Forum (NQF) provides an information model and a grammar that is intended to represent data collected during routine clinical care in EHRs as well as the basic logic required to represent the algorithmic criteria for phenotype definitions. The QDM is further aligned with Meaningful Use standards to ensure that the clinical data and algorithmic criteria are represented in a consistent, unambiguous and reproducible manner. However, phenotype definitions represented in QDM, while structured, cannot be executed readily on existing EHRs. Rather, human interpretation, and subsequent implementation is a required step for this process. To address this need, the current study investigates open-source JBoss® Drools rules engine for automatic translation of QDM criteria into rules for execution over EHR data. In particular, using Apache Foundation’s Unstructured Information Management Architecture (UIMA) platform, we developed a translator tool for converting QDM defined phenotyping algorithm criteria into executable Drools rules scripts, and demonstrated their execution on real patient data from Mayo Clinic to identify cases for Coronary Artery Disease and Diabetes. To the best of our knowledge, this is the first study illustrating a framework and an approach for executing phenotyping criteria modeled in QDM using the Drools business rules management system. PMID:23304325

  14. Statistical fingerprinting for malware detection and classification

    DOEpatents

    Prowell, Stacy J.; Rathgeb, Christopher T.

    2015-09-15

    A system detects malware in a computing architecture with an unknown pedigree. The system includes a first computing device having a known pedigree and operating free of malware. The first computing device executes a series of instrumented functions that, when executed, provide a statistical baseline that is representative of the time it takes the software application to run on a computing device having a known pedigree. A second computing device executes a second series of instrumented functions that, when executed, provides an actual time that is representative of the time the known software application runs on the second computing device. The system detects malware when there is a difference in execution times between the first and the second computing devices.

  15. The Executive Process, Grade Eight. Resource Unit (Unit III).

    ERIC Educational Resources Information Center

    Minnesota Univ., Minneapolis. Project Social Studies Curriculum Center.

    This resource unit, developed by the University of Minnesota's Project Social Studies, introduces eighth graders to the executive process. The unit uses case studies of presidential decision making such as the decision to drop the atomic bomb on Hiroshima, the Cuba Bay of Pigs and quarantine decisions, and the Little Rock decision. A case study of…

  16. Mobile Innovations, Executive Functions, and Educational Developments in Conflict Zones: A Case Study from Palestine

    ERIC Educational Resources Information Center

    Buckner, Elizabeth; Kim, Paul

    2012-01-01

    Prior research suggests that exposure to conflict can negatively impact the development of executive functioning, which in turn can affect academic performance. Recognizing the need to better understand the potentially widespread executive function deficiencies among Palestinian students and to help develop educational resources targeted to youth…

  17. A performance comparison of the IBM RS/6000 and the Astronautics ZS-1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, W.M.; Abraham, S.G.; Davidson, E.S.

    1991-01-01

    Concurrent uniprocessor architectures, of which vector and superscalar are two examples, are designed to capitalize on fine-grain parallelism. The authors have developed a performance evaluation method for comparing and improving these architectures, and in this article they present the methodology and a detailed case study of two machines. The runtime of many programs is dominated by time spent in loop constructs - for example, Fortran Do-loops. Loops generally comprise two logical processes: The access process generates addresses for memory operations while the execute process operates on floating-point data. Memory access patterns typically can be generated independently of the data inmore » the execute process. This independence allows the access process to slip ahead, thereby hiding memory latency. The IBM 360/91 was designed in 1967 to achieve slip dynamically, at runtime. One CPU unit executes integer operations while another handles floating-point operations. Other machines, including the VAX 9000 and the IBM RS/6000, use a similar approach.« less

  18. A case-comparison study of executive functions in alcohol-dependent adults with maternal history of alcoholism.

    PubMed

    Cottencin, Olivier; Nandrino, Jean-Louis; Karila, Laurent; Mezerette, Caroline; Danel, Thierry

    2009-04-01

    As executive dysfunctions frequently accompany alcohol dependence, we suggest that reports of executive dysfunction in alcoholics are actually due, in some case to a maternal history of alcohol misuse (MHA+). A history of maternal alcohol dependence increases the risk for prenatal alcohol exposure to unborn children. These exposures likely contribute to executive dysfunction in adult alcoholics. To assess this problem, we propose a case-comparison study of alcohol-dependent subjects with and without a MHA. Ten alcohol-dependent subjects, with a maternal history of alcoholism (MHA) and paternal history of alcoholism (PHA), were matched with 10 alcohol-dependent people with only a paternal history of alcoholism (PHA). Executive functions (cancellation, Stroop, and trail-making A and B tests) and the presence of a history of three mental disorders (attention deficit hyperactivity disorder, violent behavior while intoxicated, and suicidal behavior) were evaluated in both populations. Alcohol-dependent subjects with MHA showed a significant alteration in executive functions and significantly more disorders related to these functions than PHA subjects. The major measures of executive functioning deficit are duration on task accomplishment in all tests. Rates of ADHD and suicidality were found to be higher in MHA patients compared to the controls. A history of MHA, because of the high risk of PAE (in spite of the potential confounding factors such as environment) must be scrupulously documented when evaluating mental and cognitive disorders in a general population of alcoholics to ensure a better identification of these disorders. It would be helpful to replicate the study with more subjects.

  19. Execution of a parallel edge-based Navier-Stokes solver on commodity graphics processor units

    NASA Astrophysics Data System (ADS)

    Corral, Roque; Gisbert, Fernando; Pueblas, Jesus

    2017-02-01

    The implementation of an edge-based three-dimensional Reynolds Average Navier-Stokes solver for unstructured grids able to run on multiple graphics processing units (GPUs) is presented. Loops over edges, which are the most time-consuming part of the solver, have been written to exploit the massively parallel capabilities of GPUs. Non-blocking communications between parallel processes and between the GPU and the central processor unit (CPU) have been used to enhance code scalability. The code is written using a mixture of C++ and OpenCL, to allow the execution of the source code on GPUs. The Message Passage Interface (MPI) library is used to allow the parallel execution of the solver on multiple GPUs. A comparative study of the solver parallel performance is carried out using a cluster of CPUs and another of GPUs. It is shown that a single GPU is up to 64 times faster than a single CPU core. The parallel scalability of the solver is mainly degraded due to the loss of computing efficiency of the GPU when the size of the case decreases. However, for large enough grid sizes, the scalability is strongly improved. A cluster featuring commodity GPUs and a high bandwidth network is ten times less costly and consumes 33% less energy than a CPU-based cluster with an equivalent computational power.

  20. ADHD symptoms and benefit from extended time testing accommodations.

    PubMed

    Lovett, Benjamin J; Leja, Ashley M

    2015-02-01

    To investigate the relationship between ADHD symptoms, executive functioning problems, and benefit from extended time testing accommodations. College students completed a battery of measures assessing processing speed and reading fluency, reading comprehension (under two different time limits), symptoms of ADHD, executive functioning deficits, and perceptions of need for extended time. Students reporting more symptoms of ADHD and executive functioning deficits actually benefited less from extended time, and students' perceptions of their timing needs did not predict benefit. Students with more ADHD symptoms are less likely to use extended time effectively, possibly because of their associated executive functioning problems. These results suggest there may be little justification for examining a student's ADHD symptoms when making extended time accommodation decisions. © 2013 SAGE Publications.

  1. 75 FR 47057 - Self-Regulatory Organizations; NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-04

    ... Rule Change and Amendment Nos. 1 and 2 Thereto Relating to Fees for Routing to Away Markets July 29... assessed for options orders entered into NOM but routed to and executed on away markets (``routing fees... order router. Each time NOS routes to away markets NOS is charged a $0.06 clearing fee and, in the case...

  2. Generalized priority-queue network dynamics: Impact of team and hierarchy

    NASA Astrophysics Data System (ADS)

    Cho, Won-Kuk; Min, Byungjoon; Goh, K.-I.; Kim, I.-M.

    2010-06-01

    We study the effect of team and hierarchy on the waiting-time dynamics of priority-queue networks. To this end, we introduce generalized priority-queue network models incorporating interaction rules based on team-execution and hierarchy in decision making, respectively. It is numerically found that the waiting-time distribution exhibits a power law for long waiting times in both cases, yet with different exponents depending on the team size and the position of queue nodes in the hierarchy, respectively. The observed power-law behaviors have in many cases a corresponding single or pairwise-interacting queue dynamics, suggesting that the pairwise interaction may constitute a major dynamic consequence in the priority-queue networks. It is also found that the reciprocity of influence is a relevant factor for the priority-queue network dynamics.

  3. ``Carbon Credits'' for Resource-Bounded Computations Using Amortised Analysis

    NASA Astrophysics Data System (ADS)

    Jost, Steffen; Loidl, Hans-Wolfgang; Hammond, Kevin; Scaife, Norman; Hofmann, Martin

    Bounding resource usage is important for a number of areas, notably real-time embedded systems and safety-critical systems. In this paper, we present a fully automatic static type-based analysis for inferring upper bounds on resource usage for programs involving general algebraic datatypes and full recursion. Our method can easily be used to bound any countable resource, without needing to revisit proofs. We apply the analysis to the important metrics of worst-case execution time, stack- and heap-space usage. Our results from several realistic embedded control applications demonstrate good matches between our inferred bounds and measured worst-case costs for heap and stack usage. For time usage we infer good bounds for one application. Where we obtain less tight bounds, this is due to the use of software floating-point libraries.

  4. The Power of ROFO Principle Together with Companywide Training in Executing Lean Production Strategy

    ERIC Educational Resources Information Center

    Goh, Ah Bee; Chakpitak, Nopasit; Sureephong, Pradorn

    2015-01-01

    This paper reports the findings of the case study conducted at Schaffner Thailand (ST) factory regarding the application of the ROFO principle coupled with companywide training on the execution of Lean Production (LP) strategy. The case study was motivated by 3 main objectives: 1) to examine the effectiveness of the ROFO principle and companywide…

  5. The use of executed Nazi victims in anatomy: Findings from the Institute of Anatomy at Gießen University, pre- and post-1945.

    PubMed

    Oehler-Klein, Sigrid; Preuss, Dirk; Roelcke, Volker

    2012-06-01

    There is increasing evidence that both during the time of National Socialism, and in the post-World War II-period, the corpses of executed victims of the Nazi regime, as well as body parts taken from them were used for teaching and research purposes in German anatomical institutes. The paper addresses the related issues by looking at the case of the Institute of Anatomy at Gießen University whose director, Ferdinand Wagenseil, is documented to have had certain political reservations towards the Nazi regime, but at the same time used the situation to get access to more corpses, most likely for teaching purposes. On a second level, new archival sources are used to explore to what extend corpses and body parts of Nazi victims were used in the post-WW II period. One central aim in this context is the reconstruction of the identities of these victims for the purpose of acknowledgment of the atrocities committed to them, appropriate remembrance, and to possibly enable the respectful burial of the remaining body parts. Further, the case raises the question how anatomists during and after the Nazi period justified for themselves the use of corpses from executed political prisoners, and what might be potential explanations for their reasoning. The historical evidence documents an attitude and value hierarchy which is aware of the disregard of dignity or human rights in the case of the Nazi victims, but which perceives this disregard as of minor relevance compared to the needs of medical teaching, or medical research. It is argued that this mental attitude is not specific for the Nazi period, but that it has been brought to an extreme manifestation in this specific context. Copyright © 2012 Elsevier GmbH. All rights reserved.

  6. Biomedical engineers and participation in judicial executions: capital punishment as a technical problem.

    PubMed

    Doyle, John

    2007-01-01

    This paper discusses the topic of judicial execution from the perspective of the intersection of the technological issues and the professional ethics issues. Although physicians are generally ethically forbidden from any involvement in the judicial execution process, this does not appear to be the case for engineering professionals. This creates an interesting but controversial opportunity for the engineering community (especially biomedical engineers) to improve the humaneness and reliability of the judicial execution process.

  7. Legal consequences for torture in children cases: the Gomez Paquiyauri Brothers vs Peru case.

    PubMed

    Tinta, Monica Feria

    2009-01-01

    The Gomez Paquiyauri Brothers case, before the Inter-American Court of Human Rights, was the first international case concerning the protection of children in the context of armed conflict where an international court stated the law concerning the duties of States towards children even in the context of war, and provided for reparations. As such it represents a landmark decision. The case arose from the illegal detention, torture and extrajudicial execution of two minors, Emilio and Rafael Gomez Paquiyauri, at the hands of Peruvian Police in 1991, under the Fujimori Administration at a time when the internal war in Peru was at its peak. Unlike most cases coming to the jurisdiction of the Inter-American Court, the case had been subject to domestic criminal investigations that had led to the convictions of two low ranking policemen. Yet a more subtle pattern of impunity lied at the root of the case. Torture had been denied by the State, and the prosecutions of low ranking policemen had intended to cover up the responsibility of those who ordered a policy of torture and executions (including the existence of secret codes for the torture and elimination of suspects of "terrorism") during the years of the internal armed conflict in Peru. The joint work of legal and medical expertise in the litigation of the case permitted the establishment of the facts and the law, obtaining an award of 740,500 dollars for the victims and a number of measures of reparation including guarantees of non-repetition and satisfaction, such as the naming of a school after the victims.

  8. Resource utilization model for the algorithm to architecture mapping model

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Patel, Rakesh R.

    1993-01-01

    The analytical model for resource utilization and the variable node time and conditional node model for the enhanced ATAMM model for a real-time data flow architecture are presented in this research. The Algorithm To Architecture Mapping Model, ATAMM, is a Petri net based graph theoretic model developed at Old Dominion University, and is capable of modeling the execution of large-grained algorithms on a real-time data flow architecture. Using the resource utilization model, the resource envelope may be obtained directly from a given graph and, consequently, the maximum number of required resources may be evaluated. The node timing diagram for one iteration period may be obtained using the analytical resource envelope. The variable node time model, which describes the change in resource requirement for the execution of an algorithm under node time variation, is useful to expand the applicability of the ATAMM model to heterogeneous architectures. The model also describes a method of detecting the presence of resource limited mode and its subsequent prevention. Graphs with conditional nodes are shown to be reduced to equivalent graphs with time varying nodes and, subsequently, may be analyzed using the variable node time model to determine resource requirements. Case studies are performed on three graphs for the illustration of applicability of the analytical theories.

  9. A Case Study for Executive Leadership

    ERIC Educational Resources Information Center

    Hill, Phyllis J.

    1975-01-01

    A newly appointed woman dean discusses the value of a management development program involving a process of self-analysis and self-determination of leadership style and effectiveness (the University of Illinois "Executive Leadership Seminar"). (JT)

  10. A Study on the Effectiveness of Lockup-Free Caches for a Reduced Instruction Set Computer (RISC) Processor

    DTIC Science & Technology

    1992-09-01

    to acquire or develop effective simulation tools to observe the behavior of a RISC implementation as it executes different types of programs . We choose...Performance Computer performance is measured by the amount of the time required to execute a program . Performance encompasses two types of time, elapsed time...and CPU time. Elapsed time is the time required to execute a program from start to finish. It includes latency of input/output activities such as

  11. Associations between daily physical activity and executive functioning in primary school-aged children.

    PubMed

    van der Niet, Anneke G; Smith, Joanne; Scherder, Erik J A; Oosterlaan, Jaap; Hartman, Esther; Visscher, Chris

    2015-11-01

    While there is some evidence that aerobic fitness is positively associated with executive functioning in children, evidence for a relation between children's daily physical activity and their executive functioning is limited. The objective was to examine associations between objectively measured daily physical activity (total volume, sedentary behavior, moderate to vigorous physical activity) and executive functioning in children. Cross-sectional. Eighty primary school children (36 boys, 44 girls) aged 8-12 years old participated in the study. Physical activity was measured using accelerometers. Executive functions measured included inhibition (Stroop test), working memory (Visual Memory Span test), cognitive flexibility (Trailmaking test), and planning (Tower of London). Total volume of physical activity, time spent in sedentary behavior and moderate to vigorous physical activity were calculated and related to performance on executive functioning. More time spent in sedentary behavior was related to worse inhibition (r = -0.24). A higher total volume of physical activity was associated with better planning ability, as reflected by both a higher score on the Tower of London (r = 0.24) and a shorter total execution time (r = -0.29). Also, a significant moderate correlation was found between time spent in moderate to vigorous physical activity and the total execution time of the Tower of London (r = -0.29). Children should limit time spent in sedentary behavior, and increasing their total physical activity. Total volume of physical activity, which consisted mostly of light intensity physical activity, is related to executive functioning. This opens up new possibilities to explore both the quantity and quality of physical activity in relation to cognition in children. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  12. Less-structured time in children's daily lives predicts self-directed executive functioning.

    PubMed

    Barker, Jane E; Semenov, Andrei D; Michaelson, Laura; Provan, Lindsay S; Snyder, Hannah R; Munakata, Yuko

    2014-01-01

    Executive functions (EFs) in childhood predict important life outcomes. Thus, there is great interest in attempts to improve EFs early in life. Many interventions are led by trained adults, including structured training activities in the lab, and less-structured activities implemented in schools. Such programs have yielded gains in children's externally-driven executive functioning, where they are instructed on what goal-directed actions to carry out and when. However, it is less clear how children's experiences relate to their development of self-directed executive functioning, where they must determine on their own what goal-directed actions to carry out and when. We hypothesized that time spent in less-structured activities would give children opportunities to practice self-directed executive functioning, and lead to benefits. To investigate this possibility, we collected information from parents about their 6-7 year-old children's daily, annual, and typical schedules. We categorized children's activities as "structured" or "less-structured" based on categorization schemes from prior studies on child leisure time use. We assessed children's self-directed executive functioning using a well-established verbal fluency task, in which children generate members of a category and can decide on their own when to switch from one subcategory to another. The more time that children spent in less-structured activities, the better their self-directed executive functioning. The opposite was true of structured activities, which predicted poorer self-directed executive functioning. These relationships were robust (holding across increasingly strict classifications of structured and less-structured time) and specific (time use did not predict externally-driven executive functioning). We discuss implications, caveats, and ways in which potential interpretations can be distinguished in future work, to advance an understanding of this fundamental aspect of growing up.

  13. Less-structured time in children's daily lives predicts self-directed executive functioning

    PubMed Central

    Barker, Jane E.; Semenov, Andrei D.; Michaelson, Laura; Provan, Lindsay S.; Snyder, Hannah R.; Munakata, Yuko

    2014-01-01

    Executive functions (EFs) in childhood predict important life outcomes. Thus, there is great interest in attempts to improve EFs early in life. Many interventions are led by trained adults, including structured training activities in the lab, and less-structured activities implemented in schools. Such programs have yielded gains in children's externally-driven executive functioning, where they are instructed on what goal-directed actions to carry out and when. However, it is less clear how children's experiences relate to their development of self-directed executive functioning, where they must determine on their own what goal-directed actions to carry out and when. We hypothesized that time spent in less-structured activities would give children opportunities to practice self-directed executive functioning, and lead to benefits. To investigate this possibility, we collected information from parents about their 6–7 year-old children's daily, annual, and typical schedules. We categorized children's activities as “structured” or “less-structured” based on categorization schemes from prior studies on child leisure time use. We assessed children's self-directed executive functioning using a well-established verbal fluency task, in which children generate members of a category and can decide on their own when to switch from one subcategory to another. The more time that children spent in less-structured activities, the better their self-directed executive functioning. The opposite was true of structured activities, which predicted poorer self-directed executive functioning. These relationships were robust (holding across increasingly strict classifications of structured and less-structured time) and specific (time use did not predict externally-driven executive functioning). We discuss implications, caveats, and ways in which potential interpretations can be distinguished in future work, to advance an understanding of this fundamental aspect of growing up. PMID:25071617

  14. Assisting Movement Training and Execution With Visual and Haptic Feedback.

    PubMed

    Ewerton, Marco; Rother, David; Weimar, Jakob; Kollegger, Gerrit; Wiemeyer, Josef; Peters, Jan; Maeda, Guilherme

    2018-01-01

    In the practice of motor skills in general, errors in the execution of movements may go unnoticed when a human instructor is not available. In this case, a computer system or robotic device able to detect movement errors and propose corrections would be of great help. This paper addresses the problem of how to detect such execution errors and how to provide feedback to the human to correct his/her motor skill using a general, principled methodology based on imitation learning. The core idea is to compare the observed skill with a probabilistic model learned from expert demonstrations. The intensity of the feedback is regulated by the likelihood of the model given the observed skill. Based on demonstrations, our system can, for example, detect errors in the writing of characters with multiple strokes. Moreover, by using a haptic device, the Haption Virtuose 6D, we demonstrate a method to generate haptic feedback based on a distribution over trajectories, which could be used as an auxiliary means of communication between an instructor and an apprentice. Additionally, given a performance measurement, the haptic device can help the human discover and perform better movements to solve a given task. In this case, the human first tries a few times to solve the task without assistance. Our framework, in turn, uses a reinforcement learning algorithm to compute haptic feedback, which guides the human toward better solutions.

  15. Executive functioning complaints and escitalopram treatment response in late-life depression.

    PubMed

    Manning, Kevin J; Alexopoulos, George S; Banerjee, Samprit; Morimoto, Sarah Shizuko; Seirup, Joanna K; Klimstra, Sibel A; Yuen, Genevieve; Kanellopoulos, Theodora; Gunning-Dixon, Faith

    2015-05-01

    Executive dysfunction may play a key role in the pathophysiology of late-life depression. Executive dysfunction can be assessed with cognitive tests and subjective report of difficulties with executive skills. The present study investigated the association between subjective report of executive functioning complaints and time to escitalopram treatment response in older adults with major depressive disorder (MDD). 100 older adults with MDD (58 with executive functioning complaints and 42 without executive functioning complaints) completed a 12-week trial of escitalopram. Treatment response over 12 weeks, as measured by repeated Hamilton Depression Rating Scale scores, was compared for adults with and without executive complaints using mixed-effects modeling. Mixed effects analysis revealed a significant group × time interaction, F(1, 523.34) = 6.00, p = 0.01. Depressed older adults who reported executive functioning complaints at baseline demonstrated a slower response to escitalopram treatment than those without executive functioning complaints. Self-report of executive functioning difficulties may be a useful prognostic indicator for subsequent speed of response to antidepressant medication. Copyright © 2015 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  16. Time-on-task effects in children with and without ADHD: depletion of executive resources or depletion of motivation?

    PubMed

    Dekkers, Tycho J; Agelink van Rentergem, Joost A; Koole, Alette; van den Wildenberg, Wery P M; Popma, Arne; Bexkens, Anika; Stoffelsen, Reino; Diekmann, Anouk; Huizenga, Hilde M

    2017-12-01

    Children with attention-deficit/hyperactivity disorder (ADHD) are characterized by deficits in their executive functioning and motivation. In addition, these children are characterized by a decline in performance as time-on-task increases (i.e., time-on-task effects). However, it is unknown whether these time-on-task effects should be attributed to deficits in executive functioning or to deficits in motivation. Some studies in typically developing (TD) adults indicated that time-on-task effects should be interpreted as depletion of executive resources, but other studies suggested that they represent depletion of motivation. We, therefore, investigated, in children with and without ADHD, whether there were time-on-task effects on executive functions, such as inhibition and (in)attention, and whether these were best explained by depletion of executive resources or depletion of motivation. The stop-signal task (SST), which generates both indices of inhibition (stop-signal reaction time) and attention (reaction time variability and errors), was administered in 96 children (42 ADHD, 54 TD controls; aged 9-13). To differentiate between depletion of resources and depletion of motivation, the SST was administered twice. Half of the participants was reinforced during second task performance, potentially counteracting depletion of motivation. Multilevel analyses indicated that children with ADHD were more affected by time-on-task than controls on two measures of inattention, but not on inhibition. In the ADHD group, reinforcement only improved performance on one index of attention (i.e., reaction time variability). The current findings suggest that time-on-task effects in children with ADHD occur specifically in the attentional domain, and seem to originate in both depletion of executive resources and depletion of motivation. Clinical implications for diagnostics, psycho-education, and intervention are discussed.

  17. Exploring the Effect of Video Used to Enhance the Retrospective Verbal Protocol Analysis: A Multiple Case Study

    ERIC Educational Resources Information Center

    Monroe, Steven D.

    2012-01-01

    The purpose of this study was to explore how the use of video in the cognitive task analysis (CTA) retrospective verbal protocol analysis (RVPA) during a job analysis affects: (a) the quality of performing the CTA, (b) the time to complete the CTA, and (c) the cost to execute the CTA. Research has shown when using the simultaneous VPA during a CTA…

  18. No Longer Children: Case Studies of the Living and Working Conditions of the Youth Who Harvest America's Crops. Executive Summary. Revised Edition.

    ERIC Educational Resources Information Center

    Aguirre International, San Mateo, CA.

    This report examines the living and working conditions of adolescent migrant farmworkers. Interviews were conducted with 216 youth working during peak harvest time in six states, as well as with adult farmworkers, family members of working youth, and farm labor contractors. Most of the youth were 14-17 years old, although a few had begun work as…

  19. Manufacturing Methods & Technology Project Execution Report. First CY 83.

    DTIC Science & Technology

    1983-11-01

    UCCURRENCE. H 83 5180 MMT FOR METAL DEWAR AND UNBONDED LEADS THE GOLD WIRE BONDED CONNECTIOkS ARE MADE BY HAND WHICH IS A TEDIOUS AND EXPENSIVE PROCESS. THE...ATTACHMENTS CURRENT FILAMENT WOUND COMPOSIIE ROCKET MOTOR CASES REQUIRE FORGED METAL POLE PIECESt NOZZLE CLOSURE ATTACHMENT RINGS, AND OTHER ATTACHMENT RINGS... ELASTOMER INSULATOR PROCESS LARGE TACTICAL ROCKET MOTOR INSULATORS ARE COSTLY, LACK DESIGN CHANGE FLEXIBILITY AND SUFFER LONG LEAD TIMES. CURRENT

  20. Resource conflict detection and removal strategy for nondeterministic emergency response processes using Petri nets

    NASA Astrophysics Data System (ADS)

    Zeng, Qingtian; Liu, Cong; Duan, Hua

    2016-09-01

    Correctness of an emergency response process specification is critical to emergency mission success. Therefore, errors in the specification should be detected and corrected at build-time. In this paper, we propose a resource conflict detection approach and removal strategy for emergency response processes constrained by resources and time. In this kind of emergency response process, there are two timing functions representing the minimum and maximum execution time for each activity, respectively, and many activities require resources to be executed. Based on the RT_ERP_Net, the earliest time to start each activity and the ideal execution time of the process can be obtained. To detect and remove the resource conflicts in the process, the conflict detection algorithms and a priority-activity-first resolution strategy are given. In this way, real execution time for each activity is obtained and a conflict-free RT_ERP_Net is constructed by adding virtual activities. By experiments, it is proved that the resolution strategy proposed can shorten the execution time of the whole process to a great degree.

  1. Estimation Accuracy on Execution Time of Run-Time Tasks in a Heterogeneous Distributed Environment.

    PubMed

    Liu, Qi; Cai, Weidong; Jin, Dandan; Shen, Jian; Fu, Zhangjie; Liu, Xiaodong; Linge, Nigel

    2016-08-30

    Distributed Computing has achieved tremendous development since cloud computing was proposed in 2006, and played a vital role promoting rapid growth of data collecting and analysis models, e.g., Internet of things, Cyber-Physical Systems, Big Data Analytics, etc. Hadoop has become a data convergence platform for sensor networks. As one of the core components, MapReduce facilitates allocating, processing and mining of collected large-scale data, where speculative execution strategies help solve straggler problems. However, there is still no efficient solution for accurate estimation on execution time of run-time tasks, which can affect task allocation and distribution in MapReduce. In this paper, task execution data have been collected and employed for the estimation. A two-phase regression (TPR) method is proposed to predict the finishing time of each task accurately. Detailed data of each task have drawn interests with detailed analysis report being made. According to the results, the prediction accuracy of concurrent tasks' execution time can be improved, in particular for some regular jobs.

  2. Effective Vectorization with OpenMP 4.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huber, Joseph N.; Hernandez, Oscar R.; Lopez, Matthew Graham

    This paper describes how the Single Instruction Multiple Data (SIMD) model and its extensions in OpenMP work, and how these are implemented in different compilers. Modern processors are highly parallel computational machines which often include multiple processors capable of executing several instructions in parallel. Understanding SIMD and executing instructions in parallel allows the processor to achieve higher performance without increasing the power required to run it. SIMD instructions can significantly reduce the runtime of code by executing a single operation on large groups of data. The SIMD model is so integral to the processor s potential performance that, if SIMDmore » is not utilized, less than half of the processor is ever actually used. Unfortunately, using SIMD instructions is a challenge in higher level languages because most programming languages do not have a way to describe them. Most compilers are capable of vectorizing code by using the SIMD instructions, but there are many code features important for SIMD vectorization that the compiler cannot determine at compile time. OpenMP attempts to solve this by extending the C++/C and Fortran programming languages with compiler directives that express SIMD parallelism. OpenMP is used to pass hints to the compiler about the code to be executed in SIMD. This is a key resource for making optimized code, but it does not change whether or not the code can use SIMD operations. However, in many cases critical functions are limited by a poor understanding of how SIMD instructions are actually implemented, as SIMD can be implemented through vector instructions or simultaneous multi-threading (SMT). We have found that it is often the case that code cannot be vectorized, or is vectorized poorly, because the programmer does not have sufficient knowledge of how SIMD instructions work.« less

  3. Your scarcest resource.

    PubMed

    Mankins, Michael; Brahm, Chris; Caimi, Gregory

    2014-05-01

    Most companies have elaborate procedures for managing capital. They require a compelling business case for any new capital investment. They set hurdle rates. They delegate authority carefully, prescribing spending limits for each level. An organization's time, by contrast, goes largely unmanaged. Bain & Company, with which all three authors are associated, used innovative people analytics tools to examine the time budgets of 17 large corporations. It discovered that companies are awash in e-communications; meeting time has skyrocketed; real collaboration is limited; dysfunctional meeting behavior is on the rise; formal controls are rare; and the consequences of all this are few. The authors outline eight practices for managing organizational time. Among them are: Make meeting agendas clear and selective; create a zero-based time budget; require business cases for all initiatives; and standardize the decision process. Some forward-thinking companies bring as much discipline to their time budgets as to their capital budgets. As a result, they have Liberated countless hours of previously unproductive time for executives and employees, fueling innovation and accelerating profitable growth.

  4. The Development of Metaphor Comprehension and Its Relationship with Relational Verbal Reasoning and Executive Function.

    PubMed

    Carriedo, Nuria; Corral, Antonio; Montoro, Pedro R; Herrero, Laura; Ballestrino, Patricia; Sebastián, Iraia

    2016-01-01

    Our main objective was to analyse the different contributions of relational verbal reasoning (analogical and class inclusion) and executive functioning to metaphor comprehension across development. We postulated that both relational reasoning and executive functioning should predict individual and developmental differences. However, executive functioning would become increasingly involved when metaphor comprehension is highly demanding, either because of the metaphors' high difficulty (relatively novel metaphors in the absence of a context) or because of the individual's special processing difficulties, such as low levels of reading experience or low semantic knowledge. Three groups of participants, 11-year-olds, 15-year-olds and young adults, were assessed in different relational verbal reasoning tasks-analogical and class-inclusion-and in executive functioning tasks-updating information in working memory, inhibition, and shifting. The results revealed clear progress in metaphor comprehension between ages 11 and 15 and between ages 15 and 21. However, the importance of executive function in metaphor comprehension was evident by age 15 and was restricted to updating information in working memory and cognitive inhibition. Participants seemed to use two different strategies to interpret metaphors: relational verbal reasoning and executive functioning. This was clearly shown when comparing the performance of the "more efficient" participants in metaphor interpretation with that of the "less efficient" ones. Whereas in the first case none of the executive variables or those associated with relational verbal reasoning were significantly related to metaphor comprehension, in the latter case, both groups of variables had a clear predictor effect.

  5. Machine learning based job status prediction in scientific clusters

    DOE PAGES

    Yoo, Wucherl; Sim, Alex; Wu, Kesheng

    2016-09-01

    Large high-performance computing systems are built with increasing number of components with more CPU cores, more memory, and more storage space. At the same time, scientific applications have been growing in complexity. Together, they are leading to more frequent unsuccessful job statuses on HPC systems. From measured job statuses, 23.4% of CPU time was spent to the unsuccessful jobs. Here, we set out to study whether these unsuccessful job statuses could be anticipated from known job characteristics. To explore this possibility, we have developed a job status prediction method for the execution of jobs on scientific clusters. The Random Forestsmore » algorithm was applied to extract and characterize the patterns of unsuccessful job statuses. Experimental results show that our method can predict the unsuccessful job statuses from the monitored ongoing job executions in 99.8% the cases with 83.6% recall and 94.8% precision. Lastly, this prediction accuracy can be sufficiently high that it can be used to mitigation procedures of predicted failures.« less

  6. Machine learning based job status prediction in scientific clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Wucherl; Sim, Alex; Wu, Kesheng

    Large high-performance computing systems are built with increasing number of components with more CPU cores, more memory, and more storage space. At the same time, scientific applications have been growing in complexity. Together, they are leading to more frequent unsuccessful job statuses on HPC systems. From measured job statuses, 23.4% of CPU time was spent to the unsuccessful jobs. Here, we set out to study whether these unsuccessful job statuses could be anticipated from known job characteristics. To explore this possibility, we have developed a job status prediction method for the execution of jobs on scientific clusters. The Random Forestsmore » algorithm was applied to extract and characterize the patterns of unsuccessful job statuses. Experimental results show that our method can predict the unsuccessful job statuses from the monitored ongoing job executions in 99.8% the cases with 83.6% recall and 94.8% precision. Lastly, this prediction accuracy can be sufficiently high that it can be used to mitigation procedures of predicted failures.« less

  7. A prospective investigation of rumination and executive control in predicting overgeneral autobiographical memory in adolescence.

    PubMed

    Stewart, Tracy M; Hunter, Simon C; Rhodes, Sinéad M

    2018-04-01

    The CaR-FA-X model (Williams et al., 2007), or capture and rumination (CaR), functional avoidance (FA), and impaired executive control (X), is a model of overgeneral autobiographical memory (OGM). Two mechanisms of the model, rumination and executive control, were examined in isolation and in interaction in order to investigate OGM over time. Across two time points, six months apart, a total of 149 adolescents (13-16 years) completed the minimal-instruction autobiographical memory test, a measure of executive control with both emotional and nonemotional stimuli, and measures of brooding rumination and reflective pondering. The results showed that executive control for emotional information was negatively associated with OGM, but only when reflective pondering levels were high. Therefore, in the context of higher levels of reflective pondering, greater switch costs (i.e., lower executive control) when processing emotional information predicted a decrease in OGM over time.

  8. Conceptual Model-Based Systems Biology: Mapping Knowledge and Discovering Gaps in the mRNA Transcription Cycle

    PubMed Central

    Somekh, Judith; Choder, Mordechai; Dori, Dov

    2012-01-01

    We propose a Conceptual Model-based Systems Biology framework for qualitative modeling, executing, and eliciting knowledge gaps in molecular biology systems. The framework is an adaptation of Object-Process Methodology (OPM), a graphical and textual executable modeling language. OPM enables concurrent representation of the system's structure—the objects that comprise the system, and behavior—how processes transform objects over time. Applying a top-down approach of recursively zooming into processes, we model a case in point—the mRNA transcription cycle. Starting with this high level cell function, we model increasingly detailed processes along with participating objects. Our modeling approach is capable of modeling molecular processes such as complex formation, localization and trafficking, molecular binding, enzymatic stimulation, and environmental intervention. At the lowest level, similar to the Gene Ontology, all biological processes boil down to three basic molecular functions: catalysis, binding/dissociation, and transporting. During modeling and execution of the mRNA transcription model, we discovered knowledge gaps, which we present and classify into various types. We also show how model execution enhances a coherent model construction. Identification and pinpointing knowledge gaps is an important feature of the framework, as it suggests where research should focus and whether conjectures about uncertain mechanisms fit into the already verified model. PMID:23308089

  9. Motor Execution Affects Action Prediction

    ERIC Educational Resources Information Center

    Springer, Anne; Brandstadter, Simone; Liepelt, Roman; Birngruber, Teresa; Giese, Martin; Mechsner, Franz; Prinz, Wolfgang

    2011-01-01

    Previous studies provided evidence of the claim that the prediction of occluded action involves real-time simulation. We report two experiments that aimed to study how real-time simulation is affected by simultaneous action execution under conditions of full, partial or no overlap between observed and executed actions. This overlap was analysed by…

  10. A heuristic re-mapping algorithm reducing inter-level communication in SAMR applications.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steensland, Johan; Ray, Jaideep

    2003-07-01

    This paper aims at decreasing execution time for large-scale structured adaptive mesh refinement (SAMR) applications by proposing a new heuristic re-mapping algorithm and experimentally showing its effectiveness in reducing inter-level communication. Tests were done for five different SAMR applications. The overall goal is to engineer a dynamically adaptive meta-partitioner capable of selecting and configuring the most appropriate partitioning strategy at run-time based on current system and application state. Such a metapartitioner can significantly reduce execution times for general SAMR applications. Computer simulations of physical phenomena are becoming increasingly popular as they constitute an important complement to real-life testing. In manymore » cases, such simulations are based on solving partial differential equations by numerical methods. Adaptive methods are crucial to efficiently utilize computer resources such as memory and CPU. But even with adaption, the simulations are computationally demanding and yield huge data sets. Thus parallelization and the efficient partitioning of data become issues of utmost importance. Adaption causes the workload to change dynamically, calling for dynamic (re-) partitioning to maintain efficient resource utilization. The proposed heuristic algorithm reduced inter-level communication substantially. Since the complexity of the proposed algorithm is low, this decrease comes at a relatively low cost. As a consequence, we draw the conclusion that the proposed re-mapping algorithm would be useful to lower overall execution times for many large SAMR applications. Due to its usefulness and its parameterization, the proposed algorithm would constitute a natural and important component of the meta-partitioner.« less

  11. Participation of Part-time Faculty on the Executive Committee of the Academic Senate for California Community Colleges.

    ERIC Educational Resources Information Center

    Academic Senate for California Community Colleges, Sacramento.

    At the 1996 Spring Plenary Session, the Academic Senate for California Community Colleges (ASCCC) passed resolution S961.5, which authorizes the participation of part-time faculty on the Executive Committee. The assurance of participation of part-time faculty on the Executive Committee of the ASCCC at first appeared a simple proposal, but was soon…

  12. Thread scheduling for GPU-based OPC simulation on multi-thread

    NASA Astrophysics Data System (ADS)

    Lee, Heejun; Kim, Sangwook; Hong, Jisuk; Lee, Sooryong; Han, Hwansoo

    2018-03-01

    As semiconductor product development based on shrinkage continues, the accuracy and difficulty required for the model based optical proximity correction (MBOPC) is increasing. OPC simulation time, which is the most timeconsuming part of MBOPC, is rapidly increasing due to high pattern density in a layout and complex OPC model. To reduce OPC simulation time, we attempt to apply graphic processing unit (GPU) to MBOPC because OPC process is good to be programmed in parallel. We address some issues that may typically happen during GPU-based OPC simulation in multi thread system, such as "out of memory" and "GPU idle time". To overcome these problems, we propose a thread scheduling method, which manages OPC jobs in multiple threads in such a way that simulations jobs from multiple threads are alternatively executed on GPU while correction jobs are executed at the same time in each CPU cores. It was observed that the amount of GPU peak memory usage decreases by up to 35%, and MBOPC runtime also decreases by 4%. In cases where out of memory issues occur in a multi-threaded environment, the thread scheduler was used to improve MBOPC runtime up to 23%.

  13. Application-oriented offloading in heterogeneous networks for mobile cloud computing

    NASA Astrophysics Data System (ADS)

    Tseng, Fan-Hsun; Cho, Hsin-Hung; Chang, Kai-Di; Li, Jheng-Cong; Shih, Timothy K.

    2018-04-01

    Nowadays Internet applications have become more complicated that mobile device needs more computing resources for shorter execution time but it is restricted to limited battery capacity. Mobile cloud computing (MCC) is emerged to tackle the finite resource problem of mobile device. MCC offloads the tasks and jobs of mobile devices to cloud and fog environments by using offloading scheme. It is vital to MCC that which task should be offloaded and how to offload efficiently. In the paper, we formulate the offloading problem between mobile device and cloud data center and propose two algorithms based on application-oriented for minimum execution time, i.e. the Minimum Offloading Time for Mobile device (MOTM) algorithm and the Minimum Execution Time for Cloud data center (METC) algorithm. The MOTM algorithm minimizes offloading time by selecting appropriate offloading links based on application categories. The METC algorithm minimizes execution time in cloud data center by selecting virtual and physical machines with corresponding resource requirements of applications. Simulation results show that the proposed mechanism not only minimizes total execution time for mobile devices but also decreases their energy consumption.

  14. A Ten Year Retrospective on Environmental Justice: What Have We Learned?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, Michael J.; Jaksch, John A.; Cort, Katherine A.

    2005-03-01

    Beginning in 1994, Executive Order 12898 has directed federal executive agencies to identify and address, as appropriate, disproportionately high and adverse health or environmental effects of their programs, policies, and activities on minority and low income populations. The policy behind the Executive Order was to prevent minority and low income groups from bearing disproportionate adverse environmental consequences of federal actions. During the last ten years, federal agencies have implemented Executive Order 12898, and some also have developed explicit procedures or guidance for the steps that need to be taken during the preparation of environmental impact statements. Based on the authors’more » experience, the paper examines how environmental justice practice has evolved in the ten years since the original Executive Order was issued. This evolution has been both procedural and substantive. The paper examines how the actual practice of environmental justice analysis has progressed in federal agencies that deal with waste management issues. Reference is made to changes in case law and agency practice. The 2000 Census of Population and the ongoing development of geographic information systems in particular have made it easier to identify minority and low-income populations at risk. At the same time, a number of stakeholder groups have taken positions over specific federal actions that have given rise to novel issues and challenges for analysts. The paper discusses how NEPA practice is evolving to deal with these issues and challenges.« less

  15. Development of a Computer Architecture to Support the Optical Plume Anomaly Detection (OPAD) System

    NASA Technical Reports Server (NTRS)

    Katsinis, Constantine

    1996-01-01

    The NASA OPAD spectrometer system relies heavily on extensive software which repetitively extracts spectral information from the engine plume and reports the amounts of metals which are present in the plume. The development of this software is at a sufficiently advanced stage where it can be used in actual engine tests to provide valuable data on engine operation and health. This activity will continue and, in addition, the OPAD system is planned to be used in flight aboard space vehicles. The two implementations, test-stand and in-flight, may have some differing requirements. For example, the data stored during a test-stand experiment are much more extensive than in the in-flight case. In both cases though, the majority of the requirements are similar. New data from the spectrograph is generated at a rate of once every 0.5 sec or faster. All processing must be completed within this period of time to maintain real-time performance. Every 0.5 sec, the OPAD system must report the amounts of specific metals within the engine plume, given the spectral data. At present, the software in the OPAD system performs this function by solving the inverse problem. It uses powerful physics-based computational models (the SPECTRA code), which receive amounts of metals as inputs to produce the spectral data that would have been observed, had the same metal amounts been present in the engine plume. During the experiment, for every spectrum that is observed, an initial approximation is performed using neural networks to establish an initial metal composition which approximates as accurately as possible the real one. Then, using optimization techniques, the SPECTRA code is repetitively used to produce a fit to the data, by adjusting the metal input amounts until the produced spectrum matches the observed one to within a given level of tolerance. This iterative solution to the original problem of determining the metal composition in the plume requires a relatively long period of time to execute the software in a modern single-processor workstation, and therefore real-time operation is currently not possible. A different number of iterations may be required to perform spectral data fitting per spectral sample. Yet, the OPAD system must be designed to maintain real-time performance in all cases. Although faster single-processor workstations are available for execution of the fitting and SPECTRA software, this option is unattractive due to the excessive cost associated with very fast workstations and also due to the fact that such hardware is not easily expandable to accommodate future versions of the software which may require more processing power. Initial research has already demonstrated that the OPAD software can take advantage of a parallel computer architecture to achieve the necessary speedup. Current work has improved the software by converting it into a form which is easily parallelizable. Timing experiments have been performed to establish the computational complexity and execution speed of major components of the software. This work provides the foundation of future work which will create a fully parallel version of the software executing in a shared-memory multiprocessor system.

  16. A bidirectional relationship between physical activity and executive function in older adults

    PubMed Central

    Daly, Michael; McMinn, David; Allan, Julia L.

    2015-01-01

    Physically active lifestyles contribute to better executive function. However, it is unclear whether high levels of executive function lead people to be more active. This study uses a large sample and multi-wave data to identify whether a reciprocal association exists between physical activity and executive function. Participants were 4555 older adults tracked across four waves of the English Longitudinal Study of Aging. In each wave executive function was assessed using a verbal fluency test and a letter cancelation task and participants reported their physical activity levels. Fixed effects regressions showed that changes in executive function corresponded with changes in physical activity. In longitudinal multilevel models low levels of physical activity led to subsequent declines in executive function. Importantly, poor executive function predicted reductions in physical activity over time. This association was found to be over 50% larger in magnitude than the contribution of physical activity to changes in executive function. This is the first study to identify evidence for a robust bidirectional link between executive function and physical activity in a large sample of older adults tracked over time. PMID:25628552

  17. SERENITY in e-Business and Smart Item Scenarios

    NASA Astrophysics Data System (ADS)

    Benameur, Azzedine; Khoury, Paul El; Seguran, Magali; Sinha, Smriti Kumar

    SERENITY Artefacts, like Class, Patterns, Implementations and Executable Components for Security & Dependability (S&D) in addition to Serenity Runtime Framework (SRF) are discussed in previous chapters. How to integrate these artefacts with applications in Serenity approach is discussed here with two scenarios. The e-Business scenario is a standard loan origination process in a bank. The Smart Item scenario is an Ambient intelligence case study where we take advantage of Smart Items to provide an electronic healthcare infrastructure for remote healthcare assistance. In both cases, we detail how the prototype implementations of the scenarios select proper executable components through Serenity Runtime Framework and then demonstrate how these executable components of the S&D Patterns are deployed.

  18. Influence of characteristics of time series on short-term forecasting error parameter changes in real time

    NASA Astrophysics Data System (ADS)

    Klevtsov, S. I.

    2018-05-01

    The impact of physical factors, such as temperature and others, leads to a change in the parameters of the technical object. Monitoring the change of parameters is necessary to prevent a dangerous situation. The control is carried out in real time. To predict the change in the parameter, a time series is used in this paper. Forecasting allows one to determine the possibility of a dangerous change in a parameter before the moment when this change occurs. The control system in this case has more time to prevent a dangerous situation. A simple time series was chosen. In this case, the algorithm is simple. The algorithm is executed in the microprocessor module in the background. The efficiency of using the time series is affected by its characteristics, which must be adjusted. In the work, the influence of these characteristics on the error of prediction of the controlled parameter was studied. This takes into account the behavior of the parameter. The values of the forecast lag are determined. The results of the research, in the case of their use, will improve the efficiency of monitoring the technical object during its operation.

  19. User interface enhancement report

    NASA Technical Reports Server (NTRS)

    Badler, N. I.; Gangel, J.; Shields, G.; Fala, G.

    1985-01-01

    The existing user interfaces to TEMPUS, Plaid, and other systems in the OSDS are fundamentally based on only two modes of communication: alphanumeric commands or data input and grapical interaction. The latter are especially suited to the types of interaction necessary for creating workstation objects with BUILD and with performing body positioning in TEMPUS. Looking toward the future application of TEMPUS, however, the long-term goals of OSDS will include the analysis of extensive tasks in space involving one or more individuals working in concert over a period of time. In this context, the TEMPUS body positioning capability, though extremely useful in creating and validating a small number of particular body positions, will become somewhat tedious to use. The macro facility helps somewhat, since frequently used positions may be easily applied by executing a stored macro. The difference between body positioning and task execution, though subtle, is important. In the case of task execution, the important information at the user's level is what actions are to be performed rather than how the actions are performed. Viewed slightly differently, the what is constant over a set of individuals though the how may vary.

  20. Executive Coaching Practices in the Adult Workplace

    ERIC Educational Resources Information Center

    Campone, Francine

    2015-01-01

    This chapter provides an overview of key principles and practices in executive coaching. Coaching is discussed as a reflective learning opportunity and offers the theoretical grounding, strategies, and case studies for each of four key elements of a coaching engagement.

  1. Plan Execution Interchange Language (PLEXIL)

    NASA Technical Reports Server (NTRS)

    Estlin, Tara; Jonsson, Ari; Pasareanu, Corina; Simmons, Reid; Tso, Kam; Verma, Vandi

    2006-01-01

    Plan execution is a cornerstone of spacecraft operations, irrespective of whether the plans to be executed are generated on board the spacecraft or on the ground. Plan execution frameworks vary greatly, due to both different capabilities of the execution systems, and relations to associated decision-making frameworks. The latter dependency has made the reuse of execution and planning frameworks more difficult, and has all but precluded information sharing between different execution and decision-making systems. As a step in the direction of addressing some of these issues, a general plan execution language, called the Plan Execution Interchange Language (PLEXIL), is being developed. PLEXIL is capable of expressing concepts used by many high-level automated planners and hence provides an interface to multiple planners. PLEXIL includes a domain description that specifies command types, expansions, constraints, etc., as well as feedback to the higher-level decision-making capabilities. This document describes the grammar and semantics of PLEXIL. It includes a graphical depiction of this grammar and illustrative rover scenarios. It also outlines ongoing work on implementing a universal execution system, based on PLEXIL, using state-of-the-art rover functional interfaces and planners as test cases.

  2. 5 CFR 843.205 - Designation of beneficiary-form and execution.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... execution. 843.205 Section 843.205 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL... One-time Payments § 843.205 Designation of beneficiary—form and execution. (a) A designation of..., corporation, or legal entity may be named as beneficiary. (e) A change of beneficiary may be made at any time...

  3. Dissociable contributions of motor-execution and action-observation to intramanual transfer.

    PubMed

    Hayes, Spencer J; Elliott, Digby; Andrew, Matthew; Roberts, James W; Bennett, Simon J

    2012-09-01

    We examined the hypothesis that different processes and representations are associated with the learning of a movement sequence through motor-execution and action-observation. Following a pre-test in which participants attempted to achieve an absolute, and relative, time goal in a sequential goal-directed aiming movement, participants received either physical or observational practice with feedback. Post-test performance indicated that motor-execution and action-observation participants learned equally well. Participants then transferred to conditions where the gain between the limb movements and their visual consequences were manipulated. Under both bigger and smaller transfer conditions, motor-execution and action-observation participants exhibited similar intramanual transfer of absolute timing. However, participants in the action-observation group exhibited superior transfer of relative timing than the motor-execution group. These findings suggest that learning via action-observation is underpinned by a visual-spatial representation, while learning via motor-execution depends more on specific force-time planning (feed forward) and afferent processing associated with sensorimotor feedback. These behavioural effects are discussed with reference to neural processes associated with striatum, cerebellum and motor cortical regions (pre-motor cortex; SMA; pre-SMA).

  4. Application of the Multicontextual Approach in Promoting Learning and Transfer of Strategy Use in an Individual with TBI and Executive Dysfunction.

    PubMed

    Toglia, Joan; Goverover, Yael; Johnston, Mark V; Dain, Barry

    2011-01-01

    The multicontext approach addresses strategy use and self-monitoring skills within activities and contexts that are systematically varied to facilitate transfer of learning. This article illustrates the application of the multicontext approach by presenting a case study of an adult who is 5 years post-traumatic brain injury with executive dysfunction and limited awareness. A single case study design with repeated pre-post measures was used. Methods to monitor strategy generation and specific awareness within intervention are described. Findings suggest improved functional performance and generalization of use of an external strategy despite absence of changes in general self-awareness of deficits. This case describes the multicontext intervention process and provides clinical suggestions for working with individuals with serious deficits in awareness and executive dysfunction following traumatic brain injury. Copyright 2011, SLACK Incorporated.

  5. Distinguishing and Improving Mouse Behavior with Educational Computer Games in Young Children with Autistic Spectrum Disorder or Attention Deficit/Hyperactivity Disorder: An Executive Function-Based Interpretation

    ERIC Educational Resources Information Center

    Veenstra, Baukje; van Geert, Paul L. C.; van der Meulen, Bieuwe F.

    2012-01-01

    In this exploratory multiple case study, it is examined how a computer game focused on improving ineffective learning behavior can be used as a tool to assess, improve, and study real-time mouse behavior (MB) in different types of children: 18 children (3.8-6.3 years) with Autistic Spectrum Disorder (ASD), Attention Deficit/Hyperactivity Disorder…

  6. Women as Chief Information Officers in Higher Education: A Mixed Methods Study of Women Executive Role Attainment in Information Technology Organizations

    ERIC Educational Resources Information Center

    Clark, Elizabeth Ann

    2013-01-01

    The dearth of women in executive positions within the field of information technology (IT) has been studied extensively in the corporate sector. That is not the case within higher education, despite the data collected showing that women attain the top executive role--that of the Chief Information Officer (CIO)--at much better rates than their…

  7. On the Impact of Execution Models: A Case Study in Computational Chemistry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chavarría-Miranda, Daniel; Halappanavar, Mahantesh; Krishnamoorthy, Sriram

    2015-05-25

    Efficient utilization of high-performance computing (HPC) platforms is an important and complex problem. Execution models, abstract descriptions of the dynamic runtime behavior of the execution stack, have significant impact on the utilization of HPC systems. Using a computational chemistry kernel as a case study and a wide variety of execution models combined with load balancing techniques, we explore the impact of execution models on the utilization of an HPC system. We demonstrate a 50 percent improvement in performance by using work stealing relative to a more traditional static scheduling approach. We also use a novel semi-matching technique for load balancingmore » that has comparable performance to a traditional hypergraph-based partitioning implementation, which is computationally expensive. Using this study, we found that execution model design choices and assumptions can limit critical optimizations such as global, dynamic load balancing and finding the correct balance between available work units and different system and runtime overheads. With the emergence of multi- and many-core architectures and the consequent growth in the complexity of HPC platforms, we believe that these lessons will be beneficial to researchers tuning diverse applications on modern HPC platforms, especially on emerging dynamic platforms with energy-induced performance variability.« less

  8. Transcranial LED therapy for cognitive dysfunction in chronic, mild traumatic brain injury: two case reports

    NASA Astrophysics Data System (ADS)

    Naeser, Margaret A.; Saltmarche, Anita; Krengel, Maxine H.; Hamblin, Michael R.; Knight, Jeffrey A.

    2010-02-01

    Two chronic, traumatic brain injury (TBI) cases are presented, where cognitive function improved following treatment with transcranial light emitting diodes (LEDs). At age 59, P1 had closed-head injury from a motor vehicle accident (MVA) without loss of consciousness and normal MRI, but unable to return to work as development specialist in internet marketing, due to cognitive dysfunction. At 7 years post-MVA, she began transcranial LED treatments with cluster heads (2.1" diameter with 61 diodes each - 9x633nm, 52x870nm; 12-15mW per diode; total power, 500mW; 22.2 mW/cm2) on bilateral frontal, temporal, parietal, occipital and midline sagittal areas (13.3 J/cm2 at scalp, estimated 0.4 J/cm2 to brain cortex per area). Prior to transcranial LED, focused time on computer was 20 minutes. After 2 months of weekly, transcranial LED treatments, increased to 3 hours on computer. Performs nightly home treatments (now, 5 years, age 72); if stops treating >2 weeks, regresses. P2 (age 52F) had history of closed-head injuries related to sports/military training and recent fall. MRI shows fronto-parietal cortical atrophy. Pre-LED, was not able to work for 6 months and scored below average on attention, memory and executive function. Performed nightly transcranial LED treatments at home (9 months) with similar LED device, on frontal and parietal areas. After 4 months of LED treatments, returned to work as executive consultant, international technology consulting firm. Neuropsychological testing (post- 9 months of transcranial LED) showed significant improvement in memory and executive functioning (range, +1 to +2 SD improvement). Case 2 reported reduction in PTSD symptoms.

  9. Flow-Centric, Back-in-Time Debugging

    NASA Astrophysics Data System (ADS)

    Lienhard, Adrian; Fierz, Julien; Nierstrasz, Oscar

    Conventional debugging tools present developers with means to explore the run-time context in which an error has occurred. In many cases this is enough to help the developer discover the faulty source code and correct it. However, rather often errors occur due to code that has executed in the past, leaving certain objects in an inconsistent state. The actual run-time error only occurs when these inconsistent objects are used later in the program. So-called back-in-time debuggers help developers step back through earlier states of the program and explore execution contexts not available to conventional debuggers. Nevertheless, even Back-in-Time Debuggers do not help answer the question, “Where did this object come from?” The Object-Flow Virtual Machine, which we have proposed in previous work, tracks the flow of objects to answer precisely such questions, but this VM does not provide dedicated debugging support to explore faulty programs. In this paper we present a novel debugger, called Compass, to navigate between conventional run-time stack-oriented control flow views and object flows. Compass enables a developer to effectively navigate from an object contributing to an error back-in-time through all the code that has touched the object. We present the design and implementation of Compass, and we demonstrate how flow-centric, back-in-time debugging can be used to effectively locate the source of hard-to-find bugs.

  10. Estimation Accuracy on Execution Time of Run-Time Tasks in a Heterogeneous Distributed Environment

    PubMed Central

    Liu, Qi; Cai, Weidong; Jin, Dandan; Shen, Jian; Fu, Zhangjie; Liu, Xiaodong; Linge, Nigel

    2016-01-01

    Distributed Computing has achieved tremendous development since cloud computing was proposed in 2006, and played a vital role promoting rapid growth of data collecting and analysis models, e.g., Internet of things, Cyber-Physical Systems, Big Data Analytics, etc. Hadoop has become a data convergence platform for sensor networks. As one of the core components, MapReduce facilitates allocating, processing and mining of collected large-scale data, where speculative execution strategies help solve straggler problems. However, there is still no efficient solution for accurate estimation on execution time of run-time tasks, which can affect task allocation and distribution in MapReduce. In this paper, task execution data have been collected and employed for the estimation. A two-phase regression (TPR) method is proposed to predict the finishing time of each task accurately. Detailed data of each task have drawn interests with detailed analysis report being made. According to the results, the prediction accuracy of concurrent tasks’ execution time can be improved, in particular for some regular jobs. PMID:27589753

  11. Why are they late? Timing abilities and executive control among students with learning disabilities.

    PubMed

    Grinblat, Nufar; Rosenblum, Sara

    2016-12-01

    While a deficient ability to perform daily tasks on time has been reported among students with learning disabilities (LD), the underlying mechanism behind their 'being late' is still unclear. This study aimed to evaluate the organization in time, time estimation abilities, actual performance time pertaining to specific daily activities, as well as the executive functions of students with LD in comparison to those of controls, and to assess the relationships between these domains among each group. The participants were 27 students with LD, aged 20-30, and 32 gender and age-matched controls who completed the Time Organization and Participation Scale (TOPS) and the Behavioral Rating Inventory of Executive Function-Adult version (BRIEF-A). In addition, their ability to estimate the time needed to complete the task of preparing a cup of coffee as well as their actual performance time were evaluated. The results indicated that in comparison to controls, students with LD showed significantly inferior organization in time (TOPS) and executive function abilities (BRIEF-A). Furthermore, their time estimation abilities were significantly inferior and they required significantly more time to prepare a cup of coffee. Regression analysis identified the variables that predicted organization in time and task performance time among each group. The significance of the results for both theoretical and clinical implications are discussed. What this paper adds? This study examines the underlying mechanism of the phenomena of being late among students with LD. Following a recent call for using ecologically valid assessments, the functional daily ability of students with LD to prepare a cup of coffee and to organize time were investigated. Furthermore, their time estimation and executive control abilities were examined as a possible underlying mechanism for their lateness. Although previous studies have indicated executive control deficits among students with LD, to our knowledge, this is the first analysis of the relationships between their executive control and time estimation deficits and their influence upon their daily function and organization in time abilities. Our findings demonstrate that students with LD need more time in order to execute simple daily activities, such as preparing a cup of coffee. Deficient working memory, retrospective time estimation ability and inhibition predicted their performance time and organization in time abilities. Therefore, this paper sheds light on the mechanism behind daily performance in time among students with LD and emphasizes the need for future development of focused intervention programs to meet their unique needs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. The Development of Metaphor Comprehension and Its Relationship with Relational Verbal Reasoning and Executive Function

    PubMed Central

    Montoro, Pedro R.; Herrero, Laura; Ballestrino, Patricia; Sebastián, Iraia

    2016-01-01

    Our main objective was to analyse the different contributions of relational verbal reasoning (analogical and class inclusion) and executive functioning to metaphor comprehension across development. We postulated that both relational reasoning and executive functioning should predict individual and developmental differences. However, executive functioning would become increasingly involved when metaphor comprehension is highly demanding, either because of the metaphors’ high difficulty (relatively novel metaphors in the absence of a context) or because of the individual’s special processing difficulties, such as low levels of reading experience or low semantic knowledge. Three groups of participants, 11-year-olds, 15-year-olds and young adults, were assessed in different relational verbal reasoning tasks—analogical and class-inclusion—and in executive functioning tasks—updating information in working memory, inhibition, and shifting. The results revealed clear progress in metaphor comprehension between ages 11 and 15 and between ages 15 and 21. However, the importance of executive function in metaphor comprehension was evident by age 15 and was restricted to updating information in working memory and cognitive inhibition. Participants seemed to use two different strategies to interpret metaphors: relational verbal reasoning and executive functioning. This was clearly shown when comparing the performance of the "more efficient" participants in metaphor interpretation with that of the "less efficient” ones. Whereas in the first case none of the executive variables or those associated with relational verbal reasoning were significantly related to metaphor comprehension, in the latter case, both groups of variables had a clear predictor effect. PMID:26954501

  13. 75 FR 70259 - Sunshine Act; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-17

    ... Time. Contact Person for More Information: Stephen Llewellyn, Executive Officer, on (202) 663-4070. Dated: November 15, 2010. Stephen Llewellyn, Executive Officer, Executive Secretariat. This Notice...

  14. Cloudgene: A graphical execution platform for MapReduce programs on private and public clouds

    PubMed Central

    2012-01-01

    Background The MapReduce framework enables a scalable processing and analyzing of large datasets by distributing the computational load on connected computer nodes, referred to as a cluster. In Bioinformatics, MapReduce has already been adopted to various case scenarios such as mapping next generation sequencing data to a reference genome, finding SNPs from short read data or matching strings in genotype files. Nevertheless, tasks like installing and maintaining MapReduce on a cluster system, importing data into its distributed file system or executing MapReduce programs require advanced knowledge in computer science and could thus prevent scientists from usage of currently available and useful software solutions. Results Here we present Cloudgene, a freely available platform to improve the usability of MapReduce programs in Bioinformatics by providing a graphical user interface for the execution, the import and export of data and the reproducibility of workflows on in-house (private clouds) and rented clusters (public clouds). The aim of Cloudgene is to build a standardized graphical execution environment for currently available and future MapReduce programs, which can all be integrated by using its plug-in interface. Since Cloudgene can be executed on private clusters, sensitive datasets can be kept in house at all time and data transfer times are therefore minimized. Conclusions Our results show that MapReduce programs can be integrated into Cloudgene with little effort and without adding any computational overhead to existing programs. This platform gives developers the opportunity to focus on the actual implementation task and provides scientists a platform with the aim to hide the complexity of MapReduce. In addition to MapReduce programs, Cloudgene can also be used to launch predefined systems (e.g. Cloud BioLinux, RStudio) in public clouds. Currently, five different bioinformatic programs using MapReduce and two systems are integrated and have been successfully deployed. Cloudgene is freely available at http://cloudgene.uibk.ac.at. PMID:22888776

  15. Flexible Description Language for HPC based Processing of Remote Sense Data

    NASA Astrophysics Data System (ADS)

    Nandra, Constantin; Gorgan, Dorian; Bacu, Victor

    2016-04-01

    When talking about Big Data, the most challenging aspect lays in processing them in order to gain new insight, find new patterns and gain knowledge from them. This problem is likely most apparent in the case of Earth Observation (EO) data. With ever higher numbers of data sources and increasing data acquisition rates, dealing with EO data is indeed a challenge [1]. Geoscientists should address this challenge by using flexible and efficient tools and platforms. To answer this trend, the BigEarth project [2] aims to combine the advantages of high performance computing solutions with flexible processing description methodologies in order to reduce both task execution times and task definition time and effort. As a component of the BigEarth platform, WorDeL (Workflow Description Language) [3] is intended to offer a flexible, compact and modular approach to the task definition process. WorDeL, unlike other description alternatives such as Python or shell scripts, is oriented towards the description topologies, using them as abstractions for the processing programs. This feature is intended to make it an attractive alternative for users lacking in programming experience. By promoting modular designs, WorDeL not only makes the processing descriptions more user-readable and intuitive, but also helps organizing the processing tasks into independent sub-tasks, which can be executed in parallel on multi-processor platforms in order to improve execution times. As a BigEarth platform [4] component, WorDeL represents the means by which the user interacts with the system, describing processing algorithms in terms of existing operators and workflows [5], which are ultimately translated into sets of executable commands. The WorDeL language has been designed to help in the definition of compute-intensive, batch tasks which can be distributed and executed on high-performance, cloud or grid-based architectures in order to improve the processing time. Main references for further information: [1] Gorgan, D., "Flexible and Adaptive Processing of Earth Observation Data over High Performance Computation Architectures", International Conference and Exhibition Satellite 2015, August 17-19, Houston, Texas, USA. [2] Bigearth project - flexible processing of big earth data over high performance computing architectures. http://cgis.utcluj.ro/bigearth, (2014) [3] Nandra, C., Gorgan, D., "Workflow Description Language for Defining Big Earth Data Processing Tasks", Proceedings of the Intelligent Computer Communication and Processing (ICCP), IEEE-Press, pp. 461-468, (2015). [4] Bacu, V., Stefan, T., Gorgan, D., "Adaptive Processing of Earth Observation Data on Cloud Infrastructures Based on Workflow Description", Proceedings of the Intelligent Computer Communication and Processing (ICCP), IEEE-Press, pp.444-454, (2015). [5] Mihon, D., Bacu, V., Colceriu, V., Gorgan, D., "Modeling of Earth Observation Use Cases through the KEOPS System", Proceedings of the Intelligent Computer Communication and Processing (ICCP), IEEE-Press, pp. 455-460, (2015).

  16. Supervising Remote Humanoids Across Intermediate Time Delay

    NASA Technical Reports Server (NTRS)

    Hambuchen, Kimberly; Bluethmann, William; Goza, Michael; Ambrose, Robert; Rabe, Kenneth; Allan, Mark

    2006-01-01

    The President's Vision for Space Exploration, laid out in 2004, relies heavily upon robotic exploration of the lunar surface in early phases of the program. Prior to the arrival of astronauts on the lunar surface, these robots will be required to be controlled across space and time, posing a considerable challenge for traditional telepresence techniques. Because time delays will be measured in seconds, not minutes as is the case for Mars Exploration, uploading the plan for a day seems excessive. An approach for controlling humanoids under intermediate time delay is presented. This approach uses software running within a ground control cockpit to predict an immersed robot supervisor's motions which the remote humanoid autonomously executes. Initial results are presented.

  17. BP-Broker use-cases in the UncertWeb framework

    NASA Astrophysics Data System (ADS)

    Roncella, Roberto; Bigagli, Lorenzo; Schulz, Michael; Stasch, Christoph; Proß, Benjamin; Jones, Richard; Santoro, Mattia

    2013-04-01

    The UncertWeb framework is a distributed, Web-based Information and Communication Technology (ICT) system to support scientific data modeling in presence of uncertainty. We designed and prototyped a core component of the UncertWeb framework: the Business Process Broker. The BP-Broker implements several functionalities, such as: discovery of available processes/BPs, preprocessing of a BP into its executable form (EBP), publication of EBPs and their execution through a workflow-engine. According to the Composition-as-a-Service (CaaS) approach, the BP-Broker supports discovery and chaining of modeling resources (and processing resources in general), providing the necessary interoperability services for creating, validating, editing, storing, publishing, and executing scientific workflows. The UncertWeb project targeted several scenarios, which were used to evaluate and test the BP-Broker. The scenarios cover the following environmental application domains: biodiversity and habitat change, land use and policy modeling, local air quality forecasting, and individual activity in the environment. This work reports on the study of a number of use-cases, by means of the BP-Broker, namely: - eHabitat use-case: implements a Monte Carlo simulation performed on a deterministic ecological model; an extended use-case supports inter-comparison of model outputs; - FERA use-case: is composed of a set of models for predicting land-use and crop yield response to climatic and economic change; - NILU use-case: is composed of a Probabilistic Air Quality Forecasting model for predicting concentrations of air pollutants; - Albatross use-case: includes two model services for simulating activity-travel patterns of individuals in time and space; - Overlay use-case: integrates the NILU scenario with the Albatross scenario to calculate the exposure to air pollutants of individuals. Our aim was to prove the feasibility of describing composite modeling processes with a high-level, abstract notation (i.e. BPMN 2.0), and delegating the resolution of technical issues (e.g. I/O matching) as much as possible to an external service. The results of the experimented solution indicate that this approach facilitates the integration of environmental model workflows into the standard geospatial Web Services framework (e.g. the GEOSS Common Infrastructure), mitigating its inherent complexity. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 248488.

  18. Change in neurocognition by housing type and substance abuse among formerly homeless seriously mentally ill persons.

    PubMed

    Caplan, Brina; Schutt, Russell K; Turner, Winston M; Goldfinger, Stephen M; Seidman, Larry J

    2006-03-01

    To test the effect of living in group housing rather than independent apartments on executive functioning, verbal memory and sustained attention among formerly homeless persons with serious mental illness and to determine whether substance abuse modifies this effect. In metropolitan Boston, 112 persons in Department of Mental Health shelters were randomly assigned to group homes ("Evolving Consumer Households", with project facilitator, group meetings, resident decision-making) or independent apartments. All were case managed. A neuropsychological test battery was administered at baseline, at 18 months (Time 2), with an 81% follow-up rate, and at 48 months (Time 3), with a 59% follow-up rate. Hierarchical Linear Modeling was applied to executive functioning--assessed with the Wisconsin Card Sorting Test (Perseverations)-Logical Memory story recall, and an auditory Continuous Performance Test (CPT) for sustained attention. Subject characteristics were controlled. When moved to group homes, subjects without a lifetime substance abuse history improved on Perseverations, while those who moved to independent apartments deteriorated on Perseverations. Across the two housing conditions, subjects showed no change in Perseverations, but improved on Logical Memory story recall and the CPT. Type of housing placement can influence cognitive functioning; notably, socially isolating housing is associated with weakened executive functioning. Substance abuse significantly diminishes environmental effects. These are important factors to consider in housing placement and subsequent treatment.

  19. Toward cognitive pipelines of medical assistance algorithms.

    PubMed

    Philipp, Patrick; Maleshkova, Maria; Katic, Darko; Weber, Christian; Götz, Michael; Rettinger, Achim; Speidel, Stefanie; Kämpgen, Benedikt; Nolden, Marco; Wekerle, Anna-Laura; Dillmann, Rüdiger; Kenngott, Hannes; Müller, Beat; Studer, Rudi

    2016-09-01

    Assistance algorithms for medical tasks have great potential to support physicians with their daily work. However, medicine is also one of the most demanding domains for computer-based support systems, since medical assistance tasks are complex and the practical experience of the physician is crucial. Recent developments in the area of cognitive computing appear to be well suited to tackle medicine as an application domain. We propose a system based on the idea of cognitive computing and consisting of auto-configurable medical assistance algorithms and their self-adapting combination. The system enables automatic execution of new algorithms, given they are made available as Medical Cognitive Apps and are registered in a central semantic repository. Learning components can be added to the system to optimize the results in the cases when numerous Medical Cognitive Apps are available for the same task. Our prototypical implementation is applied to the areas of surgical phase recognition based on sensor data and image progressing for tumor progression mappings. Our results suggest that such assistance algorithms can be automatically configured in execution pipelines, candidate results can be automatically scored and combined, and the system can learn from experience. Furthermore, our evaluation shows that the Medical Cognitive Apps are providing the correct results as they did for local execution and run in a reasonable amount of time. The proposed solution is applicable to a variety of medical use cases and effectively supports the automated and self-adaptive configuration of cognitive pipelines based on medical interpretation algorithms.

  20. A Differential Deficit in Time- versus Event-based Prospective Memory in Parkinson's Disease

    PubMed Central

    Raskin, Sarah A.; Woods, Steven Paul; Poquette, Amelia J.; McTaggart, April B.; Sethna, Jim; Williams, Rebecca C.; Tröster, Alexander I.

    2010-01-01

    Objective The aim of the current study was to clarify the nature and extent of impairment in time- versus event-based prospective memory in Parkinson's disease (PD). Prospective memory is thought to involve cognitive processes that are mediated by prefrontal systems and are executive in nature. Given that individuals with PD frequently show executive dysfunction, it is important to determine whether these individuals may have deficits in prospective memory that could impact daily functions, such as taking medications. Although it has been reported that individuals with PD evidence impairment in prospective memory, it is still unclear whether they show a greater deficit for time- versus event-based cues. Method Fifty-four individuals with PD and 34 demographically similar healthy adults were administered a standardized measure of prospective memory that allows for a direct comparison of time-based and event-based cues. In addition, participants were administered a series of standardized measures of retrospective memory and executive functions. Results Individuals with PD demonstrated impaired prospective memory performance compared to the healthy adults, with a greater impairment demonstrated for the time-based tasks. Time-based prospective memory performance was moderately correlated with measures of executive functioning, but only the Stroop Neuropsychological Screening Test emerged as a unique predictor in a linear regression. Conclusions Findings are interpreted within the context of McDaniel and Einstein's (2000) multi-process theory to suggest that individuals with PD experience particular difficulty executing a future intention when the cue to execute the prescribed intention requires higher levels of executive control. PMID:21090895

  1. Executive functioning and processing speed in age-related differences in time estimation: a comparison of young, old, and very old adults.

    PubMed

    Baudouin, Alexia; Isingrini, Michel; Vanneste, Sandrine

    2018-01-25

    Age-related differences in time estimation were examined by comparing the temporal performance of young, young-old, and old-old adults, in relation to two major theories of cognitive aging: executive decline and cognitive slowing. We tested the hypothesis that processing speed and executive function are differentially involved in timing depending on the temporal task used. We also tested the assumption of greater age-related effects in time estimation in old-old participants. Participants performed two standard temporal tasks: duration production and duration reproduction. They also completed tests measuring executive function and processing speed. Findings supported the view that executive function is the best mediator of reproduction performance and inversely that processing speed is the best mediator of production performance. They also showed that young-old participants provide relatively accurate temporal judgments compared to old-old participants. These findings are discussed in terms of compensation mechanisms in aging.

  2. 78 FR 59766 - Actions Taken Pursuant to Executive Order 13382

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-27

    ... DEPARTMENT OF THE TREASURY Office of Foreign Assets Control [Case ID NPW-3360] Actions Taken Pursuant to Executive Order 13382 AGENCY: Office of Foreign Assets Control, Treasury Department. ACTION: Notice. SUMMARY: The Treasury Department's Office of Foreign Assets Control (``OFAC'') is publishing on...

  3. Will healthcare do time for convictions? Hospital groups seek Medicare relief as industry anticipates impact of latest fraud verdicts.

    PubMed

    Taylor, M

    1999-07-12

    Less than a week after two hospital executives were found guilty of Medicare fraud, hospital groups last week launched an advertising blitz aimed at securing more Medicare money. But that could be a hard sell. Its success will hinge on how much the industry's image has been damaged by the recent convictions in Florida, which came only a few months after convictions in a Kansas City kickback case.

  4. A Simple Algorithm for the Metric Traveling Salesman Problem

    NASA Technical Reports Server (NTRS)

    Grimm, M. J.

    1984-01-01

    An algorithm was designed for a wire list net sort problem. A branch and bound algorithm for the metric traveling salesman problem is presented for this. The algorithm is a best bound first recursive descent where the bound is based on the triangle inequality. The bounded subsets are defined by the relative order of the first K of the N cities (i.e., a K city subtour). When K equals N, the bound is the length of the tour. The algorithm is implemented as a one page subroutine written in the C programming language for the VAX 11/750. Average execution times for randomly selected planar points using the Euclidean metric are 0.01, 0.05, 0.42, and 3.13 seconds for ten, fifteen, twenty, and twenty-five cities, respectively. Maximum execution times for a hundred cases are less than eleven times the averages. The speed of the algorithms is due to an initial ordering algorithm that is a N squared operation. The algorithm also solves the related problem where the tour does not return to the starting city and the starting and/or ending cities may be specified. It is possible to extend the algorithm to solve a nonsymmetric problem satisfying the triangle inequality.

  5. Optimization of atmospheric transport models on HPC platforms

    NASA Astrophysics Data System (ADS)

    de la Cruz, Raúl; Folch, Arnau; Farré, Pau; Cabezas, Javier; Navarro, Nacho; Cela, José María

    2016-12-01

    The performance and scalability of atmospheric transport models on high performance computing environments is often far from optimal for multiple reasons including, for example, sequential input and output, synchronous communications, work unbalance, memory access latency or lack of task overlapping. We investigate how different software optimizations and porting to non general-purpose hardware architectures improve code scalability and execution times considering, as an example, the FALL3D volcanic ash transport model. To this purpose, we implement the FALL3D model equations in the WARIS framework, a software designed from scratch to solve in a parallel and efficient way different geoscience problems on a wide variety of architectures. In addition, we consider further improvements in WARIS such as hybrid MPI-OMP parallelization, spatial blocking, auto-tuning and thread affinity. Considering all these aspects together, the FALL3D execution times for a realistic test case running on general-purpose cluster architectures (Intel Sandy Bridge) decrease by a factor between 7 and 40 depending on the grid resolution. Finally, we port the application to Intel Xeon Phi (MIC) and NVIDIA GPUs (CUDA) accelerator-based architectures and compare performance, cost and power consumption on all the architectures. Implications on time-constrained operational model configurations are discussed.

  6. Interactive, graphical processing unitbased evaluation of evacuation scenarios at the state scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perumalla, Kalyan S; Aaby, Brandon G; Yoginath, Srikanth B

    2011-01-01

    In large-scale scenarios, transportation modeling and simulation is severely constrained by simulation time. For example, few real- time simulators scale to evacuation traffic scenarios at the level of an entire state, such as Louisiana (approximately 1 million links) or Florida (2.5 million links). New simulation approaches are needed to overcome severe computational demands of conventional (microscopic or mesoscopic) modeling techniques. Here, a new modeling and execution methodology is explored that holds the potential to provide a tradeoff among the level of behavioral detail, the scale of transportation network, and real-time execution capabilities. A novel, field-based modeling technique and its implementationmore » on graphical processing units are presented. Although additional research with input from domain experts is needed for refining and validating the models, the techniques reported here afford interactive experience at very large scales of multi-million road segments. Illustrative experiments on a few state-scale net- works are described based on an implementation of this approach in a software system called GARFIELD. Current modeling cap- abilities and implementation limitations are described, along with possible use cases and future research.« less

  7. [Neuropsychological evaluation of a case of organic personality disorder due to penetrating brain injury].

    PubMed

    Sanz de la Torre, J C; Pérez-Ríos, M

    1996-06-01

    In this paper, an organic personality disorder case by penetrating brain injury, predominantly localized in the right frontal lobe, is presented. Neuropsychological and neuroimaging (CT scan studies) were performed. We assessed the main cognitive aspect: orientation, attention, memory, intelligence, language, visual-spatial functioning, motor functioning, executive functioning and personality. The results obtained, point out disorders in the patient's behavior and in the executive functions. Likewise, other cognitive functions as: attention, memory, language and visual-spatial functioning, show specific deficits.

  8. Functional Role of Internal and External Visual Imagery: Preliminary Evidences from Pilates

    PubMed Central

    Montuori, Simone; Sorrentino, Pierpaolo; Belloni, Lidia; Sorrentino, Giuseppe

    2018-01-01

    The present study investigates whether a functional difference between the visualization of a sequence of movements in the perspective of the first- (internal VMI-I) or third- (external VMI-E) person exists, which might be relevant to promote learning. By using a mental chronometry experimental paradigm, we have compared the time or execution, imagination in the VMI-I perspective, and imagination in the VMI-E perspective of two kinds of Pilates exercises. The analysis was carried out in individuals with different levels of competence (expert, novice, and no-practice individuals). Our results showed that in the Expert group, in the VMI-I perspective, the imagination time was similar to the execution time, while in the VMI-E perspective, the imagination time was significantly lower than the execution time. An opposite pattern was found in the Novice group, in which the time of imagination was similar to that of execution only in the VMI-E perspective, while in the VMI-I perspective, the time of imagination was significantly lower than the time of execution. In the control group, the times of both modalities of imagination were significantly lower than the execution time for each exercise. The present data suggest that, while the VMI-I serves to train an already internalised gesture, the VMI-E perspective could be useful to learn, and then improve, the recently acquired sequence of movements. Moreover, visual imagery is not useful for individuals that lack a specific motor experience. The present data offer new insights in the application of mental training techniques, especially in field of sports. However, further investigations are needed to better understand the functional role of internal and external visual imagery. PMID:29849565

  9. A single aerobic exercise session accelerates movement execution but not central processing.

    PubMed

    Beyer, Kit B; Sage, Michael D; Staines, W Richard; Middleton, Laura E; McIlroy, William E

    2017-03-27

    Previous research has demonstrated that aerobic exercise has disparate effects on speed of processing and movement execution. In simple and choice reaction tasks, aerobic exercise appears to increase speed of movement execution while speed of processing is unaffected. In the flanker task, aerobic exercise has been shown to reduce response time on incongruent trials more than congruent trials, purportedly reflecting a selective influence on speed of processing related to cognitive control. However, it is unclear how changes in speed of processing and movement execution contribute to these exercise-induced changes in response time during the flanker task. This study examined how a single session of aerobic exercise influences speed of processing and movement execution during a flanker task using electromyography to partition response time into reaction time and movement time, respectively. Movement time decreased during aerobic exercise regardless of flanker congruence but returned to pre-exercise levels immediately after exercise. Reaction time during incongruent flanker trials decreased over time in both an aerobic exercise and non-exercise control condition indicating it was not specifically influenced by exercise. This disparate influence of aerobic exercise on movement time and reaction time indicates the importance of partitioning response time when examining the influence of aerobic exercise on speed of processing. The decrease in reaction time over time independent of aerobic exercise indicates that interpreting pre-to-post exercise changes in behavior requires caution. Copyright © 2017 IBRO. Published by Elsevier Ltd. All rights reserved.

  10. Automated Generation and Assessment of Autonomous Systems Test Cases

    NASA Technical Reports Server (NTRS)

    Barltrop, Kevin J.; Friberg, Kenneth H.; Horvath, Gregory A.

    2008-01-01

    This slide presentation reviews some of the issues concerning verification and validation testing of autonomous spacecraft routinely culminates in the exploration of anomalous or faulted mission-like scenarios using the work involved during the Dawn mission's tests as examples. Prioritizing which scenarios to develop usually comes down to focusing on the most vulnerable areas and ensuring the best return on investment of test time. Rules-of-thumb strategies often come into play, such as injecting applicable anomalies prior to, during, and after system state changes; or, creating cases that ensure good safety-net algorithm coverage. Although experience and judgment in test selection can lead to high levels of confidence about the majority of a system's autonomy, it's likely that important test cases are overlooked. One method to fill in potential test coverage gaps is to automatically generate and execute test cases using algorithms that ensure desirable properties about the coverage. For example, generate cases for all possible fault monitors, and across all state change boundaries. Of course, the scope of coverage is determined by the test environment capabilities, where a faster-than-real-time, high-fidelity, software-only simulation would allow the broadest coverage. Even real-time systems that can be replicated and run in parallel, and that have reliable set-up and operations features provide an excellent resource for automated testing. Making detailed predictions for the outcome of such tests can be difficult, and when algorithmic means are employed to produce hundreds or even thousands of cases, generating predicts individually is impractical, and generating predicts with tools requires executable models of the design and environment that themselves require a complete test program. Therefore, evaluating the results of large number of mission scenario tests poses special challenges. A good approach to address this problem is to automatically score the results based on a range of metrics. Although the specific means of scoring depends highly on the application, the use of formal scoring - metrics has high value in identifying and prioritizing anomalies, and in presenting an overall picture of the state of the test program. In this paper we present a case study based on automatic generation and assessment of faulted test runs for the Dawn mission, and discuss its role in optimizing the allocation of resources for completing the test program.

  11. Kinematics and Kinetics of Taekwon-do Side Kick

    PubMed Central

    Wąsik, Jacek

    2011-01-01

    The aim of the paper is to present an analysis of the influence of selected kinematic factors on the side kick technique. This issue is especially important in the traditional version of taekwon-do, in which a single strike may reveal the winner. Six taekwon-do (International Taekwon-do Federation) athletes were asked to participate in this case study. Generally accepted criteria of sports technique biomechanical analysis were adhered to. The athletes executed a side kick three times (in Taekwon-do terminology referred to as yop chagi) in a way which they use the kick in board breaking. The obtained data were used to determine the mean velocity changes in the function of relative extension length of the kicking leg. The maximum knee and foot velocities in the Cartesian coordinate system were determined. The leg lifting time and the duration of kick execution as well as the maximum force which the standing foot exerted on the ground were also determined. On the basis of the obtained values, mean values and standard deviations were calculated. The correlation dependence (r=0.72) shows that greater knee velocity affects the velocity which the foot develops as well as the fact that the total time of kick execution depends on the velocity which the knee (r = −0.59) and the foot (r = − 0.86) develop in the leg lifting phase. The average maximum speed was obtained at the length of the leg equal to 82% of the maximum length of the fully extended leg. This length can be considered the optimum value for achieving the maximum dynamics of the kick. PMID:23486086

  12. High performance GPU processing for inversion using uniform grid searches

    NASA Astrophysics Data System (ADS)

    Venetis, Ioannis E.; Saltogianni, Vasso; Stiros, Stathis; Gallopoulos, Efstratios

    2017-04-01

    Many geophysical problems are described by systems of redundant, highly non-linear systems of ordinary equations with constant terms deriving from measurements and hence representing stochastic variables. Solution (inversion) of such problems is based on numerical, optimization methods, based on Monte Carlo sampling or on exhaustive searches in cases of two or even three "free" unknown variables. Recently the TOPological INVersion (TOPINV) algorithm, a grid search-based technique in the Rn space, has been proposed. TOPINV is not based on the minimization of a certain cost function and involves only forward computations, hence avoiding computational errors. The basic concept is to transform observation equations into inequalities on the basis of an optimization parameter k and of their standard errors, and through repeated "scans" of n-dimensional search grids for decreasing values of k to identify the optimal clusters of gridpoints which satisfy observation inequalities and by definition contain the "true" solution. Stochastic optimal solutions and their variance-covariance matrices are then computed as first and second statistical moments. Such exhaustive uniform searches produce an excessive computational load and are extremely time consuming for common computers based on a CPU. An alternative is to use a computing platform based on a GPU, which nowadays is affordable to the research community, which provides a much higher computing performance. Using the CUDA programming language to implement TOPINV allows the investigation of the attained speedup in execution time on such a high performance platform. Based on synthetic data we compared the execution time required for two typical geophysical problems, modeling magma sources and seismic faults, described with up to 18 unknown variables, on both CPU/FORTRAN and GPU/CUDA platforms. The same problems for several different sizes of search grids (up to 1012 gridpoints) and numbers of unknown variables were solved on both platforms, and execution time as a function of the grid dimension for each problem was recorded. Results indicate an average speedup in calculations by a factor of 100 on the GPU platform; for example problems with 1012 grid-points require less than two hours instead of several days on conventional desktop computers. Such a speedup encourages the application of TOPINV on high performance platforms, as a GPU, in cases where nearly real time decisions are necessary, for example finite fault modeling to identify possible tsunami sources.

  13. Predicting Operator Execution Times Using CogTool

    NASA Technical Reports Server (NTRS)

    Santiago-Espada, Yamira; Latorella, Kara A.

    2013-01-01

    Researchers and developers of NextGen systems can use predictive human performance modeling tools as an initial approach to obtain skilled user performance times analytically, before system testing with users. This paper describes the CogTool models for a two pilot crew executing two different types of a datalink clearance acceptance tasks, and on two different simulation platforms. The CogTool time estimates for accepting and executing Required Time of Arrival and Interval Management clearances were compared to empirical data observed in video tapes and registered in simulation files. Results indicate no statistically significant difference between empirical data and the CogTool predictions. A population comparison test found no significant differences between the CogTool estimates and the empirical execution times for any of the four test conditions. We discuss modeling caveats and considerations for applying CogTool to crew performance modeling in advanced cockpit environments.

  14. Risk Factors Analysis and Death Prediction in Some Life-Threatening Ailments Using Chi-Square Case-Based Reasoning (χ2 CBR) Model.

    PubMed

    Adeniyi, D A; Wei, Z; Yang, Y

    2018-01-30

    A wealth of data are available within the health care system, however, effective analysis tools for exploring the hidden patterns in these datasets are lacking. To alleviate this limitation, this paper proposes a simple but promising hybrid predictive model by suitably combining the Chi-square distance measurement with case-based reasoning technique. The study presents the realization of an automated risk calculator and death prediction in some life-threatening ailments using Chi-square case-based reasoning (χ 2 CBR) model. The proposed predictive engine is capable of reducing runtime and speeds up execution process through the use of critical χ 2 distribution value. This work also showcases the development of a novel feature selection method referred to as frequent item based rule (FIBR) method. This FIBR method is used for selecting the best feature for the proposed χ 2 CBR model at the preprocessing stage of the predictive procedures. The implementation of the proposed risk calculator is achieved through the use of an in-house developed PHP program experimented with XAMP/Apache HTTP server as hosting server. The process of data acquisition and case-based development is implemented using the MySQL application. Performance comparison between our system, the NBY, the ED-KNN, the ANN, the SVM, the Random Forest and the traditional CBR techniques shows that the quality of predictions produced by our system outperformed the baseline methods studied. The result of our experiment shows that the precision rate and predictive quality of our system in most cases are equal to or greater than 70%. Our result also shows that the proposed system executes faster than the baseline methods studied. Therefore, the proposed risk calculator is capable of providing useful, consistent, faster, accurate and efficient risk level prediction to both the patients and the physicians at any time, online and on a real-time basis.

  15. Examination of Regional Transit Service under Contracting : A Case Study in the Greater New Orleans Region : [Executive Summary

    DOT National Transportation Integrated Search

    2011-03-01

    In late 2008, New Orleans Regional Transit Authority (RTA) began to execute a delegated management contract with a multinational private firm, in order to not only increase efficiency and effectiveness in operation and maintenance of public tra...

  16. More reliable protein NMR peak assignment via improved 2-interval scheduling.

    PubMed

    Chen, Zhi-Zhong; Lin, Guohui; Rizzi, Romeo; Wen, Jianjun; Xu, Dong; Xu, Ying; Jiang, Tao

    2005-03-01

    Protein NMR peak assignment refers to the process of assigning a group of "spin systems" obtained experimentally to a protein sequence of amino acids. The automation of this process is still an unsolved and challenging problem in NMR protein structure determination. Recently, protein NMR peak assignment has been formulated as an interval scheduling problem (ISP), where a protein sequence P of amino acids is viewed as a discrete time interval I (the amino acids on P one-to-one correspond to the time units of I), each subset S of spin systems that are known to originate from consecutive amino acids from P is viewed as a "job" j(s), the preference of assigning S to a subsequence P of consecutive amino acids on P is viewed as the profit of executing job j(s) in the subinterval of I corresponding to P, and the goal is to maximize the total profit of executing the jobs (on a single machine) during I. The interval scheduling problem is max SNP-hard in general; but in the real practice of protein NMR peak assignment, each job j(s) usually requires at most 10 consecutive time units, and typically the jobs that require one or two consecutive time units are the most difficult to assign/schedule. In order to solve these most difficult assignments, we present an efficient 13/7-approximation algorithm for the special case of the interval scheduling problem where each job takes one or two consecutive time units. Combining this algorithm with a greedy filtering strategy for handling long jobs (i.e., jobs that need more than two consecutive time units), we obtain a new efficient heuristic for protein NMR peak assignment. Our experimental study shows that the new heuristic produces the best peak assignment in most of the cases, compared with the NMR peak assignment algorithms in the recent literature. The above algorithm is also the first approximation algorithm for a nontrivial case of the well-known interval scheduling problem that breaks the ratio 2 barrier.

  17. Executive Compensation: Is It Better to be Lucky Than Good

    DTIC Science & Technology

    2013-07-30

    are executives paid so much? Executive compensation has for years sparked interest from main street (citizens) to Wall Street (shareholders) to Capitol...from the luck of working for the right firm in the right industry at the right time. To clarify this muddied picture, Jeffrey Brookman at Idaho...luck best explains executive salaries . They ultimately conclude executives are compensated for their skills. Study Design and Method In their

  18. Age-related differences in the motor planning of a lower leg target matching task.

    PubMed

    Davies, Brenda L; Gehringer, James E; Kurz, Max J

    2015-12-01

    While the development and execution of upper extremity motor plans have been well explored, little is known about how individuals plan and execute rapid, goal-directed motor tasks with the lower extremities. Furthermore, the amount of time needed to integrate the proper amount of visual and proprioceptive feedback before being able to accurately execute a goal-directed movement is not well understood; especially in children. Therefore, the purpose of this study was to initially interrogate how the amount of motor planning time provided to a child before movement execution may influence the preparation and execution of a lower leg goal-directed movement. The results displayed that the amount of pre-movement motor planning time provided may influence the reaction time and accuracy of a goal directed leg movement. All subjects in the study had longer reaction times and less accurate movements when no pre-movement motor planning time was provided. In addition, the children had slower reaction times, slower movements, and less accurate movements than the adults for all the presented targets and motor planning times. These results highlight that children may require more time to successfully plan a goal directed movement with the lower extremity. This suggests that children may potentially have less robust internal models than adults for these types of motor skills. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Hyperactivity in boys with attention-deficit/hyperactivity disorder (ADHD): The role of executive and non-executive functions.

    PubMed

    Hudec, Kristen L; Alderson, R Matt; Patros, Connor H G; Lea, Sarah E; Tarle, Stephanie J; Kasper, Lisa J

    2015-01-01

    Motor activity of boys (age 8-12 years) with (n=19) and without (n=18) ADHD was objectively measured with actigraphy across experimental conditions that varied with regard to demands on executive functions. Activity exhibited during two n-back (1-back, 2-back) working memory tasks was compared to activity during a choice-reaction time (CRT) task that placed relatively fewer demands on executive processes and during a simple reaction time (SRT) task that required mostly automatic processing with minimal executive demands. Results indicated that children in the ADHD group exhibited greater activity compared to children in the non-ADHD group. Further, both groups exhibited the greatest activity during conditions with high working memory demands, followed by the reaction time and control task conditions, respectively. The findings indicate that large-magnitude increases in motor activity are predominantly associated with increased demands on working memory, though demands on non-executive processes are sufficient to elicit small to moderate increases in motor activity as well. Published by Elsevier Ltd.

  20. Executive and arousal vigilance decrement in the context of the attentional networks: The ANTI-Vea task.

    PubMed

    Luna, Fernando Gabriel; Marino, Julián; Roca, Javier; Lupiáñez, Juan

    2018-05-20

    Vigilance is generally understood as the ability to detect infrequent critical events through long time periods. In tasks like the Sustained Attention to Response Task (SART), participants tend to detect fewer events across time, a phenomenon known as "vigilance decrement". However, vigilance might also involve sustaining a tonic arousal level. In the Psychomotor Vigilance Test (PVT), the vigilance decrement corresponds to an increment across time in both mean and variability of reaction time. The present study aimed to develop a single task -Attentional Networks Test for Interactions and Vigilance - executive and arousal components (ANTI-Vea)- to simultaneously assess both components of vigilance (i.e., the executive vigilance as in the SART, and the arousal vigilance as in the PVT), while measuring the classic attentional functions (phasic alertness, orienting, and executive control). In Experiment #1, the executive vigilance decrement was found as an increment in response bias. In Experiment #2, this result was replicated, and the arousal vigilance decrement was simultaneously observed as an increment in reaction time. The ANTI-Vea solves some issues observed in the previous ANTI-V task with the executive vigilance measure (e.g., a low hit rate and no vigilance decrement). Furthermore, the new ANTI-Vea task assesses both components of vigilance together with others typical attentional functions. The new attentional networks test developed here may be useful to provide a better understanding of the human attentional system. The role of sensitivity and response bias in the executive vigilance decrement are discussed. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Objectively Measured Physical Activity and Cognitive Function in Older Adults.

    PubMed

    Zhu, Wenfei; Wadley, Virginia G; Howard, Virginia J; Hutto, Brent; Blair, Steven N; Hooker, Steven P

    2017-01-01

    Emerging evidence suggests physical activity (PA) is associated with cognitive function. To overcome limitations of self-report PA measures, this study investigated the association of accelerometer-measured PA with incident cognitive impairment and longitudinal cognition among older adults. Participants were recruited from the cohort study Reasons for Geographic and Racial Differences in Stroke in the United States. Accelerometers provided PA measures, including the percentage of total accelerometer wearing time spent in moderate-to-vigorous-intensity PA (MVPA%), light-intensity PA, and sedentary time for four to seven consecutive days at baseline. Cognitive impairment was defined by the Six-Item Screener. Letter fluency, animal fluency, word list learning, and Montreal Cognitive Assessment (orientation and recall) were conducted to assess executive function and memory. Participants (N = 6452, 69.7 ± 8.5 yr, 55.3% women, 30.5% Black) with usable accelerometer and cognition measures spent extremely limited time in MVPA (1.5% ± 1.9% of accelerometer wearing time). During an average of 3 yr of follow-up, 346 cases of incident cognitive impairment were observed. After adjustments, participants in higher MVPA% quartiles had a lower risk of cognitive impairment (i.e., quartile 2: odds ratio = 0.64, 95% confidence interval = 0.48-0.84) and better maintenance in executive function (≥0.03 z-score units) and memory (≥0.12 z-score units) compared with quartile 1 (P < 0.05). Stratified analyses showed the same association among White adults, but higher MVPA% was associated with better maintenance of only memory among Black adults. No significance was found for light-intensity PA or sedentary time. There was a dose-response relationship between MVPA% and cognitive function in older adults, with higher levels associated with a 36% or lower risk of cognitive impairment and better maintenance of memory and executive function over time, particularly in White adults.

  2. Time-Elastic Generative Model for Acceleration Time Series in Human Activity Recognition

    PubMed Central

    Munoz-Organero, Mario; Ruiz-Blazquez, Ramona

    2017-01-01

    Body-worn sensors in general and accelerometers in particular have been widely used in order to detect human movements and activities. The execution of each type of movement by each particular individual generates sequences of time series of sensed data from which specific movement related patterns can be assessed. Several machine learning algorithms have been used over windowed segments of sensed data in order to detect such patterns in activity recognition based on intermediate features (either hand-crafted or automatically learned from data). The underlying assumption is that the computed features will capture statistical differences that can properly classify different movements and activities after a training phase based on sensed data. In order to achieve high accuracy and recall rates (and guarantee the generalization of the system to new users), the training data have to contain enough information to characterize all possible ways of executing the activity or movement to be detected. This could imply large amounts of data and a complex and time-consuming training phase, which has been shown to be even more relevant when automatically learning the optimal features to be used. In this paper, we present a novel generative model that is able to generate sequences of time series for characterizing a particular movement based on the time elasticity properties of the sensed data. The model is used to train a stack of auto-encoders in order to learn the particular features able to detect human movements. The results of movement detection using a newly generated database with information on five users performing six different movements are presented. The generalization of results using an existing database is also presented in the paper. The results show that the proposed mechanism is able to obtain acceptable recognition rates (F = 0.77) even in the case of using different people executing a different sequence of movements and using different hardware. PMID:28208736

  3. Time-Elastic Generative Model for Acceleration Time Series in Human Activity Recognition.

    PubMed

    Munoz-Organero, Mario; Ruiz-Blazquez, Ramona

    2017-02-08

    Body-worn sensors in general and accelerometers in particular have been widely used in order to detect human movements and activities. The execution of each type of movement by each particular individual generates sequences of time series of sensed data from which specific movement related patterns can be assessed. Several machine learning algorithms have been used over windowed segments of sensed data in order to detect such patterns in activity recognition based on intermediate features (either hand-crafted or automatically learned from data). The underlying assumption is that the computed features will capture statistical differences that can properly classify different movements and activities after a training phase based on sensed data. In order to achieve high accuracy and recall rates (and guarantee the generalization of the system to new users), the training data have to contain enough information to characterize all possible ways of executing the activity or movement to be detected. This could imply large amounts of data and a complex and time-consuming training phase, which has been shown to be even more relevant when automatically learning the optimal features to be used. In this paper, we present a novel generative model that is able to generate sequences of time series for characterizing a particular movement based on the time elasticity properties of the sensed data. The model is used to train a stack of auto-encoders in order to learn the particular features able to detect human movements. The results of movement detection using a newly generated database with information on five users performing six different movements are presented. The generalization of results using an existing database is also presented in the paper. The results show that the proposed mechanism is able to obtain acceptable recognition rates ( F = 0.77) even in the case of using different people executing a different sequence of movements and using different hardware.

  4. A Case for Application Oblivious Energy-Efficient MPI Runtime

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Venkatesh, Akshay; Vishnu, Abhinav; Hamidouche, Khaled

    Power has become the major impediment in designing large scale high-end systems. Message Passing Interface (MPI) is the {\\em de facto} communication interface used as the back-end for designing applications, programming models and runtime for these systems. Slack --- the time spent by an MPI process in a single MPI call --- provides a potential for energy and power savings, if an appropriate power reduction technique such as core-idling/Dynamic Voltage and Frequency Scaling (DVFS) can be applied without perturbing application's execution time. Existing techniques that exploit slack for power savings assume that application behavior repeats across iterations/executions. However, an increasingmore » use of adaptive, data-dependent workloads combined with system factors (OS noise, congestion) makes this assumption invalid. This paper proposes and implements Energy Aware MPI (EAM) --- an application-oblivious energy-efficient MPI runtime. EAM uses a combination of communication models of common MPI primitives (point-to-point, collective, progress, blocking/non-blocking) and an online observation of slack for maximizing energy efficiency. Each power lever incurs time overhead, which must be amortized over slack to minimize degradation. When predicted communication time exceeds a lever overhead, the lever is used {\\em as soon as possible} --- to maximize energy efficiency. When mis-prediction occurs, the lever(s) are used automatically at specific intervals for amortization. We implement EAM using MVAPICH2 and evaluate it on ten applications using up to 4096 processes. Our performance evaluation on an InfiniBand cluster indicates that EAM can reduce energy consumption by 5--41\\% in comparison to the default approach, with negligible (less than 4\\% in all cases) performance loss.« less

  5. CASPER Version 2.0

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Rabideau, Gregg; Tran, Daniel; Knight, Russell; Chouinard, Caroline; Estlin, Tara; Gaines, Daniel; Clement, Bradley; Barrett, Anthony

    2007-01-01

    CASPER is designed to perform automated planning of interdependent activities within a system subject to requirements, constraints, and limitations on resources. In contradistinction to the traditional concept of batch planning followed by execution, CASPER implements a concept of continuous planning and replanning in response to unanticipated changes (including failures), integrated with execution. Improvements over other, similar software that have been incorporated into CASPER version 2.0 include an enhanced executable interface to facilitate integration with a wide range of execution software systems and supporting software libraries; features to support execution while reasoning about urgency, importance, and impending deadlines; features that enable accommodation to a wide range of computing environments that include various central processing units and random- access-memory capacities; and improved generic time-server and time-control features.

  6. Sarah's Story: One Teacher's Enactment of TPACK+ in a History Classroom

    ERIC Educational Resources Information Center

    Van Vaerenewyck, Leah M.; Shinas, Valerie Harlow; Steckel, Barbara

    2017-01-01

    This article presents a descriptive case study that describes a secondary history teacher's expression of sociocultural-oriented technological pedagogical content knowledge (TPACK) in the classroom, the execution of which we describe as TPACK+. TPACK+ describes sociocultural-oriented teacher knowledge requisite for the dynamic execution of TPACK…

  7. Superintendent-Business Executive Collaboration in Intermediary Organizations: Moral Agency and Democratic Functioning

    ERIC Educational Resources Information Center

    Bennett, Jeffrey V.; McKee, Tiffany; Martin, Staci

    2014-01-01

    This case study describes collaboration between business executives and superintendents to influence local/regional K-12 educational change. Specifically, we examine participant like-mindedness about the ethics and appropriate focus of K-12 intermediary collaboration, the extent of democratic functioning, and key individuals to involve. Data…

  8. 8 CFR 1003.106 - Right to be heard and disposition.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Section 1003.106 Aliens and Nationality EXECUTIVE OFFICE FOR IMMIGRATION REVIEW, DEPARTMENT OF JUSTICE GENERAL PROVISIONS EXECUTIVE OFFICE FOR IMMIGRATION REVIEW Professional Conduct for Practitioners-Rules... in § 1003.103(b)(2)(i)-(iii), then the Board shall refer the case to the Chief Immigration Judge for...

  9. Progress on Evaluating School Buildings in Scotland

    ERIC Educational Resources Information Center

    Thomson, Keith

    2006-01-01

    In June 2004, the Scottish Executive published guidance on evaluating completed school building projects, "Building Our Future: Scotland's School Estate," as part of the School Estate Strategy; the guidance included a case study evaluation at an Edinburgh primary school. The Executive is continuing to support evaluation work on the school estate…

  10. 77 FR 63417 - Senior Executive Service; Departmental Performance Review Board Members

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-16

    ..., Deputy Assistant Secretary for Human Resources and Chief Human Capital Officer Christopher J. Meade.... FOR FURTHER INFORMATION CONTACT: Julia J. Markham, Human Resources Specialist (Executive Resources... least three members. In the case of an appraisal of a career appointee, more than half the members shall...

  11. Executive Functioning Mediates the Effect of Behavioral Problems on Depression in Mothers of Children With Developmental Disabilities

    PubMed Central

    Chan, Wai; Smith, Leann E.; Greenberg, Jan S.; Hong, Jinkuk; Mailick, Marsha R.

    2017-01-01

    The present investigation explored long-term relationships of behavioral symptoms of adolescents and adults with developmental disabilities with the mental health of their mothers. Fragile X premutation carrier mothers of an adolescent or adult child with fragile X syndrome (n = 95), and mothers of a grown child with autism (n = 213) were included. Behavioral symptoms at Time 1 were hypothesized to predict maternal depressive symptoms at Time 3 via maternal executive dysfunction at Time 2. Results provided support for the mediating pathway of executive dysfunction. Additionally, the association of behavioral symptoms with executive dysfunction differed across the two groups, suggesting that premutation carriers may be more susceptible to caregiving stress due to their genotype. PMID:28095060

  12. Extracting Loop Bounds for WCET Analysis Using the Instrumentation Point Graph

    NASA Astrophysics Data System (ADS)

    Betts, A.; Bernat, G.

    2009-05-01

    Every calculation engine proposed in the literature of Worst-Case Execution Time (WCET) analysis requires upper bounds on loop iterations. Existing mechanisms to procure this information are either error prone, because they are gathered from the end-user, or limited in scope, because automatic analyses target very specific loop structures. In this paper, we present a technique that obtains bounds completely automatically for arbitrary loop structures. In particular, we show how to employ the Instrumentation Point Graph (IPG) to parse traces of execution (generated by an instrumented program) in order to extract bounds relative to any loop-nesting level. With this technique, therefore, non-rectangular dependencies between loops can be captured, allowing more accurate WCET estimates to be calculated. We demonstrate the improvement in accuracy by comparing WCET estimates computed through our HMB framework against those computed with state-of-the-art techniques.

  13. Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, L.M.; Hochstedler, R.D.

    1997-02-01

    Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of themore » accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).« less

  14. Method and associated apparatus for capturing, servicing, and de-orbiting earth satellites using robotics

    NASA Technical Reports Server (NTRS)

    Cepollina, Frank J. (Inventor); Corbo, James E. (Inventor); Burns, Richard D. (Inventor); Jedhrich, Nicholas M. (Inventor); Holz, Jill M. (Inventor)

    2009-01-01

    This invention is a method and supporting apparatus for autonomously capturing, servicing and de-orbiting a free-flying spacecraft, such as a satellite, using robotics. The capture of the spacecraft includes the steps of optically seeking and ranging the satellite using LIDAR, and matching tumble rates, rendezvousing and berthing with the satellite. Servicing of the spacecraft may be done using supervised autonomy, which is allowing a robot to execute a sequence of instructions without intervention from a remote human-occupied location. These instructions may be packaged at the remote station in a script and uplinked to the robot for execution upon remote command giving authority to proceed. Alternately, the instructions may be generated by Artificial Intelligence (AI) logic onboard the robot. In either case, the remote operator maintains the ability to abort an instruction or script at any time as well as the ability to intervene using manual override to teleoperate the robot.

  15. Towards Supervising Remote Dexterous Robots Across Time Delay

    NASA Technical Reports Server (NTRS)

    Hambuchen, Kimberly; Bluethmann, William; Goza, Michael; Ambrose, Robert; Wheeler, Kevin; Rabe, Ken

    2006-01-01

    The President s Vision for Space Exploration, laid out in 2004, relies heavily upon robotic exploration of the lunar surface in early phases of the program. Prior to the arrival of astronauts on the lunar surface, these robots will be required to be controlled across space and time, posing a considerable challenge for traditional telepresence techniques. Because time delays will be measured in seconds, not minutes as is the case for Mars Exploration, uploading the plan for a day seems excessive. An approach for controlling dexterous robots under intermediate time delay is presented, in which software running within a ground control cockpit predicts the intention of an immersed robot supervisor, then the remote robot autonomously executes the supervisor s intended tasks. Initial results are presented.

  16. Symbolic PathFinder: Symbolic Execution of Java Bytecode

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Rungta, Neha

    2010-01-01

    Symbolic Pathfinder (SPF) combines symbolic execution with model checking and constraint solving for automated test case generation and error detection in Java programs with unspecified inputs. In this tool, programs are executed on symbolic inputs representing multiple concrete inputs. Values of variables are represented as constraints generated from the analysis of Java bytecode. The constraints are solved using off-the shelf solvers to generate test inputs guaranteed to achieve complex coverage criteria. SPF has been used successfully at NASA, in academia, and in industry.

  17. Expected Utility Distributions for Flexible, Contingent Execution

    NASA Technical Reports Server (NTRS)

    Bresina, John L.; Washington, Richard

    2000-01-01

    This paper presents a method for using expected utility distributions in the execution of flexible, contingent plans. A utility distribution maps the possible start times of an action to the expected utility of the plan suffix starting with that action. The contingent plan encodes a tree of possible courses of action and includes flexible temporal constraints and resource constraints. When execution reaches a branch point, the eligible option with the highest expected utility at that point in time is selected. The utility distributions make this selection sensitive to the runtime context, yet still efficient. Our approach uses predictions of action duration uncertainty as well as expectations of resource usage and availability to determine when an action can execute and with what probability. Execution windows and probabilities inevitably change as execution proceeds, but such changes do not invalidate the cached utility distributions, thus, dynamic updating of utility information is minimized.

  18. Estimating the executive demands of a one-back choice reaction time task by means of the selective interference paradigm.

    PubMed

    Szmalec, Arnaud; Vandierendonck, André

    2007-08-01

    The present study proposes a new executive task, the one-back choice reaction time (RT) task, and implements the selective interference paradigm to estimate the executive demands of the processing components involved in this task. Based on the similarities between a one-back choice RT task and the n-back updating task, it was hypothesized that one-back delaying of a choice reaction involves executive control. In three experiments, framed within Baddeley's (1986) working-memory model, a one-back choice RT task, a choice RT task, articulatory suppression, and matrix tapping were performed concurrently with primary tasks involving verbal, visuospatial, and executive processing. The results demonstrate that one-back delaying of a choice reaction interferes with tasks requiring executive control, while the potential interference at the level of the verbal or visuospatial working memory slave systems remains minimal.

  19. A Genetic Algorithm for UAV Routing Integrated with a Parallel Swarm Simulation

    DTIC Science & Technology

    2005-03-01

    Metrics. 2.3.5.1 Amdahl’s, Gustafson-Barsis’s, and Sun-Ni’s Laws . At the heart of parallel computing is the ratio of communication time to...parallel execution. Three ‘ laws ’ in particular are of interest with regard to this ratio: Amdahl’s Law , the Gustafson-Barsis’s Law , and Sun-Ni’s Law ...Amdahl’s Law makes the case for fixed size speedup. This conjecture states that speedup saturates and efficiency drops as a consequence of holding the

  20. Flexibility of orthographic and graphomotor coordination during a handwritten copy task: effect of time pressure

    PubMed Central

    Sausset, Solen; Lambert, Eric; Olive, Thierry

    2013-01-01

    The coordination of the various processes involved in language production is a subject of keen debate in writing research. Some authors hold that writing processes can be flexibly coordinated according to task demands, whereas others claim that process coordination is entirely inflexible. For instance, orthographic planning has been shown to be resource-dependent during handwriting, but inflexible in typing, even under time pressure. The present study therefore went one step further in studying flexibility in the coordination of orthographic processing and graphomotor execution, by measuring the impact of time pressure during a handwritten copy task. Orthographic and graphomotor processes were observed via syllable processing. Writers copied out two- and three-syllable words three times in a row, with and without time pressure. Latencies and letter measures at syllable boundaries were analyzed. We hypothesized that if coordination is flexible and varies according to task demands, it should be modified by time pressure, affecting both latency before execution and duration of execution. We therefore predicted that the extent of syllable processing before execution would be reduced under time pressure and, as a consequence, syllable effects during execution would be more salient. Results showed, however, that time pressure interacted neither with syllable number nor with syllable structure. Accordingly, syllable processing appears to remain the same regardless of time pressure. The flexibility of process coordination during handwriting is discussed, as is the operationalization of time pressure constraints. PMID:24319435

  1. Ad-Hoc Queries over Document Collections - A Case Study

    NASA Astrophysics Data System (ADS)

    Löser, Alexander; Lutter, Steffen; Düssel, Patrick; Markl, Volker

    We discuss the novel problem of supporting analytical business intelligence queries over web-based textual content, e.g., BI-style reports based on 100.000's of documents from an ad-hoc web search result. Neither conventional search engines nor conventional Business Intelligence and ETL tools address this problem, which lies at the intersection of their capabilities. "Google Squared" or our system GOOLAP.info, are examples of these kinds of systems. They execute information extraction methods over one or several document collections at query time and integrate extracted records into a common view or tabular structure. Frequent extraction and object resolution failures cause incomplete records which could not be joined into a record answering the query. Our focus is the identification of join-reordering heuristics maximizing the size of complete records answering a structured query. With respect to given costs for document extraction we propose two novel join-operations: The multi-way CJ-operator joins records from multiple relationships extracted from a single document. The two-way join-operator DJ ensures data density by removing incomplete records from results. In a preliminary case study we observe that our join-reordering heuristics positively impact result size, record density and lower execution costs.

  2. Neuropsychological profile and social cognition in congenital central hypoventilation syndrome (CCHS): Correlation with neuroimaging in a clinical case.

    PubMed

    Esteso Orduña, Borja; Seijas Gómez, Raquel; García Esparza, Elena; Briceño, Emily M; Melero Llorente, Javier; Fournier Del Castillo, María de la Concepción

    2018-02-01

    Congenital central hypoventilation syndrome (CCHS) is a rare genetic disorder due to paired-like homeobox gene (PHOX2B) mutations. CCHS patients suffer from dysregulation of the autonomic nervous system characterized by the absence of or extremely reduced response to hypercapnia and hypoxia, with neuropsychological deficits. The aim of this exploratory study is to describe the longitudinal neuropsychological profile and its correlations with magnetic resonance imaging (MRI) of a child with CCHS with a PHOX2B mutation. A comprehensive neuropsychological evaluation was conducted serially at age 7 years 4 months and 10 years 3 months, including assessment of intellectual functioning (IQ), motor functioning, perception, attention, executive functions, language, memory, social cognition, academic skills, and psychopathology. Reliable change index (RCI) scores were used to assess changes between assessments. We collected spin lattice relaxation time (T1)-weighted, fluid-attenuated inversion recovery (FLAIR), and spin spin lattice relaxation time (T2)-weighted images from the child at age 10 years 3 months using a 1.5-tesla MRI scanner. IQ, processing speed index (PSI), social cognition (theory of mind and facial emotion recognition), selective attention, naming, academic skills (reading/comprehension), and manual speed with right hand declined in the second evaluation relative to the initial evaluation, while visuoconstructional praxis, receptive vocabulary, working memory, and arithmetic skill improved. The patient showed a remarkable global deterioration in executive functions (planning, task flexibility, behavioral regulation, and metacognition) as revealed by parental report and clinical evaluation. MRI revealed gliosis from the head to tail of the hippocampus and thinning of parahippocampal gyri. In a clinical case of CCHS, serial evaluation revealed deterioration of executive functions and social cognition over a 3-year interval. These changes corresponded to hippocampal damage as revealed in MRI, which may have affected social cognition through its role in the default mode network. Serial neuropsychological assessment is clinically useful in managing the needs of these patients.

  3. Executive Mind, Timely Action.

    ERIC Educational Resources Information Center

    Torbert, William R.

    1983-01-01

    The idea of "Executive Mind" carries with it the notion of purposeful and effective action. Part I of this paper characterizes three complements to "Executive Mind"--"Observing Mind,""Theorizing Mind," and "Passionate Mind"--and offers historical figures exemplifying all four types. The concluding…

  4. The DaveMLTranslator: An Interface for DAVE-ML Aerodynamic Models

    NASA Technical Reports Server (NTRS)

    Hill, Melissa A.; Jackson, E. Bruce

    2007-01-01

    It can take weeks or months to incorporate a new aerodynamic model into a vehicle simulation and validate the performance of the model. The Dynamic Aerospace Vehicle Exchange Markup Language (DAVE-ML) has been proposed as a means to reduce the time required to accomplish this task by defining a standard format for typical components of a flight dynamic model. The purpose of this paper is to describe an object-oriented C++ implementation of a class that interfaces a vehicle subsystem model specified in DAVE-ML and a vehicle simulation. Using the DaveMLTranslator class, aerodynamic or other subsystem models can be automatically imported and verified at run-time, significantly reducing the elapsed time between receipt of a DAVE-ML model and its integration into a simulation environment. The translator performs variable initializations, data table lookups, and mathematical calculations for the aerodynamic build-up, and executes any embedded static check-cases for verification. The implementation is efficient, enabling real-time execution. Simple interface code for the model inputs and outputs is the only requirement to integrate the DaveMLTranslator as a vehicle aerodynamic model. The translator makes use of existing table-lookup utilities from the Langley Standard Real-Time Simulation in C++ (LaSRS++). The design and operation of the translator class is described and comparisons with existing, conventional, C++ aerodynamic models of the same vehicle are given.

  5. Atomicity violation detection using access interleaving invariants

    DOEpatents

    Zhou, Yuanyuan; Lu, Shan; Tucek, Joseph Andrew

    2013-09-10

    During execution of a program, the situation where the atomicity of a pair of instructions that are to be executed atomically is violated is identified, and a bug is detected as occurring in the program at the pair of instructions. The pairs of instructions that are to be executed atomically can be identified in different manners, such as by executing a program multiple times and using the results of those executions to automatically identify the pairs of instructions.

  6. A Developmental Window into Trade-offs in Executive Function: The Case of Task Switching versus Response Inhibition in 6-year-olds

    PubMed Central

    Chatham, Christopher H.; Wiseheart, Melody; Munakata, Yuko

    2014-01-01

    Good executive function has been linked to many positive outcomes in academic performance, health, and social competence. However, some aspects of executive function may interfere with other cognitive processes. Childhood provides a unique test case for investigating such cognitive trade-offs, given the dramatic failures and developments observed during this period. For example, most children categorically switch or perseverate when asked to switch between rules on a card-sorting task. To test potential trade-offs with the development of task switching abilities, we compared 6-year-olds who switched versus perseverated in a card-sorting task on two aspects of inhibitory control: response inhibition (via a stop signal task) and interference control (via a Simon task). Across two studies, switchers showed worse response inhibition than perseverators, consistent with the idea of cognitive trade-offs; however, switchers showed better interference control than perseverators, consistent with prior work documenting benefits associated with the development of executive function. This pattern of positive and negative associations may reflect aspects of working memory (active maintenance of current goals, and clearing of prior goals) that help children focused on a single task-goal but hurt in situations with conflicting goals. Implications for understanding components of executive function and their relationships across development are discussed. PMID:24791710

  7. Two approaches to estimating the effect of parenting on the development of executive function in early childhood.

    PubMed

    Blair, Clancy; Raver, C Cybele; Berry, Daniel J

    2014-02-01

    In the current article, we contrast 2 analytical approaches to estimate the relation of parenting to executive function development in a sample of 1,292 children assessed longitudinally between the ages of 36 and 60 months of age. Children were administered a newly developed and validated battery of 6 executive function tasks tapping inhibitory control, working memory, and attention shifting. Residualized change analysis indicated that higher quality parenting as indicated by higher scores on widely used measures of parenting at both earlier and later time points predicted more positive gain in executive function at 60 months. Latent change score models in which parenting and executive function over time were held to standards of longitudinal measurement invariance provided additional evidence of the association between change in parenting quality and change in executive function. In these models, cross-lagged paths indicated that in addition to parenting predicting change in executive function, executive function bidirectionally predicted change in parenting quality. Results were robust with the addition of covariates, including child sex, race, maternal education, and household income-to-need. Strengths and drawbacks of the 2 analytic approaches are discussed, and the findings are considered in light of emerging methodological innovations for testing the extent to which executive function is malleable and open to the influence of experience.

  8. Remote video auditing with real-time feedback in an academic surgical suite improves safety and efficiency metrics: a cluster randomised study.

    PubMed

    Overdyk, Frank J; Dowling, Oonagh; Newman, Sheldon; Glatt, David; Chester, Michelle; Armellino, Donna; Cole, Brandon; Landis, Gregg S; Schoenfeld, David; DiCapua, John F

    2016-12-01

    Compliance with the surgical safety checklist during operative procedures has been shown to reduce inhospital mortality and complications but proper execution by the surgical team remains elusive. We evaluated the impact of remote video auditing with real-time provider feedback on checklist compliance during sign-in, time-out and sign-out and case turnover times. Prospective, cluster randomised study in a 23-operating room (OR) suite. Surgeons, anaesthesia providers, nurses and support staff. ORs were randomised to receive, or not receive, real-time feedback on safety checklist compliance and efficiency metrics via display boards and text messages, followed by a period during which all ORs received feedback. Checklist compliance (Pass/Fail) during sign-in, time-out and sign-out demonstrated by (1) use of checklist, (2) team attentiveness, (3) required duration, (4) proper sequence and duration of case turnover times. Sign-in, time-out and sign-out PASS rates increased from 25%, 16% and 32% during baseline phase (n=1886) to 64%, 84% and 68% for feedback ORs versus 40%, 77% and 51% for no-feedback ORs (p<0.004) during the intervention phase (n=2693). Pass rates were 91%, 95% and 84% during the all-feedback phase (n=2001). For scheduled cases (n=1406, 71%), feedback reduced mean turnover times by 14% (41.4 min vs 48.1 min, p<0.004), and the improvement was sustained during the all-feedback period. Feedback had no effect on turnover time for unscheduled cases (n=587, 29%). Our data indicate that remote video auditing with feedback improves surgical safety checklist compliance for all cases, and turnover time for scheduled cases, but not for unscheduled cases. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  9. Remote video auditing with real-time feedback in an academic surgical suite improves safety and efficiency metrics: a cluster randomised study

    PubMed Central

    Overdyk, Frank J; Dowling, Oonagh; Newman, Sheldon; Glatt, David; Chester, Michelle; Armellino, Donna; Cole, Brandon; Landis, Gregg S; Schoenfeld, David; DiCapua, John F

    2016-01-01

    Importance Compliance with the surgical safety checklist during operative procedures has been shown to reduce inhospital mortality and complications but proper execution by the surgical team remains elusive. Objective We evaluated the impact of remote video auditing with real-time provider feedback on checklist compliance during sign-in, time-out and sign-out and case turnover times. Design, setting Prospective, cluster randomised study in a 23-operating room (OR) suite. Participants Surgeons, anaesthesia providers, nurses and support staff. Exposure ORs were randomised to receive, or not receive, real-time feedback on safety checklist compliance and efficiency metrics via display boards and text messages, followed by a period during which all ORs received feedback. Main outcome(s) and measure(s) Checklist compliance (Pass/Fail) during sign-in, time-out and sign-out demonstrated by (1) use of checklist, (2) team attentiveness, (3) required duration, (4) proper sequence and duration of case turnover times. Results Sign-in, time-out and sign-out PASS rates increased from 25%, 16% and 32% during baseline phase (n=1886) to 64%, 84% and 68% for feedback ORs versus 40%, 77% and 51% for no-feedback ORs (p<0.004) during the intervention phase (n=2693). Pass rates were 91%, 95% and 84% during the all-feedback phase (n=2001). For scheduled cases (n=1406, 71%), feedback reduced mean turnover times by 14% (41.4 min vs 48.1 min, p<0.004), and the improvement was sustained during the all-feedback period. Feedback had no effect on turnover time for unscheduled cases (n=587, 29%). Conclusions and relevance Our data indicate that remote video auditing with feedback improves surgical safety checklist compliance for all cases, and turnover time for scheduled cases, but not for unscheduled cases. PMID:26658775

  10. Task Decomposition Module For Telerobot Trajectory Generation

    NASA Astrophysics Data System (ADS)

    Wavering, Albert J.; Lumia, Ron

    1988-10-01

    A major consideration in the design of trajectory generation software for a Flight Telerobotic Servicer (FTS) is that the FTS will be called upon to perform tasks which require a diverse range of manipulator behaviors and capabilities. In a hierarchical control system where tasks are decomposed into simpler and simpler subtasks, the task decomposition module which performs trajectory planning and execution should therefore be able to accommodate a wide range of algorithms. In some cases, it will be desirable to plan a trajectory for an entire motion before manipulator motion commences, as when optimizing over the entire trajectory. Many FTS motions, however, will be highly sensory-interactive, such as moving to attain a desired position relative to a non-stationary object whose position is periodically updated by a vision system. In this case, the time-varying nature of the trajectory may be handled either by frequent replanning using updated sensor information, or by using an algorithm which creates a less specific state-dependent plan that determines the manipulator path as the trajectory is executed (rather than a priori). This paper discusses a number of trajectory generation techniques from these categories and how they may be implemented in a task decompo-sition module of a hierarchical control system. The structure, function, and interfaces of the proposed trajectory gener-ation module are briefly described, followed by several examples of how different algorithms may be performed by the module. The proposed task decomposition module provides a logical structure for trajectory planning and execution, and supports a large number of published trajectory generation techniques.

  11. A methodology for Manufacturing Execution Systems (MES) implementation

    NASA Astrophysics Data System (ADS)

    Govindaraju, Rajesri; Putra, Krisna

    2016-02-01

    Manufacturing execution system is information systems (IS) application that bridges the gap between IS at the top level, namely enterprise resource planning (ERP), and IS at the lower levels, namely the automation systems. MES provides a media for optimizing the manufacturing process as a whole in a real time basis. By the use of MES in combination with the implementation of ERP and other automation systems, a manufacturing company is expected to have high competitiveness. In implementing MES, functional integration -making all the components of the manufacturing system able to work well together, is the most difficult challenge. For this, there has been an industry standard that specifies the sub-systems of a manufacturing execution systems and defines the boundaries between ERP systems, MES, and other automation systems. The standard is known as ISA-95. Although the advantages from the use of MES have been stated in some studies, not much research being done on how to implement MES effectively. The purpose of this study is to develop a methodology describing how MES implementation project should be managed, utilising the support of ISA- 95 reference model in the system development process. A proposed methodology was developed based on a general IS development methodology. The developed methodology were then revisited based on the understanding about the specific charateristics of MES implementation project found in an Indonesian steel manufacturing company implementation case. The case study highlighted the importance of applying an effective requirement elicitation method during innitial system assessment process, managing system interfaces and labor division in the design process, and performing a pilot deployment before putting the whole system into operation.

  12. MATTS- A Step Towards Model Based Testing

    NASA Astrophysics Data System (ADS)

    Herpel, H.-J.; Willich, G.; Li, J.; Xie, J.; Johansen, B.; Kvinnesland, K.; Krueger, S.; Barrios, P.

    2016-08-01

    In this paper we describe a Model Based approach to testing of on-board software and compare it with traditional validation strategy currently applied to satellite software. The major problems that software engineering will face over at least the next two decades are increasing application complexity driven by the need for autonomy and serious application robustness. In other words, how do we actually get to declare success when trying to build applications one or two orders of magnitude more complex than today's applications. To solve the problems addressed above the software engineering process has to be improved at least for two aspects: 1) Software design and 2) Software testing. The software design process has to evolve towards model-based approaches with extensive use of code generators. Today, testing is an essential, but time and resource consuming activity in the software development process. Generating a short, but effective test suite usually requires a lot of manual work and expert knowledge. In a model-based process, among other subtasks, test construction and test execution can also be partially automated. The basic idea behind the presented study was to start from a formal model (e.g. State Machines), generate abstract test cases which are then converted to concrete executable test cases (input and expected output pairs). The generated concrete test cases were applied to an on-board software. Results were collected and evaluated wrt. applicability, cost-efficiency, effectiveness at fault finding, and scalability.

  13. 5 CFR 317.306 - Conversion of employees under time limited appointments.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 1 2011-01-01 2011-01-01 false Conversion of employees under time... CIVIL SERVICE REGULATIONS EMPLOYMENT IN THE SENIOR EXECUTIVE SERVICE Conversion to the Senior Executive Service § 317.306 Conversion of employees under time limited appointments. (a) Coverage. This section...

  14. 17 CFR 37.900 - Core Principle 9-Timely publication of trading information.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... publication of trading information. 37.900 Section 37.900 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION SWAP EXECUTION FACILITIES Timely Publication of Trading Information § 37.900 Core Principle 9—Timely publication of trading information. (a) In general. The swap execution facility shall...

  15. Putting time into proof outlines

    NASA Technical Reports Server (NTRS)

    Schneider, Fred B.; Bloom, Bard; Marzullo, Keith

    1991-01-01

    A logic for reasoning about timing of concurrent programs is presented. The logic is based on proof outlines and can handle maximal parallelism as well as resource-constrained execution environments. The correctness proof for a mutual exclusion protocol that uses execution timings in a subtle way illustrates the logic in action.

  16. The ethics of the Texas death penalty and its impact on a prolonged appeals process.

    PubMed

    Pearlman, T

    1998-01-01

    Society remains sharply divided as to the deterrent value of capital punishment. Following the reintroduction of the death penalty in the United States, Texas law mandates the affirmative predictability of future dangerousness beyond a reasonable doubt before a jury can impose the ultimate penalty for capital murder. The validity of prediction of dangerousness has been challenged in three Texas landmark cases before the U.S. Supreme Court. The case of Karla Faye Tucker highlights the moral controversy that occurs when execution follows an appeals process stretching over more than a decade, during which time personality growth and the effects of prison rehabilitation may have eliminated or curbed criminal tendencies.

  17. Issues surrounding lethal injection as a means of capital punishment.

    PubMed

    Romanelli, Frank; Whisman, Tyler; Fink, Joseph L

    2008-12-01

    Lethal injection as a method of state-sanctioned capital punishment was initially proposed in the United States in 1977 and used for the first time in 1982. Most lethal injection protocols use a sequential drug combination of sodium thiopental, pancuronium bromide, and potassium chloride. Lethal injection was originally introduced as a more humane form of execution compared with existing mechanical methods such as electrocution, toxic gassing, hanging, or firing squad. Lethal injection has not, however, been without controversy. Several states are considering whether lethal injection meets constitutional scrutiny forbidding cruel and unusual punishment. Recently in the case of Ralph Baze and Thomas C. Bowling, Petitioners, v John D. Rees, Commissioner, Kentucky Department of Corrections et al, the United States Supreme Court upheld the constitutionality of the lethal injection protocol as carried out in the Commonwealth of Kentucky. Most of the debate has surrounded the dosing and procedures used in lethal injection and whether the drug combinations and measures for administering the drugs truly produce a timely, pain-free, and fail-safe death. Many have also raised issues regarding the "medicalization" of execution and the ethics of health care professionals' participation in any part of the lethal injection process. As a result of all these issues, the future of lethal injection as a means of execution in the United States is under significant scrutiny. Outcomes of ongoing legislative and judicial reviews might result in cessation of lethal injection in totality or in alterations involving specific drug combinations or administration procedures.

  18. A Discussion of the Discrete Fourier Transform Execution on a Typical Desktop PC

    NASA Technical Reports Server (NTRS)

    White, Michael J.

    2006-01-01

    This paper will discuss and compare the execution times of three examples of the Discrete Fourier Transform (DFT). The first two examples will demonstrate the direct implementation of the algorithm. In the first example, the Fourier coefficients are generated at the execution of the DFT. In the second example, the coefficients are generated prior to execution and the DFT coefficients are indexed at execution. The last example will demonstrate the Cooley- Tukey algorithm, better known as the Fast Fourier Transform. All examples were written in C executed on a PC using a Pentium 4 running at 1.7 Ghz. As a function of N, the total complex data size, the direct implementation DFT executes, as expected at order of N2 and the FFT executes at order of N log2 N. At N=16K, there is an increase in processing time beyond what is expected. This is not caused by implementation but is a consequence of the effect that machine architecture and memory hierarchy has on implementation. This paper will include a brief overview of digital signal processing, along with a discussion of contemporary work with discrete Fourier processing.

  19. Parallel program debugging with flowback analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Jongdeok.

    1989-01-01

    This thesis describes the design and implementation of an integrated debugging system for parallel programs running on shared memory multi-processors. The goal of the debugging system is to present to the programmer a graphical view of the dynamic program dependences while keeping the execution-time overhead low. The author first describes the use of flowback analysis to provide information on causal relationship between events in a programs' execution without re-executing the program for debugging. Execution time overhead is kept low by recording only a small amount of trace during a program's execution. He uses semantic analysis and a technique called incrementalmore » tracing to keep the time and space overhead low. As part of the semantic analysis, he uses a static program dependence graph structure that reduces the amount of work done at compile time and takes advantage of the dynamic information produced during execution time. The cornerstone of the incremental tracing concept is to generate a coarse trace during execution and fill incrementally, during the interactive portion of the debugging session, the gap between the information gathered in the coarse trace and the information needed to do the flowback analysis using the coarse trace. Then, he describes how to extend the flowback analysis to parallel programs. The flowback analysis can span process boundaries; i.e., the most recent modification to a shared variable might be traced to a different process than the one that contains the current reference. The static and dynamic program dependence graphs of the individual processes are tied together with synchronization and data dependence information to form complete graphs that represent the entire program.« less

  20. "Between the Heavens and the Earth": Narrating the Execution of Moses Paul

    ERIC Educational Resources Information Center

    Salyer, Matt

    2012-01-01

    The 1772 execution of the Mohegan sailor Moses Paul served as the occasion for Samson Occom's popular "Sermon," reprinted in numerous editions. Recent work by Ava Chamberlain seeks to recover Paul's version of events from contemporary court records. This article argues that Paul's "firsthand" account of the case and autobiographical narrative…

  1. Are "High Potential" Executives Capable of Building Learning-Oriented Organisations? Reflections on the French Case

    ERIC Educational Resources Information Center

    Belet, Daniel

    2007-01-01

    Purpose: The author's interest in learning organisation development leads him to examine large French companies' practices regarding "high potential" executives policies and to question their selection and development processes and their capabilities to develop learning oriented organisations.The author also tries to explain why most…

  2. At the Fundraising Core: Strategic Public Relations in Fundraising Practice. CASE White Paper

    ERIC Educational Resources Information Center

    Satchwell, Carol M.

    2010-01-01

    This white paper reports on a study exploring the views of chief fundraising executives at private colleges and universities about the relationship between public relations and fundraising. The research focused on how fundraising executives define public relations and use public relations tactics and strategies within their institutions'…

  3. The Constitutional Aspects of the War Powers Resolution of November 7, 1973.

    DTIC Science & Technology

    legislation is examined in light of the constitutional separation of powers between the legislative and executive branch of government and its impact on...the executive. This essay contains an examination of the historical constitutional cases and the constitutional convention discussions on the separation of powers to ’make war.’ (Author)

  4. 17 CFR Appendix B to Part 36 - Guidance on, and Acceptable Practices in, Compliance With Core Principles

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... spot-month positions. Spot-month limits should be adopted for significant price discovery contracts to... market or derivatives transaction execution facility should set the spot-month limit for its significant... designated contract market or derivatives transaction execution facility. In this case, the spot-month...

  5. Entrepreneurship Education for Executive MBAs: The Case of a Caribbean Business School

    ERIC Educational Resources Information Center

    Allahar, Haven; Brathwaite, Candace

    2017-01-01

    Entrepreneurship courses are now a feature of the curricula of many tertiary-level business schools. While there is a growing body of research on the subject of entrepreneurship education and learning, studies of the executive master of business administration (EMBA) are relatively sparse. This article offers an example of an entrepreneurship…

  6. Execution-Inspired Murder: A Form of Suicide?

    ERIC Educational Resources Information Center

    Van Wormer, Katherine

    1995-01-01

    Describes each of 20 cases briefly to show the pattern of killers who committed their crime in order to be executed. Concludes that the death penalty itself can serve as an invitation to murder, and asserts that legislators should vote against the implementation of the death penalty in order to save lives. (JPS)

  7. Anthropological analysis of the Second World War skeletal remains from three karst sinkholes located in southern Croatia.

    PubMed

    Jerković, Ivan; Bašić, Željana; Bečić, Kristijan; Jambrešić, Gordana; Grujić, Ivan; Alujević, Antonio; Kružić, Ivana

    2016-11-01

    Although in the cases of war crimes the main effort goes to the identification of victims, it is crucial to consider the execution event as a whole. Thus, the goal of the research was to determine the trauma type and probable cause of death on skeletal remains of civilians executed by partisans from WWS found in the three karst sinkholes and to explain the context in which the injuries occurred. We determined biological profiles, pathological conditions, traumas, and assessed their lethality. Nineteen skeletons were found, 68.4% had, at least, one perimortem trauma, classified as lethal/lethal if untreated in 69.2% cases. The type of execution and administered violence showed to be age and health dependent: elderly and diseased were executed with the intention to kill, by the gunshot facing victims, whilst the more violent behavior expressed towards younger and healthy individuals was indicated by the higher frequency of blunt force trauma. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  8. Multiprogramming performance degradation - Case study on a shared memory multiprocessor

    NASA Technical Reports Server (NTRS)

    Dimpsey, R. T.; Iyer, R. K.

    1989-01-01

    The performance degradation due to multiprogramming overhead is quantified for a parallel-processing machine. Measurements of real workloads were taken, and it was found that there is a moderate correlation between the completion time of a program and the amount of system overhead measured during program execution. Experiments in controlled environments were then conducted to calculate a lower bound on the performance degradation of parallel jobs caused by multiprogramming overhead. The results show that the multiprogramming overhead of parallel jobs consumes at least 4 percent of the processor time. When two or more serial jobs are introduced into the system, this amount increases to 5.3 percent

  9. Malware detection and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiang, Ken; Lloyd, Levi; Crussell, Jonathan

    Embodiments of the invention describe systems and methods for malicious software detection and analysis. A binary executable comprising obfuscated malware on a host device may be received, and incident data indicating a time when the binary executable was received and identifying processes operating on the host device may be recorded. The binary executable is analyzed via a scalable plurality of execution environments, including one or more non-virtual execution environments and one or more virtual execution environments, to generate runtime data and deobfuscation data attributable to the binary executable. At least some of the runtime data and deobfuscation data attributable tomore » the binary executable is stored in a shared database, while at least some of the incident data is stored in a private, non-shared database.« less

  10. Software for Automation of Real-Time Agents, Version 2

    NASA Technical Reports Server (NTRS)

    Fisher, Forest; Estlin, Tara; Gaines, Daniel; Schaffer, Steve; Chouinard, Caroline; Engelhardt, Barbara; Wilklow, Colette; Mutz, Darren; Knight, Russell; Rabideau, Gregg; hide

    2005-01-01

    Version 2 of Closed Loop Execution and Recovery (CLEaR) has been developed. CLEaR is an artificial intelligence computer program for use in planning and execution of actions of autonomous agents, including, for example, Deep Space Network (DSN) antenna ground stations, robotic exploratory ground vehicles (rovers), robotic aircraft (UAVs), and robotic spacecraft. CLEaR automates the generation and execution of command sequences, monitoring the sequence execution, and modifying the command sequence in response to execution deviations and failures as well as new goals for the agent to achieve. The development of CLEaR has focused on the unification of planning and execution to increase the ability of the autonomous agent to perform under tight resource and time constraints coupled with uncertainty in how much of resources and time will be required to perform a task. This unification is realized by extending the traditional three-tier robotic control architecture by increasing the interaction between the software components that perform deliberation and reactive functions. The increase in interaction reduces the need to replan, enables earlier detection of the need to replan, and enables replanning to occur before an agent enters a state of failure.

  11. Naps Enhance Executive Attention in Preschool-Aged Children.

    PubMed

    Cremone, Amanda; McDermott, Jennifer M; Spencer, Rebecca M C

    2017-09-01

    Executive attention is impaired following sleep loss in school-aged children, adolescents, and adults. Whether naps improve attention relative to nap deprivation in preschool-aged children is unknown. The aim of this study was to compare executive attention in preschool children following a nap and an interval of wake. Sixty-nine children, 35-70 months of age, completed a Flanker task to assess executive attention following a nap and an equivalent interval of wake. Overall, accuracy was greater after the nap compared with the wake interval. Reaction time(s) did not differ between the nap and wake intervals. Results did not differ between children who napped consistently and those who napped inconsistently, suggesting that naps benefit executive attention of preschoolers regardless of nap habituality. These results indicate that naps enhance attention in preschool children. As executive attention supports executive functioning and learning, nap promotion may improve early education outcomes. © The Author 2017. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  12. Timing, sequencing, and executive control in repetitive movement production.

    PubMed

    Krampe, Ralf Th; Mayr, Ulrich; Kliegl, Reinhold

    2005-06-01

    The authors demonstrate that the timing and sequencing of target durations require low-level timing and executive control. Sixteen young (M-sub(age) = 19 years) and 16 older (M-sub(age) = 70 years) adults participated in 2 experiments. In Experiment 1, individual mean-variance functions for low-level timing (isochronous tapping) and the sequencing of multiple targets (rhythm production) revealed (a) a dissociation of low-level timing and sequencing in both age groups, (b) negligible age differences for low-level timing, and (c) large age differences for sequencing. Experiment 2 supported the distinction between low-level timing and executive functions: Selection against a dominant rhythm and switching between rhythms impaired performances in both age groups and induced pronounced perseveration of the dominant pattern in older adults. ((c) 2005 APA, all rights reserved).

  13. Real-Time Agent-Based Modeling Simulation with in-situ Visualization of Complex Biological Systems: A Case Study on Vocal Fold Inflammation and Healing.

    PubMed

    Seekhao, Nuttiiya; Shung, Caroline; JaJa, Joseph; Mongeau, Luc; Li-Jessen, Nicole Y K

    2016-05-01

    We present an efficient and scalable scheme for implementing agent-based modeling (ABM) simulation with In Situ visualization of large complex systems on heterogeneous computing platforms. The scheme is designed to make optimal use of the resources available on a heterogeneous platform consisting of a multicore CPU and a GPU, resulting in minimal to no resource idle time. Furthermore, the scheme was implemented under a client-server paradigm that enables remote users to visualize and analyze simulation data as it is being generated at each time step of the model. Performance of a simulation case study of vocal fold inflammation and wound healing with 3.8 million agents shows 35× and 7× speedup in execution time over single-core and multi-core CPU respectively. Each iteration of the model took less than 200 ms to simulate, visualize and send the results to the client. This enables users to monitor the simulation in real-time and modify its course as needed.

  14. Employer-provided health insurance and hospital mergers.

    PubMed

    Garmon, Christopher

    2013-07-01

    This paper explores the impact of employer-provided health insurance on hospital competition and hospital mergers. Under employer-provided health insurance, employer executives act as agents for their employees in selecting health insurance options for their firm. The paper investigates whether a merger of hospitals favored by executives will result in a larger price increase than a merger of competing hospitals elsewhere. This is found to be the case even when the executive has the same opportunity cost of travel as her employees and even when the executive is the sole owner of the firm, retaining all profits. This is consistent with the Federal Trade Commission's findings in its challenge of Evanston Northwestern Healthcare's acquisition of Highland Park Hospital. Implications of the model are further tested with executive location data and hospital data from Florida and Texas.

  15. Executive Resources and Item-Context Binding: Exploring the Influence of Concurrent Inhibition, Updating, and Shifting Tasks on Context Memory

    PubMed Central

    Nieznański, Marek; Obidziński, Michał; Zyskowska, Emilia; Niedziałkowska, Daria

    2015-01-01

    Previous research has demonstrated that context memory performance decreases as a result of cognitive load. However, the role of specific executive resources availability has not been specified yet. In a dual-task experiment, participants performed three kinds of concurrent task engaging: inhibition, updating, or shifting operations. In comparison with a no-load single-task condition, a significant decrease in item and context memory was observed, regardless of the kind of executive task. When executive load conditions were compared with non-specific cognitive load conditions, a significant interference effect was observed in the case of the inhibition task. The inhibition process appears to be an aspect of executive control, which relies on the same resource as item-context binding does, especially when binding refers to associations retrieved from long-term memory. PMID:26435761

  16. Ten Years of Change in Executive Education.

    ERIC Educational Resources Information Center

    Bolt, James F.

    1993-01-01

    As recently as the 1980s, most companies did not pay much attention to executive education. In the 1990s, many see executive education as a must for revamping competitive strategies, increasing productivity, improving quality, reducing cycle time, and revitalizing corporate culture. (Author/JOW)

  17. Individual differences in executive control relate to metaphor processing: an eye movement study of sentence reading

    PubMed Central

    Columbus, Georgie; Sheikh, Naveed A.; Côté-Lecaldare, Marilena; Häuser, Katja; Baum, Shari R.; Titone, Debra

    2015-01-01

    Metaphors are common elements of language that allow us to creatively stretch the limits of word meaning. However, metaphors vary in their degree of novelty, which determines whether people must create new meanings on-line or retrieve previously known metaphorical meanings from memory. Such variations affect the degree to which general cognitive capacities such as executive control are required for successful comprehension. We investigated whether individual differences in executive control relate to metaphor processing using eye movement measures of reading. Thirty-nine participants read sentences including metaphors or idioms, another form of figurative language that is more likely to rely on meaning retrieval. They also completed the AX-CPT, a domain-general executive control task. In Experiment 1, we examined sentences containing metaphorical or literal uses of verbs, presented with or without prior context. In Experiment 2, we examined sentences containing idioms or literal phrases for the same participants to determine whether the link to executive control was qualitatively similar or different to Experiment 1. When metaphors were low familiar, all people read verbs used as metaphors more slowly than verbs used literally (this difference was smaller for high familiar metaphors). Executive control capacity modulated this pattern in that high executive control readers spent more time reading verbs when a prior context forced a particular interpretation (metaphorical or literal), and they had faster total metaphor reading times when there was a prior context. Interestingly, executive control did not relate to idiom processing for the same readers. Here, all readers had faster total reading times for high familiar idioms than literal phrases. Thus, executive control relates to metaphor but not idiom processing for these readers, and for the particular metaphor and idiom reading manipulations presented. PMID:25628557

  18. Individual Differences in Childhood Sleep Problems Predict Later Cognitive Executive Control

    PubMed Central

    Friedman, Naomi P.; Corley, Robin P.; Hewitt, John K.; Wright, Kenneth P.

    2009-01-01

    Study Objective: To determine whether individual differences in developmental patterns of general sleep problems are associated with 3 executive function abilities—inhibiting, updating working memory, and task shifting—in late adolescence. Participants: 916 twins (465 female, 451 male) and parents from the Colorado Longitudinal Twin Study. Measurements and Results: Parents reported their children's sleep problems at ages 4 years, 5 y, 7 y, and 9–16 y based on a 7-item scale from the Child-Behavior Checklist; a subset of children (n = 568) completed laboratory assessments of executive functions at age 17. Latent variable growth curve analyses were used to model individual differences in longitudinal trajectories of childhood sleep problems. Sleep problems declined over time, with ~70% of children having ≥ 1 problem at age 4 and ~33% of children at age 16. However, significant individual differences in both the initial levels of problems (intercept) and changes across time (slope) were observed. When executive function latent variables were added to the model, the intercept did not significantly correlate with the later executive function latent variables; however, the slope variable significantly (P < 0.05) negatively correlated with inhibiting (r = −0.27) and updating (r = −0.21), but not shifting (r = −0.10) abilities. Further analyses suggested that the slope variable predicted the variance common to the 3 executive functions (r = −0.29). Conclusions: Early levels of sleep problems do not seem to have appreciable implications for later executive functioning. However, individuals whose sleep problems decrease more across time show better general executive control in late adolescence. Citation: Friedman NP; Corley RP; Hewitt JK; Wright KP. Individual differences in childhood sleep problems predict later cognitive executive control. SLEEP 2009;32(3):323-333. PMID:19294952

  19. Individual differences in executive control relate to metaphor processing: an eye movement study of sentence reading.

    PubMed

    Columbus, Georgie; Sheikh, Naveed A; Côté-Lecaldare, Marilena; Häuser, Katja; Baum, Shari R; Titone, Debra

    2014-01-01

    Metaphors are common elements of language that allow us to creatively stretch the limits of word meaning. However, metaphors vary in their degree of novelty, which determines whether people must create new meanings on-line or retrieve previously known metaphorical meanings from memory. Such variations affect the degree to which general cognitive capacities such as executive control are required for successful comprehension. We investigated whether individual differences in executive control relate to metaphor processing using eye movement measures of reading. Thirty-nine participants read sentences including metaphors or idioms, another form of figurative language that is more likely to rely on meaning retrieval. They also completed the AX-CPT, a domain-general executive control task. In Experiment 1, we examined sentences containing metaphorical or literal uses of verbs, presented with or without prior context. In Experiment 2, we examined sentences containing idioms or literal phrases for the same participants to determine whether the link to executive control was qualitatively similar or different to Experiment 1. When metaphors were low familiar, all people read verbs used as metaphors more slowly than verbs used literally (this difference was smaller for high familiar metaphors). Executive control capacity modulated this pattern in that high executive control readers spent more time reading verbs when a prior context forced a particular interpretation (metaphorical or literal), and they had faster total metaphor reading times when there was a prior context. Interestingly, executive control did not relate to idiom processing for the same readers. Here, all readers had faster total reading times for high familiar idioms than literal phrases. Thus, executive control relates to metaphor but not idiom processing for these readers, and for the particular metaphor and idiom reading manipulations presented.

  20. New data model with better functionality for VLab

    NASA Astrophysics Data System (ADS)

    da Silveira, P. R.; Wentzcovitch, R. M.; Karki, B. B.

    2009-12-01

    The VLab infrastructure and architecture was further developed to allow for several new features. First, workflows for first principles calculations of thermodynamics properties and static elasticity programmed in Java as Web Services can now be executed by multiple users. Second, jobs generated by these workflows can now be executed in batch in multiple servers. A simple internal schedule was implemented to handle hundreds of execution packages generated by multiple users and avoid the overload on servers. Third, a new data model was implemented to guarantee integrity of a project (workflow execution) in case of failure. The latter can happen in an execution package or in a workflow phase. By recording all executed steps of a project, its execution can be resumed after dynamic alteration of parameters through the VLab Portal. Fourth, batch jobs can also be monitored through the portal. Now, better and faster interaction with servers is achieved using Ajax technology. Finally, plots are now created on the Vlab server using Gnuplot 4.2.2. Research supported by NSF grants ATM 0428774 (VLab). Vlab is hosted by the Minnesota Supercomputing Institute.

  1. Executive functioning independently predicts self-rated health and improvement in self-rated health over time among community-dwelling older adults.

    PubMed

    McHugh, Joanna Edel; Lawlor, Brian A

    2016-01-01

    Self-rated health, as distinct from objective measures of health, is a clinically informative metric among older adults. The purpose of our study was to examine the cognitive and psychosocial factors associated with self-rated health. 624 participants over the age of 60 were assessed at baseline, and of these, 510 were contacted for a follow-up two years later. Measures of executive function and self-rated health were assessed at baseline, and self-rated health was assessed at follow-up. We employed multiple linear regression analyses to investigate the relationship between executive functioning and self-rated health, while controlling for demographic, psychosocial and biological variables. Controlling for other relevant variables, executive functioning independently and solely predicted self-rated health, both at a cross-sectional level, and also over time. Loneliness was also found to cross-sectionally predict self-rated health, although this relationship was not present at a longitudinal level. Older adults' self-rated health may be related to their executive functioning and to their loneliness. Self-rated health appeared to improve over time, and the extent of this improvement was also related to executive functioning at baseline. Self-rated health may be a judgement made of one's functioning, especially executive functioning, which changes with age and therefore may be particularly salient in the reflections of older adults.

  2. "Opening an emotional dimension in me": changes in emotional reactivity and emotion regulation in a case of executive impairment after left fronto-parietal damage.

    PubMed

    Salas, Christian E; Radovic, Darinka; Yuen, Kenneth S L; Yeates, Giles N; Castro, O; Turnbull, Oliver H

    2014-01-01

    Dysexecutive impairment is a common problem after brain injury, particularly after damage to the lateral surface of the frontal lobes. There is a large literature describing the cognitive deficits associated with executive impairment after dorsolateral damage; however, little is known about its impact on emotional functioning. This case study describes changes in a 72-year-old man (Professor F) who became markedly dysexecutive after a left fron-to-parietal stroke. Professor F's case is remarkable in that, despite exhibiting typical executive impairments, abstraction and working memory capacities were spared. Such preservation of insight-related capacities allowed him to offer a detailed account of his emotional changes. Quantitative and qualitative tools were used to explore changes in several well-known emotional processes. The results suggest that Professor F's two main emotional changes were in the domain of emotional reactivity (increased experience of both positive and negative emotions) and emotion regulation (down-regulation of sadness). Professor F related both changes to difficulties in his thinking process, especially a difficulty generating and manipulating thoughts during moments of negative arousal. These results are discussed in relation to the literature on executive function and emotion regulation. The relevance of these findings for neuropsychological rehabilitation and for the debate on the neural basis of emotional processes is addressed.

  3. 78 FR 71695 - Self-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-29

    ... inquiries and investigations. The current approach of requiring members to report the reference time instead... proposing amendments to the equity trade reporting rules relating to reporting (i) an additional time field for specified trades, (ii) execution time in milliseconds, (iii) reversals, (iv) trades executed on...

  4. Computer architecture for efficient algorithmic executions in real-time systems: New technology for avionics systems and advanced space vehicles

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Youngblood, John N.; Saha, Aindam

    1987-01-01

    Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processing elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.

  5. Computer architecture for efficient algorithmic executions in real-time systems: new technology for avionics systems and advanced space vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carroll, C.C.; Youngblood, J.N.; Saha, A.

    1987-12-01

    Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processingmore » elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.« less

  6. Accelerating next generation sequencing data analysis with system level optimizations.

    PubMed

    Kathiresan, Nagarajan; Temanni, Ramzi; Almabrazi, Hakeem; Syed, Najeeb; Jithesh, Puthen V; Al-Ali, Rashid

    2017-08-22

    Next generation sequencing (NGS) data analysis is highly compute intensive. In-memory computing, vectorization, bulk data transfer, CPU frequency scaling are some of the hardware features in the modern computing architectures. To get the best execution time and utilize these hardware features, it is necessary to tune the system level parameters before running the application. We studied the GATK-HaplotypeCaller which is part of common NGS workflows, that consume more than 43% of the total execution time. Multiple GATK 3.x versions were benchmarked and the execution time of HaplotypeCaller was optimized by various system level parameters which included: (i) tuning the parallel garbage collection and kernel shared memory to simulate in-memory computing, (ii) architecture-specific tuning in the PairHMM library for vectorization, (iii) including Java 1.8 features through GATK source code compilation and building a runtime environment for parallel sorting and bulk data transfer (iv) the default 'on-demand' mode of CPU frequency is over-clocked by using 'performance-mode' to accelerate the Java multi-threads. As a result, the HaplotypeCaller execution time was reduced by 82.66% in GATK 3.3 and 42.61% in GATK 3.7. Overall, the execution time of NGS pipeline was reduced to 70.60% and 34.14% for GATK 3.3 and GATK 3.7 respectively.

  7. Conceptualization and Operationalization of Executive Function

    ERIC Educational Resources Information Center

    Baggetta, Peter; Alexander, Patricia A.

    2016-01-01

    Executive function is comprised of different behavioral and cognitive elements and is considered to play a significant role in learning and academic achievement. Educational researchers frequently study the construct. However, because of its complexity functionally, the research on executive function can at times be both confusing and…

  8. A design fix to supervisory control for fault-tolerant scheduling of real-time multiprocessor systems with aperiodic tasks

    NASA Astrophysics Data System (ADS)

    Devaraj, Rajesh; Sarkar, Arnab; Biswas, Santosh

    2015-11-01

    In the article 'Supervisory control for fault-tolerant scheduling of real-time multiprocessor systems with aperiodic tasks', Park and Cho presented a systematic way of computing a largest fault-tolerant and schedulable language that provides information on whether the scheduler (i.e., supervisor) should accept or reject a newly arrived aperiodic task. The computation of such a language is mainly dependent on the task execution model presented in their paper. However, the task execution model is unable to capture the situation when the fault of a processor occurs even before the task has arrived. Consequently, a task execution model that does not capture this fact may possibly be assigned for execution on a faulty processor. This problem has been illustrated with an appropriate example. Then, the task execution model of Park and Cho has been modified to strengthen the requirement that none of the tasks are assigned for execution on a faulty processor.

  9. Executive Functioning Skills in Long-Term Users of Cochlear Implants: A Case Control Study

    PubMed Central

    Pisoni, David B.; Henning, Shirley C.; Colson, Bethany G.

    2013-01-01

    Objective To investigate differences in executive functioning between deaf children with cochlear implants (CIs) and normal-hearing (NH) peers. The cognitive effects of auditory deprivation in childhood may extend beyond speech–language skills to more domain-general areas including executive functioning. Methods Executive functioning skills in a sample of 53 prelingually deaf children, adolescents, and young adults who received CIs prior to age 7 years and who had used their CIs for ≥7 years were compared with age- and nonverbal IQ-matched NH peers and with scale norms. Results Despite having above average nonverbal IQ, the CI sample scored lower than the NH sample and test norms on several measures of short-term/working memory, fluency–speed, and inhibition–concentration. Executive functioning was unrelated to most demographic and hearing history characteristics. Conclusions Prelingual deafness and long-term use of CIs was associated with increased risk of weaknesses in executive functioning. PMID:23699747

  10. Fast Transformation of Temporal Plans for Efficient Execution

    NASA Technical Reports Server (NTRS)

    Tsamardinos, Ioannis; Muscettola, Nicola; Morris, Paul

    1998-01-01

    Temporal plans permit significant flexibility in specifying the occurrence time of events. Plan execution can make good use of that flexibility. However, the advantage of execution flexibility is counterbalanced by the cost during execution of propagating the time of occurrence of events throughout the flexible plan. To minimize execution latency, this propagation needs to be very efficient. Previous work showed that every temporal plan can be reformulated as a dispatchable plan, i.e., one for which propagation to immediate neighbors is sufficient. A simple algorithm was given that finds a dispatchable plan with a minimum number of edges in cubic time and quadratic space. In this paper, we focus on the efficiency of the reformulation process, and improve on that result. A new algorithm is presented that uses linear space and has time complexity equivalent to Johnson s algorithm for all-pairs shortest-path problems. Experimental evidence confirms the practical effectiveness of the new algorithm. For example, on a large commercial application, the performance is improved by at least two orders of magnitude. We further show that the dispatchable plan, already minimal in the total number of edges, can also be made minimal in the maximum number of edges incoming or outgoing at any node.

  11. An enhanced Ada run-time system for real-time embedded processors

    NASA Technical Reports Server (NTRS)

    Sims, J. T.

    1991-01-01

    An enhanced Ada run-time system has been developed to support real-time embedded processor applications. The primary focus of this development effort has been on the tasking system and the memory management facilities of the run-time system. The tasking system has been extended to support efficient and precise periodic task execution as required for control applications. Event-driven task execution providing a means of task-asynchronous control and communication among Ada tasks is supported in this system. Inter-task control is even provided among tasks distributed on separate physical processors. The memory management system has been enhanced to provide object allocation and protected access support for memory shared between disjoint processors, each of which is executing a distinct Ada program.

  12. RTSJ Memory Areas and Their Affects on the Performance of a Flight-Like Attitude Control System

    NASA Technical Reports Server (NTRS)

    Niessner, Albert F.; Benowitz, Edward G.

    2003-01-01

    The two most important factors in improving performance in any software system, but especially a real-time, embedded system, are knowing which components are the low performers and knowing what can be done to improve their performance. The word performance with respect to a real-time, embedded system does not necessarily mean fast execution, which is the common definition when discussing non real-time systems. It also includes meeting all of the specified execution dead-lines and executing at the correct time without sacrificing non real-time performance. Using a Java prototype of an existing control system used on Deep Space 1[1], the effects from adding memory areas are measured and evaluated with respect to improving performance.

  13. Spaceborne computer executive routine functional design specification. Volume 2: Computer executive design for space station/base

    NASA Technical Reports Server (NTRS)

    Kennedy, J. R.; Fitzpatrick, W. S.

    1971-01-01

    The computer executive functional system design concepts derived from study of the Space Station/Base are presented. Information Management System hardware configuration as directly influencing the executive design is reviewed. The hardware configuration and generic executive design requirements are considered in detail in a previous report (System Configuration and Executive Requirements Specifications for Reusable Shuttle and Space Station/Base, 9/25/70). This report defines basic system primitives and delineates processes and process control. Supervisor states are considered for describing basic multiprogramming and multiprocessing systems. A high-level computer executive including control of scheduling, allocation of resources, system interactions, and real-time supervisory functions is defined. The description is oriented to provide a baseline for a functional simulation of the computer executive system.

  14. Mobile Phone Service Process Hiccups at Cellular Inc.

    ERIC Educational Resources Information Center

    Edgington, Theresa M.

    2010-01-01

    This teaching case documents an actual case of process execution and failure. The case is useful in MIS introductory courses seeking to demonstrate the interdependencies within a business process, and the concept of cascading failure at the process level. This case demonstrates benefits and potential problems with information technology systems,…

  15. The Self-Actualizing Case Method.

    ERIC Educational Resources Information Center

    Gunn, Bruce

    1980-01-01

    Presents a case procedure designed to assist trainees in perfecting their problem-solving skills. Elements of that procedure are the rationale behind this "self-actualizing" case method; the role that the instructor, case leaders, and participants play in its execution; and the closed-loop grading system used for peer evaluation. (CT)

  16. On program restructuring, scheduling, and communication for parallel processor systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polychronopoulos, Constantine D.

    1986-08-01

    This dissertation discusses several software and hardware aspects of program execution on large-scale, high-performance parallel processor systems. The issues covered are program restructuring, partitioning, scheduling and interprocessor communication, synchronization, and hardware design issues of specialized units. All this work was performed focusing on a single goal: to maximize program speedup, or equivalently, to minimize parallel execution time. Parafrase, a Fortran restructuring compiler was used to transform programs in a parallel form and conduct experiments. Two new program restructuring techniques are presented, loop coalescing and subscript blocking. Compile-time and run-time scheduling schemes are covered extensively. Depending on the program construct, thesemore » algorithms generate optimal or near-optimal schedules. For the case of arbitrarily nested hybrid loops, two optimal scheduling algorithms for dynamic and static scheduling are presented. Simulation results are given for a new dynamic scheduling algorithm. The performance of this algorithm is compared to that of self-scheduling. Techniques for program partitioning and minimization of interprocessor communication for idealized program models and for real Fortran programs are also discussed. The close relationship between scheduling, interprocessor communication, and synchronization becomes apparent at several points in this work. Finally, the impact of various types of overhead on program speedup and experimental results are presented.« less

  17. Acute effects of moderate aerobic exercise on specific aspects of executive function in different age and fitness groups: A meta-analysis.

    PubMed

    Ludyga, Sebastian; Gerber, Markus; Brand, Serge; Holsboer-Trachsler, Edith; Pühse, Uwe

    2016-11-01

    Whereas a wealth of studies have investigated acute effects of moderate aerobic exercise on executive function, the roles of age, fitness, and the component of executive function in this relationship still remain unclear. Therefore, the present meta-analysis investigates exercise-induced benefits on specific aspects of executive function in different age and aerobic fitness subgroups. Based on data from 40 experimental studies, a small effect of aerobic exercise on time-dependent measures (g = .35) and accuracy (g = .22) in executive function tasks was confirmed. The results further suggest that preadolescent children (g = .54) and older adults (g = .67) compared to other age groups benefit more from aerobic exercise when reaction time is considered as dependent variable. In contrast to age, aerobic fitness and the executive function component had no influence on the obtained effect sizes. Consequently, high aerobic fitness is no prerequisite for temporary improvements of the executive control system, and low- as well as high-fit individuals seem to benefit from exercise in a similar way. However, a higher sensitivity of executive function to acute aerobic exercise was found in individuals undergoing developmental changes. Therefore, preadolescent children and older adults in particular might strategically use a single aerobic exercise session to prepare for a situation demanding high executive control. © 2016 Society for Psychophysiological Research.

  18. Two Approaches to Estimating the Effect of Parenting on the Development of Executive Function in Early Childhood

    PubMed Central

    Blair, Clancy; Raver, C. Cybele; Berry, Daniel J.

    2015-01-01

    In the current article, we contrast 2 analytical approaches to estimate the relation of parenting to executive function development in a sample of 1,292 children assessed longitudinally between the ages of 36 and 60 months of age. Children were administered a newly developed and validated battery of 6 executive function tasks tapping inhibitory control, working memory, and attention shifting. Residualized change analysis indicated that higher quality parenting as indicated by higher scores on widely used measures of parenting at both earlier and later time points predicted more positive gain in executive function at 60 months. Latent change score models in which parenting and executive function over time were held to standards of longitudinal measurement invariance provided additional evidence of the association between change in parenting quality and change in executive function. In these models, cross-lagged paths indicated that in addition to parenting predicting change in executive function, executive function bidirectionally predicted change in parenting quality. Results were robust with the addition of covariates, including child sex, race, maternal education, and household income-to-need. Strengths and drawbacks of the 2 analytic approaches are discussed, and the findings are considered in light of emerging methodological innovations for testing the extent to which executive function is malleable and open to the influence of experience. PMID:23834294

  19. Integrating Experiential Learning into Business Courses: Using Learning Journals to Create Living Case Studies

    ERIC Educational Resources Information Center

    McHann, James C.; Frost, Laura A.

    2010-01-01

    Research demonstrates that the capacity to implement strategy and to execute plans drives business success (Hrebiniak, 2007) and that businesses' inability to succeed by executing effectively arises from the ubiquitous incapacity of business professionals to overcome the gap between what they know and what they are actually able to do, whether…

  20. Repositioning Your EMBA Program and Reinventing Your Brand: A Case Study Analysis

    ERIC Educational Resources Information Center

    Petit, Francis

    2009-01-01

    The purpose of this research is to illustrate how Fordham University, the Jesuit University of New York, repositioned its Executive MBA Program and reinvented its brand, over a ten year period. More specifically, this research will analyze the current state of the Executive MBA market and will discuss the best practices and frameworks implemented…

  1. Measuring Executive Function in Early Childhood: A Case for Formative Measurement

    ERIC Educational Resources Information Center

    Willoughby, Michael T.; Blair, Clancy B.

    2016-01-01

    This study tested whether individual executive function (EF) tasks were better characterized as formative or reflective indicators of the latent construct of EF. EF data that were collected as part of the Family Life Project (FLP), a prospective longitudinal study of families who were recruited at the birth of a new child (N = 1,292), when…

  2. Empowerment of African American Women Leaders in Higher Education: A Multiple Case Study

    ERIC Educational Resources Information Center

    McDaniel, Sharon L.

    2016-01-01

    The purpose of this study was to gain an understanding of the perspectives on empowerment held by African American women who work in executive positions within higher educational settings. This study also seeks to provide other women with a deeper level of awareness regarding the journey towards executive leadership. Current literature explores…

  3. The Freedom of Information Act, Executive Order 11652 and The Second Session of the Ninety-Second Congress.

    ERIC Educational Resources Information Center

    Tenney, Craig

    In 1967 Congress passed the Freedom of Information Act (FOIA), which included nine examptions allowing the Executive branch of government the privilege of withholding information from the public in cases where disclosure would constitute a threat to national security. Responding to Congressional charges that the freedoms extended by the FOIA…

  4. Nonverbal Executive Function Is Mediated by Language: A Study of Deaf and Hearing Children

    ERIC Educational Resources Information Center

    Botting, Nicola; Jones, Anna; Marshall, Chloe; Denmark, Tanya; Atkinson, Joanna; Morgan, Gary

    2017-01-01

    Studies have suggested that language and executive function (EF) are strongly associated. Indeed, the two are difficult to separate, and it is particularly difficult to determine whether one skill is more dependent on the other. Deafness provides a unique opportunity to disentangle these skills because in this case, language difficulties have a…

  5. Qvo Vadis Magister Artium? Policy Implications of Executive Master's Programmes in an Israeli Research University

    ERIC Educational Resources Information Center

    Yogev, Abraham

    2010-01-01

    During recent decades master's studies have mainly become professional, but in some countries, like Israel, they still are a stepping stone toward doctorate studies. Changes in that respect may however occur due to recent university marketization processes. Using Tel Aviv University as a case study, we focus on the executive master's programmes…

  6. Real-time Java for flight applications: an update

    NASA Technical Reports Server (NTRS)

    Dvorak, D.

    2003-01-01

    The RTSJ is a specification for supporting real-time execution in the Java programming language. The specification has been shaped by several guiding principles, particularly: predictable execution as the first priority in all tradeoffs, no syntactic extensions to Java, and backward compatibility.

  7. Literacy in the Workplace: The Executive Perspective. A Qualitative Research Study.

    ERIC Educational Resources Information Center

    Omega Group, Inc., Haverford, PA.

    Twenty-eight in-depth interviews were conducted with top executives in Philadelphia to discover issues and concerns about committing organizational resources over time to workplace literacy programs. Participants represented major organizations and institutions, both manufacturing and service. The executives reported that the manifestations of…

  8. Electronic business model for small- and medium-sized manufacturing enterprises (SME): a case study

    NASA Astrophysics Data System (ADS)

    Yuen, Karina; Chung, Walter W.

    2001-10-01

    This paper identifies three essential factors (information infrastructure, executive information system and a new manufacturing paradigm) that are used to support the development of a new business model for competitiveness. They facilitate changes in organization structure in support of business transformation. A SME can source a good manufacturing practice using a model of academic-university collaboration to gain competitive advantage in the e-business world. The collaboration facilitates the change agents to use information systems development as a vehicle to increase the capability of executives in using information and knowledge management to gain higher responsiveness and customer satisfaction. The case company is used to illustrate the application of a web-based executive information system to interface internal communications with external operation. It explains where a good manufacturing practice may be re-applied by other SMEs to acquire skills as a learning organization grows in an extended enterprise setting.

  9. Developing high-level change and innovation agents: competencies and challenges for executive leadership.

    PubMed

    Malloch, Kathy; Melnyk, Bernadette Mazurek

    2013-01-01

    The work of health care reform and revolution requires leadership competencies that integrate the digital realities of time, space, and media. Leadership skills and behaviors of command, control, and directing from predigital times are no longer effective, given the impacts of the digital changes. Developing leadership competence in evidence-driven processes, facilitation, collaborative teamwork, and instilling a sense of urgency is the work of today's executive leaders. Ten competencies necessary for contemporary executive leadership are presented in this article.

  10. Levinson's Dream Theory and Its Relevance in an Academic Executive Mentoring Program: An Exploratory Study of Executive Mentors' Practice and Individuation

    ERIC Educational Resources Information Center

    Scherer, Douglas Martin

    2010-01-01

    The purpose of this study was to explore the relevance that executive mentors' Dream journeys had for their mentoring practices. Dream journeys are the visions of where young adults see themselves in the future, and how they integrate themselves into the adult world over time. It was anticipated that a better understanding of executive mentors'…

  11. An uncommon case of random fire-setting behavior associated with Todd paralysis: a case report.

    PubMed

    Kanehisa, Masayuki; Morinaga, Katsuhiko; Kohno, Hisae; Maruyama, Yoshihiro; Ninomiya, Taiga; Ishitobi, Yoshinobu; Tanaka, Yoshihiro; Tsuru, Jusen; Hanada, Hiroaki; Yoshikawa, Tomoya; Akiyoshi, Jotaro

    2012-08-31

    The association between fire-setting behavior and psychiatric or medical disorders remains poorly understood. Although a link between fire-setting behavior and various organic brain disorders has been established, associations between fire setting and focal brain lesions have not yet been reported. Here, we describe the case of a 24-year-old first time arsonist who suffered Todd's paralysis prior to the onset of a bizarre and random fire-setting behavior. A case of a 24-year-old man with a sudden onset of a bizarre and random fire-setting behavior is reported. The man, who had been arrested on felony arson charges, complained of difficulties concentrating and of recent memory disturbances with leg weakness. A video-EEG recording demonstrated a close relationship between the focal motor impairment and a clear-cut epileptic ictal discharge involving the bilateral motor cortical areas. The SPECT result was statistically analyzed by comparing with standard SPECT images obtained from our institute (easy Z-score imaging system; eZIS). eZIS revealed hypoperfusion in cingulate cortex, basal ganglia and hyperperfusion in frontal cortex,. A neuropsychological test battery revealed lower than normal scores for executive function, attention, and memory, consistent with frontal lobe dysfunction. The fire-setting behavior and Todd's paralysis, together with an unremarkable performance on tests measuring executive function fifteen months prior, suggested a causal relationship between this organic brain lesion and the fire-setting behavior. The case describes a rare and as yet unreported association between random, impulse-driven fire-setting behavior and damage to the brain and suggests a disconnection of frontal lobe structures as a possible pathogenic mechanism.

  12. Development and Implementation of a Telecommuting Evaluation Framework, and Modeling the Executive Telecommuting Adoption Process

    NASA Astrophysics Data System (ADS)

    Vora, V. P.; Mahmassani, H. S.

    2002-02-01

    This work proposes and implements a comprehensive evaluation framework to document the telecommuter, organizational, and societal impacts of telecommuting through telecommuting programs. Evaluation processes and materials within the outlined framework are also proposed and implemented. As the first component of the evaluation process, the executive survey is administered within a public sector agency. The survey data is examined through exploratory analysis and is compared to a previous survey of private sector executives. The ordinal probit, dynamic probit, and dynamic generalized ordinal probit (DGOP) models of telecommuting adoption are calibrated to identify factors which significantly influence executive adoption preferences and to test the robustness of such factors. The public sector DGOP model of executive willingness to support telecommuting under different program scenarios is compared with an equivalent private sector DGOP model. Through the telecommuting program, a case study of telecommuting travel impacts is performed to further substantiate research.

  13. The politics of smoking in federal buildings: an executive order case study.

    PubMed

    Cook, Daniel M; Bero, Lisa A

    2009-09-01

    Executive orders are important presidential tools for health policymaking that are subject to less public scrutiny than are legislation and regulatory rulemaking. President Bill Clinton banned smoking in federal government buildings by executive order in 1997, after the administration of George H. W. Bush had twice considered and abandoned a similar policy. The 1991 and 1993 Bush proposals drew objections from agency heads and labor unions, many coordinated by the tobacco industry. We analyzed internal tobacco industry documents and found that the industry engaged in extensive executive branch lobbying and other political activity surrounding the Clinton smoking ban. Whereas some level of stakeholder politics might have been expected, this policy also featured jockeying among various agencies and the participation of organized labor.

  14. 76 FR 34281 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-13

    ...''). The specified time period will commence for an option when a transaction occurs in any series in such... contracts executed among all series during the specified time period that represents an issue percentage... executed among all series during the specified time period that represents an issue percentage equal to the...

  15. Hemodynamic Signal Changes Accompanying Execution and Imagery of Swallowing in Patients with Dysphagia: A Multiple Single-Case Near-Infrared Spectroscopy Study

    PubMed Central

    Kober, Silvia Erika; Bauernfeind, Günther; Woller, Carina; Sampl, Magdalena; Grieshofer, Peter; Neuper, Christa; Wood, Guilherme

    2015-01-01

    In the present multiple case study, we examined hemodynamic changes in the brain in response to motor execution (ME) and motor imagery (MI) of swallowing in dysphagia patients compared to healthy matched controls using near-infrared spectroscopy (NIRS). Two stroke patients with cerebral lesions in the right hemisphere, two stroke patients with lesions in the brainstem, and two neurologically healthy control subjects actively swallowed saliva (ME) and mentally imagined to swallow saliva (MI) in a randomized order while changes in concentration of oxygenated hemoglobin (oxy-Hb) and deoxygenated hemoglobin (deoxy-Hb) were assessed. In line with recent findings in healthy young adults, MI and ME of swallowing led to the strongest NIRS signal change in the inferior frontal gyrus in stroke patients as well as in healthy elderly. We found differences in the topographical distribution and time course of the hemodynamic response in dependence on lesion location. Dysphagia patients with lesions in the brainstem showed bilateral hemodynamic signal changes in the inferior frontal gyrus during active swallowing comparable to healthy controls. In contrast, dysphagia patients with cerebral lesions in the right hemisphere showed more unilateral activation patterns during swallowing. Furthermore, patients with cerebral lesions showed a prolonged time course of the hemodynamic response during MI and ME of swallowing compared to healthy controls and patients with brainstem lesions. Brain activation patterns associated with ME and MI of swallowing were largely comparable, especially for changes in deoxy-Hb. Hence, the present results provide new evidence regarding timing and topographical distribution of the hemodynamic response during ME and MI of swallowing in dysphagia patients and may have practical impact on future dysphagia treatment. PMID:26217298

  16. Is capacity for pleasure associated with executive career success?

    PubMed

    Clark, D C; Morrison, D E; Fawcett, J

    1984-01-01

    Executives with a low capacity for pleasure were examined to determine if they evidence less occupational and social success than those with normal or high capacity. Data on pleasure capacity and depressive symptoms were collected from 88 senior executive officers, and scores were compared with independent ratings of career success. The 11% of executives with serious work-related or personal problems showed significantly higher pleasure scores than the rest. It is hypothesized that the relatively high pleasure scores of the least successful executives reflect a defensive process of denial or reaction formation rather than an excessively joyful personality trait. A longitudinal study of executives is proposed to clarify whether the high pleasure capacity scores of the least successful executives change situationally over time.

  17. The Effect of Group Therapy With Transactional Analysis Approach on Emotional Intelligence, Executive Functions and Drug Dependency.

    PubMed

    Forghani, Masoomeh; Ghanbari Hashem Abadi, Bahram Ali

    2016-06-01

    The aim of the present study was to evaluate the effect of group psychotherapy with transactional analysis (TA) approach on emotional intelligence (EI), executive functions and substance dependency among drug-addicts at rehabilitation centers in Mashhad city, Iran, in 2013. In this quasi-experimental study with pretest, posttest, case- control stages, 30 patients were selected from a rehabilitation center and randomly divided into two groups. The case group received 12 sessions of group psychotherapy with transactional analysis approach. Then the effects of independent variable (group psychotherapy with TA approach) on EI, executive function and drug dependency were assessed. The Bar-on test was used for EI, Stroop test for measuring executive function and morphine test, meth-amphetamines and B2 test for evaluating drug dependency. Data were analyzed using multifactorial covariance analysis, Levenes' analysis, MANCOVA, t-student and Pearson correlation coefficient tests t with SPSS software. Our results showed that group psychotherapy with the TA approach was effective in improving EI, executive functions and decreasing drug dependency (P < 0.05). The result of this study showed that group psychotherapy with TA approach has significant effects on addicts and prevents addiction recurrence by improving the coping capabilities and some mental functions of the subjects. However, there are some limitations regarding this study including follow-up duration and sample size.

  18. The Aircraft Electric Taxi System: A Qualitative Multi Case Study

    NASA Astrophysics Data System (ADS)

    Johnson, Thomas Frank

    The problem this research addresses is the airline industry, and the seemingly unwillingness attitude towards adopting ways to taxi aircraft without utilizing thrust from the main engines. The purpose of the study was to get a better understanding of the decision-making process of airline executives, in respect to investing in cost saving technology. A qualitative research method is used from personal interviews with 24 airline executives from two major U.S. airlines, related industry journal articles, and aircraft performance data. The following three research questions are addressed. RQ1. Does the cost of jet fuel influence airline executives' decision of adopting the aircraft electric taxi system technology? RQ2 Does the measurable payback period for a return on investment influence airline executives' decision of adopting ETS technology? RQ3. Does the amount of government assistance influence airline executives' decision of adopting ETS technology? A multi case research study design is used with a triangulation technique. The participant perceptions indicate the need to reduce operating costs, they have concerns about investment risk, and they are in favor of future government sponsored performance improvement projects. Based on the framework, findings and implications of this study, a future research paper could focus on the positive environmental effects of the ETS application. A study could be conducted on current airport area air quality and the effects that aircraft main engine thrust taxiing has on the surrounding air quality.

  19. Review of ORD Nanomaterial Case Studies Workshop

    EPA Pesticide Factsheets

    The following is a letter report from the Executive Committee of the BOSC concerning the review of the ORD Nanomaterial Case Studies Workshop: Developing a Comprehensive Environmental Assessment Research Strategy for Nanoscale Titanium Dioxide.

  20. This Working Life.

    ERIC Educational Resources Information Center

    Boothe, James W.; And Others

    1994-01-01

    Recent "Executive Educator" survey of 900 out of 6,200 randomly selected school executives found high school principals had the longest work week; 95.3% reported working over 50 hours weekly. Fully 78% of school executives are devoting more time to educational improvement changes. Despite stressors and salary complaints, most are content with…

  1. 78 FR 22225 - Meeting: African Development Foundation, Board of Directors Executive Session Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-15

    ...;and investigations, committee meetings, agency decisions and rulings, #0;delegations of authority... Executive Session Meeting Meeting: African Development Foundation, Board of Directors Executive Session Meeting Time: Tuesday, April 23, 2013 11:15 a.m. to 2:15 p.m. Place: 1400 Eye Street NW., Suite 1000...

  2. Model-based Executive Control through Reactive Planning for Autonomous Rovers

    NASA Technical Reports Server (NTRS)

    Finzi, Alberto; Ingrand, Felix; Muscettola, Nicola

    2004-01-01

    This paper reports on the design and implementation of a real-time executive for a mobile rover that uses a model-based, declarative approach. The control system is based on the Intelligent Distributed Execution Architecture (IDEA), an approach to planning and execution that provides a unified representational and computational framework for an autonomous agent. The basic hypothesis of IDEA is that a large control system can be structured as a collection of interacting agents, each with the same fundamental structure. We show that planning and real-time response are compatible if the executive minimizes the size of the planning problem. We detail the implementation of this approach on an exploration rover (Gromit an RWI ATRV Junior at NASA Ames) presenting different IDEA controllers of the same domain and comparing them with more classical approaches. We demonstrate that the approach is scalable to complex coordination of functional modules needed for autonomous navigation and exploration.

  3. Method and apparatus for fault tolerance

    NASA Technical Reports Server (NTRS)

    Masson, Gerald M. (Inventor); Sullivan, Gregory F. (Inventor)

    1993-01-01

    A method and apparatus for achieving fault tolerance in a computer system having at least a first central processing unit and a second central processing unit. The method comprises the steps of first executing a first algorithm in the first central processing unit on input which produces a first output as well as a certification trail. Next, executing a second algorithm in the second central processing unit on the input and on at least a portion of the certification trail which produces a second output. The second algorithm has a faster execution time than the first algorithm for a given input. Then, comparing the first and second outputs such that an error result is produced if the first and second outputs are not the same. The step of executing a first algorithm and the step of executing a second algorithm preferably takes place over essentially the same time period.

  4. Fluid Intelligence as a Mediator of the Relationship between Executive Control and Balanced Time Perspective.

    PubMed

    Zajenkowski, Marcin; Stolarski, Maciej; Witowska, Joanna; Maciantowicz, Oliwia; Łowicki, Paweł

    2016-01-01

    This study examined the cognitive foundations of the balanced time perspective (BTP) proposed by Zimbardo and Boyd (1999). Although BTP is defined as the mental ability to switch effectively between different temporal perspectives, its connection with cognitive functioning has not yet been established. We addressed this by exploring the relationships between time perspectives and both fluid intelligence (measured with Raven's and Cattell's tests) and executive control (Go/No-go and anti-saccade tasks). An investigation conducted among Polish adults ( N = 233) revealed that more balanced TP profile was associated with higher fluid intelligence, and higher executive control. Moreover, we found that the relationship between executive control and BTP was completely mediated by fluid intelligence with the effect size (the ratio of the indirect effect to the total effect) of 0.75, which suggests that cognitive abilities play an important role in adoption of temporal balance. The findings have relevance to time perspective theory as they provide valuable insight into the mechanisms involved in assigning human experience to certain time frames.

  5. Event-Related Potentials in a Cued Go-NoGo Task Associated with Executive Functions in Adolescents with Autism Spectrum Disorder; A Case-Control Study.

    PubMed

    Høyland, Anne L; Øgrim, Geir; Lydersen, Stian; Hope, Sigrun; Engstrøm, Morten; Torske, Tonje; Nærland, Terje; Andreassen, Ole A

    2017-01-01

    Executive functions are often affected in autism spectrum disorders (ASD). The underlying biology is however not well known. In the DSM-5, ASD is characterized by difficulties in two domains: Social Interaction and Repetitive and Restricted Behavior, RRB. Insistence of Sameness is part of RRB and has been reported related to executive functions. We aimed to identify differences between ASD and typically developing (TD) adolescents in Event Related Potentials (ERPs) associated with response preparation, conflict monitoring and response inhibition using a cued Go-NoGo paradigm. We also studied the effect of age and emotional content of paradigm related to these ERPs. We investigated 49 individuals with ASD and 49 TD aged 12-21 years, split into two groups below (young) and above (old) 16 years of age. ASD characteristics were quantified by the Social Communication Questionnaire (SCQ) and executive functions were assessed with the Behavior Rating Inventory of Executive Function (BRIEF), both parent-rated. Behavioral performance and ERPs were recorded during a cued visual Go-NoGo task which included neutral pictures (VCPT) and pictures of emotional faces (ECPT). The amplitudes of ERPs associated with response preparation, conflict monitoring, and response inhibition were analyzed. The ASD group showed markedly higher scores than TD in both SCQ and BRIEF. Behavioral data showed no case-control differences in either the VCPT or ECPT in the whole group. While there were no significant case-control differences in ERPs from the combined VCPT and ECPT in the whole sample, the Contingent Negative Variation (CNV) was significantly enhanced in the old ASD group ( p = 0.017). When excluding ASD with comorbid ADHD we found a significantly increased N2 NoGo ( p = 0.016) and N2-effect ( p = 0.023) for the whole group. We found no case-control differences in the P3-components. Our findings suggest increased response preparation in adolescents with ASD older than 16 years and enhanced conflict monitoring in ASD without comorbid ADHD during a Go-NoGo task. The current findings may be related to Insistence of Sameness in ASD. The pathophysiological underpinnings of executive dysfunction should be further investigated to learn more about how this phenomenon is related to core characteristics of ASD.

  6. Staying on Task: Age-Related Changes in the Relationship Between Executive Functioning and Response Time Consistency.

    PubMed

    Vasquez, Brandon P; Binns, Malcolm A; Anderson, Nicole D

    2016-03-01

    Little is known about the relationship of executive functioning with age-related increases in response time (RT) distribution indices (intraindividual standard deviation [ISD], and ex-Gaussian parameters mu, sigma, tau). The goals of this study were to (a) replicate findings of age-related changes in response time distribution indices during an engaging touch-screen RT task and (b) investigate age-related changes in the relationship between executive functioning and RT distribution indices. Healthy adults (24 young [aged 18-30], 24 young-old [aged 65-74], and 24 old-old [aged 75-85]) completed a touch-screen attention task and a battery of neuropsychological tests. The relationships between RT performance and executive functions were examined with structural equation modeling (SEM). ISD, mu, and tau, but not sigma, increased with age. SEM revealed tau as the most salient RT index associated with neuropsychological measures of executive functioning. Further analysis demonstrated that correlations between tau and a weighted executive function composite were significant only in the old-old group. Our results replicate findings of greater RT inconsistency in older adults and reveal that executive functioning is related to tau in adults aged 75-85. These results support literature identifying tau as a marker of cognitive control, which deteriorates in old age. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. Run-time parallelization and scheduling of loops

    NASA Technical Reports Server (NTRS)

    Saltz, Joel H.; Mirchandaney, Ravi; Baxter, Doug

    1988-01-01

    The class of problems that can be effectively compiled by parallelizing compilers is discussed. This is accomplished with the doconsider construct which would allow these compilers to parallelize many problems in which substantial loop-level parallelism is available but cannot be detected by standard compile-time analysis. We describe and experimentally analyze mechanisms used to parallelize the work required for these types of loops. In each of these methods, a new loop structure is produced by modifying the loop to be parallelized. We also present the rules by which these loop transformations may be automated in order that they be included in language compilers. The main application area of the research involves problems in scientific computations and engineering. The workload used in our experiment includes a mixture of real problems as well as synthetically generated inputs. From our extensive tests on the Encore Multimax/320, we have reached the conclusion that for the types of workloads we have investigated, self-execution almost always performs better than pre-scheduling. Further, the improvement in performance that accrues as a result of global topological sorting of indices as opposed to the less expensive local sorting, is not very significant in the case of self-execution.

  8. Mouth rinsing with a carbohydrate solution attenuates exercise-induced decline in executive function.

    PubMed

    Konishi, Kana; Kimura, Tetsuya; Yuhaku, Atsushi; Kurihara, Toshiyuki; Fujimoto, Masahiro; Hamaoka, Takafumi; Sanada, Kiyoshi

    2017-01-01

    A decline in executive function could have a negative influence on the control of actions in dynamic situations, such as sports activities. Mouth rinsing with a carbohydrate solution could serve as an effective treatment for preserving the executive function in exercise. The purpose of this study was to examine the effects of mouth rinsing with a carbohydrate solution on executive function after sustained moderately high-intensity exercise. Eight young healthy participants completed 65 min of running at 75% V̇O 2 max with two mouth-rinsing conditions: with a carbohydrate solution (CHO) or with water (CON). Executive function was assessed before and after exercise by using the incongruent task of the Stroop Color and Word Test. The levels of blood glucose; and plasma adrenocorticotropic hormone (ACTH), epinephrine, and norepinephrine (NE) were evaluated. A two-way repeated-measures ANOVA, with condition (CHO and CON) and time (pre-exercise and post-exercise) as factors, was used to examine the main and interaction effects on the outcome measures. The reaction time in the incongruent condition of the Stroop test significantly increased after exercise in CON (pre-exercise 529 ± 45 ms vs. post-exercise 547 ± 60 ms, P  = 0.029) but not in CHO (pre-exercise 531 ± 54 ms vs. post-exercise 522 ± 80 ms), which resulted in a significant interaction (condition × time) on the reaction time ( P  = 0.028). The increased reaction time in CON indicates a decline in the executive function, which was attenuated in CHO. Increases in plasma epinephrine and NE levels demonstrated a trend toward attenuation accompanying CHO ( P  < 0.085), which appeared to be associated with the preservation of executive function. The blood glucose concentration showed neither significant interactions nor main effects of condition. These findings indicate that mouth rinsing with a carbohydrate solution attenuated the decline in executive function induced by sustained moderately high-intensity exercise, and that such attenuation seems to be unrelated to carbohydrate metabolic pathway but rather attributed, in part, to the inhibition of the excessive release of stress hormones.

  9. The SNAPSHOT study protocol: SNAcking, Physical activity, Self-regulation, and Heart rate Over Time.

    PubMed

    McMinn, David; Allan, Julia L

    2014-09-26

    The cognitive processes responsible for effortful behavioural regulation are known as the executive functions, and are implicated in several factors associated with behaviour control, including focussing on tasks, resisting temptations, planning future actions, and inhibiting prepotent responses. Similar to muscles, the executive functions become fatigued following intensive use (e.g. stressful situations, when tired or busy, and when regulating behaviour such as quitting smoking). Therefore, an individual may be more susceptible to engaging in unhealthy behaviours when their executive functions are depleted. In the present study we investigate associations between the executive functions, snack food consumption, and sedentary behaviour in real time. We hypothesise that individuals may be more susceptible to unhealthy snacking and sedentary behaviours during periods when their executive functions are depleted. We test this hypothesis using real-time objective within-person measurements. A sample of approximately 50 Scottish adults from varied socio-economic, working, and cultural backgrounds will participate in the three phases of the SNAcking, Physical activity, Self-regulation, and Heart rate Over Time (SNAPSHOT) study. Phase one will require participants to complete home-based questionnaires concerned with diet, eating behaviour, and physical activity (≈1.5 hours to complete). Phase two will constitute a 2-3 hour psychological laboratory testing session during which trait-level executive function, general intelligence, and diet and physical activity intentions, past behaviour, and automaticity will be measured. The final phase will involve a 7-day ambulatory protocol during which objective repeated assessments of executive function, snacking behaviour, physical activity, mood, heart rate, perceived energy level, current context and location will be measured during participants' daily routines. Multi-level regression analysis, accounting for observations nested within participants, will be used to investigate associations between fluctuations in the executive functions and health behaviours. Data from the SNAPSHOT study will provide ecologically valid information to help better understand the temporal associations between self-regulatory resources (executive functions) and deleterious health behaviours such as snacking and sedentary behaviour. If we can identify particular periods of the day or locations where self-regulatory resources become depleted and produce suboptimal health behaviour, then interventions can be designed and targeted accordingly.

  10. Computer Simulation Results for the Two-Point Probability Function of Composite Media

    NASA Astrophysics Data System (ADS)

    Smith, P.; Torquato, S.

    1988-05-01

    Computer simulation results are reported for the two-point matrix probability function S2 of two-phase random media composed of disks distributed with an arbitrary degree of impenetrability λ. The novel technique employed to sample S2( r) (which gives the probability of finding the endpoints of a line segment of length r in the matrix) is very accurate and has a fast execution time. Results for the limiting cases λ = 0 (fully penetrable disks) and λ = 1 (hard disks), respectively, compare very favorably with theoretical predictions made by Torquato and Beasley and by Torquato and Lado. Results are also reported for several values of λ. that lie between these two extremes: cases which heretofore have not been examined.

  11. In Search of a Pony: Sources, Methods, Outcomes, and Motivated Reasoning.

    PubMed

    Stone, Marc B

    2018-05-01

    It is highly desirable to be able to evaluate the effect of policy interventions. Such evaluations should have expected outcomes based upon sound theory and be carefully planned, objectively evaluated and prospectively executed. In many cases, however, assessments originate with investigators' poorly substantiated beliefs about the effects of a policy. Instead of designing studies that test falsifiable hypotheses, these investigators adopt methods and data sources that serve as little more than descriptions of these beliefs in the guise of analysis. Interrupted time series analysis is one of the most popular forms of analysis used to present these beliefs. It is intuitively appealing but, in most cases, it is based upon false analogies, fallacious assumptions and analytical errors.

  12. MapReduce in the Cloud: A Use Case Study for Efficient Co-Occurrence Processing of MEDLINE Annotations with MeSH.

    PubMed

    Kreuzthaler, Markus; Miñarro-Giménez, Jose Antonio; Schulz, Stefan

    2016-01-01

    Big data resources are difficult to process without a scaled hardware environment that is specifically adapted to the problem. The emergence of flexible cloud-based virtualization techniques promises solutions to this problem. This paper demonstrates how a billion of lines can be processed in a reasonable amount of time in a cloud-based environment. Our use case addresses the accumulation of concept co-occurrence data in MEDLINE annotation as a series of MapReduce jobs, which can be scaled and executed in the cloud. Besides showing an efficient way solving this problem, we generated an additional resource for the scientific community to be used for advanced text mining approaches.

  13. Comparison of two transonic noise prediction formulations using the aircraft noise prediction program

    NASA Technical Reports Server (NTRS)

    Spence, Peter L.

    1987-01-01

    This paper addresses recently completed work on using Farassat's Formulation 3 noise prediction code with the Aircraft Noise Prediction Program (ANOPP). Software was written to link aerodynamic loading generated by the Propeller Loading (PLD) module within ANOPP with formulation 3. Included are results of comparisons between Formulation 3 with ANOPP's existing noise prediction modules, Subsonic Propeller Noise (SPN) and Transonic Propeller Noise (TPN). Four case studies are investigated. Results of the comparison studies show excellent agreement for the subsonic cases. Differences found in the comparisons made under transonic conditions are strictly numerical and can be explained by the way in which the time derivative is calculated in Formulation 3. Also included is a section on how to execute Formulation 3 with ANOPP.

  14. The Hedwig van Ameringen Executive Leadership in Academic MedicineRTM Program for Women: An Explanatory Study Regarding Its Development and Persistence

    ERIC Educational Resources Information Center

    Mensel, Ruth

    2010-01-01

    This study was designed to determine which factors contributed to the development and persistence of a women's leadership development program in higher education. The "Hedwig van Ameringen" Executive Leadership in Academic Medicine[R] "Program for Women" was the basis for this single-case study. To speculate about ELAM's development and…

  15. A Multiple Case Study: Gauging the Effects of Poverty on School Readiness amongst Preschoolers

    ERIC Educational Resources Information Center

    Onesto, Melissa J.

    2017-01-01

    The home environment, which includes the level of organization and stability in the home, plays a crucial role in the development of executive function and oral language skills. For children who live in a low-SES environment, executive function and oral language acquisition are inferior compared to that of students living at other economic levels.…

  16. Back from the Brink: How a Bold Vision and a Focus on Resources Can Drive System Improvement. Executive Summary

    ERIC Educational Resources Information Center

    Education Resource Strategies, 2015

    2015-01-01

    This executive summary describes a case study that implemented the framework School System 20/20 to examine how Lawrence Public Schools (Massachusetts) is transforming its policies and structures to better align resources with student and teacher needs. Education Resource Strategies' (ERS') School System 20/20 is a set of conditions and practices…

  17. A Case Study of the Development of African American Women Executives

    ERIC Educational Resources Information Center

    Brooks Greaux, Lisa

    2010-01-01

    Even in an era when the country elected an African American man as President of the United States, there is still a paucity of African American women executives within Fortune 500 companies. Although more African American women have joined the ranks of corporate management over the last two decades, the numbers, when compared to those of White…

  18. Working Memory and Mental Arithmetic: A Case for Dual Central Executive Resources

    ERIC Educational Resources Information Center

    Ketelsen, Kirk; Welsh, Marilyn

    2010-01-01

    The current study was designed to examine the possible existence of two limited-capacity pools of central executive resources: one each for verbal and visuospatial processing. Ninety-one college students (M age = 19.0, SD = 2.2) were administered a verbal working memory task that involved updating numbers in 2-, 3-, and 4-load conditions. The task…

  19. When "Happy" Means "Sad": Neuropsychological Evidence for the Right Prefrontal Cortex Contribution to Executive Semantic Processing

    ERIC Educational Resources Information Center

    Samson, Dana; Connolly, Catherine; Humphreys, Glyn W.

    2007-01-01

    The contribution of the left inferior prefrontal cortex in semantic processing has been widely investigated in the last decade. Converging evidence from functional imaging studies shows that this region is involved in the "executive" or "controlled" aspects of semantic processing. In this study, we report a single case study of a patient, PW, with…

  20. Deficits in executive and memory processes in delusional disorder: a case-control study.

    PubMed

    Ibanez-Casas, Inmaculada; De Portugal, Enrique; Gonzalez, Nieves; McKenney, Kathryn A; Haro, Josep M; Usall, Judith; Perez-Garcia, Miguel; Cervilla, Jorge A

    2013-01-01

    Delusional disorder has been traditionally considered a psychotic syndrome that does not evolve to cognitive deterioration. However, to date, very little empirical research has been done to explore cognitive executive components and memory processes in Delusional Disorder patients. This study will investigate whether patients with delusional disorder are intact in both executive function components (such as flexibility, impulsivity and updating components) and memory processes (such as immediate, short term and long term recall, learning and recognition). A large sample of patients with delusional disorder (n = 86) and a group of healthy controls (n = 343) were compared with regard to their performance in a broad battery of neuropsychological tests including Trail Making Test, Wisconsin Card Sorting Test, Colour-Word Stroop Test, and Complutense Verbal Learning Test (TAVEC). When compared to controls, cases of delusional disorder showed a significantly poorer performance in most cognitive tests. Thus, we demonstrate deficits in flexibility, impulsivity and updating components of executive functions as well as in memory processes. These findings held significant after taking into account sex, age, educational level and premorbid IQ. Our results do not support the traditional notion of patients with delusional disorder being cognitively intact.

  1. Hangman's fracture: a historical and biomechanical perspective.

    PubMed

    Rayes, Mahmoud; Mittal, Monika; Rengachary, Setti S; Mittal, Sandeep

    2011-02-01

    The execution technique of hanging, introduced by the Angle, Saxon, and Jute Germanic tribes during their invasions of the Roman Empire and Britain in the 5th century, has remained largely unchanged over time. The earliest form of a gallows was a tree on which prisoners were hanged. Despite the introduction of several modifications such as a trap door, the main mechanism of death remained asphyxiation. This created the opportunity for attempted revival after the execution, and indeed several well-known cases of survival following judicial hanging have been reported. It was not until the introduction of the standard drop by Dr. Samuel Haughton in 1866, and the so-called long drop by William Marwood in 1872 that hanging became a standard, humane means to achieve instantaneous death. Hangmen, however, fearing knot slippage, started substituting the subaural knot for the traditional submental knot. Subaural knots were not as effective, and cases of decapitation were recorded. Standardization of the long drop was further propagated by John Berry, an executioner who used mathematical calculations to estimate the correct drop length for each individual to be hanged. A British committee on capital sentences, led by Lord Aberdare, studied the execution method, and advocated for the submental knot. However, it was not until Frederic Wood-Jones published his seminal work in 1913 that cervical fractures were identified as the main mechanism of death following hanging in which the long drop and a submental knot were used. Schneider introduced the term "hangman's fracture" in 1965, and reported on the biomechanics and other similarities of the cervical fractures seen following judicial hangings and those caused by motor vehicle accidents.

  2. Executive Function, Survival, and Hospitalization in Chronic Obstructive Pulmonary Disease. A Longitudinal Analysis of the National Emphysema Treatment Trial (NETT).

    PubMed

    Dodd, James W; Novotny, Paul; Sciurba, Frank C; Benzo, Roberto P

    2015-10-01

    Cognitive dysfunction has been demonstrated in chronic obstructive pulmonary disease (COPD), but studies are limited to cross-sectional analyses or incompletely characterized populations. We examined longitudinal changes in sensitive measures of executive function in a well-characterized population of patients with severe COPD. This study was performed on patients enrolled in the National Emphysema Treatment Trial. To assess executive function, we analyzed trail making (TM) A and B times at enrollment in the trial (2,128 patients), and at 12 (731 patients) and 24 months (593 patients) after enrollment, adjusted for surgery, marriage status, age, education, income, depression, PaO2, PaCO2, and smoking. Associations with survival and hospitalizations were examined using Cox regression and linear regression models. The average age of the patients was 66.4 years, and the average FEV1 was 23.9% predicted. At the time of enrolment, 38% had executive dysfunction. Compared with those who did not, these patients were older, less educated, had higher oxygen use, higher PaCO2, worse quality of life as measured by the St. George's Respiratory Quotient, reduced well-being, and lower social function. There was no significant change over 2 years in TM A or B times after adjustment for covariables. Changes in TM B times were modestly associated with survival, but changes in TM B-A times were not. Changes in TM scores were not associated with frequency of hospitalization. Lung function, PaO2, smoking, survival, and hospitalizations were not significantly different in those with executive dysfunction. In this large population of patients with severe emphysema and heavy cigarette smoking exposure, there was no significant decline over 2 years in cognitive executive function as measured by TM tests. There was no association between executive function impairment and frequency of hospitalization, and there was a possible modest association with survival. It is plausible that cerebrovascular comorbidities explain previously described cognitive pathology in COPD.

  3. A Hybrid Procedural/Deductive Executive for Autonomous Spacecraft

    NASA Technical Reports Server (NTRS)

    Pell, Barney; Gamble, Edward B.; Gat, Erann; Kessing, Ron; Kurien, James; Millar, William; Nayak, P. Pandurang; Plaunt, Christian; Williams, Brian C.; Lau, Sonie (Technical Monitor)

    1998-01-01

    The New Millennium Remote Agent (NMRA) will be the first AI system to control an actual spacecraft. The spacecraft domain places a strong premium on autonomy and requires dynamic recoveries and robust concurrent execution, all in the presence of tight real-time deadlines, changing goals, scarce resource constraints, and a wide variety of possible failures. To achieve this level of execution robustness, we have integrated a procedural executive based on generic procedures with a deductive model-based executive. A procedural executive provides sophisticated control constructs such as loops, parallel activity, locks, and synchronization which are used for robust schedule execution, hierarchical task decomposition, and routine configuration management. A deductive executive provides algorithms for sophisticated state inference and optimal failure recover), planning. The integrated executive enables designers to code knowledge via a combination of procedures and declarative models, yielding a rich modeling capability suitable to the challenges of real spacecraft control. The interface between the two executives ensures both that recovery sequences are smoothly merged into high-level schedule execution and that a high degree of reactivity is retained to effectively handle additional failures during recovery.

  4. A study of workstation computational performance for real-time flight simulation

    NASA Technical Reports Server (NTRS)

    Maddalon, Jeffrey M.; Cleveland, Jeff I., II

    1995-01-01

    With recent advances in microprocessor technology, some have suggested that modern workstations provide enough computational power to properly operate a real-time simulation. This paper presents the results of a computational benchmark, based on actual real-time flight simulation code used at Langley Research Center, which was executed on various workstation-class machines. The benchmark was executed on different machines from several companies including: CONVEX Computer Corporation, Cray Research, Digital Equipment Corporation, Hewlett-Packard, Intel, International Business Machines, Silicon Graphics, and Sun Microsystems. The machines are compared by their execution speed, computational accuracy, and porting effort. The results of this study show that the raw computational power needed for real-time simulation is now offered by workstations.

  5. Kidnapping model: an extension of Selten's game.

    PubMed

    Iqbal, Azhar; Masson, Virginie; Abbott, Derek

    2017-12-01

    Selten's game is a kidnapping model where the probability of capturing the kidnapper is independent of whether the hostage has been released or executed. Most often, in view of the elevated sensitivities involved, authorities put greater effort and resources into capturing the kidnapper if the hostage has been executed, in contrast with the case when a ransom is paid to secure the hostage's release. In this paper, we study the asymmetric game when the probability of capturing the kidnapper depends on whether the hostage has been executed or not and find a new uniquely determined perfect equilibrium point in Selten's game.

  6. NPARC v3.1 User's Guide: A Companion to the NPARC v3.0 User's Guide

    NASA Technical Reports Server (NTRS)

    Chung, Joongkee; Slater, John W.; Suresh, Ambady; Townsend, Scott

    1999-01-01

    NPARC v3.1 is a modification to the NPARC v3.0 computer program which expands the capabilities for time-accurate computations through the use of a Newton iterative implicit method, time-varying boundary conditions, and planar dynamic grids. This document discusses some of the changes from the NPARC v3.0, specifically: changes to the directory structure and execution, changes to the input format. background on new methods, new boundary conditions. dynamic grids, new options for output, usage concepts, and some test cases to serve as tutorials. This document is intended to be used in conjunction with the NPARC v3.0 user's guide.

  7. A Conceptual Level Design for a Static Scheduler for Hard Real-Time Systems

    DTIC Science & Technology

    1988-03-01

    The design of hard real - time systems is gaining a great deal of attention in the software engineering field as more and more real-world processes are...for these hard real - time systems . PSDL, as an executable design language, is supported by an execution support system consisting of a static scheduler, dynamic scheduler, and translator.

  8. Choosing a software design method for real-time Ada applications: JSD process inversion as a means to tailor a design specification to the performance requirements and target machine

    NASA Technical Reports Server (NTRS)

    Withey, James V.

    1986-01-01

    The validity of real-time software is determined by its ability to execute on a computer within the time constraints of the physical system it is modeling. In many applications the time constraints are so critical that the details of process scheduling are elevated to the requirements analysis phase of the software development cycle. It is not uncommon to find specifications for a real-time cyclic executive program included to assumed in such requirements. It was found that prelininary designs structured around this implementation abscure the data flow of the real world system that is modeled and that it is consequently difficult and costly to maintain, update and reuse the resulting software. A cyclic executive is a software component that schedules and implicitly synchronizes the real-time software through periodic and repetitive subroutine calls. Therefore a design method is sought that allows the deferral of process scheduling to the later stages of design. The appropriate scheduling paradigm must be chosen given the performance constraints, the largest environment and the software's lifecycle. The concept of process inversion is explored with respect to the cyclic executive.

  9. Efficient Helicopter Aerodynamic and Aeroacoustic Predictions on Parallel Computers

    NASA Technical Reports Server (NTRS)

    Wissink, Andrew M.; Lyrintzis, Anastasios S.; Strawn, Roger C.; Oliker, Leonid; Biswas, Rupak

    1996-01-01

    This paper presents parallel implementations of two codes used in a combined CFD/Kirchhoff methodology to predict the aerodynamics and aeroacoustics properties of helicopters. The rotorcraft Navier-Stokes code, TURNS, computes the aerodynamic flowfield near the helicopter blades and the Kirchhoff acoustics code computes the noise in the far field, using the TURNS solution as input. The overall parallel strategy adds MPI message passing calls to the existing serial codes to allow for communication between processors. As a result, the total code modifications required for parallel execution are relatively small. The biggest bottleneck in running the TURNS code in parallel comes from the LU-SGS algorithm that solves the implicit system of equations. We use a new hybrid domain decomposition implementation of LU-SGS to obtain good parallel performance on the SP-2. TURNS demonstrates excellent parallel speedups for quasi-steady and unsteady three-dimensional calculations of a helicopter blade in forward flight. The execution rate attained by the code on 114 processors is six times faster than the same cases run on one processor of the Cray C-90. The parallel Kirchhoff code also shows excellent parallel speedups and fast execution rates. As a performance demonstration, unsteady acoustic pressures are computed at 1886 far-field observer locations for a sample acoustics problem. The calculation requires over two hundred hours of CPU time on one C-90 processor but takes only a few hours on 80 processors of the SP2. The resultant far-field acoustic field is analyzed with state of-the-art audio and video rendering of the propagating acoustic signals.

  10. Life cycle of medical product rules issued by the US Food and Drug Administration.

    PubMed

    Hwang, Thomas J; Avorn, Jerry; Kesselheim, Aaron S

    2014-08-01

    The US Food and Drug Administration (FDA) uses rulemaking as one of its primary tools to protect the public health and implement laws enacted by Congress and the president. Because of the many effects that these rules have on social welfare and the economy, the FDA and other executive agencies receive input from the executive branch, the public, and in some cases, the courts, during the process of rulemaking. In this article, we examine the life cycle of FDA regulations concerning medical products and review notable features of the rulemaking process. The current system grants substantial opportunities for diverse stakeholders to participate in and influence how rules are written and implemented. However, the duration, complexity, and adversarial qualities of the rulemaking process can hinder the FDA's ability to achieve its policy and public health goals. There is considerable variation in the level of transparency at different stages in the process, ranging from freely accessible public comments to undisclosed internal agency deliberations. In addition, significant medical product rules are associated with lengthy times to finalization, in some cases for unclear reasons. We conclude by identifying potential areas for reform on the basis of transparency and efficiency. Copyright © 2014 by Duke University Press.

  11. Spontaneous abortion in the British semiconductor industry: An HSE investigation. Health and Safety Executive.

    PubMed

    Elliott, R C; Jones, J R; McElvenny, D M; Pennington, M J; Northage, C; Clegg, T A; Clarke, S D; Hodgson, J T; Osman, J

    1999-11-01

    The UK Health and Safety Executive (HSE) conducted a study to examine the risk of spontaneous abortion (SAB) in British female semiconductor industry workers, following reports from the USA which suggested an association between risk of SAB and work in fabrication rooms and/or exposure to ethylene glycol ethers. A nested case-control study based on 2,207 women who had worked at eight manufacturing sites during a 5-year retrospective time frame was established; 36 cases were matched with 80 controls. The overall SAB rate in the industry was 10.0%. (65 SABs/651 pregnancies) The crude odds ratio (OR) for fabrication work was 0.65 (95% CI 0.30-1.40). This was essentially unchanged after adjustment for a range of potential confounding factors in the first 3 months of pregnancy and was reduced to 0.58 (95% CI 0.26-1.30) after adjustment for smoking in the previous 12 months. There were no statistically significantly elevated ORs for any work group or any specific chemical or physical exposure in the industry. There is no evidence of an increased risk of SAB in the British semiconductor industry. Am. J. Ind. Med. 36:557-572, 1999. Published 1999 Wiley-Liss, Inc.

  12. Multiresolution analysis of Bursa Malaysia KLCI time series

    NASA Astrophysics Data System (ADS)

    Ismail, Mohd Tahir; Dghais, Amel Abdoullah Ahmed

    2017-05-01

    In general, a time series is simply a sequence of numbers collected at regular intervals over a period. Financial time series data processing is concerned with the theory and practice of processing asset price over time, such as currency, commodity data, and stock market data. The primary aim of this study is to understand the fundamental characteristics of selected financial time series by using the time as well as the frequency domain analysis. After that prediction can be executed for the desired system for in sample forecasting. In this study, multiresolution analysis which the assist of discrete wavelet transforms (DWT) and maximal overlap discrete wavelet transform (MODWT) will be used to pinpoint special characteristics of Bursa Malaysia KLCI (Kuala Lumpur Composite Index) daily closing prices and return values. In addition, further case study discussions include the modeling of Bursa Malaysia KLCI using linear ARIMA with wavelets to address how multiresolution approach improves fitting and forecasting results.

  13. Implications of the Turing machine model of computation for processor and programming language design

    NASA Astrophysics Data System (ADS)

    Hunter, Geoffrey

    2004-01-01

    A computational process is classified according to the theoretical model that is capable of executing it; computational processes that require a non-predeterminable amount of intermediate storage for their execution are Turing-machine (TM) processes, while those whose storage are predeterminable are Finite Automation (FA) processes. Simple processes (such as traffic light controller) are executable by Finite Automation, whereas the most general kind of computation requires a Turing Machine for its execution. This implies that a TM process must have a non-predeterminable amount of memory allocated to it at intermediate instants of its execution; i.e. dynamic memory allocation. Many processes encountered in practice are TM processes. The implication for computational practice is that the hardware (CPU) architecture and its operating system must facilitate dynamic memory allocation, and that the programming language used to specify TM processes must have statements with the semantic attribute of dynamic memory allocation, for in Alan Turing"s thesis on computation (1936) the "standard description" of a process is invariant over the most general data that the process is designed to process; i.e. the program describing the process should never have to be modified to allow for differences in the data that is to be processed in different instantiations; i.e. data-invariant programming. Any non-trivial program is partitioned into sub-programs (procedures, subroutines, functions, modules, etc). Examination of the calls/returns between the subprograms reveals that they are nodes in a tree-structure; this tree-structure is independent of the programming language used to encode (define) the process. Each sub-program typically needs some memory for its own use (to store values intermediate between its received data and its computed results); this locally required memory is not needed before the subprogram commences execution, and it is not needed after its execution terminates; it may be allocated as its execution commences, and deallocated as its execution terminates, and if the amount of this local memory is not known until just before execution commencement, then it is essential that it be allocated dynamically as the first action of its execution. This dynamically allocated/deallocated storage of each subprogram"s intermediate values, conforms with the stack discipline; i.e. last allocated = first to be deallocated, an incidental benefit of which is automatic overlaying of variables. This stack-based dynamic memory allocation was a semantic implication of the nested block structure that originated in the ALGOL-60 programming language. AGLOL-60 was a TM language, because the amount of memory allocated on subprogram (block/procedure) entry (for arrays, etc) was computable at execution time. A more general requirement of a Turing machine process is for code generation at run-time; this mandates access to the source language processor (compiler/interpretor) during execution of the process. This fundamental aspect of computer science is important to the future of system design, because it has been overlooked throughout the 55 years since modern computing began in 1048. The popular computer systems of this first half-century of computing were constrained by compile-time (or even operating system boot-time) memory allocation, and were thus limited to executing FA processes. The practical effect was that the distinction between the data-invariant program and its variable data was blurred; programmers had to make trial and error executions, modifying the program"s compile-time constants (array dimensions) to iterate towards the values required at run-time by the data being processed. This era of trial and error computing still persists; it pervades the culture of current (2003) computing practice.

  14. Multi-INT Complex Event Processing using Approximate, Incremental Graph Pattern Search

    DTIC Science & Technology

    2012-06-01

    graph pattern search and SPARQL queries . Total execution time for 10 executions each of 5 random pattern searches in synthetic data sets...01/11 1000 10000 100000 RDF triples Time (secs) 10 20 Graph pattern algorithm SPARQL queries Initial Performance Comparisons 09/18/11 2011 Thrust Area

  15. Siblings, Theory of Mind, and Executive Functioning in Children Aged 3-6 Years: New Longitudinal Evidence

    ERIC Educational Resources Information Center

    McAlister, Anna R.; Peterson, Candida C.

    2013-01-01

    Longitudinal data were obtained from 157 children aged 3 years 3 months to 5 years 6 months at Time 1. At Time 2 these children had aged an average of 12 months. Theory of mind (ToM) and executive functioning (EF) were measured at both time points. Results suggest that Time 1 ToM scores predict Time 2 EF scores. Detailed examination of sibling…

  16. Impact of Elevated Core Body Temperature on Attention Networks.

    PubMed

    Liu, Kai; Jiang, Qingjun; Li, Li; Li, Bo; Yang, Zhen; Qian, Shaowen; Li, Min; Sun, Gang

    2015-12-01

    Cognitive function can be impaired after passive heat exposure and with an elevation in core body temperature (Tcore). This study examined the dynamic correlation among passive heat exposure, Tcore, and cognition. We gave the Attention Network Test of alerting, orienting, and executive control to five groups of five young men who were being exposed to a hyperthermic condition (50°C, 40% relative humidity) for 0, 10, 20, 30, or 40 minutes. We used the participants' reaction time, accuracy (correct responses), efficiency (accuracy÷reaction time), and Tcore to estimate optimal curve models for best fit of data. We could not estimate an appropriate curve model for either alerting or orienting with Tcore, change in Tcore, or duration of passive heat exposure. We estimated quadratic models for Tcore and duration (adjusted R=0.752), change in Tcore and duration (0.906), executive control score and duration (0.509), and efficiency of executive control and duration (0.293). We estimated linear models for executive control score and Tcore (0.479), efficiency of executive control and Tcore (0.261), executive control score and change in Tcore (0.279), and efficiency of executive control and change in Tcore (0.262). Different attentional abilities had different sensitivities to thermal stress. Executive control of attention deteriorated linearly with a rise in Tcore within the normal physiologic range, but deteriorated nonlinearly with longer passive heat exposure.

  17. Embedding Temporal Constraints For Coordinated Execution in Habitat Automation

    NASA Technical Reports Server (NTRS)

    Morris, Paul; Schwabacher, Mark; Dalal, Michael; Fry, Charles

    2013-01-01

    Future NASA plans call for long-duration deep space missions with human crews. Because of light-time delay and other considerations, increased autonomy will be needed. This will necessitate integration of tools in such areas as anomaly detection, diagnosis, planning, and execution. In this paper we investigate an approach that integrates planning and execution by embedding planner-derived temporal constraints in an execution procedure. To avoid the need for propagation, we convert the temporal constraints to dispatchable form. We handle some uncertainty in the durations without it affecting the execution; larger variations may cause activities to be skipped.

  18. High speed cylindrical roller bearing analysis. SKF computer program CYBEAN. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Dyba, G. J.; Kleckner, R. J.

    1981-01-01

    CYBEAN (CYlindrical BEaring ANalysis) was created to detail radially loaded, aligned and misaligned cylindrical roller bearing performance under a variety of operating conditions. Emphasis was placed on detailing the effects of high speed, preload and system thermal coupling. Roller tilt, skew, radial, circumferential and axial displacement as well as flange contact were considered. Variable housing and flexible out-of-round outer ring geometries, and both steady state and time transient temperature calculations were enabled. The complete range of elastohydrodynamic contact considerations, employing full and partial film conditions were treated in the computation of raceway and flange contacts. The practical and correct implementation of CYBEAN is discussed. The capability to execute the program at four different levels of complexity was included. In addition, the program was updated to properly direct roller-to-raceway contact load vectors automatically in those cases where roller or ring profiles have small radii of curvature. Input and output architectures containing guidelines for use and two sample executions are detailed.

  19. Walking execution is not affected by divided attention in patients with multiple sclerosis with no disability, but there is a motor planning impairment.

    PubMed

    Nogueira, Leandro Alberto Calazans; Santos, Luciano Teixeira Dos; Sabino, Pollyane Galinari; Alvarenga, Regina Maria Papais; Thuler, Luiz Claudio Santos

    2013-08-01

    We analysed the cognitive influence on walking in multiple sclerosis (MS) patients, in the absence of clinical disability. A case-control study was conducted with 12 MS patients with no disability and 12 matched healthy controls. Subjects were referred for completion a timed walk test of 10 m and a 3D-kinematic analysis. Participants were instructed to walk at a comfortable speed in a dual-task (arithmetic task) condition, and motor planning was measured by mental chronometry. Scores of walking speed and cadence showed no statistically significant differences between the groups in the three conditions. The dual-task condition showed an increase in the double support duration in both groups. Motor imagery analysis showed statistically significant differences between real and imagined walking in patients. MS patients with no disability did not show any influence of divided attention on walking execution. However, motor planning was overestimated as compared with real walking.

  20. Analyzing the test process using structural coverage

    NASA Technical Reports Server (NTRS)

    Ramsey, James; Basili, Victor R.

    1985-01-01

    A large, commercially developed FORTRAN program was modified to produce structural coverage metrics. The modified program was executed on a set of functionally generated acceptance tests and a large sample of operational usage cases. The resulting structural coverage metrics are combined with fault and error data to evaluate structural coverage. It was shown that in the software environment the functionally generated tests seem to be a good approximation of operational use. The relative proportions of the exercised statement subclasses change as the structural coverage of the program increases. A method was also proposed for evaluating if two sets of input data exercise a program in a similar manner. Evidence was provided that implies that in this environment, faults revealed in a procedure are independent of the number of times the procedure is executed and that it may be reasonable to use procedure coverage in software models that use statement coverage. Finally, the evidence suggests that it may be possible to use structural coverage to aid in the management of the acceptance test processed.

  1. Software as a service approach to sensor simulation software deployment

    NASA Astrophysics Data System (ADS)

    Webster, Steven; Miller, Gordon; Mayott, Gregory

    2012-05-01

    Traditionally, military simulation has been problem domain specific. Executing an exercise currently requires multiple simulation software providers to specialize, deploy, and configure their respective implementations, integrate the collection of software to achieve a specific system behavior, and then execute for the purpose at hand. This approach leads to rigid system integrations which require simulation expertise for each deployment due to changes in location, hardware, and software. Our alternative is Software as a Service (SaaS) predicated on the virtualization of Night Vision Electronic Sensors (NVESD) sensor simulations as an exemplary case. Management middleware elements layer self provisioning, configuration, and integration services onto the virtualized sensors to present a system of services at run time. Given an Infrastructure as a Service (IaaS) environment, enabled and managed system of simulations yields a durable SaaS delivery without requiring user simulation expertise. Persistent SaaS simulations would provide on demand availability to connected users, decrease integration costs and timelines, and benefit the domain community from immediate deployment of lessons learned.

  2. Spontaneous recovery of memory functions in an untreated case of anti NMDAR encephalitis - a reason to maintain hope.

    PubMed

    McIvor, Katherine; Moore, Perry

    2017-01-01

    Anti N-methyl-D-aspartate receptor (anti-NMDAR) encephalitis is an autoimmune disorder that was only fully discovered recently and neuropsychological outcome data remains sparse. We present the case of BA, a 19-year-old male, which illustrates the cognitive outcome in an untreated case over a time period of over 2½ years. We conducted three cognitive assessments, including tests of memory and executive functioning, over this time period and considered the evidence for reliable change in memory function using the Wechsler Advanced Clinical Solutions (ACS) serial assessment package. Our findings revealed mild memory problems 6 months post-discharge with, at best, static and potentially declining memory functioning at follow-up assessment 12 months post-discharge. However, the results of testing at 30 months post-discharge revealed significant improvements in immediate and delayed memory index performances. Our report of a case of anti-NMDAR encephalitis provides evidence for spontaneous improvements in memory functioning occurring more than 2 years after initial assessment and also demonstrates both the utility and potential limitations of the ACS serial assessment software when used in a relatively typical clinical assessment situation.

  3. Potential Flow Theory and Operation Guide for the Panel Code PMARC. Version 14

    NASA Technical Reports Server (NTRS)

    Ashby, Dale L.

    1999-01-01

    The theoretical basis for PMARC, a low-order panel code for modeling complex three-dimensional bodies, in potential flow, is outlined. PMARC can be run on a wide variety of computer platforms, including desktop machines, workstations, and supercomputers. Execution times for PMARC vary tremendously depending on the computer resources used, but typically range from several minutes for simple or moderately complex cases to several hours for very large complex cases. Several of the advanced features currently included in the code, such as internal flow modeling, boundary layer analysis, and time-dependent flow analysis, including problems involving relative motion, are discussed in some detail. The code is written in Fortran77, using adjustable-size arrays so that it can be easily redimensioned to match problem requirements and computer hardware constraints. An overview of the program input is presented. A detailed description of the input parameters is provided in the appendices. PMARC results for several test cases are presented along with analytic or experimental data, where available. The input files for these test cases are given in the appendices. PMARC currently supports plotfile output formats for several commercially available graphics packages. The supported graphics packages are Plot3D, Tecplot, and PmarcViewer.

  4. The Politics of Smoking in Federal Buildings: An Executive Order Case Study

    PubMed Central

    Bero, Lisa A.

    2009-01-01

    Executive orders are important presidential tools for health policymaking that are subject to less public scrutiny than are legislation and regulatory rulemaking. President Bill Clinton banned smoking in federal government buildings by executive order in 1997, after the administration of George H. W. Bush had twice considered and abandoned a similar policy. The 1991 and 1993 Bush proposals drew objections from agency heads and labor unions, many coordinated by the tobacco industry. We analyzed internal tobacco industry documents and found that the industry engaged in extensive executive branch lobbying and other political activity surrounding the Clinton smoking ban. Whereas some level of stakeholder politics might have been expected, this policy also featured jockeying among various agencies and the participation of organized labor. PMID:19608948

  5. Executive Leadership and Physician Well-being: Nine Organizational Strategies to Promote Engagement and Reduce Burnout.

    PubMed

    Shanafelt, Tait D; Noseworthy, John H

    2017-01-01

    These are challenging times for health care executives. The health care field is experiencing unprecedented changes that threaten the survival of many health care organizations. To successfully navigate these challenges, health care executives need committed and productive physicians working in collaboration with organization leaders. Unfortunately, national studies suggest that at least 50% of US physicians are experiencing professional burnout, indicating that most executives face this challenge with a disillusioned physician workforce. Burnout is a syndrome characterized by exhaustion, cynicism, and reduced effectiveness. Physician burnout has been shown to influence quality of care, patient safety, physician turnover, and patient satisfaction. Although burnout is a system issue, most institutions operate under the erroneous framework that burnout and professional satisfaction are solely the responsibility of the individual physician. Engagement is the positive antithesis of burnout and is characterized by vigor, dedication, and absorption in work. There is a strong business case for organizations to invest in efforts to reduce physician burnout and promote engagement. Herein, we summarize 9 organizational strategies to promote physician engagement and describe how we have operationalized some of these approaches at Mayo Clinic. Our experience demonstrates that deliberate, sustained, and comprehensive efforts by the organization to reduce burnout and promote engagement can make a difference. Many effective interventions are relatively inexpensive, and small investments can have a large impact. Leadership and sustained attention from the highest level of the organization are the keys to making progress. Copyright © 2016 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.

  6. To What Extent Can Motor Imagery Replace Motor Execution While Learning a Fine Motor Skill?

    PubMed Central

    Sobierajewicz, Jagna; Szarkiewicz, Sylwia; Przekoracka-Krawczyk, Anna; Jaśkowski, Wojciech; van der Lubbe, Rob

    2016-01-01

    Motor imagery is generally thought to share common mechanisms with motor execution. In the present study, we examined to what extent learning a fine motor skill by motor imagery may substitute physical practice. Learning effects were assessed by manipulating the proportion of motor execution and motor imagery trials. Additionally, learning effects were compared between participants with an explicit motor imagery instruction and a control group. A Go/NoGo discrete sequence production (DSP) task was employed, wherein a five-stimulus sequence presented on each trial indicated the required sequence of finger movements after a Go signal. In the case of a NoGo signal, participants either had to imagine carrying out the response sequence (the motor imagery group), or the response sequence had to be withheld (the control group). Two practice days were followed by a final test day on which all sequences had to be executed. Learning effects were assessed by computing response times (RTs) and the percentages of correct responses (PCs). The electroencephalogram (EEG ) was additionally measured on this test day to examine whether motor preparation and the involvement of visual short term memory (VST M) depended on the amount of physical/mental practice. Accuracy data indicated strong learning effects. However, a substantial amount of physical practice was required to reach an optimal speed. EEG results suggest the involvement of VST M for sequences that had less or no physical practice in both groups. The absence of differences between the motor imagery and the control group underlines the possibility that motor preparation may actually resemble motor imagery. PMID:28154614

  7. To What Extent Can Motor Imagery Replace Motor Execution While Learning a Fine Motor Skill?

    PubMed

    Sobierajewicz, Jagna; Szarkiewicz, Sylwia; Przekoracka-Krawczyk, Anna; Jaśkowski, Wojciech; van der Lubbe, Rob

    2016-01-01

    Motor imagery is generally thought to share common mechanisms with motor execution. In the present study, we examined to what extent learning a fine motor skill by motor imagery may substitute physical practice. Learning effects were assessed by manipulating the proportion of motor execution and motor imagery trials. Additionally, learning effects were compared between participants with an explicit motor imagery instruction and a control group. A Go/NoGo discrete sequence production (DSP) task was employed, wherein a five-stimulus sequence presented on each trial indicated the required sequence of finger movements after a Go signal. In the case of a NoGo signal, participants either had to imagine carrying out the response sequence (the motor imagery group), or the response sequence had to be withheld (the control group). Two practice days were followed by a final test day on which all sequences had to be executed. Learning effects were assessed by computing response times (RTs) and the percentages of correct responses (PCs). The electroencephalogram (EEG ) was additionally measured on this test day to examine whether motor preparation and the involvement of visual short term memory (VST M) depended on the amount of physical/mental practice. Accuracy data indicated strong learning effects. However, a substantial amount of physical practice was required to reach an optimal speed. EEG results suggest the involvement of VST M for sequences that had less or no physical practice in both groups. The absence of differences between the motor imagery and the control group underlines the possibility that motor preparation may actually resemble motor imagery.

  8. Efficiently Scheduling Multi-core Guest Virtual Machines on Multi-core Hosts in Network Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoginath, Srikanth B; Perumalla, Kalyan S

    2011-01-01

    Virtual machine (VM)-based simulation is a method used by network simulators to incorporate realistic application behaviors by executing actual VMs as high-fidelity surrogates for simulated end-hosts. A critical requirement in such a method is the simulation time-ordered scheduling and execution of the VMs. Prior approaches such as time dilation are less efficient due to the high degree of multiplexing possible when multiple multi-core VMs are simulated on multi-core host systems. We present a new simulation time-ordered scheduler to efficiently schedule multi-core VMs on multi-core real hosts, with a virtual clock realized on each virtual core. The distinguishing features of ourmore » approach are: (1) customizable granularity of the VM scheduling time unit on the simulation time axis, (2) ability to take arbitrary leaps in virtual time by VMs to maximize the utilization of host (real) cores when guest virtual cores idle, and (3) empirically determinable optimality in the tradeoff between total execution (real) time and time-ordering accuracy levels. Experiments show that it is possible to get nearly perfect time-ordered execution, with a slight cost in total run time, relative to optimized non-simulation VM schedulers. Interestingly, with our time-ordered scheduler, it is also possible to reduce the time-ordering error from over 50% of non-simulation scheduler to less than 1% realized by our scheduler, with almost the same run time efficiency as that of the highly efficient non-simulation VM schedulers.« less

  9. Usability assessment of an electronic health record in a comprehensive dental clinic.

    PubMed

    Suebnukarn, Siriwan; Rittipakorn, Pawornwan; Thongyoi, Budsara; Boonpitak, Kwanwong; Wongsapai, Mansuang; Pakdeesan, Panu

    2013-12-01

    In this paper we present the development and usability of an electronic health record (EHR) system in a comprehensive dental clinic.The graphic user interface of the system was designed to consider the concept of cognitive ergonomics.The cognitive task analysis was used to evaluate the user interface of the EHR by identifying all sub-tasks and classifying them into mental or physical operators, and to predict task execution time required to perform the given task. We randomly selected 30 cases that had oral examinations for routine clinical care in a comprehensive dental clinic. The results were based on the analysis of 4 prototypical tasks performed by ten EHR users. The results showed that on average a user needed to go through 27 steps to complete all tasks for one case. To perform all 4 tasks of 30 cases, they spent about 91 min (independent of system response time) for data entry, of which 51.8 min were spent on more effortful mental operators. In conclusion, the user interface can be improved by reducing the percentage of mental effort required for the tasks.

  10. Voltage scheduling for low power/energy

    NASA Astrophysics Data System (ADS)

    Manzak, Ali

    2001-07-01

    Power considerations have become an increasingly dominant factor in the design of both portable and desk-top systems. An effective way to reduce power consumption is to lower the supply voltage since voltage is quadratically related to power. This dissertation considers the problem of lowering the supply voltage at (i) the system level and at (ii) the behavioral level. At the system level, the voltage of the variable voltage processor is dynamically changed with the work load. Processors with limited sized buffers as well as those with very large buffers are considered. Given the task arrival times, deadline times, execution times, periods and switching activities, task scheduling algorithms that minimize energy or peak power are developed for the processors equipped with very large buffers. A relation between the operating voltages of the tasks for minimum energy/power is determined using the Lagrange multiplier method, and an iterative algorithm that utilizes this relation is developed. Experimental results show that the voltage assignment obtained by the proposed algorithm is very close (0.1% error) to that of the optimal energy assignment and the optimal peak power (1% error) assignment. Next, on-line and off-fine minimum energy task scheduling algorithms are developed for processors with limited sized buffers. These algorithms have polynomial time complexity and present optimal (off-line) and close-to-optimal (on-line) solutions. A procedure to calculate the minimum buffer size given information about the size of the task (maximum, minimum), execution time (best case, worst case) and deadlines is also presented. At the behavioral level, resources operating at multiple voltages are used to minimize power while maintaining the throughput. Such a scheme has the advantage of allowing modules on the critical paths to be assigned to the highest voltage levels (thus meeting the required timing constraints) while allowing modules on non-critical paths to be assigned to lower voltage levels (thus reducing the power consumption). A polynomial time resource and latency constrained scheduling algorithm is developed to distribute the available slack among the nodes such that power consumption is minimum. The algorithm is iterative and utilizes the slack based on the Lagrange multiplier method.

  11. Assessing Higher Level Learning: Developing Rubrics for Case Analysis

    ERIC Educational Resources Information Center

    Rochford, Linda; Borchert, Patricia S.

    2011-01-01

    Case study analyses allow students to demonstrate proficiency in executing authentic tasks in marketing and management, facilitating faculty evaluation of higher order learning outcomes. Effective and consistent assessment of case analyses depends in large part on the development of sound rubrics. The authors explored the process of rubric…

  12. The effect of federal health policy on occupational medicine.

    PubMed

    McCunney, R J; Cikins, W

    1990-01-01

    All three branches of the federal government affect occupational medicine. Notable examples include: 1) the Department of Transportation ruling (1988) requiring drug testing in diverse areas of the transportation industry (executive branch); 2) the Workplace Drug Act (1988) calling for organizations to have a policy towards drug and alcohol abuse (legislative branch); and 3) the Supreme Court ruling on the constitutionality of drug testing in the transportation industry (1989) and that infectious diseases are a handicap in accordance with the 1973 Federal Rehabilitation Act (1987). The executive branch plays a major role in occupational medicine primarily through the Occupational Safety and Health Administration (OSHA), which issues standards based on a rule making process; the executive branch can also affect occupational medicine indirectly, as evidenced by President Reagan's Executive Order 12291 calling for Office of Management and Budget oversight of regulatory initiatives. The legislative branch enacts laws, conducts hearings, and requests reports on the operations of federal agencies. The judicial branch addresses occupational health issues when people affected by an executive ruling want to challenge the ruling; or in the case of the Supreme Court, when deliberating an issue over which two circuit courts of appeal have come to divergent opinions. The Occupational Medicine profession can participate in the political process through awareness of proposed legislation and by responding accordingly with letters, resolutions, or testimony. Similar options exist within the executive branch by participating in the rule-making process. A representative of the Governmental Affairs Committee, through periodic visits with key Washington representatives, can keep members of the American College of Occupational Medicine informed about federal legislative and regulatory activities. In appropriate cases, the organization can then take a formal position on governmental activities that affect the speciality.

  13. Cognitive and Executive Functions in Colombian School Children with Conduct Disorder: Sex Differences.

    PubMed

    Urazán-Torres, Gina Rocío; Puche-Cabrera, Mario José; Caballero-Forero, Mangelli; Rey-Anacona, César Armando

    2013-12-01

    Most of the studies that have examined cognitive and executive functions in conduct disorders (CD) have been conducted on institutionalized male adolescents. In this research the cognitive and executive functions of non-institutionalized Colombian school children with CD were compared with normal school children, all between 6 and 12 years-old. We used a case-control design. The cases were participants who met the diagnostic criteria for CD (n=39) and controls who did not meet these criteria (n=39), according to reports of a professional of the participants' institution, and a structured interview for childhood psychiatric syndromes. The two groups were selected from educational institutions, and there were no differences in age, school grade, or socioeconomic level. The IQ was reviewed, as well as the presence of other mental disorders, serious physical illnesses, and more serious neurological signs. The cognitive and executive functions were evaluated using a child neuropsychological test battery. We found that participants with CD had significantly lower scores in construction abilities, perceptual abilities (tactile, visual and auditory), differed in verbal memory, differed in visual memory, language (repetition, expression and understanding), meta-linguistic abilities, spatial abilities, visual and auditory attention, conceptual abilities, verbal and graphic fluency, and cognitive flexibility. The same differences were found between males, except in repetition, whereas girls showed fewer differences, thus the cognitive and executive performance was poorer in males with CD than in females, especially in verbal and linguistic-related functions. Children with CD could show generalized cognitive and executive deficits. These deficits seem to be more frequent in boys than in girls with CD. Copyright © 2013 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  14. Putting time into proof outlines

    NASA Technical Reports Server (NTRS)

    Schneider, Fred B.; Bloom, Bard; Marzullo, Keith

    1993-01-01

    A logic for reasoning about timing properties of concurrent programs is presented. The logic is based on Hoare-style proof outlines and can handle maximal parallelism as well as certain resource-constrained execution environments. The correctness proof for a mutual exclusion protocol that uses execution timings in a subtle way illustrates the logic in action. A soundness proof using structural operational semantics is outlined in the appendix.

  15. Executive Beliefs About the Critical Success Factors In Defining, Designing, Developing and Delivering e-Learning For Adult Professional Development in Corporations

    ERIC Educational Resources Information Center

    Armstrong, Ann W.

    2008-01-01

    The purpose of this study was to gain an understanding of what e-Learning executives believe are the critical success factors for companies to successfully deliver training and education over the world-wide web. The study was a qualitative, multiple-case study design including in-depth, semi-structured interviews that incorporated verbal critical…

  16. Dynamic Test Generation for Large Binary Programs

    DTIC Science & Technology

    2009-11-12

    the fuzzing@whitestar.linuxbox.orgmailing list, including Jared DeMott, Disco Jonny, and Ari Takanen, for discussions on fuzzing tradeoffs. Martin...as is the case for large applications where exercising all execution paths is virtually hopeless anyway. This point will be further discussed in...consumes trace files generated by iDNA and virtually re-executes the recorded runs. TruScan offers several features that substantially simplify symbolic

  17. Creating and Implementing an Offshore Graduate Program: A Case Study of Leadership and Development of the Global Executive MBA Program

    ERIC Educational Resources Information Center

    Herrera, Marisa L.

    2013-01-01

    This study applies the literature on leadership framing to the globalization of higher education to understand the development of the Global Executive MBA program at a large university. The purpose of the study was to provide administrators, educators and university leaders an understanding as to how to respond to globalization and, secondly, to…

  18. Evaluating Executive Strategies (Management Strategies and Teaching-Learning Strategies) of Graduate Curriculum: Case Study in Isfahan University

    ERIC Educational Resources Information Center

    Rahmanpour, Muhammad; Ahmadi, Mojtaba; Hatami, Mostafa; Mirzaee, Hamzeh

    2017-01-01

    The present study seeks to evaluate executive strategies in graduate Curriculum of Isfahan University from the point of view of management and teaching-learning strategies. This study is an applied survey. The population comprised BA students and faculty members of the University of Isfahan. In order to do so, 141 professors and 278 students were…

  19. Parallel Execution of Functional Mock-up Units in Buildings Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozmen, Ozgur; Nutaro, James J.; New, Joshua Ryan

    2016-06-30

    A Functional Mock-up Interface (FMI) defines a standardized interface to be used in computer simulations to develop complex cyber-physical systems. FMI implementation by a software modeling tool enables the creation of a simulation model that can be interconnected, or the creation of a software library called a Functional Mock-up Unit (FMU). This report describes an FMU wrapper implementation that imports FMUs into a C++ environment and uses an Euler solver that executes FMUs in parallel using Open Multi-Processing (OpenMP). The purpose of this report is to elucidate the runtime performance of the solver when a multi-component system is imported asmore » a single FMU (for the whole system) or as multiple FMUs (for different groups of components as sub-systems). This performance comparison is conducted using two test cases: (1) a simple, multi-tank problem; and (2) a more realistic use case based on the Modelica Buildings Library. In both test cases, the performance gains are promising when each FMU consists of a large number of states and state events that are wrapped in a single FMU. Load balancing is demonstrated to be a critical factor in speeding up parallel execution of multiple FMUs.« less

  20. Executive function processes predict mobility outcomes in older adults.

    PubMed

    Gothe, Neha P; Fanning, Jason; Awick, Elizabeth; Chung, David; Wójcicki, Thomas R; Olson, Erin A; Mullen, Sean P; Voss, Michelle; Erickson, Kirk I; Kramer, Arthur F; McAuley, Edward

    2014-02-01

    To examine the relationship between performance on executive function measures and subsequent mobility outcomes in community-dwelling older adults. Randomized controlled clinical trial. Champaign-Urbana, Illinois. Community-dwelling older adults (N = 179; mean age 66.4). A 12-month exercise trial with two arms: an aerobic exercise group and a stretching and strengthening group. Established cognitive tests of executive function (flanker task, task switching, and a dual-task paradigm) and the Wisconsin card sort test. Mobility was assessed using the timed 8-foot up and go test and times to climb up and down a flight of stairs. Participants completed the cognitive tests at baseline and the mobility measures at baseline and after 12 months of the intervention. Multiple regression analyses were conducted to determine whether baseline executive function predicted postintervention functional performance after controlling for age, sex, education, cardiorespiratory fitness, and baseline mobility levels. Selective baseline executive function measurements, particularly performance on the flanker task (β = 0.15-0.17) and the Wisconsin card sort test (β = 0.11-0.16) consistently predicted mobility outcomes at 12 months. The estimates were in the expected direction, such that better baseline performance on the executive function measures predicted better performance on the timed mobility tests independent of intervention. Executive functions of inhibitory control, mental set shifting, and attentional flexibility were predictive of functional mobility. Given the literature associating mobility limitations with disability, morbidity, and mortality, these results are important for understanding the antecedents to poor mobility function that well-designed interventions to improve cognitive performance can attenuate. © 2014, Copyright the Authors Journal compilation © 2014, The American Geriatrics Society.

  1. Leadership Characteristics for Health Care Managers: Perspectives of Chief Executive Officers in US Hospitals.

    PubMed

    Collins, Sandra K; McKinnies, Richard; Collins, Kevin S

    2015-01-01

    A study was conducted to determine the perceptions of chief executive officers in US hospitals regarding the most important characteristics aspiring health care executives should possess. The results of this 2012 study were compared with a previous study conducted in 2007 to determine if the perceptions had changed over time.

  2. TMS for Instantiating a Knowledge Base With Incomplete Data

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    A computer program that belongs to the class known among software experts as output truth-maintenance-systems (output TMSs) has been devised as one of a number of software tools for reducing the size of the knowledge base that must be searched during execution of artificial- intelligence software of the rule-based inference-engine type in a case in which data are missing. This program determines whether the consequences of activation of two or more rules can be combined without causing a logical inconsistency. For example, in a case involving hypothetical scenarios that could lead to turning a given device on or off, the program determines whether a scenario involving a given combination of rules could lead to turning the device both on and off at the same time, in which case that combination of rules would not be included in the scenario.

  3. Replacing sedentary time with sleep, light, or moderate-to-vigorous physical activity: effects on self-regulation and executive functioning.

    PubMed

    Fanning, J; Porter, G; Awick, E A; Ehlers, D K; Roberts, S A; Cooke, G; Burzynska, A Z; Voss, M W; Kramer, A F; McAuley, E

    2017-04-01

    Recent attention has highlighted the importance of reducing sedentary time for maintaining health and quality of life. However, it is unclear how changing sedentary behavior may influence executive functions and self-regulatory strategy use, which are vital for the long-term maintenance of a health behavior regimen. The purpose of this cross-sectional study is to examine the estimated self-regulatory and executive functioning effects of substituting 30 min of sedentary behavior with 30 min of light activity, moderate-to-vigorous physical activity (MVPA), or sleep in a sample of older adults. This study reports baseline data collected from low-active healthy older adults (N = 247, mean age 65.4 ± 4.6 years) recruited to participate in a 6 month randomized controlled exercise trial examining the effects of various modes of exercise on brain health and function. Each participant completed assessments of physical activity self-regulatory strategy use (i.e., self-monitoring, goal-setting, social support, reinforcement, time management, and relapse prevention) and executive functioning. Physical activity and sedentary behaviors were measured using accelerometers during waking hours for seven consecutive days at each time point. Isotemporal substitution analyses were conducted to examine the effect on self-regulation and executive functioning should an individual substitute sedentary time with light activity, MVPA, or sleep. The substitution of sedentary time with both sleep and MVPA influenced both self-regulatory strategy use and executive functioning. Sleep was associated with greater self-monitoring (B = .23, p = .02), goal-setting (B = .32, p < .01), and social support (B = .18, p = .01) behaviors. Substitution of sedentary time with MVPA was associated with higher accuracy on 2-item (B = .03, p = .01) and 3-item (B = .02, p = .04) spatial working memory tasks, and with faster reaction times on single (B = -23.12, p = .03) and mixed-repeated task-switching blocks (B = -27.06, p = .04). Substitution of sedentary time with sleep was associated with marginally faster reaction time on mixed-repeated task-switching blocks (B = -12.20, p = .07) and faster reaction time on mixed-switch blocks (B = 17.21, p = .05), as well as reduced global reaction time switch cost (B = -16.86, p = .01). Substitution for light intensity physical activity did not produce significant effects. By replacing sedentary time with sleep and MVPA, individuals may bolster several important domains of self-regulatory behavior and executive functioning. This has important implications for the design of long-lasting health behavior interventions. Trial Registration clinicaltrials.gov identifier NCT00438347.

  4. Neuropsychological assessment of executive functions following pediatric traumatic brain injury.

    PubMed

    Gaines, K Drorit; Soper, Henry V

    2018-01-01

    Assessment of executive functions in the adult is best captured at the stage where full maturation of brain development occurs. Assessment of executive functions of children, however, is considerably more complicated. First, assessment of executive functioning in children represents a snapshot of these developing functions at a particular time linked stage, which may have implications for further development. Second, neuropsychological measures available to assess executive functions in children are limited in number and scope and may not be sensitive to the gradual developmental changes. The present article provides an overview of the salient neurodevelopmental stages of executive functioning and discusses the utilization of recently developed neuropsychological measures to assess these stages. Comments on clinical implications of these findings regarding Traumatic Brain Injury will be provided.

  5. A local time stepping algorithm for GPU-accelerated 2D shallow water models

    NASA Astrophysics Data System (ADS)

    Dazzi, Susanna; Vacondio, Renato; Dal Palù, Alessandro; Mignosa, Paolo

    2018-01-01

    In the simulation of flooding events, mesh refinement is often required to capture local bathymetric features and/or to detail areas of interest; however, if an explicit finite volume scheme is adopted, the presence of small cells in the domain can restrict the allowable time step due to the stability condition, thus reducing the computational efficiency. With the aim of overcoming this problem, the paper proposes the application of a Local Time Stepping (LTS) strategy to a GPU-accelerated 2D shallow water numerical model able to handle non-uniform structured meshes. The algorithm is specifically designed to exploit the computational capability of GPUs, minimizing the overheads associated with the LTS implementation. The results of theoretical and field-scale test cases show that the LTS model guarantees appreciable reductions in the execution time compared to the traditional Global Time Stepping strategy, without compromising the solution accuracy.

  6. 78 FR 21715 - Sexual Assault Prevention and Response (SAPR) Program Procedures

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-11

    ... high-risk team to monitor cases where the sexual assault victim's life and safety may be in jeopardy... in Military Rule of Evidence 514. (9) Requires the execution of a high-risk team to monitor cases...

  7. Letter Report for the Review of ORD Nanomaterial Case Studies Workshop (August 2010)

    EPA Pesticide Factsheets

    The following is a letter report from the Executive Committee of the BOSC concerning the review of the ORD Nanomaterial Case Studies Workshop: Developing a Comprehensive Environmental Assessment Research Strategy for Nanoscale Titanium Dioxide.

  8. Language and Memory Improvements following tDCS of Left Lateral Prefrontal Cortex.

    PubMed

    Hussey, Erika K; Ward, Nathan; Christianson, Kiel; Kramer, Arthur F

    2015-01-01

    Recent research demonstrates that performance on executive-control measures can be enhanced through brain stimulation of lateral prefrontal regions. Separate psycholinguistic work emphasizes the importance of left lateral prefrontal cortex executive-control resources during sentence processing, especially when readers must override early, incorrect interpretations when faced with temporary ambiguity. Using transcranial direct current stimulation, we tested whether stimulation of left lateral prefrontal cortex had discriminate effects on language and memory conditions that rely on executive-control (versus cases with minimal executive-control demands, even in the face of task difficulty). Participants were randomly assigned to receive Anodal, Cathodal, or Sham stimulation of left lateral prefrontal cortex while they (1) processed ambiguous and unambiguous sentences in a word-by-word self-paced reading task and (2) performed an n-back memory task that, on some trials, contained interference lure items reputed to require executive-control. Across both tasks, we parametrically manipulated executive-control demands and task difficulty. Our results revealed that the Anodal group outperformed the remaining groups on (1) the sentence processing conditions requiring executive-control, and (2) only the most complex n-back conditions, regardless of executive-control demands. Together, these findings add to the mounting evidence for the selective causal role of left lateral prefrontal cortex for executive-control tasks in the language domain. Moreover, we provide the first evidence suggesting that brain stimulation is a promising method to mitigate processing demands encountered during online sentence processing.

  9. Language and Memory Improvements following tDCS of Left Lateral Prefrontal Cortex

    PubMed Central

    Hussey, Erika K.; Ward, Nathan; Christianson, Kiel; Kramer, Arthur F.

    2015-01-01

    Recent research demonstrates that performance on executive-control measures can be enhanced through brain stimulation of lateral prefrontal regions. Separate psycholinguistic work emphasizes the importance of left lateral prefrontal cortex executive-control resources during sentence processing, especially when readers must override early, incorrect interpretations when faced with temporary ambiguity. Using transcranial direct current stimulation, we tested whether stimulation of left lateral prefrontal cortex had discriminate effects on language and memory conditions that rely on executive-control (versus cases with minimal executive-control demands, even in the face of task difficulty). Participants were randomly assigned to receive Anodal, Cathodal, or Sham stimulation of left lateral prefrontal cortex while they (1) processed ambiguous and unambiguous sentences in a word-by-word self-paced reading task and (2) performed an n-back memory task that, on some trials, contained interference lure items reputed to require executive-control. Across both tasks, we parametrically manipulated executive-control demands and task difficulty. Our results revealed that the Anodal group outperformed the remaining groups on (1) the sentence processing conditions requiring executive-control, and (2) only the most complex n-back conditions, regardless of executive-control demands. Together, these findings add to the mounting evidence for the selective causal role of left lateral prefrontal cortex for executive-control tasks in the language domain. Moreover, we provide the first evidence suggesting that brain stimulation is a promising method to mitigate processing demands encountered during online sentence processing. PMID:26528814

  10. Directed Incremental Symbolic Execution

    NASA Technical Reports Server (NTRS)

    Person, Suzette; Yang, Guowei; Rungta, Neha; Khurshid, Sarfraz

    2011-01-01

    The last few years have seen a resurgence of interest in the use of symbolic execution -- a program analysis technique developed more than three decades ago to analyze program execution paths. Scaling symbolic execution and other path-sensitive analysis techniques to large systems remains challenging despite recent algorithmic and technological advances. An alternative to solving the problem of scalability is to reduce the scope of the analysis. One approach that is widely studied in the context of regression analysis is to analyze the differences between two related program versions. While such an approach is intuitive in theory, finding efficient and precise ways to identify program differences, and characterize their effects on how the program executes has proved challenging in practice. In this paper, we present Directed Incremental Symbolic Execution (DiSE), a novel technique for detecting and characterizing the effects of program changes. The novelty of DiSE is to combine the efficiencies of static analysis techniques to compute program difference information with the precision of symbolic execution to explore program execution paths and generate path conditions affected by the differences. DiSE is a complementary technique to other reduction or bounding techniques developed to improve symbolic execution. Furthermore, DiSE does not require analysis results to be carried forward as the software evolves -- only the source code for two related program versions is required. A case-study of our implementation of DiSE illustrates its effectiveness at detecting and characterizing the effects of program changes.

  11. The Vigilance Decrement in Executive Function Is Attenuated When Individual Chronotypes Perform at Their Optimal Time of Day

    PubMed Central

    Lara, Tania; Madrid, Juan Antonio; Correa, Ángel

    2014-01-01

    Time of day modulates our cognitive functions, especially those related to executive control, such as the ability to inhibit inappropriate responses. However, the impact of individual differences in time of day preferences (i.e. morning vs. evening chronotype) had not been considered by most studies. It was also unclear whether the vigilance decrement (impaired performance with time on task) depends on both time of day and chronotype. In this study, morning-type and evening-type participants performed a task measuring vigilance and response inhibition (the Sustained Attention to Response Task, SART) in morning and evening sessions. The results showed that the vigilance decrement in inhibitory performance was accentuated at non-optimal as compared to optimal times of day. In the morning-type group, inhibition performance decreased linearly with time on task only in the evening session, whereas in the morning session it remained more accurate and stable over time. In contrast, inhibition performance in the evening-type group showed a linear vigilance decrement in the morning session, whereas in the evening session the vigilance decrement was attenuated, following a quadratic trend. Our findings imply that the negative effects of time on task in executive control can be prevented by scheduling cognitive tasks at the optimal time of day according to specific circadian profiles of individuals. Therefore, time of day and chronotype influences should be considered in research and clinical studies as well as real-word situations demanding executive control for response inhibition. PMID:24586404

  12. View from the top: CEO perspectives on executive development and succession planning practices in healthcare organizations.

    PubMed

    Groves, Kevin S

    2006-01-01

    Many healthcare professionals question whether the industry's hospitals and multi-site systems are implementing the necessary executive development and succession planning systems to ensure that high potential managers are prepared and aptly selected to assume key executive roles. Survey data, case studies, and cross-industry comparisons suggest that healthcare organizations may face a leadership crisis as the current generation of chief executive officers (CEOs) nears retirement while traditional means of developing the leadership pipeline, including middle-management positions and graduate programs requiring formal residencies, continue to dissipate. Given the daunting challenges that accompany the healthcare industry's quest to identify, develop, and retain leadership talent, this article provides best practice findings from a qualitative study of 13 healthcare organizations with a record of exemplary executive development and succession planning practices. CEOs from six single-site hospitals, six healthcare systems, and one medical group were interviewed to identify industry best practices so that healthcare practitioners and educators may utilize the findings to enhance the industry's leadership capacity.

  13. Blind Seer: A Scalable Private DBMS

    DTIC Science & Technology

    2014-05-01

    searchable index terms per DB row, in time comparable to (insecure) MySQL (many practical queries can be privately executed with work 1.2-3 times slower...than MySQL , although some queries are costlier). We support a rich query set, including searching on arbitrary boolean formulas on keywords and ranges...index terms per DB row, in time comparable to (insecure) MySQL (many practical queries can be privately executed with work 1.2-3 times slower than MySQL

  14. Television and children's executive function.

    PubMed

    Lillard, Angeline S; Li, Hui; Boguszewski, Katie

    2015-01-01

    Children spend a lot of time watching television on its many platforms: directly, online, and via videos and DVDs. Many researchers are concerned that some types of television content appear to negatively influence children's executive function. Because (1) executive function predicts key developmental outcomes, (2) executive function appears to be influenced by some television content, and (3) American children watch large quantities of television (including the content of concern), the issues discussed here comprise a crucial public health issue. Further research is needed to reveal exactly what television content is implicated, what underlies television's effect on executive function, how long the effect lasts, and who is affected. © 2015 Elsevier Inc. All rights reserved.

  15. Diagnosing and Resolving Conflict Created by Strategic Plans: Where Outreach Strategies and Execution Meet at an Academic Health Center.

    PubMed

    Edwards, Robert L; Wollner, Samuel B; Weddle, Jessica; Zembrodt, James W; Birdwhistell, Mark D

    2017-01-01

    The imperative for strategic change at academic health centers has never been stronger. Underpinning the success of strategic change is an effective process to implement a strategy. Healthcare organizations, however, often fail to execute on strategy because they do not activate the requisite capabilities and management processes. The University of Kentucky HealthCare recently defined its 2020 strategic plan to adapt to emerging market conditions. The authors outline the strategic importance of strengthening partnership networks and the initial challenges faced in executing their strategy. The findings are a case study in how one academic health center has approached strategy implementation.

  16. Differential and relaxed image foresting transform for graph-cut segmentation of multiple 3D objects.

    PubMed

    Moya, Nikolas; Falcão, Alexandre X; Ciesielski, Krzysztof C; Udupa, Jayaram K

    2014-01-01

    Graph-cut algorithms have been extensively investigated for interactive binary segmentation, when the simultaneous delineation of multiple objects can save considerable user's time. We present an algorithm (named DRIFT) for 3D multiple object segmentation based on seed voxels and Differential Image Foresting Transforms (DIFTs) with relaxation. DRIFT stands behind efficient implementations of some state-of-the-art methods. The user can add/remove markers (seed voxels) along a sequence of executions of the DRIFT algorithm to improve segmentation. Its first execution takes linear time with the image's size, while the subsequent executions for corrections take sublinear time in practice. At each execution, DRIFT first runs the DIFT algorithm, then it applies diffusion filtering to smooth boundaries between objects (and background) and, finally, it corrects possible objects' disconnection occurrences with respect to their seeds. We evaluate DRIFT in 3D CT-images of the thorax for segmenting the arterial system, esophagus, left pleural cavity, right pleural cavity, trachea and bronchi, and the venous system.

  17. Representation and Analysis of Real-Time Control Structures.

    DTIC Science & Technology

    1980-08-01

    external processes which cannot be forced to cooperate with programmed processes through use of a synchronization primitive such as a semaphore [Dijkstre...amounts to each task, but the time slices are synchronized with program execution. The length of the codestrip is determined by the response time...which might be synchronous or asynchronous with respect to the executing task. The notation can represent total and partial orderings among its tasks, and

  18. Mechanical analysis of the roundhouse kick according to height and distance in taekwondo.

    PubMed

    Estevan, I; Falco, C

    2013-12-01

    Competition regulation in taekwondo has experienced several changes during the last few years, for example, kicks to the head score more points than kicks to the chest. In addition, some external factors such as the height of target and execution distance seem to affect the kick performance. The aim of this study was to analyse selected biomechanical parameters (impact force, reaction time, and execution time) according to the height and execution distance in two different male groups (experts (n = 12) and novices (n = 21)). Athletes kicked twice from every execution distance (short, normal and long) and towards two different heights of target (chest and head) in a random order. Novices kicked to the head with a longer reaction time than to the chest (p < 0.05) but experts were able to kick with similar performance for both heights. From short and normal distances experts kicked with similar performance; whereas from the normal distance novices had longer reaction and execution time than from the short distance (p < 0.05). In conclusion, in counterattacking situations, experts should perform the roundhouse kick to the head instead of to the chest, because it produces better scores with similar performance; whereas novice athletes should avoid kicking to the head because they are not able to kick with similar performance. Moreover, it is recommended that during counterattacks higher-level taekwondo athletes should intend to kick from normal distances.

  19. MECHANICAL ANALYSIS OF THE ROUNDHOUSE KICK ACCORDING TO HEIGHT AND DISTANCE IN TAEKWONDO

    PubMed Central

    Falco, C.

    2013-01-01

    Competition regulation in taekwondo has experienced several changes during the last few years, for example, kicks to the head score more points than kicks to the chest. In addition, some external factors such as the height of target and execution distance seem to affect the kick performance. The aim of this study was to analyse selected biomechanical parameters (impact force, reaction time, and execution time) according to the height and execution distance in two different male groups (experts (n = 12) and novices (n = 21)). Athletes kicked twice from every execution distance (short, normal and long) and towards two different heights of target (chest and head) in a random order. Novices kicked to the head with a longer reaction time than to the chest (p < 0.05) but experts were able to kick with similar performance for both heights. From short and normal distances experts kicked with similar performance; whereas from the normal distance novices had longer reaction and execution time than from the short distance (p < 0.05). In conclusion, in counterattacking situations, experts should perform the roundhouse kick to the head instead of to the chest, because it produces better scores with similar performance; whereas novice athletes should avoid kicking to the head because they are not able to kick with similar performance. Moreover, it is recommended that during counterattacks higher-level taekwondo athletes should intend to kick from normal distances. PMID:24744499

  20. Performance Analysis of Polymorphous Computing Architectures

    DTIC Science & Technology

    2001-01-01

    G H F Proc 5 : 4 : 3 11 1 Figure 3. Self-timed execution. D C B F G H E D B H EA CG...F D C B F G H E D B H EA CG F AProc 1 Proc 2 Proc 3 Proc 4 Proc 5 185 cution pattern when the application graph in Figure 2 is executed in a self...transform, a quadra- E Figure 10. Self-timed execution with first-iteration actors denoted by T. D B H E CG F D C B F G H E D B H EA CG F A 18 T T T

  1. Circadian Rhythms in Executive Function during the Transition to Adolescence: The Effect of Synchrony between Chronotype and Time of Day

    ERIC Educational Resources Information Center

    Hahn, Constanze; Cowell, Jason M.; Wiprzycka, Ursula J.; Goldstein, David; Ralph, Martin; Hasher, Lynn; Zelazo, Philip David

    2012-01-01

    To explore the influence of circadian rhythms on executive function during early adolescence, we administered a battery of executive function measures (including a Go-Nogo task, the Iowa Gambling Task, a Self-ordered Pointing task, and an Intra/Extradimensional Shift task) to Morning-preference and Evening-preference participants (N = 80) between…

  2. The Touche Ross Survey of Business Executives on Non-Profit Boards. An Opinion Study Based on Interviews with 309 Business Executives.

    ERIC Educational Resources Information Center

    Research and Forecasts, Inc., New York, NY.

    Three hundred eight business executives from boards of nonprofit educational, social service, or cultural organizations were surveyed for their opinions on their roles. It was found that a single overriding concern is the ever-increasing demand for commitment and accountability. This is reflected particularly in three issues: time, money, and…

  3. Developing an effective business plan.

    PubMed

    Lehman, L B

    1996-06-01

    At some time, virtually all managed care executives, and most physician executives, will be asked to develop business plans. Business plans are thoughtful, comprehensive, and realistic descriptions of the many aspects of the formulation of a new business product or line for market. The author describes what goes into the writing of a business plan and how the physician executive should approach this task.

  4. Effects of deep brain stimulation of the subthalamic nucleus on inhibitory and executive control over prepotent responses in Parkinson's disease

    PubMed Central

    Jahanshahi, Marjan

    2013-01-01

    Inhibition of inappropriate, habitual or prepotent responses is an essential component of executive control and a cornerstone of self-control. Via the hyperdirect pathway, the subthalamic nucleus (STN) receives inputs from frontal areas involved in inhibition and executive control. Evidence is reviewed from our own work and the literature suggesting that in Parkinson's disease (PD), deep brain stimulation (DBS) of the STN has an impact on executive control during attention-demanding tasks or in situations of conflict when habitual or prepotent responses have to be inhibited. These results support a role for the STN in an inter-related set of processes: switching from automatic to controlled processing, inhibitory and executive control, adjusting response thresholds and influencing speed-accuracy trade-offs. Such STN DBS-induced deficits in inhibitory and executive control may contribute to some of the psychiatric problems experienced by a proportion of operated cases after STN DBS surgery in PD. However, as no direct evidence for such a link is currently available, there is a need to provide direct evidence for such a link between STN DBS-induced deficits in inhibitory and executive control and post-surgical psychiatric complications experienced by operated patients. PMID:24399941

  5. Data preprocessing for determining outer/inner parallelization in the nested loop problem using OpenMP

    NASA Astrophysics Data System (ADS)

    Handhika, T.; Bustamam, A.; Ernastuti, Kerami, D.

    2017-07-01

    Multi-thread programming using OpenMP on the shared-memory architecture with hyperthreading technology allows the resource to be accessed by multiple processors simultaneously. Each processor can execute more than one thread for a certain period of time. However, its speedup depends on the ability of the processor to execute threads in limited quantities, especially the sequential algorithm which contains a nested loop. The number of the outer loop iterations is greater than the maximum number of threads that can be executed by a processor. The thread distribution technique that had been found previously only be applied by the high-level programmer. This paper generates a parallelization procedure for low-level programmer in dealing with 2-level nested loop problems with the maximum number of threads that can be executed by a processor is smaller than the number of the outer loop iterations. Data preprocessing which is related to the number of the outer loop and the inner loop iterations, the computational time required to execute each iteration and the maximum number of threads that can be executed by a processor are used as a strategy to determine which parallel region that will produce optimal speedup.

  6. 10 CFR 10.35 - Reconsideration of cases.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Reconsideration of cases. 10.35 Section 10.35 Energy... DATA OR NATIONAL SECURITY INFORMATION OR AN EMPLOYMENT CLEARANCE Procedures § 10.35 Reconsideration of cases. (a) Where, pursuant to the procedures set forth in §§ 10.20 through 10.34, the Deputy Executive...

  7. 10 CFR 10.35 - Reconsideration of cases.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Reconsideration of cases. 10.35 Section 10.35 Energy... DATA OR NATIONAL SECURITY INFORMATION OR AN EMPLOYMENT CLEARANCE Procedures § 10.35 Reconsideration of cases. (a) Where, pursuant to the procedures set forth in §§ 10.20 through 10.34, the Deputy Executive...

  8. 10 CFR 10.35 - Reconsideration of cases.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Reconsideration of cases. 10.35 Section 10.35 Energy... DATA OR NATIONAL SECURITY INFORMATION OR AN EMPLOYMENT CLEARANCE Procedures § 10.35 Reconsideration of cases. (a) Where, pursuant to the procedures set forth in §§ 10.20 through 10.34, the Deputy Executive...

  9. The business case as a strategic tool for change.

    PubMed

    Weaver, Diana J; Sorrells-Jones, Jean

    2007-09-01

    The authors discuss the clinically focused business case, when it is and is not needed, and the knowledge and skills the nurse executive must master to use the business case effectively as a strategic tool. Necessary skills include translating nursing practice proposals into cost-effective change initiatives and marketing those changes to colleagues.

  10. A European Solution to Islamic Extremism in Western Europe

    DTIC Science & Technology

    2006-04-14

    Physically destroying terrorist organizations (“direct action”) is an effective tool. Where freedom of action and freedom of movement exist, there... effectiveness and sometimes duplicate effort. This paper will explain the growing Islamic extremist threat in Western Europe and present a case for why that...native-born youth franchise al-Qa’ida and execute a terrorist attack that effects a change in government. Terrorists executed a planned and deliberate

  11. Varicose veins endoluminal laser ablation from the beginning EVLT till now CELIV

    NASA Astrophysics Data System (ADS)

    Teixeira, Heitor; Teixeira, Carolina

    2018-04-01

    To achieve a unique goal in the treatment of varicose veins in order to obtain the desired result, we use EVLA as the execution of a single treatment that is optimized to treat varicose veins, in which we can treat different kind of patients which we have treated from November 2001 to March 2017. So 3486 documentary cases were carried out with this treatment and the analysis of these cases was the object of this retrospective observational study that allows doctors to understand the surroundings of this procedure on his road to desired perfection. The comparison of the first 8 years of clinical execution with the second ones, allows us to obtain postulates that can serve as guide lines for the understanding of the main fields involved, whether they are the human or the social, economic or even the technician execution needed to obtain the results that allow us to well founded considerations about the validity of this method, as well as we can think how to improve or execute it in order to obtain even better results. The aim of our comparison is to try to understand the clinical and physical contents of these contributions and obtained knowledge that allows the doctors with the bridge that overlaps these different backgrounds.

  12. Cycles of judicial and executive power in irregular migration.

    PubMed

    Marmo, Marinella; Giannacopoulos, Maria

    2017-01-01

    This article argues that power struggles between judiciaries and executives are fuelled by tensions of securitisation, border control and human rights over the issue of irregular migration. The article juxtaposes three paradigm court cases to render the argument concrete, focusing on two Australian High Court decisions ( M70 v Minister for Immigration and Citizenship and CPCF v. Minister for Immigration and Border Protection & Anor ) and one decision from the European Court of Human Rights ( Hirsi Jamaa and Others v. Italy ). An examination of these cases reveals each step of this cycle: the executive attempts to produce a buffer to avoid or minimise migrants' protections and judicial review, yet such manoeuvring is countered by the judges. Following this, new steps of the cycle occur: governments display disappointment to courts' interventions in an effort to discredit the exercise of judicial power while the judiciaries maintain the focus on the rule of law. And so the cycle continues. The key argument of this paper rests on the paradox resulting from the executive's attempts to curb judicial intervention, because such attempts actually empower judiciaries. Comparing different jurisdictions highlights how this cyclical power struggle is a defining element between these two arms of power across distinct legal-geographical boundaries. By tracing this development in Australia and in Europe, this article demonstrates that the argument has global significance.

  13. Varying execution discipline to increase performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, P.L.; Maccabe, A.B.

    1993-12-22

    This research investigates the relationship between execution discipline and performance. The hypothesis has two parts: 1. Different execution disciplines exhibit different performance for different computations, and 2. These differences can be effectively predicted by heuristics. A machine model is developed that can vary its execution discipline. That is, the model can execute a given program using either the control-driven, data-driven or demand-driven execution discipline. This model is referred to as a ``variable-execution-discipline`` machine. The instruction set for the model is the Program Dependence Web (PDW). The first part of the hypothesis will be tested by simulating the execution of themore » machine model on a suite of computations, based on the Livermore Fortran Kernel (LFK) Test (a.k.a. the Livermore Loops), using all three execution disciplines. Heuristics are developed to predict relative performance. These heuristics predict (a) the execution time under each discipline for one iteration of each loop and (b) the number of iterations taken by that loop; then the heuristics use those predictions to develop a prediction for the execution of the entire loop. Similar calculations are performed for branch statements. The second part of the hypothesis will be tested by comparing the results of the simulated execution with the predictions produced by the heuristics. If the hypothesis is supported, then the door is open for the development of machines that can vary execution discipline to increase performance.« less

  14. 76 FR 71092 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-16

    ... quoted prices in the respective SPX series at the time of execution, each constituent execution will be... Demeterfi, Emanuel Derman, Michael Kamal and Joseph Zou, Goldman Sachs Quantitative Strategies Research... series. The following table shows the SPX option mid-quote prices prevailing at the time of the S&P 500...

  15. Effects of Age, Intelligence and Executive Control Function on Saccadic Reaction Time in Persons with Intellectual Disabilities

    ERIC Educational Resources Information Center

    Haishi, Koichi; Okuzumi, Hideyuki; Kokubun, Mitsuru

    2011-01-01

    The current research aimed to clarify the influence of age, intelligence and executive control function on the central tendency and intraindividual variability of saccadic reaction time in persons with intellectual disabilities. Participants were 44 persons with intellectual disabilities aged between 13 and 57 years whose IQs were between 14 and…

  16. On the Optimization of Aerospace Plane Ascent Trajectory

    NASA Astrophysics Data System (ADS)

    Al-Garni, Ahmed; Kassem, Ayman Hamdy

    A hybrid heuristic optimization technique based on genetic algorithms and particle swarm optimization has been developed and tested for trajectory optimization problems with multi-constraints and a multi-objective cost function. The technique is used to calculate control settings for two types for ascending trajectories (constant dynamic pressure and minimum-fuel-minimum-heat) for a two-dimensional model of an aerospace plane. A thorough statistical analysis is done on the hybrid technique to make comparisons with both basic genetic algorithms and particle swarm optimization techniques with respect to convergence and execution time. Genetic algorithm optimization showed better execution time performance while particle swarm optimization showed better convergence performance. The hybrid optimization technique, benefiting from both techniques, showed superior robust performance compromising convergence trends and execution time.

  17. Executive Function, Survival, and Hospitalization in Chronic Obstructive Pulmonary Disease. A Longitudinal Analysis of the National Emphysema Treatment Trial (NETT)

    PubMed Central

    Dodd, James W.; Novotny, Paul; Sciurba, Frank C.

    2015-01-01

    Rationale: Cognitive dysfunction has been demonstrated in chronic obstructive pulmonary disease (COPD), but studies are limited to cross-sectional analyses or incompletely characterized populations. Objectives: We examined longitudinal changes in sensitive measures of executive function in a well-characterized population of patients with severe COPD. Methods: This study was performed on patients enrolled in the National Emphysema Treatment Trial. To assess executive function, we analyzed trail making (TM) A and B times at enrollment in the trial (2,128 patients), and at 12 (731 patients) and 24 months (593 patients) after enrollment, adjusted for surgery, marriage status, age, education, income, depression, PaO2, PaCO2, and smoking. Associations with survival and hospitalizations were examined using Cox regression and linear regression models. Measurements and Main Results: The average age of the patients was 66.4 years, and the average FEV1 was 23.9% predicted. At the time of enrolment, 38% had executive dysfunction. Compared with those who did not, these patients were older, less educated, had higher oxygen use, higher PaCO2, worse quality of life as measured by the St. George’s Respiratory Quotient, reduced well-being, and lower social function. There was no significant change over 2 years in TM A or B times after adjustment for covariables. Changes in TM B times were modestly associated with survival, but changes in TM B − A times were not. Changes in TM scores were not associated with frequency of hospitalization. Lung function, PaO2, smoking, survival, and hospitalizations were not significantly different in those with executive dysfunction. Conclusions: In this large population of patients with severe emphysema and heavy cigarette smoking exposure, there was no significant decline over 2 years in cognitive executive function as measured by TM tests. There was no association between executive function impairment and frequency of hospitalization, and there was a possible modest association with survival. It is plausible that cerebrovascular comorbidities explain previously described cognitive pathology in COPD. PMID:26288391

  18. Hyperbolic Rendezvous at Mars: Risk Assessments and Mitigation Strategies

    NASA Technical Reports Server (NTRS)

    Jedrey, Ricky; Landau, Damon; Whitley, Ryan

    2015-01-01

    Given the current interest in the use of flyby trajectories for human Mars exploration, a key requirement is the capability to execute hyperbolic rendezvous. Hyperbolic rendezvous is used to transport crew from a Mars centered orbit, to a transiting Earth bound habitat that does a flyby. Representative cases are taken from future potential missions of this type, and a thorough sensitivity analysis of the hyperbolic rendezvous phase is performed. This includes early engine cutoff, missed burn times, and burn misalignment. A finite burn engine model is applied that assumes the hyperbolic rendezvous phase is done with at least two burns.

  19. Validation of fault-free behavior of a reliable multiprocessor system - FTMP: A case study. [Fault-Tolerant Multi-Processor avionics

    NASA Technical Reports Server (NTRS)

    Clune, E.; Segall, Z.; Siewiorek, D.

    1984-01-01

    A program of experiments has been conducted at NASA-Langley to test the fault-free performance of a Fault-Tolerant Multiprocessor (FTMP) avionics system for next-generation aircraft. Baseline measurements of an operating FTMP system were obtained with respect to the following parameters: instruction execution time, frame size, and the variation of clock ticks. The mechanisms of frame stretching were also investigated. The experimental results are summarized in a table. Areas of interest for future tests are identified, with emphasis given to the implementation of a synthetic workload generation mechanism on FTMP.

  20. Model-Based GUI Testing Using Uppaal at Novo Nordisk

    NASA Astrophysics Data System (ADS)

    Hjort, Ulrik H.; Illum, Jacob; Larsen, Kim G.; Petersen, Michael A.; Skou, Arne

    This paper details a collaboration between Aalborg University and Novo Nordiskin developing an automatic model-based test generation tool for system testing of the graphical user interface of a medical device on an embedded platform. The tool takes as input an UML Statemachine model and generates a test suite satisfying some testing criterion, such as edge or state coverage, and converts the individual test case into a scripting language that can be automatically executed against the target. The tool has significantly reduced the time required for test construction and generation, and reduced the number of test scripts while increasing the coverage.

  1. Specific Features of Executive Dysfunction in Alzheimer-Type Mild Dementia Based on Computerized Cambridge Neuropsychological Test Automated Battery (CANTAB) Test Results.

    PubMed

    Kuzmickienė, Jurgita; Kaubrys, Gintaras

    2016-10-08

    BACKGROUND The primary manifestation of Alzheimer's disease (AD) is decline in memory. Dysexecutive symptoms have tremendous impact on functional activities and quality of life. Data regarding frontal-executive dysfunction in mild AD are controversial. The aim of this study was to assess the presence and specific features of executive dysfunction in mild AD based on Cambridge Neuropsychological Test Automated Battery (CANTAB) results. MATERIAL AND METHODS Fifty newly diagnosed, treatment-naïve, mild, late-onset AD patients (MMSE ≥20, AD group) and 25 control subjects (CG group) were recruited in this prospective, cross-sectional study. The CANTAB tests CRT, SOC, PAL, SWM were used for in-depth cognitive assessment. Comparisons were performed using the t test or Mann-Whitney U test, as appropriate. Correlations were evaluated by Pearson r or Spearman R. Statistical significance was set at p<0.05. RESULTS AD and CG groups did not differ according to age, education, gender, or depression. Few differences were found between groups in the SOC test for performance measures: Mean moves (minimum 3 moves): AD (Rank Sum=2227), CG (Rank Sum=623), p<0.001. However, all SOC test time measures differed significantly between groups: SOC Mean subsequent thinking time (4 moves): AD (Rank Sum=2406), CG (Rank Sum=444), p<0.001. Correlations were weak between executive function (SOC) and episodic/working memory (PAL, SWM) (R=0.01-0.38) or attention/psychomotor speed (CRT) (R=0.02-0.37). CONCLUSIONS Frontal-executive functions are impaired in mild AD patients. Executive dysfunction is highly prominent in time measures, but minimal in performance measures. Executive disorders do not correlate with a decline in episodic and working memory or psychomotor speed in mild AD.

  2. Spaceborne computer executive routine functional design specification. Volume 1: Functional design of a flight computer executive program for the reusable shuttle

    NASA Technical Reports Server (NTRS)

    Curran, R. T.

    1971-01-01

    A flight computer functional executive design for the reusable shuttle is presented. The design is given in the form of functional flowcharts and prose description. Techniques utilized in the regulation of process flow to accomplish activation, resource allocation, suspension, termination, and error masking based on process primitives are considered. Preliminary estimates of main storage utilization by the Executive are furnished. Conclusions and recommendations for timely, effective software-hardware integration in the reusable shuttle avionics system are proposed.

  3. Frontal Lobe Dysfunction in a Depressed Patient Who Survived a Suicide Attempt by Jumping from the Bridge on the Han River

    PubMed Central

    Kim, Kiwon

    2017-01-01

    Suicide attempts at the Han river are rapidly increasing, which are 4.11 times from 2005 to 2015, whereas the rate of completed suicide in South Korea increased 1.07 times during the same period. However, few studies have been conducted on the issue because many suicide attempters were seriously injured after a fall in the Han river. We present a case of a patient with major depressive disorder (MDD) who attempted suicide and minimally injured after jumping from the bridge at the Han river. We could assess his psychological and neurocognitive functions before and immediately after his attempt. From this case, we can identify that higher cognitive aspect of executive dysfunction, especially in the frontal domain of selective attention and inhibition, may be associated with his suicide attempt. In conclusion, we suggest psychiatric treatments for cognitive impulsiveness and safety barriers at the bridge to prevent suicide attempts of patients with MDD. PMID:29209400

  4. Frontal Lobe Dysfunction in a Depressed Patient Who Survived a Suicide Attempt by Jumping from the Bridge on the Han River.

    PubMed

    Kim, Kiwon; Jeon, Hong Jin

    2017-11-01

    Suicide attempts at the Han river are rapidly increasing, which are 4.11 times from 2005 to 2015, whereas the rate of completed suicide in South Korea increased 1.07 times during the same period. However, few studies have been conducted on the issue because many suicide attempters were seriously injured after a fall in the Han river. We present a case of a patient with major depressive disorder (MDD) who attempted suicide and minimally injured after jumping from the bridge at the Han river. We could assess his psychological and neurocognitive functions before and immediately after his attempt. From this case, we can identify that higher cognitive aspect of executive dysfunction, especially in the frontal domain of selective attention and inhibition, may be associated with his suicide attempt. In conclusion, we suggest psychiatric treatments for cognitive impulsiveness and safety barriers at the bridge to prevent suicide attempts of patients with MDD.

  5. Open-Source RTOS Space Qualification: An RTEMS Case Study

    NASA Technical Reports Server (NTRS)

    Zemerick, Scott

    2017-01-01

    NASA space-qualification of reusable off-the-shelf real-time operating systems (RTOSs) remains elusive due to several factors notably (1) The diverse nature of RTOSs utilized across NASA, (2) No single NASA space-qualification criteria, lack of verification and validation (V&V) analysis, or test beds, and (3) different RTOS heritages, specifically open-source RTOSs and closed vendor-provided RTOSs. As a leader in simulation test beds, the NASA IV&V Program is poised to help jump-start and lead the space-qualification effort of the open source Real-Time Executive for Multiprocessor Systems (RTEMS) RTOS. RTEMS, as a case-study, can be utilized as an example of how to qualify all RTOSs, particularly the reusable non-commercial (open-source) ones that are gaining usage and popularity across NASA. Qualification will improve the overall safety and mission assurance of RTOSs for NASA-agency wide usage. NASA's involvement in space-qualification of an open-source RTOS such as RTEMS will drive the RTOS industry toward a more qualified and mature open-source RTOS product.

  6. Understanding and Mitigating Protests of Department of Defense Acquisition Contracts

    DTIC Science & Technology

    2010-08-01

    of delivery time that can lock out a rejected offeror from a market . Sixth, more complex contracts, like services versus products , generate more...The engineers, attorneys, or head of a business unit need to explain to the team that spent time working on a bid why the company lost. Executives...agency executives have to explain to their team, who also spent time working on the source solicitation, evaluation, and selection, why the company

  7. Aerobic and Cognitive Exercise (ACE) Pilot Study for Older Adults: Executive Function Improves with Cognitive Challenge While Exergaming.

    PubMed

    Barcelos, Nicole; Shah, Nikita; Cohen, Katherine; Hogan, Michael J; Mulkerrin, Eamon; Arciero, Paul J; Cohen, Brian D; Kramer, Arthur F; Anderson-Hanley, Cay

    2015-11-01

    Dementia cases are increasing worldwide; thus, investigators seek to identify interventions that might prevent or ameliorate cognitive decline in later life. Extensive research confirms the benefits of physical exercise for brain health, yet only a fraction of older adults exercise regularly. Interactive mental and physical exercise, as in aerobic exergaming, not only motivates, but has also been found to yield cognitive benefit above and beyond traditional exercise. This pilot study sought to investigate whether greater cognitive challenge while exergaming would yield differential outcomes in executive function and generalize to everyday functioning. Sixty-four community based older adults (mean age=82) were randomly assigned to pedal a stationary bike, while interactively engaging on-screen with: (1) a low cognitive demand task (bike tour), or (2) a high cognitive demand task (video game). Executive function (indices from Trails, Stroop and Digit Span) was assessed before and after a single-bout and 3-month exercise intervention. Significant group × time interactions were found after a single-bout (Color Trails) and after 3 months of exergaming (Stroop; among 20 adherents). Those in the high cognitive demand group performed better than those in the low cognitive dose condition. Everyday function improved across both exercise conditions. Pilot data indicate that for older adults, cognitive benefit while exergaming increased concomitantly with higher doses of interactive mental challenge.

  8. The Prodiguer Messaging Platform

    NASA Astrophysics Data System (ADS)

    Denvil, S.; Greenslade, M. A.; Carenton, N.; Levavasseur, G.; Raciazek, J.

    2015-12-01

    CONVERGENCE is a French multi-partner national project designed to gather HPC and informatics expertise to innovate in the context of running French global climate models with differing grids and at differing resolutions. Efficient and reliable execution of these models and the management and dissemination of model output are some of the complexities that CONVERGENCE aims to resolve.At any one moment in time, researchers affiliated with the Institut Pierre Simon Laplace (IPSL) climate modeling group, are running hundreds of global climate simulations. These simulations execute upon a heterogeneous set of French High Performance Computing (HPC) environments. The IPSL's simulation execution runtime libIGCM (library for IPSL Global Climate Modeling group) has recently been enhanced so as to support hitherto impossible realtime use cases such as simulation monitoring, data publication, metrics collection, simulation control, visualizations … etc. At the core of this enhancement is Prodiguer: an AMQP (Advanced Message Queue Protocol) based event driven asynchronous distributed messaging platform. libIGCM now dispatches copious amounts of information, in the form of messages, to the platform for remote processing by Prodiguer software agents at IPSL servers in Paris. Such processing takes several forms: Persisting message content to database(s); Launching rollback jobs upon simulation failure; Notifying downstream applications; Automation of visualization pipelines; We will describe and/or demonstrate the platform's: Technical implementation; Inherent ease of scalability; Inherent adaptiveness in respect to supervising simulations; Web portal receiving simulation notifications in realtime.

  9. The neuropsychology of obsessive-compulsive personality disorder: a new analysis.

    PubMed

    Fineberg, Naomi A; Day, Grace A; de Koenigswarter, Nica; Reghunandanan, Samar; Kolli, Sangeetha; Jefferies-Sewell, Kiri; Hranov, Georgi; Laws, Keith R

    2015-10-01

    Obsessive compulsive personality disorder (OCPD) is characterized by perfectionism, need for control, and cognitive rigidity. Currently, little neuropsychological data exist on this condition, though emerging evidence does suggest that disorders marked by compulsivity, including obsessive-compulsive disorder (OCD), are associated with impairment in cognitive flexibility and executive planning on neurocognitive tasks. The current study investigated the neurocognitive profile in a nonclinical community-based sample of people fulfilling diagnostic criteria for OCPD in the absence of major psychiatric comorbidity. Twenty-one nonclinical subjects who fulfilled Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV) criteria for OCPD were compared with 15 healthy controls on selected clinical and neurocognitive tasks. OCPD was measured using the Compulsive Personality Assessment Scale (CPAS). Participants completed tests from the Cambridge Automated Neuropsychological Test Battery including tests of set shifting (Intra-Extra Dimensional [IED] Set Shifting) executive planning (Stockings of Cambridge [SOC]), and decision making (Cambridge Gamble Task [CGT]). The OCPD group made significantly more IED-ED shift errors and total shift errors, and also showed longer mean initial thinking time on the SOC at moderate levels of difficulty. No differences emerged on the CGT. Nonclinical cases of OCPD showed significant cognitive inflexibility coupled with executive planning deficits, whereas decision-making remained intact. This profile of impairment overlaps with that of OCD and implies that common neuropsychological changes affect individuals with these disorders.

  10. Improved Cognitive Function After Transcranial, Light-Emitting Diode Treatments in Chronic, Traumatic Brain Injury: Two Case Reports

    PubMed Central

    Saltmarche, Anita; Krengel, Maxine H.; Hamblin, Michael R.; Knight, Jeffrey A.

    2011-01-01

    Abstract Objective: Two chronic, traumatic brain injury (TBI) cases, where cognition improved following treatment with red and near-infrared light-emitting diodes (LEDs), applied transcranially to forehead and scalp areas, are presented. Background: Significant benefits have been reported following application of transcranial, low-level laser therapy (LLLT) to humans with acute stroke and mice with acute TBI. These are the first case reports documenting improved cognitive function in chronic, TBI patients treated with transcranial LED. Methods: Treatments were applied bilaterally and to midline sagittal areas using LED cluster heads [2.1″ diameter, 61 diodes (9 × 633 nm, 52 × 870 nm); 12–15 mW per diode; total power: 500 mW; 22.2 mW/cm2; 13.3 J/cm2 at scalp (estimated 0.4 J/cm2 to cortex)]. Results: Seven years after closed-head TBI from a motor vehicle accident, Patient 1 began transcranial LED treatments. Pre-LED, her ability for sustained attention (computer work) lasted 20 min. After eight weekly LED treatments, her sustained attention time increased to 3 h. The patient performs nightly home treatments (5 years); if she stops treating for more than 2 weeks, she regresses. Patient 2 had a history of closed-head trauma (sports/military, and recent fall), and magnetic resonance imaging showed frontoparietal atrophy. Pre-LED, she was on medical disability for 5 months. After 4 months of nightly LED treatments at home, medical disability discontinued; she returned to working full-time as an executive consultant with an international technology consulting firm. Neuropsychological testing after 9 months of transcranial LED indicated significant improvement (+1, +2SD) in executive function (inhibition, inhibition accuracy) and memory, as well as reduction in post-traumatic stress disorder. If she stops treating for more than 1 week, she regresses. At the time of this report, both patients are continuing treatment. Conclusions: Transcranial LED may improve cognition, reduce costs in TBI treatment, and be applied at home. Controlled studies are warranted. PMID:21182447

  11. Bioinformatics algorithm based on a parallel implementation of a machine learning approach using transducers

    NASA Astrophysics Data System (ADS)

    Roche-Lima, Abiel; Thulasiram, Ruppa K.

    2012-02-01

    Finite automata, in which each transition is augmented with an output label in addition to the familiar input label, are considered finite-state transducers. Transducers have been used to analyze some fundamental issues in bioinformatics. Weighted finite-state transducers have been proposed to pairwise alignments of DNA and protein sequences; as well as to develop kernels for computational biology. Machine learning algorithms for conditional transducers have been implemented and used for DNA sequence analysis. Transducer learning algorithms are based on conditional probability computation. It is calculated by using techniques, such as pair-database creation, normalization (with Maximum-Likelihood normalization) and parameters optimization (with Expectation-Maximization - EM). These techniques are intrinsically costly for computation, even worse when are applied to bioinformatics, because the databases sizes are large. In this work, we describe a parallel implementation of an algorithm to learn conditional transducers using these techniques. The algorithm is oriented to bioinformatics applications, such as alignments, phylogenetic trees, and other genome evolution studies. Indeed, several experiences were developed using the parallel and sequential algorithm on Westgrid (specifically, on the Breeze cluster). As results, we obtain that our parallel algorithm is scalable, because execution times are reduced considerably when the data size parameter is increased. Another experience is developed by changing precision parameter. In this case, we obtain smaller execution times using the parallel algorithm. Finally, number of threads used to execute the parallel algorithm on the Breezy cluster is changed. In this last experience, we obtain as result that speedup is considerably increased when more threads are used; however there is a convergence for number of threads equal to or greater than 16.

  12. Older Adults with Fear of Falling Show Deficits in Motor Imagery of Gait.

    PubMed

    Sakurai, R; Fujiwara, Y; Yasunaga, M; Suzuki, H; Sakuma, N; Imanaka, K; Montero-Odasso, M

    2017-01-01

    Understanding of the underlying mechanisms of Fear of Falling (FoF) could help to expand potential treatments. Given the nature of motor performance, the decline in the planning stage of motor execution may be associated with an expression of FoF. The aim of this study was to assess the planning/prediction accuracy in motor execution in people with FoF using gait-related motor imagery (MI). Cross-sectional case/control study. Three health centers in Japan. Two hundred and eighty-three community-dwelling older adults were recruited and stratified by presence of FoF as FoF group (n=178) or non-FoF group (n=107). Participants were tested for both imagery and execution tasks of a Timed Up and Go (TUG) test. The participants were first asked to imagine the trial (iTUG) and estimate the time it would take, and then perform the actual trial (aTUG). The difference between iTUG and aTUG (Δ TUG) was calculated. The FoF group was significantly slower in aTUG, but iTUG duration was almost identical between the two groups, resulting in significant overestimation in the FoF group. The adjusted logistic regression analysis showed that increased Δ TUG (i.e., tendency to overestimate) was significantly associated with FoF (OR = 1.05; 95% CI = 1.02-1.10). Low frequency of going outdoors was also associated with FoF (OR 2.95; 95% CI: 1.16-7.44). Older adults with FoF overestimate their TUG performance, reflecting impairment in motor planning. Overestimation of physical capabilities can be an additional explanation of the high risk of falls in this population.

  13. Exergaming and older adult cognition: a cluster randomized clinical trial.

    PubMed

    Anderson-Hanley, Cay; Arciero, Paul J; Brickman, Adam M; Nimon, Joseph P; Okuma, Naoko; Westen, Sarah C; Merz, Molly E; Pence, Brandt D; Woods, Jeffrey A; Kramer, Arthur F; Zimmerman, Earl A

    2012-02-01

    Dementia cases may reach 100 million by 2050. Interventions are sought to curb or prevent cognitive decline. Exercise yields cognitive benefits, but few older adults exercise. Virtual reality-enhanced exercise or "exergames" may elicit greater participation. To test the following hypotheses: (1) stationary cycling with virtual reality tours ("cybercycle") will enhance executive function and clinical status more than traditional exercise; (2) exercise effort will explain improvement; and (3) brain-derived neurotrophic growth factor (BDNF) will increase. Multi-site cluster randomized clinical trial (RCT) of the impact of 3 months of cybercycling versus traditional exercise, on cognitive function in older adults. Data were collected in 2008-2010; analyses were conducted in 2010-2011. 102 older adults from eight retirement communities enrolled; 79 were randomized and 63 completed. A recumbent stationary ergometer was utilized; virtual reality tours and competitors were enabled on the cybercycle. Executive function (Color Trails Difference, Stroop C, Digits Backward); clinical status (mild cognitive impairment; MCI); exercise effort/fitness; and plasma BDNF. Intent-to-treat analyses, controlling for age, education, and cluster randomization, revealed a significant group X time interaction for composite executive function (p=0.002). Cybercycling yielded a medium effect over traditional exercise (d=0.50). Cybercyclists had a 23% relative risk reduction in clinical progression to MCI. Exercise effort and fitness were comparable, suggesting another underlying mechanism. A significant group X time interaction for BDNF (p=0.05) indicated enhanced neuroplasticity among cybercyclists. Cybercycling older adults achieved better cognitive function than traditional exercisers, for the same effort, suggesting that simultaneous cognitive and physical exercise has greater potential for preventing cognitive decline. This study is registered at Clinicaltrials.gov NCT01167400. Copyright © 2012 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  14. Teaching case studies on earthquake preparedness efforts in the transportation sector, Los Angeles metropolitan area.

    DOT National Transportation Integrated Search

    2013-01-01

    Through the development of a Harvard Kennedy School case study (intended for : use as curriculum in graduate-level and executive education programs), this project : examines earthquake preparedness and planning processes in the Los Angeles : metropol...

  15. EM-ANIMATE - COMPUTER PROGRAM FOR DISPLAYING AND ANIMATING THE STEADY-STATE TIME-HARMONIC ELECTROMAGNETIC NEAR FIELD AND SURFACE-CURRENT SOLUTIONS

    NASA Technical Reports Server (NTRS)

    Hom, K. W.

    1994-01-01

    The EM-ANIMATE program is a specialized visualization program that displays and animates the near-field and surface-current solutions obtained from an electromagnetics program, in particular, that from MOM3D (LAR-15074). The EM-ANIMATE program is windows based and contains a user-friendly, graphical interface for setting viewing options, case selection, file manipulation, etc. EM-ANIMATE displays the field and surface-current magnitude as smooth shaded color fields (color contours) ranging from a minimum contour value to a maximum contour value for the fields and surface currents. The program can display either the total electric field or the scattered electric field in either time-harmonic animation mode or in the root mean square (RMS) average mode. The default setting is initially set to the minimum and maximum values within the field and surface current data and can be optionally set by the user. The field and surface-current value are animated by calculating and viewing the solution at user selectable radian time increments between 0 and 2pi. The surface currents can also be displayed in either time-harmonic animation mode or in RMS average mode. In RMS mode, the color contours do not vary with time, but show the constant time averaged field and surface-current magnitude solution. The electric field and surface-current directions can be displayed as scaled vector arrows which have a length proportional to the magnitude at each field grid point or surface node point. These vector properties can be viewed separately or concurrently with the field or surface-current magnitudes. Animation speed is improved by turning off the display of the vector arrows. In RMS modes, the direction vectors are still displayed as varying with time since the time averaged direction vectors would be zero length vectors. Other surface properties can optionally be viewed. These include the surface grid, the resistance value assigned to each element of the grid, and the power dissipation of each element which has an assigned resistance value. The EM-ANIMATE program will accept up to 10 different surface current cases each consisting of up to 20,000 node points and 10,000 triangle definitions and will animate one of these cases. The capability is used to compare surface-current distribution due to various initial excitation directions or electric field orientations. The program can accept up to 50 planes of field data consisting of a grid of 100 by 100 field points. These planes of data are user selectable and can be viewed individually or concurrently. With these preset limits, the program requires 55 megabytes of core memory to run. These limits can be changed in the header files to accommodate the available core memory of an individual workstation. An estimate of memory required can be made as follows: approximate memory in bytes equals (number of nodes times number of surfaces times 14 variables times bytes per word, typically 4 bytes per floating point) plus (number of field planes times number of nodes per plane times 21 variables times bytes per word). This gives the approximate memory size required to store the field and surface-current data. The total memory size is approximately 400,000 bytes plus the data memory size. The animation calculations are performed in real time at any user set time step. For Silicon Graphics Workstations that have multiple processors, this program has been optimized to perform these calculations on multiple processors to increase animation rates. The optimized program uses the SGI PFA (Power FORTRAN Accelerator) library. On single processor machines, the parallelization directives are seen as comments to the program and will have no effect on compilation or execution. EM-ANIMATE is written in FORTRAN 77 for implementation on SGI IRIS workstations running IRIX 3.0 or later. A minimum of 55Mb of RAM is required for execution of this program; however, the code may be modified to accommodate the available memory of an individual workstation. For program execution, twenty-four bit, double-buffered color capability is suggested, but not required. Sample input and output files and a sample executable are provided on the distribution medium. Electronic documentation is provided in PostScript format and in the form of IRIX man pages. The standard distribution medium for EM-ANIMATE is a .25 inch streaming magnetic IRIX tape cartridge in UNIX tar format. EM-ANIMATE is also available as part of a bundled package, COS-10048 that includes MOM3D, an IRIS program that produces electromagnetic near field and surface current solutions. This program was developed in 1993.

  16. Implementing forward recovery using checkpointing in distributed systems

    NASA Technical Reports Server (NTRS)

    Long, Junsheng; Fuchs, W. K.; Abraham, Jacob A.

    1991-01-01

    The paper describes the implementation of a forward recovery scheme using checkpoints and replicated tasks. The implementation is based on the concept of lookahead execution and rollback validation. In the experiment, two tasks are selected for the normal execution and one for rollback validation. It is shown that the recovery strategy has nearly error-free execution time and an average redundancy lower than TMR.

  17. Platform for Automated Real-Time High Performance Analytics on Medical Image Data.

    PubMed

    Allen, William J; Gabr, Refaat E; Tefera, Getaneh B; Pednekar, Amol S; Vaughn, Matthew W; Narayana, Ponnada A

    2018-03-01

    Biomedical data are quickly growing in volume and in variety, providing clinicians an opportunity for better clinical decision support. Here, we demonstrate a robust platform that uses software automation and high performance computing (HPC) resources to achieve real-time analytics of clinical data, specifically magnetic resonance imaging (MRI) data. We used the Agave application programming interface to facilitate communication, data transfer, and job control between an MRI scanner and an off-site HPC resource. In this use case, Agave executed the graphical pipeline tool GRAphical Pipeline Environment (GRAPE) to perform automated, real-time, quantitative analysis of MRI scans. Same-session image processing will open the door for adaptive scanning and real-time quality control, potentially accelerating the discovery of pathologies and minimizing patient callbacks. We envision this platform can be adapted to other medical instruments, HPC resources, and analytics tools.

  18. Image segmentation based upon topological operators: real-time implementation case study

    NASA Astrophysics Data System (ADS)

    Mahmoudi, R.; Akil, M.

    2009-02-01

    In miscellaneous applications of image treatment, thinning and crest restoring present a lot of interests. Recommended algorithms for these procedures are those able to act directly over grayscales images while preserving topology. But their strong consummation in term of time remains the major disadvantage in their choice. In this paper we present an efficient hardware implementation on RISC processor of two powerful algorithms of thinning and crest restoring developed by our team. Proposed implementation enhances execution time. A chain of segmentation applied to medical imaging will serve as a concrete example to illustrate the improvements brought thanks to the optimization techniques in both algorithm and architectural levels. The particular use of the SSE instruction set relative to the X86_32 processors (PIV 3.06 GHz) will allow a best performance for real time processing: a cadency of 33 images (512*512) per second is assured.

  19. A parallel approach of COFFEE objective function to multiple sequence alignment

    NASA Astrophysics Data System (ADS)

    Zafalon, G. F. D.; Visotaky, J. M. V.; Amorim, A. R.; Valêncio, C. R.; Neves, L. A.; de Souza, R. C. G.; Machado, J. M.

    2015-09-01

    The computational tools to assist genomic analyzes show even more necessary due to fast increasing of data amount available. With high computational costs of deterministic algorithms for sequence alignments, many works concentrate their efforts in the development of heuristic approaches to multiple sequence alignments. However, the selection of an approach, which offers solutions with good biological significance and feasible execution time, is a great challenge. Thus, this work aims to show the parallelization of the processing steps of MSA-GA tool using multithread paradigm in the execution of COFFEE objective function. The standard objective function implemented in the tool is the Weighted Sum of Pairs (WSP), which produces some distortions in the final alignments when sequences sets with low similarity are aligned. Then, in studies previously performed we implemented the COFFEE objective function in the tool to smooth these distortions. Although the nature of COFFEE objective function implies in the increasing of execution time, this approach presents points, which can be executed in parallel. With the improvements implemented in this work, we can verify the execution time of new approach is 24% faster than the sequential approach with COFFEE. Moreover, the COFFEE multithreaded approach is more efficient than WSP, because besides it is slightly fast, its biological results are better.

  20. Exploiting Semantic Web Technologies to Develop OWL-Based Clinical Practice Guideline Execution Engines.

    PubMed

    Jafarpour, Borna; Abidi, Samina Raza; Abidi, Syed Sibte Raza

    2016-01-01

    Computerizing paper-based CPG and then executing them can provide evidence-informed decision support to physicians at the point of care. Semantic web technologies especially web ontology language (OWL) ontologies have been profusely used to represent computerized CPG. Using semantic web reasoning capabilities to execute OWL-based computerized CPG unties them from a specific custom-built CPG execution engine and increases their shareability as any OWL reasoner and triple store can be utilized for CPG execution. However, existing semantic web reasoning-based CPG execution engines suffer from lack of ability to execute CPG with high levels of expressivity, high cognitive load of computerization of paper-based CPG and updating their computerized versions. In order to address these limitations, we have developed three CPG execution engines based on OWL 1 DL, OWL 2 DL and OWL 2 DL + semantic web rule language (SWRL). OWL 1 DL serves as the base execution engine capable of executing a wide range of CPG constructs, however for executing highly complex CPG the OWL 2 DL and OWL 2 DL + SWRL offer additional executional capabilities. We evaluated the technical performance and medical correctness of our execution engines using a range of CPG. Technical evaluations show the efficiency of our CPG execution engines in terms of CPU time and validity of the generated recommendation in comparison to existing CPG execution engines. Medical evaluations by domain experts show the validity of the CPG-mediated therapy plans in terms of relevance, safety, and ordering for a wide range of patient scenarios.

  1. OPAD-EDIFIS Real-Time Processing

    NASA Technical Reports Server (NTRS)

    Katsinis, Constantine

    1997-01-01

    The Optical Plume Anomaly Detection (OPAD) detects engine hardware degradation of flight vehicles through identification and quantification of elemental species found in the plume by analyzing the plume emission spectra in a real-time mode. Real-time performance of OPAD relies on extensive software which must report metal amounts in the plume faster than once every 0.5 sec. OPAD software previously written by NASA scientists performed most necessary functions at speeds which were far below what is needed for real-time operation. The research presented in this report improved the execution speed of the software by optimizing the code without changing the algorithms and converting it into a parallelized form which is executed in a shared-memory multiprocessor system. The resulting code was subjected to extensive timing analysis. The report also provides suggestions for further performance improvement by (1) identifying areas of algorithm optimization, (2) recommending commercially available multiprocessor architectures and operating systems to support real-time execution and (3) presenting an initial study of fault-tolerance requirements.

  2. An Evaluation of Database Solutions to Spatial Object Association

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, V S; Kurc, T; Saltz, J

    2008-06-24

    Object association is a common problem encountered in many applications. Spatial object association, also referred to as crossmatch of spatial datasets, is the problem of identifying and comparing objects in two datasets based on their positions in a common spatial coordinate system--one of the datasets may correspond to a catalog of objects observed over time in a multi-dimensional domain; the other dataset may consist of objects observed in a snapshot of the domain at a time point. The use of database management systems to the solve the object association problem provides portability across different platforms and also greater flexibility. Increasingmore » dataset sizes in today's applications, however, have made object association a data/compute-intensive problem that requires targeted optimizations for efficient execution. In this work, we investigate how database-based crossmatch algorithms can be deployed on different database system architectures and evaluate the deployments to understand the impact of architectural choices on crossmatch performance and associated trade-offs. We investigate the execution of two crossmatch algorithms on (1) a parallel database system with active disk style processing capabilities, (2) a high-throughput network database (MySQL Cluster), and (3) shared-nothing databases with replication. We have conducted our study in the context of a large-scale astronomy application with real use-case scenarios.« less

  3. Influence of the distance in a roundhouse kick's execution time and impact force in Taekwondo.

    PubMed

    Falco, Coral; Alvarez, Octavio; Castillo, Isabel; Estevan, Isaac; Martos, Julio; Mugarra, Fernando; Iradi, Antonio

    2009-02-09

    Taekwondo, originally a Korean martial art, is well known for its kicks. One of the most frequently used kicks in competition is Bandal Chagui or roundhouse kick. Excellence in Taekwondo relies on the ability to make contact with the opponent's trunk or face with enough force in as little time as possible, while at the same time avoiding being hit. Thus, the distance between contestants is an important variable to be taken into consideration. Thirty-one Taekwondo athletes in two different groups (expert and novice, according to experience in competition) took part in this study. The purpose of this study was to examine both impact force and execution time in a Bandal Chagui or roundhouse kick, and to explore the effect of execution distance in these two variables. A new model was developed in order to measure the force exerted by the body on a load. A force platform and a contact platform were used to measure these variables. The results showed that there are no significant differences in terms of impact force in relation to execution distance in expert competitors. Significant and positive correlations between body mass and impact force (p<.01) seem to mean that novice competitors use their body mass to generate high impact forces. Significant differences were found in competitive experience and execution time for the three different distances of kicking considered in the study. Standing at a certain further distance from the opponent should be an advantage for competitors who are used to kick from a further distance in their training.

  4. Resource-Aware Mobile-Based Health Monitoring.

    PubMed

    Masud, Mohammad M; Adel Serhani, Mohamed; Navaz, Alramzana Nujum

    2017-03-01

    Monitoring heart diseases often requires frequent measurements of electrocardiogram (ECG) signals at different periods of the day, and at different situations (e.g., traveling, and exercising). This can only be implemented using mobile devices in order to cope with mobility of patients under monitoring, thus supporting continuous monitoring practices. However, these devices are energy-aware, have limited computing resources (e.g., CPU speed and memory), and might lose network connectivity, which makes it very challenging to maintain a continuity of the monitoring episode. In this paper, we propose a mobile monitoring solution to cope with these challenges by compromising on the fly resources availability, battery level, and network intermittence. In order to solve this problem, first we divide the whole process into several subtasks such that each subtask can be executed sequentially either in the server or in the mobile or in parallel in both devices. Then, we developed a mathematical model that considers all the constraints and finds a dynamic programing solution to obtain the best execution path (i.e., which substep should be done where). The solution guarantees an optimum execution time, while considering device battery availability, execution and transmission time, and network availability. We conducted a series of experiments to evaluate our proposed approach using some key monitoring tasks starting from preprocessing to classification and prediction. The results we have obtained proved that our approach gives the best (lowest) running time for any combination of factors including processing speed, input size, and network bandwidth. Compared to several greedy but nonoptimal solutions, the execution time of our approach was at least 10 times faster and consumed 90% less energy.

  5. The Influence of Mesh Density on the Impact Response of a Shuttle Leading-Edge Panel Finite Element Simulation

    NASA Technical Reports Server (NTRS)

    Jackson, Karen E.; Fasanella, Edwin L.; Lyle, Karen H.; Spellman, Regina L.

    2004-01-01

    A study was performed to examine the influence of varying mesh density on an LS-DYNA simulation of a rectangular-shaped foam projectile impacting the space shuttle leading edge Panel 6. The shuttle leading-edge panels are fabricated of reinforced carbon-carbon (RCC) material. During the study, nine cases were executed with all possible combinations of coarse, baseline, and fine meshes of the foam and panel. For each simulation, the same material properties and impact conditions were specified and only the mesh density was varied. In the baseline model, the shell elements representing the RCC panel are approximately 0.2-in. on edge, whereas the foam elements are about 0.5-in. on edge. The element nominal edge-length for the baseline panel was halved to create a fine panel (0.1-in. edge length) mesh and doubled to create a coarse panel (0.4-in. edge length) mesh. In addition, the element nominal edge-length of the baseline foam projectile was halved (0.25-in. edge length) to create a fine foam mesh and doubled (1.0- in. edge length) to create a coarse foam mesh. The initial impact velocity of the foam was 775 ft/s. The simulations were executed in LS-DYNA version 960 for 6 ms of simulation time. Contour plots of resultant panel displacement and effective stress in the foam were compared at five discrete time intervals. Also, time-history responses of internal and kinetic energy of the panel, kinetic and hourglass energy of the foam, and resultant contact force were plotted to determine the influence of mesh density. As a final comparison, the model with a fine panel and fine foam mesh was executed with slightly different material properties for the RCC. For this model, the average degraded properties of the RCC were replaced with the maximum degraded properties. Similar comparisons of panel and foam responses were made for the average and maximum degraded models.

  6. Management of Hip Fractures in Lateral Position without a Fracture Table.

    PubMed

    Pahlavanhosseini, Hamid; Valizadeh, Sima; Banadaky, Seyyed Hossein Saeed; Karbasi, Mohammad H Akhavan; Abrisham, Seyed Mohammad J; Fallahzadeh, Hossein

    2014-09-01

    Hip fracture Management in supine position on a fracture table with biplane fluoroscopic views has some difficulties which leads to prolongation of surgery and increasing x- rays' dosage. The purpose of this study was to report the results and complications of hip fracture management in lateral position on a conventional operating table with just anteroposterior fluoroscopic view. 40 hip fractures (31 trochanteric and 9 femoral neck fractures) were operated in lateral position between Feb 2006 and Oct 2012. Age, gender, fracture classification, operation time, intra-operation blood loss, reduction quality, and complications were extracted from patients' medical records. The mean follow-up time was 30.78±22.73 months (range 4-83). The mean operation time was 76.50 ± 16.88 min (range 50 - 120 min).The mean intra-operative blood loss was 628.75 ± 275.00 ml (range 250-1300ml). Anatomic and acceptable reduction was observed in 95%of cases. The most important complications were malunion (one case in trochanteric group), avascular necrosis of femoral head and nonunion (each one case in femoral neck group). It sounds that reduction and fixation of hip fractures in lateral position with fluoroscopy in just anteroposterior view for small rural hospitals may be executable and probably safe.

  7. Management of Hip Fractures in Lateral Position without a Fracture Table

    PubMed Central

    Pahlavanhosseini, Hamid; Valizadeh, Sima; Banadaky, Seyyed Hossein Saeed; Karbasi, Mohammad H Akhavan; Abrisham, Seyed Mohammad J; Fallahzadeh, Hossein

    2014-01-01

    Background: Hip fracture Management in supine position on a fracture table with biplane fluoroscopic views has some difficulties which leads to prolongation of surgery and increasing x- rays' dosage. The purpose of this study was to report the results and complications of hip fracture management in lateral position on a conventional operating table with just anteroposterior fluoroscopic view. Methods: 40 hip fractures (31 trochanteric and 9 femoral neck fractures) were operated in lateral position between Feb 2006 and Oct 2012. Age, gender, fracture classification, operation time, intra-operation blood loss, reduction quality, and complications were extracted from patients' medical records. The mean follow-up time was 30.78±22.73 months (range 4-83). Results: The mean operation time was 76.50 ± 16.88 min (range 50 - 120 min).The mean intra-operative blood loss was 628.75 ± 275.00 ml (range 250-1300ml). Anatomic and acceptable reduction was observed in 95%of cases. The most important complications were malunion (one case in trochanteric group), avascular necrosis of femoral head and nonunion (each one case in femoral neck group). Conclusions: It sounds that reduction and fixation of hip fractures in lateral position with fluoroscopy in just anteroposterior view for small rural hospitals may be executable and probably safe. PMID:25386577

  8. SAFARI digital processing unit: performance analysis of the SpaceWire links in case of a LEON3-FT based CPU

    NASA Astrophysics Data System (ADS)

    Giusi, Giovanni; Liu, Scige J.; Di Giorgio, Anna M.; Galli, Emanuele; Pezzuto, Stefano; Farina, Maria; Spinoglio, Luigi

    2014-08-01

    SAFARI (SpicA FAR infrared Instrument) is a far-infrared imaging Fourier Transform Spectrometer for the SPICA mission. The Digital Processing Unit (DPU) of the instrument implements the functions of controlling the overall instrument and implementing the science data compression and packing. The DPU design is based on the use of a LEON family processor. In SAFARI, all instrument components are connected to the central DPU via SpaceWire links. On these links science data, housekeeping and commands flows are in some cases multiplexed, therefore the interface control shall be able to cope with variable throughput needs. The effective data transfer workload can be an issue for the overall system performances and becomes a critical parameter for the on-board software design, both at application layer level and at lower, and more HW related, levels. To analyze the system behavior in presence of the expected SAFARI demanding science data flow, we carried out a series of performance tests using the standard GR-CPCI-UT699 LEON3-FT Development Board, provided by Aeroflex/Gaisler, connected to the emulator of the SAFARI science data links, in a point-to-point topology. Two different communication protocols have been used in the tests, the ECSS-E-ST-50-52C RMAP protocol and an internally defined one, the SAFARI internal data handling protocol. An incremental approach has been adopted to measure the system performances at different levels of the communication protocol complexity. In all cases the performance has been evaluated by measuring the CPU workload and the bus latencies. The tests have been executed initially in a custom low level execution environment and finally using the Real- Time Executive for Multiprocessor Systems (RTEMS), which has been selected as the operating system to be used onboard SAFARI. The preliminary results of the carried out performance analysis confirmed the possibility of using a LEON3 CPU processor in the SAFARI DPU, but pointed out, in agreement with previous similar studies, the need of carefully designing the overall architecture to implement some of the DPU functionalities on additional processing devices.

  9. System Re-engineering Project Executive Summary

    DTIC Science & Technology

    1991-11-01

    Management Information System (STAMIS) application. This project involved reverse engineering, evaluation of structured design and object-oriented design, and re- implementation of the system in Ada. This executive summary presents the approach to re-engineering the system, the lessons learned while going through the process, and issues to be considered in future tasks of this nature.... Computer-Aided Software Engineering (CASE), Distributed Software, Ada, COBOL, Systems Analysis, Systems Design, Life Cycle Development, Functional Decomposition, Object-Oriented

  10. The Modeling, Simulation and Comparison of Interconnection Networks for Parallel Processing.

    DTIC Science & Technology

    1987-12-01

    performs better at a lower hardware cost than do the single stage cube and mesh networks. As a result, the designer of a paralll pro- cessing system is...attempted, and in most cases succeeded, in designing and implementing faster. more powerful systems. Due to design innovations and technological advances...largely to the computational complexity of the algorithms executed. In the von Neumann machine, instructions must be executed in a sequential manner. Design

  11. Improve Performance of Data Warehouse by Query Cache

    NASA Astrophysics Data System (ADS)

    Gour, Vishal; Sarangdevot, S. S.; Sharma, Anand; Choudhary, Vinod

    2010-11-01

    The primary goal of data warehouse is to free the information locked up in the operational database so that decision makers and business analyst can make queries, analysis and planning regardless of the data changes in operational database. As the number of queries is large, therefore, in certain cases there is reasonable probability that same query submitted by the one or multiple users at different times. Each time when query is executed, all the data of warehouse is analyzed to generate the result of that query. In this paper we will study how using query cache improves performance of Data Warehouse and try to find the common problems faced. These kinds of problems are faced by Data Warehouse administrators which are minimizes response time and improves the efficiency of query in data warehouse overall, particularly when data warehouse is updated at regular interval.

  12. 77 FR 29397 - Order Granting Application of BOX Options Exchange, LLC for a Limited Exemption From Exchange Act...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-17

    ... for strict price-time priority execution (``Trading System'').\\2\\ The BOX Book and the Exchange Rules...'') as set forth in Exchange Rule 7150 are an exception to the strict price-time priority execution that... in the Application, BOX will operate a fully automated electronic book (``BOX Book'') for orders to...

  13. Airpower Projection in the Anti-Access/Area Denial Environment: Dispersed Operations

    DTIC Science & Technology

    2015-02-01

    Raptor Case Study.....................................................................6 Risks to Dispersed Operations...project airpower, this paper breaks down a case study of the Rapid Raptor concept. The risks with executing a dispersed model are analyzed and mitigation...will force leaders to look at alternative ways to project power. Alternative Option: Rapid Raptor Case Study The ability to defend forward operating

  14. 76 FR 17762 - Regulations Governing the Performance of Actuarial Services Under the Employee Retirement Income...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-31

    ... following as may be appropriate in the particular case: (1) Normal cost; (2) accrued liability; (3) payment... Joint Board will address on a case-by-case basis situations involving the inability of the Executive..., communication skills, and business and general tax law. 3. Qualifying Program Requirement These regulations do...

  15. Two criteria for the selection of assembly plans - Maximizing the flexibility of sequencing the assembly tasks and minimizing the assembly time through parallel execution of assembly tasks

    NASA Technical Reports Server (NTRS)

    Homem De Mello, Luiz S.; Sanderson, Arthur C.

    1991-01-01

    The authors introduce two criteria for the evaluation and selection of assembly plans. The first criterion is to maximize the number of different sequences in which the assembly tasks can be executed. The second criterion is to minimize the total assembly time through simultaneous execution of assembly tasks. An algorithm that performs a heuristic search for the best assembly plan over the AND/OR graph representation of assembly plans is discussed. Admissible heuristics for each of the two criteria introduced are presented. Some implementation issues that affect the computational efficiency are addressed.

  16. Acute hypoglycemia impairs executive cognitive function in adults with and without type 1 diabetes.

    PubMed

    Graveling, Alex J; Deary, Ian J; Frier, Brian M

    2013-10-01

    Acute hypoglycemia impairs cognitive function in several domains. Executive cognitive function governs organization of thoughts, prioritization of tasks, and time management. This study examined the effect of acute hypoglycemia on executive function in adults with and without diabetes. Thirty-two adults with and without type 1 diabetes with no vascular complications or impaired awareness of hypoglycemia were studied. Two hyperinsulinemic glucose clamps were performed at least 2 weeks apart in a single-blind, counterbalanced order, maintaining blood glucose at 4.5 mmol/L (euglycemia) or 2.5 mmol/L (hypoglycemia). Executive functions were assessed with a validated test suite (Delis-Kaplan Executive Function). A general linear model (repeated-measures ANOVA) was used. Glycemic condition (euglycemia or hypoglycemia) was the within-participant factor. Between-participant factors were order of session (euglycemia-hypoglycemia or hypoglycemia-euglycemia), test battery used, and diabetes status (with or without diabetes). Compared with euglycemia, executive functions (with one exception) were significantly impaired during hypoglycemia; lower test scores were recorded with more time required for completion. Large Cohen d values (>0.8) suggest that hypoglycemia induces decrements in aspects of executive function with large effect sizes. In some tests, the performance of participants with diabetes was more impaired than those without diabetes. Executive cognitive function, which is necessary to carry out many everyday activities, is impaired during hypoglycemia in adults with and without type 1 diabetes. This important aspect of cognition has not received previous systematic study with respect to hypoglycemia. The effect size is large in terms of both accuracy and speed.

  17. Reducing False Positives in Runtime Analysis of Deadlocks

    NASA Technical Reports Server (NTRS)

    Bensalem, Saddek; Havelund, Klaus; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This paper presents an improvement of a standard algorithm for detecting dead-lock potentials in multi-threaded programs, in that it reduces the number of false positives. The standard algorithm works as follows. The multi-threaded program under observation is executed, while lock and unlock events are observed. A graph of locks is built, with edges between locks symbolizing locking orders. Any cycle in the graph signifies a potential for a deadlock. The typical standard example is the group of dining philosophers sharing forks. The algorithm is interesting because it can catch deadlock potentials even though no deadlocks occur in the examined trace, and at the same time it scales very well in contrast t o more formal approaches to deadlock detection. The algorithm, however, can yield false positives (as well as false negatives). The extension of the algorithm described in this paper reduces the amount of false positives for three particular cases: when a gate lock protects a cycle, when a single thread introduces a cycle, and when the code segments in different threads that cause the cycle can actually not execute in parallel. The paper formalizes a theory for dynamic deadlock detection and compares it to model checking and static analysis techniques. It furthermore describes an implementation for analyzing Java programs and its application to two case studies: a planetary rover and a space craft altitude control system.

  18. Using SRAM Based FPGAs for Power-Aware High Performance Wireless Sensor Networks

    PubMed Central

    Valverde, Juan; Otero, Andres; Lopez, Miguel; Portilla, Jorge; de la Torre, Eduardo; Riesgo, Teresa

    2012-01-01

    While for years traditional wireless sensor nodes have been based on ultra-low power microcontrollers with sufficient but limited computing power, the complexity and number of tasks of today’s applications are constantly increasing. Increasing the node duty cycle is not feasible in all cases, so in many cases more computing power is required. This extra computing power may be achieved by either more powerful microcontrollers, though more power consumption or, in general, any solution capable of accelerating task execution. At this point, the use of hardware based, and in particular FPGA solutions, might appear as a candidate technology, since though power use is higher compared with lower power devices, execution time is reduced, so energy could be reduced overall. In order to demonstrate this, an innovative WSN node architecture is proposed. This architecture is based on a high performance high capacity state-of-the-art FPGA, which combines the advantages of the intrinsic acceleration provided by the parallelism of hardware devices, the use of partial reconfiguration capabilities, as well as a careful power-aware management system, to show that energy savings for certain higher-end applications can be achieved. Finally, comprehensive tests have been done to validate the platform in terms of performance and power consumption, to proof that better energy efficiency compared to processor based solutions can be achieved, for instance, when encryption is imposed by the application requirements. PMID:22736971

  19. Using SRAM based FPGAs for power-aware high performance wireless sensor networks.

    PubMed

    Valverde, Juan; Otero, Andres; Lopez, Miguel; Portilla, Jorge; de la Torre, Eduardo; Riesgo, Teresa

    2012-01-01

    While for years traditional wireless sensor nodes have been based on ultra-low power microcontrollers with sufficient but limited computing power, the complexity and number of tasks of today's applications are constantly increasing. Increasing the node duty cycle is not feasible in all cases, so in many cases more computing power is required. This extra computing power may be achieved by either more powerful microcontrollers, though more power consumption or, in general, any solution capable of accelerating task execution. At this point, the use of hardware based, and in particular FPGA solutions, might appear as a candidate technology, since though power use is higher compared with lower power devices, execution time is reduced, so energy could be reduced overall. In order to demonstrate this, an innovative WSN node architecture is proposed. This architecture is based on a high performance high capacity state-of-the-art FPGA, which combines the advantages of the intrinsic acceleration provided by the parallelism of hardware devices, the use of partial reconfiguration capabilities, as well as a careful power-aware management system, to show that energy savings for certain higher-end applications can be achieved. Finally, comprehensive tests have been done to validate the platform in terms of performance and power consumption, to proof that better energy efficiency compared to processor based solutions can be achieved, for instance, when encryption is imposed by the application requirements.

  20. Increased Executive Functioning, Attention, and Cortical Thickness in White-Collar Criminals

    PubMed Central

    Raine, Adrian; Laufer, William S.; Yang, Yaling; Narr, Katherine L.; Thompson, Paul; Toga, Arthur W.

    2011-01-01

    Very little is known on white collar crime and how it differs to other forms of offending. This study tests the hypothesis that white collar criminals have better executive functioning, enhanced information processing, and structural brain superiorities compared to offender controls. Using a case-control design, executive functioning, orienting, and cortical thickness was assessed in 21 white collar criminals matched with 21 controls on age, gender, ethnicity, and general level of criminal offending. White collar criminals had significantly better executive functioning, increased electrodermal orienting, increased arousal, and increased cortical gray matter thickness in the ventromedial prefrontal cortex, inferior frontal gyrus, somatosensory cortex, and the temporal-parietal junction compared to controls. Results, while initial, constitute the first findings on neurobiological characteristics of white-collar criminals It is hypothesized that white collar criminals have information-processing and brain superiorities that give them an advantage in perpetrating criminal offenses in occupational settings. PMID:22002326

  1. Increased executive functioning, attention, and cortical thickness in white-collar criminals.

    PubMed

    Raine, Adrian; Laufer, William S; Yang, Yaling; Narr, Katherine L; Thompson, Paul; Toga, Arthur W

    2012-12-01

    Very little is known on white-collar crime and how it differs to other forms of offending. This study tests the hypothesis that white-collar criminals have better executive functioning, enhanced information processing, and structural brain superiorities compared with offender controls. Using a case-control design, executive functioning, orienting, and cortical thickness was assessed in 21 white-collar criminals matched with 21 controls on age, gender, ethnicity, and general level of criminal offending. White-collar criminals had significantly better executive functioning, increased electrodermal orienting, increased arousal, and increased cortical gray matter thickness in the ventromedial prefrontal cortex, inferior frontal gyrus, somatosensory cortex, and the temporal-parietal junction compared with controls. Results, while initial, constitute the first findings on neurobiological characteristics of white-collar criminals. It is hypothesized that white-collar criminals have information-processing and brain superiorities that give them an advantage in perpetrating criminal offenses in occupational settings. Copyright © 2011 Wiley Periodicals, Inc.

  2. The RWJ Executive Nurse Fellows Program, Part 2: Mentoring for leadership success.

    PubMed

    Bellack, Janis P; Morjikian, Robin L

    2005-12-01

    This article is the second in a 3-part series describing the RWJ Executive Nurse Fellows Program, an advanced leadership program for nurses in senior executive roles who aspire to help lead and shape the US healthcare system of the future. Part 1 (October 2005) described the program, its core leadership competencies, and the primary components. This article discusses the mentor experience that is a cornerstone of the 3-year fellowship program. Fellows are encouraged to have this experience with senior-level executives outside of healthcare in order to broaden their leadership perspectives. Examples of these mentor experiences are described from the viewpoints of both fellows and mentors, including successes, challenges, and lessons learned. Part 3 (February 2006) will explain how fellows are required to create a business plan for their leadership project because it is so important for nurse leaders to offer a strong business case for proceeding with anew initiative, service, or program.

  3. Investigating the effects of caffeine on executive functions using traditional Stroop and a new ecologically-valid virtual reality task, the Jansari assessment of Executive Functions (JEF(©)).

    PubMed

    Soar, K; Chapman, E; Lavan, N; Jansari, A S; Turner, J J D

    2016-10-01

    Caffeine has been shown to have effects on certain areas of cognition, but in executive functioning the research is limited and also inconsistent. One reason could be the need for a more sensitive measure to detect the effects of caffeine on executive function. This study used a new non-immersive virtual reality assessment of executive functions known as JEF(©) (the Jansari Assessment of Executive Function) alongside the 'classic' Stroop Colour-Word task to assess the effects of a normal dose of caffeinated coffee on executive function. Using a double-blind, counterbalanced within participants procedure 43 participants were administered either a caffeinated or decaffeinated coffee and completed the 'JEF(©)' and Stroop tasks, as well as a subjective mood scale and blood pressure pre- and post condition on two separate occasions a week apart. JEF(©) yields measures for eight separate aspects of executive functions, in addition to a total average score. Findings indicate that performance was significantly improved on the planning, creative thinking, event-, time- and action-based prospective memory, as well as total JEF(©) score following caffeinated coffee relative to the decaffeinated coffee. The caffeinated beverage significantly decreased reaction times on the Stroop task, but there was no effect on Stroop interference. The results provide further support for the effects of a caffeinated beverage on cognitive functioning. In particular, it has demonstrated the ability of JEF(©) to detect the effects of caffeine across a number of executive functioning constructs, which weren't shown in the Stroop task, suggesting executive functioning improvements as a result of a 'typical' dose of caffeine may only be detected by the use of more real-world, ecologically valid tasks. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Can training in a real-time strategy video game attenuate cognitive decline in older adults?

    PubMed

    Basak, Chandramallika; Boot, Walter R; Voss, Michelle W; Kramer, Arthur F

    2008-12-01

    Declines in various cognitive abilities, particularly executive control functions, are observed in older adults. An important goal of cognitive training is to slow or reverse these age-related declines. However, opinion is divided in the literature regarding whether cognitive training can engender transfer to a variety of cognitive skills in older adults. In the current study, the authors trained older adults in a real-time strategy video game for 23.5 hr in an effort to improve their executive functions. A battery of cognitive tasks, including tasks of executive control and visuospatial skills, were assessed before, during, and after video-game training. The trainees improved significantly in the measures of game performance. They also improved significantly more than the control participants in executive control functions, such as task switching, working memory, visual short-term memory, and reasoning. Individual differences in changes in game performance were correlated with improvements in task switching. The study has implications for the enhancement of executive control processes of older adults. Copyright (c) 2009 APA, all rights reserved.

  5. Supporting Real-Time Operations and Execution through Timeline and Scheduling Aids

    NASA Technical Reports Server (NTRS)

    Marquez, Jessica J.; Pyrzak, Guy; Hashemi, Sam; Ahmed, Samia; McMillin, Kevin Edward; Medwid, Joseph Daniel; Chen, Diana; Hurtle, Esten

    2013-01-01

    Since 2003, the NASA Ames Research Center has been actively involved in researching and advancing the state-of-the-art of planning and scheduling tools for NASA mission operations. Our planning toolkit SPIFe (Scheduling and Planning Interface for Exploration) has supported a variety of missions and field tests, scheduling activities for Mars rovers as well as crew on-board International Space Station and NASA earth analogs. The scheduled plan is the integration of all the activities for the day/s. In turn, the agents (rovers, landers, spaceships, crew) execute from this schedule while the mission support team members (e.g., flight controllers) follow the schedule during execution. Over the last couple of years, our team has begun to research and validate methods that will better support users during realtime operations and execution of scheduled activities. Our team utilizes human-computer interaction principles to research user needs, identify workflow processes, prototype software aids, and user test these. This paper discusses three specific prototypes developed and user tested to support real-time operations: Score Mobile, Playbook, and Mobile Assistant for Task Execution (MATE).

  6. Models of resource allocation optimization when solving the control problems in organizational systems

    NASA Astrophysics Data System (ADS)

    Menshikh, V.; Samorokovskiy, A.; Avsentev, O.

    2018-03-01

    The mathematical model of optimizing the allocation of resources to reduce the time for management decisions and algorithms to solve the general problem of resource allocation. The optimization problem of choice of resources in organizational systems in order to reduce the total execution time of a job is solved. This problem is a complex three-level combinatorial problem, for the solving of which it is necessary to implement the solution to several specific problems: to estimate the duration of performing each action, depending on the number of performers within the group that performs this action; to estimate the total execution time of all actions depending on the quantitative composition of groups of performers; to find such a distribution of the existing resource of performers in groups to minimize the total execution time of all actions. In addition, algorithms to solve the general problem of resource allocation are proposed.

  7. A novel approach to estimate emissions from large transportation networks: Hierarchical clustering-based link-driving-schedules for EPA-MOVES using dynamic time warping measures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aziz, H. M. Abdul; Ukkusuri, Satish V.

    We present that EPA-MOVES (Motor Vehicle Emission Simulator) is often integrated with traffic simulators to assess emission levels of large-scale urban networks with signalized intersections. High variations in speed profiles exist in the context of congested urban networks with signalized intersections. The traditional average-speed-based emission estimation technique with EPA-MOVES provides faster execution while underestimates the emissions in most cases because of ignoring the speed variation at congested networks with signalized intersections. In contrast, the atomic second-by-second speed profile (i.e., the trajectory of each vehicle)-based technique provides accurate emissions at the cost of excessive computational power and time. We addressed thismore » issue by developing a novel method to determine the link-driving-schedules (LDSs) for the EPA-MOVES tool. Our research developed a hierarchical clustering technique with dynamic time warping similarity measures (HC-DTW) to find the LDS for EPA-MOVES that is capable of producing emission estimates better than the average-speed-based technique with execution time faster than the atomic speed profile approach. We applied the HC-DTW on a sample data from a signalized corridor and found that HC-DTW can significantly reduce computational time without compromising the accuracy. The developed technique in this research can substantially contribute to the EPA-MOVES-based emission estimation process for large-scale urban transportation network by reducing the computational time with reasonably accurate estimates. This method is highly appropriate for transportation networks with higher variation in speed such as signalized intersections. Lastly, experimental results show error difference ranging from 2% to 8% for most pollutants except PM 10.« less

  8. A novel approach to estimate emissions from large transportation networks: Hierarchical clustering-based link-driving-schedules for EPA-MOVES using dynamic time warping measures

    DOE PAGES

    Aziz, H. M. Abdul; Ukkusuri, Satish V.

    2017-06-29

    We present that EPA-MOVES (Motor Vehicle Emission Simulator) is often integrated with traffic simulators to assess emission levels of large-scale urban networks with signalized intersections. High variations in speed profiles exist in the context of congested urban networks with signalized intersections. The traditional average-speed-based emission estimation technique with EPA-MOVES provides faster execution while underestimates the emissions in most cases because of ignoring the speed variation at congested networks with signalized intersections. In contrast, the atomic second-by-second speed profile (i.e., the trajectory of each vehicle)-based technique provides accurate emissions at the cost of excessive computational power and time. We addressed thismore » issue by developing a novel method to determine the link-driving-schedules (LDSs) for the EPA-MOVES tool. Our research developed a hierarchical clustering technique with dynamic time warping similarity measures (HC-DTW) to find the LDS for EPA-MOVES that is capable of producing emission estimates better than the average-speed-based technique with execution time faster than the atomic speed profile approach. We applied the HC-DTW on a sample data from a signalized corridor and found that HC-DTW can significantly reduce computational time without compromising the accuracy. The developed technique in this research can substantially contribute to the EPA-MOVES-based emission estimation process for large-scale urban transportation network by reducing the computational time with reasonably accurate estimates. This method is highly appropriate for transportation networks with higher variation in speed such as signalized intersections. Lastly, experimental results show error difference ranging from 2% to 8% for most pollutants except PM 10.« less

  9. Time spent by Belgian hospital pharmacists on supply disruptions and drug shortages: An exploratory study.

    PubMed

    De Weerdt, Elfi; De Rijdt, Thomas; Simoens, Steven; Casteels, Minne; Huys, Isabelle

    2017-01-01

    Supply problems of drugs are an increasing and worldwide problem, also in Belgium. Hospital pharmacists try to manage drug supply problems to minimize the impact on patient care. This study aims to quantify in a detailed manner how much time employees of 17 Belgian hospital pharmacies spend on drug supply problems. During six months, employees of Belgian hospital pharmacies filled in the daily time spent on drug supply problems using a template containing all steps which can be executed to manage drug supply problems. Additionally, Belgian hospital pharmacists were asked to report the drugs which experienced drug supply problems together with the solution for this problem. Hospital pharmacists spent a median of 109 minutes a week on drug supply problems, with a minimum of 40 minutes per week and a maximum of 216 minutes per week. Fifty-nine percent of the total time spent on drug supply problems was executed by hospital pharmacists, 27% by pharmacy technicians; the rest was performed by logistic or administrative personnel. About one third of the total time spent was invested in gathering information on the supply problem. About two third of the supply disruptions caused drug shortages, meaning there was a need to switch to another (generic) therapeutic alternative. For most drug shortages, a Belgian generic medicine could be found. However in some cases, the alternative had to be ordered abroad or for some drug shortages, no alternative was available. These exploratory results on time spent by hospital pharmacists on drug supply problems in Belgium highlight the economic impact of drug supply problems for hospital pharmacies. A fully reliable, daily updated list on the federal agencies websites would be a major help to hospital pharmacists.

  10. Time spent by Belgian hospital pharmacists on supply disruptions and drug shortages: An exploratory study

    PubMed Central

    De Weerdt, Elfi; De Rijdt, Thomas; Simoens, Steven; Casteels, Minne; Huys, Isabelle

    2017-01-01

    Introduction Supply problems of drugs are an increasing and worldwide problem, also in Belgium. Hospital pharmacists try to manage drug supply problems to minimize the impact on patient care. This study aims to quantify in a detailed manner how much time employees of 17 Belgian hospital pharmacies spend on drug supply problems. Methods During six months, employees of Belgian hospital pharmacies filled in the daily time spent on drug supply problems using a template containing all steps which can be executed to manage drug supply problems. Additionally, Belgian hospital pharmacists were asked to report the drugs which experienced drug supply problems together with the solution for this problem. Results Hospital pharmacists spent a median of 109 minutes a week on drug supply problems, with a minimum of 40 minutes per week and a maximum of 216 minutes per week. Fifty-nine percent of the total time spent on drug supply problems was executed by hospital pharmacists, 27% by pharmacy technicians; the rest was performed by logistic or administrative personnel. About one third of the total time spent was invested in gathering information on the supply problem. About two third of the supply disruptions caused drug shortages, meaning there was a need to switch to another (generic) therapeutic alternative. For most drug shortages, a Belgian generic medicine could be found. However in some cases, the alternative had to be ordered abroad or for some drug shortages, no alternative was available. Conclusion These exploratory results on time spent by hospital pharmacists on drug supply problems in Belgium highlight the economic impact of drug supply problems for hospital pharmacies. A fully reliable, daily updated list on the federal agencies websites would be a major help to hospital pharmacists. PMID:28350827

  11. 24 CFR 886.334 - Execution of housing assistance payments contract.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... DIRECT LOAN PROGRAM, SECTION 202 SUPPORTIVE HOUSING FOR THE ELDERLY PROGRAM AND SECTION 811 SUPPORTIVE... Market Rent or the exception rent provided in § 886.310 in effect at the time of execution of the...

  12. Post-game analysis: An initial experiment for heuristic-based resource management in concurrent systems

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.

    1987-01-01

    In concurrent systems, a major responsibility of the resource management system is to decide how the application program is to be mapped onto the multi-processor. Instead of using abstract program and machine models, a generate-and-test framework known as 'post-game analysis' that is based on data gathered during program execution is proposed. Each iteration consists of (1) (a simulation of) an execution of the program; (2) analysis of the data gathered; and (3) the proposal of a new mapping that would have a smaller execution time. These heuristics are applied to predict execution time changes in response to small perturbations applied to the current mapping. An initial experiment was carried out using simple strategies on 'pipeline-like' applications. The results obtained from four simple strategies demonstrated that for this kind of application, even simple strategies can produce acceptable speed-up with a small number of iterations.

  13. Executive Functions and Prefrontal Cortex: A Matter of Persistence?

    PubMed Central

    Ball, Gareth; Stokes, Paul R.; Rhodes, Rebecca A.; Bose, Subrata K.; Rezek, Iead; Wink, Alle-Meije; Lord, Louis-David; Mehta, Mitul A.; Grasby, Paul M.; Turkheimer, Federico E.

    2011-01-01

    Executive function is thought to originates from the dynamics of frontal cortical networks. We examined the dynamic properties of the blood oxygen level dependent time-series measured with functional MRI (fMRI) within the prefrontal cortex (PFC) to test the hypothesis that temporally persistent neural activity underlies performance in three tasks of executive function. A numerical estimate of signal persistence, the Hurst exponent, postulated to represent the coherent firing of cortical networks, was determined and correlated with task performance. Increasing persistence in the lateral PFC was shown to correlate with improved performance during an n-back task. Conversely, we observed a correlation between persistence and increasing commission error – indicating a failure to inhibit a prepotent response – during a Go/No-Go task. We propose that persistence within the PFC reflects dynamic network formation and these findings underline the importance of frequency analysis of fMRI time-series in the study of executive functions. PMID:21286223

  14. Acute stress affects prospective memory functions via associative memory processes.

    PubMed

    Szőllősi, Ágnes; Pajkossy, Péter; Demeter, Gyula; Kéri, Szabolcs; Racsmány, Mihály

    2018-01-01

    Recent findings suggest that acute stress can improve the execution of delayed intentions (prospective memory, PM). However, it is unclear whether this improvement can be explained by altered executive control processes or by altered associative memory functioning. To investigate this issue, we used physical-psychosocial stressors to induce acute stress in laboratory settings. Then participants completed event- and time-based PM tasks requiring the different contribution of control processes and a control task (letter fluency) frequently used to measure executive functions. According to our results, acute stress had no impact on ongoing task performance, time-based PM, and verbal fluency, whereas it enhanced event-based PM as measured by response speed for the prospective cues. Our findings indicate that, here, acute stress did not affect executive control processes. We suggest that stress affected event-based PM via associative memory processes. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Power-Aware Compiler Controllable Chip Multiprocessor

    NASA Astrophysics Data System (ADS)

    Shikano, Hiroaki; Shirako, Jun; Wada, Yasutaka; Kimura, Keiji; Kasahara, Hironori

    A power-aware compiler controllable chip multiprocessor (CMP) is presented and its performance and power consumption are evaluated with the optimally scheduled advanced multiprocessor (OSCAR) parallelizing compiler. The CMP is equipped with power control registers that change clock frequency and power supply voltage to functional units including processor cores, memories, and an interconnection network. The OSCAR compiler carries out coarse-grain task parallelization of programs and reduces power consumption using architectural power control support and the compiler's power saving scheme. The performance evaluation shows that MPEG-2 encoding on the proposed CMP with four CPUs results in 82.6% power reduction in real-time execution mode with a deadline constraint on its sequential execution time. Furthermore, MP3 encoding on a heterogeneous CMP with four CPUs and four accelerators results in 53.9% power reduction at 21.1-fold speed-up in performance against its sequential execution in the fastest execution mode.

  16. Checkpoint-based forward recovery using lookahead execution and rollback validation in parallel and distributed systems. Ph.D. Thesis, 1992

    NASA Technical Reports Server (NTRS)

    Long, Junsheng

    1994-01-01

    This thesis studies a forward recovery strategy using checkpointing and optimistic execution in parallel and distributed systems. The approach uses replicated tasks executing on different processors for forwared recovery and checkpoint comparison for error detection. To reduce overall redundancy, this approach employs a lower static redundancy in the common error-free situation to detect error than the standard N Module Redundancy scheme (NMR) does to mask off errors. For the rare occurrence of an error, this approach uses some extra redundancy for recovery. To reduce the run-time recovery overhead, look-ahead processes are used to advance computation speculatively and a rollback process is used to produce a diagnosis for correct look-ahead processes without rollback of the whole system. Both analytical and experimental evaluation have shown that this strategy can provide a nearly error-free execution time even under faults with a lower average redundancy than NMR.

  17. Bypassing Races in Live Applications with Execution Filters

    DTIC Science & Technology

    2010-01-01

    LOOM creates the needed locks and semaphores on demand. The first time a lock or semaphore is refer- enced by one of the inserted synchronization ...runtime. LOOM provides a flexible and safe language for develop- ers to write execution filters that explicitly synchronize code. It then uses an...first compile their application with LOOM. At runtime, to workaround a race, an application developer writes an execution filter that synchronizes the

  18. Reconfiguration in Robust Distributed Real-Time Systems Based on Global Checkpoints

    DTIC Science & Technology

    1991-12-01

    achieved by utilizing distributed systems in which a single application program executes on multiple processors, connected to a network. The distributed...single application program executes on multiple proces- sors, connected to a network. The distributed nature of such systems make it possible to ...resident at every node. How - ever, the responsibility for execution of a particular function is assigned to only one node in this framework. This function

  19. Decision PBL: A 4-year retrospective case study of the use of virtual patients in problem-based learning.

    PubMed

    Ellaway, Rachel H; Poulton, Terry; Jivram, Trupti

    2015-01-01

    In 2009, St George's University of London (SGUL) replaced their paper-based problem-based learning (PBL) cases with virtual patients for intermediate-level undergraduate students. This involved the development of Decision-Problem-Based Learning (D-PBL), a variation on progressive-release PBL that uses virtual patients instead of paper cases, and focuses on patient management decisions and their consequences. Using a case study method, this paper describes four years of developing and running D-PBL at SGUL from individual activities up to the ways in which D-PBL functioned as an educational system. A number of broad issues were identified: the importance of debates and decision-making in making D-PBL activities engaging and rewarding; the complexities of managing small group dynamics; the time taken to complete D-PBL activities; the changing role of the facilitator; and the erosion of the D-PBL process over time. A key point in understanding this work is the construction and execution of the D-PBL activity, as much of the value of this approach arises from the actions and interactions of students, their facilitators and the virtual patients rather than from the design of the virtual patients alone. At a systems level D-PBL needs to be periodically refreshed to retain its effectiveness.

  20. Assessment of executive functions in patients with obsessive compulsive disorder by NeuroVR.

    PubMed

    La Paglia, Filippo; La Cascia, Caterina; Rizzo, Rosalinda; Riva, Giuseppe; La Barbera, Daniele

    2012-01-01

    Executive functions are often impaired in obsessive-compulsive disorder (OCD). We used a Virtual Reality version of the Multiple Errand Test (VMET) - developed dusing the free NeuroVR software (http://www.neurovr.org) - to evaluate the executive functions in daily life in 10 OCD patients and 10 controls. It is performed in a shopping setting where there are items to be bought and information to be obtained. The execution time for the whole task was higher in patients with OCD compared to controls, suggesting that patients with OCD need more time in planning than controls. The same difference was found in the partial errors during the task. Furthermore, the mean rank for and for interpretation failures is higher for controls, while the values of divided attention and the of self correction seems to be lower in controls. We think that obsessive patients tend to work with greater diligence and observance of rules than controls. In conclusion, these results provide initial support for the feasibility of VMET as assessment tool of executive functions. Specifically, the significant correlation found between the VMET and the neuropsychological battery support the ecological validity of VMET as an instrument for the evaluation of executive functions in patients with OCD.

  1. Cognitive Performance in Suicidal Depressed Elderly: Preliminary Report

    PubMed Central

    Dombrovski, Alexandre Y.; Butters, Meryl A.; Reynolds, Charles F.; Houck, Patricia R.; Clark, Luke; Mazumdar, Sati; Szanto, Katalin

    2009-01-01

    Objective Deficits in executive functions may play an important role in late-life suicide; however the association is understudied. This study examined cognitive function in general and executive functioning specifically in depressed elderly with and without suicidal ideation and attempts. Design Case-control study. Setting University-affiliated psychiatric hospital. Participants We compared 32 suicidal depressed participants aged 60 and older with 32 non-suicidal depressed participants equated for age, education, and gender. Measurements We assessed global cognitive function and executive function with the Dementia Rating Scale (DRS) and the Executive Interview (EXIT25), respectively. Results Suicidal and non-suicidal depressed groups were comparable in terms of severity of depression and burden of physical illness. Suicidal participants performed worse on the EXIT25, and on the DRS total scale, as well as on Memory and Attention subscales. The differences were not explained by the presence of dementia, substance use, medication exposure, or brain injury from suicide attempts. Conclusions Poor performance on tests of executive function, attention, and memory is associated with suicidal behavior in late-life depression. PMID:18239196

  2. RBioCloud: A Light-Weight Framework for Bioconductor and R-based Jobs on the Cloud.

    PubMed

    Varghese, Blesson; Patel, Ishan; Barker, Adam

    2015-01-01

    Large-scale ad hoc analytics of genomic data is popular using the R-programming language supported by over 700 software packages provided by Bioconductor. More recently, analytical jobs are benefitting from on-demand computing and storage, their scalability and their low maintenance cost, all of which are offered by the cloud. While biologists and bioinformaticists can take an analytical job and execute it on their personal workstations, it remains challenging to seamlessly execute the job on the cloud infrastructure without extensive knowledge of the cloud dashboard. How analytical jobs can not only with minimum effort be executed on the cloud, but also how both the resources and data required by the job can be managed is explored in this paper. An open-source light-weight framework for executing R-scripts using Bioconductor packages, referred to as `RBioCloud', is designed and developed. RBioCloud offers a set of simple command-line tools for managing the cloud resources, the data and the execution of the job. Three biological test cases validate the feasibility of RBioCloud. The framework is available from http://www.rbiocloud.com.

  3. Dependability analysis of parallel systems using a simulation-based approach. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Sawyer, Darren Charles

    1994-01-01

    The analysis of dependability in large, complex, parallel systems executing real applications or workloads is examined in this thesis. To effectively demonstrate the wide range of dependability problems that can be analyzed through simulation, the analysis of three case studies is presented. For each case, the organization of the simulation model used is outlined, and the results from simulated fault injection experiments are explained, showing the usefulness of this method in dependability modeling of large parallel systems. The simulation models are constructed using DEPEND and C++. Where possible, methods to increase dependability are derived from the experimental results. Another interesting facet of all three cases is the presence of some kind of workload of application executing in the simulation while faults are injected. This provides a completely new dimension to this type of study, not possible to model accurately with analytical approaches.

  4. Preliminary Investigation of Time Remaining Display on the Computer-based Emergency Operating Procedure

    NASA Astrophysics Data System (ADS)

    Suryono, T. J.; Gofuku, A.

    2018-02-01

    One of the important thing in the mitigation of accidents in nuclear power plant accidents is time management. The accidents should be resolved as soon as possible in order to prevent the core melting and the release of radioactive material to the environment. In this case, operators should follow the emergency operating procedure related with the accident, in step by step order and in allowable time. Nowadays, the advanced main control rooms are equipped with computer-based procedures (CBPs) which is make it easier for operators to do their tasks of monitoring and controlling the reactor. However, most of the CBPs do not include the time remaining display feature which informs operators of time available for them to execute procedure steps and warns them if the they reach the time limit. Furthermore, the feature will increase the awareness of operators about their current situation in the procedure. This paper investigates this issue. The simplified of emergency operating procedure (EOP) of steam generator tube rupture (SGTR) accident of PWR plant is applied. In addition, the sequence of actions on each step of the procedure is modelled using multilevel flow modelling (MFM) and influenced propagation rule. The prediction of action time on each step is acquired based on similar case accidents and the Support Vector Regression. The derived time will be processed and then displayed on a CBP user interface.

  5. Executive and Perceptual Distraction in Visual Working Memory

    PubMed Central

    2017-01-01

    The contents of visual working memory are likely to reflect the influence of both executive control resources and information present in the environment. We investigated whether executive attention is critical in the ability to exclude unwanted stimuli by introducing concurrent potentially distracting irrelevant items to a visual working memory paradigm, and manipulating executive load using simple or more demanding secondary verbal tasks. Across 7 experiments varying in presentation format, timing, stimulus set, and distractor number, we observed clear disruptive effects of executive load and visual distraction, but relatively minimal evidence supporting an interactive relationship between these factors. These findings are in line with recent evidence using delay-based interference, and suggest that different forms of attentional selection operate relatively independently in visual working memory. PMID:28414499

  6. Bilingualism, executive control, and age at diagnosis among people with early-stage Alzheimer's disease in Wales.

    PubMed

    Clare, Linda; Whitaker, Christopher J; Craik, Fergus I M; Bialystok, Ellen; Martyr, Anthony; Martin-Forbes, Pamela A; Bastable, Alexandra J M; Pye, Kirstie L; Quinn, Catherine; Thomas, Enlli M; Gathercole, Virginia C Mueller; Hindle, John V

    2016-09-01

    The observation of a bilingual advantage in executive control tasks involving inhibition and management of response conflict suggests that being bilingual might contribute to increased cognitive reserve. In support of this, recent evidence indicates that bilinguals develop Alzheimer's disease (AD) later than monolinguals, and may retain an advantage in performance on executive control tasks. We compared age at the time of receiving an AD diagnosis in bilingual Welsh/English speakers (n = 37) and monolingual English speakers (n = 49), and assessed the performance of bilinguals (n = 24) and monolinguals (n = 49) on a range of executive control tasks. There was a non-significant difference in age at the time of diagnosis, with bilinguals being on average 3 years older than monolinguals, but bilinguals were also significantly more cognitively impaired at the time of diagnosis. There were no significant differences between monolinguals and bilinguals in performance on executive function tests, but bilinguals appeared to show relative strengths in the domain of inhibition and response conflict. Bilingual Welsh/English speakers with AD do not show a clear advantage in executive function over monolingual English speakers, but may retain some benefits in inhibition and management of response conflict. There may be a delay in onset of AD in Welsh/English bilinguals, but if so, it is smaller than that found in some other clinical populations. In this Welsh sample, bilinguals with AD came to the attention of services later than monolinguals, and reasons for this pattern could be explored further. © 2014 The British Psychological Society.

  7. The longitudinal development of social and executive functions in late adolescence and early adulthood

    PubMed Central

    Taylor, Sophie J.; Barker, Lynne A.; Heavey, Lisa; McHale, Sue

    2015-01-01

    Our earlier work suggests that, executive functions and social cognition show protracted development into late adolescence and early adulthood (Taylor et al., 2013). However, it remains unknown whether these functions develop linearly or non-linearly corresponding to dynamic changes to white matter density at these age ranges. Executive functions are particularly in demand during the transition to independence and autonomy associated with this age range (Ahmed and Miller, 2011). Previous research examining executive function (Romine and Reynolds, 2005) and social cognition (Dumontheil et al., 2010a) in late adolescence has utilized a cross sectional design. The current study employed a longitudinal design with 58 participants aged 17, 18, and 19 years completing social cognition and executive function tasks, Wechsler Abbreviated Scale of Intelligence (Wechsler, 1999), Positive and Negative Affect Schedule (Watson et al., 1988), and Hospital Anxiety and Depression Scale (Zigmond and Snaith, 1983) at Time 1 with follow up testing 12–16 months later. Inhibition, rule detection, strategy generation and planning executive functions and emotion recognition with dynamic stimuli showed longitudinal development between time points. Self-report empathy and emotion recognition functions using visual static and auditory stimuli were stable by age 17 whereas concept formation declined between time points. The protracted development of some functions may reflect continued brain maturation into late adolescence and early adulthood including synaptic pruning (Sowell et al., 2001) and changes to functional connectivity (Stevens et al., 2007) and/or environmental change. Clinical implications, such as assessing the effectiveness of rehabilitation following Head Injury, are discussed. PMID:26441579

  8. Evaluating executive function in patients with temporal lobe epilepsy using the frontal assessment battery.

    PubMed

    Agah, Elmira; Asgari-Rad, Nasima; Ahmadi, Mona; Tafakhori, Abbas; Aghamollaii, Vajiheh

    2017-07-01

    Previous studies have demonstrated executive dysfunction in patients with temporal lobe epilepsy (TLE). Frontal assessment battery (FAB) is a short neuropsychological tool that was developed for assessment of frontal lobe function in a clinical setting. The aim of the present study is to evaluate the clinical utility of FAB for detection of executive dysfunction in TLE patients. Forty-eight TLE patients and 48 sex and age-matched healthy controls participated in this study. Compared to healthy participants, the total FAB score was significantly lower among the TLE patients. TLE patients performed significantly worse at the mental flexibility, motor programming, sensitivity to interference and inhibitory control tasks. The duration of time has been passed since the last seizure was the only significant predictor of FAB score and patients who had a seizure less than a week before the evaluation time, had significantly lower FAB scores. The number of antiepileptic drugs (AEDs) did not influence the executive function in this study; however, sodium valproate was found to affect the mental flexibility. In conclusion, impaired executive function is common in TLE patients, and we suggest that FAB is a clinically applicable tool to monitor it. Moreover, we found that the time of the last seizure is a significant predictor of executive functioning and patients' performance may become worse up to seven days after a seizure. We also recommend that clinicians evaluate the cognitive adverse effects of AEDs especially sodium valproate, which was found to affect the mental flexibility in this study. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. A common neural code for similar conscious experiences in different individuals

    PubMed Central

    Naci, Lorina; Cusack, Rhodri; Anello, Mimma; Owen, Adrian M.

    2014-01-01

    The interpretation of human consciousness from brain activity, without recourse to speech or action, is one of the most provoking and challenging frontiers of modern neuroscience. We asked whether there is a common neural code that underpins similar conscious experiences, which could be used to decode these experiences in the absence of behavior. To this end, we used richly evocative stimulation (an engaging movie) portraying real-world events to elicit a similar conscious experience in different people. Common neural correlates of conscious experience were quantified and related to measurable, quantitative and qualitative, executive components of the movie through two additional behavioral investigations. The movie’s executive demands drove synchronized brain activity across healthy participants’ frontal and parietal cortices in regions known to support executive function. Moreover, the timing of activity in these regions was predicted by participants’ highly similar qualitative experience of the movie’s moment-to-moment executive demands, suggesting that synchronization of activity across participants underpinned their similar experience. Thus we demonstrate, for the first time to our knowledge, that a neural index based on executive function reliably predicted every healthy individual’s similar conscious experience in response to real-world events unfolding over time. This approach provided strong evidence for the conscious experience of a brain-injured patient, who had remained entirely behaviorally nonresponsive for 16 y. The patient’s executive engagement and moment-to-moment perception of the movie content were highly similar to that of every healthy participant. These findings shed light on the common basis of human consciousness and enable the interpretation of conscious experience in the absence of behavior. PMID:25225384

  10. Concurrent and Short-term Prospective Relations among Neurocognitive Functioning, Coping, and Depressive Symptoms in Youth

    PubMed Central

    Evans, Lindsay D.; Kouros, Chrystyna D.; Samanez-Larkin, Silvia; Garber, Judy

    2016-01-01

    Objective The present short-term longitudinal study examined the concurrent and prospective relations among executive functioning (i.e., working memory and cognitive flexibility), coping (primary and secondary control coping), and depressive symptoms in children. Method Participants were 192 children between 9 and 15 years old (mean age = 12.36 years, SD = 1.77) recruited from the community. Youth were individually administered neuropsychological measures of executive functioning and intelligence, and completed self-report measures of executive dysfunction, coping, and depressive symptoms in small groups; the latter two measures were completed again four months later (Time 2). Linear regression analyses were used to examine direct associations among executive functions, coping, and depressive symptoms, and a bootstrapping procedure was used to test indirect effects of executive functioning on depressive symptoms through coping. Results Significant prospective relations were found between working memory measured at Time 1 (T1) and both primary and secondary control coping measured at Time 2 (T2), controlling for T1 coping. T1 cognitive flexibility significantly predicted T2 secondary control coping, controlling for T1 coping. Working memory deficits significantly predicted increases in depressive symptoms four months later, controlling for T1 depressive symptoms. Bootstrap analyses revealed that primary and secondary control coping each partially mediated the relation between working memory and depressive symptoms; secondary control coping partially mediated the relation between cognitive flexibility and depressive symptoms. Conclusion Coping may be one pathway through which deficits in executive functioning contribute to children's symptoms of depression. PMID:25651455

  11. Rule-Based Runtime Verification

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik

    2003-01-01

    We present a rule-based framework for defining and implementing finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time logics, interval logics, forms of quantified temporal logics, and so on. Our logic, EAGLE, is implemented as a Java library and involves novel techniques for rule definition, manipulation and execution. Monitoring is done on a state-by-state basis, without storing the execution trace.

  12. Repetitive thinking, executive functioning, and depressive mood in the elderly.

    PubMed

    Philippot, Pierre; Agrigoroaei, Stefan

    2017-11-01

    Previous findings and the depressive-executive dysfunction hypothesis suggest that the established association between executive functioning and depression is accounted for by repetitive thinking. Investigating the association between executive functioning, repetitive thinking, and depressive mood, the present study empirically tested this mediational model in a sample of older adults, while focusing on both concrete and abstract repetitive thinking. This latter distinction is important given the potential protective role of concrete repetitive thinking, in contrast to the depletive effect of abstract repetitive thinking. A sample of 43 elderly volunteers, between 75 and 95 years of age, completed tests of executive functioning (the Stroop test, the Trail Making test, and the Fluency test), and questionnaires of repetitive thinking and depression. Positive correlations were observed between abstract repetitive thinking and depressive mood, and between concrete repetitive thinking and executive functioning; a negative correlation was observed between depressive mood and executive functioning. Further, mediational analysis evidenced that the relation between executive functioning and depressive mood was mediated by abstract repetitive thinking. The present data provide, for the first time, empirical support to the depressive-executive dysfunction hypothesis: the lack of executive resources would favor a mode of abstract repetitive thinking, which in turn would deplete mood. It suggests that clinical intervention targeting depression in the elderly should take into consideration repetitive thinking modes and the executive resources needed to disengage from rumination.

  13. Prospective memory in multiple sclerosis: The impact of cue distinctiveness and executive functioning.

    PubMed

    Dagenais, Emmanuelle; Rouleau, Isabelle; Tremblay, Alexandra; Demers, Mélanie; Roger, Élaine; Jobin, Céline; Duquette, Pierre

    2016-11-01

    Prospective memory (PM), the ability to remember to do something at the appropriate time in the future, is crucial in everyday life. One way to improve PM performance is to increase the salience of a cue announcing that it is time to act. Multiple sclerosis (MS) patients often report PM failures and there is growing evidence of PM deficits among this population. However, such deficits are poorly characterized and their relation to cognitive status remains unclear. To better understand PM deficits in MS patients, this study investigated the impact of cue salience on PM, and its relation to retrospective memory (RM) and executive deficits. Thirty-nine (39) MS patients were compared to 18 healthy controls on a PM task modulating cue salience during an ongoing general knowledge test. MS patients performed worse than controls on the PM task, regardless of cue salience. MS patients' executive functions contributed significantly to the variance in PM performance, whereas age, education and RM did not. Interestingly, low- and high-executive patients' performance differed when the cue was not salient, but not when it was, suggesting that low-executive MS patients benefited more from cue salience. These findings add to the growing evidence of PM deficits in MS and highlight the contribution of executive functions to certain aspects of PM. In low-executive MS patients, high cue salience improves PM performance by reducing the detection threshold and need for environmental monitoring. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Realisation of a joint consumer engagement strategy in the Nepean Blue Mountains region.

    PubMed

    Blignault, Ilse; Aspinall, Diana; Reay, Lizz; Hyman, Kay

    2017-02-15

    Ensuring consumer engagement at different levels of the health system - direct care, organisational design and governance and policy - has become a strategic priority. This case study explored, through interviews with six purposively selected 'insiders' and document review, how one Medicare Local (now a Primary Health Network, PHN) and Local Health District worked together with consumers, to establish a common consumer engagement structure and mechanisms to support locally responsive, integrated and consumer-centred services. The two healthcare organisations worked as partners across the health system, sharing ownership and responsibility. Critical success factors included a consumer champion working with other highly motivated consumers concerned with improving the health system, a budget, and ongoing commitment from the Medicare Local or PHN and the Local Health District at executive and board level. Shared boundaries were an enormous advantage. Activities were jointly planned and executed, with consumer participation paramount. Training and mentoring enhanced consumer capacity and confidence. Bringing everyone on board and building on existing structures required time, effort and resources. The initiative produced immediate and lasting benefits, with consumer engagement now embedded in organisational governance and practice.

  15. Executive defensive. As the Scrushy case finally goes before an Alabama jury, the Sarbanes-Oxley Act and its impact are also on trial.

    PubMed

    Mantone, Joseph

    2005-01-31

    With the trial of Richard Scrushy, left, finally under way in Alabama, it isn't just the former HealthSouth executive being scrutinized. It's the first courtroom test for the Sarbanes-Oxley Act of 2002, which holds CEOs accountable for false financial statements. Scrushy's defense attorneys have already begun laying the groundwork to blame underlings for the 2.64 billion dollar accounting fraud.

  16. Toward an Executive Origin for Acquired Phonological Dyslexia: A Case of Specific Deficit of Context-Sensitive Grapheme-to-Phoneme Conversion Rules

    PubMed Central

    Auclair-Ouellet, Noémie; Fossard, Marion; St-Pierre, Marie-Catherine; Macoir, Joël

    2013-01-01

    Phonological dyslexia is a written language disorder characterized by poor reading of nonwords when compared with relatively preserved ability in reading real words. In this study, we report the case of FG, a 74-year-old man with phonological dyslexia. The nature and origin of his reading impairment were assessed using tasks involving activation and explicit manipulation of phonological representations as well as reading of words and nonwords in which the nature and complexity of grapheme-to-phoneme conversion rules (GPC rules) were manipulated. FG also underwent an extensive neuropsychological assessment battery in which he showed impaired performance in tests exploring verbal working memory and executive functions. FG showed no phonological impairment, and his performance was also largely unimpaired for reading words, with no effect of concreteness, grammatical class, morphological complexity, length or nature and complexity of the GPC rules. However, he showed substantial difficulties when asked to read nonwords with contextual GPC rules. The contribution of FG’s executive deficits to his performance in reading is discussed. PMID:22713417

  17. Integration of the Reconfigurable Self-Healing eDNA Architecture in an Embedded System

    NASA Technical Reports Server (NTRS)

    Boesen, Michael Reibel; Keymeulen, Didier; Madsen, Jan; Lu, Thomas; Chao, Tien-Hsin

    2011-01-01

    In this work we describe the first real world case study for the self-healing eDNA (electronic DNA) architecture by implementing the control and data processing of a Fourier Transform Spectrometer (FTS) on an eDNA prototype. For this purpose the eDNA prototype has been ported from a Xilinx Virtex 5 FPGA to an embedded system consisting of a PowerPC and a Xilinx Virtex 5 FPGA. The FTS instrument features a novel liquid crystal waveguide, which consequently eliminates all moving parts from the instrument. The addition of the eDNA architecture to do the control and data processing has resulted in a highly fault-tolerant FTS instrument. The case study has shown that the early stage prototype of the autonomous self-healing eDNA architecture is expensive in terms of execution time.

  18. Analysis of Interactive Graphics Display Equipment for an Automated Photo Interpretation System.

    DTIC Science & Technology

    1982-06-01

    System provides the hardware and software for a range of graphics processor tasks. The IMAGE System employs the RSX- II M real - time operating . system in...One hard copy unit serves up to four work stations. The executive program of the IMAGE system is the DEC RSX- 11 M real - time operating system . In...picture controller. The PDP 11/34 executes programs concurrently under the RSX- I IM real - time operating system . Each graphics program consists of a

  19. Intelligent Rover Execution for Detecting Life in the Atacama Desert

    NASA Technical Reports Server (NTRS)

    Baskaran, Vijayakumar; Muscettola, Nicola; Rijsman, David; Plaunt, Chris; Fry, Chuck

    2006-01-01

    On-board supervisory execution is crucial for the deployment of more capable and autonomous remote explorers. Planetary science is considering robotic explorers operating for long periods of time without ground supervision while interacting with a changing and often hostile environment. Effective and robust operations require on-board supervisory control with a high level of awareness of the principles of functioning of the environment and of the numerous internal subsystems that need to be coordinated. We describe an on-board rover executive that was deployed on a rover as past of the "Limits of Life in the Atacama Desert (LITA)" field campaign sponsored by the NASA ASTEP program. The executive was built using the Intelligent Distributed Execution Architecture (IDEA), an execution framework that uses model-based and plan-based supervisory control of its fundamental computational paradigm. We present the results of the third field experiment conducted in the Atacama desert (Chile) in August - October 2005.

  20. Indochinese Refugees in America: Profiles of Five Communities. A Case Study. Executive Seminar in National and International Affairs (22nd).

    ERIC Educational Resources Information Center

    Gim, Wever; Litwin, Tybel

    Five case studies describe experiences in the resettlement of Indochinese refugees in Albuquerque, New Mexico; San Diego, California; Grand Rapids, Michigan; Minneapolis-Saint Paul, Minnesota; and Des Moines, Iowa. The case studies focus on local government and community attitudes toward the refugees; patterns of resettlement; and the nature and…

  1. The SUCCESS model for laboratory performance and execution of rapid molecular diagnostics in patients with sepsis.

    PubMed

    Dekmezian, Mhair; Beal, Stacy G; Damashek, Mary Jane; Benavides, Raul; Dhiman, Neelam

    2015-04-01

    Successful performance and execution of rapid diagnostics in a clinical laboratory hinges heavily on careful validation, accurate and timely communication of results, and real-time quality monitoring. Laboratories must develop strategies to integrate diagnostics with stewardship and evidence-based clinical practice guidelines. We present a collaborative SUCCESS model for execution and monitoring of rapid sepsis diagnostics to facilitate timely treatment. Six months after execution of the Verigene Gram-Positive Blood Culture (BC-GP) and the AdvanDx PNA-FISH assays, data were collected on 579 and 28 episodes of bacteremia and fungemia, respectively. Clinical testing was executed using a SUCCESS model comprising the following components: stewardship, utilization of resources, core strategies, concierge services, education, support, and surveillance. Stewardship needs were identified by evaluating the specialty services benefiting from new testing. Utilization of resources was optimized by reviewing current treatment strategies and antibiogram and formulary options. Core strategies consisted of input from infectious disease leadership, pharmacy, and laboratory staff. Concierge services included automated Micro-eUpdate and physician-friendly actionable reports. Education modules were user-specific, and support was provided through a dedicated 24/7 microbiology hotline. Surveillance was performed by daily audit by the director. Using the SUCCESS model, the turnaround time for the detailed report with actionable guidelines to the physician was ∼3 hours from the time of culture positivity. The overall correlation between rapid methods and culture was 94% (546/579). Discrepant results were predominantly contaminants such as a coagulase-negative staphylococci or viridans streptococci in mixed cultures. SUCCESS is a cost-effective and easily adaptable model for clinical laboratories with limited stewardship resources.

  2. 20 strategies for marketing your managed care plan.

    PubMed

    Firshein, J

    1996-01-01

    In today's fiercely competitive managed care marketplace, healthcare executives must find a way to set their plans apart from the competition and build a sufficient customer base. At the same time, they must confront a growing anti-managed care backlash among a wary and confused public. Healthcare executive magazine talked with managed care experts to gather their views on key strategies to help executives meet both of these challenges. Here's what they suggest.

  3. Star ratings. Stars of wonder.

    PubMed

    Dawes, David

    2002-09-12

    Analysis of trusts that changed their star-rating over the past two years indicates that a change of chief executive was not a significant factor. The length of time in post and the experience of the chief executive were also insignificant. This has serious implications for the theory behind franchising and the evaluation of franchised trusts. Holding chief executives to account for the organisation's performance within their first 12 months is unlikely to be effective.

  4. Mental Rotation of Tactical Instruction Displays Affects Information Processing Demand and Execution Accuracy in Basketball.

    PubMed

    Koopmann, Till; Steggemann-Weinrich, Yvonne; Baumeister, Jochen; Krause, Daniel

    2017-09-01

    In sports games, coaches often use tactic boards to present tactical instructions during time-outs (e.g., 20 s to 60 s in basketball). Instructions should be presented in a way that enables fast and errorless information processing for the players. The aim of this study was to test the effect of different orientations of visual tactical displays on observation time and execution performance. High affordances in visual-spatial transformation (e.g., mental rotation processes) might impede information processing and might decrease execution performance with regard to the instructed playing patterns. In a within-subjects design with 1 factor, 10 novice students were instructed with visual tactical instructions of basketball playing patterns with different orientations either showing the playing pattern with low spatial disparity to the players' on-court perspective (basket on top) or upside down (basket on bottom). The self-chosen time for watching the pattern before execution was significantly shorter and spatial accuracy in pattern execution was significantly higher when the instructional perspective and the real perspective on the basketball court had a congruent orientation. The effects might be explained by interfering mental rotation processes that are necessary to transform the instructional perspective into the players' actual perspective while standing on the court or imagining themselves standing on the court. According to these results, coaches should align their tactic boards to their players' on-court viewing perspective.

  5. Mission Data System Java Edition Version 7

    NASA Technical Reports Server (NTRS)

    Reinholtz, William K.; Wagner, David A.

    2013-01-01

    The Mission Data System framework defines closed-loop control system abstractions from State Analysis including interfaces for state variables, goals, estimators, and controllers that can be adapted to implement a goal-oriented control system. The framework further provides an execution environment that includes a goal scheduler, execution engine, and fault monitor that support the expression of goal network activity plans. Using these frameworks, adapters can build a goal-oriented control system where activity coordination is verified before execution begins (plan time), and continually during execution. Plan failures including violations of safety constraints expressed in the plan can be handled through automatic re-planning. This version optimizes a number of key interfaces and features to minimize dependencies, performance overhead, and improve reliability. Fault diagnosis and real-time projection capabilities are incorporated. This version enhances earlier versions primarily through optimizations and quality improvements that raise the technology readiness level. Goals explicitly constrain system states over explicit time intervals to eliminate ambiguity about intent, as compared to command-oriented control that only implies persistent intent until another command is sent. A goal network scheduling and verification process ensures that all goals in the plan are achievable before starting execution. Goal failures at runtime can be detected (including predicted failures) and handled by adapted response logic. Responses can include plan repairs (try an alternate tactic to achieve the same goal), goal shedding, ignoring the fault, cancelling the plan, or safing the system.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nishimura, Takahiro, E-mail: t-nishimura@ist.osaka-u.ac.jp; Fujii, Ryo; Ogura, Yusuke

    Molecular logic circuits represent a promising technology for observation and manipulation of biological systems at the molecular level. However, the implementation of molecular logic circuits for temporal and programmable operation remains challenging. In this paper, we demonstrate an optically controllable logic circuit that uses fluorescence resonance energy transfer (FRET) for signaling. The FRET-based signaling process is modulated by both molecular and optical inputs. Based on the distance dependence of FRET, the FRET pathways required to execute molecular logic operations are formed on a DNA nanostructure as a circuit based on its molecular inputs. In addition, the FRET pathways on themore » DNA nanostructure are controlled optically, using photoswitching fluorescent molecules to instruct the execution of the desired operation and the related timings. The behavior of the circuit can thus be controlled using external optical signals. As an example, a molecular logic circuit capable of executing two different logic operations was studied. The circuit contains functional DNAs and a DNA scaffold to construct two FRET routes for executing Input 1 AND Input 2 and Input 1 AND NOT Input 3 operations on molecular inputs. The circuit produced the correct outputs with all possible combinations of the inputs by following the light signals. Moreover, the operation execution timings were controlled based on light irradiation and the circuit responded to time-dependent inputs. The experimental results demonstrate that the circuit changes the output for the required operations following the input of temporal light signals.« less

  7. Autonomous Real Time Requirements Tracing

    NASA Technical Reports Server (NTRS)

    Plattsmier, George I.; Stetson, Howard K.

    2014-01-01

    One of the more challenging aspects of software development is the ability to verify and validate the functional software requirements dictated by the Software Requirements Specification (SRS) and the Software Detail Design (SDD). Insuring the software has achieved the intended requirements is the responsibility of the Software Quality team and the Software Test team. The utilization of Timeliner-TLX(sup TM) Auto-Procedures for relocating ground operations positions to ISS automated on-board operations has begun the transition that would be required for manned deep space missions with minimal crew requirements. This transition also moves the auto-procedures from the procedure realm into the flight software arena and as such the operational requirements and testing will be more structured and rigorous. The autoprocedures would be required to meet NASA software standards as specified in the Software Safety Standard (NASASTD- 8719), the Software Engineering Requirements (NPR 7150), the Software Assurance Standard (NASA-STD-8739) and also the Human Rating Requirements (NPR-8705). The Autonomous Fluid Transfer System (AFTS) test-bed utilizes the Timeliner-TLX(sup TM) Language for development of autonomous command and control software. The Timeliner- TLX(sup TM) system has the unique feature of providing the current line of the statement in execution during real-time execution of the software. The feature of execution line number internal reporting unlocks the capability of monitoring the execution autonomously by use of a companion Timeliner-TLX(sup TM) sequence as the line number reporting is embedded inside the Timeliner-TLX(sup TM) execution engine. This negates I/O processing of this type data as the line number status of executing sequences is built-in as a function reference. This paper will outline the design and capabilities of the AFTS Autonomous Requirements Tracker, which traces and logs SRS requirements as they are being met during real-time execution of the targeted system. It is envisioned that real time requirements tracing will greatly assist the movement of autoprocedures to flight software enhancing the software assurance of auto-procedures and also their acceptance as reliable commanders

  8. Autonomous Real Time Requirements Tracing

    NASA Technical Reports Server (NTRS)

    Plattsmier, George; Stetson, Howard

    2014-01-01

    One of the more challenging aspects of software development is the ability to verify and validate the functional software requirements dictated by the Software Requirements Specification (SRS) and the Software Detail Design (SDD). Insuring the software has achieved the intended requirements is the responsibility of the Software Quality team and the Software Test team. The utilization of Timeliner-TLX(sup TM) Auto- Procedures for relocating ground operations positions to ISS automated on-board operations has begun the transition that would be required for manned deep space missions with minimal crew requirements. This transition also moves the auto-procedures from the procedure realm into the flight software arena and as such the operational requirements and testing will be more structured and rigorous. The autoprocedures would be required to meet NASA software standards as specified in the Software Safety Standard (NASASTD- 8719), the Software Engineering Requirements (NPR 7150), the Software Assurance Standard (NASA-STD-8739) and also the Human Rating Requirements (NPR-8705). The Autonomous Fluid Transfer System (AFTS) test-bed utilizes the Timeliner-TLX(sup TM) Language for development of autonomous command and control software. The Timeliner-TLX(sup TM) system has the unique feature of providing the current line of the statement in execution during real-time execution of the software. The feature of execution line number internal reporting unlocks the capability of monitoring the execution autonomously by use of a companion Timeliner-TLX(sup TM) sequence as the line number reporting is embedded inside the Timeliner-TLX(sup TM) execution engine. This negates I/O processing of this type data as the line number status of executing sequences is built-in as a function reference. This paper will outline the design and capabilities of the AFTS Autonomous Requirements Tracker, which traces and logs SRS requirements as they are being met during real-time execution of the targeted system. It is envisioned that real time requirements tracing will greatly assist the movement of autoprocedures to flight software enhancing the software assurance of auto-procedures and also their acceptance as reliable commanders.

  9. The benefits of Tai Chi and brisk walking for cognitive function and fitness in older adults

    PubMed Central

    Ji, Zhiguang; Feng, Tian; Liu, Xiaolei; You, Yihong; Meng, Fanying; Wang, Ruoqing; Lu, Jialing

    2017-01-01

    The purpose of this study was to investigate the benefits of exercises with different cognitive demands for cognitive functions (Executive and non-Executive) in healthy older adults. A cross-sectional design was adopted. In total, 84 healthy older adults were enrolled in the study. They were categorized into the Tai Chi group (TG), the brisk walking group (BG) or the control group (CG). Each participant performed the Stroop task and a digit comparison task. The Stroop task included the following three conditions: a naming condition, an inhibition condition and an executive condition. There were two experimental conditions in the digit comparison task: the non-delay condition and the delay condition. The results indicated that participants of the TG and BG revealed significant better performance than the CG in the executive condition of cognitive tasks and fitness. There was no significant difference of reaction time (RT) and accuracy rate in the inhibition and delay conditions of cognitive tasks and fitness between the TG and BG. The TG showed shorter reaction time in the naming and the executive conditions, and more accurate in the inhibition conditions than the BG. These findings demonstrated that regular participation in brisk walking and Tai Chi have significant beneficial effects on executive function and fitness. However, due to the high cognitive demands of the exercise, Tai Chi benefit cognitive functions (Executive and non-Executive) in older adults more than brisk walking does. Further studies should research the underlying mechanisms at the behavioural and neuroelectric levels, providing more evidence to explain the effect of high-cognitive demands exercise on different processing levels of cognition. PMID:29062610

  10. Execution time supports for adaptive scientific algorithms on distributed memory machines

    NASA Technical Reports Server (NTRS)

    Berryman, Harry; Saltz, Joel; Scroggs, Jeffrey

    1990-01-01

    Optimizations are considered that are required for efficient execution of code segments that consists of loops over distributed data structures. The PARTI (Parallel Automated Runtime Toolkit at ICASE) execution time primitives are designed to carry out these optimizations and can be used to implement a wide range of scientific algorithms on distributed memory machines. These primitives allow the user to control array mappings in a way that gives an appearance of shared memory. Computations can be based on a global index set. Primitives are used to carry out gather and scatter operations on distributed arrays. Communications patterns are derived at runtime, and the appropriate send and receive messages are automatically generated.

  11. Execution time support for scientific programs on distributed memory machines

    NASA Technical Reports Server (NTRS)

    Berryman, Harry; Saltz, Joel; Scroggs, Jeffrey

    1990-01-01

    Optimizations are considered that are required for efficient execution of code segments that consists of loops over distributed data structures. The PARTI (Parallel Automated Runtime Toolkit at ICASE) execution time primitives are designed to carry out these optimizations and can be used to implement a wide range of scientific algorithms on distributed memory machines. These primitives allow the user to control array mappings in a way that gives an appearance of shared memory. Computations can be based on a global index set. Primitives are used to carry out gather and scatter operations on distributed arrays. Communications patterns are derived at runtime, and the appropriate send and receive messages are automatically generated.

  12. Design of high-performance parallelized gene predictors in MATLAB.

    PubMed

    Rivard, Sylvain Robert; Mailloux, Jean-Gabriel; Beguenane, Rachid; Bui, Hung Tien

    2012-04-10

    This paper proposes a method of implementing parallel gene prediction algorithms in MATLAB. The proposed designs are based on either Goertzel's algorithm or on FFTs and have been implemented using varying amounts of parallelism on a central processing unit (CPU) and on a graphics processing unit (GPU). Results show that an implementation using a straightforward approach can require over 4.5 h to process 15 million base pairs (bps) whereas a properly designed one could perform the same task in less than five minutes. In the best case, a GPU implementation can yield these results in 57 s. The present work shows how parallelism can be used in MATLAB for gene prediction in very large DNA sequences to produce results that are over 270 times faster than a conventional approach. This is significant as MATLAB is typically overlooked due to its apparent slow processing time even though it offers a convenient environment for bioinformatics. From a practical standpoint, this work proposes two strategies for accelerating genome data processing which rely on different parallelization mechanisms. Using a CPU, the work shows that direct access to the MEX function increases execution speed and that the PARFOR construct should be used in order to take full advantage of the parallelizable Goertzel implementation. When the target is a GPU, the work shows that data needs to be segmented into manageable sizes within the GFOR construct before processing in order to minimize execution time.

  13. Rehabilitation of executive functioning in patients with frontal lobe brain damage with goal management training.

    PubMed

    Levine, Brian; Schweizer, Tom A; O'Connor, Charlene; Turner, Gary; Gillingham, Susan; Stuss, Donald T; Manly, Tom; Robertson, Ian H

    2011-01-01

    Executive functioning deficits due to brain disease affecting frontal lobe functions cause significant real-life disability, yet solid evidence in support of executive functioning interventions is lacking. Goal Management Training (GMT), an executive functioning intervention that draws upon theories concerning goal processing and sustained attention, has received empirical support in studies of patients with traumatic brain injury, normal aging, and case studies. GMT promotes a mindful approach to complex real-life tasks that pose problems for patients with executive functioning deficits, with a main goal of periodically stopping ongoing behavior to monitor and adjust goals. In this controlled trial, an expanded version of GMT was compared to an alternative intervention, Brain Health Workshop that was matched to GMT on non-specific characteristics that can affect intervention outcome. Participants included 19 individuals in the chronic phase of recovery from brain disease (predominantly stroke) affecting frontal lobe function. Outcome data indicated specific effects of GMT on the Sustained Attention to Response Task as well as the Tower Test, a visuospatial problem-solving measure that reflected far transfer of training effects. There were no significant effects on self-report questionnaires, likely owing to the complexity of these measures in this heterogeneous patient sample. Overall, these data support the efficacy of GMT in the rehabilitation of executive functioning deficits.

  14. Gunshot wounds (resulting from execution) of exhumed victims of the communist regime in Poland.

    PubMed

    Szleszkowski, Łukasz; Thannhäuser, Agata; Szwagrzyk, Krzysztof; Kawecki, Jerzy; Jurek, Tomasz

    2014-07-01

    This study presents the results of the analysis of the remains of 23 executed male individuals aged between 21 and 63 years, recovered from Osobowicki Cemetery in Wroclaw (Poland), field 83B, in 2012. In 1948 and 1949, prisoners sentenced to death by firing squad--most of them associated with the post-war anti-communist underground independence movement in Poland--were buried there. The aim of the study was to analyse fatal wounds and the method of execution, and to compare the results to data from archival documents. The results were also compared with studies concerning executions during a later period, i.e. 1949-1954. The research on the method of execution during this period of history carried out during the exhumations in Osobowicki Cemetery was the first conducted on such a scale in Poland. Forensic analysis revealed a wide variety of gunshot wounds inflicted during executions, revealing both gunshots to the head, especially single shots to the back of the head, and cases corresponding to the use of a firing squad, probably equipped with machine guns. The results of the research indicate that capital punishment by shooting was carried out in ways both similar to those the specified in the regulations and completely different. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. 78 FR 66267 - Safety Zone; HITS Triathlon Series; Colorado River; Lake Havasu, AZ

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-05

    ... an NPRM would be impracticable. Logistical details did not present the Coast Guard enough time to... potential costs and benefits under section 6(a)(3) of Executive Order 12866 or under section 1 of Executive...

  16. Empowering Students through Organizational Empathy: Multiple Case Study Methodology

    ERIC Educational Resources Information Center

    Williams, Daniel

    2017-01-01

    This dissertation in practice employed a multiple case study design to better understand how two executive directors within a national network of arts and technology educational organizations defined, nurtured, and measured empathy with in their students. Empathy can connect diverse people and improve relationships, and it has been proven to…

  17. 31 CFR 206.9 - Charges.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... OF THE CASH MANAGEMENT IMPROVEMENTS FUND § 206.9 Charges. (a) Within 30 days of the effective date of... noncompliance. In the case of cash management collection noncompliance, an agency will absorb the charge from.... Charges collected from an executive agency in the case of cash management collection noncompliance will be...

  18. Comparison Between Laser Scanning and Automated 3d Modelling Techniques to Reconstruct Complex and Extensive Cultural Heritage Areas

    NASA Astrophysics Data System (ADS)

    Fassi, F.; Fregonese, L.; Ackermann, S.; De Troia, V.

    2013-02-01

    In Cultural Heritage field, the necessity to survey objects in a fast manner, with the ability to repeat the measurements several times for deformation or degradation monitoring purposes, is increasing. In this paper, two significant cases, an architectonical one and an archaeological one, are presented. Due to different reasons and emergency situations, the finding of the optimal solution to enable quick and well-timed survey for a complete digital reconstruction of the object is required. In both cases, two survey methods have been tested and used: a laser scanning approach that allows to obtain high-resolution and complete scans within a short time and a photogrammetric one that allows the three-dimensional reconstruction of the object from images. In the last months, several methodologies, including free or low cost techniques, have arisen. These kinds of software allow the fully automatically three-dimensional reconstruction of objects from images, giving back a dense point cloud and, in some case, a surfaced mesh model. In this paper some comparisons between the two methodologies above mentioned are presented, using the example of some real cases of study. The surveys have been performed by employing both photogrammetry and laser scanner techniques. The methodological operational choices, depending on the required goal, the difficulties encountered during the survey with these methods, the execution time (that is the key parameter), and finally the obtained results, are fully described and examinated. On the final 3D model, an analytical comparison has been made, to analyse the differences, the tolerances, the possibility of accuracy improvement and the future developments.

  19. Experimental evaluation of certification trails using abstract data type validation

    NASA Technical Reports Server (NTRS)

    Wilson, Dwight S.; Sullivan, Gregory F.; Masson, Gerald M.

    1993-01-01

    Certification trails are a recently introduced and promising approach to fault-detection and fault-tolerance. Recent experimental work reveals many cases in which a certification-trail approach allows for significantly faster program execution time than a basic time-redundancy approach. Algorithms for answer-validation of abstract data types allow a certification trail approach to be used for a wide variety of problems. An attempt to assess the performance of algorithms utilizing certification trails on abstract data types is reported. Specifically, this method was applied to the following problems: heapsort, Hullman tree, shortest path, and skyline. Previous results used certification trails specific to a particular problem and implementation. The approach allows certification trails to be localized to 'data structure modules,' making the use of this technique transparent to the user of such modules.

  20. The Chaplygin Sleigh with Parametric Excitation: Chaotic Dynamics and Nonholonomic Acceleration

    NASA Astrophysics Data System (ADS)

    Bizyaev, Ivan A.; Borisov, Alexey V.; Mamaev, Ivan S.

    2017-12-01

    This paper is concerned with the Chaplygin sleigh with time-varying mass distribution (parametric excitation). The focus is on the case where excitation is induced by a material point that executes periodic oscillations in a direction transverse to the plane of the knife edge of the sleigh. In this case, the problem reduces to investigating a reduced system of two first-order equations with periodic coefficients, which is similar to various nonlinear parametric oscillators. Depending on the parameters in the reduced system, one can observe different types of motion, including those accompanied by strange attractors leading to a chaotic (diffusion) trajectory of the sleigh on the plane. The problem of unbounded acceleration (an analog of Fermi acceleration) of the sleigh is examined in detail. It is shown that such an acceleration arises due to the position of the moving point relative to the line of action of the nonholonomic constraint and the center of mass of the platform. Various special cases of existence of tensor invariants are found.

  1. Antiviral therapeutics for the treatment of Ebola virus infection.

    PubMed

    Cardile, Anthony P; Downey, Lydia G; Wiseman, Perry D; Warren, Travis K; Bavari, Sina

    2016-10-01

    There have been significant developments in Ebola virus therapeutics. While the efficacy of several products was evaluated in the recent West Africa outbreak, a licensed treatment for EBOV disease remains elusive. Factors that negatively impacted the execution of clinical trials included an overall lack of world readiness to conduct clinical trials in an outbreak setting, ethical concerns limiting implementation of the randomized controlled trials in an outbreak setting, and a decline in case numbers by the time resources were mobilized to conduct clinical trials. We summarize relevant therapeutics that underwent clinical trials during the West Africa outbreak and highlight promising candidates under advanced development. Published by Elsevier Ltd.

  2. Executive Function Processes Predict Mobility Outcomes in Older Adults

    PubMed Central

    Gothe, Neha P.; Fanning, Jason; Awick, Elizabeth; Chung, David; Wójcicki, Thomas R.; Olson, Erin A.; Mullen, Sean P.; Voss, Michelle; Erickson, Kirk I.; Kramer, Arthur F.; McAuley, Edward

    2013-01-01

    BACKGROUND: There is growing evidence suggesting an association between cognitive function and physical performance in late life. This study examined the relationship between performance on executive function measures and subsequent mobility outcomes among community dwelling older adults across a 12-month randomized controlled exercise trial. DESIGN: Randomized controlled clinical trial SETTING: Champaign-Urbana, Illinois PARTICIPANTS: Community dwelling older adults (N = 179; Mage = 66.4) INTERVENTION: A 12-month exercise trial with two arms: an aerobic exercise group and a stretching and strengthening group MEASUREMENTS: Established cognitive tests of executive function including the flanker task, task switching and a dual task paradigm, and the Wisconsin card sort test. Mobility was assessed using the timed 8-foot up and go test and times to climb up and down a flight of stairs. METHODS: Participants completed the cognitive measures at baseline and the mobility measures at baseline and after 12 months of the intervention. Multiple regression analyses were conducted to determine whether baseline executive function predicted post-intervention functional performance after controlling for age, sex, education, cardiorespiratory fitness and baseline mobility levels. RESULTS: Our analyses revealed that selective baseline executive function measures, particularly performance on the flanker task (β’s =.15 to .17) and the Wisconsin card sort test (β’s =.11 to .16) consistently predicted mobility outcomes at month 12. The estimates were in the expected direction, such that better baseline performance on the executive function measures predicted better performance on the timed mobility tests independent of the intervention group. CONCLUSION: Executive functions of inhibitory control, mental set shifting and attentional flexibility were predictive of functional mobility. Given the literature associating mobility limitations with disability, morbidity, and mortality, these results are important for understanding the antecedents to poor mobility function that can be attenuated by well-designed interventions to improve cognitive performance. PMID:24521364

  3. Renewal and change for health care executives.

    PubMed

    Burke, G C; Bice, M O

    1991-01-01

    Health care executives must consider renewal and change within their own lives if they are to breathe life into their own institutions. Yet numerous barriers to executive renewal exist, including time pressures, fatigue, cultural factors, and trustee attitudes. This essay discusses such barriers and suggests approaches that health care executives may consider for programming renewal into their careers. These include self-assessment for professional and personal goals, career or job change, process vs. outcome considerations, solitude, networking, lifelong education, surrounding oneself with change agents, business travel and sabbaticals, reading outside the field, physical exercise, mentoring, learning from failures, a sense of humor, spiritual reflection, and family and friends. Renewal is a continuous, lifelong process requiring constant learning. Individual executives would do well to develop a framework for renewal in their careers and organizations.

  4. The Jet Propulsion Laboratory shared control architecture and implementation

    NASA Technical Reports Server (NTRS)

    Backes, Paul G.; Hayati, Samad

    1990-01-01

    A hardware and software environment for shared control of telerobot task execution has been implemented. Modes of task execution range from fully teleoperated to fully autonomous as well as shared where hand controller inputs from the human operator are mixed with autonomous system inputs in real time. The objective of the shared control environment is to aid the telerobot operator during task execution by merging real-time operator control from hand controllers with autonomous control to simplify task execution for the operator. The operator is the principal command source and can assign as much autonomy for a task as desired. The shared control hardware environment consists of two PUMA 560 robots, two 6-axis force reflecting hand controllers, Universal Motor Controllers for each of the robots and hand controllers, a SUN4 computer, and VME chassis containing 68020 processors and input/output boards. The operator interface for shared control, the User Macro Interface (UMI), is a menu driven interface to design a task and assign the levels of teleoperated and autonomous control. The operator also sets up the system monitor which checks safety limits during task execution. Cartesian-space degrees of freedom for teleoperated and/or autonomous control inputs are selected within UMI as well as the weightings for the teleoperation and autonmous inputs. These are then used during task execution to determine the mix of teleoperation and autonomous inputs. Some of the autonomous control primitives available to the user are Joint-Guarded-Move, Cartesian-Guarded-Move, Move-To-Touch, Pin-Insertion/Removal, Door/Crank-Turn, Bolt-Turn, and Slide. The operator can execute a task using pure teleoperation or mix control execution from the autonomous primitives with teleoperated inputs. Presently the shared control environment supports single arm task execution. Work is presently underway to provide the shared control environment for dual arm control. Teleoperation during shared control is only Cartesian space control and no force-reflection is provided. Force-reflecting teleoperation and joint space operator inputs are planned extensions to the environment.

  5. Communication overhead on the Intel Paragon, IBM SP2 and Meiko CS-2

    NASA Technical Reports Server (NTRS)

    Bokhari, Shahid H.

    1995-01-01

    Interprocessor communication overhead is a crucial measure of the power of parallel computing systems-its impact can severely limit the performance of parallel programs. This report presents measurements of communication overhead on three contemporary commercial multicomputer systems: the Intel Paragon, the IBM SP2 and the Meiko CS-2. In each case the time to communicate between processors is presented as a function of message length. The time for global synchronization and memory access is discussed. The performance of these machines in emulating hypercubes and executing random pairwise exchanges is also investigated. It is shown that the interprocessor communication time depends heavily on the specific communication pattern required. These observations contradict the commonly held belief that communication overhead on contemporary machines is independent of the placement of tasks on processors. The information presented in this report permits the evaluation of the efficiency of parallel algorithm implementations against standard baselines.

  6. Sorting on STAR. [CDC computer algorithm timing comparison

    NASA Technical Reports Server (NTRS)

    Stone, H. S.

    1978-01-01

    Timing comparisons are given for three sorting algorithms written for the CDC STAR computer. One algorithm is Hoare's (1962) Quicksort, which is the fastest or nearly the fastest sorting algorithm for most computers. A second algorithm is a vector version of Quicksort that takes advantage of the STAR's vector operations. The third algorithm is an adaptation of Batcher's (1968) sorting algorithm, which makes especially good use of vector operations but has a complexity of N(log N)-squared as compared with a complexity of N log N for the Quicksort algorithms. In spite of its worse complexity, Batcher's sorting algorithm is competitive with the serial version of Quicksort for vectors up to the largest that can be treated by STAR. Vector Quicksort outperforms the other two algorithms and is generally preferred. These results indicate that unusual instruction sets can introduce biases in program execution time that counter results predicted by worst-case asymptotic complexity analysis.

  7. An inference engine for embedded diagnostic systems

    NASA Technical Reports Server (NTRS)

    Fox, Barry R.; Brewster, Larry T.

    1987-01-01

    The implementation of an inference engine for embedded diagnostic systems is described. The system consists of two distinct parts. The first is an off-line compiler which accepts a propositional logical statement of the relationship between facts and conclusions and produces data structures required by the on-line inference engine. The second part consists of the inference engine and interface routines which accept assertions of fact and return the conclusions which necessarily follow. Given a set of assertions, it will generate exactly the conclusions which logically follow. At the same time, it will detect any inconsistencies which may propagate from an inconsistent set of assertions or a poorly formulated set of rules. The memory requirements are fixed and the worst case execution times are bounded at compile time. The data structures and inference algorithms are very simple and well understood. The data structures and algorithms are described in detail. The system has been implemented on Lisp, Pascal, and Modula-2.

  8. gpuSPHASE-A shared memory caching implementation for 2D SPH using CUDA

    NASA Astrophysics Data System (ADS)

    Winkler, Daniel; Meister, Michael; Rezavand, Massoud; Rauch, Wolfgang

    2017-04-01

    Smoothed particle hydrodynamics (SPH) is a meshless Lagrangian method that has been successfully applied to computational fluid dynamics (CFD), solid mechanics and many other multi-physics problems. Using the method to solve transport phenomena in process engineering requires the simulation of several days to weeks of physical time. Based on the high computational demand of CFD such simulations in 3D need a computation time of years so that a reduction to a 2D domain is inevitable. In this paper gpuSPHASE, a new open-source 2D SPH solver implementation for graphics devices, is developed. It is optimized for simulations that must be executed with thousands of frames per second to be computed in reasonable time. A novel caching algorithm for Compute Unified Device Architecture (CUDA) shared memory is proposed and implemented. The software is validated and the performance is evaluated for the well established dambreak test case.

  9. OBJ-1, A Study in Executable Algebraic Formal Specification.

    DTIC Science & Technology

    1981-07-01

    natural way; 2. Achievement of a high level of abstraction in a natural way; 3. The possibility of executing test cases; 4I. User definition of data types...languages. Goguen has defined a new data type, called symboltree, in OBJ. The purpose of this data type is to provide for fast checking of certain... data type work, is given in Appendix C of this report. K. Parsaye-Ghomi, with A. B. C. Sampaio of UCLA, has written a specification of a hardware

  10. Antiretroviral therapy outcomes among HIV infected clients in Gweru City, Zimbabwe 2006 - 2011: a cohort analysis.

    PubMed

    Shambira, Gerald; Gombe, Notion Tafara; Hall, Casey Daniel; Park, Meeyoung Mattie; Frimpong, Joseph Asamoah

    2017-01-01

    The government of Zimbabwe began providing antiretroviral therapy (ART) to People Living with HIV/AIDS (PLHIV) in public institutions in 2004. In Midlands province two clinics constituted the most active HIV care service points, with patients being followed up through a comprehensive patient monitoring and tracking system which captured specific patient variables and outcomes over time. The data from 2006 to 2011 were subjected to analysis to answer specific research questions and this case study is based on that analysis. The goal of this case study is to build participants' capacity to undertake secondary data analysis and interpretation using a dataset for HIV antiretroviral therapy in Zimbabwe and to draw conclusions which inform recommendations. Case studies in applied epidemiology allow students to practice applying epidemiologic skills in the classroom to address real-world public health problems. Case studies as a vital component of an applied epidemiology curriculum are instrumental in reinforcing principles and skills covered in lectures or in background reading. The target audience includes Field Epidemiology and Laboratory Training Programs (FELTPs), university students, district health executives, and health information officers.

  11. Mal-Xtract: Hidden Code Extraction using Memory Analysis

    NASA Astrophysics Data System (ADS)

    Lim, Charles; Syailendra Kotualubun, Yohanes; Suryadi; Ramli, Kalamullah

    2017-01-01

    Software packer has been used effectively to hide the original code inside a binary executable, making it more difficult for existing signature based anti malware software to detect malicious code inside the executable. A new method of written and rewritten memory section is introduced to to detect the exact end time of unpacking routine and extract original code from packed binary executable using Memory Analysis running in an software emulated environment. Our experiment results show that at least 97% of the original code from the various binary executable packed with different software packers could be extracted. The proposed method has also been successfully extracted hidden code from recent malware family samples.

  12. Dual compile strategy for parallel heterogeneous execution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Tyler Barratt; Perry, James Thomas

    2012-06-01

    The purpose of the Dual Compile Strategy is to increase our trust in the Compute Engine during its execution of instructions. This is accomplished by introducing a heterogeneous Monitor Engine that checks the execution of the Compute Engine. This leads to the production of a second and custom set of instructions designed for monitoring the execution of the Compute Engine at runtime. This use of multiple engines differs from redundancy in that one engine is working on the application while the other engine is monitoring and checking in parallel instead of both applications (and engines) performing the same work atmore » the same time.« less

  13. Assessment of LightSquared Terrestrial Broadband System Effects on GPS Receivers and GPS-dependent Applications

    DOT National Transportation Integrated Search

    2011-06-01

    The Executive Steering Group (ESG) of the National Executive Committee (EXCOM) for : Space-Based Positioning, Navigation, and Timing (PNT) directed the National Space-Based : PNT Systems Engineering Forum (NPEF) to conduct an assessment of the effect...

  14. Event dependence in U.S. executions

    PubMed Central

    Baumgartner, Frank R.; Box-Steffensmeier, Janet M.

    2018-01-01

    Since 1976, the United States has seen over 1,400 judicial executions, and these have been highly concentrated in only a few states and counties. The number of executions across counties appears to fit a stretched distribution. These distributions are typically reflective of self-reinforcing processes where the probability of observing an event increases for each previous event. To examine these processes, we employ two-pronged empirical strategy. First, we utilize bootstrapped Kolmogorov-Smirnov tests to determine whether the pattern of executions reflect a stretched distribution, and confirm that they do. Second, we test for event-dependence using the Conditional Frailty Model. Our tests estimate the monthly hazard of an execution in a given county, accounting for the number of previous executions, homicides, poverty, and population demographics. Controlling for other factors, we find that the number of prior executions in a county increases the probability of the next execution and accelerates its timing. Once a jurisdiction goes down a given path, the path becomes self-reinforcing, causing the counties to separate out into those never executing (the vast majority of counties) and those which use the punishment frequently. This finding is of great legal and normative concern, and ultimately, may not be consistent with the equal protection clause of the U.S. Constitution. PMID:29293583

  15. Constant time worker thread allocation via configuration caching

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eichenberger, Alexandre E; O'Brien, John K. P.

    Mechanisms are provided for allocating threads for execution of a parallel region of code. A request for allocation of worker threads to execute the parallel region of code is received from a master thread. Cached thread allocation information identifying prior thread allocations that have been performed for the master thread are accessed. Worker threads are allocated to the master thread based on the cached thread allocation information. The parallel region of code is executed using the allocated worker threads.

  16. Executive dysfunction in bipolar disorder and borderline personality disorder.

    PubMed

    Gvirts, H Z; Braw, Y; Harari, H; Lozin, M; Bloch, Y; Fefer, K; Levkovitz, Y

    2015-11-01

    The boundary between bipolar disorder (BD) and borderline personality disorder is a controversial one. Despite the importance of the topic, few studies have directly compared these patient groups. The aim of the study was to compare the executive functioning profile of BD and BPD patients. Executive functioning (sustained attention, problem-solving, planning, strategy formation, cognitive flexibility and working memory) was assessed in BD (n=30) and BPD outpatients (n=32) using a computerized assessment battery (Cambridge Neuropsychological Test Automated Battery, CANTAB). The groups were compared to one another as well as to healthy controls. BD patients showed deficits in strategy formation and in planning (indicated by longer execution time in the ToL task) in comparison to BPD patients and healthy controls. BPD patients showed deficits in planning (short deliberation time in the ToL task) in comparison to BD patients and in comparison to healthy controls. In comparison to healthy controls, BPD patients displayed deficits in problem-solving. Differences in executive dysfunction between BD and BPD patients suggest that this cognitive dimension may be relevant for the clarification of the boundary between the disorders. Copyright © 2015. Published by Elsevier Masson SAS.

  17. A Rewriting-Based Approach to Trace Analysis

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2002-01-01

    We present a rewriting-based algorithm for efficiently evaluating future time Linear Temporal Logic (LTL) formulae on finite execution traces online. While the standard models of LTL are infinite traces, finite traces appear naturally when testing and/or monitoring red applications that only run for limited time periods. The presented algorithm is implemented in the Maude executable specification language and essentially consists of a set of equations establishing an executable semantics of LTL using a simple formula transforming approach. The algorithm is further improved to build automata on-the-fly from formulae, using memoization. The result is a very efficient and small Maude program that can be used to monitor program executions. We furthermore present an alternative algorithm for synthesizing probably minimal observer finite state machines (or automata) from LTL formulae, which can be used to analyze execution traces without the need for a rewriting system, and can hence be used by observers written in conventional programming languages. The presented work is part of an ambitious runtime verification and monitoring project at NASA Ames, called PATHEXPLORER, and demonstrates that rewriting can be a tractable and attractive means for experimenting and implementing program monitoring logics.

  18. Executive Function Capacities, Negative Driving Behavior and Crashes in Young Drivers

    PubMed Central

    Winston, Flaura K.

    2017-01-01

    Motor vehicle crashes remain a leading cause of injury and death in adolescents, with teen drivers three times more likely to be in a fatal crash when compared to adults. One potential contributing risk factor is the ongoing development of executive functioning with maturation of the frontal lobe through adolescence and into early adulthood. Atypical development resulting in poor or impaired executive functioning (as in Attention-Deficit/Hyperactivity Disorder) has been associated with risky driving and crash outcomes. However, executive function broadly encompasses a number of capacities and domains (e.g., working memory, inhibition, set-shifting). In this review, we examine the role of various executive function sub-processes in adolescent driver behavior and crash rates. We summarize the state of methods for measuring executive control and driving outcomes and highlight the great heterogeneity in tools with seemingly contradictory findings. Lastly, we offer some suggestions for improved methods and practical ways to compensate for the effects of poor executive function (such as in-vehicle assisted driving devices). Given the key role that executive function plays in safe driving, this review points to an urgent need for systematic research to inform development of more effective training and interventions for safe driving among adolescents. PMID:29143762

  19. Age differences in high frequency phasic heart rate variability and performance response to increased executive function load in three executive function tasks

    PubMed Central

    Byrd, Dana L.; Reuther, Erin T.; McNamara, Joseph P. H.; DeLucca, Teri L.; Berg, William K.

    2015-01-01

    The current study examines similarity or disparity of a frontally mediated physiological response of mental effort among multiple executive functioning tasks between children and adults. Task performance and phasic heart rate variability (HRV) were recorded in children (6 to 10 years old) and adults in an examination of age differences in executive functioning skills during periods of increased demand. Executive load levels were varied by increasing the difficulty levels of three executive functioning tasks: inhibition (IN), working memory (WM), and planning/problem solving (PL). Behavioral performance decreased in all tasks with increased executive demand in both children and adults. Adults’ phasic high frequency HRV was suppressed during the management of increased IN and WM load. Children’s phasic HRV was suppressed during the management of moderate WM load. HRV was not suppressed during either children’s or adults’ increasing load during the PL task. High frequency phasic HRV may be most sensitive to executive function tasks that have a time-response pressure, and simply requiring performance on a self-paced task requiring frontal lobe activation may not be enough to generate HRV responsitivity to increasing demand. PMID:25798113

  20. Automatic Imitation in Rhythmical Actions: Kinematic Fidelity and the Effects of Compatibility, Delay, and Visual Monitoring

    PubMed Central

    Eaves, Daniel L.; Turgeon, Martine; Vogt, Stefan

    2012-01-01

    We demonstrate that observation of everyday rhythmical actions biases subsequent motor execution of the same and of different actions, using a paradigm where the observed actions were irrelevant for action execution. The cycle time of the distractor actions was subtly manipulated across trials, and the cycle time of motor responses served as the main dependent measure. Although distractor frequencies reliably biased response cycle times, this imitation bias was only a small fraction of the modulations in distractor speed, as well as of the modulations produced when participants intentionally imitated the observed rhythms. Importantly, this bias was not only present for compatible actions, but was also found, though numerically reduced, when distractor and executed actions were different (e.g., tooth brushing vs. window wiping), or when the dominant plane of movement was different (horizontal vs. vertical). In addition, these effects were equally pronounced for execution at 0, 4, and 8 s after action observation, a finding that contrasts with the more short-lived effects reported in earlier studies. The imitation bias was also unaffected when vision of the hand was occluded during execution, indicating that this effect most likely resulted from visuomotor interactions during distractor observation, rather than from visual monitoring and guidance during execution. Finally, when the distractor was incompatible in both dimensions (action type and plane) the imitation bias was not reduced further, in an additive way, relative to the single-incompatible conditions. This points to a mechanism whereby the observed action’s impact on motor processing is generally reduced whenever this is not useful for motor planning. We interpret these findings in the framework of biased competition, where intended and distractor actions can be represented as competing and quasi-encapsulated sensorimotor streams. PMID:23071623

Top