Sample records for performing high complexity

  1. 42 CFR 493.25 - Laboratories performing tests of high complexity.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Laboratories performing tests of high complexity. 493.25 Section 493.25 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND....25 Laboratories performing tests of high complexity. (a) A laboratory must obtain a certificate for...

  2. 42 CFR 493.25 - Laboratories performing tests of high complexity.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Laboratories performing tests of high complexity. 493.25 Section 493.25 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND....25 Laboratories performing tests of high complexity. (a) A laboratory must obtain a certificate for...

  3. Predicting Development of Mathematical Word Problem Solving Across the Intermediate Grades

    PubMed Central

    Tolar, Tammy D.; Fuchs, Lynn; Cirino, Paul T.; Fuchs, Douglas; Hamlett, Carol L.; Fletcher, Jack M.

    2012-01-01

    This study addressed predictors of the development of word problem solving (WPS) across the intermediate grades. At beginning of 3rd grade, 4 cohorts of students (N = 261) were measured on computation, language, nonverbal reasoning skills, and attentive behavior and were assessed 4 times from beginning of 3rd through end of 5th grade on 2 measures of WPS at low and high levels of complexity. Language skills were related to initial performance at both levels of complexity and did not predict growth at either level. Computational skills had an effect on initial performance in low- but not high-complexity problems and did not predict growth at either level of complexity. Attentive behavior did not predict initial performance but did predict growth in low-complexity, whereas it predicted initial performance but not growth for high-complexity problems. Nonverbal reasoning predicted initial performance and growth for low-complexity WPS, but only growth for high-complexity WPS. This evidence suggests that although mathematical structure is fixed, different cognitive resources may act as limiting factors in WPS development when the WPS context is varied. PMID:23325985

  4. 42 CFR 493.25 - Laboratories performing tests of high complexity.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 5 2014-10-01 2014-10-01 false Laboratories performing tests of high complexity. 493.25 Section 493.25 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS General Provisions § 493.25 Laboratories performing tests of high...

  5. [Occupational complexity and late-life memory and reasoning abilities].

    PubMed

    Ishioka, Yoshiko; Gondo, Yasuyuki; Masui, Yukie; Nakagawa, Takeshi; Tabuchi, Megumi; Ogawa, Madoka; Kamide, Kei; Ikebe, Kazunori; Arai, Yasumichi; Ishizaki, Tatsuro; Takahashi, Ryutaro

    2015-08-01

    This study examined the associations between the complexity of an individual's primary lifetime occupation and his or her late-life memory and reasoning performance, using data from 824 community-dwelling participants aged 69-72 years. The complexity of work with data, people, and things was evaluated based on the Japanese job complexity score. The associations between occupational complexity and participant's memory and reasoning abilities were examined in multiple regression analyses. An association was found between more comple work with people and higher memory performance, as well as between more complex work with data and higher reasoning performance, after having controlled for gender, school records, and education. Further, an interaction effect was observed between gender and complexity of work with data in relation to reasoning performance: work involving a high degree of complexity with data was associated with high reasoning performance in men. These findings suggest the need to consider late-life cognitive functioning within the context of adulthood experiences, specifically those related to occupation and gender.

  6. FLAME: A platform for high performance computing of complex systems, applied for three case studies

    DOE PAGES

    Kiran, Mariam; Bicak, Mesude; Maleki-Dizaji, Saeedeh; ...

    2011-01-01

    FLAME allows complex models to be automatically parallelised on High Performance Computing (HPC) grids enabling large number of agents to be simulated over short periods of time. Modellers are hindered by complexities of porting models on parallel platforms and time taken to run large simulations on a single machine, which FLAME overcomes. Three case studies from different disciplines were modelled using FLAME, and are presented along with their performance results on a grid.

  7. 42 CFR 493.1467 - Condition: Laboratories performing high complexity testing; cytology general supervisor.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... testing; cytology general supervisor. 493.1467 Section 493.1467 Public Health CENTERS FOR MEDICARE....1467 Condition: Laboratories performing high complexity testing; cytology general supervisor. For the subspecialty of cytology, the laboratory must have a general supervisor who meets the qualification...

  8. 42 CFR 493.1467 - Condition: Laboratories performing high complexity testing; cytology general supervisor.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... testing; cytology general supervisor. 493.1467 Section 493.1467 Public Health CENTERS FOR MEDICARE....1467 Condition: Laboratories performing high complexity testing; cytology general supervisor. For the subspecialty of cytology, the laboratory must have a general supervisor who meets the qualification...

  9. 42 CFR 493.1467 - Condition: Laboratories performing high complexity testing; cytology general supervisor.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... testing; cytology general supervisor. 493.1467 Section 493.1467 Public Health CENTERS FOR MEDICARE....1467 Condition: Laboratories performing high complexity testing; cytology general supervisor. For the subspecialty of cytology, the laboratory must have a general supervisor who meets the qualification...

  10. 42 CFR 493.1467 - Condition: Laboratories performing high complexity testing; cytology general supervisor.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... testing; cytology general supervisor. 493.1467 Section 493.1467 Public Health CENTERS FOR MEDICARE....1467 Condition: Laboratories performing high complexity testing; cytology general supervisor. For the subspecialty of cytology, the laboratory must have a general supervisor who meets the qualification...

  11. 42 CFR 493.1467 - Condition: Laboratories performing high complexity testing; cytology general supervisor.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... testing; cytology general supervisor. 493.1467 Section 493.1467 Public Health CENTERS FOR MEDICARE....1467 Condition: Laboratories performing high complexity testing; cytology general supervisor. For the subspecialty of cytology, the laboratory must have a general supervisor who meets the qualification...

  12. 42 CFR 493.1481 - Condition: Laboratories performing high complexity testing; cytotechnologist.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Laboratories performing high complexity testing; cytotechnologist. 493.1481 Section 493.1481 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY...

  13. 42 CFR 493.1481 - Condition: Laboratories performing high complexity testing; cytotechnologist.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Condition: Laboratories performing high complexity testing; cytotechnologist. 493.1481 Section 493.1481 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY...

  14. 42 CFR 493.1453 - Condition: Laboratories performing high complexity testing; clinical consultant.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... testing; clinical consultant. 493.1453 Section 493.1453 Public Health CENTERS FOR MEDICARE & MEDICAID... Condition: Laboratories performing high complexity testing; clinical consultant. The laboratory must have a clinical consultant who meets the requirements of § 493.1455 of this subpart and provides clinical...

  15. 42 CFR 493.1459 - Condition: Laboratories performing high complexity testing; general supervisor.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Condition: Laboratories performing high complexity testing; general supervisor. 493.1459 Section 493.1459 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY...

  16. 42 CFR 493.1441 - Condition: Laboratories performing high complexity testing; laboratory director.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Condition: Laboratories performing high complexity testing; laboratory director. 493.1441 Section 493.1441 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY...

  17. 42 CFR 493.1441 - Condition: Laboratories performing high complexity testing; laboratory director.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Laboratories performing high complexity testing; laboratory director. 493.1441 Section 493.1441 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY...

  18. 42 CFR 493.1453 - Condition: Laboratories performing high complexity testing; clinical consultant.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Condition: Laboratories performing high complexity testing; clinical consultant. 493.1453 Section 493.1453 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY...

  19. 42 CFR 493.1487 - Condition: Laboratories performing high complexity testing; testing personnel.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Laboratories performing high complexity testing; testing personnel. 493.1487 Section 493.1487 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY...

  20. 42 CFR 493.1447 - Condition: Laboratories performing high complexity testing; technical supervisor.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Condition: Laboratories performing high complexity testing; technical supervisor. 493.1447 Section 493.1447 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY...

  1. 42 CFR 493.1459 - Condition: Laboratories performing high complexity testing; general supervisor.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Laboratories performing high complexity testing; general supervisor. 493.1459 Section 493.1459 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY...

  2. 42 CFR 493.1487 - Condition: Laboratories performing high complexity testing; testing personnel.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Condition: Laboratories performing high complexity testing; testing personnel. 493.1487 Section 493.1487 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY...

  3. 42 CFR 493.1447 - Condition: Laboratories performing high complexity testing; technical supervisor.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Laboratories performing high complexity testing; technical supervisor. 493.1447 Section 493.1447 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY...

  4. Combined effects of altitude and high temperature on complex performance.

    DOT National Transportation Integrated Search

    1971-04-01

    Nine well-trained subjects were tested on a complex performance device designed to assess functions of relevance to aircrew activities. The tests, which involved tracking, monitoring, and mental arithmetic, were performed during exposure to altitude ...

  5. Complex matrix multiplication operations with data pre-conditioning in a high performance computing architecture

    DOEpatents

    Eichenberger, Alexandre E; Gschwind, Michael K; Gunnels, John A

    2014-02-11

    Mechanisms for performing a complex matrix multiplication operation are provided. A vector load operation is performed to load a first vector operand of the complex matrix multiplication operation to a first target vector register. The first vector operand comprises a real and imaginary part of a first complex vector value. A complex load and splat operation is performed to load a second complex vector value of a second vector operand and replicate the second complex vector value within a second target vector register. The second complex vector value has a real and imaginary part. A cross multiply add operation is performed on elements of the first target vector register and elements of the second target vector register to generate a partial product of the complex matrix multiplication operation. The partial product is accumulated with other partial products and a resulting accumulated partial product is stored in a result vector register.

  6. Progress on high-performance rapid prototype aluminum mirrors

    NASA Astrophysics Data System (ADS)

    Woodard, Kenneth S.; Myrick, Bruce H.

    2017-05-01

    Near net shape parts can be produced using some very old processes (investment casting) and the relatively new direct metal laser sintering (DMLS) process. These processes have significant advantages for complex blank lightweighting and costs but are not inherently suited for producing high performance mirrors. The DMLS process can provide extremely complex lightweight structures but the high residual stresses left in the material results in unstable mirror figure retention. Although not to the extreme intricacy of DMLS, investment casting can also provide complex lightweight structures at considerably lower costs than DMLS and even conventional wrought mirror blanks but the less than 100% density for casting (and also DMLS) limits finishing quality. This paper will cover the progress that has been made to make both the DMLS and investment casting processes into viable near net shape blank options for high performance aluminum mirrors. Finish and figure results will be presented to show performance commensurate with existing conventional processes.

  7. DCL System Using Deep Learning Approaches for Land-Based or Ship-Based Real Time Recognition and Localization of Marine Mammals

    DTIC Science & Technology

    2015-09-30

    Clark (2014), "Using High Performance Computing to Explore Large Complex Bioacoustic Soundscapes : Case Study for Right Whale Acoustics," Procedia...34Using High Performance Computing to Explore Large Complex Bioacoustic Soundscapes : Case Study for Right Whale Acoustics," Procedia Computer Science 20

  8. Using reversed phase high performance liquid chromatography to study the complexation of anthocyanins with β-cyclodextrin

    NASA Astrophysics Data System (ADS)

    Deineka, V. I.; Lapshova, M. S.; Deineka, L. A.

    2014-06-01

    It is shown by means of reversed phase high performance liquid chromatography (RP HPLC) with mobile phases containing additions of β-cyclodextrin that 5-glucosides of cyanidin and pelargonidin form stronger inclusion complexes than 3-glucosides; this is explained by the steric interference of the glucoside radical.

  9. Can Knowledge of the Characteristics of "High Performers" Be Generalised?

    ERIC Educational Resources Information Center

    McKenna, Stephen

    2002-01-01

    Two managers described as high performing constructed complexity maps of their organization/world. The maps suggested that high performance is socially constructed and negotiated in specific contexts and management competencies associated with it are context specific. Development of high performers thus requires personalized coaching more than…

  10. Effect of motion cues during complex curved approach and landing tasks: A piloted simulation study

    NASA Technical Reports Server (NTRS)

    Scanlon, Charles H.

    1987-01-01

    A piloted simulation study was conducted to examine the effect of motion cues using a high fidelity simulation of commercial aircraft during the performance of complex approach and landing tasks in the Microwave Landing System (MLS) signal environment. The data from these tests indicate that in a high complexity MLS approach task with moderate turbulence and wind, the pilot uses motion cues to improve path tracking performance. No significant differences in tracking accuracy were noted for the low and medium complexity tasks, regardless of the presence of motion cues. Higher control input rates were measured for all tasks when motion was used. Pilot eye scan, as measured by instrument dwell time, was faster when motion cues were used regardless of the complexity of the approach tasks. Pilot comments indicated a preference for motion. With motion cues, pilots appeared to work harder in all levels of task complexity and to improve tracking performance in the most complex approach task.

  11. Advances In High Temperature (Viscoelastoplastic) Material Modeling for Thermal Structural Analysis

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Saleeb, Atef F.

    2005-01-01

    Typical High Temperature Applications High Temperature Applications Demand High Performance Materials: 1) Complex Thermomechanical Loading; 2) Complex Material response requires Time-Dependent/Hereditary Models: Viscoelastic/Viscoplastic; and 3) Comprehensive Characterization (Tensile, Creep, Relaxation) for a variety of material systems.

  12. High microwave attenuation performance of planar carbonyl iron particles with orientation of shape anisotropy field

    NASA Astrophysics Data System (ADS)

    Guo, Cheng; Yang, Zhihong; Shen, Shile; Liang, Juan; Xu, Guoyue

    2018-05-01

    Planar anisotropy carbonyl iron (PACI) particles were prepared from commercial spherical carbonyl iron particles through a high performance ball-milling technique. The paraffin composites with orientation of shape anisotropy field for these PACI particles were obtained by applying an external magnetic field during the fabrication process. The frequency-dependent complex permeability values of these prepared paraffin composites have been investigated in the frequency range of 1-18 GHz. The results demonstrate that the orientation of shape anisotropy field for these PACI particles can effectively increase the complex permeability and decrease the complex permittivity values. Benefit from the enhancement in the complex permeability and reduction in the complex permittivity, the better impedance matching condition can be obtained and thus the good microwave absorption performance can be achieved for the samples with enough magnetic field orientation time.

  13. Early experiences in developing and managing the neuroscience gateway.

    PubMed

    Sivagnanam, Subhashini; Majumdar, Amit; Yoshimoto, Kenneth; Astakhov, Vadim; Bandrowski, Anita; Martone, MaryAnn; Carnevale, Nicholas T

    2015-02-01

    The last few decades have seen the emergence of computational neuroscience as a mature field where researchers are interested in modeling complex and large neuronal systems and require access to high performance computing machines and associated cyber infrastructure to manage computational workflow and data. The neuronal simulation tools, used in this research field, are also implemented for parallel computers and suitable for high performance computing machines. But using these tools on complex high performance computing machines remains a challenge because of issues with acquiring computer time on these machines located at national supercomputer centers, dealing with complex user interface of these machines, dealing with data management and retrieval. The Neuroscience Gateway is being developed to alleviate and/or hide these barriers to entry for computational neuroscientists. It hides or eliminates, from the point of view of the users, all the administrative and technical barriers and makes parallel neuronal simulation tools easily available and accessible on complex high performance computing machines. It handles the running of jobs and data management and retrieval. This paper shares the early experiences in bringing up this gateway and describes the software architecture it is based on, how it is implemented, and how users can use this for computational neuroscience research using high performance computing at the back end. We also look at parallel scaling of some publicly available neuronal models and analyze the recent usage data of the neuroscience gateway.

  14. Early experiences in developing and managing the neuroscience gateway

    PubMed Central

    Sivagnanam, Subhashini; Majumdar, Amit; Yoshimoto, Kenneth; Astakhov, Vadim; Bandrowski, Anita; Martone, MaryAnn; Carnevale, Nicholas. T.

    2015-01-01

    SUMMARY The last few decades have seen the emergence of computational neuroscience as a mature field where researchers are interested in modeling complex and large neuronal systems and require access to high performance computing machines and associated cyber infrastructure to manage computational workflow and data. The neuronal simulation tools, used in this research field, are also implemented for parallel computers and suitable for high performance computing machines. But using these tools on complex high performance computing machines remains a challenge because of issues with acquiring computer time on these machines located at national supercomputer centers, dealing with complex user interface of these machines, dealing with data management and retrieval. The Neuroscience Gateway is being developed to alleviate and/or hide these barriers to entry for computational neuroscientists. It hides or eliminates, from the point of view of the users, all the administrative and technical barriers and makes parallel neuronal simulation tools easily available and accessible on complex high performance computing machines. It handles the running of jobs and data management and retrieval. This paper shares the early experiences in bringing up this gateway and describes the software architecture it is based on, how it is implemented, and how users can use this for computational neuroscience research using high performance computing at the back end. We also look at parallel scaling of some publicly available neuronal models and analyze the recent usage data of the neuroscience gateway. PMID:26523124

  15. Subband Image Coding with Jointly Optimized Quantizers

    NASA Technical Reports Server (NTRS)

    Kossentini, Faouzi; Chung, Wilson C.; Smith Mark J. T.

    1995-01-01

    An iterative design algorithm for the joint design of complexity- and entropy-constrained subband quantizers and associated entropy coders is proposed. Unlike conventional subband design algorithms, the proposed algorithm does not require the use of various bit allocation algorithms. Multistage residual quantizers are employed here because they provide greater control of the complexity-performance tradeoffs, and also because they allow efficient and effective high-order statistical modeling. The resulting subband coder exploits statistical dependencies within subbands, across subbands, and across stages, mainly through complexity-constrained high-order entropy coding. Experimental results demonstrate that the complexity-rate-distortion performance of the new subband coder is exceptional.

  16. Durable pectin/chitosan membranes with self-assembling, water resistance and enhanced mechanical properties.

    PubMed

    Martins, Jéssica G; de Oliveira, Ariel C; Garcia, Patrícia S; Kipper, Matt J; Martins, Alessandro F

    2018-05-15

    Processing water-soluble polysaccharides, like pectin (PT), into materials with desirable stability and mechanical properties has been challenging. Here we report a new method to create water stable and mechanical resistant polyelectrolyte complex (PEC) membranes from PT and chitosan (CS) assemblies, without covalent crosslinking. This new method overcomes challenges of obtaining stable and durable complexes, by performing the complexation at low pH, enabling complex formation even when using an excess of PT, and when using PT with high degree of O-methoxylation. By performing the complexation at low pH, the complexes form with a high degree of intermolecular association, instead of forming by electrostatic complexation. This method avoids precipitation, and overcomes the aqueous instability typical of PT/CS complexes. After neutralization, the PEC membranes display features characteristic of a high degree of intermolecular association because of the self-assembling of polymer chains. The PT/CS ratio can be tuned to enhance the mechanical strength (σ = 39 MPa) of the membranes. These polysaccharide-based materials can demonstrate advantages over synthetic materials for technological applications. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. ASIC For Complex Fixed-Point Arithmetic

    NASA Technical Reports Server (NTRS)

    Petilli, Stephen G.; Grimm, Michael J.; Olson, Erlend M.

    1995-01-01

    Application-specific integrated circuit (ASIC) performs 24-bit, fixed-point arithmetic operations on arrays of complex-valued input data. High-performance, wide-band arithmetic logic unit (ALU) designed for use in computing fast Fourier transforms (FFTs) and for performing ditigal filtering functions. Other applications include general computations involved in analysis of spectra and digital signal processing.

  18. Complex monitoring performance and the coronary-prone Type A behavior pattern.

    DOT National Transportation Integrated Search

    1986-03-01

    The present study examined the possible relationship of the coronary-prone Type A behavior pattern to performance of a complex monitoring task. The task was designed to functionally simulate the general task characteristics of future, highly automate...

  19. An exploratory investigation of various assessment instruments as correlates of complex visual monitoring performance.

    DOT National Transportation Integrated Search

    1980-10-01

    The present study examined a variety of possible predictors of complex monitoring performance. The criterion task was designed to resemble that of a highly automated air traffic control radar system containing computer-generated alphanumeric displays...

  20. Effects of noise frequency on performance and annoyance for women and men

    NASA Technical Reports Server (NTRS)

    Key, K. F.; Payne, M. C., Jr.

    1981-01-01

    Effects of noise frequencies on both performance on a complex psychomotor task and annoyance were investigated for men (n = 30) and women (n = 30). Each subject performed a complex psychomotor task for 50 min in the presence of low-frequency noise, high-frequency noise, or ambient noise. Women and men learned the task at different rates. Little effect of noise was shown. Annoyance ratings were subsequently obtained from each subject for noises of various frequencies by the method of magnitude estimation. High-frequency noises were more annoying than low-frequency noises regardless of sex and immediate prior exposure to noise. Sex differences in annoyance did not occur. No direct relationship between learning to perform a complex task while exposed to noise and annoyance by that noise was demonstrated.

  1. Implicit Coordination Strategies for Effective Team Communication.

    PubMed

    Butchibabu, Abhizna; Sparano-Huiban, Christopher; Sonenberg, Liz; Shah, Julie

    2016-06-01

    We investigated implicit communication strategies for anticipatory information sharing during team performance of tasks with varying degrees of complexity. We compared the strategies used by teams with the highest level of performance to those used by the lowest-performing teams to evaluate the frequency and methods of communications used as a function of task structure. High-performing teams share information by anticipating the needs of their teammates rather than explicitly requesting the exchange of information. As the complexity of a task increases to involve more interdependence among teammates, the impact of coordination on team performance also increases. This observation motivated us to conduct a study of anticipatory information sharing as a function of task complexity. We conducted an experiment in which 13 teams of four people performed collaborative search-and-deliver tasks with varying degrees of complexity in a simulation environment. We elaborated upon prior characterizations of communication as implicit versus explicit by dividing implicit communication into two subtypes: (a) deliberative/goal information and (b) reactive status updates. We then characterized relationships between task structure, implicit communication, and team performance. We found that the five teams with the fastest task completion times and lowest idle times exhibited higher rates of deliberative communication versus reactive communication during high-complexity tasks compared with the five teams with the slowest completion times and longest idle times (p = .039). Teams in which members proactively communicated information about their next goal to teammates exhibited improved team performance. The findings from our work can inform the design of communication strategies for team training to improve performance of complex tasks. © 2016, Human Factors and Ergonomics Society.

  2. Fabrication of the tetrathiafulvalene–2,3,5,6-tetrafluoro-7,7,8,8-tetracyanoquinodimethane charge transfer complex with high crystallinity by eutectic melting method

    NASA Astrophysics Data System (ADS)

    Kim, Jueun; Kang, Youngjong; Lee, Jaejong

    2018-06-01

    We show that high crystallinity and charge transporting gain can be obtained in a noble donor–acceptor system (CT complex) composed of organic complex: tetrathiafulvalene–2,3,5,6-tetrafluoro-7,7,8,8-tetracyanoquinodimethane (TTF–F4TCNQ). The complex is small-gap organic metallic or semiconductor (less than 1 eV), and we predict having a high conductivity. We perform an approach to fabricate organic CT complex with high crystallinity by eutectic melting method. Our process is simple and shows crystal growth with improved crystallinity when combined with soft-lithography.

  3. A Study of Complex Deep Learning Networks on High Performance, Neuromorphic, and Quantum Computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Potok, Thomas E; Schuman, Catherine D; Young, Steven R

    Current Deep Learning models use highly optimized convolutional neural networks (CNN) trained on large graphical processing units (GPU)-based computers with a fairly simple layered network topology, i.e., highly connected layers, without intra-layer connections. Complex topologies have been proposed, but are intractable to train on current systems. Building the topologies of the deep learning network requires hand tuning, and implementing the network in hardware is expensive in both cost and power. In this paper, we evaluate deep learning models using three different computing architectures to address these problems: quantum computing to train complex topologies, high performance computing (HPC) to automatically determinemore » network topology, and neuromorphic computing for a low-power hardware implementation. Due to input size limitations of current quantum computers we use the MNIST dataset for our evaluation. The results show the possibility of using the three architectures in tandem to explore complex deep learning networks that are untrainable using a von Neumann architecture. We show that a quantum computer can find high quality values of intra-layer connections and weights, while yielding a tractable time result as the complexity of the network increases; a high performance computer can find optimal layer-based topologies; and a neuromorphic computer can represent the complex topology and weights derived from the other architectures in low power memristive hardware. This represents a new capability that is not feasible with current von Neumann architecture. It potentially enables the ability to solve very complicated problems unsolvable with current computing technologies.« less

  4. Enhancement of Hydrodynamic Processes in Oil Pipelines Considering Rheologically Complex High-Viscosity Oils

    NASA Astrophysics Data System (ADS)

    Konakhina, I. A.; Khusnutdinova, E. M.; Khamidullina, G. R.; Khamidullina, A. F.

    2016-06-01

    This paper describes a mathematical model of flow-related hydrodynamic processes for rheologically complex high-viscosity bitumen oil and oil-water suspensions and presents methods to improve the design and performance of oil pipelines.

  5. Dribble Files: Methodologies to Evaluate Learning and Performance in Complex Environments

    ERIC Educational Resources Information Center

    Schrader, P. G.; Lawless, Kimberly A.

    2007-01-01

    Research in the area of technology learning environments is tremendously complex. Tasks performed in these contexts are highly cognitive and mostly invisible to the observer. The nature of performance in these contexts is explained not only by the outcome but also by the process. However, evaluating the learning process with respect to tasks…

  6. CN-GELFrEE - Clear Native Gel-eluted Liquid Fraction Entrapment Electrophoresis

    PubMed Central

    Skinner, Owen S.; Do Vale, Luis H. F.; Catherman, Adam D.; Havugimana, Pierre C.; Valle de Sousa, Marcelo; Domont, Gilberto B.; Kelleher, Neil L.; Compton, Philip D.

    2016-01-01

    Protein complexes perform an array of crucial cellular functions. Elucidating their non-covalent interactions and dynamics is paramount for understanding the role of complexes in biological systems. While the direct characterization of biomolecular assemblies has become increasingly important in recent years, native fractionation techniques that are compatible with downstream analysis techniques, including mass spectrometry, are necessary to further expand these studies. Nevertheless, the field lacks a high-throughput, wide-range, high-recovery separation method for native protein assemblies. Here, we present clear native gel-eluted liquid fraction entrapment electrophoresis (CN-GELFrEE), which is a novel separation modality for non-covalent protein assemblies. CN-GELFrEE separation performance was demonstrated by fractionating complexes extracted from mouse heart. Fractions were collected over 2 hr and displayed discrete bands ranging from ~30 to 500 kDa. A consistent pattern of increasing molecular weight bandwidths was observed, each ranging ~100 kDa. Further, subsequent reanalysis of native fractions via SDS-PAGE showed molecular-weight shifts consistent with the denaturation of protein complexes. Therefore, CN-GELFrEE was proved to offer the ability to perform high-resolution and high-recovery native separations on protein complexes from a large molecular weight range, providing fractions that are compatible with downstream protein analyses. PMID:26967310

  7. Contribution of strategy use to performance on complex and simple span tasks.

    PubMed

    Bailey, Heather; Dunlosky, John; Kane, Michael J

    2011-04-01

    Simple and complex span tasks are widely thought to measure related but separable memory constructs. Recently, however, research has demonstrated that simple and complex span tasks may tap, in part, the same construct because both similarly predict performance on measures of fluid intelligence (Gf) when the number of items retrieved from secondary memory (SM) is equated (Unsworth & Engle, Journal of Memory and Language 54:68-80 2006). Two studies (n = 105 and n = 152) evaluated whether retrieval from SM is influenced by individual differences in the use of encoding strategies during span tasks. Results demonstrated that, after equating the number of items retrieved from SM, simple and complex span performance similarly predicted Gf performance, but rates of effective strategy use did not mediate the span-Gf relationships. Moreover, at the level of individual differences, effective strategy use was more highly related to complex span performance than to simple span performance. Thus, even though individual differences in effective strategy use influenced span performance on trials that required retrieval from SM, strategic behavior at encoding cannot account for the similarities between simple and complex span tasks.

  8. Anion-exchange high-performance liquid chromatography of water-soluble chromium (VI) and chromium (III) complexes in biological materials.

    PubMed

    Suzuki, Y

    1987-04-10

    A high-performance anion-exchange liquid chromatograph coupled to visible-range (370 nm) and UV (280 nm) detectors and an atomic-absorption spectrometer allowed the rapid determination of CrVI and/or complexes of CrIII in rat plasma, erythrocyte lysate and liver supernatant treated with CrVI or CrIII in vitro. CrVI in the eluates was determined using both the visible-range detector and atomic-absorption spectrometer (AAS). The detection limits of CrVI in standard solutions using these methods were 2 and 5 ng (signal-to-noise ratio = 2), respectively. Separations of the biological components and of CrIII complexes were monitored by UV and AAS analyses, respectively. Time-related decreases of CrVI accompanied by increases in CrIII complexes were observed, indicating the reduction of CrVI by some of the biological components. The reduction rates were considerably higher in the liver supernatant and erythrocyte lysate than in the plasma. These results indicate that the anion-exchange high-performance liquid chromatographic system is useful for simultaneous determination of CrVI and CrIII complexes in biological materials.

  9. Performance of children with developmental dyslexia on high and low topological entropy artificial grammar learning task.

    PubMed

    Katan, Pesia; Kahta, Shani; Sasson, Ayelet; Schiff, Rachel

    2017-07-01

    Graph complexity as measured by topological entropy has been previously shown to affect performance on artificial grammar learning tasks among typically developing children. The aim of this study was to examine the effect of graph complexity on implicit sequential learning among children with developmental dyslexia. Our goal was to determine whether children's performance depends on the complexity level of the grammar system learned. We conducted two artificial grammar learning experiments that compared performance of children with developmental dyslexia with that of age- and reading level-matched controls. Experiment 1 was a high topological entropy artificial grammar learning task that aimed to establish implicit learning phenomena in children with developmental dyslexia using previously published experimental conditions. Experiment 2 is a lower topological entropy variant of that task. Results indicated that given a high topological entropy grammar system, children with developmental dyslexia who were similar to the reading age-matched control group had substantial difficulty in performing the task as compared to typically developing children, who exhibited intact implicit learning of the grammar. On the other hand, when tested on a lower topological entropy grammar system, all groups performed above chance level, indicating that children with developmental dyslexia were able to identify rules from a given grammar system. The results reinforced the significance of graph complexity when experimenting with artificial grammar learning tasks, particularly with dyslexic participants.

  10. Robust Timing Synchronization in Aeronautical Mobile Communication Systems

    NASA Technical Reports Server (NTRS)

    Xiong, Fu-Qin; Pinchak, Stanley

    2004-01-01

    This work details a study of robust synchronization schemes suitable for satellite to mobile aeronautical applications. A new scheme, the Modified Sliding Window Synchronizer (MSWS), is devised and compared with existing schemes, including the traditional Early-Late Gate Synchronizer (ELGS), the Gardner Zero-Crossing Detector (GZCD), and the Sliding Window Synchronizer (SWS). Performance of the synchronization schemes is evaluated by a set of metrics that indicate performance in digital communications systems. The metrics are convergence time, mean square phase error (or root mean-square phase error), lowest SNR for locking, initial frequency offset performance, midstream frequency offset performance, and system complexity. The performance of the synchronizers is evaluated by means of Matlab simulation models. A simulation platform is devised to model the satellite to mobile aeronautical channel, consisting of a Quadrature Phase Shift Keying modulator, an additive white Gaussian noise channel, and a demodulator front end. Simulation results show that the MSWS provides the most robust performance at the cost of system complexity. The GZCD provides a good tradeoff between robustness and system complexity for communication systems that require high symbol rates or low overall system costs. The ELGS has a high system complexity despite its average performance. Overall, the SWS, originally designed for multi-carrier systems, performs very poorly in single-carrier communications systems. Table 5.1 in Section 5 provides a ranking of each of the synchronization schemes in terms of the metrics set forth in Section 4.1. Details of comparison are given in Section 5. Based on the results presented in Table 5, it is safe to say that the most robust synchronization scheme examined in this work is the high-sample-rate Modified Sliding Window Synchronizer. A close second is its low-sample-rate cousin. The tradeoff between complexity and lowest mean-square phase error determines the rankings of the Gardner Zero-Crossing Detector and both versions of the Early-Late Gate Synchronizer. The least robust models are the high and low-sample-rate Sliding Window Synchronizers. Consequently, the recommended replacement synchronizer for NASA's Advanced Air Transportation Technologies mobile aeronautical communications system is the high-sample-rate Modified Sliding Window Synchronizer. By incorporating this synchronizer into their system, NASA can be assured that their system will be operational in extremely adverse conditions. The quick convergence time of the MSWS should allow the use of high-level protocols. However, if NASA feels that reduced system complexity is the most important aspect of their replacement synchronizer, the Gardner Zero-Crossing Detector would be the best choice.

  11. Center for the Built Environment: Research on Controls and Information

    Science.gov Websites

    Foundation Complex Case Study Publications Research Area : Sustainability, Whole Building Energy, and Other commercial building energy use. Krege Foundation Complex Case Study Analyzing performance of LEED platinum criteria for high performance buildings. Building test equipment The first in-depth case study was

  12. Quality Management and Key Performance Indicators in Oncologic Esophageal Surgery.

    PubMed

    Gockel, Ines; Ahlbrand, Constantin Johannes; Arras, Michael; Schreiber, Elke Maria; Lang, Hauke

    2015-12-01

    Ranking systems and comparisons of quality and performance indicators will be of increasing relevance for complex "high-risk" procedures such as esophageal cancer surgery. The identification of evidence-based standards relevant for key performance indicators in esophageal surgery is essential for establishing monitoring systems and furthermore a requirement to enhance treatment quality. In the course of this review, we analyze the key performance indicators case volume, radicality of resection, and postoperative morbidity and mortality, leading to continuous quality improvement. Ranking systems established on this basis will gain increased relevance in highly complex procedures within the national and international comparison and furthermore improve the treatment of patients with esophageal carcinoma.

  13. Team Creative Environment as a Mediator Between CWX and R&D Team Performance and Moderating Boundary Conditions.

    PubMed

    Bornay-Barrachina, Mar; Herrero, Inés

    2018-01-01

    The purpose of this study was to investigate how high-quality dyadic co-worker relationships (CWXs) favour or hinder team performance. Specifically, we examine the role played by CWX, team creative environment, job complexity and task interdependence to achieve higher levels of team performance. We analyse data from 410 individuals belonging to 81 R&D teams in technology sciences to examine the quality of the dyadic relationships between team members under the same supervisor (co-workers) and team performance measured by the number of publications as their research output. Higher levels of team average CWX relationships are positively related to the establishment of a favourable creative team environment, ending into higher levels of team performance. Specifically, the role played by team average CWX in such relationship is stronger when job complexity and task interdependence are also high. Team's output not only depends on the leader and his/her relationships with subordinates but also on quality relationships among team members. CWXs contribute to creative team environments, but they are essential where jobs are complex and tasks are highly dependent. This study provides evidence of the important role played by CWXs in determining a creative environment, irrespective of their leaders. Previous research has provided information about how leader's role affects team outcomes, but the role of dyadic co-worker relationships in a team remains still relatively unknown. Considering job complexity and task interdependence variables, the study provides with a better understanding about how and when high-quality CWXs should be promoted to achieve higher team performance.

  14. Handling Qualities of Model Reference Adaptive Controllers with Varying Complexity for Pitch-Roll Coupled Failures

    NASA Technical Reports Server (NTRS)

    Schaefer, Jacob; Hanson, Curt; Johnson, Marcus A.; Nguyen, Nhan

    2011-01-01

    Three model reference adaptive controllers (MRAC) with varying levels of complexity were evaluated on a high performance jet aircraft and compared along with a baseline nonlinear dynamic inversion controller. The handling qualities and performance of the controllers were examined during failure conditions that induce coupling between the pitch and roll axes. Results from flight tests showed with a roll to pitch input coupling failure, the handling qualities went from Level 2 with the baseline controller to Level 1 with the most complex MRAC tested. A failure scenario with the left stabilator frozen also showed improvement with the MRAC. Improvement in performance and handling qualities was generally seen as complexity was incrementally added; however, added complexity usually corresponds to increased verification and validation effort required for certification. The tradeoff between complexity and performance is thus important to a controls system designer when implementing an adaptive controller on an aircraft. This paper investigates this relation through flight testing of several controllers of vary complexity.

  15. A low-complexity and high performance concatenated coding scheme for high-speed satellite communications

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Rhee, Dojun; Rajpal, Sandeep

    1993-01-01

    This report presents a low-complexity and high performance concatenated coding scheme for high-speed satellite communications. In this proposed scheme, the NASA Standard Reed-Solomon (RS) code over GF(2(exp 8) is used as the outer code and the second-order Reed-Muller (RM) code of Hamming distance 8 is used as the inner code. The RM inner code has a very simple trellis structure and is decoded with the soft-decision Viterbi decoding algorithm. It is shown that the proposed concatenated coding scheme achieves an error performance which is comparable to that of the NASA TDRS concatenated coding scheme in which the NASA Standard rate-1/2 convolutional code of constraint length 7 and d sub free = 10 is used as the inner code. However, the proposed RM inner code has much smaller decoding complexity, less decoding delay, and much higher decoding speed. Consequently, the proposed concatenated coding scheme is suitable for reliable high-speed satellite communications, and it may be considered as an alternate coding scheme for the NASA TDRS system.

  16. A Low-Complexity and High-Performance 2D Look-Up Table for LDPC Hardware Implementation

    NASA Astrophysics Data System (ADS)

    Chen, Jung-Chieh; Yang, Po-Hui; Lain, Jenn-Kaie; Chung, Tzu-Wen

    In this paper, we propose a low-complexity, high-efficiency two-dimensional look-up table (2D LUT) for carrying out the sum-product algorithm in the decoding of low-density parity-check (LDPC) codes. Instead of employing adders for the core operation when updating check node messages, in the proposed scheme, the main term and correction factor of the core operation are successfully merged into a compact 2D LUT. Simulation results indicate that the proposed 2D LUT not only attains close-to-optimal bit error rate performance but also enjoys a low complexity advantage that is suitable for hardware implementation.

  17. The organization of perception and action in complex control skills

    NASA Technical Reports Server (NTRS)

    Miller, Richard A.; Jagacinski, Richard J.

    1989-01-01

    An attempt was made to describe the perceptual, cognitive, and action processes that account for highly skilled human performance in complex task environments. In order to study such a performance in a controlled setting, a laboratory task was constructed and three experiments were performed using human subjects. A general framework was developed for describing the organization of perceptual, cognitive, and action process.

  18. Effect of Chunk Strength on the Performance of Children with Developmental Dyslexia on Artificial Grammar Learning Task May Be Related to Complexity

    ERIC Educational Resources Information Center

    Schiff, Rachel; Katan, Pesia; Sasson, Ayelet; Kahta, Shani

    2017-01-01

    There is a long held view that chunks play a crucial role in artificial grammar learning performance. We compared chunk strength influences on performance, in high and low topological entropy (a measure of complexity) grammar systems, with dyslexic children, age-matched and reading-level-matched control participants. Findings show that age-matched…

  19. Human Factors Tools for Improving Simulation Activities in Continuing Medical Education

    ERIC Educational Resources Information Center

    Seagull, F. Jacob

    2012-01-01

    Human factors (HF) is a discipline often drawn upon when there is a need to train people to perform complex, high-stakes tasks and effectively assess their performance. Complex tasks often present unique challenges for training and assessment. HF has developed specialized techniques that have been effective in overcoming several of these…

  20. Remarkable high efficiency of red emitters using Eu(iii) ternary complexes.

    PubMed

    Kalyakina, Alena S; Utochnikova, Valentina V; Zimmer, Manuel; Dietrich, Fabian; Kaczmarek, Anna M; Van Deun, Rik; Vashchenko, Andrey A; Goloveshkin, Alexander S; Nieger, Martin; Gerhards, Markus; Schepers, Ute; Bräse, Stefan

    2018-05-17

    We have synthesized Eu(iii) ternary complexes possessing record photoluminescence yields up to 90%. This high luminescence performance resulted from the absence of quenching moieties in the Eu coordination environment and an efficient energy transfer between ligands, combined with a particular symmetry of the coordination environment.

  1. Simulator validation results and proposed reporting format from flight testing a software model of a complex, high-performance airplane.

    DOT National Transportation Integrated Search

    2008-01-01

    Computer simulations are often used in aviation studies. These simulation tools may require complex, high-fidelity aircraft models. Since many of the flight models used are third-party developed products, independent validation is desired prior to im...

  2. Deepthi Vaidhynathan | NREL

    Science.gov Websites

    Complex Systems Simulation and Optimization Group on performance analysis and benchmarking latest . Research Interests High Performance Computing|Embedded System |Microprocessors & Microcontrollers

  3. The disruptive effects of pain on multitasking in a virtual errands task.

    PubMed

    Moore, David J; Law, Anna S

    2017-07-01

    Pain is known to have a disruptive effect on cognitive performance, but prior studies have used highly constrained laboratory tasks that lack ecological validity. In everyday life people are required to complete more complex sets of tasks, prioritising task completion and recalling lists of tasks which need to be completed, and these tasks continue to be attempted during episodes or states of pain. The present study therefore examined the impact of thermal induced pain on a simulated errand task. Fifty-five healthy adults (36 female) performed the Edinburgh Virtual Errands Task (EVET) either during a painful thermal sensation or with no concurrent pain. Participants also completed the Experience of Cognitive Intrusion of Pain (ECIP) questionnaire to measure their self-reported cognitive impact of pain in general life. Participants who completed the EVET task in pain and who self-reported high intrusion of pain made significantly more errors than those who reported lower intrusion on the ECIP. Findings here support the growing literature that suggests that pain has a significant impact on cognitive performance. Furthermore, these findings support the developing literature suggesting that this relationship is complex when considering real world cognition, and that self-report on the ECIP relates well to performance on a task designed to reflect the complexities of everyday living. If extrapolated to chronic pain populations, these data suggest that pain during complex multitasking performance may have a significant impact on the number of errors made. For people highly vulnerable to cognitive intrusion by pain, this may result in errors such as selecting the wrong location or item to perform tasks, or forgetting to perform these tasks at the correct time. If these findings are shown to extend to chronic pain populations then occupational support to manage complex task performance, using for example diaries/electronic reminders, may help to improve everyday abilities. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  4. Classifying High-noise EEG in Complex Environments for Brain-computer Interaction Technologies

    DTIC Science & Technology

    2012-02-01

    differentiation in the brain signal that our classification approach seeks to identify despite the noise in the recorded EEG signal and the complexity of...performed two offline classifications , one using BCILab (1), the other using LibSVM (2). Distinct classifiers were trained for each individual in...order to improve individual classifier performance (3). The highest classification performance results were obtained using individual frequency bands

  5. Discovering functional interdependence relationship in PPI networks for protein complex identification.

    PubMed

    Lam, Winnie W M; Chan, Keith C C

    2012-04-01

    Protein molecules interact with each other in protein complexes to perform many vital functions, and different computational techniques have been developed to identify protein complexes in protein-protein interaction (PPI) networks. These techniques are developed to search for subgraphs of high connectivity in PPI networks under the assumption that the proteins in a protein complex are highly interconnected. While these techniques have been shown to be quite effective, it is also possible that the matching rate between the protein complexes they discover and those that are previously determined experimentally be relatively low and the "false-alarm" rate can be relatively high. This is especially the case when the assumption of proteins in protein complexes being more highly interconnected be relatively invalid. To increase the matching rate and reduce the false-alarm rate, we have developed a technique that can work effectively without having to make this assumption. The name of the technique called protein complex identification by discovering functional interdependence (PCIFI) searches for protein complexes in PPI networks by taking into consideration both the functional interdependence relationship between protein molecules and the network topology of the network. The PCIFI works in several steps. The first step is to construct a multiple-function protein network graph by labeling each vertex with one or more of the molecular functions it performs. The second step is to filter out protein interactions between protein pairs that are not functionally interdependent of each other in the statistical sense. The third step is to make use of an information-theoretic measure to determine the strength of the functional interdependence between all remaining interacting protein pairs. Finally, the last step is to try to form protein complexes based on the measure of the strength of functional interdependence and the connectivity between proteins. For performance evaluation, PCIFI was used to identify protein complexes in real PPI network data and the protein complexes it found were matched against those that were previously known in MIPS. The results show that PCIFI can be an effective technique for the identification of protein complexes. The protein complexes it found can match more known protein complexes with a smaller false-alarm rate and can provide useful insights into the understanding of the functional interdependence relationships between proteins in protein complexes.

  6. Some Observations on the Current Status of Performing Finite Element Analyses

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Knight, Norman F., Jr; Shivakumar, Kunigal N.

    2015-01-01

    Aerospace structures are complex high-performance structures. Advances in reliable and efficient computing and modeling tools are enabling analysts to consider complex configurations, build complex finite element models, and perform analysis rapidly. Many of the early career engineers of today are very proficient in the usage of modern computers, computing engines, complex software systems, and visualization tools. These young engineers are becoming increasingly efficient in building complex 3D models of complicated aerospace components. However, the current trends demonstrate blind acceptance of the results of the finite element analysis results. This paper is aimed at raising an awareness of this situation. Examples of the common encounters are presented. To overcome the current trends, some guidelines and suggestions for analysts, senior engineers, and educators are offered.

  7. Electrochemically fabricated polypyrrole-cobalt-oxygen coordination complex as high-performance lithium-storage materials.

    PubMed

    Guo, Bingkun; Kong, Qingyu; Zhu, Ying; Mao, Ya; Wang, Zhaoxiang; Wan, Meixiang; Chen, Liquan

    2011-12-23

    Current lithium-ion battery (LIB) technologies are all based on inorganic electrode materials, though organic materials have been used as electrodes for years. Disadvantages such as limited thermal stability and low specific capacity hinder their applications. On the other hand, the transition metal oxides that provide high lithium-storage capacity by way of electrochemical conversion reaction suffer from poor cycling stability. Here we report a novel high-performance, organic, lithium-storage material, a polypyrrole-cobalt-oxygen (PPy-Co-O) coordination complex, with high lithium-storage capacity and excellent cycling stability. Extended X-ray absorption fine structure and Raman spectroscopy and other physical and electrochemical characterizations demonstrate that this coordination complex can be electrochemically fabricated by cycling PPy-coated Co(3)O(4) between 0.0 V and 3.0 V versus Li(+)/Li. Density functional theory (DFT) calculations indicate that each cobalt atom coordinates with two nitrogen atoms within the PPy-Co coordination layer and the layers are connected with oxygen atoms between them. Coordination weakens the C-H bonds on PPy and makes the complex a novel lithium-storage material with high capacity and high cycling stability. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Stage call: Cardiovascular reactivity to audition stress in musicians

    PubMed Central

    Chanwimalueang, Theerasak; Aufegger, Lisa; Adjei, Tricia; Wasley, David; Cruder, Cinzia; Mandic, Danilo P.

    2017-01-01

    Auditioning is at the very center of educational and professional life in music and is associated with significant psychophysical demands. Knowledge of how these demands affect cardiovascular responses to psychosocial pressure is essential for developing strategies to both manage stress and understand optimal performance states. To this end, we recorded the electrocardiograms (ECGs) of 16 musicians (11 violinists and 5 flutists) before and during performances in both low- and high-stress conditions: with no audience and in front of an audition panel, respectively. The analysis consisted of the detection of R-peaks in the ECGs to extract heart rate variability (HRV) from the notoriously noisy real-world ECGs. Our data analysis approach spanned both standard (temporal and spectral) and advanced (structural complexity) techniques. The complexity science approaches—namely, multiscale sample entropy and multiscale fuzzy entropy—indicated a statistically significant decrease in structural complexity in HRV from the low- to the high-stress condition and an increase in structural complexity from the pre-performance to performance period, thus confirming the complexity loss theory and a loss in degrees of freedom due to stress. Results from the spectral analyses also suggest that the stress responses in the female participants were more parasympathetically driven than those of the male participants. In conclusion, our findings suggest that interventions to manage stress are best targeted at the sensitive pre-performance period, before an audition begins. PMID:28437466

  9. Linguistic and auditory temporal processing in children with specific language impairment.

    PubMed

    Fortunato-Tavares, Talita; Rocha, Caroline Nunes; Andrade, Claudia Regina Furquim de; Befi-Lopes, Débora Maria; Schochat, Eliane; Hestvik, Arild; Schwartz, Richard G

    2009-01-01

    Several studies suggest the association of specific language impairment (SLI) to deficits in auditory processing. It has been evidenced that children with SLI present deficit in brief stimuli discrimination. Such deficit would lead to difficulties in developing phonological abilities necessary to map phonemes and to effectively and automatically code and decode words and sentences. However, the correlation between temporal processing (TP) and specific deficits in language disorders--such as syntactic comprehension abilities--has received little or no attention. To analyze the correlation between: TP (through the Frequency Pattern Test--FPT) and Syntactic Complexity Comprehension (through a Sentence Comprehension Task). Sixteen children with typical language development (8;9 +/- 1;1 years) and seven children with SLI (8;1 +/- 1;2 years) participated on the study. Accuracy of both groups decreased with the increase on syntactic complexity (both p < 0.01). On the between groups comparison, performance difference on the Test of Syntactic Complexity Comprehension (TSCC) was statistically significant (p = 0.02).As expected, children with SLI presented FPT performance outside reference values. On the SLI group, correlations between TSCC and FPT were positive and higher for high syntactic complexity (r = 0.97) than for low syntactic complexity (r = 0.51). Results suggest that FPT is positively correlated to syntactic complexity comprehension abilities.The low performance on FPT could serve as an additional indicator of deficits in complex linguistic processing. Future studies should consider, besides the increase of the sample, longitudinal studies that investigate the effect of frequency pattern auditory training on performance in high syntactic complexity comprehension tasks.

  10. Multispectral Image Compression Based on DSC Combined with CCSDS-IDC

    PubMed Central

    Li, Jin; Xing, Fei; Sun, Ting; You, Zheng

    2014-01-01

    Remote sensing multispectral image compression encoder requires low complexity, high robust, and high performance because it usually works on the satellite where the resources, such as power, memory, and processing capacity, are limited. For multispectral images, the compression algorithms based on 3D transform (like 3D DWT, 3D DCT) are too complex to be implemented in space mission. In this paper, we proposed a compression algorithm based on distributed source coding (DSC) combined with image data compression (IDC) approach recommended by CCSDS for multispectral images, which has low complexity, high robust, and high performance. First, each band is sparsely represented by DWT to obtain wavelet coefficients. Then, the wavelet coefficients are encoded by bit plane encoder (BPE). Finally, the BPE is merged to the DSC strategy of Slepian-Wolf (SW) based on QC-LDPC by deep coupling way to remove the residual redundancy between the adjacent bands. A series of multispectral images is used to test our algorithm. Experimental results show that the proposed DSC combined with the CCSDS-IDC (DSC-CCSDS)-based algorithm has better compression performance than the traditional compression approaches. PMID:25110741

  11. Multispectral image compression based on DSC combined with CCSDS-IDC.

    PubMed

    Li, Jin; Xing, Fei; Sun, Ting; You, Zheng

    2014-01-01

    Remote sensing multispectral image compression encoder requires low complexity, high robust, and high performance because it usually works on the satellite where the resources, such as power, memory, and processing capacity, are limited. For multispectral images, the compression algorithms based on 3D transform (like 3D DWT, 3D DCT) are too complex to be implemented in space mission. In this paper, we proposed a compression algorithm based on distributed source coding (DSC) combined with image data compression (IDC) approach recommended by CCSDS for multispectral images, which has low complexity, high robust, and high performance. First, each band is sparsely represented by DWT to obtain wavelet coefficients. Then, the wavelet coefficients are encoded by bit plane encoder (BPE). Finally, the BPE is merged to the DSC strategy of Slepian-Wolf (SW) based on QC-LDPC by deep coupling way to remove the residual redundancy between the adjacent bands. A series of multispectral images is used to test our algorithm. Experimental results show that the proposed DSC combined with the CCSDS-IDC (DSC-CCSDS)-based algorithm has better compression performance than the traditional compression approaches.

  12. How to use MPI communication in highly parallel climate simulations more easily and more efficiently.

    NASA Astrophysics Data System (ADS)

    Behrens, Jörg; Hanke, Moritz; Jahns, Thomas

    2014-05-01

    In this talk we present a way to facilitate efficient use of MPI communication for developers of climate models. Exploitation of the performance potential of today's highly parallel supercomputers with real world simulations is a complex task. This is partly caused by the low level nature of the MPI communication library which is the dominant communication tool at least for inter-node communication. In order to manage the complexity of the task, climate simulations with non-trivial communication patterns often use an internal abstraction layer above MPI without exploiting the benefits of communication aggregation or MPI-datatypes. The solution for the complexity and performance problem we propose is the communication library YAXT. This library is built on top of MPI and takes high level descriptions of arbitrary domain decompositions and automatically derives an efficient collective data exchange. Several exchanges can be aggregated in order to reduce latency costs. Examples are given which demonstrate the simplicity and the performance gains for selected climate applications.

  13. Building and measuring a high performance network architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kramer, William T.C.; Toole, Timothy; Fisher, Chuck

    2001-04-20

    Once a year, the SC conferences present a unique opportunity to create and build one of the most complex and highest performance networks in the world. At SC2000, large-scale and complex local and wide area networking connections were demonstrated, including large-scale distributed applications running on different architectures. This project was designed to use the unique opportunity presented at SC2000 to create a testbed network environment and then use that network to demonstrate and evaluate high performance computational and communication applications. This testbed was designed to incorporate many interoperable systems and services and was designed for measurement from the very beginning.more » The end results were key insights into how to use novel, high performance networking technologies and to accumulate measurements that will give insights into the networks of the future.« less

  14. Performance of Children with Developmental Dyslexia on High and Low Topological Entropy Artificial Grammar Learning Task

    ERIC Educational Resources Information Center

    Katan, Pesia; Kahta, Shani; Sasson, Ayelet; Schiff, Rachel

    2017-01-01

    Graph complexity as measured by topological entropy has been previously shown to affect performance on artificial grammar learning tasks among typically developing children. The aim of this study was to examine the effect of graph complexity on implicit sequential learning among children with developmental dyslexia. Our goal was to determine…

  15. HSTLBO: A hybrid algorithm based on Harmony Search and Teaching-Learning-Based Optimization for complex high-dimensional optimization problems

    PubMed Central

    Tuo, Shouheng; Yong, Longquan; Deng, Fang’an; Li, Yanhai; Lin, Yong; Lu, Qiuju

    2017-01-01

    Harmony Search (HS) and Teaching-Learning-Based Optimization (TLBO) as new swarm intelligent optimization algorithms have received much attention in recent years. Both of them have shown outstanding performance for solving NP-Hard optimization problems. However, they also suffer dramatic performance degradation for some complex high-dimensional optimization problems. Through a lot of experiments, we find that the HS and TLBO have strong complementarity each other. The HS has strong global exploration power but low convergence speed. Reversely, the TLBO has much fast convergence speed but it is easily trapped into local search. In this work, we propose a hybrid search algorithm named HSTLBO that merges the two algorithms together for synergistically solving complex optimization problems using a self-adaptive selection strategy. In the HSTLBO, both HS and TLBO are modified with the aim of balancing the global exploration and exploitation abilities, where the HS aims mainly to explore the unknown regions and the TLBO aims to rapidly exploit high-precision solutions in the known regions. Our experimental results demonstrate better performance and faster speed than five state-of-the-art HS variants and show better exploration power than five good TLBO variants with similar run time, which illustrates that our method is promising in solving complex high-dimensional optimization problems. The experiment on portfolio optimization problems also demonstrate that the HSTLBO is effective in solving complex read-world application. PMID:28403224

  16. HSTLBO: A hybrid algorithm based on Harmony Search and Teaching-Learning-Based Optimization for complex high-dimensional optimization problems.

    PubMed

    Tuo, Shouheng; Yong, Longquan; Deng, Fang'an; Li, Yanhai; Lin, Yong; Lu, Qiuju

    2017-01-01

    Harmony Search (HS) and Teaching-Learning-Based Optimization (TLBO) as new swarm intelligent optimization algorithms have received much attention in recent years. Both of them have shown outstanding performance for solving NP-Hard optimization problems. However, they also suffer dramatic performance degradation for some complex high-dimensional optimization problems. Through a lot of experiments, we find that the HS and TLBO have strong complementarity each other. The HS has strong global exploration power but low convergence speed. Reversely, the TLBO has much fast convergence speed but it is easily trapped into local search. In this work, we propose a hybrid search algorithm named HSTLBO that merges the two algorithms together for synergistically solving complex optimization problems using a self-adaptive selection strategy. In the HSTLBO, both HS and TLBO are modified with the aim of balancing the global exploration and exploitation abilities, where the HS aims mainly to explore the unknown regions and the TLBO aims to rapidly exploit high-precision solutions in the known regions. Our experimental results demonstrate better performance and faster speed than five state-of-the-art HS variants and show better exploration power than five good TLBO variants with similar run time, which illustrates that our method is promising in solving complex high-dimensional optimization problems. The experiment on portfolio optimization problems also demonstrate that the HSTLBO is effective in solving complex read-world application.

  17. Reduced complexity of multi-track joint 2-D Viterbi detectors for bit-patterned media recording channel

    NASA Astrophysics Data System (ADS)

    Myint, L. M. M.; Warisarn, C.

    2017-05-01

    Two-dimensional (2-D) interference is one of the prominent challenges in ultra-high density recording system such as bit patterned media recording (BPMR). The multi-track joint 2-D detection technique with the help of the array-head reading can tackle this problem effectively by jointly processing the multiple readback signals from the adjacent tracks. Moreover, it can robustly alleviate the impairments due to track mis-registration (TMR) and media noise. However, the computational complexity of such detectors is normally too high and hard to implement in a reality, even for a few multiple tracks. Therefore, in this paper, we mainly focus on reducing the complexity of multi-track joint 2-D Viterbi detector without paying a large penalty in terms of the performance. We propose a simplified multi-track joint 2-D Viterbi detector with a manageable complexity level for the BPMR's multi-track multi-head (MTMH) system. In the proposed method, the complexity of detector's trellis is reduced with the help of the joint-track equalization method which employs 1-D equalizers and 2-D generalized partial response (GPR) target. Moreover, we also examine the performance of a full-fledged multi-track joint 2-D detector and the conventional 2-D detection. The results show that the simplified detector can perform close to the full-fledge detector, especially when the system faces high media noise, with the significant low complexity.

  18. Brain Modularity Mediates the Relation between Task Complexity and Performance

    NASA Astrophysics Data System (ADS)

    Ye, Fengdan; Yue, Qiuhai; Martin, Randi; Fischer-Baum, Simon; Ramos-Nuã+/-Ez, Aurora; Deem, Michael

    Recent work in cognitive neuroscience has focused on analyzing the brain as a network, rather than a collection of independent regions. Prior studies taking this approach have found that individual differences in the degree of modularity of the brain network relate to performance on cognitive tasks. However, inconsistent results concerning the direction of this relationship have been obtained, with some tasks showing better performance as modularity increases, and other tasks showing worse performance. A recent theoretical model suggests that these inconsistencies may be explained on the grounds that high-modularity networks favor performance on simple tasks whereas low-modularity networks favor performance on complex tasks. The current study tests these predictions by relating modularity from resting-state fMRI to performance on a set of behavioral tasks. Complex and simple tasks were defined on the basis of whether they drew on executive attention. Consistent with predictions, we found a negative correlation between individuals' modularity and their performance on the complex tasks but a positive correlation with performance on the simple tasks. The results presented here provide a framework for linking measures of whole brain organization to cognitive processing.

  19. HPLC-Orbitrap analysis for identification of organic molecules in complex material

    NASA Astrophysics Data System (ADS)

    Gautier, T.; Schmitz-Afonso, I.; Carrasco, N.; Touboul, D.; Szopa, C.; Buch, A.; Pernot, P.

    2015-10-01

    We performed High Performance Liquid Chromatography (HPLC) coupled to Orbitrap High Resolution Mass Spectrometry (OHR MS) analysis of Titan's tholins. This analysis allowed us to determine the exact composition and structure of some of the major components of tholins.

  20. Optimation and Determination of Fe-Oxinate Complex by Using High Performance Liquid Chromatography

    NASA Astrophysics Data System (ADS)

    Oktavia, B.; Nasra, E.; Sary, R. C.

    2018-04-01

    The need for iron will improve the industrial processes that require iron as its raw material. Control of industrial iron waste is very important to do. One method of iron analysis is to conduct indirect analysis of iron (III) ions by complexing with 8-Hydroxyquinoline or oxine. In this research, qualitative and quantitative tests of iron (III) ions in the form of complex with oxine. The analysis was performed using HPLC at a wavelength of 470 nm with an ODS C18 column. Three methods of analysis were performed: 1) Fe-oxinate complexes were prepared in an ethanol solvent so no need for separation anymore, (2) Fe-oxinate complexes were made in chloroform so that a solvent extraction was required before the complex was injected into the column while the third complex was formed in the column, wherein the eluent contains the oxide and the metal ions are then injected. The resulting chromatogram shows that the 3rd way provides a better chromatogram for iron analysis.

  1. A novel method for identifying disease associated protein complexes based on functional similarity protein complex networks.

    PubMed

    Le, Duc-Hau

    2015-01-01

    Protein complexes formed by non-covalent interaction among proteins play important roles in cellular functions. Computational and purification methods have been used to identify many protein complexes and their cellular functions. However, their roles in terms of causing disease have not been well discovered yet. There exist only a few studies for the identification of disease-associated protein complexes. However, they mostly utilize complicated heterogeneous networks which are constructed based on an out-of-date database of phenotype similarity network collected from literature. In addition, they only apply for diseases for which tissue-specific data exist. In this study, we propose a method to identify novel disease-protein complex associations. First, we introduce a framework to construct functional similarity protein complex networks where two protein complexes are functionally connected by either shared protein elements, shared annotating GO terms or based on protein interactions between elements in each protein complex. Second, we propose a simple but effective neighborhood-based algorithm, which yields a local similarity measure, to rank disease candidate protein complexes. Comparing the predictive performance of our proposed algorithm with that of two state-of-the-art network propagation algorithms including one we used in our previous study, we found that it performed statistically significantly better than that of these two algorithms for all the constructed functional similarity protein complex networks. In addition, it ran about 32 times faster than these two algorithms. Moreover, our proposed method always achieved high performance in terms of AUC values irrespective of the ways to construct the functional similarity protein complex networks and the used algorithms. The performance of our method was also higher than that reported in some existing methods which were based on complicated heterogeneous networks. Finally, we also tested our method with prostate cancer and selected the top 100 highly ranked candidate protein complexes. Interestingly, 69 of them were evidenced since at least one of their protein elements are known to be associated with prostate cancer. Our proposed method, including the framework to construct functional similarity protein complex networks and the neighborhood-based algorithm on these networks, could be used for identification of novel disease-protein complex associations.

  2. Detection of protein complex from protein-protein interaction network using Markov clustering

    NASA Astrophysics Data System (ADS)

    Ochieng, P. J.; Kusuma, W. A.; Haryanto, T.

    2017-05-01

    Detection of complexes, or groups of functionally related proteins, is an important challenge while analysing biological networks. However, existing algorithms to identify protein complexes are insufficient when applied to dense networks of experimentally derived interaction data. Therefore, we introduced a graph clustering method based on Markov clustering algorithm to identify protein complex within highly interconnected protein-protein interaction networks. Protein-protein interaction network was first constructed to develop geometrical network, the network was then partitioned using Markov clustering to detect protein complexes. The interest of the proposed method was illustrated by its application to Human Proteins associated to type II diabetes mellitus. Flow simulation of MCL algorithm was initially performed and topological properties of the resultant network were analysed for detection of the protein complex. The results indicated the proposed method successfully detect an overall of 34 complexes with 11 complexes consisting of overlapping modules and 20 non-overlapping modules. The major complex consisted of 102 proteins and 521 interactions with cluster modularity and density of 0.745 and 0.101 respectively. The comparison analysis revealed MCL out perform AP, MCODE and SCPS algorithms with high clustering coefficient (0.751) network density and modularity index (0.630). This demonstrated MCL was the most reliable and efficient graph clustering algorithm for detection of protein complexes from PPI networks.

  3. How Stimulus and Task Complexity Affect Monitoring in High-Functioning Adults with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Koolen, Sophieke; Vissers, Constance Th. W. M.; Egger, Jos I. M.; Verhoeven, Ludo

    2014-01-01

    The present study examined whether individuals with autism spectrum disorder (ASD) are able to update and monitor working memory representations of visual input, and whether performance is influenced by stimulus and task complexity. 15 high-functioning adults with ASD and 15 controls were asked to allocate either elements of abstract figures or…

  4. Working memory subsystems and task complexity in young boys with Fragile X syndrome.

    PubMed

    Baker, S; Hooper, S; Skinner, M; Hatton, D; Schaaf, J; Ornstein, P; Bailey, D

    2011-01-01

    Working memory problems have been targeted as core deficits in individuals with Fragile X syndrome (FXS); however, there have been few studies that have examined working memory in young boys with FXS, and even fewer studies that have studied the working memory performance of young boys with FXS across different degrees of complexity. The purpose of this study was to investigate the phonological loop and visual-spatial working memory in young boys with FXS, in comparison to mental age-matched typical boys, and to examine the impact of complexity of the working memory tasks on performance. The performance of young boys (7 to 13-years-old) with FXS (n = 40) was compared with that of mental age and race matched typically developing boys (n = 40) on measures designed to test the phonological loop and the visuospatial sketchpad across low, moderate and high degrees of complexity. Multivariate analyses were used to examine group differences across the specific working memory systems and degrees of complexity. Results suggested that boys with FXS showed deficits in phonological loop and visual-spatial working memory tasks when compared with typically developing mental age-matched boys. For the boys with FXS, the phonological loop was significantly lower than the visual-spatial sketchpad; however, there was no significant difference in performance across the low, moderate and high degrees of complexity in the working memory tasks. Reverse tasks from both the phonological loop and visual-spatial sketchpad appeared to be the most challenging for both groups, but particularly for the boys with FXS. These findings implicate a generalised deficit in working memory in young boys with FXS, with a specific disproportionate impairment in the phonological loop. Given the lack of differentiation on the low versus high complexity tasks, simple span tasks may provide an adequate estimate of working memory until greater involvement of the central executive is achieved. © 2010 The Authors. Journal of Intellectual Disability Research © 2010 Blackwell Publishing Ltd.

  5. Working memory subsystems and task complexity in young boys with Fragile X syndrome

    PubMed Central

    Baker, S.; Hooper, S.; Skinner, M.; Hatton, D.; Schaaf, J.; Ornstein, P.; Bailey, D.

    2011-01-01

    Background Working memory problems have been targeted as core deficits in individuals with Fragile X syndrome (FXS); however, there have been few studies that have examined working memory in young boys with FXS, and even fewer studies that have studied the working memory performance of young boys with FXS across different degrees of complexity. The purpose of this study was to investigate the phonological loop and visual–spatial working memory in young boys with FXS, in comparison to mental age-matched typical boys, and to examine the impact of complexity of the working memory tasks on performance. Methods The performance of young boys (7 to 13-years-old) with FXS (n = 40) was compared with that of mental age and race matched typically developing boys (n = 40) on measures designed to test the phonological loop and the visuospatial sketchpad across low, moderate and high degrees of complexity. Multivariate analyses were used to examine group differences across the specific working memory systems and degrees of complexity. Results Results suggested that boys with FXS showed deficits in phonological loop and visual–spatial working memory tasks when compared with typically developing mental age-matched boys. For the boys with FXS, the phonological loop was significantly lower than the visual–spatial sketchpad; however, there was no significant difference in performance across the low, moderate and high degrees of complexity in the working memory tasks. Reverse tasks from both the phonological loop and visual–spatial sketchpad appeared to be the most challenging for both groups, but particularly for the boys with FXS. Conclusions These findings implicate a generalised deficit in working memory in young boys with FXS, with a specific disproportionate impairment in the phonological loop. Given the lack of differentiation on the low versus high complexity tasks, simple span tasks may provide an adequate estimate of working memory until greater involvement of the central executive is achieved. PMID:21121991

  6. A high throughput architecture for a low complexity soft-output demapping algorithm

    NASA Astrophysics Data System (ADS)

    Ali, I.; Wasenmüller, U.; Wehn, N.

    2015-11-01

    Iterative channel decoders such as Turbo-Code and LDPC decoders show exceptional performance and therefore they are a part of many wireless communication receivers nowadays. These decoders require a soft input, i.e., the logarithmic likelihood ratio (LLR) of the received bits with a typical quantization of 4 to 6 bits. For computing the LLR values from a received complex symbol, a soft demapper is employed in the receiver. The implementation cost of traditional soft-output demapping methods is relatively large in high order modulation systems, and therefore low complexity demapping algorithms are indispensable in low power receivers. In the presence of multiple wireless communication standards where each standard defines multiple modulation schemes, there is a need to have an efficient demapper architecture covering all the flexibility requirements of these standards. Another challenge associated with hardware implementation of the demapper is to achieve a very high throughput in double iterative systems, for instance, MIMO and Code-Aided Synchronization. In this paper, we present a comprehensive communication and hardware performance evaluation of low complexity soft-output demapping algorithms to select the best algorithm for implementation. The main goal of this work is to design a high throughput, flexible, and area efficient architecture. We describe architectures to execute the investigated algorithms. We implement these architectures on a FPGA device to evaluate their hardware performance. The work has resulted in a hardware architecture based on the figured out best low complexity algorithm delivering a high throughput of 166 Msymbols/second for Gray mapped 16-QAM modulation on Virtex-5. This efficient architecture occupies only 127 slice registers, 248 slice LUTs and 2 DSP48Es.

  7. The Role of Occupational Complexity in Trajectories of Cognitive Aging Before and After Retirement

    PubMed Central

    Finkel, Deborah; Andel, Ross; Gatz, Margaret; Pedersen, Nancy L.

    2009-01-01

    We examined the association between complexity of the main lifetime occupation and changes in cognitive ability in later life. Data on complexity of work with data, people, and things and on four cognitive factors (verbal, spatial, memory, and speed) were available from 462 individuals in the longitudinal Swedish Adoption/Twin Study of Aging. Mean age at the first measurement wave was 64.3 (s.d. = 7.2) and 65% of the sample had at least 3 waves of data. Occupational complexity with people and data were both correlated with cognitive performance. Individuals with more complex work demonstrated higher mean performance on the verbal, spatial, and speed factors. Latent growth curve analyses indicated that, after correcting for education, only complexity with people was associated with differences in cognitive performance and rate of cognitive change. Continued engagement as a result of occupational complexity with people helped to facilitate verbal function before retirement, while a previous high level of complexity of work with people was associated with faster decline after retirement on the spatial factor. PMID:19739912

  8. A complex valued radial basis function network for equalization of fast time varying channels.

    PubMed

    Gan, Q; Saratchandran, P; Sundararajan, N; Subramanian, K R

    1999-01-01

    This paper presents a complex valued radial basis function (RBF) network for equalization of fast time varying channels. A new method for calculating the centers of the RBF network is given. The method allows fixing the number of RBF centers even as the equalizer order is increased so that a good performance is obtained by a high-order RBF equalizer with small number of centers. Simulations are performed on time varying channels using a Rayleigh fading channel model to compare the performance of our RBF with an adaptive maximum-likelihood sequence estimator (MLSE) consisting of a channel estimator and a MLSE implemented by the Viterbi algorithm. The results show that the RBF equalizer produces superior performance with less computational complexity.

  9. Modeling two-phase flow in three-dimensional complex flow-fields of proton exchange membrane fuel cells

    NASA Astrophysics Data System (ADS)

    Kim, Jinyong; Luo, Gang; Wang, Chao-Yang

    2017-10-01

    3D fine-mesh flow-fields recently developed by Toyota Mirai improved water management and mass transport in proton exchange membrane (PEM) fuel cell stacks, suggesting their potential value for robust and high-power PEM fuel cell stack performance. In such complex flow-fields, Forchheimer's inertial effect is dominant at high current density. In this work, a two-phase flow model of 3D complex flow-fields of PEMFCs is developed by accounting for Forchheimer's inertial effect, for the first time, to elucidate the underlying mechanism of liquid water behavior and mass transport inside 3D complex flow-fields and their adjacent gas diffusion layers (GDL). It is found that Forchheimer's inertial effect enhances liquid water removal from flow-fields and adds additional flow resistance around baffles, which improves interfacial liquid water and mass transport. As a result, substantial improvements in high current density cell performance and operational stability are expected in PEMFCs with 3D complex flow-fields, compared to PEMFCs with conventional flow-fields. Higher current density operation required to further reduce PEMFC stack cost per kW in the future will necessitate optimizing complex flow-field designs using the present model, in order to efficiently remove a large amount of product water and hence minimize the mass transport voltage loss.

  10. Effect of the ortho-Hydroxyl Groups on a Bipyridine Ligand of Iridium Complexes for the High-Pressure Gas Generation from the Catalytic Decomposition of Formic Acid.

    PubMed

    Iguchi, Masayuki; Zhong, Heng; Himeda, Yuichiro; Kawanami, Hajime

    2017-12-14

    The hydroxyl groups of a 2,2'-bipyridine (bpy) ligand near the metal center activated the catalytic performance of the Ir complex for the dehydrogenation of formic acid at high pressure. The position of the hydroxyl groups on the ligand affected the catalytic durability for the high-pressure H 2 generation through the decomposition of formic acid. The Ir complex with a bipyridine ligand functionalized with para-hydroxyl groups shows a good durability with a constant catalytic activity during the reaction even under high-pressure conditions, whereas deactivation was observed for an Ir complex with a bipyridine ligand with ortho-hydroxyl groups (2). In the presence of high-pressure H 2 , complex 2 decomposed into the ligand and an Ir trihydride complex through the isomerization of the bpy ligand. This work provides the development of a durable catalyst for the high-pressure H 2 production from formic acid. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Robust Fixed-Structure Controller Synthesis

    NASA Technical Reports Server (NTRS)

    Corrado, Joseph R.; Haddad, Wassim M.; Gupta, Kajal (Technical Monitor)

    2000-01-01

    The ability to develop an integrated control system design methodology for robust high performance controllers satisfying multiple design criteria and real world hardware constraints constitutes a challenging task. The increasingly stringent performance specifications required for controlling such systems necessitates a trade-off between controller complexity and robustness. The principle challenge of the minimal complexity robust control design is to arrive at a tractable control design formulation in spite of the extreme complexity of such systems. Hence, design of minimal complexitY robust controllers for systems in the face of modeling errors has been a major preoccupation of system and control theorists and practitioners for the past several decades.

  12. High-level ab initio studies of the complex formed between CO and O2

    NASA Astrophysics Data System (ADS)

    Grein, Friedrich

    2017-05-01

    The explicitly correlated CCSD(T)-F12 method with VXZ-F12 basis sets was used to find the most stable structures of the van der Waals CO-O2 complexes. With geometry optimizations performed up to the quadruple-zeta level and basis set extrapolation, the calculated interaction energies for the triplet complexes are 123 cm-1 for the H complex in Cs symmetry (slipped near-parallel structure), 118 cm-1 for the X complex, also in Cs symmetry (perpendicular alignment) and 116 cm-1 for the CO-O2 T complex in C2v symmetry. The corresponding CCSD(T)-F12 results using the aug-cc-pVXZ basis sets are nearly the same. Similar calculations were performed for the CO-O2 singlet complexes, which are shown to have much higher stabilization energies, the highest being 206 cm-1 for the X complex.

  13. High-efficiency high-reliability optical components for a large, high-average-power visible laser system

    NASA Astrophysics Data System (ADS)

    Taylor, John R.; Stolz, Christopher J.

    1993-08-01

    Laser system performance and reliability depends on the related performance and reliability of the optical components which define the cavity and transport subsystems. High-average-power and long transport lengths impose specific requirements on component performance. The complexity of the manufacturing process for optical components requires a high degree of process control and verification. Qualification has proven effective in ensuring confidence in the procurement process for these optical components. Issues related to component reliability have been studied and provide useful information to better understand the long term performance and reliability of the laser system.

  14. High-efficiency high-reliability optical components for a large, high-average-power visible laser system

    NASA Astrophysics Data System (ADS)

    Taylor, J. R.; Stolz, C. J.

    1992-12-01

    Laser system performance and reliability depends on the related performance and reliability of the optical components which define the cavity and transport subsystems. High-average-power and long transport lengths impose specific requirements on component performance. The complexity of the manufacturing process for optical components requires a high degree of process control and verification. Qualification has proven effective in ensuring confidence in the procurement process for these optical components. Issues related to component reliability have been studied and provide useful information to better understand the long term performance and reliability of the laser system.

  15. Energetic lanthanide complexes: coordination chemistry and explosives applications

    NASA Astrophysics Data System (ADS)

    Manner, V. W.; Barker, B. J.; Sanders, V. E.; Laintz, K. E.; Scott, B. L.; Preston, D. N.; Sandstrom, M.; Reardon, B. L.

    2014-05-01

    Metals are generally added to organic molecular explosives in a heterogeneous composite to improve overall heat and energy release. In order to avoid creating a mixture that can vary in homogeneity, energetic organic molecules can be directly bonded to high molecular weight metals, forming a single metal complex with Angstrom-scale separation between the metal and the explosive. To probe the relationship between the structural properties of metal complexes and explosive performance, a new series of energetic lanthanide complexes has been prepared using energetic ligands such as NTO (5-nitro-2,4-dihydro-1,2,4-triazole-3-one). These are the first examples of lanthanide NTO complexes where no water is coordinated to the metal, demonstrating novel control of the coordination environment. The complexes have been characterized by X-ray crystallography, NMR and IR spectroscopies, photoluminescence, and sensitivity testing. The structural and energetic properties are discussed in the context of enhanced blast effects and detection. Cheetah calculations have been performed to fine-tune physical properties, creating a systematic method for producing explosives with 'tailor made' characteristics. These new complexes will be benchmarks for further study in the field of metalized high explosives.

  16. Energetic Lanthanide Complexes: Coordination Chemistry and Explosives Applications

    NASA Astrophysics Data System (ADS)

    Manner, Virginia; Barker, Beau; Sanders, Eric; Laintz, Kenneth; Scott, Brian; Preston, Daniel; Sandstrom, Mary; Reardon, Bettina

    2013-06-01

    Metals are generally added to organic molecular explosives in a heterogeneous composite to improve overall heat and energy release. In order to avoid creating a mixture that can vary in homogeneity, energetic organic molecules can be directly bonded to high molecular weight metals, forming a single metal complex with Angstrom-scale separation between the metal and the explosive. To probe the relationship between the structural properties of metal complexes and explosive performance, a new series of energetic lanthanide complexes has been prepared using energetic ligands such as NTO (5-nitro-2,4-dihydro-1,2,4-triazole-3-one). These are the first examples of lanthanide NTO complexes where no water is coordinated to the metal, demonstrating novel control of the coordination environment. The complexes have been characterized by X-ray crystallography, NMR and IR spectroscopies, photoluminescence, and sensitivity testing. The structural and energetic properties are discussed in the context of enhanced blast effects and detection. Cheetah calculations have been performed to fine-tune physical properties, creating a systematic method for producing explosives with ``tailor made'' characteristics. These new complexes will be benchmarks for further study in the field of metalized high explosives.

  17. Surgical management of congenital heart disease: contribution of the Aristotle complexity score to planning and budgeting in the German diagnosis-related groups system.

    PubMed

    Sinzobahamvya, Nicodème; Photiadis, Joachim; Kopp, Thorsten; Arenz, Claudia; Haun, Christoph; Schindler, Ehrenfried; Hraska, Viktor; Asfour, Boulos

    2012-01-01

    Planning and budgeting for congenital heart surgery depend primarily on how closely reimbursement matches costs and on the number and complexity of the surgical procedures. Aristotle complexity scores for the year 2010 were correlated with hospital costs and with reimbursement according to the German diagnosis-related groups (DRG) system. Unit surgical performance was estimated as surgical performance (complexity score × hospital survival) times the number of primary procedures. This study investigated how this performance evolved during years 2006 to 2010. Hospital costs and reimbursements correlated highly with Aristotle comprehensive complexity levels (Spearman r = 1). Mean costs and reimbursement reached 35,050 ± 32,665 and 31,283 ± 34,732, respectively, for an underfunding of 10.7%. Basic and comprehensive unit surgical performances were respectively 3036 ± 1009 and 3891 ± 1591 points in 2006. Both performances increased in sigmoid fashion to reach 3883 ± 1344 and 5335 ± 1314 points, respectively, in 2010. Top performances would be achieved in year 2011, and extrapolated costs would comprise about 19,434,094.92 (95% confidence interval, 11,961,491.22-22,495,764.42). The current underfunding of congenital heart surgery needs correction. The Aristotle score can help to adjust reimbursement according to complexity of procedures. Unit surgical performance allows accurate budgeting in the current German DRG system.

  18. Multiuser receiver for DS-CDMA signals in multipath channels: an enhanced multisurface method.

    PubMed

    Mahendra, Chetan; Puthusserypady, Sadasivan

    2006-11-01

    This paper deals with the problem of multiuser detection in direct-sequence code-division multiple-access (DS-CDMA) systems in multipath environments. The existing multiuser detectors can be divided into two categories: (1) low-complexity poor-performance linear detectors and (2) high-complexity good-performance nonlinear detectors. In particular, in channels where the orthogonality of the code sequences is destroyed by multipath, detectors with linear complexity perform much worse than the nonlinear detectors. In this paper, we propose an enhanced multisurface method (EMSM) for multiuser detection in multipath channels. EMSM is an intermediate piecewise linear detection scheme with a run-time complexity linear in the number of users. Its bit error rate performance is compared with existing linear detectors, a nonlinear radial basis function detector trained by the new support vector learning algorithm, and Verdu's optimal detector. Simulations in multipath channels, for both synchronous and asynchronous cases, indicate that it always outperforms all other linear detectors, performing nearly as well as nonlinear detectors.

  19. An evaluation of the effects of high visual taskload on the separate behaviors involved in complex monitoring performance.

    DOT National Transportation Integrated Search

    1988-01-01

    Operational monitoring situations, in contrast to typical laboratory vigilance tasks, generally involve more than just stimulus detection and recognition. They frequently involve complex multidimensional discriminations, interpretations of significan...

  20. Complex performance during exposure to high temperatures.

    DOT National Transportation Integrated Search

    1969-06-01

    The effects of high temperature on psychomotor performance and physiological function were studied on male pilots (age 30-51) holding a current medical certificate. A total of 41 runs were made at neutral (23.8C (75F), or hot (60.0C (140F), 71.1C (16...

  1. Towards high-performance materials for road construction

    NASA Astrophysics Data System (ADS)

    Gladkikh, V.; Korolev, E.; Smirnov, V.

    2017-10-01

    Due to constant increase of traffic, modern road construction is in need of high-performance pavement materials. The operational performance of such materials can be characterized by many properties. Nevertheless, the most important ones are resistance to rutting and resistance to dynamical loads. It was proposed earlier to use sulfur extended asphalt concrete in road construction practice. To reduce the emission of sulfur dioxide and hydrogen sulfide during the concrete mix preparation and pavement production stages, it is beneficial to make such a concrete on the base of complex sulfur modifier. In the present work the influence of the complex modifier to mechanical properties of sulfur extended asphalt concrete was examined. It was shown that sulfur extended asphalt concrete is of high mechanical properties. It was also revealed that there as an anomalous negative correlations between strain capacity, fatigue life and fracture toughness.

  2. GaAs VLSI technology and circuit elements for DSP

    NASA Astrophysics Data System (ADS)

    Mikkelson, James M.

    1990-10-01

    Recent progress in digital GaAs circuit performance and complexity is presented to demonstrate the current capabilities of GaAs components. High density GaAs process technology and circuit design techniques are described and critical issues for achieving favorable complexity speed power and cost tradeoffs are reviewed. Some DSP building blocks are described to provide examples of what types of DSP systems could be implemented with present GaAs technology. DIGITAL GaAs CIRCUIT CAPABILITIES In the past few years the capabilities of digital GaAs circuits have dramatically increased to the VLSI level. Major gains in circuit complexity and power-delay products have been achieved by the use of silicon-like process technologies and simple circuit topologies. The very high speed and low power consumption of digital GaAs VLSI circuits have made GaAs a desirable alternative to high performance silicon in hardware intensive high speed system applications. An example of the performance and integration complexity available with GaAs VLSI circuits is the 64x64 crosspoint switch shown in figure 1. This switch which is the most complex GaAs circuit currently available is designed on a 30 gate GaAs gate array. It operates at 200 MHz and dissipates only 8 watts of power. The reasons for increasing the level of integration of GaAs circuits are similar to the reasons for the continued increase of silicon circuit complexity. The market factors driving GaAs VLSI are system design methodology system cost power and reliability. System designers are hesitant or unwilling to go backwards to previous design techniques and lower levels of integration. A more highly integrated system in a lower performance technology can often approach the performance of a system in a higher performance technology at a lower level of integration. Higher levels of integration also lower the system component count which reduces the system cost size and power consumption while improving the system reliability. For large gate count circuits the power per gate must be minimized to prevent reliability and cooling problems. The technical factors which favor increasing GaAs circuit complexity are primarily related to reducing the speed and power penalties incurred when crossing chip boundaries. Because the internal GaAs chip logic levels are not compatible with standard silicon I/O levels input receivers and output drivers are needed to convert levels. These I/O circuits add significant delay to logic paths consume large amounts of power and use an appreciable portion of the die area. The effects of these I/O penalties can be reduced by increasing the ratio of core logic to I/O on a chip. DSP operations which have a large number of logic stages between the input and the output are ideal candidates to take advantage of the performance of GaAs digital circuits. Figure 2 is a schematic representation of the I/O penalties encountered when converting from ECL levels to GaAs

  3. Performance comparison of leading image codecs: H.264/AVC Intra, JPEG2000, and Microsoft HD Photo

    NASA Astrophysics Data System (ADS)

    Tran, Trac D.; Liu, Lijie; Topiwala, Pankaj

    2007-09-01

    This paper provides a detailed rate-distortion performance comparison between JPEG2000, Microsoft HD Photo, and H.264/AVC High Profile 4:4:4 I-frame coding for high-resolution still images and high-definition (HD) 1080p video sequences. This work is an extension to our previous comparative study published in previous SPIE conferences [1, 2]. Here we further optimize all three codecs for compression performance. Coding simulations are performed on a set of large-format color images captured from mainstream digital cameras and 1080p HD video sequences commonly used for H.264/AVC standardization work. Overall, our experimental results show that all three codecs offer very similar coding performances at the high-quality, high-resolution setting. Differences tend to be data-dependent: JPEG2000 with the wavelet technology tends to be the best performer with smooth spatial data; H.264/AVC High-Profile with advanced spatial prediction modes tends to cope best with more complex visual content; Microsoft HD Photo tends to be the most consistent across the board. For the still-image data sets, JPEG2000 offers the best R-D performance gains (around 0.2 to 1 dB in peak signal-to-noise ratio) over H.264/AVC High-Profile intra coding and Microsoft HD Photo. For the 1080p video data set, all three codecs offer very similar coding performance. As in [1, 2], neither do we consider scalability nor complexity in this study (JPEG2000 is operating in non-scalable, but optimal performance mode).

  4. Diminished heart rate complexity in adolescent girls: a sign of vulnerability to anxiety disorders?

    PubMed

    Fiol-Veny, Aina; De la Torre-Luque, Alejandro; Balle, Maria; Bornas, Xavier

    2018-07-01

    Diminished heart rate variability has been found to be associated with high anxiety symptomatology. Since adolescence is the period of onset for many anxiety disorders, this study aimed to determine sex- and anxiety-related differences in heart rate variability and complexity in adolescents. We created four groups according to sex and anxiety symptomatology: high-anxiety girls (n = 24) and boys (n = 25), and low-anxiety girls (n = 22) and boys (n = 24) and recorded their cardiac function while they performed regular school activities. A series of two-way (sex and anxiety) MANOVAs were performed on time domain variability, frequency domain variability, and non-linear complexity. We obtained no multivariate interaction effects between sex and anxiety, but highly anxious participants had lower heart rate variability than the low-anxiety group. Regarding sex, girls showed lower heart rate variability and complexity than boys. The results suggest that adolescent girls have a less flexible cardiac system that could be a marker of the girls' vulnerability to developing anxiety disorders.

  5. Human-Robot Interaction in High Vulnerability Domains

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.

    2016-01-01

    Future NASA missions will require successful integration of the human with highly complex systems. Highly complex systems are likely to involve humans, automation, and some level of robotic assistance. The complex environments will require successful integration of the human with automation, with robots, and with human-automation-robot teams to accomplish mission critical goals. Many challenges exist for the human performing in these types of operational environments with these kinds of systems. Systems must be designed to optimally integrate various levels of inputs and outputs based on the roles and responsibilities of the human, the automation, and the robots; from direct manual control, shared human-robotic control, or no active human control (i.e. human supervisory control). It is assumed that the human will remain involved at some level. Technologies that vary based on contextual demands and on operator characteristics (workload, situation awareness) will be needed when the human integrates into these systems. Predictive models that estimate the impact of the technologies on the system performance and the on the human operator are also needed to meet the challenges associated with such future complex human-automation-robot systems in extreme environments.

  6. Spectral-Element Seismic Wave Propagation Codes for both Forward Modeling in Complex Media and Adjoint Tomography

    NASA Astrophysics Data System (ADS)

    Smith, J. A.; Peter, D. B.; Tromp, J.; Komatitsch, D.; Lefebvre, M. P.

    2015-12-01

    We present both SPECFEM3D_Cartesian and SPECFEM3D_GLOBE open-source codes, representing high-performance numerical wave solvers simulating seismic wave propagation for local-, regional-, and global-scale application. These codes are suitable for both forward propagation in complex media and tomographic imaging. Both solvers compute highly accurate seismic wave fields using the continuous Galerkin spectral-element method on unstructured meshes. Lateral variations in compressional- and shear-wave speeds, density, as well as 3D attenuation Q models, topography and fluid-solid coupling are all readily included in both codes. For global simulations, effects due to rotation, ellipticity, the oceans, 3D crustal models, and self-gravitation are additionally included. Both packages provide forward and adjoint functionality suitable for adjoint tomography on high-performance computing architectures. We highlight the most recent release of the global version which includes improved performance, simultaneous MPI runs, OpenCL and CUDA support via an automatic source-to-source transformation library (BOAST), parallel I/O readers and writers for databases using ADIOS and seismograms using the recently developed Adaptable Seismic Data Format (ASDF) with built-in provenance. This makes our spectral-element solvers current state-of-the-art, open-source community codes for high-performance seismic wave propagation on arbitrarily complex 3D models. Together with these solvers, we provide full-waveform inversion tools to image the Earth's interior at unprecedented resolution.

  7. Observations and Measurements of Wing Parameters of the Selected Beetle Species and the Design of a Mechanism Structure Implementing a Complex Wing Movement

    NASA Astrophysics Data System (ADS)

    Geisler, T.

    2016-12-01

    Beetle wings perform a flapping movement, consisting of the rotation relative to the two axes. This paper presents the results of observations and measurements of wings operating parameters in different planes of some beetle species. High speed photos and videos were used. The concept of the mechanism performing a complex wing movement was proposed and developed.

  8. The Effects of Modality and Multimedia Comprehension on the Performance of Students with Varied Multimedia Comprehension Abilities when Exposed to High Complexity, Self-Paced Multimedia Instructional Materials

    ERIC Educational Resources Information Center

    Al-Abbasi, Daniah

    2012-01-01

    Poor multimedia comprehenders suffer from a decreased ability in comprehending complex textual and pictorial materials (Maki & Maki, 2002). This deficit will lead to an overloaded working memory and consequently decreased performance (Carretti, Borella, Cornoldi, & De Beni, 2009). The purpose of this research study was to examine the effects of…

  9. One-step manufacturing of innovative flat-knitted 3D net-shape preforms for composite applications

    NASA Astrophysics Data System (ADS)

    Bollengier, Quentin; Wieczorek, Florian; Hellmann, Sven; Trümper, Wolfgang; Cherif, Chokri

    2017-10-01

    Mostly due to the cost-intensive manually performed processing operations, the production of complex-shaped fibre reinforced plastic composites (FRPC) is currently very expensive and therefore either restricted to sectors with high added value or for small batch applications (e.g. in the aerospace or automotive industry). Previous works suggest that the successful integration of conventional textile manufacturing processes in the FRPC-process chain is the key to a cost-efficient manufacturing of complex three-dimensional (3D) FRPC-components with stress-oriented fibre arrangement. Therefore, this work focuses on the development of the multilayer weft knitting technology for the one-step manufacturing of complex 3D net-shaped preforms for high performance FRPC applications. In order to highlight the advantages of net-shaped multilayer weft knitted fabrics for the production of complex FRPC parts, seamless preforms such as 3D skin-stringer structures and tubular fabrics with load oriented fibre arrangement are realised. In this paper, the development of the textile bindings and performed technical modifications on flat knitting machines are presented. The results show that the multilayer weft knitting technology meets perfectly the requirements for a fully automated and reproducible manufacturing of complex 3D textile preforms with stress-oriented fibre arrangement.

  10. A method for surface topography measurement using a new focus function based on dual-tree complex wavelet transform

    NASA Astrophysics Data System (ADS)

    Li, Shimiao; Guo, Tong; Yuan, Lin; Chen, Jinping

    2018-01-01

    Surface topography measurement is an important tool widely used in many fields to determine the characteristics and functionality of a part or material. Among existing methods for this purpose, the focus variation method has proved high performance particularly in large slope scenarios. However, its performance depends largely on the effectiveness of focus function. This paper presents a method for surface topography measurement using a new focus measurement function based on dual-tree complex wavelet transform. Experiments are conducted on simulated defocused images to prove its high performance in comparison with other traditional approaches. The results showed that the new algorithm has better unimodality and sharpness. The method was also verified by measuring a MEMS micro resonator structure.

  11. The Curvilinear Relationship between State Neuroticism and Momentary Task Performance

    PubMed Central

    Debusscher, Jonas; Hofmans, Joeri; De Fruyt, Filip

    2014-01-01

    A daily diary and two experience sampling studies were carried out to investigate curvilinearity of the within-person relationship between state neuroticism and task performance, as well as the moderating effects of within-person variation in momentary job demands (i.e., work pressure and task complexity). In one, results showed that under high work pressure, the state neuroticism–task performance relationship was best described by an exponentially decreasing curve, whereas an inverted U-shaped curve was found for tasks low in work pressure, while in another study, a similar trend was visible for task complexity. In the final study, the state neuroticism–momentary task performance relationship was a linear one, and this relationship was moderated by momentary task complexity. Together, results from all three studies showed that it is important to take into account the moderating effects of momentary job demands because within-person variation in job demands affects the way in which state neuroticism relates to momentary levels of task performance. Specifically, we found that experiencing low levels of state neuroticism may be most beneficial in high demanding tasks, whereas more moderate levels of state neuroticism are optimal under low momentary job demands. PMID:25238547

  12. Measurements of complex impedance in microwave high power systems with a new bluetooth integrated circuit.

    PubMed

    Roussy, Georges; Dichtel, Bernard; Chaabane, Haykel

    2003-01-01

    By using a new integrated circuit, which is marketed for bluetooth applications, it is possible to simplify the method of measuring the complex impedance, complex reflection coefficient and complex transmission coefficient in an industrial microwave setup. The Analog Devices circuit AD 8302, which measures gain and phase up to 2.7 GHz, operates with variable level input signals and is less sensitive to both amplitude and frequency fluctuations of the industrial magnetrons than are mixers and AM crystal detectors. Therefore, accurate gain and phase measurements can be performed with low stability generators. A mechanical setup with an AD 8302 is described; the calibration procedure and its performance are presented.

  13. Does the Aristotle Score predict outcome in congenital heart surgery?

    PubMed

    Kang, Nicholas; Tsang, Victor T; Elliott, Martin J; de Leval, Marc R; Cole, Timothy J

    2006-06-01

    The Aristotle Score has been proposed as a measure of 'complexity' in congenital heart surgery, and a tool for comparing performance amongst different centres. To date, however, it remains unvalidated. We examined whether the Basic Aristotle Score was a useful predictor of mortality following open-heart surgery, and compared it to the Risk Adjustment in Congenital Heart Surgery (RACHS-1) system. We also examined the ability of the Aristotle Score to measure performance. The Basic Aristotle Score and RACHS-1 risk categories were assigned retrospectively to 1085 operations involving cardiopulmonary bypass in children less than 18 years of age. Multiple logistic regression analysis was used to determine the significance of the Aristotle Score and RACHS-1 category as independent predictors of in-hospital mortality. Operative performance was calculated using the Aristotle equation: performance = complexity x survival. Multiple logistic regression identified RACHS-1 category to be a powerful predictor of mortality (Wald 17.7, p < 0.0001), whereas Aristotle Score was only weakly associated with mortality (Wald 4.8, p = 0.03). Age at operation and bypass time were also highly significant predictors of postoperative death (Wald 13.7 and 33.8, respectively, p < 0.0001 for both). Operative performance was measured at 7.52 units. The Basic Aristotle Score was only weakly associated with postoperative mortality in this series. Operative performance appeared to be inflated by the fact that the overall complexity of cases was relatively high in this series. An alternative equation (performance = complexity/mortality) is proposed as a fairer and more logical method of risk-adjustment.

  14. A Parallel Rendering Algorithm for MIMD Architectures

    NASA Technical Reports Server (NTRS)

    Crockett, Thomas W.; Orloff, Tobias

    1991-01-01

    Applications such as animation and scientific visualization demand high performance rendering of complex three dimensional scenes. To deliver the necessary rendering rates, highly parallel hardware architectures are required. The challenge is then to design algorithms and software which effectively use the hardware parallelism. A rendering algorithm targeted to distributed memory MIMD architectures is described. For maximum performance, the algorithm exploits both object-level and pixel-level parallelism. The behavior of the algorithm is examined both analytically and experimentally. Its performance for large numbers of processors is found to be limited primarily by communication overheads. An experimental implementation for the Intel iPSC/860 shows increasing performance from 1 to 128 processors across a wide range of scene complexities. It is shown that minimal modifications to the algorithm will adapt it for use on shared memory architectures as well.

  15. An Analysis of Performance Enhancement Techniques for Overset Grid Applications

    NASA Technical Reports Server (NTRS)

    Djomehri, J. J.; Biswas, R.; Potsdam, M.; Strawn, R. C.; Biegel, Bryan (Technical Monitor)

    2002-01-01

    The overset grid methodology has significantly reduced time-to-solution of high-fidelity computational fluid dynamics (CFD) simulations about complex aerospace configurations. The solution process resolves the geometrical complexity of the problem domain by using separately generated but overlapping structured discretization grids that periodically exchange information through interpolation. However, high performance computations of such large-scale realistic applications must be handled efficiently on state-of-the-art parallel supercomputers. This paper analyzes the effects of various performance enhancement techniques on the parallel efficiency of an overset grid Navier-Stokes CFD application running on an SGI Origin2000 machine. Specifically, the role of asynchronous communication, grid splitting, and grid grouping strategies are presented and discussed. Results indicate that performance depends critically on the level of latency hiding and the quality of load balancing across the processors.

  16. 42 CFR 493.825 - Standard; Mycobacteriology.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard; Mycobacteriology. 493.825 Section 493.825 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Performing Tests of Moderate Complexity (including the Subcategory), High Complexity, Or Any Combination of...

  17. 42 CFR 493.849 - Condition: Hematology.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Hematology. 493.849 Section 493.849 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Performing Tests of Moderate Complexity (including the Subcategory), High Complexity, Or Any Combination of...

  18. 42 CFR 493.845 - Standard; Toxicology.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Standard; Toxicology. 493.845 Section 493.845 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Performing Tests of Moderate Complexity (including the Subcategory), High Complexity, Or Any Combination of...

  19. 42 CFR 493.849 - Condition: Hematology.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Condition: Hematology. 493.849 Section 493.849 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Performing Tests of Moderate Complexity (including the Subcategory), High Complexity, Or Any Combination of...

  20. 42 CFR 493.853 - Condition: Pathology.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Condition: Pathology. 493.853 Section 493.853 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Performing Tests of Moderate Complexity (including the Subcategory), High Complexity, Or Any Combination of...

  1. 42 CFR 493.823 - Standard; Bacteriology.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Standard; Bacteriology. 493.823 Section 493.823 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Performing Tests of Moderate Complexity (including the Subcategory), High Complexity, Or Any Combination of...

  2. 42 CFR 493.845 - Standard; Toxicology.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard; Toxicology. 493.845 Section 493.845 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Performing Tests of Moderate Complexity (including the Subcategory), High Complexity, Or Any Combination of...

  3. 42 CFR 493.857 - Condition: Immunohematology.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Immunohematology. 493.857 Section 493.857 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Performing Tests of Moderate Complexity (including the Subcategory), High Complexity, Or Any Combination of...

  4. 42 CFR 493.825 - Standard; Mycobacteriology.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Standard; Mycobacteriology. 493.825 Section 493.825 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Performing Tests of Moderate Complexity (including the Subcategory), High Complexity, Or Any Combination of...

  5. 42 CFR 493.851 - Standard; Hematology.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Standard; Hematology. 493.851 Section 493.851 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Performing Tests of Moderate Complexity (including the Subcategory), High Complexity, Or Any Combination of...

  6. 42 CFR 493.865 - Standard; Antibody identification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Standard; Antibody identification. 493.865 Section 493.865 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN... for Laboratories Performing Tests of Moderate Complexity (including the Subcategory), High Complexity...

  7. 42 CFR 493.829 - Standard; Parasitology.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard; Parasitology. 493.829 Section 493.829 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Performing Tests of Moderate Complexity (including the Subcategory), High Complexity, Or Any Combination of...

  8. 42 CFR 493.863 - Standard; Compatibility testing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard; Compatibility testing. 493.863 Section 493.863 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN... for Laboratories Performing Tests of Moderate Complexity (including the Subcategory), High Complexity...

  9. 42 CFR 493.863 - Standard; Compatibility testing.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Standard; Compatibility testing. 493.863 Section 493.863 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN... for Laboratories Performing Tests of Moderate Complexity (including the Subcategory), High Complexity...

  10. 42 CFR 493.857 - Condition: Immunohematology.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Condition: Immunohematology. 493.857 Section 493.857 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Performing Tests of Moderate Complexity (including the Subcategory), High Complexity, Or Any Combination of...

  11. 42 CFR 493.833 - Condition: Diagnostic immunology.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Condition: Diagnostic immunology. 493.833 Section 493.833 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN... for Laboratories Performing Tests of Moderate Complexity (including the Subcategory), High Complexity...

  12. 42 CFR 493.833 - Condition: Diagnostic immunology.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Diagnostic immunology. 493.833 Section 493.833 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN... for Laboratories Performing Tests of Moderate Complexity (including the Subcategory), High Complexity...

  13. 42 CFR 493.843 - Standard; Endocrinology.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Standard; Endocrinology. 493.843 Section 493.843 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Performing Tests of Moderate Complexity (including the Subcategory), High Complexity, Or Any Combination of...

  14. 42 CFR 493.823 - Standard; Bacteriology.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard; Bacteriology. 493.823 Section 493.823 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Performing Tests of Moderate Complexity (including the Subcategory), High Complexity, Or Any Combination of...

  15. 42 CFR 493.853 - Condition: Pathology.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Pathology. 493.853 Section 493.853 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Performing Tests of Moderate Complexity (including the Subcategory), High Complexity, Or Any Combination of...

  16. 42 CFR 493.851 - Standard; Hematology.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard; Hematology. 493.851 Section 493.851 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Performing Tests of Moderate Complexity (including the Subcategory), High Complexity, Or Any Combination of...

  17. 42 CFR 493.865 - Standard; Antibody identification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard; Antibody identification. 493.865 Section 493.865 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN... for Laboratories Performing Tests of Moderate Complexity (including the Subcategory), High Complexity...

  18. 42 CFR 493.829 - Standard; Parasitology.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Standard; Parasitology. 493.829 Section 493.829 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Performing Tests of Moderate Complexity (including the Subcategory), High Complexity, Or Any Combination of...

  19. General Synthesis of Transition-Metal Oxide Hollow Nanospheres/Nitrogen-Doped Graphene Hybrids by Metal-Ammine Complex Chemistry for High-Performance Lithium-Ion Batteries.

    PubMed

    Chen, Jiayuan; Wu, Xiaofeng; Gong, Yan; Wang, Pengfei; Li, Wenhui; Mo, Shengpeng; Peng, Shengpan; Tan, Qiangqiang; Chen, Yunfa

    2018-02-09

    We present a general and facile synthesis strategy, on the basis of metal-ammine complex chemistry, for synthesizing hollow transition-metal oxides (Co 3 O 4 , NiO, CuO-Cu 2 O, and ZnO)/nitrogen-doped graphene hybrids, potentially applied in high-performance lithium-ion batteries. The oxygen-containing functional groups of graphene oxide play a prerequisite role in the formation of hollow transition-metal oxides on graphene nanosheets, and a significant hollowing process occurs only when forming metal (Co 2+ , Ni 2+ , Cu 2+ , or Zn 2+ )-ammine complex ions. Moreover, the hollowing process is well correlated with the complexing capacity between metal ions and NH 3 molecules. The significant hollowing process occurs for strong metal-ammine complex ions including Co 2+ , Ni 2+ , Cu 2+ , and Zn 2+ ions, and no hollow structures formed for weak and/or noncomplex Mn 2+ and Fe 3+ ions. Simultaneously, this novel strategy can also achieve the direct doping of nitrogen atoms into the graphene framework. The electrochemical performance of two typical hollow Co 3 O 4 or NiO/nitrogen-doped graphene hybrids was evaluated by their use as anodic materials. It was demonstrated that these unique nanostructured hybrids, in contrast with the bare counterparts, solid transition-metal oxides/nitrogen-doped graphene hybrids, perform with significantly improved specific capacity, superior rate capability, and excellent capacity retention. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Iridium Complexes with Proton-Responsive Azole-Type Ligands as Effective Catalysts for CO 2 Hydrogenation

    DOE PAGES

    Ertem, Mehmed Zahid; Suna, Yuki; Himeda, Yuichiro; ...

    2017-10-06

    Pentamethylcyclopentadienyl iridium (Cp*Ir) complexes with bidentate ligands consisting of a pyridine ring and an electron-rich diazole ring were prepared. Their catalytic activity towards CO 2 hydrogenation in 2.0 M KHCO 3 aqueous solutions (pH 8.5) at 50 °C, under 1.0 MPa CO 2/H 2 (1:1) have been reported as an alternative to photo- and electrochemical CO 2 reduction. Bidentate ligands incorporating an electron-rich diazole ring improved the catalytic performance of the Ir complexes compared to the bipyridine ligand. Complexes 2, 4 and 6, possessing both a hydroxy group and an uncoordinated NH group, which are proton-responsive and capable of generatingmore » pendent-bases in basic media, recorded high initial TOF values of 1300 h -1, 1550 h -1 and 2000 h -1, respectively. Here, spectroscopic and computational investigations revealed that the reversible deprotonation changes the electronic properties of the complexes and causes interactions between pendent base and substrate and/or solvent water molecules, resulting in the high catalytic performance in basic media.« less

  1. The Telecommunications Review

    DOT National Transportation Integrated Search

    1991-01-01

    TELECOMMUNICATION NETWORKS ARE BECOMING INCREASINGLY COMPLEX. THEY MUST MEET THE NEEDS OF SOPHISTICATED USERS WHO DEMAND A HIGH-LEVEL OF FUNCTIONALITY, HIGH PERFORMANCE, AND LOW COST. : TELECOMMUNICATIONS TECHNOLOGY IS CHANGING RAPIDLY TO MEET THA...

  2. Molecular identification and in vitro antifungal susceptibility of Scedosporium complex isolates from high-human-activity sites in Mexico.

    PubMed

    Elizondo-Zertuche, Mariana; de J Treviño-Rangel, Rogelio; Robledo-Leal, Efrén; Luna-Rodríguez, Carolina E; Martínez-Fierro, Margarita L; Rodríguez-Sánchez, Iram P; González, Gloria M

    2017-01-01

    The genus Scedosporium is a complex of ubiquitous moulds associated with a wide spectrum of clinical entities, with high mortality principally in immunocompromised hosts. Ecology of these microorganisms has been studied performing isolations from environmental sources, showing a preference for human-impacted environments. This study aimed to evaluate the presence and antifungal susceptibility of Scedosporium complex species in soil samples collected in high-human-activity sites of Mexico. A total of 97 soil samples from 25 Mexican states were collected. Identifications were performed by microscopic morphology and confirmed by sequencing of the rDNA (internal transcribed spacer [ITS], D1/D2) and β-tubulin partial loci. Antifungal susceptibility testing was performed according to the Clinical and Laboratory Standards Institute (CLSI) protocols. Soil samples of urban gardens and industrial parks constituted the best sources for isolation of Scedosporium complex species. S. apiospermum sensu stricto was the most prevalent species (69%), followed by S. boydii (16%). Voriconazole (minimal inhibitory concentration [MIC] geometric mean ≤2.08 µg/mL), followed by posaconazole (MIC geometric mean ≤2.64 µg/mL), exhibited excellent in vitro activity for most species. Amphotericin B and fluconazole demonstrated limited antifungal activity, and all of the strains were resistant to echinocandins. This is the first report in Mexico of environmental distribution and antifungal in vitro susceptibility of these emergent pathogens.

  3. Optimized design of embedded DSP system hardware supporting complex algorithms

    NASA Astrophysics Data System (ADS)

    Li, Yanhua; Wang, Xiangjun; Zhou, Xinling

    2003-09-01

    The paper presents an optimized design method for a flexible and economical embedded DSP system that can implement complex processing algorithms as biometric recognition, real-time image processing, etc. It consists of a floating-point DSP, 512 Kbytes data RAM, 1 Mbytes FLASH program memory, a CPLD for achieving flexible logic control of input channel and a RS-485 transceiver for local network communication. Because of employing a high performance-price ratio DSP TMS320C6712 and a large FLASH in the design, this system permits loading and performing complex algorithms with little algorithm optimization and code reduction. The CPLD provides flexible logic control for the whole DSP board, especially in input channel, and allows convenient interface between different sensors and DSP system. The transceiver circuit can transfer data between DSP and host computer. In the paper, some key technologies are also introduced which make the whole system work efficiently. Because of the characters referred above, the hardware is a perfect flat for multi-channel data collection, image processing, and other signal processing with high performance and adaptability. The application section of this paper presents how this hardware is adapted for the biometric identification system with high identification precision. The result reveals that this hardware is easy to interface with a CMOS imager and is capable of carrying out complex biometric identification algorithms, which require real-time process.

  4. 42 CFR 493.835 - Standard; Syphilis serology.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard; Syphilis serology. 493.835 Section 493.835 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Performing Tests of Moderate Complexity (including the Subcategory), High Complexity, Or Any Combination of...

  5. 42 CFR 493.861 - Standard; Unexpected antibody detection.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard; Unexpected antibody detection. 493.861 Section 493.861 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN... for Laboratories Performing Tests of Moderate Complexity (including the Subcategory), High Complexity...

  6. 42 CFR 493.861 - Standard; Unexpected antibody detection.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Standard; Unexpected antibody detection. 493.861 Section 493.861 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN... for Laboratories Performing Tests of Moderate Complexity (including the Subcategory), High Complexity...

  7. 42 CFR 493.837 - Standard; General immunology.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Standard; General immunology. 493.837 Section 493.837 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Performing Tests of Moderate Complexity (including the Subcategory), High Complexity, Or Any Combination of...

  8. 42 CFR 493.837 - Standard; General immunology.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard; General immunology. 493.837 Section 493.837 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Performing Tests of Moderate Complexity (including the Subcategory), High Complexity, Or Any Combination of...

  9. 42 CFR 493.835 - Standard; Syphilis serology.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Standard; Syphilis serology. 493.835 Section 493.835 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Performing Tests of Moderate Complexity (including the Subcategory), High Complexity, Or Any Combination of...

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ertem, Mehmed Zahid; Suna, Yuki; Himeda, Yuichiro

    Pentamethylcyclopentadienyl iridium (Cp*Ir) complexes with bidentate ligands consisting of a pyridine ring and an electron-rich diazole ring were prepared. Their catalytic activity towards CO 2 hydrogenation in 2.0 M KHCO 3 aqueous solutions (pH 8.5) at 50 °C, under 1.0 MPa CO 2/H 2 (1:1) have been reported as an alternative to photo- and electrochemical CO 2 reduction. Bidentate ligands incorporating an electron-rich diazole ring improved the catalytic performance of the Ir complexes compared to the bipyridine ligand. Complexes 2, 4 and 6, possessing both a hydroxy group and an uncoordinated NH group, which are proton-responsive and capable of generatingmore » pendent-bases in basic media, recorded high initial TOF values of 1300 h -1, 1550 h -1 and 2000 h -1, respectively. Here, spectroscopic and computational investigations revealed that the reversible deprotonation changes the electronic properties of the complexes and causes interactions between pendent base and substrate and/or solvent water molecules, resulting in the high catalytic performance in basic media.« less

  11. High-performance liquid chromatography coupled with post-column dual-bioactivity assay for simultaneous screening of xanthine oxidase inhibitors and free radical scavengers from complex mixture.

    PubMed

    Li, D Q; Zhao, J; Li, S P

    2014-06-06

    Xanthine oxidase (XO) can catalyze hypoxanthine and xanthine to generate uric acid and reactive oxygen species (ROS), including superoxide anion radical (O₂(•-)) and hydrogen peroxide. XO inhibitors and free radical scavengers are beneficial to the treatment of gout and many related diseases. In the present study, an on-line high-performance liquid chromatography (HPLC) coupled with post-column dual-bioactivity assay was established and successfully applied to simultaneously screening of XO inhibitors and free radical scavengers from a complex mixture, Oroxylum indicum extract. The integrated system of HPLC separation, bioactivity screening and mass spectrometry identification was proved to be simple and effective for rapid and sensitive screening of individual bioactive compounds in complex mixtures. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Analysis of Proteins, Protein Complexes, and Organellar Proteomes Using Sheathless Capillary Zone Electrophoresis - Native Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Belov, Arseniy M.; Viner, Rosa; Santos, Marcia R.; Horn, David M.; Bern, Marshall; Karger, Barry L.; Ivanov, Alexander R.

    2017-12-01

    Native mass spectrometry (MS) is a rapidly advancing field in the analysis of proteins, protein complexes, and macromolecular species of various types. The majority of native MS experiments reported to-date has been conducted using direct infusion of purified analytes into a mass spectrometer. In this study, capillary zone electrophoresis (CZE) was coupled online to Orbitrap mass spectrometers using a commercial sheathless interface to enable high-performance separation, identification, and structural characterization of limited amounts of purified proteins and protein complexes, the latter with preserved non-covalent associations under native conditions. The performance of both bare-fused silica and polyacrylamide-coated capillaries was assessed using mixtures of protein standards known to form non-covalent protein-protein and protein-ligand complexes. High-efficiency separation of native complexes is demonstrated using both capillary types, while the polyacrylamide neutral-coated capillary showed better reproducibility and higher efficiency for more complex samples. The platform was then evaluated for the determination of monoclonal antibody aggregation and for analysis of proteomes of limited complexity using a ribosomal isolate from E. coli. Native CZE-MS, using accurate single stage and tandem-MS measurements, enabled identification of proteoforms and non-covalent complexes at femtomole levels. This study demonstrates that native CZE-MS can serve as an orthogonal and complementary technique to conventional native MS methodologies with the advantages of low sample consumption, minimal sample processing and losses, and high throughput and sensitivity. This study presents a novel platform for analysis of ribosomes and other macromolecular complexes and organelles, with the potential for discovery of novel structural features defining cellular phenotypes (e.g., specialized ribosomes). [Figure not available: see fulltext.

  13. High Resolution Numerical Simulations of Primary Atomization in Diesel Sprays with Single Component Reference Fuels

    DTIC Science & Technology

    2015-09-01

    NC. 14. ABSTRACT A high-resolution numerical simulation of jet breakup and spray formation from a complex diesel fuel injector at diesel engine... diesel fuel injector at diesel engine type conditions has been performed. A full understanding of the primary atomization process in diesel fuel... diesel liquid sprays the complexity is further compounded by the physical attributes present including nozzle turbulence, large density ratios

  14. Rapid identifying high-influence nodes in complex networks

    NASA Astrophysics Data System (ADS)

    Song, Bo; Jiang, Guo-Ping; Song, Yu-Rong; Xia, Ling-Ling

    2015-10-01

    A tiny fraction of influential individuals play a critical role in the dynamics on complex systems. Identifying the influential nodes in complex networks has theoretical and practical significance. Considering the uncertainties of network scale and topology, and the timeliness of dynamic behaviors in real networks, we propose a rapid identifying method (RIM) to find the fraction of high-influential nodes. Instead of ranking all nodes, our method only aims at ranking a small number of nodes in network. We set the high-influential nodes as initial spreaders, and evaluate the performance of RIM by the susceptible-infected-recovered (SIR) model. The simulations show that in different networks, RIM performs well on rapid identifying high-influential nodes, which is verified by typical ranking methods, such as degree, closeness, betweenness, and eigenvector centrality methods. Project supported by the National Natural Science Foundation of China (Grant Nos. 61374180 and 61373136), the Ministry of Education Research in the Humanities and Social Sciences Planning Fund Project, China (Grant No. 12YJAZH120), and the Six Projects Sponsoring Talent Summits of Jiangsu Province, China (Grant No. RLD201212).

  15. Robust Design of a Particle-Free Silver-Organo-Complex Ink with High Conductivity and Inkjet Stability for Flexible Electronics.

    PubMed

    Vaseem, Mohammad; McKerricher, Garret; Shamim, Atif

    2016-01-13

    Currently, silver-nanoparticle-based inkjet ink is commercially available. This type of ink has several serious problems such as a complex synthesis protocol, high cost, high sintering temperatures (∼200 °C), particle aggregation, nozzle clogging, poor shelf life, and jetting instability. For the emerging field of printed electronics, these shortcomings in conductive inks are barriers for their widespread use in practical applications. Formulating particle-free silver inks has potential to solve these issues and requires careful design of the silver complexation. The ink complex must meet various requirements, such as in situ reduction, optimum viscosity, storage and jetting stability, smooth uniform sintered films, excellent adhesion, and high conductivity. This study presents a robust formulation of silver-organo-complex (SOC) ink, where complexing molecules act as reducing agents. The 17 wt % silver loaded ink was printed and sintered on a wide range of substrates with uniform surface morphology and excellent adhesion. The jetting stability was monitored for 5 months to confirm that the ink was robust and highly stable with consistent jetting performance. Radio frequency inductors, which are highly sensitive to metal quality, were demonstrated as a proof of concept on flexible PEN substrate. This is a major step toward producing high-quality electronic components with a robust inkjet printing process.

  16. Highly Active N,O Zinc Guanidine Catalysts for the Ring-Opening Polymerization of Lactide.

    PubMed

    Schäfer, Pascal M; Fuchs, Martin; Ohligschläger, Andreas; Rittinghaus, Ruth; McKeown, Paul; Akin, Enver; Schmidt, Maximilian; Hoffmann, Alexander; Liauw, Marcel A; Jones, Matthew D; Herres-Pawlis, Sonja

    2017-09-22

    New zinc guanidine complexes with N,O donor functionalities were prepared, characterized by X-Ray crystallography, and examined for their catalytic activity in the solvent-free ring-opening polymerization (ROP) of technical-grade rac-lactide at 150 °C. All complexes showed a high activity. The fastest complex [ZnCl 2 (DMEGasme)] (C1) produced colorless poly(lactide) (PLA) after 90 min with a conversion of 52 % and high molar masses (M w =69 100, polydispersity=1.4). The complexes were tested with different monomer-to-initiator ratios to determine the rate constant k p . Furthermore, a polymerization with the most active complex C1 was monitored by in situ Raman spectroscopy. Overall, conversion of up to 90 % can be obtained. End-group analysis was performed to clarify the mechanism. All four complexes combine robustness against impurities in the lactide with high polymerization rates, and they represent the fastest robust lactide ROP catalysts to date, opening new avenues to a sustainable ROP catalyst family for industrial use. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Explaining high and low performers in complex intervention trials: a new model based on diffusion of innovations theory.

    PubMed

    McMullen, Heather; Griffiths, Chris; Leber, Werner; Greenhalgh, Trisha

    2015-05-31

    Complex intervention trials may require health care organisations to implement new service models. In a recent cluster randomised controlled trial, some participating organisations achieved high recruitment, whereas others found it difficult to assimilate the intervention and were low recruiters. We sought to explain this variation and develop a model to inform organisational participation in future complex intervention trials. The trial included 40 general practices in a London borough with high HIV prevalence. The intervention was offering a rapid HIV test as part of the New Patient Health Check. The primary outcome was mean CD4 cell count at diagnosis. The process evaluation consisted of several hundred hours of ethnographic observation, 21 semi-structured interviews and analysis of routine documents (e.g., patient leaflets, clinical protocols) and trial documents (e.g., inclusion criteria, recruitment statistics). Qualitative data were analysed thematically using--and, where necessary, extending--Greenhalgh et al.'s model of diffusion of innovations. Narrative synthesis was used to prepare case studies of four practices representing maximum variety in clinicians' interest in HIV (assessed by level of serological testing prior to the trial) and performance in the trial (high vs. low recruiters). High-recruiting practices were, in general though not invariably, also innovative practices. They were characterised by strong leadership, good managerial relations, readiness for change, a culture of staff training and available staff time ('slack resources'). Their front-line staff believed that patients might benefit from the rapid HIV test ('relative advantage'), were emotionally comfortable administering it ('compatibility'), skilled in performing it ('task issues') and made creative adaptations to embed the test in local working practices ('reinvention'). Early experience of a positive HIV test ('observability') appeared to reinforce staff commitment to recruiting more participants. Low-performing practices typically had less good managerial relations, significant resource constraints, staff discomfort with the test and no positive results early in the trial. An adaptation of the diffusion of innovations model was an effective analytical tool for retrospectively explaining high and low-performing practices in a complex intervention research trial. Whether the model will work prospectively to predict performance (and hence shape the design of future trials) is unknown. ISRCTN Registry number: ISRCTN63473710. Date assigned: 22 April 2010.

  18. Complex systems in metabolic engineering.

    PubMed

    Winkler, James D; Erickson, Keesha; Choudhury, Alaksh; Halweg-Edwards, Andrea L; Gill, Ryan T

    2015-12-01

    Metabolic engineers manipulate intricate biological networks to build efficient biological machines. The inherent complexity of this task, derived from the extensive and often unknown interconnectivity between and within these networks, often prevents researchers from achieving desired performance. Other fields have developed methods to tackle the issue of complexity for their unique subset of engineering problems, but to date, there has not been extensive and comprehensive examination of how metabolic engineers use existing tools to ameliorate this effect on their own research projects. In this review, we examine how complexity affects engineering at the protein, pathway, and genome levels within an organism, and the tools for handling these issues to achieve high-performing strain designs. Quantitative complexity metrics and their applications to metabolic engineering versus traditional engineering fields are also discussed. We conclude by predicting how metabolic engineering practices may advance in light of an explicit consideration of design complexity. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Lattice-cell orientation disorder in complex spinel oxides

    DOE PAGES

    Chen, Yan; Cheng, Yongqiang; Li, Juchuan; ...

    2016-11-07

    Transition metal (TM) substitution has been widely applied to change complex oxides crystal structures to create high energy density electrodes materials in high performance rechargeable lithium-ion batteries. The complex local structure in the oxides imparted by the TM arrangement often impacts their electrochemical behaviors by influencing the diffusion and intercalation of lithium. Here, a major discrepancy is demonstrated between the global and local structures of the promising high energy density and high voltage LiNi 0.5Mn 1.5O 4 spinel cathode material that contradicts the existing structural models. A new single-phase lattice-cell orientation disorder model is proposed as the mechanism for themore » local ordering that explains how the inhomogeneous local distortions and the coherent connection give rise to the global structure in the complex oxide. As a result, the single-phase model is consistent with the electrochemical behavior observation of the materials.« less

  20. Development of Fully-Integrated Micromagnetic Actuator Technologies

    DTIC Science & Technology

    2015-07-13

    nonexistent because of certain design and fabrication challenges— primarily the inability to integrate high-performance, permanent - magnet ( magnetically ... efficiency necessary for certain applications. To enable the development of high-performance magnetic actuator technologies, the original research plan...developed permanent - magnet materials in more complex microfabrication process flows Objective 2: Design, model, and optimize a novel multi- magnet

  1. Processing bulk natural wood into a high-performance structural material

    Treesearch

    Jianwei Song; Chaoji Chen; Shuze Zhu; Mingwei Zhu; Jiaqi Dai; Upamanyu Ray; Yiju Li; Yudi Kuang; Yongfeng Li; Nelson Quispe; Yonggang Yao; Amy Gong; Ulrich H. Leiste; Hugh A. Bruck; J. Y. Zhu; Azhar Vellore; Heng Li; Marilyn L. Minus; Zheng Jia; Ashlie Martini; Teng Li; Liangbing Hu

    2018-01-01

    Synthetic structural materials with exceptional mechanical performance suffer from either large weight and adverse environmental impact (for example, steels and alloys) or complex manufacturing processes and thus high cost (for example, polymer-based and biomimetic composites)1–8. Natural wood is a low-cost and abundant material and has been used...

  2. Extreme-scale Algorithms and Solver Resilience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dongarra, Jack

    A widening gap exists between the peak performance of high-performance computers and the performance achieved by complex applications running on these platforms. Over the next decade, extreme-scale systems will present major new challenges to algorithm development that could amplify this mismatch in such a way that it prevents the productive use of future DOE Leadership computers due to the following; Extreme levels of parallelism due to multicore processors; An increase in system fault rates requiring algorithms to be resilient beyond just checkpoint/restart; Complex memory hierarchies and costly data movement in both energy and performance; Heterogeneous system architectures (mixing CPUs, GPUs,more » etc.); and Conflicting goals of performance, resilience, and power requirements.« less

  3. In Vitro/In Vivo Evaluation of Radiolabeled [(99m)Tc(CO)3](+)-Hydroxyurea and Fluorescein Isothiocyanate-Hydroxyurea.

    PubMed

    Yilmaz, Baris; Teksoz, Serap; Kilcar, Ayfer Yurt; Ucar, Eser; Ichedef, Cigdem; Medine, Emin Ilker; Ari, Kadir

    2016-02-01

    The aim of current study is to examine hydroxyurea (HU), which is an antineoplastic drug used for the treatment of leukemia, sickle-cell disease, HIV, psoriasis, thrombocythemia, and various neoplastic diseases in two aspects. The active ingredient hydroxyurea was obtained by purification of the capsule form drug, commercially named as HYDREA. Then, [(99m)Tc(CO)3](+)core radiolabeling with HU was performed as first aspect. Quality control studies of (99m)Tc(CO)3-HU complex were performed by thin-layer radiochromatography and high-performance liquid radiochromatography methods. The results demonstrated that the radiolabeling yield was quite high (98.43% ± 2.29%). Also, (99m)Tc(CO)3-HU complex has good stability during the 24-hour period. Biological behavior of (99m)Tc(CO)3-HU complex is evaluated by biodistribution studies on Wistar Albino rats. Fluorescein isothiocyanate (FITC) labeling of HU was performed as second aspect. Fluorometric evaluation of binding efficacy and fluorescence imaging studies on MCF7 and Hela cell lines were carried out. It was thought that the knowledge achieved in this study would contribute to using (99m)Tc(CO)3-HU complex as an imaging agent, which inhibits the DNA synthesis selectively, by inhibiting ribonucleotide reductase enzyme. It was observed that FITC-HU has noteworthy incorporation on both cell lines.

  4. Task versus relationship conflict, team performance, and team member satisfaction: a meta-analysis.

    PubMed

    De Dreu, Carsten K W; Weingart, Laurie R

    2003-08-01

    This study provides a meta-analysis of research on the associations between relationship conflict, task conflict, team performance, and team member satisfaction. Consistent with past theorizing, results revealed strong and negative correlations between relationship conflict, team performance, and team member satisfaction. In contrast to what has been suggested in both academic research and introductory textbooks, however, results also revealed strong and negative (instead of the predicted positive) correlations between task conflict team performance, and team member satisfaction. As predicted, conflict had stronger negative relations with team performance in highly complex (decision making, project, mixed) than in less complex (production) tasks. Finally, task conflict was less negatively related to team performance when task conflict and relationship conflict were weakly, rather than strongly, correlated.

  5. Novel nano-semiconductor film layer supported nano-Pd Complex Nanostructured Catalyst Pd/Ⓕ-MeOx/AC for High Efficient Selective Hydrogenation of Phenol to Cyclohexanone.

    PubMed

    Si, Jiaqi; Ouyang, Wenbing; Zhang, Yanji; Xu, Wentao; Zhou, Jicheng

    2017-04-28

    Supported metal as a type of heterogeneous catalysts are the most widely used in industrial processes. High dispersion of the metal particles of supported catalyst is a key factor in determining the performance of such catalysts. Here we report a novel catalyst Pd/Ⓕ-MeO x /AC with complex nanostructured, Pd nanoparticles supported on the platelike nano-semiconductor film/activated carbon, prepared by the photocatalytic reduction method, which exhibited high efficient catalytic performance for selective hydrogenation of phenol to cyclohexanone. Conversion of phenol achieved up to more than 99% with a lower mole ratio (0.5%) of active components Pd and phenol within 2 h at 70 °C. The synergistic effect of metal nanoparticles and nano-semiconductors support layer and the greatly increasing of contact interface of nano-metal-semiconductors may be responsible for the high efficiency. This work provides a clear demonstration that complex nanostructured catalysts with nano-metal and nano-semiconductor film layer supported on high specific surface AC can yield enhanced catalytic activity and can afford promising approach for developing new supported catalyst.

  6. In situ coating nickel organic complexes on free-standing nickel wire films for volumetric-energy-dense supercapacitors.

    PubMed

    Hong, Min; Xu, Shusheng; Yao, Lu; Zhou, Chao; Hu, Nantao; Yang, Zhi; Hu, Jing; Zhang, Liying; Zhou, Zhihua; Wei, Hao; Zhang, Yafei

    2018-07-06

    A self-free-standing core-sheath structured hybrid membrane electrodes based on nickel and nickel based metal-organic complexes (Ni@Ni-OC) was designed and constructed for high volumetric supercapacitors. The self-standing Ni@Ni-OC film electrode had a high volumetric specific capacity of 1225.5 C cm -3 at 0.3 A cm -3 and an excellent rate capability. Moreover, when countered with graphene-carbon nanotube (G-CNT) film electrode, the as-assembled Ni@Ni-OC//G-CNT hybrid supercapacitor device delivered an extraordinary volumetric capacitance of 85 F cm -3 at 0.5 A cm -3 and an outstanding energy density of 33.8 at 483 mW cm -3 . Furthermore, the hybrid supercapacitor showed no capacitance loss after 10 000 cycles at 2 A cm -3 , indicating its excellent cycle stability. These fascinating performances can be ascribed to its unique core-sheath structure that high capacity nano-porous nickel based metal-organic complexes (Ni-OC) in situ coated on highly conductive Ni wires. The impressive results presented here may pave the way to construct s self-standing membrane electrode for applications in high volumetric-performance energy storage.

  7. Forward osmosis for oily wastewater reclamation: Multi-charged oxalic acid complexes as draw solutes.

    PubMed

    Ge, Qingchun; Amy, Gary Lee; Chung, Tai-Shung

    2017-10-01

    Forward osmosis (FO) has demonstrated its merits in hybrid FO seawater desalination. However, FO may have a potential for other applications if suitable draw solutes are available. In this study, a series of novel draw solutes based on oxalic acid (OA)-transitional metal complexes are presented. Influential factors of FO performance have been systematically investigated by varying the transitional metals, cations of the complex draw solutes as well as the experimental conditions. Compared to NaCl and other recently synthesized draw solutes, the OA complexes show superior FO performance in terms of high water fluxes up to 27.5 and 89.1 LMH under the respective FO and PRO (pressure retarded osmosis) modes, both with negligible reverse solute fluxes. The features of octahedral geometry, abundant hydrophilic groups and ionic species are crucial for the OA complexes as appropriate draw solutes with satisfactory FO performance. Among the synthesized OA complexes, the ammonium salt of chromic complex (NH 4 -Cr-OA) outperforms others due to the presence of more ionic species in its complex system. NH 4 -Cr-OA also performs better than the typical NaCl draw solute in FO oily wastewater treatment with higher water recovery and negligible reverse fluxes. Dilute solutions of OA complexes have been reconcentrated through membrane distillation (MD) and reused to new round of FO processes. The OA complexes have demonstrated their suitability and superiority as a novel class of draw solutes for the FO process in this study. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. A Systematic Approach for Obtaining Performance on Matrix-Like Operations

    NASA Astrophysics Data System (ADS)

    Veras, Richard Michael

    Scientific Computation provides a critical role in the scientific process because it allows us ask complex queries and test predictions that would otherwise be unfeasible to perform experimentally. Because of its power, Scientific Computing has helped drive advances in many fields ranging from Engineering and Physics to Biology and Sociology to Economics and Drug Development and even to Machine Learning and Artificial Intelligence. Common among these domains is the desire for timely computational results, thus a considerable amount of human expert effort is spent towards obtaining performance for these scientific codes. However, this is no easy task because each of these domains present their own unique set of challenges to software developers, such as domain specific operations, structurally complex data and ever-growing datasets. Compounding these problems are the myriads of constantly changing, complex and unique hardware platforms that an expert must target. Unfortunately, an expert is typically forced to reproduce their effort across multiple problem domains and hardware platforms. In this thesis, we demonstrate the automatic generation of expert level high-performance scientific codes for Dense Linear Algebra (DLA), Structured Mesh (Stencil), Sparse Linear Algebra and Graph Analytic. In particular, this thesis seeks to address the issue of obtaining performance on many complex platforms for a certain class of matrix-like operations that span across many scientific, engineering and social fields. We do this by automating a method used for obtaining high performance in DLA and extending it to structured, sparse and scale-free domains. We argue that it is through the use of the underlying structure found in the data from these domains that enables this process. Thus, obtaining performance for most operations does not occur in isolation of the data being operated on, but instead depends significantly on the structure of the data.

  9. 42 CFR 493.859 - Standard; ABO group and D (Rho) typing.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Standard; ABO group and D (Rho) typing. 493.859 Section 493.859 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN... for Laboratories Performing Tests of Moderate Complexity (including the Subcategory), High Complexity...

  10. Conceptualizing and Assessing Higher-Order Thinking in Reading

    ERIC Educational Resources Information Center

    Afflerbach, Peter; Cho, Byeong-Young; Kim, Jong-Yun

    2015-01-01

    Students engage in higher-order thinking as they read complex texts and perform complex reading-related tasks. However, the most consequential assessments, high-stakes tests, are currently limited in providing information about students' higher-order thinking. In this article, we describe higher-order thinking in relation to reading. We provide a…

  11. GEANT4 and Secondary Particle Production

    NASA Technical Reports Server (NTRS)

    Patterson, Jeff

    2004-01-01

    GEANT 4 is a Monte Carlo tool set developed by the High Energy Physics Community (CERN, SLAC, etc) to perform simulations of complex particle detectors. GEANT4 is the ideal tool to study radiation transport and should be applied to space environments and the complex geometries of modern day spacecraft.

  12. Evaluating Multi-Input/Multi-Output Digital Control Systems

    NASA Technical Reports Server (NTRS)

    Pototzky, Anthony S.; Wieseman, Carol D.; Hoadley, Sherwood T.; Mukhopadhyay, Vivek

    1994-01-01

    Controller-performance-evaluation (CPE) methodology for multi-input/multi-output (MIMO) digital control systems developed. Procedures identify potentially destabilizing controllers and confirm satisfactory performance of stabilizing ones. Methodology generic and used in many types of multi-loop digital-controller applications, including digital flight-control systems, digitally controlled spacecraft structures, and actively controlled wind-tunnel models. Also applicable to other complex, highly dynamic digital controllers, such as those in high-performance robot systems.

  13. Efficient estimation of ideal-observer performance in classification tasks involving high-dimensional complex backgrounds

    PubMed Central

    Park, Subok; Clarkson, Eric

    2010-01-01

    The Bayesian ideal observer is optimal among all observers and sets an absolute upper bound for the performance of any observer in classification tasks [Van Trees, Detection, Estimation, and Modulation Theory, Part I (Academic, 1968).]. Therefore, the ideal observer should be used for objective image quality assessment whenever possible. However, computation of ideal-observer performance is difficult in practice because this observer requires the full description of unknown, statistical properties of high-dimensional, complex data arising in real life problems. Previously, Markov-chain Monte Carlo (MCMC) methods were developed by Kupinski et al. [J. Opt. Soc. Am. A 20, 430(2003) ] and by Park et al. [J. Opt. Soc. Am. A 24, B136 (2007) and IEEE Trans. Med. Imaging 28, 657 (2009) ] to estimate the performance of the ideal observer and the channelized ideal observer (CIO), respectively, in classification tasks involving non-Gaussian random backgrounds. However, both algorithms had the disadvantage of long computation times. We propose a fast MCMC for real-time estimation of the likelihood ratio for the CIO. Our simulation results show that our method has the potential to speed up ideal-observer performance in tasks involving complex data when efficient channels are used for the CIO. PMID:19884916

  14. Low complexity 1D IDCT for 16-bit parallel architectures

    NASA Astrophysics Data System (ADS)

    Bivolarski, Lazar

    2007-09-01

    This paper shows that using the Loeffler, Ligtenberg, and Moschytz factorization of 8-point IDCT [2] one-dimensional (1-D) algorithm as a fast approximation of the Discrete Cosine Transform (DCT) and using only 16 bit numbers, it is possible to create in an IEEE 1180-1990 compliant and multiplierless algorithm with low computational complexity. This algorithm as characterized by its structure is efficiently implemented on parallel high performance architectures as well as due to its low complexity is sufficient for wide range of other architectures. Additional constraint on this work was the requirement of compliance with the existing MPEG standards. The hardware implementation complexity and low resources where also part of the design criteria for this algorithm. This implementation is also compliant with the precision requirements described in MPEG IDCT precision specification ISO/IEC 23002-1. Complexity analysis is performed as an extension to the simple measure of shifts and adds for the multiplierless algorithm as additional operations are included in the complexity measure to better describe the actual transform implementation complexity.

  15. Comparison of Laminar and Linear Eddy Model Closures for Combustion Instability Simulations

    DTIC Science & Technology

    2015-07-01

    14. ABSTRACT Unstable liquid rocket engines can produce highly complex dynamic flowfields with features such as rapid changes in temperature and...applicability. In the present study, the linear eddy model (LEM) is applied to an unstable single element liquid rocket engine to assess its performance and to...Sankaran‡ Air Force Research Laboratory, Edwards AFB, CA, 93524 Unstable liquid rocket engines can produce highly complex dynamic flowfields with features

  16. PyGirl: Generating Whole-System VMs from High-Level Prototypes Using PyPy

    NASA Astrophysics Data System (ADS)

    Bruni, Camillo; Verwaest, Toon

    Virtual machines (VMs) emulating hardware devices are generally implemented in low-level languages for performance reasons. This results in unmaintainable systems that are difficult to understand. In this paper we report on our experience using the PyPy toolchain to improve the portability and reduce the complexity of whole-system VM implementations. As a case study we implement a VM prototype for a Nintendo Game Boy, called PyGirl, in which the high-level model is separated from low-level VM implementation issues. We shed light on the process of refactoring from a low-level VM implementation in Java to a high-level model in RPython. We show that our whole-system VM written with PyPy is significantly less complex than standard implementations, without substantial loss in performance.

  17. Automated mask and wafer defect classification using a novel method for generalized CD variation measurements

    NASA Astrophysics Data System (ADS)

    Verechagin, V.; Kris, R.; Schwarzband, I.; Milstein, A.; Cohen, B.; Shkalim, A.; Levy, S.; Price, D.; Bal, E.

    2018-03-01

    Over the years, mask and wafers defects dispositioning has become an increasingly challenging and time consuming task. With design rules getting smaller, OPC getting complex and scanner illumination taking on free-form shapes - the probability of a user to perform accurate and repeatable classification of defects detected by mask inspection tools into pass/fail bins is reducing. The critical challenging of mask defect metrology for small nodes ( < 30 nm) was reviewed in [1]. While Critical Dimension (CD) variation measurement is still the method of choice for determining a mask defect future impact on wafer, the high complexity of OPCs combined with high variability in pattern shapes poses a challenge for any automated CD variation measurement method. In this study, a novel approach for measurement generalization is presented. CD variation assessment performance is evaluated on multiple different complex shape patterns, and is benchmarked against an existing qualified measurement methodology.

  18. Gymnastic judges benefit from their own motor experience as gymnasts.

    PubMed

    Pizzera, Alexandra

    2012-12-01

    Gymnastic judges have the difficult task of evaluating highly complex skills. My purpose in the current study was to examine evidence that judges use their sensorimotor experiences to enhance their perceptual judgments. In a video test, 58 judges rated 31 gymnasts performing a balance beam skill. I compared decision quality between judges who could perform the skill themselves on the balance beam (specific motor experience = SME) and those who could not. Those with SME showed better performance than those without SME. These data suggest that judges use their personal experiences as information to accurately assess complex gymnastic skills. [corrected].

  19. SU-E-T-766: Treatment Planning Comparison Study On Two Different Multileaf Collimators Delivered with Volumetric Modulated Arc Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, R; Xiaomei, F; Bai, W

    2015-06-15

    Purpose: To compare and evaluate the performance of two different multileaf collimators(MLCi2 and Agility) delivery with volumetric modulated arc therapy techniques. Methods: Treatment plans were graded four (Low, Moderate, Moderate-High and High complexity) accorrding to the complexity. This includes 1 Low complexity(brain metastasis), 2 Moderate complexity(Lung and Liver), 1 Moderate-High complexity(prostate) and 1 High complexity ( head and neck) cases. Total dose of 60 Gy was given for all the plans. All cases were desigined two VMAT plans, one with MLCi2(group A) and the other with Agility(group B). All plans were done on Elekta VMAT with Monaco treatment planning system.more » All plans were generated with 6 MV X-rays for both Plan A and Plan B. Plans were evaluated based on the ability to meet the dose volume histogram, radiation conformity index, estimated radiation delivery time, dose homogeneity index(HI) and monitor units(MU) needed to deliver the prescribed dose. Results: Plans of group B achieved the best HI (HI = 1.05 Vs. 1.06) at the Low complexity cases while plans of group A were slightly better at the high complexity cases (HI = 1.12 Vs. 1.14). Faster VMAT plan delivery with Agility than with MLCi2 as plan complexity increased (Low complexity:52s Vs.52s, Moderate complexity:58s Vs. 55s, Moderate-High complexity: 171s Vs.152s, High complexity : 326s Vs. 202s ), especially for the most complex paradigms delivered time can be decresed 38%. No Significant changes were observed between the group B and group A plans in terms of the healthy tissue mean dose and MU. Both plans respected the planning objective for all organs at risk. Conclusion: The study concludes that VMAT plans with the novel Agility MLC can significant decrease the delivering time at the high complexity cases, while a slight compromise in the dose homogeneity index should be noted. This work was supported by The Medical Science Foundation of The health department of Hebei Province (No. 20130253)« less

  20. Thermal performance of complex fenestration systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carpenter, S.C.; Elmahdy, A.H.

    1994-12-31

    The thermal performance (i.e., U-factor) of four complex fenestration systems is examined using computer simulation tools and guarded hot box testing. The systems include a flat glazed skylight, a domed or bubble skylight, a greenhouse window, and a curtain wall. The extra care required in performing simulation and testing of these complex products is described. There was good agreement (within 10%) between test and simulation for two of the four products. The agreement was slightly poorer (maximum difference of 16%) for the two high-heat-transfer products: the domed skylight and the greenhouse window. Possible causes for the larger discrepancy in thesemore » projecting window products are uncertainties in the inside and outside film coefficients and lower warm-side air temperatures because of stagnant airflow.« less

  1. Performance Engineering Research Institute SciDAC-2 Enabling Technologies Institute Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lucas, Robert

    2013-04-20

    Enhancing the performance of SciDAC applications on petascale systems had high priority within DOE SC at the start of the second phase of the SciDAC program, SciDAC-2, as it continues to do so today. Achieving expected levels of performance on high-end computing (HEC) systems is growing ever more challenging due to enormous scale, increasing architectural complexity, and increasing application complexity. To address these challenges, the University of Southern California?s Information Sciences Institute organized the Performance Engineering Research Institute (PERI). PERI implemented a unified, tripartite research plan encompassing: (1) performance modeling and prediction; (2) automatic performance tuning; and (3) performance engineeringmore » of high profile applications. Within PERI, USC?s primary research activity was automatic tuning (autotuning) of scientific software. This activity was spurred by the strong user preference for automatic tools and was based on previous successful activities such as ATLAS, which automatically tuned components of the LAPACK linear algebra library, and other recent work on autotuning domain-specific libraries. Our other major component was application engagement, to which we devoted approximately 30% of our effort to work directly with SciDAC-2 applications. This report is a summary of the overall results of the USC PERI effort.« less

  2. Diverse, high-quality test set for the validation of protein-ligand docking performance.

    PubMed

    Hartshorn, Michael J; Verdonk, Marcel L; Chessari, Gianni; Brewerton, Suzanne C; Mooij, Wijnand T M; Mortenson, Paul N; Murray, Christopher W

    2007-02-22

    A procedure for analyzing and classifying publicly available crystal structures has been developed. It has been used to identify high-resolution protein-ligand complexes that can be assessed by reconstructing the electron density for the ligand using the deposited structure factors. The complexes have been clustered according to the protein sequences, and clusters have been discarded if they do not represent proteins thought to be of direct interest to the pharmaceutical or agrochemical industry. Rules have been used to exclude complexes containing non-drug-like ligands. One complex from each cluster has been selected where a structure of sufficient quality was available. The final Astex diverse set contains 85 diverse, relevant protein-ligand complexes, which have been prepared in a format suitable for docking and are to be made freely available to the entire research community (http://www.ccdc.cam.ac.uk). The performance of the docking program GOLD against the new set is assessed using a variety of protocols. Relatively unbiased protocols give success rates of approximately 80% for redocking into native structures, but it is possible to get success rates of over 90% with some protocols.

  3. Improved methods of performing coherent optical correlation

    NASA Technical Reports Server (NTRS)

    Husain-Abidi, A. S.

    1972-01-01

    Coherent optical correlators are described in which complex spatial filters are recorded by a quasi-Fourier transform method. The high-pass spatial filtering effects (due to the dynamic range of photographic films) normally encountered in Vander Lugt type complex filters are not present in this system. Experimental results for both transmittive as well as reflective objects are presented. Experiments are also performed by illuminating the object with diffused light. A correlator using paraboloidal mirror segments as the Fourier-transforming element is also described.

  4. Queueing Network Models for Parallel Processing of Task Systems: an Operational Approach

    NASA Technical Reports Server (NTRS)

    Mak, Victor W. K.

    1986-01-01

    Computer performance modeling of possibly complex computations running on highly concurrent systems is considered. Earlier works in this area either dealt with a very simple program structure or resulted in methods with exponential complexity. An efficient procedure is developed to compute the performance measures for series-parallel-reducible task systems using queueing network models. The procedure is based on the concept of hierarchical decomposition and a new operational approach. Numerical results for three test cases are presented and compared to those of simulations.

  5. Jungle Computing: Distributed Supercomputing Beyond Clusters, Grids, and Clouds

    NASA Astrophysics Data System (ADS)

    Seinstra, Frank J.; Maassen, Jason; van Nieuwpoort, Rob V.; Drost, Niels; van Kessel, Timo; van Werkhoven, Ben; Urbani, Jacopo; Jacobs, Ceriel; Kielmann, Thilo; Bal, Henri E.

    In recent years, the application of high-performance and distributed computing in scientific practice has become increasingly wide spread. Among the most widely available platforms to scientists are clusters, grids, and cloud systems. Such infrastructures currently are undergoing revolutionary change due to the integration of many-core technologies, providing orders-of-magnitude speed improvements for selected compute kernels. With high-performance and distributed computing systems thus becoming more heterogeneous and hierarchical, programming complexity is vastly increased. Further complexities arise because urgent desire for scalability and issues including data distribution, software heterogeneity, and ad hoc hardware availability commonly force scientists into simultaneous use of multiple platforms (e.g., clusters, grids, and clouds used concurrently). A true computing jungle.

  6. High throughput computing: a solution for scientific analysis

    USGS Publications Warehouse

    O'Donnell, M.

    2011-01-01

    handle job failures due to hardware, software, or network interruptions (obviating the need to manually resubmit the job after each stoppage); be affordable; and most importantly, allow us to complete very large, complex analyses that otherwise would not even be possible. In short, we envisioned a job-management system that would take advantage of unused FORT CPUs within a local area network (LAN) to effectively distribute and run highly complex analytical processes. What we found was a solution that uses High Throughput Computing (HTC) and High Performance Computing (HPC) systems to do exactly that (Figure 1).

  7. Large aspheric optics for high-power, high-energy laser

    NASA Astrophysics Data System (ADS)

    Geyl, Roland; Houbre, Francois

    2001-12-01

    SAGEM, within its REOSC high performance optics product line, has developed through the years a specific knowledge in large plano, spherical and aspherical optics for high energy or high power laser. This paper is aimed to illustrate the application of aspheric optics for such laser application with several examples of increasing optical surface complexity.

  8. Fair performance comparison of different carbon blacks in lithium-sulfur batteries with practical mass loadings - Simple design competes with complex cathode architecture

    NASA Astrophysics Data System (ADS)

    Jozwiuk, Anna; Sommer, Heino; Janek, Jürgen; Brezesinski, Torsten

    2015-11-01

    The lithium-sulfur system is one of the most promising next generation battery systems, as elemental sulfur is cheap, abundant and has a high theoretical specific capacity. Although much research is conducted on complex sulfur/carbon composites and architectures, it is difficult to compare the performance of the cathodes to one another. Factors, such as different electrolyte composition and cell components strongly affect the cyclability of the battery. Here, we show the importance of optimizing ;standard; conditions to allow for fair performance comparison of different carbon blacks. Our optimal electrolyte-to-sulfur ratio is 11 μL mgsulfur-1 and high concentrations of LiNO3 (>0.6 M) are needed because nitrate is consumed continuously during cycling. Utilizing these standard conditions, we tested the cycling behavior of four types of cathodes with individual carbon blacks having different specific surface areas, namely Printex-A, Super C65, Printex XE-2 and Ketjenblack EC-600JD. Both the specific capacity and polysulfide adsorption capability clearly correlate with the surface area of the carbon being used. High specific capacities (>1000 mAh gsulfur-1 at C/5) are achieved with high surface area carbons. We also demonstrate that a simple cathode using Ketjenblack EC-600JD as the conductive matrix material can well compete with those having complex architectures or additives.

  9. Artificial Intelligence Applications to High-Technology Training.

    ERIC Educational Resources Information Center

    Dede, Christopher

    1987-01-01

    Discusses the use of artificial intelligence to improve occupational instruction in complex subjects with high performance goals, such as those required for high-technology jobs. Highlights include intelligent computer assisted instruction, examples in space technology training, intelligent simulation environments, and the need for adult training…

  10. Cost and Ecological Feasibility of using UHPC in Highway Bridges

    DOT National Transportation Integrated Search

    2017-11-15

    There is a growing interest in expanding the use of Ultra-high performance concrete (UHPC) from bridge deck joints for accelerated bridge construction to complex architectural and advanced structural applications. The high costs currently associated ...

  11. System-Level Radiation Hardening

    NASA Technical Reports Server (NTRS)

    Ladbury, Ray

    2014-01-01

    Although system-level radiation hardening can enable the use of high-performance components and enhance the capabilities of a spacecraft, hardening techniques can be costly and can compromise the very performance designers sought from the high-performance components. Moreover, such techniques often result in a complicated design, especially if several complex commercial microcircuits are used, each posing its own hardening challenges. The latter risk is particularly acute for Commercial-Off-The-Shelf components since high-performance parts (e.g. double-data-rate synchronous dynamic random access memories - DDR SDRAMs) may require other high-performance commercial parts (e.g. processors) to support their operation. For these reasons, it is essential that system-level radiation hardening be a coordinated effort, from setting requirements through testing up to and including validation.

  12. High speed rail as a complex sociotechnical system.

    DOT National Transportation Integrated Search

    2015-10-01

    The primary goal of this research is to study the impact of Amtraks performance in the Northeast Corridor (NEC). This report focuses on Amtraks Acela and Regional passengers, as well as the travel time performance of these services in the last ...

  13. Reference Proteome Extracts for Mass Spec Instrument Performance Validation and Method Development

    PubMed Central

    Rosenblatt, Mike; Urh, Marjeta; Saveliev, Sergei

    2014-01-01

    Biological samples of high complexity are required to test protein mass spec sample preparation procedures and validate mass spec instrument performance. Total cell protein extracts provide the needed sample complexity. However, to be compatible with mass spec applications, such extracts should meet a number of design requirements: compatibility with LC/MS (free of detergents, etc.)high protein integrity (minimal level of protein degradation and non-biological PTMs)compatibility with common sample preparation methods such as proteolysis, PTM enrichment and mass-tag labelingLot-to-lot reproducibility Here we describe total protein extracts from yeast and human cells that meet the above criteria. Two extract formats have been developed: Intact protein extracts with primary use for sample preparation method development and optimizationPre-digested extracts (peptides) with primary use for instrument validation and performance monitoring

  14. A deep learning approach for fetal QRS complex detection.

    PubMed

    Zhong, Wei; Liao, Lijuan; Guo, Xuemei; Wang, Guoli

    2018-04-20

    Non-invasive foetal electrocardiography (NI-FECG) has the potential to provide more additional clinical information for detecting and diagnosing fetal diseases. We propose and demonstrate a deep learning approach for fetal QRS complex detection from raw NI-FECG signals by using a convolutional neural network (CNN) model. The main objective is to investigate whether reliable fetal QRS complex detection performance can still be obtained from features of single-channel NI-FECG signals, without canceling maternal ECG (MECG) signals. A deep learning method is proposed for recognizing fetal QRS complexes. Firstly, we collect data from set-a of the PhysioNet/computing in Cardiology Challenge database. The sample entropy method is used for signal quality assessment. Part of the bad quality signals is excluded in the further analysis. Secondly, in the proposed method, the features of raw NI-FECG signals are normalized before they are fed to a CNN classifier to perform fetal QRS complex detection. We use precision, recall, F-measure and accuracy as the evaluation metrics to assess the performance of fetal QRS complex detection. The proposed deep learning method can achieve relatively high precision (75.33%), recall (80.54%), and F-measure scores (77.85%) compared with three other well-known pattern classification methods, namely KNN, naive Bayes and SVM. the proposed deep learning method can attain reliable fetal QRS complex detection performance from the raw NI-FECG signals without canceling MECG signals. In addition, the influence of different activation functions and signal quality assessment on classification performance are evaluated, and results show that Relu outperforms the Sigmoid and Tanh on this particular task, and better classification performance is obtained with the signal quality assessment step in this study.

  15. Effects of probiotic supplementation in different nutrient density diets on growth performance, nutrient digestibility, blood profiles, fecal microflora and noxious gas emission in weaning pig.

    PubMed

    Lan, Ruixia; Tran, Hoainam; Kim, Inho

    2017-03-01

    Probiotics can serve as alternatives to antibiotics to increase the performance of weaning pigs, and the intake of probiotics is affected by dietary nutrient density. The objective of this study was to evaluate the effects of a probiotic complex in different nutrient density diets on growth performance, digestibility, blood profiles, fecal microflora and noxious gas emission in weaning pigs. From day 22 to day 42, both high-nutrient-density and probiotic complex supplementation diets increased (P < 0.05) the average daily gain. On day 42, the apparent total tract digestibility (ATTD) of dry matter, nitrogen and gross energy (GE), blood urea nitrogen concentration and NH 3 and H 2 S emissions were increased (P < 0.05) in pigs fed high-nutrient-density diets. Pigs fed probiotic complex supplementation diets had higher (P < 0.05) ATTD of GE than pigs fed non-supplemented diets. Fecal Lactobacillus counts were increased whereas Escherichia coli counts and NH 3 and H 2 S emissions were decreased (P < 0.05) in pigs fed probiotic complex supplementation diets. Interactive effects on average daily feed intake (ADFI) were observed from day 22 to day 42 and overall, where probiotic complex improved ADFI more dramatically in low-nutrient-density diets. The beneficial effects of probiotic complex (Bacillus coagulans, Bacillus licheniformis, Bacillus subtilis and Clostridium butyricum) supplementation on ADFI is more dramatic with low-nutrient-density diets. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  16. A complexity-scalable software-based MPEG-2 video encoder.

    PubMed

    Chen, Guo-bin; Lu, Xin-ning; Wang, Xing-guo; Liu, Ji-lin

    2004-05-01

    With the development of general-purpose processors (GPP) and video signal processing algorithms, it is possible to implement a software-based real-time video encoder on GPP, and its low cost and easy upgrade attract developers' interests to transfer video encoding from specialized hardware to more flexible software. In this paper, the encoding structure is set up first to support complexity scalability; then a lot of high performance algorithms are used on the key time-consuming modules in coding process; finally, at programming level, processor characteristics are considered to improve data access efficiency and processing parallelism. Other programming methods such as lookup table are adopted to reduce the computational complexity. Simulation results showed that these ideas could not only improve the global performance of video coding, but also provide great flexibility in complexity regulation.

  17. Complex Problem Solving in Teams: The Impact of Collective Orientation on Team Process Demands.

    PubMed

    Hagemann, Vera; Kluge, Annette

    2017-01-01

    Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model, that complex problem solving teams pass through, and integrating the relevant teamwork skills for interdependently working teams into the model and combining it with the four kinds of team processes (transition, action, interpersonal, and learning processes), the paper demonstrates the importance of fulfilling team process demands for successful complex problem solving within teams. Therefore, results from a controlled team study within complex situations are presented. The study focused on factors that influence action processes, like coordination, such as emergent states like collective orientation, cohesion, and trust and that dynamically enable effective teamwork in complex situations. Before conducting the experiments, participants were divided by median split into two-person teams with either high ( n = 58) or low ( n = 58) collective orientation values. The study was conducted with the microworld C3Fire, simulating dynamic decision making, and acting in complex situations within a teamwork context. The microworld includes interdependent tasks such as extinguishing forest fires or protecting houses. Two firefighting scenarios had been developed, which takes a maximum of 15 min each. All teams worked on these two scenarios. Coordination within the team and the resulting team performance were calculated based on a log-file analysis. The results show that no relationships between trust and action processes and team performance exist. Likewise, no relationships were found for cohesion. Only collective orientation of team members positively influences team performance in complex environments mediated by action processes such as coordination within the team. The results are discussed in relation to previous empirical findings and to learning processes within the team with a focus on feedback strategies.

  18. Complex Problem Solving in Teams: The Impact of Collective Orientation on Team Process Demands

    PubMed Central

    Hagemann, Vera; Kluge, Annette

    2017-01-01

    Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model, that complex problem solving teams pass through, and integrating the relevant teamwork skills for interdependently working teams into the model and combining it with the four kinds of team processes (transition, action, interpersonal, and learning processes), the paper demonstrates the importance of fulfilling team process demands for successful complex problem solving within teams. Therefore, results from a controlled team study within complex situations are presented. The study focused on factors that influence action processes, like coordination, such as emergent states like collective orientation, cohesion, and trust and that dynamically enable effective teamwork in complex situations. Before conducting the experiments, participants were divided by median split into two-person teams with either high (n = 58) or low (n = 58) collective orientation values. The study was conducted with the microworld C3Fire, simulating dynamic decision making, and acting in complex situations within a teamwork context. The microworld includes interdependent tasks such as extinguishing forest fires or protecting houses. Two firefighting scenarios had been developed, which takes a maximum of 15 min each. All teams worked on these two scenarios. Coordination within the team and the resulting team performance were calculated based on a log-file analysis. The results show that no relationships between trust and action processes and team performance exist. Likewise, no relationships were found for cohesion. Only collective orientation of team members positively influences team performance in complex environments mediated by action processes such as coordination within the team. The results are discussed in relation to previous empirical findings and to learning processes within the team with a focus on feedback strategies. PMID:29033886

  19. Field programmable gate array based fuzzy neural signal processing system for differential diagnosis of QRS complex tachycardia and tachyarrhythmia in noisy ECG signals.

    PubMed

    Chowdhury, Shubhajit Roy

    2012-04-01

    The paper reports of a Field Programmable Gate Array (FPGA) based embedded system for detection of QRS complex in a noisy electrocardiogram (ECG) signal and thereafter differential diagnosis of tachycardia and tachyarrhythmia. The QRS complex has been detected after application of entropy measure of fuzziness to build a detection function of ECG signal, which has been previously filtered to remove power line interference and base line wander. Using the detected QRS complexes, differential diagnosis of tachycardia and tachyarrhythmia has been performed. The entire algorithm has been realized in hardware on an FPGA. Using the standard CSE ECG database, the algorithm performed highly effectively. The performance of the algorithm in respect of QRS detection with sensitivity (Se) of 99.74% and accuracy of 99.5% is achieved when tested using single channel ECG with entropy criteria. The performance of the QRS detection system has been compared and found to be better than most of the QRS detection systems available in literature. Using the system, 200 patients have been diagnosed with an accuracy of 98.5%.

  20. Filtering Gene Ontology semantic similarity for identifying protein complexes in large protein interaction networks.

    PubMed

    Wang, Jian; Xie, Dong; Lin, Hongfei; Yang, Zhihao; Zhang, Yijia

    2012-06-21

    Many biological processes recognize in particular the importance of protein complexes, and various computational approaches have been developed to identify complexes from protein-protein interaction (PPI) networks. However, high false-positive rate of PPIs leads to challenging identification. A protein semantic similarity measure is proposed in this study, based on the ontology structure of Gene Ontology (GO) terms and GO annotations to estimate the reliability of interactions in PPI networks. Interaction pairs with low GO semantic similarity are removed from the network as unreliable interactions. Then, a cluster-expanding algorithm is used to detect complexes with core-attachment structure on filtered network. Our method is applied to three different yeast PPI networks. The effectiveness of our method is examined on two benchmark complex datasets. Experimental results show that our method performed better than other state-of-the-art approaches in most evaluation metrics. The method detects protein complexes from large scale PPI networks by filtering GO semantic similarity. Removing interactions with low GO similarity significantly improves the performance of complex identification. The expanding strategy is also effective to identify attachment proteins of complexes.

  1. Crystal structures and catalytic performance of three new methoxy substituted salen type nickel(II) Schiff base complexes derived from meso-1,2-diphenyl-1,2-ethylenediamine

    NASA Astrophysics Data System (ADS)

    Ghaffari, Abolfazl; Behzad, Mahdi; Pooyan, Mahsa; Amiri Rudbari, Hadi; Bruno, Giuseppe

    2014-04-01

    Three new nickel(II) complexes of a series of methoxy substituted salen type Schiff base ligands were synthesized and characterized by IR, UV-Vis and 1H NMR spectroscopy and elemental analysis. The ligands were synthesized from the condensation of meso-1,2-diphenyl-1,2-ethylenediamine with n-methoxysalicylaldehyde (n = 3, 4 and 5). Crystal structures of these complexes were determined. Electrochemical behavior of the complexes was studied by means of cyclic voltammetry in DMSO solutions. Catalytic performance of the complexes was studied in the epoxidation of cyclooctene using tert-butylhydroperoxide (TBHP) as oxidant under various conditions to find the optimum operating parameters. Low catalytic activity with moderate epoxide selectivity was observed in in-solvent conditions but in the solvent-free conditions, enhanced catalytic activity with high epoxide selectivity was achieved.

  2. Synthetic vision display evaluation studies

    NASA Technical Reports Server (NTRS)

    Regal, David M.; Whittington, David H.

    1994-01-01

    The goal of this research was to help us understand the display requirements for a synthetic vision system for the High Speed Civil Transport (HSCT). Four experiments were conducted to examine the effects of different levels of perceptual cue complexity in displays used by pilots in a flare and landing task. Increased levels of texture mapping of terrain and runway produced mixed results, including harder but shorter landings and a lower flare initiation altitude. Under higher workload conditions, increased texture resulted in an improvement in performance. An increase in familiar size cues did not result in improved performance. Only a small difference was found between displays using two patterns of high resolution texture mapping. The effects of increased perceptual cue complexity on performance was not as strong as would be predicted from the pilot's subjective reports or from related literature. A description of the role of a synthetic vision system in the High Speed Civil Transport is provide along with a literature review covering applied research related to perceptual cue usage in aircraft displays.

  3. Navigating the Turbulent Waters of School Reform Guided by Complexity Theory

    ERIC Educational Resources Information Center

    White, David G.; Levin, James A.

    2016-01-01

    The goal of this research study has been to develop, implement, and evaluate a school reform design experiment at a continuation high school with low-income, low-performing underrepresented minority students. The complexity sciences served as a theoretical framework for this design experiment. Treating an innovative college preparatory program as…

  4. EEG Patterns Related to Cognitive Tasks of Varying Complexity.

    ERIC Educational Resources Information Center

    Dunn, Denise A.; And Others

    A study was conducted that attempted to show changes in electroencephalographic (EEG) patterns (identified using topographic EEG mapping) when children were required to perform the relatively simple task of button pressing during an eyes-open baseline session of low cognitive demand and a complex reaction time (RT) task of high cognitive demand.…

  5. Olympic Sports Coaching Education: An International Coach's Perspective

    ERIC Educational Resources Information Center

    Kiosoglous, Cameron

    2016-01-01

    The profession of high performance sports coaching is a complex process focused on performance improvement with the goal of producing international sporting success. Rising demand for top-level coaches has been matched with the increasing amount of resources allocated to producing world-class performances. This includes creating and sustaining a…

  6. Iridium Complexes with Proton-Responsive Azole-Type Ligands as Effective Catalysts for CO2 Hydrogenation.

    PubMed

    Suna, Yuki; Himeda, Yuichiro; Fujita, Etsuko; Muckerman, James T; Ertem, Mehmed Z

    2017-11-23

    Pentamethylcyclopentadienyl iridium (Cp*Ir) complexes with bidentate ligands consisting of a pyridine ring and an electron-rich diazole ring were prepared. Their catalytic activity toward CO 2 hydrogenation in 2.0 m KHCO 3 aqueous solutions (pH 8.5) at 50 °C, under 1.0 MPa CO 2 /H 2 (1:1) have been reported as an alternative to photo- and electrochemical CO 2 reduction. Bidentate ligands incorporating an electron-rich diazole ring improved the catalytic performance of the Ir complexes compared to the bipyridine ligand. Complexes 2, 4, and 6, possessing both a hydroxy group and an uncoordinated NH group, which are proton-responsive and capable of generating pendent bases in basic media, recorded high initial turnover frequency values of 1300, 1550, and 2000 h -1 , respectively. Spectroscopic and computational investigations revealed that the reversible deprotonation changes the electronic properties of the complexes and causes interactions between pendent base and substrate and/or solvent water molecules, resulting in high catalytic performance in basic media. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. MemAxes: Visualization and Analytics for Characterizing Complex Memory Performance Behaviors.

    PubMed

    Gimenez, Alfredo; Gamblin, Todd; Jusufi, Ilir; Bhatele, Abhinav; Schulz, Martin; Bremer, Peer-Timo; Hamann, Bernd

    2018-07-01

    Memory performance is often a major bottleneck for high-performance computing (HPC) applications. Deepening memory hierarchies, complex memory management, and non-uniform access times have made memory performance behavior difficult to characterize, and users require novel, sophisticated tools to analyze and optimize this aspect of their codes. Existing tools target only specific factors of memory performance, such as hardware layout, allocations, or access instructions. However, today's tools do not suffice to characterize the complex relationships between these factors. Further, they require advanced expertise to be used effectively. We present MemAxes, a tool based on a novel approach for analytic-driven visualization of memory performance data. MemAxes uniquely allows users to analyze the different aspects related to memory performance by providing multiple visual contexts for a centralized dataset. We define mappings of sampled memory access data to new and existing visual metaphors, each of which enabling a user to perform different analysis tasks. We present methods to guide user interaction by scoring subsets of the data based on known performance problems. This scoring is used to provide visual cues and automatically extract clusters of interest. We designed MemAxes in collaboration with experts in HPC and demonstrate its effectiveness in case studies.

  8. A methodology towards virtualisation-based high performance simulation platform supporting multidisciplinary design of complex products

    NASA Astrophysics Data System (ADS)

    Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin

    2012-08-01

    Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.

  9. Overexpression of MutSα Complex Proteins Predicts Poor Prognosis in Oral Squamous Cell Carcinoma.

    PubMed

    Wagner, Vivian Petersen; Webber, Liana Preto; Salvadori, Gabriela; Meurer, Luise; Fonseca, Felipe Paiva; Castilho, Rogério Moraes; Squarize, Cristiane Helena; Vargas, Pablo Agustin; Martins, Manoela Domingues

    2016-05-01

    The DNA mismatch repair (MMR) system is responsible for the detection and correction of errors created during DNA replication, thereby avoiding the incorporation of mutations in dividing cells. The prognostic value of alterations in MMR system has not previously been analyzed in oral squamous cell carcinoma (OSCC).The study comprised 115 cases of OSCC diagnosed between 1996 and 2010. The specimens collected were constructed into tissue microarray blocks. Immunohistochemical staining for MutSα complex proteins hMSH2 and hMSH6 was performed. The slides were subsequently scanned into high-resolution images, and nuclear staining of hMSH2 and hMSH6 was analyzed using the Nuclear V9 algorithm. Univariable and multivariable Cox proportional hazard regression models were performed to evaluate the prognostic value of hMSH2 and hMSH6 in OSCC.All cases in the present cohort were positive for hMSH2 and hMSH6 and a direct correlation was found between the expression of the proteins (P < 0.05). The mean number of positive cells for hMSH2 and hMSH6 was 64.44 ± 15.21 and 31.46 ± 22.38, respectively. These values were used as cutoff points to determine high protein expression. Cases with high expression of both proteins simultaneously were classified as having high MutSα complex expression. In the multivariable analysis, high expression of the MutSα complex was an independent prognostic factor for poor overall survival (hazard ratio: 2.75, P = 0.02).This study provides a first insight of the prognostic value of alterations in MMR system in OSCC. We found that MutSα complex may constitute a molecular marker for the poor prognosis of OSCC.

  10. 42 CFR 493.1471 - Standard: Cytology general supervisor responsibilities.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Cytology general supervisor... Nonwaived Testing Laboratories Performing High Complexity Testing § 493.1471 Standard: Cytology general supervisor responsibilities. The technical supervisor of cytology may perform the duties of the cytology...

  11. 42 CFR 493.1471 - Standard: Cytology general supervisor responsibilities.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 5 2014-10-01 2014-10-01 false Standard: Cytology general supervisor... Nonwaived Testing Laboratories Performing High Complexity Testing § 493.1471 Standard: Cytology general supervisor responsibilities. The technical supervisor of cytology may perform the duties of the cytology...

  12. 42 CFR 493.1471 - Standard: Cytology general supervisor responsibilities.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 5 2013-10-01 2013-10-01 false Standard: Cytology general supervisor... Nonwaived Testing Laboratories Performing High Complexity Testing § 493.1471 Standard: Cytology general supervisor responsibilities. The technical supervisor of cytology may perform the duties of the cytology...

  13. 42 CFR 493.1471 - Standard: Cytology general supervisor responsibilities.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Standard: Cytology general supervisor... Nonwaived Testing Laboratories Performing High Complexity Testing § 493.1471 Standard: Cytology general supervisor responsibilities. The technical supervisor of cytology may perform the duties of the cytology...

  14. 42 CFR 493.1471 - Standard: Cytology general supervisor responsibilities.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 5 2012-10-01 2012-10-01 false Standard: Cytology general supervisor... Nonwaived Testing Laboratories Performing High Complexity Testing § 493.1471 Standard: Cytology general supervisor responsibilities. The technical supervisor of cytology may perform the duties of the cytology...

  15. 42 CFR 493.1449 - Standard; Technical supervisor qualifications.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... service in which the laboratory performs high complexity tests or procedures. The director of a laboratory... may perform anatomic and clinical laboratory procedures and tests in all specialties and... tests in the subspecialty of bacteriology, the individual functioning as the technical supervisor must...

  16. Aggregate absorption in HMA mixtures.

    DOT National Transportation Integrated Search

    2013-12-01

    Designing hot mix asphalt (HMA) that will perform for many years is a complex balancing act of selecting an : appropriate design asphalt binder content that is sufficiently high to provide durability but not so high as to lead : to rutting problems. ...

  17. Mining Very High Resolution INSAR Data Based On Complex-GMRF Cues And Relevance Feedback

    NASA Astrophysics Data System (ADS)

    Singh, Jagmal; Popescu, Anca; Soccorsi, Matteo; Datcu, Mihai

    2012-01-01

    With the increase in number of remote sensing satellites, the number of image-data scenes in our repositories is also increasing and a large quantity of these scenes are never received and used. Thus automatic retrieval of de- sired image-data using query by image content to fully utilize the huge repository volume is becoming of great interest. Generally different users are interested in scenes containing different kind of objects and structures. So its important to analyze all the image information mining (IIM) methods so that its easier for user to select a method depending upon his/her requirement. We concentrate our study only on high-resolution SAR images and we propose to use InSAR observations instead of only one single look complex (SLC) images for mining scenes containing coherent objects such as high-rise buildings. However in case of objects with less coherence like areas with vegetation cover, SLC images exhibits better performance. We demonstrate IIM performance comparison using complex-Gauss Markov Random Fields as texture descriptor for image patches and SVM relevance- feedback.

  18. High-performance liquid chromatography/high-resolution multiple stage tandem mass spectrometry using negative-ion-mode hydroxide-doped electrospray ionization for the characterization of lignin degradation products.

    PubMed

    Owen, Benjamin C; Haupert, Laura J; Jarrell, Tiffany M; Marcum, Christopher L; Parsell, Trenton H; Abu-Omar, Mahdi M; Bozell, Joseph J; Black, Stuart K; Kenttämaa, Hilkka I

    2012-07-17

    In the search for a replacement for fossil fuel and the valuable chemicals currently obtained from crude oil, lignocellulosic biomass has become a promising candidate as an alternative biorenewable source for crude oil. Hence, many research efforts focus on the extraction, degradation, and catalytic transformation of lignin, hemicellulose, and cellulose. Unfortunately, these processes result in the production of very complex mixtures. Further, while methods have been developed for the analysis of mixtures of oligosaccharides, this is not true for the complex mixtures generated upon degradation of lignin. For example, high-performance liquid chromatography/multiple stage tandem mass spectrometry (HPLC/MS(n)), a tool proven to be invaluable in the analysis of complex mixtures derived from many other biopolymers, such as proteins and DNA, has not been implemented for lignin degradation products. In this study, we have developed an HPLC separation method for lignin degradation products that is amenable to negative-ion-mode electrospray ionization (ESI doped with NaOH), the best method identified thus far for ionization of lignin-related model compounds without fragmentation. The separated and ionized compounds are then analyzed by MS(3) experiments to obtain detailed structural information while simultaneously performing high-resolution measurements to determine their elemental compositions in the two parts of a commercial linear quadrupole ion trap/Fourier-transform ion cyclotron resonance mass spectrometer. A lignin degradation product mixture was analyzed using this method, and molecular structures were proposed for some components. This methodology significantly improves the ability to analyze complex product mixtures that result from degraded lignin.

  19. Enhancing the Photovoltaic Performance of Perovskite Solar Cells with a Down-Conversion Eu-Complex.

    PubMed

    Jiang, Ling; Chen, Wangchao; Zheng, Jiawei; Zhu, Liangzheng; Mo, Li'e; Li, Zhaoqian; Hu, Linhua; Hayat, Tasawar; Alsaedi, Ahmed; Zhang, Changneng; Dai, Songyuan

    2017-08-16

    Organometal halide perovskite solar cells (PSCs) have shown high photovoltaic performance but poor utilization of ultraviolet (UV) irradiation. Lanthanide complexes have a wide absorption range in the UV region and they can down-convert the absorbed UV light into visible light, which provides a possibility for PSCs to utilize UV light for higher photocurrent, efficiency, and stability. In this study, we use a transparent luminescent down-converting layer (LDL) of Eu-4,7-diphenyl-1,10-phenanthroline (Eu-complex) to improve the light utilization efficiency of PSCs. Compared with the uncoated PSC, the PSC coated with Eu-complex LDL on the reverse of the fluorine-doped tin oxide glass displayed an enhancement of 11.8% in short-circuit current density (J sc ) and 15.3% in efficiency due to the Eu-complex LDL re-emitting UV light (300-380 nm) in the visible range. It is indicated that the Eu-complex LDL plays the role of enhancing the power conversion efficiency as well as reducing UV degradation for PSCs.

  20. [Evaluation of the clinical efficacy of a new composition of tizol with triamcinolon in complex treatment of patients with erosive ulcerous form of lichen planus of the oral mucosa].

    PubMed

    Ron', G I; Akmalova, G M; Emel'yanova, I V

    2015-01-01

    The most significant of the primary stages of complex therapy of oral lichen planus (OLP), among causal and pathogenetic therapy is a local conservative treatment. The aim of the study was to evaluate the clinical efficacy of the local use of the new compositions TIZOL with triamcinolon in complex therapy of erosive-ulcerous forms OLP oral mucosa. The study was performed with 47 patients with lichen planus in age from 24 to 70 years with erosive-ulcerous form OLP whose diagnosis was confirmed histologically. The first group included 25 patients in the complex treatment of locally applied composition TIZOL with triamcinolon. The second group of 22 people, who in the complex treatment applied locally 0.5% prednisone ointment. The high efficiency of topical TIZOL with a highly topical steroid in the complex therapy of erosive-ulcerous forms OLP, which was confirmed by the positive clinical dynamics in all patients (100%) and high self-esteem of patients (84% positive ratings), reduced life complete epithelialization of erosions.

  1. The Structure of Working Memory Abilities across the Adult Life Span

    PubMed Central

    Hale, Sandra; Rose, Nathan S.; Myerson, Joel; Strube, Michael J; Sommers, Mitchell; Tye-Murray, Nancy; Spehar, Brent

    2010-01-01

    The present study addresses three questions regarding age differences in working memory: (1) whether performance on complex span tasks decreases as a function of age at a faster rate than performance on simple span tasks; (2) whether spatial working memory decreases at a faster rate than verbal working memory; and (3) whether the structure of working memory abilities is different for different age groups. Adults, ages 20–89 (n=388), performed three simple and three complex verbal span tasks and three simple and three complex spatial memory tasks. Performance on the spatial tasks decreased at faster rates as a function of age than performance on the verbal tasks, but within each domain, performance on complex and simple span tasks decreased at the same rates. Confirmatory factor analyses revealed that domain-differentiated models yielded better fits than models involving domain-general constructs, providing further evidence of the need to distinguish verbal and spatial working memory abilities. Regardless of which domain-differentiated model was examined, and despite the faster rates of decrease in the spatial domain, age group comparisons revealed that the factor structure of working memory abilities was highly similar in younger and older adults and showed no evidence of age-related dedifferentiation. PMID:21299306

  2. Team Modelling: Literature Review

    DTIC Science & Technology

    2006-08-01

    elucidating this complex topic . The report concludes that four areas of future team research are particularly germane to the needs of the Canadian Forces...provide a shield for team members that are not performing at high levels (Bowers, Pharmer and Salas, 2000),. Some researchers have suggested that...mid level, performance is high . Other research has explored the relationship between team turnover and the accumulation of knowledge within teams

  3. Method 447.0 - Determination of Chlorophylls a and b and Identification of Other Pigments of Interest in Marine and Freshwater Algae Using High Performance Liquid Chromatography with Visible Wavelength Detection

    EPA Science Inventory

    This method provides a procedure for determination of chlorophylls a (chl a) and b (chl b) found in marine and freshwater phytoplankton. Reversed phase high performance liquid chromatography (HPLC) with detection at 440 nm is used to separate the pigments from a complex pigment ...

  4. Costs and benefits of integrating information between the cerebral hemispheres: a computational perspective.

    PubMed

    Belger, A; Banich, M T

    1998-07-01

    Because interaction of the cerebral hemispheres has been found to aid task performance under demanding conditions, the present study examined how this effect is moderated by computational complexity, the degree of lateralization for a task, and individual differences in asymmetric hemispheric activation (AHA). Computational complexity was manipulated across tasks either by increasing the number of inputs to be processed or by increasing the number of steps to a decision. Comparison of within- and across-hemisphere trials indicated that the size of the between-hemisphere advantage increased as a function of task complexity, except for a highly lateralized rhyme decision task that can only be performed by the left hemisphere. Measures of individual differences in AHA revealed that when task demands and an individual's AHA both load on the same hemisphere, the ability to divide the processing between the hemispheres is limited. Thus, interhemispheric division of processing improves performance at higher levels of computational complexity only when the required operations can be divided between the hemispheres.

  5. A low complexity, low spur digital IF conversion circuit for high-fidelity GNSS signal playback

    NASA Astrophysics Data System (ADS)

    Su, Fei; Ying, Rendong

    2016-01-01

    A low complexity high efficiency and low spur digital intermediate frequency (IF) conversion circuit is discussed in the paper. This circuit is key element in high-fidelity GNSS signal playback instrument. We analyze the spur performance of a finite state machine (FSM) based numerically controlled oscillators (NCO), by optimization of the control algorithm, a FSM based NCO with 3 quantization stage can achieves 65dB SFDR in the range of the seventh harmonic. Compare with traditional lookup table based NCO design with the same Spurious Free Dynamic Range (SFDR) performance, the logic resource require to implemented the NCO is reduced to 1/3. The proposed design method can be extended to the IF conversion system with good SFDR in the range of higher harmonic components by increasing the quantization stage.

  6. An Optimizing Compiler for Petascale I/O on Leadership Class Architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choudhary, Alok; Kandemir, Mahmut

    In high-performance computing systems, parallel I/O architectures usually have very complex hierarchies with multiple layers that collectively constitute an I/O stack, including high-level I/O libraries such as PnetCDF and HDF5, I/O middleware such as MPI-IO, and parallel file systems such as PVFS and Lustre. Our project explored automated instrumentation and compiler support for I/O intensive applications. Our project made significant progress towards understanding the complex I/O hierarchies of high-performance storage systems (including storage caches, HDDs, and SSDs), and designing and implementing state-of-the-art compiler/runtime system technology that targets I/O intensive HPC applications that target leadership class machine. This final report summarizesmore » the major achievements of the project and also points out promising future directions.« less

  7. Reactome graph database: Efficient access to complex pathway data

    PubMed Central

    Korninger, Florian; Viteri, Guilherme; Marin-Garcia, Pablo; Ping, Peipei; Wu, Guanming; Stein, Lincoln; D’Eustachio, Peter

    2018-01-01

    Reactome is a free, open-source, open-data, curated and peer-reviewed knowledgebase of biomolecular pathways. One of its main priorities is to provide easy and efficient access to its high quality curated data. At present, biological pathway databases typically store their contents in relational databases. This limits access efficiency because there are performance issues associated with queries traversing highly interconnected data. The same data in a graph database can be queried more efficiently. Here we present the rationale behind the adoption of a graph database (Neo4j) as well as the new ContentService (REST API) that provides access to these data. The Neo4j graph database and its query language, Cypher, provide efficient access to the complex Reactome data model, facilitating easy traversal and knowledge discovery. The adoption of this technology greatly improved query efficiency, reducing the average query time by 93%. The web service built on top of the graph database provides programmatic access to Reactome data by object oriented queries, but also supports more complex queries that take advantage of the new underlying graph-based data storage. By adopting graph database technology we are providing a high performance pathway data resource to the community. The Reactome graph database use case shows the power of NoSQL database engines for complex biological data types. PMID:29377902

  8. Reactome graph database: Efficient access to complex pathway data.

    PubMed

    Fabregat, Antonio; Korninger, Florian; Viteri, Guilherme; Sidiropoulos, Konstantinos; Marin-Garcia, Pablo; Ping, Peipei; Wu, Guanming; Stein, Lincoln; D'Eustachio, Peter; Hermjakob, Henning

    2018-01-01

    Reactome is a free, open-source, open-data, curated and peer-reviewed knowledgebase of biomolecular pathways. One of its main priorities is to provide easy and efficient access to its high quality curated data. At present, biological pathway databases typically store their contents in relational databases. This limits access efficiency because there are performance issues associated with queries traversing highly interconnected data. The same data in a graph database can be queried more efficiently. Here we present the rationale behind the adoption of a graph database (Neo4j) as well as the new ContentService (REST API) that provides access to these data. The Neo4j graph database and its query language, Cypher, provide efficient access to the complex Reactome data model, facilitating easy traversal and knowledge discovery. The adoption of this technology greatly improved query efficiency, reducing the average query time by 93%. The web service built on top of the graph database provides programmatic access to Reactome data by object oriented queries, but also supports more complex queries that take advantage of the new underlying graph-based data storage. By adopting graph database technology we are providing a high performance pathway data resource to the community. The Reactome graph database use case shows the power of NoSQL database engines for complex biological data types.

  9. Advances in ultra-high performance liquid chromatography coupled to tandem mass spectrometry for sensitive detection of several food allergens in complex and processed foodstuffs.

    PubMed

    Planque, M; Arnould, T; Dieu, M; Delahaut, P; Renard, P; Gillard, N

    2016-09-16

    Sensitive detection of food allergens is affected by food processing and foodstuff complexity. It is therefore a challenge to detect cross-contamination in food production that could endanger an allergic customer's life. Here we used ultra-high performance liquid chromatography coupled to tandem mass spectrometry for simultaneous detection of traces of milk (casein, whey protein), egg (yolk, white), soybean, and peanut allergens in different complex and/or heat-processed foodstuffs. The method is based on a single protocol (extraction, trypsin digestion, and purification) applicable to the different tested foodstuffs: chocolate, ice cream, tomato sauce, and processed cookies. The determined limits of quantitation, expressed in total milk, egg, peanut, or soy proteins (and not soluble proteins) per kilogram of food, are: 0.5mg/kg for milk (detection of caseins), 5mg/kg for milk (detection of whey), 2.5mg/kg for peanut, 5mg/kg for soy, 3.4mg/kg for egg (detection of egg white), and 30.8mg/kg for egg (detection of egg yolk). The main advantage is the ability of the method to detect four major food allergens simultaneously in processed and complex matrices with very high sensitivity and specificity. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Enhanced electrical stability of nitrate ligand-based hexaaqua complexes solution-processed ultrathin a-IGZO transistors

    NASA Astrophysics Data System (ADS)

    Choi, C.; Baek, Y.; Lee, B. M.; Kim, K. H.; Rim, Y. S.

    2017-12-01

    We report solution-processed, amorphous indium-gallium-zinc-oxide-based (a-IGZO-based) thin-film transistors (TFTs). Our proposed solution-processed a-IGZO films, using a simple spin-coating method, were formed through nitrate ligand-based metal complexes, and they were annealed at low temperature (250 °C) to achieve high-quality oxide films and devices. We investigated solution-processed a-IGZO TFTs with various thicknesses, ranging from 4 to 16 nm. The 4 nm-thick TFT films had smooth morphology and high-density, and they exhibited excellent performance, i.e. a high saturation mobility of 7.73  ±  0.44 cm2 V-1 s-1, a sub-threshold swing of 0.27 V dec-1, an on/off ratio of ~108, and a low threshold voltage of 3.10  ±  0.30 V. However, the performance of the TFTs degraded as the film thickness was increased. We further performed positive and negative bias stress tests to examine their electrical stability, and it was noted that the operating behavior of the devices was highly stable. Despite a small number of free charges, the high performance of the ultrathin a-IGZO TFTs was attributed to the small effect of the thickness of the channel, low bulk resistance, the quality of the a-IGZO/SiO2 interface, and high film density.

  11. Bacterial complexes of a high moor related to different elements of microrelief

    NASA Astrophysics Data System (ADS)

    Dobrovol'skaya, T. G.; Golovchenko, A. V.; Yakushev, A. V.; Yurchenko, E. N.; Manucharov, N. A.; Chernov, I. Yu.

    2017-04-01

    The analysis of bacterial complexes, including the number, taxonomic composition, physiological state, and proportion of ecological trophic groups was performed in a high moorland related to different elements of the microrelief. The abundance of bacteria, their ability for hydrolysis of polymers and the share of r-strategists were found to be higher in the sphagnum hillocks than on the flat surfaces. The total prokaryote biomass was 4 times greater in the sphagnum samples from microhighs (hillocks). On these elements of the microrelief, the density of actinomycetal mycelium was higher. Bacteria of the hydrolytic complex ( Cytophaga and Chitinophaga genera) were found only in microhigh samples.

  12. An enhanced data visualization method for diesel engine malfunction classification using multi-sensor signals.

    PubMed

    Li, Yiqing; Wang, Yu; Zi, Yanyang; Zhang, Mingquan

    2015-10-21

    The various multi-sensor signal features from a diesel engine constitute a complex high-dimensional dataset. The non-linear dimensionality reduction method, t-distributed stochastic neighbor embedding (t-SNE), provides an effective way to implement data visualization for complex high-dimensional data. However, irrelevant features can deteriorate the performance of data visualization, and thus, should be eliminated a priori. This paper proposes a feature subset score based t-SNE (FSS-t-SNE) data visualization method to deal with the high-dimensional data that are collected from multi-sensor signals. In this method, the optimal feature subset is constructed by a feature subset score criterion. Then the high-dimensional data are visualized in 2-dimension space. According to the UCI dataset test, FSS-t-SNE can effectively improve the classification accuracy. An experiment was performed with a large power marine diesel engine to validate the proposed method for diesel engine malfunction classification. Multi-sensor signals were collected by a cylinder vibration sensor and a cylinder pressure sensor. Compared with other conventional data visualization methods, the proposed method shows good visualization performance and high classification accuracy in multi-malfunction classification of a diesel engine.

  13. An Enhanced Data Visualization Method for Diesel Engine Malfunction Classification Using Multi-Sensor Signals

    PubMed Central

    Li, Yiqing; Wang, Yu; Zi, Yanyang; Zhang, Mingquan

    2015-01-01

    The various multi-sensor signal features from a diesel engine constitute a complex high-dimensional dataset. The non-linear dimensionality reduction method, t-distributed stochastic neighbor embedding (t-SNE), provides an effective way to implement data visualization for complex high-dimensional data. However, irrelevant features can deteriorate the performance of data visualization, and thus, should be eliminated a priori. This paper proposes a feature subset score based t-SNE (FSS-t-SNE) data visualization method to deal with the high-dimensional data that are collected from multi-sensor signals. In this method, the optimal feature subset is constructed by a feature subset score criterion. Then the high-dimensional data are visualized in 2-dimension space. According to the UCI dataset test, FSS-t-SNE can effectively improve the classification accuracy. An experiment was performed with a large power marine diesel engine to validate the proposed method for diesel engine malfunction classification. Multi-sensor signals were collected by a cylinder vibration sensor and a cylinder pressure sensor. Compared with other conventional data visualization methods, the proposed method shows good visualization performance and high classification accuracy in multi-malfunction classification of a diesel engine. PMID:26506347

  14. A performance improvement case study in aircraft maintenance and its implications for hazard identification.

    PubMed

    Ward, Marie; McDonald, Nick; Morrison, Rabea; Gaynor, Des; Nugent, Tony

    2010-02-01

    Aircraft maintenance is a highly regulated, safety critical, complex and competitive industry. There is a need to develop innovative solutions to address process efficiency without compromising safety and quality. This paper presents the case that in order to improve a highly complex system such as aircraft maintenance, it is necessary to develop a comprehensive and ecologically valid model of the operational system, which represents not just what is meant to happen, but what normally happens. This model then provides the backdrop against which to change or improve the system. A performance report, the Blocker Report, specific to aircraft maintenance and related to the model was developed gathering data on anything that 'blocks' task or check performance. A Blocker Resolution Process was designed to resolve blockers and improve the current check system. Significant results were obtained for the company in the first trial and implications for safety management systems and hazard identification are discussed. Statement of Relevance: Aircraft maintenance is a safety critical, complex, competitive industry with a need to develop innovative solutions to address process and safety efficiency. This research addresses this through the development of a comprehensive and ecologically valid model of the system linked with a performance reporting and resolution system.

  15. Student Performance along Axes of Scenario Novelty and Complexity in Introductory Biology: Lessons from a Unique Factorial Approach to Assessment.

    PubMed

    Deane-Coe, Kirsten K; Sarvary, Mark A; Owens, Thomas G

    2017-01-01

    In an undergraduate introductory biology laboratory course, we used a summative assessment to directly test the learning objective that students will be able to apply course material to increasingly novel and complex situations. Using a factorial framework, we developed multiple true-false questions to fall along axes of novelty and complexity, which resulted in four categories of questions: familiar content and low complexity (category A); novel content and low complexity (category B); familiar content and high complexity (category C); and novel content and high complexity (category D). On average, students scored more than 70% on all questions, indicating that the course largely met this learning objective. However, students scored highest on questions in category A, likely because they were most similar to course content, and lowest on questions in categories C and D. While we anticipated students would score equally on questions for which either novelty or complexity was altered (but not both), we observed that student scores in category C were lower than in category B. Furthermore, students performed equally poorly on all questions for which complexity was higher (categories C and D), even those containing familiar content, suggesting that application of course material to increasingly complex situations is particularly challenging to students. © 2017 K. K. Deane-Coe et al. CBE—Life Sciences Education © 2017 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  16. Photonic crystal fiber technology for high-performance all-fiber monolithic ultrafast fiber amplifiers

    NASA Astrophysics Data System (ADS)

    Papior, Sidsel R.; Weirich, Johannes; Johansen, Mette M.; Jakobsen, Christian; Michieletto, Mattia; Triches, Marco; Kristensen, Torben; Olesen, Anders S.; Petersen, Christian; Andersen, Thomas V.; Maack, Martin D.; Alkeskjold, Thomas T.

    2018-02-01

    Photonic crystal fiber (PCF) technology for ultrafast fiber amplifiers traditionally uses air holes as key elements for large mode area (LMA) fiber designs. These air holes are crucial for the performance of high-end LMA PCFs, but makes splicing and interfacing more complex. To reduce this complexity in mid-range amplifiers, we present single-mode polarization-maintaining Yb-doped LMA PCFs without air holes for easier splicing into monolithic all-fiber amplifier designs. A 30 μm core all-solid spliceable PCF is presented, and amplification of 1064 nm light above 50 W with an optical to optical efficiency of 80 % is demonstrated. Furthermore, to demonstrate the excellent reliability of PCF based monolithic amplifiers, we demonstrate ultra-longterm performance data of > 35 khrs on a 14 μm core step-index type PCF amplifier with low long-term power degradation slope of < 1.5 % / 10,000 h.

  17. Propulsion and Power Rapid Response R&D Support Delivery Order 0041: Power Dense Solid Oxide Fuel Cell Systems: High Performance, High Power Density Solid Oxide Fuel Cells - Materials and Load Control

    DTIC Science & Technology

    2008-12-01

    respectively. 2.3.1.2 Brushless DC Motor Brushless direct current ( BLDC ) motors feature high efficiency, ease of control , and astonishingly high power...modeling purposes, we ignore the modeling complexity of the BLDC controller and treat the motor and controller “as commutated”, i.e. we assume the...High Performance, High Power Density Solid Oxide Fuel Cells− Materials and Load Control Stephen W. Sofie, Steven R. Shaw, Peter A. Lindahl, and Lee H

  18. Adaptive Missile Flight Control for Complex Aerodynamic Phenomena

    DTIC Science & Technology

    2017-08-09

    at high maneuvering conditions motivate guidance approaches that can accommodate uncertainty. Flight control algorithms are one component...performance, but system uncertainty is not directly addressed. Linear, parameter-varying37,38 approaches for munitions expand on optimal control by... post -canard stall. We propose to model these complex aerodynamic mechanisms and use these models in formulating flight controllers within the

  19. Web Exclusive--Is the Sky the Limit to Educational Improvement?

    ERIC Educational Resources Information Center

    Schleicher, Andreas

    2012-01-01

    Today, education systems need to enable people to become lifelong learners, to manage complex ways of thinking and complex ways of working that computers can't take over easily. The task for educators and policy makers is to ensure that countries rise to this challenge. High performing education systems like Finland's and Singapore's tend to…

  20. On the structure and spin states of Fe(III)-EDDHA complexes.

    PubMed

    Gómez-Gallego, Mar; Fernández, Israel; Pellico, Daniel; Gutiérrez, Angel; Sierra, Miguel A; Lucena, Juan J

    2006-07-10

    DFT methods are suitable for predicting both the geometries and spin states of EDDHA-Fe(III) complexes. Thus, extensive DFT computational studies have shown that the racemic-Fe(III) EDDHA complex is more stable than the meso isomer, regardless of the spin state of the central iron atom. A comparison of the energy values obtained for the complexes under study has also shown that high-spin (S = 5/2) complexes are more stable than low-spin (S = 1/2) ones. These computational results matched the experimental results of the magnetic susceptibility values of both isomers. In both cases, their behavior has been fitted as being due to isolated high-spin Fe(III) in a distorted octahedral environment. The study of the correlation diagram also confirms the high-spin iron in complex 2b. The geometry optimization of these complexes performed with the standard 3-21G* basis set for hydrogen, carbon, oxygen, and nitrogen and the Hay-Wadt small-core effective core potential (ECP) including a double-xi valence basis set for iron, followed by single-point energy refinement with the 6-31G* basis set, is suitable for predicting both the geometries and the spin-states of EDDHA-Fe(III) complexes. The presence of a high-spin iron in Fe(III)-EDDHA complexes could be the key to understanding their lack of reactivity in electron-transfer processes, either chemically or electrochemically induced, and their resistance to photodegradation.

  1. Large-Area High-Performance Flexible Pressure Sensor with Carbon Nanotube Active Matrix for Electronic Skin.

    PubMed

    Nela, Luca; Tang, Jianshi; Cao, Qing; Tulevski, George; Han, Shu-Jen

    2018-03-14

    Artificial "electronic skin" is of great interest for mimicking the functionality of human skin, such as tactile pressure sensing. Several important performance metrics include mechanical flexibility, operation voltage, sensitivity, and accuracy, as well as response speed. In this Letter, we demonstrate a large-area high-performance flexible pressure sensor built on an active matrix of 16 × 16 carbon nanotube thin-film transistors (CNT TFTs). Made from highly purified solution tubes, the active matrix exhibits superior flexible TFT performance with high mobility and large current density, along with a high device yield of nearly 99% over 4 inch sample area. The fully integrated flexible pressure sensor operates within a small voltage range of 3 V and shows superb performance featuring high spatial resolution of 4 mm, faster response than human skin (<30 ms), and excellent accuracy in sensing complex objects on both flat and curved surfaces. This work may pave the road for future integration of high-performance electronic skin in smart robotics and prosthetic solutions.

  2. Combining high performance simulation, data acquisition, and graphics display computers

    NASA Technical Reports Server (NTRS)

    Hickman, Robert J.

    1989-01-01

    Issues involved in the continuing development of an advanced simulation complex are discussed. This approach provides the capability to perform the majority of tests on advanced systems, non-destructively. The controlled test environments can be replicated to examine the response of the systems under test to alternative treatments of the system control design, or test the function and qualification of specific hardware. Field tests verify that the elements simulated in the laboratories are sufficient. The digital computer is hosted by a Digital Equipment Corp. MicroVAX computer with an Aptec Computer Systems Model 24 I/O computer performing the communication function. An Applied Dynamics International AD100 performs the high speed simulation computing and an Evans and Sutherland PS350 performs on-line graphics display. A Scientific Computer Systems SCS40 acts as a high performance FORTRAN program processor to support the complex, by generating numerous large files from programs coded in FORTRAN that are required for the real time processing. Four programming languages are involved in the process, FORTRAN, ADSIM, ADRIO, and STAPLE. FORTRAN is employed on the MicroVAX host to initialize and terminate the simulation runs on the system. The generation of the data files on the SCS40 also is performed with FORTRAN programs. ADSIM and ADIRO are used to program the processing elements of the AD100 and its IOCP processor. STAPLE is used to program the Aptec DIP and DIA processors.

  3. Performance Engineering Research Institute SciDAC-2 Enabling Technologies Institute Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, Mary

    2014-09-19

    Enhancing the performance of SciDAC applications on petascale systems has high priority within DOE SC. As we look to the future, achieving expected levels of performance on high-end com-puting (HEC) systems is growing ever more challenging due to enormous scale, increasing archi-tectural complexity, and increasing application complexity. To address these challenges, PERI has implemented a unified, tripartite research plan encompassing: (1) performance modeling and prediction; (2) automatic performance tuning; and (3) performance engineering of high profile applications. The PERI performance modeling and prediction activity is developing and refining performance models, significantly reducing the cost of collecting the data upon whichmore » the models are based, and increasing model fidelity, speed and generality. Our primary research activity is automatic tuning (autotuning) of scientific software. This activity is spurred by the strong user preference for automatic tools and is based on previous successful activities such as ATLAS, which has automatically tuned components of the LAPACK linear algebra library, and other re-cent work on autotuning domain-specific libraries. Our third major component is application en-gagement, to which we are devoting approximately 30% of our effort to work directly with Sci-DAC-2 applications. This last activity not only helps DOE scientists meet their near-term per-formance goals, but also helps keep PERI research focused on the real challenges facing DOE computational scientists as they enter the Petascale Era.« less

  4. Secondary benzylation with benzyl alcohols catalyzed by a high-valent heterobimetallic Ir-Sn complex.

    PubMed

    Podder, Susmita; Choudhury, Joyanta; Roy, Sujit

    2007-04-13

    A highly efficient secondary benzylation procedure has been demonstrated using a high-valent heterobimetallic complex [Ir2(COD)2(SnCl3)2(Cl)2(mu-Cl)2] 1 as the catalyst in 1,2-dichloroethane to afford the corresponding benzylated products in moderate to excellent yields. The reaction was performed not only with carbon nucleophiles (arenes and heteroarenes) but also with oxygen (alcohol), nitrogen (amide and sulfonamide), and sulfur (thiol) nucleophiles. Mechanistic investigation showed the intermediacy of the ether in this reaction. An electrophilic mechanism is proposed from Hammett correlation.

  5. Studying the Effect of a Composition of the Cluster Core in High-Radiopacity Cluster Complexes of Rhenium on Their Acute Toxicity In Vivo.

    PubMed

    Pozmogova, T N; Krasil'nikova, A A; Ivanov, A A; Shestopalov, M A; Gyrylova, S N; Shestopalova, L V; Shestopaloiv, A M; Shkurupy, V A

    2016-05-01

    An in vivo study was performed to evaluate the dependence of acute toxicity of high-radiopacity and luminescent octahedral cluster complexes of rhenium after intravenous injection on a composition of the cluster core. Changes in mouse body weight, water and food consumption, degree of intoxication, and morphological changes in the visceral organs were studied after intravenous injection of the following cluster complexes with various internal ligands (S, Se, or Te): Na4[{Re 6 Te 8 }(CN)6], Na4[{Re 6 Se 8 }(CN)6], and Na4[{Re 6 S 8 }(CN)6]. The Na4[{Re 6 S 8 } (CN)6] cluster complex was shown to be the safest for animals.

  6. Information processing using a single dynamical node as complex system

    PubMed Central

    Appeltant, L.; Soriano, M.C.; Van der Sande, G.; Danckaert, J.; Massar, S.; Dambre, J.; Schrauwen, B.; Mirasso, C.R.; Fischer, I.

    2011-01-01

    Novel methods for information processing are highly desired in our information-driven society. Inspired by the brain's ability to process information, the recently introduced paradigm known as 'reservoir computing' shows that complex networks can efficiently perform computation. Here we introduce a novel architecture that reduces the usually required large number of elements to a single nonlinear node with delayed feedback. Through an electronic implementation, we experimentally and numerically demonstrate excellent performance in a speech recognition benchmark. Complementary numerical studies also show excellent performance for a time series prediction benchmark. These results prove that delay-dynamical systems, even in their simplest manifestation, can perform efficient information processing. This finding paves the way to feasible and resource-efficient technological implementations of reservoir computing. PMID:21915110

  7. Synthesis of Copper Oxide/Graphite Composite for High-Performance Rechargeable Battery Anode.

    PubMed

    Cho, Sanghun; Ahn, Yong-Keon; Yin, Zhenxing; You, Duck-Jae; Kim, Hyunjin; Piao, Yuanzhe; Yoo, Jeeyoung; Kim, Youn Sang

    2017-08-25

    A novel copper oxide/graphite composite (GCuO) anode with high capacity and long cycle stability is proposed. A simple, one-step synthesis method is used to prepare the GCuO, through heat treatment of the Cu ion complex and pristine graphite. The gases generated during thermal decomposition of the Cu ion complex (H 2 and CO 2 ) induce interlayer expansion of the graphite planes, which assists effective ion intercalation. Copper oxide is formed simultaneously as a high-capacity anode material through thermal reduction of the Cu ion complex. Material analyses reveal the formation of Cu oxide nanoparticles and the expansion of the gaps between the graphite layers from 0.34 to 0.40 nm, which is enough to alleviate layer stress for reversible ion intercalation for Li or Na batteries. The GCuO cell exhibits excellent Li-ion battery half-cell performance, with a capacity of 532 mAh g -1 at 0.2 C (C-rate) and capacity retention of 83 % after 250 cycles. Moreover, the LiFePO 4 /GCuO full cell is fabricated to verify the high performance of GCuO in practical applications. This cell has a capacity of 70 mAh g -1 and a coulombic efficiency of 99 %. The GCuO composite is therefore a promising candidate for use as an anode material in advanced Li- or Na-ion batteries. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Identifying protein complex by integrating characteristic of core-attachment into dynamic PPI network.

    PubMed

    Shen, Xianjun; Yi, Li; Jiang, Xingpeng; He, Tingting; Yang, Jincai; Xie, Wei; Hu, Po; Hu, Xiaohua

    2017-01-01

    How to identify protein complex is an important and challenging task in proteomics. It would make great contribution to our knowledge of molecular mechanism in cell life activities. However, the inherent organization and dynamic characteristic of cell system have rarely been incorporated into the existing algorithms for detecting protein complexes because of the limitation of protein-protein interaction (PPI) data produced by high throughput techniques. The availability of time course gene expression profile enables us to uncover the dynamics of molecular networks and improve the detection of protein complexes. In order to achieve this goal, this paper proposes a novel algorithm DCA (Dynamic Core-Attachment). It detects protein-complex core comprising of continually expressed and highly connected proteins in dynamic PPI network, and then the protein complex is formed by including the attachments with high adhesion into the core. The integration of core-attachment feature into the dynamic PPI network is responsible for the superiority of our algorithm. DCA has been applied on two different yeast dynamic PPI networks and the experimental results show that it performs significantly better than the state-of-the-art techniques in terms of prediction accuracy, hF-measure and statistical significance in biology. In addition, the identified complexes with strong biological significance provide potential candidate complexes for biologists to validate.

  9. Enantioseparation of mandelic acid derivatives by high performance liquid chromatography with substituted β-cyclodextrin as chiral mobile phase additive and evaluation of inclusion complex formation

    PubMed Central

    Tong, Shengqiang; Zhang, Hu; Shen, Mangmang

    2014-01-01

    The enantioseparation of ten mandelic acid derivatives was performed by reverse phase high performance liquid chromatography with hydroxypropyl-β-cyclodextrin (HP-β-CD) or sulfobutyl ether-β-cyclodextrin (SBE-β-CD) as chiral mobile phase additives, in which inclusion complex formations between cyclodextrins and enantiomers were evaluated. The effects of various factors such as the composition of mobile phase, concentration of cyclodextrins and column temperature on retention and enantioselectivity were studied. The peak resolutions and retention time of the enantiomers were strongly affected by the pH, the organic modifier and the type of β-cyclodextrin in the mobile phase, while the concentration of buffer solution and temperature had a relatively low effect on resolutions. Enantioseparations were successfully achieved on a Shimpack CLC-ODS column (150×4.6 mm i.d., 5 μm). The mobile phase was a mixture of acetonitrile and 0.10 mol L-1 of phosphate buffer at pH 2.68 containing 20 mmol L-1 of HP-β-CD or SBE-β-CD. Semi-preparative enantioseparation of about 10 mg of α-cyclohexylmandelic acid and α-cyclopentylmandelic acid were established individually. Cyclodextrin-enantiomer complex stoichiometries as well as binding constants were investigated. Results showed that stoichiomertries for all the inclusion complex of cyclodextrin-enantiomers were 1:1. PMID:24893270

  10. The influence of creative process engagement on employee creative performance and overall job performance: a curvilinear assessment.

    PubMed

    Zhang, Xiaomeng; Bartol, Kathryn M

    2010-09-01

    Integrating theories addressing attention and activation with creativity literature, we found an inverted U-shaped relationship between creative process engagement and overall job performance among professionals in complex jobs in an information technology firm. Work experience moderated the curvilinear relationship, with low-experience employees generally exhibiting higher levels of overall job performance at low to moderate levels of creative process engagement and high-experience employees demonstrating higher overall performance at moderate to high levels of creative process engagement. Creative performance partially mediated the relationship between creative process engagement and job performance. These relationships were tested within a moderated mediation framework. Copyright 2010 APA, all rights reserved

  11. Transvenous Embolization of a Spontaneous Femoral AVF 5 Years After an Incomplete Treatment with Arterial Stent-Grafts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peynircioglu, Bora; Ozkan, Murat; Dogan, Omer Faruk

    2008-03-15

    A 66-year-old man with complex left femoral arterio-venous fistula (AVF) was first diagnosed after a deep venous thrombosis incident approximately 5 years ago. Partial treatment was performed by means of endografts along the superficial femoral artery, which remained patent for 5 years. The patient had been doing well until a couple of months ago when he developed severe venous stasis and ulcers of the left cruris, due to a high-flow nonhealing complex AVF with additional iliac vein occlusion. Therefore; the definitive treatment was performed by a unique endovascular technique combined with surgical venous bypass (femoro-femoral crossover saphenous bypass, the Palmamore » operation). A novel percutaneous transvenous technique for occlusion of a complex high-flow AVF is reported with a review of the literature. The case is unique with spontaneous AVF, transvenous embolization with detachable coils and ONYX, and the hybrid treatment technique as well as the long-term patency of superficial femoral artery stent-grafts.« less

  12. A neurobehavioral examination of individuals with high-functioning autism and Asperger's disorder using a fronto-striatal model of dysfunction.

    PubMed

    Rinehart, Nicole J; Bradshaw, John L; Tonge, Bruce J; Brereton, Avril V; Bellgrove, Mark A

    2002-06-01

    The repetitive, stereotyped, and obsessive behaviors that characterize autism may in part be attributable to disruption of the region of the fronto-striatal system, which mediates executive abilities. Neuropsychological testing has shown that children with autism exhibit set-shifting deficiencies on tests such as the Wisconsin Card Sorting task but show normal inhibitory ability on variants of the Stroop color-word test. According to Minshew and Goldstein's multiple primary deficit theory, the complexity of the executive functioning task is important in determining the performance of individuals with autism. This study employed a visual-spatial task (with a Stroop-type component) to examine the integrity of executive functioning, in particular inhibition, in autism (n = 12) and Asperger's disorder (n = 12) under increasing levels of cognitive complexity. Whereas the Asperger's disorder group performed similarly to age- and IQ-matched control participants, even at the higher levels of cognitive complexity, the high-functioning autism group displayed inhibitory deficits specifically associated with increasing cognitive load.

  13. Probabilistic performance assessment of complex energy process systems - The case of a self-sustained sanitation system.

    PubMed

    Kolios, Athanasios; Jiang, Ying; Somorin, Tosin; Sowale, Ayodeji; Anastasopoulou, Aikaterini; Anthony, Edward J; Fidalgo, Beatriz; Parker, Alison; McAdam, Ewan; Williams, Leon; Collins, Matt; Tyrrel, Sean

    2018-05-01

    A probabilistic modelling approach was developed and applied to investigate the energy and environmental performance of an innovative sanitation system, the "Nano-membrane Toilet" (NMT). The system treats human excreta via an advanced energy and water recovery island with the aim of addressing current and future sanitation demands. Due to the complex design and inherent characteristics of the system's input material, there are a number of stochastic variables which may significantly affect the system's performance. The non-intrusive probabilistic approach adopted in this study combines a finite number of deterministic thermodynamic process simulations with an artificial neural network (ANN) approximation model and Monte Carlo simulations (MCS) to assess the effect of system uncertainties on the predicted performance of the NMT system. The joint probability distributions of the process performance indicators suggest a Stirling Engine (SE) power output in the range of 61.5-73 W with a high confidence interval (CI) of 95%. In addition, there is high probability (with 95% CI) that the NMT system can achieve positive net power output between 15.8 and 35 W. A sensitivity study reveals the system power performance is mostly affected by SE heater temperature. Investigation into the environmental performance of the NMT design, including water recovery and CO 2 /NO x emissions, suggests significant environmental benefits compared to conventional systems. Results of the probabilistic analysis can better inform future improvements on the system design and operational strategy and this probabilistic assessment framework can also be applied to similar complex engineering systems.

  14. 42 CFR 493.1495 - Standard; Testing personnel responsibilities.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... experience, and technical abilities. (b) Each individual performing high complexity testing must— (1) Follow the laboratory's procedures for specimen handling and processing, test analyses, reporting and...

  15. A novel method based on selective laser sintering for preparing high-performance carbon fibres/polyamide12/epoxy ternary composites

    NASA Astrophysics Data System (ADS)

    Zhu, Wei; Yan, Chunze; Shi, Yunsong; Wen, Shifeng; Liu, Jie; Wei, Qingsong; Shi, Yusheng

    2016-09-01

    A novel method based on selective laser sintering (SLS) process is proposed for the first time to prepare complex and high-performance carbon fibres/polyamide12/epoxy (CF/PA12/EP) ternary composites. The procedures are briefly described as follows: prepare polyamide12 (PA12) coated carbon fibre (CF) composite powder; build porous green parts by SLS; infiltrate the green parts with high-performance thermosetting epoxy (EP) resin; and finally cure the resin at high temperature. The obtained composites are a ternary composite system consisting of the matrix of novolac EP resin, the reinforcement of CFs and the transition thin layer of PA12 with a thickness of 595 nm. The SEM images and micro-CT analysis prove that the ternary system is a three-dimensional co-continuous structure and the reinforcement of CFs are well dispersed in the matrix of EP with the volume fraction of 31%. Mechanical tests show that the composites fabricated by this method yield an ultimate tensile strength of 101.03 MPa and a flexural strength of 153.43 MPa, which are higher than those of most of the previously reported SLS materials. Therefore, the process proposed in this paper shows great potential for manufacturing complex, lightweight and high-performance CF reinforced composite components in aerospace, automotive industries and other areas.

  16. A novel method based on selective laser sintering for preparing high-performance carbon fibres/polyamide12/epoxy ternary composites

    PubMed Central

    Zhu, Wei; Yan, Chunze; Shi, Yunsong; Wen, Shifeng; Liu, Jie; Wei, Qingsong; Shi, Yusheng

    2016-01-01

    A novel method based on selective laser sintering (SLS) process is proposed for the first time to prepare complex and high-performance carbon fibres/polyamide12/epoxy (CF/PA12/EP) ternary composites. The procedures are briefly described as follows: prepare polyamide12 (PA12) coated carbon fibre (CF) composite powder; build porous green parts by SLS; infiltrate the green parts with high-performance thermosetting epoxy (EP) resin; and finally cure the resin at high temperature. The obtained composites are a ternary composite system consisting of the matrix of novolac EP resin, the reinforcement of CFs and the transition thin layer of PA12 with a thickness of 595 nm. The SEM images and micro-CT analysis prove that the ternary system is a three-dimensional co-continuous structure and the reinforcement of CFs are well dispersed in the matrix of EP with the volume fraction of 31%. Mechanical tests show that the composites fabricated by this method yield an ultimate tensile strength of 101.03 MPa and a flexural strength of 153.43 MPa, which are higher than those of most of the previously reported SLS materials. Therefore, the process proposed in this paper shows great potential for manufacturing complex, lightweight and high-performance CF reinforced composite components in aerospace, automotive industries and other areas. PMID:27650254

  17. Because difficulty is not the same for everyone: the impact of complexity in working memory is associated with cannabinoid 1 receptor genetic variation in young adults.

    PubMed

    Ruiz-Contreras, Alejandra E; Román-López, Talía V; Caballero-Sánchez, Ulises; Rosas-Escobar, Cintia B; Ortega-Mora, E Ivett; Barrera-Tlapa, Miguel A; Romero-Hidalgo, Sandra; Carrillo-Sánchez, Karol; Hernández-Morales, Salvador; Vadillo-Ortega, Felipe; González-Barrios, Juan Antonio; Méndez-Díaz, Mónica; Prospéro-García, Oscar

    2017-03-01

    Individual differences in working memory ability are mainly revealed when a demanding challenge is imposed. Here, we have associated cannabinoid 1 (CB1) receptor genetic variation rs2180619 (AA, AG, GG), which is located in a potential CNR1 regulatory sequence, with performance in working memory. Two-hundred and nine Mexican-mestizo healthy young participants (89 women, 120 men, mean age: 23.26 years, SD = 2.85) were challenged to solve a medium (2-back) vs. a high (3-back) difficulty N-back tasks. All subjects responded as expected, performance was better with the medium than the high demand task version, but no differences were found among genotypes while performing each working memory (WM) task. However, the cost of the level of complexity in N-back paradigm was double for GG subjects than for AA subjects. It is noteworthy that an additive-dosage allele relation was found for G allele in terms of cost of level of complexity. These genetic variation results support that the endocannabinoid system, evaluated by rs2180619 polymorphism, is involved in WM ability in humans.

  18. Strong Influence of Humidity on Low-Temperature Thin-Film Fabrication via Metal Aqua Complex for High Performance Oxide Semiconductor Thin-Film Transistors.

    PubMed

    Lim, Keon-Hee; Huh, Jae-Eun; Lee, Jinwon; Cho, Nam-Kwang; Park, Jun-Woo; Nam, Bu-Il; Lee, Eungkyu; Kim, Youn Sang

    2017-01-11

    Oxide semiconductors thin film transistors (OS TFTs) with good transparency and electrical performance have great potential for future display technology. In particular, solution-processed OS TFTs have been attracted much attention due to many advantages such as continuous, large scale, and low cost processability. Recently, OS TFTs fabricated with a metal aqua complex have been focused because they have low temperature processability for deposition on flexible substrate as well as high field-effect mobility for application of advanced display. However, despite some remarkable results, important factors to optimize their electrical performance with reproducibility and uniformity have not yet been achieved. Here, we newly introduce the strong effects of humidity to enhance the electrical performance of OS TFTs fabricated with the metal aqua complex. Through humidity control during the spin-coating process and annealing process, we successfully demonstrate solution-processed InO x /SiO 2 TFTs with a good electrical uniformity of ∼5% standard deviation, showing high average field-effect mobility of 2.76 cm 2 V -1 s -1 and 15.28 cm 2 V -1 s -1 fabricated at 200 and 250 °C, respectively. Also, on the basis of the systematic analyses, we demonstrate the mechanism for the change in electrical properties of InO x TFTs depending on the humidity control. Finally, on the basis of the mechanism, we extended the humidity control to the fabrication of the AlO x insulator. Subsequently, we successfully achieved humidity-controlled InO x /AlO x TFTs fabricated at 200 °C showing high average field-effect mobility of 9.5 cm 2 V -1 s -1 .

  19. [The impact of nutrient prosperity of the body on the alimentary status of junior high school pupils within the framework of the implementation of health-saving technologies].

    PubMed

    Setko, A G; Trishina, S P; Timoshenko, E P

    2014-01-01

    In the work there was performed the assessment of the actual nutrition of high school pupils, its impact on the nutritional status and efficiency of the implementation in the modern educational process certified vitamin-mineral complexes as a health-saving component of the optimization of rations. The introduction of additional vitamin-mineral complexes into the food of high school pupils was established to lead to the optimization of nutrition content by most of macro- and micronutrients, which in turn contributed to the increase in the number of children with an adequate supply of the body with vitamins and also contributed to the increase of students with a satisfactory adaptation by 44.3%, cases having sufficient performance reserves by 48.4% and the decrease of the number of children with sharply reduced functional reserves by 4 times.

  20. An Optimizing Compiler for Petascale I/O on Leadership-Class Architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kandemir, Mahmut Taylan; Choudary, Alok; Thakur, Rajeev

    In high-performance computing (HPC), parallel I/O architectures usually have very complex hierarchies with multiple layers that collectively constitute an I/O stack, including high-level I/O libraries such as PnetCDF and HDF5, I/O middleware such as MPI-IO, and parallel file systems such as PVFS and Lustre. Our DOE project explored automated instrumentation and compiler support for I/O intensive applications. Our project made significant progress towards understanding the complex I/O hierarchies of high-performance storage systems (including storage caches, HDDs, and SSDs), and designing and implementing state-of-the-art compiler/runtime system technology that targets I/O intensive HPC applications that target leadership class machine. This final reportmore » summarizes the major achievements of the project and also points out promising future directions Two new sections in this report compared to the previous report are IOGenie and SSD/NVM-specific optimizations.« less

  1. High-speed low-complexity video coding with EDiCTius: a DCT coding proposal for JPEG XS

    NASA Astrophysics Data System (ADS)

    Richter, Thomas; Fößel, Siegfried; Keinert, Joachim; Scherl, Christian

    2017-09-01

    In its 71th meeting, the JPEG committee issued a call for low complexity, high speed image coding, designed to address the needs of low-cost video-over-ip applications. As an answer to this call, Fraunhofer IIS and the Computing Center of the University of Stuttgart jointly developed an embedded DCT image codec requiring only minimal resources while maximizing throughput on FPGA and GPU implementations. Objective and subjective tests performed for the 73rd meeting confirmed its excellent performance and suitability for its purpose, and it was selected as one of the two key contributions for the development of a joined test model. In this paper, its authors describe the design principles of the codec, provide a high-level overview of the encoder and decoder chain and provide evaluation results on the test corpus selected by the JPEG committee.

  2. Using AberOWL for fast and scalable reasoning over BioPortal ontologies.

    PubMed

    Slater, Luke; Gkoutos, Georgios V; Schofield, Paul N; Hoehndorf, Robert

    2016-08-08

    Reasoning over biomedical ontologies using their OWL semantics has traditionally been a challenging task due to the high theoretical complexity of OWL-based automated reasoning. As a consequence, ontology repositories, as well as most other tools utilizing ontologies, either provide access to ontologies without use of automated reasoning, or limit the number of ontologies for which automated reasoning-based access is provided. We apply the AberOWL infrastructure to provide automated reasoning-based access to all accessible and consistent ontologies in BioPortal (368 ontologies). We perform an extensive performance evaluation to determine query times, both for queries of different complexity and for queries that are performed in parallel over the ontologies. We demonstrate that, with the exception of a few ontologies, even complex and parallel queries can now be answered in milliseconds, therefore allowing automated reasoning to be used on a large scale, to run in parallel, and with rapid response times.

  3. Striving for Excellence Sometimes Hinders High Achievers: Performance-Approach Goals Deplete Arithmetical Performance in Students with High Working Memory Capacity

    PubMed Central

    Crouzevialle, Marie; Smeding, Annique; Butera, Fabrizio

    2015-01-01

    We tested whether the goal to attain normative superiority over other students, referred to as performance-approach goals, is particularly distractive for high-Working Memory Capacity (WMC) students—that is, those who are used to being high achievers. Indeed, WMC is positively related to high-order cognitive performance and academic success, a record of success that confers benefits on high-WMC as compared to low-WMC students. We tested whether such benefits may turn out to be a burden under performance-approach goal pursuit. Indeed, for high achievers, aiming to rise above others may represent an opportunity to reaffirm their positive status—a stake susceptible to trigger disruptive outcome concerns that interfere with task processing. Results revealed that with performance-approach goals—as compared to goals with no emphasis on social comparison—the higher the students’ WMC, the lower their performance at a complex arithmetic task (Experiment 1). Crucially, this pattern appeared to be driven by uncertainty regarding the chances to outclass others (Experiment 2). Moreover, an accessibility measure suggested the mediational role played by status-related concerns in the observed disruption of performance. We discuss why high-stake situations can paradoxically lead high-achievers to sub-optimally perform when high-order cognitive performance is at play. PMID:26407097

  4. Performance Assessment of Two Whole-Lake Acoustic Positional Telemetry Systems - Is Reality Mining of Free-Ranging Aquatic Animals Technologically Possible?

    PubMed Central

    Baktoft, Henrik; Zajicek, Petr; Klefoth, Thomas; Svendsen, Jon C.; Jacobsen, Lene; Pedersen, Martin Wæver; March Morla, David; Skov, Christian; Nakayama, Shinnosuke; Arlinghaus, Robert

    2015-01-01

    Acoustic positional telemetry systems (APTs) represent a novel approach to study the behaviour of free ranging aquatic animals in the wild at unprecedented detail. System manufactures promise remarkably high temporal and spatial resolution. However, the performance of APTs has rarely been rigorously tested at the level of entire ecosystems. Moreover, the effect of habitat structure on system performance has only been poorly documented. Two APTs were deployed to cover two small lakes and a series of standardized stationary tests were conducted to assess system performance. Furthermore, a number of tow tests were conducted to simulate moving fish. Based on these data, we quantified system performance in terms of data yield, accuracy and precision as a function of structural complexity in relation to vegetation. Mean data yield of the two systems was 40 % (Lake1) and 60 % (Lake2). Average system accuracy (acc) and precision (prec) were Lake1: acc = 3.1 m, prec = 1.1 m; Lake2: acc = 1.0 m, prec = 0.2 m. System performance was negatively affected by structural complexity, i.e., open water habitats yielded far better performance than structurally complex vegetated habitats. Post-processing greatly improved data quality, and sub-meter accuracy and precision were, on average, regularly achieved in Lake2 but remained the exception in the larger and structurally more complex Lake1. Moving transmitters were tracked well by both systems. Whereas overestimation of moved distance is inevitable for stationary transmitters due to accumulation of small tracking errors, moving transmitters can result in both over- and underestimation of distances depending on circumstances. Both deployed APTs were capable of providing high resolution positional data at the scale of entire lakes and are suitable systems to mine the reality of free ranging fish in their natural environment. This opens important opportunities to advance several fields of study such as movement ecology and animal social networks in the wild. It is recommended that thorough performance tests are conducted in any study utilizing APTs. The APTs tested here appear best suited for studies in structurally simple ecosystems or for studying pelagic species. In such situations, the data quality provided by the APTs is exceptionally high. PMID:26000459

  5. Organic Donor-Acceptor Complexes as Novel Organic Semiconductors.

    PubMed

    Zhang, Jing; Xu, Wei; Sheng, Peng; Zhao, Guangyao; Zhu, Daoben

    2017-07-18

    Organic donor-acceptor (DA) complexes have attracted wide attention in recent decades, resulting in the rapid development of organic binary system electronics. The design and synthesis of organic DA complexes with a variety of component structures have mainly focused on metallicity (or even superconductivity), emission, or ferroelectricity studies. Further efforts have been made in high-performance electronic investigations. The chemical versatility of organic semiconductors provides DA complexes with a great number of possibilities for semiconducting applications. Organic DA complexes extend the semiconductor family and promote charge separation and transport in organic field-effect transistors (OFETs) and organic photovoltaics (OPVs). In OFETs, the organic complex serves as an active layer across extraordinary charge pathways, ensuring the efficient transport of induced charges. Although an increasing number of organic semiconductors have been reported to exhibit good p- or n-type properties (mobilities higher than 1 or even 10 cm 2 V -1 s -1 ), critical scientific challenges remain in utilizing the advantages of existing semiconductor materials for more and wider applications while maintaining less complicated synthetic or device fabrication processes. DA complex materials have revealed new insight: their unique molecular packing and structure-property relationships. The combination of donors and acceptors could offer practical advantages compared with their unimolecular materials. First, growing crystals of DA complexes with densely packed structures will reduce impurities and traps from the self-assembly process. Second, complexes based on the original structural components could form superior mixture stacking, which can facilitate charge transport depending on the driving force in the coassembly process. Third, the effective use of organic semiconductors can lead to tunable band structures, allowing the operation mode (p- or n-type) of the transistor to be systematically controlled by changing the components. Finally, theoretical calculations based on cocrystals with unique stacking could widen our understanding of structure-property relationships and in turn help us design high-performance semiconductors based on DA complexes. In this Account, we focus on discussing organic DA complexes as a new class of semiconducting materials, including their design, growth methods, packing modes, charge-transport properties, and structure-property relationships. We have also fabricated and investigated devices based on these binary crystals. This interdisciplinary work combines techniques from the fields of self-assembly, crystallography, condensed-matter physics, and theoretical chemistry. Researchers have designed new complex systems, including donor and acceptor compounds that self-assemble in feasible ways into highly ordered cocrystals. We demonstrate that using this crystallization method can easily realize ambipolar or unipolar transport. To further improve device performance, we propose several design strategies, such as using new kinds of donors and acceptors, modulating the energy alignment of the donor (ionization potential, IP) and acceptor (electron affinity, EA) components, and extending the π-conjugated backbones. In addition, we have found that when we use molecular "doping" (2:1 cocrystallization), the charge-transport nature of organic semiconductors can be switched from hole-transport-dominated to electron-transport-dominated. We expect that the formation of cocrystals through the complexation of organic donor and acceptor species will serve as a new strategy to develop semiconductors for organic electronics with superior performances over their corresponding individual components.

  6. 76 FR 80905 - TRICARE Evaluation of Centers for Medicare & Medicaid Services Approved Laboratory Developed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-27

    ... potential high utilization and potential high clinical impact on TRICARE beneficiaries. If no submission is... reviewed in numerical order beginning with the test listed as having the highest priority. Those selected... laboratories that use LDTs as well as FDA approved tests. Laboratories performing moderate or high complexity...

  7. Structured analysis and modeling of complex systems

    NASA Technical Reports Server (NTRS)

    Strome, David R.; Dalrymple, Mathieu A.

    1992-01-01

    The Aircrew Evaluation Sustained Operations Performance (AESOP) facility at Brooks AFB, Texas, combines the realism of an operational environment with the control of a research laboratory. In recent studies we collected extensive data from the Airborne Warning and Control Systems (AWACS) Weapons Directors subjected to high and low workload Defensive Counter Air Scenarios. A critical and complex task in this environment involves committing a friendly fighter against a hostile fighter. Structured Analysis and Design techniques and computer modeling systems were applied to this task as tools for analyzing subject performance and workload. This technology is being transferred to the Man-Systems Division of NASA Johnson Space Center for application to complex mission related tasks, such as manipulating the Shuttle grappler arm.

  8. Lasercom system architecture with reduced complexity

    NASA Technical Reports Server (NTRS)

    Lesh, James R. (Inventor); Chen, Chien-Chung (Inventor); Ansari, Homayoon (Inventor)

    1994-01-01

    Spatial acquisition and precision beam pointing functions are critical to spaceborne laser communication systems. In the present invention, a single high bandwidth CCD detector is used to perform both spatial acquisition and tracking functions. Compared to previous lasercom hardware design, the array tracking concept offers reduced system complexity by reducing the number of optical elements in the design. Specifically, the design requires only one detector and one beam steering mechanism. It also provides the means to optically close the point-ahead control loop. The technology required for high bandwidth array tracking was examined and shown to be consistent with current state of the art. The single detector design can lead to a significantly reduced system complexity and a lower system cost.

  9. LaserCom System Architecture With Reduced Complexity

    NASA Technical Reports Server (NTRS)

    Lesh, James R. (Inventor); Chen, Chien-Chung (Inventor); Ansari, Homa-Yoon (Inventor)

    1996-01-01

    Spatial acquisition and precision beam pointing functions are critical to spaceborne laser communication systems. In the present invention a single high bandwidth CCD detector is used to perform both spatial acquisition and tracking functions. Compared to previous lasercom hardware design, the array tracking concept offers reduced system complexity by reducing the number of optical elements in the design. Specifically, the design requires only one detector and one beam steering mechanism. It also provides means to optically close the point-ahead control loop. The technology required for high bandwidth array tracking was examined and shown to be consistent with current state of the art. The single detector design can lead to a significantly reduced system complexity and a lower system cost.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yan; Cheng, Yongqiang; Li, Juchuan

    Transition metal (TM) substitution has been widely applied to change complex oxides crystal structures to create high energy density electrodes materials in high performance rechargeable lithium-ion batteries. The complex local structure in the oxides imparted by the TM arrangement often impacts their electrochemical behaviors by influencing the diffusion and intercalation of lithium. Here, a major discrepancy is demonstrated between the global and local structures of the promising high energy density and high voltage LiNi 0.5Mn 1.5O 4 spinel cathode material that contradicts the existing structural models. A new single-phase lattice-cell orientation disorder model is proposed as the mechanism for themore » local ordering that explains how the inhomogeneous local distortions and the coherent connection give rise to the global structure in the complex oxide. As a result, the single-phase model is consistent with the electrochemical behavior observation of the materials.« less

  11. Exhaust emission reduction for intermittent combustion aircraft engines

    NASA Technical Reports Server (NTRS)

    Moffett, R. N.

    1979-01-01

    Three concepts for optimizing the performance, increasing the fuel economy, and reducing exhaust emission of the piston aircraft engine were investigated. High energy-multiple spark discharge and spark plug tip penetration, ultrasonic fuel vaporization, and variable valve timing were evaluated individually. Ultrasonic fuel vaporization did not demonstrate sufficient improvement in distribution to offset the performance loss caused by the additional manifold restriction. High energy ignition and revised spark plug tip location provided no change in performance or emissions. Variable valve timing provided some performance benefit; however, even greater performance improvement was obtained through induction system tuning which could be accomplished with far less complexity.

  12. User's guide for a computer program for calculating the zero-lift wave drag of complex aircraft configurations

    NASA Technical Reports Server (NTRS)

    Craidon, C. B.

    1983-01-01

    A computer program was developed to extend the geometry input capabilities of previous versions of a supersonic zero lift wave drag computer program. The arbitrary geometry input description is flexible enough to describe almost any complex aircraft concept, so that highly accurate wave drag analysis can now be performed because complex geometries can be represented accurately and do not have to be modified to meet the requirements of a restricted input format.

  13. Radiationless Transitions and Excited-State Absorption of Low-Field Chromium Complexes in Solids

    DTIC Science & Technology

    1989-07-20

    host-lattice modes and, in the case of the scandium compound with 5 % chromium concentration, of the a and tIg 2g localized modes. The local-mode...Radiationless transitions and excited-state Final report I/I/86-5/31/89 absorption of low-field chromium complexes 6. PERFORMING ORG. REPORT NUMBER ( 1 in...complexes, chromium ; tunable lasers, high pressure,-photoluminescence 4. 26, AMTVrAC? (Cbm e @CAP N Igemem’ a IdoMit’ by block nambew) The continuation of a

  14. Optimizing Irregular Applications for Energy and Performance on the Tilera Many-core Architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chavarría-Miranda, Daniel; Panyala, Ajay R.; Halappanavar, Mahantesh

    Optimizing applications simultaneously for energy and performance is a complex problem. High performance, parallel, irregular applications are notoriously hard to optimize due to their data-dependent memory accesses, lack of structured locality and complex data structures and code patterns. Irregular kernels are growing in importance in applications such as machine learning, graph analytics and combinatorial scientific computing. Performance- and energy-efficient implementation of these kernels on modern, energy efficient, multicore and many-core platforms is therefore an important and challenging problem. We present results from optimizing two irregular applications { the Louvain method for community detection (Grappolo), and high-performance conjugate gradient (HPCCG) {more » on the Tilera many-core system. We have significantly extended MIT's OpenTuner auto-tuning framework to conduct a detailed study of platform-independent and platform-specific optimizations to improve performance as well as reduce total energy consumption. We explore the optimization design space along three dimensions: memory layout schemes, compiler-based code transformations, and optimization of parallel loop schedules. Using auto-tuning, we demonstrate whole node energy savings of up to 41% relative to a baseline instantiation, and up to 31% relative to manually optimized variants.« less

  15. Novice Principals in Chile Mobilizing Change for the First Time: Challenges and Opportunities Associated with a School's Readiness for Change

    ERIC Educational Resources Information Center

    Galdames, Sergio; Montecinos, Carmen; Campos, Fabián; Ahumada, Luis; Leiva, María Verónica

    2018-01-01

    In a context characterised by triple sources of accountability demands, principals in Chile are required to mobilise change to raise performance indicators. School improvement is a complex endeavour--a complexity that is intensified for newly appointed principals, particularly when placed in a high-poverty, ineffective school. This article…

  16. R&D100: Lightweight Distributed Metric Service

    ScienceCinema

    Gentile, Ann; Brandt, Jim; Tucker, Tom; Showerman, Mike

    2018-06-12

    On today's High Performance Computing platforms, the complexity of applications and configurations makes efficient use of resources difficult. The Lightweight Distributed Metric Service (LDMS) is monitoring software developed by Sandia National Laboratories to provide detailed metrics of system performance. LDMS provides collection, transport, and storage of data from extreme-scale systems at fidelities and timescales to provide understanding of application and system performance with no statistically significant impact on application performance.

  17. R&D100: Lightweight Distributed Metric Service

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gentile, Ann; Brandt, Jim; Tucker, Tom

    2015-11-19

    On today's High Performance Computing platforms, the complexity of applications and configurations makes efficient use of resources difficult. The Lightweight Distributed Metric Service (LDMS) is monitoring software developed by Sandia National Laboratories to provide detailed metrics of system performance. LDMS provides collection, transport, and storage of data from extreme-scale systems at fidelities and timescales to provide understanding of application and system performance with no statistically significant impact on application performance.

  18. Optical filters for UV to near IR space applications

    NASA Astrophysics Data System (ADS)

    Begou, T.; Krol, H.; Hecquet, Christophe; Bondet, C.; Lumeau, J.; Grèzes-Besset, C.; Lequime, M.

    2017-11-01

    We present hereafter the results on the fabrication of complex optical filters within the Institut Fresnel in close collaboration with CILAS. Bandpass optical filters dedicated to astronomy and space applications, with central wavelengths ranging from ultraviolet to near infrared, were deposited on both sides of glass substrates with performances in very good congruence with theoretical designs. For these applications, the required functions are particularly complex as they must present a very narrow bandwidth as well as a high level of rejection over a broad spectral range. In addition to those severe optical performances, insensitivity to environmental conditions is necessary. For this purpose, robust solutions with particularly stable performances have to be proposed.

  19. Enhanced lithium storage performance of hierarchical CuO nanomaterials with surface fractal characteristics

    NASA Astrophysics Data System (ADS)

    Li, Ang; He, Renyue; Bian, Zhuo; Song, Huaihe; Chen, Xiaohong; Zhou, Jisheng

    2018-06-01

    Self-assembled hierarchical CuO nanostructures with fractal structures were prepared by a mild method and exhibited excellent lithium storage properties, certain of which even demonstrated a high reversible capacity of 827 mAh g-1 at a rate of 0.1 C. An interesting phenomenon was observed that the electrochemical performance varies along with the structure complexity, and the products with higher surface factal dimensions exhibited larger capability and better cyclability. Structural and electrochemical analysis methods were used to explore the lithiation kinetics of the samples and the reasons for the outstanding electrochemical performances related to the complexities of hierarchical nanostructures and the irregularities of surface and mass distribution.

  20. Complexity of physiological responses decreases in high-stress musical performance.

    PubMed

    Williamon, Aaron; Aufegger, Lisa; Wasley, David; Looney, David; Mandic, Danilo P

    2013-12-06

    For musicians, performing in front of an audience can cause considerable apprehension; indeed, performance anxiety is felt throughout the profession, with wide ranging symptoms arising irrespective of age, skill level and amount of practice. A key indicator of stress is frequency-specific fluctuations in the dynamics of heart rate known as heart rate variability (HRV). Recent developments in sensor technology have made possible the measurement of physiological parameters reflecting HRV non-invasively and outside of the laboratory, opening research avenues for real-time performer feedback to help improve stress management. However, the study of stress using standard algorithms has led to conflicting and inconsistent results. Here, we present an innovative and rigorous approach which combines: (i) a controlled and repeatable experiment in which the physiological response of an expert musician was evaluated in a low-stress performance and a high-stress recital for an audience of 400 people, (ii) a piece of music with varying physical and cognitive demands, and (iii) dynamic stress level assessment with standard and state-of-the-art HRV analysis algorithms such as those within the domain of complexity science which account for higher order stress signatures. We show that this offers new scope for interpreting the autonomic nervous system response to stress in real-world scenarios, with the evolution of stress levels being consistent with the difficulty of the music being played, superimposed on the stress caused by performing in front of an audience. For an emerging class of algorithms that can analyse HRV independent of absolute data scaling, it is shown that complexity science performs a more accurate assessment of average stress levels, thus providing greater insight into the degree of physiological change experienced by musicians when performing in public.

  1. Complexity of physiological responses decreases in high-stress musical performance

    PubMed Central

    Williamon, Aaron; Aufegger, Lisa; Wasley, David; Looney, David; Mandic, Danilo P.

    2013-01-01

    For musicians, performing in front of an audience can cause considerable apprehension; indeed, performance anxiety is felt throughout the profession, with wide ranging symptoms arising irrespective of age, skill level and amount of practice. A key indicator of stress is frequency-specific fluctuations in the dynamics of heart rate known as heart rate variability (HRV). Recent developments in sensor technology have made possible the measurement of physiological parameters reflecting HRV non-invasively and outside of the laboratory, opening research avenues for real-time performer feedback to help improve stress management. However, the study of stress using standard algorithms has led to conflicting and inconsistent results. Here, we present an innovative and rigorous approach which combines: (i) a controlled and repeatable experiment in which the physiological response of an expert musician was evaluated in a low-stress performance and a high-stress recital for an audience of 400 people, (ii) a piece of music with varying physical and cognitive demands, and (iii) dynamic stress level assessment with standard and state-of-the-art HRV analysis algorithms such as those within the domain of complexity science which account for higher order stress signatures. We show that this offers new scope for interpreting the autonomic nervous system response to stress in real-world scenarios, with the evolution of stress levels being consistent with the difficulty of the music being played, superimposed on the stress caused by performing in front of an audience. For an emerging class of algorithms that can analyse HRV independent of absolute data scaling, it is shown that complexity science performs a more accurate assessment of average stress levels, thus providing greater insight into the degree of physiological change experienced by musicians when performing in public. PMID:24068177

  2. Reaction of CO2 with propylene oxide and styrene oxide catalyzed by a chromium(III) amine-bis(phenolate) complex.

    PubMed

    Dean, Rebecca K; Devaine-Pressing, Katalin; Dawe, Louise N; Kozak, Christopher M

    2013-07-07

    A diamine-bis(phenolate) chromium(III) complex, {CrCl[O2NN'](BuBu)}2 catalyzes the copolymerization of propylene oxide with carbon dioxide. The synthesis of this metal complex is straightforward and it can be obtained in high yields. This catalyst incorporates a tripodal amine-bis(phenolate) ligand, which differs from the salen or salan ligands typically used with Cr and Co complexes that have been employed as catalysts for the synthesis of such polycarbonates. The catalyst reported herein yields low molecular weight polymers with narrow polydispersities when the reaction is performed at room temperature. Performing the reaction at elevated temperatures causes the selective synthesis of propylene carbonate. The copolymerization activity for propylene oxide and carbon dioxide, as well as the coupling of carbon dioxide and styrene oxide to give styrene carbonate are presented.

  3. High temperature solar thermal receiver

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A design concept for a high temperature solar thermal receiver to operate at 3 atmospheres pressure and 2500 F outlet was developed. The performance and complexity of windowed matrix, tube-header, and extended surface receivers were evaluated. The windowed matrix receiver proved to offer substantial cost and performance benefits. An efficient and cost effective hardware design was evaluated for a receiver which can be readily interfaced to fuel and chemical processes or to heat engines for power generation.

  4. Use of Continuous Integration Tools for Application Performance Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vergara Larrea, Veronica G; Joubert, Wayne; Fuson, Christopher B

    High performance computing systems are becom- ing increasingly complex, both in node architecture and in the multiple layers of software stack required to compile and run applications. As a consequence, the likelihood is increasing for application performance regressions to occur as a result of routine upgrades of system software components which interact in complex ways. The purpose of this study is to evaluate the effectiveness of continuous integration tools for application performance monitoring on HPC systems. In addition, this paper also describes a prototype system for application perfor- mance monitoring based on Jenkins, a Java-based continuous integration tool. The monitoringmore » system described leverages several features in Jenkins to track application performance results over time. Preliminary results and lessons learned from monitoring applications on Cray systems at the Oak Ridge Leadership Computing Facility are presented.« less

  5. RTD fluxgate performance for application in magnetic label-based bioassay: preliminary results.

    PubMed

    Ando, B; Ascia, A; Baglio, S; Bulsara, A R; Trigona, C; In, V

    2006-01-01

    Magnetic bioassay is becoming of great interest in several application including magnetic separation, drug delivery, hyperthermia treatments, magnetic resonance imaging (MRI) and magnetic labelling. The latter can be used to localize bio-entities (e.g. cancer tissues) by using magnetic markers and high sensitive detectors. To this aim SQUIDs can be adopted, however this result in a quite sophisticated and complex method involving high cost and complex set-up. In this paper, the possibility to adopt RTD fluxgate magnetometers as alternative low cost solution to perform magnetic bio-sensing is investigated. Some experimental results are shown that encourage to pursue this approach in order to obtain simple devices that can detect a certain number of magnetic particles accumulated onto a small surface such to be useful for diagnosis purposes.

  6. Anti-aliasing filter design on spaceborne digital receiver

    NASA Astrophysics Data System (ADS)

    Yu, Danru; Zhao, Chonghui

    2009-12-01

    In recent years, with the development of satellite observation technologies, more and more active remote sensing technologies are adopted in spaceborne system. The spaceborne precipitation radar will depend heavily on high performance digital processing to collect meaningful rain echo data. It will increase the complexity of the spaceborne system and need high-performance and reliable digital receiver. This paper analyzes the frequency aliasing in the intermediate frequency signal sampling of digital down conversion in spaceborne radar, and gives an effective digital filter. By analysis and calculation, we choose reasonable parameters of the half-band filters to suppress the frequency aliasing on DDC. Compared with traditional filter, the FPGA resources cost in our system are reduced by over 50%. This can effectively reduce the complexity in the spaceborne digital receiver and improve the reliability of system.

  7. Lifetime Occupation and Late-Life Cognitive Performance Among Women.

    PubMed

    Ribeiro, Pricila Cristina Correa; Lourenço, Roberto Alves

    2015-01-01

    We examined whether women who had regular jobs throughout life performed better cognitively than older adult housewives. Linear regression was used to compare global cognitive performance scores of housewives (G1) and women exposed to work of low (G2) and high (G3) complexity. The sample comprised 477 older adult Brazilian women, 430 (90.4%) of whom had performed lifelong jobs. In work with data, the G2 group's cognitive performance scores were 1.73 points higher (p =.03), and the G3 group scored 1.76 points (p =.02) higher, than the G1. In work with things and with people, the G3 scored, respectively, 2.04 (p <.01) and 2.21 (p <.01) cognitive test points higher than the G1. Based on our findings we suggest occupation of greater complexity is associated with better cognitive performance in women later in life.

  8. Design of AN Intelligent Individual Evacuation Model for High Rise Building Fires Based on Neural Network Within the Scope of 3d GIS

    NASA Astrophysics Data System (ADS)

    Atila, U.; Karas, I. R.; Turan, M. K.; Rahman, A. A.

    2013-09-01

    One of the most dangerous disaster threatening the high rise and complex buildings of today's world including thousands of occupants inside is fire with no doubt. When we consider high population and the complexity of such buildings it is clear to see that performing a rapid and safe evacuation seems hard and human being does not have good memories in case of such disasters like world trade center 9/11. Therefore, it is very important to design knowledge based realtime interactive evacuation methods instead of classical strategies which lack of flexibility. This paper presents a 3D-GIS implementation which simulates the behaviour of an intelligent indoor pedestrian navigation model proposed for a self -evacuation of a person in case of fire. The model is based on Multilayer Perceptron (MLP) which is one of the most preferred artificial neural network architecture in classification and prediction problems. A sample fire scenario following through predefined instructions has been performed on 3D model of the Corporation Complex in Putrajaya (Malaysia) and the intelligent evacuation process has been realized within a proposed 3D-GIS based simulation.

  9. High performance network and channel-based storage

    NASA Technical Reports Server (NTRS)

    Katz, Randy H.

    1991-01-01

    In the traditional mainframe-centered view of a computer system, storage devices are coupled to the system through complex hardware subsystems called input/output (I/O) channels. With the dramatic shift towards workstation-based computing, and its associated client/server model of computation, storage facilities are now found attached to file servers and distributed throughout the network. We discuss the underlying technology trends that are leading to high performance network-based storage, namely advances in networks, storage devices, and I/O controller and server architectures. We review several commercial systems and research prototypes that are leading to a new approach to high performance computing based on network-attached storage.

  10. Chirality sensing with stereodynamic biphenolate zinc complexes.

    PubMed

    Bentley, Keith W; de Los Santos, Zeus A; Weiss, Mary J; Wolf, Christian

    2015-10-01

    Two bidentate ligands consisting of a fluxional polyarylacetylene framework with terminal phenol groups were synthesized. Reaction with diethylzinc gives stereodynamic complexes that undergo distinct asymmetric transformation of the first kind upon binding of chiral amines and amino alcohols. The substrate-to-ligand chirality imprinting at the zinc coordination sphere results in characteristic circular dichroism signals that can be used for direct enantiomeric excess (ee) analysis. This chemosensing approach bears potential for high-throughput ee screening with small sample amounts and reduced solvent waste compared to traditional high-performance liquid chromatography methods. © 2015 Wiley Periodicals, Inc.

  11. Multipurpose electroslag remelting furnace for modern energy and heavy engineering industry

    NASA Astrophysics Data System (ADS)

    Dub, A. V.; Dub, V. S.; Kriger, Yu. N.; Levkov, L. Ya.; Shurygin, D. A.; Kissel'man, M. A.; Nekhamin, C. M.; Chernyak, A. I.; Bessonov, A. V.; Kamantsev, S. V.; Sokolov, S. O.

    2012-12-01

    In 2011, a unique complex based on a multipurpose unit-type electroslag remelting (ESR) furnace is created to meet the demand for large high-quality solid and hollow billets for the products of power, atomic, petrochemical, and heavy machine engineering. This complex has modern low-frequency power supplies with a new control level that ensure a high homogeneity and quality of the billets and an increase in the engineering-and-economical performance of the production. A unique pilot ESR furnace is erected to adjust technological conditions and the main control system elements.

  12. Heterogeneous Monolithic Integration of Single-Crystal Organic Materials.

    PubMed

    Park, Kyung Sun; Baek, Jangmi; Park, Yoonkyung; Lee, Lynn; Hyon, Jinho; Koo Lee, Yong-Eun; Shrestha, Nabeen K; Kang, Youngjong; Sung, Myung Mo

    2017-02-01

    Manufacturing high-performance organic electronic circuits requires the effective heterogeneous integration of different nanoscale organic materials with uniform morphology and high crystallinity in a desired arrangement. In particular, the development of high-performance organic electronic and optoelectronic devices relies on high-quality single crystals that show optimal intrinsic charge-transport properties and electrical performance. Moreover, the heterogeneous integration of organic materials on a single substrate in a monolithic way is highly demanded for the production of fundamental organic electronic components as well as complex integrated circuits. Many of the various methods that have been designed to pattern multiple heterogeneous organic materials on a substrate and the heterogeneous integration of organic single crystals with their crystal growth are described here. Critical issues that have been encountered in the development of high-performance organic integrated electronics are also addressed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Complexity control algorithm based on adaptive mode selection for interframe coding in high efficiency video coding

    NASA Astrophysics Data System (ADS)

    Chen, Gang; Yang, Bing; Zhang, Xiaoyun; Gao, Zhiyong

    2017-07-01

    The latest high efficiency video coding (HEVC) standard significantly increases the encoding complexity for improving its coding efficiency. Due to the limited computational capability of handheld devices, complexity constrained video coding has drawn great attention in recent years. A complexity control algorithm based on adaptive mode selection is proposed for interframe coding in HEVC. Considering the direct proportionality between encoding time and computational complexity, the computational complexity is measured in terms of encoding time. First, complexity is mapped to a target in terms of prediction modes. Then, an adaptive mode selection algorithm is proposed for the mode decision process. Specifically, the optimal mode combination scheme that is chosen through offline statistics is developed at low complexity. If the complexity budget has not been used up, an adaptive mode sorting method is employed to further improve coding efficiency. The experimental results show that the proposed algorithm achieves a very large complexity control range (as low as 10%) for the HEVC encoder while maintaining good rate-distortion performance. For the lowdelayP condition, compared with the direct resource allocation method and the state-of-the-art method, an average gain of 0.63 and 0.17 dB in BDPSNR is observed for 18 sequences when the target complexity is around 40%.

  14. A design of spectrophotometric microfluidic chip sensor for analyzing silicate in seawater

    NASA Astrophysics Data System (ADS)

    Cao, X.; Zhang, S. W.; Chu, D. Z.; Wu, N.; Ma, H. K.; Liu, Y.

    2017-08-01

    High quality and continuous in situ silicate data are required to investigate the mechanism of the biogeochemical cycles and the formation of red tide. There is an urgently growing need for autonomous in situ silicate instruments that perform determination on various platforms. However, due to the high reagents and power consumption, as well as high system complexity leading to low reliability and robustness, the performance of the commercially available silicate sensors is not satisfactory. With these problems, here we present a new generation of microfluidic continuous flow analysis silicate sensor with sufficient analytical performance and robustness, for in situ determination of soluble silicate in seawater. The reaction mechanism of this sensor is based on the reaction of silicate with ammonium molybdate to form a yellow silicomolybdate complex and further reduction to silicomoIybdenum blue by ascorbic acid. The minimum limit of detection was 45.1 nmol L-1, and the linear determination range of the sensor is 0-400 μmol L-1. The recovery rate of the actual water is between 98.1%-104.0%, and the analyzing cycle of the sensor is about 5 minutes. This sensor has the advantages of high accuracy, high integration, low water consumption, and strong anti-interference ability. It has been successfully applied to measuring the silicate in seawater in Jiaozhou Bay.

  15. Tailoring Thermodynamics and Kinetics for Hydrogen Storage in Complex Hydrides towards Applications.

    PubMed

    Liu, Yongfeng; Yang, Yaxiong; Gao, Mingxia; Pan, Hongge

    2016-02-01

    Solid-state hydrogen storage using various materials is expected to provide the ultimate solution for safe and efficient on-board storage. Complex hydrides have attracted increasing attention over the past two decades due to their high gravimetric and volumetric hydrogen densities. In this account, we review studies from our lab on tailoring the thermodynamics and kinetics for hydrogen storage in complex hydrides, including metal alanates, borohydrides and amides. By changing the material composition and structure, developing feasible preparation methods, doping high-performance catalysts, optimizing multifunctional additives, creating nanostructures and understanding the interaction mechanisms with hydrogen, the operating temperatures for hydrogen storage in metal amides, alanates and borohydrides are remarkably reduced. This temperature reduction is associated with enhanced reaction kinetics and improved reversibility. The examples discussed in this review are expected to provide new inspiration for the development of complex hydrides with high hydrogen capacity and appropriate thermodynamics and kinetics for hydrogen storage. © 2015 The Chemical Society of Japan & Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Interlaboratory Study Characterizing a Yeast Performance Standard for Benchmarking LC-MS Platform Performance*

    PubMed Central

    Paulovich, Amanda G.; Billheimer, Dean; Ham, Amy-Joan L.; Vega-Montoto, Lorenzo; Rudnick, Paul A.; Tabb, David L.; Wang, Pei; Blackman, Ronald K.; Bunk, David M.; Cardasis, Helene L.; Clauser, Karl R.; Kinsinger, Christopher R.; Schilling, Birgit; Tegeler, Tony J.; Variyath, Asokan Mulayath; Wang, Mu; Whiteaker, Jeffrey R.; Zimmerman, Lisa J.; Fenyo, David; Carr, Steven A.; Fisher, Susan J.; Gibson, Bradford W.; Mesri, Mehdi; Neubert, Thomas A.; Regnier, Fred E.; Rodriguez, Henry; Spiegelman, Cliff; Stein, Stephen E.; Tempst, Paul; Liebler, Daniel C.

    2010-01-01

    Optimal performance of LC-MS/MS platforms is critical to generating high quality proteomics data. Although individual laboratories have developed quality control samples, there is no widely available performance standard of biological complexity (and associated reference data sets) for benchmarking of platform performance for analysis of complex biological proteomes across different laboratories in the community. Individual preparations of the yeast Saccharomyces cerevisiae proteome have been used extensively by laboratories in the proteomics community to characterize LC-MS platform performance. The yeast proteome is uniquely attractive as a performance standard because it is the most extensively characterized complex biological proteome and the only one associated with several large scale studies estimating the abundance of all detectable proteins. In this study, we describe a standard operating protocol for large scale production of the yeast performance standard and offer aliquots to the community through the National Institute of Standards and Technology where the yeast proteome is under development as a certified reference material to meet the long term needs of the community. Using a series of metrics that characterize LC-MS performance, we provide a reference data set demonstrating typical performance of commonly used ion trap instrument platforms in expert laboratories; the results provide a basis for laboratories to benchmark their own performance, to improve upon current methods, and to evaluate new technologies. Additionally, we demonstrate how the yeast reference, spiked with human proteins, can be used to benchmark the power of proteomics platforms for detection of differentially expressed proteins at different levels of concentration in a complex matrix, thereby providing a metric to evaluate and minimize preanalytical and analytical variation in comparative proteomics experiments. PMID:19858499

  17. Brain and effort: brain activation and effort-related working memory in healthy participants and patients with working memory deficits.

    PubMed

    Engström, Maria; Landtblom, Anne-Marie; Karlsson, Thomas

    2013-01-01

    Despite the interest in the neuroimaging of working memory, little is still known about the neurobiology of complex working memory in tasks that require simultaneous manipulation and storage of information. In addition to the central executive network, we assumed that the recently described salience network [involving the anterior insular cortex (AIC) and the anterior cingulate cortex (ACC)] might be of particular importance to working memory tasks that require complex, effortful processing. Healthy participants (n = 26) and participants suffering from working memory problems related to the Kleine-Levin syndrome (KLS) (a specific form of periodic idiopathic hypersomnia; n = 18) participated in the study. Participants were further divided into a high- and low-capacity group, according to performance on a working memory task (listening span). In a functional magnetic resonance imaging (fMRI) study, participants were administered the reading span complex working memory task tapping cognitive effort. The fMRI-derived blood oxygen level dependent (BOLD) signal was modulated by (1) effort in both the central executive and the salience network and (2) capacity in the salience network in that high performers evidenced a weaker BOLD signal than low performers. In the salience network there was a dichotomy between the left and the right hemisphere; the right hemisphere elicited a steeper increase of the BOLD signal as a function of increasing effort. There was also a stronger functional connectivity within the central executive network because of increased task difficulty. The ability to allocate cognitive effort in complex working memory is contingent upon focused resources in the executive and in particular the salience network. Individual capacity during the complex working memory task is related to activity in the salience (but not the executive) network so that high-capacity participants evidence a lower signal and possibly hence a larger dynamic response.

  18. Brain and effort: brain activation and effort-related working memory in healthy participants and patients with working memory deficits

    PubMed Central

    Engström, Maria; Landtblom, Anne-Marie; Karlsson, Thomas

    2013-01-01

    Despite the interest in the neuroimaging of working memory, little is still known about the neurobiology of complex working memory in tasks that require simultaneous manipulation and storage of information. In addition to the central executive network, we assumed that the recently described salience network [involving the anterior insular cortex (AIC) and the anterior cingulate cortex (ACC)] might be of particular importance to working memory tasks that require complex, effortful processing. Method: Healthy participants (n = 26) and participants suffering from working memory problems related to the Kleine–Levin syndrome (KLS) (a specific form of periodic idiopathic hypersomnia; n = 18) participated in the study. Participants were further divided into a high- and low-capacity group, according to performance on a working memory task (listening span). In a functional magnetic resonance imaging (fMRI) study, participants were administered the reading span complex working memory task tapping cognitive effort. Principal findings: The fMRI-derived blood oxygen level dependent (BOLD) signal was modulated by (1) effort in both the central executive and the salience network and (2) capacity in the salience network in that high performers evidenced a weaker BOLD signal than low performers. In the salience network there was a dichotomy between the left and the right hemisphere; the right hemisphere elicited a steeper increase of the BOLD signal as a function of increasing effort. There was also a stronger functional connectivity within the central executive network because of increased task difficulty. Conclusion: The ability to allocate cognitive effort in complex working memory is contingent upon focused resources in the executive and in particular the salience network. Individual capacity during the complex working memory task is related to activity in the salience (but not the executive) network so that high-capacity participants evidence a lower signal and possibly hence a larger dynamic response. PMID:23616756

  19. Large Eddy Simulation of High Reynolds Number Complex Flows

    NASA Astrophysics Data System (ADS)

    Verma, Aman

    Marine configurations are subject to a variety of complex hydrodynamic phenomena affecting the overall performance of the vessel. The turbulent flow affects the hydrodynamic drag, propulsor performance and structural integrity, control-surface effectiveness, and acoustic signature of the marine vessel. Due to advances in massively parallel computers and numerical techniques, an unsteady numerical simulation methodology such as Large Eddy Simulation (LES) is well suited to study such complex turbulent flows whose Reynolds numbers (Re) are typically on the order of 10. 6. LES also promises increasedaccuracy over RANS based methods in predicting unsteady phenomena such as cavitation and noise production. This dissertation develops the capability to enable LES of high Re flows in complex geometries (e.g. a marine vessel) on unstructured grids and provide physical insight into the turbulent flow. LES is performed to investigate the geometry induced separated flow past a marine propeller attached to a hull, in an off-design condition called crashback. LES shows good quantitative agreement with experiments and provides a physical mechanism to explain the increase in side-force on the propeller blades below an advance ratio of J=-0.7. Fundamental developments in the dynamic subgrid-scale model for LES are pursued to improve the LES predictions, especially for complex flows on unstructured grids. A dynamic procedure is proposed to estimate a Lagrangian time scale based on a surrogate correlation without any adjustable parameter. The proposed model is applied to turbulent channel, cylinder and marine propeller flows and predicts improved results over other model variants due to a physically consistent Lagrangian time scale. A wall model is proposed for application to LES of high Reynolds number wall-bounded flows. The wall model is formulated as the minimization of a generalized constraint in the dynamic model for LES and applied to LES of turbulent channel flow at various Reynolds numbers up to Reτ=10000 and coarse grid resolutions to obtain significant improvement.

  20. High performance MPEG-audio decoder IC

    NASA Technical Reports Server (NTRS)

    Thorn, M.; Benbassat, G.; Cyr, K.; Li, S.; Gill, M.; Kam, D.; Walker, K.; Look, P.; Eldridge, C.; Ng, P.

    1993-01-01

    The emerging digital audio and video compression technology brings both an opportunity and a new challenge to IC design. The pervasive application of compression technology to consumer electronics will require high volume, low cost IC's and fast time to market of the prototypes and production units. At the same time, the algorithms used in the compression technology result in complex VLSI IC's. The conflicting challenges of algorithm complexity, low cost, and fast time to market have an impact on device architecture and design methodology. The work presented in this paper is about the design of a dedicated, high precision, Motion Picture Expert Group (MPEG) audio decoder.

  1. Pilot workload, performance and aircraft control automation

    NASA Technical Reports Server (NTRS)

    Hart, S. G.; Sheridan, T. B.

    1984-01-01

    Conceptual and practical issues associated with the design, operation, and performance of advanced systems and the impact of such systems on the human operators are reviewed. The development of highly automated systems is driven by the availability of new technology and the requirement that operators safely and economically perform more and more activities in increasingly difficult and hostile environments. It is noted that the operators workload may become a major area of concern in future design considerations. Little research was done to determine how automation and workload relate to each other, although it is assumed that the abstract, supervisory, or management roles that are performed by operators of highly automated systems will impose increased mental workload. The relationship between performance and workload is discussed in relation to highly complex and automated environments.

  2. On the impact of communication complexity in the design of parallel numerical algorithms

    NASA Technical Reports Server (NTRS)

    Gannon, D.; Vanrosendale, J.

    1984-01-01

    This paper describes two models of the cost of data movement in parallel numerical algorithms. One model is a generalization of an approach due to Hockney, and is suitable for shared memory multiprocessors where each processor has vector capabilities. The other model is applicable to highly parallel nonshared memory MIMD systems. In the second model, algorithm performance is characterized in terms of the communication network design. Techniques used in VLSI complexity theory are also brought in, and algorithm independent upper bounds on system performance are derived for several problems that are important to scientific computation.

  3. Field programmable gate array-assigned complex-valued computation and its limits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernard-Schwarz, Maria, E-mail: maria.bernardschwarz@ni.com; Institute of Applied Physics, TU Wien, Wiedner Hauptstrasse 8, 1040 Wien; Zwick, Wolfgang

    We discuss how leveraging Field Programmable Gate Array (FPGA) technology as part of a high performance computing platform reduces latency to meet the demanding real time constraints of a quantum optics simulation. Implementations of complex-valued operations using fixed point numeric on a Virtex-5 FPGA compare favorably to more conventional solutions on a central processing unit. Our investigation explores the performance of multiple fixed point options along with a traditional 64 bits floating point version. With this information, the lowest execution times can be estimated. Relative error is examined to ensure simulation accuracy is maintained.

  4. On the impact of communication complexity on the design of parallel numerical algorithms

    NASA Technical Reports Server (NTRS)

    Gannon, D. B.; Van Rosendale, J.

    1984-01-01

    This paper describes two models of the cost of data movement in parallel numerical alorithms. One model is a generalization of an approach due to Hockney, and is suitable for shared memory multiprocessors where each processor has vector capabilities. The other model is applicable to highly parallel nonshared memory MIMD systems. In this second model, algorithm performance is characterized in terms of the communication network design. Techniques used in VLSI complexity theory are also brought in, and algorithm-independent upper bounds on system performance are derived for several problems that are important to scientific computation.

  5. Manipulation complexity in primates coevolved with brain size and terrestriality

    PubMed Central

    Heldstab, Sandra A.; Kosonen, Zaida K.; Koski, Sonja E.; Burkart, Judith M.; van Schaik, Carel P.; Isler, Karin

    2016-01-01

    Humans occupy by far the most complex foraging niche of all mammals, built around sophisticated technology, and at the same time exhibit unusually large brains. To examine the evolutionary processes underlying these features, we investigated how manipulation complexity is related to brain size, cognitive test performance, terrestriality, and diet quality in a sample of 36 non-human primate species. We categorized manipulation bouts in food-related contexts into unimanual and bimanual actions, and asynchronous or synchronous hand and finger use, and established levels of manipulative complexity using Guttman scaling. Manipulation categories followed a cumulative ranking. They were particularly high in species that use cognitively challenging food acquisition techniques, such as extractive foraging and tool use. Manipulation complexity was also consistently positively correlated with brain size and cognitive test performance. Terrestriality had a positive effect on this relationship, but diet quality did not affect it. Unlike a previous study on carnivores, we found that, among primates, brain size and complex manipulations to acquire food underwent correlated evolution, which may have been influenced by terrestriality. Accordingly, our results support the idea of an evolutionary feedback loop between manipulation complexity and cognition in the human lineage, which may have been enhanced by increasingly terrestrial habits. PMID:27075921

  6. Driver’s Cognitive Workload and Driving Performance under Traffic Sign Information Exposure in Complex Environments: A Case Study of the Highways in China

    PubMed Central

    Lyu, Nengchao; Xie, Lian; Wu, Chaozhong; Fu, Qiang; Deng, Chao

    2017-01-01

    Complex traffic situations and high driving workload are the leading contributing factors to traffic crashes. There is a strong correlation between driving performance and driving workload, such as visual workload from traffic signs on highway off-ramps. This study aimed to evaluate traffic safety by analyzing drivers’ behavior and performance under the cognitive workload in complex environment areas. First, the driving workload of drivers was tested based on traffic signs with different quantities of information. Forty-four drivers were recruited to conduct a traffic sign cognition experiment under static controlled environment conditions. Different complex traffic signs were used for applying the cognitive workload. The static experiment results reveal that workload is highly related to the amount of information on traffic signs and reaction time increases with the information grade, while driving experience and gender effect are not significant. This shows that the cognitive workload of subsequent driving experiments can be controlled by the amount of information on traffic signs; Second, driving characteristics and driving performance were analyzed under different secondary task driving workload levels using a driving simulator. Drivers were required to drive at the required speed on a designed highway off-ramp scene. The cognitive workload was controlled by reading traffic signs with different information, which were divided into four levels. Drivers had to make choices by pushing buttons after reading traffic signs. Meanwhile, the driving performance information was recorded. Questionnaires on objective workload were collected right after each driving task. The results show that speed maintenance and lane deviations are significantly different under different levels of cognitive workload, and the effects of driving experience and gender groups are significant. The research results can be used to analyze traffic safety in highway environments, while considering more drivers’ cognitive and driving performance. PMID:28218696

  7. Power optimization of digital baseband WCDMA receiver components on algorithmic and architectural level

    NASA Astrophysics Data System (ADS)

    Schämann, M.; Bücker, M.; Hessel, S.; Langmann, U.

    2008-05-01

    High data rates combined with high mobility represent a challenge for the design of cellular devices. Advanced algorithms are required which result in higher complexity, more chip area and increased power consumption. However, this contrasts to the limited power supply of mobile devices. This presentation discusses the application of an HSDPA receiver which has been optimized regarding power consumption with the focus on the algorithmic and architectural level. On algorithmic level the Rake combiner, Prefilter-Rake equalizer and MMSE equalizer are compared regarding their BER performance. Both equalizer approaches provide a significant increase of performance for high data rates compared to the Rake combiner which is commonly used for lower data rates. For both equalizer approaches several adaptive algorithms are available which differ in complexity and convergence properties. To identify the algorithm which achieves the required performance with the lowest power consumption the algorithms have been investigated using SystemC models regarding their performance and arithmetic complexity. Additionally, for the Prefilter Rake equalizer the power estimations of a modified Griffith (LMS) and a Levinson (RLS) algorithm have been compared with the tool ORINOCO supplied by ChipVision. The accuracy of this tool has been verified with a scalable architecture of the UMTS channel estimation described both in SystemC and VHDL targeting a 130 nm CMOS standard cell library. An architecture combining all three approaches combined with an adaptive control unit is presented. The control unit monitors the current condition of the propagation channel and adjusts parameters for the receiver like filter size and oversampling ratio to minimize the power consumption while maintaining the required performance. The optimization strategies result in a reduction of the number of arithmetic operations up to 70% for single components which leads to an estimated power reduction of up to 40% while the BER performance is not affected. This work utilizes SystemC and ORINOCO for the first estimation of power consumption in an early step of the design flow. Thereby algorithms can be compared in different operating modes including the effects of control units. Here an algorithm having higher peak complexity and power consumption but providing more flexibility showed less consumption for normal operating modes compared to the algorithm which is optimized for peak performance.

  8. Electrocatalytic oxidation of 2-mercaptoethanol using modified glassy carbon electrode by MWCNT in combination with unsymmetrical manganese (II) Schiff base complexes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohebbi, Sajjad, E-mail: smohebbi@uok.ac.ir; Eslami, Saadat

    2015-06-15

    Highlights: • High electocatalytic efficiency and stability of modified hybrid electrode GC/MWCNTs/MnSaloph. • Direct reflection of catalytic activity of manganese complexes on electrocatalytic oxidation of 2-ME. • Decreasing overpotential and increasing catalytic peak current toward oxidation of 2-ME. • Deposition of range of novel substituted N{sub 2}O{sub 2} Saloph complexes of manganese(II) on GCE/MWCNT. • Enhancement of electrocatalytic oxidation activity upon electron donating substitutions on the Saloph. - Abstract: The performance of modified hybrid glassy carbon electrode with composite of carbon nanotubes and manganese complexes for the electrocatalytic oxidation of 2-mercaptoethanol is developed. GC electrode was modified using MWCNT andmore » new N{sub 2}O{sub 2} unsymmetrical tetradentate Schiff base complexes of manganese namely Manganese Saloph complexes 1-5, with general formula Mn[(5-x-4-y-Sal)(5-x′-4-y′-Sal) Ph], where x, x′ = H, Br, NO{sub 2} and y, y′ = H, MeO. Direct immobilization of CNT on the surface of GCE is performed by abrasive immobilization, and then modified by manganese(II) complexes via direct deposition method. These novel modified electrodes clearly demonstrate the necessity of modifying bare carbon electrodes to endow them with the desired behavior and were identified by HRTEM. Also complexes were characterized by elemental analyses, MS, UV–vis and IR spectroscopy. Modified hybrid GC/MWCNT/MnSaloph electrode exhibits strong and stable electrocatalytic activity towards the electrooxidation of 2-mercaptoethanol molecules in comparison with bare glassy carbon electrode with advantages of very low over potential and high catalytic current. Such ability promotes the thiol’s electron transfer reaction. Also, electron withdrawing substituent on the Saloph was enhanced electrocatalytic oxidation activity.« less

  9. PISA, Policy and Persuasion: Translating Complex Conditions into Education "Best Practice"

    ERIC Educational Resources Information Center

    Auld, Euan; Morris, Paul

    2016-01-01

    Education reform is increasingly portrayed as a means to improve a nation's global competitiveness as measured by its performance in international league tables of pupil achievement. This has created a demand for comparative research which identifies "what works" in high-performing school systems. A diverse array of consultancies, thinks…

  10. A Conceptual Framework for Assessing Performance in Games and Simulations. CRESST Report 771

    ERIC Educational Resources Information Center

    Koenig, Alan D.; Lee, John J.; Iseli, Markus; Wainess, Richard

    2010-01-01

    The military's need for high-fidelity games and simulations is substantial, as these environments can be valuable for demonstration of essential knowledge, skills, and abilities required in complex tasks. However assessing performance in these settings can be difficult--particularly in non-linear simulations where more than one pathway to success…

  11. Ready, Aim, Perform! Targeted Micro-Training for Performance Intervention

    ERIC Educational Resources Information Center

    Carpenter, Julia; Forde, Dahlia S.; Stevens, Denise R.; Flango, Vincent; Babcock, Lisa K.

    2016-01-01

    The Department of Veterans Affairs has an immediate problem at hand. Tens of thousands of employees are working in a high-stress work environment where fast-paced daily production requirements are critical. Employees are faced with a tremendous backlog of veterans' claims. Unfortunately, not only are the claims extremely complex, but there is…

  12. 2013 R&D 100 Award: ‘Miniapps’ Bolster High Performance Computing

    ScienceCinema

    Belak, Jim; Richards, David

    2018-06-12

    Two Livermore computer scientists served on a Sandia National Laboratories-led team that developed Mantevo Suite 1.0, the first integrated suite of small software programs, also called "miniapps," to be made available to the high performance computing (HPC) community. These miniapps facilitate the development of new HPC systems and the applications that run on them. Miniapps (miniature applications) serve as stripped down surrogates for complex, full-scale applications that can require a great deal of time and effort to port to a new HPC system because they often consist of hundreds of thousands of lines of code. The miniapps are a prototype that contains some or all of the essentials of the real application but with many fewer lines of code, making the miniapp more versatile for experimentation. This allows researchers to more rapidly explore options and optimize system design, greatly improving the chances the full-scale application will perform successfully. These miniapps have become essential tools for exploring complex design spaces because they can reliably predict the performance of full applications.

  13. A Project Manager’s Personal Attributes as Predictors for Success

    DTIC Science & Technology

    2007-03-01

    Northouse (2004) explains that leadership is highly a researched topic with much written. Yet, a definitive description of this phenomenon is difficult to...express because of its complexity. Even though leadership has varied descriptions and conceptualizations, Northouse states that the concept of...characteristic of leadership is not an accurate predictor of performance. Leadership is a complex, multi-faceted attribute ( Northouse , 2004) and specific

  14. The Trainability of Adolescent Soccer Players to Brief Periodized Complex Training.

    PubMed

    Chatzinikolaou, Athanasios; Michaloglou, Konstantinos; Avloniti, Alexandra; Leontsini, Diamanda; Deli, Chariklia K; Vlachopoulos, Dimitris; Gracia-Marco, Luis; Arsenis, Sotirios; Athanailidis, Ioannis; Draganidis, Dimitrios; Jamurtas, Athanasios Z; Williams, Craig A; Fatouros, Ioannis G

    2018-05-01

    To investigate the effect of a complex, short-term strength/power training protocol on performance and body composition of elite early adolescent soccer players. Twenty-two players (14-15 y) were randomly assigned to (1) an experimental group (N = 12; participated in a 5-wk training protocol with traditional multijoint power resistance exercises, Olympic-style lifts, plyometric drills, and speed work; 4 times per week) or (2) a control group (N = 10). Strength and power performance (jumping, speed, change of direction, repeated sprint ability, endurance, isokinetic strength of knee flexors and extensors, maximal strength in various lifts, and speed-endurance) were evaluated pretraining and posttraining. Cessation of training for 5 weeks in the control group induced a marked performance deterioration (∼5%-20%). Training not only prevented strength performance deterioration but also increased it (∼2%-30%). Endurance and repeated sprint ability declined to a smaller extent in experimental group compared with control group (15% vs 7.5%). Isometric strength and body composition remained unaltered in both groups. Results demonstrate that (1) young players exhibit a high level of trainability of their strength/power performance (but not endurance) in response to a short-term complex training protocol during early adolescence, (2) Olympic-style lifts are characterized by increased safety in this age group and appear to be highly effective, (3) lifts incorporating a hip thrust result in increased strength of both knee extensors and flexors, (4) cessation of training for only 5 weeks results in marked deterioration of strength/power and endurance performance, and (5) improvement of strength/power performance may be related to neural-based adaptation as body composition remained unaffected.

  15. Task complexity modulates pilot electroencephalographic activity during real flights.

    PubMed

    Di Stasi, Leandro L; Diaz-Piedra, Carolina; Suárez, Juan; McCamy, Michael B; Martinez-Conde, Susana; Roca-Dorda, Joaquín; Catena, Andrés

    2015-07-01

    Most research connecting task performance and neural activity to date has been conducted in laboratory conditions. Thus, field studies remain scarce, especially in extreme conditions such as during real flights. Here, we investigated the effects of flight procedures of varied complexity on the in-flight EEG activity of military helicopter pilots. Flight procedural complexity modulated the EEG power spectrum: highly demanding procedures (i.e., takeoff and landing) were associated with higher EEG power in the higher frequency bands, whereas less demanding procedures (i.e., flight exercises) were associated with lower EEG power over the same frequency bands. These results suggest that EEG recordings may help to evaluate an operator's cognitive performance in challenging real-life scenarios, and thus could aid in the prevention of catastrophic events. © 2015 Society for Psychophysiological Research.

  16. Structural Characterization of a Thrombin-Aptamer Complex by High Resolution Native Top-Down Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Zhang, Jiang; Loo, Rachel R. Ogorzalek; Loo, Joseph A.

    2017-09-01

    Native mass spectrometry (MS) with electrospray ionization (ESI) has evolved as an invaluable tool for the characterization of intact native proteins and non-covalently bound protein complexes. Here we report the structural characterization by high resolution native top-down MS of human thrombin and its complex with the Bock thrombin binding aptamer (TBA), a 15-nucleotide DNA with high specificity and affinity for thrombin. Accurate mass measurements revealed that the predominant form of native human α-thrombin contains a glycosylation mass of 2205 Da, corresponding to a sialylated symmetric biantennary oligosaccharide structure without fucosylation. Native MS showed that thrombin and TBA predominantly form a 1:1 complex under near physiological conditions (pH 6.8, 200 mM NH4OAc), but the binding stoichiometry is influenced by the solution ionic strength. In 20 mM ammonium acetate solution, up to two TBAs were bound to thrombin, whereas increasing the solution ionic strength destabilized the thrombin-TBA complex and 1 M NH4OAc nearly completely dissociated the complex. This observation is consistent with the mediation of thrombin-aptamer binding through electrostatic interactions and it is further consistent with the human thrombin structure that contains two anion binding sites on the surface. Electron capture dissociation (ECD) top-down MS of the thrombin-TBA complex performed with a high resolution 15 Tesla Fourier transform ion cyclotron resonance (FTICR) mass spectrometer showed the primary binding site to be at exosite I located near the N-terminal sequence of the heavy chain, consistent with crystallographic data. High resolution native top-down MS is complementary to traditional structural biology methods for structurally characterizing native proteins and protein-DNA complexes. [Figure not available: see fulltext.

  17. Detection of LGI1 and CASPR2 antibodies with a commercial cell-based assay in patients with very high VGKC-complex antibody levels.

    PubMed

    Yeo, T; Chen, Z; Chai, J Y H; Tan, K

    2017-07-15

    The presence of VGKC-complex antibodies, without LGI1/CASPR2 antibodies, as a standalone marker for neurological autoimmunity remains controversial. Additionally, the lack of an unequivocal VGKC-complex antibody cut-off level defining neurological autoimmunity makes it important to test for monospecific antibodies. We aim to determine the performance characteristics of a commercial assay (Euroimmun, Lübeck, Germany) for LGI1/CASPR2 antibody detection in patients with very high VGKC-complex antibody levels and report their clinico-serological associations. We identified 8 patients in our cohort with the highest VGKC-complex antibody levels (median 2663.5pM, range 933-6730pM) with VGKC-complex antibody related syndromes (Group A). Two other groups were identified; 1 group with suspected neuronal surface antibody syndromes and negative for VGKC-complex antibodies (Group B, n=8), and another group with cerebellar ataxia and negative for onconeuronal antibodies (Group C, n=8). Seven out of 8 patients (87.5%) in Group A had LGI1 and/or CASPR2 antibodies. One Group B patient had LGI1 antibodies but was negative on re-testing with a live cell assay. No Group C patients had monospecific antibodies. Inter-rater reliability was high; combining Groups A and B patients, the kappa statistic was 0.87 and 1.0 for LGI1 and CASPR2 antibodies respectively. We demonstrated that a high proportion of patients with very high VGKC-complex antibody levels and relevant clinical syndromes have LGI1 and/or CASPR2 antibodies detected by the commercial assay. Our findings lend support to the use of the assay for rapid and reliable detection of LGI1 and CASPR2 antibodies. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Performance of Improved High-Order Filter Schemes for Turbulent Flows with Shocks

    NASA Technical Reports Server (NTRS)

    Kotov, Dmitry Vladimirovich; Yee, Helen M C.

    2013-01-01

    The performance of the filter scheme with improved dissipation control ? has been demonstrated for different flow types. The scheme with local ? is shown to obtain more accurate results than its counterparts with global or constant ?. At the same time no additional tuning is needed to achieve high accuracy of the method when using the local ? technique. However, further improvement of the method might be needed for even more complex and/or extreme flows.

  19. Fast and simultaneous determination of 12 polyphenols in apple peel and pulp by using chemometrics-assisted high-performance liquid chromatography with diode array detection.

    PubMed

    Wang, Tong; Wu, Hai-Long; Xie, Li-Xia; Zhu, Li; Liu, Zhi; Sun, Xiao-Dong; Xiao, Rong; Yu, Ru-Qin

    2017-04-01

    In this work, a smart chemometrics-enhanced strategy, high-performance liquid chromatography, and diode array detection coupled with second-order calibration method based on alternating trilinear decomposition algorithm was proposed to simultaneously quantify 12 polyphenols in different kinds of apple peel and pulp samples. The proposed strategy proved to be a powerful tool to solve the problems of coelution, unknown interferences, and chromatographic shifts in the process of high-performance liquid chromatography analysis, making it possible for the determination of 12 polyphenols in complex apple matrices within 10 min under simple conditions of elution. The average recoveries with standard deviations, and figures of merit including sensitivity, selectivity, limit of detection, and limit of quantitation were calculated to validate the accuracy of the proposed method. Compared to the quantitative analysis results from the classic high-performance liquid chromatography method, the statistical and graphical analysis showed that our proposed strategy obtained more reliable results. All results indicated that our proposed method used in the quantitative analysis of apple polyphenols was an accurate, fast, universal, simple, and green one, and it was expected to be developed as an attractive alternative method for simultaneous determination of multitargeted analytes in complex matrices. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. High-performance ternary blend polymer solar cells involving both energy transfer and hole relay processes.

    PubMed

    Lu, Luyao; Chen, Wei; Xu, Tao; Yu, Luping

    2015-06-04

    The integration of multiple materials with complementary absorptions into a single junction device is regarded as an efficient way to enhance the power conversion efficiency (PCE) of organic solar cells (OSCs). However, because of increased complexity with one more component, only limited high-performance ternary systems have been demonstrated previously. Here we report an efficient ternary blend OSC with a PCE of 9.2%. We show that the third component can reduce surface trap densities in the ternary blend. Detailed studies unravel that the improved performance results from synergistic effects of enlarged open circuit voltage, suppressed trap-assisted recombination, enhanced light absorption, increased hole extraction, efficient energy transfer and better morphology. The working mechanism and high device performance demonstrate new insights and design guidelines for high-performance ternary blend solar cells and suggest that ternary structure is a promising platform to boost the efficiency of OSCs.

  1. Assessment of the Suitability of High Resolution Numerical Weather Model Outputs for Hydrological Modelling in Mountainous Cold Regions

    NASA Astrophysics Data System (ADS)

    Rasouli, K.; Pomeroy, J. W.; Hayashi, M.; Fang, X.; Gutmann, E. D.; Li, Y.

    2017-12-01

    The hydrology of mountainous cold regions has a large spatial variability that is driven both by climate variability and near-surface process variability associated with complex terrain and patterns of vegetation, soils, and hydrogeology. There is a need to downscale large-scale atmospheric circulations towards the fine scales that cold regions hydrological processes operate at to assess their spatial variability in complex terrain and quantify uncertainties by comparison to field observations. In this research, three high resolution numerical weather prediction models, namely, the Intermediate Complexity Atmosphere Research (ICAR), Weather Research and Forecasting (WRF), and Global Environmental Multiscale (GEM) models are used to represent spatial and temporal patterns of atmospheric conditions appropriate for hydrological modelling. An area covering high mountains and foothills of the Canadian Rockies was selected to assess and compare high resolution ICAR (1 km × 1 km), WRF (4 km × 4 km), and GEM (2.5 km × 2.5 km) model outputs with station-based meteorological measurements. ICAR with very low computational cost was run with different initial and boundary conditions and with finer spatial resolution, which allowed an assessment of modelling uncertainty and scaling that was difficult with WRF. Results show that ICAR, when compared with WRF and GEM, performs very well in precipitation and air temperature modelling in the Canadian Rockies, while all three models show a fair performance in simulating wind and humidity fields. Representation of local-scale atmospheric dynamics leading to realistic fields of temperature and precipitation by ICAR, WRF, and GEM makes these models suitable for high resolution cold regions hydrological predictions in complex terrain, which is a key factor in estimating water security in western Canada.

  2. Water Flow Test at Launch Complex 39B

    NASA Image and Video Library

    2017-12-20

    Water flowed during a test at Launch Complex 39B at NASA’s Kennedy Space Center in Florida. About 450,000 gallons of water flowed at high speed from a holding tank through new and modified piping and valves, the flame trench, flame deflector nozzles and mobile launcher interface risers during a wet flow test at Launch Complex 39B. At peak flow, the water reached about 100 feet in the air above the pad surface. The test was a milestone to confirm and baseline the performance of the Ignition Overpressure/Sound Suppression system. During launch of NASA's Space Launch System rocket and Orion spacecraft, the high-speed water flow will help protect the vehicle from the extreme acoustic and temperature environment during ignition and liftoff.

  3. High-Performance Algorithms and Complex Fluids | Computational Science |

    Science.gov Websites

    only possible by combining experimental data with simulation. Capabilities Capabilities include: Block -laden, non-Newtonian, as well as traditional internal and external flows. Contact Ray Grout Group

  4. High-contrast imager for Complex Aperture Telescopes (HiCAT). 4. Status and wavefront control development

    NASA Astrophysics Data System (ADS)

    Leboulleux, Lucie; N'Diaye, Mamadou; Riggs, A. J. E.; Egron, Sylvain; Mazoyer, Johan; Pueyo, Laurent; Choquet, Elodie; Perrin, Marshall D.; Kasdin, Jeremy; Sauvage, Jean-François; Fusco, Thierry; Soummer, Rémi

    2016-07-01

    Segmented telescopes are a possible approach to enable large-aperture space telescopes for the direct imaging and spectroscopy of habitable worlds. However, the increased complexity of their aperture geometry, due to their central obstruction, support structures and segment gaps, makes high-contrast imaging very challenging. The High-contrast imager for Complex Aperture Telescopes (HiCAT) was designed to study and develop solutions for such telescope pupils using wavefront control and starlight suppression. The testbed design has the flexibility to enable studies with increasing complexity for telescope aperture geometries starting with off-axis telescopes, then on-axis telescopes with central obstruction and support structures (e.g. the Wide Field Infrared Survey Telescope [WFIRST]), up to on-axis segmented telescopes e.g. including various concepts for a Large UV, Optical, IR telescope (LUVOIR), such as the High Definition Space Telescope (HDST). We completed optical alignment in the summer of 2014 and a first deformable mirror was successfully integrated in the testbed, with a total wavefront error of 13nm RMS over a 18mm diameter circular pupil in open loop. HiCAT will also be provided with a segmented mirror conjugated with a shaped pupil representing the HDST configuration, to directly study wavefront control in the presence of segment gaps, central obstruction and spider. We recently applied a focal plane wavefront control method combined with a classical Lyot coronagraph on HiCAT, and we found limitations on contrast performance due to vibration effect. In this communication, we analyze this instability and study its impact on the performance of wavefront control algorithms. We present our Speckle Nulling code to control and correct for wavefront errors both in simulation mode and on testbed mode. This routine is first tested in simulation mode without instability to validate our code. We then add simulated vibrations to study the degradation of contrast performance in the presence of these effects.

  5. Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering☆

    PubMed Central

    Rabitz, Herschel; Welsh, William J.; Kohn, Joachim; de Boer, Jan

    2016-01-01

    The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. PMID:26876875

  6. ComplexQuant: high-throughput computational pipeline for the global quantitative analysis of endogenous soluble protein complexes using high resolution protein HPLC and precision label-free LC/MS/MS.

    PubMed

    Wan, Cuihong; Liu, Jian; Fong, Vincent; Lugowski, Andrew; Stoilova, Snejana; Bethune-Waddell, Dylan; Borgeson, Blake; Havugimana, Pierre C; Marcotte, Edward M; Emili, Andrew

    2013-04-09

    The experimental isolation and characterization of stable multi-protein complexes are essential to understanding the molecular systems biology of a cell. To this end, we have developed a high-throughput proteomic platform for the systematic identification of native protein complexes based on extensive fractionation of soluble protein extracts by multi-bed ion exchange high performance liquid chromatography (IEX-HPLC) combined with exhaustive label-free LC/MS/MS shotgun profiling. To support these studies, we have built a companion data analysis software pipeline, termed ComplexQuant. Proteins present in the hundreds of fractions typically collected per experiment are first identified by exhaustively interrogating MS/MS spectra using multiple database search engines within an integrative probabilistic framework, while accounting for possible post-translation modifications. Protein abundance is then measured across the fractions based on normalized total spectral counts and precursor ion intensities using a dedicated tool, PepQuant. This analysis allows co-complex membership to be inferred based on the similarity of extracted protein co-elution profiles. Each computational step has been optimized for processing large-scale biochemical fractionation datasets, and the reliability of the integrated pipeline has been benchmarked extensively. This article is part of a Special Issue entitled: From protein structures to clinical applications. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. An XML-Based Protocol for Distributed Event Services

    NASA Technical Reports Server (NTRS)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    A recent trend in distributed computing is the construction of high-performance distributed systems called computational grids. One difficulty we have encountered is that there is no standard format for the representation of performance information and no standard protocol for transmitting this information. This limits the types of performance analysis that can be undertaken in complex distributed systems. To address this problem, we present an XML-based protocol for transmitting performance events in distributed systems and evaluate the performance of this protocol.

  8. Good distractions: Testing the effects of listening to an audiobook on driving performance in simple and complex road environments.

    PubMed

    Nowosielski, Robert J; Trick, Lana M; Toxopeus, Ryan

    2018-02-01

    Distracted driving (driving while performing a secondary task) causes many collisions. Most research on distracted driving has focused on operating a cell-phone, but distracted driving can include eating while driving, conversing with passengers or listening to music or audiobooks. Although the research has focused on the deleterious effects of distraction, there may be situations where distraction improves driving performance. Fatigue and boredom are also associated with collision risk and it is possible that secondary tasks can help alleviate the effects of fatigue and boredom. Furthermore, it has been found that individuals with high levels of executive functioning as measured by the OSPAN (Operation Span) task show better driving while multitasking. In this study, licensed drivers were tested in a driving simulator (a car body surrounded by screens) that simulated simple or complex roads. Road complexity was manipulated by increasing traffic, scenery, and the number of curves in the drive. Participants either drove, or drove while listening to an audiobook. Driving performance was measured in terms of braking response time to hazards (HRT): the time required to brake in response to pedestrians or vehicles that suddenly emerged from the periphery into the path of the vehicle, speed, standard deviation of speed, standard deviation of lateral position (SDLP). Overall, braking times to hazards were higher on the complex drive than the simple one, though the effects of secondary tasks such as audiobooks were especially deleterious on the complex drive. In contrast, on the simple drive, driving while listening to an audiobook lead to faster HRT. We found evidence that individuals with high OSPAN scores had faster HRTs when listening to an audiobook. These results suggest that there are environmental and individual factors behind difference in the allocation of attention while listening to audiobooks while driving. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Capacity building and predictors of success for HIV-1 drug resistance testing in the Asia-Pacific region and Africa

    PubMed Central

    Land, Sally; Zhou, Julian; Cunningham, Philip; Sohn, Annette H; Singtoroj, Thida; Katzenstein, David; Mann, Marita; Sayer, David; Kantor, Rami

    2013-01-01

    Background The TREAT Asia Quality Assessment Scheme (TAQAS) was developed as a quality assessment programme through expert education and training, for laboratories in the Asia-Pacific and Africa that perform HIV drug-resistance (HIVDR) genotyping. We evaluated the programme performance and factors associated with high-quality HIVDR genotyping. Methods Laboratories used their standard protocols to test panels of human immunodeficiency virus (HIV)-positive plasma samples or electropherograms. Protocols were documented and performance was evaluated according to a newly developed scoring system, agreement with panel-specific consensus sequence, and detection of drug-resistance mutations (DRMs) and mixtures of wild-type and resistant virus (mixtures). High-quality performance was defined as detection of ≥95% DRMs. Results Over 4.5 years, 23 participating laboratories in 13 countries tested 45 samples (30 HIV-1 subtype B; 15 non-B subtypes) in nine panels. Median detection of DRMs was 88–98% in plasma panels and 90–97% in electropherogram panels. Laboratories were supported to amend and improve their test outcomes as appropriate. Three laboratories that detected <80% DRMs in early panels demonstrated subsequent improvement. Sample complexity factors – number of DRMs (p<0.001) and number of DRMs as mixtures (p<0.001); and laboratory performance factors – detection of mixtures (p<0.001) and agreement with consensus sequence (p<0.001), were associated with high performance; sample format (plasma or electropherogram), subtype and genotyping protocol were not. Conclusion High-quality HIVDR genotyping was achieved in the TAQAS collaborative laboratory network. Sample complexity and detection of mixtures were associated with performance quality. Laboratories conducting HIVDR genotyping are encouraged to participate in quality assessment programmes. PMID:23845227

  10. On the interconnection of stable protein complexes: inter-complex hubs and their conservation in Saccharomyces cerevisiae and Homo sapiens networks.

    PubMed

    Guerra, Concettina

    2015-01-01

    Protein complexes are key molecular entities that perform a variety of essential cellular functions. The connectivity of proteins within a complex has been widely investigated with both experimental and computational techniques. We developed a computational approach to identify and characterise proteins that play a role in interconnecting complexes. We computed a measure of inter-complex centrality, the crossroad index, based on disjoint paths connecting proteins in distinct complexes and identified inter-complex hubs as proteins with a high value of the crossroad index. We applied the approach to a set of stable complexes in Saccharomyces cerevisiae and in Homo sapiens. Just as done for hubs, we evaluated the topological and biological properties of inter-complex hubs addressing the following questions. Do inter-complex hubs tend to be evolutionary conserved? What is the relation between crossroad index and essentiality? We found a good correlation between inter-complex hubs and both evolutionary conservation and essentiality.

  11. Effects of noise frequency on performance and annoyance. M.S. Thesis - Georgia Inst. of Tech.

    NASA Technical Reports Server (NTRS)

    Key, K. F.

    1979-01-01

    Using a complex psychomotor task performed for 50 minutes in the presence of low frequency noise, high frequency noise, or ambient noise, annoyance ratings were obtained for noises of various frequencies by the method of magnitude estimation. The results suggest that high frequency noise affects female performance to a greater extent than male performance. Contrasted to these performance effects, the sexes did not differ in their annoyance ratings. A monotonically increasing relationship between annoyance and noise frequency was found (except for a decrease in annoyance at 8,000 Hz). It is concluded that both performance and annoyance responses may need to be assessed in certain situations to adequately describe human reaction to noise.

  12. Design of convolutional tornado code

    NASA Astrophysics Data System (ADS)

    Zhou, Hui; Yang, Yao; Gao, Hongmin; Tan, Lu

    2017-09-01

    As a linear block code, the traditional tornado (tTN) code is inefficient in burst-erasure environment and its multi-level structure may lead to high encoding/decoding complexity. This paper presents a convolutional tornado (cTN) code which is able to improve the burst-erasure protection capability by applying the convolution property to the tTN code, and reduce computational complexity by abrogating the multi-level structure. The simulation results show that cTN code can provide a better packet loss protection performance with lower computation complexity than tTN code.

  13. Single Image Super-Resolution Using Global Regression Based on Multiple Local Linear Mappings.

    PubMed

    Choi, Jae-Seok; Kim, Munchurl

    2017-03-01

    Super-resolution (SR) has become more vital, because of its capability to generate high-quality ultra-high definition (UHD) high-resolution (HR) images from low-resolution (LR) input images. Conventional SR methods entail high computational complexity, which makes them difficult to be implemented for up-scaling of full-high-definition input images into UHD-resolution images. Nevertheless, our previous super-interpolation (SI) method showed a good compromise between Peak-Signal-to-Noise Ratio (PSNR) performances and computational complexity. However, since SI only utilizes simple linear mappings, it may fail to precisely reconstruct HR patches with complex texture. In this paper, we present a novel SR method, which inherits the large-to-small patch conversion scheme from SI but uses global regression based on local linear mappings (GLM). Thus, our new SR method is called GLM-SI. In GLM-SI, each LR input patch is divided into 25 overlapped subpatches. Next, based on the local properties of these subpatches, 25 different local linear mappings are applied to the current LR input patch to generate 25 HR patch candidates, which are then regressed into one final HR patch using a global regressor. The local linear mappings are learned cluster-wise in our off-line training phase. The main contribution of this paper is as follows: Previously, linear-mapping-based conventional SR methods, including SI only used one simple yet coarse linear mapping to each patch to reconstruct its HR version. On the contrary, for each LR input patch, our GLM-SI is the first to apply a combination of multiple local linear mappings, where each local linear mapping is found according to local properties of the current LR patch. Therefore, it can better approximate nonlinear LR-to-HR mappings for HR patches with complex texture. Experiment results show that the proposed GLM-SI method outperforms most of the state-of-the-art methods, and shows comparable PSNR performance with much lower computational complexity when compared with a super-resolution method based on convolutional neural nets (SRCNN15). Compared with the previous SI method that is limited with a scale factor of 2, GLM-SI shows superior performance with average 0.79 dB higher in PSNR, and can be used for scale factors of 3 or higher.

  14. Structural features of Aspergillus niger β-galactosidase define its activity against glycoside linkages.

    PubMed

    Rico-Díaz, Agustín; Ramírez-Escudero, Mercedes; Vizoso-Vázquez, Ángel; Cerdán, M Esperanza; Becerra, Manuel; Sanz-Aparicio, Julia

    2017-06-01

    β-Galactosidases are biotechnologically interesting enzymes that catalyze the hydrolysis or transgalactosylation of β-galactosides. Among them, the Aspergillus niger β-galactosidase (AnβGal) belongs to the glycoside hydrolase family 35 (GH35) and is widely used in the industry due to its high hydrolytic activity which is used to degrade lactose. We present here its three-dimensional structure in complex with different oligosaccharides, to illustrate the structural determinants of the broad specificity of the enzyme against different glycoside linkages. Remarkably, the residues Phe264, Tyr304, and Trp806 make a dynamic hydrophobic platform that accommodates the sugar at subsite +1 suggesting a main role on the recognition of structurally different substrates. Moreover, complexes with the trisaccharides show two potential subsites +2 depending on the substrate type. This feature and the peculiar shape of its wide cavity suggest that AnβGal might accommodate branched substrates from the complex net of polysaccharides composing the plant material in its natural environment. Relevant residues were selected and mutagenesis analyses were performed to evaluate their role in the catalytic performance and the hydrolase/transferase ratio of AnβGal. Thus, we generated mutants with improved transgalactosylation activity. In particular, the variant Y304F/Y355H/N357G/W806F displays a higher level of galacto-oligosaccharides production than the Aspergillus oryzae β-galactosidase, which is the preferred enzyme in the industry owing to its high transferase activity. Our results provide new knowledge on the determinants modulating specificity and the catalytic performance of fungal GH35 β-galactosidases. In turn, this fundamental background gives novel tools for the future improvement of these enzymes, which represent an interesting target for rational design. Structural data are available in PDB database under the accession numbers 5IFP (native form), 5IHR (in complex with 6GalGlu), 5IFT (in complex with 3GalGlu), 5JUV (in complex with 6GalGal), 5MGC (in complex with 4GalLac), and 5MGD (in complex with 6GalLac). © 2017 Federation of European Biochemical Societies.

  15. Performance-scalable volumetric data classification for online industrial inspection

    NASA Astrophysics Data System (ADS)

    Abraham, Aby J.; Sadki, Mustapha; Lea, R. M.

    2002-03-01

    Non-intrusive inspection and non-destructive testing of manufactured objects with complex internal structures typically requires the enhancement, analysis and visualization of high-resolution volumetric data. Given the increasing availability of fast 3D scanning technology (e.g. cone-beam CT), enabling on-line detection and accurate discrimination of components or sub-structures, the inherent complexity of classification algorithms inevitably leads to throughput bottlenecks. Indeed, whereas typical inspection throughput requirements range from 1 to 1000 volumes per hour, depending on density and resolution, current computational capability is one to two orders-of-magnitude less. Accordingly, speeding up classification algorithms requires both reduction of algorithm complexity and acceleration of computer performance. A shape-based classification algorithm, offering algorithm complexity reduction, by using ellipses as generic descriptors of solids-of-revolution, and supporting performance-scalability, by exploiting the inherent parallelism of volumetric data, is presented. A two-stage variant of the classical Hough transform is used for ellipse detection and correlation of the detected ellipses facilitates position-, scale- and orientation-invariant component classification. Performance-scalability is achieved cost-effectively by accelerating a PC host with one or more COTS (Commercial-Off-The-Shelf) PCI multiprocessor cards. Experimental results are reported to demonstrate the feasibility and cost-effectiveness of the data-parallel classification algorithm for on-line industrial inspection applications.

  16. Raised visual detection thresholds depend on the level of complexity of cognitive foveal loading.

    PubMed

    Plainis, S; Murray, I J; Chauhan, K

    2001-01-01

    The objective of the study was to measure the interactions between visual thresholds for a simple light (the secondary task) presented peripherally and a simultaneously performed cognitive task (the primary task) presented foveally The primary task was highly visible but varied according to its cognitive complexity. Interactions between the tasks were determined by measuring detection thresholds for the peripheral task and accuracy of performance of the foveal task. Effects were measured for 5, 10, 20, and 30 deg eccentricity of the peripherally presented light and for three levels of cognitive complexity. Mesopic conditions (0.5 lx) were used. As expected, the concurrent presentation of the foveal cognitive task reduced peripheral sensitivity. Moreover, performance of the foveal task was adversely affected when conducting the peripheral task. Performance on both tasks was reduced as the level of complexity of the cognitive task increased. There were qualitative differences in task interactions between the central 10 deg and at greater eccentricities. Within 10 deg there was a disproportionate effect of eccentricity, previously interpreted as the 'tunnel-vision' model of visual field narrowing. Interactions outside 10 deg were less affected by eccentricity. These results are discussed in terms of the known neurophysiological characteristics of the primary visual pathway.

  17. Characterizing informative sequence descriptors and predicting binding affinities of heterodimeric protein complexes.

    PubMed

    Srinivasulu, Yerukala Sathipati; Wang, Jyun-Rong; Hsu, Kai-Ti; Tsai, Ming-Ju; Charoenkwan, Phasit; Huang, Wen-Lin; Huang, Hui-Ling; Ho, Shinn-Ying

    2015-01-01

    Protein-protein interactions (PPIs) are involved in various biological processes, and underlying mechanism of the interactions plays a crucial role in therapeutics and protein engineering. Most machine learning approaches have been developed for predicting the binding affinity of protein-protein complexes based on structure and functional information. This work aims to predict the binding affinity of heterodimeric protein complexes from sequences only. This work proposes a support vector machine (SVM) based binding affinity classifier, called SVM-BAC, to classify heterodimeric protein complexes based on the prediction of their binding affinity. SVM-BAC identified 14 of 580 sequence descriptors (physicochemical, energetic and conformational properties of the 20 amino acids) to classify 216 heterodimeric protein complexes into low and high binding affinity. SVM-BAC yielded the training accuracy, sensitivity, specificity, AUC and test accuracy of 85.80%, 0.89, 0.83, 0.86 and 83.33%, respectively, better than existing machine learning algorithms. The 14 features and support vector regression were further used to estimate the binding affinities (Pkd) of 200 heterodimeric protein complexes. Prediction performance of a Jackknife test was the correlation coefficient of 0.34 and mean absolute error of 1.4. We further analyze three informative physicochemical properties according to their contribution to prediction performance. Results reveal that the following properties are effective in predicting the binding affinity of heterodimeric protein complexes: apparent partition energy based on buried molar fractions, relations between chemical structure and biological activity in principal component analysis IV, and normalized frequency of beta turn. The proposed sequence-based prediction method SVM-BAC uses an optimal feature selection method to identify 14 informative features to classify and predict binding affinity of heterodimeric protein complexes. The characterization analysis revealed that the average numbers of beta turns and hydrogen bonds at protein-protein interfaces in high binding affinity complexes are more than those in low binding affinity complexes.

  18. Characterizing informative sequence descriptors and predicting binding affinities of heterodimeric protein complexes

    PubMed Central

    2015-01-01

    Background Protein-protein interactions (PPIs) are involved in various biological processes, and underlying mechanism of the interactions plays a crucial role in therapeutics and protein engineering. Most machine learning approaches have been developed for predicting the binding affinity of protein-protein complexes based on structure and functional information. This work aims to predict the binding affinity of heterodimeric protein complexes from sequences only. Results This work proposes a support vector machine (SVM) based binding affinity classifier, called SVM-BAC, to classify heterodimeric protein complexes based on the prediction of their binding affinity. SVM-BAC identified 14 of 580 sequence descriptors (physicochemical, energetic and conformational properties of the 20 amino acids) to classify 216 heterodimeric protein complexes into low and high binding affinity. SVM-BAC yielded the training accuracy, sensitivity, specificity, AUC and test accuracy of 85.80%, 0.89, 0.83, 0.86 and 83.33%, respectively, better than existing machine learning algorithms. The 14 features and support vector regression were further used to estimate the binding affinities (Pkd) of 200 heterodimeric protein complexes. Prediction performance of a Jackknife test was the correlation coefficient of 0.34 and mean absolute error of 1.4. We further analyze three informative physicochemical properties according to their contribution to prediction performance. Results reveal that the following properties are effective in predicting the binding affinity of heterodimeric protein complexes: apparent partition energy based on buried molar fractions, relations between chemical structure and biological activity in principal component analysis IV, and normalized frequency of beta turn. Conclusions The proposed sequence-based prediction method SVM-BAC uses an optimal feature selection method to identify 14 informative features to classify and predict binding affinity of heterodimeric protein complexes. The characterization analysis revealed that the average numbers of beta turns and hydrogen bonds at protein-protein interfaces in high binding affinity complexes are more than those in low binding affinity complexes. PMID:26681483

  19. The effects of using screencasting as a multimedia pre-training tool to manage the intrinsic cognitive load of chemical equilibrium instruction for advanced high school chemistry students

    NASA Astrophysics Data System (ADS)

    Musallam, Ramsey

    Chemistry is a complex knowledge domain. Specifically, research notes that Chemical Equilibrium presents greater cognitive challenges than other topics in chemistry. Cognitive Load Theory describes the impact a subject, and the learning environment, have on working memory. Intrinsic load is the facet of Cognitive Load Theory that explains the complexity innate to complex subjects. The purpose of this study was to build on the limited research into intrinsic cognitive load, by examining the effects of using multimedia screencasts as a pre-training technique to manage the intrinsic cognitive load of chemical equilibrium instruction for advanced high school chemistry students. A convenience sample of 62 fourth-year high school students enrolled in an advanced chemistry course from a co-ed high school in urban San Francisco were given a chemical equilibrium concept pre-test. Upon conclusion of the pre-test, students were randomly assigned to two groups: pre-training and no pre-training. The pre-training group received a 10 minute and 52 second pre-training screencast that provided definitions, concepts and an overview of chemical equilibrium. After pre-training both group received the same 50-minute instructional lecture. After instruction, all students were given a chemical equilibrium concept post-test. Independent sample t-tests were conducted to examine differences in performance and intrinsic load. No significant differences in performance or intrinsic load, as measured by ratings of mental effort, were observed on the pre-test. Significant differences in performance, t(60)=3.70, p=.0005, and intrinsic load, t(60)=5.34, p=.0001, were observed on the post-test. A significant correlation between total performance scores and total mental effort ratings was also observed, r(60)=-0.44, p=.0003. Because no significant differences in prior knowledge were observed, it can be concluded that pre-training was successful at reducing intrinsic load. Moreover, a significant correlation between performance and mental effort strengthens the argument that performance measures can be used to approximate intrinsic cognitive load.

  20. Behavior of new complexes of tetrakis(4-methoxylphenyl)porphyrin with heavy rare earth elements in reversed-phase high performance liquid chromatography.

    PubMed

    Zhang, Jun-Feng; Wang, Hong; Hou, An-Xin; Wang, Chang-Fa; Zhang, Hua-Shan

    2004-08-01

    An HPLC method has been developed for the separation of new complexes of tetrakis(4-methoxylphenyl)porphyrin (TMOPP) with four heavy rare earth elements (RE = Y, Er, Tm, and Yb). The function of amine and acid in the mobile phase has been investigated and a reasonable explanation is presented. Successful separation of the RE-TMOPP-Cl complexes is accomplished in 10 min with a mobile phase consisting of methanol-water-acetic acid-triethanolamine. The detection limits (S/N= 3) for the four complexes are 0.01 microg/mL. This method is rapid, sensitive, and simple.

  1. Separation of V(V)-4-(2-pyridylazo)resorcinolato complex from a large excess reagent using an ODS cartridge for high-performance liquid chromatography.

    PubMed

    Takahashi, Toru; Kaneko, Emiko; Yotsuyanagi, Takao

    2006-12-01

    A selective off-line preconcentration technique for the V(V) complex with 4-(2-pyridylazo)resorcinol has been developed and successfully applied to the determination of V(V) in an air-borne sample. The target complex was separated from excess reagent using an ODS cartridge and water as the eluent. The complex was then concentrated on another ODS cartridge using tetrabutylammonium bromide and eluted with methanol; the eluate was applied to a one-drop concentration/HPLC. A detection limit as low as (6.05 +/- 0.82)x 10(-11) M (5 ppt) was achieved.

  2. Electrolyte salts for nonaqueous electrolytes

    DOEpatents

    Amine, Khalil; Zhang, Zhengcheng; Chen, Zonghai

    2012-10-09

    Metal complex salts may be used in lithium ion batteries. Such metal complex salts not only perform as an electrolyte salt in a lithium ion batteries with high solubility and conductivity, but also can act as redox shuttles that provide overcharge protection of individual cells in a battery pack and/or as electrolyte additives to provide other mechanisms to provide overcharge protection to lithium ion batteries. The metal complex salts have at least one aromatic ring. The aromatic moiety may be reversibly oxidized/reduced at a potential slightly higher than the working potential of the positive electrode in the lithium ion battery. The metal complex salts may also be known as overcharge protection salts.

  3. FPGA implementation of low complexity LDPC iterative decoder

    NASA Astrophysics Data System (ADS)

    Verma, Shivani; Sharma, Sanjay

    2016-07-01

    Low-density parity-check (LDPC) codes, proposed by Gallager, emerged as a class of codes which can yield very good performance on the additive white Gaussian noise channel as well as on the binary symmetric channel. LDPC codes have gained lots of importance due to their capacity achieving property and excellent performance in the noisy channel. Belief propagation (BP) algorithm and its approximations, most notably min-sum, are popular iterative decoding algorithms used for LDPC and turbo codes. The trade-off between the hardware complexity and the decoding throughput is a critical factor in the implementation of the practical decoder. This article presents introduction to LDPC codes and its various decoding algorithms followed by realisation of LDPC decoder by using simplified message passing algorithm and partially parallel decoder architecture. Simplified message passing algorithm has been proposed for trade-off between low decoding complexity and decoder performance. It greatly reduces the routing and check node complexity of the decoder. Partially parallel decoder architecture possesses high speed and reduced complexity. The improved design of the decoder possesses a maximum symbol throughput of 92.95 Mbps and a maximum of 18 decoding iterations. The article presents implementation of 9216 bits, rate-1/2, (3, 6) LDPC decoder on Xilinx XC3D3400A device from Spartan-3A DSP family.

  4. Data flow modeling techniques

    NASA Technical Reports Server (NTRS)

    Kavi, K. M.

    1984-01-01

    There have been a number of simulation packages developed for the purpose of designing, testing and validating computer systems, digital systems and software systems. Complex analytical tools based on Markov and semi-Markov processes have been designed to estimate the reliability and performance of simulated systems. Petri nets have received wide acceptance for modeling complex and highly parallel computers. In this research data flow models for computer systems are investigated. Data flow models can be used to simulate both software and hardware in a uniform manner. Data flow simulation techniques provide the computer systems designer with a CAD environment which enables highly parallel complex systems to be defined, evaluated at all levels and finally implemented in either hardware or software. Inherent in data flow concept is the hierarchical handling of complex systems. In this paper we will describe how data flow can be used to model computer system.

  5. Computationally Efficient Power Allocation Algorithm in Multicarrier-Based Cognitive Radio Networks: OFDM and FBMC Systems

    NASA Astrophysics Data System (ADS)

    Shaat, Musbah; Bader, Faouzi

    2010-12-01

    Cognitive Radio (CR) systems have been proposed to increase the spectrum utilization by opportunistically access the unused spectrum. Multicarrier communication systems are promising candidates for CR systems. Due to its high spectral efficiency, filter bank multicarrier (FBMC) can be considered as an alternative to conventional orthogonal frequency division multiplexing (OFDM) for transmission over the CR networks. This paper addresses the problem of resource allocation in multicarrier-based CR networks. The objective is to maximize the downlink capacity of the network under both total power and interference introduced to the primary users (PUs) constraints. The optimal solution has high computational complexity which makes it unsuitable for practical applications and hence a low complexity suboptimal solution is proposed. The proposed algorithm utilizes the spectrum holes in PUs bands as well as active PU bands. The performance of the proposed algorithm is investigated for OFDM and FBMC based CR systems. Simulation results illustrate that the proposed resource allocation algorithm with low computational complexity achieves near optimal performance and proves the efficiency of using FBMC in CR context.

  6. Overview of Intelligent Systems and Operations Development

    NASA Technical Reports Server (NTRS)

    Pallix, Joan; Dorais, Greg; Penix, John

    2004-01-01

    To achieve NASA's ambitious mission objectives for the future, aircraft and spacecraft will need intelligence to take the correct action in a variety of circumstances. Vehicle intelligence can be defined as the ability to "do the right thing" when faced with a complex decision-making situation. It will be necessary to implement integrated autonomous operations and low-level adaptive flight control technologies to direct actions that enhance the safety and success of complex missions despite component failures, degraded performance, operator errors, and environment uncertainty. This paper will describe the array of technologies required to meet these complex objectives. This includes the integration of high-level reasoning and autonomous capabilities with multiple subsystem controllers for robust performance. Future intelligent systems will use models of the system, its environment, and other intelligent agents with which it interacts. They will also require planners, reasoning engines, and adaptive controllers that can recommend or execute commands enabling the system to respond intelligently. The presentation will also address the development of highly dependable software, which is a key component to ensure the reliability of intelligent systems.

  7. Effect of chunk strength on the performance of children with developmental dyslexia on artificial grammar learning task may be related to complexity.

    PubMed

    Schiff, Rachel; Katan, Pesia; Sasson, Ayelet; Kahta, Shani

    2017-07-01

    There's a long held view that chunks play a crucial role in artificial grammar learning performance. We compared chunk strength influences on performance, in high and low topological entropy (a measure of complexity) grammar systems, with dyslexic children, age-matched and reading-level-matched control participants. Findings show that age-matched control participants' performance reflected equivalent influence of chunk strength in the two topological entropy conditions, as typically found in artificial grammar learning experiments. By contrast, dyslexic children and reading-level-matched controls' performance reflected knowledge of chunk strength only under the low topological entropy condition. In the low topological entropy grammar system, they appeared completely unable to utilize chunk strength to make appropriate test item selections. In line with previous research, this study suggests that for typically developing children, it is the chunks that are attended during artificial grammar learning and create a foundation on which implicit associative learning mechanisms operate, and these chunks are unitized to different strengths. However, for children with dyslexia, it is complexity that may influence the subsequent memorability of chunks, independently of their strength.

  8. Self-templated formation of uniform NiCo2O4 hollow spheres with complex interior structures for lithium-ion batteries and supercapacitors.

    PubMed

    Shen, Laifa; Yu, Le; Yu, Xin-Yao; Zhang, Xiaogang; Lou, Xiong Wen David

    2015-02-02

    Despite the significant advancement in preparing metal oxide hollow structures, most approaches rely on template-based multistep procedures for tailoring the interior structure. In this work, we develop a new generally applicable strategy toward the synthesis of mixed-metal-oxide complex hollow spheres. Starting with metal glycerate solid spheres, we show that subsequent thermal annealing in air leads to the formation of complex hollow spheres of the resulting metal oxide. We demonstrate the concept by synthesizing highly uniform NiCo2O4 hollow spheres with a complex interior structure. With the small primary building nanoparticles, high structural integrity, complex interior architectures, and enlarged surface area, these unique NiCo2O4 hollow spheres exhibit superior electrochemical performances as advanced electrode materials for both lithium-ion batteries and supercapacitors. This approach can be an efficient self-templated strategy for the preparation of mixed-metal-oxide hollow spheres with complex interior structures and functionalities. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Coevolution at protein complex interfaces can be detected by the complementarity trace with important impact for predictive docking

    PubMed Central

    Madaoui, Hocine; Guerois, Raphaël

    2008-01-01

    Protein surfaces are under significant selection pressure to maintain interactions with their partners throughout evolution. Capturing how selection pressure acts at the interfaces of protein–protein complexes is a fundamental issue with high interest for the structural prediction of macromolecular assemblies. We tackled this issue under the assumption that, throughout evolution, mutations should minimally disrupt the physicochemical compatibility between specific clusters of interacting residues. This constraint drove the development of the so-called Surface COmplementarity Trace in Complex History score (SCOTCH), which was found to discriminate with high efficiency the structure of biological complexes. SCOTCH performances were assessed not only with respect to other evolution-based approaches, such as conservation and coevolution analyses, but also with respect to statistically based scoring methods. Validated on a set of 129 complexes of known structure exhibiting both permanent and transient intermolecular interactions, SCOTCH appears as a robust strategy to guide the prediction of protein–protein complex structures. Of particular interest, it also provides a basic framework to efficiently track how protein surfaces could evolve while keeping their partners in contact. PMID:18511568

  10. Assessing the density functional theory-based multireference configuration interaction (DFT/MRCI) method for transition metal complexes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Escudero, Daniel, E-mail: escudero@kofo.mpg.de, E-mail: thiel@kofo.mpg.de; Thiel, Walter, E-mail: escudero@kofo.mpg.de, E-mail: thiel@kofo.mpg.de

    2014-05-21

    We report an assessment of the performance of density functional theory-based multireference configuration interaction (DFT/MRCI) calculations for a set of 3d- and 4d-transition metal (TM) complexes. The DFT/MRCI results are compared to published reference data from reliable high-level multi-configurational ab initio studies. The assessment covers the relative energies of different ground-state minima of the highly correlated CrF{sub 6} complex, the singlet and triplet electronically excited states of seven typical TM complexes (MnO{sub 4}{sup −}, Cr(CO){sub 6}, [Fe(CN){sub 6}]{sup 4−}, four larger Fe and Ru complexes), and the corresponding electronic spectra (vertical excitation energies and oscillator strengths). It includes comparisons withmore » results from different flavors of time-dependent DFT (TD-DFT) calculations using pure, hybrid, and long-range corrected functionals. The DFT/MRCI method is found to be superior to the tested TD-DFT approaches and is thus recommended for exploring the excited-state properties of TM complexes.« less

  11. Digitally enhanced homodyne interferometry.

    PubMed

    Sutton, Andrew J; Gerberding, Oliver; Heinzel, Gerhard; Shaddock, Daniel A

    2012-09-24

    We present two variations of a novel interferometry technique capable of simultaneously measuring multiple targets with high sensitivity. The technique performs a homodyne phase measurement by application of a four point phase shifting algorithm, with pseudo-random switching between points to allow multiplexed measurement based upon propagation delay alone. By multiplexing measurements and shifting complexity into signal processing, both variants realise significant complexity reductions over comparable methods. The first variant performs a typical coherent detection with a dedicated reference field and achieves a displacement noise floor 0.8 pm/√Hz above 50 Hz. The second allows for removal of the dedicated reference, resulting in further simplifications and improved low frequency performance with a 1 pm/√Hz noise floor measured down to 20 Hz. These results represent the most sensitive measurement performed using this style of interferometry whilst simultaneously reducing the electro-optic footprint.

  12. Characterization of the chemical composition of white chrysanthemum flowers of Hangzhou by using high-performance ion trap mass spectrometry.

    PubMed

    Zhou, Xiahui; Chen, Xiaocheng; Wu, Xin; Cao, Gang; Zhang, Junjie

    2016-04-01

    In this study, high-performance liquid chromatography coupled with amaZon SL high-performance ion trap mass spectrometry was used to analyze the target components in white chrysanthemum flowers of Hangzhou. Twenty-one components were detected and identified in both white chrysanthemum flowers of Hangzhou samples by using target compound analysis. Furthermore, seven new compounds in white chrysanthemum flowers of Hangzhou were found and identified by analyzing the fragment ion behavior in the mass spectra. The established method can be expedient for the global quality investigation of complex components in herbal medicines and food. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. A feasibility study of a hypersonic real-gas facility

    NASA Technical Reports Server (NTRS)

    Gully, J. H.; Driga, M. D.; Weldon, W. F.

    1987-01-01

    A four month feasibility study of a hypersonic real-gas free flight test facility for NASA Langley Research Center (LARC) was performed. The feasibility of using a high-energy electromagnetic launcher (EML) to accelerate complex models (lifting and nonlifting) in the hypersonic, real-gas facility was examined. Issues addressed include: design and performance of the accelerator; design and performance of the power supply; design and operation of the sabot and payload during acceleration and separation; effects of high current, magnetic fields, temperature, and stress on the sabot and payload; and survivability of payload instrumentation during acceleration, flight, and soft catch.

  14. Congenital heart surgery: expected versus observed surgical performance according to the Aristotle complexity score.

    PubMed

    Photiadis, J; Sinzobahamvya, N; Arenz, C; Sata, S; Haun, C; Schindler, E; Asfour, B; Hraska, V

    2011-08-01

    The Aristotle score quantifies the complexity involved in congenital heart surgery. It defines surgical performance as complexity score times hospital survival. We studied how expected and observed surgical performance evolved over time. 2312 main procedures carried out between 2006 and 2010 were analyzed. The Aristotle basic score, corresponding hospital survival and related observed surgical performance were estimated. Expected survival was based on the mortality risks published by O'Brien and coauthors. Observed performance divided by expected performance was called the standardized ratio of performance. This should trend towards a figure above 100%. Survival rates and performance are given with 95% confidence intervals. The mean Aristotle basic score was 7.88 ± 2.68. 51 patients died: observed hospital survival was 97.8 % (97.1 %-98.3%). 115 deaths were anticipated: expected survival was 95.2% (93.5%-96.3%). Observed and expected surgical performance reached 7.71 (7.65-7.75) and 7.49 (7.37-7.59), respectively. Therefore the overall standardized ratio of performance was 102.94%. The ratio increased from 2006 (ratio = 101.60%) to 2009 (103.92%) and was 103.42% in 2010. Performance was high for the repair of congenital corrected transposition of the great arteries and ventricular septal defect (VSD) by atrial switch and Rastelli procedure, the Norwood procedure, repair of truncus arteriosus, aortic arch repair and VSD closure, and the Ross-Konno procedure, with corresponding standardized ratios of 123.30%, 116.83%, 112.99%, 110.86% and 110.38%, respectively. With a ratio of 82.87%, performance was low for repair of Ebstein's anomaly. The standardized ratio of surgical performance integrates three factors into a single value: procedure complexity, postoperative observed survival, and comparison with expected survival. It constitutes an excellent instrument for quality monitoring of congenital heart surgery programs over time. It allows an accurate comparison of surgical performance across institutions with different case mixes. © Georg Thieme Verlag KG Stuttgart · New York.

  15. Computer-Assisted Monitoring Of A Complex System

    NASA Technical Reports Server (NTRS)

    Beil, Bob J.; Mickelson, Eric M.; Sterritt, John M.; Costantino, Rob W.; Houvener, Bob C.; Super, Mike A.

    1995-01-01

    Propulsion System Advisor (PSA) computer-based system assists engineers and technicians in analyzing masses of sensory data indicative of operating conditions of space shuttle propulsion system during pre-launch and launch activities. Designed solely for monitoring; does not perform any control functions. Although PSA developed for highly specialized application, serves as prototype of noncontrolling, computer-based subsystems for monitoring other complex systems like electric-power-distribution networks and factories.

  16. Method for separating contaminants from solution employing an organic-stabilized metal-hydroxy gel

    DOEpatents

    Alexander, Donald H.

    1996-01-01

    Metals and organics are extracted from solution by co-precipitating them with a gel comprising aluminum hydroxide and a complexing agent such as EDTA. After the gel is processed to remove the metals and organics, it can be recycled for further use by dissolving it in a high-pH solution, leaving no secondary waste stream. A number of alternative complexing agents perform better than EDTA.

  17. Characterization of children's decision making: sensitivity to punishment frequency, not task complexity.

    PubMed

    Crone, Eveline A; Bunge, Silvia A; Latenstein, Heleen; van der Molen, Maurits W

    2005-06-01

    On a gambling task that models real-life decision making, children between ages 7 and 12 perform like patients with bilateral lesions of the ventromedial prefrontal cortex (VMPFC), opting for choices that yield high immediate gains in spite of higher future losses (Crone & Van der Molen, 2004). The current study set out to characterize developmental changes in decision making by varying task complexity and punishment frequency. Three age groups (7-9 years, 10-12 years, 13-15 years) performed two versions of a computerized variant of the original Iowa gambling task. Task complexity was manipulated by varying the number of choices participants could make. Punishment frequency was manipulated by varying the frequency of delayed punishment. Results showed a developmental increase in the sensitivity to future consequences, which was present only when the punishment was presented infrequently. These results could not be explained by differential sensitivity to task complexity, hypersensitivity to reward, or failure to switch response set after receiving punishment. There was a general pattern of boys outperforming girls by making more advantageous choices over the course of the task. In conclusion, 7-12-year-old children--like VMPFC patients--appear myopic about the future except when the potential for future punishment is high.

  18. Mission possible: Building an effective business continuity team in seven steps.

    PubMed

    Porter, David

    2016-01-01

    Several books and studies exist on the creation, development and benefits of high-performing teams; many others offer insights into the business continuity management (BCM) discipline, crisis response and planning. Very rarely, however, do they cover both. This paper will explore the seven main development areas that helped build the foundation for a successful and high-performing BCM team in the Australian Taxation Office. Practical, actionable advice will be provided, recognising that the task for those starting out can be quite daunting and complex.

  19. Role of HPC in Advancing Computational Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.

    2004-01-01

    On behalf of the High Performance Computing and Modernization Program (HPCMP) and NASA Advanced Supercomputing Division (NAS) a study is conducted to assess the role of supercomputers on computational aeroelasticity of aerospace vehicles. The study is mostly based on the responses to a web based questionnaire that was designed to capture the nuances of high performance computational aeroelasticity, particularly on parallel computers. A procedure is presented to assign a fidelity-complexity index to each application. Case studies based on major applications using HPCMP resources are presented.

  20. High-performance liquid chromatography analysis of plant saponins: An update 2005-2010

    PubMed Central

    Negi, Jagmohan S.; Singh, Pramod; Pant, Geeta Joshi Nee; Rawat, M. S. M.

    2011-01-01

    Saponins are widely distributed in plant kingdom. In view of their wide range of biological activities and occurrence as complex mixtures, saponins have been purified and separated by high-performance liquid chromatography using reverse-phase columns at lower wavelength. Mostly, saponins are not detected by ultraviolet detector due to lack of chromophores. Electrospray ionization mass spectrometry, diode array detector , evaporative light scattering detection, and charged aerosols have been used for overcoming the detection problem of saponins. PMID:22303089

  1. A high speed sequential decoder

    NASA Technical Reports Server (NTRS)

    Lum, H., Jr.

    1972-01-01

    The performance and theory of operation for the High Speed Hard Decision Sequential Decoder are delineated. The decoder is a forward error correction system which is capable of accepting data from binary-phase-shift-keyed and quadriphase-shift-keyed modems at input data rates up to 30 megabits per second. Test results show that the decoder is capable of maintaining a composite error rate of 0.00001 at an input E sub b/N sub o of 5.6 db. This performance has been obtained with minimum circuit complexity.

  2. The inhibitory spillover effect: Controlling the bladder makes better liars *

    PubMed Central

    Fenn, Elise; Blandón-Gitlin, Iris; Coons, Jennifer; Pineda, Catherine; Echon, Reinalyn

    2015-01-01

    The Inhibitory-Spillover-Effect (ISE) on a deception task was investigated. The ISE occurs when performance in one self-control task facilitates performance in another (simultaneously conducted) self-control task. Deceiving requires increased access to inhibitory control. We hypothesized that inducing liars to control urination urgency (physical inhibition) would facilitate control during deceptive interviews (cognitive inhibition). Participants drank small (low-control) or large (high-control) amounts of water. Next, they lied or told the truth to an interviewer. Third-party observers assessed the presence of behavioral cues and made true/lie judgments. In the high-control, but not the low-control condition, liars displayed significantly fewer behavioral cues to deception, more behavioral cues signaling truth, and provided longer and more complex accounts than truth-tellers. Accuracy detecting liars in the high-control condition was significantly impaired; observers revealed bias toward perceiving liars as truth-tellers. The ISE can operate in complex behaviors. Acts of deception can be facilitated by covert manipulations of self-control. PMID:26366466

  3. Design consideration in constructing high performance embedded Knowledge-Based Systems (KBS)

    NASA Technical Reports Server (NTRS)

    Dalton, Shelly D.; Daley, Philip C.

    1988-01-01

    As the hardware trends for artificial intelligence (AI) involve more and more complexity, the process of optimizing the computer system design for a particular problem will also increase in complexity. Space applications of knowledge based systems (KBS) will often require an ability to perform both numerically intensive vector computations and real time symbolic computations. Although parallel machines can theoretically achieve the speeds necessary for most of these problems, if the application itself is not highly parallel, the machine's power cannot be utilized. A scheme is presented which will provide the computer systems engineer with a tool for analyzing machines with various configurations of array, symbolic, scaler, and multiprocessors. High speed networks and interconnections make customized, distributed, intelligent systems feasible for the application of AI in space. The method presented can be used to optimize such AI system configurations and to make comparisons between existing computer systems. It is an open question whether or not, for a given mission requirement, a suitable computer system design can be constructed for any amount of money.

  4. Removal of central obscuration and spiders for coronagraphy

    NASA Astrophysics Data System (ADS)

    Abe, L.; Nishikawa, J.; Murakami, N.; Tamura, M.

    2006-06-01

    We present a method to remove the central obscuration and spiders, or any kind of geometry inside a telescope pupil. The technique relies on the combination of a first focal plane diffracting mask, and a complex amplitude pupil mask. In this combination, the central obscuration and eventual spider arms patterns in the re-imaged pupil (after the diffracting mask) are filled with coherent light. Adding an appropriate complex amplitude pupil mask allows virtually any kind of pupil shaping (in both amplitude and/or phase). We show that the obtained output pupil can feed a high efficiency coronagraph (any kind) with a very reasonable overall throughput and good performance even when considering pointing errors. In this paper, we specifically assess the performance of this technique when using apodized entrance pupils. This technique is relevant for ground based telescopes foreseeing the advent of higher order (so called ExAO) adaptive optics systems providing very high Strehl ratios. Some feasibility points are also discussed. adaptive optics systems providing very high Strehl ratios. Some feasibility points are also discussed.

  5. High-contrast Imager for Complex Aperture Telescopes (HICAT): II. Design overview and first light results

    NASA Astrophysics Data System (ADS)

    N'Diaye, Mamadou; Choquet, Elodie; Egron, Sylvain; Pueyo, Laurent; Leboulleux, Lucie; Levecq, Olivier; Perrin, Marshall D.; Elliot, Erin; Wallace, J. Kent; Hugot, Emmanuel; Marcos, Michel; Ferrari, Marc; Long, Chris A.; Anderson, Rachel; DiFelice, Audrey; Soummer, Rémi

    2014-08-01

    We present a new high-contrast imaging testbed designed to provide complete solutions in wavefront sensing, control and starlight suppression with complex aperture telescopes. The testbed was designed to enable a wide range of studies of the effects of such telescope geometries, with primary mirror segmentation, central obstruction, and spiders. The associated diffraction features in the point spread function make high-contrast imaging more challenging. In particular the testbed will be compatible with both AFTA-like and ATLAST-like aperture shapes, respectively on-axis monolithic, and on-axis segmented telescopes. The testbed optical design was developed using a novel approach to define the layout and surface error requirements to minimize amplitude­ induced errors at the target contrast level performance. In this communication we compare the as-built surface errors for each optic to their specifications based on end-to-end Fresnel modelling of the testbed. We also report on the testbed optical and optomechanical alignment performance, coronagraph design and manufacturing, and preliminary first light results.

  6. In situ Raman cell for high pressure and temperature studies of metal and complex hydrides.

    PubMed

    Domènech-Ferrer, Roger; Ziegs, Frank; Klod, Sabrina; Lindemann, Inge; Voigtländer, Ralf; Dunsch, Lothar; Gutfleisch, Oliver

    2011-04-15

    A novel cell for in situ Raman studies at hydrogen pressures up to 200 bar and at temperatures as high as 400 °C is presented. This device permits in situ monitoring of the formation and decomposition of chemical structures under high pressure via Raman scattering. The performance of the cell under extreme conditions is stable as the design of this device compensates much of the thermal expansion during heating which avoids defocusing of the laser beam. Several complex and metal hydrides were analyzed to demonstrate the advantageous use of this in situ cell. Temperature calibration was performed by monitoring the structural phase transformation and melting point of LiBH(4). The feasibility of the cell in hydrogen atmosphere was confirmed by in situ studies of the decomposition of NaAlH(4) with added TiCl(3) at different hydrogen pressures and the decomposition and rehydrogenation of MgH(2) and LiNH(2). © 2011 American Chemical Society

  7. Alliance Helps States Map New Terrain in Educator Evaluation. REL West Research Digest

    ERIC Educational Resources Information Center

    Regional Educational Laboratory West, 2014

    2014-01-01

    About five years ago, states across the country took on the huge, complex task of developing and implementing new systems to evaluate teacher and principal performance in public schools. In response to a federal mandate aimed at improving student achievement, especially in the lowest performing schools, state boards of education drafted high-level…

  8. From Poor Performance to Success under Stress: Working Memory, Strategy Selection, and Mathematical Problem Solving under Pressure

    ERIC Educational Resources Information Center

    Beilock, Sian L.; DeCaro, Marci S.

    2007-01-01

    Two experiments demonstrate how individual differences in working memory (WM) impact the strategies used to solve complex math problems and how consequential testing situations alter strategy use. In Experiment 1, individuals performed multistep math problems under low- or high-pressure conditions and reported their problem-solving strategies.…

  9. A Novel Method for Assessing Task Complexity in Outpatient Clinical-Performance Measures.

    PubMed

    Hysong, Sylvia J; Amspoker, Amber B; Petersen, Laura A

    2016-04-01

    Clinical-performance measurement has helped improve the quality of health-care; yet success in attaining high levels of quality across multiple domains simultaneously still varies considerably. Although many sources of variability in care quality have been studied, the difficulty required to complete the clinical work itself has received little attention. We present a task-based methodology for evaluating the difficulty of clinical-performance measures (CPMs) by assessing the complexity of their component requisite tasks. Using Functional Job Analysis (FJA), subject-matter experts (SMEs) generated task lists for 17 CPMs; task lists were rated on ten dimensions of complexity, and then aggregated into difficulty composites. Eleven outpatient work SMEs; 133 VA Medical Centers nationwide. Clinical Performance: 17 outpatient CPMs (2000-2008) at 133 VA Medical Centers nationwide. Measure Difficulty: for each CPM, the number of component requisite tasks and the average rating across ten FJA complexity scales for the set of tasks comprising the measure. Measures varied considerably in the number of component tasks (M = 10.56, SD = 6.25, min = 5, max = 25). Measures of chronic care following acute myocardial infarction exhibited significantly higher measure difficulty ratings compared to diabetes or screening measures, but not to immunization measures ([Formula: see text] = 0.45, -0.04, -0.05, and -0.06 respectively; F (3, 186) = 3.57, p = 0.015). Measure difficulty ratings were not significantly correlated with the number of component tasks (r = -0.30, p = 0.23). Evaluating the difficulty of achieving recommended CPM performance levels requires more than simply counting the tasks involved; using FJA to assess the complexity of CPMs' component tasks presents an alternate means of assessing the difficulty of primary-care CPMs and accounting for performance variation among measures and performers. This in turn could be used in designing performance reward programs, or to match workflow to clinician time and effort.

  10. Regional Differences in Brain Volume Predict the Acquisition of Skill in a Complex Real-Time Strategy Videogame

    PubMed Central

    Basak, Chandramallika; Voss, Michelle W.; Erickson, Kirk I.; Boot, Walter R.; Kramer, Arthur F.

    2015-01-01

    Previous studies have found that differences in brain volume among older adults predict performance in laboratory tasks of executive control, memory, and motor learning. In the present study we asked whether regional differences in brain volume as assessed by the application of a voxel-based morphometry technique on high resolution MRI would also be useful in predicting the acquisition of skill in complex tasks, such as strategy-based video games. Twenty older adults were trained for over 20 hours to play Rise of Nations, a complex real-time strategy game. These adults showed substantial improvements over the training period in game performance. MRI scans obtained prior to training revealed that the volume of a number of brain regions, which have been previously associated with subsets of the trained skills, predicted a substantial amount of variance in learning on the complex game. Thus, regional differences in brain volume can predict learning in complex tasks that entail the use of a variety of perceptual, cognitive and motor processes. PMID:21546146

  11. Regional differences in brain volume predict the acquisition of skill in a complex real-time strategy videogame.

    PubMed

    Basak, Chandramallika; Voss, Michelle W; Erickson, Kirk I; Boot, Walter R; Kramer, Arthur F

    2011-08-01

    Previous studies have found that differences in brain volume among older adults predict performance in laboratory tasks of executive control, memory, and motor learning. In the present study we asked whether regional differences in brain volume as assessed by the application of a voxel-based morphometry technique on high resolution MRI would also be useful in predicting the acquisition of skill in complex tasks, such as strategy-based video games. Twenty older adults were trained for over 20 h to play Rise of Nations, a complex real-time strategy game. These adults showed substantial improvements over the training period in game performance. MRI scans obtained prior to training revealed that the volume of a number of brain regions, which have been previously associated with subsets of the trained skills, predicted a substantial amount of variance in learning on the complex game. Thus, regional differences in brain volume can predict learning in complex tasks that entail the use of a variety of perceptual, cognitive and motor processes. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. Optically controlled phased-array antenna technology for space communication systems

    NASA Technical Reports Server (NTRS)

    Kunath, Richard R.; Bhasin, Kul B.

    1988-01-01

    Using MMICs in phased-array applications above 20 GHz requires complex RF and control signal distribution systems. Conventional waveguide, coaxial cable, and microstrip methods are undesirable due to their high weight, high loss, limited mechanical flexibility and large volume. An attractive alternative to these transmission media, for RF and control signal distribution in MMIC phased-array antennas, is optical fiber. Presented are potential system architectures and their associated characteristics. The status of high frequency opto-electronic components needed to realize the potential system architectures is also discussed. It is concluded that an optical fiber network will reduce weight and complexity, and increase reliability and performance, but may require higher power.

  13. A high performance data parallel tensor contraction framework: Application to coupled electro-mechanics

    NASA Astrophysics Data System (ADS)

    Poya, Roman; Gil, Antonio J.; Ortigosa, Rogelio

    2017-07-01

    The paper presents aspects of implementation of a new high performance tensor contraction framework for the numerical analysis of coupled and multi-physics problems on streaming architectures. In addition to explicit SIMD instructions and smart expression templates, the framework introduces domain specific constructs for the tensor cross product and its associated algebra recently rediscovered by Bonet et al. (2015, 2016) in the context of solid mechanics. The two key ingredients of the presented expression template engine are as follows. First, the capability to mathematically transform complex chains of operations to simpler equivalent expressions, while potentially avoiding routes with higher levels of computational complexity and, second, to perform a compile time depth-first or breadth-first search to find the optimal contraction indices of a large tensor network in order to minimise the number of floating point operations. For optimisations of tensor contraction such as loop transformation, loop fusion and data locality optimisations, the framework relies heavily on compile time technologies rather than source-to-source translation or JIT techniques. Every aspect of the framework is examined through relevant performance benchmarks, including the impact of data parallelism on the performance of isomorphic and nonisomorphic tensor products, the FLOP and memory I/O optimality in the evaluation of tensor networks, the compilation cost and memory footprint of the framework and the performance of tensor cross product kernels. The framework is then applied to finite element analysis of coupled electro-mechanical problems to assess the speed-ups achieved in kernel-based numerical integration of complex electroelastic energy functionals. In this context, domain-aware expression templates combined with SIMD instructions are shown to provide a significant speed-up over the classical low-level style programming techniques.

  14. Computing Systems | High-Performance Computing | NREL

    Science.gov Websites

    investigate, build, and test models of complex phenomena or entire integrated systems-that cannot be directly observed or manipulated in the lab, or would be too expensive or time consuming. Models and visualizations

  15. Reduced interhemispheric interaction in non-autistic individuals with normal but high levels of autism traits.

    PubMed

    O'Keefe, Natalie; Lindell, Annukka K

    2013-11-01

    People with autism spectrum disorder (ASD) show superior performance for tasks requiring detail-focused processing. Atypical neural connectivity and reduced interhemispheric communication are posited to underlie this cognitive advantage. Given recent conceptualization of autism as a continuum, we sought to investigate whether people with normal but high levels of autism like traits (AQ) also exhibit reduced hemispheric interaction. Sixty right-handed participants completed the AQ questionnaire (Baron-Cohen, Wheelwright, Skinner, Martin, & Clubley, 2001) and a lateralised letter matching task that assessed unilateral and bilateral performance in response to simple (physical) and complex (identity) matches. Whereas people with low self-rated AQ scores showed a bilateral advantage for the more complex task, indicating normal interhemispheric interaction, people in the high AQ group failed to show a bilateral gain for the computationally demanding stimuli. This finding of disrupted interhemispheric interaction converges with a dimensional conceptualisation of ASD, suggesting that the structural anomalies of ASD extend to non-autistic individuals with high levels of autism traits. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. Motion adaptive Kalman filter for super-resolution

    NASA Astrophysics Data System (ADS)

    Richter, Martin; Nasse, Fabian; Schröder, Hartmut

    2011-01-01

    Superresolution is a sophisticated strategy to enhance image quality of both low and high resolution video, performing tasks like artifact reduction, scaling and sharpness enhancement in one algorithm, all of them reconstructing high frequency components (above Nyquist frequency) in some way. Especially recursive superresolution algorithms can fulfill high quality aspects because they control the video output using a feed-back loop and adapt the result in the next iteration. In addition to excellent output quality, temporal recursive methods are very hardware efficient and therefore even attractive for real-time video processing. A very promising approach is the utilization of Kalman filters as proposed by Farsiu et al. Reliable motion estimation is crucial for the performance of superresolution. Therefore, robust global motion models are mainly used, but this also limits the application of superresolution algorithm. Thus, handling sequences with complex object motion is essential for a wider field of application. Hence, this paper proposes improvements by extending the Kalman filter approach using motion adaptive variance estimation and segmentation techniques. Experiments confirm the potential of our proposal for ideal and real video sequences with complex motion and further compare its performance to state-of-the-art methods like trainable filters.

  17. Characterization of mature maize (Zea mays L.) root system architecture and complexity in a diverse set of Ex-PVP inbreds and hybrids.

    PubMed

    Hauck, Andrew L; Novais, Joana; Grift, Tony E; Bohn, Martin O

    2015-01-01

    The mature root system is a vital plant organ, which is critical to plant performance. Commercial maize (Zea mays L.) breeding has resulted in a steady increase in plant performance over time, along with noticeable changes in above ground vegetative traits, but the corresponding changes in the root system are not presently known. In this study, roughly 2500 core root systems from field trials of a set of 10 diverse elite inbreds formerly protected by Plant Variety Protection plus B73 and Mo17 and the 66 diallel intercrosses among them were evaluated for root traits using high throughput image-based phenotyping. Overall root architecture was modeled by root angle (RA) and stem diameter (SD), while root complexity, the amount of root branching, was quantified using fractal analysis to obtain values for fractal dimension (FD) and fractal abundance (FA). For each trait, per se line effects were highly significant and the most important contributor to trait performance. Mid-parent heterosis and specific combining ability was also highly significant for FD, FA, and RA, while none of the traits showed significant general combining ability. The interaction between the environment and the additive line effect was also significant for all traits. Within the inbred and hybrid generations, FD and FA were highly correlated (rp ≥ 0.74), SD was moderately correlated to FD and FA (0.69 ≥ rp ≥ 0.48), while the correlation between RA and other traits was low (0.13 ≥ rp ≥ -0.40). Inbreds with contrasting effects on complexity and architecture traits were observed, suggesting that root complexity and architecture traits are inherited independently. A more comprehensive understanding of the maize root system and the way it interacts with the environment will be useful for defining adaptation to nutrient acquisition and tolerance to stress from drought and high plant densities, critical factors in the yield gains of modern hybrids.

  18. The dependence of crowding on flanker complexity and target-flanker similarity

    PubMed Central

    Bernard, Jean-Baptiste; Chung, Susana T.L.

    2013-01-01

    We examined the effects of the spatial complexity of flankers and target-flanker similarity on the performance of identifying crowded letters. On each trial, observers identified the middle character of random strings of three characters (“trigrams”) briefly presented at 10° below fixation. We tested the 26 lowercase letters of the Times-Roman and Courier fonts, a set of 79 characters (letters and non-letters) of the Times-Roman font, and the uppercase letters of two highly complex ornamental fonts, Edwardian and Aristocrat. Spatial complexity of characters was quantified by the length of the morphological skeleton of each character, and target-flanker similarity was defined based on a psychometric similarity matrix. Our results showed that (1) letter identification error rate increases with flanker complexity up to a certain value, beyond which error rate becomes independent of flanker complexity; (2) the increase of error rate is slower for high-complexity target letters; (3) error rate increases with target-flanker similarity; and (4) mislocation error rate increases with target-flanker similarity. These findings, combined with the current understanding of the faulty feature integration account of crowding, provide some constraints of how the feature integration process could cause perceptual errors. PMID:21730225

  19. Syntactic and Story Structure Complexity in the Narratives of High- and Low-Language Ability Children with Autism Spectrum Disorder

    PubMed Central

    Peristeri, Eleni; Andreou, Maria; Tsimpli, Ianthi M.

    2017-01-01

    Although language impairment is commonly associated with the autism spectrum disorder (ASD), the Diagnostic Statistical Manual no longer includes language impairment as a necessary component of an ASD diagnosis (American Psychiatric Association, 2013). However, children with ASD and no comorbid intellectual disability struggle with some aspects of language whose precise nature is still outstanding. Narratives have been extensively used as a tool to examine lexical and syntactic abilities, as well as pragmatic skills in children with ASD. This study contributes to this literature by investigating the narrative skills of 30 Greek-speaking children with ASD and normal non-verbal IQ, 16 with language skills in the upper end of the normal range (ASD-HL), and 14 in the lower end of the normal range (ASD-LL). The control group consisted of 15 age-matched typically-developing (TD) children. Narrative performance was measured in terms of both microstructural and macrostructural properties. Microstructural properties included lexical and syntactic measures of complexity such as subordinate vs. coordinate clauses and types of subordinate clauses. Macrostructure was measured in terms of the diversity in the use of internal state terms (ISTs) and story structure complexity, i.e., children's ability to produce important units of information that involve the setting, characters, events, and outcomes of the story, as well as the characters' thoughts and feelings. The findings demonstrate that high language ability and syntactic complexity pattern together in ASD children's narrative performance and that language ability compensates for autistic children's pragmatic deficit associated with the production of Theory of Mind-related ISTs. Nevertheless, both groups of children with ASD (high and low language ability) scored lower than the TD controls in the production of Theory of Mind-unrelated ISTs, modifier clauses and story structure complexity. PMID:29209258

  20. Syntactic and Story Structure Complexity in the Narratives of High- and Low-Language Ability Children with Autism Spectrum Disorder.

    PubMed

    Peristeri, Eleni; Andreou, Maria; Tsimpli, Ianthi M

    2017-01-01

    Although language impairment is commonly associated with the autism spectrum disorder (ASD), the Diagnostic Statistical Manual no longer includes language impairment as a necessary component of an ASD diagnosis (American Psychiatric Association, 2013). However, children with ASD and no comorbid intellectual disability struggle with some aspects of language whose precise nature is still outstanding. Narratives have been extensively used as a tool to examine lexical and syntactic abilities, as well as pragmatic skills in children with ASD. This study contributes to this literature by investigating the narrative skills of 30 Greek-speaking children with ASD and normal non-verbal IQ, 16 with language skills in the upper end of the normal range (ASD-HL), and 14 in the lower end of the normal range (ASD-LL). The control group consisted of 15 age-matched typically-developing (TD) children. Narrative performance was measured in terms of both microstructural and macrostructural properties. Microstructural properties included lexical and syntactic measures of complexity such as subordinate vs. coordinate clauses and types of subordinate clauses. Macrostructure was measured in terms of the diversity in the use of internal state terms (ISTs) and story structure complexity, i.e., children's ability to produce important units of information that involve the setting, characters, events, and outcomes of the story, as well as the characters' thoughts and feelings. The findings demonstrate that high language ability and syntactic complexity pattern together in ASD children's narrative performance and that language ability compensates for autistic children's pragmatic deficit associated with the production of Theory of Mind-related ISTs. Nevertheless, both groups of children with ASD (high and low language ability) scored lower than the TD controls in the production of Theory of Mind-unrelated ISTs, modifier clauses and story structure complexity.

  1. Surgical volume-to-outcome relationship and monitoring of technical performance in pediatric cardiac surgery.

    PubMed

    Kalfa, David; Chai, Paul; Bacha, Emile

    2014-08-01

    A significant inverse relationship of surgical institutional and surgeon volumes to outcome has been demonstrated in many high-stakes surgical specialties. By and large, the same results were found in pediatric cardiac surgery, for which a more thorough analysis has shown that this relationship depends on case complexity and type of surgical procedures. Lower-volume programs tend to underperform larger-volume programs as case complexity increases. High-volume pediatric cardiac surgeons also tend to have better results than low-volume surgeons, especially at the more complex end of the surgery spectrum (e.g., the Norwood procedure). Nevertheless, this trend for lower mortality rates at larger centers is not universal. All larger programs do not perform better than all smaller programs. Moreover, surgical volume seems to account for only a small proportion of the overall between-center variation in outcome. Intraoperative technical performance is one of the most important parts, if not the most important part, of the therapeutic process and a critical component of postoperative outcome. Thus, the use of center-specific, risk-adjusted outcome as a tool for quality assessment together with monitoring of technical performance using a specific score may be more reliable than relying on volume alone. However, the relationship between surgical volume and outcome in pediatric cardiac surgery is strong enough that it ought to support adapted and well-balanced health care strategies that take advantage of the positive influence that higher center and surgeon volumes have on outcome.

  2. Enduring the shipboard stressor complex: a systems approach.

    PubMed

    Comperatore, Carlos A; Rivera, Pik Kwan; Kingsley, Leonard

    2005-06-01

    A high incidence of physiological and psychological stressors characterizes the maritime work environment in many segments of the commercial maritime industry and in the military. Traditionally, crewmembers work embedded in a complex of stressors. Stressors rarely act independently because most occur concurrently, simultaneously taxing physical and mental resources. Stressors such as extreme environmental temperatures, long work hours, heavy mental and physical workload, authoritative leadership, isolation from family and loved ones, lack of exercise, and unhealthy diets often combine to degrade crewmember health and performance, particularly on long voyages. This complex system of interacting stressors affects the ability of maritime crewmembers to maintain adequate levels of alertness and performance. An analytical systems approach methodology is described here as a viable method to identify workplace stressors and track their systemic interactions. A systems-based program for managing the stressor complex is then offered, together with the empirical research supporting its efficacy. Included is an example implementation of a stressor-control program aboard a U.S. Coast Guard cutter.

  3. Low-complexity R-peak detection for ambulatory fetal monitoring.

    PubMed

    Rooijakkers, Michael J; Rabotti, Chiara; Oei, S Guid; Mischi, Massimo

    2012-07-01

    Non-invasive fetal health monitoring during pregnancy is becoming increasingly important because of the increasing number of high-risk pregnancies. Despite recent advances in signal-processing technology, which have enabled fetal monitoring during pregnancy using abdominal electrocardiogram (ECG) recordings, ubiquitous fetal health monitoring is still unfeasible due to the computational complexity of noise-robust solutions. In this paper, an ECG R-peak detection algorithm for ambulatory R-peak detection is proposed, as part of a fetal ECG detection algorithm. The proposed algorithm is optimized to reduce computational complexity, without reducing the R-peak detection performance compared to the existing R-peak detection schemes. Validation of the algorithm is performed on three manually annotated datasets. With a detection error rate of 0.23%, 1.32% and 9.42% on the MIT/BIH Arrhythmia and in-house maternal and fetal databases, respectively, the detection rate of the proposed algorithm is comparable to the best state-of-the-art algorithms, at a reduced computational complexity.

  4. The highly intelligent virtual agents for modeling financial markets

    NASA Astrophysics Data System (ADS)

    Yang, G.; Chen, Y.; Huang, J. P.

    2016-02-01

    Researchers have borrowed many theories from statistical physics, like ensemble, Ising model, etc., to study complex adaptive systems through agent-based modeling. However, one fundamental difference between entities (such as spins) in physics and micro-units in complex adaptive systems is that the latter are usually with high intelligence, such as investors in financial markets. Although highly intelligent virtual agents are essential for agent-based modeling to play a full role in the study of complex adaptive systems, how to create such agents is still an open question. Hence, we propose three principles for designing high artificial intelligence in financial markets and then build a specific class of agents called iAgents based on these three principles. Finally, we evaluate the intelligence of iAgents through virtual index trading in two different stock markets. For comparison, we also include three other types of agents in this contest, namely, random traders, agents from the wealth game (modified on the famous minority game), and agents from an upgraded wealth game. As a result, iAgents perform the best, which gives a well support for the three principles. This work offers a general framework for the further development of agent-based modeling for various kinds of complex adaptive systems.

  5. Feedforward object-vision models only tolerate small image variations compared to human

    PubMed Central

    Ghodrati, Masoud; Farzmahdi, Amirhossein; Rajaei, Karim; Ebrahimpour, Reza; Khaligh-Razavi, Seyed-Mahdi

    2014-01-01

    Invariant object recognition is a remarkable ability of primates' visual system that its underlying mechanism has constantly been under intense investigations. Computational modeling is a valuable tool toward understanding the processes involved in invariant object recognition. Although recent computational models have shown outstanding performances on challenging image databases, they fail to perform well in image categorization under more complex image variations. Studies have shown that making sparse representation of objects by extracting more informative visual features through a feedforward sweep can lead to higher recognition performances. Here, however, we show that when the complexity of image variations is high, even this approach results in poor performance compared to humans. To assess the performance of models and humans in invariant object recognition tasks, we built a parametrically controlled image database consisting of several object categories varied in different dimensions and levels, rendered from 3D planes. Comparing the performance of several object recognition models with human observers shows that only in low-level image variations the models perform similar to humans in categorization tasks. Furthermore, the results of our behavioral experiments demonstrate that, even under difficult experimental conditions (i.e., briefly presented masked stimuli with complex image variations), human observers performed outstandingly well, suggesting that the models are still far from resembling humans in invariant object recognition. Taken together, we suggest that learning sparse informative visual features, although desirable, is not a complete solution for future progresses in object-vision modeling. We show that this approach is not of significant help in solving the computational crux of object recognition (i.e., invariant object recognition) when the identity-preserving image variations become more complex. PMID:25100986

  6. Predictive model of complexity in early palliative care: a cohort of advanced cancer patients (PALCOM study).

    PubMed

    Tuca, Albert; Gómez-Martínez, Mónica; Prat, Aleix

    2018-01-01

    Model of early palliative care (PC) integrated in oncology is based on shared care from the diagnosis to the end of life and is mainly focused on patients with greater complexity. However, there is no definition or tools to evaluate PC complexity. The objectives of the study were to identify the factors influencing level determination of complexity, propose predictive models, and build a complexity scale of PC. We performed a prospective, observational, multicenter study in a cohort of advanced cancer patients with an estimated prognosis ≤ 6 months. An ad hoc structured evaluation including socio-demographic and clinical data, symptom burden, functional and cognitive status, psychosocial problems, and existential-ethic dilemmas was recorded systematically. According to this multidimensional evaluation, investigator classified patients as high, medium, or low palliative complexity, associated to need of basic or specialized PC. Logistic regression was used to identify the variables influencing determination of level of PC complexity and explore predictive models. We included 324 patients; 41% were classified as having high PC complexity and 42.9% as medium, both levels being associated with specialized PC. Variables influencing determination of PC complexity were as follows: high symptom burden (OR 3.19 95%CI: 1.72-6.17), difficult pain (OR 2.81 95%CI:1.64-4.9), functional status (OR 0.99 95%CI:0.98-0.9), and social-ethical existential risk factors (OR 3.11 95%CI:1.73-5.77). Logistic analysis of variables allowed construct a complexity model and structured scales (PALCOM 1 and 2) with high predictive value (AUC ROC 76%). This study provides a new model and tools to assess complexity in palliative care, which may be very useful to manage referral to specialized PC services, and agree intensity of their intervention in a model of early-shared care integrated in oncology.

  7. Hydrothermal preparation and physicochemical studies of new copper nano-complexes for antitumor application

    NASA Astrophysics Data System (ADS)

    Saif, M.; El-Shafiy, Hoda F.; Mashaly, Mahmoud M.; Eid, Mohamed F.; Nabeel, A. I.; Fouad, R.

    2018-03-01

    Two novel nano-complexes [(Cu)2(L) (NO3)2(OH2)] (CuH) and [Cu(HL) (OH2)2(NO3)] (CuCTH)were synthesized by hydrothermal method at 200 °C for 48 h in absence and presence of surfactant (CTAB), respectively. Introducing surfactant (CTAB) leads to changing stoichiometric metal/ligand ratio from binuclear (CuH) to mononuclear (CuCTH) nano-complexes. CuH shows irregular nano-flake shape while CuCTH have separately uniform nano-spherical morphology. Thermal analysis revealed that CuCTH is thermally stable in comparison with CuH Nano-complex. CuCTH absorption peak shifted to shorter wavelength (blue shift) and sharpness of the peak also decreased in presence of CTAB. The role of CTAB in the crystal growth is discussed. CuH and CuCTH nano-complexes were tested for their in vitro cytotoxicity against Ehrlich Ascites Carcinoma cell line (E.A.C.). Both nano-complexes effectively inhibited E.A.C. growth with IC50value of 37 and 25 μM for CuH and CuCTH, respectively. The high antitumor activity of CuCTH was attributed to several factors such as spherical morphology, smaller size, chemical structure, and geometry. The LD50 for high cytotoxic CuCTH nano-complex on mice was found to be 100 mg/kg with strong abscess in abdomen side effect. To overcome this side effect, different molar ratio of CuCTH and previously prepared ZnNano-complexes were tested for their in vitrocytotoxicity and in vivo toxicity. Obtained results show that the 2:8 M ratio between CuCTH and Zn nano-complexes gives very low toxicity without any side effects. Also, geometric optimization and conformational analysis were performed using semi-empirical PM3 method. Energy gap (ΔE), dipole moment, and structure activity relationship were performed and discussed.

  8. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE PAGES

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

  9. Automatic Information Processing and High Performance Skills: Individual Differences and Mechanisms of Performance Improvement in Search-Detection and Complex Task

    DTIC Science & Technology

    1992-09-01

    abilities is fit along with the autoregressive process. Initially, the influences on search performance of within-group age and sex were included as control...Results: PerformanceLAbility Structure Measurement Model: Ability Structure The correlations between all the ability measures, age, and sex are...subsequent analyses for young adults. Age and sex were included as control variables. There was an age range of 15 years; this range is sufficiently large that

  10. Rational and Efficient Preparative Isolation of Natural Products by MPLC-UV-ELSD based on HPLC to MPLC Gradient Transfer.

    PubMed

    Challal, Soura; Queiroz, Emerson Ferreira; Debrus, Benjamin; Kloeti, Werner; Guillarme, Davy; Gupta, Mahabir Prashad; Wolfender, Jean-Luc

    2015-11-01

    In natural product research, the isolation of biomarkers or bioactive compounds from complex natural extracts represents an essential step for de novo identification and bioactivity assessment. When pure natural products have to be obtained in milligram quantities, the chromatographic steps are generally labourious and time-consuming. In this respect, an efficient method has been developed for the reversed-phase gradient transfer from high-performance liquid chromatography to medium-performance liquid chromatography for the isolation of pure natural products at the level of tens of milligrams from complex crude natural extracts. The proposed method provides a rational way to predict retention behaviour and resolution at the analytical scale prior to medium-performance liquid chromatography, and guarantees similar performances at both analytical and preparative scales. The optimisation of the high-performance liquid chromatography separation and system characterisation allows for the prediction of the gradient at the medium-performance liquid chromatography scale by using identical stationary phase chemistries. The samples were introduced in medium-performance liquid chromatography using a pressure-resistant aluminium dry load cell especially designed for this study to allow high sample loading while maintaining a maximum achievable flow rate for the separation. The method has been validated with a mixture of eight natural product standards. Ultraviolet and evaporative light scattering detections were used in parallel for a comprehensive monitoring. In addition, post-chromatographic mass spectrometry detection was provided by high-throughput ultrahigh-performance liquid chromatography time-of-flight mass spectrometry analyses of all fractions. The processing of all liquid chromatography-mass spectrometry data in the form of an medium-performance liquid chromatography x ultra high-performance liquid chromatography time-of-flight mass spectrometry matrix enabled an efficient localisation of the compounds of interest in the generated fractions. The methodology was successfully applied for the separation of three different plant extracts that contain many diverse secondary metabolites. The advantages and limitations of this approach and the theoretical chromatographic background that rules such as liquid chromatography gradient transfer are presented from a practical viewpoint. Georg Thieme Verlag KG Stuttgart · New York.

  11. Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering.

    PubMed

    Groen, Nathalie; Guvendiren, Murat; Rabitz, Herschel; Welsh, William J; Kohn, Joachim; de Boer, Jan

    2016-04-01

    The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. In this opinion paper, we postulate that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. Copyright © 2016. Published by Elsevier Ltd.

  12. Delivery of the high-mobility group box 1 box A peptide using heparin in the acute lung injury animal models.

    PubMed

    Song, Ji Hyun; Kim, Ji Yeon; Piao, Chunxian; Lee, Seonyeong; Kim, Bora; Song, Su Jeong; Choi, Joon Sig; Lee, Minhyung

    2016-07-28

    In this study, the efficacy of the high-mobility group box-1 box A (HMGB1A)/heparin complex was evaluated for the treatment of acute lung injury (ALI). HMGB1A is an antagonist against wild-type high-mobility group box-1 (wtHMGB1), a pro-inflammatory cytokine that is involved in ALIs. HMGB1A has positive charges and can be captured in the mucus layer after intratracheal administration. To enhance the delivery and therapeutic efficiency of HMGB1A, the HMGB1A/heparin complex was produced using electrostatic interactions, with the expectation that the nano-sized complex with a negative surface charge could efficiently penetrate the mucus layer. Additionally, heparin itself had an anti-inflammatory effect. Complex formation with HMGB1A and heparin was confirmed by atomic force microscopy. The particle size and surface charge of the HMGB1A/heparin complex at a 1:1 weight ratio were 113nm and -25mV, respectively. Intratracheal administration of the complex was performed into an ALI animal model. The results showed that the HMGB1A/heparin complex reduced pro-inflammatory cytokines, including tumor necrosis factor-α (TNF-α), interleukin-6 (IL-6), and IL-1β, more effectively than HMGB1A or heparin alone. Hematoxylin and eosin staining confirmed the decreased inflammatory reaction in the lungs after delivery of the HMGB1A/heparin complex. In conclusion, the HMGB1A/heparin complex might be useful to treat ALI. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Predicting High School Completion Using Student Performance in High School Algebra: A Mixed Methods Research Study

    ERIC Educational Resources Information Center

    Chiado, Wendy S.

    2012-01-01

    Too many of our nation's youth have failed to complete high school. Determining why so many of our nation's students fail to graduate is a complex, multi-faceted problem and beyond the scope of any one study. The study presented herein utilized a thirteen-step mixed methods model developed by Leech and Onwuegbuzie (2007) to demonstrate within a…

  14. A high performance, ad-hoc, fuzzy query processing system for relational databases

    NASA Technical Reports Server (NTRS)

    Mansfield, William H., Jr.; Fleischman, Robert M.

    1992-01-01

    Database queries involving imprecise or fuzzy predicates are currently an evolving area of academic and industrial research. Such queries place severe stress on the indexing and I/O subsystems of conventional database environments since they involve the search of large numbers of records. The Datacycle architecture and research prototype is a database environment that uses filtering technology to perform an efficient, exhaustive search of an entire database. It has recently been modified to include fuzzy predicates in its query processing. The approach obviates the need for complex index structures, provides unlimited query throughput, permits the use of ad-hoc fuzzy membership functions, and provides a deterministic response time largely independent of query complexity and load. This paper describes the Datacycle prototype implementation of fuzzy queries and some recent performance results.

  15. Poly-Omic Prediction of Complex Traits: OmicKriging

    PubMed Central

    Wheeler, Heather E.; Aquino-Michaels, Keston; Gamazon, Eric R.; Trubetskoy, Vassily V.; Dolan, M. Eileen; Huang, R. Stephanie; Cox, Nancy J.; Im, Hae Kyung

    2014-01-01

    High-confidence prediction of complex traits such as disease risk or drug response is an ultimate goal of personalized medicine. Although genome-wide association studies have discovered thousands of well-replicated polymorphisms associated with a broad spectrum of complex traits, the combined predictive power of these associations for any given trait is generally too low to be of clinical relevance. We propose a novel systems approach to complex trait prediction, which leverages and integrates similarity in genetic, transcriptomic, or other omics-level data. We translate the omic similarity into phenotypic similarity using a method called Kriging, commonly used in geostatistics and machine learning. Our method called OmicKriging emphasizes the use of a wide variety of systems-level data, such as those increasingly made available by comprehensive surveys of the genome, transcriptome, and epigenome, for complex trait prediction. Furthermore, our OmicKriging framework allows easy integration of prior information on the function of subsets of omics-level data from heterogeneous sources without the sometimes heavy computational burden of Bayesian approaches. Using seven disease datasets from the Wellcome Trust Case Control Consortium (WTCCC), we show that OmicKriging allows simple integration of sparse and highly polygenic components yielding comparable performance at a fraction of the computing time of a recently published Bayesian sparse linear mixed model method. Using a cellular growth phenotype, we show that integrating mRNA and microRNA expression data substantially increases performance over either dataset alone. Using clinical statin response, we show improved prediction over existing methods. PMID:24799323

  16. Malaria Parasite CLAG3, a Protein Linked to Nutrient Channels, Participates in High Molecular Weight Membrane-Associated Complexes in the Infected Erythrocyte

    PubMed Central

    Zainabadi, Kayvan

    2016-01-01

    Malaria infected erythrocytes show increased permeability to a number of solutes important for parasite growth as mediated by the Plasmodial Surface Anion Channel (PSAC). The P. falciparum clag3 genes have recently been identified as key determinants of PSAC, though exactly how they contribute to channel function and whether additional host/parasite proteins are required remain unknown. To begin to answer these questions, I have taken a biochemical approach. Here I have used an epitope-tagged CLAG3 parasite to perform co-immunoprecipitation experiments using membrane fractions of infected erythrocytes. Native PAGE and mass spectrometry studies reveal that CLAG3 participate in at least three different high molecular weight complexes: a ~720kDa complex consisting of CLAG3, RHOPH2 and RHOPH3; a ~620kDa complex consisting of CLAG3 and RHOPH2; and a ~480kDa complex composed solely of CLAG3. Importantly, these complexes can be found throughout the parasite lifecycle but are absent in untransfected controls. Extracellular biotin labeling and protease susceptibility studies localize the 480kDa complex to the erythrocyte membrane. This complex, likely composed of a homo-oligomer of 160kDa CLAG3, may represent a functional subunit, possibly the pore, of PSAC. PMID:27299521

  17. WRF simulation over complex terrain during a southern California wildfire event

    NASA Astrophysics Data System (ADS)

    Lu, W.; Zhong, S.; Charney, J. J.; Bian, X.; Liu, S.

    2012-03-01

    In October 2007, the largest wildfire-related evacuation in California's history occurred as severe wildfires broke out across southern California. Smoke from these wildfires contributed to elevated pollutant concentrations in the atmosphere, affecting air quality in a vast region of the western United States. High-resolution numerical simulations were performed using the Weather Research and Forecast (WRF) model to understand the atmospheric conditions during the wildfire episode and how the complex circulation patterns might affect smoke transport and dispersion. The simulated meteorological fields were validated using surface and upper air observations in California and Nevada. To distinguish the performance of the WRF in different geographic regions, the surface stations were grouped into coastal sites, valley and basin sites, and mountain sites, and the results for the three categories were analyzed and intercompared. For temperature and moisture, the mountain category has the best agreement with the observations, while the coastal category was the worst. For wind, the model performance for the three categories was very similar. The flow patterns over complex terrain were also analyzed under different synoptic conditions and the possible impact of the terrain on smoke and pollutant pathways is analyzed by employing a Lagrangian Particle Dispersion Model. When high mountains prevent the smoke from moving inland, the mountain passes act as active pathways for smoke transport; meanwhile, chimney effect helps inject the pollutants to higher levels, where they are transported regionally. The results highlight the role of complex topography in the assessment of the possible smoke transport patterns in the region.

  18. Topological Properties of Some Integrated Circuits for Very Large Scale Integration Chip Designs

    NASA Astrophysics Data System (ADS)

    Swanson, S.; Lanzerotti, M.; Vernizzi, G.; Kujawski, J.; Weatherwax, A.

    2015-03-01

    This talk presents topological properties of integrated circuits for Very Large Scale Integration chip designs. These circuits can be implemented in very large scale integrated circuits, such as those in high performance microprocessors. Prior work considered basic combinational logic functions and produced a mathematical framework based on algebraic topology for integrated circuits composed of logic gates. Prior work also produced an historically-equivalent interpretation of Mr. E. F. Rent's work for today's complex circuitry in modern high performance microprocessors, where a heuristic linear relationship was observed between the number of connections and number of logic gates. This talk will examine topological properties and connectivity of more complex functionally-equivalent integrated circuits. The views expressed in this article are those of the author and do not reflect the official policy or position of the United States Air Force, Department of Defense or the U.S. Government.

  19. Compression of hyper-spectral images using an accelerated nonnegative tensor decomposition

    NASA Astrophysics Data System (ADS)

    Li, Jin; Liu, Zilong

    2017-12-01

    Nonnegative tensor Tucker decomposition (NTD) in a transform domain (e.g., 2D-DWT, etc) has been used in the compression of hyper-spectral images because it can remove redundancies between spectrum bands and also exploit spatial correlations of each band. However, the use of a NTD has a very high computational cost. In this paper, we propose a low complexity NTD-based compression method of hyper-spectral images. This method is based on a pair-wise multilevel grouping approach for the NTD to overcome its high computational cost. The proposed method has a low complexity under a slight decrease of the coding performance compared to conventional NTD. We experimentally confirm this method, which indicates that this method has the less processing time and keeps a better coding performance than the case that the NTD is not used. The proposed approach has a potential application in the loss compression of hyper-spectral or multi-spectral images

  20. The computational challenges of Earth-system science.

    PubMed

    O'Neill, Alan; Steenman-Clark, Lois

    2002-06-15

    The Earth system--comprising atmosphere, ocean, land, cryosphere and biosphere--is an immensely complex system, involving processes and interactions on a wide range of space- and time-scales. To understand and predict the evolution of the Earth system is one of the greatest challenges of modern science, with success likely to bring enormous societal benefits. High-performance computing, along with the wealth of new observational data, is revolutionizing our ability to simulate the Earth system with computer models that link the different components of the system together. There are, however, considerable scientific and technical challenges to be overcome. This paper will consider four of them: complexity, spatial resolution, inherent uncertainty and time-scales. Meeting these challenges requires a significant increase in the power of high-performance computers. The benefits of being able to make reliable predictions about the evolution of the Earth system should, on their own, amply repay this investment.

  1. Microwave absorption properties of flake-shaped Co particles composites at elevated temperature (293-673 K) in X band

    NASA Astrophysics Data System (ADS)

    Wang, Guowu; Li, Xiling; Wang, Peng; Zhang, Junming; Wang, Dian; Qiao, Liang; Wang, Tao; Li, Fashen

    2018-06-01

    The complex permeability and permittivity of the easy-plane anisotropic Co/polyimide composite at high temperature (293-673 K) in X band were measured. The results show that both the complex permeability and permittivity increase with the increase of temperature in the measured temperature range. The calculated absorption properties display that the intensity of the reflection loss (RL) peak first increases and then decreases with the increase of temperature, and reaches the maximum (-52 dB) at 523 K. At each temperature, the composite can achieve the RL exceeding -10 dB in the whole X band. The composite can even work stably for more than 20 min with the excellent absorption performance under 673 K. In addition, the RL performance of the composite at high temperature is better than that at room temperature.

  2. High-speed separation and characterization of major constituents in Radix Paeoniae Rubra by fast high-performance liquid chromatography coupled with diode-array detection and time-of-flight mass spectrometry.

    PubMed

    Liu, E-Hu; Qi, Lian-Wen; Li, Bin; Peng, Yong-Bo; Li, Ping; Li, Chang-Yin; Cao, Jun

    2009-01-01

    A fast high-performance liquid chromatography (HPLC) method coupled with diode-array detection (DAD) and electrospray ionization time-of-flight mass spectrometry (ESI-TOFMS) has been developed for rapid separation and sensitive identification of major constituents in Radix Paeoniae Rubra (RPR). The total analysis time on a short column packed with 1.8-microm porous particles was about 20 min without a loss in resolution, six times faster than the performance of a conventional column analysis (115 min). The MS fragmentation behavior and structural characterization of major compounds in RPR were investigated here for the first time. The targets were rapidly screened from RPR matrix using a narrow mass window of 0.01 Da to restructure extracted ion chromatograms. Accurate mass measurements (less than 5 ppm error) for both the deprotonated molecule and characteristic fragment ions represent reliable identification criteria for these compounds in complex matrices with similar if not even better performance compared with tandem mass spectrometry. A total of 26 components were screened and identified in RPR including 11 monoterpene glycosides, 11 galloyl glucoses and 4 other phenolic compounds. From the point of time savings, resolving power, accurate mass measurement capability and full spectral sensitivity, the established fast HPLC/DAD/TOFMS method turns out to be a highly useful technique to identify constituents in complex herbal medicines. (c) 2008 John Wiley & Sons, Ltd.

  3. Computational Characterization of Electromagnetic Field Propagation in Complex Structures

    DTIC Science & Technology

    1998-04-10

    34Computational characterization of electromagnetic field propagation in complex structures", DAAH01-91-D-ROOS D.O. 59. Dr. Michael Scalora performed the...Development, and Engineering Center, Bldg. 7804, Room 242 Redstone Arsenal, Alabama 35898-5248 USA Dr. Michael Scalora Quantum Optics Group Tel:(205...scheduled to appear. They are: (1) M. Scalora , J.P. Dowling, A.S. Manka, CM. Bowden, and J.W. Haus, Pulse Propagation Near Highly Reflective

  4. USSR and Eastern Europe Scientific Abstracts, Cybernetics, Computers, and Automation Technology, Number 26

    DTIC Science & Technology

    1977-01-26

    Sisteme Matematicheskogo Obespecheniya YeS EVM [ Applied Programs in the Software System for the Unified System of Computers], by A. Ye. Fateyev, A. I...computerized systems are most effective in large production complexes , in which the level of utilization of computers can be as high as 500,000...performance of these tasks could be furthered by the complex introduction of electronic computers in automated control systems. The creation of ASU

  5. Efficient Symbolic Task Planning for Multiple Mobile Robots

    DTIC Science & Technology

    2016-12-13

    Efficient Symbolic Task Planning for Multiple Mobile Robots Yuqian Jiang December 13, 2016 Abstract Symbolic task planning enables a robot to make...high-level deci- sions toward a complex goal by computing a sequence of actions with minimum expected costs. This thesis builds on a single- robot ...time complexity of optimal planning for multiple mobile robots . In this thesis we first investigate the performance of the state-of-the-art solvers of

  6. AstroGrid: Taverna in the Virtual Observatory .

    NASA Astrophysics Data System (ADS)

    Benson, K. M.; Walton, N. A.

    This paper reports on the implementation of the Taverna workbench by AstroGrid, a tool for designing and executing workflows of tasks in the Virtual Observatory. The workflow approach helps astronomers perform complex task sequences with little technical effort. Visual approach to workflow construction streamlines highly complex analysis over public and private data and uses computational resources as minimal as a desktop computer. Some integration issues and future work are discussed in this article.

  7. Gaze entropy reflects surgical task load.

    PubMed

    Di Stasi, Leandro L; Diaz-Piedra, Carolina; Rieiro, Héctor; Sánchez Carrión, José M; Martin Berrido, Mercedes; Olivares, Gonzalo; Catena, Andrés

    2016-11-01

    Task (over-)load imposed on surgeons is a main contributing factor to surgical errors. Recent research has shown that gaze metrics represent a valid and objective index to asses operator task load in non-surgical scenarios. Thus, gaze metrics have the potential to improve workplace safety by providing accurate measurements of task load variations. However, the direct relationship between gaze metrics and surgical task load has not been investigated yet. We studied the effects of surgical task complexity on the gaze metrics of surgical trainees. We recorded the eye movements of 18 surgical residents, using a mobile eye tracker system, during the performance of three high-fidelity virtual simulations of laparoscopic exercises of increasing complexity level: Clip Applying exercise, Cutting Big exercise, and Translocation of Objects exercise. We also measured performance accuracy and subjective rating of complexity. Gaze entropy and velocity linearly increased with increased task complexity: Visual exploration pattern became less stereotyped (i.e., more random) and faster during the more complex exercises. Residents performed better the Clip Applying exercise and the Cutting Big exercise than the Translocation of Objects exercise and their perceived task complexity differed accordingly. Our data show that gaze metrics are a valid and reliable surgical task load index. These findings have potential impacts to improve patient safety by providing accurate measurements of surgeon task (over-)load and might provide future indices to assess residents' learning curves, independently of expensive virtual simulators or time-consuming expert evaluation.

  8. Aiding the Detection of QRS Complex in ECG Signals by Detecting S Peaks Independently.

    PubMed

    Sabherwal, Pooja; Singh, Latika; Agrawal, Monika

    2018-03-30

    In this paper, a novel algorithm for the accurate detection of QRS complex by combining the independent detection of R and S peaks, using fusion algorithm is proposed. R peak detection has been extensively studied and is being used to detect the QRS complex. Whereas, S peaks, which is also part of QRS complex can be independently detected to aid the detection of QRS complex. In this paper, we suggest a method to first estimate S peak from raw ECG signal and then use them to aid the detection of QRS complex. The amplitude of S peak in ECG signal is relatively weak than corresponding R peak, which is traditionally used for the detection of QRS complex, therefore, an appropriate digital filter is designed to enhance the S peaks. These enhanced S peaks are then detected by adaptive thresholding. The algorithm is validated on all the signals of MIT-BIH arrhythmia database and noise stress database taken from physionet.org. The algorithm performs reasonably well even for the signals highly corrupted by noise. The algorithm performance is confirmed by sensitivity and positive predictivity of 99.99% and the detection accuracy of 99.98% for QRS complex detection. The number of false positives and false negatives resulted while analysis has been drastically reduced to 80 and 42 against the 98 and 84 the best results reported so far.

  9. Smart Swarms of Bacteria-Inspired Agents with Performance Adaptable Interactions

    PubMed Central

    Shklarsh, Adi; Ariel, Gil; Schneidman, Elad; Ben-Jacob, Eshel

    2011-01-01

    Collective navigation and swarming have been studied in animal groups, such as fish schools, bird flocks, bacteria, and slime molds. Computer modeling has shown that collective behavior of simple agents can result from simple interactions between the agents, which include short range repulsion, intermediate range alignment, and long range attraction. Here we study collective navigation of bacteria-inspired smart agents in complex terrains, with adaptive interactions that depend on performance. More specifically, each agent adjusts its interactions with the other agents according to its local environment – by decreasing the peers' influence while navigating in a beneficial direction, and increasing it otherwise. We show that inclusion of such performance dependent adaptable interactions significantly improves the collective swarming performance, leading to highly efficient navigation, especially in complex terrains. Notably, to afford such adaptable interactions, each modeled agent requires only simple computational capabilities with short-term memory, which can easily be implemented in simple swarming robots. PMID:21980274

  10. Smart swarms of bacteria-inspired agents with performance adaptable interactions.

    PubMed

    Shklarsh, Adi; Ariel, Gil; Schneidman, Elad; Ben-Jacob, Eshel

    2011-09-01

    Collective navigation and swarming have been studied in animal groups, such as fish schools, bird flocks, bacteria, and slime molds. Computer modeling has shown that collective behavior of simple agents can result from simple interactions between the agents, which include short range repulsion, intermediate range alignment, and long range attraction. Here we study collective navigation of bacteria-inspired smart agents in complex terrains, with adaptive interactions that depend on performance. More specifically, each agent adjusts its interactions with the other agents according to its local environment--by decreasing the peers' influence while navigating in a beneficial direction, and increasing it otherwise. We show that inclusion of such performance dependent adaptable interactions significantly improves the collective swarming performance, leading to highly efficient navigation, especially in complex terrains. Notably, to afford such adaptable interactions, each modeled agent requires only simple computational capabilities with short-term memory, which can easily be implemented in simple swarming robots.

  11. Performance Assessment of Refractory Concrete Used on the Space Shuttle's Launch Pad

    NASA Technical Reports Server (NTRS)

    Trejo, David; Calle, Luz Marina; Halman, Ceki

    2005-01-01

    The John F. Kennedy Space Center (KSC) maintains several facilities for launching space vehicles. During recent launches it has been observed that the refractory concrete materials that protect the steel-framed flame duct are breaking away from this base structure and are being projected at high velocities. There is significant concern that these projected pieces can strike the launch complex or space vehicle during the launch, jeopardizing the safety of the mission. A qualification program is in place to evaluate the performance of different refractory concretes and data from these tests have been used to assess the performance of the refractory concretes. However, there is significant variation in the test results, possibly making the existing qualification test program unreliable. This paper will evaluate data from past qualification tests, identify potential key performance indicators for the launch complex, and will recommend a new qualification test program that can be used to better qualify refractory concrete.

  12. Cognitive-linguistic performances of multilingual university students suspected of dyslexia.

    PubMed

    Lindgrén, Signe-Anita; Laine, Matti

    2011-05-01

    High-performing adults with compensated dyslexia pose particular challenges to dyslexia diagnostics. We compared the performance of 20 multilingual Finnish university students with suspected dyslexia with 20 age-matched and education-matched controls on an extensive test battery. The battery tapped various aspects of reading, writing, word retrieval, phonological processing and other cognitive functions relevant for dyslexia. Reading and writing were examined in the two domestic languages, Swedish and Finnish. The most prominent group differences in reading and writing emerged on accuracy measures in both languages (reading text aloud, proofreading, writing to dictation, free writing). The dyslexia group also performed less well on speeded segmentation of written input, complex speeded naming and complex phoneme manipulation. The pattern of results fits the phonological deficit hypothesis of dyslexia and indicates the presence of pervasive underlying defects in compensated dyslexia. Copyright © 2010 John Wiley & Sons, Ltd.

  13. Performance Enhancement Strategies for Multi-Block Overset Grid CFD Applications

    NASA Technical Reports Server (NTRS)

    Djomehri, M. Jahed; Biswas, Rupak

    2003-01-01

    The overset grid methodology has significantly reduced time-to-solution of highfidelity computational fluid dynamics (CFD) simulations about complex aerospace configurations. The solution process resolves the geometrical complexity of the problem domain by using separately generated but overlapping structured discretization grids that periodically exchange information through interpolation. However, high performance computations of such large-scale realistic applications must be handled efficiently on state-of-the-art parallel supercomputers. This paper analyzes the effects of various performance enhancement strategies on the parallel efficiency of an overset grid Navier-Stokes CFD application running on an SGI Origin2000 machinc. Specifically, the role of asynchronous communication, grid splitting, and grid grouping strategies are presented and discussed. Details of a sophisticated graph partitioning technique for grid grouping are also provided. Results indicate that performance depends critically on the level of latency hiding and the quality of load balancing across the processors.

  14. The Use of Novel Camtasia Videos to Improve Performance of At-Risk Students in Undergraduate Physiology Courses

    ERIC Educational Resources Information Center

    Miller, Cynthia J.

    2014-01-01

    Students in undergraduate physiology courses often have difficulty understanding complex, multi-step processes, and these concepts consume a large portion of class time. For this pilot study, it was hypothesized that online multimedia resources may improve student performance in a high-risk population and reduce the in-class workload. A narrated…

  15. pFlogger: The Parallel Fortran Logging Utility

    NASA Technical Reports Server (NTRS)

    Clune, Tom; Cruz, Carlos A.

    2017-01-01

    In the context of high performance computing (HPC), software investments in support of text-based diagnostics, which monitor a running application, are typically limited compared to those for other types of IO. Examples of such diagnostics include reiteration of configuration parameters, progress indicators, simple metrics (e.g., mass conservation, convergence of solvers, etc.), and timers. To some degree, this difference in priority is justifiable as other forms of output are the primary products of a scientific model and, due to their large data volume, much more likely to be a significant performance concern. In contrast, text-based diagnostic content is generally not shared beyond the individual or group running an application and is most often used to troubleshoot when something goes wrong. We suggest that a more systematic approach enabled by a logging facility (or 'logger)' similar to those routinely used by many communities would provide significant value to complex scientific applications. In the context of high-performance computing, an appropriate logger would provide specialized support for distributed and shared-memory parallelism and have low performance overhead. In this paper, we present our prototype implementation of pFlogger - a parallel Fortran-based logging framework, and assess its suitability for use in a complex scientific application.

  16. Dendrites are dispensable for basic motoneuron function but essential for fine tuning of behavior.

    PubMed

    Ryglewski, Stefanie; Kadas, Dimitrios; Hutchinson, Katie; Schuetzler, Natalie; Vonhoff, Fernando; Duch, Carsten

    2014-12-16

    Dendrites are highly complex 3D structures that define neuronal morphology and connectivity and are the predominant sites for synaptic input. Defects in dendritic structure are highly consistent correlates of brain diseases. However, the precise consequences of dendritic structure defects for neuronal function and behavioral performance remain unknown. Here we probe dendritic function by using genetic tools to selectively abolish dendrites in identified Drosophila wing motoneurons without affecting other neuronal properties. We find that these motoneuron dendrites are unexpectedly dispensable for synaptic targeting, qualitatively normal neuronal activity patterns during behavior, and basic behavioral performance. However, significant performance deficits in sophisticated motor behaviors, such as flight altitude control and switching between discrete courtship song elements, scale with the degree of dendritic defect. To our knowledge, our observations provide the first direct evidence that complex dendrite architecture is critically required for fine-tuning and adaptability within robust, evolutionarily constrained behavioral programs that are vital for mating success and survival. We speculate that the observed scaling of performance deficits with the degree of structural defect is consistent with gradual increases in intellectual disability during continuously advancing structural deficiencies in progressive neurological disorders.

  17. DOE High Performance Computing Operational Review (HPCOR): Enabling Data-Driven Scientific Discovery at HPC Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard; Allcock, William; Beggio, Chris

    2014-10-17

    U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at themore » DOE national laboratories. The report contains findings from that review.« less

  18. High energy PIXE: A tool to characterize multi-layer thick samples

    NASA Astrophysics Data System (ADS)

    Subercaze, A.; Koumeir, C.; Métivier, V.; Servagent, N.; Guertin, A.; Haddad, F.

    2018-02-01

    High energy PIXE is a useful and non-destructive tool to characterize multi-layer thick samples such as cultural heritage objects. In a previous work, we demonstrated the possibility to perform quantitative analysis of simple multi-layer samples using high energy PIXE, without any assumption on their composition. In this work an in-depth study of the parameters involved in the method previously published is proposed. Its extension to more complex samples with a repeated layer is also presented. Experiments have been performed at the ARRONAX cyclotron using 68 MeV protons. The thicknesses and sequences of a multi-layer sample including two different layers of the same element have been determined. Performances and limits of this method are presented and discussed.

  19. Understanding GPU Power. A Survey of Profiling, Modeling, and Simulation Methods

    DOE PAGES

    Bridges, Robert A.; Imam, Neena; Mintz, Tiffany M.

    2016-09-01

    Modern graphics processing units (GPUs) have complex architectures that admit exceptional performance and energy efficiency for high throughput applications.Though GPUs consume large amounts of power, their use for high throughput applications facilitate state-of-the-art energy efficiency and performance. Consequently, continued development relies on understanding their power consumption. Our work is a survey of GPU power modeling and profiling methods with increased detail on noteworthy efforts. Moreover, as direct measurement of GPU power is necessary for model evaluation and parameter initiation, internal and external power sensors are discussed. Hardware counters, which are low-level tallies of hardware events, share strong correlation to powermore » use and performance. Statistical correlation between power and performance counters has yielded worthwhile GPU power models, yet the complexity inherent to GPU architectures presents new hurdles for power modeling. Developments and challenges of counter-based GPU power modeling is discussed. Often building on the counter-based models, research efforts for GPU power simulation, which make power predictions from input code and hardware knowledge, provide opportunities for optimization in programming or architectural design. Noteworthy strides in power simulations for GPUs are included along with their performance or functional simulator counterparts when appropriate. Lastly, possible directions for future research are discussed.« less

  20. A high-performance liquid chromatography method for the serotonin release assay is equivalent to the radioactive method.

    PubMed

    Sono-Koree, N K; Crist, R A; Frank, E L; Rodgers, G M; Smock, K J

    2016-02-01

    The serotonin release assay (SRA) is considered the gold standard laboratory test for heparin-induced thrombocytopenia (HIT). The historic SRA method uses platelets loaded with radiolabeled serotonin to evaluate platelet activation by HIT immune complexes. However, a nonradioactive method is desirable. We report the performance characteristics of a high-performance liquid chromatography (HPLC) SRA method. We validated the performance characteristics of an HPLC-SRA method, including correlation with a reference laboratory using the radioactive method. Serotonin released from reagent platelets was quantified by HPLC using fluorescent detection. Results were expressed as % release and classified as positive, negative, or indeterminate based on previously published cutoffs. Serum samples from 250 subjects with suspected HIT were tested in the HPLC-SRA and with the radioactive method. Concordant classifications were observed in 230 samples (92%). Sera from 41 healthy individuals tested negative. Between-run imprecision studies showed standard deviation of <6 (% release) for positive, weak positive, and negative serum pools. Stability studies demonstrated stability after two freeze-thaw cycles or up to a week of refrigeration. The HPLC-SRA has robust performance characteristics, equivalent to the historic radioactive method, but avoids the complexities of working with radioactivity. © 2015 John Wiley & Sons Ltd.

  1. Understanding GPU Power. A Survey of Profiling, Modeling, and Simulation Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bridges, Robert A.; Imam, Neena; Mintz, Tiffany M.

    Modern graphics processing units (GPUs) have complex architectures that admit exceptional performance and energy efficiency for high throughput applications.Though GPUs consume large amounts of power, their use for high throughput applications facilitate state-of-the-art energy efficiency and performance. Consequently, continued development relies on understanding their power consumption. Our work is a survey of GPU power modeling and profiling methods with increased detail on noteworthy efforts. Moreover, as direct measurement of GPU power is necessary for model evaluation and parameter initiation, internal and external power sensors are discussed. Hardware counters, which are low-level tallies of hardware events, share strong correlation to powermore » use and performance. Statistical correlation between power and performance counters has yielded worthwhile GPU power models, yet the complexity inherent to GPU architectures presents new hurdles for power modeling. Developments and challenges of counter-based GPU power modeling is discussed. Often building on the counter-based models, research efforts for GPU power simulation, which make power predictions from input code and hardware knowledge, provide opportunities for optimization in programming or architectural design. Noteworthy strides in power simulations for GPUs are included along with their performance or functional simulator counterparts when appropriate. Lastly, possible directions for future research are discussed.« less

  2. A Descriptive Study of Working Memory, Phonological Awareness and Literacy Performance of People Who Use AAC

    ERIC Educational Resources Information Center

    Taibo, Maria Luisa Gomez; Iglesias, Pilar Vieiro; Mendez, Maria Sotillo; del Salvador, Maria Gonzalez Raposo

    2009-01-01

    Ten cerebral palsied adolescents and young adults with complex communicative needs who use augmentative and alternative communication were studied. They were classified according to their high versus low working memory capacity and according to their high versus low phonological skills into two groups of participants. These groups were compared on…

  3. Effects of Peer-Tutor Competences on Learner Cognitive Load and Learning Performance during Knowledge Sharing

    ERIC Educational Resources Information Center

    Hsiao, Ya-Ping; Brouns, Francis; van Bruggen, Jan; Sloep, Peter B.

    2012-01-01

    In Learning Networks, learners need to share knowledge with others to build knowledge. In particular, when working on complex tasks, they often need to acquire extra cognitive resources from others to process a high task load. However, without support high task load and organizing knowledge sharing themselves might easily overload learners'…

  4. The hard fall effect: high working memory capacity leads to a higher, but less robust short-term memory performance.

    PubMed

    Thomassin, Noémylle; Gonthier, Corentin; Guerraz, Michel; Roulin, Jean-Luc

    2015-01-01

    Participants with a high working memory span tend to perform better than low spans in a variety of tasks. However, their performance is paradoxically more impaired when they have to perform two tasks at once, a phenomenon that could be labeled the "hard fall effect." The present study tested whether this effect exists in a short-term memory task, and investigated the proposal that the effect is due to high spans using efficient facilitative strategies under simple task conditions. Ninety-eight participants performed a spatial short-term memory task under simple and dual task conditions; stimuli presentation times either allowed for the use of complex facilitative strategies or not. High spans outperformed low spans only under simple task conditions when presentation times allowed for the use of facilitative strategies. These results indicate that the hard fall effect exists on a short-term memory task and may be caused by individual differences in strategy use.

  5. High-performance ternary blend polymer solar cells involving both energy transfer and hole relay processes

    DOE PAGES

    Lu, Luyao; Chen, Wei; Xu, Tao; ...

    2015-06-04

    The integration of multiple materials with complementary absorptions into a single junction device is regarded as an efficient way to enhance the power conversion efficiency (PCE) of organic solar cells (OSCs). However, because of increased complexity with one more component, only limited high-performance ternary systems have been demonstrated previously. Here we report an efficient ternary blend OSC with a PCE of 9.2%. We show that the third component can reduce surface trap densities in the ternary blend. Detailed studies unravel that the improved performance results from synergistic effects of enlarged open circuit voltage, suppressed trap-assisted recombination, enhanced light absorption, increasedmore » hole extraction, efficient energy transfer and better morphology. As a result, the working mechanism and high device performance demonstrate new insights and design guidelines for high-performance ternary blend solar cells and suggest that ternary structure is a promising platform to boost the efficiency of OSCs.« less

  6. Chromatin immunoprecipitation with fixed animal tissues and preparation for high-throughput sequencing.

    PubMed

    Cotney, Justin L; Noonan, James P

    2015-02-02

    Chromatin immunoprecipitation coupled with high-throughput sequencing (ChIP-Seq) is a powerful method used to identify genome-wide binding patterns of transcription factors and distribution of various histone modifications associated with different chromatin states. In most published studies, ChIP-Seq has been performed on cultured cells grown under controlled conditions, allowing generation of large amounts of material in a homogeneous biological state. Although such studies have provided great insight into the dynamic landscapes of animal genomes, they do not allow the examination of transcription factor binding and chromatin states in adult tissues, developing embryonic structures, or tumors. Such knowledge is critical to understanding the information required to create and maintain a complex biological tissue and to identify noncoding regions of the genome directly involved in tissues affected by complex diseases such as autism. Studying these tissue types with ChIP-Seq can be challenging due to the limited availability of tissues and the lack of complex biological states able to be achieved in culture. These inherent differences require alterations of standard cross-linking and chromatin extraction typically used in cell culture. Here we describe a general approach for using small amounts of animal tissue to perform ChIP-Seq directed at histone modifications and transcription factors. Tissue is homogenized before treatment with formaldehyde to ensure proper cross-linking, and a two-step nuclear isolation is performed to increase extraction of soluble chromatin. Small amounts of soluble chromatin are then used for immunoprecipitation (IP) and prepared for multiplexed high-throughput sequencing. © 2015 Cold Spring Harbor Laboratory Press.

  7. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing.

    PubMed

    Brown, David K; Penkler, David L; Musyoka, Thommas M; Bishop, Özlem Tastan

    2015-01-01

    Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS.

  8. Functionalized nanoparticles based solid-phase membrane micro-tip extraction and high-performance liquid chromatography analyses of vitamin B complex in human plasma.

    PubMed

    Ali, Imran; Kulsum, Umma; Al-Othman, Zeid A; Alwarthan, Abdulrahman; Saleem, Kishwar

    2016-07-01

    Iron nanoparticles were prepared by a green method following functionalization using 1-butyl-3-methylimidazolium bromide. 1-Butyl-3-methylimidazole iron nanoparticles were characterized using FTIR spectroscopy, energy dispersive X-ray fluorescence, X-ray diffraction, scanning electron microscopy and transmission electron microscopy. The nanoparticles were used in solid-phase membrane micro-tip extraction to separate vitamin B complex from plasma before high-performance liquid chromatography. The optimum conditions obtained were sorbent (15 mg), agitation time (30 min), pH (9.0), desorbing solvent [water (5 mL) + methanol (5 mL) + sodium hydroxide (0.1 N) + acetic acid (d = 1.05 kg/L, pH 5.5), desorbing volume (10 mL) and desorption time (30 min). The percentage recoveries of all the eight vitamin B complex were from 60 to 83%. A high-performance liquid chromatography method was developed using a PhE column (250 × 4.6 mm, 5.0 μm) and water/acetonitrile (95:5, v/v; pH 4.0 with 0.1% formic acid) mobile phase. The flow rate was 1.0 mL/min with detection at 270 and 210 nm. The values of the capacity, separation and resolution factor were 0.57-39.47, 1.12-6.00 and 1.84-26.26, respectively. The developed sample preparation and chromatographic methods were fast, selective, inexpensive, economic and reproducible. The developed method can be applied for analyzing these drugs in biological and environmental matrices. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing

    PubMed Central

    Brown, David K.; Penkler, David L.; Musyoka, Thommas M.; Bishop, Özlem Tastan

    2015-01-01

    Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS. PMID:26280450

  10. Spectral Entropies as Information-Theoretic Tools for Complex Network Comparison

    NASA Astrophysics Data System (ADS)

    De Domenico, Manlio; Biamonte, Jacob

    2016-10-01

    Any physical system can be viewed from the perspective that information is implicitly represented in its state. However, the quantification of this information when it comes to complex networks has remained largely elusive. In this work, we use techniques inspired by quantum statistical mechanics to define an entropy measure for complex networks and to develop a set of information-theoretic tools, based on network spectral properties, such as Rényi q entropy, generalized Kullback-Leibler and Jensen-Shannon divergences, the latter allowing us to define a natural distance measure between complex networks. First, we show that by minimizing the Kullback-Leibler divergence between an observed network and a parametric network model, inference of model parameter(s) by means of maximum-likelihood estimation can be achieved and model selection can be performed with appropriate information criteria. Second, we show that the information-theoretic metric quantifies the distance between pairs of networks and we can use it, for instance, to cluster the layers of a multilayer system. By applying this framework to networks corresponding to sites of the human microbiome, we perform hierarchical cluster analysis and recover with high accuracy existing community-based associations. Our results imply that spectral-based statistical inference in complex networks results in demonstrably superior performance as well as a conceptual backbone, filling a gap towards a network information theory.

  11. Reducing Communication in Algebraic Multigrid Using Additive Variants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vassilevski, Panayot S.; Yang, Ulrike Meier

    Algebraic multigrid (AMG) has proven to be an effective scalable solver on many high performance computers. However, its increasing communication complexity on coarser levels has shown to seriously impact its performance on computers with high communication cost. Moreover, additive AMG variants provide not only increased parallelism as well as decreased numbers of messages per cycle but also generally exhibit slower convergence. Here we present various new additive variants with convergence rates that are significantly improved compared to the classical additive algebraic multigrid method and investigate their potential for decreased communication, and improved communication-computation overlap, features that are essential for goodmore » performance on future exascale architectures.« less

  12. Reducing Communication in Algebraic Multigrid Using Additive Variants

    DOE PAGES

    Vassilevski, Panayot S.; Yang, Ulrike Meier

    2014-02-12

    Algebraic multigrid (AMG) has proven to be an effective scalable solver on many high performance computers. However, its increasing communication complexity on coarser levels has shown to seriously impact its performance on computers with high communication cost. Moreover, additive AMG variants provide not only increased parallelism as well as decreased numbers of messages per cycle but also generally exhibit slower convergence. Here we present various new additive variants with convergence rates that are significantly improved compared to the classical additive algebraic multigrid method and investigate their potential for decreased communication, and improved communication-computation overlap, features that are essential for goodmore » performance on future exascale architectures.« less

  13. High performance frame synchronization for continuous variable quantum key distribution systems.

    PubMed

    Lin, Dakai; Huang, Peng; Huang, Duan; Wang, Chao; Peng, Jinye; Zeng, Guihua

    2015-08-24

    Considering a practical continuous variable quantum key distribution(CVQKD) system, synchronization is of significant importance as it is hardly possible to extract secret keys from unsynchronized strings. In this paper, we proposed a high performance frame synchronization method for CVQKD systems which is capable to operate under low signal-to-noise(SNR) ratios and is compatible with random phase shift induced by quantum channel. A practical implementation of this method with low complexity is presented and its performance is analysed. By adjusting the length of synchronization frame, this method can work well with large range of SNR values which paves the way for longer distance CVQKD.

  14. A comprehensive study of MPI parallelism in three-dimensional discrete element method (DEM) simulation of complex-shaped granular particles

    NASA Astrophysics Data System (ADS)

    Yan, Beichuan; Regueiro, Richard A.

    2018-02-01

    A three-dimensional (3D) DEM code for simulating complex-shaped granular particles is parallelized using message-passing interface (MPI). The concepts of link-block, ghost/border layer, and migration layer are put forward for design of the parallel algorithm, and theoretical scalability function of 3-D DEM scalability and memory usage is derived. Many performance-critical implementation details are managed optimally to achieve high performance and scalability, such as: minimizing communication overhead, maintaining dynamic load balance, handling particle migrations across block borders, transmitting C++ dynamic objects of particles between MPI processes efficiently, eliminating redundant contact information between adjacent MPI processes. The code executes on multiple US Department of Defense (DoD) supercomputers and tests up to 2048 compute nodes for simulating 10 million three-axis ellipsoidal particles. Performance analyses of the code including speedup, efficiency, scalability, and granularity across five orders of magnitude of simulation scale (number of particles) are provided, and they demonstrate high speedup and excellent scalability. It is also discovered that communication time is a decreasing function of the number of compute nodes in strong scaling measurements. The code's capability of simulating a large number of complex-shaped particles on modern supercomputers will be of value in both laboratory studies on micromechanical properties of granular materials and many realistic engineering applications involving granular materials.

  15. Managing Scientific Software Complexity with Bocca and CCA

    DOE PAGES

    Allan, Benjamin A.; Norris, Boyana; Elwasif, Wael R.; ...

    2008-01-01

    In high-performance scientific software development, the emphasis is often on short time to first solution. Even when the development of new components mostly reuses existing components or libraries and only small amounts of new code must be created, dealing with the component glue code and software build processes to obtain complete applications is still tedious and error-prone. Component-based software meant to reduce complexity at the application level increases complexity to the extent that the user must learn and remember the interfaces and conventions of the component model itself. To address these needs, we introduce Bocca, the first tool to enablemore » application developers to perform rapid component prototyping while maintaining robust software-engineering practices suitable to HPC environments. Bocca provides project management and a comprehensive build environment for creating and managing applications composed of Common Component Architecture components. Of critical importance for high-performance computing (HPC) applications, Bocca is designed to operate in a language-agnostic way, simultaneously handling components written in any of the languages commonly used in scientific applications: C, C++, Fortran, Python and Java. Bocca automates the tasks related to the component glue code, freeing the user to focus on the scientific aspects of the application. Bocca embraces the philosophy pioneered by Ruby on Rails for web applications: start with something that works, and evolve it to the user's purpose.« less

  16. Ontology for the Intelligence Analyst

    DTIC Science & Technology

    2012-12-01

    Intelligence and Information Warfare Directorate Shouvik Bardhan , High Performance Technologies, Incorporated Jamie Johnson, EOIR Technologies...to soldiers. E-mail: kesny.parent@us.army.mil Shouvik Bardhan has more than 25 years of experience in the field of complex software design and

  17. Task Assignment Heuristics for Distributed CFD Applications

    NASA Technical Reports Server (NTRS)

    Lopez-Benitez, N.; Djomehri, M. J.; Biswas, R.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    CFD applications require high-performance computational platforms: 1. Complex physics and domain configuration demand strongly coupled solutions; 2. Applications are CPU and memory intensive; and 3. Huge resource requirements can only be satisfied by teraflop-scale machines or distributed computing.

  18. Improving driver decisions and performance in high-speed, multilane, complex conditions.

    DOT National Transportation Integrated Search

    2009-01-01

    In an effort to reduce fatalities resulting from traffic collisions, Californias Strategic Highway Safety Plan identified : 16 Challenge Areas under the State Highway Safety Plan. Improper driving decisions about the right of way and : turning bec...

  19. Multistate Lempel-Ziv (MLZ) index interpretation as a measure of amplitude and complexity changes.

    PubMed

    Sarlabous, Leonardo; Torres, Abel; Fiz, Jose A; Gea, Joaquim; Galdiz, Juan B; Jane, Raimon

    2009-01-01

    The Lempel-Ziv complexity (LZ) has been widely used to evaluate the randomness of finite sequences. In general, the LZ complexity has been used to determine the complexity grade present in biomedical signals. The LZ complexity is not able to discern between signals with different amplitude variations and similar random components. On the other hand, amplitude parameters, as the root mean square (RMS), are not able to discern between signals with similar power distributions and different random components. In this work, we present a novel method to quantify amplitude and complexity variations in biomedical signals by means of the computation of the LZ coefficient using more than two quantification states, and with thresholds fixed and independent of the dynamic range or standard deviation of the analyzed signal: the Multistate Lempel-Ziv (MLZ) index. Our results indicate that MLZ index with few quantification levels only evaluate the complexity changes of the signal, with high number of levels, the amplitude variations, and with an intermediate number of levels informs about both amplitude and complexity variations. The study performed in diaphragmatic mechanomyographic signals shows that the amplitude variations of this signal are more correlated with the respiratory effort than the complexity variations. Furthermore, it has been observed that the MLZ index with high number of levels practically is not affected by the existence of impulsive, sinusoidal, constant and Gaussian noises compared with the RMS amplitude parameter.

  20. Synthesis and spectral characterization of 2-((2-hydroxybenzylidene)amino)-2-methylpropane-1,3-diol derived complexes: Molecular docking and antimicrobial studies

    NASA Astrophysics Data System (ADS)

    Ansari, Istikhar A.; Sama, Farasha; Raizada, Mukul; Shahid, M.; Rajpoot, Ravi Kant; Siddiqi, Zafar A.

    2017-01-01

    A series of four homo-dinuclear transition metal complexes with stoichiometry [M2(HL)2(H2O)2] [M = Fe (1), Co (2), Ni (3) and Cu (4); H3L = 2-((2-hydroxybenzylidene)amino)-2-methylpropane-1,3-diol] has been prepared. Ligand (H3L) was obtained by the condensation of 2-amino-2-methyl-1,3-propanediol (H2ampd) with salicylaldehyde. The complexes (1-4) are characterized employing elemental analysis, FTIR, ESI mass, 1H &13C NMR, EPR, UV Visible, TGA, cyclic voltammetry, and magnetic studies. Spectral data ascertained the bonding features and the geometry of the complexes and revealed that all the complexes adopt distorted octahedral geometry with high spin state of metal ions. Thermal and ESI mass data confirmed the proposed stoichiometry of the complexes. Cyclic voltammetric (CV) studies ascertain the formation of MII/MIII quasi-reversible redox couples in solution. The antimicrobial activities of the present complexes have been examined against few bacteria (E. coli, B. subtilis, S. aureus and S. typhymurium) and fungi (C. albicans, A. fumigatus and P. marneffeiin) suggesting that the present compounds show moderate to high antimicrobial properties. Among all the compounds tested, complex (4) exhibited highest antibacterial as well as antifungal activity. Molecular docking studies of the free ligand and the complexes are performed with BDNA.

  1. Optical interconnection networks for high-performance computing systems

    NASA Astrophysics Data System (ADS)

    Biberman, Aleksandr; Bergman, Keren

    2012-04-01

    Enabled by silicon photonic technology, optical interconnection networks have the potential to be a key disruptive technology in computing and communication industries. The enduring pursuit of performance gains in computing, combined with stringent power constraints, has fostered the ever-growing computational parallelism associated with chip multiprocessors, memory systems, high-performance computing systems and data centers. Sustaining these parallelism growths introduces unique challenges for on- and off-chip communications, shifting the focus toward novel and fundamentally different communication approaches. Chip-scale photonic interconnection networks, enabled by high-performance silicon photonic devices, offer unprecedented bandwidth scalability with reduced power consumption. We demonstrate that the silicon photonic platforms have already produced all the high-performance photonic devices required to realize these types of networks. Through extensive empirical characterization in much of our work, we demonstrate such feasibility of waveguides, modulators, switches and photodetectors. We also demonstrate systems that simultaneously combine many functionalities to achieve more complex building blocks. We propose novel silicon photonic devices, subsystems, network topologies and architectures to enable unprecedented performance of these photonic interconnection networks. Furthermore, the advantages of photonic interconnection networks extend far beyond the chip, offering advanced communication environments for memory systems, high-performance computing systems, and data centers.

  2. Petascale computation performance of lightweight multiscale cardiac models using hybrid programming models.

    PubMed

    Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias

    2011-01-01

    Future multiscale and multiphysics models must use the power of high performance computing (HPC) systems to enable research into human disease, translational medical science, and treatment. Previously we showed that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message passing processes (e.g. the message passing interface (MPI)) with multithreading (e.g. OpenMP, POSIX pthreads). The objective of this work is to compare the performance of such hybrid programming models when applied to the simulation of a lightweight multiscale cardiac model. Our results show that the hybrid models do not perform favourably when compared to an implementation using only MPI which is in contrast to our results using complex physiological models. Thus, with regards to lightweight multiscale cardiac models, the user may not need to increase programming complexity by using a hybrid programming approach. However, considering that model complexity will increase as well as the HPC system size in both node count and number of cores per node, it is still foreseeable that we will achieve faster than real time multiscale cardiac simulations on these systems using hybrid programming models.

  3. A fuzzy logic control in adjustable autonomy of a multi-agent system for an automated elderly movement monitoring application.

    PubMed

    Mostafa, Salama A; Mustapha, Aida; Mohammed, Mazin Abed; Ahmad, Mohd Sharifuddin; Mahmoud, Moamin A

    2018-04-01

    Autonomous agents are being widely used in many systems, such as ambient assisted-living systems, to perform tasks on behalf of humans. However, these systems usually operate in complex environments that entail uncertain, highly dynamic, or irregular workload. In such environments, autonomous agents tend to make decisions that lead to undesirable outcomes. In this paper, we propose a fuzzy-logic-based adjustable autonomy (FLAA) model to manage the autonomy of multi-agent systems that are operating in complex environments. This model aims to facilitate the autonomy management of agents and help them make competent autonomous decisions. The FLAA model employs fuzzy logic to quantitatively measure and distribute autonomy among several agents based on their performance. We implement and test this model in the Automated Elderly Movements Monitoring (AEMM-Care) system, which uses agents to monitor the daily movement activities of elderly users and perform fall detection and prevention tasks in a complex environment. The test results show that the FLAA model improves the accuracy and performance of these agents in detecting and preventing falls. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Efficient biprediction decision scheme for fast high efficiency video coding encoding

    NASA Astrophysics Data System (ADS)

    Park, Sang-hyo; Lee, Seung-ho; Jang, Euee S.; Jun, Dongsan; Kang, Jung-Won

    2016-11-01

    An efficient biprediction decision scheme of high efficiency video coding (HEVC) is proposed for fast-encoding applications. For low-delay video applications, bidirectional prediction can be used to increase compression performance efficiently with previous reference frames. However, at the same time, the computational complexity of the HEVC encoder is significantly increased due to the additional biprediction search. Although a some research has attempted to reduce this complexity, whether the prediction is strongly related to both motion complexity and prediction modes in a coding unit has not yet been investigated. A method that avoids most compression-inefficient search points is proposed so that the computational complexity of the motion estimation process can be dramatically decreased. To determine if biprediction is critical, the proposed method exploits the stochastic correlation of the context of prediction units (PUs): the direction of a PU and the accuracy of a motion vector. Through experimental results, the proposed method showed that the time complexity of biprediction can be reduced to 30% on average, outperforming existing methods in view of encoding time, number of function calls, and memory access.

  5. Competitive pressures affect sexual signal complexity in Kurixalus odontotarsus: insights into the evolution of compound calls

    PubMed Central

    2017-01-01

    ABSTRACT Male-male vocal competition in anuran species is critical for mating success; however, it is also energetically demanding and highly time-consuming. Thus, we hypothesized that males may change signal elaboration in response to competition in real time. Male serrate-legged small treefrogs (Kurixalus odontotarsus) produce compound calls that contain two kinds of notes, harmonic sounds called ‘A notes’ and short broadband sounds called ‘B notes’. Using male evoked vocal response experiments, we found that competition influences the temporal structure and complexity of vocal signals produced by males. Males produce calls with a higher ratio of notes:call, and more compound calls including more A notes but fewer B notes with contest escalation. In doing so, males minimize the energy costs and maximize the benefits of competition when the level of competition is high. This means that the evolution of sexual signal complexity in frogs may be susceptible to selection for plasticity related to adjusting performance to the pressures of competition, and supports the idea that more complex social contexts can lead to greater vocal complexity. PMID:29175862

  6. Understanding Portability of a High-Level Programming Model on Contemporary Heterogeneous Architectures

    DOE PAGES

    Sabne, Amit J.; Sakdhnagool, Putt; Lee, Seyong; ...

    2015-07-13

    Accelerator-based heterogeneous computing is gaining momentum in the high-performance computing arena. However, the increased complexity of heterogeneous architectures demands more generic, high-level programming models. OpenACC is one such attempt to tackle this problem. Although the abstraction provided by OpenACC offers productivity, it raises questions concerning both functional and performance portability. In this article, the authors propose HeteroIR, a high-level, architecture-independent intermediate representation, to map high-level programming models, such as OpenACC, to heterogeneous architectures. They present a compiler approach that translates OpenACC programs into HeteroIR and accelerator kernels to obtain OpenACC functional portability. They then evaluate the performance portability obtained bymore » OpenACC with their approach on 12 OpenACC programs on Nvidia CUDA, AMD GCN, and Intel Xeon Phi architectures. They study the effects of various compiler optimizations and OpenACC program settings on these architectures to provide insights into the achieved performance portability.« less

  7. Working Memory and Reasoning Benefit from Different Modes of Large-scale Brain Dynamics in Healthy Older Adults.

    PubMed

    Lebedev, Alexander V; Nilsson, Jonna; Lövdén, Martin

    2018-07-01

    Researchers have proposed that solving complex reasoning problems, a key indicator of fluid intelligence, involves the same cognitive processes as solving working memory tasks. This proposal is supported by an overlap of the functional brain activations associated with the two types of tasks and by high correlations between interindividual differences in performance. We replicated these findings in 53 older participants but also showed that solving reasoning and working memory problems benefits from different configurations of the functional connectome and that this dissimilarity increases with a higher difficulty load. Specifically, superior performance in a typical working memory paradigm ( n-back) was associated with upregulation of modularity (increased between-network segregation), whereas performance in the reasoning task was associated with effective downregulation of modularity. We also showed that working memory training promotes task-invariant increases in modularity. Because superior reasoning performance is associated with downregulation of modular dynamics, training may thus have fostered an inefficient way of solving the reasoning tasks. This could help explain why working memory training does little to promote complex reasoning performance. The study concludes that complex reasoning abilities cannot be reduced to working memory and suggests the need to reconsider the feasibility of using working memory training interventions to attempt to achieve effects that transfer to broader cognition.

  8. Difficult Decisions Made Easier

    NASA Technical Reports Server (NTRS)

    2006-01-01

    NASA missions are extremely complex and prone to sudden, catastrophic failure if equipment falters or if an unforeseen event occurs. For these reasons, NASA trains to expect the unexpected. It tests its equipment and systems in extreme conditions, and it develops risk-analysis tests to foresee any possible problems. The Space Agency recently worked with an industry partner to develop reliability analysis software capable of modeling complex, highly dynamic systems, taking into account variations in input parameters and the evolution of the system over the course of a mission. The goal of this research was multifold. It included performance and risk analyses of complex, multiphase missions, like the insertion of the Mars Reconnaissance Orbiter; reliability analyses of systems with redundant and/or repairable components; optimization analyses of system configurations with respect to cost and reliability; and sensitivity analyses to identify optimal areas for uncertainty reduction or performance enhancement.

  9. A Critical Assessment of the Performance of Protein-ligand Scoring Functions Based on NMR Chemical Shift Perturbations

    PubMed Central

    Wang, Bing; Westerhoff, Lance M.; Merz, Kenneth M.

    2008-01-01

    We have generated docking poses for the FKBP-GPI complex using eight docking programs, and compared their scoring functions with scoring based on NMR chemical shift perturbations (NMRScore). Because the chemical shift perturbation (CSP) is exquisitely sensitive on the orientation of ligand inside the binding pocket, NMRScore offers an accurate and straightforward approach to score different poses. All scoring functions were inspected by their abilities to highly rank the native-like structures and separate them from decoy poses generated for a protein-ligand complex. The overall performance of NMRScore is much better than that of energy-based scoring functions associated with docking programs in both aspects. In summary, we find that the combination of docking programs with NMRScore results in an approach that can robustly determine the binding site structure for a protein-ligand complex, thereby, providing a new tool facilitating the structure-based drug discovery process. PMID:17867664

  10. On the reduced-complexity of LDPC decoders for ultra-high-speed optical transmission.

    PubMed

    Djordjevic, Ivan B; Xu, Lei; Wang, Ting

    2010-10-25

    We propose two reduced-complexity (RC) LDPC decoders, which can be used in combination with large-girth LDPC codes to enable ultra-high-speed serial optical transmission. We show that optimally attenuated RC min-sum sum algorithm performs only 0.46 dB (at BER of 10(-9)) worse than conventional sum-product algorithm, while having lower storage memory requirements and much lower latency. We further study the use of RC LDPC decoding algorithms in multilevel coded modulation with coherent detection and show that with RC decoding algorithms we can achieve the net coding gain larger than 11 dB at BERs below 10(-9).

  11. Characterization of Early Partial Seizure Onset: Frequency, Complexity and Entropy

    PubMed Central

    Jouny, Christophe C.; Bergey, Gregory K.

    2011-01-01

    Objective A clear classification of partial seizures onset features is not yet established. Complexity and entropy have been very widely used to describe dynamical systems, but a systematic evaluation of these measures to characterize partial seizures has never been performed. Methods Eighteen different measures including power in frequency bands up to 300Hz, Gabor atom density (GAD), Higuchi fractal dimension (HFD), Lempel-Ziv complexity, Shannon entropy, sample entropy, and permutation entropy, were selected to test sensitivity to partial seizure onset. Intracranial recordings from forty-five patients with mesial temporal, neocortical temporal and neocortical extratemporal seizure foci were included (331 partial seizures). Results GAD, Lempel-Ziv complexity, HFD, high frequency activity, and sample entropy were the most reliable measures to assess early seizure onset. Conclusions Increases in complexity and occurrence of high-frequency components appear to be commonly associated with early stages of partial seizure evolution from all regions. The type of measure (frequency-based, complexity or entropy) does not predict the efficiency of the method to detect seizure onset. Significance Differences between measures such as GAD and HFD highlight the multimodal nature of partial seizure onsets. Improved methods for early seizure detection may be achieved from a better understanding of these underlying dynamics. PMID:21872526

  12. How can surgical training benefit from theories of skilled motor development, musical skill acquisition and performance psychology?

    PubMed

    McCaskie, Andrew W; Kenny, Dianna T; Deshmukh, Sandeep

    2011-05-02

    Trainee surgeons must acquire expert status in the context of reduced hours, reduced operating room time and the need to learn complex skills involving screen-mediated techniques, computers and robotics. Ever more sophisticated surgical simulation strategies have been helpful in providing surgeons with the opportunity to practise, but not all of these strategies are widely available. Similarities in the motor skills required in skilled musical performance and surgery suggest that models of music learning, and particularly skilled motor development, may be applicable in training surgeons. More attention should be paid to factors associated with optimal arousal and optimal performance in surgical training - lessons learned from helping anxious musicians optimise performance and manage anxiety may also be transferable to trainee surgeons. The ways in which the trainee surgeon moves from novice to expert need to be better understood so that this process can be expedited using current knowledge in other disciplines requiring the performance of complex fine motor tasks with high cognitive load under pressure.

  13. Improved prescribed performance control for air-breathing hypersonic vehicles with unknown deadzone input nonlinearity.

    PubMed

    Wang, Yingyang; Hu, Jianbo

    2018-05-19

    An improved prescribed performance controller is proposed for the longitudinal model of an air-breathing hypersonic vehicle (AHV) subject to uncertain dynamics and input nonlinearity. Different from the traditional non-affine model requiring non-affine functions to be differentiable, this paper utilizes a semi-decomposed non-affine model with non-affine functions being locally semi-bounded and possibly in-differentiable. A new error transformation combined with novel prescribed performance functions is proposed to bypass complex deductions caused by conventional error constraint approaches and circumvent high frequency chattering in control inputs. On the basis of backstepping technique, the improved prescribed performance controller with low structural and computational complexity is designed. The methodology guarantees the altitude and velocity tracking error within transient and steady state performance envelopes and presents excellent robustness against uncertain dynamics and deadzone input nonlinearity. Simulation results demonstrate the efficacy of the proposed method. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  14. Comparison of High Performance Network Options: EDR InfiniBand vs.100Gb RDMA Capable Ethernet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kachelmeier, Luke Anthony; Van Wig, Faith Virginia; Erickson, Kari Natania

    These are the slides for a presentation at the HPC Mini Showcase. This is a comparison of two different high performance network options: EDR InfiniBand and 100Gb RDMA capable ethernet. The conclusion of this comparison is the following: there is good potential, as shown with the direct results; 100Gb technology is too new and not standardized, thus deployment effort is complex for both options; different companies are not necessarily compatible; if you want 100Gb/s, you must get it all from one place.

  15. High Power LaB6 Plasma Source Performance for the Lockheed Martin Compact Fusion Reactor Experiment

    NASA Astrophysics Data System (ADS)

    Heinrich, Jonathon

    2016-10-01

    Lockheed Martin's Compact Fusion Reactor (CFR) concept is a linear encapsulated ring cusp. Due to the complex field geometry, plasma injection into the device requires careful consideration. A high power thermionic plasma source (>0.25MW; >10A/cm2) has been developed with consideration to phase space for optimal coupling. We present the performance of the plasma source, comparison with alternative plasma sources, and plasma coupling with the CFR field configuration. ©2016 Lockheed Martin Corporation. All Rights Reserved.

  16. High-energy physics software parallelization using database techniques

    NASA Astrophysics Data System (ADS)

    Argante, E.; van der Stok, P. D. V.; Willers, I.

    1997-02-01

    A programming model for software parallelization, called CoCa, is introduced that copes with problems caused by typical features of high-energy physics software. By basing CoCa on the database transaction paradimg, the complexity induced by the parallelization is for a large part transparent to the programmer, resulting in a higher level of abstraction than the native message passing software. CoCa is implemented on a Meiko CS-2 and on a SUN SPARCcenter 2000 parallel computer. On the CS-2, the performance is comparable with the performance of native PVM and MPI.

  17. Wind tunnel investigation of a high lift system with pneumatic flow control

    NASA Astrophysics Data System (ADS)

    Victor, Pricop Mihai; Mircea, Boscoianu; Daniel-Eugeniu, Crunteanu

    2016-06-01

    Next generation passenger aircrafts require more efficient high lift systems under size and mass constraints, to achieve more fuel efficiency. This can be obtained in various ways: to improve/maintain aerodynamic performance while simplifying the mechanical design of the high lift system going to a single slotted flap, to maintain complexity and improve the aerodynamics even more, etc. Laminar wings have less efficient leading edge high lift systems if any, requiring more performance from the trailing edge flap. Pulsed blowing active flow control (AFC) in the gap of single element flap is investigated for a relatively large model. A wind tunnel model, test campaign and results and conclusion are presented.

  18. Mutation screening of 75 candidate genes in 152 complex I deficiency cases identifies pathogenic variants in 16 genes including NDUFB9.

    PubMed

    Haack, Tobias B; Madignier, Florence; Herzer, Martina; Lamantea, Eleonora; Danhauser, Katharina; Invernizzi, Federica; Koch, Johannes; Freitag, Martin; Drost, Rene; Hillier, Ingo; Haberberger, Birgit; Mayr, Johannes A; Ahting, Uwe; Tiranti, Valeria; Rötig, Agnes; Iuso, Arcangela; Horvath, Rita; Tesarova, Marketa; Baric, Ivo; Uziel, Graziella; Rolinski, Boris; Sperl, Wolfgang; Meitinger, Thomas; Zeviani, Massimo; Freisinger, Peter; Prokisch, Holger

    2012-02-01

    Mitochondrial complex I deficiency is the most common cause of mitochondrial disease in childhood. Identification of the molecular basis is difficult given the clinical and genetic heterogeneity. Most patients lack a molecular definition in routine diagnostics. A large-scale mutation screen of 75 candidate genes in 152 patients with complex I deficiency was performed by high-resolution melting curve analysis and Sanger sequencing. The causal role of a new disease allele was confirmed by functional complementation assays. The clinical phenotype of patients carrying mutations was documented using a standardised questionnaire. Causative mutations were detected in 16 genes, 15 of which had previously been associated with complex I deficiency: three mitochondrial DNA genes encoding complex I subunits, two mitochondrial tRNA genes and nuclear DNA genes encoding six complex I subunits and four assembly factors. For the first time, a causal mutation is described in NDUFB9, coding for a complex I subunit, resulting in reduction in NDUFB9 protein and both amount and activity of complex I. These features were rescued by expression of wild-type NDUFB9 in patient-derived fibroblasts. Mutant NDUFB9 is a new cause of complex I deficiency. A molecular diagnosis related to complex I deficiency was established in 18% of patients. However, most patients are likely to carry mutations in genes so far not associated with complex I function. The authors conclude that the high degree of genetic heterogeneity in complex I disorders warrants the implementation of unbiased genome-wide strategies for the complete molecular dissection of mitochondrial complex I deficiency.

  19. A tree-like Bayesian structure learning algorithm for small-sample datasets from complex biological model systems.

    PubMed

    Yin, Weiwei; Garimalla, Swetha; Moreno, Alberto; Galinski, Mary R; Styczynski, Mark P

    2015-08-28

    There are increasing efforts to bring high-throughput systems biology techniques to bear on complex animal model systems, often with a goal of learning about underlying regulatory network structures (e.g., gene regulatory networks). However, complex animal model systems typically have significant limitations on cohort sizes, number of samples, and the ability to perform follow-up and validation experiments. These constraints are particularly problematic for many current network learning approaches, which require large numbers of samples and may predict many more regulatory relationships than actually exist. Here, we test the idea that by leveraging the accuracy and efficiency of classifiers, we can construct high-quality networks that capture important interactions between variables in datasets with few samples. We start from a previously-developed tree-like Bayesian classifier and generalize its network learning approach to allow for arbitrary depth and complexity of tree-like networks. Using four diverse sample networks, we demonstrate that this approach performs consistently better at low sample sizes than the Sparse Candidate Algorithm, a representative approach for comparison because it is known to generate Bayesian networks with high positive predictive value. We develop and demonstrate a resampling-based approach to enable the identification of a viable root for the learned tree-like network, important for cases where the root of a network is not known a priori. We also develop and demonstrate an integrated resampling-based approach to the reduction of variable space for the learning of the network. Finally, we demonstrate the utility of this approach via the analysis of a transcriptional dataset of a malaria challenge in a non-human primate model system, Macaca mulatta, suggesting the potential to capture indicators of the earliest stages of cellular differentiation during leukopoiesis. We demonstrate that by starting from effective and efficient approaches for creating classifiers, we can identify interesting tree-like network structures with significant ability to capture the relationships in the training data. This approach represents a promising strategy for inferring networks with high positive predictive value under the constraint of small numbers of samples, meeting a need that will only continue to grow as more high-throughput studies are applied to complex model systems.

  20. Enhancing NMR of insensitive nuclei by transfer of SABRE spin hyperpolarization

    NASA Astrophysics Data System (ADS)

    Pravdivtsev, Andrey N.; Yurkovskaya, Alexandra V.; Zimmermann, Herbert; Vieth, Hans-Martin; Ivanov, Konstantin L.

    2016-09-01

    We describe the performance of methods for enhancing NMR (Nuclear Magnetic Resonance) signals of "insensitive", but important NMR nuclei, which are based on the SABRE (Signal Amplification By Reversible Exchange) technique, i.e., on spin order transfer from parahydrogen (H2 molecule in its nuclear singlet spin state) to a substrate in a transient organometallic complex. Here such transfer is performed at high magnetic fields by INEPT-type NMR pulse sequences, modified for SABRE. Signal enhancements up to three orders of magnitude are obtained for 15N nuclei; the possibility of sensitive detection of 2D-NMR 1H-15N spectra of SABRE complexes and substrates is demonstrated.

  1. Silver complexation and tandem mass spectrometry for differentiation of isomeric flavonoid diglycosides.

    PubMed

    Zhang, Junmei; Brodbelt, Jennifer S

    2005-03-15

    For detection and differentiation of isomeric flavonoids, electrospray ionization mass spectrometry is used to generate silver complexes of the type (Ag + flavonoid)+. Collisionally activated dissociation (CAD) of the resulting 1:1 silver/flavonoid complexes allows isomer differentiation of flavonoids. Eighteen flavonoid diglycosides constituting seven isomeric series are distinguishable from each other based on the CAD patterns of their silver complexes. Characteristic dissociation pathways allow identification of the site of glycosylation, the type of disaccharide (rutinose versus neohesperidose), and the type of aglycon (flavonol versus flavone versus flavanone). This silver complexation method is more universal than previous metal complexation methods, as intense silver complexes are observed even for flavonoids that lack the typical metal chelation sites. To demonstrate the feasibility of using silver complexation and tandem mass spectrometry to characterize flavonoids in complex mixtures, flavonoids extracted from grapefruit juice are separated by high-performance liquid chromatography and analyzed via a postcolumn complexation ESI-MS/MS strategy. Diagnostic fragmentation pathways of the silver complexes of the individual eluting flavonoids allow successful identification of the six flavonoids in the extract.

  2. Immune complexes in serum of rats during infection with Plasmodium berghei.

    PubMed

    Alder, J D; Kreier, J P

    1989-01-01

    Large amounts of immune complexes were present in the serum of infected rats early in infection when parasitemias were low. As the infection progressed and parasitemia increased and then decreased, the amounts of immune complexes in the serum also fell. This result suggests that increased efficiency of complex clearance was an important factor in determining the levels of immune complexes in the serum. In high performance liquid chromatography (HPLC), the complexes in the serum migrated as a peak with material of 350 kDa and greater in mass. They sedimented in a sucrose gradient as a band with a sedimentation coefficient of 22 s, which was calculated to yield a mass of approximately 1100 kDa. Immunoelectrophoresis and radial immunodiffusion showed that IgG was the major immunoglobulin in the complexes. As the IgG content of the complexes increased, the levels of complexes in the serum generally decreased. HPLC analysis of precipitated complexes suggested that they contained loosely bound albumin. Serum proteins were affected by the infection. A depletion of free immunoglobulin was observed during the initial period of immune complex formation.

  3. High Density Hydrogen Storage System Demonstration Using NaAlH4 Based Complex Compound Hydrides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daniel A. Mosher; Xia Tang; Ronald J. Brown

    2007-07-27

    This final report describes the motivations, activities and results of the hydrogen storage independent project "High Density Hydrogen Storage System Demonstration Using NaAlH4 Based Complex Compound Hydrides" performed by the United Technologies Research Center under the Department of Energy Hydrogen Program, contract # DE-FC36-02AL67610. The objectives of the project were to identify and address the key systems technologies associated with applying complex hydride materials, particularly ones which differ from those for conventional metal hydride based storage. This involved the design, fabrication and testing of two prototype systems based on the hydrogen storage material NaAlH4. Safety testing, catalysis studies, heat exchangermore » optimization, reaction kinetics modeling, thermochemical finite element analysis, powder densification development and material neutralization were elements included in the effort.« less

  4. Mammalian gravity receptors: Structure and metabolism

    NASA Technical Reports Server (NTRS)

    Ross, M.

    1984-01-01

    High performance liquid chromatography (HPLC) instrumentation was used for amino acid analysis of rat otoconial complexes. The amino acids of otoconial complexes pooled by origin from only 10 rats were analyzed. It is indicated that it should be possible to analyze complexes from only three rats, and perhaps fewer, which means that the method should be applicable to material from space flow rats. It is suggested that the organic otoconial phase is comparable in its complement of acidic amino acids to other calcium carbonate containing materials such as fish otoliths and certain mollusk shells. The organic material is high in acidic amino acids; and the relative proportions of aspirate, glutamate, threonine and serine appear to be similar to those found in neogastropod shells. Its significance to the evolution of biomineralization processes occurring in the animal kingdom is emphasized.

  5. Cost to Set up Common Languages

    NASA Astrophysics Data System (ADS)

    Latora, Vito

    Complexity is a highly interdisciplinary science. Although there are drawbacks for researchers to work at the interface of different fields, such as the cost to set up common languages, and the risks associated with not being recognized by any of the well-established scientific communities, some of my recent work indicates that interdisciplinarity can be extremely rewarding. Drawing on large data sets on scientific production during several decades, we have shown that highly interdisciplinary scholars can outperform specialized ones, and that scientists can enhance their performance by seeking collaborators with expertise in various fields. My vision for complexity is based on the added value of its interdisciplinary nature. I list below three research directions that I am personally eager to explore, and that I think will be among the main challenges of complexity in the next 10 years...

  6. [Predicting individual risk of high healthcare cost to identify complex chronic patients].

    PubMed

    Coderch, Jordi; Sánchez-Pérez, Inma; Ibern, Pere; Carreras, Marc; Pérez-Berruezo, Xavier; Inoriza, José M

    2014-01-01

    To develop a predictive model for the risk of high consumption of healthcare resources, and assess the ability of the model to identify complex chronic patients. A cross-sectional study was performed within a healthcare management organization by using individual data from 2 consecutive years (88,795 people). The dependent variable consisted of healthcare costs above the 95th percentile (P95), including all services provided by the organization and pharmaceutical consumption outside of the institution. The predictive variables were age, sex, morbidity-based on clinical risk groups (CRG)-and selected data from previous utilization (use of hospitalization, use of high-cost drugs in ambulatory care, pharmaceutical expenditure). A univariate descriptive analysis was performed. We constructed a logistic regression model with a 95% confidence level and analyzed sensitivity, specificity, positive predictive values (PPV), and the area under the ROC curve (AUC). Individuals incurring costs >P95 accumulated 44% of total healthcare costs and were concentrated in ACRG3 (aggregated CRG level 3) categories related to multiple chronic diseases. All variables were statistically significant except for sex. The model had a sensitivity of 48.4% (CI: 46.9%-49.8%), specificity of 97.2% (CI: 97.0%-97.3%), PPV of 46.5% (CI: 45.0%-47.9%), and an AUC of 0.897 (CI: 0.892 to 0.902). High consumption of healthcare resources is associated with complex chronic morbidity. A model based on age, morbidity, and prior utilization is able to predict high-cost risk and identify a target population requiring proactive care. Copyright © 2013 SESPAS. Published by Elsevier Espana. All rights reserved.

  7. Intellectual development is positively related to intrinsic motivation and course grades for female but not male students.

    PubMed

    Cortright, Ronald N; Lujan, Heidi L; Cox, Julie H; Cortright, Maria A; Langworthy, Brandon M; Petta, Lorene M; Tanner, Charles J; DiCarlo, Stephen E

    2015-09-01

    We hypothesized that the intellectual development of students, i.e., their beliefs about the nature of knowledge and learning, affects their intrinsic motivation and class performance. Specifically, we hypothesized that students with low intellectual development (i.e., the naive beliefs that knowledge is simple, absolute, and certain) have low intrinsic motivation and low class performance, whereas students with high intellectual development (i.e., more sophisticated beliefs that knowledge is complex, tentative, and evolving) have high intrinsic motivation and class performance. To test this hypothesis, we administered the Learning Context Questionnaire to measure intellectual development. In addition, we administered the Intrinsic Motivation Inventory to assess our students' intrinsic motivation. Furthermore, we performed regression analyses between intellectual development with both intrinsic motivation and class performance. The results document a positive relationship among intellectual development, intrinsic motivation, and class performance for female students only. In sharp contrast, there was a negative relationship between intellectual development, intrinsic motivation, and class performance for male students. The slope comparisons documented significant differences in the slopes relating intellectual development, intrinsic motivation, and class performance between female and male students. Thus, female students with more sophisticated beliefs that knowledge is personally constructed, complex, and evolving had higher intrinsic motivation and class performance. In contrast, male students with the naive beliefs that the structure of knowledge is simple, absolute, and certain had higher levels of intrinsic motivation and class performance. The results suggest that sex influences intellectual development, which has an effect on intrinsic motivation for learning a specific topic. Copyright © 2015 The American Physiological Society.

  8. Dynamic Target Acquisition: Empirical Models of Operator Performance.

    DTIC Science & Technology

    1980-08-01

    for 30,000 Ft Initial Slant Range VARIABLES MEAN Signature X Scene Complexity Low Medium High Active Target FLIR 22794 20162 20449 Inactive Target...Interactions for 30,000 Ft Initial Slant Range I Signature X Scene Complexity V * ORDERED MEANS 14867 18076 18079 18315 19105 19643 20162 20449 22794...14867 18076 1 183159 19105* 1 19643 20162* 20449 * 1 22794Signature X Speed I ORDERED MEANS 13429 15226 16604 17344 19033 20586 22641 24033 24491 1

  9. User's guide for ENSAERO: A multidisciplinary program for fluid/structural/control interaction studies of aircraft (release 1)

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.

    1994-01-01

    Strong interactions can occur between the flow about an aerospace vehicle and its structural components resulting in several important aeroelastic phenomena. These aeroelastic phenomena can significantly influence the performance of the vehicle. At present, closed-form solutions are available for aeroelastic computations when flows are in either the linear subsonic or supersonic range. However, for aeroelasticity involving complex nonlinear flows with shock waves, vortices, flow separations, and aerodynamic heating, computational methods are still under development. These complex aeroelastic interactions can be dangerous and limit the performance of aircraft. Examples of these detrimental effects are aircraft with highly swept wings experiencing vortex-induced aeroelastic oscillations, transonic regime at which the flutter speed is low, aerothermoelastic loads that play a critical role in the design of high-speed vehicles, and flow separations that often lead to buffeting with undesirable structural oscillations. The simulation of these complex aeroelastic phenomena requires an integrated analysis of fluids and structures. This report presents a summary of the development, applications, and procedures to use the multidisciplinary computer code ENSAERO. This code is based on the Euler/Navier-Stokes flow equations and modal/finite-element structural equations.

  10. Here and now: the intersection of computational science, quantum-mechanical simulations, and materials science

    NASA Astrophysics Data System (ADS)

    Marzari, Nicola

    The last 30 years have seen the steady and exhilarating development of powerful quantum-simulation engines for extended systems, dedicated to the solution of the Kohn-Sham equations of density-functional theory, often augmented by density-functional perturbation theory, many-body perturbation theory, time-dependent density-functional theory, dynamical mean-field theory, and quantum Monte Carlo. Their implementation on massively parallel architectures, now leveraging also GPUs and accelerators, has started a massive effort in the prediction from first principles of many or of complex materials properties, leading the way to the exascale through the combination of HPC (high-performance computing) and HTC (high-throughput computing). Challenges and opportunities abound: complementing hardware and software investments and design; developing the materials' informatics infrastructure needed to encode knowledge into complex protocols and workflows of calculations; managing and curating data; resisting the complacency that we have already reached the predictive accuracy needed for materials design, or a robust level of verification of the different quantum engines. In this talk I will provide an overview of these challenges, with the ultimate prize being the computational understanding, prediction, and design of properties and performance for novel or complex materials and devices.

  11. Medication Management: The Macrocognitive Workflow of Older Adults With Heart Failure

    PubMed Central

    2016-01-01

    Background Older adults with chronic disease struggle to manage complex medication regimens. Health information technology has the potential to improve medication management, but only if it is based on a thorough understanding of the complexity of medication management workflow as it occurs in natural settings. Prior research reveals that patient work related to medication management is complex, cognitive, and collaborative. Macrocognitive processes are theorized as how people individually and collaboratively think in complex, adaptive, and messy nonlaboratory settings supported by artifacts. Objective The objective of this research was to describe and analyze the work of medication management by older adults with heart failure, using a macrocognitive workflow framework. Methods We interviewed and observed 61 older patients along with 30 informal caregivers about self-care practices including medication management. Descriptive qualitative content analysis methods were used to develop categories, subcategories, and themes about macrocognitive processes used in medication management workflow. Results We identified 5 high-level macrocognitive processes affecting medication management—sensemaking, planning, coordination, monitoring, and decision making—and 15 subprocesses. Data revealed workflow as occurring in a highly collaborative, fragile system of interacting people, artifacts, time, and space. Process breakdowns were common and patients had little support for macrocognitive workflow from current tools. Conclusions Macrocognitive processes affected medication management performance. Describing and analyzing this performance produced recommendations for technology supporting collaboration and sensemaking, decision making and problem detection, and planning and implementation. PMID:27733331

  12. Magnetic navigation facilitates percutaneous coronary intervention for complex lesions.

    PubMed

    Sandhu, Gurpreet S; Sanon, Saurabh; Holmes, David R; Gulati, Rajiv; Brilakis, Emmanouil S; Lennon, Ryan J; Rihal, Charanjit S

    2014-10-01

    We sought to determine the utility of a magnetic navigation system (MNS) in treating a variety of coronary artery lesions including those that could not be revascularized with standard percutaneous coronary intervention (PCI). MNS may add value in the treatment of tortuous arteries and bifurcation lesions; however its widespread adoption has lagged because of cost and a lack of clear advantage over conventional PCI. We performed a retrospective analysis to determine whether MNS improved procedural success for highly complex lesions. One hundred and forty-eight patients underwent treatment with MNS at Mayo Clinic, Rochester. Clinical data, angiographic and procedural characteristics, lesion crossing success and outcomes were reviewed. Overall 133 patients underwent successful revascularization with 87% (143) of 164 lesions crossed using MNS alone. Another six lesions required a combination of MNS and conventional devices resulting in overall success of 91% (149/164). Eighteen complex lesions had previously failed PCI and 12 (67%) were successfully treated with MNS. Success after failed PCI was higher (88%) when a frequent user operated MNS, but occasional users also noted incremental success (30%). Twenty-five chronic total occlusions were included amongst these 164 lesions, with observed antegrade MNS lesion crossing rates of 78% for regular and 14% for occasional users. MNS is a useful adjunct to performance of PCI. This specialized technology has a clear learning curve and can facilitate treatment of highly complex lesions. © 2013 Wiley Periodicals, Inc.

  13. Rapid identification of a novel complex I MT-ND3 m.10134C>A mutation in a Leigh syndrome patient.

    PubMed

    Miller, David K; Menezes, Minal J; Simons, Cas; Riley, Lisa G; Cooper, Sandra T; Grimmond, Sean M; Thorburn, David R; Christodoulou, John; Taft, Ryan J

    2014-01-01

    Leigh syndrome (LS) is a rare progressive multi-system neurodegenerative disorder, the genetics of which is frequently difficult to resolve. Rapid determination of the genetic etiology of LS in a 5-year-old girl facilitated inclusion in Edison Pharmaceutical's phase 2B clinical trial of EPI-743. SNP-arrays and high-coverage whole exome sequencing were performed on the proband, both parents and three unaffected siblings. Subsequent multi-tissue targeted high-depth mitochondrial sequencing was performed using custom long-range PCR amplicons. Tissue-specific mutant load was also assessed by qPCR. Complex I was interrogated by spectrophotometric enzyme assays and Western Blot. No putatively causal mutations were identified in nuclear-encoded genes. Analysis of low-coverage off-target mitochondrial reads revealed a previously unreported mitochondrial mutation in the proband in MT-ND3 (m.10134C>A, p.Q26K), a Complex I mitochondrial gene previously associated with LS. Targeted investigations demonstrated that this mutation was 1% heteroplasmic in the mother's blood and homoplasmic in the proband's blood, fibroblasts, liver and muscle. Enzyme assays revealed decreased Complex I activity. The identification of this novel LS MT-ND3 variant, the genomics of which was accomplished in less than 3.5 weeks, indicates that rapid genomic approaches may prove useful in time-sensitive cases with an unresolved genetic diagnosis.

  14. Medication Management: The Macrocognitive Workflow of Older Adults With Heart Failure.

    PubMed

    Mickelson, Robin S; Unertl, Kim M; Holden, Richard J

    2016-10-12

    Older adults with chronic disease struggle to manage complex medication regimens. Health information technology has the potential to improve medication management, but only if it is based on a thorough understanding of the complexity of medication management workflow as it occurs in natural settings. Prior research reveals that patient work related to medication management is complex, cognitive, and collaborative. Macrocognitive processes are theorized as how people individually and collaboratively think in complex, adaptive, and messy nonlaboratory settings supported by artifacts. The objective of this research was to describe and analyze the work of medication management by older adults with heart failure, using a macrocognitive workflow framework. We interviewed and observed 61 older patients along with 30 informal caregivers about self-care practices including medication management. Descriptive qualitative content analysis methods were used to develop categories, subcategories, and themes about macrocognitive processes used in medication management workflow. We identified 5 high-level macrocognitive processes affecting medication management-sensemaking, planning, coordination, monitoring, and decision making-and 15 subprocesses. Data revealed workflow as occurring in a highly collaborative, fragile system of interacting people, artifacts, time, and space. Process breakdowns were common and patients had little support for macrocognitive workflow from current tools. Macrocognitive processes affected medication management performance. Describing and analyzing this performance produced recommendations for technology supporting collaboration and sensemaking, decision making and problem detection, and planning and implementation.

  15. Electroreduction of CO 2 Catalyzed by a Heterogenized Zn–Porphyrin Complex with a Redox-Innocent Metal Center

    DOE PAGES

    Wu, Yueshen; Jiang, Jianbing; Weng, Zhe; ...

    2017-07-26

    Transition-metal-based molecular complexes are a class of catalyst materials for electrochemical CO 2 reduction to CO that can be rationally designed to deliver high catalytic performance. One common mechanistic feature of these electrocatalysts developed thus far is an electrogenerated reduced metal center associated with catalytic CO 2 reduction. Here in this paper, we report a heterogenized zinc–porphyrin complex (zinc(II) 5,10,15,20-tetramesitylporphyrin) as an electrocatalyst that delivers a turnover frequency as high as 14.4 site –1 s –1 and a Faradaic efficiency as high as 95% for CO 2 electroreduction to CO at -1.7 V vs the standard hydrogen electrode in anmore » organic/water mixed electrolyte. While the Zn center is critical to the observed catalysis, in situ and operando X-ray absorption spectroscopic studies reveal that it is redox-innocent throughout the potential range. Cyclic voltammetry indicates that the porphyrin ligand may act as a redox mediator. Chemical reduction of the zinc–porphyrin complex further confirms that the reduction is ligand-based and the reduced species can react with CO 2. This represents the first example of a transition-metal complex for CO 2 electroreduction catalysis with its metal center being redox-innocent under working conditions.« less

  16. Electroreduction of CO2 Catalyzed by a Heterogenized Zn–Porphyrin Complex with a Redox-Innocent Metal Center

    PubMed Central

    2017-01-01

    Transition-metal-based molecular complexes are a class of catalyst materials for electrochemical CO2 reduction to CO that can be rationally designed to deliver high catalytic performance. One common mechanistic feature of these electrocatalysts developed thus far is an electrogenerated reduced metal center associated with catalytic CO2 reduction. Here we report a heterogenized zinc–porphyrin complex (zinc(II) 5,10,15,20-tetramesitylporphyrin) as an electrocatalyst that delivers a turnover frequency as high as 14.4 site–1 s–1 and a Faradaic efficiency as high as 95% for CO2 electroreduction to CO at −1.7 V vs the standard hydrogen electrode in an organic/water mixed electrolyte. While the Zn center is critical to the observed catalysis, in situ and operando X-ray absorption spectroscopic studies reveal that it is redox-innocent throughout the potential range. Cyclic voltammetry indicates that the porphyrin ligand may act as a redox mediator. Chemical reduction of the zinc–porphyrin complex further confirms that the reduction is ligand-based and the reduced species can react with CO2. This represents the first example of a transition-metal complex for CO2 electroreduction catalysis with its metal center being redox-innocent under working conditions. PMID:28852698

  17. Electroreduction of CO 2 Catalyzed by a Heterogenized Zn–Porphyrin Complex with a Redox-Innocent Metal Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Yueshen; Jiang, Jianbing; Weng, Zhe

    Transition-metal-based molecular complexes are a class of catalyst materials for electrochemical CO 2 reduction to CO that can be rationally designed to deliver high catalytic performance. One common mechanistic feature of these electrocatalysts developed thus far is an electrogenerated reduced metal center associated with catalytic CO 2 reduction. Here in this paper, we report a heterogenized zinc–porphyrin complex (zinc(II) 5,10,15,20-tetramesitylporphyrin) as an electrocatalyst that delivers a turnover frequency as high as 14.4 site –1 s –1 and a Faradaic efficiency as high as 95% for CO 2 electroreduction to CO at -1.7 V vs the standard hydrogen electrode in anmore » organic/water mixed electrolyte. While the Zn center is critical to the observed catalysis, in situ and operando X-ray absorption spectroscopic studies reveal that it is redox-innocent throughout the potential range. Cyclic voltammetry indicates that the porphyrin ligand may act as a redox mediator. Chemical reduction of the zinc–porphyrin complex further confirms that the reduction is ligand-based and the reduced species can react with CO 2. This represents the first example of a transition-metal complex for CO 2 electroreduction catalysis with its metal center being redox-innocent under working conditions.« less

  18. Electroreduction of CO 2 Catalyzed by a Heterogenized Zn–Porphyrin Complex with a Redox-Innocent Metal Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Yueshen; Jiang, Jianbing; Weng, Zhe

    Transition-metal-based molecular complexes are a class of catalyst materials for electrochemical CO2 reduction to CO that can be rationally designed to deliver high catalytic performance. One common mechanistic feature of these electrocatalysts developed thus far is an electrogenerated reduced metal center associated with catalytic CO2 reduction. Here we report a heterogenized zinc–porphyrin complex (zinc(II) 5,10,15,20-tetramesitylporphyrin) as an electrocatalyst that delivers a turnover frequency as high as 14.4 site–1 s–1 and a Faradaic efficiency as high as 95% for CO2 electroreduction to CO at -1.7 V vs the standard hydrogen electrode in an organic/water mixed electrolyte. While the Zn center ismore » critical to the observed catalysis, in situ and operando X-ray absorption spectroscopic studies reveal that it is redox-innocent throughout the potential range. Cyclic voltammetry indicates that the porphyrin ligand may act as a redox mediator. Chemical reduction of the zinc–porphyrin complex further confirms that the reduction is ligand-based and the reduced species can react with CO2. This represents the first example of a transition-metal complex for CO2 electroreduction catalysis with its metal center being redox-innocent under working conditions.« less

  19. Comparison of fresh-frozen cadaver and high-fidelity virtual reality simulator as methods of laparoscopic training.

    PubMed

    Sharma, Mitesh; Horgan, Alan

    2012-08-01

    The aim of this study was to compare fresh-frozen cadavers (FFC) with a high-fidelity virtual reality simulator (VRS) as training tools in minimal access surgery for complex and relatively simple procedures. A prospective comparative face validity study between FFC and VRS (LAP Mentor(™)) was performed. Surgeons were recruited to perform tasks on both FFC and VRS appropriately paired to their experience level. Group A (senior) performed a laparoscopic sigmoid colectomy, Group B (intermediate) performed a laparoscopic incisional hernia repair, and Group C (junior) performed basic laparoscopic tasks (BLT) (camera manipulation, hand-eye coordination, tissue dissection and hand-transferring skills). Each subject completed a 5-point Likert-type questionnaire rating the training modalities in nine domains. Data were analysed using nonparametric tests. Forty-five surgeons were recruited to participate (15 per skill group). Median scores for subjects in Group A were significantly higher for evaluation of FFC in all nine domains compared to VRS (p < 0.01). Group B scored FFC significantly better (p < 0.05) in all domains except task replication (p = 0.06). Group C scored FFC significantly better (p < 0.01) in eight domains but not on performance feedback (p = 0.09). When compared across groups, juniors accepted VRS as a training model more than did intermediate and senior groups on most domains (p < 0.01) except team work. Fresh-frozen cadaver is perceived as a significantly overall better model for laparoscopic training than the high-fidelity VRS by all training grades, irrespective of the complexity of the operative procedure performed. VRS is still useful when training junior trainees in BLT.

  20. Formation of core-shell structured complex microparticles during fabrication of magnetorheological elastomers and their magnetorheological behavior

    NASA Astrophysics Data System (ADS)

    Wang, Yonghong; Zhang, Xinru; Chung, Kyungho; Liu, Chengcen; Choi, Seung-Bok; Choi, Hyoung Jin

    2016-11-01

    To improve mechanical and magnetorheological properties of magnetorheological elastomers (MREs), a facile method was used to fabricate high-performance MREs which consisted of the core-shell complex microparticles with an organic-inorganic network structure dispersed in an ethylene propylene diene rubber. In this work, the proposed magnetic complex microparticles were in situ formed during MREs fabrication as a result of strong interaction between matrix and CIPs using carbon black as a connecting point. The morphology of both isotropic (i-MREs) and anisotropic MREs (a-MREs) was observed by scanning electron microscope (SEM). The effects of carbonyl iron particle (CIP) volume content on mechanical properties and hysteresis loss of MREs were investigated. The effects of CIP volume content on the shear storage modulus, MR effect and loss tangent were studied using a modified dynamic mechanical analyzer under applied magnetic field strengths. The results showed that the orientation effect became more pronounced with increasing CIPs in the a-MREs, whereas CIPs distributed uniformly in the i-MREs. The tensile strength, tear strength and elongation at break decreased with increasing CIP content up to 40 vol.%, while the hardness increased. It is worth noting that the tensile strength of i-MREs and a-MREs containing 40 vol.% CIPs still had high mechanical properties as a result of good compatibility between complex microparticles and rubber matrix. The MR performance of shear storage modulus and damping properties of MREs increased remarkably with CIP content due to strong dipole-dipole interaction of complex microparticles. Besides, the hysteresis loss increased with increasing CIP content as a result of magnetic field induced interfacial sliding between complex microparticles.

  1. Defect states of complexes involving a vacancy on the boron site in boronitrene

    NASA Astrophysics Data System (ADS)

    Ngwenya, T. B.; Ukpong, A. M.; Chetty, N.

    2011-12-01

    First principles calculations have been performed to investigate the ground state properties of freestanding monolayer hexagonal boronitrene (h-BN). We have considered monolayers that contain native point defects and their complexes, which form when the point defects bind with the boron vacancy on the nearest-neighbor position. The changes in the electronic structure are analyzed to show the extent of localization of the defect-induced midgap states. The variations in formation energies suggest that defective h-BN monolayers that contain carbon substitutional impurities are the most stable structures, irrespective of the changes in growth conditions. The high energies of formation of the boron vacancy complexes suggest that they are less stable, and their creation by ion bombardment would require high-energy ions compared to point defects. Using the relative positions of the derived midgap levels for the double vacancy complex, it is shown that the quasi-donor-acceptor pair interpretation of optical transitions is consistent with stimulated transitions between electron and hole states in boronitrene.

  2. Natural radionuclides in the rocks of the Valle del Cervo Pluton in Piedmont.

    PubMed

    Sesana, Lucia; Fumagalli, Marco; Carnevale, Mauro; Polla, Giancarla; Facchini, Ugo; Colombo, Annita; Tunesi, Annalisa; De Capitani, Luisa; Rusconi, Rosella

    2006-01-01

    Monitoring of the gamma radiation in Valle del Cervo Pluton was performed by determining U and Th contents in the main rock types cropping out over the entire area and pertaining to the granitic complex, syenitic complex and monzonitic complex. In particular, syenitic rocks were largely used as building and ornamental materials (e.g. Sienite della Balma). All the samples are fresh and do not present joints or fractures filled with U minerals. In the crushed samples the activity of uranium varies from 346 to 764 Bq/kg. Concentration of thorium varies from 202 to 478 Bq/kg. For all the analysed rocks uranium activity is higher than thorium one. The lowest value of radioactive concentration is referred to rocks of the granitic complex. The most active rocks are syenites. The data confirm the high activities of Valle del Cervo rock types, strongly connected with high K content of the source magma (geochemical signature); on the contrary, the activity seems to be not related to the location of the samples.

  3. Rational Design and Facile Synthesis of Boranophosphate Ionic Liquids as Hypergolic Rocket Fuels.

    PubMed

    Liu, Tianlin; Qi, Xiujuan; Wang, Binshen; Jin, Yunhe; Yan, Chao; Wang, Yi; Zhang, Qinghua

    2018-05-14

    The design and synthesis of new hypergolic ionic liquids (HILs) as replacements for toxic hydrazine derivatives have been the focus of current academic research in the field of liquid bipropellant fuels. In most cases, however, the requirements of excellent ignition performances, good hydrolytic stabilities, and low synthetic costs are often contradictory, which makes the development of high-performance HILs an enormous challenge. Here, we show how a fuel-rich boranophosphate ion was rationally designed and used to synthesize a series of high-performance HILs with excellent comprehensive properties. In the design strategy, we introduced the {BH 3 } moiety into the boranophosphate ion for improving the self-ignition property, whereas the complexation of boron and phosphite was used to improve the hydrolytic activity of the borohydride species. As a result, these boranophosphate HILs exhibited wide liquid operating ranges (>220 °C), high densities (1.00-1.10 g cm -3 ), good hydrolytic stabilities, and short ignition delay times (2.3-9.7 milliseconds) with white fuming nitric acid (WFNA) as the oxidizer. More importantly, these boranophosphate HILs could be readily prepared in high yields from commercial phosphite esters, avoiding complex and time-consuming synthetic routes. This work offers an effective strategy of designing boranophosphate HILs towards safer and greener hypergolic fuels for liquid bipropellant applications. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. High Resolution Aerospace Applications using the NASA Columbia Supercomputer

    NASA Technical Reports Server (NTRS)

    Mavriplis, Dimitri J.; Aftosmis, Michael J.; Berger, Marsha

    2005-01-01

    This paper focuses on the parallel performance of two high-performance aerodynamic simulation packages on the newly installed NASA Columbia supercomputer. These packages include both a high-fidelity, unstructured, Reynolds-averaged Navier-Stokes solver, and a fully-automated inviscid flow package for cut-cell Cartesian grids. The complementary combination of these two simulation codes enables high-fidelity characterization of aerospace vehicle design performance over the entire flight envelope through extensive parametric analysis and detailed simulation of critical regions of the flight envelope. Both packages. are industrial-level codes designed for complex geometry and incorpor.ats. CuStomized multigrid solution algorithms. The performance of these codes on Columbia is examined using both MPI and OpenMP and using both the NUMAlink and InfiniBand interconnect fabrics. Numerical results demonstrate good scalability on up to 2016 CPUs using the NUMAIink4 interconnect, with measured computational rates in the vicinity of 3 TFLOP/s, while InfiniBand showed some performance degradation at high CPU counts, particularly with multigrid. Nonetheless, the results are encouraging enough to indicate that larger test cases using combined MPI/OpenMP communication should scale well on even more processors.

  5. New strategies for the synthesis of naphthoquinones employing Cu(II) complexes: Crystal structures and cytotoxicity

    NASA Astrophysics Data System (ADS)

    Azeredo, Nathália F. B.; Souza, Fabrícia P.; Demidoff, Felipe C.; Netto, Chaquip D.; Resende, Jackson A. L. C.; Franco, Roberto W. A.; Colepicolo, Pio; Ferreira, Ana M. C.; Fernandes, Christiane

    2018-01-01

    The syntheses, physico-chemical characterization and cytotoxicity toward three human cell lines (standard and resistant sarcoma cells, and fibroblast) of a new copper(II) complex [Cu(HBPA)(L1)Cl]·3H2O 2 are reported. Complex 2 was obtained through the reaction between the ligand stilbene-quinone (HL1) and Cu[HBPA]Cl21, where HBPA = 2-hydroxybenzyl-2pyridylmethylamine. The synthesis of HL1 was performed in high yield through Heck reaction on PEG-400. X-ray diffraction and solution studies (UV-Vis, EPR, ESI(+)-MS and ESI(+)-MS/MS) were performed for complex 2, in which the copper(II) center is coordinated to the quinone in its deprotonated form, to the ligand HBPA and to a chloro ligand. Similar reaction employing CuCl2·2H2O, instead of Cu[HBPA]Cl21 and HL1, has resulted in the obtainment of a furano-o-naphtoquinone (L2) with 99% selectivity, suggesting a new methodology to cyclize the ligand HL1. In order to obtain the analogous para-isomer (L3), and to evaluate the isomerism influence on cytotoxicity activity, a cyclization reaction of HL1 with NBS (N-bromosuccinimide) was also performed, which resulted in the obtainment of L2 (8%) and L3 (13%). X-ray diffraction studies were performed for L2 and complex 2, and the description of their structure elucidated. Results from MTT assay revealed that complex 2 is more active against sarcoma cell lines (MES-SA/Dx5 and MES-SA) than both the free ligand HL1 and complex 1, reducing cell viability to less than 50 μmol L-1. L2 was the most active in the series, presenting cytotoxicity against resistant MES-SA/Dx5 and its standard MES-SA cell line, respectively, three and ten times higher than the current drug doxorubicin.

  6. Multi-threading performance of Geant4, MCNP6, and PHITS Monte Carlo codes for tetrahedral-mesh geometry.

    PubMed

    Han, Min Cheol; Yeom, Yeon Soo; Lee, Hyun Su; Shin, Bangho; Kim, Chan Hyeong; Furuta, Takuya

    2018-05-04

    In this study, the multi-threading performance of the Geant4, MCNP6, and PHITS codes was evaluated as a function of the number of threads (N) and the complexity of the tetrahedral-mesh phantom. For this, three tetrahedral-mesh phantoms of varying complexity (simple, moderately complex, and highly complex) were prepared and implemented in the three different Monte Carlo codes, in photon and neutron transport simulations. Subsequently, for each case, the initialization time, calculation time, and memory usage were measured as a function of the number of threads used in the simulation. It was found that for all codes, the initialization time significantly increased with the complexity of the phantom, but not with the number of threads. Geant4 exhibited much longer initialization time than the other codes, especially for the complex phantom (MRCP). The improvement of computation speed due to the use of a multi-threaded code was calculated as the speed-up factor, the ratio of the computation speed on a multi-threaded code to the computation speed on a single-threaded code. Geant4 showed the best multi-threading performance among the codes considered in this study, with the speed-up factor almost linearly increasing with the number of threads, reaching ~30 when N  =  40. PHITS and MCNP6 showed a much smaller increase of the speed-up factor with the number of threads. For PHITS, the speed-up factors were low when N  =  40. For MCNP6, the increase of the speed-up factors was better, but they were still less than ~10 when N  =  40. As for memory usage, Geant4 was found to use more memory than the other codes. In addition, compared to that of the other codes, the memory usage of Geant4 more rapidly increased with the number of threads, reaching as high as ~74 GB when N  =  40 for the complex phantom (MRCP). It is notable that compared to that of the other codes, the memory usage of PHITS was much lower, regardless of both the complexity of the phantom and the number of threads, hardly increasing with the number of threads for the MRCP.

  7. Single-photon test of hyper-complex quantum theories using a metamaterial.

    PubMed

    Procopio, Lorenzo M; Rozema, Lee A; Wong, Zi Jing; Hamel, Deny R; O'Brien, Kevin; Zhang, Xiang; Dakić, Borivoje; Walther, Philip

    2017-04-21

    In standard quantum mechanics, complex numbers are used to describe the wavefunction. Although this has so far proven sufficient to predict experimental results, there is no theoretical reason to choose them over real numbers or generalizations of complex numbers, that is, hyper-complex numbers. Experiments performed to date have proven that real numbers are insufficient, but the need for hyper-complex numbers remains an open question. Here we experimentally probe hyper-complex quantum theories, studying one of their deviations from complex quantum theory: the non-commutativity of phases. We do so by passing single photons through a Sagnac interferometer containing both a metamaterial with a negative refractive index, and a positive phase shifter. To accomplish this we engineered a fishnet metamaterial to have a negative refractive index at 780 nm. We show that the metamaterial phase commutes with other phases with high precision, allowing us to place limits on a particular prediction of hyper-complex quantum theories.

  8. Complement activating properties of complexes containing rheumatoid factor in synovial fluids and sera from patients with rheumatoid arthritis.

    PubMed Central

    Elson, C J; Carter, S D; Cottrell, B J; Scott, D G; Bacon, P A; Wallington, T B

    1985-01-01

    The relationship between complexes containing rheumatoid factor and complexes activating complement was examined in synovial fluids and sera from patients with rheumatoid arthritis (RA). In each case this was performed by quantifying the amount of rheumatoid factor bound by solid phase Fab'2 anti-C3 and/or solid phase conglutinin. Both anti-C3 coated and conglutinin coated microtitre plates bound high levels of complexes containing rheumatoid factor from sera of RA patients with vasculitis. Unexpectedly, these complexes were detected in synovial fluids from only a minority of RA patients with synovitis. However, RA synovial fluids did contain other complexes as shown by the presence of complement consuming activity, C1q binding material and immunoglobulin attaching to conglutinin. It is considered that in RA synovial fluids the complexes containing RF and those activating complement are not necessarily the same whilst in vasculitic sera the complexes containing rheumatoid factor also activate complement. PMID:3978872

  9. Single-photon test of hyper-complex quantum theories using a metamaterial

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Procopio, Lorenzo M.; Rozema, Lee A.; Wong, Zi Jing

    In standard quantum mechanics, complex numbers are used to describe the wavefunction. Although this has so far proven sufficient to predict experimental results, there is no theoretical reason to choose them over real numbers or generalizations of complex numbers, that is, hyper-complex numbers. Experiments performed to date have proven that real numbers are insufficient, but the need for hyper-complex numbers remains an open question. Here we experimentally probe hyper-complex quantum theories, studying one of their deviations from complex quantum theory: the non-commutativity of phases. We do so by passing single photons through a Sagnac interferometer containing both a metamaterial withmore » a negative refractive index, and a positive phase shifter. In order to accomplish this we engineered a fishnet metamaterial to have a negative refractive index at 780 nm. Here, we show that the metamaterial phase commutes with other phases with high precision, allowing us to place limits on a particular prediction of hyper-complex quantum theories.« less

  10. Single-photon test of hyper-complex quantum theories using a metamaterial

    DOE PAGES

    Procopio, Lorenzo M.; Rozema, Lee A.; Wong, Zi Jing; ...

    2017-04-21

    In standard quantum mechanics, complex numbers are used to describe the wavefunction. Although this has so far proven sufficient to predict experimental results, there is no theoretical reason to choose them over real numbers or generalizations of complex numbers, that is, hyper-complex numbers. Experiments performed to date have proven that real numbers are insufficient, but the need for hyper-complex numbers remains an open question. Here we experimentally probe hyper-complex quantum theories, studying one of their deviations from complex quantum theory: the non-commutativity of phases. We do so by passing single photons through a Sagnac interferometer containing both a metamaterial withmore » a negative refractive index, and a positive phase shifter. In order to accomplish this we engineered a fishnet metamaterial to have a negative refractive index at 780 nm. Here, we show that the metamaterial phase commutes with other phases with high precision, allowing us to place limits on a particular prediction of hyper-complex quantum theories.« less

  11. Single-photon test of hyper-complex quantum theories using a metamaterial

    PubMed Central

    Procopio, Lorenzo M.; Rozema, Lee A.; Wong, Zi Jing; Hamel, Deny R.; O'Brien, Kevin; Zhang, Xiang; Dakić, Borivoje; Walther, Philip

    2017-01-01

    In standard quantum mechanics, complex numbers are used to describe the wavefunction. Although this has so far proven sufficient to predict experimental results, there is no theoretical reason to choose them over real numbers or generalizations of complex numbers, that is, hyper-complex numbers. Experiments performed to date have proven that real numbers are insufficient, but the need for hyper-complex numbers remains an open question. Here we experimentally probe hyper-complex quantum theories, studying one of their deviations from complex quantum theory: the non-commutativity of phases. We do so by passing single photons through a Sagnac interferometer containing both a metamaterial with a negative refractive index, and a positive phase shifter. To accomplish this we engineered a fishnet metamaterial to have a negative refractive index at 780 nm. We show that the metamaterial phase commutes with other phases with high precision, allowing us to place limits on a particular prediction of hyper-complex quantum theories. PMID:28429711

  12. An Adaptive Cooperative Strategy for Underlay MIMO Cognitive Radio Networks: An Opportunistic and Low-Complexity Approach

    NASA Astrophysics Data System (ADS)

    Mazoochi, M.; Pourmina, M. A.; Bakhshi, H.

    2015-03-01

    The core aim of this work is the maximization of the achievable data rate of the secondary user pairs (SU pairs), while ensuring the QoS of primary users (PUs). All users are assumed to be equipped with multiple antennas. It is assumed that when PUs are present, the direct communications between SU pairs introduces intolerable interference to PUs and thereby SUs transmit signal using the cooperation of other SUs and avoid transmitting in the direct channel. In brief, an adaptive cooperative strategy for multiple-input/multiple-output (MIMO) cognitive radio networks is proposed. At the presence of PUs, the issue of joint relay selection and power allocation in Underlay MIMO Cooperative Cognitive Radio Networks (U-MIMO-CCRN) is addressed. The optimal approach for determining the power allocation and the cooperating SU is proposed. Besides, the outage probability of the proposed communication protocol is further derived. Due to high complexity of the optimal approach, a low-complexity approach is further proposed and its performance is evaluated using simulations. The simulation results reveal that the performance loss due to the low-complexity approach is only about 14%, while the complexity is greatly reduced.

  13. High-efficiency emitting materials based on phenylquinoline/carbazole-based compounds for organic light emitting diode applications

    NASA Astrophysics Data System (ADS)

    Jin, Sung-Ho

    2009-08-01

    Highly efficient light-emitting materials based on phenylquinoline-carbazole derivative has been synthesized for organic-light emitting diodes (OLEDs). The materials form high quality amorphous thin films by thermal evaporation and the energy levels can be easily adjusted by the introduction of different electron donating and electron withdrawing groups on carbazoylphenylquinoline. Non-doped deep-blue OLEDs using Et-CVz-PhQ as the emitter show bright emission (CIE coordinates, x=0.156, y=0.093) with an external quantum efficiency of 2.45 %. Furthermore, the material works as an excellent host material for BCzVBi to get high-performance OLEDs with excellent deep-blue CIE coordinates (x=0.155, y=0.157), high power efficiency (5.98 lm/W), and high external quantum efficiency (5.22 %). Cyclometalated Ir(III) μ-chloride bridged dimers were synthesized by iridium trichloride hydrate with an excess of our developed deep-blue emitter, Et-CVz-PhQ. The Ir(III) complexes were prepared by the dimers with the corresponding ancillary ligands. The chloride bridged diiridium complexes can be easily converted to mononuclear Ir(III) complexes by replacing the two bridging chlorides with bidentate monoanionic ancillary ligands. Among the various types of ancillary ligands, we firstly used picolinic acid N-oxide, including picolinic acid and acetylacetone as an ancillary ligands for Ir(III) complexes. The PhOLEDs also shows reasonably high brightness and good luminance efficiency of 20,000 cd/m2 and 12 cd/A, respectively.

  14. FY07 NRL DoD High Performance Computing Modernization Program Annual Reports

    DTIC Science & Technology

    2008-09-05

    performed. Implicit and explicit solutions methods are used as appropriate. The primary finite element codes used are ABAQUS and ANSYS. User subroutines ...geometric complexities, loading path dependence, rate dependence, and interaction between loading types (electrical, thermal and mechanical). Work is not...are used for specialized material constitutive response. Coupled material responses, such as electrical- thermal for capacitor materials or electrical

  15. Fuel Tank Technology

    DTIC Science & Technology

    1989-11-01

    the high risk of fuel cells damaging as a consequence of the unfolding and refolding operations. - Difficulties to perform acceptance inspection tests...corners sometimes present in the structures. (See FIG. 6, 7, 8). - Additional installation costs and risk of damaging due to fuel cells anchoring...performed manually by very complex tying operations. (See. FIG. 9). - Risk of damaging of the thicker reinforced zones of the flexible fuel cells where

  16. High-performance execution of psychophysical tasks with complex visual stimuli in MATLAB

    PubMed Central

    Asaad, Wael F.; Santhanam, Navaneethan; McClellan, Steven

    2013-01-01

    Behavioral, psychological, and physiological experiments often require the ability to present sensory stimuli, monitor and record subjects' responses, interface with a wide range of devices, and precisely control the timing of events within a behavioral task. Here, we describe our recent progress developing an accessible and full-featured software system for controlling such studies using the MATLAB environment. Compared with earlier reports on this software, key new features have been implemented to allow the presentation of more complex visual stimuli, increase temporal precision, and enhance user interaction. These features greatly improve the performance of the system and broaden its applicability to a wider range of possible experiments. This report describes these new features and improvements, current limitations, and quantifies the performance of the system in a real-world experimental setting. PMID:23034363

  17. 42 CFR 493.1457 - Standard; Clinical consultant responsibilities.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard; Clinical consultant responsibilities. 493... Testing Laboratories Performing High Complexity Testing § 493.1457 Standard; Clinical consultant responsibilities. The clinical consultant provides consultation regarding the appropriateness of the testing...

  18. 42 CFR 493.1485 - Standard; Cytotechnologist responsibilities.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard; Cytotechnologist responsibilities. 493.1485 Section 493.1485 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND... Testing Laboratories Performing High Complexity Testing § 493.1485 Standard; Cytotechnologist...

  19. 42 CFR 493.1485 - Standard; Cytotechnologist responsibilities.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Standard; Cytotechnologist responsibilities. 493.1485 Section 493.1485 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND... Testing Laboratories Performing High Complexity Testing § 493.1485 Standard; Cytotechnologist...

  20. Multiplexed in vivo His-tagging of enzyme pathways for in vitro single-pot multienzyme catalysis.

    PubMed

    Wang, Harris H; Huang, Po-Yi; Xu, George; Haas, Wilhelm; Marblestone, Adam; Li, Jun; Gygi, Steven P; Forster, Anthony C; Jewett, Michael C; Church, George M

    2012-02-17

    Protein pathways are dynamic and highly coordinated spatially and temporally, capable of performing a diverse range of complex chemistries and enzymatic reactions with precision and at high efficiency. Biotechnology aims to harvest these natural systems to construct more advanced in vitro reactions, capable of new chemistries and operating at high yield. Here, we present an efficient Multiplex Automated Genome Engineering (MAGE) strategy to simultaneously modify and co-purify large protein complexes and pathways from the model organism Escherichia coli to reconstitute functional synthetic proteomes in vitro. By application of over 110 MAGE cycles, we successfully inserted hexa-histidine sequences into 38 essential genes in vivo that encode for the entire translation machinery. Streamlined co-purification and reconstitution of the translation protein complex enabled protein synthesis in vitro. Our approach can be applied to a growing area of applications in in vitro one-pot multienzyme catalysis (MEC) to manipulate or enhance in vitro pathways such as natural product or carbohydrate biosynthesis.

Top