Sample records for receiver functions computed

  1. Administering truncated receive functions in a parallel messaging interface

    DOEpatents

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2014-12-09

    Administering truncated receive functions in a parallel messaging interface (`PMI`) of a parallel computer comprising a plurality of compute nodes coupled for data communications through the PMI and through a data communications network, including: sending, through the PMI on a source compute node, a quantity of data from the source compute node to a destination compute node; specifying, by an application on the destination compute node, a portion of the quantity of data to be received by the application on the destination compute node and a portion of the quantity of data to be discarded; receiving, by the PMI on the destination compute node, all of the quantity of data; providing, by the PMI on the destination compute node to the application on the destination compute node, only the portion of the quantity of data to be received by the application; and discarding, by the PMI on the destination compute node, the portion of the quantity of data to be discarded.

  2. Auto covariance computer

    NASA Technical Reports Server (NTRS)

    Hepner, T. E.; Meyers, J. F. (Inventor)

    1985-01-01

    A laser velocimeter covariance processor which calculates the auto covariance and cross covariance functions for a turbulent flow field based on Poisson sampled measurements in time from a laser velocimeter is described. The device will process a block of data that is up to 4096 data points in length and return a 512 point covariance function with 48-bit resolution along with a 512 point histogram of the interarrival times which is used to normalize the covariance function. The device is designed to interface and be controlled by a minicomputer from which the data is received and the results returned. A typical 4096 point computation takes approximately 1.5 seconds to receive the data, compute the covariance function, and return the results to the computer.

  3. Network Coding for Function Computation

    ERIC Educational Resources Information Center

    Appuswamy, Rathinakumar

    2011-01-01

    In this dissertation, the following "network computing problem" is considered. Source nodes in a directed acyclic network generate independent messages and a single receiver node computes a target function f of the messages. The objective is to maximize the average number of times f can be computed per network usage, i.e., the "computing…

  4. A micro-computer based system to compute magnetic variation

    NASA Technical Reports Server (NTRS)

    Kaul, R.

    1984-01-01

    A mathematical model of magnetic variation in the continental United States (COT48) was implemented in the Ohio University LORAN C receiver. The model is based on a least squares fit of a polynomial function. The implementation on the microprocessor based LORAN C receiver is possible with the help of a math chip, Am9511 which performs 32 bit floating point mathematical operations. A Peripheral Interface Adapter (M6520) is used to communicate between the 6502 based micro-computer and the 9511 math chip. The implementation provides magnetic variation data to the pilot as a function of latitude and longitude. The model and the real time implementation in the receiver are described.

  5. Anisotropic structure of the mantle wedge beneath the Ryukyu arc from teleseismic receiver function analysis

    NASA Astrophysics Data System (ADS)

    McCormack, K. A.; Wirth, E. A.; Long, M. D.

    2011-12-01

    The recycling of oceanic plates back into the mantle through subduction is an important process taking place within our planet. However, many fundamental aspects of subduction systems, such as the dynamics of mantle flow, have yet to be completely understood. Subducting slabs transport water down into the mantle, but how and where that water is released, as well as how it affects mantle flow, is still an open question. In this study, we focus on the Ryukyu subduction zone in southwestern Japan and use anisotropic receiver function analysis to characterize the structure of the mantle wedge. We compute radial and transverse P-to-S receiver functions for eight stations of the broadband F-net array using a multitaper receiver function estimator. We observe coherent P-to-SV converted energy in the radial receiver functions at ~6 sec for most of the stations analyzed consistent with conversions originating at the top of the slab. We also observe conversions on the transverse receiver functions that are consistent with the presence of multiple anisotropic and/or dipping layers. The character of the transverse receiver functions varies significantly along strike, with the northernmost three stations exhibiting markedly different behavior than stations located in the center of the Ryukyu arc. We compute synthetic receiver functions using a forward modeling scheme that can handle dipping interfaces and anisotropic layers to create models for the depths, thicknesses, and strengths of anisotropic layers in the mantle wedge beneath Ryukyu.

  6. Bragg-cell receiver study

    NASA Technical Reports Server (NTRS)

    Wilson, Lonnie A.

    1987-01-01

    Bragg-cell receivers are employed in specialized Electronic Warfare (EW) applications for the measurement of frequency. Bragg-cell receiver characteristics are fully characterized for simple RF emitter signals. This receiver is early in its development cycle when compared to the IFM receiver. Functional mathematical models are derived and presented in this report for the Bragg-cell receiver. Theoretical analysis is presented and digital computer signal processing results are presented for the Bragg-cell receiver. Probability density function analysis are performed for output frequency. Probability density function distributions are observed to depart from assumed distributions for wideband and complex RF signals. This analysis is significant for high resolution and fine grain EW Bragg-cell receiver systems.

  7. A Functional Analytic Approach to Computer-Interactive Mathematics

    ERIC Educational Resources Information Center

    Ninness, Chris; Rumph, Robin; McCuller, Glen; Harrison, Carol; Ford, Angela M.; Ninness, Sharon K.

    2005-01-01

    Following a pretest, 11 participants who were naive with regard to various algebraic and trigonometric transformations received an introductory lecture regarding the fundamentals of the rectangular coordinate system. Following the lecture, they took part in a computer-interactive matching-to-sample procedure in which they received training on…

  8. System and methods for determining masking signals for applying empirical mode decomposition (EMD) and for demodulating intrinsic mode functions obtained from application of EMD

    DOEpatents

    Senroy, Nilanjan [New Delhi, IN; Suryanarayanan, Siddharth [Littleton, CO

    2011-03-15

    A computer-implemented method of signal processing is provided. The method includes generating one or more masking signals based upon a computed Fourier transform of a received signal. The method further includes determining one or more intrinsic mode functions (IMFs) of the received signal by performing a masking-signal-based empirical mode decomposition (EMD) using the at least one masking signal.

  9. A micro-computer-based system to compute magnetic variation

    NASA Technical Reports Server (NTRS)

    Kaul, Rajan

    1987-01-01

    A mathematical model of magnetic variation in the continental United States was implemented in the Ohio University Loran-C receiver. The model is based on a least squares fit of a polynomial function. The implementation on the microprocessor based Loran-C receiver is possible with the help of a math chip which performs 32 bit floating point mathematical operations. A Peripheral Interface Adapter is used to communicate between the 6502 based microcomputer and the 9511 math chip. The implementation provides magnetic variation data to the pilot as a function of latitude and longitude. The model and the real time implementation in the receiver are described.

  10. Engineering studies related to Skylab program. [assessment of automatic gain control data

    NASA Technical Reports Server (NTRS)

    Hayne, G. S.

    1973-01-01

    The relationship between the S-193 Automatic Gain Control data and the magnitude of received signal power was studied in order to characterize performance parameters for Skylab equipment. The r-factor was used for the assessment and is defined to be less than unity, and a function of off-nadir angle, ocean surface roughness, and receiver signal to noise ratio. A digital computer simulation was also used to assess to additive receiver, or white noise. The system model for the digital simulation is described, along with intermediate frequency and video impulse response functions used, details of the input waveforms, and results to date. Specific discussion of the digital computer programs used is also provided.

  11. Pelvic Floor Dyssynergia

    MedlinePlus

    ... It is a painless process that uses a computer and a video monitor to display bodily functions ... or as linegraphs we can see on a computer screen. In this way, we receive information (feedback) ...

  12. Use of inpatient continuous passive motion versus no CPM in computer-assisted total knee arthroplasty.

    PubMed

    Alkire, Martha R; Swank, Michael L

    2010-01-01

    Continuous passive motion (CPM) has shown positive effects on tissue healing, edema, hemarthrosis, and joint function (L. Brosseau et al., 2004). CPM has also been shown to increase short-term early flexion and decrease length of stay (LOS) ( L. Brosseau et al., 2004; C. M. Chiarello, C. M. S. Gundersen, & T. O'Halloran, 2004). The benefits of CPM for the population of patients undergoing computer-assisted total knee arthroplasty (TKA) have not been examined. The primary objective of this study was to determine whether the use of CPM following computer-assisted TKA resulted in differences in range of motion, edema/drainage, functional ability, and pain. This was an experimental, prospective, randomized study of patients undergoing unilateral, computer-assisted TKA. The experimental group received CPM thrice daily and physical therapy (PT) twice daily during their hospitalization. The control group received PT twice daily and no CPM during the hospital stay. Both groups received PT after discharge. Measurement included Knee Society scores, Western Ontario McMaster Osteoarthritis Index values, range of motion, knee circumference, and HemoVac drainage. Data were collected at various intervals from preoperatively through 3 months. Although the control group was found to be higher functioning preoperatively, there was no statistically significant difference in flexion, edema or drainage, function, or pain between groups through the 3-month study period.

  13. Verifying the error bound of numerical computation implemented in computer systems

    DOEpatents

    Sawada, Jun

    2013-03-12

    A verification tool receives a finite precision definition for an approximation of an infinite precision numerical function implemented in a processor in the form of a polynomial of bounded functions. The verification tool receives a domain for verifying outputs of segments associated with the infinite precision numerical function. The verification tool splits the domain into at least two segments, wherein each segment is non-overlapping with any other segment and converts, for each segment, a polynomial of bounded functions for the segment to a simplified formula comprising a polynomial, an inequality, and a constant for a selected segment. The verification tool calculates upper bounds of the polynomial for the at least two segments, beginning with the selected segment and reports the segments that violate a bounding condition.

  14. Receive Mode Analysis and Design of Microstrip Reflectarrays

    NASA Technical Reports Server (NTRS)

    Rengarajan, Sembiam

    2011-01-01

    Traditionally microstrip or printed reflectarrays are designed using the transmit mode technique. In this method, the size of each printed element is chosen so as to provide the required value of the reflection phase such that a collimated beam results along a given direction. The reflection phase of each printed element is approximated using an infinite array model. The infinite array model is an excellent engineering approximation for a large microstrip array since the size or orientation of elements exhibits a slow spatial variation. In this model, the reflection phase from a given printed element is approximated by that of an infinite array of elements of the same size and orientation when illuminated by a local plane wave. Thus the reflection phase is a function of the size (or orientation) of the element, the elevation and azimuth angles of incidence of a local plane wave, and polarization. Typically, one computes the reflection phase of the infinite array as a function of several parameters such as size/orientation, elevation and azimuth angles of incidence, and in some cases for vertical and horizontal polarization. The design requires the selection of the size/orientation of the printed element to realize the required phase by interpolating or curve fitting all the computed data. This is a substantially complicated problem, especially in applications requiring a computationally intensive commercial code to determine the reflection phase. In dual polarization applications requiring rectangular patches, one needs to determine the reflection phase as a function of five parameters (dimensions of the rectangular patch, elevation and azimuth angles of incidence, and polarization). This is an extremely complex problem. The new method employs the reciprocity principle and reaction concept, two well-known concepts in electromagnetics to derive the receive mode analysis and design techniques. In the "receive mode design" technique, the reflection phase is computed for a plane wave incident on the reflectarray from the direction of the beam peak. In antenna applications with a single collimated beam, this method is extremely simple since all printed elements see the same angles of incidence. Thus the number of parameters is reduced by two when compared to the transmit mode design. The reflection phase computation as a function of five parameters in the rectangular patch array discussed previously is reduced to a computational problem with three parameters in the receive mode. Furthermore, if the beam peak is in the broadside direction, the receive mode design is polarization independent and the reflection phase computation is a function of two parameters only. For a square patch array, it is a function of the size, one parameter only, thus making it extremely simple.

  15. Modified timing module for Loran-C receiver

    NASA Technical Reports Server (NTRS)

    Lilley, R. W.

    1983-01-01

    Full hardware documentation is provided for the circuit card implementing the Loran-C timing loop, and the receiver event-mark and re-track functions. This documentation is to be combined with overall receiver drawings to form the as-built record for this device. Computer software to support this module is integrated with the remainder of the receiver software, in the LORPROM program.

  16. Estimating Effects of Multipath Propagation on GPS Signals

    NASA Technical Reports Server (NTRS)

    Byun, Sung; Hajj, George; Young, Lawrence

    2005-01-01

    Multipath Simulator Taking into Account Reflection and Diffraction (MUSTARD) is a computer program that simulates effects of multipath propagation on received Global Positioning System (GPS) signals. MUSTARD is a very efficient means of estimating multipath-induced position and phase errors as functions of time, given the positions and orientations of GPS satellites, the GPS receiver, and any structures near the receiver as functions of time. MUSTARD traces each signal from a GPS satellite to the receiver, accounting for all possible paths the signal can take, including all paths that include reflection and/or diffraction from surfaces of structures near the receiver and on the satellite. Reflection and diffraction are modeled by use of the geometrical theory of diffraction. The multipath signals are added to the direct signal after accounting for the gain of the receiving antenna. Then, in a simulation of a delay-lock tracking loop in the receiver, the multipath-induced range and phase errors as measured by the receiver are estimated. All of these computations are performed for both right circular polarization and left circular polarization of both the L1 (1.57542-GHz) and L2 (1.2276-GHz) GPS signals.

  17. Automatic Thermal Infrared Panoramic Imaging Sensor

    DTIC Science & Technology

    2006-11-01

    hibernation, in which power supply to the server computer , the wireless network hardware, the GPS receiver, and the electronic compass / tilt sensor...prototype. At the operator’s command on the client laptop, the receiver wakeup device on the server side will switch on the ATX power supply at the...server, to resume the power supply to all the APTIS components. The embedded computer will resume all of the functions it was performing when put

  18. PRROC: computing and visualizing precision-recall and receiver operating characteristic curves in R.

    PubMed

    Grau, Jan; Grosse, Ivo; Keilwagen, Jens

    2015-08-01

    Precision-recall (PR) and receiver operating characteristic (ROC) curves are valuable measures of classifier performance. Here, we present the R-package PRROC, which allows for computing and visualizing both PR and ROC curves. In contrast to available R-packages, PRROC allows for computing PR and ROC curves and areas under these curves for soft-labeled data using a continuous interpolation between the points of PR curves. In addition, PRROC provides a generic plot function for generating publication-quality graphics of PR and ROC curves. © The Author 2015. Published by Oxford University Press.

  19. Sensor sentinel computing device

    DOEpatents

    Damico, Joseph P.

    2016-08-02

    Technologies pertaining to authenticating data output by sensors in an industrial environment are described herein. A sensor sentinel computing device receives time-series data from a sensor by way of a wireline connection. The sensor sentinel computing device generates a validation signal that is a function of the time-series signal. The sensor sentinel computing device then transmits the validation signal to a programmable logic controller in the industrial environment.

  20. P- and S-wave Receiver Function Imaging with Scattering Kernels

    NASA Astrophysics Data System (ADS)

    Hansen, S. M.; Schmandt, B.

    2017-12-01

    Full waveform inversion provides a flexible approach to the seismic parameter estimation problem and can account for the full physics of wave propagation using numeric simulations. However, this approach requires significant computational resources due to the demanding nature of solving the forward and adjoint problems. This issue is particularly acute for temporary passive-source seismic experiments (e.g. PASSCAL) that have traditionally relied on teleseismic earthquakes as sources resulting in a global scale forward problem. Various approximation strategies have been proposed to reduce the computational burden such as hybrid methods that embed a heterogeneous regional scale model in a 1D global model. In this study, we focus specifically on the problem of scattered wave imaging (migration) using both P- and S-wave receiver function data. The proposed method relies on body-wave scattering kernels that are derived from the adjoint data sensitivity kernels which are typically used for full waveform inversion. The forward problem is approximated using ray theory yielding a computationally efficient imaging algorithm that can resolve dipping and discontinuous velocity interfaces in 3D. From the imaging perspective, this approach is closely related to elastic reverse time migration. An energy stable finite-difference method is used to simulate elastic wave propagation in a 2D hypothetical subduction zone model. The resulting synthetic P- and S-wave receiver function datasets are used to validate the imaging method. The kernel images are compared with those generated by the Generalized Radon Transform (GRT) and Common Conversion Point stacking (CCP) methods. These results demonstrate the potential of the kernel imaging approach to constrain lithospheric structure in complex geologic environments with sufficiently dense recordings of teleseismic data. This is demonstrated using a receiver function dataset from the Central California Seismic Experiment which shows several dipping interfaces related to the tectonic assembly of this region. Figure 1. Scattering kernel examples for three receiver function phases. A) direct P-to-s (Ps), B) direct S-to-p and C) free-surface PP-to-s (PPs).

  1. A computer assisted tutorial for applications of computer spreadsheets in nursing financial management.

    PubMed

    Edwardson, S R; Pejsa, J

    1993-01-01

    A computer-based tutorial for teaching nursing financial management concepts was developed using the macro function of a commercially available spreadsheet program. The goals of the tutorial were to provide students with an experience with spreadsheets as a computer tool and to teach selected financial management concepts. Preliminary results show the tutorial was well received by students. Suggestions are made for overcoming the general lack of computer sophistication among students.

  2. Low cost omega navigation receiver

    NASA Technical Reports Server (NTRS)

    Lilley, R. W.

    1974-01-01

    The development of a low cost Omega navigation receiver is discussed. Emphasis is placed on the completion and testing of a modular, multipurpose Omega receiver which utilizes a digital memory-aided, phase-locked loop to provide phase measurement data to a variety of applications interfaces. The functional units contained in the prototype device are described. The receiver is capable of receiving and storing phase measurements for up to eight Omega signals and computes two switch-selectable lines of position, displaying this navigation data in chart-recorded form.

  3. M18. Lack of Generalization From a High-Dose, Well-Powered Randomized Controlled Trial of Working Memory-Focused Training for Schizophrenia

    PubMed Central

    Nienow, Tasha; MacDonald, Angus

    2017-01-01

    Abstract Background: Cognitive deficits contribute to the functional disability associated with schizophrenia. Cognitive training has shown promise as a method of intervention; however, there is considerable variability in the implementation of this approach. The aim of this study was to test the efficacy of a high dose of cognitive training that targeted working memory-related functions. Methods: A randomized, double blind, active placebo-controlled, clinical trial was conducted with 80 outpatients with schizophrenia (mean age 46.44 years, 25% female). Patients were randomized to either working memory-based cognitive training or a computer skills training course that taught computer applications. In both conditions, participants received an average of 3 hours of training weekly for 16 weeks. Cognitive and functional outcomes were assessed with the MATRICS Consensus Cognitive Battery, N-Back performance, 2 measures of functional capacity (UPSA and SSPA) and a measure of community functioning, the Social Functioning Scale. Results: An intent-to-treat analysis found that patients who received cognitive training demonstrated significantly greater change on a trained task (Word N-Back), F(78) = 21.69, P < .0001, and a novel version of a trained task (Picture N-Back) as compared to those in the comparison condition, F(78) = 13.59, P = .002. However, only very modest support was found for generalization of training gains. A trend for an interaction was found on the MCCB Attention Domain score, F(78) = 2.56, P = .12. Participants who received cognitive training demonstrated significantly improved performance, t(39) = 3.79, P = .001, while those in computer skills did not, t(39) = 1.07, P = .37. Conclusion: A well-powered, high-dose, working memory focused, computer-based, cognitive training protocol produced only a small effect in patients with schizophrenia. Results indicate the importance of measuring generalization from training tasks in cognitive remediation studies. Computer-based training was not an effective method of producing change in cognition in patients with schizophrenia.

  4. Signal-processing analysis of the MC2823 radar fuze: an addendum concerning clutter effects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jelinek, D.A.

    1978-07-01

    A detailed analysis of the signal processing of the MC2823 radar fuze was published by Thompson in 1976 which enabled the computation of dud probability versus signal-to-noise ratio where the noise was receiver noise. An addendum to Thompson's work was published by Williams in 1978 that modified the weighting function used by Thompson. The analysis presented herein extends the work of Thompson to include the effects of clutter (the non-signal portion of the echo from a terrain) using the new weighting function. This extension enables computation of dud probability versus signal-to-total-noise ratio where total noise is the sum of themore » receiver-noise power and the clutter power.« less

  5. Effects of brain-computer interface-based functional electrical stimulation on brain activation in stroke patients: a pilot randomized controlled trial.

    PubMed

    Chung, EunJung; Kim, Jung-Hee; Park, Dae-Sung; Lee, Byoung-Hee

    2015-03-01

    [Purpose] This study sought to determine the effects of brain-computer interface-based functional electrical stimulation (BCI-FES) on brain activation in patients with stroke. [Subjects] The subjects were randomized to in a BCI-FES group (n=5) and a functional electrical stimulation (FES) group (n=5). [Methods] Patients in the BCI-FES group received ankle dorsiflexion training with FES for 30 minutes per day, 5 times under the brain-computer interface-based program. The FES group received ankle dorsiflexion training with FES for the same amount of time. [Results] The BCI-FES group demonstrated significant differences in the frontopolar regions 1 and 2 attention indexes, and frontopolar 1 activation index. The FES group demonstrated no significant differences. There were significant differences in the frontopolar 1 region activation index between the two groups after the interventions. [Conclusion] The results of this study suggest that BCI-FES training may be more effective in stimulating brain activation than only FES training in patients recovering from stroke.

  6. CONC/11: A computer program for calculating the performance of dish-type solar thermal collectors and power systems

    NASA Technical Reports Server (NTRS)

    Jaffe, L. D.

    1984-01-01

    The CONC/11 computer program designed for calculating the performance of dish-type solar thermal collectors and power systems is discussed. This program is intended to aid the system or collector designer in evaluating the performance to be expected with possible design alternatives. From design or test data on the characteristics of the various subsystems, CONC/11 calculates the efficiencies of the collector and the overall power system as functions of the receiver temperature for a specified insolation. If desired, CONC/11 will also determine the receiver aperture and the receiver temperature that will provide the highest efficiencies at a given insolation. The program handles both simple and compound concentrators. The CONC/11 is written in Athena Extended FORTRAN (similar to FORTRAN 77) to operate primarily in an interactive mode on a Sperry 1100/81 computer. It could also be used on many small computers. A user's manual is also provided for this program.

  7. Data processing techniques used with MST radars: A review

    NASA Technical Reports Server (NTRS)

    Rastogi, P. K.

    1983-01-01

    The data processing methods used in high power radar probing of the middle atmosphere are examined. The radar acts as a spatial filter on the small scale refractivity fluctuations in the medium. The characteristics of the received signals are related to the statistical properties of these fluctuations. A functional outline of the components of a radar system is given. Most computation intensive tasks are carried out by the processor. The processor computes a statistical function of the received signals, simultaneously for a large number of ranges. The slow fading of atmospheric signals is used to reduce the data input rate to the processor by coherent integration. The inherent range resolution of the radar experiments can be improved significant with the use of pseudonoise phase codes to modulate the transmitted pulses and a corresponding decoding operation on the received signals. Commutability of the decoding and coherent integration operations is used to obtain a significant reduction in computations. The limitations of the processors are outlined. At the next level of data reduction, the measured function is parameterized by a few spectral moments that can be related to physical processes in the medium. The problems encountered in estimating the spectral moments in the presence of strong ground clutter, external interference, and noise are discussed. The graphical and statistical analysis of the inferred parameters are outlined. The requirements for special purpose processors for MST radars are discussed.

  8. Large scale propagation intermittency in the atmosphere

    NASA Astrophysics Data System (ADS)

    Mehrabi, Ali

    2000-11-01

    Long-term (several minutes to hours) amplitude variations observed in outdoor sound propagation experiments at Disneyland, California, in February 1998 are explained in terms of a time varying index of refraction. The experimentally propagated acoustic signals were received and recorded at several locations ranging from 300 meters to 2,800 meters. Meteorological data was taken as a function of altitude simultaneously with the received signal levels. There were many barriers along the path of acoustic propagation that affected the received signal levels, especially at short ranges. In a downward refraction situation, there could be a random change of amplitude in the predicted signals. A computer model based on the Fast Field Program (FFP) was used to compute the signal loss at the different receiving locations and to verify that the variations in the received signal levels can be predicted numerically. The calculations agree with experimental data with the same trend variations in average amplitude.

  9. Effects of brain-computer interface-based functional electrical stimulation on balance and gait function in patients with stroke: preliminary results

    PubMed Central

    Chung, EunJung; Park, Sang-In; Jang, Yun-Yung; Lee, Byoung-Hee

    2015-01-01

    [Purpose] The purpose of this study was to determine the effects of brain-computer interface (BCI)-based functional electrical stimulation (FES) on balance and gait function in patients with stroke. [Subjects] Subjects were randomly allocated to a BCI-FES group (n=5) and a FES group (n=5). [Methods] The BCI-FES group received ankle dorsiflexion training with FES according to a BCI-based program for 30 minutes per day for 5 days. The FES group received ankle dorsiflexion training with FES for the same duration. [Results] Following the intervention, the BCI-FES group showed significant differences in Timed Up and Go test value, cadence, and step length on the affected side. The FES group showed no significant differences after the intervention. However, there were no significant differences between the 2 groups after the intervention. [Conclusion] The results of this study suggest that BCI-based FES training is a more effective exercise for balance and gait function than FES training alone in patients with stroke. PMID:25729205

  10. Effects of brain-computer interface-based functional electrical stimulation on balance and gait function in patients with stroke: preliminary results.

    PubMed

    Chung, EunJung; Park, Sang-In; Jang, Yun-Yung; Lee, Byoung-Hee

    2015-02-01

    [Purpose] The purpose of this study was to determine the effects of brain-computer interface (BCI)-based functional electrical stimulation (FES) on balance and gait function in patients with stroke. [Subjects] Subjects were randomly allocated to a BCI-FES group (n=5) and a FES group (n=5). [Methods] The BCI-FES group received ankle dorsiflexion training with FES according to a BCI-based program for 30 minutes per day for 5 days. The FES group received ankle dorsiflexion training with FES for the same duration. [Results] Following the intervention, the BCI-FES group showed significant differences in Timed Up and Go test value, cadence, and step length on the affected side. The FES group showed no significant differences after the intervention. However, there were no significant differences between the 2 groups after the intervention. [Conclusion] The results of this study suggest that BCI-based FES training is a more effective exercise for balance and gait function than FES training alone in patients with stroke.

  11. Reduction of accounts receivable through total quality management.

    PubMed

    LaFleur, N

    1994-01-01

    On October 1, 1990, The Miriam Hospital in Providence, R.I., converted to a new computer system for patient accounting applications and on-line registration functions. The new system automated the hospital's patient accounting, registration, and medical records functions and interfaced registration with patient accounts for billing purposes.

  12. Electronic spreadsheet vs. manual payroll.

    PubMed

    Kiley, M M

    1991-01-01

    Medical groups with direct employees must employ someone or contract with a company to compute payroll, writes Michael Kiley, Ph.D., M.P.H. However, many medical groups, including small ones, own a personal or minicomputer to handle accounts receivable. Kiley explains, in detail, how this same computer and a spreadsheet program also can be used to perform payroll functions.

  13. The Crustal Structure of the Central Anatolia (Turkey) Using Receiver Functions

    NASA Astrophysics Data System (ADS)

    Yelkenci, S.; Benoit, M.; Kuleli, H.; Gurbuz, C.

    2005-12-01

    Central Anatolia lies in a transitional region between the extensional tectonics of western Anatolia and the complex transpressional tectonics of Eastern Anatolia, and has a complicated thermal and structural history. Few studies of the crustal structure of Anatolia have been performed, however, studies of the crustal structure of Eastern Anatolia showed that crustal thicknesses were thinner than previously thought. To further investigate the crustal structure in Central Anatolia, we present results from receiver function analysis using new data from broad-band instruments. The stations were equipped with 7 broadband three-component STS-2 and 13 short period three-component S-13 sensors. These stations operated for period of one and half months between the October and November, 2002, and yielded data for ~ 40 high quality receiver functions. Additionally, receiver functions were also computed using data from permanent stations MALT, ISP, and ANTO. We applied the hk-stacking technique of Zhu and Kanamori (2000) to receiver functions to obtain the crustal thickness and Vp/Vs ratios. Furthermore, we applied a waveform modeling technique to investigate mid-crustal discontinuties previously imaged in the region. Our results compare well with refraction-based crustal thicknesses in overlapped areas.

  14. Atmosphere Explorer control system software (version 2.0)

    NASA Technical Reports Server (NTRS)

    Mocarsky, W.; Villasenor, A.

    1973-01-01

    The Atmosphere Explorer Control System (AECS) was developed to provide automatic computer control of the Atmosphere Explorer spacecraft and experiments. The software performs several vital functions, such as issuing commands to the spacecraft and experiments, receiving and processing telemetry data, and allowing for extensive data processing by experiment analysis programs. The AECS was written for a 48K XEROX Data System Sigma 5 computer, and coexists in core with the XDS Real-time Batch Monitor (RBM) executive system. RBM is a flexible operating system designed for a real-time foreground/background environment, and hence is ideally suited for this application. Existing capabilities of RBM have been used as much as possible by AECS to minimize programming redundancy. The most important functions of the AECS are to send commands to the spacecraft and experiments, and to receive, process, and display telemetry data.

  15. Revision of an automated microseismic location algorithm for DAS - 3C geophone hybrid array

    NASA Astrophysics Data System (ADS)

    Mizuno, T.; LeCalvez, J.; Raymer, D.

    2017-12-01

    Application of distributed acoustic sensing (DAS) has been studied in several areas in seismology. One of the areas is microseismic reservoir monitoring (e.g., Molteni et al., 2017, First Break). Considering the present limitations of DAS, which include relatively low signal-to-noise ratio (SNR) and no 3C polarization measurements, a DAS - 3C geophone hybrid array is a practical option when using a single monitoring well. Considering the large volume of data from distributed sensing, microseismic event detection and location using a source scanning type algorithm is a reasonable choice, especially for real-time monitoring. The algorithm must handle both strain rate along the borehole axis for DAS and particle velocity for 3C geophones. Only a small quantity of large SNR events will be detected throughout a large aperture encompassing the hybrid array; therefore, the aperture is to be optimized dynamically to eliminate noisy channels for a majority of events. For such hybrid array, coalescence microseismic mapping (CMM) (Drew et al., 2005, SPE) was revised. CMM forms a likelihood function of location of event and its origin time. At each receiver, a time function of event arrival likelihood is inferred using an SNR function, and it is migrated to time and space to determine hypocenter and origin time likelihood. This algorithm was revised to dynamically optimize such a hybrid array by identifying receivers where a microseismic signal is possibly detected and using only those receivers to compute the likelihood function. Currently, peak SNR is used to select receivers. To prevent false results due to small aperture, a minimum aperture threshold is employed. The algorithm refines location likelihood using 3C geophone polarization. We tested this algorithm using a ray-based synthetic dataset. Leaney (2014, PhD thesis, UBC) is used to compute particle velocity at receivers. Strain rate along the borehole axis is computed from particle velocity as DAS microseismic synthetic data. The likelihood function formed by both DAS and geophone behaves as expected with the aperture dynamically selected depending on the SNR of the event. We conclude that this algorithm can be successfully applied for such hybrid arrays to monitor microseismic activity. A study using a recently acquired dataset is planned.

  16. Investigating Segmentation in Cascadia: Anisotropic Crustal Structure and Mantle Wedge Serpentinization from Receiver Functions

    NASA Astrophysics Data System (ADS)

    Krueger, Hannah E.; Wirth, Erin A.

    2017-10-01

    The Cascadia subduction zone exhibits along-strike segmentation in structure, processes, and seismogenic behavior. While characterization of seismic anisotropy can constrain deformation processes at depth, the character of seismic anisotropy in Cascadia remains poorly understood. This is primarily due to a lack of seismicity in the subducting Juan de Fuca slab, which limits shear wave splitting and other seismological analyses that interrogate the fine-scale anisotropic structure of the crust and mantle wedge. We investigate lower crustal anisotropy and mantle wedge structure by computing P-to-S receiver functions at 12 broadband seismic stations along the Cascadia subduction zone. We observe P-to-SV converted energy consistent with previously estimated Moho depths. Several stations exhibit evidence of an "inverted Moho" (i.e., a downward velocity decrease across the crust-mantle boundary), indicative of a serpentinized mantle wedge. Stations with an underlying hydrated mantle wedge appear prevalent from northern Washington to central Oregon, but sparse in southern Oregon and northern California. Transverse component receiver functions are complex, suggesting anisotropic and/or dipping crustal structure. To constrain the orientation of crustal anisotropy we compute synthetic receiver functions using manual forward modeling. We determine that the lower crust shows variable orientations of anisotropy along-strike, with highly complex anisotropy in northern Cascadia, and generally NW-SE and NE-SW orientations of slow-axis anisotropy in central and southern Cascadia, respectively. The orientations of anisotropy from this work generally agree with those inferred from shear wave splitting of tremor studies at similar locations, lending confidence to this relatively new method of inferring seismic anisotropy from slow earthquakes.

  17. DRAGON score predicts functional outcomes in acute ischemic stroke patients receiving both intravenous tissue plasminogen activator and endovascular therapy.

    PubMed

    Wang, Arthur; Pednekar, Noorie; Lehrer, Rachel; Todo, Akira; Sahni, Ramandeep; Marks, Stephen; Stiefel, Michael F

    2017-01-01

    The DRAGON score, which includes clinical and computed tomographic (CT) scan parameters, predicts functional outcomes in ischemic stroke patients treated with intravenous tissue plasminogen activator (IV tPA). We assessed the utility of the DRAGON score in predicting functional outcome in stroke patients receiving both IV tPA and endovascular therapy. A retrospective chart review of patients treated at our institution from February 2009 to October 2015 was conducted. All patients with computed tomography angiography (CTA) proven large vessel occlusions (LVO) who underwent intravenous thrombolysis and endovascular therapy were included. Baseline DRAGON scores and modified Rankin Score (mRS) at the time of hospital discharge was calculated. Good outcome was defined as mRS ≤3. Fifty-eight patients with LVO of the anterior circulation were studied. The mean DRAGON score of patients on admission was 5.3 (range, 3-8). All patients received IV tPA and endovascular therapy. Multivariate analysis demonstrated that DRAGON scores ≥7 was associated with higher mRS ( P < 0.006) and higher mortality ( P < 0.0001) compared with DRAGON scores ≤6. Patients with DRAGON scores of 7 and 8 on admission had a mortality rate of 3.8% and 40%, respectively. The DRAGON score can help predict better functional outcomes in ischemic stroke patients receiving both IV tPA and endovascular therapy. This data supports the use of the DRAGON score in selecting patients who could potentially benefit from more invasive therapies such as endovascular treatment. Larger prospective studies are warranted to further validate these results.

  18. Cellular computational platform and neurally inspired elements thereof

    DOEpatents

    Okandan, Murat

    2016-11-22

    A cellular computational platform is disclosed that includes a multiplicity of functionally identical, repeating computational hardware units that are interconnected electrically and optically. Each computational hardware unit includes a reprogrammable local memory and has interconnections to other such units that have reconfigurable weights. Each computational hardware unit is configured to transmit signals into the network for broadcast in a protocol-less manner to other such units in the network, and to respond to protocol-less broadcast messages that it receives from the network. Each computational hardware unit is further configured to reprogram the local memory in response to incoming electrical and/or optical signals.

  19. Determination of acoustical transfer functions using an impulse method

    NASA Astrophysics Data System (ADS)

    MacPherson, J.

    1985-02-01

    The Transfer Function of a system may be defined as the relationship of the output response to the input of a system. Whilst recent advances in digital processing systems have enabled Impulse Transfer Functions to be determined by computation of the Fast Fourier Transform, there has been little work done in applying these techniques to room acoustics. Acoustical Transfer Functions have been determined for auditoria, using an impulse method. The technique is based on the computation of the Fast Fourier Transform (FFT) of a non-ideal impulsive source, both at the source and at the receiver point. The Impulse Transfer Function (ITF) is obtained by dividing the FFT at the receiver position by the FFT of the source. This quantity is presented both as linear frequency scale plots and also as synthesized one-third octave band data. The technique enables a considerable quantity of data to be obtained from a small number of impulsive signals recorded in the field, thereby minimizing the time and effort required on site. As the characteristics of the source are taken into account in the calculation, the choice of impulsive source is non-critical. The digital analysis equipment required for the analysis is readily available commercially.

  20. Optimal Energy Efficiency Fairness of Nodes in Wireless Powered Communication Networks.

    PubMed

    Zhang, Jing; Zhou, Qingjie; Ng, Derrick Wing Kwan; Jo, Minho

    2017-09-15

    In wireless powered communication networks (WPCNs), it is essential to research energy efficiency fairness in order to evaluate the balance of nodes for receiving information and harvesting energy. In this paper, we propose an efficient iterative algorithm for optimal energy efficiency proportional fairness in WPCN. The main idea is to use stochastic geometry to derive the mean proportionally fairness utility function with respect to user association probability and receive threshold. Subsequently, we prove that the relaxed proportionally fairness utility function is a concave function for user association probability and receive threshold, respectively. At the same time, a sub-optimal algorithm by exploiting alternating optimization approach is proposed. Through numerical simulations, we demonstrate that our sub-optimal algorithm can obtain a result close to optimal energy efficiency proportional fairness with significant reduction of computational complexity.

  1. Optimal Energy Efficiency Fairness of Nodes in Wireless Powered Communication Networks

    PubMed Central

    Zhou, Qingjie; Ng, Derrick Wing Kwan; Jo, Minho

    2017-01-01

    In wireless powered communication networks (WPCNs), it is essential to research energy efficiency fairness in order to evaluate the balance of nodes for receiving information and harvesting energy. In this paper, we propose an efficient iterative algorithm for optimal energy efficiency proportional fairness in WPCN. The main idea is to use stochastic geometry to derive the mean proportionally fairness utility function with respect to user association probability and receive threshold. Subsequently, we prove that the relaxed proportionally fairness utility function is a concave function for user association probability and receive threshold, respectively. At the same time, a sub-optimal algorithm by exploiting alternating optimization approach is proposed. Through numerical simulations, we demonstrate that our sub-optimal algorithm can obtain a result close to optimal energy efficiency proportional fairness with significant reduction of computational complexity. PMID:28914818

  2. Autonomous Integrated Receive System (AIRS) requirements definition. Volume 2: Design and development

    NASA Technical Reports Server (NTRS)

    Chie, C. M.; White, M. A.; Lindsey, W. C.; Davarian, F.; Dixon, R. C.

    1984-01-01

    Functional requirements and specifications are defined for an autonomous integrated receive system (AIRS) to be used as an improvement in the current tracking and data relay satellite system (TDRSS), and as a receiving system in the future tracking and data acquisition system (TDAS). The AIRS provides improved acquisition, tracking, bit error rate (BER), RFI mitigation techniques, and data operations performance compared to the current TDRSS ground segment receive system. A computer model of the AIRS is used to provide simulation results predicting the performance of AIRS. Cost and technology assessments are included.

  3. High-Degree Neurons Feed Cortical Computations

    PubMed Central

    Timme, Nicholas M.; Ito, Shinya; Shimono, Masanori; Yeh, Fang-Chin; Litke, Alan M.; Beggs, John M.

    2016-01-01

    Recent work has shown that functional connectivity among cortical neurons is highly varied, with a small percentage of neurons having many more connections than others. Also, recent theoretical developments now make it possible to quantify how neurons modify information from the connections they receive. Therefore, it is now possible to investigate how information modification, or computation, depends on the number of connections a neuron receives (in-degree) or sends out (out-degree). To do this, we recorded the simultaneous spiking activity of hundreds of neurons in cortico-hippocampal slice cultures using a high-density 512-electrode array. This preparation and recording method combination produced large numbers of neurons recorded at temporal and spatial resolutions that are not currently available in any in vivo recording system. We utilized transfer entropy (a well-established method for detecting linear and nonlinear interactions in time series) and the partial information decomposition (a powerful, recently developed tool for dissecting multivariate information processing into distinct parts) to quantify computation between neurons where information flows converged. We found that computations did not occur equally in all neurons throughout the networks. Surprisingly, neurons that computed large amounts of information tended to receive connections from high out-degree neurons. However, the in-degree of a neuron was not related to the amount of information it computed. To gain insight into these findings, we developed a simple feedforward network model. We found that a degree-modified Hebbian wiring rule best reproduced the pattern of computation and degree correlation results seen in the real data. Interestingly, this rule also maximized signal propagation in the presence of network-wide correlations, suggesting a mechanism by which cortex could deal with common random background input. These are the first results to show that the extent to which a neuron modifies incoming information streams depends on its topological location in the surrounding functional network. PMID:27159884

  4. Space Handbook

    DTIC Science & Technology

    1970-07-01

    communications problem is confined to transfer between the vehicle and the surface and to recording the data in a form suitable for processing and analysis at a...information available for later analysis would be of greater significance. Thus, computers used to receive and record data may be used to perform preliminary...basic design of boosters and payloads to the reduction and analysis of the data after mission completion. Computers function to schedule experiments

  5. 3D receiver function Kirchhoff depth migration image of Cascadia subduction slab weak zone

    NASA Astrophysics Data System (ADS)

    Cheng, C.; Allen, R. M.; Bodin, T.; Tauzin, B.

    2016-12-01

    We have developed a highly computational efficient algorithm of applying 3D Kirchhoff depth migration to telesismic receiver function data. Combine primary PS arrival with later multiple arrivals we are able to reveal a better knowledge about the earth discontinuity structure (transmission and reflection). This method is highly useful compare with traditional CCP method when dipping structure is met during the imaging process, such as subduction slab. We apply our method to the reginal Cascadia subduction zone receiver function data and get a high resolution 3D migration image, for both primary and multiples. The image showed us a clear slab weak zone (slab hole) in the upper plate boundary under Northern California and the whole Oregon. Compare with previous 2D receiver function image from 2D array(CAFE and CASC93), the position of the weak zone shows interesting conherency. This weak zone is also conherent with local seismicity missing and heat rising, which lead us to think about and compare with the ocean plate stucture and the hydralic fluid process during the formation and migration of the subduction slab.

  6. On algorithmic optimization of histogramming functions for GEM systems

    NASA Astrophysics Data System (ADS)

    Krawczyk, Rafał D.; Czarski, Tomasz; Kolasinski, Piotr; Poźniak, Krzysztof T.; Linczuk, Maciej; Byszuk, Adrian; Chernyshova, Maryna; Juszczyk, Bartlomiej; Kasprowicz, Grzegorz; Wojenski, Andrzej; Zabolotny, Wojciech

    2015-09-01

    This article concerns optimization methods for data analysis for the X-ray GEM detector system. The offline analysis of collected samples was optimized for MATLAB computations. Compiled functions in C language were used with MEX library. Significant speedup was received for both ordering-preprocessing and for histogramming of samples. Utilized techniques with obtained results are presented.

  7. JANUS: Joint Academic Network Using Satellite. Brief Description of Project. IET Papers on Broadcasting: No. 287.

    ERIC Educational Resources Information Center

    Bates, A. W.

    The JANUS (Joint Academic Network Using Satellite) satellite network is being planned to link European institutions wishing to jointly produce distance teaching materials. Earth stations with capabilities for transmit/receive functions, voice/data functions, two 64 kbs channels, and connection to local telephone exchange and computer networks will…

  8. The Precambrian crustal structure of East Africa

    NASA Astrophysics Data System (ADS)

    Young, A. J.; Tugume, F.; Nyblade, A.; Julia, J.; Mulibo, G.

    2011-12-01

    We present new results on crustal structure from East Africa from analyzing P wave receiver functions. The data for this study come from temporary AfricaArray broadband seismic stations deployed between 2007 and 2011 in Uganda, Tanzania and Zambia. Receiver functions have been computed using an iterative deconvolution method. Crustal structure has been imaged using the H-k stacking method and by jointly inverting the receiver functions and surface wave phase and group velocities. The results show remarkably uniform crust throughout the Archean and Proterozoic terrains that comprise the Precambrian tectonic framework of the region. Crustal thickness for most terrains is between 37 and 40 km, and Poisson's ratio is between 0.25 and 0.27. Results from the joint inversion yield average crustal Vs values of 3.6 to 3.7 km/s. For most terrains, a thin (1-5 km) thick high velocity (Vs>4.0 km/s) is found at the base of the crust.

  9. Method, systems, and computer program products for implementing function-parallel network firewall

    DOEpatents

    Fulp, Errin W [Winston-Salem, NC; Farley, Ryan J [Winston-Salem, NC

    2011-10-11

    Methods, systems, and computer program products for providing function-parallel firewalls are disclosed. According to one aspect, a function-parallel firewall includes a first firewall node for filtering received packets using a first portion of a rule set including a plurality of rules. The first portion includes less than all of the rules in the rule set. At least one second firewall node filters packets using a second portion of the rule set. The second portion includes at least one rule in the rule set that is not present in the first portion. The first and second portions together include all of the rules in the rule set.

  10. A digital-receiver for the MurchisonWidefield Array

    NASA Astrophysics Data System (ADS)

    Prabu, Thiagaraj; Srivani, K. S.; Roshi, D. Anish; Kamini, P. A.; Madhavi, S.; Emrich, David; Crosse, Brian; Williams, Andrew J.; Waterson, Mark; Deshpande, Avinash A.; Shankar, N. Udaya; Subrahmanyan, Ravi; Briggs, Frank H.; Goeke, Robert F.; Tingay, Steven J.; Johnston-Hollitt, Melanie; R, Gopalakrishna M.; Morgan, Edward H.; Pathikulangara, Joseph; Bunton, John D.; Hampson, Grant; Williams, Christopher; Ord, Stephen M.; Wayth, Randall B.; Kumar, Deepak; Morales, Miguel F.; deSouza, Ludi; Kratzenberg, Eric; Pallot, D.; McWhirter, Russell; Hazelton, Bryna J.; Arcus, Wayne; Barnes, David G.; Bernardi, Gianni; Booler, T.; Bowman, Judd D.; Cappallo, Roger J.; Corey, Brian E.; Greenhill, Lincoln J.; Herne, David; Hewitt, Jacqueline N.; Kaplan, David L.; Kasper, Justin C.; Kincaid, Barton B.; Koenig, Ronald; Lonsdale, Colin J.; Lynch, Mervyn J.; Mitchell, Daniel A.; Oberoi, Divya; Remillard, Ronald A.; Rogers, Alan E.; Salah, Joseph E.; Sault, Robert J.; Stevens, Jamie B.; Tremblay, S.; Webster, Rachel L.; Whitney, Alan R.; Wyithe, Stuart B.

    2015-03-01

    An FPGA-based digital-receiver has been developed for a low-frequency imaging radio interferometer, the Murchison Widefield Array (MWA). The MWA, located at the Murchison Radio-astronomy Observatory (MRO) in Western Australia, consists of 128 dual-polarized aperture-array elements (tiles) operating between 80 and 300 MHz, with a total processed bandwidth of 30.72 MHz for each polarization. Radio-frequency signals from the tiles are amplified and band limited using analog signal conditioning units; sampled and channelized by digital-receivers. The signals from eight tiles are processed by a single digital-receiver, thus requiring 16 digital-receivers for the MWA. The main function of the digital-receivers is to digitize the broad-band signals from each tile, channelize them to form the sky-band, and transport it through optical fibers to a centrally located correlator for further processing. The digital-receiver firmware also implements functions to measure the signal power, perform power equalization across the band, detect interference-like events, and invoke diagnostic modes. The digital-receiver is controlled by high-level programs running on a single-board-computer. This paper presents the digital-receiver design, implementation, current status, and plans for future enhancements.

  11. Finite-frequency tomography using adjoint methods-Methodology and examples using membrane surface waves

    NASA Astrophysics Data System (ADS)

    Tape, Carl; Liu, Qinya; Tromp, Jeroen

    2007-03-01

    We employ adjoint methods in a series of synthetic seismic tomography experiments to recover surface wave phase-speed models of southern California. Our approach involves computing the Fréchet derivative for tomographic inversions via the interaction between a forward wavefield, propagating from the source to the receivers, and an `adjoint' wavefield, propagating from the receivers back to the source. The forward wavefield is computed using a 2-D spectral-element method (SEM) and a phase-speed model for southern California. A `target' phase-speed model is used to generate the `data' at the receivers. We specify an objective or misfit function that defines a measure of misfit between data and synthetics. For a given receiver, the remaining differences between data and synthetics are time-reversed and used as the source of the adjoint wavefield. For each earthquake, the interaction between the regular and adjoint wavefields is used to construct finite-frequency sensitivity kernels, which we call event kernels. An event kernel may be thought of as a weighted sum of phase-specific (e.g. P) banana-doughnut kernels, with weights determined by the measurements. The overall sensitivity is simply the sum of event kernels, which defines the misfit kernel. The misfit kernel is multiplied by convenient orthonormal basis functions that are embedded in the SEM code, resulting in the gradient of the misfit function, that is, the Fréchet derivative. A non-linear conjugate gradient algorithm is used to iteratively improve the model while reducing the misfit function. We illustrate the construction of the gradient and the minimization algorithm, and consider various tomographic experiments, including source inversions, structural inversions and joint source-structure inversions. Finally, we draw connections between classical Hessian-based tomography and gradient-based adjoint tomography.

  12. A brain-computer interface controlled mail client.

    PubMed

    Yu, Tianyou; Li, Yuanqing; Long, Jinyi; Wang, Cong

    2013-01-01

    In this paper, we propose a brain-computer interface (BCI) based mail client. This system is controlled by hybrid features extracted from scalp-recorded electroencephalographic (EEG). We emulate the computer mouse by the motor imagery-based mu rhythm and the P300 potential. Furthermore, an adaptive P300 speller is included to provide text input function. With this BCI mail client, users can receive, read, write mails, as well as attach files in mail writing. The system has been tested on 3 subjects. Experimental results show that mail communication with this system is feasible.

  13. Neural and computational processes underlying dynamic changes in self-esteem

    PubMed Central

    Rutledge, Robb B; Moutoussis, Michael; Dolan, Raymond J

    2017-01-01

    Self-esteem is shaped by the appraisals we receive from others. Here, we characterize neural and computational mechanisms underlying this form of social influence. We introduce a computational model that captures fluctuations in self-esteem engendered by prediction errors that quantify the difference between expected and received social feedback. Using functional MRI, we show these social prediction errors correlate with activity in ventral striatum/subgenual anterior cingulate cortex, while updates in self-esteem resulting from these errors co-varied with activity in ventromedial prefrontal cortex (vmPFC). We linked computational parameters to psychiatric symptoms using canonical correlation analysis to identify an ‘interpersonal vulnerability’ dimension. Vulnerability modulated the expression of prediction error responses in anterior insula and insula-vmPFC connectivity during self-esteem updates. Our findings indicate that updating of self-evaluative beliefs relies on learning mechanisms akin to those used in learning about others. Enhanced insula-vmPFC connectivity during updating of those beliefs may represent a marker for psychiatric vulnerability. PMID:29061228

  14. Neural and computational processes underlying dynamic changes in self-esteem.

    PubMed

    Will, Geert-Jan; Rutledge, Robb B; Moutoussis, Michael; Dolan, Raymond J

    2017-10-24

    Self-esteem is shaped by the appraisals we receive from others. Here, we characterize neural and computational mechanisms underlying this form of social influence. We introduce a computational model that captures fluctuations in self-esteem engendered by prediction errors that quantify the difference between expected and received social feedback. Using functional MRI, we show these social prediction errors correlate with activity in ventral striatum/subgenual anterior cingulate cortex, while updates in self-esteem resulting from these errors co-varied with activity in ventromedial prefrontal cortex (vmPFC). We linked computational parameters to psychiatric symptoms using canonical correlation analysis to identify an 'interpersonal vulnerability' dimension. Vulnerability modulated the expression of prediction error responses in anterior insula and insula-vmPFC connectivity during self-esteem updates. Our findings indicate that updating of self-evaluative beliefs relies on learning mechanisms akin to those used in learning about others. Enhanced insula-vmPFC connectivity during updating of those beliefs may represent a marker for psychiatric vulnerability.

  15. Fast-Acquisition/Weak-Signal-Tracking GPS Receiver for HEO

    NASA Technical Reports Server (NTRS)

    Wintemitz, Luke; Boegner, Greg; Sirotzky, Steve

    2004-01-01

    A report discusses the technical background and design of the Navigator Global Positioning System (GPS) receiver -- . a radiation-hardened receiver intended for use aboard spacecraft. Navigator is capable of weak signal acquisition and tracking as well as much faster acquisition of strong or weak signals with no a priori knowledge or external aiding. Weak-signal acquisition and tracking enables GPS use in high Earth orbits (HEO), and fast acquisition allows for the receiver to remain without power until needed in any orbit. Signal acquisition and signal tracking are, respectively, the processes of finding and demodulating a signal. Acquisition is the more computationally difficult process. Previous GPS receivers employ the method of sequentially searching the two-dimensional signal parameter space (code phase and Doppler). Navigator exploits properties of the Fourier transform in a massively parallel search for the GPS signal. This method results in far faster acquisition times [in the lab, 12 GPS satellites have been acquired with no a priori knowledge in a Low-Earth-Orbit (LEO) scenario in less than one second]. Modeling has shown that Navigator will be capable of acquiring signals down to 25 dB-Hz, appropriate for HEO missions. Navigator is built using the radiation-hardened ColdFire microprocessor and housing the most computationally intense functions in dedicated field-programmable gate arrays. The high performance of the algorithm and of the receiver as a whole are made possible by optimizing computational efficiency and carefully weighing tradeoffs among the sampling rate, data format, and data-path bit width.

  16. X-wing fly-by-wire vehicle management system

    NASA Technical Reports Server (NTRS)

    Fischer, Jr., William C. (Inventor)

    1990-01-01

    A complete, computer based, vehicle management system (VMS) for X-Wing aircraft using digital fly-by-wire technology controlling many subsystems and providing functions beyond the classical aircraft flight control system. The vehicle management system receives input signals from a multiplicity of sensors and provides commands to a large number of actuators controlling many subsystems. The VMS includes--segregating flight critical and mission critical factors and providing a greater level of back-up or redundancy for the former; centralizing the computation of functions utilized by several subsystems (e.g. air data, rotor speed, etc.); integrating the control of the flight control functions, the compressor control, the rotor conversion control, vibration alleviation by higher harmonic control, engine power anticipation and self-test, all in the same flight control computer (FCC) hardware units. The VMS uses equivalent redundancy techniques to attain quadruple equivalency levels; includes alternate modes of operation and recovery means to back-up any functions which fail; and uses back-up control software for software redundancy.

  17. A computationally efficient technique to model depth, orientation and alignment via ray tracing in acoustic power transfer systems

    NASA Astrophysics Data System (ADS)

    Christensen, David B.; Basaeri, Hamid; Roundy, Shad

    2017-12-01

    In acoustic power transfer systems, a receiver is displaced from a transmitter by an axial depth, a lateral offset (alignment), and a rotation angle (orientation). In systems where the receiver’s position is not fixed, such as a receiver implanted in biological tissue, slight variations in depth, orientation, or alignment can cause significant variations in the received voltage and power. To address this concern, this paper presents a computationally efficient technique to model the effects of depth, orientation, and alignment via ray tracing (DOART) on received voltage and power in acoustic power transfer systems. DOART combines transducer circuit equivalent models, a modified version of Huygens principle, and ray tracing to simulate pressure wave propagation and reflection between a transmitter and a receiver in a homogeneous medium. A reflected grid method is introduced to calculate propagation distances, reflection coefficients, and initial vectors between a point on the transmitter and a point on the receiver for an arbitrary number of reflections. DOART convergence and simulation time per data point is discussed as a function of the number of reflections and elements chosen. Finally, experimental data is compared to DOART simulation data in terms of magnitude and shape of the received voltage signal.

  18. Dispatching function calls across accelerator devices

    DOEpatents

    Jacob, Arpith C.; Sallenave, Olivier H.

    2017-01-10

    In one embodiment, a computer-implemented method for dispatching a function call includes receiving, at a supervisor processing element (PE) and from an origin PE, an identifier of a target device, a stack frame of the origin PE, and an address of a function called from the origin PE. The supervisor PE allocates a target PE of the target device. The supervisor PE copies the stack frame of the origin PE to a new stack frame on a call stack of the target PE. The supervisor PE instructs the target PE to execute the function. The supervisor PE receives a notification that execution of the function is complete. The supervisor PE copies the stack frame of the target PE to the stack frame of the origin PE. The supervisor PE releases the target PE of the target device. The supervisor PE instructs the origin PE to resume execution of the program.

  19. Dispatching function calls across accelerator devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacob, Arpith C.; Sallenave, Olivier H.

    In one embodiment, a computer-implemented method for dispatching a function call includes receiving, at a supervisor processing element (PE) and from an origin PE, an identifier of a target device, a stack frame of the origin PE, and an address of a function called from the origin PE. The supervisor PE allocates a target PE of the target device. The supervisor PE copies the stack frame of the origin PE to a new stack frame on a call stack of the target PE. The supervisor PE instructs the target PE to execute the function. The supervisor PE receives a notificationmore » that execution of the function is complete. The supervisor PE copies the stack frame of the target PE to the stack frame of the origin PE. The supervisor PE releases the target PE of the target device. The supervisor PE instructs the origin PE to resume execution of the program.« less

  20. Method for utilizing properties of the sinc(x) function for phase retrieval on nyquist-under-sampled data

    NASA Technical Reports Server (NTRS)

    Dean, Bruce H. (Inventor); Smith, Jeffrey Scott (Inventor); Aronstein, David L. (Inventor)

    2012-01-01

    Disclosed herein are systems, methods, and non-transitory computer-readable storage media for simulating propagation of an electromagnetic field, performing phase retrieval, or sampling a band-limited function. A system practicing the method generates transformed data using a discrete Fourier transform which samples a band-limited function f(x) without interpolating or modifying received data associated with the function f(x), wherein an interval between repeated copies in a periodic extension of the function f(x) obtained from the discrete Fourier transform is associated with a sampling ratio Q, defined as a ratio of a sampling frequency to a band-limited frequency, and wherein Q is assigned a value between 1 and 2 such that substantially no aliasing occurs in the transformed data, and retrieves a phase in the received data based on the transformed data, wherein the phase is used as feedback to an optical system.

  1. Preparing an autonomous, low-cost GNSS positioning and timing function on board a GEO telecom mission: a study case

    NASA Astrophysics Data System (ADS)

    Zin, A.; Scotti, M.; Mangolini, E.; Cappelluti, I.; Fiordiponti, R.; Amalric, J.; Flament, P.; Brouillard, E.; Kowaltschek, S.

    2015-06-01

    The purpose of this paper is to present a viable solution for a low-cost, autonomous GNSS positioning and timing function integrated in the avionics of a GEO telecomm satellite. This paper is based on a study currently carried out by Thales Alenia Space under an ARTES contract with ESA, funded by the Italian Space Agency. The availability of an autonomous means of positioning in GEO platform is essential in reducing the constraints on the ground control. This is even more true in the case of the transfer to the GEO orbit, where the current trend is to implement the low-thrust electrical propulsion and this phase could have a duration of months: the GNSS function can autonomously feed the on-board trajectory propagator with the spacecraft position fixes, in order to constrain the state estimate of the on-board filter. In order to achieve large availability of position and timing of the hosting platform, the tracking of the GNSS antenna side lobes is a critical feature. Based on the recent experiences of SGR GEO, additional insights on the side-lobe levels were made available to the GNSS community and this allows to better target a receiver architecture whose performance can be more precisely determined for the study case. This paper will show the findings in terms of GNSS function architecture for the GEO application, as well as the integration of this function inside the avionic computer. The focus is on a single frequency receiver, targeting at least GPS L1 and Galileo E1 signals. Optimizations in terms of sharing of HW resources with the on-board computer (oscillator, DC/DC converter) and the overall redundancy strategy are presented. Extension and implications for covering to the GTO case is presented as well. The expected receiver performances (position and timing accuracy, autonomy) are provided through in-orbit simulations, including a realistic receiver model.

  2. The Hospital-Based Drug Information Center.

    ERIC Educational Resources Information Center

    Hopkins, Leigh

    1982-01-01

    Discusses the rise of drug information centers in hospitals and medical centers, highlighting staffing, functions, typical categories of questions received by centers, and sources used. An appendix of drug information sources included in texts, updated services, journals, and computer databases is provided. Thirteen references are listed. (EJS)

  3. United States Air Force Computer-Aided Acquisition and Logistics Support (CALS) Evolution of Computer Integrated Manufacturing (CIM) Technologies

    DTIC Science & Technology

    1988-11-01

    Manufacturing System 22 4. Similar Parts Based Shape or Manufactuting Process 24 5. Projected Annual Unit Robot Sales and Installed Base Through 1992 30 6. U.S...effort needed to perform personnel, product design, marketing , and advertising, and finance tasks of the firm. Level III controls the resource...planning and accounting functions of the firm. Systems at this level support purchasing, accounts payable, accounts receivable, master scheduling and sales

  4. NREL Receives Editors' Choice Awards for Supercomputer Research | News |

    Science.gov Websites

    function," Beckham said. "We followed up these molecular simulations with experimental work to Award. The awards recognize outstanding research in computational molecular science and engineering Mechanisms of Cellulose-Active Enzymes Using Molecular Simulation" at the AIChE 2014 Annual Meeting

  5. Unraveling the Origin of the Bermuda Rise Using Receiver Functions: Insights from Mantle Discontinuity Structure

    NASA Astrophysics Data System (ADS)

    Burky, A.; Irving, J. C. E.; Simons, F.

    2017-12-01

    The Bermuda Rise is an enigmatic intraplate bathymetric feature which is considered a candidate hotspot in some catalogs, but remains a poor candidate due to the lack of an associated seamount chain and the absence of any present-day volcanism. Tomographic models of the seismic P and S wave velocity structure in the upper mantle and transition zone beneath Bermuda and the surrounding seafloor consistently resolve low velocity structures, but the magnitude, lateral dimensions, and position of these low velocity structures vary considerably between models. Due to these discrepancies, it remains difficult to attribute the observed velocity anomalies to thermal or chemical heterogeneity in this region. In addition to tomographic modeling, previous studies investigated the mantle transition zone structure beneath Bermuda by calculating receiver functions for GSN station BBSR, and suggested thinning of the transition zone as well as depressed discontinuity topography. In this study, we expand upon those studies by including the wealth of newly available data, and by incorporating a suite of three-dimensional velocity models. We calculate radial receiver functions in multiple frequency bands for the highest quality seismograms selected from over 5,000 waveforms recorded at station BBSR between October 2008 and August 2017 using the iterative deconvolution technique. We use various one- and three-dimensional velocity models to depth-convert our receiver functions to find the depths of the mantle transition zone discontinuities responsible for the signals in our receiver functions. The observed discontinuity topography is interpreted in the context of candidate mineralogical phase transitions and mantle temperature. To gain a more comprehensive understanding of our observations, we also calculate synthetic seismograms using AxiSEM, compute radial receiver functions for these synthetic data, and compare the results to the real receiver functions. Lastly, we discuss our results in the context of the geologic and geodynamic history of the Bermuda Rise.

  6. Receiver Functions From Regional and Near-Teleseismic P Waves

    NASA Astrophysics Data System (ADS)

    Park, J.; Levin, V.

    2001-05-01

    P waves from regional-distance earthquakes are complex and reverberatory, as would be expected from a combination of head waves, post-critical crustal reflections and shallow-incident P from the upper mantle. Although developed to analyze steeply-incident teleseismic P waves, receiver function analysis can also retrieve information about crustal structure from regional and near-teleseismic P. Using a new method to estimate receiver functions, based on multiple-taper spectral analysis, regional-distance RFs for GSN stations RAYN and ANTO show broad agreement with teleseismic RFs. At RAYN the moveout of the Moho-converted Ps phase, relative to direct P, follows well the predictions of the IASP91 earth model. The Moho-converted Ps phase shows complexity associated with the transition-zone triplication near Δ =20o and constant delay (zero moveout) as Δ -> 0, consistent with conversion from Pn. Similar behavior is seen for ANTO for events that arrive from the west. For eastern backazimuths the ANTO RFs show features whose moveout is negative as Δ -> 0. This moveout is poorly fit by reverberations in flat layers or by direct scattering from a dipping interface, but is consistent with a topographic scatterer 20--30 km eastward of the ANTO site. Regional receiver functions may therefore be useful in judging whether teleseismic RFs at a particular station are suitable candidates for a 1-D velocity structure inversion. Synthetic seismograms of regional P phases, computed with a locked-mode reflectivity approach, confirm broad features of the RAYN and ANTO regional receiver functions.

  7. A pilot single-blind multicentre randomized controlled trial to evaluate the potential benefits of computer-assisted arm rehabilitation gaming technology on the arm function of children with spastic cerebral palsy.

    PubMed

    Preston, Nick; Weightman, Andrew; Gallagher, Justin; Levesley, Martin; Mon-Williams, Mark; Clarke, Mike; O'Connor, Rory J

    2016-10-01

    To evaluate the potential benefits of computer-assisted arm rehabilitation gaming technology on arm function of children with spastic cerebral palsy. A single-blind randomized controlled trial design. Power calculations indicated that 58 children would be required to demonstrate a clinically important difference. Intervention was home-based; recruitment took place in regional spasticity clinics. A total of 15 children with cerebral palsy aged five to 12 years were recruited; eight to the device group. Both study groups received 'usual follow-up treatment' following spasticity treatment with botulinum toxin; the intervention group also received a rehabilitation gaming device. ABILHAND-kids and Canadian Occupational Performance Measure were performed by blinded assessors at baseline, six and 12 weeks. An analysis of covariance showed no group differences in mean ABILHAND-kids scores between time points. A non-parametric analysis of variance on Canadian Occupational Performance Measure scores showed a statistically significant improvement across time points (χ 2 (2,15) = 6.778, p = 0.031), but this improvement did not reach minimal clinically important difference. Mean daily device use was seven minutes. Recruitment did not reach target owing to unanticipated staff shortages in clinical services. Feedback from children and their families indicated that the games were not sufficiently engaging to promote sufficient use that was likely to result in functional benefits. This study suggests that computer-assisted arm rehabilitation gaming does not benefit arm function, but a Type II error cannot be ruled out. © The Author(s) 2015.

  8. Effect of virtual reality on cognition in stroke patients.

    PubMed

    Kim, Bo Ryun; Chun, Min Ho; Kim, Lee Suk; Park, Ji Young

    2011-08-01

    To investigate the effect of virtual reality on the recovery of cognitive impairment in stroke patients. Twenty-eight patients (11 males and 17 females, mean age 64.2) with cognitive impairment following stroke were recruited for this study. All patients were randomly assigned to one of two groups, the virtual reality (VR) group (n=15) or the control group (n=13). The VR group received both virtual reality training and computer-based cognitive rehabilitation, whereas the control group received only computer-based cognitive rehabilitation. To measure, activity of daily living cognitive and motor functions, the following assessment tools were used: computerized neuropsychological test and the Tower of London (TOL) test for cognitive function assessment, Korean-Modified Barthel index (K-MBI) for functional status evaluation, and the motricity index (MI) for motor function assessment. All recruited patients underwent these evaluations before rehabilitation and four weeks after rehabilitation. The VR group showed significant improvement in the K-MMSE, visual and auditory continuous performance tests (CPT), forward digit span test (DST), forward and backward visual span tests (VST), visual and verbal learning tests, TOL, K-MBI, and MI scores, while the control group showed significant improvement in the K-MMSE, forward DST, visual and verbal learning tests, trail-making test-type A, TOL, K-MBI, and MI scores after rehabilitation. The changes in the visual CPT and backward VST in the VR group after rehabilitation were significantly higher than those in the control group. Our findings suggest that virtual reality training combined with computer-based cognitive rehabilitation may be of additional benefit for treating cognitive impairment in stroke patients.

  9. PREDICTS

    NASA Technical Reports Server (NTRS)

    Zhou, Hanying

    2007-01-01

    PREDICTS is a computer program that predicts the frequencies, as functions of time, of signals to be received by a radio science receiver in this case, a special-purpose digital receiver dedicated to analysis of signals received by an antenna in NASA s Deep Space Network (DSN). Unlike other software used in the DSN, PREDICTS does not use interpolation early in the calculations; as a consequence, PREDICTS is more precise and more stable. The precision afforded by the other DSN software is sufficient for telemetry; the greater precision afforded by PREDICTS is needed for radio-science experiments. In addition to frequencies as a function of time, PREDICTS yields the rates of change and interpolation coefficients for the frequencies and the beginning and ending times of reception, transmission, and occultation. PREDICTS is applicable to S-, X-, and Ka-band signals and can accommodate the following link configurations: (1) one-way (spacecraft to ground), (2) two-way (from a ground station to a spacecraft to the same ground station), and (3) three-way (from a ground transmitting station to a spacecraft to a different ground receiving station).

  10. Modelling protein functional domains in signal transduction using Maude

    NASA Technical Reports Server (NTRS)

    Sriram, M. G.

    2003-01-01

    Modelling of protein-protein interactions in signal transduction is receiving increased attention in computational biology. This paper describes recent research in the application of Maude, a symbolic language founded on rewriting logic, to the modelling of functional domains within signalling proteins. Protein functional domains (PFDs) are a critical focus of modern signal transduction research. In general, Maude models can simulate biological signalling networks and produce specific testable hypotheses at various levels of abstraction. Developing symbolic models of signalling proteins containing functional domains is important because of the potential to generate analyses of complex signalling networks based on structure-function relationships.

  11. Evidence of Dynamic Crustal Deformation in Tohoku, Japan, From Time-Varying Receiver Functions

    NASA Astrophysics Data System (ADS)

    Porritt, R. W.; Yoshioka, S.

    2017-10-01

    Temporal variation of crustal structure is key to our understanding of Earth processes on human timescales. Often, we expect that the most significant structural variations are caused by strong ground shaking associated with large earthquakes, and recent studies seem to confirm this. Here we test the possibility of using P receiver functions (PRF) to isolate structural variations over time. Synthetic receiver function tests indicate that structural variation could produce PRF changes on the same order of magnitude as random noise or contamination by local earthquakes. Nonetheless, we find significant variability in observed receiver functions over time at several stations located in northeastern Honshu. Immediately following the Tohoku-oki earthquake, we observe high PRF variation clustering spatially, especially in two regions near the beginning and end of the rupture plane. Due to the depth sensitivity of PRF and the timescales over which this variability is observed, we infer this effect is primarily due to fluid migration in volcanic regions and shear stress/strength reorganization. While the noise levels in PRF are high for this type of analysis, by sampling small data sets, the computational cost is lower than other methods, such as ambient noise, thereby making PRF a useful tool for estimating temporal variations in crustal structure.

  12. A theoretical study on the impact of particle scattering on the channel characteristics of underwater optical communication system

    NASA Astrophysics Data System (ADS)

    Sahu, Sanjay Kumar; Shanmugam, Palanisamy

    2018-02-01

    Scattering by water molecules and particulate matters determines the path and distance of photon propagation in underwater medium. Consequently, photon angle of scattering (given by scattering phase function) requires to be considered in addition to the extinction coefficient of the aquatic medium governed by the absorption and scattering coefficients in channel characterization for an underwater wireless optical communication (UWOC) system. This study focuses on analyzing the received signal power and impulse response of UWOC channel based on Monte-Carlo simulations for different water types, link distances, link geometries and transceiver parameters. A newly developed scattering phase function (referred to as SS phase function), which represents the real water types more accurately like the Petzold phase function, is considered for quantification of the channel characteristics along with the effects of absorption and scattering coefficients. A comparison between the results simulated using various phase function models and the experimental measurements of Petzold revealed that the SS phase function model predicts values closely matching with the actual values of the Petzold's phase function, which further establishes the importance of using a correct scattering phase function model while estimating the channel capacity of UWOC system in terms of the received power and channel impulse response. Results further demonstrate a great advantage of considering the nonzero probability of receiving scattered photons in estimating channel capacity rather than considering the reception of only ballistic photons as in Beer's Law, which severely underestimates the received power and affects the range of communication especially in the scattering water column. The received power computed based on the Monte-Carlo method by considering the receiver aperture sizes and field of views in different water types are further analyzed and discussed. These results are essential for evaluating the underwater link budget and constructing different system and design parameters for an UWOC system.

  13. High-Rate Digital Receiver Board

    NASA Technical Reports Server (NTRS)

    Ghuman, Parminder; Bialas, Thomas; Brambora, Clifford; Fisher, David

    2004-01-01

    A high-rate digital receiver (HRDR) implemented as a peripheral component interface (PCI) board has been developed as a prototype of compact, general-purpose, inexpensive, potentially mass-producible data-acquisition interfaces between telemetry systems and personal computers. The installation of this board in a personal computer together with an analog preprocessor enables the computer to function as a versatile, highrate telemetry-data-acquisition and demodulator system. The prototype HRDR PCI board can handle data at rates as high as 600 megabits per second, in a variety of telemetry formats, transmitted by diverse phase-modulation schemes that include binary phase-shift keying and various forms of quadrature phaseshift keying. Costing less than $25,000 (as of year 2003), the prototype HRDR PCI board supplants multiple racks of older equipment that, when new, cost over $500,000. Just as the development of standard network-interface chips has contributed to the proliferation of networked computers, it is anticipated that the development of standard chips based on the HRDR could contribute to reductions in size and cost and increases in performance of telemetry systems.

  14. A portable fNIRS system with eight channels

    NASA Astrophysics Data System (ADS)

    Si, Juanning; Zhao, Ruirui; Zhang, Yujin; Zuo, Nianming; Zhang, Xin; Jiang, Tianzi

    2015-03-01

    Abundant study on the hemodynamic response of a brain have brought quite a few advances in technologies of measuring it. The most benefitted is the functional near infrared spectroscope (fNIRS). A variety of devices have been developed for different applications. Because portable fNIRS systems were more competent to measure responses either of special subjects or in natural environment, several kinds of portable fNIRS systems have been reported. However, they all required a computer for receiving data. The extra computer increases the cost of a fNIRS system. What's more noticeable is the space required to locate the computer even for a portable system. It will discount the portability of the fNIRS system. So we designed a self-contained eight channel fNIRS system, which does not demand a computer to receive data and display data in a monitor. Instead, the system is centered by an ARM core CPU, which takes charge in organizing data and saving data, and then displays data on a touch screen. The system has also been validated by experiments on phantoms and on subjects in tasks.

  15. Clinical Pilot Study and Computational Modeling of Bitemporal Transcranial Direct Current Stimulation, and Safety of Repeated Courses of Treatment, in Major Depression.

    PubMed

    Ho, Kerrie-Anne; Bai, Siwei; Martin, Donel; Alonzo, Angelo; Dokos, Socrates; Loo, Colleen K

    2015-12-01

    This study aimed to examine a bitemporal (BT) transcranial direct current stimulation (tDCS) electrode montage for the treatment of depression through a clinical pilot study and computational modeling. The safety of repeated courses of stimulation was also examined. Four participants with depression who had previously received multiple courses of tDCS received a 4-week course of BT tDCS. Mood and neuropsychological function were assessed. The results were compared with previous courses of tDCS given to the same participants using different electrode montages. Computational modeling examined the electric field maps produced by the different montages. Three participants showed clinical improvement with BT tDCS (mean [SD] improvement, 49.6% [33.7%]). There were no adverse neuropsychological effects. Computational modeling showed that the BT montage activates the anterior cingulate cortices and brainstem, which are deep brain regions that are important for depression. However, a fronto-extracephalic montage stimulated these areas more effectively. No adverse effects were found in participants receiving up to 6 courses of tDCS. Bitemporal tDCS was safe and led to clinically meaningful efficacy in 3 of 4 participants. However, computational modeling suggests that the BT montage may not activate key brain regions in depression more effectively than another novel montage--fronto-extracephalic tDCS. There is also preliminary evidence to support the safety of up to 6 repeated courses of tDCS.

  16. Capacity of noncoherent MFSK channels

    NASA Technical Reports Server (NTRS)

    Bar-David, I.; Butman, S. A.; Klass, M. J.; Levitt, B. K.; Lyon, R. F.

    1974-01-01

    Performance limits theoretically achievable over noncoherent channels perturbed by additive Gaussian noise in hard decision, optimal, and soft decision receivers are computed as functions of the number of orthogonal signals and the predetection signal-to-noise ratio. Equations are derived for orthogonal signal capacity, the ultimate MFSK capacity, and the convolutional coding and decoding limit. It is shown that performance improves as the signal-to-noise ratio increases, provided the bandwidth can be increased, that the optimum number of signals is not infinite (except for the optimal receiver), and that the optimum number decreases as the signal-to-noise ratio decreases, but is never less than 7 for even the hard decision receiver.

  17. Zonal and tesseral harmonic coefficients for the geopotential function, from zero to 18th order

    NASA Technical Reports Server (NTRS)

    Kirkpatrick, J. C.

    1976-01-01

    Zonal and tesseral harmonic coefficients for the geopotential function are usually tabulated in normalized form to provide immediate information as to the relative significance of the coefficients in the gravity model. The normalized form of the geopotential coefficients cannot be used for computational purposes unless the gravity model has been modified to receive them. This modification is usually not done because the absolute or unnormalized form of the coefficients can be obtained from the simple mathematical relationship that relates the two forms. This computation can be quite tedious for hand calculation, especially for the higher order terms, and can be costly in terms of storage and execution time for machine computation. In this report, zonal and tesseral harmonic coefficients for the geopotential function are tabulated in absolute or unnormalized form. The report is designed to be used as a ready reference for both hand and machine calculation to save the user time and effort.

  18. PPM Receiver Implemented in Software

    NASA Technical Reports Server (NTRS)

    Gray, Andrew; Kang, Edward; Lay, Norman; Vilnrotter, Victor; Srinivasan, Meera; Lee, Clement

    2010-01-01

    A computer program has been written as a tool for developing optical pulse-position- modulation (PPM) receivers in which photodetector outputs are fed to analog-to-digital converters (ADCs) and all subsequent signal processing is performed digitally. The program can be used, for example, to simulate an all-digital version of the PPM receiver described in Parallel Processing of Broad-Band PPM Signals (NPO-40711), which appears elsewhere in this issue of NASA Tech Briefs. The program can also be translated into a design for digital PPM receiver hardware. The most notable innovation embodied in the software and the underlying PPM-reception concept is a digital processing subsystem that performs synchronization of PPM time slots, even though the digital processing is, itself, asynchronous in the sense that no attempt is made to synchronize it with the incoming optical signal a priori and there is no feedback to analog signal processing subsystems or ADCs. Functions performed by the software receiver include time-slot synchronization, symbol synchronization, coding preprocessing, and diagnostic functions. The program is written in the MATLAB and Simulink software system. The software receiver is highly parameterized and, hence, programmable: for example, slot- and symbol-synchronization filters have programmable bandwidths.

  19. Concentrated solar-flux measurements at the IEA-SSPS solar-central-receiver power plant, Tabernas - Lameria (Spain)

    NASA Astrophysics Data System (ADS)

    Vontobel, G.; Schelders, C.; Real, M.

    A flux analyzing system (F.A.S.) was installed at the central receiver system of the SSPS project to determine the relative flux distribution of the heliostat field and to measure the entire optical solar flux reflected from the heliostat field into the receiver cavity. The functional principles of the F.A.S. are described. The raw data and the evaluation of the measurements of the entire helistat field are given, and an approach to determine the actual fluxes which hit the receiver tube bundle is presented. A method is described to qualify the performance of each heliostat using a computer code. The data of the measurements of the direct radiation are presented.

  20. Derivation of the Cramér-Rao Bound in the GNSS-Reflectometry Context for Static, Ground-Based Receivers in Scenarios with Coherent Reflection

    PubMed Central

    Ribot, Miguel Angel; Botteron, Cyril; Farine, Pierre-André

    2016-01-01

    The use of the reflected Global Navigation Satellite Systems’ (GNSS) signals in Earth observation applications, referred to as GNSS reflectometry (GNSS-R), has been already studied for more than two decades. However, the estimation precision that can be achieved by GNSS-R sensors in some particular scenarios is still not fully understood yet. In an effort to partially fill this gap, in this paper, we compute the Cramér–Rao bound (CRB) for the specific case of static ground-based GNSS-R receivers and scenarios where the coherent component of the reflected signal is dominant. We compute the CRB for GNSS signals with different modulations, GPS L1 C/A and GPS L5 I/Q, which use binary phase-shift keying, and Galileo E1 B/C and E5, using the binary offset carrier. The CRB for these signals is evaluated as a function of the receiver bandwidth and different scenario parameters, such as the height of the receiver or the properties of the reflection surface. The CRB computation presented considers observation times of up to several tens of seconds, in which the satellite elevation angle observed changes significantly. Finally, the results obtained show the theoretical benefit of using modern GNSS signals with GNSS-R techniques using long observation times, such as the interference pattern technique. PMID:27929388

  1. Derivation of the Cramér-Rao Bound in the GNSS-Reflectometry Context for Static, Ground-Based Receivers in Scenarios with Coherent Reflection.

    PubMed

    Ribot, Miguel Angel; Botteron, Cyril; Farine, Pierre-André

    2016-12-05

    The use of the reflected Global Navigation Satellite Systems' (GNSS) signals in Earth observation applications, referred to as GNSS reflectometry (GNSS-R), has been already studied for more than two decades. However, the estimation precision that can be achieved by GNSS-R sensors in some particular scenarios is still not fully understood yet. In an effort to partially fill this gap, in this paper, we compute the Cramér-Rao bound (CRB) for the specific case of static ground-based GNSS-R receivers and scenarios where the coherent component of the reflected signal is dominant. We compute the CRB for GNSS signals with different modulations, GPS L1 C/A and GPS L5 I/Q, which use binary phase-shift keying, and Galileo E1 B/C and E5, using the binary offset carrier. The CRB for these signals is evaluated as a function of the receiver bandwidth and different scenario parameters, such as the height of the receiver or the properties of the reflection surface. The CRB computation presented considers observation times of up to several tens of seconds, in which the satellite elevation angle observed changes significantly. Finally, the results obtained show the theoretical benefit of using modern GNSS signals with GNSS-R techniques using long observation times, such as the interference pattern technique.

  2. Gravity-supported exercise with computer gaming improves arm function in chronic stroke.

    PubMed

    Jordan, Kimberlee; Sampson, Michael; King, Marcus

    2014-08-01

    To investigate the effect of 4 to 6 weeks of exergaming with a computer mouse embedded within an arm skate on upper limb function in survivors of chronic stroke. Intervention study with a 4-week postintervention follow-up. In home. Survivors (N=13) of chronic (≥6 mo) stroke with hemiparesis of the upper limb with stable baseline Fugl-Meyer assessment scores received the intervention. One participant withdrew, and 2 participants were not reassessed at the 4-week follow-up. No participants withdrew as a result of adverse effects. Four to 6 weeks of exergaming using the arm skate where participants received either 9 (n=5) or 16 (n=7) hours of game play. Upper limb component of the Fugl-Meyer assessment. There was an average increase in the Fugl-Meyer upper limb assessment score from the beginning to end of the intervention of 4.9 points. At the end of the 4-week period after the intervention, the increase was 4.4 points. A 4- to 6-week intervention using the arm skate significantly improved arm function in survivors of chronic stroke by an average of 4.9 Fugl-Meyer upper limb assessment points. This research shows that a larger-scale randomized trial of this device is warranted and highlights the potential value of using virtual reality technology (eg, computer games) in a rehabilitation setting. Copyright © 2014 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  3. MACMULTIVIEW 5.1

    NASA Technical Reports Server (NTRS)

    Norikane, L.

    1994-01-01

    MacMultiview is an interactive tool for the Macintosh II family which allows one to display and make computations utilizing polarimetric radar data collected by the Jet Propulsion Laboratory's imaging SAR (synthetic aperture radar) polarimeter system. The system includes the single-frequency L-band sensor mounted on the NASA CV990 aircraft and its replacement, the multi-frequency P-, L-, and C-band sensors mounted on the NASA DC-8. MacMultiview provides two basic functions: computation of synthesized polarimetric images and computation of polarization signatures. The radar data can be used to compute a variety of images. The total power image displays the sum of the polarized and unpolarized components of the backscatter for each pixel. The magnitude/phase difference image displays the HH (horizontal transmit and horizontal receive polarization) to VV (vertical transmit and vertical receive polarization) phase difference using color. Magnitude is displayed using intensity. The user may also select any transmit and receive polarization combination from which an image is synthesized. This image displays the backscatter which would have been observed had the sensor been configured using the selected transmit and receive polarizations. MacMultiview can also be used to compute polarization signatures, three dimensional plots of backscatter versus transmit and receive polarizations. The standard co-polarization signatures (transmit and receive polarizations are the same) and cross-polarization signatures (transmit and receive polarizations are orthogonal) can be plotted for any rectangular subset of pixels within a radar data set. In addition, the ratio of co- and cross-polarization signatures computed from different subsets within the same data set can also be computed. Computed images can be saved in a variety of formats: byte format (headerless format which saves the image as a string of byte values), MacMultiview (a byte image preceded by an ASCII header), and PICT2 format (standard format readable by MacMultiview and other image processing programs for the Macintosh). Images can also be printed on PostScript output devices. Polarization signatures can be saved in either a PICT format or as a text file containing PostScript commands and printed on any QuickDraw output device. The associated Stokes matrices can be stored in a text file. MacMultiview is written in C-language for Macintosh II series computers. MacMultiview will only run on Macintosh II series computers with 8-bit video displays (gray shades or color). The program also requires a minimum configuration of System 6.0, Finder 6.1, and 1Mb of RAM. MacMultiview is NOT compatible with System 7.0. It requires 32-Bit QuickDraw. Note: MacMultiview may not be fully compatible with preliminary versions of 32-Bit QuickDraw. Macintosh Programmer's Workshop and Macintosh Programmer's Workshop C (version 3.0) are required for recompiling and relinking. The standard distribution medium for this package is a set of three 800K 3.5 inch diskettes in Macintosh format. This program was developed in 1989 and updated in 1991. MacMultiview is a copyrighted work with all copyright vested in NASA. QuickDraw, Finder, Macintosh, and System 7 are trademarks of Apple Computer, Inc.

  4. Hybrid Quantum-Classical Approach to Quantum Optimal Control.

    PubMed

    Li, Jun; Yang, Xiaodong; Peng, Xinhua; Sun, Chang-Pu

    2017-04-14

    A central challenge in quantum computing is to identify more computational problems for which utilization of quantum resources can offer significant speedup. Here, we propose a hybrid quantum-classical scheme to tackle the quantum optimal control problem. We show that the most computationally demanding part of gradient-based algorithms, namely, computing the fitness function and its gradient for a control input, can be accomplished by the process of evolution and measurement on a quantum simulator. By posing queries to and receiving answers from the quantum simulator, classical computing devices update the control parameters until an optimal control solution is found. To demonstrate the quantum-classical scheme in experiment, we use a seven-qubit nuclear magnetic resonance system, on which we have succeeded in optimizing state preparation without involving classical computation of the large Hilbert space evolution.

  5. Scaffolding Executive Function Capabilities via Play-&-Learn Software for Preschoolers

    ERIC Educational Resources Information Center

    Axelsson, Anton; Andersson, Richard; Gulz, Agneta

    2016-01-01

    Educational software in the form of games or so called "computer assisted intervention" for young children has become increasingly common receiving a growing interest and support. Currently there are, for instance, more than 1,000 iPad apps tagged for preschool. Thus, it has become increasingly important to empirically investigate…

  6. The effects of computer-assisted cognitive rehabilitation on Alzheimer's dementia patients memories.

    PubMed

    Hwang, Jung-Ha; Cha, Hyun-Gyu; Cho, Young-Seok; Kim, Tae-Sue; Cho, Hyuk-Shin

    2015-09-01

    [Purpose] The purpose of the present study was to conduct Computer-Assisted Cognitive Rehabilitation (COMCOG) to examine the effects of COMCOG on Alzheimer's dementia patients' memories. [Subjects] Thirty-five patients diagnosed with Alzheimer's dementia received COMCOG for 30 minutes per day, five days per week for four weeks. [Methods] Before and after the COMCOG intervention, subjects' cognitive functions were evaluated using the Cognitive Assessment Reference Diagnosis System (CARDS) and Mini-Mental State Examination-Korea (MMSE-K) test. [Results] According to the results of the evaluation, among the CARDS scores of the subjects who received COMCOG, the scores of the delayed 10-word list, delayed 10-object list, recognition 10-object, and recent memory significantly increased while the scores of recognition 10-word significantly decreased after intervention compared to before intervention. In addition, among the MMSE-K items, the orientation, registration, and recall showed significant increases. [Conclusion] Based on these results, delay in the progress of memory deterioration can be expected when COMCOG is conducted for Alzheimer's dementia patients who show declines in cognitive functions.

  7. Design of Remote GPRS-based Gas Data Monitoring System

    NASA Astrophysics Data System (ADS)

    Yan, Xiyue; Yang, Jianhua; Lu, Wei

    2018-01-01

    In order to solve the problem of remote data transmission of gas flowmeter, and realize unattended operation on the spot, an unattended remote monitoring system based on GPRS for gas data is designed in this paper. The slave computer of this system adopts embedded microprocessor to read data of gas flowmeter through rs-232 bus and transfers it to the host computer through DTU. In the host computer, the VB program dynamically binds the Winsock control to receive and parse data. By using dynamic data exchange, the Kingview configuration software realizes history trend curve, real time trend curve, alarm, print, web browsing and other functions.

  8. Determination of Shear Wave Velocity Structure in the Rio Grande Rift Through Receiver Function and Surface Wave Analysis. Appendix B

    DTIC Science & Technology

    1991-08-01

    source and receiver responses for constant ray parameter, Bull. Seism. Soc. Am. 67, 1029-1050, 1977. Langston, C. A., Structure under Mount Rainier ...the 106 petrologic processes taking place within the rift. APPENDIX LIST OF COMPUTER PROGRAMS USED IN THESIS. 107 I 108 PROGRAM: RAY3D AUTHOR: Dr. T.J...Lab. Rep., LA-8676-T, 218 pp., 1981. Baldridge, W. S., Petrology an,3 petrogenesis of Plio- Pleistocene basaltic rocks from the central Rio Grand

  9. Permittivity and conductivity parameter estimations using full waveform inversion

    NASA Astrophysics Data System (ADS)

    Serrano, Jheyston O.; Ramirez, Ana B.; Abreo, Sergio A.; Sadler, Brian M.

    2018-04-01

    Full waveform inversion of Ground Penetrating Radar (GPR) data is a promising strategy to estimate quantitative characteristics of the subsurface such as permittivity and conductivity. In this paper, we propose a methodology that uses Full Waveform Inversion (FWI) in time domain of 2D GPR data to obtain highly resolved images of the permittivity and conductivity parameters of the subsurface. FWI is an iterative method that requires a cost function to measure the misfit between observed and modeled data, a wave propagator to compute the modeled data and an initial velocity model that is updated at each iteration until an acceptable decrease of the cost function is reached. The use of FWI with GPR are expensive computationally because it is based on the computation of the electromagnetic full wave propagation. Also, the commercially available acquisition systems use only one transmitter and one receiver antenna at zero offset, requiring a large number of shots to scan a single line.

  10. Computing the Dynamic Response of a Stratified Elastic Half Space Using Diffuse Field Theory

    NASA Astrophysics Data System (ADS)

    Sanchez-Sesma, F. J.; Perton, M.; Molina Villegas, J. C.

    2015-12-01

    The analytical solution for the dynamic response of an elastic half-space for a normal point load at the free surface is due to Lamb (1904). For a tangential force, we have Chaós (1960) formulae. For an arbitrary load at any depth within a stratified elastic half space, the resulting elastic field can be given in the same fashion, by using an integral representation in the radial wavenumber domain. Typically, computations use discrete wave number (DWN) formalism and Fourier analysis allows for solution in space and time domain. Experimentally, these elastic Greeńs functions might be retrieved from ambient vibrations correlations when assuming a diffuse field. In fact, the field could not be totally diffuse and only parts of the Green's functions, associated to surface or body waves, are retrieved. In this communication, we explore the computation of Green functions for a layered media on top of a half-space using a set of equipartitioned elastic plane waves. Our formalism includes body and surface waves (Rayleigh and Love waves). These latter waves correspond to the classical representations in terms of normal modes in the asymptotic case of large separation distance between source and receiver. This approach allows computing Green's functions faster than DWN and separating the surface and body wave contributions in order to better represent Green's function experimentally retrieved.

  11. Effect of Virtual Reality on Cognition in Stroke Patients

    PubMed Central

    Kim, Bo Ryun; Kim, Lee Suk; Park, Ji Young

    2011-01-01

    Objective To investigate the effect of virtual reality on the recovery of cognitive impairment in stroke patients. Method Twenty-eight patients (11 males and 17 females, mean age 64.2) with cognitive impairment following stroke were recruited for this study. All patients were randomly assigned to one of two groups, the virtual reality (VR) group (n=15) or the control group (n=13). The VR group received both virtual reality training and computer-based cognitive rehabilitation, whereas the control group received only computer-based cognitive rehabilitation. To measure, activity of daily living cognitive and motor functions, the following assessment tools were used: computerized neuropsychological test and the Tower of London (TOL) test for cognitive function assessment, Korean-Modified Barthel index (K-MBI) for functional status evaluation, and the motricity index (MI) for motor function assessment. All recruited patients underwent these evaluations before rehabilitation and four weeks after rehabilitation. Results The VR group showed significant improvement in the K-MMSE, visual and auditory continuous performance tests (CPT), forward digit span test (DST), forward and backward visual span tests (VST), visual and verbal learning tests, TOL, K-MBI, and MI scores, while the control group showed significant improvement in the K-MMSE, forward DST, visual and verbal learning tests, trail-making test-type A, TOL, K-MBI, and MI scores after rehabilitation. The changes in the visual CPT and backward VST in the VR group after rehabilitation were significantly higher than those in the control group. Conclusion Our findings suggest that virtual reality training combined with computer-based cognitive rehabilitation may be of additional benefit for treating cognitive impairment in stroke patients. PMID:22506159

  12. Relationships between the generalized functional method and other methods of nonimaging optical design.

    PubMed

    Bortz, John; Shatz, Narkis

    2011-04-01

    The recently developed generalized functional method provides a means of designing nonimaging concentrators and luminaires for use with extended sources and receivers. We explore the mathematical relationships between optical designs produced using the generalized functional method and edge-ray, aplanatic, and simultaneous multiple surface (SMS) designs. Edge-ray and dual-surface aplanatic designs are shown to be special cases of generalized functional designs. In addition, it is shown that dual-surface SMS designs are closely related to generalized functional designs and that certain computational advantages accrue when the two design methods are combined. A number of examples are provided. © 2011 Optical Society of America

  13. Rehabilitation of hand in subacute tetraplegic patients based on brain computer interface and functional electrical stimulation: a randomised pilot study

    NASA Astrophysics Data System (ADS)

    Osuagwu, Bethel C. A.; Wallace, Leslie; Fraser, Mathew; Vuckovic, Aleksandra

    2016-12-01

    Objective. To compare neurological and functional outcomes between two groups of hospitalised patients with subacute tetraplegia. Approach. Seven patients received 20 sessions of brain computer interface (BCI) controlled functional electrical stimulation (FES) while five patients received the same number of sessions of passive FES for both hands. The neurological assessment measures were event related desynchronization (ERD) during movement attempt, Somatosensory evoked potential (SSEP) of the ulnar and median nerve; assessment of hand function involved the range of motion (ROM) of wrist and manual muscle test. Main results. Patients in both groups initially had intense ERD during movement attempt that was not restricted to the sensory-motor cortex. Following the treatment, ERD cortical activity restored towards the activity in able-bodied people in BCI-FES group only, remaining wide-spread in FES group. Likewise, SSEP returned in 3 patients in BCI-FES group, having no changes in FES group. The ROM of the wrist improved in both groups. Muscle strength significantly improved for both hands in BCI-FES group. For FES group, a significant improvement was noticed for right hand flexor muscles only. Significance. Combined BCI-FES therapy results in better neurological recovery and better improvement of muscle strength than FES alone. For spinal cord injured patients, BCI-FES should be considered as a therapeutic tool rather than solely a long-term assistive device for the restoration of a lost function.

  14. Crustal Structure of Iraq from Receiver Functions and Surface Wave Dispersion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gok, R; Mahdi, H; Al-Shukri, H

    2006-08-31

    We report the crustal structure of Iraq, located in the northeastern Arabian plate, estimated by joint inversion of P-wave receiver functions and surface wave group velocity dispersion. Receiver functions were computed from teleseismic recordings at two temporary broadband seismic stations in Mosul (MSL) and Baghdad (BHD), separated by approximately 360 km. Group velocity dispersion curves at the sites were derived from continental-scale tomography of Pasyanos (2006). The inversion results show that the crustal thicknesses are 39 km at MSL and 43 km at BHD. Both sites reveal low velocity surface layers consistent with sedimentary thickness of about 3 km atmore » station MSL and 7 km at BHD, agreeing well with the existing models. Ignoring the sediments, the crustal velocities and thicknesses are remarkably similar between the two stations, suggesting that the crustal structure of the proto-Arabian Platform in northern Iraq was uniform before subsidence and deposition of the sediments in the Cenozoic. Deeper low velocity sediments at BHD are expected to result in higher ground motions for earthquakes.« less

  15. Time delay and distance measurement

    NASA Technical Reports Server (NTRS)

    Abshire, James B. (Inventor); Sun, Xiaoli (Inventor)

    2011-01-01

    A method for measuring time delay and distance may include providing an electromagnetic radiation carrier frequency and modulating one or more of amplitude, phase, frequency, polarization, and pointing angle of the carrier frequency with a return to zero (RZ) pseudo random noise (PN) code. The RZ PN code may have a constant bit period and a pulse duration that is less than the bit period. A receiver may detect the electromagnetic radiation and calculate the scattering profile versus time (or range) by computing a cross correlation function between the recorded received signal and a three-state RZ PN code kernel in the receiver. The method also may be used for pulse delay time (i.e., PPM) communications.

  16. High-speed free-space based reconfigurable card-to-card optical interconnects with broadcast capability.

    PubMed

    Wang, Ke; Nirmalathas, Ampalavanapillai; Lim, Christina; Skafidas, Efstratios; Alameh, Kamal

    2013-07-01

    In this paper, we propose and experimentally demonstrate a free-space based high-speed reconfigurable card-to-card optical interconnect architecture with broadcast capability, which is required for control functionalities and efficient parallel computing applications. Experimental results show that 10 Gb/s data can be broadcast to all receiving channels for up to 30 cm with a worst-case receiver sensitivity better than -12.20 dBm. In addition, arbitrary multicasting with the same architecture is also investigated. 10 Gb/s reconfigurable point-to-point link and multicast channels are simultaneously demonstrated with a measured receiver sensitivity power penalty of ~1.3 dB due to crosstalk.

  17. Performance of synchronous optical receivers using atmospheric compensation techniques.

    PubMed

    Belmonte, Aniceto; Khan, Joseph

    2008-09-01

    We model the impact of atmospheric turbulence-induced phase and amplitude fluctuations on free-space optical links using synchronous detection. We derive exact expressions for the probability density function of the signal-to-noise ratio in the presence of turbulence. We consider the effects of log-normal amplitude fluctuations and Gaussian phase fluctuations, in addition to local oscillator shot noise, for both passive receivers and those employing active modal compensation of wave-front phase distortion. We compute error probabilities for M-ary phase-shift keying, and evaluate the impact of various parameters, including the ratio of receiver aperture diameter to the wave-front coherence diameter, and the number of modes compensated.

  18. Functional recovery in upper limb function in stroke survivors by using brain-computer interface A single case A-B-A-B design.

    PubMed

    Ono, Takashi; Mukaino, Masahiko; Ushiba, Junichi

    2013-01-01

    Resent studies suggest that brain-computer interface (BCI) training for chronic stroke patient is useful to improve their motor function of paretic hand. However, these studies does not show the extent of the contribution of the BCI clearly because they prescribed BCI with other rehabilitation systems, e.g. an orthosis itself, a robotic intervention, or electrical stimulation. We therefore compared neurological effects between interventions with neuromuscular electrical stimulation (NMES) with motor imagery and BCI-driven NMES, employing an ABAB experimental design. In epoch A, the subject received NMES on paretic extensor digitorum communis (EDC). The subject was asked to attempt finger extension simultaneously. In epoch B, the subject received NMES when BCI system detected motor-related electroencephalogram change while attempting motor imagery. Both epochs were carried out for 60 min per day, 5 days per week. As a result, EMG activity of EDC was enhanced by BCI-driven NMES and significant cortico-muscular coherence was observed at the final evaluation. These results indicate that the training by BCI-driven NMES is effective even compared to motor imagery combined with NMES, suggesting the superiority of closed-loop training with BCI-driven NMES to open-loop NMES for chronic stroke patients.

  19. The Lithospheric Structure Beneath Canary Islands from Receiver Function Analysis

    NASA Astrophysics Data System (ADS)

    Martinez-Arevalo, C.; Mancilla, F.; Helffrich, G. R.; Garcia, A.

    2009-12-01

    The Canary Archipelago is located a few hundred kilometers off the western Moroccan coast, extending 450 km west-to-east. It is composed of seven main islands. All but one have been active in the last million years. The origin of the Canary Islands is not well established and local and regional geology features cannot be completely explained by the current models. The main aim of this study is to provide new data that help us to understand and constrain the archipelago's origin and tectonic evolution. The crustal structure under each station is obtained applying P-receiver function technique to the teleseismic P arrivals recorded by the broadband seismic network installed at the Canary Island by the Instituto Geográfico Nacional (IGN) and two temporary stations (MIDSEA and IRIS). We computed receiver functions using the Extended-Time Multitaper Frequency Domain Cross-Correlation Receiver Function (ET-MTRF) method. The results show that the crust is thicker, around 22 km, in the eastern islands (Fuerteventura and Lanzarote) than in the western ones (El Hierro, La Palma, Tenerife), around 17 km, with the exception of La Gomera island. This island, located in the west, exhibits similar crustal structure to Fuerteventura and Lanzarote. A discontinuity at 70-80 km, possibly the LAB (Lithosphere Asthenosphere Boundary) is clearly observed in all the stations. It appears that Moho depths do not track the LAB discontinuity.

  20. C-Speak Aphasia Alternative Communication Program for People with Severe Aphasia: Importance of Executive Functioning and Semantic Knowledge

    PubMed Central

    Nicholas, Marjorie; Sinotte, Michele P.; Helm-Estabrooks, Nancy

    2011-01-01

    Learning how to use a computer-based communication system can be challenging for people with severe aphasia even if the system is not word-based. This study explored cognitive and linguistic factors relative to how they affected individual patients’ ability to communicate expressively using C-Speak Aphasia, (CSA), an alternative communication computer program that is primarily picture-based. Ten individuals with severe non-fluent aphasia received at least six months of training with CSA. To assess carryover of training, untrained functional communication tasks (i.e., answering autobiographical questions, describing pictures, making telephone calls, describing a short video, and two writing tasks) were repeatedly probed in two conditions: 1) using CSA in addition to natural forms of communication, and 2) using only natural forms of communication, e.g., speaking, writing, gesturing, drawing. Four of the ten participants communicated more information on selected probe tasks using CSA than they did without the computer. Response to treatment also was examined in relation to baseline measures of non-linguistic executive function skills, pictorial semantic abilities, and auditory comprehension. Only nonlinguistic executive function skills were significantly correlated with treatment response. PMID:21506045

  1. Automatic system of collection of parameters and control of receiving equipment of the radiotelescope of VLBI complex "Quasar "

    NASA Astrophysics Data System (ADS)

    Syrovoy, Sergey

    At present the radiointerferometry with Very Long Bases (VLBI) is more and more globalized, turning into the world network of observation posts. So the inclusion of the developing Russian system "Quasar" into the world VLBI community has a great importance to us. The important role in the work of radiotelescope as a part of VLBI network belongs to a question of ensuring the optimal interaction of the its sub-systems, which can only be done by means of automation of the whole process of observation. The possibility of participation of RTF-32 in the international VLBI sessions observation is taken into account in the system development. These observations have the stable technology of experiments on the base Mark-IV Field System. In this paper the description, the structured and the functional schemes of the system of automatic collection of parameters and control of receiving complex of radiotelescope RTF-32 are given. This system is to solve the given problem. The most important tasks of the system being developed are the ensuring of distant checking and control of the following systems of the radiotelescope: 1. the receivers system, which consists of the five dual-channel radiometers 21-18 sm, 13 sm, 6 sm, 3.5 sm, 1.35 sm brands; 2. the radiotelescope pointing system; 3. the frequency-time synchronizing system, which consists of the hydrogen standard of frequency, the system of ultrahigh frequency oscillators and the generators of picosecond impulses; 4. the signal transformation system; 5. the signal registration system; 6. the system of measurement of electrical features of atmosphere; 7. the power supply system. The part of the automatic system, ensuring the distant checking and control of the radiotelescope pointing system both in the local mode and in the state of working under control the Field System computer, was put into operation and is functioning at this moment. Now the part of the automatic system ensuring the checking and control of receiving system of radiotelescope is being developed. The functional scheme has been designed. The experimental model of the device of connection of control PC with the terminal has been produced. The algorithms of receiver control in the different modes of observation have been developed. The questions of interaction with the computer Field System have been solved. The radiotelescope RTF-32 is capable of functioning in two modes such as radio-astronomical and radio-interferometrical. The control of the transformation signal system and the registration signal system in these modes is different and is entrusted with the Field System computer. The automation of collection of the meteorological data and parameters of the power supply system of the radiotelescope is last stage of the development of the presented system.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pantelis, Evaggelos, E-mail: vpantelis@phys.uoa.g; Medical Physics Laboratory, Medical School, University of Athens, Athens; Papadakis, Nikolaos

    Purpose: To study the efficacy of the integration of functional magnetic resonance imaging (fMRI) and diffusion tensor imaging tractography data into stereotactic radiosurgery clinical practice. Methods and Materials: fMRI and tractography data sets were acquired and fused with corresponding anatomical MR and computed tomography images of patients with arteriovenous malformation (AVM), astrocytoma, brain metastasis, or hemangioma and referred for stereotactic radiosurgery. The acquired data sets were imported into a CyberKnife stereotactic radiosurgery system and used to delineate the target, organs at risk, and nearby functional structures and fiber tracts. Treatment plans with and without the incorporation of the functional structuresmore » and the fiber tracts into the optimization process were developed and compared. Results: The nearby functional structures and fiber tracts could receive doses of >50% of the maximum dose if they were excluded from the planning process. In the AVM case, the doses received by the Broadmann-17 structure and the optic tract were reduced to 700 cGy from 1,400 cGy and to 1,200 cGy from 2,000 cGy, respectively, upon inclusion into the optimization process. In the metastasis case, the motor cortex received 850 cGy instead of 1,400 cGy; and in the hemangioma case, the pyramidal tracts received 780 cGy instead of 990 cGy. In the astrocytoma case, the dose to the motor cortex bordering the lesion was reduced to 1,900 cGy from 2,100 cGy, and therefore, the biologically equivalent dose in three fractions was delivered instead. Conclusions: Functional structures and fiber tracts could receive high doses if they were not considered during treatment planning. With the aid of fMRI and tractography images, they can be delineated and spared.« less

  3. Deterrence of device counterfeiting, cloning, and subversion by substitution using hardware fingerprinting

    DOEpatents

    Hamlet, Jason R; Bauer, Todd M; Pierson, Lyndon G

    2014-09-30

    Deterrence of device subversion by substitution may be achieved by including a cryptographic fingerprint unit within a computing device for authenticating a hardware platform of the computing device. The cryptographic fingerprint unit includes a physically unclonable function ("PUF") circuit disposed in or on the hardware platform. The PUF circuit is used to generate a PUF value. A key generator is coupled to generate a private key and a public key based on the PUF value while a decryptor is coupled to receive an authentication challenge posed to the computing device and encrypted with the public key and coupled to output a response to the authentication challenge decrypted with the private key.

  4. The computational neurobiology of learning and reward.

    PubMed

    Daw, Nathaniel D; Doya, Kenji

    2006-04-01

    Following the suggestion that midbrain dopaminergic neurons encode a signal, known as a 'reward prediction error', used by artificial intelligence algorithms for learning to choose advantageous actions, the study of the neural substrates for reward-based learning has been strongly influenced by computational theories. In recent work, such theories have been increasingly integrated into experimental design and analysis. Such hybrid approaches have offered detailed new insights into the function of a number of brain areas, especially the cortex and basal ganglia. In part this is because these approaches enable the study of neural correlates of subjective factors (such as a participant's beliefs about the reward to be received for performing some action) that the computational theories purport to quantify.

  5. Associations of Ischemic Lesion Volume With Functional Outcome in Patients With Acute Ischemic Stroke: 24-Hour Versus 1-Week Imaging.

    PubMed

    Bucker, Amber; Boers, Anna M; Bot, Joseph C J; Berkhemer, Olvert A; Lingsma, Hester F; Yoo, Albert J; van Zwam, Wim H; van Oostenbrugge, Robert J; van der Lugt, Aad; Dippel, Diederik W J; Roos, Yvo B W E M; Majoie, Charles B L M; Marquering, Henk A

    2017-05-01

    Ischemic lesion volume (ILV) on noncontrast computed tomography at 1 week can be used as a secondary outcome measure in patients with acute ischemic stroke. Twenty-four-hour ILV on noncontrast computed tomography has greater availability and potentially allows earlier estimation of functional outcome. We aimed to assess lesion growth 24 hours after stroke onset and compare the associations of 24-hour and 1-week ILV with functional outcome. We included 228 patients from MR CLEAN trial (Multicenter Randomized Clinical Trial of Endovascular Treatment for Acute Ischemic Stroke in the Netherlands), who received noncontrast computed tomography at 24-hour and 1-week follow-up on which ILV was measured. Relative and absolute lesion growth was determined. Logistic regression models were constructed either including the 24-hour or including the 1-week ILV. Ordinal and dichotomous (0-2 and 3-6) modified Rankin scale scores were, respectively, used as primary and secondary outcome measures. Median ILV was 42 mL (interquartile range, 21-95 mL) and 64 mL (interquartile range: 30-120 mL) at 24 hours and 1 week, respectively. Relative lesion growth exceeding 30% occurred in 121 patients (53%) and absolute lesion growth exceeding 20 mL occurred in 83 patients (36%). Both the 24-hour and 1-week ILVs were similarly significantly associated with functional outcome (both P <0.001). In the logistic analyses, the areas under the curve of the receiver-operator characteristic curves were similar: 0.85 (95% confidence interval, 0.80-0.90) and 0.87 (95% confidence interval, 0.82-0.91) for including the 24-hour and 1-week ILV, respectively. Growth of ILV is common 24-hour poststroke onset. Nevertheless, the 24-hour ILV proved to be a valuable secondary outcome measure as it is equally strongly associated with functional outcome as the 1-week ILV. URL: http://www.isrctn.com. Unique identifier: ISRCTN10888758. © 2017 American Heart Association, Inc.

  6. The application of simulation modeling to the cost and performance ranking of solar thermal power plants

    NASA Technical Reports Server (NTRS)

    Rosenberg, L. S.; Revere, W. R.; Selcuk, M. K.

    1981-01-01

    A computer simulation code was employed to evaluate several generic types of solar power systems (up to 10 MWe). Details of the simulation methodology, and the solar plant concepts are given along with cost and performance results. The Solar Energy Simulation computer code (SESII) was used, which optimizes the size of the collector field and energy storage subsystem for given engine-generator and energy-transport characteristics. Nine plant types were examined which employed combinations of different technology options, such as: distributed or central receivers with one- or two-axis tracking or no tracking; point- or line-focusing concentrator; central or distributed power conversion; Rankin, Brayton, or Stirling thermodynamic cycles; and thermal or electrical storage. Optimal cost curves were plotted as a function of levelized busbar energy cost and annualized plant capacity. Point-focusing distributed receiver systems were found to be most efficient (17-26 percent).

  7. Implementation of an experimental program to investigate the performance characteristics of OMEGA navigation

    NASA Technical Reports Server (NTRS)

    Baxa, E. G., Jr.

    1974-01-01

    A theoretical formulation of differential and composite OMEGA error is presented to establish hypotheses about the functional relationships between various parameters and OMEGA navigational errors. Computer software developed to provide for extensive statistical analysis of the phase data is described. Results from the regression analysis used to conduct parameter sensitivity studies on differential OMEGA error tend to validate the theoretically based hypothesis concerning the relationship between uncorrected differential OMEGA error and receiver separation range and azimuth. Limited results of measurement of receiver repeatability error and line of position measurement error are also presented.

  8. Quantitative relation between server motion and receiver anticipation in tennis: implications of responses to computer-simulated motions.

    PubMed

    Ida, Hirofumi; Fukuhara, Kazunobu; Sawada, Misako; Ishii, Motonobu

    2011-01-01

    The purpose of this study was to determine the quantitative relationships between the server's motion and the receiver's anticipation using a computer graphic animation of tennis serves. The test motions were determined by capturing the motion of a model player and estimating the computational perturbations caused by modulating the rotation of the player's elbow and forearm joints. Eight experienced and eight novice players rated their anticipation of the speed, direction, and spin of the ball on a visual analogue scale. The experienced players significantly altered some of their anticipatory judgment depending on the percentage of both the forearm and elbow modulations, while the novice players indicated no significant changes. Multiple regression analyses, including that of the racket's kinematic parameters immediately before racket-ball impact as independent variables, showed that the experienced players demonstrated a higher coefficient of determination than the novice players in their anticipatory judgment of the ball direction. The results have implications on the understanding of the functional relation between a player's motion and the opponent's anticipatory judgment during real play.

  9. Adaptation of a Control Center Development Environment for Industrial Process Control

    NASA Technical Reports Server (NTRS)

    Killough, Ronnie L.; Malik, James M.

    1994-01-01

    In the control center, raw telemetry data is received for storage, display, and analysis. This raw data must be combined and manipulated in various ways by mathematical computations to facilitate analysis, provide diversified fault detection mechanisms, and enhance display readability. A development tool called the Graphical Computation Builder (GCB) has been implemented which provides flight controllers with the capability to implement computations for use in the control center. The GCB provides a language that contains both general programming constructs and language elements specifically tailored for the control center environment. The GCB concept allows staff who are not skilled in computer programming to author and maintain computer programs. The GCB user is isolated from the details of external subsystem interfaces and has access to high-level functions such as matrix operators, trigonometric functions, and unit conversion macros. The GCB provides a high level of feedback during computation development that improves upon the often cryptic errors produced by computer language compilers. An equivalent need can be identified in the industrial data acquisition and process control domain: that of an integrated graphical development tool tailored to the application to hide the operating system, computer language, and data acquisition interface details. The GCB features a modular design which makes it suitable for technology transfer without significant rework. Control center-specific language elements can be replaced by elements specific to industrial process control.

  10. Older men who use computers have lower risk of dementia.

    PubMed

    Almeida, Osvaldo P; Yeap, Bu B; Alfonso, Helman; Hankey, Graeme J; Flicker, Leon; Norman, Paul E

    2012-01-01

    To determine if older men who use computers have lower risk of developing dementia. Cohort study of 5506 community-dwelling men aged 69 to 87 years followed for up to 8.5 years. Use of computers measured as daily, weekly, less than weekly and never. Participants also reported their use of email, internet, word processors, games or other computer activities. The primary outcome was the incidence of ICD-10 diagnosis of dementia as recorded by the Western Australian Data Linkage System. 1857/5506 (33.7%) men reported using computers and 347 (6.3%) received a diagnosis of dementia during an average follow up of 6.0 years (range: 6 months to 8.5 years). The hazard ratio (HR) of dementia was lower among computer users than non-users (HR = 0.62, 95%CI = 0.47-0.81, after adjustment for age, educational attainment, size of social network, and presence of depression or of significant clinical morbidity). The HR of dementia appeared to decrease with increasing frequency of computer use: 0.68 (95%CI = 0.41-1.13), 0.61 (95%CI = 0.39-0.94) and 0.59 (95%CI = 0.40-0.87) for less than weekly, at least weekly and daily. The HR of dementia was 0.66 (95%CI = 0.50-0.86) after the analysis was further adjusted for baseline cognitive function, as measured by the Mini-Mental State Examination. Older men who use computers have lower risk of receiving a diagnosis of dementia up to 8.5 years later. Randomised trials are required to determine if the observed associations are causal.

  11. Method for transferring data from an unsecured computer to a secured computer

    DOEpatents

    Nilsen, Curt A.

    1997-01-01

    A method is described for transferring data from an unsecured computer to a secured computer. The method includes transmitting the data and then receiving the data. Next, the data is retransmitted and rereceived. Then, it is determined if errors were introduced when the data was transmitted by the unsecured computer or received by the secured computer. Similarly, it is determined if errors were introduced when the data was retransmitted by the unsecured computer or rereceived by the secured computer. A warning signal is emitted from a warning device coupled to the secured computer if (i) an error was introduced when the data was transmitted or received, and (ii) an error was introduced when the data was retransmitted or rereceived.

  12. Low latency, high bandwidth data communications between compute nodes in a parallel computer

    DOEpatents

    Archer, Charles J.; Blocksome, Michael A.; Ratterman, Joseph D.; Smith, Brian E.

    2010-11-02

    Methods, parallel computers, and computer program products are disclosed for low latency, high bandwidth data communications between compute nodes in a parallel computer. Embodiments include receiving, by an origin direct memory access (`DMA`) engine of an origin compute node, data for transfer to a target compute node; sending, by the origin DMA engine of the origin compute node to a target DMA engine on the target compute node, a request to send (`RTS`) message; transferring, by the origin DMA engine, a predetermined portion of the data to the target compute node using memory FIFO operation; determining, by the origin DMA engine whether an acknowledgement of the RTS message has been received from the target DMA engine; if the an acknowledgement of the RTS message has not been received, transferring, by the origin DMA engine, another predetermined portion of the data to the target compute node using a memory FIFO operation; and if the acknowledgement of the RTS message has been received by the origin DMA engine, transferring, by the origin DMA engine, any remaining portion of the data to the target compute node using a direct put operation.

  13. Clinical usefulness of brain-computer interface-controlled functional electrical stimulation for improving brain activity in children with spastic cerebral palsy: a pilot randomized controlled trial.

    PubMed

    Kim, Tae-Woo; Lee, Byoung-Hee

    2016-09-01

    [Purpose] Evaluating the effect of brain-computer interface (BCI)-based functional electrical stimulation (FES) training on brain activity in children with spastic cerebral palsy (CP) was the aim of this study. [Subjects and Methods] Subjects were randomized into a BCI-FES group (n=9) and a functional electrical stimulation (FES) control group (n=9). Subjects in the BCI-FES group received wrist and hand extension training with FES for 30 minutes per day, 5 times per week for 6 weeks under the BCI-based program. The FES group received wrist and hand extension training with FES for the same amount of time. Sensorimotor rhythms (SMR) and middle beta waves (M-beta) were measured in frontopolar regions 1 and 2 (Fp1, Fp2) to determine the effects of BCI-FES training. [Results] Significant improvements in the SMR and M-beta of Fp1 and Fp2 were seen in the BCI-FES group. In contrast, significant improvement was only seen in the SMR and M-beta of Fp2 in the control group. [Conclusion] The results of the present study suggest that BCI-controlled FES training may be helpful in improving brain activity in patients with cerebral palsy and may be applied as effectively as traditional FES training.

  14. Seismic Velocity Assessment In The Kachchh Region, India, From Multiple Waveform Functionals

    NASA Astrophysics Data System (ADS)

    Ghosh, R.; Sen, M. K.; Mandal, P.; Pulliam, J.; Agrawal, M.

    2014-12-01

    The primary goal of this study is to estimate well constrained crust and upper mantle seismic velocity structure in the Kachchh region of Gujarat, India - an area of active interest for earthquake monitoring purposes. Several models based on 'stand-alone' surface wave dispersion and receiver function modeling exist in this area. Here we jointly model the receiver function, surface wave dispersion and, S and shear-coupled PL wavetrains using broadband seismograms of deep (150-700 km), moderate to-large magnitude (5.5-6.8) earthquakes recorded teleseismically at semi-permanent seismograph stations in the Kachchh region, Gujarat, India. While surface wave dispersion and receiver function modeling is computationally fast, full waveform modeling makes use of reflectivity synthetic seismograms. An objective function that measures misfit between all three data is minimized using a very fast simulated annealing (VFSA) approach. Surface wave and receiver function data help reduce the model search space which is explored extensively for detailed waveform fitting. Our estimated crustal and lithospheric thicknesses in this region vary from 32 to 41 km and 70 to 80 km, respectively, while crustal P and S velocities from surface to Moho discontinuity vary from 4.7 to 7.0 km/s and 2.7 to 4.1 km/s, respectively. Our modeling clearly reveals a zone of crustal as well as an asthenospheric upwarping underlying the Kachchh rift zone relative to the surrounding unrifted area. We believe that this feature plays a key role in the seismogenesis of lower crustal earthquakes occurring in the region through the emanation of volatile CO2 into the hypocentral zones liberating from the crystallization of carbonatite melts in the asthenosphere. Such a crust-mantle structure might be related to the plume-lithosphere interaction during the Deccan/Reunion plume episode (~65 Ma).

  15. Computer-assisted design/computer-assisted manufacturing zirconia implant fixed complete prostheses: clinical results and technical complications up to 4 years of function.

    PubMed

    Papaspyridakos, Panos; Lal, Kunal

    2013-06-01

    To report the clinical results and technical complications with computer-assisted design/computer-assisted manufacturing (CAD/CAM) zirconia, implant fixed complete dental prostheses (IFCDPs) after 2-4 years in function. Fourteen consecutive edentulous patients (16 edentulous arches) were included in this study. Ten of the patients were women and four were men, with an average age of 58 years (range: 35-71). Ten mandibular and six maxillary arches were restored with porcelain fused to zirconia (PFZ) IFCDPs. Of the 16 arches, 14 received one-piece and 2 received segmented two-piece IFCDPs, respectively. The mean clinical follow-up period was 3 years (range: 2-4). At the last recall appointment, biological and technical parameters of dental implant treatment were evaluated. The implant and prosthesis survival rate following prosthesis insertion was 100% up to 4-year follow-up. The prostheses in 11 of the 16 restored arches were structurally sound, exhibited favorable soft tissue response, esthetics, and patient satisfaction. Five IFCDPs (31.25%) in four patients exhibited porcelain veneer chipping. Chipping was minor in three prostheses (three patients) and was addressed intraorally with polishing (one prosthesis) or composite resin (two prostheses). One patient with maxillary and mandibular zirconia IFCDP exhibited major porcelain chipping fractures which had to be repaired in the laboratory. Function, esthetics, and patient satisfaction were not affected in three of the four fracture incidents. Median crestal bone loss was 0.1 mm (0.01-0.2 mm). The presence of parafunctional activity, the IFCDP as opposing dentition, and the absence of occlusal night guard were associated with all the incidents of ceramic chipping. CAD/CAM zirconia IFCDPs are viable prosthetic treatment after 2-4 years in function, but not without complications. The porcelain chipping/fracture was the most frequent technical complication, with a 31.25% chipping rate at the prosthesis level. Despite the technical complications, increased patient satisfaction was noted. © 2012 John Wiley & Sons A/S.

  16. High-resolution receiver function imaging reveals Colorado Plateau lithospheric architecture and mantle-supported topography

    USGS Publications Warehouse

    Domingo, Dorothy L.; R. Aster,; S. Grand,; J Ni,; W.S. Baldridge,; David C. Wilson USGS,

    2010-01-01

    After maintaining elevations near sea level for over 500 million years, the Colorado Plateau (CP) has a present average elevation of 2 km. We compute new receiver function images from the first dense seismic transect to cross the plateau that reveal a central CP crustal thickness of 42–50 km thinning to 30–35 km at the CP margins. Isostatic calculations show that only approximately 20% of central CP elevations can be explained by thickened crust alone, with the CP edges requiring nearly total mantle compensation. We calculate an uplift budget showing that CP buoyancy arises from a combination of crustal thickening, heating and alteration of the lithospheric root, dynamic support from mantle upwelling, and significant buoyant edge effects produced by small-scale convecting asthenosphere at its margins.

  17. Efficient Evaluation Functions for Multi-Rover Systems

    NASA Technical Reports Server (NTRS)

    Agogino, Adrian; Tumer, Kagan

    2004-01-01

    Evolutionary computation can be a powerful tool in cresting a control policy for a single agent receiving local continuous input. This paper extends single-agent evolutionary computation to multi-agent systems, where a collection of agents strives to maximize a global fitness evaluation function that rates the performance of the entire system. This problem is solved in a distributed manner, where each agent evolves its own population of neural networks that are used as the control policies for the agent. Each agent evolves its population using its own agent-specific fitness evaluation function. We propose to create these agent-specific evaluation functions using the theory of collectives to avoid the coordination problem where each agent evolves a population that maximizes its own fitness function, yet the system has a whole achieves low values of the global fitness function. Instead we will ensure that each fitness evaluation function is both "aligned" with the global evaluation function and is "learnable," i.e., the agents can readily see how their behavior affects their evaluation function. We then show how these agent-specific evaluation functions outperform global evaluation methods by up to 600% in a domain where a set of rovers attempt to maximize the amount of information observed while navigating through a simulated environment.

  18. Lithospheric structure of the southern French Alps inferred from broadband analysis

    NASA Astrophysics Data System (ADS)

    Bertrand, E.; Deschamps, A.

    2000-11-01

    Broadband receiver functions analysis is commonly used to evaluate the fine-scale S-velocity structure of the lithosphere. We analyse teleseismic P-waves and their coda from 30 selected teleseismic events recorded at three seismological stations of to the French TGRS network in the Alpes Maritimes. Receiver functions are computed in the time domain using an SVD matrix inversion method. Dipping Moho and lateral heterogeneities beneath the array are inferred from the amplitude, arrival time and polarity of locally-generated PS phases. We propose that the Moho dips 11° towards 25°±10°N below station CALF, in the outer part of the Alpine belt. At this station, we determine a Moho depth of about 20±2 km; the same depth is suggested below SAOF station also located in the fold-trust belt. Beneath station STET located in the inner part of the Alpine belt, the Moho depth increases to 30 km and dips towards the N-NW. Moreover, 1D-modelling of summed receiver function from STET station constrains a crustal structure significantly different from that observed at stations located in the outer part of the Alps. Indeed, beneath CALF and SAOF stations we need a 2 km thick shallow low velocity layer to fit best the observed receiver functions whereas this layer seems not to be present beneath STET station. Because recent P-coda studies have shown that near-receiver scattering can dominate teleseismic P-wave recordings in tectonically complicated areas, we account for effect of scattering energy in our records from array measurements. As the array aperture is wide relative to the heterogeneity scale length in the area, the array analysis produces only smooth imaging of scatterers beneath the stations.

  19. Analysis of the Optimum Receiver Design Problem Using Interactive Computer Graphics.

    DTIC Science & Technology

    1981-12-01

    7 _AD A115 498A l AR FORCE INST OF TECH WR16HT-PATTERSON AF8 OH SCHOO--ETC F/6 9/2 ANALYSIS OF THE OPTIMUM RECEIVER DESIGN PROBLEM USING INTERACTI...ANALYSIS OF THE OPTIMUM RECEIVER DESIGN PROBLEM USING INTERACTIVE COMPUTER GRAPHICS THESIS AFIT/GE/EE/81D-39 Michael R. Mazzuechi Cpt USA Approved for...public release; distribution unlimited AFIT/GE/EE/SlD-39 ANALYSIS OF THE OPTIMUM RECEIVER DESIGN PROBLEM USING INTERACTIVE COMPUTER GRAPHICS THESIS

  20. 48 CFR 552.216-73 - Ordering Information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... transmission or □ computer-to-computer Electronic Data Interchange (EDI). (b) An offeror electing to receive computer-to-computer EDI is requested to indicate below the name, address, and telephone number of the representative to be contacted regarding establishment of an EDI interface. (c) An offeror electing to receive...

  1. Wrist display concept demonstration based on 2-in. color AMOLED

    NASA Astrophysics Data System (ADS)

    Meyer, Frederick M.; Longo, Sam J.; Hopper, Darrel G.

    2004-09-01

    The wrist watch needs an upgrade. Recent advances in optoelectronics, microelectronics, and communication theory have established a technology base that now make the multimedia Dick Tracy watch attainable during the next decade. As a first step towards stuffing the functionality of an entire personnel computer (PC) and television receiver under a watch face, we have set a goal of providing wrist video capability to warfighters. Commercial sector work on the wrist form factor already includes all the functionality of a personal digital assistant (PDA) and full PC operating system. Our strategy is to leverage these commercial developments. In this paper we describe our use of a 2.2 in. diagonal color active matrix light emitting diode (AMOLED) device as a wrist-mounted display (WMD) to present either full motion video or computer generated graphical image formats.

  2. GPS application to mapping, charting and geodesy

    NASA Technical Reports Server (NTRS)

    Senus, W. J.; Hill, R. W.

    1981-01-01

    GPSPAC, a receiver being developed for space applications by the Defense Mapping Agency and NASA, will use signals from GPS constellations to generate real-time values of host vehicle position and velocity. The GPSPAC has an L-band antenna and preamp capable of receiving the 1575 MHz and 1227 MHz spread spectrum signals; its stable oscillator at 5.115 MHz provides the basic frequency reference, resulting in a long term drift of less than one part in 10 to the -10th day. The GPSPAC performs many functions on board the spacecraft which were previously relegated to large-scale ground-based computer/receiver systems. A positional accuracy of better than 8 can be achieved for those periods when four or more NAVSTAR satellites are visible to the host satellite. The GPS geodetic receiver development, which will provide prototype receivers for utilization in terrestrial surveying operations, has the potential to significantly enhance the accuracy of point geodetic surveys over the current user hardware capability.

  3. Computational approaches to understand cardiac electrophysiology and arrhythmias

    PubMed Central

    Roberts, Byron N.; Yang, Pei-Chi; Behrens, Steven B.; Moreno, Jonathan D.

    2012-01-01

    Cardiac rhythms arise from electrical activity generated by precisely timed opening and closing of ion channels in individual cardiac myocytes. These impulses spread throughout the cardiac muscle to manifest as electrical waves in the whole heart. Regularity of electrical waves is critically important since they signal the heart muscle to contract, driving the primary function of the heart to act as a pump and deliver blood to the brain and vital organs. When electrical activity goes awry during a cardiac arrhythmia, the pump does not function, the brain does not receive oxygenated blood, and death ensues. For more than 50 years, mathematically based models of cardiac electrical activity have been used to improve understanding of basic mechanisms of normal and abnormal cardiac electrical function. Computer-based modeling approaches to understand cardiac activity are uniquely helpful because they allow for distillation of complex emergent behaviors into the key contributing components underlying them. Here we review the latest advances and novel concepts in the field as they relate to understanding the complex interplay between electrical, mechanical, structural, and genetic mechanisms during arrhythmia development at the level of ion channels, cells, and tissues. We also discuss the latest computational approaches to guiding arrhythmia therapy. PMID:22886409

  4. Enoxaparin-induced spontaneous massive retroperitoneal hematoma with fatal outcome.

    PubMed

    Salemis, Nikolaos S; Oikonomakis, Ioannis; Lagoudianakis, Emanuel; Boubousis, Georgios; Tsakalakis, Christos; Sourlas, Sotirios; Gourgiotis, Stavros

    2014-12-01

    Spontaneous retroperitoneal hematoma (SRH) is a severe and potentially fatal complication of anticoagulation therapy. We describe a case of fatal spontaneous massive retroperitoneal hematoma in a female patient receiving bridging therapy with enoxaparin for atrial fibrillation. Physicians should be cautious when prescribing enoxaparin in elderly patients, in patients with impaired renal function, and in patients receiving concomitant oral anticoagulants. Emergency physicians should always consider SRH in the differential diagnosis in patients under enoxaparin therapy presenting with abdominal pain. Computed tomographic scan is the imaging modality of choice for evaluating SRH. Early diagnosis and aggressive treatment are of paramount importance as SRH is associated with high mortality and morbidity rates.

  5. How to fly an aircraft with control theory and splines

    NASA Technical Reports Server (NTRS)

    Karlsson, Anders

    1994-01-01

    When trying to fly an aircraft as smoothly as possible it is a good idea to use the derivatives of the pilot command instead of using the actual control. This idea was implemented with splines and control theory, in a system that tries to model an aircraft. Computer calculations in Matlab show that it is impossible to receive enough smooth control signals by this way. This is due to the fact that the splines not only try to approximate the test function, but also its derivatives. A perfect traction is received but we have to pay in very peaky control signals and accelerations.

  6. Neurophysiological substrates of stroke patients with motor imagery-based Brain-Computer Interface training.

    PubMed

    Li, Mingfen; Liu, Ye; Wu, Yi; Liu, Sirao; Jia, Jie; Zhang, Liqing

    2014-06-01

    We investigated the efficacy of motor imagery-based Brain Computer Interface (MI-based BCI) training for eight stroke patients with severe upper extremity paralysis using longitudinal clinical assessments. The results were compared with those of a control group (n = 7) that only received FES (Functional Electrical Stimulation) treatment besides conventional therapies. During rehabilitation training, changes in the motor function of the upper extremity and in the neurophysiologic electroencephalographic (EEG) were observed for two groups. After 8 weeks of training, a significant improvement in the motor function of the upper extremity for the BCI group was confirmed (p < 0.05 for ARAT), simultaneously with the activation of bilateral cerebral hemispheres. Additionally, event-related desynchronization (ERD) of the affected sensorimotor cortexes (SMCs) was significantly enhanced when compared to the pretraining course, which was only observed in the BCI group (p < 0.05). Furthermore, the activation of affected SMC and parietal lobe were determined to contribute to motor function recovery (p < 0.05). In brief, our findings demonstrate that MI-based BCI training can enhance the motor function of the upper extremity for stroke patients by inducing the optimal cerebral motor functional reorganization.

  7. Older Men Who Use Computers Have Lower Risk of Dementia

    PubMed Central

    Almeida, Osvaldo P.; Yeap, Bu B.; Alfonso, Helman; Hankey, Graeme J.; Flicker, Leon; Norman, Paul E.

    2012-01-01

    Objective To determine if older men who use computers have lower risk of developing dementia. Methods Cohort study of 5506 community-dwelling men aged 69 to 87 years followed for up to 8.5 years. Use of computers measured as daily, weekly, less than weekly and never. Participants also reported their use of email, internet, word processors, games or other computer activities. The primary outcome was the incidence of ICD-10 diagnosis of dementia as recorded by the Western Australian Data Linkage System. Results 1857/5506 (33.7%) men reported using computers and 347 (6.3%) received a diagnosis of dementia during an average follow up of 6.0 years (range: 6 months to 8.5 years). The hazard ratio (HR) of dementia was lower among computer users than non-users (HR = 0.62, 95%CI = 0.47–0.81, after adjustment for age, educational attainment, size of social network, and presence of depression or of significant clinical morbidity). The HR of dementia appeared to decrease with increasing frequency of computer use: 0.68 (95%CI = 0.41–1.13), 0.61 (95%CI = 0.39–0.94) and 0.59 (95%CI = 0.40–0.87) for less than weekly, at least weekly and daily. The HR of dementia was 0.66 (95%CI = 0.50–0.86) after the analysis was further adjusted for baseline cognitive function, as measured by the Mini-Mental State Examination. Conclusion Older men who use computers have lower risk of receiving a diagnosis of dementia up to 8.5 years later. Randomised trials are required to determine if the observed associations are causal. PMID:22937167

  8. Adaptive reconfigurable V-BLAST type equalizer for cognitive MIMO-OFDM radios

    NASA Astrophysics Data System (ADS)

    Ozden, Mehmet Tahir

    2015-12-01

    An adaptive channel shortening equalizer design for multiple input multiple output-orthogonal frequency division multiplexing (MIMO-OFDM) radio receivers is considered in this presentation. The proposed receiver has desirable features for cognitive and software defined radio implementations. It consists of two sections: MIMO decision feedback equalizer (MIMO-DFE) and adaptive multiple Viterbi detection. In MIMO-DFE section, a complete modified Gram-Schmidt orthogonalization of multichannel input data is accomplished using sequential processing multichannel Givens lattice stages, so that a Vertical Bell Laboratories Layered Space Time (V-BLAST) type MIMO-DFE is realized at the front-end section of the channel shortening equalizer. Matrix operations, a major bottleneck for receiver operations, are accordingly avoided, and only scalar operations are used. A highly modular and regular radio receiver architecture that has a suitable structure for digital signal processing (DSP) chip and field programable gate array (FPGA) implementations, which are important for software defined radio realizations, is achieved. The MIMO-DFE section of the proposed receiver can also be reconfigured for spectrum sensing and positioning functions, which are important tasks for cognitive radio applications. In connection with adaptive multiple Viterbi detection section, a systolic array implementation for each channel is performed so that a receiver architecture with high computational concurrency is attained. The total computational complexity is given in terms of equalizer and desired response filter lengths, alphabet size, and number of antennas. The performance of the proposed receiver is presented for two-channel case by means of mean squared error (MSE) and probability of error evaluations, which are conducted for time-invariant and time-variant channel conditions, orthogonal and nonorthogonal transmissions, and two different modulation schemes.

  9. Emergency repair of upper extremity large soft tissue and vascular injuries with flow-through anterolateral thigh free flaps.

    PubMed

    Zhan, Yi; Fu, Guo; Zhou, Xiang; He, Bo; Yan, Li-Wei; Zhu, Qing-Tang; Gu, Li-Qiang; Liu, Xiao-Lin; Qi, Jian

    2017-12-01

    Complex extremity trauma commonly involves both soft tissue and vascular injuries. Traditional two-stage surgical repair may delay rehabilitation and functional recovery, as well as increase the risk of infections. We report a single-stage reconstructive surgical method that repairs soft tissue defects and vascular injuries with flow-through free flaps to improve functional outcomes. Between March 2010 and December 2016 in our hospital, 5 patients with severe upper extremity trauma received single-stage reconstructive surgery, in which a flow-through anterolateral thigh free flap was applied to repair soft tissue defects and vascular injuries simultaneously. Cases of injured artery were reconstructed with the distal trunk of the descending branch of the lateral circumflex femoral artery. A segment of adjacent vein was used if there was a second artery injury. Patients were followed to evaluate their functional recoveries, and received computed tomography angiography examinations to assess peripheral circulation. Two patients had post-operative thumb necrosis; one required amputation, and the other was healed after debridement and abdominal pedicle flap repair. The other 3 patients had no major complications (infection, necrosis) to the recipient or donor sites after surgery. All the patients had achieved satisfactory functional recovery by the end of the follow-up period. Computed tomography angiography showed adequate circulation in the peripheral vessels. The success of these cases shows that one-step reconstructive surgery with flow-through anterolateral thigh free flaps can be a safe and effective treatment option for patients with complex upper extremity trauma with soft tissue defects and vascular injuries. Copyright © 2017. Published by Elsevier Ltd.

  10. On Fast Post-Processing of Global Positioning System Simulator Truth Data and Receiver Measurements and Solutions Data

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Day, John H. (Technical Monitor)

    2000-01-01

    Post-processing of data, related to a GPS receiver test in a GPS simulator and test facility, is an important step towards qualifying a receiver for space flight. Although the GPS simulator provides all the parameters needed to analyze a simulation, as well as excellent analysis tools on the simulator workstation, post-processing is not a GPS simulator or receiver function alone, and it must be planned as a separate pre-flight test program requirement. A GPS simulator is a critical resource, and it is desirable to move off the pertinent test data from the simulator as soon as a test is completed. The receiver and simulator databases are used to extract the test data files for postprocessing. These files are then usually moved from the simulator and receiver systems to a personal computer (PC) platform, where post-processing is done typically using PC-based commercial software languages and tools. Because of commercial software systems generality their functions are notoriously slow and more than often are the bottleneck even for short duration simulator-based tests. There is a need to do post-processing faster and within an hour after test completion, including all required operations on the simulator and receiver to prepare and move off the post-processing files. This is especially significant in order to use the previous test feedback for the next simulation setup or to run near back-to-back simulation scenarios. Solving the post-processing timing problem is critical for a pre-flight test program success. Towards this goal an approach was developed that allows to speed-up post-processing by an order of a magnitude. It is based on improving the post-processing bottleneck function algorithm using a priory information that is specific to a GPS simulation application and using only the necessary volume of truth data. The presented postprocessing scheme was used in support of a few successful space flight missions carrying GPS receivers.

  11. Time Series Analysis of the Quasar PKS 1749+096

    NASA Astrophysics Data System (ADS)

    Lam, Michael T.; Balonek, T. J.

    2011-01-01

    Multiple timescales of variability are observed in quasars at a variety of wavelengths, the nature of which is not fully understood. In 2007 and 2008, the quasar 1749+096 underwent two unprecedented optical outbursts, reaching a brightness never before seen in our twenty years of monitoring. Much lower level activity had been seen prior to these two outbursts. We present an analysis of the timescales of variability over the two regimes using a variety of statistical techniques. An IDL software package developed at Colgate University over the summer of 2010, the Quasar User Interface (QUI), provides effective computation of four time series functions for analyzing underlying trends present in generic, discretely sampled data sets. Using the Autocorrelation Function, Structure Function, and Power Spectrum, we are able to quickly identify possible variability timescales. QUI is also capable of computing the Cross-Correlation Function for comparing variability at different wavelengths. We apply these algorithms to 1749+096 and present our analysis of the timescales for this object. Funding for this project was received from Colgate University, the Justus and Jayne Schlichting Student Research Fund, and the NASA / New York Space Grant.

  12. Development of wide band digital receiver for atmospheric radars using COTS board based SDR

    NASA Astrophysics Data System (ADS)

    Yasodha, Polisetti; Jayaraman, Achuthan; Thriveni, A.

    2016-07-01

    Digital receiver extracts the received echo signal information, and is a potential subsystem for atmospheric radar, also referred to as wind profiling radar (WPR), which provides the vertical profiles of 3-dimensional wind vector in the atmosphere. This paper presents the development of digital receiver using COTS board based Software Defined Radio technique, which can be used for atmospheric radars. The developmental work is being carried out at National Atmospheric Research Laboratory (NARL), Gadanki. The digital receiver consists of a commercially available software defined radio (SDR) board called as universal software radio peripheral B210 (USRP B210) and a personal computer. USRP B210 operates over a wider frequency range from 70 MHz to 6 GHz and hence can be used for variety of radars like Doppler weather radars operating in S/C bands, in addition to wind profiling radars operating in VHF, UHF and L bands. Due to the flexibility and re-configurability of SDR, where the component functionalities are implemented in software, it is easy to modify the software to receive the echoes and process them as per the requirement suitable for the type of the radar intended. Hence, USRP B210 board along with the computer forms a versatile digital receiver from 70 MHz to 6 GHz. It has an inbuilt direct conversion transceiver with two transmit and two receive channels, which can be operated in fully coherent 2x2 MIMO fashion and thus it can be used as a two channel receiver. Multiple USRP B210 boards can be synchronized using the pulse per second (PPS) input provided on the board, to configure multi-channel digital receiver system. RF gain of the transceiver can be varied from 0 to 70 dB. The board can be controlled from the computer via USB 3.0 interface through USRP hardware driver (UHD), which is an open source cross platform driver. The USRP B210 board is connected to the personal computer through USB 3.0. Reference (10 MHz) clock signal from the radar master oscillator is used to lock the board, which is essential for deriving Doppler information. Input from the radar analog receiver is given to one channel of USRP B210, which is down converted to baseband. 12-bit ADC present on the board digitizes the signal and produces I (in-phase) and Q (quadrature-phase) data. The maximum sampling rate possible is about 61 MSPS. The I and Q (time series) data is sent to PC via USB 3.0, where the signal processing is carried out. The online processing steps include decimation, range gating, decoding, coherent integration and FFT computation (optional). The processed data is then stored in the hard disk. C++ programming language is used for developing the real time signal processing. Shared memory along with multi threading is used to collect and process data simultaneously. Before implementing the real time operation, stand alone test of the board was carried out through GNU radio software and the base band output data obtained is found satisfactory. Later the board is integrated with the existing Lower Atmospheric Wind Profiling radar at NARL. The radar receive IF output at 70 MHz is given to the board and the real-time radar data is collected. The data is processed off-line and the range-doppler spectrum is obtained. Online processing software is under progress.

  13. Training Teachers to Use Computers: A Case Study of the Summer Training Component of the IBM/ETS Secondary School Computer Education Program. Research Report.

    ERIC Educational Resources Information Center

    Stecher, Brian

    A training program in computer educationtTested in 89 secondary schools focused on the use of computers as tools in all subject areas. Each school received enough computers and software from IBM to equip a full computer laboratory. The schools were organized into local networks in eight regions and received training and continuing support in these…

  14. A Functional Description of a Digital Flight Test System for Navigation and Guidance Research in the Terminal Area

    NASA Technical Reports Server (NTRS)

    Hegarty, D. M.

    1974-01-01

    A guidance, navigation, and control system, the Simulated Shuttle Flight Test System (SS-FTS), when interfaced with existing aircraft systems, provides a research facility for studying concepts for landing the space shuttle orbiter and conventional jet aircraft. The SS-FTS, which includes a general-purpose computer, performs all computations for precisely following a prescribed approach trajectory while properly managing the vehicle energy to allow safe arrival at the runway and landing within prescribed dispersions. The system contains hardware and software provisions for navigation with several combinations of possible navigation aids that have been suggested for the shuttle. The SS-FTS can be reconfigured to study different guidance and navigation concepts by changing only the computer software, and adapted to receive different radio navigation information through minimum hardware changes. All control laws, logic, and mode interlocks reside solely in the computer software.

  15. Optimal algorithm for automatic detection of microaneurysms based on receiver operating characteristic curve

    NASA Astrophysics Data System (ADS)

    Xu, Lili; Luo, Shuqian

    2010-11-01

    Microaneurysms (MAs) are the first manifestations of the diabetic retinopathy (DR) as well as an indicator for its progression. Their automatic detection plays a key role for both mass screening and monitoring and is therefore in the core of any system for computer-assisted diagnosis of DR. The algorithm basically comprises the following stages: candidate detection aiming at extracting the patterns possibly corresponding to MAs based on mathematical morphological black top hat, feature extraction to characterize these candidates, and classification based on support vector machine (SVM), to validate MAs. Feature vector and kernel function of SVM selection is very important to the algorithm. We use the receiver operating characteristic (ROC) curve to evaluate the distinguishing performance of different feature vectors and different kernel functions of SVM. The ROC analysis indicates the quadratic polynomial SVM with a combination of features as the input shows the best discriminating performance.

  16. Optimal algorithm for automatic detection of microaneurysms based on receiver operating characteristic curve.

    PubMed

    Xu, Lili; Luo, Shuqian

    2010-01-01

    Microaneurysms (MAs) are the first manifestations of the diabetic retinopathy (DR) as well as an indicator for its progression. Their automatic detection plays a key role for both mass screening and monitoring and is therefore in the core of any system for computer-assisted diagnosis of DR. The algorithm basically comprises the following stages: candidate detection aiming at extracting the patterns possibly corresponding to MAs based on mathematical morphological black top hat, feature extraction to characterize these candidates, and classification based on support vector machine (SVM), to validate MAs. Feature vector and kernel function of SVM selection is very important to the algorithm. We use the receiver operating characteristic (ROC) curve to evaluate the distinguishing performance of different feature vectors and different kernel functions of SVM. The ROC analysis indicates the quadratic polynomial SVM with a combination of features as the input shows the best discriminating performance.

  17. System and technique for ultrasonic characterization of settling suspensions

    DOEpatents

    Greenwood, Margaret S [Richland, WA; Panetta, Paul D [Richland, WA; Bamberger, Judith A [Richland, WA; Pappas, Richard A [Richland, WA

    2006-11-28

    A system for determining properties of settling suspensions includes a settling container, a mixer, and devices for ultrasonic interrogation transverse to the settling direction. A computer system controls operation of the mixer and the interrogation devices and records the response to the interrogating as a function of settling time, which is then used to determine suspension properties. Attenuation versus settling time for dilute suspensions, such as dilute wood pulp suspension, exhibits a peak at different settling times for suspensions having different properties, and the location of this peak is used as one mechanism for characterizing suspensions. Alternatively or in addition, a plurality of ultrasound receivers are arranged at different angles to a common transmitter to receive scattering responses at a variety of angles during particle settling. Angular differences in scattering as a function of settling time are also used to characterize the suspension.

  18. Development of computer games for assessment and training in post-stroke arm telerehabilitation.

    PubMed

    Rodriguez-de-Pablo, Cristina; Perry, Joel C; Cavallaro, Francesca I; Zabaleta, Haritz; Keller, Thierry

    2012-01-01

    Stroke is the leading cause of long term disability among adults in industrialized nations. The majority of these disabilities include deficiencies in arm function, which can make independent living very difficult. Research shows that better results in rehabilitation are obtained when patients receive more intensive therapy. However this intensive therapy is currently too expensive to be provided by the public health system, and at home few patients perform the repetitive exercises recommended by their therapists. Computer games can provide an affordable, enjoyable, and effective way to intensify treatment, while keeping the patient as well as their therapists informed about their progress. This paper presents the study, design, implementation and user-testing of a set of computer games for at-home assessment and training of upper-limb motor impairment after stroke.

  19. How Does One Manage ’Information? Making Sense of the Information Being Received

    DTIC Science & Technology

    2012-12-01

    to manage. (Photo by PFC Franklin E. Mercado .) Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of...in choosing the right application. Appli- cation software is written to perform a specific task or function, and it becomes increasingly difficult...common data, virtu- alizing machines for all software (using one computer/server, but dividing it into logical segments), and standardizing

  20. The radiometer transfer function for the AAFE composite two-frequency radiometer scatterometer. M.S. Thesis - Pennsylvania Univ.

    NASA Technical Reports Server (NTRS)

    Moore, J. H.

    1973-01-01

    A model was developed for the switching radiometer utilizing a continuous method of calibration. Sources of system degradation were identified and include losses and voltage standing wave ratios in front of the receiver input. After computing the three modes of operation, expressions were developed for the normalized radiometer output, the minimum detectable signal (normalized RMS temperature fluctuation), sensitivity, and accuracy correction factors).

  1. Computer-associated health complaints and sources of ergonomic instructions in computer-related issues among Finnish adolescents: a cross-sectional study.

    PubMed

    Hakala, Paula T; Saarni, Lea A; Ketola, Ritva L; Rahkola, Erja T; Salminen, Jouko J; Rimpelä, Arja H

    2010-01-11

    The use of computers has increased among adolescents, as have musculoskeletal symptoms. There is evidence that these symptoms can be reduced through an ergonomics approach and through education. The purpose of this study was to examine where adolescents had received ergonomic instructions related to computer use, and whether receiving these instructions was associated with a reduced prevalence of computer-associated health complaints. Mailed survey with nationally representative sample of 12 to 18-year-old Finns in 2001 (n = 7292, response rate 70%). In total, 6961 youths reported using a computer. We tested the associations of computer use time and received ergonomic instructions (predictor variables) with computer-associated health complaints (outcome variables) using logistic regression analysis. To prevent computer-associated complaints, 61.2% reported having been instructed to arrange their desk/chair/screen in the right position, 71.5% to take rest breaks. The older age group (16-18 years) reported receiving instructions or being self-instructed more often than the 12- to 14-year-olds (p < 0.001). Among both age groups the sources of instructions included school (33.1%), family (28.6%), self (self-instructed) (12.5%), ICT-related (8.6%), friends (1.5%) and health professionals (0.8%). Receiving instructions was not related to lower prevalence of computer-associated health complaints. This report shows that ergonomic instructions on how to prevent computer-related musculoskeletal problems fail to reach a substantial number of children. Furthermore, the reported sources of instructions vary greatly in terms of reliability.

  2. Computer-associated health complaints and sources of ergonomic instructions in computer-related issues among Finnish adolescents: A cross-sectional study

    PubMed Central

    2010-01-01

    Background The use of computers has increased among adolescents, as have musculoskeletal symptoms. There is evidence that these symptoms can be reduced through an ergonomics approach and through education. The purpose of this study was to examine where adolescents had received ergonomic instructions related to computer use, and whether receiving these instructions was associated with a reduced prevalence of computer-associated health complaints. Methods Mailed survey with nationally representative sample of 12 to 18-year-old Finns in 2001 (n = 7292, response rate 70%). In total, 6961 youths reported using a computer. We tested the associations of computer use time and received ergonomic instructions (predictor variables) with computer-associated health complaints (outcome variables) using logistic regression analysis. Results To prevent computer-associated complaints, 61.2% reported having been instructed to arrange their desk/chair/screen in the right position, 71.5% to take rest breaks. The older age group (16-18 years) reported receiving instructions or being self-instructed more often than the 12- to 14-year-olds (p < 0.001). Among both age groups the sources of instructions included school (33.1%), family (28.6%), self (self-instructed) (12.5%), ICT-related (8.6%), friends (1.5%) and health professionals (0.8%). Receiving instructions was not related to lower prevalence of computer-associated health complaints. Conclusions This report shows that ergonomic instructions on how to prevent computer-related musculoskeletal problems fail to reach a substantial number of children. Furthermore, the reported sources of instructions vary greatly in terms of reliability. PMID:20064250

  3. 38 CFR 3.271 - Computation of income.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... income. Recurring income means income which is received or anticipated in equal amounts and at regular... computation purposes. (2) Irregular income. Irregular income means income which is received or anticipated during a 12-month annualization period, but which is received in unequal amounts or at irregular...

  4. Updates to FuncLab, a Matlab based GUI for handling receiver functions

    NASA Astrophysics Data System (ADS)

    Porritt, Robert W.; Miller, Meghan S.

    2018-02-01

    Receiver functions are a versatile tool commonly used in seismic imaging. Depending on how they are processed, they can be used to image discontinuity structure within the crust or mantle or they can be inverted for seismic velocity either directly or jointly with complementary datasets. However, modern studies generally require large datasets which can be challenging to handle; therefore, FuncLab was originally written as an interactive Matlab GUI to assist in handling these large datasets. This software uses a project database to allow interactive trace editing, data visualization, H-κ stacking for crustal thickness and Vp/Vs ratio, and common conversion point stacking while minimizing computational costs. Since its initial release, significant advances have been made in the implementation of web services and changes in the underlying Matlab platform have necessitated a significant revision to the software. Here, we present revisions to the software, including new features such as data downloading via irisFetch.m, receiver function calculations via processRFmatlab, on-the-fly cross-section tools, interface picking, and more. In the descriptions of the tools, we present its application to a test dataset in Michigan, Wisconsin, and neighboring areas following the passage of USArray Transportable Array. The software is made available online at https://robporritt.wordpress.com/software.

  5. Reliability, synchrony and noise

    PubMed Central

    Ermentrout, G. Bard; Galán, Roberto F.; Urban, Nathaniel N.

    2008-01-01

    The brain is noisy. Neurons receive tens of thousands of highly fluctuating inputs and generate spike trains that appear highly irregular. Much of this activity is spontaneous—uncoupled to overt stimuli or motor outputs—leading to questions about the functional impact of this noise. Although noise is most often thought of as disrupting patterned activity and interfering with the encoding of stimuli, recent theoretical and experimental work has shown that noise can play a constructive role—leading to increased reliability or regularity of neuronal firing in single neurons and across populations. These results raise fundamental questions about how noise can influence neural function and computation. PMID:18603311

  6. School children's use of computers and teachers' education in computer ergonomics.

    PubMed

    Dockrell, S; Fallon, E; Kelly, M; Masterson, B; Shields, N

    2007-10-01

    A national survey to investigate the education of teachers in computer-related ergonomics was carried out by postal questionnaire. The use of computers by primary school children (age 4-12 years) was also investigated. Data were collected from a random sample of 25% (n = 830) of primary schools in the Republic of Ireland. Questionnaires (n = 1863) were returned from 416 schools giving a response rate of 50.1%. Almost all schools (99.7%) had computers for children's use. The computers were most often (69.8%) used in the classroom. The majority (56.3%) of children worked in pairs. Most teachers (89.6%) had received computer training, but few (17.6%) had received ergonomics information during the training. Respondents were not satisfied with their current knowledge of ergonomics. Over 90% stated that they would like to receive further information by printed format or during a training course, rather than by computer (web or CD-ROM).

  7. Adaptive voting computer system

    NASA Technical Reports Server (NTRS)

    Koczela, L. J.; Wilgus, D. S. (Inventor)

    1974-01-01

    A computer system is reported that uses adaptive voting to tolerate failures and operates in a fail-operational, fail-safe manner. Each of four computers is individually connected to one of four external input/output (I/O) busses which interface with external subsystems. Each computer is connected to receive input data and commands from the other three computers and to furnish output data commands to the other three computers. An adaptive control apparatus including a voter-comparator-switch (VCS) is provided for each computer to receive signals from each of the computers and permits adaptive voting among the computers to permit the fail-operational, fail-safe operation.

  8. OCIS: 15 years' experience with patient-centered computing.

    PubMed

    Enterline, J P; Lenhard, R E; Blum, B I; Majidi, F M; Stuart, G J

    1994-01-01

    In the mid-1970s, the medical and administrative staff of the Oncology Center at Johns Hopkins Hospital recognized a need for a computer-based clinical decision-support system that organized patients' information according to the care continuum, rather than as a series of event-specific data. This is especially important in cancer patients, because of the long periods in which they receive complex medical treatment and the enormous amounts of data generated by extremely ill patients with multiple interrelated diseases. During development of the Oncology Clinical Information System (OCIS), it became apparent that administrative services, research systems, ancillary functions (such as drug and blood product ordering), and financial processes should be integrated with the basic patient-oriented database. With the structured approach used in applications development, new modules were added as the need for additional functions arose. The system has since been moved to a modern network environment with the capacity for client-server processing.

  9. HEAP: Heat Energy Analysis Program, a computer model simulating solar receivers. [solving the heat transfer problem

    NASA Technical Reports Server (NTRS)

    Lansing, F. L.

    1979-01-01

    A computer program which can distinguish between different receiver designs, and predict transient performance under variable solar flux, or ambient temperatures, etc. has a basic structure that fits a general heat transfer problem, but with specific features that are custom-made for solar receivers. The code is written in MBASIC computer language. The methodology followed in solving the heat transfer problem is explained. A program flow chart, an explanation of input and output tables, and an example of the simulation of a cavity-type solar receiver are included.

  10. Completion processing for data communications instructions

    DOEpatents

    Blocksome, Michael A.; Kumar, Sameer; Parker, Jeffrey J.

    2014-06-03

    Completion processing of data communications instructions in a distributed computing environment, including receiving, in an active messaging interface (`AMI`) data communications instructions, at least one instruction specifying a callback function; injecting into an injection FIFO buffer of a data communication adapter, an injection descriptor, each slot in the injection FIFO buffer having a corresponding slot in a pending callback list; listing in the pending callback list any callback function specified by an instruction, incrementing a pending callback counter for each listed callback function; transferring payload data as per each injection descriptor, incrementing a transfer counter upon completion of each transfer; determining from counter values whether the pending callback list presently includes callback functions whose data transfers have been completed; calling by the AMI any such callback functions from the pending callback list, decrementing the pending callback counter for each callback function called.

  11. A Frequency-Domain Multipath Parameter Estimation and Mitigation Method for BOC-Modulated GNSS Signals

    PubMed Central

    Sun, Chao; Feng, Wenquan; Du, Songlin

    2018-01-01

    As multipath is one of the dominating error sources for high accuracy Global Navigation Satellite System (GNSS) applications, multipath mitigation approaches are employed to minimize this hazardous error in receivers. Binary offset carrier modulation (BOC), as a modernized signal structure, is adopted to achieve significant enhancement. However, because of its multi-peak autocorrelation function, conventional multipath mitigation techniques for binary phase shift keying (BPSK) signal would not be optimal. Currently, non-parametric and parametric approaches have been studied specifically aiming at multipath mitigation for BOC signals. Non-parametric techniques, such as Code Correlation Reference Waveforms (CCRW), usually have good feasibility with simple structures, but suffer from low universal applicability for different BOC signals. Parametric approaches can thoroughly eliminate multipath error by estimating multipath parameters. The problems with this category are at the high computation complexity and vulnerability to the noise. To tackle the problem, we present a practical parametric multipath estimation method in the frequency domain for BOC signals. The received signal is transferred to the frequency domain to separate out the multipath channel transfer function for multipath parameter estimation. During this process, we take the operations of segmentation and averaging to reduce both noise effect and computational load. The performance of the proposed method is evaluated and compared with the previous work in three scenarios. Results indicate that the proposed averaging-Fast Fourier Transform (averaging-FFT) method achieves good robustness in severe multipath environments with lower computational load for both low-order and high-order BOC signals. PMID:29495589

  12. Design and construction of a desktop AC susceptometer using an Arduino and a Bluetooth for serial interface

    NASA Astrophysics Data System (ADS)

    Pérez, Israel; Ángel Hernández Cuevas, José; Trinidad Elizalde Galindo, José

    2018-05-01

    We designed and developed a desktop AC susceptometer for the characterization of materials. The system consists of a lock-in amplifier, an AC function generator, a couple of coils, a sample holder, a computer system with a designed software in freeware C++ code, and an Arduino card coupled to a Bluetooth module. The Arduino/Bluetooth serial interface allows the user to have a connection to almost any computer and thus avoids the problem of connectivity between the computer and the peripherals, such as the lock-in amplifier and the function generator. The Bluetooth transmitter/receiver used is a commercial device which is robust and fast. These new features reduce the size and increase the versatility of the susceptometer, for it can be used with a simple laptop. To test our instrument, we performed measurements on magnetic materials and show that the system is reliable at both room temperature and cryogenic temperatures (77 K). The instrument is suitable for any physics or engineering laboratory either for research or academic purposes.

  13. Including Short Period Constraints In the Construction of Full Waveform Tomographic Models

    NASA Astrophysics Data System (ADS)

    Roy, C.; Calo, M.; Bodin, T.; Romanowicz, B. A.

    2015-12-01

    Thanks to the introduction of the Spectral Element Method (SEM) in seismology, which allows accurate computation of the seismic wavefield in complex media, the resolution of regional and global tomographic models has improved in recent years. However, due to computational costs, only long period waveforms are considered, and only long wavelength structure can be constrained. Thus, the resulting 3D models are smooth, and only represent a small volumetric perturbation around a smooth reference model that does not include upper-mantle discontinuities (e.g. MLD, LAB). Extending the computations to shorter periods, necessary for the resolution of smaller scale features, is computationally challenging. In order to overcome these limitations and to account for layered structure in the upper mantle in our full waveform tomography, we include information provided by short period seismic observables (receiver functions and surface wave dispersion), sensitive to sharp boundaries and anisotropic structure respectively. In a first step, receiver functions and dispersion curves are used to generate a number of 1D radially anisotropic shear velocity profiles using a trans-dimensional Markov-chain Monte Carlo (MCMC) algorithm. These 1D profiles include both isotropic and anisotropic discontinuities in the upper mantle (above 300 km depth) beneath selected stationsand are then used to build a 3D starting model for the full waveform tomographic inversion. This model is built after 1) interpolation between the available 1D profiles, and 2) homogeneization of the layered 1D models to obtain an equivalent smooth 3D starting model in the period range of interest for waveform inversion. The waveforms used in the inversion are collected for paths contained in the region of study and filtered at periods longer than 40s. We use the spectral element code "RegSEM" (Cupillard et al., 2012) for forward computations and a quasi-Newton inversion approach in which kernels are computed using normal mode perturbation theory. We present here the first reults of such an approach after successive iterations of a full waveform tomography of the North American continent.

  14. Apparatus and method for fusion of compute and switching functions of exascale system into a single component by using configurable network-on-chip fabric with distributed dual mode input-output ports and programmable network interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khare, Surhud; Somasekhar, Dinesh; More, Ankit

    Described is an apparatus which comprises: a Network-On-Chip fabric using crossbar switches, having distributed ingress and egress ports; and a dual-mode network interface coupled to at least one crossbar switch, the dual-mode network interface is to include: a dual-mode circuitry; a controller operable to: configure the dual-mode circuitry to transmit and receive differential signals via the egress and ingress ports, respectively, and configure the dual-mode circuitry to transmit and receive signal-ended signals via the egress and ingress ports, respectively.

  15. Emergent Literacy Development and Computer Assisted Instruction

    ERIC Educational Resources Information Center

    Trotti, Judy; Hendricks, Randy; Bledsoe, Christie

    2017-01-01

    In this mixed-methods study, researchers examined the literacy development of prekindergarten students (N = 162) randomly placed in one of two treatment groups with each receiving 15 minutes of computer-assisted literacy instruction for four months. Literacy development of a control group of children not receiving computer-assisted instruction was…

  16. Imaging strategies using focusing functions with applications to a North Sea field

    NASA Astrophysics Data System (ADS)

    da Costa Filho, C. A.; Meles, G. A.; Curtis, A.; Ravasi, M.; Kritski, A.

    2018-04-01

    Seismic methods are used in a wide variety of contexts to investigate subsurface Earth structures, and to explore and monitor resources and waste-storage reservoirs in the upper ˜100 km of the Earth's subsurface. Reverse-time migration (RTM) is one widely used seismic method which constructs high-frequency images of subsurface structures. Unfortunately, RTM has certain disadvantages shared with other conventional single-scattering-based methods, such as not being able to correctly migrate multiply scattered arrivals. In principle, the recently developed Marchenko methods can be used to migrate all orders of multiples correctly. In practice however, using Marchenko methods are costlier to compute than RTM—for a single imaging location, the cost of performing the Marchenko method is several times that of standard RTM, and performing RTM itself requires dedicated use of some of the largest computers in the world for individual data sets. A different imaging strategy is therefore required. We propose a new set of imaging methods which use so-called focusing functions to obtain images with few artifacts from multiply scattered waves, while greatly reducing the number of points across the image at which the Marchenko method need be applied. Focusing functions are outputs of the Marchenko scheme: they are solutions of wave equations that focus in time and space at particular surface or subsurface locations. However, they are mathematical rather than physical entities, being defined only in reference media that equal to the true Earth above their focusing depths but are homogeneous below. Here, we use these focusing functions as virtual source/receiver surface seismic surveys, the upgoing focusing function being the virtual received wavefield that is created when the downgoing focusing function acts as a spatially distributed source. These source/receiver wavefields are used in three imaging schemes: one allows specific individual reflectors to be selected and imaged. The other two schemes provide either targeted or complete images with distinct advantages over current RTM methods, such as fewer artifacts and artifacts that occur in different locations. The latter property allows the recently published `combined imaging' method to remove almost all artifacts. We show several examples to demonstrate the methods: acoustic 1-D and 2-D synthetic examples, and a 2-D line from an ocean bottom cable field data set. We discuss an extension to elastic media, which is illustrated by a 1.5-D elastic synthetic example.

  17. Seismic velocity structure of the crust and upper mantle beneath the Texas-Gulf of Mexico margin from joint inversion of Ps and Sp receiver functions and surface wave dispersion

    NASA Astrophysics Data System (ADS)

    Agrawal, M.; Pulliam, J.; Sen, M. K.

    2013-12-01

    The seismic structure beneath Texas Gulf Coast Plain (GCP) is determined via velocity analysis of stacked common conversion point (CCP) Ps and Sp receiver functions and surface wave dispersion. The GCP is a portion of a ocean-continental transition zone, or 'passive margin', where seismic imaging of lithospheric Earth structure via passive seismic techniques has been rare. Seismic data from a temporary array of 22 broadband stations, spaced 16-20 km apart, on a ~380-km-long profile from Matagorda Island, a barrier island in the Gulf of Mexico, to Johnson City, Texas were employed to construct a coherent image of the crust and uppermost mantle. CCP stacking was applied to data from teleseismic earthquakes to enhance the signal-to-noise ratios of converted phases, such as Ps phases. An inaccurate velocity model, used for time-to-depth conversion in CCP stacking, may produce higher errors, especially in a region of substantial lateral velocity variations. An accurate velocity model is therefore essential to constructing high quality depth-domain images. To find accurate velocity P- and S-wave models, we applied a joint modeling approach that searches for best-fitting models via simulated annealing. This joint inversion approach, which we call 'multi objective optimization in seismology' (MOOS), simultaneously models Ps receiver functions, Sp receiver functions and group velocity surface wave dispersion curves after assigning relative weights for each objective function. Weights are computed from the standard deviations of the data. Statistical tools such as the posterior parameter correlation matrix and posterior probability density (PPD) function are used to evaluate the constraints that each data type places on model parameters. They allow us to identify portions of the model that are well or poorly constrained.

  18. Using Adjoint Methods to Improve 3-D Velocity Models of Southern California

    NASA Astrophysics Data System (ADS)

    Liu, Q.; Tape, C.; Maggi, A.; Tromp, J.

    2006-12-01

    We use adjoint methods popular in climate and ocean dynamics to calculate Fréchet derivatives for tomographic inversions in southern California. The Fréchet derivative of an objective function χ(m), where m denotes the Earth model, may be written in the generic form δχ=int Km(x) δln m(x) d3x, where δln m=δ m/m denotes the relative model perturbation. For illustrative purposes, we construct the 3-D finite-frequency banana-doughnut kernel Km, corresponding to the misfit of a single traveltime measurement, by simultaneously computing the 'adjoint' wave field s† forward in time and reconstructing the regular wave field s backward in time. The adjoint wave field is produced by using the time-reversed velocity at the receiver as a fictitious source, while the regular wave field is reconstructed on the fly by propagating the last frame of the wave field saved by a previous forward simulation backward in time. The approach is based upon the spectral-element method, and only two simulations are needed to produce density, shear-wave, and compressional-wave sensitivity kernels. This method is applied to the SCEC southern California velocity model. Various density, shear-wave, and compressional-wave sensitivity kernels are presented for different phases in the seismograms. We also generate 'event' kernels for Pnl, S and surface waves, which are the Fréchet kernels of misfit functions that measure the P, S or surface wave traveltime residuals at all the receivers simultaneously for one particular event. Effectively, an event kernel is a sum of weighted Fréchet kernels, with weights determined by the associated traveltime anomalies. By the nature of the 3-D simulation, every event kernel is also computed based upon just two simulations, i.e., its construction costs the same amount of computation time as an individual banana-doughnut kernel. One can think of the sum of the event kernels for all available earthquakes, called the 'misfit' kernel, as a graphical representation of the gradient of the misfit function. With the capability of computing both the value of the misfit function and its gradient, which assimilates the traveltime anomalies, we are ready to use a non-linear conjugate gradient algorithm to iteratively improve velocity models of southern California.

  19. Data Format Classification for Autonomous Software Defined Radios

    NASA Technical Reports Server (NTRS)

    Simon, Marvin; Divsalar, Dariush

    2005-01-01

    We present maximum-likelihood (ML) coherent and noncoherent classifiers for discriminating between NRZ and Manchester coded (biphase-L) data formats for binary phase-shift-keying (BPSK) modulation. Such classification of the data format is an essential element of so-called autonomous software defined radio (SDR) receivers (similar to so-called cognitive SDR receivers in the military application) where it is desired that the receiver perform each of its functions by extracting the appropriate knowledge from the received signal and, if possible, with as little information of the other signal parameters as possible. Small and large SNR approximations to the ML classifiers are also proposed that lead to simpler implementation with comparable performance in their respective SNR regions. Numerical performance results obtained by a combination of computer simulation and, wherever possible, theoretical analyses, are presented and comparisons are made among the various configurations based on the probability of misclassification as a performance criterion. Extensions to other modulations such as QPSK are readily accomplished using the same methods described in the paper.

  20. Receiver function structure beneath a broad-band seismic station in south Sumatra

    NASA Astrophysics Data System (ADS)

    MacPherson, K. A.; Hidayat, D.; Goh, S.

    2010-12-01

    We estimated the one-dimensional velocity structure beneath a broad-band station in south Sumatra by the forward modeling and inversion of receiver functions. Station PMBI belongs to the GEOFON seismic network maintained by GFZ-Potsdam, and at a longitude of 104.77° and latitude of -2.93°, sits atop the south Sumatran basin. This station is of interest to researchers at the Earth Observatory of Singapore, as data from it and other stations in Sumatra and Singapore will be incorporated into a regional velocity model for use in seismic hazard analyses. Three-component records from 193 events at teleseismic distances and Mw ≥ 5.0 were examined for this study and 67 records were deemed to have sufficient signal to noise characteristics to be retained for analysis. Observations are primarily from source zones in the Bougainville trench with back-azimuths to the east-south-east, the Japan and Kurile trenches with back-azimuths to the northeast, and a scattering of observations from other azimuths. Due to the level of noise present in even the higher-quality records, the usual frequency-domain deconvolution method of computing receiver functions was ineffective, and a time-domain iterative deconvolution was employed to obtain usable wave forms. Receiver functions with similar back-azimuths were stacked in order to improve their signal to noise ratios. The resulting wave forms are relatively complex, with significant energy being present in the tangential components, indicating heterogeneity in the underlying structure. A dip analysis was undertaken but no clear pattern was observed. However, it is apparent that polarities of the tangential components were generally reversed for records that sample the Sunda trench. Forward modeling of the receiver functions indicates the presence of a near-surface low-velocity layer (Vp≈1.9 km/s) and a Moho depth of ~31 km. Details of the crustal structure were investigated by employing time-domain inversions of the receiver functions. General features of those velocity models providing a good fit to the waveform include an approximately one kilometer thick near-surface low-velocity zone, a high-velocity layer over a velocity inversion at mid-crustal depths, and a crust-mantle transition at depths between 30 km and 34 km.

  1. Burst mode PCS of EPON

    NASA Astrophysics Data System (ADS)

    Du, Xiao

    2005-02-01

    Normal GIGA ETHERNET continuously transmits or receives 8B/10B codes including data codes, idle codes or configuration information. In ETHERNET network, one computer links one port of switch through CAT5 and that is OK. But for EPON, it is different. All ONUs share one fiber in upstream, if we inherit the GIGA ETHERNET PHY, then collision will occur. All ONUs always transmit 8B/10B codes, and the optical signal will overlay. The OLT will receive the fault information. So we need a novel EPON PHY instead of ETHERNET PHY. But its major function is compatible with ETHERNET"s. In this article, first, the function of PCS sub layer is discussed and a novel PCS module is given that can be used in not only EPON system but also in GIGA ETHERNET system. The design of PCS is based on 1000BASE-X PCS technology. And the function of 1000BASE-X PCS should be accomplished first. Next we modify the design in order to meet the requirements of EPON system. In the new design, the auto negotiation and synchronization is the same to the 1000 BASE-X technology.

  2. A hybrid brain-computer interface-based mail client.

    PubMed

    Yu, Tianyou; Li, Yuanqing; Long, Jinyi; Li, Feng

    2013-01-01

    Brain-computer interface-based communication plays an important role in brain-computer interface (BCI) applications; electronic mail is one of the most common communication tools. In this study, we propose a hybrid BCI-based mail client that implements electronic mail communication by means of real-time classification of multimodal features extracted from scalp electroencephalography (EEG). With this BCI mail client, users can receive, read, write, and attach files to their mail. Using a BCI mouse that utilizes hybrid brain signals, that is, motor imagery and P300 potential, the user can select and activate the function keys and links on the mail client graphical user interface (GUI). An adaptive P300 speller is employed for text input. The system has been tested with 6 subjects, and the experimental results validate the efficacy of the proposed method.

  3. A Hybrid Brain-Computer Interface-Based Mail Client

    PubMed Central

    Yu, Tianyou; Li, Yuanqing; Long, Jinyi; Li, Feng

    2013-01-01

    Brain-computer interface-based communication plays an important role in brain-computer interface (BCI) applications; electronic mail is one of the most common communication tools. In this study, we propose a hybrid BCI-based mail client that implements electronic mail communication by means of real-time classification of multimodal features extracted from scalp electroencephalography (EEG). With this BCI mail client, users can receive, read, write, and attach files to their mail. Using a BCI mouse that utilizes hybrid brain signals, that is, motor imagery and P300 potential, the user can select and activate the function keys and links on the mail client graphical user interface (GUI). An adaptive P300 speller is employed for text input. The system has been tested with 6 subjects, and the experimental results validate the efficacy of the proposed method. PMID:23690880

  4. Realization of Intelligent Measurement and Control System for Limb Rehabilitation Based on PLC and Touch Screen

    NASA Astrophysics Data System (ADS)

    Liu, Xiangquan

    According to the treatment needs of patients with limb movement disorder, on the basis of the limb rehabilitative training prototype, function of measure and control system are analyzed, design of system hardware and software is completed. The touch screen which is adopt as host computer and man-machine interaction window is responsible for sending commands and training information display; The PLC which is adopt as slave computer is responsible for receiving control command from touch screen, collecting the sensor data, regulating torque and speed of motor by analog output according to the different training mode, realizing ultimately active and passive training for limb rehabilitation therapy.

  5. Design criteria for noncoherent Gaussian channels with MFSK signaling and coding

    NASA Technical Reports Server (NTRS)

    Butman, S. A.; Levitt, B. K.; Bar-David, I.; Lyon, R. F.; Klass, M. J.

    1976-01-01

    This paper presents data and criteria to assess and guide the design of modems for coded noncoherent communication systems subject to practical system constraints of power, bandwidth, noise spectral density, coherence time, and number of orthogonal signals M. Three basic receiver types are analyzed for the noncoherent multifrequency-shift keying (MFSK) additive white Gaussian noise channel: hard decision, unquantized (optimum), and quantized (soft decision). Channel capacity and computational cutoff rate are computed for each type and presented as functions of the predetection signal-to-noise ratio and the number of orthogonal signals. This relates the channel constraints of power, bandwidth, coherence time, and noise power to the optimum choice of signal duration and signal number.

  6. Computing angle of arrival of radio signals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borchardt, John J.; Steele, David K.

    Various technologies pertaining to computing angle of arrival of radio signals are described. A system that is configured for computing the angle of arrival of a radio signal includes a cylindrical sheath wrapped around a cylindrical object, where the cylindrical sheath acts as a ground plane. The system further includes a plurality of antennas that are positioned about an exterior surface of the cylindrical sheath, and receivers respectively coupled to the antennas. The receivers output measurements pertaining to the radio signal. A processing circuit receives the measurements and computes the angle of arrival of the radio signal based upon themore » measurements.« less

  7. Stuck in the Shallow End: Education, Race, and Computing. Updated Edition

    ERIC Educational Resources Information Center

    Margolis, Jane

    2017-01-01

    The number of African Americans and Latino/as receiving undergraduate and advanced degrees in computer science is disproportionately low. And relatively few African American and Latino/a high school students receive the kind of institutional encouragement, educational opportunities, and preparation needed for them to choose computer science as a…

  8. Ionic mechanisms in peripheral pain.

    PubMed

    Fransén, Erik

    2014-01-01

    Chronic pain constitutes an important and growing problem in society with large unmet needs with respect to treatment and clear implications for quality of life. Computational modeling is used to complement experimental studies to elucidate mechanisms involved in pain states. Models representing the peripheral nerve ending often address questions related to sensitization or reduction in pain detection threshold. In models of the axon or the cell body of the unmyelinated C-fiber, a large body of work concerns the role of particular sodium channels and mutations of these. Furthermore, in central structures: spinal cord or higher structures, sensitization often refers not only to enhanced synaptic efficacy but also to elevated intrinsic neuronal excitability. One of the recent developments in computational neuroscience is the emergence of computational neuropharmacology. In this area, computational modeling is used to study mechanisms of pathology with the objective of finding the means of restoring healthy function. This research has received increased attention from the pharmaceutical industry as ion channels have gained increased interest as drug targets. Computational modeling has several advantages, notably the ability to provide mechanistic links between molecular and cellular levels on the one hand and functions at the systems level on the other hand. These characteristics make computational modeling an additional tool to be used in the process of selecting pharmaceutical targets. Furthermore, large-scale simulations can provide a framework to systematically study the effects of several interacting disease parameters or effects from combinations of drugs. © 2014 Elsevier Inc. All rights reserved.

  9. Is longer treatment better? A comparison study of 3 versus 6 months cognitive remediation in schizophrenia.

    PubMed

    Buonocore, Mariachiara; Bosia, Marta; Bechi, Margherita; Spangaro, Marco; Cavedoni, Silvia; Cocchi, Federica; Guglielmino, Carmelo; Bianchi, Laura; Mastromatteo, Antonella Rita; Cavallaro, Roberto

    2017-05-01

    Despite its extensive use for treating cognitive deficits in schizophrenia, computer-assisted cognitive remediation (CACR) currently lacks a standardized protocol. Duration is an important feature to be defined, as it may contribute to heterogeneous outcome. This study compares 2 treatment durations, 3 versus 6 months, to analyze their effects on both cognition and daily functioning. Fifty-seven outpatients with schizophrenia received 3 months of CACR and 41 received 6 months of CACR. All patients were assessed at baseline and after 3 and 6 months with the Brief Assessment for Cognition in Schizophrenia and with the Quality of Life Scale (QLS). Repeated measures ANOVA showed significant improvements in all cognitive domains after 3 months. A significant effect of treatment duration was observed only for executive functions, with significantly higher scores among patients treated for 6 months. Significant improvements in QLS were also observed after 6 months in both groups, with a significant time by treatment interaction for QLS Total Score. Results confirm the efficacy of 3-months CACR in terms of both cognitive and functional improvements, suggesting that an extended intervention may lead to further benefits in executive functions and daily functioning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. Completion processing for data communications instructions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blocksome, Michael A; Kumar, Sameer; Parker, Jeffrey J

    Completion processing of data communications instructions in a distributed computing environment, including receiving, in an active messaging interface (`AMI`) data communications instructions, at least one instruction specifying a callback function; injecting into an injection FIFO buffer of a data communication adapter, an injection descriptor, each slot in the injection FIFO buffer having a corresponding slot in a pending callback list; listing in the pending callback list any callback function specified by an instruction, incrementing a pending callback counter for each listed callback function; transferring payload data as per each injection descriptor, incrementing a transfer counter upon completion of each transfer;more » determining from counter values whether the pending callback list presently includes callback functions whose data transfers have been completed; calling by the AMI any such callback functions from the pending callback list, decrementing the pending callback counter for each callback function called.« less

  11. [Technological advances in neurorehabilitation].

    PubMed

    Gutiérrez-Martínez, Josefina; Núñez-Gaona, Marco Antonio; Carrillo-Mora, Paul

    2014-07-01

    Neurological rehabilitation arose as formal method in the 60's, for the therapeutic treatment of patients with stroke or spinal cord injury, which develop severe sequelae that affect their motor and sensory abilities. Although the Central Nervous System has plasticity mechanisms for spontaneous recovery, a high percentage of patients should receive specialized therapies to regain motor function, such as Constraint Induced Movement Therapy or Upright physical Therapy. The neurorehabilitation has undergone drastic changes over the last two decades due to the incorporation of computer and robotic electronic devices, designed to produce positive changes in cortical excitability of the cerebral hemisphere damaged and so to improve neuroplasticity. Among equipment, we can mention those for electrotherapy devices, apparatus for transcranial magnetic stimulation, the robotic lower limb orthoses, robot for upper limb training, systems for functional electrical stimulation, neuroprosthesis and brain computer interfaces. These devices have caused controversy because of its application and benefits reported in the literature. The aim of Neurorehabilitation technologies is to take advantage of the functional neuromuscular structures preserved, and they compensate or re-learn the functions that previously made the damaged areas. The purpose of this article is to mention some clinical applications and benefits that these technologies offer to patients with neuronal injury.

  12. Giving and Receiving Advice in Computer-Mediated Peer Response Activities

    ERIC Educational Resources Information Center

    Tsai, Mei-Hsing; Kinginger, Celeste

    2015-01-01

    In synchronous computer-mediated contexts, peer-to-peer interaction at the microlevel has received little scrutiny. In applying a conversation analysis approach, this study scrutinizes the precise nature of peer-to-peer advice giving and receiving. In this process, an advice giver can be viewed at certain moments as more competent to evaluate a…

  13. Combining ray tracing and CFD in the thermal analysis of a parabolic dish tubular cavity receiver

    NASA Astrophysics Data System (ADS)

    Craig, Ken J.; Marsberg, Justin; Meyer, Josua P.

    2016-05-01

    This paper describes the numerical evaluation of a tubular receiver used in a dish Brayton cycle. In previous work considering the use of Computational Fluid Dynamics (CFD) to perform the calculation of the absorbed radiation from the parabolic dish into the cavity as well as the resulting conjugate heat transfer, it was shown that an axi-symmetric model of the dish and receiver absorbing surfaces was useful in reducing the computational cost required for a full 3-D discrete ordinates solution, but concerns remained about its accuracy. To increase the accuracy, the Monte Carlo ray tracer SolTrace is used to perform the calculation of the absorbed radiation profile to be used in the conjugate heat transfer CFD simulation. The paper describes an approach for incorporating a complex geometry like a tubular receiver generated using CFD software into SolTrace. The results illustrate the variation of CFD mesh density that translates into the number of elements in SolTrace as well as the number of rays used in the Monte Carlo approach and their effect on obtaining a resolution-independent solution. The conjugate heat transfer CFD simulation illustrates the effect of applying the SolTrace surface heat flux profile solution as a volumetric heat source to heat up the air inside the tube. Heat losses due to convection and thermal re-radiation are also determined as a function of different tube absorptivities.

  14. Targeted, activity-dependent spinal stimulation produces long-lasting motor recovery in chronic cervical spinal cord injury

    PubMed Central

    McPherson, Jacob G.; Miller, Robert R.; Perlmutter, Steve I.

    2015-01-01

    Use-dependent movement therapies can lead to partial recovery of motor function after neurological injury. We attempted to improve recovery by developing a neuroprosthetic intervention that enhances movement therapy by directing spike timing-dependent plasticity in spared motor pathways. Using a recurrent neural–computer interface in rats with a cervical contusion of the spinal cord, we synchronized intraspinal microstimulation below the injury with the arrival of functionally related volitional motor commands signaled by muscle activity in the impaired forelimb. Stimulation was delivered during physical retraining of a forelimb behavior and throughout the day for 3 mo. Rats receiving this targeted, activity-dependent spinal stimulation (TADSS) exhibited markedly enhanced recovery compared with animals receiving targeted but open-loop spinal stimulation and rats receiving physical retraining alone. On a forelimb reach and grasp task, TADSS animals recovered 63% of their preinjury ability, more than two times the performance level achieved by the other therapy groups. Therapeutic gains were maintained for 3 additional wk without stimulation. The results suggest that activity-dependent spinal stimulation can induce neural plasticity that improves behavioral recovery after spinal cord injury. PMID:26371306

  15. Impact of thermal energy storage properties on solar dynamic space power conversion system mass

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.; Coles-Hamilton, Carolyn E.; Lacy, Dovie E.

    1987-01-01

    A 16 parameter solar concentrator/heat receiver mass model is used in conjunction with Stirling and Brayton Power Conversion System (PCS) performance and mass computer codes to determine the effect of thermal energy storage (TES) material property changes on overall PCS mass as a function of steady state electrical power output. Included in the PCS mass model are component masses as a function of thermal power for: concentrator, heat receiver, heat exchangers (source unless integral with heat receiver, heat sink, regenerator), heat engine units with optional parallel redundancy, power conditioning and control (PC and C), PC and C radiator, main radiator, and structure. Critical TES properties are: melting temperature, heat of fusion, density of the liquid phase, and the ratio of solid-to-liquid density. Preliminary results indicate that even though overalll system efficiency increases with TES melting temperature up to 1400 K for concentrator surface accuracies of 1 mrad or better, reductions in the overall system mass beyond that achievable with lithium fluoride (LiF) can be accomplished only if the heat of fusion is at least 800 kJ/kg and the liquid density is comparable to that of LiF (1880 kg/cu m.

  16. Impact of thermal energy storage properties on solar dynamic space power conversion system mass

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.; Coles-Hamilton, Carolyn E.; Lacy, Dovie E.

    1987-01-01

    A 16 parameter solar concentrator/heat receiver mass model is used in conjunction with Stirling and Brayton Power Conversion System (PCS) performance and mass computer codes to determine the effect of thermal energy storage (TES) material property changes on overall PCS mass as a function of steady state electrical power output. Included in the PCS mass model are component masses as a function of thermal power for: concentrator, heat receiver, heat exchangers (source unless integral with heat receiver, heat sink, regenerator), heat engine units with optional parallel redundancy, power conditioning and control (PC and C), PC and C radiator, main radiator, and structure. Critical TES properties are: melting temperature, heat of fusion, density of the liquid phase, and the ratio of solid-to-liquid density. Preliminary results indicate that even though overall system efficiency increases with TES melting temperature up to 1400 K for concentrator surface accuracies of 1 mrad or better, reductions in the overall system mass beyond that achievable with lithium fluoride (LiF) can be accomplished only if the heat of fusion is at least 800 kJ/kg and the liquid density is comparable to that of LiF (1800 kg/cu m).

  17. Extending the Stabilized Supralinear Network model for binocular image processing.

    PubMed

    Selby, Ben; Tripp, Bryan

    2017-06-01

    The visual cortex is both extensive and intricate. Computational models are needed to clarify the relationships between its local mechanisms and high-level functions. The Stabilized Supralinear Network (SSN) model was recently shown to account for many receptive field phenomena in V1, and also to predict subtle receptive field properties that were subsequently confirmed in vivo. In this study, we performed a preliminary exploration of whether the SSN is suitable for incorporation into large, functional models of the visual cortex, considering both its extensibility and computational tractability. First, whereas the SSN receives abstract orientation signals as input, we extended it to receive images (through a linear-nonlinear stage), and found that the extended version behaved similarly. Secondly, whereas the SSN had previously been studied in a monocular context, we found that it could also reproduce data on interocular transfer of surround suppression. Finally, we reformulated the SSN as a convolutional neural network, and found that it scaled well on parallel hardware. These results provide additional support for the plausibility of the SSN as a model of lateral interactions in V1, and suggest that the SSN is well suited as a component of complex vision models. Future work will use the SSN to explore relationships between local network interactions and sophisticated vision processes in large networks. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Multi-Mode 3D Kirchhoff Migration of Receiver Functions at Continental Scale With Applications to USArray

    NASA Astrophysics Data System (ADS)

    Millet, F.; Bodin, T.; Rondenay, S.

    2017-12-01

    The teleseismic scattered seismic wavefield contains valuable information about heterogeneities and discontinuities inside the Earth. By using fast Receiver Function (RF) migration techniques such as classic Common Conversion Point (CCP) stacks, one can easily interpret structural features down to a few hundred kilometers in the mantle. However, strong simplifying 1D assumptions limit the scope of these methods to structures that are relatively planar and sub-horizontal at local-to-regional scales, such as the Lithosphere-Asthenosphere Boundary and the Mantle Transition Zone discontinuities. Other more robust 2D and 2.5D methods rely on fewer assumptions but require considerable, sometime prohibitive, computation time. Following the ideas of Cheng (2017), we have implemented a simple fully 3D Prestack Kirchhoff RF migration scheme which uses the FM3D fast Eikonal solver to compute travel times and scattering angles. The method accounts for 3D elastic point scattering and includes free surface multiples, resulting in enhanced images of laterally varying dipping structures, such as subducted slabs. The method is tested for subduction structures using 2.5D synthetics generated with Raysum and 3D synthetics generated with specfem3D. Results show that dip angles, depths and lateral variations can be recovered almost perfectly. The approach is ideally suited for applications to dense regional datasets, including those collected across the Cascadia and Alaska subduction zones by USArray.

  19. Membrane hydrophone phase characteristics through nonlinear acoustics measurements.

    PubMed

    Bloomfield, Philip E; Gandhi, Gaurav; Lewin, Peter A

    2011-11-01

    This work considers the need for both the amplitude and phase to fully characterize polyvinylidene fluoride (PVDF) membrane hydrophones and presents a comprehensive discussion of the nonlinear acoustic measurements utilized to extract the phase information and the experimental results taken with two widely used PVDF membrane hydrophones up to 100 MHz. A semi-empirical computer model utilized the hyperbolic propagation operator to predict the nonlinear pressure field and provide the complex frequency response of the corresponding source transducer. The PVDF hydrophone phase characteristics, which were obtained directly from the difference between the computer-modeled nonlinear field simulation and the corresponding measured harmonic frequency phase values, agree to within 10% with the phase predictions obtained from receive-transfer-function simulations based on software modeling of the membrane's physical properties. Cable loading effects and membrane hydrophone resonances were distinguished and identified through a series of impedance measurements and receive transfer function simulations on the hydrophones including their hard-wired coaxial cables. The results obtained indicate that the PVDF membrane hydrophone's phase versus frequency plot exhibits oscillations about a monotonically decreasing line. The maxima and minima inflection point slopes occur at the membrane thickness resonances and antiresonances, respectively. A cable resonance was seen at 100 MHz for the hydrophone with a 1-m cable attached, but not seen for the hydrophone with a shorter 0.65-m cable.

  20. Crustal and Upper Mantle S-velocity Structure From Receiver Functions Analisys Around Terra Nova Bay Base, Antartica

    NASA Astrophysics Data System (ADS)

    Piana Agostinetti, N.; Amato, A.; Cattaneo, M.; de Gori, P.; di Bona, M.

    In the framework of the italian PNRA (Progetto Nazionale di Ricerche in Antartide), we have started to re-analize teleseismic waveforms recorded, using three-components seismometers (equipped with 5 seconds sensors, Lennartz 3D-5s), during five summer campaings, from 1993 to 2000. Seismic stations were deployed around Terra Nova Bay (TNB) italian base, from the sea to reach the interior of the Transantartic Moun- tains (TAM), the most striking example of nocontractional mountain belt. During the last campaingn (1999-2000) seismic stations were deployed deep into Northern Vic- toria Land to reach Rennik and Lillie Glaciers Area and George V coast region, the northest part of TAM. Our main goals were: to compute, using frequency-domanin deconvolution method by Di Bona [1998], Receiver Functions covering all the area around TNB italian antartic base; to map of Moho-depth and intercrustal S-waves ve- locity discontinuity from 1-D velocity model computed using Sambridge's inversion scheme [Sambridge,1999]; to analize new teleseimic waveforms recorded near TNB base: continuos recording, from 1999 to present, permits more accurate modelling S-velocity crustal structure in this critical area situated at the edge of the ipothetic rift [Stern and ten Brik, 1989; Stump and Fitzgerald, 1992; ten Brik et al., 1997]; to present final results from BACKTAM expedition.

  1. Interior view, looking northeast in computer room OvertheHorizon Backscatter ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view, looking northeast in computer room - Over-the-Horizon Backscatter Radar Network, Tulelake Radar Site Receive Sector Five Receiver Building, Unnamed Road West of Double Head Road, Tulelake, Siskiyou County, CA

  2. Interior view, looking south in computer room OvertheHorizon Backscatter ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view, looking south in computer room - Over-the-Horizon Backscatter Radar Network, Tulelake Radar Site Receive Sector Six Receiver Building, Unnamed Road West of Double Head Road, Tulelake, Siskiyou County, CA

  3. Pacing a data transfer operation between compute nodes on a parallel computer

    DOEpatents

    Blocksome, Michael A [Rochester, MN

    2011-09-13

    Methods, systems, and products are disclosed for pacing a data transfer between compute nodes on a parallel computer that include: transferring, by an origin compute node, a chunk of an application message to a target compute node; sending, by the origin compute node, a pacing request to a target direct memory access (`DMA`) engine on the target compute node using a remote get DMA operation; determining, by the origin compute node, whether a pacing response to the pacing request has been received from the target DMA engine; and transferring, by the origin compute node, a next chunk of the application message if the pacing response to the pacing request has been received from the target DMA engine.

  4. Cost-Benefit Analysis for ECIA Chapter 1 and State DPPF Programs Comparing Groups Receiving Regular Program Instruction and Groups Receiving Computer Assisted Instruction/Computer Management System (CAI/CMS). 1986-87.

    ERIC Educational Resources Information Center

    Chamberlain, Ed

    A cost benefit study was conducted to determine the effectiveness of a computer assisted instruction/computer management system (CAI/CMS) as an alternative to conventional methods of teaching reading within Chapter 1 and DPPF funded programs of the Columbus (Ohio) Public Schools. The Chapter 1 funded Compensatory Language Experiences and Reading…

  5. Evidence for the contemporary magmatic system beneath Long Valley Caldera from local earthquake tomography and receiver function analysis

    USGS Publications Warehouse

    Seccia, D.; Chiarabba, C.; De Gori, P.; Bianchi, I.; Hill, D.P.

    2011-01-01

    We present a new P wave and S wave velocity model for the upper crust beneath Long Valley Caldera obtained using local earthquake tomography and receiver function analysis. We computed the tomographic model using both a graded inversion scheme and a traditional approach. We complement the tomographic I/P model with a teleseismic receiver function model based on data from broadband seismic stations (MLAC and MKV) located on the SE and SW margins of the resurgent dome inside the caldera. The inversions resolve (1) a shallow, high-velocity P wave anomaly associated with the structural uplift of a resurgent dome; (2) an elongated, WNW striking low-velocity anomaly (8%–10 % reduction in I/P) at a depth of 6 km (4 km below mean sea level) beneath the southern section of the resurgent dome; and (3) a broad, low-velocity volume (–5% reduction in I/P and as much as 40% reduction in I/S) in the depth interval 8–14 km (6–12 km below mean sea level) beneath the central section of the caldera. The two low-velocity volumes partially overlap the geodetically inferred inflation sources that drove uplift of the resurgent dome associated with caldera unrest between 1980 and 2000, and they likely reflect the ascent path for magma or magmatic fluids into the upper crust beneath the caldera.

  6. Slope tomography based on eikonal solvers and the adjoint-state method

    NASA Astrophysics Data System (ADS)

    Tavakoli F., B.; Operto, S.; Ribodetti, A.; Virieux, J.

    2017-06-01

    Velocity macromodel building is a crucial step in the seismic imaging workflow as it provides the necessary background model for migration or full waveform inversion. In this study, we present a new formulation of stereotomography that can handle more efficiently long-offset acquisition, complex geological structures and large-scale data sets. Stereotomography is a slope tomographic method based upon a semi-automatic picking of local coherent events. Each local coherent event, characterized by its two-way traveltime and two slopes in common-shot and common-receiver gathers, is tied to a scatterer or a reflector segment in the subsurface. Ray tracing provides a natural forward engine to compute traveltime and slopes but can suffer from non-uniform ray sampling in presence of complex media and long-offset acquisitions. Moreover, most implementations of stereotomography explicitly build a sensitivity matrix, leading to the resolution of large systems of linear equations, which can be cumbersome when large-scale data sets are considered. Overcoming these issues comes with a new matrix-free formulation of stereotomography: a factored eikonal solver based on the fast sweeping method to compute first-arrival traveltimes and an adjoint-state formulation to compute the gradient of the misfit function. By solving eikonal equation from sources and receivers, we make the computational cost proportional to the number of sources and receivers while it is independent of picked events density in each shot and receiver gather. The model space involves the subsurface velocities and the scatterer coordinates, while the dips of the reflector segments are implicitly represented by the spatial support of the adjoint sources and are updated through the joint localization of nearby scatterers. We present an application on the complex Marmousi model for a towed-streamer acquisition and a realistic distribution of local events. We show that the estimated model, built without any prior knowledge of the velocities, provides a reliable initial model for frequency-domain FWI of long-offset data for a starting frequency of 4 Hz, although some artefacts at the reservoir level result from a deficit of illumination. This formulation of slope tomography provides a computationally efficient alternative to waveform inversion method such as reflection waveform inversion or differential-semblance optimization to build an initial model for pre-stack depth migration and conventional FWI.

  7. Parallel goal-oriented adaptive finite element modeling for 3D electromagnetic exploration

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Key, K.; Ovall, J.; Holst, M.

    2014-12-01

    We present a parallel goal-oriented adaptive finite element method for accurate and efficient electromagnetic (EM) modeling of complex 3D structures. An unstructured tetrahedral mesh allows this approach to accommodate arbitrarily complex 3D conductivity variations and a priori known boundaries. The total electric field is approximated by the lowest order linear curl-conforming shape functions and the discretized finite element equations are solved by a sparse LU factorization. Accuracy of the finite element solution is achieved through adaptive mesh refinement that is performed iteratively until the solution converges to the desired accuracy tolerance. Refinement is guided by a goal-oriented error estimator that uses a dual-weighted residual method to optimize the mesh for accurate EM responses at the locations of the EM receivers. As a result, the mesh refinement is highly efficient since it only targets the elements where the inaccuracy of the solution corrupts the response at the possibly distant locations of the EM receivers. We compare the accuracy and efficiency of two approaches for estimating the primary residual error required at the core of this method: one uses local element and inter-element residuals and the other relies on solving a global residual system using a hierarchical basis. For computational efficiency our method follows the Bank-Holst algorithm for parallelization, where solutions are computed in subdomains of the original model. To resolve the load-balancing problem, this approach applies a spectral bisection method to divide the entire model into subdomains that have approximately equal error and the same number of receivers. The finite element solutions are then computed in parallel with each subdomain carrying out goal-oriented adaptive mesh refinement independently. We validate the newly developed algorithm by comparison with controlled-source EM solutions for 1D layered models and with 2D results from our earlier 2D goal oriented adaptive refinement code named MARE2DEM. We demonstrate the performance and parallel scaling of this algorithm on a medium-scale computing cluster with a marine controlled-source EM example that includes a 3D array of receivers located over a 3D model that includes significant seafloor bathymetry variations and a heterogeneous subsurface.

  8. Online mentalising investigated with functional MRI.

    PubMed

    Kircher, Tilo; Blümel, Isabelle; Marjoram, Dominic; Lataster, Tineke; Krabbendam, Lydia; Weber, Jochen; van Os, Jim; Krach, Sören

    2009-05-01

    For successful interpersonal communication, inferring intentions, goals or desires of others is highly advantageous. Increasingly, humans also interact with computers or robots. In this study, we sought to determine to what degree an interactive task, which involves receiving feedback from social partners that can be used to infer intent, engaged the medial prefrontal cortex, a region previously associated with Theory of Mind processes among others. Participants were scanned using fMRI as they played an adapted version of the Prisoner's Dilemma Game with alleged human and computer partners who were outside the scanner. The medial frontal cortex was activated when both human and computer partner were played, while the direct contrast revealed significantly stronger signal change during the human-human interaction. The results suggest a link between activity in the medial prefrontal cortex and the partner played in a mentalising task. This signal change was also present for to the computers partner. Implying agency or a will to non-human actors might be an innate human resource that could lead to an evolutionary advantage.

  9. Computational substrates of social value in interpersonal collaboration.

    PubMed

    Fareri, Dominic S; Chang, Luke J; Delgado, Mauricio R

    2015-05-27

    Decisions to engage in collaborative interactions require enduring considerable risk, yet provide the foundation for building and maintaining relationships. Here, we investigate the mechanisms underlying this process and test a computational model of social value to predict collaborative decision making. Twenty-six participants played an iterated trust game and chose to invest more frequently with their friends compared with a confederate or computer despite equal reinforcement rates. This behavior was predicted by our model, which posits that people receive a social value reward signal from reciprocation of collaborative decisions conditional on the closeness of the relationship. This social value signal was associated with increased activity in the ventral striatum and medial prefrontal cortex, which significantly predicted the reward parameters from the social value model. Therefore, we demonstrate that the computation of social value drives collaborative behavior in repeated interactions and provide a mechanistic account of reward circuit function instantiating this process. Copyright © 2015 the authors 0270-6474/15/358170-11$15.00/0.

  10. Study of pseudo noise CW diode laser for ranging applications

    NASA Technical Reports Server (NTRS)

    Lee, Hyo S.; Ramaswami, Ravi

    1992-01-01

    A new Pseudo Random Noise (PN) modulated CW diode laser radar system is being developed for real time ranging of targets at both close and large distances (greater than 10 KM) to satisy a wide range of applications: from robotics to future space applications. Results from computer modeling and statistical analysis, along with some preliminary data obtained from a prototype system, are presented. The received signal is averaged for a short time to recover the target response function. It is found that even with uncooperative targets, based on the design parameters used (200-mW laser and 20-cm receiver), accurate ranging is possible up to about 15 KM, beyond which signal to noise ratio (SNR) becomes too small for real time analog detection.

  11. Buffer thermal energy storage for an air Brayton solar engine

    NASA Technical Reports Server (NTRS)

    Strumpf, H. J.; Barr, K. P.

    1981-01-01

    The application of latent-heat buffer thermal energy storage to a point-focusing solar receiver equipped with an air Brayton engine was studied. To demonstrate the effect of buffer thermal energy storage on engine operation, a computer program was written which models the recuperator, receiver, and thermal storage device as finite-element thermal masses. Actual operating or predicted performance data are used for all components, including the rotating equipment. Based on insolation input and a specified control scheme, the program predicts the Brayton engine operation, including flows, temperatures, and pressures for the various components, along with the engine output power. An economic parametric study indicates that the economic viability of buffer thermal energy storage is largely a function of the achievable engine life.

  12. High-speed on-chip windowed centroiding using photodiode-based CMOS imager

    NASA Technical Reports Server (NTRS)

    Pain, Bedabrata (Inventor); Sun, Chao (Inventor); Yang, Guang (Inventor); Cunningham, Thomas J. (Inventor); Hancock, Bruce (Inventor)

    2003-01-01

    A centroid computation system is disclosed. The system has an imager array, a switching network, computation elements, and a divider circuit. The imager array has columns and rows of pixels. The switching network is adapted to receive pixel signals from the image array. The plurality of computation elements operates to compute inner products for at least x and y centroids. The plurality of computation elements has only passive elements to provide inner products of pixel signals the switching network. The divider circuit is adapted to receive the inner products and compute the x and y centroids.

  13. High-speed on-chip windowed centroiding using photodiode-based CMOS imager

    NASA Technical Reports Server (NTRS)

    Pain, Bedabrata (Inventor); Sun, Chao (Inventor); Yang, Guang (Inventor); Cunningham, Thomas J. (Inventor); Hancock, Bruce (Inventor)

    2004-01-01

    A centroid computation system is disclosed. The system has an imager array, a switching network, computation elements, and a divider circuit. The imager array has columns and rows of pixels. The switching network is adapted to receive pixel signals from the image array. The plurality of computation elements operates to compute inner products for at least x and y centroids. The plurality of computation elements has only passive elements to provide inner products of pixel signals the switching network. The divider circuit is adapted to receive the inner products and compute the x and y centroids.

  14. Precise Clock Solutions Using Carrier Phase from GPS Receivers in the International GPS Service

    NASA Technical Reports Server (NTRS)

    Zumberge, J. F.; Jefferson, D. C.; Stowers, D. A.; Tjoelker, R. L.; Young, L. E.

    1999-01-01

    As one of its activities as an Analysis Center in the International GPS Service (IGS), the Jet Propulsion Laboratory (JPL) uses data from a globally distributed network of geodetic-quality GPS receivers to estimate precise clock solutions, relative to a chosen reference, for both the GPS satellites and GPS receiver internal clocks, every day. The GPS constellation and ground network provide geometrical strength resulting in formal errors of about 100 p sec for these estimates. Some of the receivers in the global IGS network contain high quality frequency references, such as hydrogen masers. The clock solutions for such receivers are smooth at the 20-p sec level on time scales of a few minutes. There are occasional (daily to weekly) shifts at the microsec level, symptomatic of receiver resets, and 200-p sec-level discontinuities at midnight due to 1-day processing boundaries. Relative clock solutions among 22 IGS sites proposed as "fiducial" in the IGS/BIPM pilot project have been examined over a recent 4-week period. This allows a quantitative measure of receiver reset frequency as a function of site. For days and-sites without resets, the Allan deviation of the relative clock solutions is also computed for subdaily values of tau..

  15. The physics of bat biosonar

    NASA Astrophysics Data System (ADS)

    Müller, Rolf

    2011-10-01

    Bats have evolved one of the most capable and at the same time parsimonious sensory systems found in nature. Using active and passive biosonar as a major - and often sufficient - far sense, different bat species are able to master a wide variety of sensory tasks under very dissimilar sets of constraints. Given the limited computational resources of the bat's brain, this performance is unlikely to be explained as the result of brute-force, black-box-style computations. Instead, the animals must rely heavily on in-built physics knowledge in order to ensure that all required information is encoded reliably into the acoustic signals received at the ear drum. To this end, bats can manipulate the emitted and received signals in the physical domain: By diffracting the outgoing and incoming ultrasonic waves with intricate baffle shapes (i.e., noseleaves and outer ears), the animals can generate selectivity filters that are joint functions of space and frequency. To achieve this, bats employ structural features such as resonance cavities and diffracting ridges. In addition, some bat species can dynamically adjust the shape of their selectivity filters through muscular actuation.

  16. Neural control of computer cursor velocity by decoding motor cortical spiking activity in humans with tetraplegia

    NASA Astrophysics Data System (ADS)

    Kim, Sung-Phil; Simeral, John D.; Hochberg, Leigh R.; Donoghue, John P.; Black, Michael J.

    2008-12-01

    Computer-mediated connections between human motor cortical neurons and assistive devices promise to improve or restore lost function in people with paralysis. Recently, a pilot clinical study of an intracortical neural interface system demonstrated that a tetraplegic human was able to obtain continuous two-dimensional control of a computer cursor using neural activity recorded from his motor cortex. This control, however, was not sufficiently accurate for reliable use in many common computer control tasks. Here, we studied several central design choices for such a system including the kinematic representation for cursor movement, the decoding method that translates neuronal ensemble spiking activity into a control signal and the cursor control task used during training for optimizing the parameters of the decoding method. In two tetraplegic participants, we found that controlling a cursor's velocity resulted in more accurate closed-loop control than controlling its position directly and that cursor velocity control was achieved more rapidly than position control. Control quality was further improved over conventional linear filters by using a probabilistic method, the Kalman filter, to decode human motor cortical activity. Performance assessment based on standard metrics used for the evaluation of a wide range of pointing devices demonstrated significantly improved cursor control with velocity rather than position decoding. Disclosure. JPD is the Chief Scientific Officer and a director of Cyberkinetics Neurotechnology Systems (CYKN); he holds stock and receives compensation. JDS has been a consultant for CYKN. LRH receives clinical trial support from CYKN.

  17. Prediction of 5-year overall survival in cervical cancer patients treated with radical hysterectomy using computational intelligence methods.

    PubMed

    Obrzut, Bogdan; Kusy, Maciej; Semczuk, Andrzej; Obrzut, Marzanna; Kluska, Jacek

    2017-12-12

    Computational intelligence methods, including non-linear classification algorithms, can be used in medical research and practice as a decision making tool. This study aimed to evaluate the usefulness of artificial intelligence models for 5-year overall survival prediction in patients with cervical cancer treated by radical hysterectomy. The data set was collected from 102 patients with cervical cancer FIGO stage IA2-IIB, that underwent primary surgical treatment. Twenty-three demographic, tumor-related parameters and selected perioperative data of each patient were collected. The simulations involved six computational intelligence methods: the probabilistic neural network (PNN), multilayer perceptron network, gene expression programming classifier, support vector machines algorithm, radial basis function neural network and k-Means algorithm. The prediction ability of the models was determined based on the accuracy, sensitivity, specificity, as well as the area under the receiver operating characteristic curve. The results of the computational intelligence methods were compared with the results of linear regression analysis as a reference model. The best results were obtained by the PNN model. This neural network provided very high prediction ability with an accuracy of 0.892 and sensitivity of 0.975. The area under the receiver operating characteristics curve of PNN was also high, 0.818. The outcomes obtained by other classifiers were markedly worse. The PNN model is an effective tool for predicting 5-year overall survival in cervical cancer patients treated with radical hysterectomy.

  18. Peritumoral Artery Scoring System: a Novel Scoring System to Predict Renal Function Outcome after Laparoscopic Partial Nephrectomy.

    PubMed

    Zhang, Ruiyun; Wu, Guangyu; Huang, Jiwei; Shi, Oumin; Kong, Wen; Chen, Yonghui; Xu, Jianrong; Xue, Wei; Zhang, Jin; Huang, Yiran

    2017-06-06

    The present study aimed to assess the impact of peritumoral artery characteristics on renal function outcome prediction using a novel Peritumoral Artery Scoring System based on computed tomography arteriography. Peritumoral artery characteristics and renal function were evaluated in 220 patients who underwent laparoscopic partial nephrectomy and then validate in 51 patients with split and total glomerular filtration rate (GFR). In particular, peritumoral artery classification and diameter were measured to assign arteries into low, moderate, and high Peritumoral Artery Scoring System risk categories. Univariable and multivariable logistic regression analyses were then used to determine risk factors for major renal functional decline. The Peritumoral Artery Scoring System and four other nephrometry systems were compared using receiver operating characteristic curve analysis. The Peritumoral Artery Scoring System was significantly superior to the other systems for predicting postoperative renal function decline (p < 0.001). In receiver operating characteristic analysis, our category system was a superior independent predictor of estimated glomerular filtration rate (eGFR) decline (area-under-the-curve = 0.865, p < 0.001) and total GFR decline (area-under-the-curve = 0.796, p < 0.001), and split GFR decline (area-under-the-curve = 0.841, p < 0.001). Peritumoral artery characteristics were independent predictors of renal function outcome after laparoscopic partial nephrectomy.

  19. Virtual plane-wave imaging via Marchenko redatuming

    NASA Astrophysics Data System (ADS)

    Meles, Giovanni Angelo; Wapenaar, Kees; Thorbecke, Jan

    2018-04-01

    Marchenko redatuming is a novel scheme used to retrieve up- and down-going Green's functions in an unknown medium. Marchenko equations are based on reciprocity theorems and are derived on the assumption of the existence of functions exhibiting space-time focusing properties once injected in the subsurface. In contrast to interferometry but similarly to standard migration methods, Marchenko redatuming only requires an estimate of the direct wave from the virtual source (or to the virtual receiver), illumination from only one side of the medium, and no physical sources (or receivers) inside the medium. In this contribution we consider a different time-focusing condition within the frame of Marchenko redatuming that leads to the retrieval of virtual plane-wave responses. As a result, it allows multiple-free imaging using only a one-dimensional sampling of the targeted model at a fraction of the computational cost of standard Marchenko schemes. The potential of the new method is demonstrated on 2D synthetic models.

  20. Modularized seismic full waveform inversion based on waveform sensitivity kernels - The software package ASKI

    NASA Astrophysics Data System (ADS)

    Schumacher, Florian; Friederich, Wolfgang; Lamara, Samir; Gutt, Phillip; Paffrath, Marcel

    2015-04-01

    We present a seismic full waveform inversion concept for applications ranging from seismological to enineering contexts, based on sensitivity kernels for full waveforms. The kernels are derived from Born scattering theory as the Fréchet derivatives of linearized frequency-domain full waveform data functionals, quantifying the influence of elastic earth model parameters and density on the data values. For a specific source-receiver combination, the kernel is computed from the displacement and strain field spectrum originating from the source evaluated throughout the inversion domain, as well as the Green function spectrum and its strains originating from the receiver. By storing the wavefield spectra of specific sources/receivers, they can be re-used for kernel computation for different specific source-receiver combinations, optimizing the total number of required forward simulations. In the iterative inversion procedure, the solution of the forward problem, the computation of sensitivity kernels and the derivation of a model update is held completely separate. In particular, the model description for the forward problem and the description of the inverted model update are kept independent. Hence, the resolution of the inverted model as well as the complexity of solving the forward problem can be iteratively increased (with increasing frequency content of the inverted data subset). This may regularize the overall inverse problem and optimizes the computational effort of both, solving the forward problem and computing the model update. The required interconnection of arbitrary unstructured volume and point grids is realized by generalized high-order integration rules and 3D-unstructured interpolation methods. The model update is inferred solving a minimization problem in a least-squares sense, resulting in Gauss-Newton convergence of the overall inversion process. The inversion method was implemented in the modularized software package ASKI (Analysis of Sensitivity and Kernel Inversion), which provides a generalized interface to arbitrary external forward modelling codes. So far, the 3D spectral-element code SPECFEM3D (Tromp, Komatitsch and Liu, 2008) and the 1D semi-analytical code GEMINI (Friederich and Dalkolmo, 1995) in both, Cartesian and spherical framework are supported. The creation of interfaces to further forward codes is planned in the near future. ASKI is freely available under the terms of the GPL at www.rub.de/aski . Since the independent modules of ASKI must communicate via file output/input, large storage capacities need to be accessible conveniently. Storing the complete sensitivity matrix to file, however, permits the scientist full manual control over each step in a customized procedure of sensitivity/resolution analysis and full waveform inversion. In the presentation, we will show some aspects of the theory behind the full waveform inversion method and its practical realization by the software package ASKI, as well as synthetic and real-data applications from different scales and geometries.

  1. Noncoherent DTTLs for Symbol Synchronization

    NASA Technical Reports Server (NTRS)

    Simon, Marvin; Tkacenko, Andre

    2007-01-01

    Noncoherent data-transition tracking loops (DTTLs) have been proposed for use as symbol synchronizers in digital communication receivers. [Communication- receiver subsystems that can perform their assigned functions in the absence of synchronization with the phases of their carrier signals ( carrier synchronization ) are denoted by the term noncoherent, while receiver subsystems that cannot function without carrier synchronization are said to be coherent. ] The proposal applies, more specifically, to receivers of binary phase-shift-keying (BPSK) signals generated by directly phase-modulating binary non-return-to-zero (NRZ) data streams onto carrier signals having known frequencies but unknown phases. The proposed noncoherent DTTLs would be modified versions of traditional DTTLs, which are coherent. The symbol-synchronization problem is essentially the problem of recovering symbol timing from a received signal. In the traditional, coherent approach to symbol synchronization, it is necessary to establish carrier synchronization in order to recover symbol timing. A traditional DTTL effects an iterative process in which it first generates an estimate of the carrier phase in the absence of symbol-synchronization information, then uses the carrier-phase estimate to obtain an estimate of the symbol-synchronization information, then feeds the symbol-synchronization estimate back to the carrier-phase-estimation subprocess. In a noncoherent symbol-synchronization process, there is no need for carrier synchronization and, hence, no need for iteration between carrier-synchronization and symbol- synchronization subprocesses. The proposed noncoherent symbolsynchronization process is justified theoretically by a mathematical derivation that starts from a maximum a posteriori (MAP) method of estimation of symbol timing utilized in traditional, coherent DTTLs. In that MAP method, one chooses the value of a variable of interest (in this case, the offset in the estimated symbol timing) that causes a likelihood function of symbol estimates over some number of symbol periods to assume a maximum value. In terms that are necessarily oversimplified to fit within the space available for this article, it can be said that the mathematical derivation involves a modified interpretation of the likelihood function that lends itself to noncoherent DTTLs. The proposal encompasses both linear and nonlinear noncoherent DTTLs. The performances of both have been computationally simulated; for comparison, the performances of linear and nonlinear coherent DTTLs have also been computationally simulated. The results of these simulations show that, among other things, the expected mean-square timing errors of coherent and noncoherent DTTLs are relatively insensitive to window width. The results also show that at high signal-to-noise ratios (SNRs), the performances of the noncoherent DTTLs approach those of their coherent counterparts at, while at low SNRs, the noncoherent DTTLs incur penalties of the order of 1.5 to 2 dB.

  2. Elementary Green function as an integral superposition of Gaussian beams in inhomogeneous anisotropic layered structures in Cartesian coordinates

    NASA Astrophysics Data System (ADS)

    Červený, Vlastislav; Pšenčík, Ivan

    2017-08-01

    Integral superposition of Gaussian beams is a useful generalization of the standard ray theory. It removes some of the deficiencies of the ray theory like its failure to describe properly behaviour of waves in caustic regions. It also leads to a more efficient computation of seismic wavefields since it does not require the time-consuming two-point ray tracing. We present the formula for a high-frequency elementary Green function expressed in terms of the integral superposition of Gaussian beams for inhomogeneous, isotropic or anisotropic, layered structures, based on the dynamic ray tracing (DRT) in Cartesian coordinates. For the evaluation of the superposition formula, it is sufficient to solve the DRT in Cartesian coordinates just for the point-source initial conditions. Moreover, instead of seeking 3 × 3 paraxial matrices in Cartesian coordinates, it is sufficient to seek just 3 × 2 parts of these matrices. The presented formulae can be used for the computation of the elementary Green function corresponding to an arbitrary direct, multiply reflected/transmitted, unconverted or converted, independently propagating elementary wave of any of the three modes, P, S1 and S2. Receivers distributed along or in a vicinity of a target surface may be situated at an arbitrary part of the medium, including ray-theory shadow regions. The elementary Green function formula can be used as a basis for the computation of wavefields generated by various types of point sources (explosive, moment tensor).

  3. Computer programming in the UK undergraduate mathematics curriculum

    NASA Astrophysics Data System (ADS)

    Sangwin, Christopher J.; O'Toole, Claire

    2017-11-01

    This paper reports a study which investigated the extent to which undergraduate mathematics students in the United Kingdom are currently taught to programme a computer as a core part of their mathematics degree programme. We undertook an online survey, with significant follow-up correspondence, to gather data on current curricula and received replies from 46 (63%) of the departments who teach a BSc mathematics degree. We found that 78% of BSc degree courses in mathematics included computer programming in a compulsory module but 11% of mathematics degree programmes do not teach programming to all their undergraduate mathematics students. In 2016, programming is most commonly taught to undergraduate mathematics students through imperative languages, notably MATLAB, using numerical analysis as the underlying (or parallel) mathematical subject matter. Statistics is a very popular choice in optional courses, using the package R. Computer algebra systems appear to be significantly less popular for compulsory first-year courses than a decade ago, and there was no mention of logic programming, functional programming or automatic theorem proving software. The modal form of assessment of computing modules is entirely by coursework (i.e. no examination).

  4. Utilising handheld computers to monitor and support patients receiving chemotherapy: results of a UK-based feasibility study.

    PubMed

    Kearney, N; Kidd, L; Miller, M; Sage, M; Khorrami, J; McGee, M; Cassidy, J; Niven, K; Gray, P

    2006-07-01

    Recent changes in cancer service provision mean that many patients spend a limited time in hospital and therefore experience and must cope with and manage treatment-related side effects at home. Information technology can provide innovative solutions in promoting patient care through information provision, enhancing communication, monitoring treatment-related side effects and promoting self-care. The aim of this feasibility study was to evaluate the acceptability of using handheld computers as a symptom assessment and management tool for patients receiving chemotherapy for cancer. A convenience sample of patients (n = 18) and health professionals (n = 9) at one Scottish cancer centre was recruited. Patients used the handheld computer to record and send daily symptom reports to the cancer centre and receive instant, tailored symptom management advice during two treatment cycles. Both patients' and health professionals' perceptions of the handheld computer system were evaluated at baseline and at the end of the project. Patients believed the handheld computer had improved their symptom management and felt comfortable in using it. The health professionals also found the handheld computer to be helpful in assessing and managing patients' symptoms. This project suggests that a handheld-computer-based symptom management tool is feasible and acceptable to both patients and health professionals in complementing the care of patients receiving chemotherapy.

  5. Computational model for perception of objects and motions.

    PubMed

    Yang, WenLu; Zhang, LiQing; Ma, LiBo

    2008-06-01

    Perception of objects and motions in the visual scene is one of the basic problems in the visual system. There exist 'What' and 'Where' pathways in the superior visual cortex, starting from the simple cells in the primary visual cortex. The former is able to perceive objects such as forms, color, and texture, and the latter perceives 'where', for example, velocity and direction of spatial movement of objects. This paper explores brain-like computational architectures of visual information processing. We propose a visual perceptual model and computational mechanism for training the perceptual model. The computational model is a three-layer network. The first layer is the input layer which is used to receive the stimuli from natural environments. The second layer is designed for representing the internal neural information. The connections between the first layer and the second layer, called the receptive fields of neurons, are self-adaptively learned based on principle of sparse neural representation. To this end, we introduce Kullback-Leibler divergence as the measure of independence between neural responses and derive the learning algorithm based on minimizing the cost function. The proposed algorithm is applied to train the basis functions, namely receptive fields, which are localized, oriented, and bandpassed. The resultant receptive fields of neurons in the second layer have the characteristics resembling that of simple cells in the primary visual cortex. Based on these basis functions, we further construct the third layer for perception of what and where in the superior visual cortex. The proposed model is able to perceive objects and their motions with a high accuracy and strong robustness against additive noise. Computer simulation results in the final section show the feasibility of the proposed perceptual model and high efficiency of the learning algorithm.

  6. The study on servo-control system in the large aperture telescope

    NASA Astrophysics Data System (ADS)

    Hu, Wei; Zhenchao, Zhang; Daxing, Wang

    2008-08-01

    Large astronomical telescope or extremely enormous astronomical telescope servo tracking technique will be one of crucial technology that must be solved in researching and manufacturing. To control technique feature of large astronomical telescope or extremely enormous astronomical telescope, this paper design a sort of large astronomical telescope servo tracking control system. This system composes a principal and subordinate distributed control system, host computer sends steering instruction and receive slave computer functional mode, slave computer accomplish control algorithm and execute real-time control. Large astronomical telescope servo control use direct drive machine, and adopt DSP technology to complete direct torque control algorithm, Such design can not only increase control system performance, but also greatly reduced volume and costs of control system, which has a significant occurrence. The system design scheme can be proved reasonably by calculating and simulating. This system can be applied to large astronomical telescope.

  7. Explanation of the computer listings of Faraday factors for INTASAT users

    NASA Technical Reports Server (NTRS)

    Nesterczuk, G.; Llewellyn, S. K.; Bent, R. B.; Schmid, P. E.

    1974-01-01

    Using a simplified form of the Appleton-Hartree formula for the phase refractive index, a relationship was obtained between the Faraday rotation angle along the angular path and the total electron content along the vertical path, intersecting the angular at the height of maximum electron density. Using the second mean value theorem of integration, the function B cosine theta second chi was removed from under the integral sign and replaced by a 'mean' value. The mean value factors were printed on the computer listing for 39 stations receiving signals from the INTASAT satellite during the specified time period. The data is presented by station and date. Graphs are included to demonstrate the variation of the Faraday factor with local time and season, with magnetic latitude, elevation and azimuth angles. Other topics discussed include a description of the bent ionospheric model, the earth's magnetic field model, and the sample computer listing.

  8. A computer program for calculation of approximate embryo/fetus radiation dose in nuclear medicine applications.

    PubMed

    Bayram, Tuncay; Sönmez, Bircan

    2012-04-01

    In this study, we aimed to make a computer program that calculates approximate radiation dose received by embryo/fetus in nuclear medicine applications. Radiation dose values per MBq-1 received by embryo/fetus in nuclear medicine applications were gathered from literature for various stages of pregnancy. These values were embedded in the computer code, which was written in Fortran 90 program language. The computer program called nmfdose covers almost all radiopharmaceuticals used in nuclear medicine applications. Approximate radiation dose received by embryo/fetus can be calculated easily at a few steps using this computer program. Although there are some constraints on using the program for some special cases, nmfdose is useful and it provides practical solution for calculation of approximate dose to embryo/fetus in nuclear medicine applications. None declared.

  9. The estimation of pointing angle and normalized surface scattering cross section from GEOS-3 radar altimeter measurements

    NASA Technical Reports Server (NTRS)

    Brown, G. S.; Curry, W. J.

    1977-01-01

    The statistical error of the pointing angle estimation technique is determined as a function of the effective receiver signal to noise ratio. Other sources of error are addressed and evaluated with inadequate calibration being of major concern. The impact of pointing error on the computation of normalized surface scattering cross section (sigma) from radar and the waveform attitude induced altitude bias is considered and quantitative results are presented. Pointing angle and sigma processing algorithms are presented along with some initial data. The intensive mode clean vs. clutter AGC calibration problem is analytically resolved. The use clutter AGC data in the intensive mode is confirmed as the correct calibration set for the sigma computations.

  10. Experimental and theoretical studies on solar energy for energy conversion

    NASA Technical Reports Server (NTRS)

    Thomas, A. P.; Thekaekara, M. P.

    1976-01-01

    This paper presents the results of investigations made experimentally and theoretically to evaluate the various parameters that affect the amount of solar energy received on a collector surface. Measurements were made over a long period of time using both pyranometer and pyrheliometer. Computation of spectral and total irradiance at ground level have been made for a large variety of combinations of atmospheric parameters for ozone density, precipitable water vapor, turbidity-coefficients and air mass. A study of the air mass as a function of irradiance measured at GSFC, and comparison of the data with the computed values of total direct solar irradiance for various parameters indicate that turbidity changes with time of the day; atmospheric opacity is less in the afternoon than in the morning.

  11. Online sequential Monte Carlo smoother for partially observed diffusion processes

    NASA Astrophysics Data System (ADS)

    Gloaguen, Pierre; Étienne, Marie-Pierre; Le Corff, Sylvain

    2018-12-01

    This paper introduces a new algorithm to approximate smoothed additive functionals of partially observed diffusion processes. This method relies on a new sequential Monte Carlo method which allows to compute such approximations online, i.e., as the observations are received, and with a computational complexity growing linearly with the number of Monte Carlo samples. The original algorithm cannot be used in the case of partially observed stochastic differential equations since the transition density of the latent data is usually unknown. We prove that it may be extended to partially observed continuous processes by replacing this unknown quantity by an unbiased estimator obtained for instance using general Poisson estimators. This estimator is proved to be consistent and its performance are illustrated using data from two models.

  12. Leisure activities in Prader-Wili syndrome: implications for health, cognition and adaptive functioning.

    PubMed

    Dykens, Elisabeth M

    2014-02-01

    Although hyperphagia and compulsivity in Prader-Willi syndrome (PWS) are well described, recreation and adaptive skills are relatively unexplored. Parents of 123 participants with PWS (4-48 years) completed measures of their child's adaptive, recreation, and problem behaviors. Offspring received cognitive testing. Watching TV was the most frequent recreational activity, and was associated with compulsivity and skin picking. BMIs were negatively correlated with physical play, and highest in those who watched TV and played computer games. Computer games and physical activities were associated with higher IQ and adaptive scores. People with PWS and other disabilities need to watch less TV and be more engaged in physical activities, games, and leisure pursuits that are fun, and may bring cognitive or adaptive advantages.

  13. Distributed Computation and TENEX-Related Activities

    DTIC Science & Technology

    1978-01-01

    IPCF) which provides the inter-job communication functions required by MSG. MSG will be modified to use the IPCF primitives when running under TOPS...mmummi iiiwnrnrtnr’in i^WMBi. ■a^j.i.aiAj.k ■*"-’"’’"— •’ ’■■ BBN Report No. 3752 Bolt Beranek and Newman Inc. . . - . *. - primitive (e.g...from a process to MSG when a communication primitive is executed, and from MSG to a process when a pending event (e.g., outstanding receive operation

  14. Apparatus for monitoring high temperature ultrasonic characterization

    DOEpatents

    Lanagan, Michael T.; Kupperman, David S.; Yaconi, George A.

    1998-01-01

    A method and an apparatus for nondestructive detecting and evaluating chas in the microstructural properties of a material by employing one or more magnetostrictive transducers linked to the material by means of one or more sonic signal conductors. The magnetostrictive transducer or transducers are connected to a pulser/receiver which in turn is connected to an oscilloscope. The oscilloscope is connected to a computer which employs an algorithm to evaluate changes in the velocity of a signal transmitted to the material sample as function of time and temperature.

  15. An Advanced Computational Approach to System of Systems Analysis & Architecting Using Agent-Based Behavioral Model

    DTIC Science & Technology

    2012-09-30

    System N Agent « datatype » SoS Architecture -Receives Capabilities1 -Provides Capabilities1 1 -Provides Capabilities1 1 -Provides Capabilities1 -Updates 1...fitness, or objective function. The structure of the SoS Agent is depicted in Figure 10. SoS Agent Architecture « datatype » Initial SoS...Architecture «subsystem» Fuzzy Inference Engine FAM « datatype » Affordability « datatype » Flexibility « datatype » Performance « datatype » Robustness Input Input

  16. Chemical application of diffusion quantum Monte Carlo

    NASA Technical Reports Server (NTRS)

    Reynolds, P. J.; Lester, W. A., Jr.

    1984-01-01

    The diffusion quantum Monte Carlo (QMC) method gives a stochastic solution to the Schroedinger equation. This approach is receiving increasing attention in chemical applications as a result of its high accuracy. However, reducing statistical uncertainty remains a priority because chemical effects are often obtained as small differences of large numbers. As an example, the single-triplet splitting of the energy of the methylene molecule CH sub 2 is given. The QMC algorithm was implemented on the CYBER 205, first as a direct transcription of the algorithm running on the VAX 11/780, and second by explicitly writing vector code for all loops longer than a crossover length C. The speed of the codes relative to one another as a function of C, and relative to the VAX, are discussed. The computational time dependence obtained versus the number of basis functions is discussed and this is compared with that obtained from traditional quantum chemistry codes and that obtained from traditional computer architectures.

  17. Computation of transmitted and received B1 fields in magnetic resonance imaging.

    PubMed

    Milles, Julien; Zhu, Yue Min; Chen, Nan-Kuei; Panych, Lawrence P; Gimenez, Gérard; Guttmann, Charles R G

    2006-05-01

    Computation of B1 fields is a key issue for determination and correction of intensity nonuniformity in magnetic resonance images. This paper presents a new method for computing transmitted and received B1 fields. Our method combines a modified MRI acquisition protocol and an estimation technique based on the Levenberg-Marquardt algorithm and spatial filtering. It enables accurate estimation of transmitted and received B1 fields for both homogeneous and heterogeneous objects. The method is validated using numerical simulations and experimental data from phantom and human scans. The experimental results are in agreement with theoretical expectations.

  18. Quality of human-computer interaction - results of a national usability survey of hospital-IT in Germany

    PubMed Central

    2011-01-01

    Background Due to the increasing functionality of medical information systems, it is hard to imagine day to day work in hospitals without IT support. Therefore, the design of dialogues between humans and information systems is one of the most important issues to be addressed in health care. This survey presents an analysis of the current quality level of human-computer interaction of healthcare-IT in German hospitals, focused on the users' point of view. Methods To evaluate the usability of clinical-IT according to the design principles of EN ISO 9241-10 the IsoMetrics Inventory, an assessment tool, was used. The focus of this paper has been put on suitability for task, training effort and conformity with user expectations, differentiated by information systems. Effectiveness has been evaluated with the focus on interoperability and functionality of different IT systems. Results 4521 persons from 371 hospitals visited the start page of the study, while 1003 persons from 158 hospitals completed the questionnaire. The results show relevant variations between different information systems. Conclusions Specialised information systems with defined functionality received better assessments than clinical information systems in general. This could be attributed to the improved customisation of these specialised systems for specific working environments. The results can be used as reference data for evaluation and benchmarking of human computer engineering in clinical health IT context for future studies. PMID:22070880

  19. Bad data packet capture device

    DOEpatents

    Chen, Dong; Gara, Alan; Heidelberger, Philip; Vranas, Pavlos

    2010-04-20

    An apparatus and method for capturing data packets for analysis on a network computing system includes a sending node and a receiving node connected by a bi-directional communication link. The sending node sends a data transmission to the receiving node on the bi-directional communication link, and the receiving node receives the data transmission and verifies the data transmission to determine valid data and invalid data and verify retransmissions of invalid data as corresponding valid data. A memory device communicates with the receiving node for storing the invalid data and the corresponding valid data. A computing node communicates with the memory device and receives and performs an analysis of the invalid data and the corresponding valid data received from the memory device.

  20. Coquina Elementary students enjoy gift of computers

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Children at Coquina Elementary School, Titusville, Fla., 'practice' using a computer keyboard, part of equipment donated by Kennedy Space Center. Coquina is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. Ksc employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated.

  1. Coquina Elementary students enjoy gift of computers

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Children at Coquina Elementary School, Titusville, Fla., look with curiosity at the wrapped computer equipment donated by Kennedy Space Center. Coquina is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. Ksc employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated.

  2. Audubon Elementary students enjoy gift of computers

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Children at Audubon Elementary School, Merritt Island, Fla., eagerly unwrap computer equipment donated by Kennedy Space Center. Audubon is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year- long project. Ksc employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated.

  3. Coquina Elementary students enjoy gift of computers

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Children at Coquina Elementary School, Titusville, Fla., eagerly tear into the wrapped computer equipment donated by Kennedy Space Center. Coquina is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year- long project. Ksc employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated.

  4. Coquina Elementary students enjoy gift of computers

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Children at Coquina Elementary School, Titusville, Fla., excitedly tear into the wrapped computer equipment donated by Kennedy Space Center. Coquina is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. Ksc employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated.

  5. A Markov computer simulation model of the economics of neuromuscular blockade in patients with acute respiratory distress syndrome

    PubMed Central

    Macario, Alex; Chow, John L; Dexter, Franklin

    2006-01-01

    Background Management of acute respiratory distress syndrome (ARDS) in the intensive care unit (ICU) is clinically challenging and costly. Neuromuscular blocking agents may facilitate mechanical ventilation and improve oxygenation, but may result in prolonged recovery of neuromuscular function and acute quadriplegic myopathy syndrome (AQMS). The goal of this study was to address a hypothetical question via computer modeling: Would a reduction in intubation time of 6 hours and/or a reduction in the incidence of AQMS from 25% to 21%, provide enough benefit to justify a drug with an additional expenditure of $267 (the difference in acquisition cost between a generic and brand name neuromuscular blocker)? Methods The base case was a 55 year-old man in the ICU with ARDS who receives neuromuscular blockade for 3.5 days. A Markov model was designed with hypothetical patients in 1 of 6 mutually exclusive health states: ICU-intubated, ICU-extubated, hospital ward, long-term care, home, or death, over a period of 6 months. The net monetary benefit was computed. Results Our computer simulation modeling predicted the mean cost for ARDS patients receiving standard care for 6 months to be $62,238 (5% – 95% percentiles $42,259 – $83,766), with an overall 6-month mortality of 39%. Assuming a ceiling ratio of $35,000, even if a drug (that cost $267 more) hypothetically reduced AQMS from 25% to 21% and decreased intubation time by 6 hours, the net monetary benefit would only equal $137. Conclusion ARDS patients receiving a neuromuscular blocker have a high mortality, and unpredictable outcome, which results in large variability in costs per case. If a patient dies, there is no benefit to any drug that reduces ventilation time or AQMS incidence. A prospective, randomized pharmacoeconomic study of neuromuscular blockers in the ICU to asses AQMS or intubation times is impractical because of the highly variable clinical course of patients with ARDS. PMID:16539706

  6. Finite-fault source inversion using adjoint methods in 3D heterogeneous media

    NASA Astrophysics Data System (ADS)

    Somala, Surendra Nadh; Ampuero, Jean-Paul; Lapusta, Nadia

    2018-04-01

    Accounting for lateral heterogeneities in the 3D velocity structure of the crust is known to improve earthquake source inversion, compared to results based on 1D velocity models which are routinely assumed to derive finite-fault slip models. The conventional approach to include known 3D heterogeneity in source inversion involves pre-computing 3D Green's functions, which requires a number of 3D wave propagation simulations proportional to the number of stations or to the number of fault cells. The computational cost of such an approach is prohibitive for the dense datasets that could be provided by future earthquake observation systems. Here, we propose an adjoint-based optimization technique to invert for the spatio-temporal evolution of slip velocity. The approach does not require pre-computed Green's functions. The adjoint method provides the gradient of the cost function, which is used to improve the model iteratively employing an iterative gradient-based minimization method. The adjoint approach is shown to be computationally more efficient than the conventional approach based on pre-computed Green's functions in a broad range of situations. We consider data up to 1 Hz from a Haskell source scenario (a steady pulse-like rupture) on a vertical strike-slip fault embedded in an elastic 3D heterogeneous velocity model. The velocity model comprises a uniform background and a 3D stochastic perturbation with the von Karman correlation function. Source inversions based on the 3D velocity model are performed for two different station configurations, a dense and a sparse network with 1 km and 20 km station spacing, respectively. These reference inversions show that our inversion scheme adequately retrieves the rise time when the velocity model is exactly known, and illustrates how dense coverage improves the inference of peak slip velocities. We investigate the effects of uncertainties in the velocity model by performing source inversions based on an incorrect, homogeneous velocity model. We find that, for velocity uncertainties that have standard deviation and correlation length typical of available 3D crustal models, the inverted sources can be severely contaminated by spurious features even if the station density is high. When data from thousand or more receivers is used in source inversions in 3D heterogeneous media, the computational cost of the method proposed in this work is at least two orders of magnitude lower than source inversion based on pre-computed Green's functions.

  7. Finite-fault source inversion using adjoint methods in 3-D heterogeneous media

    NASA Astrophysics Data System (ADS)

    Somala, Surendra Nadh; Ampuero, Jean-Paul; Lapusta, Nadia

    2018-07-01

    Accounting for lateral heterogeneities in the 3-D velocity structure of the crust is known to improve earthquake source inversion, compared to results based on 1-D velocity models which are routinely assumed to derive finite-fault slip models. The conventional approach to include known 3-D heterogeneity in source inversion involves pre-computing 3-D Green's functions, which requires a number of 3-D wave propagation simulations proportional to the number of stations or to the number of fault cells. The computational cost of such an approach is prohibitive for the dense data sets that could be provided by future earthquake observation systems. Here, we propose an adjoint-based optimization technique to invert for the spatio-temporal evolution of slip velocity. The approach does not require pre-computed Green's functions. The adjoint method provides the gradient of the cost function, which is used to improve the model iteratively employing an iterative gradient-based minimization method. The adjoint approach is shown to be computationally more efficient than the conventional approach based on pre-computed Green's functions in a broad range of situations. We consider data up to 1 Hz from a Haskell source scenario (a steady pulse-like rupture) on a vertical strike-slip fault embedded in an elastic 3-D heterogeneous velocity model. The velocity model comprises a uniform background and a 3-D stochastic perturbation with the von Karman correlation function. Source inversions based on the 3-D velocity model are performed for two different station configurations, a dense and a sparse network with 1 and 20 km station spacing, respectively. These reference inversions show that our inversion scheme adequately retrieves the rise time when the velocity model is exactly known, and illustrates how dense coverage improves the inference of peak-slip velocities. We investigate the effects of uncertainties in the velocity model by performing source inversions based on an incorrect, homogeneous velocity model. We find that, for velocity uncertainties that have standard deviation and correlation length typical of available 3-D crustal models, the inverted sources can be severely contaminated by spurious features even if the station density is high. When data from thousand or more receivers is used in source inversions in 3-D heterogeneous media, the computational cost of the method proposed in this work is at least two orders of magnitude lower than source inversion based on pre-computed Green's functions.

  8. Efficacy of a short cognitive training program in patients with multiple sclerosis

    PubMed Central

    Pérez-Martín, María Yaiza; González-Platas, Montserrat; Eguía-del Río, Pablo; Croissier-Elías, Cristina; Jiménez Sosa, Alejandro

    2017-01-01

    Background Cognitive impairment is a common feature in multiple sclerosis (MS) and may have a substantial impact on quality of life. Evidence about the effectiveness of neuropsychological rehabilitation is still limited, but current data suggest that computer-assisted cognitive training improves cognitive performance. Objective The objective of this study was to evaluate the efficacy of combined computer-assisted training supported by home-based neuropsychological training to improve attention, processing speed, memory and executive functions during 3 consecutive months. Methods In this randomized controlled study blinded for the evaluators, 62 MS patients with clinically stable disease and mild-to-moderate levels of cognitive impairment were randomized to receive a computer-assisted neuropsychological training program (n=30) or no intervention (control group [CG]; n=32). The cognitive assessment included the Brief Repeatable Battery of Neuropsychological Test. Other secondary measures included subjective cognitive impairment, anxiety and depression, fatigue and quality of life measures. Results The treatment group (TG) showed significant improvements in measures of verbal memory, working memory and phonetic fluency after intervention, and repeated measures analysis of covariance revealed a positive effect in most of the functions. The control group (CG) did not show changes. The TG showed a significant reduction in anxiety symptoms and significant improvement in quality of life. There were no improvements in fatigue levels and depressive symptoms. Conclusion Cognitive intervention with a computer-assisted training supported by home training between face-to-face sessions is a useful tool to treat patients with MS and improve functions such as verbal memory, working memory and phonetic fluency. PMID:28223806

  9. A parameter estimation algorithm for LFM/BPSK hybrid modulated signal intercepted by Nyquist folding receiver

    NASA Astrophysics Data System (ADS)

    Qiu, Zhaoyang; Wang, Pei; Zhu, Jun; Tang, Bin

    2016-12-01

    Nyquist folding receiver (NYFR) is a novel ultra-wideband receiver architecture which can realize wideband receiving with a small amount of equipment. Linear frequency modulated/binary phase shift keying (LFM/BPSK) hybrid modulated signal is a novel kind of low probability interception signal with wide bandwidth. The NYFR is an effective architecture to intercept the LFM/BPSK signal and the LFM/BPSK signal intercepted by the NYFR will add the local oscillator modulation. A parameter estimation algorithm for the NYFR output signal is proposed. According to the NYFR prior information, the chirp singular value ratio spectrum is proposed to estimate the chirp rate. Then, based on the output self-characteristic, matching component function is designed to estimate Nyquist zone (NZ) index. Finally, matching code and subspace method are employed to estimate the phase change points and code length. Compared with the existing methods, the proposed algorithm has a better performance. It also has no need to construct a multi-channel structure, which means the computational complexity for the NZ index estimation is small. The simulation results demonstrate the efficacy of the proposed algorithm.

  10. KSC-99pp1227

    NASA Image and Video Library

    1999-10-06

    Children at Audubon Elementary School, Merritt Island, Fla., eagerly unwrap computer equipment donated by Kennedy Space Center. Audubon is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. KSC employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated

  11. An analysis of carrier phase jitter in an MPSK receiver utilizing map estimation. Ph.D. Thesis Semiannual Status Report, Jul. 1993 - Jan. 1994

    NASA Technical Reports Server (NTRS)

    Osborne, William P.

    1994-01-01

    The use of 8 and 16 PSK TCM to support satellite communications in an effort to achieve more bandwidth efficiency in a power-limited channel has been proposed. This project addresses the problem of carrier phase jitter in an M-PSK receiver utilizing the high SNR approximation to the maximum aposteriori estimation of carrier phase. In particular, numerical solutions to the 8 and 16 PSK self-noise and phase detector gain in the carrier tracking loop are presented. The effect of changing SNR on the loop noise bandwidth is also discussed. These data are then used to compute variance of phase error as a function of SNR. Simulation and hardware data are used to verify these calculations. The results show that there is a threshold in the variance of phase error versus SNR curves that is a strong function of SNR and a weak function of loop bandwidth. The M-PSK variance thresholds occur at SNR's in the range of practical interest for the use of 8 and 16-PSK TCM. This suggests that phase error variance is an important consideration in the design of these systems.

  12. The usefulness of sLORETA in evaluating the effect of high-dose ARA-C on brain connectivity in patients with acute myeloid leukemia: an exploratory study

    PubMed Central

    Zarabla, Alessia; Ungania, Sara; Cacciatore, Alessandra; Maialetti, Andrea; Petreri, Gianluca; Mengarelli, Andrea; Spadea, Antonio; Marchesi, Francesco; Renzi, Daniela; Gumenyuk, Svitlana; Strigari, Lidia; Maschio, Marta

    2017-01-01

    Summary Cytosine arabinoside (Ara-C) is one of the key drugs for treating acute myeloid leukemia (AML). High intravenous doses may produce a number of central nervous system (CNS) toxicities and contribute to modifications in brain functional connectivity. sLORETA is a software used for localizing brain electrical activity and functional connectivity. The aim of this study was to apply sLORETA in the evaluation of possible effects of Ara-C on brain connectivity in patients with AML without CNS involvement. We studied eight patients with AML; four were administered standard doses of Ara-C while the other four received high doses. sLORETA was computed from computerized EEG data before treatment and after six months of treatment. Three regions of interest, corresponding to specific combinations of Brodmann areas, were defined. In the patients receiving high-dose Ara-C, a statistically significant reduction in functional connectivity was observed in the frontoparietal network, which literature data suggest is involved in attentional processes. Our data highlight the possibility of using novel techniques to study potential CNS toxicity of cancer therapy.

  13. Contributed Review: Source-localization algorithms and applications using time of arrival and time difference of arrival measurements

    NASA Astrophysics Data System (ADS)

    Li, Xinya; Deng, Zhiqun Daniel; Rauchenstein, Lynn T.; Carlson, Thomas J.

    2016-04-01

    Locating the position of fixed or mobile sources (i.e., transmitters) based on measurements obtained from sensors (i.e., receivers) is an important research area that is attracting much interest. In this paper, we review several representative localization algorithms that use time of arrivals (TOAs) and time difference of arrivals (TDOAs) to achieve high signal source position estimation accuracy when a transmitter is in the line-of-sight of a receiver. Circular (TOA) and hyperbolic (TDOA) position estimation approaches both use nonlinear equations that relate the known locations of receivers and unknown locations of transmitters. Estimation of the location of transmitters using the standard nonlinear equations may not be very accurate because of receiver location errors, receiver measurement errors, and computational efficiency challenges that result in high computational burdens. Least squares and maximum likelihood based algorithms have become the most popular computational approaches to transmitter location estimation. In this paper, we summarize the computational characteristics and position estimation accuracies of various positioning algorithms. By improving methods for estimating the time-of-arrival of transmissions at receivers and transmitter location estimation algorithms, transmitter location estimation may be applied across a range of applications and technologies such as radar, sonar, the Global Positioning System, wireless sensor networks, underwater animal tracking, mobile communications, and multimedia.

  14. The 6th International Conference on Computer Science and Computational Mathematics (ICCSCM 2017)

    NASA Astrophysics Data System (ADS)

    2017-09-01

    The ICCSCM 2017 (The 6th International Conference on Computer Science and Computational Mathematics) has aimed to provide a platform to discuss computer science and mathematics related issues including Algebraic Geometry, Algebraic Topology, Approximation Theory, Calculus of Variations, Category Theory; Homological Algebra, Coding Theory, Combinatorics, Control Theory, Cryptology, Geometry, Difference and Functional Equations, Discrete Mathematics, Dynamical Systems and Ergodic Theory, Field Theory and Polynomials, Fluid Mechanics and Solid Mechanics, Fourier Analysis, Functional Analysis, Functions of a Complex Variable, Fuzzy Mathematics, Game Theory, General Algebraic Systems, Graph Theory, Group Theory and Generalizations, Image Processing, Signal Processing and Tomography, Information Fusion, Integral Equations, Lattices, Algebraic Structures, Linear and Multilinear Algebra; Matrix Theory, Mathematical Biology and Other Natural Sciences, Mathematical Economics and Financial Mathematics, Mathematical Physics, Measure Theory and Integration, Neutrosophic Mathematics, Number Theory, Numerical Analysis, Operations Research, Optimization, Operator Theory, Ordinary and Partial Differential Equations, Potential Theory, Real Functions, Rings and Algebras, Statistical Mechanics, Structure Of Matter, Topological Groups, Wavelets and Wavelet Transforms, 3G/4G Network Evolutions, Ad-Hoc, Mobile, Wireless Networks and Mobile Computing, Agent Computing & Multi-Agents Systems, All topics related Image/Signal Processing, Any topics related Computer Networks, Any topics related ISO SC-27 and SC- 17 standards, Any topics related PKI(Public Key Intrastructures), Artifial Intelligences(A.I.) & Pattern/Image Recognitions, Authentication/Authorization Issues, Biometric authentication and algorithms, CDMA/GSM Communication Protocols, Combinatorics, Graph Theory, and Analysis of Algorithms, Cryptography and Foundation of Computer Security, Data Base(D.B.) Management & Information Retrievals, Data Mining, Web Image Mining, & Applications, Defining Spectrum Rights and Open Spectrum Solutions, E-Comerce, Ubiquitous, RFID, Applications, Fingerprint/Hand/Biometrics Recognitions and Technologies, Foundations of High-performance Computing, IC-card Security, OTP, and Key Management Issues, IDS/Firewall, Anti-Spam mail, Anti-virus issues, Mobile Computing for E-Commerce, Network Security Applications, Neural Networks and Biomedical Simulations, Quality of Services and Communication Protocols, Quantum Computing, Coding, and Error Controls, Satellite and Optical Communication Systems, Theory of Parallel Processing and Distributed Computing, Virtual Visions, 3-D Object Retrievals, & Virtual Simulations, Wireless Access Security, etc. The success of ICCSCM 2017 is reflected in the received papers from authors around the world from several countries which allows a highly multinational and multicultural idea and experience exchange. The accepted papers of ICCSCM 2017 are published in this Book. Please check http://www.iccscm.com for further news. A conference such as ICCSCM 2017 can only become successful using a team effort, so herewith we want to thank the International Technical Committee and the Reviewers for their efforts in the review process as well as their valuable advices. We are thankful to all those who contributed to the success of ICCSCM 2017. The Secretary

  15. Remote direct memory access

    DOEpatents

    Archer, Charles J.; Blocksome, Michael A.

    2012-12-11

    Methods, parallel computers, and computer program products are disclosed for remote direct memory access. Embodiments include transmitting, from an origin DMA engine on an origin compute node to a plurality target DMA engines on target compute nodes, a request to send message, the request to send message specifying a data to be transferred from the origin DMA engine to data storage on each target compute node; receiving, by each target DMA engine on each target compute node, the request to send message; preparing, by each target DMA engine, to store data according to the data storage reference and the data length, including assigning a base storage address for the data storage reference; sending, by one or more of the target DMA engines, an acknowledgment message acknowledging that all the target DMA engines are prepared to receive a data transmission from the origin DMA engine; receiving, by the origin DMA engine, the acknowledgement message from the one or more of the target DMA engines; and transferring, by the origin DMA engine, data to data storage on each of the target compute nodes according to the data storage reference using a single direct put operation.

  16. Informational Needs of Head and Neck Cancer Patients.

    PubMed

    Fang, Carolyn Y; Longacre, Margaret L; Manne, Sharon L; Ridge, John A; Lango, Miriam N; Burtness, Barbara A

    2012-04-01

    Treatment for head and neck squamous cell carcinoma (HNSCC) can lead to considerable functional impairment. As a result, HNSCC patients experience significant decrements in quality of life, high levels of emotional distress, deteriorations in interpersonal relations, and increased social isolation. Studies suggest that HNSCC patients may have extensive informational and psychosocial needs that are not being adequately addressed. However, few programs have been developed to address the needs of HNSCC patients. Therefore, we conducted a pilot study of HNSCC patients to: 1) characterize patients' informational needs; and 2) describe preferred formats and time points for receiving such information. The majority of participants desired additional information regarding treatment options, managing changes in swallowing and speaking, and staying healthy after treatment. Overall, patients with early-stage disease reported more informational needs compared to patients with advanced disease. Female patients were more likely to desire information about coping with emotional stress and anxiety than male patients. Younger patients (29-49 years) were more interested in receiving information about sexuality after cancer compared to their older (50+) counterparts. Although information was requested throughout the cancer trajectory, most patients preferred to receive such information at diagnosis or within 1-3 months post-treatment. The majority of patients reported having computer and Internet access, and they were most receptive to receiving information delivered via the Internet, from a DVD, or from pamphlets and booklets. The relatively high percentage of patients with computer and Internet access reflects a growing trend in the United States and supports the feasibility of disseminating health information to this patient population via Internet-based programs.

  17. Modeling methods for merging computational and experimental aerodynamic pressure data

    NASA Astrophysics Data System (ADS)

    Haderlie, Jacob C.

    This research describes a process to model surface pressure data sets as a function of wing geometry from computational and wind tunnel sources and then merge them into a single predicted value. The described merging process will enable engineers to integrate these data sets with the goal of utilizing the advantages of each data source while overcoming the limitations of both; this provides a single, combined data set to support analysis and design. The main challenge with this process is accurately representing each data source everywhere on the wing. Additionally, this effort demonstrates methods to model wind tunnel pressure data as a function of angle of attack as an initial step towards a merging process that uses both location on the wing and flow conditions (e.g., angle of attack, flow velocity or Reynold's number) as independent variables. This surrogate model of pressure as a function of angle of attack can be useful for engineers that need to predict the location of zero-order discontinuities, e.g., flow separation or normal shocks. Because, to the author's best knowledge, there is no published, well-established merging method for aerodynamic pressure data (here, the coefficient of pressure Cp), this work identifies promising modeling and merging methods, and then makes a critical comparison of these methods. Surrogate models represent the pressure data for both data sets. Cubic B-spline surrogate models represent the computational simulation results. Machine learning and multi-fidelity surrogate models represent the experimental data. This research compares three surrogates for the experimental data (sequential--a.k.a. online--Gaussian processes, batch Gaussian processes, and multi-fidelity additive corrector) on the merits of accuracy and computational cost. The Gaussian process (GP) methods employ cubic B-spline CFD surrogates as a model basis function to build a surrogate model of the WT data, and this usage of the CFD surrogate in building the WT data could serve as a "merging" because the resulting WT pressure prediction uses information from both sources. In the GP approach, this model basis function concept seems to place more "weight" on the Cp values from the wind tunnel (WT) because the GP surrogate uses the CFD to approximate the WT data values. Conversely, the computationally inexpensive additive corrector method uses the CFD B-spline surrogate to define the shape of the spanwise distribution of the Cp while minimizing prediction error at all spanwise locations for a given arc length position; this, too, combines information from both sources to make a prediction of the 2-D WT-based Cp distribution, but the additive corrector approach gives more weight to the CFD prediction than to the WT data. Three surrogate models of the experimental data as a function of angle of attack are also compared for accuracy and computational cost. These surrogates are a single Gaussian process model (a single "expert"), product of experts, and generalized product of experts. The merging approach provides a single pressure distribution that combines experimental and computational data. The batch Gaussian process method provides a relatively accurate surrogate that is computationally acceptable, and can receive wind tunnel data from port locations that are not necessarily parallel to a variable direction. On the other hand, the sequential Gaussian process and additive corrector methods must receive a sufficient number of data points aligned with one direction, e.g., from pressure port bands (tap rows) aligned with the freestream. The generalized product of experts best represents wind tunnel pressure as a function of angle of attack, but at higher computational cost than the single expert approach. The format of the application data from computational and experimental sources in this work precluded the merging process from including flow condition variables (e.g., angle of attack) in the independent variables, so the merging process is only conducted in the wing geometry variables of arc length and span. The merging process of Cp data allows a more "hands-off" approach to aircraft design and analysis, (i.e., not as many engineers needed to debate the Cp distribution shape) and generates Cp predictions at any location on the wing. However, the cost with these benefits are engineer time (learning how to build surrogates), computational time in constructing the surrogates, and surrogate accuracy (surrogates introduce error into data predictions). This dissertation effort used the Trap Wing / First AIAA CFD High-Lift Prediction Workshop as a relevant transonic wing with a multi-element high-lift system, and this work identified that the batch GP model for the WT data and the B-spline surrogate for the CFD might best be combined using expert belief weights to describe Cp as a function of location on the wing element surface. (Abstract shortened by ProQuest.).

  18. Methods and compositions for protection of cells and tissues from computed tomography radiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grdina, David J.

    Described are methods for preventing or inhibiting genomic instability and in cells affected by diagnostic radiology procedures employing ionizing radiation. Embodiments include methods of preventing or inhibiting genomic instability and in cells affected by computed tomography (CT) radiation. Subjects receiving ionizing radiation may be those persons suspected of having cancer, or cancer patients having received or currently receiving cancer therapy, and or those patients having received previous ionizing radiation, including those who are approaching or have exceeded the recommended total radiation dose for a person.

  19. Functional and performance requirements of the next NOAA-Kasas City computer system

    NASA Technical Reports Server (NTRS)

    Mosher, F. R.

    1985-01-01

    The development of the Advanced Weather Interactive Processing System for the 1990's (AWIPS-90) will result in more timely and accurate forecasts with improved cost effectiveness. As part of the AWIPS-90 initiative, the National Meteorological Center (NMC), the National Severe Storms Forecast Center (NSSFC), and the National Hurricane Center (NHC) are to receive upgrades of interactive processing systems. This National Center Upgrade program will support the specialized inter-center communications, data acquisition, and processing needs of these centers. The missions, current capabilities and general functional requirements for the upgrade to the NSSFC are addressed. System capabilities are discussed along with the requirements for the upgraded system.

  20. Brain-computer interface training combined with transcranial direct current stimulation in patients with chronic severe hemiparesis: Proof of concept study.

    PubMed

    Kasashima-Shindo, Yuko; Fujiwara, Toshiyuki; Ushiba, Junichi; Matsushika, Yayoi; Kamatani, Daiki; Oto, Misa; Ono, Takashi; Nishimoto, Atsuko; Shindo, Keiichiro; Kawakami, Michiyuki; Tsuji, Tetsuya; Liu, Meigen

    2015-04-01

    Brain-computer interface technology has been applied to stroke patients to improve their motor function. Event-related desynchronization during motor imagery, which is used as a brain-computer interface trigger, is sometimes difficult to detect in stroke patients. Anodal transcranial direct current stimulation (tDCS) is known to increase event-related desynchronization. This study investigated the adjunctive effect of anodal tDCS for brain-computer interface training in patients with severe hemiparesis. Eighteen patients with chronic stroke. A non-randomized controlled study. Subjects were divided between a brain-computer interface group and a tDCS- brain-computer interface group and participated in a 10-day brain-computer interface training. Event-related desynchronization was detected in the affected hemisphere during motor imagery of the affected fingers. The tDCS-brain-computer interface group received anodal tDCS before brain-computer interface training. Event-related desynchronization was evaluated before and after the intervention. The Fugl-Meyer Assessment upper extremity motor score (FM-U) was assessed before, immediately after, and 3 months after, the intervention. Event-related desynchronization was significantly increased in the tDCS- brain-computer interface group. The FM-U was significantly increased in both groups. The FM-U improvement was maintained at 3 months in the tDCS-brain-computer interface group. Anodal tDCS can be a conditioning tool for brain-computer interface training in patients with severe hemiparetic stroke.

  1. Link failure detection in a parallel computer

    DOEpatents

    Archer, Charles J.; Blocksome, Michael A.; Megerian, Mark G.; Smith, Brian E.

    2010-11-09

    Methods, apparatus, and products are disclosed for link failure detection in a parallel computer including compute nodes connected in a rectangular mesh network, each pair of adjacent compute nodes in the rectangular mesh network connected together using a pair of links, that includes: assigning each compute node to either a first group or a second group such that adjacent compute nodes in the rectangular mesh network are assigned to different groups; sending, by each of the compute nodes assigned to the first group, a first test message to each adjacent compute node assigned to the second group; determining, by each of the compute nodes assigned to the second group, whether the first test message was received from each adjacent compute node assigned to the first group; and notifying a user, by each of the compute nodes assigned to the second group, whether the first test message was received.

  2. Comparison of arterial input functions measured from ultra-fast dynamic contrast enhanced MRI and dynamic contrast enhanced computed tomography in prostate cancer patients

    NASA Astrophysics Data System (ADS)

    Wang, Shiyang; Lu, Zhengfeng; Fan, Xiaobing; Medved, Milica; Jiang, Xia; Sammet, Steffen; Yousuf, Ambereen; Pineda, Federico; Oto, Aytekin; Karczmar, Gregory S.

    2018-02-01

    The purpose of this study was to evaluate the accuracy of arterial input functions (AIFs) measured from dynamic contrast enhanced (DCE) MRI following a low dose of contrast media injection. The AIFs measured from DCE computed tomography (CT) were used as ‘gold standard’. A total of twenty patients received CT and MRI scans on the same day. Patients received 120 ml Iohexol in DCE-CT and a low dose of (0.015 mM kg-1) of gadobenate dimeglumine in DCE-MRI. The AIFs were measured in the iliac artery and normalized to the CT and MRI contrast agent doses. To correct for different temporal resolution and sampling periods of CT and MRI, an empirical mathematical model (EMM) was used to fit the AIFs first. Then numerical AIFs (AIFCT and AIFMRI) were calculated based on fitting parameters. The AIFMRI was convolved with a ‘contrast agent injection’ function (AIFMRICON ) to correct for the difference between MRI and CT contrast agent injection times (~1.5 s versus 30 s). The results show that the EMMs accurately fitted AIFs measured from CT and MRI. There was no significant difference (p  >  0.05) between the maximum peak amplitude of AIFs from CT (22.1  ±  4.1 mM/dose) and MRI after convolution (22.3  ±  5.2 mM/dose). The shapes of the AIFCT and AIFMRICON were very similar. Our results demonstrated that AIFs can be accurately measured by MRI following low dose contrast agent injection.

  3. Cognitive computer training in children with attention deficit hyperactivity disorder (ADHD) versus no intervention: study protocol for a randomized controlled trial.

    PubMed

    Bikic, Aida; Leckman, James F; Lindschou, Jane; Christensen, Torben Ø; Dalsgaard, Søren

    2015-10-24

    Attention Deficit Hyperactivity Disorder (ADHD) is a common neurodevelopmental disorder characterized by symptoms of inattention and impulsivity and/or hyperactivity and a range of cognitive dysfunctions. Pharmacological treatment may be beneficial; however, many affected individuals continue to have difficulties with cognitive functions despite medical treatment, and up to 30 % do not respond to pharmacological treatment. Inadequate medical compliance and the long-term effects of treatment make it necessary to explore nonpharmacological and supplementary treatments for ADHD. Treatment of cognitive dysfunctions may prove particularly important because of the impact of these dysfunctions on the ability to cope with everyday life. Lately, several trials have shown promising results for cognitive computer training, often referred to as cognitive training, which focuses on particular parts of cognition, mostly on the working memory or attention but with poor generalization of training on other cognitive functions and functional outcome. Children with ADHD have a variety of cognitive dysfunctions, and it is important that cognitive training target multiple cognitive functions. This multicenter randomized clinical superiority trial aims to investigate the effect of "ACTIVATE™," a computer program designed to improve a range of cognitive skills and ADHD symptoms. A total of 122 children with ADHD, aged 6 to 13 years, will be randomized to an intervention or a control group. The intervention group will be asked to use ACTIVATE™ at home 40 minutes per day, 6 days per week for 8 weeks. Both intervention and control group will receive treatment as usual. Outcome measures will assess cognitive functions, symptoms, and behavioral and functional measures before and after the 8 weeks of training and in a 12- and 24-week follow-up. Results of this trial will provide useful information on the effectiveness of computer training focusing on several cognitive functions. Cognitive training has the potential to reduce cognitive dysfunctions and to become a new treatment option, which can promote a more normal neural development in young children with ADHD and thus reduce cognitive dysfunctions and symptoms. This could help children with ADHD to perform better in everyday life and school. ClinicalTrials.gov: NCT01752530 , date of registration: 10 December 2012.

  4. Receiver function HV ratio: a new measurement for reducing non-uniqueness of receiver function waveform inversion

    NASA Astrophysics Data System (ADS)

    Chong, Jiajun; Chu, Risheng; Ni, Sidao; Meng, Qingjun; Guo, Aizhi

    2018-02-01

    It is known that a receiver function has relatively weak constraint on absolute seismic wave velocity, and that joint inversion of the receiver function with surface wave dispersion has been widely applied to reduce the trade-off of velocity with interface depth. However, some studies indicate that the receiver function itself is capable for determining the absolute shear-wave velocity. In this study, we propose to measure the receiver function HV ratio which takes advantage of the amplitude information of the receiver function to constrain the shear-wave velocity. Numerical analysis indicates that the receiver function HV ratio is sensitive to the average shear-wave velocity in the depth range it samples, and can help to reduce the non-uniqueness of receiver function waveform inversion. A joint inversion scheme has been developed, and both synthetic tests and real data application proved the feasibility of the joint inversion.

  5. The effects of a computer skill training programme adopting social comparison and self-efficacy enhancement strategies on self-concept and skill outcome in trainees with physical disabilities.

    PubMed

    Tam, S F

    2000-10-15

    The aim of this controlled, quasi-experimental study was to evaluate the effects of both self-efficacy enhancement and social comparison training strategy on computer skills learning and self-concept outcome of trainees with physical disabilities. The self-efficacy enhancement group comprised 16 trainees, the tutorial training group comprised 15 trainees, and there were 25 subjects in the control group. Both the self-efficacy enhancement group and the tutorial training group received a 15 week computer skills training course, including generic Chinese computer operation, Chinese word processing and Chinese desktop publishing skills. The self-efficacy enhancement group received training with tutorial instructions that incorporated self-efficacy enhancement strategies and experienced self-enhancing social comparisons. The tutorial training group received behavioural learning-based tutorials only, and the control group did not receive any training. The following measurements were employed to evaluate the outcomes: the Self-Concept Questionnaire for the Physically Disabled Hong Kong Chinese (SCQPD), the computer self-efficacy rating scale and the computer performance rating scale. The self-efficacy enhancement group showed significantly better computer skills learning outcome, total self-concept, and social self-concept than the tutorial training group. The self-efficacy enhancement group did not show significant changes in their computer self-efficacy: however, the tutorial training group showed a significant lowering of their computer self-efficacy. The training strategy that incorporated self-efficacy enhancement and positive social comparison experiences maintained the computer self-efficacy of trainees with physical disabilities. This strategy was more effective in improving the learning outcome (p = 0.01) and self-concept (p = 0.05) of the trainees than the conventional tutorial-based training strategy.

  6. An arch-shaped intraoral tongue drive system with built-in tongue-computer interfacing SoC.

    PubMed

    Park, Hangue; Ghovanloo, Maysam

    2014-11-14

    We present a new arch-shaped intraoral Tongue Drive System (iTDS) designed to occupy the buccal shelf in the user's mouth. The new arch-shaped iTDS, which will be referred to as the iTDS-2, incorporates a system-on-a-chip (SoC) that amplifies and digitizes the raw magnetic sensor data and sends it wirelessly to an external TDS universal interface (TDS-UI) via an inductive coil or a planar inverted-F antenna. A built-in transmitter (Tx) employs a dual-band radio that operates at either 27 MHz or 432 MHz band, according to the wireless link quality. A built-in super-regenerative receiver (SR-Rx) monitors the wireless link quality and switches the band if the link quality is below a predetermined threshold. An accompanying ultra-low power FPGA generates data packets for the Tx and handles digital control functions. The custom-designed TDS-UI receives raw magnetic sensor data from the iTDS-2, recognizes the intended user commands by the sensor signal processing (SSP) algorithm running in a smartphone, and delivers the classified commands to the target devices, such as a personal computer or a powered wheelchair. We evaluated the iTDS-2 prototype using center-out and maze navigation tasks on two human subjects, which proved its functionality. The subjects' performance with the iTDS-2 was improved by 22% over its predecessor, reported in our earlier publication.

  7. Comparison of high pressure transient PVT measurements and model predictions. Part I.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felver, Todd G.; Paradiso, Nicholas Joseph; Evans, Gregory Herbert

    2010-07-01

    A series of experiments consisting of vessel-to-vessel transfers of pressurized gas using Transient PVT methodology have been conducted to provide a data set for optimizing heat transfer correlations in high pressure flow systems. In rapid expansions such as these, the heat transfer conditions are neither adiabatic nor isothermal. Compressible flow tools exist, such as NETFLOW that can accurately calculate the pressure and other dynamical mechanical properties of such a system as a function of time. However to properly evaluate the mass that has transferred as a function of time these computational tools rely on heat transfer correlations that must bemore » confirmed experimentally. In this work new data sets using helium gas are used to evaluate the accuracy of these correlations for receiver vessel sizes ranging from 0.090 L to 13 L and initial supply pressures ranging from 2 MPa to 40 MPa. The comparisons show that the correlations developed in the 1980s from sparse data sets perform well for the supply vessels but are not accurate for the receivers, particularly at early time during the transfers. This report focuses on the experiments used to obtain high quality data sets that can be used to validate computational models. Part II of this report discusses how these data were used to gain insight into the physics of gas transfer and to improve vessel heat transfer correlations. Network flow modeling and CFD modeling is also discussed.« less

  8. Identifying a largest logical plane from a plurality of logical planes formed of compute nodes of a subcommunicator in a parallel computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Kristan D.; Faraj, Daniel A.

    In a parallel computer, a largest logical plane from a plurality of logical planes formed of compute nodes of a subcommunicator may be identified by: identifying, by each compute node of the subcommunicator, all logical planes that include the compute node; calculating, by each compute node for each identified logical plane that includes the compute node, an area of the identified logical plane; initiating, by a root node of the subcommunicator, a gather operation; receiving, by the root node from each compute node of the subcommunicator, each node's calculated areas as contribution data to the gather operation; and identifying, bymore » the root node in dependence upon the received calculated areas, a logical plane of the subcommunicator having the greatest area.« less

  9. Cognitive remediation therapy during treatment for alcohol dependence.

    PubMed

    Rupp, Claudia I; Kemmler, Georg; Kurz, Martin; Hinterhuber, Hartmann; Fleischhacker, W Wolfgang

    2012-07-01

    Cognitive impairments in individuals with alcohol dependence may interfere with the progress of treatment and contribute to the progression of the disease. This study aimed to determine whether cognitive remediation (CR) therapy applied during treatment for alcohol dependence improves cognitive functioning in alcohol-dependent inpatients. A secondary aim was to evaluate whether the benefits of CR generalize to noncognitive clinically meaningful outcomes at the end of inpatient treatment. Forty-one alcohol-dependent patients entering inpatient treatment for alcohol dependence were randomly assigned to receive conventional treatment (n = 21) or an additional 12 sessions of computer-assisted CR focusing on cognitive enhancement in attention/executive function and memory domains (n = 20). Assessments of cognitive abilities in these domains as well as of psychological well-being and alcohol craving were conducted at baseline (at the beginning of inpatient treatment) and after CR (at the end of treatment). Results indicated that, relative to patients completing conventional treatment, those who received supplemental CR showed significant improvement in attention/executive function and memory domains, particularly in attention (alertness, divided attention), working memory, and delayed memory (recall). In addition, patients receiving CR during alcohol-dependence treatment showed significantly greater improvements in psychological well-being (Symptom Checklist-90-Revised) and in the compulsion aspect of craving (Obsessive Compulsive Drinking Scale-German version). CR during inpatient treatment for alcohol dependence is effective in improving cognitive impairments in alcohol-dependent patients. The benefits generalize to noncognitive outcomes, demonstrating that CR may be an efficacious adjunctive intervention for the treatment of alcohol dependence.

  10. Towards Full-Waveform Ambient Noise Inversion

    NASA Astrophysics Data System (ADS)

    Sager, K.; Ermert, L. A.; Boehm, C.; Fichtner, A.

    2016-12-01

    Noise tomography usually works under the assumption that the inter-station ambient noise correlation is equal to a scaled version of the Green function between the two receivers. This assumption, however, is only met under specific conditions, e.g. wavefield diffusivity and equipartitioning, or the isotropic distribution of both mono- and dipolar uncorrelated noise sources. These assumptions are typically not satisfied in the Earth. This inconsistency inhibits the exploitation of the full waveform information contained in noise correlations in order to constrain Earth structure and noise generation. To overcome this limitation, we attempt to develop a method that consistently accounts for the distribution of noise sources, 3D heterogeneous Earth structure and the full seismic wave propagation physics. This is intended to improve the resolution of tomographic images, to refine noise source location, and thereby to contribute to a better understanding of noise generation. We introduce an operator-based formulation for the computation of correlation functions and apply the continuous adjoint method that allows us to compute first and second derivatives of misfit functionals with respect to source distribution and Earth structure efficiently. Based on these developments we design an inversion scheme using a 2D finite-difference code. To enable a joint inversion for noise sources and Earth structure, we investigate the following aspects: The capability of different misfit functionals to image wave speed anomalies and source distribution. Possible source-structure trade-offs, especially to what extent unresolvable structure can be mapped into the inverted noise source distribution and vice versa. In anticipation of real-data applications, we present an extension of the open-source waveform modelling and inversion package Salvus, which allows us to compute correlation functions in 3D media with heterogeneous noise sources at the surface.

  11. Ultra-low power high precision magnetotelluric receiver array based customized computer and wireless sensor network

    NASA Astrophysics Data System (ADS)

    Chen, R.; Xi, X.; Zhao, X.; He, L.; Yao, H.; Shen, R.

    2016-12-01

    Dense 3D magnetotelluric (MT) data acquisition owns the benefit of suppressing the static shift and topography effect, can achieve high precision and high resolution inversion for underground structure. This method may play an important role in mineral exploration, geothermal resources exploration, and hydrocarbon exploration. It's necessary to reduce the power consumption greatly of a MT signal receiver for large-scale 3D MT data acquisition while using sensor network to monitor data quality of deployed MT receivers. We adopted a series of technologies to realized above goal. At first, we designed an low-power embedded computer which can couple with other parts of MT receiver tightly and support wireless sensor network. The power consumption of our embedded computer is less than 1 watt. Then we designed 4-channel data acquisition subsystem which supports 24-bit analog-digital conversion, GPS synchronization, and real-time digital signal processing. Furthermore, we developed the power supply and power management subsystem for MT receiver. At last, a series of software, which support data acquisition, calibration, wireless sensor network, and testing, were developed. The software which runs on personal computer can monitor and control over 100 MT receivers on the field for data acquisition and quality control. The total power consumption of the receiver is about 2 watts at full operation. The standby power consumption is less than 0.1 watt. Our testing showed that the MT receiver can acquire good quality data at ground with electrical dipole length as 3 m. Over 100 MT receivers were made and used for large-scale geothermal exploration in China with great success.

  12. Influence of global heterogeneities on regional imaging based upon full waveform inversion of teleseismic wavefield

    NASA Astrophysics Data System (ADS)

    Monteiller, Vadim; Beller, Stephen; Operto, Stephane; Virieux, Jean

    2015-04-01

    The current development of dense seismic arrays and high performance computing make feasible today application of full-waveform inversion (FWI) on teleseismic data for high-resolution lithospheric imaging. In teleseismic configuration, the source is often considered to first order as a planar wave that impinges the base of the lithospheric target located below the receiver array. Recently, injection methods coupling global propagation in 1D or axisymmetric earth model with regional 3D methods (Discontinuous Galerkin finite element methods, Spectral elements methods or finite differences) allow us to consider more realistic teleseismic phases. Those teleseismic phases can be propagated inside 3D regional model in order to exploit not only the forward-scattered waves propagating up to the receiver but also second-order arrivals that are back-scattered from the free-surface and the reflectors before their recordings on the surface. However, those computation are performed assuming simple global model. In this presentation, we review some key specifications that might be considered for mitigating the effect on FWI of heterogeneities situated outside the regional domain. We consider synthetic models and data computed using our recently developed hybrid method AxiSEM/SEM. The global simulation is done by AxiSEM code which allows us to consider axisymmetric anomalies. The 3D regional computation is performed by Spectral Element Method. We investigate the effect of external anomalies on the regional model obtained by FWI when one neglects them by considering only 1D global propagation. We also investigate the effect of the source time function and the focal mechanism on results of the FWI approach.

  13. KSC-99pp1225

    NASA Image and Video Library

    1999-10-06

    Children at Coquina Elementary School, Titusville, Fla., excitedly tear into the wrapped computer equipment donated by Kennedy Space Center. Coquina is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. KSC employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated

  14. KSC-99pp1224

    NASA Image and Video Library

    1999-10-06

    Children at Coquina Elementary School, Titusville, Fla., eagerly tear into the wrapped computer equipment donated by Kennedy Space Center. Coquina is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. KSC employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated

  15. KSC-99pp1222

    NASA Image and Video Library

    1999-10-06

    Children at Coquina Elementary School, Titusville, Fla., look with curiosity at the wrapped computer equipment donated by Kennedy Space Center. Coquina is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. KSC employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated

  16. KSC-99pp1223

    NASA Image and Video Library

    1999-10-06

    Children at Coquina Elementary School, Titusville, Fla., "practice" using a computer keyboard, part of equipment donated by Kennedy Space Center. Coquina is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. KSC employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated

  17. Crustal structure across the lateral edge of the Southern Tyrrhenian slab

    NASA Astrophysics Data System (ADS)

    Pio Lucente, Francesco; Piana Agostinetti, Nicola; Di Bona, Massimo; Govoni, Aladino; Bianchi, Irene

    2015-04-01

    In the southeastern corner of the Tyrrhenian basin, in the central Mediterranean Sea, a tight alignment of earthquakes along a well-defined Benioff zone reveals the presence of one of the narrowest active trenches worldwide, where one of the last fragments of the former Tethys ocean is consumed. Seismic tomography furnishes snapshot images of the present-day position and shape of this slab. Through receiver function analysis we investigate the layered structures overlying the slab. We compute receiver functions from the P-coda of teleseismic events at 13 temporary station deployed during the "Messina 1908-2008" research project (Margheriti, 2008), and operating for an average period of 15 months each. The crustal and uppermost mantle structure has been investigated using a trans-dimensional McMC algorithm developed by Piana Agostinetti and Malinverno (2010), obtaining a 1D S-wave velocity profile for each station. At three of the stations, operating for a longer period of time, the number and the azimuthal distribution of teleseisms allowed us to stack the RF data-set with back azimuth and to compute the harmonic expansion. The analysis of the back-azimuthal harmonics gave us insight on the presence of dipping interfaces and anisotropic layers at depth. The strike and the dip of interfaces and the anisotropic parameters have been quantified using the Neighbourhood Algorithm (Sambridge, 1999). Preliminary results highlight: (1) a neat differentiation of the isotropic S-wave velocity structure passing through the slab edge, from the tip of the Calabrian arc to the Peloritani Range, and (2) the presence of crustal complexities, such as dipping interfaces and anisotropic layers, both in the upper and lower crust. Margheriti, L. (2008), Understanding Crust Dynamics and Subduction in Southern Italy, Eos Trans. AGU, 89(25), 225-226, doi:10.1029/2008EO250002. Piana Agostinetti, N. and A. Malinverno (2010) Receiver Function inversion by trans-dimensional Monte Carlo sampling, Geophys. J. Int., 181(2) 858-872, doi: 10.1111/j.1365-246X.2010.04530.x Sambridge, M. (1999), Geophysical inversion with a neighbourhood algorithm-I. Searching a parameter space, Geophys. J. Int., 138, 479-494, doi:10.1046/j.1365-246X.1999.00876.x.

  18. Computer-based attention training in the schools for children with attention deficit/hyperactivity disorder: a preliminary trial.

    PubMed

    Steiner, Naomi J; Sheldrick, Radley Christopher; Gotthelf, David; Perrin, Ellen C

    2011-07-01

    Objective. This study examined the efficacy of 2 computer-based training systems to teach children with attention deficit/hyperactivity disorder (ADHD) to attend more effectively. Design/methods. A total of 41 children with ADHD from 2 middle schools were randomly assigned to receive 2 sessions a week at school of either neurofeedback (NF) or attention training through a standard computer format (SCF), either immediately or after a 6-month wait (waitlist control group). Parents, children, and teachers completed questionnaires pre- and postintervention. Results. Primary parents in the NF condition reported significant (P < .05) change on Conners's Rating Scales-Revised (CRS-R) and Behavior Assessment Scales for Children (BASC) subscales; and in the SCF condition, they reported significant (P < .05) change on the CRS-R Inattention scale and ADHD index, the BASC Attention Problems Scale, and on the Behavioral Rating Inventory of Executive Functioning (BRIEF). Conclusion. This randomized control trial provides preliminary evidence of the effectiveness of computer-based interventions for ADHD and supports the feasibility of offering them in a school setting.

  19. On the use of inexact, pruned hardware in atmospheric modelling

    PubMed Central

    Düben, Peter D.; Joven, Jaume; Lingamneni, Avinash; McNamara, Hugh; De Micheli, Giovanni; Palem, Krishna V.; Palmer, T. N.

    2014-01-01

    Inexact hardware design, which advocates trading the accuracy of computations in exchange for significant savings in area, power and/or performance of computing hardware, has received increasing prominence in several error-tolerant application domains, particularly those involving perceptual or statistical end-users. In this paper, we evaluate inexact hardware for its applicability in weather and climate modelling. We expand previous studies on inexact techniques, in particular probabilistic pruning, to floating point arithmetic units and derive several simulated set-ups of pruned hardware with reasonable levels of error for applications in atmospheric modelling. The set-up is tested on the Lorenz ‘96 model, a toy model for atmospheric dynamics, using software emulation for the proposed hardware. The results show that large parts of the computation tolerate the use of pruned hardware blocks without major changes in the quality of short- and long-time diagnostics, such as forecast errors and probability density functions. This could open the door to significant savings in computational cost and to higher resolution simulations with weather and climate models. PMID:24842031

  20. Home-Based Computer Gaming in Vestibular Rehabilitation of Gaze and Balance Impairment.

    PubMed

    Szturm, Tony; Reimer, Karen M; Hochman, Jordan

    2015-06-01

    Disease or damage of the vestibular sense organs cause a range of distressing symptoms and functional problems that could include loss of balance, gaze instability, disorientation, and dizziness. A novel computer-based rehabilitation system with therapeutic gaming application has been developed. This method allows different gaze and head movement exercises to be coupled to a wide range of inexpensive, commercial computer games. It can be used in standing, and thus graded balance demands using a sponge pad can be incorporated into the program. A case series pre- and postintervention study was conducted of nine adults diagnosed with peripheral vestibular dysfunction who received a 12-week home rehabilitation program. The feasibility and usability of the home computer-based therapeutic program were established. Study findings revealed that using head rotation to interact with computer games, when coupled to demanding balance conditions, resulted in significant improvements in standing balance, dynamic visual acuity, gaze control, and walking performance. Perception of dizziness as measured by the Dizziness Handicap Inventory also decreased significantly. These preliminary findings provide support that a low-cost home game-based exercise program is well suited to train standing balance and gaze control (with active and passive head motion).

  1. De novo self-assembling collagen heterotrimers using explicit positive and negative design.

    PubMed

    Xu, Fei; Zhang, Lei; Koder, Ronald L; Nanda, Vikas

    2010-03-23

    We sought to computationally design model collagen peptides that specifically associate as heterotrimers. Computational design has been successfully applied to the creation of new protein folds and functions. Despite the high abundance of collagen and its key role in numerous biological processes, fibrous proteins have received little attention as computational design targets. Collagens are composed of three polypeptide chains that wind into triple helices. We developed a discrete computational model to design heterotrimer-forming collagen-like peptides. Stability and specificity of oligomerization were concurrently targeted using a combined positive and negative design approach. The sequences of three 30-residue peptides, A, B, and C, were optimized to favor charge-pair interactions in an ABC heterotrimer, while disfavoring the 26 competing oligomers (i.e., AAA, ABB, BCA). Peptides were synthesized and characterized for thermal stability and triple-helical structure by circular dichroism and NMR. A unique A:B:C-type species was not achieved. Negative design was partially successful, with only A + B and B + C competing mixtures formed. Analysis of computed versus experimental stabilities helps to clarify the role of electrostatics and secondary-structure propensities determining collagen stability and to provide important insight into how subsequent designs can be improved.

  2. Evaluating the risk of appendiceal perforation when using ultrasound as the initial diagnostic imaging modality in children with suspected appendicitis.

    PubMed

    Alerhand, Stephen; Meltzer, James; Tay, Ee Tein

    2017-08-01

    Ultrasound scan has gained attention for diagnosing appendicitis due to its avoidance of ionizing radiation. However, studies show that ultrasound scan carries inferior sensitivity to computed tomography scan. A non-diagnostic ultrasound scan could increase the time to diagnosis and appendicectomy, particularly if follow-up computed tomography scan is needed. Some studies suggest that delaying appendicectomy increases the risk of perforation. To investigate the risk of appendiceal perforation when using ultrasound scan as the initial diagnostic imaging modality in children with suspected appendicitis. We retrospectively reviewed 1411 charts of children ≤17 years old diagnosed with appendicitis at two urban academic medical centers. Patients who underwent ultrasound scan first were compared to those who underwent computed tomography scan first. In the sub-group analysis, patients who only received ultrasound scan were compared to those who received initial ultrasound scan followed by computed tomography scan. Main outcome measures were appendiceal perforation rate and time from triage to appendicectomy. In 720 children eligible for analysis, there was no significant difference in perforation rate between those who had initial ultrasound scan and those who had initial computed tomography scan (7.3% vs. 8.9%, p = 0.44), nor in those who had ultrasound scan only and those who had initial ultrasound scan followed by computed tomography scan (8.0% vs. 5.6%, p = 0.42). Those patients who had ultrasound scan first had a shorter triage-to-incision time than those who had computed tomography scan first (9.2 (IQR: 5.9, 14.0) vs. 10.2 (IQR: 7.3, 14.3) hours, p = 0.03), whereas those who had ultrasound scan followed by computed tomography scan took longer than those who had ultrasound scan only (7.8 (IQR: 5.3, 11.6) vs. 15.1 (IQR: 10.6, 20.6), p < 0.001). Children < 12 years old receiving ultrasound scan first had lower perforation rate (p = 0.01) and shorter triage-to-incision time (p = 0.003). Children with suspected appendicitis receiving ultrasound scan as the initial diagnostic imaging modality do not have increased risk of perforation compared to those receiving computed tomography scan first. We recommend that children <12 years of age receive ultrasound scan first.

  3. Less-Complex Method of Classifying MPSK

    NASA Technical Reports Server (NTRS)

    Hamkins, Jon

    2006-01-01

    An alternative to an optimal method of automated classification of signals modulated with M-ary phase-shift-keying (M-ary PSK or MPSK) has been derived. The alternative method is approximate, but it offers nearly optimal performance and entails much less complexity, which translates to much less computation time. Modulation classification is becoming increasingly important in radio-communication systems that utilize multiple data modulation schemes and include software-defined or software-controlled receivers. Such a receiver may "know" little a priori about an incoming signal but may be required to correctly classify its data rate, modulation type, and forward error-correction code before properly configuring itself to acquire and track the symbol timing, carrier frequency, and phase, and ultimately produce decoded bits. Modulation classification has long been an important component of military interception of initially unknown radio signals transmitted by adversaries. Modulation classification may also be useful for enabling cellular telephones to automatically recognize different signal types and configure themselves accordingly. The concept of modulation classification as outlined in the preceding paragraph is quite general. However, at the present early stage of development, and for the purpose of describing the present alternative method, the term "modulation classification" or simply "classification" signifies, more specifically, a distinction between M-ary and M'-ary PSK, where M and M' represent two different integer multiples of 2. Both the prior optimal method and the present alternative method require the acquisition of magnitude and phase values of a number (N) of consecutive baseband samples of the incoming signal + noise. The prior optimal method is based on a maximum- likelihood (ML) classification rule that requires a calculation of likelihood functions for the M and M' hypotheses: Each likelihood function is an integral, over a full cycle of carrier phase, of a complicated sum of functions of the baseband sample values, the carrier phase, the carrier-signal and noise magnitudes, and M or M'. Then the likelihood ratio, defined as the ratio between the likelihood functions, is computed, leading to the choice of whichever hypothesis - M or M'- is more likely. In the alternative method, the integral in each likelihood function is approximated by a sum over values of the integrand sampled at a number, 1, of equally spaced values of carrier phase. Used in this way, 1 is a parameter that can be adjusted to trade computational complexity against the probability of misclassification. In the limit as 1 approaches infinity, one obtains the integral form of the likelihood function and thus recovers the ML classification. The present approximate method has been tested in comparison with the ML method by means of computational simulations. The results of the simulations have shown that the performance (as quantified by probability of misclassification) of the approximate method is nearly indistinguishable from that of the ML method (see figure).

  4. NASA Tech Briefs, May 2004

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Topics include: Embedded Heaters for Joining or Separating Plastic Parts; Curing Composite Materials Using Lower-Energy Electron Beams; Aluminum-Alloy-Matrix/Alumina-Reinforcement Composites; Fibrous-Ceramic/Aerogel Composite Insulating Tiles; Urethane/Silicone Adhesives for Bonding Flexing Metal Parts; Scalable Architecture for Multihop Wireless ad Hoc Networks; Improved Thermoplastic/Iron-Particle Transformer Cores; Cooperative Lander-Surface/Aerial Microflyer Missions for Mars Exploration Dual-Frequency Airborne Scanning Rain Radar Antenna System Eight-Channel Continuous Timer Reduction of Phase Ambiguity in an Offset-QPSK Receiver Ambient-Light-Canceling Camera Using Subtraction of Frames Lightweight, Flexible, Thin, Integrated Solar-Power Packs Windows(Registered Trademark)-Based Software Models Cyclic Oxidation Behavior Software for Analyzing Sequences of Flow-Related Images Improved Ball-and-Socket Docking Mechanism Two-Stage Solenoid Ordered Nanostructures Made Using Chaperonin Polypeptides Low-Temperature Plasma Functionalization of Carbon Nanotubes Improved Cryostat for Cooling a Wide Panel Current Pulses Momentarily Enhance Thermoelectric Cooling Hand-Held Color Meters Based on Interference Filters Calculating Mass Diffusion in High-Pressure Binary Fluids Fresnel Lenses for Wide-Aperture Optical Receivers Increasing Accuracy in Computed Inviscid Boundary Conditions Higher-Order Finite Elements for Computing Thermal Radiation Radar for Monitoring Hurricanes from Geostationary Orbit Time-Transfer System for Two Orbiting Spacecraft

  5. Imaging Crustal Structure with Waveform and HV Ratio of Body-wave Receiver Function

    NASA Astrophysics Data System (ADS)

    Chong, J.; Chu, R.; Ni, S.; Meng, Q.; Guo, A.

    2017-12-01

    It is known that receiver function has less constraint on the absolute velocity, and joint inversion of receiver function and surface wave dispersion has been widely applied to reduce the non-uniqueness of velocity and interface depth. However, some studies indicate that the receiver function itself is capable for determining the absolute shear wave velocity. In this study, we propose to measure the receiver function HV ratio which takes advantage of the amplitude information of the radial and vertical receiver functions to constrain the shear-wave velocity. Numerical analysis indicates that the receiver function HV ratio is sensitive to the average shear wave velocity in the depth range it samples, and can help to reduce the non-uniqueness of receiver function waveform inversion. A joint inversion scheme has been developed, and both synthetic tests and real data application proved the feasibility of the joint inversion. The method has been applied to the dense seismic array of ChinArray program in SE Tibet during the time period from August 2011 to August 2012 in SE Tibet (ChinArray-Himalaya, 2011). The measurements of receiver function HV ratio reveals the lateral variation of the tectonics in of the study region. And main features of the velocity structure imagined by the new joint inversion method are consistent with previous studies. KEYWORDS: receiver function HV ratio, receiver function waveform inversion, crustal structure ReferenceChinArray-Himalaya. 2011. China Seismic Array waveform data of Himalaya Project. Institute of Geophysics, China Earthquake Administration. doi:10.12001/ChinArray.Data. Himalaya. Jiajun Chong, Risheng Chu*, Sidao Ni, Qingjun Meng, Aizhi Guo, 2017. Receiver Function HV Ratio, a New Measurement for Reducing Non-uniqueness of Receiver Function Waveform Inversion. (under revision)

  6. South Lake Elementary students enjoy gift of computers

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Nancy Nichols, principal of South Lake Elementary School, Titusville, Fla., joins students in teacher Michelle Butler's sixth grade class who are unwrapping computer equipment donated by Kennedy Space Center. South Lake is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. Ksc employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated.

  7. Cambridge Elementary students enjoy gift of computers

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Children at Cambridge Elementary School, Cocoa, Fla., eagerly unwrap computer equipment donated by Kennedy Space Center. Cambridge is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. Behind the children is Jim Thurston, a school volunteer and retired employee of USBI, who shared in the project. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. Ksc employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated.

  8. Spreading Sequence System for Full Connectivity Relay Network

    NASA Technical Reports Server (NTRS)

    Kwon, Hyuck M. (Inventor); Pham, Khanh D. (Inventor); Yang, Jie (Inventor)

    2018-01-01

    Fully connected uplink and downlink fully connected relay network systems using pseudo-noise spreading and despreading sequences subjected to maximizing the signal-to-interference-plus-noise ratio. The relay network systems comprise one or more transmitting units, relays, and receiving units connected via a communication network. The transmitting units, relays, and receiving units each may include a computer for performing the methods and steps described herein and transceivers for transmitting and/or receiving signals. The computer encodes and/or decodes communication signals via optimum adaptive PN sequences found by employing Cholesky decompositions and singular value decompositions (SVD). The PN sequences employ channel state information (CSI) to more effectively and more securely computing the optimal sequences.

  9. Multi-input and binary reproducible, high bandwidth floating point adder in a collective network

    DOEpatents

    Chen, Dong; Eisley, Noel A.; Heidelberger, Philip; Steinmacher-Burow, Burkhard

    2016-11-15

    To add floating point numbers in a parallel computing system, a collective logic device receives the floating point numbers from computing nodes. The collective logic devices converts the floating point numbers to integer numbers. The collective logic device adds the integer numbers and generating a summation of the integer numbers. The collective logic device converts the summation to a floating point number. The collective logic device performs the receiving, the converting the floating point numbers, the adding, the generating and the converting the summation in one pass. One pass indicates that the computing nodes send inputs only once to the collective logic device and receive outputs only once from the collective logic device.

  10. Asynchronous broadcast for ordered delivery between compute nodes in a parallel computing system where packet header space is limited

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Sameer

    Disclosed is a mechanism on receiving processors in a parallel computing system for providing order to data packets received from a broadcast call and to distinguish data packets received at nodes from several incoming asynchronous broadcast messages where header space is limited. In the present invention, processors at lower leafs of a tree do not need to obtain a broadcast message by directly accessing the data in a root processor's buffer. Instead, each subsequent intermediate node's rank id information is squeezed into the software header of packet headers. In turn, the entire broadcast message is not transferred from the rootmore » processor to each processor in a communicator but instead is replicated on several intermediate nodes which then replicated the message to nodes in lower leafs. Hence, the intermediate compute nodes become "virtual root compute nodes" for the purpose of replicating the broadcast message to lower levels of a tree.« less

  11. The application of rapid prototyping technique in chin augmentation.

    PubMed

    Li, Min; Lin, Xin; Xu, Yongchen

    2010-04-01

    This article discusses the application of computer-aided design and rapid prototyping techniques in prosthetic chin augmentation for mild microgenia. Nine cases of mild microgenia underwent an electrobeam computer tomography scan. Then we performed three-dimensional reconstruction and operative design using computer software. According to the design, we determined the shape and size of the prostheses and made an individualized prosthesis for each chin augmentation with the rapid prototyping technique. With the application of computer-aided design and a rapid prototyping technique, we could determine the shape, size, and embedding location accurately. Prefabricating the individual prosthesis model is useful in improving the accuracy of treatment. In the nine cases of mild microgenia, three received a silicone implant, four received an ePTFE implant, and two received a Medpor implant. All patients were satisfied with the results. During follow-up at 6-12 months, all patients remained satisfied. The application of computer-aided design and rapid prototyping techniques can offer surgeons the ability to design an individualized ideal prosthesis for each patient.

  12. Embedding global barrier and collective in torus network with each node combining input from receivers according to class map for output to senders

    DOEpatents

    Chen, Dong; Coteus, Paul W; Eisley, Noel A; Gara, Alan; Heidelberger, Philip; Senger, Robert M; Salapura, Valentina; Steinmacher-Burow, Burkhard; Sugawara, Yutaka; Takken, Todd E

    2013-08-27

    Embodiments of the invention provide a method, system and computer program product for embedding a global barrier and global interrupt network in a parallel computer system organized as a torus network. The computer system includes a multitude of nodes. In one embodiment, the method comprises taking inputs from a set of receivers of the nodes, dividing the inputs from the receivers into a plurality of classes, combining the inputs of each of the classes to obtain a result, and sending said result to a set of senders of the nodes. Embodiments of the invention provide a method, system and computer program product for embedding a collective network in a parallel computer system organized as a torus network. In one embodiment, the method comprises adding to a torus network a central collective logic to route messages among at least a group of nodes in a tree structure.

  13. A Software Defined Radio Based Airplane Communication Navigation Simulation System

    NASA Astrophysics Data System (ADS)

    He, L.; Zhong, H. T.; Song, D.

    2018-01-01

    Radio communication and navigation system plays important role in ensuring the safety of civil airplane in flight. Function and performance should be tested before these systems are installed on-board. Conventionally, a set of transmitter and receiver are needed for each system, thus all the equipment occupy a lot of space and are high cost. In this paper, software defined radio technology is applied to design a common hardware communication and navigation ground simulation system, which can host multiple airplane systems with different operating frequency, such as HF, VHF, VOR, ILS, ADF, etc. We use a broadband analog frontend hardware platform, universal software radio peripheral (USRP), to transmit/receive signal of different frequency band. Software is compiled by LabVIEW on computer, which interfaces with USRP through Ethernet, and is responsible for communication and navigation signal processing and system control. An integrated testing system is established to perform functional test and performance verification of the simulation signal, which demonstrate the feasibility of our design. The system is a low-cost and common hardware platform for multiple airplane systems, which provide helpful reference for integrated avionics design.

  14. Control Software for Advanced Video Guidance Sensor

    NASA Technical Reports Server (NTRS)

    Howard, Richard T.; Book, Michael L.; Bryan, Thomas C.

    2006-01-01

    Embedded software has been developed specifically for controlling an Advanced Video Guidance Sensor (AVGS). A Video Guidance Sensor is an optoelectronic system that provides guidance for automated docking of two vehicles. Such a system includes pulsed laser diodes and a video camera, the output of which is digitized. From the positions of digitized target images and known geometric relationships, the relative position and orientation of the vehicles are computed. The present software consists of two subprograms running in two processors that are parts of the AVGS. The subprogram in the first processor receives commands from an external source, checks the commands for correctness, performs commanded non-image-data-processing control functions, and sends image data processing parts of commands to the second processor. The subprogram in the second processor processes image data as commanded. Upon power-up, the software performs basic tests of functionality, then effects a transition to a standby mode. When a command is received, the software goes into one of several operational modes (e.g. acquisition or tracking). The software then returns, to the external source, the data appropriate to the command.

  15. Systems and methods for detecting and processing

    DOEpatents

    Johnson, Michael M [Livermore, CA; Yoshimura, Ann S [Tracy, CA

    2006-03-28

    Embodiments of the present invention provides systems and method for detecting. Sensing modules are provided in communication with one or more detectors. In some embodiments, detectors are provided that are sensitive to chemical, biological, or radiological agents. Embodiments of sensing modules include processing capabilities to analyze, perform computations on, and/or run models to predict or interpret data received from one or more detectors. Embodiments of sensing modules form various network configurations with one another and/or with one or more data aggregation devices. Some embodiments of sensing modules include power management functionalities.

  16. Apparatus for monitoring high temperature ultrasonic characterization

    DOEpatents

    Lanagan, M.T.; Kupperman, D.S.; Yaconi, G.A.

    1998-03-24

    A method and an apparatus for nondestructive detecting and evaluating changes in the microstructural properties of a material by employing one or more magnetostrictive transducers linked to the material by means of one or more sonic signal conductors. The magnetostrictive transducer or transducers are connected to a pulser/receiver which in turn is connected to an oscilloscope. The oscilloscope is connected to a computer which employs an algorithm to evaluate changes in the velocity of a signal transmitted to the material sample as function of time and temperature. 6 figs.

  17. Ballistic-Electron-Emission Microscope

    NASA Technical Reports Server (NTRS)

    Kaiser, William J.; Bell, L. Douglas

    1990-01-01

    Ballistic-electron-emission microscope (BEEM) employs scanning tunneling-microscopy (STM) methods for nondestructive, direct electrical investigation of buried interfaces, such as interface between semiconductor and thin metal film. In BEEM, there are at least three electrodes: emitting tip, biasing electrode, and collecting electrode, receiving current crossing interface under investigation. Signal-processing device amplifies electrode signals and converts them into form usable by computer. Produces spatial images of surface by scanning tip; in addition, provides high-resolution images of buried interface under investigation. Spectroscopic information extracted by measuring collecting-electrode current as function of one of interelectrode voltages.

  18. FUZZY COMPUTATIONAL MODELS TO EVALUATE THE EFFECTS OF AIR POLLUTION ON CHILDREN.

    PubMed

    David, Gleise Silva; Rizol, Paloma Maria Silva Rocha; Nascimento, Luiz Fernando Costa

    2018-01-01

    To build a fuzzy computational model to estimate the number of hospitalizations of children aged up to 10 years due to respiratory conditions based on pollutants and climatic factors in the city of São José do Rio Preto, Brazil. A computational model was constructed using the fuzzy logic. The model has 4 inputs, each with 2 membership functions generating 16 rules, and the output with 5 pertinence functions, based on the Mamdani's method, to estimate the association between the pollutants and the number of hospitalizations. Data from hospitalizations, from 2011-2013, were obtained in DATASUS - and the pollutants Particulate Matter (PM10) and Nitrogen Dioxide (NO2), wind speed and temperature were obtained by the Environmental Company of São Paulo State (Cetesb). A total of 1,161 children were hospitalized in the period and the mean of pollutants was 36 and 51 µg/m3 - PM10 and NO2, respectively. The best values of the Pearson correlation (0.34) and accuracy measured by the Receiver Operating Characteristic (ROC) curve (NO2 - 96.7% and PM10 - 90.4%) were for hospitalizations on the same day of exposure. The model was effective in predicting the number of hospitalizations of children and could be used as a tool in the hospital management of the studied region.

  19. Noise reduction of coincidence detector output by the inferior colliculus of the barn owl.

    PubMed

    Christianson, G Björn; Peña, José Luis

    2006-05-31

    A recurring theme in theoretical work is that integration over populations of similarly tuned neurons can reduce neural noise. However, there are relatively few demonstrations of an explicit noise reduction mechanism in a neural network. Here we demonstrate that the brainstem of the barn owl includes a stage of processing apparently devoted to increasing the signal-to-noise ratio in the encoding of the interaural time difference (ITD), one of two primary binaural cues used to compute the position of a sound source in space. In the barn owl, the ITD is processed in a dedicated neural pathway that terminates at the core of the inferior colliculus (ICcc). The actual locus of the computation of the ITD is before ICcc in the nucleus laminaris (NL), and ICcc receives no inputs carrying information that did not originate in NL. Unlike in NL, the rate-ITD functions of ICcc neurons require as little as a single stimulus presentation per ITD to show coherent ITD tuning. ICcc neurons also displayed a greater dynamic range with a maximal difference in ITD response rates approximately double that seen in NL. These results indicate that ICcc neurons perform a computation functionally analogous to averaging across a population of similarly tuned NL neurons.

  20. An executable specification for the message processor in a simple combining network

    NASA Technical Reports Server (NTRS)

    Middleton, David

    1995-01-01

    While the primary function of the network in a parallel computer is to communicate data between processors, it is often useful if the network can also perform rudimentary calculations. That is, some simple processing ability in the network itself, particularly for performing parallel prefix computations, can reduce both the volume of data being communicated and the computational load on the processors proper. Unfortunately, typical implementations of such networks require a large fraction of the hardware budget, and so combining networks are viewed as being impractical. The FFP Machine has such a combining network, and various characteristics of the machine allow a good deal of simplification in the network design. Despite being simple in construction however, the network relies on many subtle details to work correctly. This paper describes an executable model of the network which will serve several purposes. It provides a complete and detailed description of the network which can substantiate its ability to support necessary functions. It provides an environment in which algorithms to be run on the network can be designed and debugged more easily than they would on physical hardware. Finally, it provides the foundation for exploring the design of the message receiving facility which connects the network to the individual processors.

  1. The software for automatic creation of the formal grammars used by speech recognition, computer vision, editable text conversion systems, and some new functions

    NASA Astrophysics Data System (ADS)

    Kardava, Irakli; Tadyszak, Krzysztof; Gulua, Nana; Jurga, Stefan

    2017-02-01

    For more flexibility of environmental perception by artificial intelligence it is needed to exist the supporting software modules, which will be able to automate the creation of specific language syntax and to make a further analysis for relevant decisions based on semantic functions. According of our proposed approach, of which implementation it is possible to create the couples of formal rules of given sentences (in case of natural languages) or statements (in case of special languages) by helping of computer vision, speech recognition or editable text conversion system for further automatic improvement. In other words, we have developed an approach, by which it can be achieved to significantly improve the training process automation of artificial intelligence, which as a result will give us a higher level of self-developing skills independently from us (from users). At the base of our approach we have developed a software demo version, which includes the algorithm and software code for the entire above mentioned component's implementation (computer vision, speech recognition and editable text conversion system). The program has the ability to work in a multi - stream mode and simultaneously create a syntax based on receiving information from several sources.

  2. Cognitive Predictors of Work Among Social Security Disability Insurance Beneficiaries With Psychiatric Disorders Enrolled in IPS Supported Employment.

    PubMed

    McGurk, Susan R; Drake, Robert E; Xie, Haiyi; Riley, Jarnee; Milfort, Roline; Hale, Thomas W; Frey, William

    2018-01-13

    Impaired cognitive functioning is a significant predictor of work dysfunction in schizophrenia. Less is known, however about relationships of cognition and work in people with less severe disorders with relatively normal cognitive functioning. This secondary analysis evaluated cognitive predictors of work in Social Security Disability Insurance (SSDI) beneficiaries with a recent work history who were randomized to receive mental health services, supported employment, and freedom from work disincentives over a 2-year study period in the Mental Health Treatment Study. Of the 1045 participants randomized to the treatment package, 945 (90.4%) received a cognitive assessment at study entry. Competitive work activity was evaluated using a computer-assisted timeline follow-back calendar at baseline and quarterly for 24 months. Mood disorders were the most common psychiatric diagnoses (64.9%), followed by schizophrenia or schizoaffective disorder (35.1%). Tobit regression analyses predicting the average number of hours worked per week, controlling for demographic characteristics, diagnosis, and work history indicated that the cognitive composite score (P < .01) and verbal learning subscale scores (P < .001) were associated with fewer hours of weekly work over the study period. Cognitive functioning predicted work over 2 years in SSDI beneficiaries with mood or schizophrenia-spectrum disorders who were receiving supported employment and mental health interventions, despite a relative absence of cognitive impairment in the study participants. The findings suggest cognitive functioning contributes to competitive work outcomes in persons with psychiatric disorders who have relatively unimpaired cognitive abilities, even under optimal conditions of treatment and vocational support. © The Author(s) 2017. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  3. Stabilized NADH as a Countermeasure for Jet Lag

    NASA Technical Reports Server (NTRS)

    Kay, Gary G.; Viirre, Erik; Clark, Jonathan

    2001-01-01

    Current remedies for jet lag (phototherapy, melatonin, stimulant, and sedative medications) are limited in efficacy and practicality. The efficacy of a stabilized, sublingual form of reduced nicotin amide adenine dinucleotide (NADH, ENADAlert, Menuco Corp.) as a countermeasure for jet lag was examined. Because NADH increases cellular production of ATP and facilitates dopamine synthesis, it may counteract the effects of jet lag on cognitive functioning and sleepiness. Thirty-five healthy, employed subjects participated in this double-blind, placebo-controlled study. Training and baseline testing were conducted on the West Coast before subjects flew overnight to the East Coast, where they would experience a 3-hour time difference. Upon arrival, individuals were randomly assigned to receive either 20 mg of sublingual stabilized ADH (n=18) or identical placebo tablets (n=17). All participants completed computer-administered tests (including CogScreen7) to assess changes in cognitive functioning, mood, and sleepiness in the morning and afternoon. Jet lag resulted in increased sleepiness for over half the participants and deterioration of cognitive functioning for approximately one third. The morning following the flight, subjects experienced lapses of attention in addition to disruptions in working memory, divided attention, and visual perceptual speed. Individuals who received NADH performed significantly better on 5 of 8 cognitive and psychomotor test measures (P less than or equal to 0.5) and showed a trend for better performance on the other three measures (P less than or equal to .l0). Subjects also reported less sleepiness compared with those who received placebo. No adverse effects were observed with NADH treatment. Stabilized NADH significantly reduced jet lag-induced disruptions of cognitive functioning, was easily administered, and was found to have no adverse side effects.

  4. 45 CFR 205.56 - Requirements governing the use of income and eligibility information.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...) of the Social Security Act must provide that: (a) The State agency will use the information obtained... received from the Internal Revenue Service, and earnings information received from the Social Security... Federal computer matching program that is subject to the requirements in the Computer Matching and Privacy...

  5. 45 CFR 205.56 - Requirements governing the use of income and eligibility information.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...) of the Social Security Act must provide that: (a) The State agency will use the information obtained... received from the Internal Revenue Service, and earnings information received from the Social Security... Federal computer matching program that is subject to the requirements in the Computer Matching and Privacy...

  6. 45 CFR 205.56 - Requirements governing the use of income and eligibility information.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...) of the Social Security Act must provide that: (a) The State agency will use the information obtained... received from the Internal Revenue Service, and earnings information received from the Social Security... Federal computer matching program that is subject to the requirements in the Computer Matching and Privacy...

  7. 45 CFR 205.56 - Requirements governing the use of income and eligibility information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) of the Social Security Act must provide that: (a) The State agency will use the information obtained... received from the Internal Revenue Service, and earnings information received from the Social Security... Federal computer matching program that is subject to the requirements in the Computer Matching and Privacy...

  8. 45 CFR 205.56 - Requirements governing the use of income and eligibility information.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...) of the Social Security Act must provide that: (a) The State agency will use the information obtained... received from the Internal Revenue Service, and earnings information received from the Social Security... Federal computer matching program that is subject to the requirements in the Computer Matching and Privacy...

  9. Achieving effective learning effects in the blended course: a combined approach of online self-regulated learning and collaborative learning with initiation.

    PubMed

    Tsai, Chia-Wen

    2011-09-01

    In many countries, undergraduates are required to take at least one introductory computer course to enhance their computer literacy and computing skills. However, the application software education in Taiwan can hardly be deemed as effective in developing students' practical computing skills. The author applied online self-regulated learning (SRL) and collaborative learning (CL) with initiation in a blended computing course and examined the effects of different combinations on enhancing students' computing skills. Four classes, comprising 221 students, participated in this study. The online SRL and CL with initiation (G1, n = 53), online CL with initiation (G2, n = 68), and online CL without initiation (G3, n = 68) were experimental groups, and the last class, receiving traditional lecture (G4, n = 32), was the control group. The results of this study show that students who received the intervention of online SRL and CL with initiation attained significantly best grades for practical computing skills, whereas those that received the traditional lectures had statistically poorest grades among the four classes. The implications for schools and educators who plan to provide online or blended learning for their students, particularly in computing courses, are also provided in this study.

  10. Early intensive hand rehabilitation after spinal cord injury ("Hands On"): a protocol for a randomised controlled trial.

    PubMed

    Harvey, Lisa A; Dunlop, Sarah A; Churilov, Leonid; Hsueh, Ya-Seng Arthur; Galea, Mary P

    2011-01-17

    Loss of hand function is one of the most devastating consequences of spinal cord injury. Intensive hand training provided on an instrumented exercise workstation in conjunction with functional electrical stimulation may enhance neural recovery and hand function. The aim of this trial is to compare usual care with an 8-week program of intensive hand training and functional electrical stimulation. A multicentre randomised controlled trial will be undertaken. Seventy-eight participants with recent tetraplegia (C2 to T1 motor complete or incomplete) undergoing inpatient rehabilitation will be recruited from seven spinal cord injury units in Australia and New Zealand and will be randomised to a control or experimental group. Control participants will receive usual care. Experimental participants will receive usual care and an 8-week program of intensive unilateral hand training using an instrumented exercise workstation and functional electrical stimulation. Participants will drive the functional electrical stimulation of their target hands via a behind-the-ear bluetooth device, which is sensitive to tooth clicks. The bluetooth device will enable the use of various manipulanda to practice functional activities embedded within computer-based games and activities. Training will be provided for one hour, 5 days per week, during the 8-week intervention period. The primary outcome is the Action Research Arm Test. Secondary outcomes include measurements of strength, sensation, function, quality of life and cost effectiveness. All outcomes will be taken at baseline, 8 weeks, 6 months and 12 months by assessors blinded to group allocation. Recruitment commenced in December 2009. The results of this trial will determine the effectiveness of an 8-week program of intensive hand training with functional electrical stimulation. NCT01086930 (12th March 2010)ACTRN12609000695202 (12th August 2009).

  11. The calculation of electromagnetic fields in the Fresnel and Fraunhofer regions using numerical integration methods

    NASA Technical Reports Server (NTRS)

    Schmidt, R. F.

    1971-01-01

    Some results obtained with a digital computer program written at Goddard Space Flight Center to obtain electromagnetic fields scattered by perfectly reflecting surfaces are presented. For purposes of illustration a paraboloidal reflector was illuminated at radio frequencies in the simulation for both receiving and transmitting modes of operation. Fields were computed in the Fresnel and Fraunhofer regions. A dual-reflector system (Cassegrain) was also simulated for the transmitting case, and fields were computed in the Fraunhofer region. Appended results include derivations which show that the vector Kirchhoff-Kottler formulation has an equivalent form requiring only incident magnetic fields as a driving function. Satisfaction of the radiation conditions at infinity by the equivalent form is demonstrated by a conversion from Cartesian to spherical vector operators. A subsequent development presents the formulation by which Fresnel or Fraunhofer patterns are obtainable for dual-reflector systems. A discussion of the time-average Poynting vector is also appended.

  12. KSC-99pp1226

    NASA Image and Video Library

    1999-10-06

    Nancy Nichols, principal of South Lake Elementary School, Titusville, Fla., joins students in teacher Michelle Butler's sixth grade class who are unwrapping computer equipment donated by Kennedy Space Center. South Lake is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. KSC employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated

  13. Detection probability of EBPSK-MODEM system

    NASA Astrophysics Data System (ADS)

    Yao, Yu; Wu, Lenan

    2016-07-01

    Since the impacting filter-based receiver is able to transform phase modulation into amplitude peak, a simple threshold decision can detect the Extend-Binary Phase Shift Keying (EBPSK) modulated ranging signal in noise environment. In this paper, an analysis of the EBPSK-MODEM system output gives the probability density function for EBPSK modulated signals plus noise. The equation of detection probability (pd) for fluctuating and non-fluctuating targets has been deduced. Also, a comparison of the pd for the EBPSK-MODEM system and pulse radar receiver is made, and some results are plotted. Moreover, the probability curves of such system with several modulation parameters are analysed. When modulation parameter is not smaller than 6, the detection performance of EBPSK-MODEM system is more excellent than traditional radar system. In addition to theoretical considerations, computer simulations are provided for illustrating the performance.

  14. On-board Attitude Determination System (OADS). [for advanced spacecraft missions

    NASA Technical Reports Server (NTRS)

    Carney, P.; Milillo, M.; Tate, V.; Wilson, J.; Yong, K.

    1978-01-01

    The requirements, capabilities and system design for an on-board attitude determination system (OADS) to be flown on advanced spacecraft missions were determined. Based upon the OADS requirements and system performance evaluation, a preliminary on-board attitude determination system is proposed. The proposed OADS system consists of one NASA Standard IRU (DRIRU-2) as the primary attitude determination sensor, two improved NASA Standard star tracker (SST) for periodic update of attitude information, a GPS receiver to provide on-board space vehicle position and velocity vector information, and a multiple microcomputer system for data processing and attitude determination functions. The functional block diagram of the proposed OADS system is shown. The computational requirements are evaluated based upon this proposed OADS system.

  15. Adaptive mixed reality rehabilitation improves quality of reaching movements more than traditional reaching therapy following stroke.

    PubMed

    Duff, Margaret; Chen, Yinpeng; Cheng, Long; Liu, Sheng-Min; Blake, Paul; Wolf, Steven L; Rikakis, Thanassis

    2013-05-01

    Adaptive mixed reality rehabilitation (AMRR) is a novel integration of motion capture technology and high-level media computing that provides precise kinematic measurements and engaging multimodal feedback for self-assessment during a therapeutic task. We describe the first proof-of-concept study to compare outcomes of AMRR and traditional upper-extremity physical therapy. Two groups of participants with chronic stroke received either a month of AMRR therapy (n = 11) or matched dosing of traditional repetitive task therapy (n = 10). Participants were right handed, between 35 and 85 years old, and could independently reach to and at least partially grasp an object in front of them. Upper-extremity clinical scale scores and kinematic performances were measured before and after treatment. Both groups showed increased function after therapy, demonstrated by statistically significant improvements in Wolf Motor Function Test and upper-extremity Fugl-Meyer Assessment (FMA) scores, with the traditional therapy group improving significantly more on the FMA. However, only participants who received AMRR therapy showed a consistent improvement in kinematic measurements, both for the trained task of reaching to grasp a cone and the untrained task of reaching to push a lighted button. AMRR may be useful in improving both functionality and the kinematics of reaching. Further study is needed to determine if AMRR therapy induces long-term changes in movement quality that foster better functional recovery.

  16. Controlling user access to electronic resources without password

    DOEpatents

    Smith, Fred Hewitt

    2015-06-16

    Described herein are devices and techniques for remotely controlling user access to a restricted computer resource. The process includes pre-determining an association of the restricted computer resource and computer-resource-proximal environmental information. Indicia of user-proximal environmental information are received from a user requesting access to the restricted computer resource. Received indicia of user-proximal environmental information are compared to associated computer-resource-proximal environmental information. User access to the restricted computer resource is selectively granted responsive to a favorable comparison in which the user-proximal environmental information is sufficiently similar to the computer-resource proximal environmental information. In at least some embodiments, the process further includes comparing user-supplied biometric measure and comparing it with a predetermined association of at least one biometric measure of an authorized user. Access to the restricted computer resource is granted in response to a favorable comparison.

  17. The influence of the uplink noise on the performance of satellite data transmission systems

    NASA Astrophysics Data System (ADS)

    Dewal, Vrinda P.

    The problem of transmission of binary phase shift keying (BPSK) modulated digital data through a bandlimited nonlinear satellite channel in the presence of uplink, downlink Gaussian noise and intersymbol interface is examined. The satellite transponder is represented by a zero memory bandpass nonlinearity, with AM/AM conversion. The proposed optimum linear receiver structure consists of tapped-delay lines followed by a decision device. The linear receiver is designed to minimize the mean square error that is a function of the intersymbol interface, the uplink and the downlink noise. The minimum mean square error equalizer (MMSE) is derived using the Wiener-Kolmogorov theory. In this receiver, the decision about the transmitted signal is made by taking into account the received sequence of present sample, and the interfering past and future samples, which represent the intersymbol interference (ISI). Illustrative examples of the receiver structures are considered for the nonlinear channels with a symmetrical and asymmetrical frequency responses of the transmitter filter. The transponder nonlinearity is simulated by a polynomial using only the first and the third orders terms. A computer simulation determines the tap gain coefficients of the MMSE equalizer that adapt to the various uplink and downlink noise levels. The performance of the MMSE equalizer is evaluated in terms of an estimate of the average probability of error.

  18. On Fast Post-Processing of Global Positioning System Simulator Truth Data and Receiver Measurements and Solutions Data

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Day, John H. (Technical Monitor)

    2000-01-01

    Post-Processing of data related to a Global Positioning System (GPS) simulation is an important activity in qualification of a GPS receiver for space flight. Because a GPS simulator is a critical resource it is desirable to move off the pertinent simulation data from the simulator as soon as a test is completed. The simulator data files are usually moved to a Personal Computer (PC), where the post-processing of the receiver logged measurements and solutions data and simulated data is performed. Typically post-processing is accomplished using PC-based commercial software languages and tools. Because of commercial software systems generality their general-purpose functions are notoriously slow and more than often are the bottleneck problem even for short duration experiments. For example, it may take 8 hours to post-process data from a 6-hour simulation. There is a need to do post-processing faster, especially in order to use the previous test results as feedback for a next simulation setup. This paper demonstrates that a fast software linear interpolation algorithm is applicable to a large class of engineering problems, like GPS simulation data post-processing, where computational time is a critical resource and is one of the most important considerations. An approach is developed that allows to speed-up post-processing by an order of magnitude. It is based on improving the post-processing bottleneck interpolation algorithm using apriori information that is specific to the GPS simulation application. The presented post-processing scheme was used in support of a few successful space flight missions carrying GPS receivers. A future approach to solving the post-processing performance problem using Field Programmable Gate Array (FPGA) technology is described.

  19. Send-side matching of data communications messages

    DOEpatents

    Archer, Charles J.; Blocksome, Michael A.; Ratterman, Joseph D.; Smith, Brian E.

    2014-06-17

    Send-side matching of data communications messages in a distributed computing system comprising a plurality of compute nodes, including: issuing by a receiving node to source nodes a receive message that specifies receipt of a single message to be sent from any source node, the receive message including message matching information, a specification of a hardware-level mutual exclusion device, and an identification of a receive buffer; matching by two or more of the source nodes the receive message with pending send messages in the two or more source nodes; operating by one of the source nodes having a matching send message the mutual exclusion device, excluding messages from other source nodes with matching send messages and identifying to the receiving node the source node operating the mutual exclusion device; and sending to the receiving node from the source node operating the mutual exclusion device a matched pending message.

  20. Future remnant liver function as predictive factor for the hypertrophy response after portal vein embolization.

    PubMed

    Cieslak, Kasia P; Huisman, Floor; Bais, Thomas; Bennink, Roelof J; van Lienden, Krijn P; Verheij, Joanne; Besselink, Marc G; Busch, Olivier R C; van Gulik, Thomas M

    2017-07-01

    Preoperative portal vein embolization is widely used to increase the future remnant liver. Identification of nonresponders to portal vein embolization is essential because these patients may benefit from associating liver partition and portal vein ligation for staged hepatectomy (ALPPS), which induces a more powerful hypertrophy response. 99m Tc-mebrofenin hepatobiliary scintigraphy is a quantitative method for assessment of future remnant liver function with a calculated cutoff value for the prediction of postoperative liver failure. The aim of this study was to analyze future remnant liver function before portal vein embolization to predict sufficient functional hypertrophy response after portal vein embolization. Sixty-three patients who underwent preoperative portal vein embolization and computed tomography imaging were included. Hepatobiliary scintigraphy was performed to determine pre-portal vein embolization and post-portal vein embolization future remnant liver function. Receiver operator characteristic analysis of pre-portal vein embolization future remnant liver function was performed to identify patients who would meet the post-portal vein embolization cutoff value for sufficient function (ie, 2.7%/min/m 2 ). Mean pre-portal vein embolization future remnant liver function was 1.80% ± 0.45%/min/m 2 and increased to 2.89% ± 0.97%/min/m 2 post-portal vein embolization. Receiver operator characteristic analysis in 33 patients who did not receive chemotherapy revealed that a pre-portal vein embolization future remnant liver function of ≥1.72%/min/m 2 was able to identify patients who would meet the safe future remnant liver function cutoff value 3 weeks after portal vein embolization (area under the curve = 0.820). The predictive value was less pronounced in 30 patients treated with neoadjuvant chemotherapy (area under the curve = 0.618). A total of 45 of 63 patients underwent liver resection, of whom 5 of 45 developed postoperative liver failure; 4 of 5 patients had a post-portal vein embolization future remnant liver function below the cutoff value for safe resection. When selecting patients for portal vein embolization, future remnant liver function assessed with hepatobiliary scintigraphy can be used as a predictor of insufficient functional hypertrophy after portal vein embolization, especially in nonchemotherapy patients. These patients are potential candidates for ALPPS. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. TH-A-9A-04: Incorporating Liver Functionality in Radiation Therapy Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, V; Epelman, M; Feng, M

    2014-06-15

    Purpose: Liver SBRT patients have both variable pretreatment liver function (e.g., due to degree of cirrhosis and/or prior treatments) and sensitivity to radiation, leading to high variability in potential liver toxicity with similar doses. This work aims to explicitly incorporate liver perfusion into treatment planning to redistribute dose to preserve well-functioning areas without compromising target coverage. Methods: Voxel-based liver perfusion, a measure of functionality, was computed from dynamic contrast-enhanced MRI. Two optimization models with different cost functions subject to the same dose constraints (e.g., minimum target EUD and maximum critical structure EUDs) were compared. The cost functions minimized were EUDmore » (standard model) and functionality-weighted EUD (functional model) to the liver. The resulting treatment plans delivering the same target EUD were compared with respect to their DVHs, their dose wash difference, the average dose delivered to voxels of a particular perfusion level, and change in number of high-/low-functioning voxels receiving a particular dose. Two-dimensional synthetic and three-dimensional clinical examples were studied. Results: The DVHs of all structures of plans from each model were comparable. In contrast, in plans obtained with the functional model, the average dose delivered to high-/low-functioning voxels was lower/higher than in plans obtained with its standard counterpart. The number of high-/low-functioning voxels receiving high/low dose was lower in the plans that considered perfusion in the cost function than in the plans that did not. Redistribution of dose can be observed in the dose wash differences. Conclusion: Liver perfusion can be used during treatment planning potentially to minimize the risk of toxicity during liver SBRT, resulting in better global liver function. The functional model redistributes dose in the standard model from higher to lower functioning voxels, while achieving the same target EUD and satisfying dose limits to critical structures. This project is funded by MCubed and grant R01-CA132834.« less

  2. Joint Transmit and Receive Filter Optimization for Sub-Nyquist Delay-Doppler Estimation

    NASA Astrophysics Data System (ADS)

    Lenz, Andreas; Stein, Manuel S.; Swindlehurst, A. Lee

    2018-05-01

    In this article, a framework is presented for the joint optimization of the analog transmit and receive filter with respect to a parameter estimation problem. At the receiver, conventional signal processing systems restrict the two-sided bandwidth of the analog pre-filter $B$ to the rate of the analog-to-digital converter $f_s$ to comply with the well-known Nyquist-Shannon sampling theorem. In contrast, here we consider a transceiver that by design violates the common paradigm $B\\leq f_s$. To this end, at the receiver, we allow for a higher pre-filter bandwidth $B>f_s$ and study the achievable parameter estimation accuracy under a fixed sampling rate when the transmit and receive filter are jointly optimized with respect to the Bayesian Cram\\'{e}r-Rao lower bound. For the case of delay-Doppler estimation, we propose to approximate the required Fisher information matrix and solve the transceiver design problem by an alternating optimization algorithm. The presented approach allows us to explore the Pareto-optimal region spanned by transmit and receive filters which are favorable under a weighted mean squared error criterion. We also discuss the computational complexity of the obtained transceiver design by visualizing the resulting ambiguity function. Finally, we verify the performance of the optimized designs by Monte-Carlo simulations of a likelihood-based estimator.

  3. Solar tower cavity receiver aperture optimization based on transient optical and thermo-hydraulic modeling

    NASA Astrophysics Data System (ADS)

    Schöttl, Peter; Bern, Gregor; van Rooyen, De Wet; Heimsath, Anna; Fluri, Thomas; Nitz, Peter

    2017-06-01

    A transient simulation methodology for cavity receivers for Solar Tower Central Receiver Systems with molten salt as heat transfer fluid is described. Absorbed solar radiation is modeled with ray tracing and a sky discretization approach to reduce computational effort. Solar radiation re-distribution in the cavity as well as thermal radiation exchange are modeled based on view factors, which are also calculated with ray tracing. An analytical approach is used to represent convective heat transfer in the cavity. Heat transfer fluid flow is simulated with a discrete tube model, where the boundary conditions at the outer tube surface mainly depend on inputs from the previously mentioned modeling aspects. A specific focus is put on the integration of optical and thermo-hydraulic models. Furthermore, aiming point and control strategies are described, which are used during the transient performance assessment. Eventually, the developed simulation methodology is used for the optimization of the aperture opening size of a PS10-like reference scenario with cavity receiver and heliostat field. The objective function is based on the cumulative gain of one representative day. Results include optimized aperture opening size, transient receiver characteristics and benefits of the implemented aiming point strategy compared to a single aiming point approach. Future work will include annual simulations, cost assessment and optimization of a larger range of receiver parameters.

  4. The use of interactive computer vision and robot hand controllers for enhancing manufacturing safety

    NASA Technical Reports Server (NTRS)

    Marzwell, Neville I.; Jacobus, Charles J.; Peurach, Thomas M.; Mitchell, Brian T.

    1994-01-01

    Current available robotic systems provide limited support for CAD-based model-driven visualization, sensing algorithm development and integration, and automated graphical planning systems. This paper describes ongoing work which provides the functionality necessary to apply advanced robotics to automated manufacturing and assembly operations. An interface has been built which incorporates 6-DOF tactile manipulation, displays for three dimensional graphical models, and automated tracking functions which depend on automated machine vision. A set of tools for single and multiple focal plane sensor image processing and understanding has been demonstrated which utilizes object recognition models. The resulting tool will enable sensing and planning from computationally simple graphical objects. A synergistic interplay between human and operator vision is created from programmable feedback received from the controller. This approach can be used as the basis for implementing enhanced safety in automated robotics manufacturing, assembly, repair and inspection tasks in both ground and space applications. Thus, an interactive capability has been developed to match the modeled environment to the real task environment for safe and predictable task execution.

  5. The use of interactive computer vision and robot hand controllers for enhancing manufacturing safety

    NASA Astrophysics Data System (ADS)

    Marzwell, Neville I.; Jacobus, Charles J.; Peurach, Thomas M.; Mitchell, Brian T.

    1994-02-01

    Current available robotic systems provide limited support for CAD-based model-driven visualization, sensing algorithm development and integration, and automated graphical planning systems. This paper describes ongoing work which provides the functionality necessary to apply advanced robotics to automated manufacturing and assembly operations. An interface has been built which incorporates 6-DOF tactile manipulation, displays for three dimensional graphical models, and automated tracking functions which depend on automated machine vision. A set of tools for single and multiple focal plane sensor image processing and understanding has been demonstrated which utilizes object recognition models. The resulting tool will enable sensing and planning from computationally simple graphical objects. A synergistic interplay between human and operator vision is created from programmable feedback received from the controller. This approach can be used as the basis for implementing enhanced safety in automated robotics manufacturing, assembly, repair and inspection tasks in both ground and space applications. Thus, an interactive capability has been developed to match the modeled environment to the real task environment for safe and predictable task execution.

  6. Radiative Heat Transfer and Turbulence-Radiation Interactions in a Heavy-Duty Diesel Engine

    NASA Astrophysics Data System (ADS)

    Paul, C.; Sircar, A.; Ferreyro, S.; Imren, A.; Haworth, D. C.; Roy, S.; Ge, W.; Modest, M. F.

    2016-11-01

    Radiation in piston engines has received relatively little attention to date. Recently, it is being revisited in light of current trends towards higher operating pressures and higher levels of exhaust-gas recirculation, both of which enhance molecular gas radiation. Advanced high-efficiency engines also are expected to function closer to the limits of stable operation, where even small perturbations to the energy balance can have a large influence on system behavior. Here several different spectral radiation property models and radiative transfer equation (RTE) solvers have been implemented in an OpenFOAM-based engine CFD code, and simulations have been performed for a heavy-duty diesel engine. Differences in computed temperature fields, NO and soot levels, and wall heat transfer rates are shown for different combinations of spectral models and RTE solvers. The relative importance of molecular gas radiation versus soot radiation is examined. And the influence of turbulence-radiation interactions is determined by comparing results obtained using local mean values of composition and temperature to compute radiative emission and absorption with those obtained using a particle-based transported probability density function method. DOE, NSF.

  7. Radiative Heat Transfer modelling in a Heavy-Duty Diesel Engine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paul, Chandan; Sircar, Arpan; Ferreyro-Fernandez, Sebastian

    Detailed radiation modelling in piston engines has received relatively little attention to date. Recently, it is being revisited in light of current trends towards higher operating pressures and higher levels of exhaust-gas recirculation, both of which enhance molecular gas radiation. Advanced high-efficiency engines also are expected to function closer to the limits of stable operation, where even small perturbations to the energy balance can have a large influence on system behavior. Here several different spectral radiation property models and radiative transfer equation (RTE) solvers have been implemented in an OpenFOAM-based engine CFD code, and simulations have been performed for amore » heavy-duty diesel engine. Differences in computed temperature fields, NO and soot levels, and wall heat transfer rates are shown for different combinations of spectral models and RTE solvers. The relative importance of molecular gas radiation versus soot radiation is examined. And the influence of turbulence-radiation interactions is determined by comparing results obtained using local mean values of composition and temperature to compute radiative emission and absorption with those obtained using a particle-based transported probability density function method.« less

  8. Single-lens computed tomography imaging spectrometer and method of capturing spatial and spectral information

    NASA Technical Reports Server (NTRS)

    Wilson, Daniel W. (Inventor); Johnson, William R. (Inventor); Bearman, Gregory H. (Inventor)

    2011-01-01

    Computed tomography imaging spectrometers ("CTISs") employing a single lens are provided. The CTISs may be either transmissive or reflective, and the single lens is either configured to transmit and receive uncollimated light (in transmissive systems), or is configured to reflect and receive uncollimated light (in reflective systems). An exemplary transmissive CTIS includes a focal plane array detector, a single lens configured to transmit and receive uncollimated light, a two-dimensional grating, and a field stop aperture. An exemplary reflective CTIS includes a focal plane array detector, a single mirror configured to reflect and receive uncollimated light, a two-dimensional grating, and a field stop aperture.

  9. System and method for generating attitude determinations using GPS

    NASA Technical Reports Server (NTRS)

    Cohen, Clark E. (Inventor)

    1996-01-01

    A GPS attitude receiver for determining the attitude of a moving vehicle in conjunction with a first, a second, a third, and a fourth antenna mounted to the moving vehicle. Each of the antennas receives a plurality of GPS signals that each include a carrier component. For each of the carrier components of the received GPS signals there is an integer ambiguity associated with the first and fourth antennas, an integer ambiguity associated with second and fourth antennas, and an integer ambiguity associated with the third and fourth antennas. The GPS attitude receiver measures phase values for the carrier components of the GPS signals received from each of the antennas at a plurality of measurement epochs during an initialization period and at a measurement epoch after the initialization period. In response to the phase values measured at the measurement epochs during the initialization period, the GPS attitude receiver computes integer ambiguity resolution values representing resolution of the integer ambiguities. Then, in response to the computed integer ambiguity resolution values and the phase value measured at the measurement epoch after the initialization period, it computes values defining the attitude of the moving vehicle at the measurement epoch after the initialization period.

  10. Radiology's Achilles' heel: error and variation in the interpretation of the Röntgen image.

    PubMed

    Robinson, P J

    1997-11-01

    The performance of the human eye and brain has failed to keep pace with the enormous technical progress in the first full century of radiology. Errors and variations in interpretation now represent the weakest aspect of clinical imaging. Those interpretations which differ from the consensus view of a panel of "experts" may be regarded as errors; where experts fail to achieve consensus, differing reports are regarded as "observer variation". Errors arise from poor technique, failures of perception, lack of knowledge and misjudgments. Observer variation is substantial and should be taken into account when different diagnostic methods are compared; in many cases the difference between observers outweighs the difference between techniques. Strategies for reducing error include attention to viewing conditions, training of the observers, availability of previous films and relevant clinical data, dual or multiple reporting, standardization of terminology and report format, and assistance from computers. Digital acquisition and display will probably not affect observer variation but the performance of radiologists, as measured by receiver operating characteristic (ROC) analysis, may be improved by computer-directed search for specific image features. Other current developments show that where image features can be comprehensively described, computer analysis can replace the perception function of the observer, whilst the function of interpretation can in some cases be performed better by artificial neural networks. However, computer-assisted diagnosis is still in its infancy and complete replacement of the human observer is as yet a remote possibility.

  11. On the design of computer-based models for integrated environmental science.

    PubMed

    McIntosh, Brian S; Jeffrey, Paul; Lemon, Mark; Winder, Nick

    2005-06-01

    The current research agenda in environmental science is dominated by calls to integrate science and policy to better understand and manage links between social (human) and natural (nonhuman) processes. Freshwater resource management is one area where such calls can be heard. Designing computer-based models for integrated environmental science poses special challenges to the research community. At present it is not clear whether such tools, or their outputs, receive much practical policy or planning application. It is argued that this is a result of (1) a lack of appreciation within the research modeling community of the characteristics of different decision-making processes including policy, planning, and (2) participation, (3) a lack of appreciation of the characteristics of different decision-making contexts, (4) the technical difficulties in implementing the necessary support tool functionality, and (5) the socio-technical demands of designing tools to be of practical use. This article presents a critical synthesis of ideas from each of these areas and interprets them in terms of design requirements for computer-based models being developed to provide scientific information support for policy and planning. Illustrative examples are given from the field of freshwater resources management. Although computer-based diagramming and modeling tools can facilitate processes of dialogue, they lack adequate simulation capabilities. Component-based models and modeling frameworks provide such functionality and may be suited to supporting problematic or messy decision contexts. However, significant technical (implementation) and socio-technical (use) challenges need to be addressed before such ambition can be realized.

  12. Design guidelines for the use of audio cues in computer interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sumikawa, D.A.; Blattner, M.M.; Joy, K.I.

    1985-07-01

    A logical next step in the evolution of the computer-user interface is the incorporation of sound thereby using our senses of ''hearing'' in our communication with the computer. This allows our visual and auditory capacities to work in unison leading to a more effective and efficient interpretation of information received from the computer than by sight alone. In this paper we examine earcons, which are audio cues, used in the computer-user interface to provide information and feedback to the user about computer entities (these include messages and functions, as well as states and labels). The material in this paper ismore » part of a larger study that recommends guidelines for the design and use of audio cues in the computer-user interface. The complete work examines the disciplines of music, psychology, communication theory, advertising, and psychoacoustics to discover how sound is utilized and analyzed in those areas. The resulting information is organized according to the theory of semiotics, the theory of signs, into the syntax, semantics, and pragmatics of communication by sound. Here we present design guidelines for the syntax of earcons. Earcons are constructed from motives, short sequences of notes with a specific rhythm and pitch, embellished by timbre, dynamics, and register. Compound earcons and family earcons are introduced. These are related motives that serve to identify a family of related cues. Examples of earcons are given.« less

  13. 75 FR 64996 - Takes of Marine Mammals Incidental to Specified Activities; Marine Geophysical Survey in the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-21

    ... cruises. A laptop computer is located on the observer platform for ease of data entry. The computer is... lines, the receiving systems will receive the returning acoustic signals. The study (e.g., equipment...-board assistance by the scientists who have proposed the study. The Chief Scientist is Dr. Franco...

  14. Multi-input and binary reproducible, high bandwidth floating point adder in a collective network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Dong; Eisley, Noel A; Heidelberger, Philip

    To add floating point numbers in a parallel computing system, a collective logic device receives the floating point numbers from computing nodes. The collective logic devices converts the floating point numbers to integer numbers. The collective logic device adds the integer numbers and generating a summation of the integer numbers. The collective logic device converts the summation to a floating point number. The collective logic device performs the receiving, the converting the floating point numbers, the adding, the generating and the converting the summation in one pass. One pass indicates that the computing nodes send inputs only once to themore » collective logic device and receive outputs only once from the collective logic device.« less

  15. Concomitant Use of Transcranial Direct Current Stimulation and Computer-Assisted Training for the Rehabilitation of Attention in Traumatic Brain Injured Patients: Behavioral and Neuroimaging Results.

    PubMed

    Sacco, Katiuscia; Galetto, Valentina; Dimitri, Danilo; Geda, Elisabetta; Perotti, Francesca; Zettin, Marina; Geminiani, Giuliano C

    2016-01-01

    Divided attention (DA), the ability to distribute cognitive resources among two or more simultaneous tasks, may be severely compromised after traumatic brain injury (TBI), resulting in problems with numerous activities involved with daily living. So far, no research has investigated whether the use of non-invasive brain stimulation associated with neuropsychological rehabilitation might contribute to the recovery of such cognitive function. The main purpose of this study was to assess the effectiveness of 10 transcranial direct current stimulation (tDCS) sessions combined with computer-assisted training; it also intended to explore the neural modifications induced by the treatment. Thirty-two patients with severe TBI participated in the study: 16 were part of the experimental group, and 16 part of the control group. The treatment included 20' of tDCS, administered twice a day for 5 days. The electrodes were placed on the dorso-lateral prefrontal cortex. Their location varied across patients and it depended on each participant's specific area of damage. The control group received sham tDCS. After each tDCS session, the patient received computer-assisted cognitive training on DA for 40'. The results showed that the experimental group significantly improved in DA performance between pre- and post-treatment, showing faster reaction times (RTs), and fewer omissions. No improvement was detected between the baseline assessment (i.e., 1 month before treatment) and the pre-training assessment, or within the control group. Functional magnetic resonance imaging (fMRI) data, obtained on the experimental group during a DA task, showed post-treatment lower cerebral activations in the right superior temporal gyrus (BA 42), right and left middle frontal gyrus (BA 6), right postcentral gyrus (BA 3) and left inferior frontal gyrus (BA 9). We interpreted such neural changes as normalization of previously abnormal hyperactivations.

  16. The effect of Vaccinium uliginosum extract on tablet computer-induced asthenopia: randomized placebo-controlled study.

    PubMed

    Park, Choul Yong; Gu, Namyi; Lim, Chi-Yeon; Oh, Jong-Hyun; Chang, Minwook; Kim, Martha; Rhee, Moo-Yong

    2016-08-18

    To investigate the alleviation effect of Vaccinium uliginosum extract (DA9301) on tablet computer-induced asthenopia. This was a randomized, placebo-controlled, double-blind and parallel study (Trial registration number: 2013-95). A total 60 volunteers were randomized into DA9301 (n = 30) and control (n = 30) groups. The DA9301 group received DA9301 oral pill (1000 mg/day) for 4 weeks and the control group received placebo. Asthenopia was evaluated by administering a questionnaire containing 10 questions (responses were scored on a scales of 0-6; total score: 60) regarding ocular symptoms before (baseline) and 4 weeks after receiving pills (DA9301 or placebo). The participants completed the questionnaire before and after tablet computer (iPad Air, Apple Inc.) watching at each visit. The change in total asthenopia score (TAS) was calculated and compared between the groups TAS increased significantly after tablet computer watching at baseline in DA9301 group. (from 20.35 to 23.88; p = 0.031) However, after receiving DA9301 for 4 weeks, TAS remained stable after tablet computer watching. In the control group, TAS changes induced by tablet computer watching were not significant both at baseline and at 4 weeks after receiving placebo. Further analysis revealed the scores for "tired eyes" (p = 0.001), "sore/aching eyes" (p = 0.038), "irritated eyes" (p = 0.010), "watery eyes" (p = 0.005), "dry eyes" (p = 0.003), "eye strain" (p = 0.006), "blurred vision" (p = 0.034), and "visual discomfort" (p = 0.018) significantly improved in the DA9301 group. We found that oral intake of DA9301 (1000 mg/day for 4 weeks) was effective in alleviating asthenopia symptoms induced by tablet computer watching. The study is registered at www.clinicaltrials.gov (registration number: NCT02641470, date of registration December 30, 2015).

  17. Lithospheric structure below seismic stations in Cuba from the joint inversion of Rayleigh surface waves dispersion and receiver functions

    NASA Astrophysics Data System (ADS)

    González, O'Leary; Moreno, Bladimir; Romanelli, Fabio; Panza, Giuliano F.

    2012-05-01

    The joint inversion of Rayleigh wave group velocity dispersion and receiver functions has been used to study the crust and upper mantle structure at eight seismic stations in Cuba. Receiver functions have been computed from teleseismic recordings of earthquakes at epicentral (angular) distances in the range from 30° to 90° and Rayleigh wave group velocity dispersion relations have been taken from earlier surface wave tomographic studies in the Caribbean area. The thickest crust (˜30 km) below Cuban stations is found at Cascorro (CCC) and Maisí (MAS) whereas the thinnest crust (˜18 km) is found at stations Río Carpintero (RCC) and Guantánamo Bay (GTBY), in the southeastern part of Cuba; this result is in agreement with the southward gradual thinning of the crust revealed by previous studies. In the crystalline crust, the S-wave velocity varies between ˜2.8 and ˜3.9 km s-1 and, at the crust-mantle transition zone, the shear wave velocity varies from ˜4.0 and ˜4.3 km s-1. The lithospheric thickness varies from ˜65 km, in the youngest lithosphere, to ˜150 km in the northeastern part of the Cuban island, below Maisí (MAS) and Moa (MOA) stations. Evidence of a subducted slab possibly belonging to the Caribbean plate is present below the stations Las Mercedes (LMG), RCC and GTBY whereas earlier subducted slabs could explain the results obtained below the Soroa (SOR), Manicaragua (MGV) and Cascorro (CCC) station.

  18. Computational analysis of sequence selection mechanisms.

    PubMed

    Meyerguz, Leonid; Grasso, Catherine; Kleinberg, Jon; Elber, Ron

    2004-04-01

    Mechanisms leading to gene variations are responsible for the diversity of species and are important components of the theory of evolution. One constraint on gene evolution is that of protein foldability; the three-dimensional shapes of proteins must be thermodynamically stable. We explore the impact of this constraint and calculate properties of foldable sequences using 3660 structures from the Protein Data Bank. We seek a selection function that receives sequences as input, and outputs survival probability based on sequence fitness to structure. We compute the number of sequences that match a particular protein structure with energy lower than the native sequence, the density of the number of sequences, the entropy, and the "selection" temperature. The mechanism of structure selection for sequences longer than 200 amino acids is approximately universal. For shorter sequences, it is not. We speculate on concrete evolutionary mechanisms that show this behavior.

  19. LANDSAT-D band 6 data evaluation

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A filter, fabricated to match the spectral response of the LANDSAT band 6 sensors, was received and the combined system response function computed. The half power points for the aircraft system are 10.5 micrometer and 11.55 micrometer compared to the 10.4 and 11.6 micrometer values for the satellite. These discrepancies are considered acceptable; their effect on the apparent temperature observed at the satellite is being evaluated. The filter was installed in the infrared line scanner and the line scanner was installed in the aircraft and field checked. A daytime underflight of the satellite is scheduled for the next clear overpass and the feasibility of a nightime overpass is being discussed with NASA. The LOWTRAN 5 computer code was obtained from the Air Force Geophysical Laboratory and is being implemented for use on this effort.

  20. A survey of signal processing algorithms in brain-computer interfaces based on electrical brain signals.

    PubMed

    Bashashati, Ali; Fatourechi, Mehrdad; Ward, Rabab K; Birch, Gary E

    2007-06-01

    Brain-computer interfaces (BCIs) aim at providing a non-muscular channel for sending commands to the external world using the electroencephalographic activity or other electrophysiological measures of the brain function. An essential factor in the successful operation of BCI systems is the methods used to process the brain signals. In the BCI literature, however, there is no comprehensive review of the signal processing techniques used. This work presents the first such comprehensive survey of all BCI designs using electrical signal recordings published prior to January 2006. Detailed results from this survey are presented and discussed. The following key research questions are addressed: (1) what are the key signal processing components of a BCI, (2) what signal processing algorithms have been used in BCIs and (3) which signal processing techniques have received more attention?

  1. TOPICAL REVIEW: A survey of signal processing algorithms in brain computer interfaces based on electrical brain signals

    NASA Astrophysics Data System (ADS)

    Bashashati, Ali; Fatourechi, Mehrdad; Ward, Rabab K.; Birch, Gary E.

    2007-06-01

    Brain computer interfaces (BCIs) aim at providing a non-muscular channel for sending commands to the external world using the electroencephalographic activity or other electrophysiological measures of the brain function. An essential factor in the successful operation of BCI systems is the methods used to process the brain signals. In the BCI literature, however, there is no comprehensive review of the signal processing techniques used. This work presents the first such comprehensive survey of all BCI designs using electrical signal recordings published prior to January 2006. Detailed results from this survey are presented and discussed. The following key research questions are addressed: (1) what are the key signal processing components of a BCI, (2) what signal processing algorithms have been used in BCIs and (3) which signal processing techniques have received more attention?

  2. A modularized pulse programmer for NMR spectroscopy

    NASA Astrophysics Data System (ADS)

    Mao, Wenping; Bao, Qingjia; Yang, Liang; Chen, Yiqun; Liu, Chaoyang; Qiu, Jianqing; Ye, Chaohui

    2011-02-01

    A modularized pulse programmer for a NMR spectrometer is described. It consists of a networked PCI-104 single-board computer and a field programmable gate array (FPGA). The PCI-104 is dedicated to translate the pulse sequence elements from the host computer into 48-bit binary words and download these words to the FPGA, while the FPGA functions as a sequencer to execute these binary words. High-resolution NMR spectra obtained on a home-built spectrometer with four pulse programmers working concurrently demonstrate the effectiveness of the pulse programmer. Advantages of the module include (1) once designed it can be duplicated and used to construct a scalable NMR/MRI system with multiple transmitter and receiver channels, (2) it is a totally programmable system in which all specific applications are determined by software, and (3) it provides enough reserve for possible new pulse sequences.

  3. Optimized 4-bit Quantum Reversible Arithmetic Logic Unit

    NASA Astrophysics Data System (ADS)

    Ayyoub, Slimani; Achour, Benslama

    2017-08-01

    Reversible logic has received a great attention in the recent years due to its ability to reduce the power dissipation. The main purposes of designing reversible logic are to decrease quantum cost, depth of the circuits and the number of garbage outputs. The arithmetic logic unit (ALU) is an important part of central processing unit (CPU) as the execution unit. This paper presents a complete design of a new reversible arithmetic logic unit (ALU) that can be part of a programmable reversible computing device such as a quantum computer. The proposed ALU based on a reversible low power control unit and small performance parameters full adder named double Peres gates. The presented ALU can produce the largest number (28) of arithmetic and logic functions and have the smallest number of quantum cost and delay compared with existing designs.

  4. Prediction Error Representation in Individuals With Generalized Anxiety Disorder During Passive Avoidance.

    PubMed

    White, Stuart F; Geraci, Marilla; Lewis, Elizabeth; Leshin, Joseph; Teng, Cindy; Averbeck, Bruno; Meffert, Harma; Ernst, Monique; Blair, James R; Grillon, Christian; Blair, Karina S

    2017-02-01

    Deficits in reinforcement-based decision making have been reported in generalized anxiety disorder. However, the pathophysiology of these deficits is largely unknown; published studies have mainly examined adolescents, and the integrity of core functional processes underpinning decision making remains undetermined. In particular, it is unclear whether the representation of reinforcement prediction error (PE) (the difference between received and expected reinforcement) is disrupted in generalized anxiety disorder. This study addresses these issues in adults with the disorder. Forty-six unmedicated individuals with generalized anxiety disorder and 32 healthy comparison subjects group-matched on IQ, gender, and age performed a passive avoidance task while undergoing functional MRI. Data analyses were performed using a computational modeling approach. Behaviorally, individuals with generalized anxiety disorder showed impaired reinforcement-based decision making. Imaging results revealed that during feedback, individuals with generalized anxiety disorder relative to healthy subjects showed a reduced correlation between PE and activity within the ventromedial prefrontal cortex, ventral striatum, and other structures implicated in decision making. In addition, individuals with generalized anxiety disorder relative to healthy participants showed a reduced correlation between punishment PEs, but not reward PEs, and activity within the left and right lentiform nucleus/putamen. This is the first study to identify computational impairments during decision making in generalized anxiety disorder. PE signaling is significantly disrupted in individuals with the disorder and may lead to their decision-making deficits and excessive worry about everyday problems by disrupting the online updating ("reality check") of the current relationship between the expected values of current response options and the actual received rewards and punishments.

  5. The "proactive" model of learning: Integrative framework for model-free and model-based reinforcement learning utilizing the associative learning-based proactive brain concept.

    PubMed

    Zsuga, Judit; Biro, Klara; Papp, Csaba; Tajti, Gabor; Gesztelyi, Rudolf

    2016-02-01

    Reinforcement learning (RL) is a powerful concept underlying forms of associative learning governed by the use of a scalar reward signal, with learning taking place if expectations are violated. RL may be assessed using model-based and model-free approaches. Model-based reinforcement learning involves the amygdala, the hippocampus, and the orbitofrontal cortex (OFC). The model-free system involves the pedunculopontine-tegmental nucleus (PPTgN), the ventral tegmental area (VTA) and the ventral striatum (VS). Based on the functional connectivity of VS, model-free and model based RL systems center on the VS that by integrating model-free signals (received as reward prediction error) and model-based reward related input computes value. Using the concept of reinforcement learning agent we propose that the VS serves as the value function component of the RL agent. Regarding the model utilized for model-based computations we turned to the proactive brain concept, which offers an ubiquitous function for the default network based on its great functional overlap with contextual associative areas. Hence, by means of the default network the brain continuously organizes its environment into context frames enabling the formulation of analogy-based association that are turned into predictions of what to expect. The OFC integrates reward-related information into context frames upon computing reward expectation by compiling stimulus-reward and context-reward information offered by the amygdala and hippocampus, respectively. Furthermore we suggest that the integration of model-based expectations regarding reward into the value signal is further supported by the efferent of the OFC that reach structures canonical for model-free learning (e.g., the PPTgN, VTA, and VS). (c) 2016 APA, all rights reserved).

  6. VoxelStats: A MATLAB Package for Multi-Modal Voxel-Wise Brain Image Analysis.

    PubMed

    Mathotaarachchi, Sulantha; Wang, Seqian; Shin, Monica; Pascoal, Tharick A; Benedet, Andrea L; Kang, Min Su; Beaudry, Thomas; Fonov, Vladimir S; Gauthier, Serge; Labbe, Aurélie; Rosa-Neto, Pedro

    2016-01-01

    In healthy individuals, behavioral outcomes are highly associated with the variability on brain regional structure or neurochemical phenotypes. Similarly, in the context of neurodegenerative conditions, neuroimaging reveals that cognitive decline is linked to the magnitude of atrophy, neurochemical declines, or concentrations of abnormal protein aggregates across brain regions. However, modeling the effects of multiple regional abnormalities as determinants of cognitive decline at the voxel level remains largely unexplored by multimodal imaging research, given the high computational cost of estimating regression models for every single voxel from various imaging modalities. VoxelStats is a voxel-wise computational framework to overcome these computational limitations and to perform statistical operations on multiple scalar variables and imaging modalities at the voxel level. VoxelStats package has been developed in Matlab(®) and supports imaging formats such as Nifti-1, ANALYZE, and MINC v2. Prebuilt functions in VoxelStats enable the user to perform voxel-wise general and generalized linear models and mixed effect models with multiple volumetric covariates. Importantly, VoxelStats can recognize scalar values or image volumes as response variables and can accommodate volumetric statistical covariates as well as their interaction effects with other variables. Furthermore, this package includes built-in functionality to perform voxel-wise receiver operating characteristic analysis and paired and unpaired group contrast analysis. Validation of VoxelStats was conducted by comparing the linear regression functionality with existing toolboxes such as glim_image and RMINC. The validation results were identical to existing methods and the additional functionality was demonstrated by generating feature case assessments (t-statistics, odds ratio, and true positive rate maps). In summary, VoxelStats expands the current methods for multimodal imaging analysis by allowing the estimation of advanced regional association metrics at the voxel level.

  7. Optimization of MLS receivers for multipath environments

    NASA Technical Reports Server (NTRS)

    Mcalpine, G. A.; Highfill, J. H., III; Tzeng, C. P. J.; Koleyni, G.

    1978-01-01

    Reduced order receiver (suboptimal receiver) analysis in multipath environments is presented. The origin and objective of MLS is described briefly. Signal modeling in MLS the optimum receiver is also included and a description of a computer oriented technique which was used in the simulation study of the suboptimal receiver is provided. Results and conclusion obtained from the research for the suboptimal receiver are reported.

  8. Determining when a set of compute nodes participating in a barrier operation on a parallel computer are ready to exit the barrier operation

    DOEpatents

    Blocksome, Michael A [Rochester, MN

    2011-12-20

    Methods, apparatus, and products are disclosed for determining when a set of compute nodes participating in a barrier operation on a parallel computer are ready to exit the barrier operation that includes, for each compute node in the set: initializing a barrier counter with no counter underflow interrupt; configuring, upon entering the barrier operation, the barrier counter with a value in dependence upon a number of compute nodes in the set; broadcasting, by a DMA engine on the compute node to each of the other compute nodes upon entering the barrier operation, a barrier control packet; receiving, by the DMA engine from each of the other compute nodes, a barrier control packet; modifying, by the DMA engine, the value for the barrier counter in dependence upon each of the received barrier control packets; exiting the barrier operation if the value for the barrier counter matches the exit value.

  9. An Array of Optical Receivers for Deep-Space Communications

    NASA Technical Reports Server (NTRS)

    Vilnrotter, Chi-Wung; Srinivasan, Meera; Andrews, Kenneth

    2007-01-01

    An array of small optical receivers is proposed as an alternative to a single large optical receiver for high-data-rate communications in NASA s Deep Space Network (DSN). Because the telescope for a single receiver capable of satisfying DSN requirements must be greater than 10 m in diameter, the design, building, and testing of the telescope would be very difficult and expensive. The proposed array would utilize commercially available telescopes of 1-m or smaller diameter and, therefore, could be developed and verified with considerably less difficulty and expense. The essential difference between a single-aperture optical-communications receiver and an optical-array receiver is that a single-aperture receiver focuses all of the light energy it collects onto the surface of an optical detector, whereas an array receiver focuses portions of the total collected energy onto separate detectors, optically detects each fractional energy component, then combines the electrical signal from the array of detector outputs to form the observable, or "decision statistic," used to decode the transmitted data. A conceptual block diagram identifying the key components of the optical-array receiver suitable for deep-space telemetry reception is shown in the figure. The most conspicuous feature of the receiver is the large number of small- to medium-size telescopes, with individual apertures and number of telescopes selected to make up the desired total collecting area. This array of telescopes is envisioned to be fully computer- controlled via the user interface and prediction-driven to achieve rough pointing and tracking of the desired spacecraft. Fine-pointing and tracking functions then take over to keep each telescope pointed toward the source, despite imperfect pointing predictions, telescope-drive errors, and vibration caused by wind.

  10. Current Trend Towards Using Soft Computing Approaches to Phase Synchronization in Communication Systems

    NASA Technical Reports Server (NTRS)

    Drake, Jeffrey T.; Prasad, Nadipuram R.

    1999-01-01

    This paper surveys recent advances in communications that utilize soft computing approaches to phase synchronization. Soft computing, as opposed to hard computing, is a collection of complementary methodologies that act in producing the most desirable control, decision, or estimation strategies. Recently, the communications area has explored the use of the principal constituents of soft computing, namely, fuzzy logic, neural networks, and genetic algorithms, for modeling, control, and most recently for the estimation of phase in phase-coherent communications. If the receiver in a digital communications system is phase-coherent, as is often the case, phase synchronization is required. Synchronization thus requires estimation and/or control at the receiver of an unknown or random phase offset.

  11. Effecting a broadcast with an allreduce operation on a parallel computer

    DOEpatents

    Almasi, Gheorghe; Archer, Charles J.; Ratterman, Joseph D.; Smith, Brian E.

    2010-11-02

    A parallel computer comprises a plurality of compute nodes organized into at least one operational group for collective parallel operations. Each compute node is assigned a unique rank and is coupled for data communications through a global combining network. One compute node is assigned to be a logical root. A send buffer and a receive buffer is configured. Each element of a contribution of the logical root in the send buffer is contributed. One or more zeros corresponding to a size of the element are injected. An allreduce operation with a bitwise OR using the element and the injected zeros is performed. And the result for the allreduce operation is determined and stored in each receive buffer.

  12. KSC-99pp1228

    NASA Image and Video Library

    1999-10-06

    Children at Cambridge Elementary School, Cocoa, Fla., eagerly unwrap computer equipment donated by Kennedy Space Center. Cambridge is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. Behind the children is Jim Thurston, a school volunteer and retired employee of USBI, who shared in the project. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. KSC employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated

  13. University students' notebook computer use.

    PubMed

    Jacobs, Karen; Johnson, Peter; Dennerlein, Jack; Peterson, Denise; Kaufman, Justin; Gold, Joshua; Williams, Sarah; Richmond, Nancy; Karban, Stephanie; Firn, Emily; Ansong, Elizabeth; Hudak, Sarah; Tung, Katherine; Hall, Victoria; Pencina, Karol; Pencina, Michael

    2009-05-01

    Recent evidence suggests that university students are self-reporting experiencing musculoskeletal discomfort with computer use similar to levels reported by adult workers. The objective of this study was to determine how university students use notebook computers and to determine what ergonomic strategies might be effective in reducing self-reported musculoskeletal discomfort in this population. Two hundred and eighty-nine university students randomly assigned to one of three towers by the university's Office of Housing participated in this study. The results of this investigation showed a significant reduction in self-reported notebook computer-related discomfort from pre- and post-survey in participants who received notebook computer accessories and in those who received accessories and participatory ergonomics training. A significant increase in post-survey rest breaks was seen. There was a significant correlation between self-reported computer usage and the amount measured using computer usage software (odometer). More research is needed however to determine the most effective ergonomics intervention for university students.

  14. Towards Seismic Tomography Based Upon Adjoint Methods

    NASA Astrophysics Data System (ADS)

    Tromp, J.; Liu, Q.; Tape, C.; Maggi, A.

    2006-12-01

    We outline the theory behind tomographic inversions based on 3D reference models, fully numerical 3D wave propagation, and adjoint methods. Our approach involves computing the Fréchet derivatives for tomographic inversions via the interaction between a forward wavefield, propagating from the source to the receivers, and an `adjoint' wavefield, propagating from the receivers back to the source. The forward wavefield is computed using a spectral-element method (SEM) and a heterogeneous wave-speed model, and stored as synthetic seismograms at particular receivers for which there is data. We specify an objective or misfit function that defines a measure of misfit between data and synthetics. For a given receiver, the differences between the data and the synthetics are time reversed and used as the source of the adjoint wavefield. For each earthquake, the interaction between the regular and adjoint wavefields is used to construct finite-frequency sensitivity kernels, which we call event kernel. These kernels may be thought of as weighted sums of measurement-specific banana-donut kernels, with weights determined by the measurements. The overall sensitivity is simply the sum of event kernels, which defines the misfit kernel. The misfit kernel is multiplied by convenient orthonormal basis functions that are embedded in the SEM code, resulting in the gradient of the misfit function, i.e., the Fréchet derivatives. The misfit kernel is multiplied by convenient orthonormal basis functions that are embedded in the SEM code, resulting in the gradient of the misfit function, i.e., the Fréchet derivatives. A conjugate gradient algorithm is used to iteratively improve the model while reducing the misfit function. Using 2D examples for Rayleigh wave phase-speed maps of southern California, we illustrate the construction of the gradient and the minimization algorithm, and consider various tomographic experiments, including source inversions, structural inversions, and joint source-structure inversions. We also illustrate the characteristics of these 3D finite-frequency kernels based upon adjoint simulations for a variety of global arrivals, e.g., Pdiff, P'P', and SKS, and we illustrate how the approach may be used to investigate body- and surface-wave anisotropy. In adjoint tomography any time segment in which the data and synthetics match reasonably well is suitable for measurement, and this implies a much greater number of phases per seismogram can be used compared to classical tomography in which the sensitivity of the measurements is determined analytically for specific arrivals, e.g., P. We use an automated picking algorithm based upon short-term/long-term averages and strict phase and amplitude anomaly criteria to determine arrivals and time windows suitable for measurement. For shallow global events the algorithm typically identifies of the order of 1000~windows suitable for measurement, whereas for a deep event the number can reach 4000. For southern California earthquakes the number of phases is of the order of 100 for a magnitude 4.0 event and up to 450 for a magnitude 5.0 event. We will show examples of event kernels for both global and regional earthquakes. These event kernels form the basis of adjoint tomography.

  15. Computer-assisted versus non-computer-assisted preoperative planning of corrective osteotomy for extra-articular distal radius malunions: a randomized controlled trial.

    PubMed

    Leong, Natalie L; Buijze, Geert A; Fu, Eric C; Stockmans, Filip; Jupiter, Jesse B

    2010-12-14

    Malunion is the most common complication of distal radius fracture. It has previously been demonstrated that there is a correlation between the quality of anatomical correction and overall wrist function. However, surgical correction can be difficult because of the often complex anatomy associated with this condition. Computer assisted surgical planning, combined with patient-specific surgical guides, has the potential to improve pre-operative understanding of patient anatomy as well as intra-operative accuracy. For patients with malunion of the distal radius fracture, this technology could significantly improve clinical outcomes that largely depend on the quality of restoration of normal anatomy. Therefore, the objective of this study is to compare patient outcomes after corrective osteotomy for distal radius malunion with and without preoperative computer-assisted planning and peri-operative patient-specific surgical guides. This study is a multi-center randomized controlled trial of conventional planning versus computer-assisted planning for surgical correction of distal radius malunion. Adult patients with extra-articular malunion of the distal radius will be invited to enroll in our study. After providing informed consent, subjects will be randomized to two groups: one group will receive corrective surgery with conventional preoperative planning, while the other will receive corrective surgery with computer-assisted pre-operative planning and peri-operative patient specific surgical guides. In the computer-assisted planning group, a CT scan of the affected forearm as well as the normal, contralateral forearm will be obtained. The images will be used to construct a 3D anatomical model of the defect and patient-specific surgical guides will be manufactured. Outcome will be measured by DASH and PRWE scores, grip strength, radiographic measurements, and patient satisfaction at 3, 6, and 12 months postoperatively. Computer-assisted surgical planning, combined with patient-specific surgical guides, is a powerful new technology that has the potential to improve the accuracy and consistency of orthopaedic surgery. To date, the role of this technology in upper extremity surgery has not been adequately investigated, and it is unclear whether its use provides any significant clinical benefit over traditional preoperative imaging protocols. Our study will represent the first randomized controlled trial investigating the use of computer assisted surgery in corrective osteotomy for distal radius malunions. NCT01193010.

  16. Psychometric properties of the PROMIS Physical Function item bank in patients receiving physical therapy.

    PubMed

    Crins, Martine H P; van der Wees, Philip J; Klausch, Thomas; van Dulmen, Simone A; Roorda, Leo D; Terwee, Caroline B

    2018-01-01

    The Patient-Reported Outcomes Measurement Information System (PROMIS) is a universally applicable set of instruments, including item banks, short forms and computer adaptive tests (CATs), measuring patient-reported health across different patient populations. PROMIS CATs are highly efficient and the use in practice is considered feasible with little administration time, offering standardized and routine patient monitoring. Before an item bank can be used as CAT, the psychometric properties of the item bank have to be examined. Therefore, the objective was to assess the psychometric properties of the Dutch-Flemish PROMIS Physical Function item bank (DF-PROMIS-PF) in Dutch patients receiving physical therapy. Cross-sectional study. 805 patients >18 years, who received any kind of physical therapy in primary care in the past year, completed the full DF-PROMIS-PF (121 items). Unidimensionality was examined by Confirmatory Factor Analysis and local dependence and monotonicity were evaluated. A Graded Response Model was fitted. Construct validity was examined with correlations between DF-PROMIS-PF T-scores and scores on two legacy instruments (SF-36 Health Survey Physical Functioning scale [SF36-PF10] and the Health Assessment Questionnaire Disability-Index [HAQ-DI]). Reliability (standard errors of theta) was assessed. The results for unidimensionality were mixed (scaled CFI = 0.924, TLI = 0.923, RMSEA = 0.045, 1th factor explained 61.5% of variance). Some local dependence was found (8.2% of item pairs). The item bank showed a broad coverage of the physical function construct (threshold-parameters range: -4.28-2.33) and good construct validity (correlation with SF36-PF10 = 0.84 and HAQ-DI = -0.85). Furthermore, the DF-PROMIS-PF showed greater reliability over a broader score-range than the SF36-PF10 and HAQ-DI. The psychometric properties of the DF-PROMIS-PF item bank are sufficient. The DF-PROMIS-PF can now be used as short forms or CAT to measure the level of physical function of physiotherapy patients.

  17. Suspension properties of whole blood and its components under glucose influence studied in patients with acute coronary syndrome

    NASA Astrophysics Data System (ADS)

    Malinova, Lidia I.; Simonenko, Georgy V.; Denisova, Tatyana P.; Dovgalevsky, Pavel Y.; Tuchin, Valery V.

    2004-05-01

    The protocol of our study includes men with acute myocardial infarction, stable angina pectoris of II and III functional classes and unstable angina pectoris. Patients with arterial hypertension, disorders in carbohydrate metabolism were excluded from the study. Blood samples taken under standardized conditions, were stabilized with citrate sodium 3,8% (1:9). Erythrocytes and platelets aggregation activity under glucose influence (in vitro) was studied by means of computer aided microphotometer -- a visual analyzer. Erythrocyte and platelets were united in special subsystem of whole blood. Temporal and functional characteristics of their aggregation were analyzed by creation of phase patterns fragments. The received data testify to interrelation of erythrocytes and platelets processes of aggregation under conditions of increasing of glucose concentration of the incubatory environment, which temporal and functional characteristics may be used for diagnostics and the prognosis of destabilization coronary blood flow at an acute coronary syndrome.

  18. Intranasal Nerve Growth Factor administration improves cerebral functions in a child with severe traumatic brain injury: A case report.

    PubMed

    Chiaretti, Antonio; Conti, Giorgio; Falsini, Benedetto; Buonsenso, Danilo; Crasti, Matteo; Manni, Luigi; Soligo, Marzia; Fantacci, Claudia; Genovese, Orazio; Calcagni, Maria Lucia; Di Giuda, Daniela; Mattoli, Maria Vittoria; Cocciolillo, Fabrizio; Ferrara, Pietro; Ruggiero, Antonio; Staccioli, Susanna; Colafati, Giovanna Stefania; Riccardi, Riccardo

    2017-01-01

    Nerve growth factor (NGF) promotes neural recovery after experimental traumatic brain injury (TBI) supporting neuronal growth, differentiation and survival of brain cells and up-regulating the neurogenesis-associated protein Doublecortin (DCX). Only a few studies reported NGF administration in paediatric patients with severe TBI. A four-year-old boy in a persistent unresponsive wakefulness syndrome (UWS) was treated with intranasal murine NGF administration 6 months after severe TBI. The patient received four cycles of intranasal NGF (0.1 mg/kg, twice a day for 10 consecutive days). NGF administration improved functional [Positron Emission Tomography/Computed Tomography (PET/CT); Single photon emission/Computed Tomography (SPECT/CT) and Magnetic Resonance Imaging (MRI)] assessment, electrophysiological [Electroencephalogram (EEG) and Visual Evoked Potential (VEP)] studies and clinical conditions. He showed improvements in voluntary movements, facial mimicry, phonation, attention and verbal comprehension, ability to cry, cough reflex, oral motility, feeding capacity, and bowel and urinary functions. After NGF administration, raised levels of both NGF and DCX were found in the cerebrospinal fluid of the patient. No side effects were reported. Although further studies are needed for better understanding the neuroprotective role of this neurotrophin, intranasal NGF administration appears to be a promising and safe rescuing strategy treatment in children with neurological impairment after TBI.

  19. Efficacy of a Computer-Assisted Cognitive Rehabilitation Intervention in Relapsing-Remitting Multiple Sclerosis Patients: A Multicenter Randomized Controlled Trial

    PubMed Central

    Kosmidis, Mary H.; Zampakis, Petros; Malefaki, Sonia; Ntoskou, Katerina; Nousia, Anastasia; Bakirtzis, Christos; Papathanasopoulos, Panagiotis

    2017-01-01

    Cognitive impairment is frequently encountered in multiple sclerosis (MS) affecting between 40–65% of individuals, irrespective of disease duration and severity of physical disability. In the present multicenter randomized controlled trial, fifty-eight clinically stable RRMS patients with mild to moderate cognitive impairment and relatively low disability status were randomized to receive either computer-assisted (RehaCom) functional cognitive training with an emphasis on episodic memory, information processing speed/attention, and executive functions for 10 weeks (IG; n = 32) or standard clinical care (CG; n = 26). Outcome measures included a flexible comprehensive neuropsychological battery of tests sensitive to MS patient deficits and feedback regarding personal benefit gained from the intervention on four verbal questions. Only the IG group showed significant improvements in verbal and visuospatial episodic memory, processing speed/attention, and executive functioning from pre - to postassessment. Moreover, the improvement obtained on attention was retained over 6 months providing evidence on the long-term benefits of this intervention. Group by time interactions revealed significant improvements in composite cognitive domain scores in the IG relative to the demographically and clinically matched CG for verbal episodic memory, processing speed, verbal fluency, and attention. Treated patients rated the intervention positively and were more confident about their cognitive abilities following treatment. PMID:29463950

  20. Stanford/NASA-Ames Center of Excellence in model-based human performance

    NASA Technical Reports Server (NTRS)

    Wandell, Brian A.

    1990-01-01

    The human operator plays a critical role in many aeronautic and astronautic missions. The Stanford/NASA-Ames Center of Excellence in Model-Based Human Performance (COE) was initiated in 1985 to further our understanding of the performance capabilities and performance limits of the human component of aeronautic and astronautic projects. Support from the COE is devoted to those areas of experimental and theoretical work designed to summarize and explain human performance by developing computable performance models. The ultimate goal is to make these computable models available to other scientists for use in design and evaluation of aeronautic and astronautic instrumentation. Within vision science, two topics have received particular attention. First, researchers did extensive work analyzing the human ability to recognize object color relatively independent of the spectral power distribution of the ambient lighting (color constancy). The COE has supported a number of research papers in this area, as well as the development of a substantial data base of surface reflectance functions, ambient illumination functions, and an associated software package for rendering and analyzing image data with respect to these spectral functions. Second, the COE supported new empirical studies on the problem of selecting colors for visual display equipment to enhance human performance in discrimination and recognition tasks.

  1. A Carrier Estimation Method Based on MLE and KF for Weak GNSS Signals.

    PubMed

    Zhang, Hongyang; Xu, Luping; Yan, Bo; Zhang, Hua; Luo, Liyan

    2017-06-22

    Maximum likelihood estimation (MLE) has been researched for some acquisition and tracking applications of global navigation satellite system (GNSS) receivers and shows high performance. However, all current methods are derived and operated based on the sampling data, which results in a large computation burden. This paper proposes a low-complexity MLE carrier tracking loop for weak GNSS signals which processes the coherent integration results instead of the sampling data. First, the cost function of the MLE of signal parameters such as signal amplitude, carrier phase, and Doppler frequency are used to derive a MLE discriminator function. The optimal value of the cost function is searched by an efficient Levenberg-Marquardt (LM) method iteratively. Its performance including Cramér-Rao bound (CRB), dynamic characteristics and computation burden are analyzed by numerical techniques. Second, an adaptive Kalman filter is designed for the MLE discriminator to obtain smooth estimates of carrier phase and frequency. The performance of the proposed loop, in terms of sensitivity, accuracy and bit error rate, is compared with conventional methods by Monte Carlo (MC) simulations both in pedestrian-level and vehicle-level dynamic circumstances. Finally, an optimal loop which combines the proposed method and conventional method is designed to achieve the optimal performance both in weak and strong signal circumstances.

  2. Evaluation of the Effectiveness of the Storage and Distribution Entry-Level Computer-Based Training (CBT) Program

    DTIC Science & Technology

    1990-09-01

    learning occurs when this final Zink is made into long-term memory (13:79). Cognitive scientists realize the role of the trainee as a passive receiver of...of property on the computer, and when they did, this piece of paperwork printed out on their printer . Someone from the receiving section brought this

  3. Unified algorithm of cone optics to compute solar flux on central receiver

    NASA Astrophysics Data System (ADS)

    Grigoriev, Victor; Corsi, Clotilde

    2017-06-01

    Analytical algorithms to compute flux distribution on central receiver are considered as a faster alternative to ray tracing. They have quite too many modifications, with HFLCAL and UNIZAR being the most recognized and verified. In this work, a generalized algorithm is presented which is valid for arbitrary sun shape of radial symmetry. Heliostat mirrors can have a nonrectangular profile, and the effects of shading and blocking, strong defocusing and astigmatism can be taken into account. The algorithm is suitable for parallel computing and can benefit from hardware acceleration of polygon texturing.

  4. Climate@Home: Crowdsourcing Climate Change Research

    NASA Astrophysics Data System (ADS)

    Xu, C.; Yang, C.; Li, J.; Sun, M.; Bambacus, M.

    2011-12-01

    Climate change deeply impacts human wellbeing. Significant amounts of resources have been invested in building super-computers that are capable of running advanced climate models, which help scientists understand climate change mechanisms, and predict its trend. Although climate change influences all human beings, the general public is largely excluded from the research. On the other hand, scientists are eagerly seeking communication mediums for effectively enlightening the public on climate change and its consequences. The Climate@Home project is devoted to connect the two ends with an innovative solution: crowdsourcing climate computing to the general public by harvesting volunteered computing resources from the participants. A distributed web-based computing platform will be built to support climate computing, and the general public can 'plug-in' their personal computers to participate in the research. People contribute the spare computing power of their computers to run a computer model, which is used by scientists to predict climate change. Traditionally, only super-computers could handle such a large computing processing load. By orchestrating massive amounts of personal computers to perform atomized data processing tasks, investments on new super-computers, energy consumed by super-computers, and carbon release from super-computers are reduced. Meanwhile, the platform forms a social network of climate researchers and the general public, which may be leveraged to raise climate awareness among the participants. A portal is to be built as the gateway to the climate@home project. Three types of roles and the corresponding functionalities are designed and supported. The end users include the citizen participants, climate scientists, and project managers. Citizen participants connect their computing resources to the platform by downloading and installing a computing engine on their personal computers. Computer climate models are defined at the server side. Climate scientists configure computer model parameters through the portal user interface. After model configuration, scientists then launch the computing task. Next, data is atomized and distributed to computing engines that are running on citizen participants' computers. Scientists will receive notifications on the completion of computing tasks, and examine modeling results via visualization modules of the portal. Computing tasks, computing resources, and participants are managed by project managers via portal tools. A portal prototype has been built for proof of concept. Three forums have been setup for different groups of users to share information on science aspect, technology aspect, and educational outreach aspect. A facebook account has been setup to distribute messages via the most popular social networking platform. New treads are synchronized from the forums to facebook. A mapping tool displays geographic locations of the participants and the status of tasks on each client node. A group of users have been invited to test functions such as forums, blogs, and computing resource monitoring.

  5. Fusing literature and full network data improves disease similarity computation.

    PubMed

    Li, Ping; Nie, Yaling; Yu, Jingkai

    2016-08-30

    Identifying relatedness among diseases could help deepen understanding for the underlying pathogenic mechanisms of diseases, and facilitate drug repositioning projects. A number of methods for computing disease similarity had been developed; however, none of them were designed to utilize information of the entire protein interaction network, using instead only those interactions involving disease causing genes. Most of previously published methods required gene-disease association data, unfortunately, many diseases still have very few or no associated genes, which impeded broad adoption of those methods. In this study, we propose a new method (MedNetSim) for computing disease similarity by integrating medical literature and protein interaction network. MedNetSim consists of a network-based method (NetSim), which employs the entire protein interaction network, and a MEDLINE-based method (MedSim), which computes disease similarity by mining the biomedical literature. Among function-based methods, NetSim achieved the best performance. Its average AUC (area under the receiver operating characteristic curve) reached 95.2 %. MedSim, whose performance was even comparable to some function-based methods, acquired the highest average AUC in all semantic-based methods. Integration of MedSim and NetSim (MedNetSim) further improved the average AUC to 96.4 %. We further studied the effectiveness of different data sources. It was found that quality of protein interaction data was more important than its volume. On the contrary, higher volume of gene-disease association data was more beneficial, even with a lower reliability. Utilizing higher volume of disease-related gene data further improved the average AUC of MedNetSim and NetSim to 97.5 % and 96.7 %, respectively. Integrating biomedical literature and protein interaction network can be an effective way to compute disease similarity. Lacking sufficient disease-related gene data, literature-based methods such as MedSim can be a great addition to function-based algorithms. It may be beneficial to steer more resources torward studying gene-disease associations and improving the quality of protein interaction data. Disease similarities can be computed using the proposed methods at http:// www.digintelli.com:8000/ .

  6. Privacy authentication using key attribute-based encryption in mobile cloud computing

    NASA Astrophysics Data System (ADS)

    Mohan Kumar, M.; Vijayan, R.

    2017-11-01

    Mobile Cloud Computing is becoming more popular in nowadays were users of smartphones are getting increased. So, the security level of cloud computing as to be increased. Privacy Authentication using key-attribute based encryption helps the users for business development were the data sharing with the organization using the cloud in a secured manner. In Privacy Authentication the sender of data will have permission to add their receivers to whom the data access provided for others the access denied. In sender application, the user can choose the file which is to be sent to receivers and then that data will be encrypted using Key-attribute based encryption using AES algorithm. In which cipher created, and that stored in Amazon Cloud along with key value and the receiver list.

  7. Multi-ball and one-ball geolocation and location verification

    NASA Astrophysics Data System (ADS)

    Nelson, D. J.; Townsend, J. L.

    2017-05-01

    We present analysis methods that may be used to geolocate emitters using one or more moving receivers. While some of the methods we present may apply to a broader class of signals, our primary interest is locating and tracking ships from short pulsed transmissions, such as the maritime Automatic Identification System (AIS.) The AIS signal is difficult to process and track since the pulse duration is only 25 milliseconds, and the pulses may only be transmitted every six to ten seconds. Several fundamental problems are addressed, including demodulation of AIS/GMSK signals, verification of the emitter location, accurate frequency and delay estimation and identification of pulse trains from the same emitter. In particular, we present several new correlation methods, including cross-cross correlation that greatly improves correlation accuracy over conventional methods and cross- TDOA and cross-FDOA functions that make it possible to estimate time and frequency delay without the need of computing a two dimensional cross-ambiguity surface. By isolating pulses from the same emitter and accurately tracking the received signal frequency, we are able to accurately estimate the emitter location from the received Doppler characteristics.

  8. Doppler-shift estimation of flat underwater channel using data-aided least-square approach

    NASA Astrophysics Data System (ADS)

    Pan, Weiqiang; Liu, Ping; Chen, Fangjiong; Ji, Fei; Feng, Jing

    2015-06-01

    In this paper we proposed a dada-aided Doppler estimation method for underwater acoustic communication. The training sequence is non-dedicate, hence it can be designed for Doppler estimation as well as channel equalization. We assume the channel has been equalized and consider only flat-fading channel. First, based on the training symbols the theoretical received sequence is composed. Next the least square principle is applied to build the objective function, which minimizes the error between the composed and the actual received signal. Then an iterative approach is applied to solve the least square problem. The proposed approach involves an outer loop and inner loop, which resolve the channel gain and Doppler coefficient, respectively. The theoretical performance bound, i.e. the Cramer-Rao Lower Bound (CRLB) of estimation is also derived. Computer simulations results show that the proposed algorithm achieves the CRLB in medium to high SNR cases.

  9. Costas loop lock detection in the advanced receiver

    NASA Technical Reports Server (NTRS)

    Mileant, A.; Hinedi, S.

    1989-01-01

    The advanced receiver currently being developed uses a Costas digital loop to demodulate the subcarrier. Previous analyses of lock detector algorithms for Costas loops have ignored the effects of the inherent correlation between the samples of the phase-error process. Accounting for this correlation is necessary to achieve the desired lock-detection probability for a given false-alarm rate. Both analysis and simulations are used to quantify the effects of phase correlation on lock detection for the square-law and the absolute-value type detectors. Results are obtained which depict the lock-detection probability as a function of loop signal-to-noise ratio for a given false-alarm rate. The mathematical model and computer simulation show that the square-law detector experiences less degradation due to phase jitter than the absolute-value detector and that the degradation in detector signal-to-noise ratio is more pronounced for square-wave than for sine-wave signals.

  10. Musculoskeletal symptoms in workers of a Telecom Company.

    PubMed

    Antunes, Evelise Dias; de Araújo, Célia Regina Alves; Abage, Zilda

    2012-01-01

    Millions of people work with computers every day. Human work provides a means of comfort and ease to perform the tasks, favoring incorrect postures. Among the employees of a telecom company, it appears that all make use of computer, remaining in a seated position leading in musculoskeletal symptoms. This is a quantitative study, conducted in a telecom company in the city of Curitiba. Were interviewed 27 analysts who work in engineering department, who agreed to participate and receive guidelines regarding sitting posture. This study, consisted of a structured questionnaire and the Nordic Musculoskeletal Questionnaire with musculoskeletal symptoms indicators. After answering the questionnaire, the subjects were guided through the folder on correct posture and positioning front the computer. Of employees 74% were male, and 100% of the employees are the computer's user, remaining in the sitting posture during working day. Concerning the break time, 74% reported that they frequently do it. Been the average working day is 8 hours. Regarding the frequency of musculoskeletal symptoms, 70% of employees reported some symptoms during the last 12 months but only one came to be sick leave. It is necessary to an analysis of the work situation, evaluating and correcting inadequacies of securities and the risks inherent in function, making prevention.

  11. Thinking about a reader's mind: fostering communicative clarity in the compositions of youth with autism spectrum disorders.

    PubMed

    Grossman, Michael; Peskin, Joan; San Juan, Valerie

    2013-10-01

    A critical component of effective communication is the ability to consider the knowledge state of one's audience, yet individuals with autism spectrum disorders (ASD) have difficulty representing the mental states of others. In the present study, youth with high-functioning ASD were trained to consider their reader's knowledge states in their compositions using a novel computer-based task. After two training trials, participants who received visual feedback from a confederate demonstrated significantly greater communicative clarity on the training measure compared to a control group. The improvements from training transferred to similar and very different tasks, and were maintained approximately 6 weeks post-intervention. These results provide support for the sustained efficacy of a rapid and motivating communication intervention for youth with high-functioning ASD.

  12. Support vector machine multiuser receiver for DS-CDMA signals in multipath channels.

    PubMed

    Chen, S; Samingan, A K; Hanzo, L

    2001-01-01

    The problem of constructing an adaptive multiuser detector (MUD) is considered for direct sequence code division multiple access (DS-CDMA) signals transmitted through multipath channels. The emerging learning technique, called support vector machines (SVM), is proposed as a method of obtaining a nonlinear MUD from a relatively small training data block. Computer simulation is used to study this SVM MUD, and the results show that it can closely match the performance of the optimal Bayesian one-shot detector. Comparisons with an adaptive radial basis function (RBF) MUD trained by an unsupervised clustering algorithm are discussed.

  13. Shuttle GPS R/PA configuration and specification study

    NASA Technical Reports Server (NTRS)

    Booth, R. W. D.

    1979-01-01

    Changes in the technical specifications for a global positioning system (GPS) receiving system dedicated to space shuttle use are presented. Various hardware functions including acquisition, tracking, and measurement are emphasized. The anti-jam performance of the baseline GPS systems are evaluated. Other topics addressed include: the impact on R/PA design of the use of ground based transmitters; problems involved with the use of single channel tests sets; utility of various R/PA antenna interconnections topologies; the choice of the averaging interval for delta range measurements; and the use of interferometry techniques for the computation of orbiter attitude were undertaken.

  14. Rim inertial measuring system

    NASA Technical Reports Server (NTRS)

    Groom, N. J.; Anderson, W. W.; Phillips, W. H. (Inventor)

    1981-01-01

    The invention includes an angular momentum control device (AMCD) having a rim and several magnetic bearing stations. The AMCD is in a strapped down position on a spacecraft. Each magnetic bearing station comprises means, including an axial position sensor, for controlling the position of the rim in the axial direction; and means, including a radial position sensor, for controlling the position of the rim in the radial direction. A first computer receives the signals from all the axial position sensors and computes the angular rates about first and second mutually perpendicular axes in the plane of the rim and computes the linear acceleration along a third axis perpendicular to the first and second axes. A second computer receives the signals from all the radial position sensors and computes the linear accelerations along the first and second axes.

  15. Handheld computers in critical care.

    PubMed

    Lapinsky, S E; Weshler, J; Mehta, S; Varkul, M; Hallett, D; Stewart, T E

    2001-08-01

    Computing technology has the potential to improve health care management but is often underutilized. Handheld computers are versatile and relatively inexpensive, bringing the benefits of computers to the bedside. We evaluated the role of this technology for managing patient data and accessing medical reference information, in an academic intensive-care unit (ICU). Palm III series handheld devices were given to the ICU team, each installed with medical reference information, schedules, and contact numbers. Users underwent a 1-hour training session introducing the hardware and software. Various patient data management applications were assessed during the study period. Qualitative assessment of the benefits, drawbacks, and suggestions was performed by an independent company, using focus groups. An objective comparison between a paper and electronic handheld textbook was achieved using clinical scenario tests. During the 6-month study period, the 20 physicians and 6 paramedical staff who used the handheld devices found them convenient and functional but suggested more comprehensive training and improved search facilities. Comparison of the handheld computer with the conventional paper text revealed equivalence. Access to computerized patient information improved communication, particularly with regard to long-stay patients, but changes to the software and the process were suggested. The introduction of this technology was well received despite differences in users' familiarity with the devices. Handheld computers have potential in the ICU, but systems need to be developed specifically for the critical-care environment.

  16. Handheld computers in critical care

    PubMed Central

    Lapinsky, Stephen E; Weshler, Jason; Mehta, Sangeeta; Varkul, Mark; Hallett, Dave; Stewart, Thomas E

    2001-01-01

    Background Computing technology has the potential to improve health care management but is often underutilized. Handheld computers are versatile and relatively inexpensive, bringing the benefits of computers to the bedside. We evaluated the role of this technology for managing patient data and accessing medical reference information, in an academic intensive-care unit (ICU). Methods Palm III series handheld devices were given to the ICU team, each installed with medical reference information, schedules, and contact numbers. Users underwent a 1-hour training session introducing the hardware and software. Various patient data management applications were assessed during the study period. Qualitative assessment of the benefits, drawbacks, and suggestions was performed by an independent company, using focus groups. An objective comparison between a paper and electronic handheld textbook was achieved using clinical scenario tests. Results During the 6-month study period, the 20 physicians and 6 paramedical staff who used the handheld devices found them convenient and functional but suggested more comprehensive training and improved search facilities. Comparison of the handheld computer with the conventional paper text revealed equivalence. Access to computerized patient information improved communication, particularly with regard to long-stay patients, but changes to the software and the process were suggested. Conclusions The introduction of this technology was well received despite differences in users' familiarity with the devices. Handheld computers have potential in the ICU, but systems need to be developed specifically for the critical-care environment. PMID:11511337

  17. Simulation and control of a 20 kHz spacecraft power system

    NASA Technical Reports Server (NTRS)

    Wasynczuk, O.; Krause, P. C.

    1988-01-01

    A detailed computer representation of four Mapham inverters connected in a series, parallel arrangement has been implemented. System performance is illustrated by computer traces for the four Mapham inverters connected to a Litz cable with parallel resistance and dc receiver loads at the receiving end of the transmission cable. Methods of voltage control and load sharing between the inverters are demonstrated. Also, the detailed computer representation is used to design and to demonstrate the advantages of a feed-forward voltage control strategy. It is illustrated that with a computer simulation of this type, the performance and control of spacecraft power systems may be investigated with relative ease and facility.

  18. Performing an allreduce operation using shared memory

    DOEpatents

    Archer, Charles J [Rochester, MN; Dozsa, Gabor [Ardsley, NY; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2012-04-17

    Methods, apparatus, and products are disclosed for performing an allreduce operation using shared memory that include: receiving, by at least one of a plurality of processing cores on a compute node, an instruction to perform an allreduce operation; establishing, by the core that received the instruction, a job status object for specifying a plurality of shared memory allreduce work units, the plurality of shared memory allreduce work units together performing the allreduce operation on the compute node; determining, by an available core on the compute node, a next shared memory allreduce work unit in the job status object; and performing, by that available core on the compute node, that next shared memory allreduce work unit.

  19. Performing an allreduce operation using shared memory

    DOEpatents

    Archer, Charles J; Dozsa, Gabor; Ratterman, Joseph D; Smith, Brian E

    2014-06-10

    Methods, apparatus, and products are disclosed for performing an allreduce operation using shared memory that include: receiving, by at least one of a plurality of processing cores on a compute node, an instruction to perform an allreduce operation; establishing, by the core that received the instruction, a job status object for specifying a plurality of shared memory allreduce work units, the plurality of shared memory allreduce work units together performing the allreduce operation on the compute node; determining, by an available core on the compute node, a next shared memory allreduce work unit in the job status object; and performing, by that available core on the compute node, that next shared memory allreduce work unit.

  20. How Receptive Are Patients With Late Stage Cancer to Rehabilitation Services and What Are the Sources of Their Resistance?

    PubMed

    Cheville, Andrea L; Rhudy, Lori; Basford, Jeffrey R; Griffin, Joan M; Flores, Ann Marie

    2017-02-01

    To describe the proportion and characteristics of patients with late stage cancer that are and are not receptive to receiving rehabilitation services, and the rationale for their level of interest. Prospective mixed-methods study. Comprehensive cancer center in a quaternary medical center. Adults with stage IIIC or IV non-small cell or extensive stage small cell lung cancer (N=311). Not applicable. Telephone-acquired responses to the administration of (1) the Activity Measure for Post Acute Care Computer Adaptive Test (AM-PAC-CAT); (2) numerical rating scales for pain, dyspnea, fatigue, general emotional distress, and distress associated with functional limitations; (3) a query regarding receptivity to receipt of rehabilitation services, and (4) a query about rationale for nonreceptivity. Overall, 99 (31.8%) of the study's 311 participants expressed interest in receiving rehabilitation services: 38 at the time of enrollment and an additional 61 during at least 1 subsequent contact. Participants expressing interest were more likely to have a child as primary caregiver (18.18% vs 9.91%, P=.04) and a musculoskeletal comorbidity (42.4% vs 31.6%, P=.05). Function-related distress was highly associated with receptivity, as were lower AM-PAC-CAT scores. Reasons provided for lack of interest in receiving services included a perception of their limited benefit, being too busy, and prioritization below more pressing tasks/concerns. One-third of patients with late stage lung cancer are likely to be interested in receiving rehabilitation services despite high levels of disability and related distress. These findings suggest that patient misperception of the role of rehabilitation services may be a barrier to improved function and quality of life. Efforts to educate patients on the benefits of rehabilitation and to more formally integrate rehabilitation as part of comprehensive care may curb these missed opportunities. Copyright © 2016. Published by Elsevier Inc.

  1. How receptive are patients with late stage cancer to rehabilitation services and what are the sources of their resistance?

    PubMed Central

    Cheville, Andrea L.; Rhudy, Lori; Basford, Jeffrey R.; Griffin, Joan M.; Flores, Ann Marie

    2017-01-01

    Objective To describe the proportion and characteristics of patients with late stage cancer that are and are not receptive to receiving rehabilitation services, as well as the rationale for their level of interest. Setting A comprehensive cancer center in a Northcentral US quaternary medical center Design A prospective mixed methods study Participants 311 adults with Stage IIIC or IV non-small cell or extensive stage small cell lung cancer. Interventions Not applicable Main Outcome Measures Telephone acquired responses to the administration of: 1) the Activity Measure for Post Acute Care Computer Adaptive Test (AM-PAC-CAT); 2) Numerical rating scales for pain, dyspnea, fatigue, general emotional distress, and distress associated with functional limitations; 3) a query regarding receptivity to receipt of rehabilitation services, and 4) a query about rationale for non-receptivity. Results Overall 99 (31.8%) of the study’s 311 participants expressed interest in receiving rehabilitation services; 38 at the time of enrollment and an additional 61 during at least one subsequent contact. Participants expressing interest were more likely to have a child as primary caregiver (18.18% vs. 9.91%, p = 0.04) and a musculoskeletal comorbidity (42.4% vs. 31.6%, p = 0.05). Function-related distress was highly associated with receptivity, as were lower AM-PAC-CAT scores. Reasons provided for lack of interest in receiving services included a perception of their limited benefit, being too busy, and prioritization below more pressing tasks/concerns. Conclusions One-third of patients with late stage lung cancer are likely to be interested in receiving rehabilitation services despite high levels of disability and related distress. These findings suggest that patient misperception of the role of rehabilitation services may be a barrier to improved function and quality of life. Efforts to educate patients on the benefits of rehabilitation and to more formally integrate rehabilitation as part of comprehensive care may curb these missed opportunities. PMID:27592401

  2. Effectiveness of a Computer-Based Training Program of Attention and Memory in Patients with Acquired Brain Damage

    PubMed Central

    Fernandez, Elizabeth; Bergado Rosado, Jorge A.; Rodriguez Perez, Daymi; Salazar Santana, Sonia; Torres Aguilar, Maydane; Bringas, Maria Luisa

    2017-01-01

    Many training programs have been designed using modern software to restore the impaired cognitive functions in patients with acquired brain damage (ABD). The objective of this study was to evaluate the effectiveness of a computer-based training program of attention and memory in patients with ABD, using a two-armed parallel group design, where the experimental group (n = 50) received cognitive stimulation using RehaCom software, and the control group (n = 30) received the standard cognitive stimulation (non-computerized) for eight weeks. In order to assess the possible cognitive changes after the treatment, a post-pre experimental design was employed using the following neuropsychological tests: Wechsler Memory Scale (WMS) and Trail Making test A and B. The effectiveness of the training procedure was statistically significant (p < 0.05) when it established the comparison between the performance in these scales, before and after the training period, in each patient and between the two groups. The training group had statistically significant (p < 0.001) changes in focused attention (Trail A), two subtests (digit span and logical memory), and the overall score of WMS. Finally, we discuss the advantages of computerized training rehabilitation and further directions of this line of work. PMID:29301194

  3. A Functional Analytic Approach To Computer-Interactive Mathematics

    PubMed Central

    2005-01-01

    Following a pretest, 11 participants who were naive with regard to various algebraic and trigonometric transformations received an introductory lecture regarding the fundamentals of the rectangular coordinate system. Following the lecture, they took part in a computer-interactive matching-to-sample procedure in which they received training on particular formula-to-formula and formula-to-graph relations as these formulas pertain to reflections and vertical and horizontal shifts. In training A-B, standard formulas served as samples and factored formulas served as comparisons. In training B-C, factored formulas served as samples and graphs served as comparisons. Subsequently, the program assessed for mutually entailed B-A and C-B relations as well as combinatorially entailed C-A and A-C relations. After all participants demonstrated mutual entailment and combinatorial entailment, we employed a test of novel relations to assess 40 different and complex variations of the original training formulas and their respective graphs. Six of 10 participants who completed training demonstrated perfect or near-perfect performance in identifying novel formula-to-graph relations. Three of the 4 participants who made more than three incorrect responses during the assessment of novel relations showed some commonality among their error patterns. Derived transfer of stimulus control using mathematical relations is discussed. PMID:15898471

  4. Router Agent Technology for Policy-Based Network Management

    NASA Technical Reports Server (NTRS)

    Chow, Edward T.; Sudhir, Gurusham; Chang, Hsin-Ping; James, Mark; Liu, Yih-Chiao J.; Chiang, Winston

    2011-01-01

    This innovation can be run as a standalone network application on any computer in a networked environment. This design can be configured to control one or more routers (one instance per router), and can also be configured to listen to a policy server over the network to receive new policies based on the policy- based network management technology. The Router Agent Technology transforms the received policies into suitable Access Control List syntax for the routers it is configured to control. It commits the newly generated access control lists to the routers and provides feedback regarding any errors that were faced. The innovation also automatically generates a time-stamped log file regarding all updates to the router it is configured to control. This technology, once installed on a local network computer and started, is autonomous because it has the capability to keep listening to new policies from the policy server, transforming those policies to router-compliant access lists, and committing those access lists to a specified interface on the specified router on the network with any error feedback regarding commitment process. The stand-alone application is named RouterAgent and is currently realized as a fully functional (version 1) implementation for the Windows operating system and for CISCO routers.

  5. Three Dimensional Computer Graphics Federates for the 2012 Smackdown Simulation

    NASA Technical Reports Server (NTRS)

    Fordyce, Crystal; Govindaiah, Swetha; Muratet, Sean; O'Neil, Daniel A.; Schricker, Bradley C.

    2012-01-01

    The Simulation Interoperability Standards Organization (SISO) Smackdown is a two-year old annual event held at the 2012 Spring Simulation Interoperability Workshop (SIW). A primary objective of the Smackdown event is to provide college students with hands-on experience in developing distributed simulations using High Level Architecture (HLA). Participating for the second time, the University of Alabama in Huntsville (UAHuntsville) deployed four federates, two federates simulated a communications server and a lunar communications satellite with a radio. The other two federates generated 3D computer graphics displays for the communication satellite constellation and for the surface based lunar resupply mission. Using the Light-Weight Java Graphics Library, the satellite display federate presented a lunar-texture mapped sphere of the moon and four Telemetry Data Relay Satellites (TDRS), which received object attributes from the lunar communications satellite federate to drive their motion. The surface mission display federate was an enhanced version of the federate developed by ForwardSim, Inc. for the 2011 Smackdown simulation. Enhancements included a dead-reckoning algorithm and a visual indication of which communication satellite was in line of sight of Hadley Rille. This paper concentrates on these two federates by describing the functions, algorithms, HLA object attributes received from other federates, development experiences and recommendations for future, participating Smackdown teams.

  6. A functional analytic approach to computer-interactive mathematics.

    PubMed

    Ninness, Chris; Rumph, Robin; McCuller, Glen; Harrison, Carol; Ford, Angela M; Ninness, Sharon K

    2005-01-01

    Following a pretest, 11 participants who were naive with regard to various algebraic and trigonometric transformations received an introductory lecture regarding the fundamentals of the rectangular coordinate system. Following the lecture, they took part in a computer-interactive matching-to-sample procedure in which they received training on particular formula-to-formula and formula-to-graph relations as these formulas pertain to reflections and vertical and horizontal shifts. In training A-B, standard formulas served as samples and factored formulas served as comparisons. In training B-C, factored formulas served as samples and graphs served as comparisons. Subsequently, the program assessed for mutually entailed B-A and C-B relations as well as combinatorially entailed C-A and A-C relations. After all participants demonstrated mutual entailment and combinatorial entailment, we employed a test of novel relations to assess 40 different and complex variations of the original training formulas and their respective graphs. Six of 10 participants who completed training demonstrated perfect or near-perfect performance in identifying novel formula-to-graph relations. Three of the 4 participants who made more than three incorrect responses during the assessment of novel relations showed some commonality among their error patterns. Derived transfer of stimulus control using mathematical relations is discussed.

  7. Training Older Adults to Use Tablet Computers: Does It Enhance Cognitive Function?

    PubMed

    Chan, Micaela Y; Haber, Sara; Drew, Linda M; Park, Denise C

    2016-06-01

    Recent evidence shows that engaging in learning new skills improves episodic memory in older adults. In this study, older adults who were computer novices were trained to use a tablet computer and associated software applications. We hypothesize that sustained engagement in this mentally challenging training would yield a dual benefit of improved cognition and enhancement of everyday function by introducing useful skills. A total of 54 older adults (age 60-90) committed 15 hr/week for 3 months. Eighteen participants received extensive iPad training, learning a broad range of practical applications. The iPad group was compared with 2 separate controls: a Placebo group that engaged in passive tasks requiring little new learning; and a Social group that had regular social interaction, but no active skill acquisition. All participants completed the same cognitive battery pre- and post-engagement. Compared with both controls, the iPad group showed greater improvements in episodic memory and processing speed but did not differ in mental control or visuospatial processing. iPad training improved cognition relative to engaging in social or nonchallenging activities. Mastering relevant technological devices have the added advantage of providing older adults with technological skills useful in facilitating everyday activities (e.g., banking). This work informs the selection of targeted activities for future interventions and community programs. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America.

  8. Integrated Computer Controlled Glow Discharge Tube

    NASA Astrophysics Data System (ADS)

    Kaiser, Erik; Post-Zwicker, Andrew

    2002-11-01

    An "Interactive Plasma Display" was created for the Princeton Plasma Physics Laboratory to demonstrate the characteristics of plasma to various science education outreach programs. From high school students and teachers, to undergraduate students and visitors to the lab, the plasma device will be a key component in advancing the public's basic knowledge of plasma physics. The device is fully computer controlled using LabVIEW, a touchscreen Graphical User Interface [GUI], and a GPIB interface. Utilizing a feedback loop, the display is fully autonomous in controlling pressure, as well as in monitoring the safety aspects of the apparatus. With a digital convectron gauge continuously monitoring pressure, the computer interface analyzes the input signals, while making changes to a digital flow controller. This function works independently of the GUI, allowing the user to simply input and receive a desired pressure; quickly, easily, and intuitively. The discharge tube is a 36" x 4"id glass cylinder with 3" side port. A 3000 volt, 10mA power supply, is used to breakdown the plasma. A 300 turn solenoid was created to demonstrate the magnetic pinching of a plasma. All primary functions of the device are controlled through the GUI digital controllers. This configuration allows for operators to safely control the pressure (100mTorr-1Torr), magnetic field (0-90Gauss, 7amps, 10volts), and finally, the voltage applied across the electrodes (0-3000v, 10mA).

  9. Training Older Adults to Use Tablet Computers: Does It Enhance Cognitive Function?

    PubMed Central

    Chan, Micaela Y.; Haber, Sara; Drew, Linda M.; Park, Denise C.

    2016-01-01

    Purpose of the Study: Recent evidence shows that engaging in learning new skills improves episodic memory in older adults. In this study, older adults who were computer novices were trained to use a tablet computer and associated software applications. We hypothesize that sustained engagement in this mentally challenging training would yield a dual benefit of improved cognition and enhancement of everyday function by introducing useful skills. Design and Methods: A total of 54 older adults (age 60-90) committed 15 hr/week for 3 months. Eighteen participants received extensive iPad training, learning a broad range of practical applications. The iPad group was compared with 2 separate controls: a Placebo group that engaged in passive tasks requiring little new learning; and a Social group that had regular social interaction, but no active skill acquisition. All participants completed the same cognitive battery pre- and post-engagement. Results: Compared with both controls, the iPad group showed greater improvements in episodic memory and processing speed but did not differ in mental control or visuospatial processing. Implications: iPad training improved cognition relative to engaging in social or nonchallenging activities. Mastering relevant technological devices have the added advantage of providing older adults with technological skills useful in facilitating everyday activities (e.g., banking). This work informs the selection of targeted activities for future interventions and community programs. PMID:24928557

  10. Imaging Ruptured Lithosphere Beneath the Arabian Peninsula Using S-wave Receiver Functions

    NASA Astrophysics Data System (ADS)

    Hansen, S. E.; Rodgers, A. J.; Schwartz, S. Y.; Al-Amri, A. M.

    2006-12-01

    The lithospheric thickness beneath the Arabian Peninsula has important implications for understanding the tectonic processes associated with continental rifting along the Red Sea. However, estimates of the lithospheric thickness are limited by the lack of high-resolution seismic observations sampling the lithosphere- asthenosphere boundary (LAB). The S-wave receiver function technique allows point determinations of Moho and LAB depths by identifying S-to-P conversions from these discontinuities beneath individual seismic stations. This method is superior to P-wave receiver functions for identifying the LAB because P-to-S multiple reverberations from shallower discontinuities (such as the Moho) often mask the direct conversion from the LAB while S-to-P boundary conversions arrive earlier than the direct S phase and all multiples arrive later. We interpret crustal and lithospheric structure across the entire Arabian Peninsula from S-wave receiver functions computed at 29 stations from four different seismic networks. Generally, both the Moho and the LAB are shallowest near the Red Sea and become deeper towards the Arabian interior. Near the coast, the Moho increases from about 12 to 35 km, with a few exceptions showing a deeper Moho beneath stations that are situated on higher topography in the Asir Province. The crustal thickening continues until an average depth of about 40-45 km is reached over both the central Arabian Shield and Platform. The LAB near the coast is at a depth of about 50 km, increases rapidly, and reaches an average maximum depth of about 120 km beneath the Arabian Shield. At the Shield-Platform boundary, a distinct step is observed in the lithospheric thickness where the LAB depth increases to about 160 km. This step may reflect remnant lithospheric thickening associated with the Shield's accretion onto the Platform and has an important role in guiding asthenospheric flow beneath the eastern margin of the Red Sea. This work was performed in part under the auspices of the U.S. Department of Energy by University of California, Lawrence Livermore National Laboratory under contract W-7405-Eng-48.

  11. Receiver-Coupling Schemes Based On Optimal-Estimation Theory

    NASA Technical Reports Server (NTRS)

    Kumar, Rajendra

    1992-01-01

    Two schemes for reception of weak radio signals conveying digital data via phase modulation provide for mutual coupling of multiple receivers, and coherent combination of outputs of receivers. In both schemes, optimal mutual-coupling weights computed according to Kalman-filter theory, but differ in manner of transmission and combination of outputs of receivers.

  12. Optical Communication with Semiconductor Laser Diode. Interim Progress Report. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Davidson, Frederic; Sun, Xiaoli

    1989-01-01

    Theoretical and experimental performance limits of a free-space direct detection optical communication system were studied using a semiconductor laser diode as the optical transmitter and a silicon avalanche photodiode (APD) as the receiver photodetector. Optical systems using these components are under consideration as replacements for microwave satellite communication links. Optical pulse position modulation (PPM) was chosen as the signal format. An experimental system was constructed that used an aluminum gallium arsenide semiconductor laser diode as the transmitter and a silicon avalanche photodiode photodetector. The system used Q=4 PPM signaling at a source data rate of 25 megabits per second. The PPM signal format requires regeneration of PPM slot clock and word clock waveforms in the receiver. A nearly exact computational procedure was developed to compute receiver bit error rate without using the Gaussion approximation. A transition detector slot clock recovery system using a phase lock loop was developed and implemented. A novel word clock recovery system was also developed. It was found that the results of the nearly exact computational procedure agreed well with actual measurements of receiver performance. The receiver sensitivity achieved was the closest to the quantum limit yet reported for an optical communication system of this type.

  13. Impact of field number and beam angle on functional image-guided lung cancer radiotherapy planning

    NASA Astrophysics Data System (ADS)

    Tahir, Bilal A.; Bragg, Chris M.; Wild, Jim M.; Swinscoe, James A.; Lawless, Sarah E.; Hart, Kerry A.; Hatton, Matthew Q.; Ireland, Rob H.

    2017-09-01

    To investigate the effect of beam angles and field number on functionally-guided intensity modulated radiotherapy (IMRT) normal lung avoidance treatment plans that incorporate hyperpolarised helium-3 magnetic resonance imaging (3He MRI) ventilation data. Eight non-small cell lung cancer patients had pre-treatment 3He MRI that was registered to inspiration breath-hold radiotherapy planning computed tomography. IMRT plans that minimised the volume of total lung receiving  ⩾20 Gy (V20) were compared with plans that minimised 3He MRI defined functional lung receiving  ⩾20 Gy (fV20). Coplanar IMRT plans using 5-field manually optimised beam angles and 9-field equidistant plans were also evaluated. For each pair of plans, the Wilcoxon signed ranks test was used to compare fV20 and the percentage of planning target volume (PTV) receiving 90% of the prescription dose (PTV90). Incorporation of 3He MRI led to median reductions in fV20 of 1.3% (range: 0.2-9.3% p  =  0.04) and 0.2% (range: 0 to 4.1%; p  =  0.012) for 5- and 9-field arrangements, respectively. There was no clinically significant difference in target coverage. Functionally-guided IMRT plans incorporating hyperpolarised 3He MRI information can reduce the dose received by ventilated lung without comprising PTV coverage. The effect was greater for optimised beam angles rather than uniformly spaced fields.

  14. Evaluation of a Mobile Phone for Aircraft GPS Interference

    NASA Technical Reports Server (NTRS)

    Nguyen, Truong X.

    2004-01-01

    Measurements of spurious emissions from a mobile phone are conducted in a reverberation chamber for the Global Positioning System (GPS) radio frequency band. This phone model was previously determined to have caused interference to several aircraft GPS receivers. Interference path loss (IPL) factors are applied to the emission data, and the outcome compared against GPS receiver susceptibility. The resulting negative safety margins indicate there are risks to aircraft GPS systems. The maximum emission level from the phone is also shown to be comparable with some laptop computer's emissions, implying that laptop computers can provide similar risks to aircraft GPS receivers.

  15. Using cross correlations of turbulent flow-induced ambient vibrations to estimate the structural impulse response. Application to structural health monitoring.

    PubMed

    Sabra, Karim G; Winkel, Eric S; Bourgoyne, Dwayne A; Elbing, Brian R; Ceccio, Steve L; Perlin, Marc; Dowling, David R

    2007-04-01

    It has been demonstrated theoretically and experimentally that an estimate of the impulse response (or Green's function) between two receivers can be obtained from the cross correlation of diffuse wave fields at these two receivers in various environments and frequency ranges: ultrasonics, civil engineering, underwater acoustics, and seismology. This result provides a means for structural monitoring using ambient structure-borne noise only, without the use of active sources. This paper presents experimental results obtained from flow-induced random vibration data recorded by pairs of accelerometers mounted within a flat plate or hydrofoil in the test section of the U.S. Navy's William B. Morgan Large Cavitation Channel. The experiments were conducted at high Reynolds number (Re > 50 million) with the primary excitation source being turbulent boundary layer pressure fluctuations on the upper and lower surfaces of the plate or foil. Identical deterministic time signatures emerge from the noise cross-correlation function computed via robust and simple processing of noise measured on different days by a pair of passive sensors. These time signatures are used to determine and/or monitor the structural response of the test models from a few hundred to a few thousand Hertz.

  16. Cognitive side-effects of electroconvulsive therapy in elderly depressed patients.

    PubMed

    Dybedal, Gro Strømnes; Tanum, Lars; Sundet, Kjetil; Gaarden, Torfinn Lødøen; Bjølseth, Tor Magne

    2014-01-01

    Knowledge about cognitive side-effects induced by electroconvulsive therapy (ECT) in depressed elderly patients is sparse. In this study we investigated changes in the cognitive functioning of non-demented elderly depressed patients receiving ECT (n = 62) compared with healthy elderly people (n = 17). Neuropsychological tests were administered at the start of treatment and again within 1 week after treatment. We computed reliable change indices (RCIs) using simple regression methods. RCIs are statistical methods for analyzing change in individuals that have not yet been used in studies of the acute cognitive side-effects of ECT. At the group level, only letter fluency performance was found to be significantly reduced in the ECT group compared with the controls, whereas both groups demonstrated stable or improved performance on all other measures. At the individual level, however, 11% of patients showed retrograde amnesia for public facts post-ECT and 40% of the patients showed a significant decline in neuropsychological functioning. Decline on a measure of delayed verbal anterograde memory was most common. Our findings indicate that there are mild neurocognitive impairments in the acute phase for a substantial minority of elderly patients receiving ECT. Analysis of reliable change facilitated the illumination of cognitive side-effects in our sample.

  17. On pp wave limit for η deformed superstrings

    NASA Astrophysics Data System (ADS)

    Roychowdhury, Dibakar

    2018-05-01

    In this paper, based on the notion of plane wave string/gauge theory duality, we explore the pp wave limit associated with the bosonic sector of η deformed superstrings propagating in ( AdS 5 × S 5) η . Our analysis reveals that in the presence of NS-NS and RR fluxes, the pp wave limit associated to full ABF background satisfies type IIB equations in its standard form. However, the beta functions as well as the string Hamiltonian start receiving non trivial curvature corrections as one starts probing beyond pp wave limit which thereby takes solutions away from the standard type IIB form. Furthermore, using uniform gauge, we also explore the BMN dynamics associated with short strings and compute the corresponding Hamiltonian density. Finally, we explore the Penrose limit associated with the HT background and compute the corresponding stringy spectrum for the bosonic sector.

  18. Flashing characters with famous faces improves ERP-based brain-computer interface performance

    NASA Astrophysics Data System (ADS)

    Kaufmann, T.; Schulz, S. M.; Grünzinger, C.; Kübler, A.

    2011-10-01

    Currently, the event-related potential (ERP)-based spelling device, often referred to as P300-Speller, is the most commonly used brain-computer interface (BCI) for enhancing communication of patients with impaired speech or motor function. Among numerous improvements, a most central feature has received little attention, namely optimizing the stimulus used for eliciting ERPs. Therefore we compared P300-Speller performance with the standard stimulus (flashing characters) against performance with stimuli known for eliciting particularly strong ERPs due to their psychological salience, i.e. flashing familiar faces transparently superimposed on characters. Our results not only indicate remarkably increased ERPs in response to familiar faces but also improved P300-Speller performance due to a significant reduction of stimulus sequences needed for correct character classification. These findings demonstrate a promising new approach for improving the speed and thus fluency of BCI-enhanced communication with the widely used P300-Speller.

  19. The benefits of the Atlas of Human Cardiac Anatomy website for the design of cardiac devices.

    PubMed

    Spencer, Julianne H; Quill, Jason L; Bateman, Michael G; Eggen, Michael D; Howard, Stephen A; Goff, Ryan P; Howard, Brian T; Quallich, Stephen G; Iaizzo, Paul A

    2013-11-01

    This paper describes how the Atlas of Human Cardiac Anatomy website can be used to improve cardiac device design throughout the process of development. The Atlas is a free-access website featuring novel images of both functional and fixed human cardiac anatomy from over 250 human heart specimens. This website provides numerous educational tutorials on anatomy, physiology and various imaging modalities. For instance, the 'device tutorial' provides examples of devices that were either present at the time of in vitro reanimation or were subsequently delivered, including leads, catheters, valves, annuloplasty rings and stents. Another section of the website displays 3D models of the vasculature, blood volumes and/or tissue volumes reconstructed from computed tomography and magnetic resonance images of various heart specimens. The website shares library images, video clips and computed tomography and MRI DICOM files in honor of the generous gifts received from donors and their families.

  20. Superior mesenteric artery syndrome in a young military basic trainee.

    PubMed

    Schauer, Steven G; Thompson, Andrew J; Bebarta, Vikhyat S

    2013-03-01

    We report the case of a 19-year-old military trainee that presented to the emergency department with a 3-week history of diffuse abdominal pain, 1 to 2 hours postprandially. The timing, onset, quality, and location of her pain was concerning for intestinal angina. Her serum chemistry, hematology, and liver function tests were normal. The radiologist's interpretation of the computed tomography angiogram of the abdomen was an abnormally narrow takeoff angle of the superior mesenteric artery (SMA) from the aorta near the third portion of the duodenum. She was diagnosed with SMA syndrome and received additional evaluation and treatment by her gastroenterologist and surgeon. SMA syndrome is rare and can cause bowel obstruction, perforation, gastric wall pneumatosis, and portal venous gas formation. Computed tomography angiography can be used to promptly diagnose this syndrome in the emergency department. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  1. Fully Decentralized Semi-supervised Learning via Privacy-preserving Matrix Completion.

    PubMed

    Fierimonte, Roberto; Scardapane, Simone; Uncini, Aurelio; Panella, Massimo

    2016-08-26

    Distributed learning refers to the problem of inferring a function when the training data are distributed among different nodes. While significant work has been done in the contexts of supervised and unsupervised learning, the intermediate case of Semi-supervised learning in the distributed setting has received less attention. In this paper, we propose an algorithm for this class of problems, by extending the framework of manifold regularization. The main component of the proposed algorithm consists of a fully distributed computation of the adjacency matrix of the training patterns. To this end, we propose a novel algorithm for low-rank distributed matrix completion, based on the framework of diffusion adaptation. Overall, the distributed Semi-supervised algorithm is efficient and scalable, and it can preserve privacy by the inclusion of flexible privacy-preserving mechanisms for similarity computation. The experimental results and comparison on a wide range of standard Semi-supervised benchmarks validate our proposal.

  2. A scalable silicon photonic chip-scale optical switch for high performance computing systems.

    PubMed

    Yu, Runxiang; Cheung, Stanley; Li, Yuliang; Okamoto, Katsunari; Proietti, Roberto; Yin, Yawei; Yoo, S J B

    2013-12-30

    This paper discusses the architecture and provides performance studies of a silicon photonic chip-scale optical switch for scalable interconnect network in high performance computing systems. The proposed switch exploits optical wavelength parallelism and wavelength routing characteristics of an Arrayed Waveguide Grating Router (AWGR) to allow contention resolution in the wavelength domain. Simulation results from a cycle-accurate network simulator indicate that, even with only two transmitter/receiver pairs per node, the switch exhibits lower end-to-end latency and higher throughput at high (>90%) input loads compared with electronic switches. On the device integration level, we propose to integrate all the components (ring modulators, photodetectors and AWGR) on a CMOS-compatible silicon photonic platform to ensure a compact, energy efficient and cost-effective device. We successfully demonstrate proof-of-concept routing functions on an 8 × 8 prototype fabricated using foundry services provided by OpSIS-IME.

  3. Extended write combining using a write continuation hint flag

    DOEpatents

    Chen, Dong; Gara, Alan; Heidelberger, Philip; Ohmacht, Martin; Vranas, Pavlos

    2013-06-04

    A computing apparatus for reducing the amount of processing in a network computing system which includes a network system device of a receiving node for receiving electronic messages comprising data. The electronic messages are transmitted from a sending node. The network system device determines when more data of a specific electronic message is being transmitted. A memory device stores the electronic message data and communicating with the network system device. A memory subsystem communicates with the memory device. The memory subsystem stores a portion of the electronic message when more data of the specific message will be received, and the buffer combines the portion with later received data and moves the data to the memory device for accessible storage.

  4. Method to Measure Total Noise Temperature of a Wireless Receiver During Operation

    NASA Technical Reports Server (NTRS)

    Young, Lawrence E. (Inventor); Esterhuizen, Stephan X. (Inventor); Turbiner, Dmitry (Inventor)

    2014-01-01

    An electromagnetic signal receiver and methods for determining the noise level and signal power in a signal of interest while the receiver is operating. In some embodiments, the signal of interest is a GPS signal. The receiver includes a noise source that provides a noise signal of known power during intervals while the signal of interest is observed. By measuring a signal-to-noise ratio for the signal of interest and the noise power in the signal of interest, the noise level and signal power of the signal of interest can be computed. Various methods of making the measurements and computing the power of the signal of interest are described. Applications of the system and method are described.

  5. Computational Multiqubit Tunnelling in Programmable Quantum Annealers

    DTIC Science & Technology

    2016-08-25

    ARTICLE Received 3 Jun 2015 | Accepted 26 Nov 2015 | Published 7 Jan 2016 Computational multiqubit tunnelling in programmable quantum annealers...state itself. Quantum tunnelling has been hypothesized as an advantageous physical resource for optimization in quantum annealing. However, computational ...qubit tunnelling plays a computational role in a currently available programmable quantum annealer. We devise a probe for tunnelling, a computational

  6. Preparing Future Secondary Computer Science Educators

    ERIC Educational Resources Information Center

    Ajwa, Iyad

    2007-01-01

    Although nearly every college offers a major in computer science, many computer science teachers at the secondary level have received little formal training. This paper presents details of a project that could make a significant contribution to national efforts to improve computer science education by combining teacher education and professional…

  7. Constructing a starting 3D shear velocity model with sharp interfaces for SEM-based upper mantle tomography in North America

    NASA Astrophysics Data System (ADS)

    Calo, M.; Bodin, T.; Yuan, H.; Romanowicz, B. A.; Larmat, C. S.; Maceira, M.

    2013-12-01

    Seismic tomography is currently evolving towards 3D earth models that satisfy full seismic waveforms at increasingly high frequencies. This evolution is possible thanks to the advent of powerful numerical methods such as the Spectral Element Method (SEM) that allow accurate computation of the seismic wavefield in complex media, and the drastic increase of computational resources. However, the production of such models requires handling complex misfit functions with more than one local minimum. Standard linearized inversion methods (such as gradient methods) have two main drawbacks: 1) they produce solution models highly dependent on the starting model; 2) they do not provide a means of estimating true model uncertainties. However, these issues can be addressed with stochastic methods that can sample the space of possible solutions efficiently. Such methods are prohibitively challenging computationally in 3D, but increasingly accessible in 1D. In previous work (Yuan and Romanowicz, 2010; Yuan et al., 2011) we developed a continental scale anisotropic upper mantle model of north America based on a combination of long period seismic waveforms and SKS splitting measurements, showing the pervasive presence of layering of anisotropy in the cratonic lithosphere with significant variations in depth of the mid-lithospheric boundary. The radial anisotropy part of the model has been recently updated using the spectral element method for forward wavefield computations and waveform data from the latest deployments of USarray (Yuan and Romanowicz, 2013). However, the long period waveforms (periods > 40s) themselves only provide a relatively smooth view of the mantle if the starting model is smooth, and the mantle discontinuities necessary for geodynamical interpretation are not imaged. Increasing the frequency of the computations to constrain smaller scale features is possible, but challenging computationally, and at the risk of falling in local minima of the misfit function. In this work we propose instead to directly tackle the non-linearity of the inverse problem by using stochastic methods to construct a 3D starting model with a good estimate of the depths of the main layering interfaces. We present preliminary results of the construction of such a starting 3D model based on: (1) Regionalizing the study area to define provinces within which lateral variations are smooth; (2) Applying trans-dimensional stochastic inversion (Bodin et al., 2012) to obtain accurate 1D models in each province as well as the corresponding error distribution, constrained by receiver function and surface wave dispersion data as well as the previously constructed 3D model (name), and (3) connecting these models laterally using data-driven smoothing operators to obtain a starting 3D model with errors. References Bodin, T.,et al. 2012, Transdimensional inversion of receiver functions and surface wave dispersion, J. Geophys. Res., 117, B02301, doi:10.1029/2011JB008560. Yuan and Romanowicz, 2013, in revison. Yuan, H., et al. 2011, 3-D shear wave radially and azimuthally anisotropic velocity model of the North American upper mantle. Geophysical Journal International, 184: 1237-1260. doi: 10.1111/j.1365-246X.2010.04901.x Yuan, H. & Romanowicz, B., 2010. Lithospheric layering in the North American Craton, Nature, 466, 1063-1068.

  8. Digital receiver study and implementation

    NASA Technical Reports Server (NTRS)

    Fogle, D. A.; Lee, G. M.; Massey, J. C.

    1972-01-01

    Computer software was developed which makes it possible to use any general purpose computer with A/D conversion capability as a PSK receiver for low data rate telemetry processing. Carrier tracking, bit synchronization, and matched filter detection are all performed digitally. To aid in the implementation of optimum computer processors, a study of general digital processing techniques was performed which emphasized various techniques for digitizing general analog systems. In particular, the phase-locked loop was extensively analyzed as a typical non-linear communication element. Bayesian estimation techniques for PSK demodulation were studied. A hardware implementation of the digital Costas loop was developed.

  9. Computational Ghost Imaging for Remote Sensing

    NASA Technical Reports Server (NTRS)

    Erkmen, Baris I.

    2012-01-01

    This work relates to the generic problem of remote active imaging; that is, a source illuminates a target of interest and a receiver collects the scattered light off the target to obtain an image. Conventional imaging systems consist of an imaging lens and a high-resolution detector array [e.g., a CCD (charge coupled device) array] to register the image. However, conventional imaging systems for remote sensing require high-quality optics and need to support large detector arrays and associated electronics. This results in suboptimal size, weight, and power consumption. Computational ghost imaging (CGI) is a computational alternative to this traditional imaging concept that has a very simple receiver structure. In CGI, the transmitter illuminates the target with a modulated light source. A single-pixel (bucket) detector collects the scattered light. Then, via computation (i.e., postprocessing), the receiver can reconstruct the image using the knowledge of the modulation that was projected onto the target by the transmitter. This way, one can construct a very simple receiver that, in principle, requires no lens to image a target. Ghost imaging is a transverse imaging modality that has been receiving much attention owing to a rich interconnection of novel physical characteristics and novel signal processing algorithms suitable for active computational imaging. The original ghost imaging experiments consisted of two correlated optical beams traversing distinct paths and impinging on two spatially-separated photodetectors: one beam interacts with the target and then illuminates on a single-pixel (bucket) detector that provides no spatial resolution, whereas the other beam traverses an independent path and impinges on a high-resolution camera without any interaction with the target. The term ghost imaging was coined soon after the initial experiments were reported, to emphasize the fact that by cross-correlating two photocurrents, one generates an image of the target. In CGI, the measurement obtained from the reference arm (with the high-resolution detector) is replaced by a computational derivation of the measurement-plane intensity profile of the reference-arm beam. The algorithms applied to computational ghost imaging have diversified beyond simple correlation measurements, and now include modern reconstruction algorithms based on compressive sensing.

  10. A handheld computer as part of a portable in vivo knee joint load monitoring system

    PubMed Central

    Szivek, JA; Nandakumar, VS; Geffre, CP; Townsend, CP

    2009-01-01

    In vivo measurement of loads and pressures acting on articular cartilage in the knee joint during various activities and rehabilitative therapies following focal defect repair will provide a means of designing activities that encourage faster and more complete healing of focal defects. It was the goal of this study to develop a totally portable monitoring system that could be used during various activities and allow continuous monitoring of forces acting on the knee. In order to make the monitoring system portable, a handheld computer with custom software, a USB powered miniature wireless receiver and a battery-powered coil were developed to replace a currently used computer, AC powered bench top receiver and power supply. A Dell handheld running Windows Mobile operating system(OS) programmed using Labview was used to collect strain measurements. Measurements collected by the handheld based system connected to the miniature wireless receiver were compared with the measurements collected by a hardwired system and a computer based system during bench top testing and in vivo testing. The newly developed handheld based system had a maximum accuracy of 99% when compared to the computer based system. PMID:19789715

  11. Clustering P-Wave Receiver Functions To Constrain Subsurface Seismic Structure

    NASA Astrophysics Data System (ADS)

    Chai, C.; Larmat, C. S.; Maceira, M.; Ammon, C. J.; He, R.; Zhang, H.

    2017-12-01

    The acquisition of high-quality data from permanent and temporary dense seismic networks provides the opportunity to apply statistical and machine learning techniques to a broad range of geophysical observations. Lekic and Romanowicz (2011) used clustering analysis on tomographic velocity models of the western United States to perform tectonic regionalization and the velocity-profile clusters agree well with known geomorphic provinces. A complementary and somewhat less restrictive approach is to apply cluster analysis directly to geophysical observations. In this presentation, we apply clustering analysis to teleseismic P-wave receiver functions (RFs) continuing efforts of Larmat et al. (2015) and Maceira et al. (2015). These earlier studies validated the approach with surface waves and stacked EARS RFs from the USArray stations. In this study, we experiment with both the K-means and hierarchical clustering algorithms. We also test different distance metrics defined in the vector space of RFs following Lekic and Romanowicz (2011). We cluster data from two distinct data sets. The first, corresponding to the western US, was by smoothing/interpolation of receiver-function wavefield (Chai et al. 2015). Spatial coherence and agreement with geologic region increase with this simpler, spatially smoothed set of observations. The second data set is composed of RFs for more than 800 stations of the China Digital Seismic Network (CSN). Preliminary results show a first order agreement between clusters and tectonic region and each region cluster includes a distinct Ps arrival, which probably reflects differences in crustal thickness. Regionalization remains an important step to characterize a model prior to application of full waveform and/or stochastic imaging techniques because of the computational expense of these types of studies. Machine learning techniques can provide valuable information that can be used to design and characterize formal geophysical inversion, providing information on spatial variability in the subsurface geology.

  12. Nondynamic Tracking Using The Global Positioning System

    NASA Technical Reports Server (NTRS)

    Yunck, T. P.; Wu, Sien-Chong

    1988-01-01

    Report describes technique for using Global Positioning System (GPS) to determine position of low Earth orbiter without need for dynamic models. Differential observing strategy requires GPS receiver on user vehicle and network of six ground receivers. Computationally efficient technique delivers decimeter accuracy on orbits down to lowest altitudes. New technique nondynamic long-arc strategy having potential for accuracy of best dynamic techniques while retaining much of computational simplicity of geometric techniques.

  13. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time

    PubMed Central

    Zhang, Yeqing; Wang, Meiling; Li, Yafeng

    2018-01-01

    For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90–94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7–5.6% per millisecond, with most satellites acquired successfully. PMID:29495301

  14. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time.

    PubMed

    Zhang, Yeqing; Wang, Meiling; Li, Yafeng

    2018-02-24

    For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90-94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7-5.6% per millisecond, with most satellites acquired successfully.

  15. Calibration of the BEV GPS Receiver by Using TWSTFT

    DTIC Science & Technology

    2008-12-01

    40th Annual Precise Time and Time Interval (PTTI) Meeting 543 CALIBRATION OF THE BEV GPS RECEIVER BY USING TWSTFT A. Niessner1, W...a calibration of the BEV reference GPS time receiver by using Two-way Satellite Time and Frequency Transfer ( TWSTFT ). Due to antenna changes, a new...calibration of the BEV receiver was necessary. This receiver is the first GPS receiver with calibration through TWSTFT and used for UTC computation

  16. Multidisciplinary Shape Optimization of a Composite Blended Wing Body Aircraft

    NASA Astrophysics Data System (ADS)

    Boozer, Charles Maxwell

    A multidisciplinary shape optimization tool coupling aerodynamics, structure, and performance was developed for battery powered aircraft. Utilizing high-fidelity computational fluid dynamics analysis tools and a structural wing weight tool, coupled based on the multidisciplinary feasible optimization architecture; aircraft geometry is modified in the optimization of the aircraft's range or endurance. The developed tool is applied to three geometries: a hybrid blended wing body, delta wing UAS, the ONERA M6 wing, and a modified ONERA M6 wing. First, the optimization problem is presented with the objective function, constraints, and design vector. Next, the tool's architecture and the analysis tools that are utilized are described. Finally, various optimizations are described and their results analyzed for all test subjects. Results show that less computationally expensive inviscid optimizations yield positive performance improvements using planform, airfoil, and three-dimensional degrees of freedom. From the results obtained through a series of optimizations, it is concluded that the newly developed tool is both effective at improving performance and serves as a platform ready to receive additional performance modules, further improving its computational design support potential.

  17. Efficient Processing of Data for Locating Lightning Strikes

    NASA Technical Reports Server (NTRS)

    Medelius, Pedro J.; Starr, Stan

    2003-01-01

    Two algorithms have been devised to increase the efficiency of processing of data in lightning detection and ranging (LDAR) systems so as to enable the accurate location of lightning strikes in real time. In LDAR, the location of a lightning strike is calculated by solving equations for the differences among the times of arrival (DTOAs) of the lightning signals at multiple antennas as functions of the locations of the antennas and the speed of light. The most difficult part of the problem is computing the DTOAs from digitized versions of the signals received by the various antennas. One way (a time-domain approach) to determine the DTOAs is to compute cross-correlations among variously differentially delayed replicas of the digitized signals and to select, as the DTOAs, those differential delays that yield the maximum correlations. Another way (a frequency-domain approach) to determine the DTOAs involves the computation of cross-correlations among Fourier transforms of variously differentially phased replicas of the digitized signals, along with utilization of the relationship among phase difference, time delay, and frequency.

  18. Cognitive cooperation groups mediated by computers and internet present significant improvement of cognitive status in older adults with memory complaints: a controlled prospective study.

    PubMed

    Krug, Rodrigo de Rosso; Silva, Anna Quialheiro Abreu da; Schneider, Ione Jayce Ceola; Ramos, Luiz Roberto; d'Orsi, Eleonora; Xavier, André Junqueira

    2017-04-01

    To estimate the effect of participating in cognitive cooperation groups, mediated by computers and the internet, on the Mini-Mental State Examination (MMSE) percent variation of outpatients with memory complaints attending two memory clinics. A prospective controlled intervention study carried out from 2006 to 2013 with 293 elders. The intervention group (n = 160) attended a cognitive cooperation group (20 sessions of 1.5 hours each). The control group (n = 133) received routine medical care. Outcome was the percent variation in the MMSE. Control variables included gender, age, marital status, schooling, hypertension, diabetes, dyslipidaemia, hypothyroidism, depression, vascular diseases, polymedication, use of benzodiazepines, exposure to tobacco, sedentary lifestyle, obesity and functional capacity. The final model was obtained by multivariate linear regression. The intervention group obtained an independent positive variation of 24.39% (CI 95% = 14.86/33.91) in the MMSE compared to the control group. The results suggested that cognitive cooperation groups, mediated by computers and the internet, are associated with cognitive status improvement of older adults in memory clinics.

  19. Interactive wiimote gaze stabilization exercise training system for patients with vestibular hypofunction

    PubMed Central

    2012-01-01

    Background Peripheral vestibular hypofunction is a major cause of dizziness. When complicated with postural imbalance, this condition can lead to an increased incidence of falls. In traditional clinical practice, gaze stabilization exercise is commonly used to rehabilitate patients. In this study, we established a computer-aided vestibular rehabilitation system by coupling infrared LEDs to an infrared receiver. This system enabled the subjects’ head-turning actions to be quantified, and the training was performed using vestibular exercise combined with computer games and interactive video games that simulate daily life activities. Methods Three unilateral and one bilateral vestibular hypofunction patients volunteered to participate in this study. The participants received 30 minutes of computer-aided vestibular rehabilitation training 2 days per week for 6 weeks. Pre-training and post-training assessments were completed, and a follow-up assessment was completed 1 month after the end of the training period. Results After 6 weeks of training, significant improvements in balance and dynamic visual acuity (DVA) were observed in the four participants. Self-reports of dizziness, anxiety and depressed mood all decreased significantly. Significant improvements in self-confidence and physical performance were also observed. The effectiveness of this training was maintained for at least 1 month after the end of the training period. Conclusion Real-time monitoring of training performance can be achieved using this rehabilitation platform. Patients demonstrated a reduction in dizziness symptoms after 6 weeks of training with this short-term interactive game approach. This treatment paradigm also improved the patients’ balance function. This system could provide a convenient, safe and affordable treatment option for clinical practitioners. PMID:23043886

  20. Interactive wiimote gaze stabilization exercise training system for patients with vestibular hypofunction.

    PubMed

    Chen, Po-Yin; Hsieh, Wan-Ling; Wei, Shun-Hwa; Kao, Chung-Lan

    2012-10-09

    Peripheral vestibular hypofunction is a major cause of dizziness. When complicated with postural imbalance, this condition can lead to an increased incidence of falls. In traditional clinical practice, gaze stabilization exercise is commonly used to rehabilitate patients. In this study, we established a computer-aided vestibular rehabilitation system by coupling infrared LEDs to an infrared receiver. This system enabled the subjects' head-turning actions to be quantified, and the training was performed using vestibular exercise combined with computer games and interactive video games that simulate daily life activities. Three unilateral and one bilateral vestibular hypofunction patients volunteered to participate in this study. The participants received 30 minutes of computer-aided vestibular rehabilitation training 2 days per week for 6 weeks. Pre-training and post-training assessments were completed, and a follow-up assessment was completed 1 month after the end of the training period. After 6 weeks of training, significant improvements in balance and dynamic visual acuity (DVA) were observed in the four participants. Self-reports of dizziness, anxiety and depressed mood all decreased significantly. Significant improvements in self-confidence and physical performance were also observed. The effectiveness of this training was maintained for at least 1 month after the end of the training period. Real-time monitoring of training performance can be achieved using this rehabilitation platform. Patients demonstrated a reduction in dizziness symptoms after 6 weeks of training with this short-term interactive game approach. This treatment paradigm also improved the patients' balance function. This system could provide a convenient, safe and affordable treatment option for clinical practitioners.

  1. Comparison of computer based instruction to behavior skills training for teaching staff implementation of discrete-trial instruction with an adult with autism.

    PubMed

    Nosik, Melissa R; Williams, W Larry; Garrido, Natalia; Lee, Sarah

    2013-01-01

    In the current study, behavior skills training (BST) is compared to a computer based training package for teaching discrete trial instruction to staff, teaching an adult with autism. The computer based training package consisted of instructions, video modeling and feedback. BST consisted of instructions, modeling, rehearsal and feedback. Following training, participants were evaluated in terms of their accuracy on completing critical skills for running a discrete trial program. Six participants completed training; three received behavior skills training and three received the computer based training. Participants in the BST group performed better overall after training and during six week probes than those in the computer based training group. There were differences across both groups between research assistant and natural environment competency levels. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Shielding and activity estimator for template-based nuclide identification methods

    DOEpatents

    Nelson, Karl Einar

    2013-04-09

    According to one embodiment, a method for estimating an activity of one or more radio-nuclides includes receiving one or more templates, the one or more templates corresponding to one or more radio-nuclides which contribute to a probable solution, receiving one or more weighting factors, each weighting factor representing a contribution of one radio-nuclide to the probable solution, computing an effective areal density for each of the one more radio-nuclides, computing an effective atomic number (Z) for each of the one more radio-nuclides, computing an effective metric for each of the one or more radio-nuclides, and computing an estimated activity for each of the one or more radio-nuclides. In other embodiments, computer program products, systems, and other methods are presented for estimating an activity of one or more radio-nuclides.

  3. 21 CFR 870.1435 - Single-function, preprogrammed diagnostic computer.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Single-function, preprogrammed diagnostic computer... Single-function, preprogrammed diagnostic computer. (a) Identification. A single-function, preprogrammed diagnostic computer is a hard-wired computer that calculates a specific physiological or blood-flow parameter...

  4. 21 CFR 870.1435 - Single-function, preprogrammed diagnostic computer.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Single-function, preprogrammed diagnostic computer... Single-function, preprogrammed diagnostic computer. (a) Identification. A single-function, preprogrammed diagnostic computer is a hard-wired computer that calculates a specific physiological or blood-flow parameter...

  5. 21 CFR 870.1435 - Single-function, preprogrammed diagnostic computer.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Single-function, preprogrammed diagnostic computer... Single-function, preprogrammed diagnostic computer. (a) Identification. A single-function, preprogrammed diagnostic computer is a hard-wired computer that calculates a specific physiological or blood-flow parameter...

  6. 21 CFR 870.1435 - Single-function, preprogrammed diagnostic computer.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Single-function, preprogrammed diagnostic computer... Single-function, preprogrammed diagnostic computer. (a) Identification. A single-function, preprogrammed diagnostic computer is a hard-wired computer that calculates a specific physiological or blood-flow parameter...

  7. Lithospheric architecture of NE China from joint Inversions of receiver functions and surface wave dispersion through Bayesian optimisation

    NASA Astrophysics Data System (ADS)

    Sebastian, Nita; Kim, Seongryong; Tkalčić, Hrvoje; Sippl, Christian

    2017-04-01

    The purpose of this study is to develop an integrated inference on the lithospheric structure of NE China using three passive seismic networks comprised of 92 stations. The NE China plain consists of complex lithospheric domains characterised by the co-existence of complex geodynamic processes such as crustal thinning, active intraplate cenozoic volcanism and low velocity anomalies. To estimate lithospheric structures with greater detail, we chose to perform the joint inversion of independent data sets such as receiver functions and surface wave dispersion curves (group and phase velocity). We perform a joint inversion based on principles of Bayesian transdimensional optimisation techniques (Kim etal., 2016). Unlike in the previous studies of NE China, the complexity of the model is determined from the data in the first stage of the inversion, and the data uncertainty is computed based on Bayesian statistics in the second stage of the inversion. The computed crustal properties are retrieved from an ensemble of probable models. We obtain major structural inferences with well constrained absolute velocity estimates, which are vital for inferring properties of the lithosphere and bulk crustal Vp/Vs ratio. The Vp/Vs estimate obtained from joint inversions confirms the high Vp/Vs ratio ( 1.98) obtained using the H-Kappa method beneath some stations. Moreover, we could confirm the existence of a lower crustal velocity beneath several stations (eg: station SHS) within the NE China plain. Based on these findings we attempt to identify a plausible origin for structural complexity. We compile a high-resolution 3D image of the lithospheric architecture of the NE China plain.

  8. Clients' outcomes of home health nursing in Taiwan.

    PubMed

    Yeh, L; Wen, M J

    2001-09-01

    The home health nursing movement is expanding rapidly. Home health nursing agencies (HHNAs) are expected to demonstrate that the care provided does make a difference for the client receiving the services. The purpose of this study was to explore client outcomes from home health nursing. Outcome indicators include: Services utilized (emergency services, re-hospitalization), physiological status (catheter indwelling status, consciousness level, wound severity-number and wound stages) and functional status (reflected by Barthel Index). A prospective research design was used to collect the results. Five hospital-based HHNAs were invited to participate in this research. Clients newly admitted to HHNAs and diagnosed as non-cancer patients were recruited, and the researchers gathered outcome indicators over a six-month period. Data were analyzed using SPSS 8.0 computer software. There were 75 clients in this study. Results showed that most of the clients (64.0%) received service for more than 180 days. The client characteristics were dominated by elderly (66.6% age above 70), female (53.3%) and married (74.7%). The three leading care needs were NG tubing service (84.0%), Foley tubing service (45.3%) and wound care (38.7%). The Kruscal Wallis Test revealed that there was no difference in emergency service frequency and re-hospitalization between clients who received service for more than 180 days and those who received service for less than 180 days. The Wilcoxon Sign rank test showed that within one half-year, catheter indwelling status, functional status, and wound severity were not significantly different, with the exception only of conscious level (p = .001). The results of this study can be viewed as preliminary data to assist in shaping home health nursing services in Taiwan.

  9. Uric acid therapy improves the outcomes of stroke patients treated with intravenous tissue plasminogen activator and mechanical thrombectomy.

    PubMed

    Chamorro, Ángel; Amaro, Sergio; Castellanos, Mar; Gomis, Meritxell; Urra, Xabier; Blasco, Jordi; Arenillas, Juan F; Román, Luis S; Muñoz, Roberto; Macho, Juan; Cánovas, David; Marti-Fabregas, Joan; Leira, Enrique C; Planas, Anna M

    2017-06-01

    Background Numerous neuroprotective drugs have failed to show benefit in the treatment of acute ischemic stroke, making the search for new treatments imperative. Uric acid is an endogenous antioxidant making it a drug candidate to improve stroke outcomes. Aim To report the effects of uric acid therapy in stroke patients receiving intravenous thrombolysis and mechanical thrombectomy. Methods Forty-five patients with proximal vessel occlusions enrolled in the URICO-ICTUS trial received intravenous recombinant tissue plasminogen activator within 4.5 h after stroke onset and randomized to intravenous 1000 mg uric acid or placebo (NCT00860366). These patients also received mechanical thrombectomy because a brain computed tomogaphy angiography confirmed the lack of proximal recanalization at the end of systemic thrombolysis. The primary outcome was good functional outcome at 90 days (modified Rankin Score 0-2). Safety outcomes included mortality, symptomatic intracerebral bleeding, and gout attacks. Results The rate of successful revascularization was >80% in the uric acid and the placebo groups but good functional outcome was observed in 16 out of 24 (67%) patients treated with uric acid and 10 out of 21 (48%) treated with placebo (adjusted Odds Ratio, 6.12 (95% CI 1.08-34.56)). Mortality was observed in two out of 24 (8.3%) patients treated with uric acid and one out of 21 (4.8%) treated with placebo (adjusted Odds Ratio, 3.74 (95% CI 0.06-226.29)). Symptomatic cerebral bleeding and gout attacks were similar in both groups. Conclusions Uric acid therapy was safe and improved stroke outcomes in stroke patients receiving intravenous thrombolysis followed by thrombectomy. Validation of this simple strategy in a larger trial is urgent.

  10. Broadcasting a message in a parallel computer

    DOEpatents

    Archer, Charles J; Faraj, Ahmad A

    2013-04-16

    Methods, systems, and products are disclosed for broadcasting a message in a parallel computer that includes: transmitting, by the logical root to all of the nodes directly connected to the logical root, a message; and for each node except the logical root: receiving the message; if that node is the physical root, then transmitting the message to all of the child nodes except the child node from which the message was received; if that node received the message from a parent node and if that node is not a leaf node, then transmitting the message to all of the child nodes; and if that node received the message from a child node and if that node is not the physical root, then transmitting the message to all of the child nodes except the child node from which the message was received and transmitting the message to the parent node.

  11. Broadcasting a message in a parallel computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Methods, systems, and products are disclosed for broadcasting a message in a parallel computer that includes: transmitting, by the logical root to all of the nodes directly connected to the logical root, a message; and for each node except the logical root: receiving the message; if that node is the physical root, then transmitting the message to all of the child nodes except the child node from which the message was received; if that node received the message from a parent node and if that node is not a leaf node, then transmitting the message to all of the childmore » nodes; and if that node received the message from a child node and if that node is not the physical root, then transmitting the message to all of the child nodes except the child node from which the message was received and transmitting the message to the parent node.« less

  12. Situation awareness system for Canada

    NASA Astrophysics Data System (ADS)

    Hill, Andrew

    1999-07-01

    Situation awareness encompasses a knowledge of orders, plans and current knowledge of friendly force actions. Knowing where you are and being able to transmit that information in near real-time to other friendly forces provides the ability to exercise precise command and control over those forces. With respect to current command and control using voice methods, between 40 percent and 60 percent of Combat Net Radio traffic relates to location reporting of some sort. Commanders at Battle Group and below spend, on average, 40 percent of their total time performing position and navigation related functions. The need to rapidly transfer own force location information throughout a force and to process the received information quickly, accurately and reliably provides the rationale for the requirement for an automated situation awareness system. This paper describes the Situation Awareness System (SAS) being developed by Computing Devices Canada for the Canadian Department of National Defence as a component of the Position Determination and Navigation for Land Forces program. The SAS is being integrated with the Iris Tactical Command, Control, Communications System, which is also being developed by Computing Devices. The SAS software provides a core operating environment onto which command and control functionality can be easily added to produce general and specialist battlefield management systems.

  13. Communication Limits Due to Photon-Detector Jitter

    NASA Technical Reports Server (NTRS)

    Moision, Bruce E.; Farr, William H.

    2008-01-01

    A theoretical and experimental study was conducted of the limit imposed by photon-detector jitter on the capacity of a pulse-position-modulated optical communication system in which the receiver operates in a photon-counting (weak-signal) regime. Photon-detector jitter is a random delay between impingement of a photon and generation of an electrical pulse by the detector. In the study, jitter statistics were computed from jitter measurements made on several photon detectors. The probability density of jitter was mathematically modeled by use of a weighted sum of Gaussian functions. Parameters of the model were adjusted to fit histograms representing the measured-jitter statistics. Likelihoods of assigning detector-output pulses to correct pulse time slots in the presence of jitter were derived and used to compute channel capacities and corresponding losses due to jitter. It was found that the loss, expressed as the ratio between the signal power needed to achieve a specified capacity in the presence of jitter and that needed to obtain the same capacity in the absence of jitter, is well approximated as a quadratic function of the standard deviation of the jitter in units of pulse-time-slot duration.

  14. Modeling Radiative Heat Transfer and Turbulence-Radiation Interactions in Engines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paul, Chandan; Sircar, Arpan; Ferreyro-Fernandez, Sebastian

    Detailed radiation modelling in piston engines has received relatively little attention to date. Recently, it is being revisited in light of current trends towards higher operating pressures and higher levels of exhaust-gas recirculation, both of which enhance molecular gas radiation. Advanced high-efficiency engines also are expected to function closer to the limits of stable operation, where even small perturbations to the energy balance can have a large influence on system behavior. Here several different spectral radiation property models and radiative transfer equation (RTE) solvers have been implemented in an OpenFOAM-based engine CFD code, and simulations have been performed for amore » full-load (peak pressure ~200 bar) heavy-duty diesel engine. Differences in computed temperature fields, NO and soot levels, and wall heat transfer rates are shown for different combinations of spectral models and RTE solvers. The relative importance of molecular gas radiation versus soot radiation is examined. And the influence of turbulence-radiation interactions is determined by comparing results obtained using local mean values of composition and temperature to compute radiative emission and absorption with those obtained using a particle-based transported probability density function method.« less

  15. Computer-aided classification of patients with dementia of Alzheimer's type based on cerebral blood flow determined with arterial spin labeling technique

    NASA Astrophysics Data System (ADS)

    Yamashita, Yasuo; Arimura, Hidetaka; Yoshiura, Takashi; Tokunaga, Chiaki; Magome, Taiki; Monji, Akira; Noguchi, Tomoyuki; Toyofuku, Fukai; Oki, Masafumi; Nakamura, Yasuhiko; Honda, Hiroshi

    2010-03-01

    Arterial spin labeling (ASL) is one of promising non-invasive magnetic resonance (MR) imaging techniques for diagnosis of Alzheimer's disease (AD) by measuring cerebral blood flow (CBF). The aim of this study was to develop a computer-aided classification system for AD patients based on CBFs measured by the ASL technique. The average CBFs in cortical regions were determined as functional image features based on the CBF map image, which was non-linearly transformed to a Talairach brain atlas by using a free-form deformation. An artificial neural network (ANN) was trained with the CBF functional features in 10 cortical regions, and was employed for distinguishing patients with AD from control subjects. For evaluation of the method, we applied the proposed method to 20 cases including ten AD patients and ten control subjects, who were scanned a 3.0-Tesla MR unit. As a result, the area under the receiver operating characteristic curve obtained by the proposed method was 0.893 based on a leave-one-out-by-case test in identification of AD cases among 20 cases. The proposed method would be feasible for classification of patients with AD.

  16. The effectiveness of an interactive computer program versus traditional lecture in athletic training education.

    PubMed

    Wiksten, D L; Patterson, P; Antonio, K; De La Cruz, D; Buxton, B P

    1998-07-01

    To evaluate the effectiveness of an interactive athletic training educational curriculum (IATEC) computer program as compared with traditional lecture instruction. Instructions on assessment of the quadriceps angle (Q-angle) were compared. Dependent measures consisted of cognitive knowledge, practical skill assessment, and attitudes toward the 2 methods of instruction. Sixty-six subjects were selected and then randomly assigned to 3 different groups: traditional lecture, IATEC, and control. The traditional lecture group (n = 22) received a 50-minute lecture/demonstration covering the same instructional content as the Q-angle module of the IATEC program. The IATEC group (n = 20; 2 subjects were dropped from this group due to scheduling conflicts) worked independently for 50 to 65 minutes using the Q-angle module of the IATEC program. The control group (n = 22) received no instruction. Subjects were recruited from an undergraduate athletic training education program and were screened for prior knowledge of the Q-angle. A 9-point multiple choice examination was used to determine cognitive knowledge of the Q-angle. A 12-point yes-no checklist was used to determine whether or not the subjects were able to correctly measure the Q-angle. The Allen Attitude Toward Computer-Assisted Instruction Semantic Differential Survey was used to assess student attitudes toward the 2 methods of instruction. The survey examined overall attitudes, in addition to 3 subscales: comfort, creativity, and function. The survey was scored from 1 to 7, with 7 being the most favorable and 1 being the least favorable. Results of a 1-way ANOVA on cognitive knowledge of the Q-angle revealed that the traditional lecture and IATEC groups performed significantly better than the control group, and the traditional lecture group performed significantly better than the IATEC group. Results of a 1-way ANOVA on practical skill performance revealed that the traditional lecture and IATEC groups performed significantly better than the control group, but there were no significant differences between the traditional lecture and IATEC groups on practical skill performance. Results of a t test indicated significantly more favorable attitudes (P < .05) for the traditional lecture group when compared with the IATEC group for comfort, creativity, and function. Our results suggest that use of the IATEC computer module is an effective means of instruction; however, use of the IATEC program alone may not be sufficient for educating students in cognitive knowledge. Further research is needed to determine the effectiveness of the IATEC computer program as a supplement to traditional lecture instruction in athletic training education.

  17. The Effectiveness of an Interactive Computer Program Versus Traditional Lecture in Athletic Training Education

    PubMed Central

    Wiksten, Denise Lebsack; Patterson, Patricia; Antonio, Kimberly; De La Cruz, Daniel; Buxton, Barton P.

    1998-01-01

    Objective: To evaluate the effectiveness of an interactive athletic training educational curriculum (IATEC) computer program as compared with traditional lecture instruction. Instructions on assessment of the quadriceps angle (Q-angle) were compared. Dependent measures consisted of cognitive knowledge, practical skill assessment, and attitudes toward the 2 methods of instruction. Design and Setting: Sixty-six subjects were selected and then randomly assigned to 3 different groups: traditional lecture, IATEC, and control. The traditional lecture group (n = 22) received a 50-minute lecture/demonstration covering the same instructional content as the Q-angle module of the IATEC program. The IATEC group (n = 20; 2 subjects were dropped from this group due to scheduling conflicts) worked independently for 50 to 65 minutes using the Q-angle module of the IATEC program. The control group (n = 22) received no instruction. Subjects: Subjects were recruited from an undergraduate athletic training education program and were screened for prior knowledge of the Q-angle. Measurements: A 9-point multiple choice examination was used to determine cognitive knowledge of the Q-angle. A 12-point yes-no checklist was used to determine whether or not the subjects were able to correctly measure the Q-angle. The Allen Attitude Toward Computer-Assisted Instruction Semantic Differential Survey was used to assess student attitudes toward the 2 methods of instruction. The survey examined overall attitudes, in addition to 3 subscales: comfort, creativity, and function. The survey was scored from 1 to 7, with 7 being the most favorable and 1 being the least favorable. Results: Results of a 1-way ANOVA on cognitive knowledge of the Q-angle revealed that the traditional lecture and IATEC groups performed significantly better than the control group, and the traditional lecture group performed significantly better than the IATEC group. Results of a 1-way ANOVA on practical skill performance revealed that the traditional lecture and IATEC groups performed significantly better than the control group, but there were no significant differences between the traditional lecture and IATEC groups on practical skill performance. Results of a t test indicated significantly more favorable attitudes (P < .05) for the traditional lecture group when compared with the IATEC group for comfort, creativity, and function. Conclusions: Our results suggest that use of the IATEC computer module is an effective means of instruction; however, use of the IATEC program alone may not be sufficient for educating students in cognitive knowledge. Further research is needed to determine the effectiveness of the IATEC computer program as a supplement to traditional lecture instruction in athletic training education. PMID:16558517

  18. A Decade of Mobile Computing for Students

    ERIC Educational Resources Information Center

    Jenny, Frederick J.

    2005-01-01

    This paper describes the mobile computing at Grove City College, a small, private, liberal arts institution in Western Pennsylvania. They have entered their second decade of mobile computing for students in the school of about 2200. Each incoming freshman receives a laptop computing and inkjet printer during the fall orientation, all a benefit of…

  19. Computer Games for the Math Achievement of Diverse Students

    ERIC Educational Resources Information Center

    Kim, Sunha; Chang, Mido

    2010-01-01

    Although computer games as a way to improve students' learning have received attention by many educational researchers, no consensus has been reached on the effects of computer games on student achievement. Moreover, there is lack of empirical research on differential effects of computer games on diverse learners. In response, this study…

  20. 8 CFR 293.1 - Computation of interest.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... simple interest table in § 293.3 shall be utilized in the computation of interest under this part. ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Computation of interest. 293.1 Section 293... INTEREST ON CASH RECEIVED TO SECURE IMMIGRATION BONDS § 293.1 Computation of interest. Interest shall be...

  1. Effect of Wind Flow on Convective Heat Losses from Scheffler Solar Concentrator Receivers

    NASA Astrophysics Data System (ADS)

    Nene, Anita Arvind; Ramachandran, S.; Suyambazhahan, S.

    2018-05-01

    Receiver is an important element of solar concentrator system. In a Scheffler concentrator, solar rays get concentrated at focus of parabolic dish. While radiation losses are more predictable and calculable since strongly related to receiver temperature, convective looses are difficult to estimate in view of additional factors such as wind flow direction, speed, receiver geometry, prior to current work. Experimental investigation was carried out on two geometries of receiver namely cylindrical and conical with 2.7 m2 Scheffler to find optimum condition of tilt to provide best efficiency. Experimental results showed that as compared to cylindrical receiver, conical receiver gave maximum efficiency at 45° tilt angle. However effect of additional factors like wind speed, wind direction on especially convective losses could not be separately seen. The current work was undertaken to investigate further the same two geometries using computation fluid dynamics using FLUENT to compute convective losses considering all variables such at tilt angle of receiver, wind velocity and wind direction. For cylindrical receiver, directional heat transfer coefficient (HTC) is remarkably high to tilt condition meaning this geometry is critical to tilt leading to higher convective heat losses. For conical receiver, directional average HTC is remarkably less to tilt condition leading to lower convective heat loss.

  2. Low latency, high bandwidth data communications between compute nodes in a parallel computer

    DOEpatents

    Blocksome, Michael A

    2014-04-01

    Methods, systems, and products are disclosed for data transfers between nodes in a parallel computer that include: receiving, by an origin DMA on an origin node, a buffer identifier for a buffer containing data for transfer to a target node; sending, by the origin DMA to the target node, a RTS message; transferring, by the origin DMA, a data portion to the target node using a memory FIFO operation that specifies one end of the buffer from which to begin transferring the data; receiving, by the origin DMA, an acknowledgement of the RTS message from the target node; and transferring, by the origin DMA in response to receiving the acknowledgement, any remaining data portion to the target node using a direct put operation that specifies the other end of the buffer from which to begin transferring the data, including initiating the direct put operation without invoking an origin processing core.

  3. Low latency, high bandwidth data communications between compute nodes in a parallel computer

    DOEpatents

    Blocksome, Michael A

    2014-04-22

    Methods, systems, and products are disclosed for data transfers between nodes in a parallel computer that include: receiving, by an origin DMA on an origin node, a buffer identifier for a buffer containing data for transfer to a target node; sending, by the origin DMA to the target node, a RTS message; transferring, by the origin DMA, a data portion to the target node using a memory FIFO operation that specifies one end of the buffer from which to begin transferring the data; receiving, by the origin DMA, an acknowledgement of the RTS message from the target node; and transferring, by the origin DMA in response to receiving the acknowledgement, any remaining data portion to the target node using a direct put operation that specifies the other end of the buffer from which to begin transferring the data, including initiating the direct put operation without invoking an origin processing core.

  4. Low latency, high bandwidth data communications between compute nodes in a parallel computer

    DOEpatents

    Blocksome, Michael A

    2013-07-02

    Methods, systems, and products are disclosed for data transfers between nodes in a parallel computer that include: receiving, by an origin DMA on an origin node, a buffer identifier for a buffer containing data for transfer to a target node; sending, by the origin DMA to the target node, a RTS message; transferring, by the origin DMA, a data portion to the target node using a memory FIFO operation that specifies one end of the buffer from which to begin transferring the data; receiving, by the origin DMA, an acknowledgement of the RTS message from the target node; and transferring, by the origin DMA in response to receiving the acknowledgement, any remaining data portion to the target node using a direct put operation that specifies the other end of the buffer from which to begin transferring the data, including initiating the direct put operation without invoking an origin processing core.

  5. Should regional ventilation function be considered during radiation treatment planning to prevent radiation-induced complications?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lan, Fujun; Jeudy, Jean; D’Souza, Warren

    Purpose: To investigate the incorporation of pretherapy regional ventilation function in predicting radiation fibrosis (RF) in stage III nonsmall cell lung cancer (NSCLC) patients treated with concurrent thoracic chemoradiotherapy. Methods: Thirty-seven patients with stage III NSCLC were retrospectively studied. Patients received one cycle of cisplatin–gemcitabine, followed by two to three cycles of cisplatin–etoposide concurrently with involved-field thoracic radiotherapy (46–66 Gy; 2 Gy/fraction). Pretherapy regional ventilation images of the lung were derived from 4D computed tomography via a density change–based algorithm with mass correction. In addition to the conventional dose–volume metrics (V{sub 20}, V{sub 30}, V{sub 40}, and mean lung dose),more » dose–function metrics (fV{sub 20}, fV{sub 30}, fV{sub 40}, and functional mean lung dose) were generated by combining regional ventilation and radiation dose. A new class of metrics was derived and referred to as dose–subvolume metrics (sV{sub 20}, sV{sub 30}, sV{sub 40}, and subvolume mean lung dose); these were defined as the conventional dose–volume metrics computed on the functional lung. Area under the receiver operating characteristic curve (AUC) values and logistic regression analyses were used to evaluate these metrics in predicting hallmark characteristics of RF (lung consolidation, volume loss, and airway dilation). Results: AUC values for the dose–volume metrics in predicting lung consolidation, volume loss, and airway dilation were 0.65–0.69, 0.57–0.70, and 0.69–0.76, respectively. The respective ranges for dose–function metrics were 0.63–0.66, 0.61–0.71, and 0.72–0.80 and for dose–subvolume metrics were 0.50–0.65, 0.65–0.75, and 0.73–0.85. Using an AUC value = 0.70 as cutoff value suggested that at least one of each type of metrics (dose–volume, dose–function, dose–subvolume) was predictive for volume loss and airway dilation, whereas lung consolidation cannot be accurately predicted by any of the metrics. Logistic regression analyses showed that dose–function and dose–subvolume metrics were significant (P values ≤ 0.02) in predicting volume airway dilation. Likelihood ratio test showed that when combining dose–function and/or dose–subvolume metrics with dose–volume metrics, the achieved improvements of prediction accuracy on volume loss and airway dilation were significant (P values ≤ 0.04). Conclusions: The authors’ results demonstrated that the inclusion of regional ventilation function improved accuracy in predicting RF. In particular, dose–subvolume metrics provided a promising method for preventing radiation-induced pulmonary complications.« less

  6. A new fifth parameter for transverse isotropy III: reflection and transmission coefficients

    NASA Astrophysics Data System (ADS)

    Kawakatsu, Hitoshi

    2018-04-01

    The effect of the newly defined fifth parameter, ηκ, of transverse anisotropy to the reflection and transmission coefficients, especially for P-to-S and S-to-P conversion coefficients, is examined. While ηκ systematically affects the P-to-S and S-to-P conversions, in the incidence angle range of the practical interest of receiver function studies, the effect may be asymmetric in a sense that P-wave receiver function is affected more than S-receiver function in terms of amplitude. This asymmetry may help resolving ηκ via extensive receiver function analysis. It is also found that P-wave anisotropy significantly influences P-to-S and S-to-P conversion coefficients that complicates the interpretation of receiver functions, because, for isotropic media, we typically attribute the primary receiver function signals to S-wave velocity changes but not to P-wave changes.

  7. Comparison of continuously acquired resting state and extracted analogues from active tasks.

    PubMed

    Ganger, Sebastian; Hahn, Andreas; Küblböck, Martin; Kranz, Georg S; Spies, Marie; Vanicek, Thomas; Seiger, René; Sladky, Ronald; Windischberger, Christian; Kasper, Siegfried; Lanzenberger, Rupert

    2015-10-01

    Functional connectivity analysis of brain networks has become an important tool for investigation of human brain function. Although functional connectivity computations are usually based on resting-state data, the application to task-specific fMRI has received growing attention. Three major methods for extraction of resting-state data from task-related signal have been proposed (1) usage of unmanipulated task data for functional connectivity; (2) regression against task effects, subsequently using the residuals; and (3) concatenation of baseline blocks located in-between task blocks. Despite widespread application in current research, consensus on which method best resembles resting-state seems to be missing. We, therefore, evaluated these techniques in a sample of 26 healthy controls measured at 7 Tesla. In addition to continuous resting-state, two different task paradigms were assessed (emotion discrimination and right finger-tapping) and five well-described networks were analyzed (default mode, thalamus, cuneus, sensorimotor, and auditory). Investigating the similarity to continuous resting-state (Dice, Intraclass correlation coefficient (ICC), R(2) ) showed that regression against task effects yields functional connectivity networks most alike to resting-state. However, all methods exhibited significant differences when compared to continuous resting-state and similarity metrics were lower than test-retest of two resting-state scans. Omitting global signal regression did not change these findings. Visually, the networks are highly similar, but through further investigation marked differences can be found. Therefore, our data does not support referring to resting-state when extracting signals from task designs, although functional connectivity computed from task-specific data may indeed yield interesting information. © 2015 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  8. Comparison of continuously acquired resting state and extracted analogues from active tasks

    PubMed Central

    Ganger, Sebastian; Hahn, Andreas; Küblböck, Martin; Kranz, Georg S.; Spies, Marie; Vanicek, Thomas; Seiger, René; Sladky, Ronald; Windischberger, Christian; Kasper, Siegfried

    2015-01-01

    Abstract Functional connectivity analysis of brain networks has become an important tool for investigation of human brain function. Although functional connectivity computations are usually based on resting‐state data, the application to task‐specific fMRI has received growing attention. Three major methods for extraction of resting‐state data from task‐related signal have been proposed (1) usage of unmanipulated task data for functional connectivity; (2) regression against task effects, subsequently using the residuals; and (3) concatenation of baseline blocks located in‐between task blocks. Despite widespread application in current research, consensus on which method best resembles resting‐state seems to be missing. We, therefore, evaluated these techniques in a sample of 26 healthy controls measured at 7 Tesla. In addition to continuous resting‐state, two different task paradigms were assessed (emotion discrimination and right finger‐tapping) and five well‐described networks were analyzed (default mode, thalamus, cuneus, sensorimotor, and auditory). Investigating the similarity to continuous resting‐state (Dice, Intraclass correlation coefficient (ICC), R 2) showed that regression against task effects yields functional connectivity networks most alike to resting‐state. However, all methods exhibited significant differences when compared to continuous resting‐state and similarity metrics were lower than test‐retest of two resting‐state scans. Omitting global signal regression did not change these findings. Visually, the networks are highly similar, but through further investigation marked differences can be found. Therefore, our data does not support referring to resting‐state when extracting signals from task designs, although functional connectivity computed from task‐specific data may indeed yield interesting information. Hum Brain Mapp 36:4053–4063, 2015. © 2015 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc. PMID:26178250

  9. Functional Connectivity Between Anterior Insula and Key Nodes of Frontoparietal Executive Control and Salience Networks Distinguish Bipolar Depression From Unipolar Depression and Healthy Control Subjects.

    PubMed

    Ellard, Kristen K; Zimmerman, Jared P; Kaur, Navneet; Van Dijk, Koene R A; Roffman, Joshua L; Nierenberg, Andrew A; Dougherty, Darin D; Deckersbach, Thilo; Camprodon, Joan A

    2018-05-01

    Patients with bipolar depression are characterized by dysregulation across the full spectrum of mood, differentiating them from patients with unipolar depression. The ability to switch neural resources among the default mode network, salience network, and executive control network (ECN) has been proposed as a key mechanism for adaptive mood regulation. The anterior insula is implicated in the modulation of functional network switching. Differential connectivity between anterior insula and functional networks may provide insights into pathophysiological differences between bipolar and unipolar mood disorders, with implications for diagnosis and treatment. Resting-state functional magnetic resonance imaging data were collected from 98 subjects (35 unipolar, 24 bipolar, and 39 healthy control subjects). Pearson correlations were computed between bilateral insula seed regions and a priori defined target regions from the default mode network, salience network, and ECN. After r-to-z transformation, a one-way multivariate analysis of covariance was conducted to identify significant differences in connectivity between groups. Post hoc pairwise comparisons were conducted and Bonferroni corrections were applied. Receiver-operating characteristics were computed to assess diagnostic sensitivity. Patients with bipolar depression evidenced significantly altered right anterior insula functional connectivity with the inferior parietal lobule of the ECN relative to patients with unipolar depression and control subjects. Right anterior insula-inferior parietal lobule connectivity significantly discriminated patients with bipolar depression. Impaired functional connectivity between the anterior insula and the inferior parietal lobule of the ECN distinguishes patients with bipolar depression from those with unipolar depression and healthy control subjects. This finding highlights a pathophysiological mechanism with potential as a therapeutic target and a clinical biomarker for bipolar disorder, exhibiting reasonable sensitivity and specificity. Copyright © 2018 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  10. Spectral determinants for twist field correlators

    NASA Astrophysics Data System (ADS)

    Belitsky, A. V.

    2018-04-01

    Twist fields were introduced a few decades ago as a quantum counterpart to classical kink configurations and disorder variables in low dimensional field theories. In recent years they received a new incarnation within the framework of geometric entropy and strong coupling limit of four-dimensional scattering amplitudes. In this paper, we study their two-point correlation functions in a free massless scalar theory, namely, twist-twist and twist-antitwist correlators. In spite of the simplicity of the model in question, the properties of the latter are far from being trivial. The problem is reduced, within the formalism of the path integral, to the study of spectral determinants on surfaces with conical points, which are then computed exactly making use of the zeta function regularization. We also provide an insight into twist correlators for a massive complex scalar by means of the Lifshitz-Krein trace formula.

  11. MHSS: a material handling system simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pomernacki, L.; Hollstien, R.B.

    1976-04-07

    A Material Handling System Simulator (MHSS) program is described that provides specialized functional blocks for modeling and simulation of nuclear material handling systems. Models of nuclear fuel fabrication plants may be built using functional blocks that simulate material receiving, storage, transport, inventory, processing, and shipping operations as well as the control and reporting tasks of operators or on-line computers. Blocks are also provided that allow the user to observe and gather statistical information on the dynamic behavior of simulated plants over single or replicated runs. Although it is currently being developed for the nuclear materials handling application, MHSS can bemore » adapted to other industries in which material accountability is important. In this paper, emphasis is on the simulation methodology of the MHSS program with application to the nuclear material safeguards problem. (auth)« less

  12. Evaluation of computer-based computer tomography stratification against outcome models in connective tissue disease-related interstitial lung disease: a patient outcome study.

    PubMed

    Jacob, Joseph; Bartholmai, Brian J; Rajagopalan, Srinivasan; Brun, Anne Laure; Egashira, Ryoko; Karwoski, Ronald; Kokosi, Maria; Wells, Athol U; Hansell, David M

    2016-11-23

    To evaluate computer-based computer tomography (CT) analysis (CALIPER) against visual CT scoring and pulmonary function tests (PFTs) when predicting mortality in patients with connective tissue disease-related interstitial lung disease (CTD-ILD). To identify outcome differences between distinct CTD-ILD groups derived following automated stratification of CALIPER variables. A total of 203 consecutive patients with assorted CTD-ILDs had CT parenchymal patterns evaluated by CALIPER and visual CT scoring: honeycombing, reticular pattern, ground glass opacities, pulmonary vessel volume, emphysema, and traction bronchiectasis. CT scores were evaluated against pulmonary function tests: forced vital capacity, diffusing capacity for carbon monoxide, carbon monoxide transfer coefficient, and composite physiologic index for mortality analysis. Automated stratification of CALIPER-CT variables was evaluated in place of and alongside forced vital capacity and diffusing capacity for carbon monoxide in the ILD gender, age physiology (ILD-GAP) model using receiver operating characteristic curve analysis. Cox regression analyses identified four independent predictors of mortality: patient age (P < 0.0001), smoking history (P = 0.0003), carbon monoxide transfer coefficient (P = 0.003), and pulmonary vessel volume (P < 0.0001). Automated stratification of CALIPER variables identified three morphologically distinct groups which were stronger predictors of mortality than all CT and functional indices. The Stratified-CT model substituted automated stratified groups for functional indices in the ILD-GAP model and maintained model strength (area under curve (AUC) = 0.74, P < 0.0001), ILD-GAP (AUC = 0.72, P < 0.0001). Combining automated stratified groups with the ILD-GAP model (stratified CT-GAP model) strengthened predictions of 1- and 2-year mortality: ILD-GAP (AUC = 0.87 and 0.86, respectively); stratified CT-GAP (AUC = 0.89 and 0.88, respectively). CALIPER-derived pulmonary vessel volume is an independent predictor of mortality across all CTD-ILD patients. Furthermore, automated stratification of CALIPER CT variables represents a novel method of prognostication at least as robust as PFTs in CTD-ILD patients.

  13. How the cerebellum may monitor sensory information for spatial representation

    PubMed Central

    Rondi-Reig, Laure; Paradis, Anne-Lise; Lefort, Julie M.; Babayan, Benedicte M.; Tobin, Christine

    2014-01-01

    The cerebellum has already been shown to participate in the navigation function. We propose here that this structure is involved in maintaining a sense of direction and location during self-motion by monitoring sensory information and interacting with navigation circuits to update the mental representation of space. To better understand the processing performed by the cerebellum in the navigation function, we have reviewed: the anatomical pathways that convey self-motion information to the cerebellum; the computational algorithm(s) thought to be performed by the cerebellum from these multi-source inputs; the cerebellar outputs directed toward navigation circuits and the influence of self-motion information on space-modulated cells receiving cerebellar outputs. This review highlights that the cerebellum is adequately wired to combine the diversity of sensory signals to be monitored during self-motion and fuel the navigation circuits. The direct anatomical projections of the cerebellum toward the head-direction cell system and the parietal cortex make those structures possible relays of the cerebellum influence on the hippocampal spatial map. We describe computational models of the cerebellar function showing that the cerebellum can filter out the components of the sensory signals that are predictable, and provides a novelty output. We finally speculate that this novelty output is taken into account by the navigation structures, which implement an update over time of position and stabilize perception during navigation. PMID:25408638

  14. Lateral and feedforward inhibition suppress asynchronous activity in a large, biophysically-detailed computational model of the striatal network

    PubMed Central

    Moyer, Jason T.; Halterman, Benjamin L.; Finkel, Leif H.; Wolf, John A.

    2014-01-01

    Striatal medium spiny neurons (MSNs) receive lateral inhibitory projections from other MSNs and feedforward inhibitory projections from fast-spiking, parvalbumin-containing striatal interneurons (FSIs). The functional roles of these connections are unknown, and difficult to study in an experimental preparation. We therefore investigated the functionality of both lateral (MSN-MSN) and feedforward (FSI-MSN) inhibition using a large-scale computational model of the striatal network. The model consists of 2744 MSNs comprised of 189 compartments each and 121 FSIs comprised of 148 compartments each, with dendrites explicitly represented and almost all known ionic currents included and strictly constrained by biological data as appropriate. Our analysis of the model indicates that both lateral inhibition and feedforward inhibition function at the population level to limit non-ensemble MSN spiking while preserving ensemble MSN spiking. Specifically, lateral inhibition enables large ensembles of MSNs firing synchronously to strongly suppress non-ensemble MSNs over a short time-scale (10–30 ms). Feedforward inhibition enables FSIs to strongly inhibit weakly activated, non-ensemble MSNs while moderately inhibiting activated ensemble MSNs. Importantly, FSIs appear to more effectively inhibit MSNs when FSIs fire asynchronously. Both types of inhibition would increase the signal-to-noise ratio of responding MSN ensembles and contribute to the formation and dissolution of MSN ensembles in the striatal network. PMID:25505406

  15. Effects of Brain-Computer Interface-controlled Functional Electrical Stimulation Training on Shoulder Subluxation for Patients with Stroke: A Randomized Controlled Trial.

    PubMed

    Jang, Yun Young; Kim, Tae Hoon; Lee, Byoung Hee

    2016-06-01

    The purpose of this study was to investigate the effects of brain-computer interface (BCI)-controlled functional electrical stimulation (FES) training on shoulder subluxation of patients with stroke. Twenty subjects were randomly divided into two groups: the BCI-FES group (n = 10) and the FES group (n = 10). Patients in the BCI-FES group were administered conventional therapy with the BCI-FES on the shoulder subluxation area of the paretic upper extremity, five times per week during 6 weeks, while the FES group received conventional therapy with FES only. All patients were assessed for shoulder subluxation (vertical distance, VD; horizontal distance, HD), pain (visual analogue scale, VAS) and the Manual Function Test (MFT) at the time of recruitment to the study and after 6 weeks of the intervention. The BCI-FES group demonstrated significant improvements in VD, HD, VAS and MFT after the intervention period, while the FES group demonstrated significant improvements in HD, VAS and MFT. There were also significant differences in the VD and two items (shoulder flexion and abduction) of the MFT between the two groups. The results of this study suggest that BCI-FES training may be effective in improving shoulder subluxation of patients with stroke by facilitating motor recovery. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  16. An efficient formulation of Krylov's prediction model for train induced vibrations based on the dynamic reciprocity theorem.

    PubMed

    Degrande, G; Lombaert, G

    2001-09-01

    In Krylov's analytical prediction model, the free field vibration response during the passage of a train is written as the superposition of the effect of all sleeper forces, using Lamb's approximate solution for the Green's function of a halfspace. When this formulation is extended with the Green's functions of a layered soil, considerable computational effort is required if these Green's functions are needed in a wide range of source-receiver distances and frequencies. It is demonstrated in this paper how the free field response can alternatively be computed, using the dynamic reciprocity theorem, applied to moving loads. The formulation is based on the response of the soil due to the moving load distribution for a single axle load. The equations are written in the wave-number-frequency domain, accounting for the invariance of the geometry in the direction of the track. The approach allows for a very efficient calculation of the free field vibration response, distinguishing the quasistatic contribution from the effect of the sleeper passage frequency and its higher harmonics. The methodology is validated by means of in situ vibration measurements during the passage of a Thalys high-speed train on the track between Brussels and Paris. It is shown that the model has good predictive capabilities in the near field at low and high frequencies, but underestimates the response in the midfrequency band.

  17. Controlling data transfers from an origin compute node to a target compute node

    DOEpatents

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2011-06-21

    Methods, apparatus, and products are disclosed for controlling data transfers from an origin compute node to a target compute node that include: receiving, by an application messaging module on the target compute node, an indication of a data transfer from an origin compute node to the target compute node; and administering, by the application messaging module on the target compute node, the data transfer using one or more messaging primitives of a system messaging module in dependence upon the indication.

  18. 17 CFR 240.15c3-1b - Adjustments to net worth and aggregate indebtedness for certain commodities transactions...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...) Deduct all unsecured receivables, advances and loans except for: (A) Management fees receivable from... broker or dealer in computing “net capital” and which are not receivable from (A) a futures commission... other form of receivable shall not be considered “secured” for the purposes of paragraph (a)(3) of this...

  19. 17 CFR 240.15c3-1b - Adjustments to net worth and aggregate indebtedness for certain commodities transactions...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...) Deduct all unsecured receivables, advances and loans except for: (A) Management fees receivable from... broker or dealer in computing “net capital” and which are not receivable from (A) a futures commission... other form of receivable shall not be considered “secured” for the purposes of paragraph (a)(3) of this...

  20. 17 CFR 240.15c3-1b - Adjustments to net worth and aggregate indebtedness for certain commodities transactions...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...) Deduct all unsecured receivables, advances and loans except for: (A) Management fees receivable from... broker or dealer in computing “net capital” and which are not receivable from (A) a futures commission... other form of receivable shall not be considered “secured” for the purposes of paragraph (a)(3) of this...

  1. 17 CFR 240.15c3-1b - Adjustments to net worth and aggregate indebtedness for certain commodities transactions...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...) Deduct all unsecured receivables, advances and loans except for: (A) Management fees receivable from... broker or dealer in computing “net capital” and which are not receivable from (A) a futures commission... other form of receivable shall not be considered “secured” for the purposes of paragraph (a)(3) of this...

  2. Exploiting on-node heterogeneity for in-situ analytics of climate simulations via a functional partitioning framework

    NASA Astrophysics Data System (ADS)

    Sapra, Karan; Gupta, Saurabh; Atchley, Scott; Anantharaj, Valentine; Miller, Ross; Vazhkudai, Sudharshan

    2016-04-01

    Efficient resource utilization is critical for improved end-to-end computing and workflow of scientific applications. Heterogeneous node architectures, such as the GPU-enabled Titan supercomputer at the Oak Ridge Leadership Computing Facility (OLCF), present us with further challenges. In many HPC applications on Titan, the accelerators are the primary compute engines while the CPUs orchestrate the offloading of work onto the accelerators, and moving the output back to the main memory. On the other hand, applications that do not exploit GPUs, the CPU usage is dominant while the GPUs idle. We utilized Heterogenous Functional Partitioning (HFP) runtime framework that can optimize usage of resources on a compute node to expedite an application's end-to-end workflow. This approach is different from existing techniques for in-situ analyses in that it provides a framework for on-the-fly analysis on-node by dynamically exploiting under-utilized resources therein. We have implemented in the Community Earth System Model (CESM) a new concurrent diagnostic processing capability enabled by the HFP framework. Various single variate statistics, such as means and distributions, are computed in-situ by launching HFP tasks on the GPU via the node local HFP daemon. Since our current configuration of CESM does not use GPU resources heavily, we can move these tasks to GPU using the HFP framework. Each rank running the atmospheric model in CESM pushes the variables of of interest via HFP function calls to the HFP daemon. This node local daemon is responsible for receiving the data from main program and launching the designated analytics tasks on the GPU. We have implemented these analytics tasks in C and use OpenACC directives to enable GPU acceleration. This methodology is also advantageous while executing GPU-enabled configurations of CESM when the CPUs will be idle during portions of the runtime. In our implementation results, we demonstrate that it is more efficient to use HFP framework to offload the tasks to GPUs instead of doing it in the main application. We observe increased resource utilization and overall productivity in this approach by using HFP framework for end-to-end workflow.

  3. Hardware based redundant multi-threading inside a GPU for improved reliability

    DOEpatents

    Sridharan, Vilas; Gurumurthi, Sudhanva

    2015-05-05

    A system and method for verifying computation output using computer hardware are provided. Instances of computation are generated and processed on hardware-based processors. As instances of computation are processed, each instance of computation receives a load accessible to other instances of computation. Instances of output are generated by processing the instances of computation. The instances of output are verified against each other in a hardware based processor to ensure accuracy of the output.

  4. RATIO COMPUTER

    DOEpatents

    Post, R.F.

    1958-11-11

    An electronic computer circuit is described for producing an output voltage proportional to the product or quotient of tbe voltages of a pair of input signals. ln essence, the disclosed invention provides a computer having two channels adapted to receive separate input signals and each having amplifiers with like fixed amplification factors and like negatlve feedback amplifiers. One of the channels receives a constant signal for comparison purposes, whereby a difference signal is produced to control the amplification factors of the variable feedback amplifiers. The output of the other channel is thereby proportional to the product or quotient of input signals depending upon the relation of input to fixed signals in the first mentioned channel.

  5. An algorithm for automatic target recognition using passive radar and an EKF for estimating aircraft orientation

    NASA Astrophysics Data System (ADS)

    Ehrman, Lisa M.

    2005-07-01

    Rather than emitting pulses, passive radar systems rely on "illuminators of opportunity," such as TV and FM radio, to illuminate potential targets. These systems are attractive since they allow receivers to operate without emitting energy, rendering them covert. Until recently, most of the research regarding passive radar has focused on detecting and tracking targets. This dissertation focuses on extending the capabilities of passive radar systems to include automatic target recognition. The target recognition algorithm described in this dissertation uses the radar cross section (RCS) of potential targets, collected over a short period of time, as the key information for target recognition. To make the simulated RCS as accurate as possible, the received signal model accounts for aircraft position and orientation, propagation losses, and antenna gain patterns. An extended Kalman filter (EKF) estimates the target's orientation (and uncertainty in the estimate) from velocity measurements obtained from the passive radar tracker. Coupling the aircraft orientation and state with the known antenna locations permits computation of the incident and observed azimuth and elevation angles. The Fast Illinois Solver Code (FISC) simulates the RCS of potential target classes as a function of these angles. Thus, the approximated incident and observed angles allow the appropriate RCS to be extracted from a database of FISC results. Using this process, the RCS of each aircraft in the target class is simulated as though each is executing the same maneuver as the target detected by the system. Two additional scaling processes are required to transform the RCS into a power profile (magnitude only) simulating the signal in the receiver. First, the RCS is scaled by the Advanced Refractive Effects Prediction System (AREPS) code to account for propagation losses that occur as functions of altitude and range. Then, the Numerical Electromagnetic Code (NEC2) computes the antenna gain pattern, further scaling the RCS. A Rician likelihood model compares the scaled RCS of the illuminated aircraft with those of the potential targets. To improve the robustness of the result, the algorithm jointly optimizes over feasible orientation profiles and target types via dynamic programming.

  6. Note on: 'EMLCLLER-A program for computing the EM response of a large loop source over a layered earth model' by N.P. Singh and T. Mogi, Computers & Geosciences 29 (2003) 1301-1307

    NASA Astrophysics Data System (ADS)

    Jamie, Majid

    2016-11-01

    Singh and Mogi (2003) presented a forward modeling (FWD) program, coded in FORTRAN 77 called "EMLCLLER", which is capable of computing the frequency-domain electromagnetic (EM) response of a large circular loop, in terms of vertical magnetic component (Hz), over 1D layer earth models; computations at this program could be performed by assuming variable transmitter-receiver configurations and incorporating both conduction and displacement currents into computations. Integral equations at this program are computed through digital linear filters based on the Hankel transforms together with analytic solutions based on hyper-geometric functions. Despite capabilities of EMLCLLER, there are some mistakes at this program that make its FWD results unreliable. The mistakes in EMLCLLER arise in using wrong algorithm for computing reflection coefficient of the EM wave in TE-mode (rTE), and using flawed algorithms for computing phase and normalized phase values relating to Hz; in this paper corrected form of these mistakes are presented. Moreover, in order to illustrate how these mistakes can affect FWD results, EMLCLLER and corrected version of this program presented in this paper titled "EMLCLLER_Corr" are conducted on different two- and three-layered earth models; afterwards their FWD results in terms of real and imaginary parts of Hz, its normalized amplitude, and the corresponding normalized phase curves are plotted versus frequency and compared to each other. In addition, in Singh and Mogi (2003) extra derivations for computing radial component of the magnetic field (Hr) and angular component of the electric field (Eϕ) are also presented where the numerical solution presented for Hr is incorrect; in this paper the correct numerical solution for this derivation is also presented.

  7. Investigative clinical study on prostate cancer part IV: exploring functional relationships of total testosterone predicting free testosterone and total prostate-specific antigen in operated prostate cancer patients.

    PubMed

    Porcaro, Antonio B; Petrozziello, Aldo; Migliorini, Filippo; Lacola, Vincenzo; Romano, Mario; Sava, Teodoro; Ghimenton, Claudio; Caruso, Beatrice; Zecchini Antoniolli, Stefano; Rubilotta, Emanuele; Monaco, Carmelo; Comunale, Luigi

    2011-01-01

    To explore, in operated prostate cancer patients, functional relationships of total testosterone (tt) predicting free testosterone (ft) and total PSA. 128 operated prostate cancer patients were simultaneously investigated for tt, ft and PSA before surgery. Patients were not receiving 5α-reductase inhibitors, LH-releasing hormone analogues and testosterone replacement treatment. Scatter plots including ft and PSA versus tt were computed in order to assess the functional relationship of the variables. Linear regression analysis of tt predicting ft and PSA was computed. tt was a significant predictor of the response variable (ft) and different subsets of the patient population were assessed according to the ft to tt ratio. PSA was related to tt according to a nonlinear law. tt was a significant predictor of PSA according to an inversely nonlinear law and different significant clusters of the patient population were assessed according to the different constant of proportionality computed from experimental data. In our prostate cancer population, ft was significantly predicted by tt according to a linear law, and the ft/tt ratio was a significant parameter for assessing the different clusters. Also, tt was a significant variable predicting PSA by a nonlinear law and different clusters of the patient population were assessed by the different constants of proportionality. As a theory, we explain the nonlinear relation of tt in predicting PSA as follows: (a) the number of androgen-independent prostate cancer cells increases as tumor volume and PSA serum levels rise, (b) the prevalence of androgen-independent cells producing a substance which inhibits serum LH, and (c) as a result lower levels of serum tt are detected. Copyright © 2011 S. Karger AG, Basel.

  8. Pupils, Teachers & Palmtop Computers.

    ERIC Educational Resources Information Center

    Robertson, S. I.; And Others

    1996-01-01

    To examine the effects of introducing portable computers into secondary schools, a study was conducted regarding information technology skills and attitudes of staff and eighth grade students prior to and after receiving individual portable computers. Knowledge and use of word processing, spreadsheets, and database applications increased for both…

  9. 7 CFR 3203.9 - Accountability and recordkeeping.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... recordkeeping. (a) USDA requires all excess computers or other technical equipment received by an eligible... review the reports for accuracy, as well as fair and equitable distribution of the excess computers or... PROPERTY MANAGEMENT, DEPARTMENT OF AGRICULTURE GUIDELINES FOR THE TRANSFER OF EXCESS COMPUTERS OR OTHER...

  10. 7 CFR 3203.9 - Accountability and recordkeeping.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... recordkeeping. (a) USDA requires all excess computers or other technical equipment received by an eligible... review the reports for accuracy, as well as fair and equitable distribution of the excess computers or... PROPERTY MANAGEMENT, DEPARTMENT OF AGRICULTURE GUIDELINES FOR THE TRANSFER OF EXCESS COMPUTERS OR OTHER...

  11. Inflight IFR procedures simulator

    NASA Technical Reports Server (NTRS)

    Parker, L. C. (Inventor)

    1984-01-01

    An inflight IFR procedures simulator for generating signals and commands to conventional instruments provided in an airplane is described. The simulator includes a signal synthesizer which generates predetermined simulated signals corresponding to signals normally received from remote sources upon being activated. A computer is connected to the signal synthesizer and causes the signal synthesizer to produce simulated signals responsive to programs fed into the computer. A switching network is connected to the signal synthesizer, the antenna of the aircraft, and navigational instruments and communication devices for selectively connecting instruments and devices to the synthesizer and disconnecting the antenna from the navigational instruments and communication device. Pressure transducers are connected to the altimeter and speed indicator for supplying electrical signals to the computer indicating the altitude and speed of the aircraft. A compass is connected for supply electrical signals for the computer indicating the heading of the airplane. The computer upon receiving signals from the pressure transducer and compass, computes the signals that are fed to the signal synthesizer which, in turn, generates simulated navigational signals.

  12. xDSL connection monitor

    DOEpatents

    Horton, John J.

    2006-04-11

    A system and method of maintaining communication between a computer and a server, the server being in communication with the computer via xDSL service or dial-up modem service, with xDSL service being the default mode of communication, the method including sending a request to the server via xDSL service to which the server should respond and determining if a response has been received. If no response has been received, displaying on the computer a message (i) indicating that xDSL service has failed and (ii) offering to establish communication between the computer and the server via the dial-up modem, and thereafter changing the default mode of communication between the computer and the server to dial-up modem service. In a preferred embodiment, an xDSL service provider monitors dial-up modem communications and determines if the computer dialing in normally establishes communication with the server via xDSL service. The xDSL service provider can thus quickly and easily detect xDSL failures.

  13. Improving the Capture and Re-Use of Data with Wearable Computers

    NASA Technical Reports Server (NTRS)

    Pfarr, Barbara; Fating, Curtis C.; Green, Daniel; Powers, Edward I. (Technical Monitor)

    2001-01-01

    At the Goddard Space Flight Center, members of the Real-Time Software Engineering Branch are developing a wearable, wireless, voice-activated computer for use in a wide range of crosscutting space applications that would benefit from having instant Internet, network, and computer access with complete mobility and hands-free operations. These applications can be applied across many fields and disciplines including spacecraft fabrication, integration and testing (including environmental testing), and astronaut on-orbit control and monitoring of experiments with ground based experimenters. To satisfy the needs of NASA customers, this wearable computer needs to be connected to a wireless network, to transmit and receive real-time video over the network, and to receive updated documents via the Internet or NASA servers. The voice-activated computer, with a unique vocabulary, will allow the users to access documentation in a hands free environment and interact in real-time with remote users. We will discuss wearable computer development, hardware and software issues, wireless network limitations, video/audio solutions and difficulties in language development.

  14. Neural Network Classification of Receiver Functions as a Step Towards Automatic Crustal Parameter Determination

    NASA Astrophysics Data System (ADS)

    Jemberie, A.; Dugda, M. T.; Reusch, D.; Nyblade, A.

    2006-12-01

    Neural networks are decision making mathematical/engineering tools, which if trained properly, can do jobs automatically (and objectively) that normally require particular expertise and/or tedious repetition. Here we explore two techniques from the field of artificial neural networks (ANNs) that seek to reduce the time requirements and increase the objectivity of quality control (QC) and Event Identification (EI) on seismic datasets. We explore to apply the multiplayer Feed Forward (FF) Artificial Neural Networks (ANN) and Self- Organizing Maps (SOM) in combination with Hk stacking of receiver functions in an attempt to test the extent of the usefulness of automatic classification of receiver functions for crustal parameter determination. Feed- forward ANNs (FFNNs) are a supervised classification tool while self-organizing maps (SOMs) are able to provide unsupervised classification of large, complex geophysical data sets into a fixed number of distinct generalized patterns or modes. Hk stacking is a methodology that is used to stack receiver functions based on the relative arrival times of P-to-S converted phase and next two reverberations to determine crustal thickness H and Vp-to-Vs ratio (k). We use receiver functions from teleseismic events recorded by the 2000- 2002 Ethiopia Broadband Seismic Experiment. Preliminary results of applying FFNN neural network and Hk stacking of receiver functions for automatic receiver functions classification as a step towards an effort of automatic crustal parameter determination look encouraging. After training a FFNN neural network, the network could classify the best receiver functions from bad ones with a success rate of about 75 to 95%. Applying H? stacking on the receiver functions classified by this FFNN as the best receiver functions, we could obtain crustal thickness and Vp/Vs ratio of 31±4 km and 1.75±0.05, respectively, for the crust beneath station ARBA in the Main Ethiopian Rift. To make comparison, we applied Hk stacking on the receiver functions which we ourselves classified as the best set and found that the crustal thickness and Vp/Vs ratio are 31±2 km and 1.75±0.02, respectively.

  15. Optimization of MLS receivers for multipath environments

    NASA Technical Reports Server (NTRS)

    Mcalpine, G. A.; Highfill, J. H., III

    1979-01-01

    The angle tracking problems in microwave landing system receivers along with a receiver design capable of optimal performance in the multipath environments found in air terminal areas were studied. Included were various theoretical and evaluative studies like: (1) signal model development; (2) derivation of optimal receiver structures; and (3) development and use of computer simulations for receiver algorithm evaluation. The development of an experimental receiver for flight testing is presented. An overview of the work and summary of principal results and conclusions are reported.

  16. Machine Learning Classification to Identify the Stage of Brain-Computer Interface Therapy for Stroke Rehabilitation Using Functional Connectivity.

    PubMed

    Mohanty, Rosaleena; Sinha, Anita M; Remsik, Alexander B; Dodd, Keith C; Young, Brittany M; Jacobson, Tyler; McMillan, Matthew; Thoma, Jaclyn; Advani, Hemali; Nair, Veena A; Kang, Theresa J; Caldera, Kristin; Edwards, Dorothy F; Williams, Justin C; Prabhakaran, Vivek

    2018-01-01

    Interventional therapy using brain-computer interface (BCI) technology has shown promise in facilitating motor recovery in stroke survivors; however, the impact of this form of intervention on functional networks outside of the motor network specifically is not well-understood. Here, we investigated resting-state functional connectivity (rs-FC) in stroke participants undergoing BCI therapy across stages, namely pre- and post-intervention, to identify discriminative functional changes using a machine learning classifier with the goal of categorizing participants into one of the two therapy stages. Twenty chronic stroke participants with persistent upper-extremity motor impairment received neuromodulatory training using a closed-loop neurofeedback BCI device, and rs-functional MRI (rs-fMRI) scans were collected at four time points: pre-, mid-, post-, and 1 month post-therapy. To evaluate the peak effects of this intervention, rs-FC was analyzed from two specific stages, namely pre- and post-therapy. In total, 236 seeds spanning both motor and non-motor regions of the brain were computed at each stage. A univariate feature selection was applied to reduce the number of features followed by a principal component-based data transformation used by a linear binary support vector machine (SVM) classifier to classify each participant into a therapy stage. The SVM classifier achieved a cross-validation accuracy of 92.5% using a leave-one-out method. Outside of the motor network, seeds from the fronto-parietal task control, default mode, subcortical, and visual networks emerged as important contributors to the classification. Furthermore, a higher number of functional changes were observed to be strengthening from the pre- to post-therapy stage than the ones weakening, both of which involved motor and non-motor regions of the brain. These findings may provide new evidence to support the potential clinical utility of BCI therapy as a form of stroke rehabilitation that not only benefits motor recovery but also facilitates recovery in other brain networks. Moreover, delineation of stronger and weaker changes may inform more optimal designs of BCI interventional therapy so as to facilitate strengthened and suppress weakened changes in the recovery process.

  17. Tibial rotation kinematics subsequent to knee arthroplasty

    PubMed Central

    Collins, Duane J.; Khatib, Yasser H.; Parker, David A.; Jenkin, Deanne E.; Molnar, Robert B.

    2015-01-01

    Background The use of computer assisted joint replacement has facilitated precise intraoperative measurement of knee kinematics. The changes in “screw home mechanism” (SHM) resulting from Total Knee Arthroplasty (TKA) with different prostheses and constraints has not yet been accurately described. Methods A pilot study was first completed. Intraoperative kinematic data was collected two groups of 15 patients receiving different prostheses. Results On average, patients lost 5.3° of ER (SD = 6.1°). There was no significant difference between the prostheses or different prosthetic constraints. Conclusions There significant loss of SHM after TKA. Further research is required to understand its impact on patient function. PMID:25829754

  18. 75 FR 41093 - General Services Administration Acquisition Regulation; Rewrite of GSAR Part 516, Types of Contracts

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-15

    ...) will place all orders by EDI using computer-to-computer EDI. If computer-to-computer EDI is not possible, FAS will use an alternative EDI method allowing the Contractor to receive orders by facsimile transmission. Subject to the Contractor's agreement, other agencies may place orders by EDI. * * * * * (g) The...

  19. Easy, Collaborative and Engaging--The Use of Cloud Computing in the Design of Management Classrooms

    ERIC Educational Resources Information Center

    Schneckenberg, Dirk

    2014-01-01

    Background: Cloud computing has recently received interest in information systems research and practice as a new way to organise information with the help of an increasingly ubiquitous computer infrastructure. However, the use of cloud computing in higher education institutions and business schools, as well as its potential to create novel…

  20. Dense, Efficient Chip-to-Chip Communication at the Extremes of Computing

    ERIC Educational Resources Information Center

    Loh, Matthew

    2013-01-01

    The scalability of CMOS technology has driven computation into a diverse range of applications across the power consumption, performance and size spectra. Communication is a necessary adjunct to computation, and whether this is to push data from node-to-node in a high-performance computing cluster or from the receiver of wireless link to a neural…

  1. 29 CFR 779.421 - Basic rate for computing overtime compensation of nonexempt employees receiving commissions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Basic rate for computing overtime compensation of nonexempt... Principally by Commissions § 779.421 Basic rate for computing overtime compensation of nonexempt employees... not meet the exemption requirements of section 7(i) may be computed under the provisions of section 7...

  2. 76 FR 14669 - Privacy Act of 1974; CMS Computer Match No. 2011-02; HHS Computer Match No. 1007

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-17

    ... (CMS); and Department of Defense (DoD), Manpower Data Center (DMDC), Defense Enrollment and Eligibility... the results of the computer match and provide the information to TMA for use in its matching program... under TRICARE. DEERS will receive the results of the computer match and provide the information provided...

  3. Computers and the Learning of Biological Concepts: Attitudes and Achievement of Nigerian Students.

    ERIC Educational Resources Information Center

    Jegede, Olugbemiro J.

    1991-01-01

    Compared attitudes toward computer use and achievement in biology for three groups of Nigerian students (n=64): (1) working alone with computer; (2) working in groups of three on the computer; (3) and a control group that received normal instruction (lecture). Students in the second group had the highest scores on attitude. No significant…

  4. Prospective study of functional bone marrow-sparing intensity modulated radiation therapy with concurrent chemotherapy for pelvic malignancies.

    PubMed

    Liang, Yun; Bydder, Mark; Yashar, Catheryn M; Rose, Brent S; Cornell, Mariel; Hoh, Carl K; Lawson, Joshua D; Einck, John; Saenz, Cheryl; Fanta, Paul; Mundt, Arno J; Bydder, Graeme M; Mell, Loren K

    2013-02-01

    To test the hypothesis that intensity modulated radiation therapy (IMRT) can reduce radiation dose to functional bone marrow (BM) in patients with pelvic malignancies (phase IA) and estimate the clinical feasibility and acute toxicity associated with this technique (phase IB). We enrolled 31 subjects (19 with gynecologic cancer and 12 with anal cancer) in an institutional review board-approved prospective trial (6 in the pilot study, 10 in phase IA, and 15 in phase IB). The mean age was 52 years; 8 of 31 patients (26%) were men. Twenty-one subjects completed (18)F-fluorodeoxyglucose (FDG)-positron emission tomography (PET)/computed tomography (CT) simulation and magnetic resonance imaging by use of quantitative IDEAL (IDEAL IQ; GE Healthcare, Waukesha, WI). The PET/CT and IDEAL IQ were registered, and BM subvolumes were segmented above the mean standardized uptake value and below the mean fat fraction within the pelvis and lumbar spine; their intersection was designated as functional BM for IMRT planning. Functional BM-sparing vs total BM-sparing IMRT plans were compared in 12 subjects; 10 were treated with functional BM-sparing pelvic IMRT per protocol. In gynecologic cancer patients, the mean functional BM V(10) (volume receiving ≥10 Gy) and V(20) (volume receiving ≥20 Gy) were 85% vs 94% (P<.0001) and 70% vs 82% (P<.0001), respectively, for functional BM-sparing IMRT vs total BM-sparing IMRT. In anal cancer patients, the corresponding values were 75% vs 77% (P=.06) and 62% vs 67% (P=.002), respectively. Of 10 subjects treated with functional BM-sparing pelvic IMRT, 3 (30%) had acute grade 3 hematologic toxicity or greater. IMRT can reduce dose to BM subregions identified by (18)F-fluorodeoxyglucose-PET/CT and IDEAL IQ. The efficacy of BM-sparing IMRT is being tested in a phase II trial. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Efficient Vocational Skills Training for People with Cognitive Disabilities: An Exploratory Study Comparing Computer-Assisted Instruction to One-on-One Tutoring.

    PubMed

    Larson, James R; Juszczak, Andrew; Engel, Kathryn

    2016-03-01

    This study compared the effectiveness of computer-assisted instruction to that of one-on-one tutoring for teaching people with mild and moderate cognitive disabilities when both training methods are designed to take account of the specific mental deficits most commonly found in cognitive disability populations. Fifteen participants (age 22-71) received either computer-assisted instruction or one-on-one tutoring in three content domains that were of functional and daily relevance to them: behavioural limits, rights and responsibilities (two modules) and alphabetical sorting. Learning was assessed by means of a series of pretests and four learning cycle post-tests. Both instructional conditions maintained time-on-task and teaching material equivalence, and both incorporated a set of best-practices and empirically supported teaching techniques designed to address attentional deficits, stimulus processing inefficiencies and cognitive load limitations. Strong evidence of learning was found in both instructional method conditions. Moreover, in all content domains the two methods yielded approximately equivalent rates of learning and learning attainment. These findings offer tentative evidence that a repetitive, computer-assisted training program can produce learning outcomes in people with mild and moderate cognitive disabilities that are comparable to those achieved by high-quality one-on-one tutoring. © 2015 John Wiley & Sons Ltd.

  6. RAINLINK: Retrieval algorithm for rainfall monitoring employing microwave links from a cellular communication network

    NASA Astrophysics Data System (ADS)

    Uijlenhoet, R.; Overeem, A.; Leijnse, H.; Rios Gaona, M. F.

    2017-12-01

    The basic principle of rainfall estimation using microwave links is as follows. Rainfall attenuates the electromagnetic signals transmitted from one telephone tower to another. By measuring the received power at one end of a microwave link as a function of time, the path-integrated attenuation due to rainfall can be calculated, which can be converted to average rainfall intensities over the length of a link. Microwave links from cellular communication networks have been proposed as a promising new rainfall measurement technique for one decade. They are particularly interesting for those countries where few surface rainfall observations are available. Yet to date no operational (real-time) link-based rainfall products are available. To advance the process towards operational application and upscaling of this technique, there is a need for freely available, user-friendly computer code for microwave link data processing and rainfall mapping. Such software is now available as R package "RAINLINK" on GitHub (https://github.com/overeem11/RAINLINK). It contains a working example to compute link-based 15-min rainfall maps for the entire surface area of The Netherlands for 40 hours from real microwave link data. This is a working example using actual data from an extensive network of commercial microwave links, for the first time, which will allow users to test their own algorithms and compare their results with ours. The package consists of modular functions, which facilitates running only part of the algorithm. The main processings steps are: 1) Preprocessing of link data (initial quality and consistency checks); 2) Wet-dry classification using link data; 3) Reference signal determination; 4) Removal of outliers ; 5) Correction of received signal powers; 6) Computation of mean path-averaged rainfall intensities; 7) Interpolation of rainfall intensities ; 8) Rainfall map visualisation. Some applications of RAINLINK will be shown based on microwave link data from a temperate climate (the Netherlands), and from a subtropical climate (Brazil). We hope that RAINLINK will promote the application of rainfall monitoring using microwave links in poorly gauged regions around the world. We invite researchers to contribute to RAINLINK to make the code more generally applicable to data from different networks and climates.

  7. Increasing exercise capacity and quality of life of patients with heart failure through Wii gaming: the rationale, design and methodology of the HF-Wii study; a multicentre randomized controlled trial.

    PubMed

    Jaarsma, Tiny; Klompstra, Leonie; Ben Gal, Tuvia; Boyne, Josiane; Vellone, Ercole; Bäck, Maria; Dickstein, Kenneth; Fridlund, Bengt; Hoes, Arno; Piepoli, Massimo F; Chialà, Oronzo; Mårtensson, Jan; Strömberg, Anna

    2015-07-01

    Exercise is known to be beneficial for patients with heart failure (HF), and these patients should therefore be routinely advised to exercise and to be or to become physically active. Despite the beneficial effects of exercise such as improved functional capacity and favourable clinical outcomes, the level of daily physical activity in most patients with HF is low. Exergaming may be a promising new approach to increase the physical activity of patients with HF at home. The aim of this study is to determine the effectiveness of the structured introduction and access to a Wii game computer in patients with HF to improve exercise capacity and level of daily physical activity, to decrease healthcare resource use, and to improve self-care and health-related quality of life. A multicentre randomized controlled study with two treatment groups will include 600 patients with HF. In each centre, patients will be randomized to either motivational support only (control) or structured access to a Wii game computer (Wii). Patients in the control group will receive advice on physical activity and will be contacted by four telephone calls. Patients in the Wii group also will receive advice on physical activity along with a Wii game computer, with instructions and training. The primary endpoint will be exercise capacity at 3 months as measured by the 6 min walk test. Secondary endpoints include exercise capacity at 6 and 12 months, level of daily physical activity, muscle function, health-related quality of life, and hospitalization or death during the 12 months follow-up. The HF-Wii study is a randomized study that will evaluate the effect of exergaming in patients with HF. The findings can be useful to healthcare professionals and improve our understanding of the potential role of exergaming in the treatment and management of patients with HF. NCT01785121. © 2015 The Authors. European Journal of Heart Failure © 2015 European Society of Cardiology.

  8. Clinical and cost effectiveness of computer treatment for aphasia post stroke (Big CACTUS): study protocol for a randomised controlled trial.

    PubMed

    Palmer, Rebecca; Cooper, Cindy; Enderby, Pam; Brady, Marian; Julious, Steven; Bowen, Audrey; Latimer, Nicholas

    2015-01-27

    Aphasia affects the ability to speak, comprehend spoken language, read and write. One third of stroke survivors experience aphasia. Evidence suggests that aphasia can continue to improve after the first few months with intensive speech and language therapy, which is frequently beyond what resources allow. The development of computer software for language practice provides an opportunity for self-managed therapy. This pragmatic randomised controlled trial will investigate the clinical and cost effectiveness of a computerised approach to long-term aphasia therapy post stroke. A total of 285 adults with aphasia at least four months post stroke will be randomly allocated to either usual care, computerised intervention in addition to usual care or attention and activity control in addition to usual care. Those in the intervention group will receive six months of self-managed word finding practice on their home computer with monthly face-to-face support from a volunteer/assistant. Those in the attention control group will receive puzzle activities, supplemented by monthly telephone calls. Study delivery will be coordinated by 20 speech and language therapy departments across the United Kingdom. Outcome measures will be made at baseline, six, nine and 12 months after randomisation by blinded speech and language therapist assessors. Primary outcomes are the change in number of words (of personal relevance) named correctly at six months and improvement in functional conversation. Primary outcomes will be analysed using a Hochberg testing procedure. Significance will be declared if differences in both word retrieval and functional conversation at six months are significant at the 5% level, or if either comparison is significant at 2.5%. A cost utility analysis will be undertaken from the NHS and personal social service perspective. Differences between costs and quality-adjusted life years in the three groups will be described and the incremental cost effectiveness ratio will be calculated. Treatment fidelity will be monitored. This is the first fully powered trial of the clinical and cost effectiveness of computerised aphasia therapy. Specific challenges in designing the protocol are considered. Registered with Current Controlled Trials ISRCTN68798818 on 18 February 2014.

  9. A principled dimension-reduction method for the population density approach to modeling networks of neurons with synaptic dynamics.

    PubMed

    Ly, Cheng

    2013-10-01

    The population density approach to neural network modeling has been utilized in a variety of contexts. The idea is to group many similar noisy neurons into populations and track the probability density function for each population that encompasses the proportion of neurons with a particular state rather than simulating individual neurons (i.e., Monte Carlo). It is commonly used for both analytic insight and as a time-saving computational tool. The main shortcoming of this method is that when realistic attributes are incorporated in the underlying neuron model, the dimension of the probability density function increases, leading to intractable equations or, at best, computationally intensive simulations. Thus, developing principled dimension-reduction methods is essential for the robustness of these powerful methods. As a more pragmatic tool, it would be of great value for the larger theoretical neuroscience community. For exposition of this method, we consider a single uncoupled population of leaky integrate-and-fire neurons receiving external excitatory synaptic input only. We present a dimension-reduction method that reduces a two-dimensional partial differential-integral equation to a computationally efficient one-dimensional system and gives qualitatively accurate results in both the steady-state and nonequilibrium regimes. The method, termed modified mean-field method, is based entirely on the governing equations and not on any auxiliary variables or parameters, and it does not require fine-tuning. The principles of the modified mean-field method have potential applicability to more realistic (i.e., higher-dimensional) neural networks.

  10. Preserving the Finger Lakes for the Future: A Prototype Decision Support System for Water Resource Management, Open Space, and Agricultural Protection

    NASA Technical Reports Server (NTRS)

    Brower, Robert

    2003-01-01

    As described herein, this project has progressed well, with the initiation or completion of a number of program facets at programmatic, technical, and inter-agency levels. The concept of the Virtual Management Operations Center has taken shape, grown, and has been well received by parties from a wide variety of agencies and organizations in the Finger Lakes region and beyond. As it has evolved in design and functionality, and to better illustrate its current focus for this project, it has been given the expanded name of Watershed Virtual Management Operations Center (W-VMOC). It offers the advanced, compelling functionality of interactive 3D visualization interfaced with 2D mapping, all accessed via Internet or virtually any kind of distributed computer network. This strong foundation will allow the development of a Decision Support System (DSS) with anticipated enhanced functionality to be applied to the myriad issues involved in the wise management of the Finger Lakes region.

  11. System and method of self-properties for an autonomous and automatic computer environment

    NASA Technical Reports Server (NTRS)

    Sterritt, Roy (Inventor); Hinchey, Michael G. (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments self health/urgency data and environment health/urgency data may be transmitted externally from an autonomic element. Other embodiments may include transmitting the self health/urgency data and environment health/urgency data together on a regular basis similar to the lub-dub of a heartbeat. Yet other embodiments may include a method for managing a system based on the functioning state and operating status of the system, wherein the method may include processing received signals from the system indicative of the functioning state and the operating status to obtain an analysis of the condition of the system, generating one or more stay alive signals based on the functioning status and the operating state of the system, transmitting the stay-alive signal, transmitting self health/urgency data, and transmitting environment health/urgency data. Still other embodiments may include an autonomic element that includes a self monitor, a self adjuster, an environment monitor, and an autonomic manager.

  12. WCALive: broadcasting a major medical conference on the Internet.

    PubMed

    Palmer, T E; Cumpston, P H; Ruskin, K; Jones, R D

    1997-11-01

    Live video and sound from the 11th World Congress of Anaesthesiology in Sydney, Australia were broadcast over the Internet using the CuSeeme software package as part of an ongoing evaluation of Internet-based telecommunication in the delivery of Continuing Medical Education (CME). This was the first time such a broadcast had been attempted from a medical convention. The broadcast lasted for four days, during which a functioning combination of computer hardware and software was established. Technical issues relating to broadcast of these real time signals over ISDN links and the Internet itself were addressed. Over 200 anaesthetists from around the world were able to 'attend' the plenary sessions via the Internet. Evidenced by feedback received audio reception was quite good. Video reception was less successful for those receiving the broadcast via a modem based Internet connection. The received signal in such circumstances was adequate to provide a video presence of the speaker but inadequate to allow details of 35 mm slides to be visualised. We conclude that this technology will be of use in the delivery of CME materials to remote areas provided simultaneous viewing of high resolution still images is possible using another medium, such as the World Wide Web.

  13. Basic concepts and development of an all-purpose computer interface for ROC/FROC observer study.

    PubMed

    Shiraishi, Junji; Fukuoka, Daisuke; Hara, Takeshi; Abe, Hiroyuki

    2013-01-01

    In this study, we initially investigated various aspects of requirements for a computer interface employed in receiver operating characteristic (ROC) and free-response ROC (FROC) observer studies which involve digital images and ratings obtained by observers (radiologists). Secondly, by taking into account these aspects, an all-purpose computer interface utilized for these observer performance studies was developed. Basically, the observer studies can be classified into three paradigms, such as one rating for one case without an identification of a signal location, one rating for one case with an identification of a signal location, and multiple ratings for one case with identification of signal locations. For these paradigms, display modes on the computer interface can be used for single/multiple views of a static image, continuous viewing with cascade images (i.e., CT, MRI), and dynamic viewing of movies (i.e., DSA, ultrasound). Various functions on these display modes, which include windowing (contrast/level), magnifications, and annotations, are needed to be selected by an experimenter corresponding to the purpose of the research. In addition, the rules of judgment for distinguishing between true positives and false positives are an important factor for estimating diagnostic accuracy in an observer study. We developed a computer interface which runs on a Windows operating system by taking into account all aspects required for various observer studies. This computer interface requires experimenters to have sufficient knowledge about ROC/FROC observer studies, but allows its use for any purpose of the observer studies. This computer interface will be distributed publicly in the near future.

  14. Measuring radio-signal power accurately

    NASA Technical Reports Server (NTRS)

    Goldstein, R. M.; Newton, J. W.; Winkelstein, R. A.

    1979-01-01

    Absolute value of signal power in weak radio signals is determined by computer-aided measurements. Equipment operates by averaging received signal over several-minute period and comparing average value with noise level of receiver previously calibrated.

  15. A robust algorithm for automated target recognition using precomputed radar cross sections

    NASA Astrophysics Data System (ADS)

    Ehrman, Lisa M.; Lanterman, Aaron D.

    2004-09-01

    Passive radar is an emerging technology that offers a number of unique benefits, including covert operation. Many such systems are already capable of detecting and tracking aircraft. The goal of this work is to develop a robust algorithm for adding automated target recognition (ATR) capabilities to existing passive radar systems. In previous papers, we proposed conducting ATR by comparing the precomputed RCS of known targets to that of detected targets. To make the precomputed RCS as accurate as possible, a coordinated flight model is used to estimate aircraft orientation. Once the aircraft's position and orientation are known, it is possible to determine the incident and observed angles on the aircraft, relative to the transmitter and receiver. This makes it possible to extract the appropriate radar cross section (RCS) from our simulated database. This RCS is then scaled to account for propagation losses and the receiver's antenna gain. A Rician likelihood model compares these expected signals from different targets to the received target profile. We have previously employed Monte Carlo runs to gauge the probability of error in the ATR algorithm; however, generation of a statistically significant set of Monte Carlo runs is computationally intensive. As an alternative to Monte Carlo runs, we derive the relative entropy (also known as Kullback-Liebler distance) between two Rician distributions. Since the probability of Type II error in our hypothesis testing problem can be expressed as a function of the relative entropy via Stein's Lemma, this provides us with a computationally efficient method for determining an upper bound on our algorithm's performance. It also provides great insight into the types of classification errors we can expect from our algorithm. This paper compares the numerically approximated probability of Type II error with the results obtained from a set of Monte Carlo runs.

  16. Experimental Design and Data Analysis in Receiver Operating Characteristic Studies: Lessons Learned from Reports in Radiology from 1997 to 20061

    PubMed Central

    Shiraishi, Junji; Pesce, Lorenzo L.; Metz, Charles E.; Doi, Kunio

    2009-01-01

    Purpose: To provide a broad perspective concerning the recent use of receiver operating characteristic (ROC) analysis in medical imaging by reviewing ROC studies published in Radiology between 1997 and 2006 for experimental design, imaging modality, medical condition, and ROC paradigm. Materials and Methods: Two hundred ninety-five studies were obtained by conducting a literature search with PubMed with two criteria: publication in Radiology between 1997 and 2006 and occurrence of the phrase “receiver operating characteristic.” Studies returned by the query that were not diagnostic imaging procedure performance evaluations were excluded. Characteristics of the remaining studies were tabulated. Results: Two hundred thirty-three (79.0%) of the 295 studies reported findings based on observers' diagnostic judgments or objective measurements. Forty-three (14.6%) did not include human observers, with most of these reporting an evaluation of a computer-aided diagnosis system or functional data obtained with computed tomography (CT) or magnetic resonance (MR) imaging. The remaining 19 (6.4%) studies were classified as reviews or meta-analyses and were excluded from our subsequent analysis. Among the various imaging modalities, MR imaging (46.0%) and CT (25.7%) were investigated most frequently. Approximately 60% (144 of 233) of ROC studies with human observers published in Radiology included three or fewer observers. Conclusion: ROC analysis is widely used in radiologic research, confirming its fundamental role in assessing diagnostic performance. However, the ROC studies reported in Radiology were not always adequate to support clear and clinically relevant conclusions. © RSNA, 2009 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.2533081632/-/DC1 PMID:19864510

  17. Automated target recognition using passive radar and coordinated flight models

    NASA Astrophysics Data System (ADS)

    Ehrman, Lisa M.; Lanterman, Aaron D.

    2003-09-01

    Rather than emitting pulses, passive radar systems rely on illuminators of opportunity, such as TV and FM radio, to illuminate potential targets. These systems are particularly attractive since they allow receivers to operate without emitting energy, rendering them covert. Many existing passive radar systems estimate the locations and velocities of targets. This paper focuses on adding an automatic target recognition (ATR) component to such systems. Our approach to ATR compares the Radar Cross Section (RCS) of targets detected by a passive radar system to the simulated RCS of known targets. To make the comparison as accurate as possible, the received signal model accounts for aircraft position and orientation, propagation losses, and antenna gain patterns. The estimated positions become inputs for an algorithm that uses a coordinated flight model to compute probable aircraft orientation angles. The Fast Illinois Solver Code (FISC) simulates the RCS of several potential target classes as they execute the estimated maneuvers. The RCS is then scaled by the Advanced Refractive Effects Prediction System (AREPS) code to account for propagation losses that occur as functions of altitude and range. The Numerical Electromagnetic Code (NEC2) computes the antenna gain pattern, so that the RCS can be further scaled. The Rician model compares the RCS of the illuminated aircraft with those of the potential targets. This comparison results in target identification.

  18. Computer-Mediated Social Support for Physical Activity: A Content Analysis.

    PubMed

    Stragier, Jeroen; Mechant, Peter; De Marez, Lieven; Cardon, Greet

    2018-02-01

    Online fitness communities are a recent phenomenon experiencing growing user bases. They can be considered as online social networks in which recording, monitoring, and sharing of physical activity (PA) are the most prevalent practices. They have added a new dimension to the social experience of PA in which online peers function as virtual PA partners or supporters. However, research into seeking and receiving computer-mediated social support for PA is scarce. Our aim was to study to what extent using online fitness communities and sharing physical activities with online social networks results in receiving various types of online social support. Two databases, one containing physical activities logged with Strava and one containing physical activities logged with RunKeeper and shared on Twitter, were investigated for occurrence and type of social support, by means of a deductive content analysis. Results indicate that social support delivered through Twitter is not particularly extensive. On Strava, social support is significantly more prevalent. Especially esteem support, expressed as compliments for the accomplishment of an activity, is provided on both Strava and Twitter. The results demonstrate that social media have potential as a platform used for providing social support for PA, but differences among various social network sites can be substantial. Especially esteem support can be expected, in contrast to online health communities, where information support is more common.

  19. Evaluation of Coronary Artery Stenosis by Quantitative Flow Ratio During Invasive Coronary Angiography: The WIFI II Study (Wire-Free Functional Imaging II).

    PubMed

    Westra, Jelmer; Tu, Shengxian; Winther, Simon; Nissen, Louise; Vestergaard, Mai-Britt; Andersen, Birgitte Krogsgaard; Holck, Emil Nielsen; Fox Maule, Camilla; Johansen, Jane Kirk; Andreasen, Lene Nyhus; Simonsen, Jo Krogsgaard; Zhang, Yimin; Kristensen, Steen Dalby; Maeng, Michael; Kaltoft, Anne; Terkelsen, Christian Juhl; Krusell, Lars Romer; Jakobsen, Lars; Reiber, Johan H C; Lassen, Jens Flensted; Bøttcher, Morten; Bøtker, Hans Erik; Christiansen, Evald Høj; Holm, Niels Ramsing

    2018-03-01

    Quantitative flow ratio (QFR) is a novel diagnostic modality for functional testing of coronary artery stenosis without the use of pressure wires and induction of hyperemia. QFR is based on computation of standard invasive coronary angiographic imaging. The purpose of WIFI II (Wire-Free Functional Imaging II) was to evaluate the feasibility and diagnostic performance of QFR in unselected consecutive patients. WIFI II was a predefined substudy to the Dan-NICAD study (Danish Study of Non-Invasive Diagnostic Testing in Coronary Artery Disease), referring 362 consecutive patients with suspected coronary artery disease on coronary computed tomographic angiography for diagnostic invasive coronary angiography. Fractional flow reserve (FFR) was measured in all segments with 30% to 90% diameter stenosis. Blinded observers calculated QFR (Medis Medical Imaging bv, The Netherlands) for comparison with FFR. FFR was measured in 292 lesions from 191 patients. Ten (5%) and 9 patients (5%) were excluded because of FFR and angiographic core laboratory criteria, respectively. QFR was successfully computed in 240 out of 255 lesions (94%) with a mean diameter stenosis of 50±12%. Mean difference between FFR and QFR was 0.01±0.08. QFR correctly classified 83% of the lesions using FFR with cutoff at 0.80 as reference standard. The area under the receiver operating characteristic curve was 0.86 (95% confidence interval, 0.81-0.91) with a sensitivity, specificity, negative predictive value, and positive predictive value of 77%, 86%, 75%, and 87%, respectively. A QFR-FFR hybrid approach based on the present results enables wire-free and adenosine-free procedures in 68% of cases. Functional lesion evaluation by QFR assessment showed good agreement and diagnostic accuracy compared with FFR. Studies comparing clinical outcome after QFR- and FFR-based diagnostic strategies are required. URL: https://www.clinicaltrials.gov. Unique identifier: NCT02264717. © 2018 The Authors.

  20. Adaptive critic autopilot design of bank-to-turn missiles using fuzzy basis function networks.

    PubMed

    Lin, Chuan-Kai

    2005-04-01

    A new adaptive critic autopilot design for bank-to-turn missiles is presented. In this paper, the architecture of adaptive critic learning scheme contains a fuzzy-basis-function-network based associative search element (ASE), which is employed to approximate nonlinear and complex functions of bank-to-turn missiles, and an adaptive critic element (ACE) generating the reinforcement signal to tune the associative search element. In the design of the adaptive critic autopilot, the control law receives signals from a fixed gain controller, an ASE and an adaptive robust element, which can eliminate approximation errors and disturbances. Traditional adaptive critic reinforcement learning is the problem faced by an agent that must learn behavior through trial-and-error interactions with a dynamic environment, however, the proposed tuning algorithm can significantly shorten the learning time by online tuning all parameters of fuzzy basis functions and weights of ASE and ACE. Moreover, the weight updating law derived from the Lyapunov stability theory is capable of guaranteeing both tracking performance and stability. Computer simulation results confirm the effectiveness of the proposed adaptive critic autopilot.

  1. 75 FR 30915 - Computer Matching Program Between the Department of Veterans Affairs (VA) and the Department of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-02

    ..., and reservists, and eligible dependents who have applied for or who are receiving education benefit... Leader, Education Service, Veterans Benefits Administration, Department of Veterans Affairs, 810 Vermont... receiving, or have received education benefit payments under the Post-9/11 GI Bill. These benefit records...

  2. Integration of Monte-Carlo ray tracing with a stochastic optimisation method: application to the design of solar receiver geometry.

    PubMed

    Asselineau, Charles-Alexis; Zapata, Jose; Pye, John

    2015-06-01

    A stochastic optimisation method adapted to illumination and radiative heat transfer problems involving Monte-Carlo ray-tracing is presented. A solar receiver shape optimisation case study illustrates the advantages of the method and its potential: efficient receivers are identified using a moderate computational cost.

  3. A user's manual for DELSOL3: A computer code for calculating the optical performance and optimal system design for solar thermal central receiver plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kistler, B.L.

    DELSOL3 is a revised and updated version of the DELSOL2 computer program (SAND81-8237) for calculating collector field performance and layout and optimal system design for solar thermal central receiver plants. The code consists of a detailed model of the optical performance, a simpler model of the non-optical performance, an algorithm for field layout, and a searching algorithm to find the best system design based on energy cost. The latter two features are coupled to a cost model of central receiver components and an economic model for calculating energy costs. The code can handle flat, focused and/or canted heliostats, and externalmore » cylindrical, multi-aperture cavity, and flat plate receivers. The program optimizes the tower height, receiver size, field layout, heliostat spacings, and tower position at user specified power levels subject to flux limits on the receiver and land constraints for field layout. DELSOL3 maintains the advantages of speed and accuracy which are characteristics of DELSOL2.« less

  4. Receiver-Assisted Congestion Control to Achieve High Throughput in Lossy Wireless Networks

    NASA Astrophysics Data System (ADS)

    Shi, Kai; Shu, Yantai; Yang, Oliver; Luo, Jiarong

    2010-04-01

    Many applications would require fast data transfer in high-speed wireless networks nowadays. However, due to its conservative congestion control algorithm, Transmission Control Protocol (TCP) cannot effectively utilize the network capacity in lossy wireless networks. In this paper, we propose a receiver-assisted congestion control mechanism (RACC) in which the sender performs loss-based control, while the receiver is performing delay-based control. The receiver measures the network bandwidth based on the packet interarrival interval and uses it to compute a congestion window size deemed appropriate for the sender. After receiving the advertised value feedback from the receiver, the sender then uses the additive increase and multiplicative decrease (AIMD) mechanism to compute the correct congestion window size to be used. By integrating the loss-based and the delay-based congestion controls, our mechanism can mitigate the effect of wireless losses, alleviate the timeout effect, and therefore make better use of network bandwidth. Simulation and experiment results in various scenarios show that our mechanism can outperform conventional TCP in high-speed and lossy wireless environments.

  5. Skilled Metro Workers Get Highest Payoffs for Using a Computer at Work.

    ERIC Educational Resources Information Center

    Kusmin, Lorin D.

    2000-01-01

    Workers who use computers on the job receive higher wages, reflecting both computer-specific and broader skills. This accounts for a small portion of the metro-nonmetro wage gap. The payoff for using a computer on the job is higher for college graduates and more-experienced workers than their counterparts and is higher for rural than urban…

  6. A Pilot Study of the Use of Emerging Computer Technologies to Improve the Effectiveness of Reading and Writing Therapies in Children with Down Syndrome

    ERIC Educational Resources Information Center

    Felix, Vanessa G.; Mena, Luis J.; Ostos, Rodolfo; Maestre, Gladys E.

    2017-01-01

    Despite the potential benefits that computer approaches could provide for children with cognitive disabilities, research and implementation of emerging approaches to learning supported by computing technology has not received adequate attention. We conducted a pilot study to assess the effectiveness of a computer-assisted learning tool, named…

  7. 20 CFR 404.250 - Special computation rules for people who had a period of disability.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Special computation rules for people who had... Computation Rules for People Who Had A Period of Disability § 404.250 Special computation rules for people who had a period of disability. If you were disabled at some time in your life, received disability...

  8. SU-F-J-91: Sparing Lung Function in Treatment Planning Using Dual Energy Tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lapointe, A; Bahig, H; Zerouali, K

    2016-06-15

    Purpose: To propose an alternate treatment plan that minimizes the dose to the functional lung tissues. In clinical situation, the evaluation of the lung functionality is typically derived from perfusion scintigraphy. However, such technique has spatial and temporal resolutions generally inferior to those of a CT scan. Alternatively, it is possible to evaluate pulmonary function by analysing the iodine concentration determined via contrast-enhanced dual energy CT (DECT) scan. Methods: Five lung cancer patients underwent a scintigraphy and a contrast-enhanced DECT scan (SOMATOM Definition Flash, Siemens). The iodine concentration was evaluated using the two-material decomposition method to produce a functional mapmore » of the lung. The validation of the approach is realized by comparison between the differential function computed by DECT and scintigraphy. The functional map is then used to redefine the V5 (volume of the organ that received more than 5 Gy during a radiotherapy treatment) to a novel functional parameter, the V5f. The V5f, that uses a volume weighted by its function level, can assist in evaluating optimal beam entry points for a specific treatment plan. Results: The results show that the differential functions obtained by scintigraphy and DECT are in good agreement with a mean difference of 6%. In specific cases, we are able to visually correlate low iodine concentration with abnormal pulmonary lung or cancerous tumors. The comparison between V5f and V5 has shown that some entry points can be better exploited and that new ones are now accessible, 2.34 times more in average, without increasing the V5f - thus allowing easier optimization of other planning objectives. Conclusion: In addition to the high-resolution DECT images, the iodine map provides local information used to detect potential functional heterogeneities in the 3D space. We propose that this information be used to calculate new functional dose parameters such as the V5f. The presenting author, Andreanne Lapointe, received a canadian scholarship from MITACS. Part of the funding is from the compagny Siemens.« less

  9. 12 CFR 1235.2 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., or stored by electronic means. E-mail means a document created or received on a computer network for... conduct of the business of a regulated entity or the Office of Finance (which business, in the case of the... is stored or located, including network servers, desktop or laptop computers and handheld computers...

  10. 12 CFR 1235.2 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., or stored by electronic means. E-mail means a document created or received on a computer network for... conduct of the business of a regulated entity or the Office of Finance (which business, in the case of the... is stored or located, including network servers, desktop or laptop computers and handheld computers...

  11. 78 FR 48169 - Privacy Act of 1974; CMS Computer Match No. 2013-02; HHS Computer Match No. 1306; DoD-DMDC Match...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-07

    ...), Defense Manpower Data Center (DMDC) and the Office of the Assistant Secretary of Defense (Health Affairs.../TRICARE. DMDC will receive the results of the computer match and provide the information to TMA for use in...

  12. 26 CFR 301.6222(b)-3 - Partner receiving incorrect schedule.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Revenue Service office issuing the notice of computational adjustment within 30 days after the notice is... the partnership and of the notice of computational adjustment. The partner need not enclose a copy of the notice of computational adjustment, however, if the partner clearly identifies the notice of...

  13. 26 CFR 301.6222(b)-3 - Partner receiving incorrect schedule.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Revenue Service office issuing the notice of computational adjustment within 30 days after the notice is... the partnership and of the notice of computational adjustment. The partner need not enclose a copy of the notice of computational adjustment, however, if the partner clearly identifies the notice of...

  14. 26 CFR 301.6222(b)-3 - Partner receiving incorrect schedule.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Revenue Service office issuing the notice of computational adjustment within 30 days after the notice is... the partnership and of the notice of computational adjustment. The partner need not enclose a copy of the notice of computational adjustment, however, if the partner clearly identifies the notice of...

  15. 26 CFR 301.6222(b)-3 - Partner receiving incorrect schedule.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Revenue Service office issuing the notice of computational adjustment within 30 days after the notice is... the partnership and of the notice of computational adjustment. The partner need not enclose a copy of the notice of computational adjustment, however, if the partner clearly identifies the notice of...

  16. 26 CFR 301.6222(b)-3 - Partner receiving incorrect schedule.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Revenue Service office issuing the notice of computational adjustment within 30 days after the notice is... the partnership and of the notice of computational adjustment. The partner need not enclose a copy of the notice of computational adjustment, however, if the partner clearly identifies the notice of...

  17. 40 CFR 721.91 - Computation of estimated surface water concentrations: Instructions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... shall be computed for each site using the stream flow rate appropriate for the site according to... computing the equation, the number of kilograms released, and receiving stream flow. (a) Number of kilograms... chemical changes and/or changes in location, temperature, pressure, physical state, or similar...

  18. 40 CFR 721.91 - Computation of estimated surface water concentrations: Instructions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... shall be computed for each site using the stream flow rate appropriate for the site according to... computing the equation, the number of kilograms released, and receiving stream flow. (a) Number of kilograms... diagram which describes each manufacturing, processing, or use operation involving the substance. The...

  19. 40 CFR 721.91 - Computation of estimated surface water concentrations: Instructions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... shall be computed for each site using the stream flow rate appropriate for the site according to... computing the equation, the number of kilograms released, and receiving stream flow. (a) Number of kilograms... diagram which describes each manufacturing, processing, or use operation involving the substance. The...

  20. Optimized microsystems-enabled photovoltaics

    DOEpatents

    Cruz-Campa, Jose Luis; Nielson, Gregory N.; Young, Ralph W.; Resnick, Paul J.; Okandan, Murat; Gupta, Vipin P.

    2015-09-22

    Technologies pertaining to designing microsystems-enabled photovoltaic (MEPV) cells are described herein. A first restriction for a first parameter of an MEPV cell is received. Subsequently, a selection of a second parameter of the MEPV cell is received. Values for a plurality of parameters of the MEPV cell are computed such that the MEPV cell is optimized with respect to the second parameter, wherein the values for the plurality of parameters are computed based at least in part upon the restriction for the first parameter.

Top