Sample records for timing analysis performed

  1. Apollo 15 time and motion study

    NASA Technical Reports Server (NTRS)

    Kubis, J. F.; Elrod, J. T.; Rusnak, R.; Barnes, J. E.

    1972-01-01

    A time and motion study of Apollo 15 lunar surface activity led to examination of four distinct areas of crewmen activity. These areas are: an analysis of lunar mobility, a comparative analysis of tasks performed in 1-g training and lunar EVA, an analysis of the metabolic cost of two activities that are performed in several EVAs, and a fall/near-fall analysis. An analysis of mobility showed that the crewmen used three basic mobility patterns (modified walk, hop, side step) while on the lunar surface. These mobility patterns were utilized as adaptive modes to compensate for the uneven terrain and varied soil conditions that the crewmen encountered. A comparison of the time required to perform tasks at the final 1-g lunar EVA training sessions and the time required to perform the same task on the lunar surface indicates that, in almost all cases, it took significantly more time (on the order of 40%) to perform tasks on the moon. This increased time was observed even after extraneous factors (e.g., hardware difficulties) were factored out.

  2. Dispersion analysis for baseline reference mission 2

    NASA Technical Reports Server (NTRS)

    Snow, L. S.

    1975-01-01

    A dispersion analysis considering uncertainties (or perturbations) in platform, vehicle, and environmental parameters was performed for baseline reference mission (BRM) 2. The dispersion analysis is based on the nominal trajectory for BRM 2. The analysis was performed to determine state vector and performance dispersions (or variations) which result from the indicated uncertainties. The dispersions are determined at major mission events and fixed times from liftoff (time slices). The dispersion results will be used to evaluate the capability of the vehicle to perform the mission within a specified level of confidence and to determine flight performance reserves.

  3. A performance analysis method for distributed real-time robotic systems: A case study of remote teleoperation

    NASA Technical Reports Server (NTRS)

    Lefebvre, D. R.; Sanderson, A. C.

    1994-01-01

    Robot coordination and control systems for remote teleoperation applications are by necessity implemented on distributed computers. Modeling and performance analysis of these distributed robotic systems is difficult, but important for economic system design. Performance analysis methods originally developed for conventional distributed computer systems are often unsatisfactory for evaluating real-time systems. The paper introduces a formal model of distributed robotic control systems; and a performance analysis method, based on scheduling theory, which can handle concurrent hard-real-time response specifications. Use of the method is illustrated by a case of remote teleoperation which assesses the effect of communication delays and the allocation of robot control functions on control system hardware requirements.

  4. Performance bounds on parallel self-initiating discrete-event

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1990-01-01

    The use is considered of massively parallel architectures to execute discrete-event simulations of what is termed self-initiating models. A logical process in a self-initiating model schedules its own state re-evaluation times, independently of any other logical process, and sends its new state to other logical processes following the re-evaluation. The interest is in the effects of that communication on synchronization. The performance is considered of various synchronization protocols by deriving upper and lower bounds on optimal performance, upper bounds on Time Warp's performance, and lower bounds on the performance of a new conservative protocol. The analysis of Time Warp includes the overhead costs of state-saving and rollback. The analysis points out sufficient conditions for the conservative protocol to outperform Time Warp. The analysis also quantifies the sensitivity of performance to message fan-out, lookahead ability, and the probability distributions underlying the simulation.

  5. Paramedir: A Tool for Programmable Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  6. The reliability of an instrumented start block analysis system.

    PubMed

    Tor, Elaine; Pease, David L; Ball, Kevin A

    2015-02-01

    The swimming start is highly influential to overall competition performance. Therefore, it is paramount to develop reliable methods to perform accurate biomechanical analysis of start performance for training and research. The Wetplate Analysis System is a custom-made force plate system developed by the Australian Institute of Sport--Aquatic Testing, Training and Research Unit (AIS ATTRU). This sophisticated system combines both force data and 2D digitization to measure a number of kinetic and kinematic parameter values in an attempt to evaluate start performance. Fourteen elite swimmers performed two maximal effort dives (performance was defined as time from start signal to 15 m) over two separate testing sessions. Intraclass correlation coefficients (ICC) were used to determine each parameter's reliability. The kinetic parameters all had ICC greater than 0.9 except the time of peak vertical force (0.742). This may have been due to variations in movement initiation after the starting signal between trials. The kinematic and time parameters also had ICC greater than 0.9 apart from for the time of maximum depth (0.719). This parameter was lower due to the swimmers varying their depth between trials. Based on the high ICC scores for all parameters, the Wetplate Analysis System is suitable for biomechanical analysis of swimming starts.

  7. Modeling Canadian Quality Control Test Program for Steroid Hormone Receptors in Breast Cancer: Diagnostic Accuracy Study.

    PubMed

    Pérez, Teresa; Makrestsov, Nikita; Garatt, John; Torlakovic, Emina; Gilks, C Blake; Mallett, Susan

    The Canadian Immunohistochemistry Quality Control program monitors clinical laboratory performance for estrogen receptor and progesterone receptor tests used in breast cancer treatment management in Canada. Current methods assess sensitivity and specificity at each time point, compared with a reference standard. We investigate alternative performance analysis methods to enhance the quality assessment. We used 3 methods of analysis: meta-analysis of sensitivity and specificity of each laboratory across all time points; sensitivity and specificity at each time point for each laboratory; and fitting models for repeated measurements to examine differences between laboratories adjusted by test and time point. Results show 88 laboratories participated in quality control at up to 13 time points using typically 37 to 54 histology samples. In meta-analysis across all time points no laboratories have sensitivity or specificity below 80%. Current methods, presenting sensitivity and specificity separately for each run, result in wide 95% confidence intervals, typically spanning 15% to 30%. Models of a single diagnostic outcome demonstrated that 82% to 100% of laboratories had no difference to reference standard for estrogen receptor and 75% to 100% for progesterone receptor, with the exception of 1 progesterone receptor run. Laboratories with significant differences to reference standard identified with Generalized Estimating Equation modeling also have reduced performance by meta-analysis across all time points. The Canadian Immunohistochemistry Quality Control program has a good design, and with this modeling approach has sufficient precision to measure performance at each time point and allow laboratories with a significantly lower performance to be targeted for advice.

  8. EEG Correlates of Fluctuation in Cognitive Performance in an Air Traffic Control Task

    DTIC Science & Technology

    2014-11-01

    using non-parametric statistical analysis to identify neurophysiological patterns due to the time-on-task effect. Significant changes in EEG power...EEG, Cognitive Performance, Power Spectral Analysis , Non-Parametric Analysis Document is available to the public through the Internet...3 Performance Data Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 EEG

  9. Performance Analysis of Live-Virtual-Constructive and Distributed Virtual Simulations: Defining Requirements in Terms of Temporal Consistency

    DTIC Science & Technology

    2009-12-01

    events. Work associated with aperiodic tasks have the same statistical behavior and the same timing requirements. The timing deadlines are soft. • Sporadic...answers, but it is possible to calculate how precise the estimates are. Simulation-based performance analysis of a model includes a statistical ...to evaluate all pos- sible states in a timely manner. This is the principle reason for resorting to simulation and statistical analysis to evaluate

  10. Comparative analysis of techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Hitt, E. F.; Bridgman, M. S.; Robinson, A. C.

    1981-01-01

    Performability analysis is a technique developed for evaluating the effectiveness of fault-tolerant computing systems in multiphase missions. Performability was evaluated for its accuracy, practical usefulness, and relative cost. The evaluation was performed by applying performability and the fault tree method to a set of sample problems ranging from simple to moderately complex. The problems involved as many as five outcomes, two to five mission phases, permanent faults, and some functional dependencies. Transient faults and software errors were not considered. A different analyst was responsible for each technique. Significantly more time and effort were required to learn performability analysis than the fault tree method. Performability is inherently as accurate as fault tree analysis. For the sample problems, fault trees were more practical and less time consuming to apply, while performability required less ingenuity and was more checkable. Performability offers some advantages for evaluating very complex problems.

  11. SPA- STATISTICAL PACKAGE FOR TIME AND FREQUENCY DOMAIN ANALYSIS

    NASA Technical Reports Server (NTRS)

    Brownlow, J. D.

    1994-01-01

    The need for statistical analysis often arises when data is in the form of a time series. This type of data is usually a collection of numerical observations made at specified time intervals. Two kinds of analysis may be performed on the data. First, the time series may be treated as a set of independent observations using a time domain analysis to derive the usual statistical properties including the mean, variance, and distribution form. Secondly, the order and time intervals of the observations may be used in a frequency domain analysis to examine the time series for periodicities. In almost all practical applications, the collected data is actually a mixture of the desired signal and a noise signal which is collected over a finite time period with a finite precision. Therefore, any statistical calculations and analyses are actually estimates. The Spectrum Analysis (SPA) program was developed to perform a wide range of statistical estimation functions. SPA can provide the data analyst with a rigorous tool for performing time and frequency domain studies. In a time domain statistical analysis the SPA program will compute the mean variance, standard deviation, mean square, and root mean square. It also lists the data maximum, data minimum, and the number of observations included in the sample. In addition, a histogram of the time domain data is generated, a normal curve is fit to the histogram, and a goodness-of-fit test is performed. These time domain calculations may be performed on both raw and filtered data. For a frequency domain statistical analysis the SPA program computes the power spectrum, cross spectrum, coherence, phase angle, amplitude ratio, and transfer function. The estimates of the frequency domain parameters may be smoothed with the use of Hann-Tukey, Hamming, Barlett, or moving average windows. Various digital filters are available to isolate data frequency components. Frequency components with periods longer than the data collection interval are removed by least-squares detrending. As many as ten channels of data may be analyzed at one time. Both tabular and plotted output may be generated by the SPA program. This program is written in FORTRAN IV and has been implemented on a CDC 6000 series computer with a central memory requirement of approximately 142K (octal) of 60 bit words. This core requirement can be reduced by segmentation of the program. The SPA program was developed in 1978.

  12. PERTS: A Prototyping Environment for Real-Time Systems

    NASA Technical Reports Server (NTRS)

    Liu, Jane W. S.; Lin, Kwei-Jay; Liu, C. L.

    1993-01-01

    PERTS is a prototyping environment for real-time systems. It is being built incrementally and will contain basic building blocks of operating systems for time-critical applications, tools, and performance models for the analysis, evaluation and measurement of real-time systems and a simulation/emulation environment. It is designed to support the use and evaluation of new design approaches, experimentations with alternative system building blocks, and the analysis and performance profiling of prototype real-time systems.

  13. Model Performance Evaluation and Scenario Analysis ...

    EPA Pesticide Factsheets

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors. The performance measures include error analysis, coefficient of determination, Nash-Sutcliffe efficiency, and a new weighted rank method. These performance metrics only provide useful information about the overall model performance. Note that MPESA is based on the separation of observed and simulated time series into magnitude and sequence components. The separation of time series into magnitude and sequence components and the reconstruction back to time series provides diagnostic insights to modelers. For example, traditional approaches lack the capability to identify if the source of uncertainty in the simulated data is due to the quality of the input data or the way the analyst adjusted the model parameters. This report presents a suite of model diagnostics that identify if mismatches between observed and simulated data result from magnitude or sequence related errors. MPESA offers graphical and statistical options that allow HSPF users to compare observed and simulated time series and identify the parameter values to adjust or the input data to modify. The scenario analysis part of the too

  14. Validating the performance of one-time decomposition for fMRI analysis using ICA with automatic target generation process.

    PubMed

    Yao, Shengnan; Zeng, Weiming; Wang, Nizhuan; Chen, Lei

    2013-07-01

    Independent component analysis (ICA) has been proven to be effective for functional magnetic resonance imaging (fMRI) data analysis. However, ICA decomposition requires to optimize the unmixing matrix iteratively whose initial values are generated randomly. Thus the randomness of the initialization leads to different ICA decomposition results. Therefore, just one-time decomposition for fMRI data analysis is not usually reliable. Under this circumstance, several methods about repeated decompositions with ICA (RDICA) were proposed to reveal the stability of ICA decomposition. Although utilizing RDICA has achieved satisfying results in validating the performance of ICA decomposition, RDICA cost much computing time. To mitigate the problem, in this paper, we propose a method, named ATGP-ICA, to do the fMRI data analysis. This method generates fixed initial values with automatic target generation process (ATGP) instead of being produced randomly. We performed experimental tests on both hybrid data and fMRI data to indicate the effectiveness of the new method and made a performance comparison of the traditional one-time decomposition with ICA (ODICA), RDICA and ATGP-ICA. The proposed method demonstrated that it not only could eliminate the randomness of ICA decomposition, but also could save much computing time compared to RDICA. Furthermore, the ROC (Receiver Operating Characteristic) power analysis also denoted the better signal reconstruction performance of ATGP-ICA than that of RDICA. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Instability of a solidifying binary mixture

    NASA Technical Reports Server (NTRS)

    Antar, B. N.

    1982-01-01

    An analysis is performed on the stability of a solidifying binary mixture due to surface tension variation of the free liquid surface. The basic state solution is obtained numerically as a nonstationary function of time. Due to the time dependence of the basic state, the stability analysis is of the global type which utilizes a variational technique. Also due to the fact that the basic state is a complex function of both space and time, the stability analysis is performed through numerical means.

  16. Model Performance Evaluation and Scenario Analysis (MPESA)

    EPA Pesticide Factsheets

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  17. A novel tool for continuous fracture aftercare - Clinical feasibility and first results of a new telemetric gait analysis insole.

    PubMed

    Braun, Benedikt J; Bushuven, Eva; Hell, Rebecca; Veith, Nils T; Buschbaum, Jan; Holstein, Joerg H; Pohlemann, Tim

    2016-02-01

    Weight bearing after lower extremity fractures still remains a highly controversial issue. Even in ankle fractures, the most common lower extremity injury no standard aftercare protocol has been established. Average non weight bearing times range from 0 to 7 weeks, with standardised, radiological healing controls at fixed time intervals. Recent literature calls for patient-adapted aftercare protocols based on individual fracture and load scenarios. We show the clinical feasibility and first results of a new, insole embedded gait analysis tool for continuous monitoring of gait, load and activity. Ten patients were monitored with a new, independent gait analysis insole for up to 3 months postoperatively. Strict 20 kg partial weight bearing was ordered for 6 weeks. Overall activity, load spectrum, ground reaction forces, clinical scoring and general health data were recorded and correlated. Statistical analysis with power analysis, t-test and Spearman correlation was performed. Only one patient completely adhered to the set weight bearing limit. Average time in minutes over the limit was 374 min. Based on the parameters load, activity, gait time over 20 kg weight bearing and maximum ground reaction force high and low performers were defined after 3 weeks. Significant difference in time to painless full weight bearing between high and low performers was shown. Correlation analysis revealed a significant correlation between weight bearing and clinical scoring as well as pain (American Orthopaedic Foot and Ankle Society (AOFAS) Score rs=0.74; Olerud-Molander Score rs=0.93; VAS pain rs=-0.95). Early, continuous gait analysis is able to define aftercare performers with significant differences in time to full painless weight bearing where clinical or radiographic controls could not. Patient compliance to standardised weight bearing limits and protocols is low. Highly individual rehabilitation patterns were seen in all patients. Aftercare protocols should be adjusted to real-time patient conditions, rather than fixed intervals and limits. With a real-time measuring device high performers could be identified and influenced towards optimal healing conditions early, while low performers are recognised and missing healing influences could be corrected according to patient condition. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. No Evidence of Reaction Time Slowing in Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Ferraro, F. Richard

    2016-01-01

    A total of 32 studies comprising 238 simple reaction time and choice reaction time conditions were examined in individuals with autism spectrum disorder (n?=?964) and controls (n?=?1032). A Brinley plot/multiple regression analysis was performed on mean reaction times, regressing autism spectrum disorder performance onto the control performance as…

  19. Nuclear reactor descriptions for space power systems analysis

    NASA Technical Reports Server (NTRS)

    Mccauley, E. W.; Brown, N. J.

    1972-01-01

    For the small, high performance reactors required for space electric applications, adequate neutronic analysis is of crucial importance, but in terms of computational time consumed, nuclear calculations probably yield the least amount of detail for mission analysis study. It has been found possible, after generation of only a few designs of a reactor family in elaborate thermomechanical and nuclear detail to use simple curve fitting techniques to assure desired neutronic performance while still performing the thermomechanical analysis in explicit detail. The resulting speed-up in computation time permits a broad detailed examination of constraints by the mission analyst.

  20. Link Performance Analysis and monitoring - A unified approach to divergent requirements

    NASA Astrophysics Data System (ADS)

    Thom, G. A.

    Link Performance Analysis and real-time monitoring are generally covered by a wide range of equipment. Bit Error Rate testers provide digital link performance measurements but are not useful during real-time data flows. Real-time performance monitors utilize the fixed overhead content but vary widely from format to format. Link quality information is also present from signal reconstruction equipment in the form of receiver AGC, bit synchronizer AGC, and bit synchronizer soft decision level outputs, but no general approach to utilizing this information exists. This paper presents an approach to link tests, real-time data quality monitoring, and results presentation that utilizes a set of general purpose modules in a flexible architectural environment. The system operates over a wide range of bit rates (up to 150 Mbs) and employs several measurement techniques, including P/N code errors or fixed PCM format errors, derived real-time BER from frame sync errors, and Data Quality Analysis derived by counting significant sync status changes. The architecture performs with a minimum of elements in place to permit a phased update of the user's unit in accordance with his needs.

  1. NASA trend analysis procedures

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This publication is primarily intended for use by NASA personnel engaged in managing or implementing trend analysis programs. 'Trend analysis' refers to the observation of current activity in the context of the past in order to infer the expected level of future activity. NASA trend analysis was divided into 5 categories: problem, performance, supportability, programmatic, and reliability. Problem trend analysis uncovers multiple occurrences of historical hardware or software problems or failures in order to focus future corrective action. Performance trend analysis observes changing levels of real-time or historical flight vehicle performance parameters such as temperatures, pressures, and flow rates as compared to specification or 'safe' limits. Supportability trend analysis assesses the adequacy of the spaceflight logistics system; example indicators are repair-turn-around time and parts stockage levels. Programmatic trend analysis uses quantitative indicators to evaluate the 'health' of NASA programs of all types. Finally, reliability trend analysis attempts to evaluate the growth of system reliability based on a decreasing rate of occurrence of hardware problems over time. Procedures for conducting all five types of trend analysis are provided in this publication, prepared through the joint efforts of the NASA Trend Analysis Working Group.

  2. A Study of ATLAS Grid Performance for Distributed Analysis

    NASA Astrophysics Data System (ADS)

    Panitkin, Sergey; Fine, Valery; Wenaus, Torre

    2012-12-01

    In the past two years the ATLAS Collaboration at the LHC has collected a large volume of data and published a number of ground breaking papers. The Grid-based ATLAS distributed computing infrastructure played a crucial role in enabling timely analysis of the data. We will present a study of the performance and usage of the ATLAS Grid as platform for physics analysis in 2011. This includes studies of general properties as well as timing properties of user jobs (wait time, run time, etc). These studies are based on mining of data archived by the PanDA workload management system.

  3. Design and Analysis of Scheduling Policies for Real-Time Computer Systems

    DTIC Science & Technology

    1992-01-01

    C. M. Krishna, "The Impact of Workload on the Reliability of Real-Time Processor Triads," to appear in Micro . Rel. [17] J.F. Kurose, "Performance... Processor Triads", to appear in Micro . Rel. "* J.F. Kurose. "Performance Analysis of Minimum Laxity Scheduling in Discrete Time Queue- ing Systems", to...exponentially distributed service times and deadlines. A similar model was developed for the ED policy for a single processor system under identical

  4. Cost analysis of injection laryngoplasty performed under local anaesthesia versus general anaesthesia: an Australian perspective.

    PubMed

    Chandran, D; Woods, C M; Schar, M; Ma, N; Ooi, E H; Athanasiadis, T

    2018-02-01

    To conduct a cost analysis of injection laryngoplasty performed in the operating theatre under local anaesthesia and general anaesthesia. The retrospective study included patients who had undergone injection laryngoplasty as day cases between July 2013 and March 2016. Cost data were obtained, along with patient demographics, anaesthetic details, type of injectant, American Society of Anesthesiologists score, length of stay, total operating theatre time and surgeon procedure time. A total of 20 cases (general anaesthesia = 6, local anaesthesia = 14) were included in the cost analysis. The mean total cost under general anaesthesia (AU$2865.96 ± 756.29) was significantly higher than that under local anaesthesia (AU$1731.61 ± 290.29) (p < 0.001). The mean operating theatre time, surgeon procedure time and length of stay were all significantly lower under local anaesthesia compared to general anaesthesia. Time variables such as operating theatre time and length of stay were the most significant predictors of the total costs. Procedures performed under local anaesthesia in the operating theatre are associated with shorter operating theatre time and length of stay in the hospital, and provide significant cost savings. Further savings could be achieved if local anaesthesia procedures were performed in the office setting.

  5. Supportive Measures: An Analysis of the Trio Program--Student Support Services at East Tennessee State University from 2001-2004

    ERIC Educational Resources Information Center

    Strode, Christopher N.

    2013-01-01

    The purpose of this study was to examine the academic performance of the first-time, full-time, traditional-aged students in the Student Support Services program at East Tennessee State University. This was accomplished by comparing their academic performance with the academic performance of first-time, full-time, traditional-aged non-SSS…

  6. Time-frequency analysis of human motion during rhythmic exercises.

    PubMed

    Omkar, S N; Vyas, Khushi; Vikranth, H N

    2011-01-01

    Biomechanical signals due to human movements during exercise are represented in time-frequency domain using Wigner Distribution Function (WDF). Analysis based on WDF reveals instantaneous spectral and power changes during a rhythmic exercise. Investigations were carried out on 11 healthy subjects who performed 5 cycles of sun salutation, with a body-mounted Inertial Measurement Unit (IMU) as a motion sensor. Variance of Instantaneous Frequency (I.F) and Instantaneous Power (I.P) for performance analysis of the subject is estimated using one-way ANOVA model. Results reveal that joint Time-Frequency analysis of biomechanical signals during motion facilitates a better understanding of grace and consistency during rhythmic exercise.

  7. More on Time Series Designs: A Reanalysis of Mayer and Kozlow's Data.

    ERIC Educational Resources Information Center

    Willson, Victor L.

    1982-01-01

    Differentiating between time-series design and time-series analysis, examines design considerations and reanalyzes data previously reported by Mayer and Kozlow in this journal. The current analysis supports the analysis performed by Mayer and Kozlow but puts the results on a somewhat firmer statistical footing. (Author/JN)

  8. High-rate dead-time corrections in a general purpose digital pulse processing system

    PubMed Central

    Abbene, Leonardo; Gerardi, Gaetano

    2015-01-01

    Dead-time losses are well recognized and studied drawbacks in counting and spectroscopic systems. In this work the abilities on dead-time correction of a real-time digital pulse processing (DPP) system for high-rate high-resolution radiation measurements are presented. The DPP system, through a fast and slow analysis of the output waveform from radiation detectors, is able to perform multi-parameter analysis (arrival time, pulse width, pulse height, pulse shape, etc.) at high input counting rates (ICRs), allowing accurate counting loss corrections even for variable or transient radiations. The fast analysis is used to obtain both the ICR and energy spectra with high throughput, while the slow analysis is used to obtain high-resolution energy spectra. A complete characterization of the counting capabilities, through both theoretical and experimental approaches, was performed. The dead-time modeling, the throughput curves, the experimental time-interval distributions (TIDs) and the counting uncertainty of the recorded events of both the fast and the slow channels, measured with a planar CdTe (cadmium telluride) detector, will be presented. The throughput formula of a series of two types of dead-times is also derived. The results of dead-time corrections, performed through different methods, will be reported and discussed, pointing out the error on ICR estimation and the simplicity of the procedure. Accurate ICR estimations (nonlinearity < 0.5%) were performed by using the time widths and the TIDs (using 10 ns time bin width) of the detected pulses up to 2.2 Mcps. The digital system allows, after a simple parameter setting, different and sophisticated procedures for dead-time correction, traditionally implemented in complex/dedicated systems and time-consuming set-ups. PMID:26289270

  9. Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks

    ERIC Educational Resources Information Center

    Park, Sanghoon

    2017-01-01

    This paper reports the findings of a comparative analysis of online learner behavioral interactions, time-on-task, attendance, and performance at different points throughout a semester (beginning, during, and end) based on two online courses: one course offering authentic discussion-based learning activities and the other course offering authentic…

  10. Performance Analysis of Scientific and Engineering Applications Using MPInside and TAU

    NASA Technical Reports Server (NTRS)

    Saini, Subhash; Mehrotra, Piyush; Taylor, Kenichi Jun Haeng; Shende, Sameer Suresh; Biswas, Rupak

    2010-01-01

    In this paper, we present performance analysis of two NASA applications using performance tools like Tuning and Analysis Utilities (TAU) and SGI MPInside. MITgcmUV and OVERFLOW are two production-quality applications used extensively by scientists and engineers at NASA. MITgcmUV is a global ocean simulation model, developed by the Estimating the Circulation and Climate of the Ocean (ECCO) Consortium, for solving the fluid equations of motion using the hydrostatic approximation. OVERFLOW is a general-purpose Navier-Stokes solver for computational fluid dynamics (CFD) problems. Using these tools, we analyze the MPI functions (MPI_Sendrecv, MPI_Bcast, MPI_Reduce, MPI_Allreduce, MPI_Barrier, etc.) with respect to message size of each rank, time consumed by each function, and how ranks communicate. MPI communication is further analyzed by studying the performance of MPI functions used in these two applications as a function of message size and number of cores. Finally, we present the compute time, communication time, and I/O time as a function of the number of cores.

  11. Dispersion analysis for baseline reference mission 1. [flight simulation and trajectory analysis for space shuttle orbiter

    NASA Technical Reports Server (NTRS)

    Kuhn, A. E.

    1975-01-01

    A dispersion analysis considering 3 sigma uncertainties (or perturbations) in platform, vehicle, and environmental parameters was performed for the baseline reference mission (BRM) 1 of the space shuttle orbiter. The dispersion analysis is based on the nominal trajectory for the BRM 1. State vector and performance dispersions (or variations) which result from the indicated 3 sigma uncertainties were studied. The dispersions were determined at major mission events and fixed times from lift-off (time slices) and the results will be used to evaluate the capability of the vehicle to perform the mission within a 3 sigma level of confidence and to determine flight performance reserves. A computer program is given that was used for dynamic flight simulations of the space shuttle orbiter.

  12. Discrete retardance second harmonic generation ellipsometry.

    PubMed

    Dehen, Christopher J; Everly, R Michael; Plocinik, Ryan M; Hedderich, Hartmut G; Simpson, Garth J

    2007-01-01

    A new instrument was constructed to perform discrete retardance nonlinear optical ellipsometry (DR-NOE). The focus of the design was to perform second harmonic generation NOE while maximizing sample and application flexibility and minimizing data acquisition time. The discrete retardance configuration results in relatively simple computational algorithms for performing nonlinear optical ellipsometric analysis. NOE analysis of a disperse red 19 monolayer yielded results that were consistent with previously reported values for the same surface system, but with significantly reduced acquisition times.

  13. Low-Latency Embedded Vision Processor (LLEVS)

    DTIC Science & Technology

    2016-03-01

    26 3.2.3 Task 3 Projected Performance Analysis of FPGA- based Vision Processor ........... 31 3.2.3.1 Algorithms Latency Analysis ...Programmable Gate Array Custom Hardware for Real- Time Multiresolution Analysis . ............................................... 35...conduct data analysis for performance projections. The data acquired through measurements , simulation and estimation provide the requisite platform for

  14. Comparative analysis of the modified enclosed energy metric for self-focusing holograms from digital lensless holographic microscopy.

    PubMed

    Trujillo, Carlos; Garcia-Sucerquia, Jorge

    2015-06-01

    A comparative analysis of the performance of the modified enclosed energy (MEE) method for self-focusing holograms recorded with digital lensless holographic microscopy is presented. Notwithstanding the MEE analysis previously published, no extended analysis of its performance has been reported. We have tested the MEE in terms of the minimum axial distance allowed between the set of reconstructed holograms to search for the focal plane and the elapsed time to obtain the focused image. These parameters have been compared with those for some of the already reported methods in the literature. The MEE achieves better results in terms of self-focusing quality but at a higher computational cost. Despite its longer processing time, the method remains within a time frame to be technologically attractive. Modeled and experimental holograms have been utilized in this work to perform the comparative study.

  15. Collection of human reaction times and supporting health related data for analysis of cognitive and physical performance.

    PubMed

    Brůha, Petr; Mouček, Roman; Vacek, Vítězslav; Šnejdar, Pavel; Černá, Kateřina; Řehoř, Petr

    2018-04-01

    Smoking, excessive drinking, overeating and physical inactivity are well-established risk factors decreasing human physical performance. Moreover, epidemiological work has identified modifiable lifestyle factors, such as poor diet and physical and cognitive inactivity that are associated with the risk of reduced cognitive performance. Definition, collection and annotation of human reaction times and suitable health related data and metadata provides researchers with a necessary source for further analysis of human physical and cognitive performance. The collection of human reaction times and supporting health related data was obtained from two groups comprising together 349 people of all ages - the visitors of the Days of Science and Technology 2016 held on the Pilsen central square and members of the Mensa Czech Republic visiting the neuroinformatics lab at the University of West Bohemia. Each provided dataset contains a complete or partial set of data obtained from the following measurements: hands and legs reaction times, color vision, spirometry, electrocardiography, blood pressure, blood glucose, body proportions and flexibility. It also provides a sufficient set of metadata (age, gender and summary of the participant's current life style and health) to allow researchers to perform further analysis. This article has two main aims. The first aim is to provide a well annotated collection of human reaction times and health related data that is suitable for further analysis of lifestyle and human cognitive and physical performance. This data collection is complemented with a preliminarily statistical evaluation. The second aim is to present a procedure of efficient acquisition of human reaction times and supporting health related data in non-lab and lab conditions.

  16. The timing resolution of scintillation-detector systems: Monte Carlo analysis

    NASA Astrophysics Data System (ADS)

    Choong, Woon-Seng

    2009-11-01

    Recent advancements in fast scintillating materials and fast photomultiplier tubes (PMTs) have stimulated renewed interest in time-of-flight (TOF) positron emission tomography (PET). It is well known that the improvement in the timing resolution in PET can significantly reduce the noise variance in the reconstructed image resulting in improved image quality. In order to evaluate the timing performance of scintillation detectors used in TOF PET, we use Monte Carlo analysis to model the physical processes (crystal geometry, crystal surface finish, scintillator rise time, scintillator decay time, photoelectron yield, PMT transit time spread, PMT single-electron response, amplifier response and time pick-off method) that can contribute to the timing resolution of scintillation-detector systems. In the Monte Carlo analysis, the photoelectron emissions are modeled by a rate function, which is used to generate the photoelectron time points. The rate function, which is simulated using Geant4, represents the combined intrinsic light emissions of the scintillator and the subsequent light transport through the crystal. The PMT output signal is determined by the superposition of the PMT single-electron response resulting from the photoelectron emissions. The transit time spread and the single-electron gain variation of the PMT are modeled in the analysis. Three practical time pick-off methods are considered in the analysis. Statistically, the best timing resolution is achieved with the first photoelectron timing. The calculated timing resolution suggests that a leading edge discriminator gives better timing performance than a constant fraction discriminator and produces comparable results when a two-threshold or three-threshold discriminator is used. For a typical PMT, the effect of detector noise on the timing resolution is negligible. The calculated timing resolution is found to improve with increasing mean photoelectron yield, decreasing scintillator decay time and decreasing transit time spread. However, only substantial improvement in the timing resolution is obtained with improved transit time spread if the first photoelectron timing is less than the transit time spread. While the calculated timing performance does not seem to be affected by the pixel size of the crystal, it improves for an etched crystal compared to a polished crystal. In addition, the calculated timing resolution degrades with increasing crystal length. These observations can be explained by studying the initial photoelectron rate. Experimental measurements provide reasonably good agreement with the calculated timing resolution. The Monte Carlo analysis developed in this work will allow us to optimize the scintillation detectors for timing and to understand the physical factors limiting their performance.

  17. The timing resolution of scintillation-detector systems: Monte Carlo analysis.

    PubMed

    Choong, Woon-Seng

    2009-11-07

    Recent advancements in fast scintillating materials and fast photomultiplier tubes (PMTs) have stimulated renewed interest in time-of-flight (TOF) positron emission tomography (PET). It is well known that the improvement in the timing resolution in PET can significantly reduce the noise variance in the reconstructed image resulting in improved image quality. In order to evaluate the timing performance of scintillation detectors used in TOF PET, we use Monte Carlo analysis to model the physical processes (crystal geometry, crystal surface finish, scintillator rise time, scintillator decay time, photoelectron yield, PMT transit time spread, PMT single-electron response, amplifier response and time pick-off method) that can contribute to the timing resolution of scintillation-detector systems. In the Monte Carlo analysis, the photoelectron emissions are modeled by a rate function, which is used to generate the photoelectron time points. The rate function, which is simulated using Geant4, represents the combined intrinsic light emissions of the scintillator and the subsequent light transport through the crystal. The PMT output signal is determined by the superposition of the PMT single-electron response resulting from the photoelectron emissions. The transit time spread and the single-electron gain variation of the PMT are modeled in the analysis. Three practical time pick-off methods are considered in the analysis. Statistically, the best timing resolution is achieved with the first photoelectron timing. The calculated timing resolution suggests that a leading edge discriminator gives better timing performance than a constant fraction discriminator and produces comparable results when a two-threshold or three-threshold discriminator is used. For a typical PMT, the effect of detector noise on the timing resolution is negligible. The calculated timing resolution is found to improve with increasing mean photoelectron yield, decreasing scintillator decay time and decreasing transit time spread. However, only substantial improvement in the timing resolution is obtained with improved transit time spread if the first photoelectron timing is less than the transit time spread. While the calculated timing performance does not seem to be affected by the pixel size of the crystal, it improves for an etched crystal compared to a polished crystal. In addition, the calculated timing resolution degrades with increasing crystal length. These observations can be explained by studying the initial photoelectron rate. Experimental measurements provide reasonably good agreement with the calculated timing resolution. The Monte Carlo analysis developed in this work will allow us to optimize the scintillation detectors for timing and to understand the physical factors limiting their performance.

  18. Separation of Intercepted Multi-Radar Signals Based on Parameterized Time-Frequency Analysis

    NASA Astrophysics Data System (ADS)

    Lu, W. L.; Xie, J. W.; Wang, H. M.; Sheng, C.

    2016-09-01

    Modern radars use complex waveforms to obtain high detection performance and low probabilities of interception and identification. Signals intercepted from multiple radars overlap considerably in both the time and frequency domains and are difficult to separate with primary time parameters. Time-frequency analysis (TFA), as a key signal-processing tool, can provide better insight into the signal than conventional methods. In particular, among the various types of TFA, parameterized time-frequency analysis (PTFA) has shown great potential to investigate the time-frequency features of such non-stationary signals. In this paper, we propose a procedure for PTFA to separate overlapped radar signals; it includes five steps: initiation, parameterized time-frequency analysis, demodulating the signal of interest, adaptive filtering and recovering the signal. The effectiveness of the method was verified with simulated data and an intercepted radar signal received in a microwave laboratory. The results show that the proposed method has good performance and has potential in electronic reconnaissance applications, such as electronic intelligence, electronic warfare support measures, and radar warning.

  19. Real-time fMRI processing with physiological noise correction - Comparison with off-line analysis.

    PubMed

    Misaki, Masaya; Barzigar, Nafise; Zotev, Vadim; Phillips, Raquel; Cheng, Samuel; Bodurka, Jerzy

    2015-12-30

    While applications of real-time functional magnetic resonance imaging (rtfMRI) are growing rapidly, there are still limitations in real-time data processing compared to off-line analysis. We developed a proof-of-concept real-time fMRI processing (rtfMRIp) system utilizing a personal computer (PC) with a dedicated graphic processing unit (GPU) to demonstrate that it is now possible to perform intensive whole-brain fMRI data processing in real-time. The rtfMRIp performs slice-timing correction, motion correction, spatial smoothing, signal scaling, and general linear model (GLM) analysis with multiple noise regressors including physiological noise modeled with cardiac (RETROICOR) and respiration volume per time (RVT). The whole-brain data analysis with more than 100,000voxels and more than 250volumes is completed in less than 300ms, much faster than the time required to acquire the fMRI volume. Real-time processing implementation cannot be identical to off-line analysis when time-course information is used, such as in slice-timing correction, signal scaling, and GLM. We verified that reduced slice-timing correction for real-time analysis had comparable output with off-line analysis. The real-time GLM analysis, however, showed over-fitting when the number of sampled volumes was small. Our system implemented real-time RETROICOR and RVT physiological noise corrections for the first time and it is capable of processing these steps on all available data at a given time, without need for recursive algorithms. Comprehensive data processing in rtfMRI is possible with a PC, while the number of samples should be considered in real-time GLM. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Developing corridor-level truck travel time estimates and other freight performance measures from archived ITS data.

    DOT National Transportation Integrated Search

    2009-08-01

    The objectives of this research were to retrospectively study the feasibility for using truck transponder data to produce freight corridor performance measures (travel times) and real-time traveler information. To support this analysis, weigh-in-moti...

  1. 76 FR 25345 - Annual Assessment of the Status of Competition in the Market for the Delivery of Video Programming

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-04

    ... as of June 30 of the relevant year to monitor trends on an annual basis. To continue our time-series... video programming? 24. MVPD Performance. We seek comment on the information and time- series data we... Television Performance. We seek information and time- series data for the analysis of various performance...

  2. A computer program for estimating the power-density spectrum of advanced continuous simulation language generated time histories

    NASA Technical Reports Server (NTRS)

    Dunn, H. J.

    1981-01-01

    A computer program for performing frequency analysis of time history data is presented. The program uses circular convolution and the fast Fourier transform to calculate power density spectrum (PDS) of time history data. The program interfaces with the advanced continuous simulation language (ACSL) so that a frequency analysis may be performed on ACSL generated simulation variables. An example of the calculation of the PDS of a Van de Pol oscillator is presented.

  3. Combining instruction prefetching with partial cache locking to improve WCET in real-time systems.

    PubMed

    Ni, Fan; Long, Xiang; Wan, Han; Gao, Xiaopeng

    2013-01-01

    Caches play an important role in embedded systems to bridge the performance gap between fast processor and slow memory. And prefetching mechanisms are proposed to further improve the cache performance. While in real-time systems, the application of caches complicates the Worst-Case Execution Time (WCET) analysis due to its unpredictable behavior. Modern embedded processors often equip locking mechanism to improve timing predictability of the instruction cache. However, locking the whole cache may degrade the cache performance and increase the WCET of the real-time application. In this paper, we proposed an instruction-prefetching combined partial cache locking mechanism, which combines an instruction prefetching mechanism (termed as BBIP) with partial cache locking to improve the WCET estimates of real-time applications. BBIP is an instruction prefetching mechanism we have already proposed to improve the worst-case cache performance and in turn the worst-case execution time. The estimations on typical real-time applications show that the partial cache locking mechanism shows remarkable WCET improvement over static analysis and full cache locking.

  4. Combining Instruction Prefetching with Partial Cache Locking to Improve WCET in Real-Time Systems

    PubMed Central

    Ni, Fan; Long, Xiang; Wan, Han; Gao, Xiaopeng

    2013-01-01

    Caches play an important role in embedded systems to bridge the performance gap between fast processor and slow memory. And prefetching mechanisms are proposed to further improve the cache performance. While in real-time systems, the application of caches complicates the Worst-Case Execution Time (WCET) analysis due to its unpredictable behavior. Modern embedded processors often equip locking mechanism to improve timing predictability of the instruction cache. However, locking the whole cache may degrade the cache performance and increase the WCET of the real-time application. In this paper, we proposed an instruction-prefetching combined partial cache locking mechanism, which combines an instruction prefetching mechanism (termed as BBIP) with partial cache locking to improve the WCET estimates of real-time applications. BBIP is an instruction prefetching mechanism we have already proposed to improve the worst-case cache performance and in turn the worst-case execution time. The estimations on typical real-time applications show that the partial cache locking mechanism shows remarkable WCET improvement over static analysis and full cache locking. PMID:24386133

  5. Body sway, aim point fluctuation and performance in rifle shooters: inter- and intra-individual analysis.

    PubMed

    Ball, Kevin A; Best, Russell J; Wrigley, Tim V

    2003-07-01

    In this study, we examined the relationships between body sway, aim point fluctuation and performance in rifle shooting on an inter- and intra-individual basis. Six elite shooters performed 20 shots under competition conditions. For each shot, body sway parameters and four aim point fluctuation parameters were quantified for the time periods 5 s to shot, 3 s to shot and 1 s to shot. Three parameters were used to indicate performance. An AMTI LG6-4 force plate was used to measure body sway parameters, while a SCATT shooting analysis system was used to measure aim point fluctuation and shooting performance. Multiple regression analysis indicated that body sway was related to performance for four shooters. Also, body sway was related to aim point fluctuation for all shooters. These relationships were specific to the individual, with the strength of association, parameters of importance and time period of importance different for different shooters. Correlation analysis of significant regressions indicated that, as body sway increased, performance decreased and aim point fluctuation increased for most relationships. We conclude that body sway and aim point fluctuation are important in elite rifle shooting and performance errors are highly individual-specific at this standard. Individual analysis should be a priority when examining elite sports performance.

  6. Participation and Performance Trends in Triple Iron Ultra-triathlon – a Cross-sectional and Longitudinal Data Analysis

    PubMed Central

    Rüst, Christoph Alexander; Knechtle, Beat; Knechtle, Patrizia; Rosemann, Thomas; Lepers, Romuald

    2012-01-01

    Purpose The aims of the present study were to investigate (i) the changes in participation and performance and (ii) the gender difference in Triple Iron ultra-triathlon (11.4 km swimming, 540 km cycling and 126.6 km running) across years from 1988 to 2011. Methods For the cross-sectional data analysis, the association between with overall race times and split times was investigated using simple linear regression analyses and analysis of variance. For the longitudinal data analysis, the changes in race times for the five men and women with the highest number of participations were analysed using simple linear regression analyses. Results During the studied period, the number of finishers were 824 (71.4%) for men and 80 (78.4%) for women. Participation increased for men (r 2=0.27, P<0.01) while it remained stable for women (8%). Total race times were 2,146 ± 127.3 min for men and 2,615 ± 327.2 min for women (P<0.001). Total race time decreased for men (r 2=0.17; P=0.043), while it increased for women (r 2=0.49; P=0.001) across years. The gender difference in overall race time for winners increased from 10% in 1992 to 42% in 2011 (r 2=0.63; P<0.001). The longitudinal analysis of the five women and five men with the highest number of participations showed that performance decreased in one female (r 2=0.45; P=0.01). The four other women as well as all five men showed no change in overall race times across years. Conclusions Participation increased and performance improved for male Triple Iron ultra-triathletes while participation remained unchanged and performance decreased for females between 1988 and 2011. The reasons for the increase of the gap between female and male Triple Iron ultra-triathletes need further investigations. PMID:23012633

  7. Identification of human operator performance models utilizing time series analysis

    NASA Technical Reports Server (NTRS)

    Holden, F. M.; Shinners, S. M.

    1973-01-01

    The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.

  8. Comparing the performance of FA, DFA and DMA using different synthetic long-range correlated time series

    PubMed Central

    Shao, Ying-Hui; Gu, Gao-Feng; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Sornette, Didier

    2012-01-01

    Notwithstanding the significant efforts to develop estimators of long-range correlations (LRC) and to compare their performance, no clear consensus exists on what is the best method and under which conditions. In addition, synthetic tests suggest that the performance of LRC estimators varies when using different generators of LRC time series. Here, we compare the performances of four estimators [Fluctuation Analysis (FA), Detrended Fluctuation Analysis (DFA), Backward Detrending Moving Average (BDMA), and Centred Detrending Moving Average (CDMA)]. We use three different generators [Fractional Gaussian Noises, and two ways of generating Fractional Brownian Motions]. We find that CDMA has the best performance and DFA is only slightly worse in some situations, while FA performs the worst. In addition, CDMA and DFA are less sensitive to the scaling range than FA. Hence, CDMA and DFA remain “The Methods of Choice” in determining the Hurst index of time series. PMID:23150785

  9. Alignment of high-throughput sequencing data inside in-memory databases.

    PubMed

    Firnkorn, Daniel; Knaup-Gregori, Petra; Lorenzo Bermejo, Justo; Ganzinger, Matthias

    2014-01-01

    In times of high-throughput DNA sequencing techniques, performance-capable analysis of DNA sequences is of high importance. Computer supported DNA analysis is still an intensive time-consuming task. In this paper we explore the potential of a new In-Memory database technology by using SAP's High Performance Analytic Appliance (HANA). We focus on read alignment as one of the first steps in DNA sequence analysis. In particular, we examined the widely used Burrows-Wheeler Aligner (BWA) and implemented stored procedures in both, HANA and the free database system MySQL, to compare execution time and memory management. To ensure that the results are comparable, MySQL has been running in memory as well, utilizing its integrated memory engine for database table creation. We implemented stored procedures, containing exact and inexact searching of DNA reads within the reference genome GRCh37. Due to technical restrictions in SAP HANA concerning recursion, the inexact matching problem could not be implemented on this platform. Hence, performance analysis between HANA and MySQL was made by comparing the execution time of the exact search procedures. Here, HANA was approximately 27 times faster than MySQL which means, that there is a high potential within the new In-Memory concepts, leading to further developments of DNA analysis procedures in the future.

  10. Sport, time pressure, and cognitive performance.

    PubMed

    Chiu, Chia N; Chen, Chiao-Yun; Muggleton, Neil G

    2017-01-01

    Sport participation, fitness, and expertise have been associated with a range of cognitive benefits in a range of populations but both the factors that confer such benefits and the nature of the resulting changes are relatively unclear. Additionally, the interactions between time pressure and cognitive performance for these groups is little studied. Using a flanker task, which measures the ability to selectively process information, and with different time limits for responding, we investigated the differences in performance for participants in (1) an unpredictable, open-skill sport (volleyball), (2) an exercise group engaged in predictable, closed-skill sports (running, swimming), and (3) nonsporting controls. Analysis by means of a drift diffusion analysis of response times was used to characterize the nature of any differences. Volleyball players were more accurate than controls and the exercise group, particularly for shorter time limits for responding, as well as tending to respond more quickly. Drift diffusion model analysis suggested that better performance by the volleyball group was due to factors such as stimulus encoding or motor programming and execution rather than decision making. Trends in the pattern of data seen also suggest less noisy cognitive processing (rather than greater efficiency) and should be further investigated. © 2017 Elsevier B.V. All rights reserved.

  11. Time-Gated Raman Spectroscopy for Quantitative Determination of Solid-State Forms of Fluorescent Pharmaceuticals.

    PubMed

    Lipiäinen, Tiina; Pessi, Jenni; Movahedi, Parisa; Koivistoinen, Juha; Kurki, Lauri; Tenhunen, Mari; Yliruusi, Jouko; Juppo, Anne M; Heikkonen, Jukka; Pahikkala, Tapio; Strachan, Clare J

    2018-04-03

    Raman spectroscopy is widely used for quantitative pharmaceutical analysis, but a common obstacle to its use is sample fluorescence masking the Raman signal. Time-gating provides an instrument-based method for rejecting fluorescence through temporal resolution of the spectral signal and allows Raman spectra of fluorescent materials to be obtained. An additional practical advantage is that analysis is possible in ambient lighting. This study assesses the efficacy of time-gated Raman spectroscopy for the quantitative measurement of fluorescent pharmaceuticals. Time-gated Raman spectroscopy with a 128 × (2) × 4 CMOS SPAD detector was applied for quantitative analysis of ternary mixtures of solid-state forms of the model drug, piroxicam (PRX). Partial least-squares (PLS) regression allowed quantification, with Raman-active time domain selection (based on visual inspection) improving performance. Model performance was further improved by using kernel-based regularized least-squares (RLS) regression with greedy feature selection in which the data use in both the Raman shift and time dimensions was statistically optimized. Overall, time-gated Raman spectroscopy, especially with optimized data analysis in both the spectral and time dimensions, shows potential for sensitive and relatively routine quantitative analysis of photoluminescent pharmaceuticals during drug development and manufacturing.

  12. Analysis of musical expression in audio signals

    NASA Astrophysics Data System (ADS)

    Dixon, Simon

    2003-01-01

    In western art music, composers communicate their work to performers via a standard notation which specificies the musical pitches and relative timings of notes. This notation may also include some higher level information such as variations in the dynamics, tempo and timing. Famous performers are characterised by their expressive interpretation, the ability to convey structural and emotive information within the given framework. The majority of work on audio content analysis focusses on retrieving score-level information; this paper reports on the extraction of parameters describing the performance, a task which requires a much higher degree of accuracy. Two systems are presented: BeatRoot, an off-line beat tracking system which finds the times of musical beats and tracks changes in tempo throughout a performance, and the Performance Worm, a system which provides a real-time visualisation of the two most important expressive dimensions, tempo and dynamics. Both of these systems are being used to process data for a large-scale study of musical expression in classical and romantic piano performance, which uses artificial intelligence (machine learning) techniques to discover fundamental patterns or principles governing expressive performance.

  13. Time Spent Walking and Risk of Diabetes in Japanese Adults: The Japan Public Health Center-Based Prospective Diabetes Study.

    PubMed

    Kabeya, Yusuke; Goto, Atsushi; Kato, Masayuki; Matsushita, Yumi; Takahashi, Yoshihiko; Isogawa, Akihiro; Inoue, Manami; Mizoue, Tetsuya; Tsugane, Shoichiro; Kadowaki, Takashi; Noda, Mitsuhiko

    2016-01-01

    The association between time spent walking and risk of diabetes was investigated in a Japanese population-based cohort. Data from the Japan Public Health Center-based Prospective Diabetes cohort were analyzed. The surveys of diabetes were performed at baseline and at the 5-year follow-up. Time spent walking per day was assessed using a self-reported questionnaire (<30 minutes, 30 minutes to <1 hour, 1 to <2 hours, or ≥2 hours). A cross-sectional analysis was performed among 26 488 adults in the baseline survey. Logistic regression was used to examine the association between time spent walking and the presence of unrecognized diabetes. We then performed a longitudinal analysis that was restricted to 11 101 non-diabetic adults who participated in both the baseline and 5-year surveys. The association between time spent walking and the incidence of diabetes during the 5 years was examined. In the cross-sectional analysis, 1058 participants had unrecognized diabetes. Those with time spent walking of <30 minutes per day had increased odds of having diabetes in relation to those with time spent walking of ≥2 hours (adjusted odds ratio [OR] 1.23; 95% CI, 1.02-1.48). In the longitudinal analysis, 612 participants developed diabetes during the 5 years of follow-up. However, a significant association between time spent walking and the incidence of diabetes was not observed. Increased risk of diabetes was implied in those with time spent walking of <30 minutes per day, although the longitudinal analysis failed to show a significant result.

  14. Programmable neural processing on a smartdust for brain-computer interfaces.

    PubMed

    Yuwen Sun; Shimeng Huang; Oresko, Joseph J; Cheng, Allen C

    2010-10-01

    Brain-computer interfaces (BCIs) offer tremendous promise for improving the quality of life for disabled individuals. BCIs use spike sorting to identify the source of each neural firing. To date, spike sorting has been performed by either using off-chip analysis, which requires a wired connection penetrating the skull to a bulky external power/processing unit, or via custom application-specific integrated circuits that lack the programmability to perform different algorithms and upgrades. In this research, we propose and test the feasibility of performing on-chip, real-time spike sorting on a programmable smartdust, including feature extraction, classification, compression, and wireless transmission. A detailed power/performance tradeoff analysis using DVFS is presented. Our experimental results show that the execution time and power density meet the requirements to perform real-time spike sorting and wireless transmission on a single neural channel.

  15. Double Fourier analysis for Emotion Identification in Voiced Speech

    NASA Astrophysics Data System (ADS)

    Sierra-Sosa, D.; Bastidas, M.; Ortiz P., D.; Quintero, O. L.

    2016-04-01

    We propose a novel analysis alternative, based on two Fourier Transforms for emotion recognition from speech. Fourier analysis allows for display and synthesizes different signals, in terms of power spectral density distributions. A spectrogram of the voice signal is obtained performing a short time Fourier Transform with Gaussian windows, this spectrogram portraits frequency related features, such as vocal tract resonances and quasi-periodic excitations during voiced sounds. Emotions induce such characteristics in speech, which become apparent in spectrogram time-frequency distributions. Later, the signal time-frequency representation from spectrogram is considered an image, and processed through a 2-dimensional Fourier Transform in order to perform the spatial Fourier analysis from it. Finally features related with emotions in voiced speech are extracted and presented.

  16. The `TTIME' Package: Performance Evaluation in a Cluster Computing Environment

    NASA Astrophysics Data System (ADS)

    Howe, Marico; Berleant, Daniel; Everett, Albert

    2011-06-01

    The objective of translating developmental event time across mammalian species is to gain an understanding of the timing of human developmental events based on known time of those events in animals. The potential benefits include improvements to diagnostic and intervention capabilities. The CRAN `ttime' package provides the functionality to infer unknown event timings and investigate phylogenetic proximity utilizing hierarchical clustering of both known and predicted event timings. The original generic mammalian model included nine eutherian mammals: Felis domestica (cat), Mustela putorius furo (ferret), Mesocricetus auratus (hamster), Macaca mulatta (monkey), Homo sapiens (humans), Mus musculus (mouse), Oryctolagus cuniculus (rabbit), Rattus norvegicus (rat), and Acomys cahirinus (spiny mouse). However, the data for this model is expected to grow as more data about developmental events is identified and incorporated into the analysis. Performance evaluation of the `ttime' package across a cluster computing environment versus a comparative analysis in a serial computing environment provides an important computational performance assessment. A theoretical analysis is the first stage of a process in which the second stage, if justified by the theoretical analysis, is to investigate an actual implementation of the `ttime' package in a cluster computing environment and to understand the parallelization process that underlies implementation.

  17. Evaluation of the microsoft kinect skeletal versus depth data analysis for timed-up and go and figure of 8 walk tests.

    PubMed

    Hotrabhavananda, Benjamin; Mishra, Anup K; Skubic, Marjorie; Hotrabhavananda, Nijaporn; Abbott, Carmen

    2016-08-01

    We compared the performance of the Kinect skeletal data with the Kinect depth data in capturing different gait parameters during the Timed-up and Go Test (TUG) and Figure of 8 Walk Test (F8W). The gait parameters considered were stride length, stride time, and walking speed for the TUG, and number of steps and completion time for the F8W. A marker-based Vicon motion capture system was used for the ground-truth measurements. Five healthy participants were recruited for the experiment and were asked to perform three trials of each task. Results show that depth data analysis yields stride length and stride time measures with significantly low percentile errors as compared to the skeletal data analysis. However, the skeletal and depth data performed similar with less than 3% of absolute mean percentile error in determining the walking speed for the TUG and both parameters of F8W. The results show potential capabilities of Kinect depth data analysis in computing many gait parameters, whereas, the Kinect skeletal data can also be used for walking speed in TUG and F8W gait parameters.

  18. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replication and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. A technique is used that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed database with both shared and exclusive locks.

  19. Toward Capturing Momentary Changes of Heart Rate Variability by a Dynamic Analysis Method

    PubMed Central

    Zhang, Haoshi; Zhu, Mingxing; Zheng, Yue; Li, Guanglin

    2015-01-01

    The analysis of heart rate variability (HRV) has been performed on long-term electrocardiography (ECG) recordings (12~24 hours) and short-term recordings (2~5 minutes), which may not capture momentary change of HRV. In this study, we present a new method to analyze the momentary HRV (mHRV). The ECG recordings were segmented into a series of overlapped HRV analysis windows with a window length of 5 minutes and different time increments. The performance of the proposed method in delineating the dynamics of momentary HRV measurement was evaluated with four commonly used time courses of HRV measures on both synthetic time series and real ECG recordings from human subjects and dogs. Our results showed that a smaller time increment could capture more dynamical information on transient changes. Considering a too short increment such as 10 s would cause the indented time courses of the four measures, a 1-min time increment (4-min overlapping) was suggested in the analysis of mHRV in the study. ECG recordings from human subjects and dogs were used to further assess the effectiveness of the proposed method. The pilot study demonstrated that the proposed analysis of mHRV could provide more accurate assessment of the dynamical changes in cardiac activity than the conventional measures of HRV (without time overlapping). The proposed method may provide an efficient means in delineating the dynamics of momentary HRV and it would be worthy performing more investigations. PMID:26172953

  20. Effects of System Timing Parameters on Operator Performance in a Personnel Records Task

    DTIC Science & Technology

    1981-03-01

    work sampling, embedded performance measures, and operator satisfaction ratings) are needed to provide a complete analysis of the effects of the four...HFL-8 l-l/NPRDC-8 1-1 March 1981 EFFECTS OF SYSTEM TIMING PARAMETERS ON OPERATOR PERFORMANCE IN A PERSONNEL RECORDS TASK Robert C. Williges Beverly H...and Subtitle) S. TYPE OF REPORT & PERIOD COVERED EFFECTS OF SYSTEM TIMING PARAMETERS ON OPERATOR PERFORMANCE IN A PERSONNEL RECORDS TASK Final

  1. Integrated Sensitivity Analysis Workflow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman-Hill, Ernest J.; Hoffman, Edward L.; Gibson, Marcus J.

    2014-08-01

    Sensitivity analysis is a crucial element of rigorous engineering analysis, but performing such an analysis on a complex model is difficult and time consuming. The mission of the DART Workbench team at Sandia National Laboratories is to lower the barriers to adoption of advanced analysis tools through software integration. The integrated environment guides the engineer in the use of these integrated tools and greatly reduces the cycle time for engineering analysis.

  2. Performance optimisations for distributed analysis in ALICE

    NASA Astrophysics Data System (ADS)

    Betev, L.; Gheata, A.; Gheata, M.; Grigoras, C.; Hristov, P.

    2014-06-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with "sensors" collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis.

  3. PERTS: A Prototyping Environment for Real-Time Systems

    NASA Technical Reports Server (NTRS)

    Liu, Jane W. S.; Lin, Kwei-Jay; Liu, C. L.

    1991-01-01

    We discuss an ongoing project to build a Prototyping Environment for Real-Time Systems, called PERTS. PERTS is a unique prototyping environment in that it has (1) tools and performance models for the analysis and evaluation of real-time prototype systems, (2) building blocks for flexible real-time programs and the support system software, (3) basic building blocks of distributed and intelligent real time applications, and (4) an execution environment. PERTS will make the recent and future theoretical advances in real-time system design and engineering readily usable to practitioners. In particular, it will provide an environment for the use and evaluation of new design approaches, for experimentation with alternative system building blocks and for the analysis and performance profiling of prototype real-time systems.

  4. Independent component analysis algorithm FPGA design to perform real-time blind source separation

    NASA Astrophysics Data System (ADS)

    Meyer-Baese, Uwe; Odom, Crispin; Botella, Guillermo; Meyer-Baese, Anke

    2015-05-01

    The conditions that arise in the Cocktail Party Problem prevail across many fields creating a need for of Blind Source Separation. The need for BSS has become prevalent in several fields of work. These fields include array processing, communications, medical signal processing, and speech processing, wireless communication, audio, acoustics and biomedical engineering. The concept of the cocktail party problem and BSS led to the development of Independent Component Analysis (ICA) algorithms. ICA proves useful for applications needing real time signal processing. The goal of this research was to perform an extensive study on ability and efficiency of Independent Component Analysis algorithms to perform blind source separation on mixed signals in software and implementation in hardware with a Field Programmable Gate Array (FPGA). The Algebraic ICA (A-ICA), Fast ICA, and Equivariant Adaptive Separation via Independence (EASI) ICA were examined and compared. The best algorithm required the least complexity and fewest resources while effectively separating mixed sources. The best algorithm was the EASI algorithm. The EASI ICA was implemented on hardware with Field Programmable Gate Arrays (FPGA) to perform and analyze its performance in real time.

  5. Regression analysis for bivariate gap time with missing first gap time data.

    PubMed

    Huang, Chia-Hui; Chen, Yi-Hau

    2017-01-01

    We consider ordered bivariate gap time while data on the first gap time are unobservable. This study is motivated by the HIV infection and AIDS study, where the initial HIV contracting time is unavailable, but the diagnosis times for HIV and AIDS are available. We are interested in studying the risk factors for the gap time between initial HIV contraction and HIV diagnosis, and gap time between HIV and AIDS diagnoses. Besides, the association between the two gap times is also of interest. Accordingly, in the data analysis we are faced with two-fold complexity, namely data on the first gap time is completely missing, and the second gap time is subject to induced informative censoring due to dependence between the two gap times. We propose a modeling framework for regression analysis of bivariate gap time under the complexity of the data. The estimating equations for the covariate effects on, as well as the association between, the two gap times are derived through maximum likelihood and suitable counting processes. Large sample properties of the resulting estimators are developed by martingale theory. Simulations are performed to examine the performance of the proposed analysis procedure. An application of data from the HIV and AIDS study mentioned above is reported for illustration.

  6. Identification of Time-Varying Pilot Control Behavior in Multi-Axis Control Tasks

    NASA Technical Reports Server (NTRS)

    Zaal, Peter M. T.; Sweet, Barbara T.

    2012-01-01

    Recent developments in fly-by-wire control architectures for rotorcraft have introduced new interest in the identification of time-varying pilot control behavior in multi-axis control tasks. In this paper a maximum likelihood estimation method is used to estimate the parameters of a pilot model with time-dependent sigmoid functions to characterize time-varying human control behavior. An experiment was performed by 9 general aviation pilots who had to perform a simultaneous roll and pitch control task with time-varying aircraft dynamics. In 8 different conditions, the axis containing the time-varying dynamics and the growth factor of the dynamics were varied, allowing for an analysis of the performance of the estimation method when estimating time-dependent parameter functions. In addition, a detailed analysis of pilots adaptation to the time-varying aircraft dynamics in both the roll and pitch axes could be performed. Pilot control behavior in both axes was significantly affected by the time-varying aircraft dynamics in roll and pitch, and by the growth factor. The main effect was found in the axis that contained the time-varying dynamics. However, pilot control behavior also changed over time in the axis not containing the time-varying aircraft dynamics. This indicates that some cross coupling exists in the perception and control processes between the roll and pitch axes.

  7. Statistical Evaluation of Time Series Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Benignus, V. A.

    1973-01-01

    The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.

  8. Complete daVinci versus laparoscopic pyeloplasty: cost analysis.

    PubMed

    Bhayani, Sam B; Link, Richard E; Varkarakis, John M; Kavoussi, Louis R

    2005-04-01

    Computer-assisted pyeloplasty with the daVinci system is an emerging technique to treat ureteropelvic junction (UPJ) obstruction. A relative cost analysis was performed assessing this technology in comparison with purely laparoscopic pyeloplasty. Eight patients underwent computer-assisted (daVinci) dismembered pyeloplasty (CP) via a transperitoneal four-port approach. They were compared with 13 patients who underwent purely laparoscopic pyeloplasty (LP). All patients had a primary UPJ obstruction and were matched for age, sex, and body mass index. The cost of equipment and capital depreciation for both procedures, as well as assessment of room set-up time, takedown time, and personnel were analyzed. Surgeons and nursing staff for both groups were experienced in both laparoscopy and daVinci procedures. One- and two-way financial analysis was performed to assess relative costs. The mean set-up and takedown time was 71 minutes for CP and 49 minutes for LP. The mean length of stay was 2.3 days for CP and 2.5 days for LP. The mean operating room (OR) times for CP and LP were 176 and 210 minutes, respectively. There were no complications in either group. One-way cost analysis with an economic model showed that LP is more cost effective than CP at our hospital if LP OR time is <338 minutes. With adjustment to a volume of 500 daVinci cases/year, CP is still not as cost effective as LP. Two-way sensitivity analysis shows that in-room time must still be <130 minutes and yearly cases must be >500 to obtain cost equivalence for CP. Perioperative parameters for CP are encouraging. However, the costs are a clear disadvantage. In our hospital, it is more cost effective to teach and perform LP than to perform CP.

  9. Comparative analysis of different weight matrices in subspace system identification for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Shokravi, H.; Bakhary, NH

    2017-11-01

    Subspace System Identification (SSI) is considered as one of the most reliable tools for identification of system parameters. Performance of a SSI scheme is considerably affected by the structure of the associated identification algorithm. Weight matrix is a variable in SSI that is used to reduce the dimensionality of the state-space equation. Generally one of the weight matrices of Principle Component (PC), Unweighted Principle Component (UPC) and Canonical Variate Analysis (CVA) are used in the structure of a SSI algorithm. An increasing number of studies in the field of structural health monitoring are using SSI for damage identification. However, studies that evaluate the performance of the weight matrices particularly in association with accuracy, noise resistance, and time complexity properties are very limited. In this study, the accuracy, noise-robustness, and time-efficiency of the weight matrices are compared using different qualitative and quantitative metrics. Three evaluation metrics of pole analysis, fit values and elapsed time are used in the assessment process. A numerical model of a mass-spring-dashpot and operational data is used in this research paper. It is observed that the principal components obtained using PC algorithms are more robust against noise uncertainty and give more stable results for the pole distribution. Furthermore, higher estimation accuracy is achieved using UPC algorithm. CVA had the worst performance for pole analysis and time efficiency analysis. The superior performance of the UPC algorithm in the elapsed time is attributed to using unit weight matrices. The obtained results demonstrated that the process of reducing dimensionality in CVA and PC has not enhanced the time efficiency but yield an improved modal identification in PC.

  10. Fractal analysis of time varying data

    DOEpatents

    Vo-Dinh, Tuan; Sadana, Ajit

    2002-01-01

    Characteristics of time varying data, such as an electrical signal, are analyzed by converting the data from a temporal domain into a spatial domain pattern. Fractal analysis is performed on the spatial domain pattern, thereby producing a fractal dimension D.sub.F. The fractal dimension indicates the regularity of the time varying data.

  11. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replications and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. Here, a technique is discussed that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed databases with both shared and exclusive locks.

  12. Real-time implementation of an interactive jazz accompaniment system

    NASA Astrophysics Data System (ADS)

    Deshpande, Nikhil

    Modern computational algorithms and digital signal processing (DSP) are able to combine with human performers without forced or predetermined structure in order to create dynamic and real-time accompaniment systems. With modern computing power and intelligent algorithm layout and design, it is possible to achieve more detailed auditory analysis of live music. Using this information, computer code can follow and predict how a human's musical performance evolves, and use this to react in a musical manner. This project builds a real-time accompaniment system to perform together with live musicians, with a focus on live jazz performance and improvisation. The system utilizes a new polyphonic pitch detector and embeds it in an Ableton Live system - combined with Max for Live - to perform elements of audio analysis, generation, and triggering. The system also relies on tension curves and information rate calculations from the Creative Artificially Intuitive and Reasoning Agent (CAIRA) system to help understand and predict human improvisation. These metrics are vital to the core system and allow for extrapolated audio analysis. The system is able to react dynamically to a human performer, and can successfully accompany the human as an entire rhythm section.

  13. Impact of scatterometer wind (ASCAT-A/B) data assimilation on semi real-time forecast system at KIAPS

    NASA Astrophysics Data System (ADS)

    Han, H. J.; Kang, J. H.

    2016-12-01

    Since Jul. 2015, KIAPS (Korea Institute of Atmospheric Prediction Systems) has been performing the semi real-time forecast system to assess the performance of their forecast system as a NWP model. KPOP (KIAPS Protocol for Observation Processing) is a part of KIAPS data assimilation system and has been performing well in KIAPS semi real-time forecast system. In this study, due to the fact that KPOP would be able to treat the scatterometer wind data, we analyze the effect of scatterometer wind (ASCAT-A/B) on KIAPS semi real-time forecast system. O-B global distribution and statistics of scatterometer wind give use two information which are the difference between background field and observation is not too large and KPOP processed the scatterometer wind data well. The changes of analysis increment because of O-B global distribution appear remarkably at the bottom of atmospheric field. It also shows that scatterometer wind data cover wide ocean where data would be able to short. Performance of scatterometer wind data can be checked through the vertical error reduction against IFS between background and analysis field and vertical statistics of O-A. By these analysis result, we can notice that scatterometer wind data will influence the positive effect on lower level performance of semi real-time forecast system at KIAPS. After, long-term result based on effect of scatterometer wind data will be analyzed.

  14. LMI-based stability and performance conditions for continuous-time nonlinear systems in Takagi-Sugeno's form.

    PubMed

    Lam, H K; Leung, Frank H F

    2007-10-01

    This correspondence presents the stability analysis and performance design of the continuous-time fuzzy-model-based control systems. The idea of the nonparallel-distributed-compensation (non-PDC) control laws is extended to the continuous-time fuzzy-model-based control systems. A nonlinear controller with non-PDC control laws is proposed to stabilize the continuous-time nonlinear systems in Takagi-Sugeno's form. To produce the stability-analysis result, a parameter-dependent Lyapunov function (PDLF) is employed. However, two difficulties are usually encountered: 1) the time-derivative terms produced by the PDLF will complicate the stability analysis and 2) the stability conditions are not in the form of linear-matrix inequalities (LMIs) that aid the design of feedback gains. To tackle the first difficulty, the time-derivative terms are represented by some weighted-sum terms in some existing approaches, which will increase the number of stability conditions significantly. In view of the second difficulty, some positive-definitive terms are added in order to cast the stability conditions into LMIs. In this correspondence, the favorable properties of the membership functions and nonlinear control laws, which allow the introduction of some free matrices, are employed to alleviate the two difficulties while retaining the favorable properties of PDLF-based approach. LMI-based stability conditions are derived to ensure the system stability. Furthermore, based on a common scalar performance index, LMI-based performance conditions are derived to guarantee the system performance. Simulation examples are given to illustrate the effectiveness of the proposed approach.

  15. High-performance liquid chromatography with fluorescence detection for the rapid analysis of pheophytins and pyropheophytins in virgin olive oil.

    PubMed

    Li, Xueqi; Woodman, Michael; Wang, Selina C

    2015-08-01

    Pheophytins and pyropheophytin are degradation products of chlorophyll pigments, and their ratios can be used as a sensitive indicator of stress during the manufacturing and storage of olive oil. They increase over time depending on the storage condition and if the oil is exposed to heat treatments during the refining process. The traditional analysis method includes solvent- and time-consuming steps of solid-phase extraction followed by analysis by high-performance liquid chromatography with ultraviolet detection. We developed an improved dilute/fluorescence method where multi-step sample preparation was replaced by a simple isopropanol dilution before the high-performance liquid chromatography injection. A quaternary solvent gradient method was used to include a fourth strong solvent wash on a quaternary gradient pump, which avoided the need to premix any solvents and greatly reduced the oil residues on the column from previous analysis. This new method not only reduces analysis cost and time but shows reliability, repeatability, and improved sensitivity, especially important for low-level samples. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. On-orbit frequency stability analysis of the GPS NAVSTAR-1 quartz clock and the NAVSTARs-6 and -8 rubidium clocks

    NASA Technical Reports Server (NTRS)

    Mccaskill, T. B.; Buisson, J. A.; Reid, W. G.

    1984-01-01

    An on-orbit frequency stability performance analysis of the GPS NAVSTAR-1 quartz clock and the NAVSTARs-6 and -8 rubidium clocks is presented. The clock offsets were obtained from measurements taken at the GPS monitor stations which use high performance cesium standards as a reference. Clock performance is characterized through the use of the Allan variance, which is evaluated for sample times of 15 minutes to two hours, and from one day to 10 days. The quartz and rubidium clocks' offsets were corrected for aging rate before computing the frequency stability. The effect of small errors in aging rate is presented for the NAVSTAR-8 rubidium clock's stability analysis. The analysis includes presentation of time and frequency residuals with respect to linear and quadratic models, which aid in obtaining aging rate values and identifying systematic and random effects. The frequency stability values were further processed with a time domain noise process analysis, which is used to classify random noise process and modulation type.

  17. Cerebral Small Vessel Disease Burden Is Associated with Motor Performance of Lower and Upper Extremities in Community-Dwelling Populations

    PubMed Central

    Su, Ning; Zhai, Fei-Fei; Zhou, Li-Xin; Ni, Jun; Yao, Ming; Li, Ming-Li; Jin, Zheng-Yu; Gong, Gao-Lang; Zhang, Shu-Yang; Cui, Li-Ying; Tian, Feng; Zhu, Yi-Cheng

    2017-01-01

    Objective: To investigate the correlation between cerebral small vessel disease (CSVD) burden and motor performance of lower and upper extremities in community-dwelling populations. Methods: We performed a cross-sectional analysis on 770 participants enrolled in the Shunyi study, which is a population-based cohort study. CSVD burden, including white matter hyperintensities (WMH), lacunes, cerebral microbleeds (CMBs), perivascular spaces (PVS), and brain atrophy were measured using 3T magnetic resonance imaging. All participants underwent quantitative motor assessment of lower and upper extremities, which included 3-m walking speed, 5-repeat chair-stand time, 10-repeat pronation–supination time, and 10-repeat finger-tapping time. Data on demographic characteristics, vascular risk factors, and cognitive functions were collected. General linear model analysis was performed to identify potential correlations between motor performance measures and imaging markers of CSVD after controlling for confounding factors. Results: For motor performance of the lower extremities, WMH was negatively associated with gait speed (standardized β = -0.092, p = 0.022) and positively associated with chair-stand time (standardized β = 0.153, p < 0.0001, surviving FDR correction). For motor performance of the upper extremities, pronation–supination time was positively associated with WMH (standardized β = 0.155, p < 0.0001, surviving FDR correction) and negatively with brain parenchymal fraction (BPF; standardized β = -0.125, p = 0.011, surviving FDR correction). Only BPF was found to be negatively associated with finger-tapping time (standardized β = -0.123, p = 0.012). However, lacunes, CMBs, or PVS were not found to be associated with motor performance of lower or upper extremities in multivariable analysis. Conclusion: Our findings suggest that cerebral microstructural changes related to CSVD may affect motor performance of both lower and upper extremities. WMH and brain atrophy are most strongly associated with motor function deterioration in community-dwelling populations. PMID:29021757

  18. Cerebral Small Vessel Disease Burden Is Associated with Motor Performance of Lower and Upper Extremities in Community-Dwelling Populations.

    PubMed

    Su, Ning; Zhai, Fei-Fei; Zhou, Li-Xin; Ni, Jun; Yao, Ming; Li, Ming-Li; Jin, Zheng-Yu; Gong, Gao-Lang; Zhang, Shu-Yang; Cui, Li-Ying; Tian, Feng; Zhu, Yi-Cheng

    2017-01-01

    Objective: To investigate the correlation between cerebral small vessel disease (CSVD) burden and motor performance of lower and upper extremities in community-dwelling populations. Methods: We performed a cross-sectional analysis on 770 participants enrolled in the Shunyi study, which is a population-based cohort study. CSVD burden, including white matter hyperintensities (WMH), lacunes, cerebral microbleeds (CMBs), perivascular spaces (PVS), and brain atrophy were measured using 3T magnetic resonance imaging. All participants underwent quantitative motor assessment of lower and upper extremities, which included 3-m walking speed, 5-repeat chair-stand time, 10-repeat pronation-supination time, and 10-repeat finger-tapping time. Data on demographic characteristics, vascular risk factors, and cognitive functions were collected. General linear model analysis was performed to identify potential correlations between motor performance measures and imaging markers of CSVD after controlling for confounding factors. Results: For motor performance of the lower extremities, WMH was negatively associated with gait speed (standardized β = -0.092, p = 0.022) and positively associated with chair-stand time (standardized β = 0.153, p < 0.0001, surviving FDR correction). For motor performance of the upper extremities, pronation-supination time was positively associated with WMH (standardized β = 0.155, p < 0.0001, surviving FDR correction) and negatively with brain parenchymal fraction (BPF; standardized β = -0.125, p = 0.011, surviving FDR correction). Only BPF was found to be negatively associated with finger-tapping time (standardized β = -0.123, p = 0.012). However, lacunes, CMBs, or PVS were not found to be associated with motor performance of lower or upper extremities in multivariable analysis. Conclusion: Our findings suggest that cerebral microstructural changes related to CSVD may affect motor performance of both lower and upper extremities. WMH and brain atrophy are most strongly associated with motor function deterioration in community-dwelling populations.

  19. Evaluation of Counter-Based Dynamic Load Balancing Schemes for Massive Contingency Analysis on Over 10,000 Cores

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Huang, Zhenyu; Rice, Mark J.

    Contingency analysis studies are necessary to assess the impact of possible power system component failures. The results of the contingency analysis are used to ensure the grid reliability, and in power market operation for the feasibility test of market solutions. Currently, these studies are performed in real time based on the current operating conditions of the grid with a set of pre-selected contingency list, which might result in overlooking some critical contingencies caused by variable system status. To have a complete picture of a power grid, more contingencies need to be studied to improve grid reliability. High-performance computing techniques holdmore » the promise of being able to perform the analysis for more contingency cases within a much shorter time frame. This paper evaluates the performance of counter-based dynamic load balancing schemes for a massive contingency analysis program on 10,000+ cores. One million N-2 contingency analysis cases with a Western Electricity Coordinating Council power grid model have been used to demonstrate the performance. The speedup of 3964 with 4096 cores and 7877 with 10240 cores are obtained. This paper reports the performance of the load balancing scheme with a single counter and two counters, describes disk I/O issues, and discusses other potential techniques for further improving the performance.« less

  20. Time Spent Walking and Risk of Diabetes in Japanese Adults: The Japan Public Health Center-Based Prospective Diabetes Study

    PubMed Central

    Kabeya, Yusuke; Goto, Atsushi; Kato, Masayuki; Matsushita, Yumi; Takahashi, Yoshihiko; Isogawa, Akihiro; Inoue, Manami; Mizoue, Tetsuya; Tsugane, Shoichiro; Kadowaki, Takashi; Noda, Mitsuhiko

    2016-01-01

    Background The association between time spent walking and risk of diabetes was investigated in a Japanese population-based cohort. Methods Data from the Japan Public Health Center-based Prospective Diabetes cohort were analyzed. The surveys of diabetes were performed at baseline and at the 5-year follow-up. Time spent walking per day was assessed using a self-reported questionnaire (<30 minutes, 30 minutes to <1 hour, 1 to <2 hours, or ≥2 hours). A cross-sectional analysis was performed among 26 488 adults in the baseline survey. Logistic regression was used to examine the association between time spent walking and the presence of unrecognized diabetes. We then performed a longitudinal analysis that was restricted to 11 101 non-diabetic adults who participated in both the baseline and 5-year surveys. The association between time spent walking and the incidence of diabetes during the 5 years was examined. Results In the cross-sectional analysis, 1058 participants had unrecognized diabetes. Those with time spent walking of <30 minutes per day had increased odds of having diabetes in relation to those with time spent walking of ≥2 hours (adjusted odds ratio [OR] 1.23; 95% CI, 1.02–1.48). In the longitudinal analysis, 612 participants developed diabetes during the 5 years of follow-up. However, a significant association between time spent walking and the incidence of diabetes was not observed. Conclusions Increased risk of diabetes was implied in those with time spent walking of <30 minutes per day, although the longitudinal analysis failed to show a significant result. PMID:26725285

  1. Comparative performance assessment of commercially available automatic external defibrillators: A simulation and real-life measurement study of hands-off time.

    PubMed

    Savastano, Simone; Vanni, Vincenzo; Burkart, Roman; Raimondi, Maurizio; Canevari, Fabrizio; Molinari, Simone; Baldi, Enrico; Danza, Aurora I; Caputo, Maria Luce; Mauri, Romano; Regoli, Francois; Conte, Giulio; Benvenuti, Claudio; Auricchio, Angelo

    2017-01-01

    Early and good quality cardiopulmonary resuscitation (CPR) and the use of automated external defibrillators (AEDs) improve cardiac arrest patients' survival. However, AED peri- and post-shock/analysis pauses may reduce CPR effectiveness. The time performance of 12 different commercially available AEDs was tested in a manikin based scenario; then the AEDs recordings from the same tested models following the clinical use both in Pavia and Ticino were analyzed to evaluate the post-shock and post-analysis time. None of the AEDs was able to complete the analysis and to charge the capacitors in less than 10s and the mean post-shock pause was 6.7±2.4s. For non-shockable rhythms, the mean analysis time was 10.3±2s and the mean post-analysis time was 6.2±2.2s. We analyzed 154 AED records [104 by Emergency Medical Service (EMS) rescuers; 50 by lay rescuers]. EMS rescuers were faster in resuming CPR than lay rescuers [5.3s (95%CI 5-5.7) vs 8.6s (95%CI 7.3-10). AEDs showed different performances that may reduce CPR quality mostly for those rescuers following AED instructions. Both technological improvements and better lay rescuers training might be needed. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. Analysis method comparison of on-time and on-budget data.

    DOT National Transportation Integrated Search

    2007-02-01

    New Mexico Department of Transportation (NMDOT) results for On-Time and On-Budget performance measures as reported in (AASHTO/SCoQ) NCHRP 20-24(37) Project Measuring Performance Among State DOTs (Phase I) are lower than construction personnel kno...

  3. Sleep-deprivation effect on human performance: a meta-analysis approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Candice D. Griffith; Candice D. Griffith; Sankaran Mahadevan

    Human fatigue is hard to define since there is no direct measure of fatigue, much like stress. Instead fatigue must be inferred from measures that are affected by fatigue. One such measurable output affected by fatigue is reaction time. In this study the relationship of reaction time to sleep deprivation is studied. These variables were selected because reaction time and hours of sleep deprivation are straightforward characteristics of fatigue to begin the investigation of fatigue effects on performance. Meta-analysis, a widely used procedure in medical and psychological studies, is applied to the variety of fatigue literature collected from various fieldsmore » in this study. Meta-analysis establishes a procedure for coding and analyzing information from various studies to compute an effect size. In this research the effect size reported is the difference between standardized means, and is found to be -0.6341, implying a strong relationship between sleep deprivation and performance degradation.« less

  4. Clinical usefulness and feasibility of time-frequency analysis of chemosensory event-related potentials.

    PubMed

    Huart, C; Rombaux, Ph; Hummel, T; Mouraux, A

    2013-09-01

    The clinical usefulness of olfactory event-related brain potentials (OERPs) to assess olfactory function is limited by the relatively low signal-to-noise ratio of the responses identified using conventional time-domain averaging. Recently, it was shown that time-frequency analysis of the obtained EEG signals can markedly improve the signal-to-noise ratio of OERPs in healthy controls, because it enhances both phase-locked and non phase-locked EEG responses. The aim of the present study was to investigate the clinical usefulness of this approach and evaluate its feasibility in a clinical setting. We retrospectively analysed EEG recordings obtained from 45 patients (15 anosmic, 15 hyposmic and 15 normos- mic). The responses to olfactory stimulation were analysed using conventional time-domain analysis and joint time-frequency analysis. The ability of the two methods to discriminate between anosmic, hyposmic and normosmic patients was assessed using a Receiver Operating Characteristic analysis. The discrimination performance of OERPs identified using conventional time-domain averaging was poor. In contrast, the discrimination performance of the EEG response identified in the time-frequency domain was relatively high. Furthermore, we found a significant correlation between the magnitude of this response and the psychophysical olfactory score. Time-frequency analysis of the EEG responses to olfactory stimulation could be used as an effective and reliable diagnostic tool for the objective clinical evaluation of olfactory function in patients.

  5. Modeling and performance analysis using extended fuzzy-timing Petri nets for networked virtual environments.

    PubMed

    Zhou, Y; Murata, T; Defanti, T A

    2000-01-01

    Despite their attractive properties, networked virtual environments (net-VEs) are notoriously difficult to design, implement, and test due to the concurrency, real-time and networking features in these systems. Net-VEs demand high quality-of-service (QoS) requirements on the network to maintain natural and real-time interactions among users. The current practice for net-VE design is basically trial and error, empirical, and totally lacks formal methods. This paper proposes to apply a Petri net formal modeling technique to a net-VE-NICE (narrative immersive constructionist/collaborative environment), predict the net-VE performance based on simulation, and improve the net-VE performance. NICE is essentially a network of collaborative virtual reality systems called the CAVE-(CAVE automatic virtual environment). First, we introduce extended fuzzy-timing Petri net (EFTN) modeling and analysis techniques. Then, we present EFTN models of the CAVE, NICE, and transport layer protocol used in NICE: transmission control protocol (TCP). We show the possibility analysis based on the EFTN model for the CAVE. Then, by using these models and design/CPN as the simulation tool, we conducted various simulations to study real-time behavior, network effects and performance (latencies and jitters) of NICE. Our simulation results are consistent with experimental data.

  6. Numerical Modeling of Pulse Detonation Rocket Engine Gasdynamics and Performance

    NASA Technical Reports Server (NTRS)

    2003-01-01

    This paper presents viewgraphs on the numerical modeling of pulse detonation rocket engines (PDRE), with an emphasis on the Gasdynamics and performance analysis of these engines. The topics include: 1) Performance Analysis of PDREs; 2) Simplified PDRE Cycle; 3) Comparison of PDRE and Steady-State Rocket Engines (SSRE) Performance; 4) Numerical Modeling of Quasi 1-D Rocket Flows; 5) Specific PDRE Geometries Studied; 6) Time-Accurate Thrust Calculations; 7) PDRE Performance (Geometries A B C and D); 8) PDRE Blowdown Gasdynamics (Geom. A B C and D); 9) PDRE Geometry Performance Comparison; 10) PDRE Blowdown Time (Geom. A B C and D); 11) Specific SSRE Geometry Studied; 12) Effect of F-R Chemistry on SSRE Performance; 13) PDRE/SSRE Performance Comparison; 14) PDRE Performance Study; 15) Grid Resolution Study; and 16) Effect of F-R Chemistry on SSRE Exit Species Mole Fractions.

  7. Aeroelastic Stability of Idling Wind Turbines

    NASA Astrophysics Data System (ADS)

    Wang, Kai; Riziotis, Vasilis A.; Voutsinas, Spyros G.

    2016-09-01

    Wind turbine rotors in idling operation mode can experience high angles of attack, within the post stall region that are capable of triggering stall-induced vibrations. In the present paper rotor stability in slow idling operation is assessed on the basis of non-linear time domain and linear eigenvalue analysis. Analysis is performed for a 10 MW conceptual wind turbine designed by DTU. First the flow conditions that are likely to favour stall induced instabilities are identified through non-linear time domain aeroelastic analysis. Next, for the above specified conditions, eigenvalue stability simulations are performed aiming at identifying the low damped modes of the turbine. Finally the results of the eigenvalue analysis are evaluated through computations of the work of the aerodynamic forces by imposing harmonic vibrations following the shape and frequency of the various modes. Eigenvalue analysis indicates that the asymmetric and symmetric out-of-plane modes have the lowest damping. The results of the eigenvalue analysis agree well with those of the time domain analysis.

  8. Relationship between radiation treatment time and overall survival after induction chemotherapy for locally advanced head-and-neck carcinoma: a subset analysis of TAX 324.

    PubMed

    Sher, David J; Posner, Marshall R; Tishler, Roy B; Sarlis, Nicholas J; Haddad, Robert I; Holupka, Edward J; Devlin, Phillip M

    2011-12-01

    To analyze the relationship between overall survival (OS) and radiation treatment time (RTT) and overall treatment time (OTT) in a well-described sequential therapy paradigm for locally advanced head-and-neck carcinoma (LAHNC). TAX 324 is a Phase III study comparing TPF (docetaxel, cisplatin, and fluorouracil) with PF (cisplatin and fluorouracil) induction chemotherapy (IC) in LAHNC patients; both arms were followed by carboplatin-based chemoradiotherapy (CRT). Prospective radiotherapy quality assurance was performed. This analysis includes all patients who received three cycles of IC and a radiation dose of ≥70 Gy. Radiotherapy treatment time was analyzed as binary (≤8 weeks vs. longer) and continuous (number of days beyond 8 weeks) functions. The primary analysis assessed the relationship between RTT, OTT, and OS, and the secondary analysis explored the association between treatment times and locoregional recurrence (LRR). A total of 333 (of 501) TAX 324 patients met the criteria for inclusion in this analysis. There were no significant differences between the treatment arms in baseline or treatment characteristics. On multivariable analysis, PF IC, World Health Organization performance status of 1, non-oropharynx site, T3/4 stage, N3 status, and prolonged RTT (hazard ratio 1.63, p=0.006) were associated with significantly inferior survival. Performance status, T3/4 disease, and prolonged RTT (odds ratio 1.68, p=0.047) were independently and negatively related to LRR on multivariable analysis, whereas PF was not. Overall treatment time was not independently associated with either OS or LRR. In this secondary analysis of the TAX 324 trial, TPF IC remains superior to PF IC after controlling for radiotherapy delivery time. Even with optimal IC and concurrent chemotherapy, a non-prolonged RTT is a crucial determinant of treatment success. Appropriate delivery of radiotherapy after IC remains essential for optimizing OS in LAHNC. Copyright © 2011 Elsevier Inc. All rights reserved.

  9. Relationship Between Radiation Treatment Time and Overall Survival After Induction Chemotherapy for Locally Advanced Head-and-Neck Carcinoma: A Subset Analysis of TAX 324

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sher, David J., E-mail: dsher@partners.org; Posner, Marshall R.; Tishler, Roy B.

    2011-12-01

    Purpose: To analyze the relationship between overall survival (OS) and radiation treatment time (RTT) and overall treatment time (OTT) in a well-described sequential therapy paradigm for locally advanced head-and-neck carcinoma (LAHNC). Methods and Materials: TAX 324 is a Phase III study comparing TPF (docetaxel, cisplatin, and fluorouracil) with PF (cisplatin and fluorouracil) induction chemotherapy (IC) in LAHNC patients; both arms were followed by carboplatin-based chemoradiotherapy (CRT). Prospective radiotherapy quality assurance was performed. This analysis includes all patients who received three cycles of IC and a radiation dose of {>=} 70 Gy. Radiotherapy treatment time was analyzed as binary ({<=} 8more » weeks vs. longer) and continuous (number of days beyond 8 weeks) functions. The primary analysis assessed the relationship between RTT, OTT, and OS, and the secondary analysis explored the association between treatment times and locoregional recurrence (LRR). Results: A total of 333 (of 501) TAX 324 patients met the criteria for inclusion in this analysis. There were no significant differences between the treatment arms in baseline or treatment characteristics. On multivariable analysis, PF IC, World Health Organization performance status of 1, non-oropharynx site, T3/4 stage, N3 status, and prolonged RTT (hazard ratio 1.63, p = 0.006) were associated with significantly inferior survival. Performance status, T3/4 disease, and prolonged RTT (odds ratio 1.68, p = 0.047) were independently and negatively related to LRR on multivariable analysis, whereas PF was not. Overall treatment time was not independently associated with either OS or LRR. Conclusions: In this secondary analysis of the TAX 324 trial, TPF IC remains superior to PF IC after controlling for radiotherapy delivery time. Even with optimal IC and concurrent chemotherapy, a non-prolonged RTT is a crucial determinant of treatment success. Appropriate delivery of radiotherapy after IC remains essential for optimizing OS in LAHNC.« less

  10. Evaluation of different time domain peak models using extreme learning machine-based peak detection for EEG signal.

    PubMed

    Adam, Asrul; Ibrahim, Zuwairie; Mokhtar, Norrima; Shapiai, Mohd Ibrahim; Cumming, Paul; Mubin, Marizan

    2016-01-01

    Various peak models have been introduced to detect and analyze peaks in the time domain analysis of electroencephalogram (EEG) signals. In general, peak model in the time domain analysis consists of a set of signal parameters, such as amplitude, width, and slope. Models including those proposed by Dumpala, Acir, Liu, and Dingle are routinely used to detect peaks in EEG signals acquired in clinical studies of epilepsy or eye blink. The optimal peak model is the most reliable peak detection performance in a particular application. A fair measure of performance of different models requires a common and unbiased platform. In this study, we evaluate the performance of the four different peak models using the extreme learning machine (ELM)-based peak detection algorithm. We found that the Dingle model gave the best performance, with 72 % accuracy in the analysis of real EEG data. Statistical analysis conferred that the Dingle model afforded significantly better mean testing accuracy than did the Acir and Liu models, which were in the range 37-52 %. Meanwhile, the Dingle model has no significant difference compared to Dumpala model.

  11. Factors Affecting Accuracy and Time Requirements of a Glucose Oxidase-Peroxidase Assay for Determination of Glucose

    USDA-ARS?s Scientific Manuscript database

    Accurate and rapid assays for glucose are desirable for analysis of glucose and starch in food and feedstuffs. An established colorimetric glucose oxidase-peroxidase method for glucose was modified to reduce analysis time, and evaluated for factors that affected accuracy. Time required to perform t...

  12. Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0

    DOE PAGES

    Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; ...

    2008-01-01

    The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less

  13. Freeway performance measurement system : an operational analysis tool

    DOT National Transportation Integrated Search

    2001-07-30

    PeMS is a freeway performance measurement system for all of California. It processes 2 : GB/day of 30-second loop detector data in real time to produce useful information. Managers : at any time can have a uniform, and comprehensive assessment of fre...

  14. Modelling lecturer performance index of private university in Tulungagung by using survival analysis with multivariate adaptive regression spline

    NASA Astrophysics Data System (ADS)

    Hasyim, M.; Prastyo, D. D.

    2018-03-01

    Survival analysis performs relationship between independent variables and survival time as dependent variable. In fact, not all survival data can be recorded completely by any reasons. In such situation, the data is called censored data. Moreover, several model for survival analysis requires assumptions. One of the approaches in survival analysis is nonparametric that gives more relax assumption. In this research, the nonparametric approach that is employed is Multivariate Regression Adaptive Spline (MARS). This study is aimed to measure the performance of private university’s lecturer. The survival time in this study is duration needed by lecturer to obtain their professional certificate. The results show that research activities is a significant factor along with developing courses material, good publication in international or national journal, and activities in research collaboration.

  15. Characterizing the performance of ecosystem models across time scales: A spectral analysis of the North American Carbon Program site-level synthesis

    Treesearch

    Michael C. Dietze; Rodrigo Vargas; Andrew D. Richardson; Paul C. Stoy; Alan G. Barr; Ryan S. Anderson; M. Altaf Arain; Ian T. Baker; T. Andrew Black; Jing M. Chen; Philippe Ciais; Lawrence B. Flanagan; Christopher M. Gough; Robert F. Grant; David Hollinger; R. Cesar Izaurralde; Christopher J. Kucharik; Peter Lafleur; Shugang Liu; Erandathie Lokupitiya; Yiqi Luo; J. William Munger; Changhui Peng; Benjamin Poulter; David T. Price; Daniel M. Ricciuto; William J. Riley; Alok Kumar Sahoo; Kevin Schaefer; Andrew E. Suyker; Hanqin Tian; Christina Tonitto; Hans Verbeeck; Shashi B. Verma; Weifeng Wang; Ensheng Weng

    2011-01-01

    Ecosystem models are important tools for diagnosing the carbon cycle and projecting its behavior across space and time. Despite the fact that ecosystems respond to drivers at multiple time scales, most assessments of model performance do not discriminate different time scales. Spectral methods, such as wavelet analyses, present an alternative approach that enables the...

  16. A break-even analysis of major ear surgery.

    PubMed

    Wasson, J D; Phillips, J S

    2015-10-01

    To determine variables which affect cost and profit for major ear surgery and perform a break-even analysis. Retrospective financial analysis. UK teaching hospital. Patients who underwent major ear surgery under general anaesthesia performed by the senior author in main theatre over a 2-year period between dates of 07 September 2010 and 07 September 2012. Income, cost and profit for each major ear patient spell. Variables that affect major ear surgery profitability. Seventy-six patients met inclusion criteria. Wide variation in earnings, with a median net loss of £-1345.50 was observed. Income was relatively uniform across all patient spells; however, theatre time of major ear surgery at a cost of £953.24 per hour varied between patients and was the main determinant of cost and profit for the patient spell. Bivariate linear regression of earnings on theatre time identified 94% of variation in earnings was due to variation in theatre time (r = -0.969; P < 0.0001) and derived a break-even time for major ear surgery of 110.6 min. Theatre time was dependent on complexity of procedure and number of OPCS4 procedures performed, with a significant increase in theatre time when three or more procedures were performed during major ear surgery (P = 0.015). For major ear surgery to either break-even or return a profit, total theatre time should not exceed 110 min and 36 s. © 2015 John Wiley & Sons Ltd.

  17. Pre-hospital electrocardiogram triage with telemedicine near halves time to treatment in STEMI: A meta-analysis and meta-regression analysis of non-randomized studies.

    PubMed

    Brunetti, Natale Daniele; De Gennaro, Luisa; Correale, Michele; Santoro, Francesco; Caldarola, Pasquale; Gaglione, Antonio; Di Biase, Matteo

    2017-04-01

    A shorter time to treatment has been shown to be associated with lower mortality rates in acute myocardial infarction (AMI). Several strategies have been adopted with the aim to reduce any delay in diagnosis of AMI: pre-hospital triage with telemedicine is one of such strategies. We therefore aimed to measure the real effect of pre-hospital triage with telemedicine in case of AMI in a meta-analysis study. We performed a meta-analysis of non-randomized studies with the aim to quantify the exact reduction of time to treatment achieved by pre-hospital triage with telemedicine. Data were pooled and compared by relative time reduction and 95% C.I.s. A meta-regression analysis was performed in order to find possible predictors of shorter time to treatment. Eleven studies were selected and finally evaluated in the study. The overall relative reduction of time to treatment with pre-hospital triage and telemedicine was -38/-40% (p<0.001). Absolute time reduction was significantly correlated to time to treatment in the control groups (p<0.001), while relative time reduction was independent. A non-significant trend toward shorter relative time reductions was observed over years. Pre-hospital triage with telemedicine is associated with a near halved time to treatment in AMI. The benefit is larger in terms of absolute time to treatment reduction in populations with larger delays to treatment. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. GNSS global real-time augmentation positioning: Real-time precise satellite clock estimation, prototype system construction and performance analysis

    NASA Astrophysics Data System (ADS)

    Chen, Liang; Zhao, Qile; Hu, Zhigang; Jiang, Xinyuan; Geng, Changjiang; Ge, Maorong; Shi, Chuang

    2018-01-01

    Lots of ambiguities in un-differenced (UD) model lead to lower calculation efficiency, which isn't appropriate for the high-frequency real-time GNSS clock estimation, like 1 Hz. Mixed differenced model fusing UD pseudo-range and epoch-differenced (ED) phase observations has been introduced into real-time clock estimation. In this contribution, we extend the mixed differenced model for realizing multi-GNSS real-time clock high-frequency updating and a rigorous comparison and analysis on same conditions are performed to achieve the best real-time clock estimation performance taking the efficiency, accuracy, consistency and reliability into consideration. Based on the multi-GNSS real-time data streams provided by multi-GNSS Experiment (MGEX) and Wuhan University, GPS + BeiDou + Galileo global real-time augmentation positioning prototype system is designed and constructed, including real-time precise orbit determination, real-time precise clock estimation, real-time Precise Point Positioning (RT-PPP) and real-time Standard Point Positioning (RT-SPP). The statistical analysis of the 6 h-predicted real-time orbits shows that the root mean square (RMS) in radial direction is about 1-5 cm for GPS, Beidou MEO and Galileo satellites and about 10 cm for Beidou GEO and IGSO satellites. Using the mixed differenced estimation model, the prototype system can realize high-efficient real-time satellite absolute clock estimation with no constant clock-bias and can be used for high-frequency augmentation message updating (such as 1 Hz). The real-time augmentation message signal-in-space ranging error (SISRE), a comprehensive accuracy of orbit and clock and effecting the users' actual positioning performance, is introduced to evaluate and analyze the performance of GPS + BeiDou + Galileo global real-time augmentation positioning system. The statistical analysis of real-time augmentation message SISRE is about 4-7 cm for GPS, whlile 10 cm for Beidou IGSO/MEO, Galileo and about 30 cm for BeiDou GEO satellites. The real-time positioning results prove that the GPS + BeiDou + Galileo RT-PPP comparing to GPS-only can effectively accelerate convergence time by about 60%, improve the positioning accuracy by about 30% and obtain averaged RMS 4 cm in horizontal and 6 cm in vertical; additionally RT-SPP accuracy in the prototype system can realize positioning accuracy with about averaged RMS 1 m in horizontal and 1.5-2 m in vertical, which are improved by 60% and 70% to SPP based on broadcast ephemeris, respectively.

  19. Interpreting the Need for Initial Support to Perform Tandem Stance Tests of Balance

    PubMed Central

    Brach, Jennifer S.; Perera, Subashan; Wert, David M.; VanSwearingen, Jessie M.; Studenski, Stephanie A.

    2012-01-01

    Background Geriatric rehabilitation reimbursement increasingly requires documented deficits on standardized measures. Tandem stance performance can characterize balance, but protocols are not standardized. Objective The purpose of this study was to explore the impact of: (1) initial support to stabilize in position and (2) maximum hold time on tandem stance tests of balance in older adults. Design A cross-sectional secondary analysis of observational cohort data was conducted. Methods One hundred seventeen community-dwelling older adults (71% female, 12% black) were assigned to 1 of 3 groups based on the need for initial support to perform tandem stance: (1) unable even with support, (2) able only with support, and (3) able without support. The able without support group was further stratified on hold time in seconds: (1) <10 (low), (2) 10 to 29, (medium), and (3) 30 (high). Groups were compared on primary outcomes (gait speed, Timed “Up & Go” Test performance, and balance confidence) using analysis of variance. Results Twelve participants were unable to perform tandem stance, 14 performed tandem stance only with support, and 91 performed tandem stance without support. Compared with the able without support group, the able with support group had statistically or clinically worse performance and balance confidence. No significant differences were found between the able with support group and the unable even with support group on these same measures. Extending the hold time to 30 seconds in a protocol without initial support eliminated ceiling effects for 16% of the study sample. Limitations Small comparison groups, use of a secondary analysis, and lack of generalizability of results were limitations of the study. Conclusions Requiring initial support to stabilize in tandem stance appears to reflect meaningful deficits in balance-related mobility measures, so failing to consider support may inflate balance estimates and confound hold time comparisons. Additionally, 10-second maximum hold times limit discrimination of balance in adults with a higher level of function. For community-dwelling older adults, we recommend timing for at least 30 seconds and documenting initial support for consideration when interpreting performance. PMID:22745198

  20. Arcjet thruster research and technology

    NASA Technical Reports Server (NTRS)

    Makel, Darby B.; Cann, Gordon L.

    1988-01-01

    The design, analysis, and performance testing of an advanced lower power arcjet is described. A high impedance, vortex stabilized 1-kw class arcjet has been studied. A baseline research thruster has been built and endurance and performance tested. This advanced arcjet has demonstrated long lifetime characteristics, but lower than expected performance. Analysis of the specific design has identified modifications which should improve performance and maintain the long life time shown by the arcjet.

  1. Design analysis and computer-aided performance evaluation of shuttle orbiter electrical power system. Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Studies were conducted to develop appropriate space shuttle electrical power distribution and control (EPDC) subsystem simulation models and to apply the computer simulations to systems analysis of the EPDC. A previously developed software program (SYSTID) was adapted for this purpose. The following objectives were attained: (1) significant enhancement of the SYSTID time domain simulation software, (2) generation of functionally useful shuttle EPDC element models, and (3) illustrative simulation results in the analysis of EPDC performance, under the conditions of fault, current pulse injection due to lightning, and circuit protection sizing and reaction times.

  2. Run-time parallelization and scheduling of loops

    NASA Technical Reports Server (NTRS)

    Saltz, Joel H.; Mirchandaney, Ravi; Crowley, Kay

    1991-01-01

    Run-time methods are studied to automatically parallelize and schedule iterations of a do loop in certain cases where compile-time information is inadequate. The methods presented involve execution time preprocessing of the loop. At compile-time, these methods set up the framework for performing a loop dependency analysis. At run-time, wavefronts of concurrently executable loop iterations are identified. Using this wavefront information, loop iterations are reordered for increased parallelism. Symbolic transformation rules are used to produce: inspector procedures that perform execution time preprocessing, and executors or transformed versions of source code loop structures. These transformed loop structures carry out the calculations planned in the inspector procedures. Performance results are presented from experiments conducted on the Encore Multimax. These results illustrate that run-time reordering of loop indexes can have a significant impact on performance.

  3. Analysis on burnup step effect for evaluating reactor criticality and fuel breeding ratio

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saputra, Geby; Purnama, Aditya Rizki; Permana, Sidik

    Criticality condition of the reactors is one of the important factors for evaluating reactor operation and nuclear fuel breeding ratio is another factor to show nuclear fuel sustainability. This study analyzes the effect of burnup steps and cycle operation step for evaluating the criticality condition of the reactor as well as the performance of nuclear fuel breeding or breeding ratio (BR). Burnup step is performed based on a day step analysis which is varied from 10 days up to 800 days and for cycle operation from 1 cycle up to 8 cycles reactor operations. In addition, calculation efficiency based onmore » the variation of computer processors to run the analysis in term of time (time efficiency in the calculation) have been also investigated. Optimization method for reactor design analysis which is used a large fast breeder reactor type as a reference case was performed by adopting an established reactor design code of JOINT-FR. The results show a criticality condition becomes higher for smaller burnup step (day) and for breeding ratio becomes less for smaller burnup step (day). Some nuclides contribute to make better criticality when smaller burnup step due to individul nuclide half-live. Calculation time for different burnup step shows a correlation with the time consuming requirement for more details step calculation, although the consuming time is not directly equivalent with the how many time the burnup time step is divided.« less

  4. A parallelization scheme of the periodic signals tracking algorithm for isochronous mass spectrometry on GPUs

    NASA Astrophysics Data System (ADS)

    Chen, R. J.; Wang, M.; Yan, X. L.; Yang, Q.; Lam, Y. H.; Yang, L.; Zhang, Y. H.

    2017-12-01

    The periodic signals tracking algorithm has been used to determine the revolution times of ions stored in storage rings in isochronous mass spectrometry (IMS) experiments. It has been a challenge to perform real-time data analysis by using the periodic signals tracking algorithm in the IMS experiments. In this paper, a parallelization scheme of the periodic signals tracking algorithm is introduced and a new program is developed. The computing time of data analysis can be reduced by a factor of ∼71 and of ∼346 by using our new program on Tesla C1060 GPU and Tesla K20c GPU, compared to using old program on Xeon E5540 CPU. We succeed in performing real-time data analysis for the IMS experiments by using the new program on Tesla K20c GPU.

  5. Quantitative EEG analysis using error reduction ratio-causality test; validation on simulated and real EEG data.

    PubMed

    Sarrigiannis, Ptolemaios G; Zhao, Yifan; Wei, Hua-Liang; Billings, Stephen A; Fotheringham, Jayne; Hadjivassiliou, Marios

    2014-01-01

    To introduce a new method of quantitative EEG analysis in the time domain, the error reduction ratio (ERR)-causality test. To compare performance against cross-correlation and coherence with phase measures. A simulation example was used as a gold standard to assess the performance of ERR-causality, against cross-correlation and coherence. The methods were then applied to real EEG data. Analysis of both simulated and real EEG data demonstrates that ERR-causality successfully detects dynamically evolving changes between two signals, with very high time resolution, dependent on the sampling rate of the data. Our method can properly detect both linear and non-linear effects, encountered during analysis of focal and generalised seizures. We introduce a new quantitative EEG method of analysis. It detects real time levels of synchronisation in the linear and non-linear domains. It computes directionality of information flow with corresponding time lags. This novel dynamic real time EEG signal analysis unveils hidden neural network interactions with a very high time resolution. These interactions cannot be adequately resolved by the traditional methods of coherence and cross-correlation, which provide limited results in the presence of non-linear effects and lack fidelity for changes appearing over small periods of time. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  6. Class start times, sleep, and academic performance in college: a path analysis.

    PubMed

    Onyper, Serge V; Thacher, Pamela V; Gilbert, Jack W; Gradess, Samuel G

    2012-04-01

    Path analysis was used to examine the relationship between class start times, sleep, circadian preference, and academic performance in college-aged adults. Consistent with observations in middle and high school students, college students with later class start times slept longer, experienced less daytime sleepiness, and were less likely to miss class. Chronotype was an important moderator of sleep schedules and daytime functioning; those with morning preference went to bed and woke up earlier and functioned better throughout the day. The benefits of taking later classes did not extend to academic performance, however; grades were somewhat lower in students with predominantly late class schedules. Furthermore, students taking later classes were at greater risk for increased alcohol consumption, and among all the factors affecting academic performance, alcohol misuse exerted the strongest effect. Thus, these results indicate that later class start times in college, while allowing for more sleep, also increase the likelihood of alcohol misuse, ultimately impeding academic success.

  7. Tracking performance under time sharing conditions with a digit processing task: A feedback control theory analysis. [attention sharing effect on operator performance

    NASA Technical Reports Server (NTRS)

    Gopher, D.; Wickens, C. D.

    1975-01-01

    A one dimensional compensatory tracking task and a digit processing reaction time task were combined in a three phase experiment designed to investigate tracking performance in time sharing. Adaptive techniques, elaborate feedback devices, and on line standardization procedures were used to adjust task difficulty to the ability of each individual subject and manipulate time sharing demands. Feedback control analysis techniques were employed in the description of tracking performance. The experimental results show that when the dynamics of a system are constrained, in such a manner that man machine system stability is no longer a major concern of the operator, he tends to adopt a first order control describing function, even with tracking systems of higher order. Attention diversion to a concurrent task leads to an increase in remnant level, or nonlinear power. This decrease in linearity is reflected both in the output magnitude spectra of the subjects, and in the linear fit of the amplitude ratio functions.

  8. Impact of sentinel lymph node biopsy on immediate breast reconstruction after mastectomy.

    PubMed

    Wood, Benjamin C; David, Lisa R; Defranzo, Anthony J; Stewart, John H; Shen, Perry; Geisinger, Kim R; Marks, Malcolm W; Levine, Edward A

    2009-07-01

    Traditionally, sentinel lymph node biopsy (SLNB) is performed at the time of mastectomy and reconstruction. However, several groups have advocated SLNB as a separate outpatient procedure before mastectomy, when immediate reconstruction is planned, to allow for complete pathologic evaluation. The purpose of this study was to determine the impact of intraoperative analysis of SLNB on the reconstructive plan when performed at the same time as definitive surgery. A retrospective review was conducted of all mastectomy cases performed at a single institution between September 1998 and November 2007. Of the 747 mastectomy cases reviewed, SLNB was conducted in 344 cases, and there was immediate breast reconstruction in 193 of those cases. There were 27 (7.8%) false negative and three (0.9%) false positive intraoperative analysis of SLNB. Touch preparation analysis from the SLNB changed the reconstructive plan in four (2.1%) cases. In our experience, SLNB can be performed at the time of mastectomy with minimal impact on the reconstructive plan. A staged approach incurs significant additional expense, increases the delay in initiation of systemic therapy and the propensity of procedure-related morbidity; therefore, SLNB should not be performed as a separate procedure before definitive surgery with immediate breast reconstruction.

  9. Documentation and user's guide for interactive spectral analysis and filter program package useful in the processing of seismic reflection data

    USGS Publications Warehouse

    Miller, J.J.

    1982-01-01

    The spectral analysis and filter program package is written in the BASIC language for the HP-9845T desktop computer. The program's main purpose is to perform spectral analyses on digitized time-domain data. In addition, band-pass filtering of the data can be performed in the time domain. Various other processes such as autocorrelation can be performed to the time domain data in order to precondition them for spectral analyses. The frequency domain data can also be transformed back into the time domain if desired. Any data can be displayed on the CRT in graphic form using a variety of plot routines. A hard copy can be obtained immediately using the internal thermal printer. Data can also be displayed in tabular form on the CRT or internal thermal printer or it can be stored permanently on a mass storage device like a tape or disk. A list of the processes performed in the order in which they occurred can be displayed at any time.

  10. Quantitative Analysis of Color Differences within High Contrast, Low Power Reversible Electrophoretic Displays

    DOE PAGES

    Giera, Brian; Bukosky, Scott; Lee, Elaine; ...

    2018-01-23

    Here, quantitative color analysis is performed on videos of high contrast, low power reversible electrophoretic deposition (EPD)-based displays operated under different applied voltages. This analysis is coded in an open-source software, relies on a color differentiation metric, ΔE * 00, derived from digital video, and provides an intuitive relationship between the operating conditions of the devices and their performance. Time-dependent ΔE * 00 color analysis reveals color relaxation behavior, recoverability for different voltage sequences, and operating conditions that can lead to optimal performance.

  11. Quantitative Analysis of Color Differences within High Contrast, Low Power Reversible Electrophoretic Displays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giera, Brian; Bukosky, Scott; Lee, Elaine

    Here, quantitative color analysis is performed on videos of high contrast, low power reversible electrophoretic deposition (EPD)-based displays operated under different applied voltages. This analysis is coded in an open-source software, relies on a color differentiation metric, ΔE * 00, derived from digital video, and provides an intuitive relationship between the operating conditions of the devices and their performance. Time-dependent ΔE * 00 color analysis reveals color relaxation behavior, recoverability for different voltage sequences, and operating conditions that can lead to optimal performance.

  12. Positioning performance analysis of the time sum of arrival algorithm with error features

    NASA Astrophysics Data System (ADS)

    Gong, Feng-xun; Ma, Yan-qiu

    2018-03-01

    The theoretical positioning accuracy of multilateration (MLAT) with the time difference of arrival (TDOA) algorithm is very high. However, there are some problems in practical applications. Here we analyze the location performance of the time sum of arrival (TSOA) algorithm from the root mean square error ( RMSE) and geometric dilution of precision (GDOP) in additive white Gaussian noise (AWGN) environment. The TSOA localization model is constructed. Using it, the distribution of location ambiguity region is presented with 4-base stations. And then, the location performance analysis is started from the 4-base stations with calculating the RMSE and GDOP variation. Subsequently, when the location parameters are changed in number of base stations, base station layout and so on, the performance changing patterns of the TSOA location algorithm are shown. So, the TSOA location characteristics and performance are revealed. From the RMSE and GDOP state changing trend, the anti-noise performance and robustness of the TSOA localization algorithm are proved. The TSOA anti-noise performance will be used for reducing the blind-zone and the false location rate of MLAT systems.

  13. Machine learning of swimming data via wisdom of crowd and regression analysis.

    PubMed

    Xie, Jiang; Xu, Junfu; Nie, Celine; Nie, Qing

    2017-04-01

    Every performance, in an officially sanctioned meet, by a registered USA swimmer is recorded into an online database with times dating back to 1980. For the first time, statistical analysis and machine learning methods are systematically applied to 4,022,631 swim records. In this study, we investigate performance features for all strokes as a function of age and gender. The variances in performance of males and females for different ages and strokes were studied, and the correlations of performances for different ages were estimated using the Pearson correlation. Regression analysis show the performance trends for both males and females at different ages and suggest critical ages for peak training. Moreover, we assess twelve popular machine learning methods to predict or classify swimmer performance. Each method exhibited different strengths or weaknesses in different cases, indicating no one method could predict well for all strokes. To address this problem, we propose a new method by combining multiple inference methods to derive Wisdom of Crowd Classifier (WoCC). Our simulation experiments demonstrate that the WoCC is a consistent method with better overall prediction accuracy. Our study reveals several new age-dependent trends in swimming and provides an accurate method for classifying and predicting swimming times.

  14. Mendelian randomization analysis of a time-varying exposure for binary disease outcomes using functional data analysis methods.

    PubMed

    Cao, Ying; Rajan, Suja S; Wei, Peng

    2016-12-01

    A Mendelian randomization (MR) analysis is performed to analyze the causal effect of an exposure variable on a disease outcome in observational studies, by using genetic variants that affect the disease outcome only through the exposure variable. This method has recently gained popularity among epidemiologists given the success of genetic association studies. Many exposure variables of interest in epidemiological studies are time varying, for example, body mass index (BMI). Although longitudinal data have been collected in many cohort studies, current MR studies only use one measurement of a time-varying exposure variable, which cannot adequately capture the long-term time-varying information. We propose using the functional principal component analysis method to recover the underlying individual trajectory of the time-varying exposure from the sparsely and irregularly observed longitudinal data, and then conduct MR analysis using the recovered curves. We further propose two MR analysis methods. The first assumes a cumulative effect of the time-varying exposure variable on the disease risk, while the second assumes a time-varying genetic effect and employs functional regression models. We focus on statistical testing for a causal effect. Our simulation studies mimicking the real data show that the proposed functional data analysis based methods incorporating longitudinal data have substantial power gains compared to standard MR analysis using only one measurement. We used the Framingham Heart Study data to demonstrate the promising performance of the new methods as well as inconsistent results produced by the standard MR analysis that relies on a single measurement of the exposure at some arbitrary time point. © 2016 WILEY PERIODICALS, INC.

  15. Performance evaluation using SYSTID time domain simulation. [computer-aid design and analysis for communication systems

    NASA Technical Reports Server (NTRS)

    Tranter, W. H.; Ziemer, R. E.; Fashano, M. J.

    1975-01-01

    This paper reviews the SYSTID technique for performance evaluation of communication systems using time-domain computer simulation. An example program illustrates the language. The inclusion of both Gaussian and impulse noise models make accurate simulation possible in a wide variety of environments. A very flexible postprocessor makes possible accurate and efficient performance evaluation.

  16. General purpose pulse shape analysis for fast scintillators implemented in digital readout electronics

    NASA Astrophysics Data System (ADS)

    Asztalos, Stephen J.; Hennig, Wolfgang; Warburton, William K.

    2016-01-01

    Pulse shape discrimination applied to certain fast scintillators is usually performed offline. In sufficiently high-event rate environments data transfer and storage become problematic, which suggests a different analysis approach. In response, we have implemented a general purpose pulse shape analysis algorithm in the XIA Pixie-500 and Pixie-500 Express digital spectrometers. In this implementation waveforms are processed in real time, reducing the pulse characteristics to a few pulse shape analysis parameters and eliminating time-consuming waveform transfer and storage. We discuss implementation of these features, their advantages, necessary trade-offs and performance. Measurements from bench top and experimental setups using fast scintillators and XIA processors are presented.

  17. Motor Phenotype of Decline in Cognitive Performance among Community-Dwellers without Dementia: Population-Based Study and Meta-Analysis

    PubMed Central

    Beauchet, Olivier; Allali, Gilles; Montero-Odasso, Manuel; Sejdić, Ervin; Fantino, Bruno; Annweiler, Cédric

    2014-01-01

    Background Decline in cognitive performance is associated with gait deterioration. Our objectives were: 1) to determine, from an original study in older community-dwellers without diagnosis of dementia, which gait parameters, among slower gait speed, higher stride time variability (STV) and Timed Up & Go test (TUG) delta time, were most strongly associated with lower performance in two cognitive domains (i.e., episodic memory and executive function); and 2) to quantitatively synthesize, with a systematic review and meta-analysis, the association between gait performance and cognitive decline (i.e., mild cognitive impairment (MCI) and dementia). Methods Based on a cross-sectional design, 934 older community-dwellers without dementia (mean±standard deviation, 70.3±4.9years; 52.1% female) were recruited. A score at 5 on the Short Mini-Mental State Examination defined low episodic memory performance. Low executive performance was defined by clock-drawing test errors. STV and gait speed were measured using GAITRite system. TUG delta time was calculated as the difference between the times needed to perform and to imagine the TUG. Then, a systematic Medline search was conducted in November 2013 using the Medical Subject Heading terms “Delirium,” “Dementia,” “Amnestic,” “Cognitive disorders” combined with “Gait” OR “Gait disorders, Neurologic” and “Variability.” Findings A total of 294 (31.5%) participants presented decline in cognitive performance. Higher STV, higher TUG delta time, and slower gait speed were associated with decline in episodic memory and executive performances (all P-values <0.001). The highest magnitude of association was found for higher STV (effect size  =  −0.74 [95% Confidence Interval (CI): −1.05;−0.43], among participants combining of decline in episodic memory and in executive performances). Meta-analysis underscored that higher STV represented a gait biomarker in patients with MCI (effect size  =  0.48 [95% CI: 0.30;0.65]) and dementia (effect size  = 1.06 [95% CI: 0.40;1.72]). Conclusion Higher STV appears to be a motor phenotype of cognitive decline. PMID:24911155

  18. Productivity improvement through cycle time analysis

    NASA Astrophysics Data System (ADS)

    Bonal, Javier; Rios, Luis; Ortega, Carlos; Aparicio, Santiago; Fernandez, Manuel; Rosendo, Maria; Sanchez, Alejandro; Malvar, Sergio

    1996-09-01

    A cycle time (CT) reduction methodology has been developed in the Lucent Technology facility (former AT&T) in Madrid, Spain. It is based on a comparison of the contribution of each process step in each technology with a target generated by a cycle time model. These targeted cycle times are obtained using capacity data of the machines processing those steps, queuing theory and theory of constrains (TOC) principles (buffers to protect bottleneck and low cycle time/inventory everywhere else). Overall efficiency equipment (OEE) like analysis is done in the machine groups with major differences between their target cycle time and real values. Comparisons between the current value of the parameters that command their capacity (process times, availability, idles, reworks, etc.) and the engineering standards are done to detect the cause of exceeding their contribution to the cycle time. Several friendly and graphical tools have been developed to track and analyze those capacity parameters. Specially important have showed to be two tools: ASAP (analysis of scheduling, arrivals and performance) and performer which analyzes interrelation problems among machines procedures and direct labor. The performer is designed for a detailed and daily analysis of an isolate machine. The extensive use of this tool by the whole labor force has demonstrated impressive results in the elimination of multiple small inefficiencies with a direct positive implications on OEE. As for ASAP, it shows the lot in process/queue for different machines at the same time. ASAP is a powerful tool to analyze the product flow management and the assigned capacity for interdependent operations like the cleaning and the oxidation/diffusion. Additional tools have been developed to track, analyze and improve the process times and the availability.

  19. Predictive factors in patients with hepatocellular carcinoma receiving sorafenib therapy using time-dependent receiver operating characteristic analysis.

    PubMed

    Nishikawa, Hiroki; Nishijima, Norihiro; Enomoto, Hirayuki; Sakamoto, Azusa; Nasu, Akihiro; Komekado, Hideyuki; Nishimura, Takashi; Kita, Ryuichi; Kimura, Toru; Iijima, Hiroko; Nishiguchi, Shuhei; Osaki, Yukio

    2017-01-01

    To investigate variables before sorafenib therapy on the clinical outcomes in hepatocellular carcinoma (HCC) patients receiving sorafenib and to further assess and compare the predictive performance of continuous parameters using time-dependent receiver operating characteristics (ROC) analysis. A total of 225 HCC patients were analyzed. We retrospectively examined factors related to overall survival (OS) and progression free survival (PFS) using univariate and multivariate analyses. Subsequently, we performed time-dependent ROC analysis of continuous parameters which were significant in the multivariate analysis in terms of OS and PFS. Total sum of area under the ROC in all time points (defined as TAAT score) in each case was calculated. Our cohort included 175 male and 50 female patients (median age, 72 years) and included 158 Child-Pugh A and 67 Child-Pugh B patients. The median OS time was 0.68 years, while the median PFS time was 0.24 years. On multivariate analysis, gender, body mass index (BMI), Child-Pugh classification, extrahepatic metastases, tumor burden, aspartate aminotransferase (AST) and alpha-fetoprotein (AFP) were identified as significant predictors of OS and ECOG-performance status, Child-Pugh classification and extrahepatic metastases were identified as significant predictors of PFS. Among three continuous variables (i.e., BMI, AST and AFP), AFP had the highest TAAT score for the entire cohort. In subgroup analyses, AFP had the highest TAAT score except for Child-Pugh B and female among three continuous variables. In continuous variables, AFP could have higher predictive accuracy for survival in HCC patients undergoing sorafenib therapy.

  20. Simulation for analysis and control of superplastic forming. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zacharia, T.; Aramayo, G.A.; Simunovic, S.

    1996-08-01

    A joint study was conducted by Oak Ridge National Laboratory (ORNL) and the Pacific Northwest Laboratory (PNL) for the U.S. Department of Energy-Lightweight Materials (DOE-LWM) Program. the purpose of the study was to assess and benchmark the current modeling capabilities with respect to accuracy of predictions and simulation time. Two modeling capabilities with respect to accuracy of predictions and simulation time. Two simulation platforms were considered in this study, which included the LS-DYNA3D code installed on ORNL`s high- performance computers and the finite element code MARC used at PNL. both ORNL and PNL performed superplastic forming (SPF) analysis on amore » standard butter-tray geometry, which was defined by PNL, to better understand the capabilities of the respective models. The specific geometry was selected and formed at PNL, and the experimental results, such as forming time and thickness at specific locations, were provided for comparisons with numerical predictions. Furthermore, comparisons between the ORNL simulation results, using elasto-plastic analysis, and PNL`s results, using rigid-plastic flow analysis, were performed.« less

  1. Ultra-high-performance supercritical fluid chromatography with quadrupole-time-of-flight mass spectrometry (UHPSFC/QTOF-MS) for analysis of lignin-derived monomeric compounds in processed lignin samples.

    PubMed

    Prothmann, Jens; Sun, Mingzhe; Spégel, Peter; Sandahl, Margareta; Turner, Charlotta

    2017-12-01

    The conversion of lignin to potentially high-value low molecular weight compounds often results in complex mixtures of monomeric and oligomeric compounds. In this study, a method for the quantitative and qualitative analysis of 40 lignin-derived compounds using ultra-high-performance supercritical fluid chromatography coupled to quadrupole-time-of-flight mass spectrometry (UHPSFC/QTOF-MS) has been developed. Seven different columns were explored for maximum selectivity. Makeup solvent composition and ion source settings were optimised using a D-optimal design of experiment (DoE). Differently processed lignin samples were analysed and used for the method validation. The new UHPSFC/QTOF-MS method showed good separation of the 40 compounds within only 6-min retention time, and out of these, 36 showed high ionisation efficiency in negative electrospray ionisation mode. Graphical abstract A rapid and selective method for the quantitative and qualitative analysis of 40 lignin-derived compounds using ultra-high-performance supercritical fluid chromatography coupled to quadrupole-time-of-flight mass spectrometry (UHPSFC/QTOF-MS).

  2. Display format, highlight validity, and highlight method: Their effects on search performance

    NASA Technical Reports Server (NTRS)

    Donner, Kimberly A.; Mckay, Tim D.; Obrien, Kevin M.; Rudisill, Marianne

    1991-01-01

    Display format and highlight validity were shown to affect visual display search performance; however, these studies were conducted on small, artificial displays of alphanumeric stimuli. A study manipulating these variables was conducted using realistic, complex Space Shuttle information displays. A 2x2x3 within-subjects analysis of variance found that search times were faster for items in reformatted displays than for current displays. Responses to valid applications of highlight were significantly faster than responses to non or invalidly highlighted applications. The significant format by highlight validity interaction showed that there was little difference in response time to both current and reformatted displays when the highlight validity was applied; however, under the non or invalid highlight conditions, search times were faster with reformatted displays. A separate within-subject analysis of variance of display format, highlight validity, and several highlight methods did not reveal a main effect of highlight method. In addition, observed display search times were compared to search time predicted by Tullis' Display Analysis Program. Benefits of highlighting and reformatting displays to enhance search and the necessity to consider highlight validity and format characteristics in tandem for predicting search performance are discussed.

  3. A Proposed Study Program for the Enhancement of Performance of Clocks in the DCS Timing system.

    DTIC Science & Technology

    1982-08-31

    INTRODUCTION This technical note presents a proposed program of test and analysis with 𔃾 a goal of using prediction techniques to enhance the...future digital DCS and methods of satisfying them so that the reader will understand: (1) what is needed from this program to enhance the performance...over a number of years, using both simulation and analysis , have recommended that all major nodes of the DCS be referenced to Coordinated Universal Time

  4. Initial Design and Construction of a Mobil Regenerative Fuel Cell System

    NASA Technical Reports Server (NTRS)

    Colozza, Anthony J.; Maloney, Thomas; Hoberecht, Mark (Technical Monitor)

    2003-01-01

    The design and initial construction of a mobile regenerative power system is described. The main components of the power system consists of a photovoltaic array, regenerative fuel cell and electrolyzer. The system is mounted on a modified landscape trailer and is completely self contained. An operational analysis is also presented that shows predicted performance for the system at various times of the year. The operational analysis consists of performing an energy balance on the system based on array output and total desired operational time.

  5. Segmenting the Stream of Consciousness: The Psychological Correlates of Temporal Structures in the Time Series Data of a Continuous Performance Task

    ERIC Educational Resources Information Center

    Smallwood, Jonathan; McSpadden, Merrill; Luus, Bryan; Schooler, Joanthan

    2008-01-01

    Using principal component analysis, we examined whether structural properties in the time series of response time would identify different mental states during a continuous performance task. We examined whether it was possible to identify regular patterns which were present in blocks classified as lacking controlled processing, either…

  6. The diagnostic performance of dental maturity for identification of the circumpubertal growth phases: a meta-analysis.

    PubMed

    Perinetti, Giuseppe; Westphalen, Graziela H; Biasotto, Matteo; Salgarello, Stefano; Contardo, Luca

    2013-05-23

    The present meta-analysis initially evaluates the reliability of dental maturation in the identification of the circumpubertal growth phases, essentially for determining treatment timing in orthodontics. A literature survey was performed using the Medline, LILACS and SciELO databases, and the Cochrane Library (2000 to 2011). Studies of the correlation between dental and cervical vertebral maturation methods were considered. The mandibular canine, the first and second premolars, and the second molar were investigated. After the selection, six articles qualified for the final analysis. The overall correlation coefficients were all significant, ranging from 0.57 to 0.73. Five of these studies suggested the use of dental maturation as an indicator of the growth phase. However, the diagnostic performance analysis uncovered limited reliability only for the identification of the pre-pubertal growth phase. The determination of dental maturity for the assessment of treatment timing in orthodontics is not recommended.

  7. Ultra‐high performance supercritical fluid chromatography of lignin‐derived phenols from alkaline cupric oxide oxidation

    PubMed Central

    Sun, Mingzhe; Lidén, Gunnar; Sandahl, Margareta

    2016-01-01

    Traditional chromatographic methods for the analysis of lignin‐derived phenolic compounds in environmental samples are generally time consuming. In this work, an ultra‐high performance supercritical fluid chromatography method with a diode array detector for the analysis of major lignin‐derived phenolic compounds produced by alkaline cupric oxide oxidation was developed. In an analysis of a collection of 11 representative monomeric lignin phenolic compounds, all compounds were clearly separated within 6 min with excellent peak shapes, with a limit of detection of 0.5–2.5 μM, a limit of quantification of 2.5–5.0 μM, and a dynamic range of 5.0–2.0 mM (R 2 > 0.997). The new ultra‐high performance supercritical fluid chromatography method was also applied for the qualitative and quantitative analysis of lignin‐derived phenolic compounds obtained upon alkaline cupric oxide oxidation of a commercial humic acid. Ten out of the previous eleven model compounds could be quantified in the oxidized humic acid sample. The high separation power and short analysis time obtained demonstrate for the first time that supercritical fluid chromatography is a fast and reliable technique for the analysis of lignin‐derived phenols in complex environmental samples. PMID:27452148

  8. Analysis of statistical and standard algorithms for detecting muscle onset with surface electromyography.

    PubMed

    Tenan, Matthew S; Tweedell, Andrew J; Haynes, Courtney A

    2017-01-01

    The timing of muscle activity is a commonly applied analytic method to understand how the nervous system controls movement. This study systematically evaluates six classes of standard and statistical algorithms to determine muscle onset in both experimental surface electromyography (EMG) and simulated EMG with a known onset time. Eighteen participants had EMG collected from the biceps brachii and vastus lateralis while performing a biceps curl or knee extension, respectively. Three established methods and three statistical methods for EMG onset were evaluated. Linear envelope, Teager-Kaiser energy operator + linear envelope and sample entropy were the established methods evaluated while general time series mean/variance, sequential and batch processing of parametric and nonparametric tools, and Bayesian changepoint analysis were the statistical techniques used. Visual EMG onset (experimental data) and objective EMG onset (simulated data) were compared with algorithmic EMG onset via root mean square error and linear regression models for stepwise elimination of inferior algorithms. The top algorithms for both data types were analyzed for their mean agreement with the gold standard onset and evaluation of 95% confidence intervals. The top algorithms were all Bayesian changepoint analysis iterations where the parameter of the prior (p0) was zero. The best performing Bayesian algorithms were p0 = 0 and a posterior probability for onset determination at 60-90%. While existing algorithms performed reasonably, the Bayesian changepoint analysis methodology provides greater reliability and accuracy when determining the singular onset of EMG activity in a time series. Further research is needed to determine if this class of algorithms perform equally well when the time series has multiple bursts of muscle activity.

  9. Neural correlates of belief-bias reasoning under time pressure: a near-infrared spectroscopy study.

    PubMed

    Tsujii, Takeo; Watanabe, Shigeru

    2010-04-15

    The dual-process theory of reasoning explained the belief-bias effect, the tendency for human reasoning to be erroneously biased when logical conclusions are incongruent with belief about the world, by proposing a belief-based fast heuristic system and a logic-based slow analytic system. Although the claims were supported by behavioral findings that the belief-bias effect was enhanced when subjects were not given sufficient time for reasoning, the neural correlates were still unknown. The present study therefore examined the relationship between the time-pressure effect and activity in the inferior frontal cortex (IFC) during belief-bias reasoning using near-infrared spectroscopy (NIRS). Forty-eight subjects performed congruent and incongruent reasoning tasks, involving long-span (20 s) and short-span trials (10 s). Behavioral analysis found that only incongruent reasoning performance was impaired by the time-pressure of short-span trials. NIRS analysis found that the time-pressure decreased right IFC activity during incongruent trials. Correlation analysis showed that subjects with enhanced right IFC activity could perform better in incongruent trials, while subjects for whom the right IFC activity was impaired by the time-pressure could not maintain better reasoning performance. These findings suggest that the right IFC may be responsible for the time-pressure effect in conflicting reasoning processes. When the right IFC activity was impaired in the short-span trials in which subjects were not given sufficient time for reasoning, the subjects may rely on the fast heuristic system, which result in belief-bias responses. We therefore offer the first demonstration of neural correlates of time-pressure effect on the IFC activity in belief-bias reasoning. Copyright 2009 Elsevier Inc. All rights reserved.

  10. Model-Based Linkage Analysis of a Quantitative Trait.

    PubMed

    Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H

    2017-01-01

    Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.

  11. Initial Data Analysis Results for ATD-2 ISAS HITL Simulation

    NASA Technical Reports Server (NTRS)

    Lee, Hanbong

    2017-01-01

    To evaluate the operational procedures and information requirements for the core functional capabilities of the ATD-2 project, such as tactical surface metering tool, APREQ-CFR procedure, and data element exchanges between ramp and tower, human-in-the-loop (HITL) simulations were performed in March, 2017. This presentation shows the initial data analysis results from the HITL simulations. With respect to the different runway configurations and metering values in tactical surface scheduler, various airport performance metrics were analyzed and compared. These metrics include gate holding time, taxi-out in time, runway throughput, queue size and wait time in queue, and TMI flight compliance. In addition to the metering value, other factors affecting the airport performance in the HITL simulation, including run duration, runway changes, and TMI constraints, are also discussed.

  12. Structural performance analysis and redesign

    NASA Technical Reports Server (NTRS)

    Whetstone, W. D.

    1978-01-01

    Program performs stress buckling and vibrational analysis of large, linear, finite-element systems in excess of 50,000 degrees of freedom. Cost, execution time, and storage requirements are kept reasonable through use of sparse matrix solution techniques, and other computational and data management procedures designed for problems of very large size.

  13. Probabilistic risk assessment for a loss of coolant accident in McMaster Nuclear Reactor and application of reliability physics model for modeling human reliability

    NASA Astrophysics Data System (ADS)

    Ha, Taesung

    A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential usefulness of quantifying model uncertainty as sensitivity analysis in the PRA model.

  14. Nontechnical skills performance and care processes in the management of the acute trauma patient.

    PubMed

    Pucher, Philip H; Aggarwal, Rajesh; Batrick, Nicola; Jenkins, Michael; Darzi, Ara

    2014-05-01

    Acute trauma management is a complex process, with the effective cooperation among multiple clinicians critical to success. Despite this, the effect of nontechnical skills on performance on outcomes has not been investigated previously in trauma. Trauma calls in an urban, level 1 trauma center were observed directly. Nontechnical performance was measured using T-NOTECHS. Times to disposition and completion of assessment care processes were recorded, as well as any delays or errors. Statistical analysis assessed the effect of T-NOTECHS on performance and outcomes, accounting for Injury Severity Scores (ISS) and time of day as potential confounding factors. Meta-analysis was performed for incidence of delays. Fifty trauma calls were observed, with an ISS of 13 (interquartile range [IQR], 5-25); duration of stay 1 (IQR, 1-8) days; T-NOTECHS, 20.5 (IQR, 18-23); time to disposition, 24 minutes (IQR, 18-42). Trauma calls with low T-NOTECHS scores had a greater time to disposition: 35 minutes (IQR, 23-53) versus 20 (IQR, 16-25; P = .046). ISS showed a significant correlation to duration of stay (r = 0.736; P < .001), but not to T-NOTECHS (r = 0.201; P = .219) or time to disposition (r = 0.113; P = .494). There was no difference between "in-hours" and "out-of-hours" trauma calls for T-NOTECHS scores (21 [IQR, 18-22] vs 20 [IQR, 20-23]; P = .361), or time to disposition (34 minutes [IQR, 24-52] vs 17 [IQR, 15-27]; P = .419). Regression analysis revealed T-NOTECHS as the only factor associated with delays (odds ratio [OR], 0.24; 95% confidence interval [CI], 0.06-0.95). Better teamwork and nontechnical performance are associated with significant decreases in disposition time, an important marker of quality in acute trauma care. Addressing team and nontechnical skills has the potential to improve patient assessment, treatment, and outcomes. Copyright © 2014 Mosby, Inc. All rights reserved.

  15. Performance Analysis of IEEE 802.15.6 CSMA/CA Protocol for WBAN Medical Scenario through DTMC Model.

    PubMed

    Kumar, Vivek; Gupta, Bharat

    2016-12-01

    The newly drafted IEEE 802.15.6 standard for Wireless Body Area Networks (WBAN) has been concentrating on a numerous medical and non-medical applications. Such short range wireless communication standard offers ultra-low power consumption with variable data rates from few Kbps to Mbps in, on or around the proximity of the human body. In this paper, the performance analysis of carrier sense multiple access with collision avoidance (CSMA/CA) scheme based on IEEE 802.15.6 standard in terms of throughput, reliability, clear channel assessment (CCA) failure probability, packet drop probability, and end-to-end delay has been presented. We have developed a discrete-time Markov chain (DTMC) to significantly evaluate the performances of IEEE 802.15.6 CSMA/CA under non-ideal channel condition having saturated traffic condition including node wait time and service time. We also visualize that, as soon as the payload length increases the CCA failure probability increases, which results in lower node's reliability. Also, we have calculated the end-to-end delay in order to prioritize the node wait time cause by backoff and retransmission. The user priority (UP) wise DTMC analysis has been performed to show the importance of the standard especially for medical scenario.

  16. Finite time step and spatial grid effects in δf simulation of warm plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sturdevant, Benjamin J., E-mail: benjamin.j.sturdevant@gmail.com; Department of Applied Mathematics, University of Colorado at Boulder, Boulder, CO 80309; Parker, Scott E.

    2016-01-15

    This paper introduces a technique for analyzing time integration methods used with the particle weight equations in δf method particle-in-cell (PIC) schemes. The analysis applies to the simulation of warm, uniform, periodic or infinite plasmas in the linear regime and considers the collective behavior similar to the analysis performed by Langdon for full-f PIC schemes [1,2]. We perform both a time integration analysis and spatial grid analysis for a kinetic ion, adiabatic electron model of ion acoustic waves. An implicit time integration scheme is studied in detail for δf simulations using our weight equation analysis and for full-f simulations usingmore » the method of Langdon. It is found that the δf method exhibits a CFL-like stability condition for low temperature ions, which is independent of the parameter characterizing the implicitness of the scheme. The accuracy of the real frequency and damping rate due to the discrete time and spatial schemes is also derived using a perturbative method. The theoretical analysis of numerical error presented here may be useful for the verification of simulations and for providing intuition for the design of new implicit time integration schemes for the δf method, as well as understanding differences between δf and full-f approaches to plasma simulation.« less

  17. Robust Control Systems.

    DTIC Science & Technology

    1981-12-01

    time control system algorithms that will perform adequately (i.e., at least maintain closed-loop system stability) when ucertain parameters in the...system design models vary significantly. Such a control algorithm is said to have stability robustness-or more simply is said to be "robust". This...cas6s above, the performance is analyzed using a covariance analysis. The development of all the controllers and the performance analysis algorithms is

  18. Investigation of the 16-year and 18-year ZTD Time Series Derived from GPS Data Processing

    NASA Astrophysics Data System (ADS)

    Bałdysz, Zofia; Nykiel, Grzegorz; Figurski, Mariusz; Szafranek, Karolina; KroszczyńSki, Krzysztof

    2015-08-01

    The GPS system can play an important role in activities related to the monitoring of climate. Long time series, coherent strategy, and very high quality of tropospheric parameter Zenith Tropospheric Delay (ZTD) estimated on the basis of GPS data analysis allows to investigate its usefulness for climate research as a direct GPS product. This paper presents results of analysis of 16-year time series derived from EUREF Permanent Network (EPN) reprocessing performed by the Military University of Technology. For 58 stations Lomb-Scargle periodograms were performed in order to obtain information about the oscillations in ZTD time series. Seasonal components and linear trend were estimated using Least Square Estimation (LSE) and Mann—Kendall trend test was used to confirm the presence of a linear trend designated by LSE method. In order to verify the impact of the length of time series on trend value, comparison between 16 and 18 years were performed.

  19. Lanczos eigensolution method for high-performance computers

    NASA Technical Reports Server (NTRS)

    Bostic, Susan W.

    1991-01-01

    The theory, computational analysis, and applications are presented of a Lanczos algorithm on high performance computers. The computationally intensive steps of the algorithm are identified as: the matrix factorization, the forward/backward equation solution, and the matrix vector multiples. These computational steps are optimized to exploit the vector and parallel capabilities of high performance computers. The savings in computational time from applying optimization techniques such as: variable band and sparse data storage and access, loop unrolling, use of local memory, and compiler directives are presented. Two large scale structural analysis applications are described: the buckling of a composite blade stiffened panel with a cutout, and the vibration analysis of a high speed civil transport. The sequential computational time for the panel problem executed on a CONVEX computer of 181.6 seconds was decreased to 14.1 seconds with the optimized vector algorithm. The best computational time of 23 seconds for the transport problem with 17,000 degs of freedom was on the the Cray-YMP using an average of 3.63 processors.

  20. Mixed time integration methods for transient thermal analysis of structures, appendix 5

    NASA Technical Reports Server (NTRS)

    Liu, W. K.

    1982-01-01

    Mixed time integration methods for transient thermal analysis of structures are studied. An efficient solution procedure for predicting the thermal behavior of aerospace vehicle structures was developed. A 2D finite element computer program incorporating these methodologies is being implemented. The performance of these mixed time finite element algorithms can then be evaluated employing the proposed example problem.

  1. Reconstruction and feature selection for desorption electrospray ionization mass spectroscopy imagery

    NASA Astrophysics Data System (ADS)

    Gao, Yi; Zhu, Liangjia; Norton, Isaiah; Agar, Nathalie Y. R.; Tannenbaum, Allen

    2014-03-01

    Desorption electrospray ionization mass spectrometry (DESI-MS) provides a highly sensitive imaging technique for differentiating normal and cancerous tissue at the molecular level. This can be very useful, especially under intra-operative conditions where the surgeon has to make crucial decision about the tumor boundary. In such situations, the time it takes for imaging and data analysis becomes a critical factor. Therefore, in this work we utilize compressive sensing to perform the sparse sampling of the tissue, which halves the scanning time. Furthermore, sparse feature selection is performed, which not only reduces the dimension of data from about 104 to less than 50, and thus significantly shortens the analysis time. This procedure also identifies biochemically important molecules for further pathological analysis. The methods are validated on brain and breast tumor data sets.

  2. "Fast" Is Not "Real-Time": Designing Effective Real-Time AI Systems

    NASA Astrophysics Data System (ADS)

    O'Reilly, Cindy A.; Cromarty, Andrew S.

    1985-04-01

    Realistic practical problem domains (such as robotics, process control, and certain kinds of signal processing) stand to benefit greatly from the application of artificial intelligence techniques. These problem domains are of special interest because they are typified by complex dynamic environments in which the ability to select and initiate a proper response to environmental events in real time is a strict prerequisite to effective environmental interaction. Artificial intelligence systems developed to date have been sheltered from this real-time requirement, however, largely by virtue of their use of simplified problem domains or problem representations. The plethora of colloquial and (in general) mutually inconsistent interpretations of the term "real-time" employed by workers in each of these domains further exacerbates the difficul-ties in effectively applying state-of-the-art problem solving tech-niques to time-critical problems. Indeed, the intellectual waters are by now sufficiently muddied that the pursuit of a rigorous treatment of intelligent real-time performance mandates the redevelopment of proper problem perspective on what "real-time" means, starting from first principles. We present a simple but nonetheless formal definition of real-time performance. We then undertake an analysis of both conventional techniques and AI technology with respect to their ability to meet substantive real-time performance criteria. This analysis provides a basis for specification of problem-independent design requirements for systems that would claim real-time performance. Finally, we discuss the application of these design principles to a pragmatic problem in real-time signal understanding.

  3. Detecting a periodic signal in the terrestrial cratering record

    NASA Technical Reports Server (NTRS)

    Grieve, Richard A. F.; Rupert, James D.; Goodacre, Alan K.; Sharpton, Virgil L.

    1988-01-01

    A time-series analysis of model periodic data, where the period and phase are known, has been performed in order to investigate whether a significant period can be detected consistently from a mix of random and periodic impacts. Special attention is given to the effect of age uncertainties and random ages in the detection of a periodic signal. An equivalent analysis is performed with observed data on crater ages and compared with the model data, and the effects of the temporal distribution of crater ages on the results from the time-series analysis are studied. Evidence for a consistent 30-m.y. period is found to be weak.

  4. An efficient approach to identify different chemical markers between fibrous root and rhizome of Anemarrhena asphodeloides by ultra high-performance liquid chromatography quadrupole time-of-flight tandem mass spectrometry with multivariate statistical analysis.

    PubMed

    Wang, Fang-Xu; Yuan, Jian-Chao; Kang, Li-Ping; Pang, Xu; Yan, Ren-Yi; Zhao, Yang; Zhang, Jie; Sun, Xin-Guang; Ma, Bai-Ping

    2016-09-10

    An ultra high-performance liquid chromatography quadrupole time-of-flight tandem mass spectrometry approach coupled with multivariate statistical analysis was established and applied to rapidly distinguish the chemical differences between fibrous root and rhizome of Anemarrhena asphodeloides. The datasets of tR-m/z pairs, ion intensity and sample code were processed by principal component analysis and orthogonal partial least squares discriminant analysis. Chemical markers could be identified based on their exact mass data, fragmentation characteristics, and retention times. And the new compounds among chemical markers could be isolated rapidly guided by the ultra high-performance liquid chromatography quadrupole time-of-flight tandem mass spectrometry and their definitive structures would be further elucidated by NMR spectra. Using this approach, twenty-four markers were identified on line including nine new saponins and five new steroidal saponins of them were obtained in pure form. The study validated this proposed approach as a suitable method for identification of the chemical differences between various medicinal parts in order to expand medicinal parts and increase the utilization rate of resources. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Data management system performance modeling

    NASA Technical Reports Server (NTRS)

    Kiser, Larry M.

    1993-01-01

    This paper discusses analytical techniques that have been used to gain a better understanding of the Space Station Freedom's (SSF's) Data Management System (DMS). The DMS is a complex, distributed, real-time computer system that has been redesigned numerous times. The implications of these redesigns have not been fully analyzed. This paper discusses the advantages and disadvantages for static analytical techniques such as Rate Monotonic Analysis (RMA) and also provides a rationale for dynamic modeling. Factors such as system architecture, processor utilization, bus architecture, queuing, etc. are well suited for analysis with a dynamic model. The significance of performance measures for a real-time system are discussed.

  6. Run-time parallelization and scheduling of loops

    NASA Technical Reports Server (NTRS)

    Saltz, Joel H.; Mirchandaney, Ravi; Crowley, Kay

    1990-01-01

    Run time methods are studied to automatically parallelize and schedule iterations of a do loop in certain cases, where compile-time information is inadequate. The methods presented involve execution time preprocessing of the loop. At compile-time, these methods set up the framework for performing a loop dependency analysis. At run time, wave fronts of concurrently executable loop iterations are identified. Using this wavefront information, loop iterations are reordered for increased parallelism. Symbolic transformation rules are used to produce: inspector procedures that perform execution time preprocessing and executors or transformed versions of source code loop structures. These transformed loop structures carry out the calculations planned in the inspector procedures. Performance results are presented from experiments conducted on the Encore Multimax. These results illustrate that run time reordering of loop indices can have a significant impact on performance. Furthermore, the overheads associated with this type of reordering are amortized when the loop is executed several times with the same dependency structure.

  7. Assessing performance of an Electronic Health Record (EHR) using Cognitive Task Analysis.

    PubMed

    Saitwal, Himali; Feng, Xuan; Walji, Muhammad; Patel, Vimla; Zhang, Jiajie

    2010-07-01

    Many Electronic Health Record (EHR) systems fail to provide user-friendly interfaces due to the lack of systematic consideration of human-centered computing issues. Such interfaces can be improved to provide easy to use, easy to learn, and error-resistant EHR systems to the users. To evaluate the usability of an EHR system and suggest areas of improvement in the user interface. The user interface of the AHLTA (Armed Forces Health Longitudinal Technology Application) was analyzed using the Cognitive Task Analysis (CTA) method called GOMS (Goals, Operators, Methods, and Selection rules) and an associated technique called KLM (Keystroke Level Model). The GOMS method was used to evaluate the AHLTA user interface by classifying each step of a given task into Mental (Internal) or Physical (External) operators. This analysis was performed by two analysts independently and the inter-rater reliability was computed to verify the reliability of the GOMS method. Further evaluation was performed using KLM to estimate the execution time required to perform the given task through application of its standard set of operators. The results are based on the analysis of 14 prototypical tasks performed by AHLTA users. The results show that on average a user needs to go through 106 steps to complete a task. To perform all 14 tasks, they would spend about 22 min (independent of system response time) for data entry, of which 11 min are spent on more effortful mental operators. The inter-rater reliability analysis performed for all 14 tasks was 0.8 (kappa), indicating good reliability of the method. This paper empirically reveals and identifies the following finding related to the performance of AHLTA: (1) large number of average total steps to complete common tasks, (2) high average execution time and (3) large percentage of mental operators. The user interface can be improved by reducing (a) the total number of steps and (b) the percentage of mental effort, required for the tasks. 2010 Elsevier Ireland Ltd. All rights reserved.

  8. Algorithm Summary and Evaluation: Automatic Implementation of Ringdown Analysis for Electromechanical Mode Identification from Phasor Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Ning; Huang, Zhenyu; Tuffner, Francis K.

    2010-02-28

    Small signal stability problems are one of the major threats to grid stability and reliability. Prony analysis has been successfully applied on ringdown data to monitor electromechanical modes of a power system using phasor measurement unit (PMU) data. To facilitate an on-line application of mode estimation, this paper develops a recursive algorithm for implementing Prony analysis and proposed an oscillation detection method to detect ringdown data in real time. By automatically detecting ringdown data, the proposed method helps guarantee that Prony analysis is applied properly and timely on the ringdown data. Thus, the mode estimation results can be performed reliablymore » and timely. The proposed method is tested using Monte Carlo simulations based on a 17-machine model and is shown to be able to properly identify the oscillation data for on-line application of Prony analysis. In addition, the proposed method is applied to field measurement data from WECC to show the performance of the proposed algorithm.« less

  9. An evaluation of superminicomputers for thermal analysis

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Vidal, J. B.; Jones, G. K.

    1962-01-01

    The feasibility and cost effectiveness of solving thermal analysis problems on superminicomputers is demonstrated. Conventional thermal analysis and the changing computer environment, computer hardware and software used, six thermal analysis test problems, performance of superminicomputers (CPU time, accuracy, turnaround, and cost) and comparison with large computers are considered. Although the CPU times for superminicomputers were 15 to 30 times greater than the fastest mainframe computer, the minimum cost to obtain the solutions on superminicomputers was from 11 percent to 59 percent of the cost of mainframe solutions. The turnaround (elapsed) time is highly dependent on the computer load, but for large problems, superminicomputers produced results in less elapsed time than a typically loaded mainframe computer.

  10. Response Time Analysis and Test of Protection System Instrument Channels for APR1400 and OPR1000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Chang Jae; Han, Seung; Yun, Jae Hee

    2015-07-01

    Safety limits are required to maintain the integrity of physical barriers designed to prevent the uncontrolled release of radioactive materials in nuclear power plants. The safety analysis establishes two critical constraints that include an analytical limit in terms of a measured or calculated variable, and a specific time after the analytical limit is reached to begin protective action. Keeping with the nuclear regulations and industry standards, satisfying these two requirements will ensure that the safety limit will not be exceeded during the design basis event, either an anticipated operational occurrence or a postulated accident. Various studies on the setpoint determinationmore » methodology for the safety-related instrumentation have been actively performed to ensure that the requirement of the analytical limit is satisfied. In particular, the protection setpoint methodology for the advanced power reactor 1400 (APP1400) and the optimized power reactor 1000 (OPR1000) has been recently developed to cover both the design basis event and the beyond design basis event. The developed setpoint methodology has also been quantitatively validated using specific computer programs and setpoint calculations. However, the safety of nuclear power plants cannot be fully guaranteed by satisfying the requirement of the analytical limit. In spite of the response time verification requirements of nuclear regulations and industry standards, it is hard to find the studies on the systematically integrated methodology regarding the response time evaluation. In cases of APR1400 and OPR1000, the response time analysis for the plant protection system is partially included in the setpoint calculation and the response time test is separately performed via the specific plant procedure. The test technique has a drawback which is the difficulty to demonstrate completeness of timing test. The analysis technique has also a demerit of resulting in extreme times that not actually possible. Thus, the establishment of the systematic response time evaluation methodology is needed to justify the conformance to the response time requirement used in the safety analysis. This paper proposes the response time evaluation methodology for APR1400 and OPR1000 using the combined analysis and test technique to confirm that the plant protection system can meet the analytical response time assumed in the safety analysis. In addition, the results of the quantitative evaluation performed for APR1400 and OPR1000 are presented in this paper. The proposed response time analysis technique consists of defining the response time requirement, determining the critical signal path for the trip parameter, allocating individual response time to each component on the signal path, and analyzing the total response time for the trip parameter, and demonstrates that the total analyzed response time does not exceed the response time requirement. The proposed response time test technique is composed of defining the response time requirement, determining the critical signal path for the trip parameter, determining the test method for each component on the signal path, performing the response time test, and demonstrates that the total test result does not exceed the response time requirement. The total response time should be tested in a single test that covers from the sensor to the final actuation device on the instrument channel. When the total channel is not tested in a single test, separate tests on groups of components or single components including the total instrument channel shall be combined to verify the total channel response. For APR1400 and OPR1000, the ramp test technique is used for the pressure and differential pressure transmitters and the step function testing technique is applied to the signal processing equipment and final actuation device. As a result, it can be demonstrated that the response time requirement is satisfied by the combined analysis and test technique. Therefore, the proposed methodology in this paper plays a crucial role in guaranteeing the safety of the nuclear power plants systematically satisfying one of two critical requirements from the safety analysis. (authors)« less

  11. National Combustion Code: Parallel Implementation and Performance

    NASA Technical Reports Server (NTRS)

    Quealy, A.; Ryder, R.; Norris, A.; Liu, N.-S.

    2000-01-01

    The National Combustion Code (NCC) is being developed by an industry-government team for the design and analysis of combustion systems. CORSAIR-CCD is the current baseline reacting flow solver for NCC. This is a parallel, unstructured grid code which uses a distributed memory, message passing model for its parallel implementation. The focus of the present effort has been to improve the performance of the NCC flow solver to meet combustor designer requirements for model accuracy and analysis turnaround time. Improving the performance of this code contributes significantly to the overall reduction in time and cost of the combustor design cycle. This paper describes the parallel implementation of the NCC flow solver and summarizes its current parallel performance on an SGI Origin 2000. Earlier parallel performance results on an IBM SP-2 are also included. The performance improvements which have enabled a turnaround of less than 15 hours for a 1.3 million element fully reacting combustion simulation are described.

  12. Time-dependent inertia analysis of vehicle mechanisms

    NASA Astrophysics Data System (ADS)

    Salmon, James Lee

    Two methods for performing transient inertia analysis of vehicle hardware systems are developed in this dissertation. The analysis techniques can be used to predict the response of vehicle mechanism systems to the accelerations associated with vehicle impacts. General analytical methods for evaluating translational or rotational system dynamics are generated and evaluated for various system characteristics. The utility of the derived techniques are demonstrated by applying the generalized methods to two vehicle systems. Time dependent acceleration measured during a vehicle to vehicle impact are used as input to perform a dynamic analysis of an automobile liftgate latch and outside door handle. Generalized Lagrange equations for a non-conservative system are used to formulate a second order nonlinear differential equation defining the response of the components to the transient input. The differential equation is solved by employing the fourth order Runge-Kutta method. The events are then analyzed using commercially available two dimensional rigid body dynamic analysis software. The results of the two analytical techniques are compared to experimental data generated by high speed film analysis of tests of the two components performed on a high G acceleration sled at Ford Motor Company.

  13. Performance analysis and prediction in triathlon.

    PubMed

    Ofoghi, Bahadorreza; Zeleznikow, John; Macmahon, Clare; Rehula, Jan; Dwyer, Dan B

    2016-01-01

    Performance in triathlon is dependent upon factors that include somatotype, physiological capacity, technical proficiency and race strategy. Given the multidisciplinary nature of triathlon and the interaction between each of the three race components, the identification of target split times that can be used to inform the design of training plans and race pacing strategies is a complex task. The present study uses machine learning techniques to analyse a large database of performances in Olympic distance triathlons (2008-2012). The analysis reveals patterns of performance in five components of triathlon (three race "legs" and two transitions) and the complex relationships between performance in each component and overall performance in a race. The results provide three perspectives on the relationship between performance in each component of triathlon and the final placing in a race. These perspectives allow the identification of target split times that are required to achieve a certain final place in a race and the opportunity to make evidence-based decisions about race tactics in order to optimise performance.

  14. They saw a movie: long-term memory for an extended audiovisual narrative.

    PubMed

    Furman, Orit; Dorfman, Nimrod; Hasson, Uri; Davachi, Lila; Dudai, Yadin

    2007-06-01

    We measured long-term memory for a narrative film. During the study session, participants watched a 27-min movie episode, without instructions to remember it. During the test session, administered at a delay ranging from 3 h to 9 mo after the study session, long-term memory for the movie was probed using a computerized questionnaire that assessed cued recall, recognition, and metamemory of movie events sampled approximately 20 sec apart. The performance of each group of participants was measured at a single time point only. The participants remembered many events in the movie even months after watching it. Analysis of performance, using multiple measures, indicates differences between recent (weeks) and remote (months) memory. While high-confidence recognition performance was a reliable index of memory throughout the measured time span, cued recall accuracy was higher for relatively recent information. Analysis of different content elements in the movie revealed differential memory performance profiles according to time since encoding. We also used the data to propose lower limits on the capacity of long-term memory. This experimental paradigm is useful not only for the analysis of behavioral performance that results from encoding episodes in a continuous real-life-like situation, but is also suitable for studying brain substrates and processes of real-life memory using functional brain imaging.

  15. They saw a movie: Long-term memory for an extended audiovisual narrative

    PubMed Central

    Furman, Orit; Dorfman, Nimrod; Hasson, Uri; Davachi, Lila; Dudai, Yadin

    2007-01-01

    We measured long-term memory for a narrative film. During the study session, participants watched a 27-min movie episode, without instructions to remember it. During the test session, administered at a delay ranging from 3 h to 9 mo after the study session, long-term memory for the movie was probed using a computerized questionnaire that assessed cued recall, recognition, and metamemory of movie events sampled ∼20 sec apart. The performance of each group of participants was measured at a single time point only. The participants remembered many events in the movie even months after watching it. Analysis of performance, using multiple measures, indicates differences between recent (weeks) and remote (months) memory. While high-confidence recognition performance was a reliable index of memory throughout the measured time span, cued recall accuracy was higher for relatively recent information. Analysis of different content elements in the movie revealed differential memory performance profiles according to time since encoding. We also used the data to propose lower limits on the capacity of long-term memory. This experimental paradigm is useful not only for the analysis of behavioral performance that results from encoding episodes in a continuous real-life-like situation, but is also suitable for studying brain substrates and processes of real-life memory using functional brain imaging. PMID:17562897

  16. Defense Planning In A Time Of Conflict: A Comparative Analysis Of The 2001-2014 Quadrennial Defense Reviews, And Implications For The Army--Executive Summary

    DTIC Science & Technology

    2018-01-01

    Defense Planning in a Time of Conflict A Comparative Analysis of the 2001– 2014 Quadrennial Defense Reviews, and Implications for...The purpose of the project was to perform a comparative historical review of the four Quadrennial Defense Reviews (QDRs) conducted since the first...Planning in a Time of Conflict: A Comparative Analysis of the 2001–2014 Quadrennial Defense Reviews, and Implications for the Army—that documented

  17. Multi-ingredients determination and fingerprint analysis of leaves from Ilex latifolia using ultra-performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry.

    PubMed

    Fan, Chunlin; Deng, Jiewei; Yang, Yunyun; Liu, Junshan; Wang, Ying; Zhang, Xiaoqi; Fai, Kuokchiu; Zhang, Qingwen; Ye, Wencai

    2013-10-01

    An ultra-performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry (UPLC-QTOF-MS) method integrating multi-ingredients determination and fingerprint analysis has been established for quality assessment and control of leaves from Ilex latifolia. The method possesses the advantages of speediness, efficiency, accuracy, and allows the multi-ingredients determination and fingerprint analysis in one chromatographic run within 13min. Multi-ingredients determination was performed based on the extracted ion chromatograms of the exact pseudo-molecular ions (with a 0.01Da window), and fingerprint analysis was performed based on the base peak chromatograms, obtained by negative-ion electrospray ionization QTOF-MS. The method validation results demonstrated our developed method possessing desirable specificity, linearity, precision and accuracy. The method was utilized to analyze 22 I. latifolia samples from different origins. The quality assessment was achieved by using both similarity analysis (SA) and principal component analysis (PCA), and the results from SA were consistent with those from PCA. Our experimental results demonstrate that the strategy integrated multi-ingredients determination and fingerprint analysis using UPLC-QTOF-MS technique is a useful approach for rapid pharmaceutical analysis, with promising prospects for the differentiation of origin, the determination of authenticity, and the overall quality assessment of herbal medicines. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Noise modeling and analysis of an IMU-based attitude sensor: improvement of performance by filtering and sensor fusion

    NASA Astrophysics Data System (ADS)

    K., Nirmal; A. G., Sreejith; Mathew, Joice; Sarpotdar, Mayuresh; Suresh, Ambily; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant

    2016-07-01

    We describe the characterization and removal of noises present in the Inertial Measurement Unit (IMU) MPU- 6050, which was initially used in an attitude sensor, and later used in the development of a pointing system for small balloon-borne astronomical payloads. We found that the performance of the IMU degraded with time because of the accumulation of different errors. Using Allan variance analysis method, we identified the different components of noise present in the IMU, and verified the results by the power spectral density analysis (PSD). We tried to remove the high-frequency noise using smooth filters such as moving average filter and then Savitzky Golay (SG) filter. Even though we managed to filter some high-frequency noise, these filters performance wasn't satisfactory for our application. We found the distribution of the random noise present in IMU using probability density analysis and identified that the noise in our IMU was white Gaussian in nature. Hence, we used a Kalman filter to remove the noise and which gave us good performance real time.

  19. Ethics, Nanobiosensors and Elite Sport: The Need for a New Governance Framework.

    PubMed

    Evans, Robert; McNamee, Michael; Guy, Owen

    2017-12-01

    Individual athletes, coaches and sports teams seek continuously for ways to improve performance and accomplishment in elite competition. New techniques of performance analysis are a crucial part of the drive for athletic perfection. This paper discusses the ethical importance of one aspect of the future potential of performance analysis in sport, combining the field of biomedicine, sports engineering and nanotechnology in the form of 'Nanobiosensors'. This innovative technology has the potential to revolutionise sport, enabling real time biological data to be collected from athletes that can be electronically distributed. Enabling precise real time performance analysis is not without ethical problems. Arguments concerning (1) data ownership and privacy; (2) data confidentiality; and (3) athlete welfare are presented alongside a discussion of the use of the Precautionary Principle in making ethical evaluations. We conclude, that although the future potential use of Nanobiosensors in sports analysis offers many potential benefits, there is also a fear that it could be abused at a sporting system level. Hence, it is essential for sporting bodies to consider the development of a robust ethically informed governance framework in advance of their proliferated use.

  20. Meta-Analysis of Functional Neuroimaging and Cognitive Control Studies in Schizophrenia: Preliminary Elucidation of a Core Dysfunctional Timing Network

    PubMed Central

    Alústiza, Irene; Radua, Joaquim; Albajes-Eizagirre, Anton; Domínguez, Manuel; Aubá, Enrique; Ortuño, Felipe

    2016-01-01

    Timing and other cognitive processes demanding cognitive control become interlinked when there is an increase in the level of difficulty or effort required. Both functions are interrelated and share neuroanatomical bases. A previous meta-analysis of neuroimaging studies found that people with schizophrenia had significantly lower activation, relative to normal controls, of most right hemisphere regions of the time circuit. This finding suggests that a pattern of disconnectivity of this circuit, particularly in the supplementary motor area, is a trait of this mental disease. We hypothesize that a dysfunctional temporal/cognitive control network underlies both cognitive and psychiatric symptoms of schizophrenia and that timing dysfunction is at the root of the cognitive deficits observed. The goal of our study was to look, in schizophrenia patients, for brain structures activated both by execution of cognitive tasks requiring increased effort and by performance of time perception tasks. We conducted a signed differential mapping (SDM) meta-analysis of functional neuroimaging studies in schizophrenia patients assessing the brain response to increasing levels of cognitive difficulty. Then, we performed a multimodal meta-analysis to identify common brain regions in the findings of that SDM meta-analysis and our previously-published activation likelihood estimate (ALE) meta-analysis of neuroimaging of time perception in schizophrenia patients. The current study supports the hypothesis that there exists an overlap between neural structures engaged by both timing tasks and non-temporal cognitive tasks of escalating difficulty in schizophrenia. The implication is that a deficit in timing can be considered as a trait marker of the schizophrenia cognitive profile. PMID:26925013

  1. [Construction and analysis of a monitoring system with remote real-time multiple physiological parameters based on cloud computing].

    PubMed

    Zhu, Lingyun; Li, Lianjie; Meng, Chunyan

    2014-12-01

    There have been problems in the existing multiple physiological parameter real-time monitoring system, such as insufficient server capacity for physiological data storage and analysis so that data consistency can not be guaranteed, poor performance in real-time, and other issues caused by the growing scale of data. We therefore pro posed a new solution which was with multiple physiological parameters and could calculate clustered background data storage and processing based on cloud computing. Through our studies, a batch processing for longitudinal analysis of patients' historical data was introduced. The process included the resource virtualization of IaaS layer for cloud platform, the construction of real-time computing platform of PaaS layer, the reception and analysis of data stream of SaaS layer, and the bottleneck problem of multi-parameter data transmission, etc. The results were to achieve in real-time physiological information transmission, storage and analysis of a large amount of data. The simulation test results showed that the remote multiple physiological parameter monitoring system based on cloud platform had obvious advantages in processing time and load balancing over the traditional server model. This architecture solved the problems including long turnaround time, poor performance of real-time analysis, lack of extensibility and other issues, which exist in the traditional remote medical services. Technical support was provided in order to facilitate a "wearable wireless sensor plus mobile wireless transmission plus cloud computing service" mode moving towards home health monitoring for multiple physiological parameter wireless monitoring.

  2. Is Implicit Sequence Learning Impaired in Schizophrenia? A Meta-Analysis

    ERIC Educational Resources Information Center

    Siegert, Richard J.; Weatherall, Mark; Bell, Elliot M.

    2008-01-01

    Cognition in schizophrenia seems to be characterized by impaired performance on most tests of explicit or declarative learning contrasting with relatively intact performance on most tests of implicit or procedural learning. At the same time there have been conflicting results for studies that have used the Serial Reaction Time (SRT) task to…

  3. A Hybrid Secure Scheme for Wireless Sensor Networks against Timing Attacks Using Continuous-Time Markov Chain and Queueing Model.

    PubMed

    Meng, Tianhui; Li, Xiaofan; Zhang, Sha; Zhao, Yubin

    2016-09-28

    Wireless sensor networks (WSNs) have recently gained popularity for a wide spectrum of applications. Monitoring tasks can be performed in various environments. This may be beneficial in many scenarios, but it certainly exhibits new challenges in terms of security due to increased data transmission over the wireless channel with potentially unknown threats. Among possible security issues are timing attacks, which are not prevented by traditional cryptographic security. Moreover, the limited energy and memory resources prohibit the use of complex security mechanisms in such systems. Therefore, balancing between security and the associated energy consumption becomes a crucial challenge. This paper proposes a secure scheme for WSNs while maintaining the requirement of the security-performance tradeoff. In order to proceed to a quantitative treatment of this problem, a hybrid continuous-time Markov chain (CTMC) and queueing model are put forward, and the tradeoff analysis of the security and performance attributes is carried out. By extending and transforming this model, the mean time to security attributes failure is evaluated. Through tradeoff analysis, we show that our scheme can enhance the security of WSNs, and the optimal rekeying rate of the performance and security tradeoff can be obtained.

  4. A Hybrid Secure Scheme for Wireless Sensor Networks against Timing Attacks Using Continuous-Time Markov Chain and Queueing Model

    PubMed Central

    Meng, Tianhui; Li, Xiaofan; Zhang, Sha; Zhao, Yubin

    2016-01-01

    Wireless sensor networks (WSNs) have recently gained popularity for a wide spectrum of applications. Monitoring tasks can be performed in various environments. This may be beneficial in many scenarios, but it certainly exhibits new challenges in terms of security due to increased data transmission over the wireless channel with potentially unknown threats. Among possible security issues are timing attacks, which are not prevented by traditional cryptographic security. Moreover, the limited energy and memory resources prohibit the use of complex security mechanisms in such systems. Therefore, balancing between security and the associated energy consumption becomes a crucial challenge. This paper proposes a secure scheme for WSNs while maintaining the requirement of the security-performance tradeoff. In order to proceed to a quantitative treatment of this problem, a hybrid continuous-time Markov chain (CTMC) and queueing model are put forward, and the tradeoff analysis of the security and performance attributes is carried out. By extending and transforming this model, the mean time to security attributes failure is evaluated. Through tradeoff analysis, we show that our scheme can enhance the security of WSNs, and the optimal rekeying rate of the performance and security tradeoff can be obtained. PMID:27690042

  5. A 20-year period of orthotopic liver transplantation activity in a single center: a time series analysis performed using the R Statistical Software.

    PubMed

    Santori, G; Andorno, E; Morelli, N; Casaccia, M; Bottino, G; Di Domenico, S; Valente, U

    2009-05-01

    In many Western countries a "minimum volume rule" policy has been adopted as a quality measure for complex surgical procedures. In Italy, the National Transplant Centre set the minimum number of orthotopic liver transplantation (OLT) procedures/y at 25/center. OLT procedures performed in a single center for a reasonably large period may be treated as a time series to evaluate trend, seasonal cycles, and nonsystematic fluctuations. Between January 1, 1987 and December 31, 2006, we performed 563 cadaveric donor OLTs to adult recipients. During 2007, there were another 28 procedures. The greatest numbers of OLTs/y were performed in 2001 (n = 51), 2005 (n = 50), and 2004 (n = 49). A time series analysis performed using R Statistical Software (Foundation for Statistical Computing, Vienna, Austria), a free software environment for statistical computing and graphics, showed an incremental trend after exponential smoothing as well as after seasonal decomposition. The predicted OLT/mo for 2007 calculated with the Holt-Winters exponential smoothing applied to the previous period 1987-2006 helped to identify the months where there was a major difference between predicted and performed procedures. The time series approach may be helpful to establish a minimum volume/y at a single-center level.

  6. Combined analysis of cortical (EEG) and nerve stump signals improves robotic hand control.

    PubMed

    Tombini, Mario; Rigosa, Jacopo; Zappasodi, Filippo; Porcaro, Camillo; Citi, Luca; Carpaneto, Jacopo; Rossini, Paolo Maria; Micera, Silvestro

    2012-01-01

    Interfacing an amputee's upper-extremity stump nerves to control a robotic hand requires training of the individual and algorithms to process interactions between cortical and peripheral signals. To evaluate for the first time whether EEG-driven analysis of peripheral neural signals as an amputee practices could improve the classification of motor commands. Four thin-film longitudinal intrafascicular electrodes (tf-LIFEs-4) were implanted in the median and ulnar nerves of the stump in the distal upper arm for 4 weeks. Artificial intelligence classifiers were implemented to analyze LIFE signals recorded while the participant tried to perform 3 different hand and finger movements as pictures representing these tasks were randomly presented on a screen. In the final week, the participant was trained to perform the same movements with a robotic hand prosthesis through modulation of tf-LIFE-4 signals. To improve the classification performance, an event-related desynchronization/synchronization (ERD/ERS) procedure was applied to EEG data to identify the exact timing of each motor command. Real-time control of neural (motor) output was achieved by the participant. By focusing electroneurographic (ENG) signal analysis in an EEG-driven time window, movement classification performance improved. After training, the participant regained normal modulation of background rhythms for movement preparation (α/β band desynchronization) in the sensorimotor area contralateral to the missing limb. Moreover, coherence analysis found a restored α band synchronization of Rolandic area with frontal and parietal ipsilateral regions, similar to that observed in the opposite hemisphere for movement of the intact hand. Of note, phantom limb pain (PLP) resolved for several months. Combining information from both cortical (EEG) and stump nerve (ENG) signals improved the classification performance compared with tf-LIFE signals processing alone; training led to cortical reorganization and mitigation of PLP.

  7. Cost and Time Analysis of Monograph Cataloging in Hospital Libraries: A Preliminary Study.

    ERIC Educational Resources Information Center

    Angold, Linda

    The purpose of this paper is: (1) to propose models to be used in evaluating relative time and cost factors involved in monograph cataloging within a hospital library, and (2) to test the models by performing a cost and time analysis of each cataloging method studied. To establish as complete a list of cataloging work units as possible, several…

  8. 77 FR 47383 - Annual Assessment of the Status of Competition in the Market for the Delivery of Video Programming

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-08

    ... monitor trends on an annual basis. To continue our time-series analysis, we request data as of June 30... information and time- series data we should collect for the analysis of various MVPD performance metrics. In... revenues, cash flows, and margins. To the extent possible, we seek five-year time-series data to allow us...

  9. QC-ART: A tool for real-time quality control assessment of mass spectrometry-based proteomics data.

    PubMed

    Stanfill, Bryan A; Nakayasu, Ernesto S; Bramer, Lisa M; Thompson, Allison M; Ansong, Charles K; Clauss, Therese; Gritsenko, Marina A; Monroe, Matthew E; Moore, Ronald J; Orton, Daniel J; Piehowski, Paul D; Schepmoes, Athena A; Smith, Richard D; Webb-Robertson, Bobbie-Jo; Metz, Thomas O

    2018-04-17

    Liquid chromatography-mass spectrometry (LC-MS)-based proteomics studies of large sample cohorts can easily require from months to years to complete. Acquiring consistent, high-quality data in such large-scale studies is challenging because of normal variations in instrumentation performance over time, as well as artifacts introduced by the samples themselves, such as those due to collection, storage and processing. Existing quality control methods for proteomics data primarily focus on post-hoc analysis to remove low-quality data that would degrade downstream statistics; they are not designed to evaluate the data in near real-time, which would allow for interventions as soon as deviations in data quality are detected.  In addition to flagging analyses that demonstrate outlier behavior, evaluating how the data structure changes over time can aide in understanding typical instrument performance or identify issues such as a degradation in data quality due to the need for instrument cleaning and/or re-calibration.  To address this gap for proteomics, we developed Quality Control Analysis in Real-Time (QC-ART), a tool for evaluating data as they are acquired in order to dynamically flag potential issues with instrument performance or sample quality.  QC-ART has similar accuracy as standard post-hoc analysis methods with the additional benefit of real-time analysis.  We demonstrate the utility and performance of QC-ART in identifying deviations in data quality due to both instrument and sample issues in near real-time for LC-MS-based plasma proteomics analyses of a sample subset of The Environmental Determinants of Diabetes in the Young cohort. We also present a case where QC-ART facilitated the identification of oxidative modifications, which are often underappreciated in proteomic experiments. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.

  10. [Qualitative and quantitative analysis of amygdalin and its metabolite prunasin in plasma by ultra-high performance liquid chromatography-tandem quadrupole time of flight mass spectrometry and ultra-high performance liquid chromatography-tandem triple quadrupole mass spectrometry].

    PubMed

    Gao, Meng; Wang, Yuesheng; Wei, Huizhen; Ouyang, Hui; He, Mingzhen; Zeng, Lianqing; Shen, Fengyun; Guo, Qiang; Rao, Yi

    2014-06-01

    A method was developed for the determination of amygdalin and its metabolite prunasin in rat plasma after intragastric administration of Maxing shigan decoction. The analytes were identified by ultra-high performance liquid chromatography-tandem quadrupole time of flight mass spectrometry and quantitatively determined by ultra-high performance liquid chromatography-tandem triple quadrupole mass spectrometry. After purified by liquid-liquid extraction, the qualitative analysis of amygdalin and prunasin in the plasma sample was performed on a Shim-pack XR-ODS III HPLC column (75 mm x 2.0 mm, 1.6 microm), using acetonitrile-0.1% (v/v) formic acid aqueous solution. The detection was performed on a Triple TOF 5600 quadrupole time of flight mass spectrometer. The quantitative analysis of amygdalin and prunasin in the plasma sample was performed by separation on an Agilent C18 HPLC column (50 mm x 2.1 mm, 1.7 microm), using acetonitrile-0.1% (v/v) formic acid aqueous solution. The detection was performed on an AB Q-TRAP 4500 triple quadrupole mass spectrometer utilizing electrospray ionization (ESI) interface operated in negative ion mode and multiple-reaction monitoring (MRM) mode. The qualitative analysis results showed that amygdalin and its metabolite prunasin were detected in the plasma sample. The quantitative analysis results showed that the linear range of amygdalin was 1.05-4 200 ng/mL with the correlation coefficient of 0.999 0 and the linear range of prunasin was 1.25-2 490 ng/mL with the correlation coefficient of 0.997 0. The method had a good precision with the relative standard deviations (RSDs) lower than 9.20% and the overall recoveries varied from 82.33% to 95.25%. The limits of detection (LODs) of amygdalin and prunasin were 0.50 ng/mL. With good reproducibility, the method is simple, fast and effective for the qualitative and quantitative analysis of the amygdalin and prunasin in plasma sample of rats which were administered by Maxing shigan decoction.

  11. On Extended Dissipativity of Discrete-Time Neural Networks With Time Delay.

    PubMed

    Feng, Zhiguang; Zheng, Wei Xing

    2015-12-01

    In this brief, the problem of extended dissipativity analysis for discrete-time neural networks with time-varying delay is investigated. The definition of extended dissipativity of discrete-time neural networks is proposed, which unifies several performance measures, such as the H∞ performance, passivity, l2 - l∞ performance, and dissipativity. By introducing a triple-summable term in Lyapunov function, the reciprocally convex approach is utilized to bound the forward difference of the triple-summable term and then the extended dissipativity criterion for discrete-time neural networks with time-varying delay is established. The derived condition guarantees not only the extended dissipativity but also the stability of the neural networks. Two numerical examples are given to demonstrate the reduced conservatism and effectiveness of the obtained results.

  12. Simulation of Transcritical CO2 Refrigeration System with Booster Hot Gas Bypass in Tropical Climate

    NASA Astrophysics Data System (ADS)

    Santosa, I. D. M. C.; Sudirman; Waisnawa, IGNS; Sunu, PW; Temaja, IW

    2018-01-01

    A Simulation computer becomes significant important for performance analysis since there is high cost and time allocation to build an experimental rig, especially for CO2 refrigeration system. Besides, to modify the rig also need additional cos and time. One of computer program simulation that is very eligible to refrigeration system is Engineering Equation System (EES). In term of CO2 refrigeration system, environmental issues becomes priority on the refrigeration system development since the Carbon dioxide (CO2) is natural and clean refrigerant. This study aims is to analysis the EES simulation effectiveness to perform CO2 transcritical refrigeration system with booster hot gas bypass in high outdoor temperature. The research was carried out by theoretical study and numerical analysis of the refrigeration system using the EES program. Data input and simulation validation were obtained from experimental and secondary data. The result showed that the coefficient of performance (COP) decreased gradually with the outdoor temperature variation increasing. The results show the program can calculate the performance of the refrigeration system with quick running time and accurate. So, it will be significant important for the preliminary reference to improve the CO2 refrigeration system design for the hot climate temperature.

  13. Application of Ultra-performance Liquid Chromatography with Time-of-Flight Mass Spectrometry for the Rapid Analysis of Constituents and Metabolites from the Extracts of Acanthopanax senticosus Harms Leaf

    PubMed Central

    Zhang, Yingzhi; Zhang, Aihua; Zhang, Ying; Sun, Hui; Meng, Xiangcai; Yan, Guangli; Wang, Xijun

    2016-01-01

    Acanthopanax senticosus (Rupr and Maxim) Harms (AS), a member of Araliaceae family, is a typical folk medicinal herb, which is widely distributed in the Northeastern part of China. Due to lack of this resource caused by the extensive use of its root, this work studied the chemical constituents of leaves of this plant with the purpose of looking for an alternative resource. In this work, a fast and optimized ultra-performance liquid chromatography method with quadrupole time-of-flight mass spectrometry (UPLC-QTOF-MS) has been developed for the analysis of constituents in leaves extracts. A total of 131 compounds were identified or tentatively characterized including triterpenoid saponins, phenols, flavonoids, lignans, coumarins, polysaccharides, and other compounds based on their fragmentation behaviors. Besides, a total of 21 metabolites were identified in serum in rats after oral administration, among which 12 prototypes and 9 metabolites through the metabolic pathways of reduction, methylation, sulfate conjugation, sulfoxide to thioether and deglycosylation. The coupling of UPLC-QTOF-MS led to the in-depth characterization of the leaves extracts of AS both in vitro and in vivo on the basis of retention time, mass accuracy, and tandem MS/MS spectra. It concluded that this analytical tool was very valuable in the study of complex compounds in medicinal herb. HIGHLIGHT OF PAPER A fast UPLC-QTOF-MS has been developed for analysis of constituents in leaves extractsA total of 131 compounds were identified in leaves extractsA total of 21 metabolites including 12 prototypes and 9 metabolites were identified in vivo. SUMMARY Constituent’s analysis of Acanthopanax senticosus Harms leaf by ultra-performance liquid chromatography method with quadrupole time-of-flight mass spectrometry. Abbreviations used: AS: Acanthopanax senticosus (Rupr and Maxim) Harms, TCHM: Traditional Chinese herbal medicine, UPLC-QTOF-MS: Ultra-performance liquid chromatography method with time-of-flight mass spectrometry, MS/MS: Tandem mass spectrometry, PCA: Principal component analysis, PLS-DA: Partial least squared discriminant analysis, OPLS-DA: Orthogonal projection to latent structure-discriminant analysis. PMID:27076752

  14. Runtime Speculative Software-Only Fault Tolerance

    DTIC Science & Technology

    2012-06-01

    reliability of RSFT, a in-depth analysis on its window of vulnerability is also discussed and measured via simulated fault injection. The performance...propagation of faults through the entire program. For optimal performance, these techniques have to use herotic alias analysis to find the minimum set of...affect program output. No program source code or alias analysis is needed to analyze the fault propagation ahead of time. 2.3 Limitations of Existing

  15. Analysis of statistical and standard algorithms for detecting muscle onset with surface electromyography

    PubMed Central

    Tweedell, Andrew J.; Haynes, Courtney A.

    2017-01-01

    The timing of muscle activity is a commonly applied analytic method to understand how the nervous system controls movement. This study systematically evaluates six classes of standard and statistical algorithms to determine muscle onset in both experimental surface electromyography (EMG) and simulated EMG with a known onset time. Eighteen participants had EMG collected from the biceps brachii and vastus lateralis while performing a biceps curl or knee extension, respectively. Three established methods and three statistical methods for EMG onset were evaluated. Linear envelope, Teager-Kaiser energy operator + linear envelope and sample entropy were the established methods evaluated while general time series mean/variance, sequential and batch processing of parametric and nonparametric tools, and Bayesian changepoint analysis were the statistical techniques used. Visual EMG onset (experimental data) and objective EMG onset (simulated data) were compared with algorithmic EMG onset via root mean square error and linear regression models for stepwise elimination of inferior algorithms. The top algorithms for both data types were analyzed for their mean agreement with the gold standard onset and evaluation of 95% confidence intervals. The top algorithms were all Bayesian changepoint analysis iterations where the parameter of the prior (p0) was zero. The best performing Bayesian algorithms were p0 = 0 and a posterior probability for onset determination at 60–90%. While existing algorithms performed reasonably, the Bayesian changepoint analysis methodology provides greater reliability and accuracy when determining the singular onset of EMG activity in a time series. Further research is needed to determine if this class of algorithms perform equally well when the time series has multiple bursts of muscle activity. PMID:28489897

  16. Laser Microprobe Mass Spectrometry 1: Basic Principles and Performance Characteristics.

    ERIC Educational Resources Information Center

    Denoyer, Eric; And Others

    1982-01-01

    Describes the historical development, performance characteristics (sample requirements, analysis time, ionization characteristics, speciation capabilities, and figures of merit), and applications of laser microprobe mass spectrometry. (JN)

  17. Real-time Medical Emergency Response System: Exploiting IoT and Big Data for Public Health.

    PubMed

    Rathore, M Mazhar; Ahmad, Awais; Paul, Anand; Wan, Jiafu; Zhang, Daqiang

    2016-12-01

    Healthy people are important for any nation's development. Use of the Internet of Things (IoT)-based body area networks (BANs) is increasing for continuous monitoring and medical healthcare in order to perform real-time actions in case of emergencies. However, in the case of monitoring the health of all citizens or people in a country, the millions of sensors attached to human bodies generate massive volume of heterogeneous data, called "Big Data." Processing Big Data and performing real-time actions in critical situations is a challenging task. Therefore, in order to address such issues, we propose a Real-time Medical Emergency Response System that involves IoT-based medical sensors deployed on the human body. Moreover, the proposed system consists of the data analysis building, called "Intelligent Building," depicted by the proposed layered architecture and implementation model, and it is responsible for analysis and decision-making. The data collected from millions of body-attached sensors is forwarded to Intelligent Building for processing and for performing necessary actions using various units such as collection, Hadoop Processing (HPU), and analysis and decision. The feasibility and efficiency of the proposed system are evaluated by implementing the system on Hadoop using an UBUNTU 14.04 LTS coreTMi5 machine. Various medical sensory datasets and real-time network traffic are considered for evaluating the efficiency of the system. The results show that the proposed system has the capability of efficiently processing WBAN sensory data from millions of users in order to perform real-time responses in case of emergencies.

  18. A multiprocessing architecture for real-time monitoring

    NASA Technical Reports Server (NTRS)

    Schmidt, James L.; Kao, Simon M.; Read, Jackson Y.; Weitzenkamp, Scott M.; Laffey, Thomas J.

    1988-01-01

    A multitasking architecture for performing real-time monitoring and analysis using knowledge-based problem solving techniques is described. To handle asynchronous inputs and perform in real time, the system consists of three or more distributed processes which run concurrently and communicate via a message passing scheme. The Data Management Process acquires, compresses, and routes the incoming sensor data to other processes. The Inference Process consists of a high performance inference engine that performs a real-time analysis on the state and health of the physical system. The I/O Process receives sensor data from the Data Management Process and status messages and recommendations from the Inference Process, updates its graphical displays in real time, and acts as the interface to the console operator. The distributed architecture has been interfaced to an actual spacecraft (NASA's Hubble Space Telescope) and is able to process the incoming telemetry in real-time (i.e., several hundred data changes per second). The system is being used in two locations for different purposes: (1) in Sunnyville, California at the Space Telescope Test Control Center it is used in the preflight testing of the vehicle; and (2) in Greenbelt, Maryland at NASA/Goddard it is being used on an experimental basis in flight operations for health and safety monitoring.

  19. A Preliminary Model for Spacecraft Propulsion Performance Analysis Based on Nuclear Gain and Subsystem Mass-Power Balances

    NASA Technical Reports Server (NTRS)

    Chakrabarti, Suman; Schmidt, George R.; Thio, Y. C.; Hurst, Chantelle M.

    1999-01-01

    A preliminary model for spacecraft propulsion performance analysis based on nuclear gain and subsystem mass-power balances are presented in viewgraph form. For very fast missions with straight-line trajectories, it has been shown that mission trip time is proportional to the cube root of alpha. Analysis of spacecraft power systems via a power balance and examination of gain vs. mass-power ratio has shown: 1) A minimum gain is needed to have enough power for thruster and driver operation; and 2) Increases in gain result in decreases in overall mass-power ratio, which in turn leads to greater achievable accelerations. However, subsystem mass-power ratios and efficiencies are crucial: less efficient values for these can partially offset the effect of nuclear gain. Therefore, it is of interest to monitor the progress of gain-limited subsystem technologies and it is also possible that power-limited systems with sufficiently low alpha may be competitive for such ambitious missions. Topics include Space flight requirements; Spacecraft energy gain; Control theory for performance; Mission assumptions; Round trips: Time and distance; Trip times; Vehicle acceleration; and Minimizing trip times.

  20. Efficiency of bowel preparation for capsule endoscopy examination: a meta-analysis.

    PubMed

    Niv, Yaron

    2008-03-07

    Good preparation before endoscopic procedures is essential for successful visualization. The small bowel is difficult to evaluate because of its length and complex configuration. A meta-analysis was conducted of studies comparing small bowel visualization by capsule endoscopy with and without preparation. Medical data bases were searched for all studies investigating the preparation for capsule endoscopy of the small bowel up to July 31, 2007. Studies that scored bowel cleanness and measured gastric and small bowel transit time and rate of cecum visualization were included. The primary endpoint was the quality of bowel visualization. The secondary endpoints were transit times and proportion of examinations that demonstrated the cecum, with and without preparation. Meta-analysis was performed with StatDirect Statistical software, version 2.6.1 (http://statsdirect.com). Eight studies met the inclusion criteria. Bowel visualization was scored as "good" in 78% of the examinations performed with preparation and 49% performed without (P<0.0001). There were no significant differences in transit times or in the proportion of examinations that demonstrated the cecum with and without preparation. Capsule endoscopy preparation improves the quality of small bowel visualization, but has no effect on transit times, or demonstration of the cecum.

  1. Efficiency of bowel preparation for capsule endoscopy examination: A meta-analysis

    PubMed Central

    Niv, Yaron

    2008-01-01

    Good preparation before endoscopic procedures is essential for successful visualization. The small bowel is difficult to evaluate because of its length and complex configuration. A meta-analysis was conducted of studies comparing small bowel visualization by capsule endoscopy with and without preparation. Medical data bases were searched for all studies investigating the preparation for capsule endoscopy of the small bowel up to July 31, 2007. Studies that scored bowel cleanness and measured gastric and small bowel transit time and rate of cecum visualization were included. The primary endpoint was the quality of bowel visualization. The secondary endpoints were transit times and proportion of examinations that demonstrated the cecum, with and without preparation. Meta-analysis was performed with StatDirect Statistical software, version 2.6.1 (http://statsdirect.com). Eight studies met the inclusion criteria. Bowel visualization was scored as “good” in 78% of the examinations performed with preparation and 49% performed without (P < 0.0001). There were no significant differences in transit times or in the proportion of examinations that demonstrated the cecum with and without preparation. Capsule endoscopy preparation improves the quality of small bowel visualization, but has no effect on transit times, or demonstration of the cecum. PMID:18322940

  2. Screen media usage, sleep time and academic performance in adolescents: clustering a self-organizing maps analysis.

    PubMed

    Peiró-Velert, Carmen; Valencia-Peris, Alexandra; González, Luis M; García-Massó, Xavier; Serra-Añó, Pilar; Devís-Devís, José

    2014-01-01

    Screen media usage, sleep time and socio-demographic features are related to adolescents' academic performance, but interrelations are little explored. This paper describes these interrelations and behavioral profiles clustered in low and high academic performance. A nationally representative sample of 3,095 Spanish adolescents, aged 12 to 18, was surveyed on 15 variables linked to the purpose of the study. A Self-Organizing Maps analysis established non-linear interrelationships among these variables and identified behavior patterns in subsequent cluster analyses. Topological interrelationships established from the 15 emerging maps indicated that boys used more passive videogames and computers for playing than girls, who tended to use mobile phones to communicate with others. Adolescents with the highest academic performance were the youngest. They slept more and spent less time using sedentary screen media when compared to those with the lowest performance, and they also showed topological relationships with higher socioeconomic status adolescents. Cluster 1 grouped boys who spent more than 5.5 hours daily using sedentary screen media. Their academic performance was low and they slept an average of 8 hours daily. Cluster 2 gathered girls with an excellent academic performance, who slept nearly 9 hours per day, and devoted less time daily to sedentary screen media. Academic performance was directly related to sleep time and socioeconomic status, but inversely related to overall sedentary screen media usage. Profiles from the two clusters were strongly differentiated by gender, age, sedentary screen media usage, sleep time and academic achievement. Girls with the highest academic results had a medium socioeconomic status in Cluster 2. Findings may contribute to establishing recommendations about the timing and duration of screen media usage in adolescents and appropriate sleep time needed to successfully meet the demands of school academics and to improve interventions targeting to affect behavioral change.

  3. Screen Media Usage, Sleep Time and Academic Performance in Adolescents: Clustering a Self-Organizing Maps Analysis

    PubMed Central

    Peiró-Velert, Carmen; Valencia-Peris, Alexandra; González, Luis M.; García-Massó, Xavier; Serra-Añó, Pilar; Devís-Devís, José

    2014-01-01

    Screen media usage, sleep time and socio-demographic features are related to adolescents' academic performance, but interrelations are little explored. This paper describes these interrelations and behavioral profiles clustered in low and high academic performance. A nationally representative sample of 3,095 Spanish adolescents, aged 12 to 18, was surveyed on 15 variables linked to the purpose of the study. A Self-Organizing Maps analysis established non-linear interrelationships among these variables and identified behavior patterns in subsequent cluster analyses. Topological interrelationships established from the 15 emerging maps indicated that boys used more passive videogames and computers for playing than girls, who tended to use mobile phones to communicate with others. Adolescents with the highest academic performance were the youngest. They slept more and spent less time using sedentary screen media when compared to those with the lowest performance, and they also showed topological relationships with higher socioeconomic status adolescents. Cluster 1 grouped boys who spent more than 5.5 hours daily using sedentary screen media. Their academic performance was low and they slept an average of 8 hours daily. Cluster 2 gathered girls with an excellent academic performance, who slept nearly 9 hours per day, and devoted less time daily to sedentary screen media. Academic performance was directly related to sleep time and socioeconomic status, but inversely related to overall sedentary screen media usage. Profiles from the two clusters were strongly differentiated by gender, age, sedentary screen media usage, sleep time and academic achievement. Girls with the highest academic results had a medium socioeconomic status in Cluster 2. Findings may contribute to establishing recommendations about the timing and duration of screen media usage in adolescents and appropriate sleep time needed to successfully meet the demands of school academics and to improve interventions targeting to affect behavioral change. PMID:24941009

  4. Ultra-high performance supercritical fluid chromatography of lignin-derived phenols from alkaline cupric oxide oxidation.

    PubMed

    Sun, Mingzhe; Lidén, Gunnar; Sandahl, Margareta; Turner, Charlotta

    2016-08-01

    Traditional chromatographic methods for the analysis of lignin-derived phenolic compounds in environmental samples are generally time consuming. In this work, an ultra-high performance supercritical fluid chromatography method with a diode array detector for the analysis of major lignin-derived phenolic compounds produced by alkaline cupric oxide oxidation was developed. In an analysis of a collection of 11 representative monomeric lignin phenolic compounds, all compounds were clearly separated within 6 min with excellent peak shapes, with a limit of detection of 0.5-2.5 μM, a limit of quantification of 2.5-5.0 μM, and a dynamic range of 5.0-2.0 mM (R(2) > 0.997). The new ultra-high performance supercritical fluid chromatography method was also applied for the qualitative and quantitative analysis of lignin-derived phenolic compounds obtained upon alkaline cupric oxide oxidation of a commercial humic acid. Ten out of the previous eleven model compounds could be quantified in the oxidized humic acid sample. The high separation power and short analysis time obtained demonstrate for the first time that supercritical fluid chromatography is a fast and reliable technique for the analysis of lignin-derived phenols in complex environmental samples. © 2016 The Authors, Journal of Separation Science Published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Analysis of anthocyanins in commercial fruit juices by using nano-liquid chromatography-electrospray-mass spectrometry and high-performance liquid chromatography with UV-vis detector.

    PubMed

    Fanali, Chiara; Dugo, Laura; D'Orazio, Giovanni; Lirangi, Melania; Dachà, Marina; Dugo, Paola; Mondello, Luigi

    2011-01-01

    Nano-LC and conventional HPLC techniques were applied for the analysis of anthocyanins present in commercial fruit juices using a capillary column of 100 μm id and a 2.1 mm id narrow-bore C(18) column. Analytes were detected by UV-Vis at 518 nm and ESI-ion trap MS with HPLC and nano-LC, respectively. Commercial blueberry juice (14 anthocyanins detected) was used to optimize chromatographic separation of analytes and other analysis parameters. Qualitative identification of anthocyanins was performed by comparing the recorded mass spectral data with those of published papers. The use of the same mobile phase composition in both techniques revealed that the miniaturized method exhibited shorter analysis time and higher sensitivity than narrow-bore chromatography. Good intra-day and day-to-day precision of retention time was obtained in both methods with values of RSD less than 3.4 and 0.8% for nano-LC and HPLC, respectively. Quantitative analysis was performed by external standard curve calibration of cyanidin-3-O-glucoside standard. Calibration curves were linear in the concentration ranges studied, 0.1-50 and 6-50 μg/mL for HPLC-UV/Vis and nano-LC-MS, respectively. LOD and LOQ values were good for both methods. In addition to commercial blueberry juice, qualitative and quantitative analysis of other juices (e.g. raspberry, sweet cherry and pomegranate) was performed. The optimized nano-LC-MS method allowed an easy and selective identification and quantification of anthocyanins in commercial fruit juices; it offered good results, shorter analysis time and reduced mobile phase volume with respect to narrow-bore HPLC. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. MOVANAID: An Interactive Aid for Analysis of Movement Capabilities.

    ERIC Educational Resources Information Center

    Cooper, George E.; And Others

    A computer-drive interactive aid for movement analysis, called MOVANAID, has been developed to be of assistance in the performance of certain Army intelligence processing tasks in a tactical environment. It can compute fastest travel times and paths through road networks for military units of various types, as well as fastest times in which…

  7. Load Balancing Using Time Series Analysis for Soft Real Time Systems with Statistically Periodic Loads

    NASA Technical Reports Server (NTRS)

    Hailperin, M.

    1993-01-01

    This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that the authors' techniques allow more accurate estimation of the global system loading, resulting in fewer object migrations than local methods. The authors' method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive load-balancing methods. Results from a preliminary analysis of another system and from simulation with a synthetic load provide some evidence of more general applicability.

  8. Aging analysis of high performance FinFET flip-flop under Dynamic NBTI simulation configuration

    NASA Astrophysics Data System (ADS)

    Zainudin, M. F.; Hussin, H.; Halim, A. K.; Karim, J.

    2018-03-01

    A mechanism known as Negative-bias Temperature Instability (NBTI) degrades a main electrical parameters of a circuit especially in terms of performance. So far, the circuit design available at present are only focussed on high performance circuit without considering the circuit reliability and robustness. In this paper, the main circuit performances of high performance FinFET flip-flop such as delay time, and power were studied with the presence of the NBTI degradation. The aging analysis was verified using a 16nm High Performance Predictive Technology Model (PTM) based on different commands available at Synopsys HSPICE. The results shown that the circuit under the longer dynamic NBTI simulation produces the highest impact in the increasing of gate delay and decrease in the average power reduction from a fresh simulation until the aged stress time under a nominal condition. In addition, the circuit performance under a varied stress condition such as temperature and negative stress gate bias were also studied.

  9. Distributed intelligent data analysis in diabetic patient management.

    PubMed Central

    Bellazzi, R.; Larizza, C.; Riva, A.; Mira, A.; Fiocchi, S.; Stefanelli, M.

    1996-01-01

    This paper outlines the methodologies that can be used to perform an intelligent analysis of diabetic patients' data, realized in a distributed management context. We present a decision-support system architecture based on two modules, a Patient Unit and a Medical Unit, connected by telecommunication services. We stress the necessity to resort to temporal abstraction techniques, combined with time series analysis, in order to provide useful advice to patients; finally, we outline how data analysis and interpretation can be cooperatively performed by the two modules. PMID:8947655

  10. Cost-Effectiveness Analysis of Microscopic and Endoscopic Transsphenoidal Surgery Versus Medical Therapy in the Management of Microprolactinoma in the United States.

    PubMed

    Jethwa, Pinakin R; Patel, Tapan D; Hajart, Aaron F; Eloy, Jean Anderson; Couldwell, William T; Liu, James K

    2016-03-01

    Although prolactinomas are treated effectively with dopamine agonists, some have proposed curative surgical resection for select cases of microprolactinomas to avoid life-long medical therapy. We performed a cost-effectiveness analysis comparing transsphenoidal surgery (either microsurgical or endoscopic) and medical therapy (either bromocriptine or cabergoline) with decision analysis modeling. A 2-armed decision tree was created with TreeAge Pro Suite 2012 to compare upfront transsphenoidal surgery versus medical therapy. The economic perspective was that of the health care third-party payer. On the basis of a literature review, we assigned plausible distributions for costs and utilities to each potential outcome, taking into account medical and surgical costs and complications. Base-case analysis, sensitivity analysis, and Monte Carlo simulations were performed to determine the cost-effectiveness of each strategy at 5-year and 10-year time horizons. In the base-case scenario, microscopic transsphenoidal surgery was the most cost-effective option at 5 years from the time of diagnosis; however, by the 10-year time horizon, endoscopic transsphenoidal surgery became the most cost-effective option. At both time horizons, medical therapy (both bromocriptine and cabergoline) were found to be more costly and less effective than transsphenoidal surgery (i.e., the medical arm was dominated by the surgical arm in this model). Two-way sensitivity analysis demonstrated that endoscopic resection would be the most cost-effective strategy if the cure rate from endoscopic surgery was greater than 90% and the complication rate was less than 1%. Monte Carlo simulation was performed for endoscopic surgery versus microscopic surgery at both time horizons. This analysis produced an incremental cost-effectiveness ratio of $80,235 per quality-adjusted life years at 5 years and $40,737 per quality-adjusted life years at 10 years, implying that with increasing time intervals, endoscopic transsphenoidal surgery is the more cost-effective treatment strategy. On the basis of the results of our model, transsphenoidal surgical resection of microprolactinomas, either microsurgical or endoscopic, appears to be more cost-effective than life-long medical therapy in young patients with life expectancy greater than 10 years. We caution that surgical resection for microprolactinomas be performed only in select cases by experienced pituitary surgeons at high-volume centers with high biochemical cure rates and low complication rates. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Combined cumulative sum (CUSUM) and chronological environmental analysis as a tool to improve the learning environment for linear-probe endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) trainees: a pilot study.

    PubMed

    Norisue, Yasuhiro; Tokuda, Yasuharu; Juarez, Mayrol; Uchimido, Ryo; Fujitani, Shigeki; Stoeckel, David A

    2017-02-07

    Cumulative sum (CUSUM) analysis can be used to continuously monitor the performance of an individual or process and detect deviations from a preset or standard level of achievement. However, no previous study has evaluated the utility of CUSUM analysis in facilitating timely environmental assessment and interventions to improve performance of linear-probe endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA). The aim of this study was to evaluate the usefulness of combined CUSUM and chronological environmental analysis as a tool to improve the learning environment for EBUS-TBNA trainees. This study was an observational chart review. To determine if performance was acceptable, CUSUM analysis was used to track procedural outcomes of trainees in EBUS-TBNA. To investigate chronological changes in the learning environment, multivariate logistic regression analysis was used to compare several indices before and after time points when significant changes occurred in proficiency. Presence of an additional attending bronchoscopist was inversely associated with nonproficiency (odds ratio, 0.117; 95% confidence interval, 0-0.749; P = 0.019). Other factors, including presence of an on-site cytopathologist and dose of sedatives used, were not significantly associated with duration of nonproficiency. Combined CUSUM and chronological environmental analysis may be useful in hastening interventions that improve performance of EBUS-TBNA.

  12. Analysis of ultra-triathlon performances

    PubMed Central

    Lepers, Romuald; Knechtle, Beat; Knechtle, Patrizia; Rosemann, Thomas

    2011-01-01

    Despite increased interest in ultra-endurance events, little research has examined ultra-triathlon performance. The aims of this study were: (i) to compare swimming, cycling, running, and overall performances in three ultra-distance triathlons, double Ironman distance triathlon (2IMT) (7.6 km swimming, 360 km cycling, and 84.4 km running), triple Ironman distance triathlon (3IMT) (11.4 km, 540 km, and 126.6 km), and deca Ironman distance triathlon (10IMT) (38 km, 1800 km, and 420 km) and (ii) to examine the relationships between the 2IMT, 3IMT, and 10IMT performances to create predicted equations of the 10IMT performances. Race results from 1985 through 2009 were examined to identify triathletes who performed the three considered ultra-distances. In total, 73 triathletes (68 men and 5 women) were identified. The contribution of swimming to overall ultra-triathlon performance was lower than for cycling and running. Running performance was more important to overall performance for 2IMT and 3IMT compared with 10IMT The 2IMT and 3IMT performances were significantly correlated with 10IMT performances for swimming and cycling, but not for running. 10IMT total time performance might be predicted by the following equation: 10IMT race time (minutes) = 5885 + 3.69 × 3IMT race time (minutes). This analysis of human performance during ultra-distance triathlons represents a unique data set in the field of ultra-endurance events. Additional studies are required to determine the physiological and psychological factors associated with ultra-triathlon performance. PMID:24198579

  13. NASCOM network: Ground communications reliability report

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A reliability performance analysis of the NASCOM Network circuits is reported. Network performance narrative summary is presented to include significant changes in circuit configurations, current figures, and trends in each trouble category with notable circuit totals specified. Lost time and interruption tables listing circuits which were affected by outages showing their totals category are submitted. A special analysis of circuits with low reliabilities is developed with tables depicting the performance and graphs for individual reliabilities.

  14. Statistical performance and information content of time lag analysis and redundancy analysis in time series modeling.

    PubMed

    Angeler, David G; Viedma, Olga; Moreno, José M

    2009-11-01

    Time lag analysis (TLA) is a distance-based approach used to study temporal dynamics of ecological communities by measuring community dissimilarity over increasing time lags. Despite its increased use in recent years, its performance in comparison with other more direct methods (i.e., canonical ordination) has not been evaluated. This study fills this gap using extensive simulations and real data sets from experimental temporary ponds (true zooplankton communities) and landscape studies (landscape categories as pseudo-communities) that differ in community structure and anthropogenic stress history. Modeling time with a principal coordinate of neighborhood matrices (PCNM) approach, the canonical ordination technique (redundancy analysis; RDA) consistently outperformed the other statistical tests (i.e., TLAs, Mantel test, and RDA based on linear time trends) using all real data. In addition, the RDA-PCNM revealed different patterns of temporal change, and the strength of each individual time pattern, in terms of adjusted variance explained, could be evaluated, It also identified species contributions to these patterns of temporal change. This additional information is not provided by distance-based methods. The simulation study revealed better Type I error properties of the canonical ordination techniques compared with the distance-based approaches when no deterministic component of change was imposed on the communities. The simulation also revealed that strong emphasis on uniform deterministic change and low variability at other temporal scales is needed to result in decreased statistical power of the RDA-PCNM approach relative to the other methods. Based on the statistical performance of and information content provided by RDA-PCNM models, this technique serves ecologists as a powerful tool for modeling temporal change of ecological (pseudo-) communities.

  15. Nano-JASMINE and small-JASMINE data analysis

    NASA Astrophysics Data System (ADS)

    Yamada, Yoshiyuki; Shirasaki, Yuji; Nishi, Ryoichi

    2018-04-01

    Space astrometry missions Nano-JASMINE and small-JASMINE are planned in Japan. Data analysis tasks are performed under Gaia-JASMINE collaboration in long time. We expected to achieve 3 mas accuracy in Nano-JASMINE, and 20 micro arcsec in small-JASMINE of astrometric performance. Gaia DR1 publication and instruction is done from NAOJ and Niigata University.

  16. Improving queuing service at McDonald's

    NASA Astrophysics Data System (ADS)

    Koh, Hock Lye; Teh, Su Yean; Wong, Chin Keat; Lim, Hooi Kie; Migin, Melissa W.

    2014-07-01

    Fast food restaurants are popular among price-sensitive youths and working adults who value the conducive environment and convenient services. McDonald's chains of restaurants promote their sales during lunch hours by offering package meals which are perceived to be inexpensive. These promotional lunch meals attract good response, resulting in occasional long queues and inconvenient waiting times. A study is conducted to monitor the distribution of waiting time, queue length, customer arrival and departure patterns at a McDonald's restaurant located in Kuala Lumpur. A customer survey is conducted to gauge customers' satisfaction regarding waiting time and queue length. An android app named Que is developed to perform onsite queuing analysis and report key performance indices. The queuing theory in Que is based upon the concept of Poisson distribution. In this paper, Que is utilized to perform queuing analysis at this McDonald's restaurant with the aim of improving customer service, with particular reference to reducing queuing time and shortening queue length. Some results will be presented.

  17. Performance-based, cost- and time-effective pcb analytical methodology.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alvarado, J. S.

    1998-06-11

    Laboratory applications for the analysis of PCBs (polychlorinated biphenyls) in environmental matrices such as soil/sediment/sludge and oil/waste oil were evaluated for potential reduction in waste, source reduction, and alternative techniques for final determination. As a consequence, new procedures were studied for solvent substitution, miniaturization of extraction and cleanups, minimization of reagent consumption, reduction of cost per analysis, and reduction of time. These new procedures provide adequate data that meet all the performance requirements for the determination of PCBs. Use of the new procedures reduced costs for all sample preparation techniques. Time and cost were also reduced by combining the newmore » sample preparation procedures with the power of fast gas chromatography. Separation of Aroclor 1254 was achieved in less than 6 min by using DB-1 and SPB-608 columns. With the greatly shortened run times, reproducibility can be tested quickly and consequently with low cost. With performance-based methodology, the applications presented here can be applied now, without waiting for regulatory approval.« less

  18. Independent component analysis for onset detection in piano trills

    NASA Astrophysics Data System (ADS)

    Brown, Judith C.; Todd, Jeremy G.; Smaragdis, Paris

    2002-05-01

    The detection of onsets in piano music is difficult due to the presence of many notes simultaneously and their long decay times from pedaling. This is even more difficult for trills where the rapid note changes make it difficult to observe a decrease in amplitude for individual notes in either the temporal wave form or the time dependent Fourier components. Occasionally one note of the trill has a much lower amplitude than the other making an unambiguous determination of its onset virtually impossible. We have analyzed a number of trills from CD's of performances by Horowitz, Ashkenazy, and Goode, choosing the same trill and different performances where possible. The Fourier transform was calculated as a function of time, and the magnitude coefficients served as input for a calculation using the method of independent component analysis. In most cases this gave a more definitive determination of the onset times, as can be demonstrated graphically. For comparison identical calculations have been carried out on recordings of midi generated performances on a Yamaha Disclavier piano.

  19. Personal best marathon time and longest training run, not anthropometry, predict performance in recreational 24-hour ultrarunners.

    PubMed

    Knechtle, Beat; Knechtle, Patrizia; Rosemann, Thomas; Lepers, Romuald

    2011-08-01

    In recent studies, a relationship between both low body fat and low thicknesses of selected skinfolds has been demonstrated for running performance of distances from 100 m to the marathon but not in ultramarathon. We investigated the association of anthropometric and training characteristics with race performance in 63 male recreational ultrarunners in a 24-hour run using bi and multivariate analysis. The athletes achieved an average distance of 146.1 (43.1) km. In the bivariate analysis, body mass (r = -0.25), the sum of 9 skinfolds (r = -0.32), the sum of upper body skinfolds (r = -0.34), body fat percentage (r = -0.32), weekly kilometers ran (r = 0.31), longest training session before the 24-hour run (r = 0.56), and personal best marathon time (r = -0.58) were related to race performance. Stepwise multiple regression showed that both the longest training session before the 24-hour run (p = 0.0013) and the personal best marathon time (p = 0.0015) had the best correlation with race performance. Performance in these 24-hour runners may be predicted (r2 = 0.46) by the following equation: Performance in a 24-hour run, km) = 234.7 + 0.481 (longest training session before the 24-hour run, km) - 0.594 (personal best marathon time, minutes). For practical applications, training variables such as volume and intensity were associated with performance but not anthropometric variables. To achieve maximum kilometers in a 24-hour run, recreational ultrarunners should have a personal best marathon time of ∼3 hours 20 minutes and complete a long training run of ∼60 km before the race, whereas anthropometric characteristics such as low body fat or low skinfold thicknesses showed no association with performance.

  20. Children's Sleep and Cognitive Performance: A Cross-Domain Analysis of Change over Time

    ERIC Educational Resources Information Center

    Bub, Kristen L.; Buckhalt, Joseph A.; El-Sheikh, Mona

    2011-01-01

    Relations between changes in children's cognitive performance and changes in sleep problems were examined over a 3-year period, and family socioeconomic status, child race/ethnicity, and gender were assessed as moderators of these associations. Participants were 250 second- and third-grade (8-9 years old at Time 1) boys and girls. At each…

  1. The Endurance of Children's Working Memory: A Recall Time Analysis

    ERIC Educational Resources Information Center

    Towse, John N.; Hitch, Graham J.; Hamilton, Z.; Pirrie, Sarah

    2008-01-01

    We analyze the timing of recall as a source of information about children's performance in complex working memory tasks. A group of 8-year-olds performed a traditional operation span task in which sequence length increased across trials and an operation period task in which processing requirements were extended across trials of constant sequence…

  2. Colonic lesion characterization in inflammatory bowel disease: A systematic review and meta-analysis

    PubMed Central

    Lord, Richard; Burr, Nicholas E; Mohammed, Noor; Subramanian, Venkataraman

    2018-01-01

    AIM To perform a systematic review and meta-analysis for the diagnostic accuracy of in vivo lesion characterization in colonic inflammatory bowel disease (IBD), using optical imaging techniques, including virtual chromoendoscopy (VCE), dye-based chromoendoscopy (DBC), magnification endoscopy and confocal laser endomicroscopy (CLE). METHODS We searched Medline, Embase and the Cochrane library. We performed a bivariate meta-analysis to calculate the pooled estimate sensitivities, specificities, positive and negative likelihood ratios (+LHR, -LHR), diagnostic odds ratios (DOR), and area under the SROC curve (AUSROC) for each technology group. A subgroup analysis was performed to investigate differences in real-time non-magnified Kudo pit patterns (with VCE and DBC) and real-time CLE. RESULTS We included 22 studies [1491 patients; 4674 polyps, of which 539 (11.5%) were neoplastic]. Real-time CLE had a pooled sensitivity of 91% (95%CI: 66%-98%), specificity of 97% (95%CI: 94%-98%), and an AUSROC of 0.98 (95%CI: 0.97-0.99). Magnification endoscopy had a pooled sensitivity of 90% (95%CI: 77%-96%) and specificity of 87% (95%CI: 81%-91%). VCE had a pooled sensitivity of 86% (95%CI: 62%-95%) and specificity of 87% (95%CI: 72%-95%). DBC had a pooled sensitivity of 67% (95%CI: 44%-84%) and specificity of 86% (95%CI: 72%-94%). CONCLUSION Real-time CLE is a highly accurate technology for differentiating neoplastic from non-neoplastic lesions in patients with colonic IBD. However, most CLE studies were performed by single expert users within tertiary centres, potentially confounding these results. PMID:29563760

  3. Analysis of travel time reliability on Indiana interstates.

    DOT National Transportation Integrated Search

    2009-09-15

    Travel-time reliability is a key performance measure in any transportation system. It is a : measure of quality of travel time experienced by transportation system users and reflects the efficiency : of the transportation system to serve citizens, bu...

  4. Time-frequency analysis-based time-windowing algorithm for the inverse synthetic aperture radar imaging of ships

    NASA Astrophysics Data System (ADS)

    Zhou, Peng; Zhang, Xi; Sun, Weifeng; Dai, Yongshou; Wan, Yong

    2018-01-01

    An algorithm based on time-frequency analysis is proposed to select an imaging time window for the inverse synthetic aperture radar imaging of ships. An appropriate range bin is selected to perform the time-frequency analysis after radial motion compensation. The selected range bin is that with the maximum mean amplitude among the range bins whose echoes are confirmed to be contributed by a dominant scatter. The criterion for judging whether the echoes of a range bin are contributed by a dominant scatter is key to the proposed algorithm and is therefore described in detail. When the first range bin that satisfies the judgment criterion is found, a sequence composed of the frequencies that have the largest amplitudes in every moment's time-frequency spectrum corresponding to this range bin is employed to calculate the length and the center moment of the optimal imaging time window. Experiments performed with simulation data and real data show the effectiveness of the proposed algorithm, and comparisons between the proposed algorithm and the image contrast-based algorithm (ICBA) are provided. Similar image contrast and lower entropy are acquired using the proposed algorithm as compared with those values when using the ICBA.

  5. Involvement of the anterior cingulate cortex in time-based prospective memory task monitoring: An EEG analysis of brain sources using Independent Component and Measure Projection Analysis

    PubMed Central

    Burgos, Pablo; Kilborn, Kerry; Evans, Jonathan J.

    2017-01-01

    Objective Time-based prospective memory (PM), remembering to do something at a particular moment in the future, is considered to depend upon self-initiated strategic monitoring, involving a retrieval mode (sustained maintenance of the intention) plus target checking (intermittent time checks). The present experiment was designed to explore what brain regions and brain activity are associated with these components of strategic monitoring in time-based PM tasks. Method 24 participants were asked to reset a clock every four minutes, while performing a foreground ongoing word categorisation task. EEG activity was recorded and data were decomposed into source-resolved activity using Independent Component Analysis. Common brain regions across participants, associated with retrieval mode and target checking, were found using Measure Projection Analysis. Results Participants decreased their performance on the ongoing task when concurrently performed with the time-based PM task, reflecting an active retrieval mode that relied on withdrawal of limited resources from the ongoing task. Brain activity, with its source in or near the anterior cingulate cortex (ACC), showed changes associated with an active retrieval mode including greater negative ERP deflections, decreased theta synchronization, and increased alpha suppression for events locked to the ongoing task while maintaining a time-based intention. Activity in the ACC was also associated with time-checks and found consistently across participants; however, we did not find an association with time perception processing per se. Conclusion The involvement of the ACC in both aspects of time-based PM monitoring may be related to different functions that have been attributed to it: strategic control of attention during the retrieval mode (distributing attentional resources between the ongoing task and the time-based task) and anticipatory/decision making processing associated with clock-checks. PMID:28863146

  6. Instrumental variables analysis using multiple databases: an example of antidepressant use and risk of hip fracture.

    PubMed

    Uddin, Md Jamal; Groenwold, Rolf H H; de Boer, Anthonius; Gardarsdottir, Helga; Martin, Elisa; Candore, Gianmario; Belitser, Svetlana V; Hoes, Arno W; Roes, Kit C B; Klungel, Olaf H

    2016-03-01

    Instrumental variable (IV) analysis can control for unmeasured confounding, yet it has not been widely used in pharmacoepidemiology. We aimed to assess the performance of IV analysis using different IVs in multiple databases in a study of antidepressant use and hip fracture. Information on adults with at least one prescription of a selective serotonin reuptake inhibitor (SSRI) or tricyclic antidepressant (TCA) during 2001-2009 was extracted from the THIN (UK), BIFAP (Spain), and Mondriaan (Netherlands) databases. IVs were created using the proportion of SSRI prescriptions per practice or using the one, five, or ten previous prescriptions by a physician. Data were analysed using conventional Cox regression and two-stage IV models. In the conventional analysis, SSRI (vs. TCA) was associated with an increased risk of hip fracture, which was consistently found across databases: the adjusted hazard ratio (HR) was approximately 1.35 for time-fixed and 1.50 to 2.49 for time-varying SSRI use, while the IV analysis based on the IVs that appeared to satisfy the IV assumptions showed conflicting results, e.g. the adjusted HRs ranged from 0.55 to 2.75 for time-fixed exposure. IVs for time-varying exposure violated at least one IV assumption and were therefore invalid. This multiple database study shows that the performance of IV analysis varied across the databases for time-fixed and time-varying exposures and strongly depends on the definition of IVs. It remains challenging to obtain valid IVs in pharmacoepidemiological studies, particularly for time-varying exposure, and IV analysis should therefore be interpreted cautiously. Copyright © 2016 John Wiley & Sons, Ltd.

  7. Coupled Solid Rocket Motor Ballistics and Trajectory Modeling for Higher Fidelity Launch Vehicle Design

    NASA Technical Reports Server (NTRS)

    Ables, Brett

    2014-01-01

    Multi-stage launch vehicles with solid rocket motors (SRMs) face design optimization challenges, especially when the mission scope changes frequently. Significant performance benefits can be realized if the solid rocket motors are optimized to the changing requirements. While SRMs represent a fixed performance at launch, rapid design iterations enable flexibility at design time, yielding significant performance gains. The streamlining and integration of SRM design and analysis can be achieved with improved analysis tools. While powerful and versatile, the Solid Performance Program (SPP) is not conducive to rapid design iteration. Performing a design iteration with SPP and a trajectory solver is a labor intensive process. To enable a better workflow, SPP, the Program to Optimize Simulated Trajectories (POST), and the interfaces between them have been improved and automated, and a graphical user interface (GUI) has been developed. The GUI enables real-time visual feedback of grain and nozzle design inputs, enforces parameter dependencies, removes redundancies, and simplifies manipulation of SPP and POST's numerous options. Automating the analysis also simplifies batch analyses and trade studies. Finally, the GUI provides post-processing, visualization, and comparison of results. Wrapping legacy high-fidelity analysis codes with modern software provides the improved interface necessary to enable rapid coupled SRM ballistics and vehicle trajectory analysis. Low cost trade studies demonstrate the sensitivities of flight performance metrics to propulsion characteristics. Incorporating high fidelity analysis from SPP into vehicle design reduces performance margins and improves reliability. By flying an SRM designed with the same assumptions as the rest of the vehicle, accurate comparisons can be made between competing architectures. In summary, this flexible workflow is a critical component to designing a versatile launch vehicle model that can accommodate a volatile mission scope.

  8. Optimal Parameter Design of Coarse Alignment for Fiber Optic Gyro Inertial Navigation System.

    PubMed

    Lu, Baofeng; Wang, Qiuying; Yu, Chunmei; Gao, Wei

    2015-06-25

    Two different coarse alignment algorithms for Fiber Optic Gyro (FOG) Inertial Navigation System (INS) based on inertial reference frame are discussed in this paper. Both of them are based on gravity vector integration, therefore, the performance of these algorithms is determined by integration time. In previous works, integration time is selected by experience. In order to give a criterion for the selection process, and make the selection of the integration time more accurate, optimal parameter design of these algorithms for FOG INS is performed in this paper. The design process is accomplished based on the analysis of the error characteristics of these two coarse alignment algorithms. Moreover, this analysis and optimal parameter design allow us to make an adequate selection of the most accurate algorithm for FOG INS according to the actual operational conditions. The analysis and simulation results show that the parameter provided by this work is the optimal value, and indicate that in different operational conditions, the coarse alignment algorithms adopted for FOG INS are different in order to achieve better performance. Lastly, the experiment results validate the effectiveness of the proposed algorithm.

  9. Aerocapture Systems Analysis for a Neptune Mission

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Edquist, Karl T.; Starr, Brett R.; Hollis, Brian R.; Hrinda, Glenn A.; Bailey, Robert W.; Hall, Jeffery L.; Spilker, Thomas R.; Noca, Muriel A.; O'Kongo, N.

    2006-01-01

    A Systems Analysis was completed to determine the feasibility, benefit and risk of an aeroshell aerocapture system for Neptune and to identify technology gaps and technology performance goals. The systems analysis includes the following disciplines: science; mission design; aeroshell configuration; interplanetary navigation analyses; atmosphere modeling; computational fluid dynamics for aerodynamic performance and aeroheating environment; stability analyses; guidance development; atmospheric flight simulation; thermal protection system design; mass properties; structures; spacecraft design and packaging; and mass sensitivities. Results show that aerocapture is feasible and performance is adequate for the Neptune mission. Aerocapture can deliver 1.4 times more mass to Neptune orbit than an all-propulsive system for the same launch vehicle and results in a 3-4 year reduction in trip time compared to all-propulsive systems. Enabling technologies for this mission include TPS manufacturing; and aerothermodynamic methods for determining coupled 3-D convection, radiation and ablation aeroheating rates and loads.

  10. Load Balancing Using Time Series Analysis for Soft Real Time Systems with Statistically Periodic Loads

    NASA Technical Reports Server (NTRS)

    Hailperin, Max

    1993-01-01

    This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that our techniques allow more accurate estimation of the global system load ing, resulting in fewer object migration than local methods. Our method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive methods.

  11. Digital signal processing for velocity measurements in dynamical material's behaviour studies.

    PubMed

    Devlaminck, Julien; Luc, Jérôme; Chanal, Pierre-Yves

    2014-03-01

    In this work, we describe different configurations of optical fiber interferometers (types Michelson and Mach-Zehnder) used to measure velocities during dynamical material's behaviour studies. We detail the algorithms of processing developed and optimized to improve the performance of these interferometers especially in terms of time and frequency resolutions. Three methods of analysis of interferometric signals were studied. For Michelson interferometers, the time-frequency analysis of signals by Short-Time Fourier Transform (STFT) is compared to a time-frequency analysis by Continuous Wavelet Transform (CWT). The results have shown that the CWT was more suitable than the STFT for signals with low signal-to-noise, and low velocity and high acceleration areas. For Mach-Zehnder interferometers, the measurement is carried out by analyzing the phase shift between three interferometric signals (Triature processing). These three methods of digital signal processing were evaluated, their measurement uncertainties estimated, and their restrictions or operational limitations specified from experimental results performed on a pulsed power machine.

  12. Design and analysis of coherent OCDM en/decoder based on photonic crystal

    NASA Astrophysics Data System (ADS)

    Zhang, Chongfu; Qiu, Kun

    2008-08-01

    The design and performance analysis of a new coherent optical en/decoder based on photonic crystal (PhC) for optical code -division -multiple (OCDM) are presented in this paper. In this scheme, the optical pulse phase and time delay can be flexibly controlled by photonic crystal phase shifter and time delayer by using the appropriate design of fabrication. According to the PhC transmission matrix theorem, combination calculation of the impurity and normal period layers is applied, and performances of the PhC-based optical en/decoder are also analyzed. The reflection, transmission, time delay characteristic and optical spectrum of pulse en/decoded are studied for the waves tuned in the photonic band-gap by numerical calculation. Theoretical analysis and numerical results indicate that the optical pulse is achieved to properly phase modulation and time delay, and an auto-correlation of about 8 dB ration and cross-correlation is gained, which demonstrates the applicability of true pulse phase modulation in a number of applications.

  13. What is associated with race performance in male 100-km ultra-marathoners--anthropometry, training or marathon best time?

    PubMed

    Knechtle, Beat; Knechtle, Patrizia; Rosemann, Thomas; Senn, Oliver

    2011-03-01

    We investigated the associations of anthropometry, training, and pre-race experience with race time in 93 recreational male ultra-marathoners (mean age 44.6 years, s = 10.0; body mass 74.0 kg, s = 9.0; height 1.77 m, s = 0.06; body mass index 23.4 kg · m(-2), s = 2.0) in a 100-km ultra-marathon using bivariate and multivariate analysis. In the bivariate analysis, body mass index (r = 0.24), the sum of eight skinfolds (r = 0.55), percent body fat (r = 0.57), weekly running hours (r = -0.29), weekly running kilometres (r = -0.49), running speed during training (r = -0.50), and personal best time in a marathon (r = 0.72) were associated with race time. Results of the multiple regression analysis revealed an independent and negative association of weekly running kilometres and average speed in training with race time, as well as a significant positive association between the sum of eight skinfold thicknesses and race time. There was a significant positive association between 100-km race time and personal best time in a marathon. We conclude that both training and anthropometry were independently associated with race performance. These characteristics remained relevant even when controlling for personal best time in a marathon.

  14. Implementation of a data packet generator using pattern matching for wearable ECG monitoring systems.

    PubMed

    Noh, Yun Hong; Jeong, Do Un

    2014-07-15

    In this paper, a packet generator using a pattern matching algorithm for real-time abnormal heartbeat detection is proposed. The packet generator creates a very small data packet which conveys sufficient crucial information for health condition analysis. The data packet envelopes real time ECG signals and transmits them to a smartphone via Bluetooth. An Android application was developed specifically to decode the packet and extract ECG information for health condition analysis. Several graphical presentations are displayed and shown on the smartphone. We evaluate the performance of abnormal heartbeat detection accuracy using the MIT/BIH Arrhythmia Database and real time experiments. The experimental result confirm our finding that abnormal heart beat detection is practically possible. We also performed data compression ratio and signal restoration performance evaluations to establish the usefulness of the proposed packet generator and the results were excellent.

  15. Effects of Sequences of Cognitions on Group Performance Over Time

    PubMed Central

    Molenaar, Inge; Chiu, Ming Ming

    2017-01-01

    Extending past research showing that sequences of low cognitions (low-level processing of information) and high cognitions (high-level processing of information through questions and elaborations) influence the likelihoods of subsequent high and low cognitions, this study examines whether sequences of cognitions are related to group performance over time; 54 primary school students (18 triads) discussed and wrote an essay about living in another country (32,375 turns of talk). Content analysis and statistical discourse analysis showed that within each lesson, groups with more low cognitions or more sequences of low cognition followed by high cognition added more essay words. Groups with more high cognitions, sequences of low cognition followed by low cognition, or sequences of high cognition followed by an action followed by low cognition, showed different words and sequences, suggestive of new ideas. The links between cognition sequences and group performance over time can inform facilitation and assessment of student discussions. PMID:28490854

  16. Effects of Sequences of Cognitions on Group Performance Over Time.

    PubMed

    Molenaar, Inge; Chiu, Ming Ming

    2017-04-01

    Extending past research showing that sequences of low cognitions (low-level processing of information) and high cognitions (high-level processing of information through questions and elaborations) influence the likelihoods of subsequent high and low cognitions, this study examines whether sequences of cognitions are related to group performance over time; 54 primary school students (18 triads) discussed and wrote an essay about living in another country (32,375 turns of talk). Content analysis and statistical discourse analysis showed that within each lesson, groups with more low cognitions or more sequences of low cognition followed by high cognition added more essay words. Groups with more high cognitions, sequences of low cognition followed by low cognition, or sequences of high cognition followed by an action followed by low cognition, showed different words and sequences, suggestive of new ideas. The links between cognition sequences and group performance over time can inform facilitation and assessment of student discussions.

  17. Recoding low-level simulator data into a record of meaningful task performance: the integrated task modeling environment (ITME).

    PubMed

    King, Robert; Parker, Simon; Mouzakis, Kon; Fletcher, Winston; Fitzgerald, Patrick

    2007-11-01

    The Integrated Task Modeling Environment (ITME) is a user-friendly software tool that has been developed to automatically recode low-level data into an empirical record of meaningful task performance. The present research investigated and validated the performance of the ITME software package by conducting complex simulation missions and comparing the task analyses produced by ITME with taskanalyses produced by experienced video analysts. A very high interrater reliability (> or = .94) existed between experienced video analysts and the ITME for the task analyses produced for each mission. The mean session time:analysis time ratio was 1:24 using video analysis techniques and 1:5 using the ITME. It was concluded that the ITME produced task analyses that were as reliable as those produced by experienced video analysts, and significantly reduced the time cost associated with these analyses.

  18. Performance analysis of a laser propelled interorbital tansfer vehicle

    NASA Technical Reports Server (NTRS)

    Minovitch, M. A.

    1976-01-01

    Performance capabilities of a laser-propelled interorbital transfer vehicle receiving propulsive power from one ground-based transmitter was investigated. The laser transmits propulsive energy to the vehicle during successive station fly-overs. By applying a series of these propulsive maneuvers, large payloads can be economically transferred between low earth orbits and synchronous orbits. Operations involving the injection of large payloads onto escape trajectories are also studied. The duration of each successive engine burn must be carefully timed so that the vehicle reappears over the laser station to receive additional propulsive power within the shortest possible time. The analytical solution for determining these time intervals is presented, as is a solution to the problem of determining maximum injection payloads. Parameteric computer analysis based on these optimization studies is presented. The results show that relatively low beam powers, on the order of 50 MW to 60 MW, produce significant performance capabilities.

  19. Determining team cognition from delay analysis using cross recurrence plot.

    PubMed

    Hajari, Nasim; Cheng, Irene; Bin Zheng; Basu, Anup

    2016-08-01

    Team cognition is an important factor in evaluating and determining team performance. Forming a team with good shared cognition is even more crucial for laparoscopic surgery applications. In this study, we analyzed the eye tracking data of two surgeons during a laparoscopic simulation operation, then performed Cross Recurrence Analysis (CRA) on the recorded data to study the delay behaviour for good performer and poor performer teams. Dual eye tracking data for twenty two dyad teams were recorded during a laparoscopic task and then the teams were divided into good performer and poor performer teams based on the task times. Eventually we studied the delay between two team members for good and poor performer teams. The results indicated that the good performer teams show a smaller delay comparing to poor performer teams. This study is compatible with gaze overlap analysis between team members and therefore it is a good evidence of shared cognition between team members.

  20. Time-frequency analysis of acoustic signals in the audio-frequency range generated during Hadfield's steel friction

    NASA Astrophysics Data System (ADS)

    Dobrynin, S. A.; Kolubaev, E. A.; Smolin, A. Yu.; Dmitriev, A. I.; Psakhie, S. G.

    2010-07-01

    Time-frequency analysis of sound waves detected by a microphone during the friction of Hadfield’s steel has been performed using wavelet transform and window Fourier transform methods. This approach reveals a relationship between the appearance of quasi-periodic intensity outbursts in the acoustic response signals and the processes responsible for the formation of wear products. It is shown that the time-frequency analysis of acoustic emission in a tribosystem can be applied, along with traditional approaches, to studying features in the wear and friction process.

  1. Reducing adaptive optics latency using Xeon Phi many-core processors

    NASA Astrophysics Data System (ADS)

    Barr, David; Basden, Alastair; Dipper, Nigel; Schwartz, Noah

    2015-11-01

    The next generation of Extremely Large Telescopes (ELTs) for astronomy will rely heavily on the performance of their adaptive optics (AO) systems. Real-time control is at the heart of the critical technologies that will enable telescopes to deliver the best possible science and will require a very significant extrapolation from current AO hardware existing for 4-10 m telescopes. Investigating novel real-time computing architectures and testing their eligibility against anticipated challenges is one of the main priorities of technology development for the ELTs. This paper investigates the suitability of the Intel Xeon Phi, which is a commercial off-the-shelf hardware accelerator. We focus on wavefront reconstruction performance, implementing a straightforward matrix-vector multiplication (MVM) algorithm. We present benchmarking results of the Xeon Phi on a real-time Linux platform, both as a standalone processor and integrated into an existing real-time controller (RTC). Performance of single and multiple Xeon Phis are investigated. We show that this technology has the potential of greatly reducing the mean latency and variations in execution time (jitter) of large AO systems. We present both a detailed performance analysis of the Xeon Phi for a typical E-ELT first-light instrument along with a more general approach that enables us to extend to any AO system size. We show that systematic and detailed performance analysis is an essential part of testing novel real-time control hardware to guarantee optimal science results.

  2. Relationship between masticatory performance using a gummy jelly and masticatory movement.

    PubMed

    Uesugi, Hanako; Shiga, Hiroshi

    2017-10-01

    The purpose of this study was to clarify the relationship between masticatory performance using a gummy jelly and masticatory movement. Thirty healthy males were asked to chew a gummy jelly on their habitual chewing side for 20s, and the parameters of masticatory performance and masticatory movement were calculated as follows. For evaluating the masticatory performance, the amount of glucose extraction during chewing of a gummy jelly was measured. For evaluating the masticatory movement, the movement of the mandibular incisal point was recorded using the MKG K6-I, and ten parameters of the movement path (opening distance and masticatory width), movement rhythm (opening time, closing time, occluding time, and cycle time), stability of movement (stability of path and stability of rhythm), and movement velocity (opening maximum velocity and closing maximum velocity) were calculated from 10 cycles of chewing beginning with the fifth cycle. The relationship between the amount of glucose extraction and parameters representing masticatory movement was investigated and then stepwise multiple linear regression analysis was performed. The amount of glucose extraction was associated with 7 parameters representing the masticatory movement. Stepwise multiple linear regression analysis showed that the opening distance, closing time, stability of rhythm, and closing maximum velocity were the most important factors affecting the glucose extraction. From these results it was suggested that there was a close relation between masticatory performance and masticatory movement, and that the masticatory performance could be increased by rhythmic, rapid and stable mastication with a large opening distance. Copyright © 2017 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.

  3. Appendectomy does not decrease the risk of future colectomy in UC: results from a large cohort and meta-analysis.

    PubMed

    Parian, Alyssa; Limketkai, Berkeley; Koh, Joyce; Brant, Steven R; Bitton, Alain; Cho, Judy H; Duerr, Richard H; McGovern, Dermot P; Proctor, Deborah D; Regueiro, Miguel D; Rioux, John D; Schumm, Phil; Taylor, Kent D; Silverberg, Mark S; Steinhart, A Hillary; Hernaez, Ruben; Lazarev, Mark

    2017-08-01

    Early appendectomy is inversely associated with the development of UC. However, the impact of appendectomy on the clinical course of UC is controversial, generally favouring a milder disease course. We aim to describe the effect appendectomy has on the disease course of UC with focus on the timing of appendectomy in relation to UC diagnosis. Using the National Institute of Diabetes and Digestive and Kidney Diseases Inflammatory Bowel Disease Genetics Consortium database of patients with UC, the risk of colectomy was compared between patients who did and did not undergo appendectomy. In addition, we performed a meta-analysis of studies that examined the association between appendectomy and colectomy. 2980 patients with UC were initially included. 111 (4.4%) patients with UC had an appendectomy; of which 63 were performed prior to UC diagnosis and 48 after diagnosis. In multivariable analysis, appendectomy performed at any time was an independent risk factor for colectomy (OR 1.9, 95% CI 1.1 to 3.1), with appendectomy performed after UC diagnosis most strongly associated with colectomy (OR 2.2, 95% CI 1.1 to 4.5). An updated meta-analysis showed appendectomy performed either prior to or after UC diagnosis had no effect on colectomy rates. Appendectomy performed at any time in relation to UC diagnosis was not associated with a decrease in severity of disease. In fact, appendectomy after UC diagnosis may be associated with a higher risk of colectomy. These findings question the proposed use of appendectomy as treatment for UC. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  4. Evaluation of shear wave elastography for differential diagnosis of breast lesions: A new qualitative analysis versus conventional quantitative analysis.

    PubMed

    Ren, Wei-Wei; Li, Xiao-Long; Wang, Dan; Liu, Bo-Ji; Zhao, Chong-Ke; Xu, Hui-Xiong

    2018-04-13

    To evaluate a special kind of ultrasound (US) shear wave elastography for differential diagnosis of breast lesions, using a new qualitative analysis (i.e. the elasticity score in the travel time map) compared with conventional quantitative analysis. From June 2014 to July 2015, 266 pathologically proven breast lesions were enrolled in this study. The maximum, mean, median, minimum, and standard deviation of shear wave speed (SWS) values (m/s) were assessed. The elasticity score, a new qualitative feature, was evaluated in the travel time map. The area under the receiver operating characteristic (AUROC) curves were plotted to evaluate the diagnostic performance of both qualitative and quantitative analyses for differentiation of breast lesions. Among all quantitative parameters, SWS-max showed the highest AUROC (0.805; 95% CI: 0.752, 0.851) compared with SWS-mean (0.786; 95% CI:0.732, 0.834; P = 0.094), SWS-median (0.775; 95% CI:0.720, 0.824; P = 0.046), SWS-min (0.675; 95% CI:0.615, 0.731; P = 0.000), and SWS-SD (0.768; 95% CI:0.712, 0.817; P = 0.074). The AUROC of qualitative analysis in this study obtained the best diagnostic performance (0.871; 95% CI: 0.825, 0.909, compared with the best parameter of SWS-max in quantitative analysis, P = 0.011). The new qualitative analysis of shear wave travel time showed the superior diagnostic performance in the differentiation of breast lesions in comparison with conventional quantitative analysis.

  5. Singular-Arc Time-Optimal Trajectory of Aircraft in Two-Dimensional Wind Field

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan

    2006-01-01

    This paper presents a study of a minimum time-to-climb trajectory analysis for aircraft flying in a two-dimensional altitude dependent wind field. The time optimal control problem possesses a singular control structure when the lift coefficient is taken as a control variable. A singular arc analysis is performed to obtain an optimal control solution on the singular arc. Using a time-scale separation with the flight path angle treated as a fast state, the dimensionality of the optimal control solution is reduced by eliminating the lift coefficient control. A further singular arc analysis is used to decompose the original optimal control solution into the flight path angle solution and a trajectory solution as a function of the airspeed and altitude. The optimal control solutions for the initial and final climb segments are computed using a shooting method with known starting values on the singular arc The numerical results of the shooting method show that the optimal flight path angle on the initial and final climb segments are constant. The analytical approach provides a rapid means for analyzing a time optimal trajectory for aircraft performance.

  6. Racial Earnings Differentials and Performance Pay

    ERIC Educational Resources Information Center

    Heywood, John S.; O'Halloran, Patrick L.

    2005-01-01

    A comparative analysis between output-based payment and time rates payment is presented. It is observed that racial or gender earnings discrimination is more likely in time rates payment and supervisory evaluations.

  7. Solenoid valve performance characteristics studied

    NASA Technical Reports Server (NTRS)

    Abe, J. T.; Blackburn, S.

    1970-01-01

    Current and voltage waveforms of a solenoid coil are recorded as the valve opens and closes. Analysis of the waveforms with respect to time and the phase of the valve cycle accurately describes valve performance.

  8. Television Viewing and Its Association with Sedentary Behaviors, Self-Rated Heath and Academic Performance among Secondary School Students in Peru.

    PubMed

    Sharma, Bimala; Cosme Chavez, Rosemary; Jeong, Ae Suk; Nam, Eun Woo

    2017-04-05

    The study assessed television viewing >2 h a day and its association with sedentary behaviors, self-rated health, and academic performance among secondary school adolescents. A cross-sectional survey was conducted among randomly selected students in Lima in 2015. We measured self-reported responses of students using a standard questionnaire, and conducted in-depth interviews with 10 parents and 10 teachers. Chi-square test, correlation and multivariate logistic regression analysis were performed among 1234 students, and thematic analysis technique was used for qualitative information. A total of 23.1% adolescents reported watching television >2 h a day. Qualitative findings also show that adolescents spend most of their leisure time watching television, playing video games or using the Internet. Television viewing had a significant positive correlation with video game use in males and older adolescents, with Internet use in both sexes, and a negative correlation with self-rated health and academic performance in females. Multivariate logistic regression analysis shows that television viewing >2 h a day, independent of physical activity was associated with video games use >2 h a day, Internet use >2 h a day, poor/fair self-rated health and poor self-reported academic performance. Television viewing time and sex had a significant interaction effect on both video game use >2 h a day and Internet use >2 h a day. Reducing television viewing time may be an effective strategy for improving health and academic performance in adolescents.

  9. Television Viewing and Its Association with Sedentary Behaviors, Self-Rated Health and Academic Performance among Secondary School Students in Peru

    PubMed Central

    Sharma, Bimala; Cosme Chavez, Rosemary; Jeong, Ae Suk; Nam, Eun Woo

    2017-01-01

    The study assessed television viewing >2 h a day and its association with sedentary behaviors, self-rated health, and academic performance among secondary school adolescents. A cross-sectional survey was conducted among randomly selected students in Lima in 2015. We measured self-reported responses of students using a standard questionnaire, and conducted in-depth interviews with 10 parents and 10 teachers. Chi-square test, correlation and multivariate logistic regression analysis were performed among 1234 students, and thematic analysis technique was used for qualitative information. A total of 23.1% adolescents reported watching television >2 h a day. Qualitative findings also show that adolescents spend most of their leisure time watching television, playing video games or using the Internet. Television viewing had a significant positive correlation with video game use in males and older adolescents, with Internet use in both sexes, and a negative correlation with self-rated health and academic performance in females. Multivariate logistic regression analysis shows that television viewing >2 h a day, independent of physical activity was associated with video games use >2 h a day, Internet use >2 h a day, poor/fair self-rated health and poor self-reported academic performance. Television viewing time and sex had a significant interaction effect on both video game use >2 h a day and Internet use >2 h a day. Reducing television viewing time may be an effective strategy for improving health and academic performance in adolescents. PMID:28379202

  10. The LTS timing analysis program :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armstrong, Darrell Jewell; Schwarz, Jens

    The LTS Timing Analysis program described in this report uses signals from the Tempest Lasers, Pulse Forming Lines, and Laser Spark Detectors to carry out calculations to quantify and monitor the performance of the the Z-Accelerators laser triggered SF6 switches. The program analyzes Z-shots beginning with Z2457, when Laser Spark Detector data became available for all lines.

  11. Multi-constituent determination and fingerprint analysis of Scutellaria indica L. using ultra high performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry.

    PubMed

    Liang, Xianrui; Zhao, Cui; Su, Weike

    2015-11-01

    An ultra-performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry method integrating multi-constituent determination and fingerprint analysis has been established for quality assessment and control of Scutellaria indica L. The optimized method possesses the advantages of speediness, efficiency, and allows multi-constituents determination and fingerprint analysis in one chromatographic run within 11 min. 36 compounds were detected, and 23 of them were unequivocally identified or tentatively assigned. The established fingerprint method was applied to the analysis of ten S. indica samples from different geographic locations. The quality assessment was achieved by using principal component analysis. The proposed method is useful and reliable for the characterization of multi-constituents in a complex chemical system and the overall quality assessment of S. indica. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. An efficient General Transit Feed Specification (GTFS) enabled algorithm for dynamic transit accessibility analysis.

    PubMed

    Fayyaz S, S Kiavash; Liu, Xiaoyue Cathy; Zhang, Guohui

    2017-01-01

    The social functions of urbanized areas are highly dependent on and supported by the convenient access to public transportation systems, particularly for the less privileged populations who have restrained auto ownership. To accurately evaluate the public transit accessibility, it is critical to capture the spatiotemporal variation of transit services. This can be achieved by measuring the shortest paths or minimum travel time between origin-destination (OD) pairs at each time-of-day (e.g. every minute). In recent years, General Transit Feed Specification (GTFS) data has been gaining popularity for between-station travel time estimation due to its interoperability in spatiotemporal analytics. Many software packages, such as ArcGIS, have developed toolbox to enable the travel time estimation with GTFS. They perform reasonably well in calculating travel time between OD pairs for a specific time-of-day (e.g. 8:00 AM), yet can become computational inefficient and unpractical with the increase of data dimensions (e.g. all times-of-day and large network). In this paper, we introduce a new algorithm that is computationally elegant and mathematically efficient to address this issue. An open-source toolbox written in C++ is developed to implement the algorithm. We implemented the algorithm on City of St. George's transit network to showcase the accessibility analysis enabled by the toolbox. The experimental evidence shows significant reduction on computational time. The proposed algorithm and toolbox presented is easily transferable to other transit networks to allow transit agencies and researchers perform high resolution transit performance analysis.

  13. An efficient General Transit Feed Specification (GTFS) enabled algorithm for dynamic transit accessibility analysis

    PubMed Central

    Fayyaz S., S. Kiavash; Zhang, Guohui

    2017-01-01

    The social functions of urbanized areas are highly dependent on and supported by the convenient access to public transportation systems, particularly for the less privileged populations who have restrained auto ownership. To accurately evaluate the public transit accessibility, it is critical to capture the spatiotemporal variation of transit services. This can be achieved by measuring the shortest paths or minimum travel time between origin-destination (OD) pairs at each time-of-day (e.g. every minute). In recent years, General Transit Feed Specification (GTFS) data has been gaining popularity for between-station travel time estimation due to its interoperability in spatiotemporal analytics. Many software packages, such as ArcGIS, have developed toolbox to enable the travel time estimation with GTFS. They perform reasonably well in calculating travel time between OD pairs for a specific time-of-day (e.g. 8:00 AM), yet can become computational inefficient and unpractical with the increase of data dimensions (e.g. all times-of-day and large network). In this paper, we introduce a new algorithm that is computationally elegant and mathematically efficient to address this issue. An open-source toolbox written in C++ is developed to implement the algorithm. We implemented the algorithm on City of St. George’s transit network to showcase the accessibility analysis enabled by the toolbox. The experimental evidence shows significant reduction on computational time. The proposed algorithm and toolbox presented is easily transferable to other transit networks to allow transit agencies and researchers perform high resolution transit performance analysis. PMID:28981544

  14. APPLICATION OF TRAVEL TIME RELIABILITY FOR PERFORMANCE ORIENTED OPERATIONAL PLANNING OF EXPRESSWAYS

    NASA Astrophysics Data System (ADS)

    Mehran, Babak; Nakamura, Hideki

    Evaluation of impacts of congestion improvement scheme s on travel time reliability is very significant for road authorities since travel time reliability repr esents operational performance of expressway segments. In this paper, a methodology is presented to estimate travel tim e reliability prior to implementation of congestion relief schemes based on travel time variation modeling as a function of demand, capacity, weather conditions and road accident s. For subject expressway segmen ts, traffic conditions are modeled over a whole year considering demand and capacity as random variables. Patterns of demand and capacity are generated for each five minute interval by appl ying Monte-Carlo simulation technique, and accidents are randomly generated based on a model that links acci dent rate to traffic conditions. A whole year analysis is performed by comparing de mand and available capacity for each scenario and queue length is estimated through shockwave analysis for each time in terval. Travel times are estimated from refined speed-flow relationships developed for intercity expressways and buffer time index is estimated consequently as a measure of travel time reliability. For validation, estimated reliability indices are compared with measured values from empirical data, and it is shown that the proposed method is suitable for operational evaluation and planning purposes.

  15. FPGA-based multi-channel fluorescence lifetime analysis of Fourier multiplexed frequency-sweeping lifetime imaging

    PubMed Central

    Zhao, Ming; Li, Yu; Peng, Leilei

    2014-01-01

    We report a fast non-iterative lifetime data analysis method for the Fourier multiplexed frequency-sweeping confocal FLIM (Fm-FLIM) system [ Opt. Express22, 10221 ( 2014)24921725]. The new method, named R-method, allows fast multi-channel lifetime image analysis in the system’s FPGA data processing board. Experimental tests proved that the performance of the R-method is equivalent to that of single-exponential iterative fitting, and its sensitivity is well suited for time-lapse FLIM-FRET imaging of live cells, for example cyclic adenosine monophosphate (cAMP) level imaging with GFP-Epac-mCherry sensors. With the R-method and its FPGA implementation, multi-channel lifetime images can now be generated in real time on the multi-channel frequency-sweeping FLIM system, and live readout of FRET sensors can be performed during time-lapse imaging. PMID:25321778

  16. The effect of differences rainfall data duration and time period in the assessment of rainwater harvesting system performance for domestic water use

    NASA Astrophysics Data System (ADS)

    Juliana, Imroatul C.; Kusuma, M. Syahril Badri; Cahyono, M.; Martokusumo, Widjaja; Kuntoro, Arno Adi

    2017-11-01

    One of the attempts to tackle the problem in water resources is to exploit the potential of rainwater volume with rainwater harvesting (RWH) system. A number of rainfall data required for analyzing the RWH system performance. In contrast, the availability of rainfall data is occasionally difficult to obtain. The main objective of this study is to investigate the effect of difference rainfall data duration and time period to assess the RWH system performance. An analysis was conducted on the rainfall data based on rainfall data duration and time period. The analysis was performed considering 15, 5, 3, 2 years, average year, wet year, and dry year for Palembang city in South Sumatera. The RWH system performance is calculated based on the concept of yield before spillage algorithm. A number of scenarios were conducted by varying the tank capacity, roof area, and the rainwater demand. It was observed that the use of data with a smaller duration provides a significant difference, especially for high rainwater demand. In addition, the use of daily rainfall data would describe th e behavior of the system more thoroughly. As for time step, the use of monthly rainfall data is only sufficient for low rainwater demand and bigger tank capacity.

  17. Drunk driving detection based on classification of multivariate time series.

    PubMed

    Li, Zhenlong; Jin, Xue; Zhao, Xiaohua

    2015-09-01

    This paper addresses the problem of detecting drunk driving based on classification of multivariate time series. First, driving performance measures were collected from a test in a driving simulator located in the Traffic Research Center, Beijing University of Technology. Lateral position and steering angle were used to detect drunk driving. Second, multivariate time series analysis was performed to extract the features. A piecewise linear representation was used to represent multivariate time series. A bottom-up algorithm was then employed to separate multivariate time series. The slope and time interval of each segment were extracted as the features for classification. Third, a support vector machine classifier was used to classify driver's state into two classes (normal or drunk) according to the extracted features. The proposed approach achieved an accuracy of 80.0%. Drunk driving detection based on the analysis of multivariate time series is feasible and effective. The approach has implications for drunk driving detection. Copyright © 2015 Elsevier Ltd and National Safety Council. All rights reserved.

  18. [A Case of Hereditary Medullary Thyroid Cancer (MEN2A/FMTC) Diagnosed at the Time of Recurrence].

    PubMed

    Enomoto, Keisuke; Shimizu, Kotaro; Hirose, Masayuki; Miyabe, Haruka; Morizane, Natsue; Takenaka, Yukinori; Shimazu, Kohki; Fushimi, Hiroaki; Uno, Atsuhiko

    2015-03-01

    We report a 42-year-old man with hereditary medullary thyroid cancer (multiple endocrine neoplasia, MEN2A/familial medullary thyroid carcinoma, FMTC), which was diagnosed at the time of tumor recurrence. He had a past history of a left thyroidectomy with neck dissection 7 years previously. A RET gene analysis revealed a point mutation (codon 618), and we diagnosed him as having hereditary medullary thyroid cancer. We resected the recurrent tumor in the right thyroid lobe together with performing a right lateral and central neck dissection. A RET gene analysis should be performed for patients with medullary thyroid cancer. When a RET gene mutation is present, a total thyroidectomy must be performed for the medullary thyroid cancer.

  19. Polarization-Analyzing CMOS Image Sensor With Monolithically Embedded Polarizer for Microchemistry Systems.

    PubMed

    Tokuda, T; Yamada, H; Sasagawa, K; Ohta, J

    2009-10-01

    This paper proposes and demonstrates a polarization-analyzing CMOS sensor based on image sensor architecture. The sensor was designed targeting applications for chiral analysis in a microchemistry system. The sensor features a monolithically embedded polarizer. Embedded polarizers with different angles were implemented to realize a real-time absolute measurement of the incident polarization angle. Although the pixel-level performance was confirmed to be limited, estimation schemes based on the variation of the polarizer angle provided a promising performance for real-time polarization measurements. An estimation scheme using 180 pixels in a 1deg step provided an estimation accuracy of 0.04deg. Polarimetric measurements of chiral solutions were also successfully performed to demonstrate the applicability of the sensor to optical chiral analysis.

  20. National Combustion Code Parallel Performance Enhancements

    NASA Technical Reports Server (NTRS)

    Quealy, Angela; Benyo, Theresa (Technical Monitor)

    2002-01-01

    The National Combustion Code (NCC) is being developed by an industry-government team for the design and analysis of combustion systems. The unstructured grid, reacting flow code uses a distributed memory, message passing model for its parallel implementation. The focus of the present effort has been to improve the performance of the NCC code to meet combustor designer requirements for model accuracy and analysis turnaround time. Improving the performance of this code contributes significantly to the overall reduction in time and cost of the combustor design cycle. This report describes recent parallel processing modifications to NCC that have improved the parallel scalability of the code, enabling a two hour turnaround for a 1.3 million element fully reacting combustion simulation on an SGI Origin 2000.

  1. Determination of dasatinib in the tablet dosage form by ultra high performance liquid chromatography, capillary zone electrophoresis, and sequential injection analysis.

    PubMed

    Gonzalez, Aroa Garcia; Taraba, Lukáš; Hraníček, Jakub; Kozlík, Petr; Coufal, Pavel

    2017-01-01

    Dasatinib is a novel oral prescription drug proposed for treating adult patients with chronic myeloid leukemia. Three analytical methods, namely ultra high performance liquid chromatography, capillary zone electrophoresis, and sequential injection analysis, were developed, validated, and compared for determination of the drug in the tablet dosage form. The total analysis time of optimized ultra high performance liquid chromatography and capillary zone electrophoresis methods was 2.0 and 2.2 min, respectively. Direct ultraviolet detection with detection wavelength of 322 nm was employed in both cases. The optimized sequential injection analysis method was based on spectrophotometric detection of dasatinib after a simple colorimetric reaction with folin ciocalteau reagent forming a blue-colored complex with an absorbance maximum at 745 nm. The total analysis time was 2.5 min. The ultra high performance liquid chromatography method provided the lowest detection and quantitation limits and the most precise and accurate results. All three newly developed methods were demonstrated to be specific, linear, sensitive, precise, and accurate, providing results satisfactorily meeting the requirements of the pharmaceutical industry, and can be employed for the routine determination of the active pharmaceutical ingredient in the tablet dosage form. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. An Investigation of Lost Time and Utilization in a Sample of First-Term Male and Female Soldiers

    DTIC Science & Technology

    1982-10-01

    Fitzgibbons, D., & Moch, M. Employee absenteeism : A multivariate analysis with replication. Organizational Behavior and Human Performance , 1980, 26, 349...TIME AND UTILIZATION Technical Report IN A SAMPLE OF FIRST-TERM MALE AND FEMALE April 1981-October 1982 SODER . PERFORMING ORO. REPORT NUMBER . t 7...AIJTHOR(s) S. CONTRACT OR GRANT NUMBER(s) a Joel M. Savell, Carlos K. Rigby, and - Andrew A. Zbikowski 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10

  3. A time series analysis performed on a 25-year period of kidney transplantation activity in a single center.

    PubMed

    Santori, G; Fontana, I; Bertocchi, M; Gasloli, G; Valente, U

    2010-05-01

    Following the example of many Western countries, where a "minimum volume rule" policy has been adopted as a quality parameter for complex surgical procedures, the Italian National Transplant Centre set the minimum number of kidney transplantation procedures/y at 30/center. The number of procedures performed in a single center over a large period may be treated as a time series to evaluate trends, seasonal cycles, and nonsystematic fluctuations. Between January 1, 1983, and December 31, 2007, we performed 1376 procedures in adult or pediatric recipients from living or cadaveric donors. The greatest numbers of cases/y were performed in 1998 (n = 86) followed by 2004 (n = 82), 1996 (n = 75), and 2003 (n = 73). A time series analysis performed using R Statistical Software (Foundation for Statistical Computing, Vienna, Austria), a free software environment for statistical computing and graphics, showed a whole incremental trend after exponential smoothing as well as after seasonal decomposition. However, starting from 2005, we observed a decreased trend in the series. The number of kidney transplants expected to be performed for 2008 by using the Holt-Winters exponential smoothing applied to the period 1983 to 2007 suggested 58 procedures, while in that year there were 52. The time series approach may be helpful to establish a minimum volume/y at a single-center level. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  4. Transcriptional and Chromatin Dynamics of Muscle Regeneration After Severe Trauma

    DTIC Science & Technology

    2016-10-12

    performed pathway analysis of the time-clustered RNA- Seq data16 and showed an initial burst of pro-inflammatory and immune-response transcripts in the...143 showed dynamic behavior (See Methods) and analysis of the dynamic miRNAs reinforced many of the results observed from the RNA-Seq datasets...excellent agreement was viewed. Hierarchical clustering of the datasets through time revealed 5 clusters, and gene ontology (GO) analysis of the

  5. Analysis of Galaxy 15 Satellite Images from a Small-Aperture Telescope

    DTIC Science & Technology

    2011-09-01

    December 2010) during which it did not respond to commands from the ground. During this time period, the satellite drifted eastward causing...and 2) aberration. The light speed correction reflects the motion of the satellite along the orbit during the time Δt it takes for the signal to... time (or phase angle) with a separate photometric analysis performed at Oceanit. To obtain the photometry , we used AstroGraph software (Fig. 3

  6. Effects of Computer Support, Collaboration, and Time Lag on Performance Self-Efficacy and Transfer of Training: A Longitudinal Meta-Analysis

    ERIC Educational Resources Information Center

    Gegenfurtner, Andreas; Veermans, Koen; Vauras, Marja

    2013-01-01

    This meta-analysis (29 studies, k = 33, N = 4158) examined the longitudinal development of the relationship between performance self-efficacy and transfer before and after training. A specific focus was on training programs that afforded varying degrees of computer-supported collaborative learning (CSCL). Consistent with social cognitive theory,…

  7. Microgrid Enabled Distributed Energy Solutions (MEDES) Fort Bliss Military Reservation

    DTIC Science & Technology

    2014-02-01

    Logic Controller PF Power Factor PO Performance Objectives PPA Power Purchase Agreements PV Photovoltaic R&D Research and Development RDSI...controller, algorithms perform power flow analysis, short term optimization, and long-term forecasted planning. The power flow analysis ensures...renewable photovoltaic power and energy storage in this microgrid configuration, the available mission operational time of the backup generator can be

  8. School Expenditure and School Performance: Evidence from New South Wales Schools Using a Dynamic Panel Analysis

    ERIC Educational Resources Information Center

    Pugh, G.; Mangan, J.; Blackburn, V.; Radicic, D.

    2015-01-01

    This article estimates the effects of school expenditure on school performance in government secondary schools in New South Wales, Australia over the period 2006-2010. It uses dynamic panel analysis to exploit time series data on individual schools that only recently has become available. We find a significant but small effect of expenditure on…

  9. Effects of and Preference for Pay for Performance: An Analogue Analysis

    ERIC Educational Resources Information Center

    Long, Robert D., III; Wilder, David A.; Betz, Alison; Dutta, Ami

    2012-01-01

    We examined the effects of 2 payment systems on the rate of check processing and time spent on task by participants in a simulated work setting. Three participants experienced individual pay-for-performance (PFP) without base pay and pay-for-time (PFT) conditions. In the last phase, we asked participants to choose which system they preferred. For…

  10. An exact computational method for performance analysis of sequential test algorithms for detecting network intrusions

    NASA Astrophysics Data System (ADS)

    Chen, Xinjia; Lacy, Fred; Carriere, Patrick

    2015-05-01

    Sequential test algorithms are playing increasingly important roles for quick detecting network intrusions such as portscanners. In view of the fact that such algorithms are usually analyzed based on intuitive approximation or asymptotic analysis, we develop an exact computational method for the performance analysis of such algorithms. Our method can be used to calculate the probability of false alarm and average detection time up to arbitrarily pre-specified accuracy.

  11. Determination of thermally induced effects and design guidelines of optomechanical accelerometers

    NASA Astrophysics Data System (ADS)

    Lu, Qianbo; Bai, Jian; Wang, Kaiwei; Jiao, Xufen; Han, Dandan; Chen, Peiwen; Liu, Dong; Yang, Yongying; Yang, Guoguang

    2017-11-01

    Thermal effects, including thermally induced deformation and warm up time, are ubiquitous problems for sensors, especially for inertial measurement units such as accelerometers. Optomechanical accelerometers, which contain light sources that can be regarded as heat sources, involve a different thermal phenomenon in terms of their specific optical readout, and the phenomenon has not been investigated systematically. This paper proposes a model to evaluate the temperature difference, rise time and thermally induced deformation of optomechanical accelerometers, and then constructs design guidelines which can diminish these thermal effects without compromising other mechanical performances, based on the analysis of the interplay of thermal and mechanical performances. In the model, the irradiation of the micromachined structure of a laser source is considered a dominant factor. The experimental data obtained using a prototype of an optomechanical accelerometer approximately confirm the validity of the model for the rise time and response tendency. Moreover, design guidelines that adopt suspensions with a flat cross-section and a short length are demonstrated with reference to the analysis. The guidelines can reduce the thermally induced deformation and rise time or achieve higher mechanical performances with similar thermal effects, which paves the way for the design of temperature-tolerant and robust, high-performance devices.

  12. Digital microfluidic platform for multiplexing enzyme assays: implications for lysosomal storage disease screening in newborns.

    PubMed

    Sista, Ramakrishna S; Eckhardt, Allen E; Wang, Tong; Graham, Carrie; Rouse, Jeremy L; Norton, Scott M; Srinivasan, Vijay; Pollack, Michael G; Tolun, Adviye A; Bali, Deeksha; Millington, David S; Pamula, Vamsee K

    2011-10-01

    Newborn screening for lysosomal storage diseases (LSDs) has been gaining considerable interest owing to the availability of enzyme replacement therapies. We present a digital microfluidic platform to perform rapid, multiplexed enzymatic analysis of acid α-glucosidase (GAA) and acid α-galactosidase to screen for Pompe and Fabry disorders. The results were compared with those obtained using standard fluorometric methods. We performed bench-based, fluorometric enzymatic analysis on 60 deidentified newborn dried blood spots (DBSs), plus 10 Pompe-affected and 11 Fabry-affected samples, at Duke Biochemical Genetics Laboratory using a 3-mm punch for each assay and an incubation time of 20 h. We used a digital microfluidic platform to automate fluorometric enzymatic assays at Advanced Liquid Logic Inc. using extract from a single punch for both assays, with an incubation time of 6 h. Assays were also performed with an incubation time of 1 h. Assay results were generally comparable, although mean enzymatic activity for GAA using microfluidics was approximately 3 times higher than that obtained using bench-based methods, which could be attributed to higher substrate concentration. Clear separation was observed between the normal and affected samples at both 6- and 1-h incubation times using digital microfluidics. A digital microfluidic platform compared favorably with a clinical reference laboratory to perform enzymatic analysis in DBSs for Pompe and Fabry disorders. This platform presents a new technology for a newborn screening laboratory to screen LSDs by fully automating all the liquid-handling operations in an inexpensive system, providing rapid results.

  13. Reprint of “Performance analysis of a model-sized superconducting DC transmission system based VSC-HVDC transmission technologies using RTDS”

    NASA Astrophysics Data System (ADS)

    Dinh, Minh-Chau; Ju, Chang-Hyeon; Kim, Sung-Kyu; Kim, Jin-Geun; Park, Minwon; Yu, In-Keun

    2013-01-01

    The combination of a high temperature superconducting DC power cable and a voltage source converter based HVDC (VSC-HVDC) creates a new option for transmitting power with multiple collection and distribution points for long distance and bulk power transmissions. It offers some greater advantages compared with HVAC or conventional HVDC transmission systems, and it is well suited for the grid integration of renewable energy sources in existing distribution or transmission systems. For this reason, a superconducting DC transmission system based HVDC transmission technologies is planned to be set up in the Jeju power system, Korea. Before applying this system to a real power system on Jeju Island, system analysis should be performed through a real time test. In this paper, a model-sized superconducting VSC-HVDC system, which consists of a small model-sized VSC-HVDC connected to a 2 m YBCO HTS DC model cable, is implemented. The authors have performed the real-time simulation method that incorporates the model-sized superconducting VSC-HVDC system into the simulated Jeju power system using Real Time Digital Simulator (RTDS). The performance analysis of the superconducting VSC-HVDC systems has been verified by the proposed test platform and the results were discussed in detail.

  14. Performance analysis of a model-sized superconducting DC transmission system based VSC-HVDC transmission technologies using RTDS

    NASA Astrophysics Data System (ADS)

    Dinh, Minh-Chau; Ju, Chang-Hyeon; Kim, Sung-Kyu; Kim, Jin-Geun; Park, Minwon; Yu, In-Keun

    2012-08-01

    The combination of a high temperature superconducting DC power cable and a voltage source converter based HVDC (VSC-HVDC) creates a new option for transmitting power with multiple collection and distribution points for long distance and bulk power transmissions. It offers some greater advantages compared with HVAC or conventional HVDC transmission systems, and it is well suited for the grid integration of renewable energy sources in existing distribution or transmission systems. For this reason, a superconducting DC transmission system based HVDC transmission technologies is planned to be set up in the Jeju power system, Korea. Before applying this system to a real power system on Jeju Island, system analysis should be performed through a real time test. In this paper, a model-sized superconducting VSC-HVDC system, which consists of a small model-sized VSC-HVDC connected to a 2 m YBCO HTS DC model cable, is implemented. The authors have performed the real-time simulation method that incorporates the model-sized superconducting VSC-HVDC system into the simulated Jeju power system using Real Time Digital Simulator (RTDS). The performance analysis of the superconducting VSC-HVDC systems has been verified by the proposed test platform and the results were discussed in detail.

  15. A tribute to John Gibbon.

    PubMed

    Church, Russell M.

    2002-04-28

    This article provides an overview of the published research of John Gibbon. It describes his experimental research on scalar timing and his development of scalar timing theory. It also describes his methods of research which included mathematical analysis, conditioning methods, psychophysical methods and secondary data analysis. Finally, it describes his application of scalar timing theory to avoidance and punishment, autoshaping, temporal perception and timed behavior, foraging, circadian rhythms, human timing, and the effect of drugs on timed perception and timed performance of Parkinson's patients. The research of Gibbon has shown the essential role of timing in perception, classical conditioning, instrumental learning, behavior in natural environments and in neuropsychology.

  16. `G.A.T.E': Gap analysis for TTX evaluation

    NASA Astrophysics Data System (ADS)

    Cacciotti, Ilaria; Di Giovanni, Daniele; Pergolini, Alessandro; Malizia, Andrea; Carestia, Mariachiara; Palombi, Leonardo; Bellecci, Carlo; Gaudio, Pasquale

    2016-06-01

    A Table Top Exercise (TTX) gap analysis tool was developed with the aim to provide a complete, systematic and objective evaluation of TTXs organized in safety and security fields. A TTX consists in a discussion-based emergency management exercise, organized in a simulated emergency scenario, involving groups of players who are subjected to a set of solicitations (`injects'), in order to evaluate their emergency response abilities. This kind of exercise is devoted to identify strengths and shortfalls and to propose potential and promising changes in the approach to a particular situation. In order to manage the TTX derived data collection and analysis, a gap analysis tool would be very useful and functional at identifying the 'gap' between them and specific areas and actions for improvement, consisting the gap analysis in a comparison between actual performances and optimal/expected ones. In this context, a TTX gap analysis tool was designed, with the objective to provide an evaluation of Team players' competences and performances and TTX organization and structure. The influence of both the players' expertise and the reaction time (difference between expected time and time necessary to actually complete the injects) on the final evaluation of the inject responses was also taken into account.

  17. Cloud Computing: A model Construct of Real-Time Monitoring for Big Dataset Analytics Using Apache Spark

    NASA Astrophysics Data System (ADS)

    Alkasem, Ameen; Liu, Hongwei; Zuo, Decheng; Algarash, Basheer

    2018-01-01

    The volume of data being collected, analyzed, and stored has exploded in recent years, in particular in relation to the activity on the cloud computing. While large-scale data processing, analysis, storage, and platform model such as cloud computing were previously and currently are increasingly. Today, the major challenge is it address how to monitor and control these massive amounts of data and perform analysis in real-time at scale. The traditional methods and model systems are unable to cope with these quantities of data in real-time. Here we present a new methodology for constructing a model for optimizing the performance of real-time monitoring of big datasets, which includes a machine learning algorithms and Apache Spark Streaming to accomplish fine-grained fault diagnosis and repair of big dataset. As a case study, we use the failure of Virtual Machines (VMs) to start-up. The methodology proposition ensures that the most sensible action is carried out during the procedure of fine-grained monitoring and generates the highest efficacy and cost-saving fault repair through three construction control steps: (I) data collection; (II) analysis engine and (III) decision engine. We found that running this novel methodology can save a considerate amount of time compared to the Hadoop model, without sacrificing the classification accuracy or optimization of performance. The accuracy of the proposed method (92.13%) is an improvement on traditional approaches.

  18. Bank-firm credit network in Japan: an analysis of a bipartite network.

    PubMed

    Marotta, Luca; Miccichè, Salvatore; Fujiwara, Yoshi; Iyetomi, Hiroshi; Aoyama, Hideaki; Gallegati, Mauro; Mantegna, Rosario N

    2015-01-01

    We investigate the networked nature of the Japanese credit market. Our investigation is performed with tools of network science. In our investigation we perform community detection with an algorithm which is identifying communities composed of both banks and firms. We show that the communities obtained by directly working on the bipartite network carry information about the networked nature of the Japanese credit market. Our analysis is performed for each calendar year during the time period from 1980 to 2011. To investigate the time evolution of the networked structure of the credit market we introduce a new statistical method to track the time evolution of detected communities. We then characterize the time evolution of communities by detecting for each time evolving set of communities the over-expression of attributes of firms and banks. Specifically, we consider as attributes the economic sector and the geographical location of firms and the type of banks. In our 32-year-long analysis we detect a persistence of the over-expression of attributes of communities of banks and firms together with a slow dynamic of changes from some specific attributes to new ones. Our empirical observations show that the credit market in Japan is a networked market where the type of banks, geographical location of firms and banks, and economic sector of the firm play a role in shaping the credit relationships between banks and firms.

  19. Bank-Firm Credit Network in Japan: An Analysis of a Bipartite Network

    PubMed Central

    Marotta, Luca; Miccichè, Salvatore; Fujiwara, Yoshi; Iyetomi, Hiroshi; Aoyama, Hideaki; Gallegati, Mauro; Mantegna, Rosario N.

    2015-01-01

    We investigate the networked nature of the Japanese credit market. Our investigation is performed with tools of network science. In our investigation we perform community detection with an algorithm which is identifying communities composed of both banks and firms. We show that the communities obtained by directly working on the bipartite network carry information about the networked nature of the Japanese credit market. Our analysis is performed for each calendar year during the time period from 1980 to 2011. To investigate the time evolution of the networked structure of the credit market we introduce a new statistical method to track the time evolution of detected communities. We then characterize the time evolution of communities by detecting for each time evolving set of communities the over-expression of attributes of firms and banks. Specifically, we consider as attributes the economic sector and the geographical location of firms and the type of banks. In our 32-year-long analysis we detect a persistence of the over-expression of attributes of communities of banks and firms together with a slow dynamic of changes from some specific attributes to new ones. Our empirical observations show that the credit market in Japan is a networked market where the type of banks, geographical location of firms and banks, and economic sector of the firm play a role in shaping the credit relationships between banks and firms. PMID:25933413

  20. An x ray archive on your desk: The Einstein CD-ROM's

    NASA Technical Reports Server (NTRS)

    Prestwich, A.; Mcdowell, J.; Plummer, D.; Manning, K.; Garcia, M.

    1992-01-01

    Data from the Einstein Observatory imaging proportional counter (IPC) and high resolution imager (HRI) were released on several CD-ROM sets. The sets released so far include pointed IPC and HRI observations in both simple image and detailed photon event list format, as well as the IPC slew survey. With the data on these CD-ROMS's the user can perform spatial analysis (e.g., surface brightness distributions), spectral analysis (with the IPC event lists), and timing analysis (with the IPC and HRI event lists). The next CD-ROM set will contain IPC unscreened data, allowing the user to perform custom screening to recover, for instance, data during times of lost aspect data or high particle background rates.

  1. Mean platelet volume (MPV) predicts middle distance running performance.

    PubMed

    Lippi, Giuseppe; Salvagno, Gian Luca; Danese, Elisa; Skafidas, Spyros; Tarperi, Cantor; Guidi, Gian Cesare; Schena, Federico

    2014-01-01

    Running economy and performance in middle distance running depend on several physiological factors, which include anthropometric variables, functional characteristics, training volume and intensity. Since little information is available about hematological predictors of middle distance running time, we investigated whether some hematological parameters may be associated with middle distance running performance in a large sample of recreational runners. The study population consisted in 43 amateur runners (15 females, 28 males; median age 47 years), who successfully concluded a 21.1 km half-marathon at 75-85% of their maximal aerobic power (VO2max). Whole blood was collected 10 min before the run started and immediately thereafter, and hematological testing was completed within 2 hours after sample collection. The values of lymphocytes and eosinophils exhibited a significant decrease compared to pre-run values, whereas those of mean corpuscular volume (MCV), platelets, mean platelet volume (MPV), white blood cells (WBCs), neutrophils and monocytes were significantly increased after the run. In univariate analysis, significant associations with running time were found for pre-run values of hematocrit, hemoglobin, mean corpuscular hemoglobin (MCH), red blood cell distribution width (RDW), MPV, reticulocyte hemoglobin concentration (RetCHR), and post-run values of MCH, RDW, MPV, monocytes and RetCHR. In multivariate analysis, in which running time was entered as dependent variable whereas age, sex, blood lactate, body mass index, VO2max, mean training regimen and the hematological parameters significantly associated with running performance in univariate analysis were entered as independent variables, only MPV values before and after the trial remained significantly associated with running time. After adjustment for platelet count, the MPV value before the run (p = 0.042), but not thereafter (p = 0.247), remained significantly associated with running performance. The significant association between baseline MPV and running time suggest that hyperactive platelets may exert some pleiotropic effects on endurance performance.

  2. Remember to do: insomnia versus control groups in a prospective memory task.

    PubMed

    Fabbri, Marco; Tonetti, Lorenzo; Martoni, Monica; Natale, Vincenzo

    2015-01-01

    Primary insomnia is characterized by difficulty in falling asleep and/or remaining asleep, by early morning awakening and/or nonrestorative sleep, and resultant daytime dysfunction in the absence of specific physical, mental, or substance-related causes. However, the studies on daytime cognitive functioning of insomnia patients report inconclusive results. This retrospective study aimed to compare the performance of insomnia patients (N = 54) to that of controls (N = 113) in a naturalistic prospective memory task. Task performance was defined by the percentage of times the event-marker button of an actigraph was pressed, at lights-off time and at wake-up time. The performance pattern in the prospective memory task was similar in both groups. In addition, the task was performed better at lights-off time than at wake-up time regardless of group. Post-hoc subgroup analysis showed that there were more insomnia patients who performed the task perfectly (i.e., 100%) than controls. Performance at wake-up time was significantly correlated to objective sleep quality (i.e., sleep efficiency) only in insomnia patients.

  3. High-speed counter-current chromatography coupled online to high performance liquid chromatography-diode array detector-mass spectrometry for purification, analysis and identification of target compounds from natural products.

    PubMed

    Liang, Xuejuan; Zhang, Yuping; Chen, Wei; Cai, Ping; Zhang, Shuihan; Chen, Xiaoqin; Shi, Shuyun

    2015-03-13

    A challenge in coupling high-speed counter-current chromatography (HSCCC) online with high performance liquid chromatography (HPLC) for purity analysis was their time incompatibility. Consequently, HSCCC-HPLC was conducted by either controlling HPLC analysis time and HSCCC flow rate or using stop-and-go scheme. For natural products containing compounds with a wide range of polarities, the former would optimize experimental conditions, while the latter required more time. Here, a novel HSCCC-HPLC-diode array detector-mass spectrometry (HSCCC-HPLC-DAD-MS) was developed for undisrupted purification, analysis and identification of multi-compounds from natural products. Two six-port injection valves and a six-port switching valve were used as interface for collecting key HSCCC effluents alternatively for HPLC-DAD-MS analysis and identification. The ethyl acetate extract of Malus doumeri was performed on the hyphenated system to verify its efficacy. Five main flavonoids, 3-hydroxyphloridzin (1), phloridzin (2), 4',6'-dihydroxyhydrochalcone-2'-O-β-D-glucopyranoside (3, first found in M. doumeri), phloretin (4), and chrysin (5), were purified with purities over 99% by extrusion elution and/or stepwise elution mode in two-step HSCCC, and 25mM ammonium acetate solution was selected instead of water to depress emulsification in the first HSCCC. The online system shortened manipulation time largely compared with off-line analysis procedure and stop-and-go scheme. The results indicated that the present method could serve as a simple, rapid and effective way to achieve target compounds with high purity from natural products. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Analysis of BigFoot HDC SymCap experiment N161205 on NIF

    NASA Astrophysics Data System (ADS)

    Dittrich, T. R.; Baker, K. L.; Thomas, C. A.; Berzak Hopkins, L. F.; Harte, J. A.; Zimmerman, G. B.; Woods, D. T.; Kritcher, A. L.; Ho, D. D.; Weber, C. R.; Kyrala, G.

    2017-10-01

    Analysis of NIF implosion experiment N161205 provides insight into both hohlraum and capsule performance. This experiment used an undoped High Density Carbon (HDC) ablator driven by a BigFoot x-ray profile in a Au hohlraum. Observations from this experiment include DT fusion yield, bang time, DSR, Tion and time-resolved x-ray emission images around bang time. These observations are all consistent with an x-ray spectrum having significantly reduced Au m-band emission that is present in a standard hohlraum simulation. Attempts to justify the observations using several other simulation modifications will be presented. This work was performed under the auspices of the Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344.

  5. Shock timing measurements and analysis in deuterium-tritium-ice layered capsule implosions on NIF

    NASA Astrophysics Data System (ADS)

    Robey, H. F.; Celliers, P. M.; Moody, J. D.; Sater, J.; Parham, T.; Kozioziemski, B.; Dylla-Spears, R.; Ross, J. S.; LePape, S.; Ralph, J. E.; Hohenberger, M.; Dewald, E. L.; Berzak Hopkins, L.; Kroll, J. J.; Yoxall, B. E.; Hamza, A. V.; Boehly, T. R.; Nikroo, A.; Landen, O. L.; Edwards, M. J.

    2014-02-01

    Recent advances in shock timing experiments and analysis techniques now enable shock measurements to be performed in cryogenic deuterium-tritium (DT) ice layered capsule implosions on the National Ignition Facility (NIF). Previous measurements of shock timing in inertial confinement fusion implosions [Boehly et al., Phys. Rev. Lett. 106, 195005 (2011); Robey et al., Phys. Rev. Lett. 108, 215004 (2012)] were performed in surrogate targets, where the solid DT ice shell and central DT gas were replaced with a continuous liquid deuterium (D2) fill. These previous experiments pose two surrogacy issues: a material surrogacy due to the difference of species (D2 vs. DT) and densities of the materials used and a geometric surrogacy due to presence of an additional interface (ice/gas) previously absent in the liquid-filled targets. This report presents experimental data and a new analysis method for validating the assumptions underlying this surrogate technique. Comparison of the data with simulation shows good agreement for the timing of the first three shocks, but reveals a considerable discrepancy in the timing of the 4th shock in DT ice layered implosions. Electron preheat is examined as a potential cause of the observed discrepancy in the 4th shock timing.

  6. Computational simulation and aerodynamic sensitivity analysis of film-cooled turbines

    NASA Astrophysics Data System (ADS)

    Massa, Luca

    A computational tool is developed for the time accurate sensitivity analysis of the stage performance of hot gas, unsteady turbine components. An existing turbomachinery internal flow solver is adapted to the high temperature environment typical of the hot section of jet engines. A real gas model and film cooling capabilities are successfully incorporated in the software. The modifications to the existing algorithm are described; both the theoretical model and the numerical implementation are validated. The accuracy of the code in evaluating turbine stage performance is tested using a turbine geometry typical of the last stage of aeronautical jet engines. The results of the performance analysis show that the predictions differ from the experimental data by less than 3%. A reliable grid generator, applicable to the domain discretization of the internal flow field of axial flow turbine is developed. A sensitivity analysis capability is added to the flow solver, by rendering it able to accurately evaluate the derivatives of the time varying output functions. The complex Taylor's series expansion (CTSE) technique is reviewed. Two of them are used to demonstrate the accuracy and time dependency of the differentiation process. The results are compared with finite differences (FD) approximations. The CTSE is more accurate than the FD, but less efficient. A "black box" differentiation of the source code, resulting from the automated application of the CTSE, generates high fidelity sensitivity algorithms, but with low computational efficiency and high memory requirements. New formulations of the CTSE are proposed and applied. Selective differentiation of the method for solving the non-linear implicit residual equation leads to sensitivity algorithms with the same accuracy but improved run time. The time dependent sensitivity derivatives are computed in run times comparable to the ones required by the FD approach.

  7. Analysis of dispatching rules in a stochastic dynamic job shop manufacturing system with sequence-dependent setup times

    NASA Astrophysics Data System (ADS)

    Sharma, Pankaj; Jain, Ajai

    2014-12-01

    Stochastic dynamic job shop scheduling problem with consideration of sequence-dependent setup times are among the most difficult classes of scheduling problems. This paper assesses the performance of nine dispatching rules in such shop from makespan, mean flow time, maximum flow time, mean tardiness, maximum tardiness, number of tardy jobs, total setups and mean setup time performance measures viewpoint. A discrete event simulation model of a stochastic dynamic job shop manufacturing system is developed for investigation purpose. Nine dispatching rules identified from literature are incorporated in the simulation model. The simulation experiments are conducted under due date tightness factor of 3, shop utilization percentage of 90% and setup times less than processing times. Results indicate that shortest setup time (SIMSET) rule provides the best performance for mean flow time and number of tardy jobs measures. The job with similar setup and modified earliest due date (JMEDD) rule provides the best performance for makespan, maximum flow time, mean tardiness, maximum tardiness, total setups and mean setup time measures.

  8. How bootstrap can help in forecasting time series with more than one seasonal pattern

    NASA Astrophysics Data System (ADS)

    Cordeiro, Clara; Neves, M. Manuela

    2012-09-01

    The search for the future is an appealing challenge in time series analysis. The diversity of forecasting methodologies is inevitable and is still in expansion. Exponential smoothing methods are the launch platform for modelling and forecasting in time series analysis. Recently this methodology has been combined with bootstrapping revealing a good performance. The algorithm (Boot. EXPOS) using exponential smoothing and bootstrap methodologies, has showed promising results for forecasting time series with one seasonal pattern. In case of more than one seasonal pattern, the double seasonal Holt-Winters methods and the exponential smoothing methods were developed. A new challenge was now to combine these seasonal methods with bootstrap and carry over a similar resampling scheme used in Boot. EXPOS procedure. The performance of such partnership will be illustrated for some well-know data sets existing in software.

  9. Analysing playing using the note-time playing path.

    PubMed

    de Graaff, Deborah L E; Schubert, Emery

    2011-03-01

    This article introduces a new method of data analysis that represents the playing of written music as a graph. The method, inspired by Miklaszewski, charts low-level note timings from a sound recording of a single-line instrument using high-precision audio-to-MIDI conversion software. Note onset times of pitch sequences are then plotted against the score-predicted timings to produce a Note-Time Playing Path (NTPP). The score-predicted onset time of each sequentially performed note (horizontal axis) unfolds in performed time down the page (vertical axis). NTPPs provide a visualisation that shows (1) tempo variations, (2) repetitive practice behaviours, (3) segmenting of material, (4) precise note time positions, and (5) time spent on playing or not playing. The NTPP can provide significant new insights into behaviour and cognition of music performance and may also be used to complement established traditional approaches such as think-alouds, interviews, and video coding.

  10. Temperature and time variations during osteotomies performed with different piezosurgical devices: an in vitro study.

    PubMed

    Delgado-Ruiz, R A; Sacks, D; Palermo, A; Calvo-Guirado, J L; Perez-Albacete, C; Romanos, G E

    2016-09-01

    The aim of this experimental in vitro study was to evaluate the effects of the piezoelectric device in temperature and time variations in standardized osteotomies performed with similar tip inserts in bovine bone blocks. Two different piezosurgical devices were used the OE-F15(®) (Osada Inc., Los Angeles, California, USA) and the Surgybone(®) (Silfradent Inc., Sofia, Forli Cesena, Italy). Serrated inserts with similar geometry were coupled with each device (ST94 insert/test A and P0700 insert/test B). Osteotomies 10 mm long and 3 mm deep were performed in bone blocks resembling type II (dense) and type IV (soft) bone densities with and without irrigation. Thermal changes and time variations were recorded. The effects of bone density, irrigation, and device on temperature changes and time necessary to accomplish the osteotomies were analyzed. Thermal analysis showed significant higher temperatures during piezosurgery osteotomies in hard bone without irrigation (P < 0.05). The type of piezosurgical device did not influence thermal variations (P > 0.05). Time analysis showed that the mean time values necessary to perform osteotomies were shorter in soft bone than in dense bone (P < 0.05). Within the limitations of this in vitro study, it may be concluded that the temperature increases more in piezosurgery osteotomies in dense bone without irrigation; the time to perform the osteotomy with piezosurgery is shorter in soft bone compared to hard bone; and the piezosurgical device have a minimal influence in the temperature and time variations when a similar tip design is used during piezosurgery osteotomies. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  11. Combined Vocal Exercises for Rehabilitation After Supracricoid Laryngectomy: Evaluation of Different Execution Times.

    PubMed

    Silveira, Hevely Saray Lima; Simões-Zenari, Marcia; Kulcsar, Marco Aurélio; Cernea, Claudio Roberto; Nemr, Kátia

    2017-10-27

    The supracricoid partial laryngectomy allows the preservation of laryngeal functions with good local cancer control. To assess laryngeal configuration and voice analysis data following the performance of a combination of two vocal exercises: the prolonged /b/vocal exercise combined with the vowel /e/ using chest and arm pushing with different durations among individuals who have undergone supracricoid laryngectomy. Eleven patients undergoing partial laryngectomy supracricoid with cricohyoidoepiglottopexy (CHEP) were evaluated using voice recording. Four judges performed separately a perceptive-vocal analysis of hearing voices, with random samples. For the analysis of intrajudge reliability, repetitions of 70% of the voices were done. Intraclass correlation coefficient was used to analyze the reliability of the judges. For an analysis of each judge to the comparison between zero time (time point 0), after the first series of exercises (time point 1), after the second series (time point 2), after the third series (time point 3), after the fourth series (time point 4), and after the fifth and final series (time point 5), the Friedman test was used with a significance level of 5%. The data relative to the configuration of the larynx were subjected to a descriptive analysis. In the evaluation, were considered the judge results 1 which have greater reliability. There was an improvement in the general level of vocal, roughness, and breathiness deviations from time point 4 [T4]. The prolonged /b/vocal exercise, combined with the vowel /e/ using chest- and arm-pushing exercises, was associated with an improvement in the overall grade of vocal deviation, roughness, and breathiness starting at minute 4 among patients who had undergone supracricoid laryngectomy with CHEP reconstruction. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  12. Integrated analysis of large space systems

    NASA Technical Reports Server (NTRS)

    Young, J. P.

    1980-01-01

    Based on the belief that actual flight hardware development of large space systems will necessitate a formalized method of integrating the various engineering discipline analyses, an efficient highly user oriented software system capable of performing interdisciplinary design analyses with tolerable solution turnaround time is planned Specific analysis capability goals were set forth with initial emphasis given to sequential and quasi-static thermal/structural analysis and fully coupled structural/control system analysis. Subsequently, the IAC would be expanded to include a fully coupled thermal/structural/control system, electromagnetic radiation, and optical performance analyses.

  13. ConceFT for Time-Varying Heart Rate Variability Analysis as a Measure of Noxious Stimulation During General Anesthesia.

    PubMed

    Lin, Yu-Ting; Wu, Hau-Tieng

    2017-01-01

    Heart rate variability (HRV) offers a noninvasive way to peek into the physiological status of the human body. When this physiological status is dynamic, traditional HRV indices calculated from power spectrum do not resolve the dynamic situation due to the issue of nonstationarity. Clinical anesthesia is a typically dynamic situation that calls for time-varying HRV analysis. Concentration of frequency and time (ConceFT) is a nonlinear time-frequency (TF) analysis generalizing the multitaper technique and the synchrosqueezing transform. The result is a sharp TF representation capturing the dynamics inside HRV. Companion indices of the commonly applied HRV indices, including time-varying low-frequency power (tvLF), time-varying high-frequency power, and time-varying low-high ratio, are considered as measures of noxious stimulation. To evaluate the feasibility of the proposed indices, we apply these indices to study two different types of noxious stimulation, the endotracheal intubation and surgical skin incision, under general anesthesia. The performance was compared with traditional HRV indices, the heart rate reading, and indices from electroencephalography. The results indicate that the tvLF index performs best and outperforms not only the traditional HRV index, but also the commonly used heart rate reading. With the help of ConceFT, the proposed HRV indices are potential to provide a better quantification of the dynamic change of the autonomic nerve system. Our proposed scheme of time-varying HRV analysis could contribute to the clinical assessment of analgesia under general anesthesia.

  14. Capability of the Gas Analysis and Testing Laboratory at the NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Broerman, Craig; Jimenez, Javier; Sweterlitsch, Jeff

    2012-01-01

    The Gas Analysis and Testing Laboratory is an integral part of the testing performed at the NASA Johnson Space Center. The Gas Analysis and Testing Laboratory is a high performance laboratory providing real time analytical instruments to support manned and unmanned testing. The lab utilizes precision gas chromatographs, gas analyzers and spectrophotometers to support the technology development programs within the NASA community. The Gas Analysis and Testing Laboratory works with a wide variety of customers and provides engineering support for user-specified applications in compressed gas, chemical analysis, general and research laboratory.

  15. Capability of the Gas Analysis and Testing Laboratory at the NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Broerman, Craig; Jimenez, Javier; Sweterlitsch, Jeff

    2011-01-01

    The Gas Analysis and Testing Laboratory is an integral part of the testing performed at the NASA Johnson Space Center. The Gas Analysis and Testing Laboratory is a high performance laboratory providing real time analytical instruments to support manned and unmanned testing. The lab utilizes precision gas chromatographs, gas analyzers and spectrophotometers to support the technology development programs within the NASA community. The Gas Analysis and Testing Laboratory works with a wide variety of customers and provides engineering support for user-specified applications in compressed gas, chemical analysis, general and research laboratory

  16. Does Vitamin D Supplementation Enhance Musculoskeletal Performance in Individuals Identified as Vitamin D Deficient through Blood Spot Testing?

    NASA Astrophysics Data System (ADS)

    Murphy, Kellie A.

    This thesis investigated possible changes in performance after one month of vitamin D supplementation in individuals found to be vitamin D deficient or insufficient through blood spot testing. Thirty-two males, ages 18-32, participated. Each subject visited the lab three times in one-month, completing four performance tests each session, including an isometric mid-thigh pull and a vertical jump on a force plate, a isometric 90-degree elbow flexion test using a load cell, and a psychomotor vigilance test on a palm pilot. The initial lab included blood spot tests to find vitamin D levels. In a single blind manner, 16 subjects were assigned vitamin D and 16 the placebo. Repeated measures ANOVA analysis did not reveal any main effects for time (F=2.626, p=0.364), treatment (vitamin D3 vs placebo; F=1.282, p=0.999), or interaction effects for treatment by time (F=0.304, p=0.999) for maximum force production during an isometric mid-thigh pull. Repeated measures ANOVA analysis did not reveal any main effects for time (F=1.323, p=0.999), treatment (vitamin D3 vs placebo; F=0.510, p=0.999), or interaction effects for treatment by time (F= 1.625, p=0.860) for rate of force production during a vertical jump. Repeated measures ANOVA analysis did not reveal any main effects for time (F=0.194, p=0.999), treatment (vitamin D3 vs placebo; F=2.452, p=0.513), or interaction effects for treatment by time (F= 1.179, p=0.999) for maximal force production during a 90-degree isometric elbow flexion. Repeated measures ANOVA analysis did not reveal any main effects for time (F=1.710, p=0.804), treatment (vitamin D3 vs placebo; F=1.471, p=0.94), or interaction effects for treatment by time (F= 0.293, p=0.999) for mean reaction time to random stimuli during the psychomotor vigilance test. Repeated measures ANOVA analysis did not reveal any main effects for time (F=0.530, p=0.999), treatment (vitamin D3 vs placebo; F=0.141, p=0.999), or interaction effects for treatment by time (F=0.784 p=0.999) for incidence of minor lapses during the psychomotor vigilance test.

  17. Performance Analysis of a De-correlated Modified Code Tracking Loop for Synchronous DS-CDMA System under Multiuser Environment

    NASA Astrophysics Data System (ADS)

    Wu, Ya-Ting; Wong, Wai-Ki; Leung, Shu-Hung; Zhu, Yue-Sheng

    This paper presents the performance analysis of a De-correlated Modified Code Tracking Loop (D-MCTL) for synchronous direct-sequence code-division multiple-access (DS-CDMA) systems under multiuser environment. Previous studies have shown that the imbalance of multiple access interference (MAI) in the time lead and time lag portions of the signal causes tracking bias or instability problem in the traditional correlating tracking loop like delay lock loop (DLL) or modified code tracking loop (MCTL). In this paper, we exploit the de-correlating technique to combat the MAI at the on-time code position of the MCTL. Unlike applying the same technique to DLL which requires an extensive search algorithm to compensate the noise imbalance which may introduce small tracking bias under low signal-to-noise ratio (SNR), the proposed D-MCTL has much lower computational complexity and exhibits zero tracking bias for the whole range of SNR, regardless of the number of interfering users. Furthermore, performance analysis and simulations based on Gold codes show that the proposed scheme has better mean square tracking error, mean-time-to-lose-lock and near-far resistance than the other tracking schemes, including traditional DLL (T-DLL), traditional MCTL (T-MCTL) and modified de-correlated DLL (MD-DLL).

  18. Stability analysis of implicit time discretizations for the Compton-scattering Fokker-Planck equation

    NASA Astrophysics Data System (ADS)

    Densmore, Jeffery D.; Warsa, James S.; Lowrie, Robert B.; Morel, Jim E.

    2009-09-01

    The Fokker-Planck equation is a widely used approximation for modeling the Compton scattering of photons in high energy density applications. In this paper, we perform a stability analysis of three implicit time discretizations for the Compton-Scattering Fokker-Planck equation. Specifically, we examine (i) a Semi-Implicit (SI) scheme that employs backward-Euler differencing but evaluates temperature-dependent coefficients at their beginning-of-time-step values, (ii) a Fully Implicit (FI) discretization that instead evaluates temperature-dependent coefficients at their end-of-time-step values, and (iii) a Linearized Implicit (LI) scheme, which is developed by linearizing the temperature dependence of the FI discretization within each time step. Our stability analysis shows that the FI and LI schemes are unconditionally stable and cannot generate oscillatory solutions regardless of time-step size, whereas the SI discretization can suffer from instabilities and nonphysical oscillations for sufficiently large time steps. With the results of this analysis, we present time-step limits for the SI scheme that prevent undesirable behavior. We test the validity of our stability analysis and time-step limits with a set of numerical examples.

  19. Independent component analysis decomposition of hospital emergency department throughput measures

    NASA Astrophysics Data System (ADS)

    He, Qiang; Chu, Henry

    2016-05-01

    We present a method adapted from medical sensor data analysis, viz. independent component analysis of electroencephalography data, to health system analysis. Timely and effective care in a hospital emergency department is measured by throughput measures such as median times patients spent before they were admitted as an inpatient, before they were sent home, before they were seen by a healthcare professional. We consider a set of five such measures collected at 3,086 hospitals distributed across the U.S. One model of the performance of an emergency department is that these correlated throughput measures are linear combinations of some underlying sources. The independent component analysis decomposition of the data set can thus be viewed as transforming a set of performance measures collected at a site to a collection of outputs of spatial filters applied to the whole multi-measure data. We compare the independent component sources with the output of the conventional principal component analysis to show that the independent components are more suitable for understanding the data sets through visualizations.

  20. IoGET: Internet of Geophysical and Environmental Things

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mudunuru, Maruti Kumar

    The objective of this project is to provide novel and fast reduced-order models for onboard computation at sensor nodes for real-time analysis. The approach will require that LANL perform high-fidelity numerical simulations, construct simple reduced-order models (ROMs) using machine learning and signal processing algorithms, and use real-time data analysis for ROMs and compressive sensing at sensor nodes.

  1. The Contribution of Human Factors in Military System Development: Methodological Considerations

    DTIC Science & Technology

    1980-07-01

    Risk/Uncertainty Analysis - Project Scoring - Utility Scales - Relevance Tree Techniques (Reverse Factor Analysis) 2. Computer Simulation Simulation...effectiveness of mathematical models for R&D project selection. Management Science, April 1973, 18. 6-43 .1~ *.-. Souder, W.E. h scoring methodology for...per some interval PROFICIENCY test scores (written) RADIATION radiation effects aircrew performance on radiation environments REACTION TIME 1) (time

  2. Real-Time Optical Image Processing Techniques

    DTIC Science & Technology

    1988-10-31

    pursued through the analysis, design, and fabrication of pulse frequency modulated halftone screens and the modification of micro-chan- nel spatial...required for non-linear operation. Real-time nonlinear processing was performed using the halftone screen and MSLM. The experiments showed the effectiveness...pulse frequency modulation has been pursued through the analysis, design, and fabrication of pulse frequency modulated halftone screens and the

  3. Transcriptomic and bioinformatics analysis of the early time-course of the response to prostaglandin F2 alpha in the bovine corpus luteum

    USDA-ARS?s Scientific Manuscript database

    RNA expression analysis was performed on the corpus luteum tissue at five time points after prostaglandin F2 alpha treatment of midcycle cows using an Affymetrix Bovine Gene v1 Array. The normalized linear microarray data was uploaded to the NCBI GEO repository (GSE94069). Subsequent statistical ana...

  4. Spectral Unmixing Analysis of Time Series Landsat 8 Images

    NASA Astrophysics Data System (ADS)

    Zhuo, R.; Xu, L.; Peng, J.; Chen, Y.

    2018-05-01

    Temporal analysis of Landsat 8 images opens up new opportunities in the unmixing procedure. Although spectral analysis of time series Landsat imagery has its own advantage, it has rarely been studied. Nevertheless, using the temporal information can provide improved unmixing performance when compared to independent image analyses. Moreover, different land cover types may demonstrate different temporal patterns, which can aid the discrimination of different natures. Therefore, this letter presents time series K-P-Means, a new solution to the problem of unmixing time series Landsat imagery. The proposed approach is to obtain the "purified" pixels in order to achieve optimal unmixing performance. The vertex component analysis (VCA) is used to extract endmembers for endmember initialization. First, nonnegative least square (NNLS) is used to estimate abundance maps by using the endmember. Then, the estimated endmember is the mean value of "purified" pixels, which is the residual of the mixed pixel after excluding the contribution of all nondominant endmembers. Assembling two main steps (abundance estimation and endmember update) into the iterative optimization framework generates the complete algorithm. Experiments using both simulated and real Landsat 8 images show that the proposed "joint unmixing" approach provides more accurate endmember and abundance estimation results compared with "separate unmixing" approach.

  5. Watching the clock

    PubMed Central

    Fetterman, J. Gregor; Killeen, Peter R.; Hall, Scott

    2008-01-01

    Four rats and four pigeons were monitored while performing retrospective timing tasks. All animals displayed collateral behaviors which could have mediated their temporal judgements. Statistical analysis made a good case for such mediation in the case of two pigeons performing on a spatially-differentiated response, but not for the two responding on a color-differentiated response. For the rats, all of which performed on a spatially-differentiated task, prediction of their temporal judgements was always better if based on collateral activity than if based on the passage of time. PMID:19701487

  6. Characterizing Time Series Data Diversity for Wind Forecasting: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodge, Brian S; Chartan, Erol Kevin; Feng, Cong

    Wind forecasting plays an important role in integrating variable and uncertain wind power into the power grid. Various forecasting models have been developed to improve the forecasting accuracy. However, it is challenging to accurately compare the true forecasting performances from different methods and forecasters due to the lack of diversity in forecasting test datasets. This paper proposes a time series characteristic analysis approach to visualize and quantify wind time series diversity. The developed method first calculates six time series characteristic indices from various perspectives. Then the principal component analysis is performed to reduce the data dimension while preserving the importantmore » information. The diversity of the time series dataset is visualized by the geometric distribution of the newly constructed principal component space. The volume of the 3-dimensional (3D) convex polytope (or the length of 1D number axis, or the area of the 2D convex polygon) is used to quantify the time series data diversity. The method is tested with five datasets with various degrees of diversity.« less

  7. Two-dimensional statistical linear discriminant analysis for real-time robust vehicle-type recognition

    NASA Astrophysics Data System (ADS)

    Zafar, I.; Edirisinghe, E. A.; Acar, S.; Bez, H. E.

    2007-02-01

    Automatic vehicle Make and Model Recognition (MMR) systems provide useful performance enhancements to vehicle recognitions systems that are solely based on Automatic License Plate Recognition (ALPR) systems. Several car MMR systems have been proposed in literature. However these approaches are based on feature detection algorithms that can perform sub-optimally under adverse lighting and/or occlusion conditions. In this paper we propose a real time, appearance based, car MMR approach using Two Dimensional Linear Discriminant Analysis that is capable of addressing this limitation. We provide experimental results to analyse the proposed algorithm's robustness under varying illumination and occlusions conditions. We have shown that the best performance with the proposed 2D-LDA based car MMR approach is obtained when the eigenvectors of lower significance are ignored. For the given database of 200 car images of 25 different make-model classifications, a best accuracy of 91% was obtained with the 2D-LDA approach. We use a direct Principle Component Analysis (PCA) based approach as a benchmark to compare and contrast the performance of the proposed 2D-LDA approach to car MMR. We conclude that in general the 2D-LDA based algorithm supersedes the performance of the PCA based approach.

  8. The Development of a Handbook for Astrobee F Performance and Stability Analysis

    NASA Technical Reports Server (NTRS)

    Wolf, R. S.

    1982-01-01

    An astrobee F performance and stability analysis is presented, for use by the NASA Sounding Rocket Division. The performance analysis provides information regarding altitude, mach number, dynamic pressure, and velocity as functions of time since launch. It is found that payload weight has the greatest effect on performance, and performance prediction accuracy was calculated to remain within 1%. In addition, to assure sufficient flight stability, a predicted rigid-body static margin of at least 8% of the total vehicle length is required. Finally, fin cant angle predictions are given in order to achieve a 2.5 cycle per second burnout roll rate, based on obtaining 75% of the steady roll rate. It is noted that this method can be used by flight performance engineers to create a similar handbook for any sounding rocket series.

  9. Some dynamical aspects of interacting quintessence model

    NASA Astrophysics Data System (ADS)

    Choudhury, Binayak S.; Mondal, Himadri Shekhar; Chatterjee, Devosmita

    2018-04-01

    In this paper, we consider a particular form of coupling, namely B=σ (\\dot{ρ _m}-\\dot{ρ _φ }) in spatially flat (k=0) Friedmann-Lemaitre-Robertson-Walker (FLRW) space-time. We perform phase-space analysis for this interacting quintessence (dark energy) and dark matter model for different numerical values of parameters. We also show the phase-space analysis for the `best-fit Universe' or concordance model. In our analysis, we observe the existence of late-time scaling attractors.

  10. An evidence accumulation model for conflict detection performance in a simulated air traffic control task.

    PubMed

    Neal, Andrew; Kwantes, Peter J

    2009-04-01

    The aim of this article is to develop a formal model of conflict detection performance. Our model assumes that participants iteratively sample evidence regarding the state of the world and accumulate it over time. A decision is made when the evidence reaches a threshold that changes over time in response to the increasing urgency of the task. Two experiments were conducted to examine the effects of conflict geometry and timing on response proportions and response time. The model is able to predict the observed pattern of response times, including a nonmonotonic relationship between distance at point of closest approach and response time, as well as effects of angle of approach and relative velocity. The results demonstrate that evidence accumulation models provide a good account of performance on a conflict detection task. Evidence accumulation models are a form of dynamic signal detection theory, allowing for the analysis of response times as well as response proportions, and can be used for simulating human performance on dynamic decision tasks.

  11. Effect of education on listening comprehension of sentences on healthy elderly: analysis of number of correct responses and task execution time.

    PubMed

    Silagi, Marcela Lima; Rabelo, Camila Maia; Schochat, Eliane; Mansur, Letícia Lessa

    2017-11-13

    To analyze the effect of education on sentence listening comprehension on cognitively healthy elderly. A total of 111 healthy elderly, aged 60-80 years of both genders were divided into two groups according to educational level: low education (0-8 years of formal education) and high education (≥9 years of formal education). The participants were assessed using the Revised Token Test, an instrument that supports the evaluation of auditory comprehension of orders with different working memory and syntactic complexity demands. The indicators used for performance analysis were the number of correct responses (accuracy analysis) and task execution time (temporal analysis) in the different blocks. The low educated group had a lower number of correct responses than the high educated group on all blocks of the test. In the temporal analysis, participants with low education had longer execution time for commands on the first four blocks related to working memory. However, the two groups had similar execution time for blocks more related to syntactic comprehension. Education influenced sentence listening comprehension on elderly. Temporal analysis allowed to infer over the relationship between comprehension and other cognitive abilities, and to observe that the low educated elderly did not use effective compensation strategies to improve their performances on the task. Therefore, low educational level, associated with aging, may potentialize the risks for language decline.

  12. Development and Implementation of a Generic Analysis Template for Structural-Thermal-Optical-Performance Modeling

    NASA Technical Reports Server (NTRS)

    Scola, Salvatore; Stavely, Rebecca; Jackson, Trevor; Boyer, Charlie; Osmundsen, Jim; Turczynski, Craig; Stimson, Chad

    2016-01-01

    Performance-related effects of system level temperature changes can be a key consideration in the design of many types of optical instruments. This is especially true for space-based imagers, which may require complex thermal control systems to maintain alignment of the optical components. Structural-Thermal-Optical-Performance (STOP) analysis is a multi-disciplinary process that can be used to assess the performance of these optical systems when subjected to the expected design environment. This type of analysis can be very time consuming, which makes it difficult to use as a trade study tool early in the project life cycle. In many cases, only one or two iterations can be performed over the course of a project. This limits the design space to best practices since it may be too difficult, or take too long, to test new concepts analytically. In order to overcome this challenge, automation, and a standard procedure for performing these studies is essential. A methodology was developed within the framework of the Comet software tool that captures the basic inputs, outputs, and processes used in most STOP analyses. This resulted in a generic, reusable analysis template that can be used for design trades for a variety of optical systems. The template captures much of the upfront setup such as meshing, boundary conditions, data transfer, naming conventions, and post-processing, and therefore saves time for each subsequent project. A description of the methodology and the analysis template is presented, and results are described for a simple telescope optical system.

  13. Development and implementation of a generic analysis template for structural-thermal-optical-performance modeling

    NASA Astrophysics Data System (ADS)

    Scola, Salvatore; Stavely, Rebecca; Jackson, Trevor; Boyer, Charlie; Osmundsen, Jim; Turczynski, Craig; Stimson, Chad

    2016-09-01

    Performance-related effects of system level temperature changes can be a key consideration in the design of many types of optical instruments. This is especially true for space-based imagers, which may require complex thermal control systems to maintain alignment of the optical components. Structural-Thermal-Optical-Performance (STOP) analysis is a multi-disciplinary process that can be used to assess the performance of these optical systems when subjected to the expected design environment. This type of analysis can be very time consuming, which makes it difficult to use as a trade study tool early in the project life cycle. In many cases, only one or two iterations can be performed over the course of a project. This limits the design space to best practices since it may be too difficult, or take too long, to test new concepts analytically. In order to overcome this challenge, automation, and a standard procedure for performing these studies is essential. A methodology was developed within the framework of the Comet software tool that captures the basic inputs, outputs, and processes used in most STOP analyses. This resulted in a generic, reusable analysis template that can be used for design trades for a variety of optical systems. The template captures much of the upfront setup such as meshing, boundary conditions, data transfer, naming conventions, and post-processing, and therefore saves time for each subsequent project. A description of the methodology and the analysis template is presented, and results are described for a simple telescope optical system.

  14. Analysis of Thermal and Reaction Times for Hydrogen Reduction of Lunar Regolith

    NASA Technical Reports Server (NTRS)

    Hegde, U.; Balasubramaniam, R.; Gokoglu, S.

    2008-01-01

    System analysis of oxygen production by hydrogen reduction of lunar regolith has shown the importance of the relative time scales for regolith heating and chemical reaction to overall performance. These values determine the sizing and power requirements of the system and also impact the number and operational phasing of reaction chambers. In this paper, a Nusselt number correlation analysis is performed to determine the heat transfer rates and regolith heat up times in a fluidized bed reactor heated by a central heating element (e.g., a resistively heated rod, or a solar concentrator heat pipe). A coupled chemical and transport model has also been developed for the chemical reduction of regolith by a continuous flow of hydrogen. The regolith conversion occurs on the surfaces of and within the regolith particles. Several important quantities are identified as a result of the above analyses. Reactor scale parameters include the void fraction (i.e., the fraction of the reactor volume not occupied by the regolith particles) and the residence time of hydrogen in the reactor. Particle scale quantities include the particle Reynolds number, the Archimedes number, and the time needed for hydrogen to diffuse into the pores of the regolith particles. The analysis is used to determine the heat up and reaction times and its application to NASA s oxygen production system modeling tool is noted.

  15. Analysis of Thermal and Reaction Times for Hydrogen Reduction of Lunar Regolith

    NASA Technical Reports Server (NTRS)

    Hegde, U.; Balasubramaniam, R.; Gokoglu, S.

    2009-01-01

    System analysis of oxygen production by hydrogen reduction of lunar regolith has shown the importance of the relative time scales for regolith heating and chemical reaction to overall performance. These values determine the sizing and power requirements of the system and also impact the number and operational phasing of reaction chambers. In this paper, a Nusselt number correlation analysis is performed to determine the heat transfer rates and regolith heat up times in a fluidized bed reactor heated by a central heating element (e.g., a resistively heated rod, or a solar concentrator heat pipe). A coupled chemical and transport model has also been developed for the chemical reduction of regolith by a continuous flow of hydrogen. The regolith conversion occurs on the surfaces of and within the regolith particles. Several important quantities are identified as a result of the above analyses. Reactor scale parameters include the void fraction (i.e., the fraction of the reactor volume not occupied by the regolith particles) and the residence time of hydrogen in the reactor. Particle scale quantities include the particle Reynolds number, the Archimedes number, and the time needed for hydrogen to diffuse into the pores of the regolith particles. The analysis is used to determine the heat up and reaction times and its application to NASA s oxygen production system modeling tool is noted.

  16. Comparison of Seismic Responses for Reinforced Concrete Buildings with Mass and Stiffness Irregularities Using Pushover and Nonlinear Time History Analysis

    NASA Astrophysics Data System (ADS)

    Teruna, D. R.

    2017-03-01

    Pushover analysis or also known as nonlinear static procedures (NSP) have been recognized in recent years for practical evaluation of seismic demands and for structural design by estimating a structural building capacities and deformation demands. By comparing these demands and capacities at the performance level interest, the seismic performance of a building can be evaluated. However, the accuracy of NSP for assessment irregular building is not yet a fully satisfactory solution, since irregularities of a building influence the dynamic responses of the building. The objective of the study presented herein is to understand the nonlinear behaviour of six story RC building with mass irregularities at different floors and stiffness irregularity at first story (soft story) using NSP. For the purpose of comparison on the performance level obtained with NSP, nonlinear time history analysis (THA) were also performed under ground motion excitation with compatible to response spectra design. Finally, formation plastic hinges and their progressive development from elastic level to collapse prevention are presented and discussed.

  17. Academic Performance and Lifestyle Behaviors in Australian School Children: A Cluster Analysis

    ERIC Educational Resources Information Center

    Dumuid, Dorothea; Olds, Timothy; Martín-Fernández, Josep-Antoni; Lewis, Lucy K.; Cassidy, Leah; Maher, Carol

    2017-01-01

    Poor academic performance has been linked with particular lifestyle behaviors, such as unhealthy diet, short sleep duration, high screen time, and low physical activity. However, little is known about how lifestyle behavior patterns (or combinations of behaviors) contribute to children's academic performance. We aimed to compare academic…

  18. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  19. Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time-to-Event Analysis.

    PubMed

    Gong, Xiajing; Hu, Meng; Zhao, Liang

    2018-05-01

    Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time-to-event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high-dimensional data featured by a large number of predictor variables. Our results showed that ML-based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high-dimensional data. The prediction performances of ML-based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML-based methods provide a powerful tool for time-to-event analysis, with a built-in capacity for high-dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. © 2018 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  20. Engaging the Workforce - 12347

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaden, Michael D.; Wastren Advantage Inc.

    2012-07-01

    Likert, Covey, and a number of others studying and researching highly effective organizations have found that performing functions such as problem-solving, decision-making, safety analysis, planning, and continuous improvement as close to the working floor level as possible results in greater buy-in, feelings of ownership by the workers, and more effective use of resources. Empowering the workforce does several things: 1) people put more effort and thought into work for which they feel ownership, 2) the information they use for planning, analysis, problem-solving,and decision-making is more accurate, 3) these functions are performed in a more timely manner, and 4) the resultsmore » of these functions have more credibility with those who must implement them. This act of delegation and empowerment also allows management more time to perform functions they are uniquely trained and qualified to perform, such as strategic planning, staff development, succession planning, and organizational improvement. To achieve this state in an organization, however, requires a very open, transparent culture in which accurate, timely, relevant, candid, and inoffensive communication flourishes, a situation that does not currently exist in a majority of organizations. (authors)« less

  1. A comparative analysis of alternative approaches for quantifying nonlinear dynamics in cardiovascular system.

    PubMed

    Chen, Yun; Yang, Hui

    2013-01-01

    Heart rate variability (HRV) analysis has emerged as an important research topic to evaluate autonomic cardiac function. However, traditional time and frequency-domain analysis characterizes and quantify only linear and stationary phenomena. In the present investigation, we made a comparative analysis of three alternative approaches (i.e., wavelet multifractal analysis, Lyapunov exponents and multiscale entropy analysis) for quantifying nonlinear dynamics in heart rate time series. Note that these extracted nonlinear features provide information about nonlinear scaling behaviors and the complexity of cardiac systems. To evaluate the performance, we used 24-hour HRV recordings from 54 healthy subjects and 29 heart failure patients, available in PhysioNet. Three nonlinear methods are evaluated not only individually but also in combination using three classification algorithms, i.e., linear discriminate analysis, quadratic discriminate analysis and k-nearest neighbors. Experimental results show that three nonlinear methods capture nonlinear dynamics from different perspectives and the combined feature set achieves the best performance, i.e., sensitivity 97.7% and specificity 91.5%. Collectively, nonlinear HRV features are shown to have the promise to identify the disorders in autonomic cardiovascular function.

  2. Starting Performance Analysis for Universal Motors by FEM

    NASA Astrophysics Data System (ADS)

    Kurihara, Kazumi; Sakamoto, Shin-Ichi

    This paper presents a novel transient analysis of the universal motors taking into account the time-varying brush-contact resistance and mechanical loss. The transient current, torque and speed during the starting process are computed by solving the electromagnetic, circuit and dynamic motion equations, simultaneously. The computed performances have been validated by tests in a 500-W, 2-pole, 50Hz, 100V universal motor.

  3. A balanced perspective: using nonfinancial measures to assess financial performance.

    PubMed

    Watkins, Ann L

    2003-11-01

    Assessments of hospitals' financial performance have traditionally been based exclusively on analysis of a concise set of key financial ratios. One study, however, demonstrates that analysis of a hospital's financial condition can be significantly enhanced with the addition of several nonfinancial measures, including case-mix adjusted admissions, case-mix adjusted admissions per full-time equivalent, and case-mix adjusted admissions per beds in service.

  4. Improving Department of Defense Global Distribution Performance Through Network Analysis

    DTIC Science & Technology

    2016-06-01

    network performance increase. 14. SUBJECT TERMS supply chain metrics, distribution networks, requisition shipping time, strategic distribution database...peace and war” (p. 4). USTRANSCOM Metrics and Analysis Branch defines, develops, tracks, and maintains outcomes- based supply chain metrics to...2014a, p. 8). The Joint Staff defines a TDD standard as the maximum number of days the supply chain can take to deliver requisitioned materiel

  5. Kinetic performance comparison of fully and superficially porous particles with a particle size of 5 µm: intrinsic evaluation and application to the impurity analysis of griseofulvin.

    PubMed

    Kahsay, Getu; Broeckhoven, Ken; Adams, Erwin; Desmet, Gert; Cabooter, Deirdre

    2014-05-01

    After the great commercial success of sub-3 µm superficially porous particles, vendors are now also starting to commercialize 5 µm superficially porous particles, as an alternative to their fully porous counterparts which are routinely used in pharmaceutical analysis. In this study, the performance of 5 µm superficially porous particles was compared to that of fully porous 5 µm particles in terms of efficiency, separation performance and loadability on a conventional HPLC instrument. Van Deemter and kinetic plots were first used to evaluate the efficiency and performance of both particle types using alkylphenones as a test mixture. The van Deemter and kinetic plots showed that the superficially porous particles provide a superior kinetic performance compared to the fully porous particles over the entire relevant range of separation conditions, when both support types were evaluated at the same operating pressure. The same observations were made both for isocratic and gradient analysis. The superior performance was further demonstrated for the separation of a pharmaceutical compound (griseofulvin) and its impurities, where a gain in analysis time of around 2 could be obtained using the superficially porous particles. Finally, both particle types were evaluated in terms of loadability by plotting the resolution of the active pharmaceutical ingredient and its closest impurity as a function of the signal-to-noise ratio obtained for the smallest impurity. It was demonstrated that the superficially porous particles show better separation performance for griseofulvin and its impurities without significantly compromising sensitivity due to loadability issues in comparison with their fully porous counterparts. Moreover these columns can be used on conventional equipment without modifications to obtain a significant improvement in analysis time. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Parameter Transient Behavior Analysis on Fault Tolerant Control System

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine (Technical Monitor); Shin, Jong-Yeob

    2003-01-01

    In a fault tolerant control (FTC) system, a parameter varying FTC law is reconfigured based on fault parameters estimated by fault detection and isolation (FDI) modules. FDI modules require some time to detect fault occurrences in aero-vehicle dynamics. This paper illustrates analysis of a FTC system based on estimated fault parameter transient behavior which may include false fault detections during a short time interval. Using Lyapunov function analysis, the upper bound of an induced-L2 norm of the FTC system performance is calculated as a function of a fault detection time and the exponential decay rate of the Lyapunov function.

  7. Trends in Mathematics and Science Performance in 18 Countries: Multiple Regression Analysis of the Cohort Effects of TIMSS 1995-2007

    ERIC Educational Resources Information Center

    Hong, Hee Kyung

    2012-01-01

    The purpose of this study was to simultaneously examine relationships between teacher quality and instructional time and mathematics and science achievement of 8th grade cohorts in 18 advanced and developing economies. In addition, the study examined changes in mathematics and science performance across the two groups of economies over time using…

  8. Multifractal detrending moving-average cross-correlation analysis

    NASA Astrophysics Data System (ADS)

    Jiang, Zhi-Qiang; Zhou, Wei-Xing

    2011-07-01

    There are a number of situations in which several signals are simultaneously recorded in complex systems, which exhibit long-term power-law cross correlations. The multifractal detrended cross-correlation analysis (MFDCCA) approaches can be used to quantify such cross correlations, such as the MFDCCA based on the detrended fluctuation analysis (MFXDFA) method. We develop in this work a class of MFDCCA algorithms based on the detrending moving-average analysis, called MFXDMA. The performances of the proposed MFXDMA algorithms are compared with the MFXDFA method by extensive numerical experiments on pairs of time series generated from bivariate fractional Brownian motions, two-component autoregressive fractionally integrated moving-average processes, and binomial measures, which have theoretical expressions of the multifractal nature. In all cases, the scaling exponents hxy extracted from the MFXDMA and MFXDFA algorithms are very close to the theoretical values. For bivariate fractional Brownian motions, the scaling exponent of the cross correlation is independent of the cross-correlation coefficient between two time series, and the MFXDFA and centered MFXDMA algorithms have comparative performances, which outperform the forward and backward MFXDMA algorithms. For two-component autoregressive fractionally integrated moving-average processes, we also find that the MFXDFA and centered MFXDMA algorithms have comparative performances, while the forward and backward MFXDMA algorithms perform slightly worse. For binomial measures, the forward MFXDMA algorithm exhibits the best performance, the centered MFXDMA algorithms performs worst, and the backward MFXDMA algorithm outperforms the MFXDFA algorithm when the moment order q<0 and underperforms when q>0. We apply these algorithms to the return time series of two stock market indexes and to their volatilities. For the returns, the centered MFXDMA algorithm gives the best estimates of hxy(q) since its hxy(2) is closest to 0.5, as expected, and the MFXDFA algorithm has the second best performance. For the volatilities, the forward and backward MFXDMA algorithms give similar results, while the centered MFXDMA and the MFXDFA algorithms fail to extract rational multifractal nature.

  9. Structural neural correlates of multitasking: A voxel-based morphometry study.

    PubMed

    Zhang, Rui-Ting; Yang, Tian-Xiao; Wang, Yi; Sui, Yuxiu; Yao, Jingjing; Zhang, Chen-Yuan; Cheung, Eric F C; Chan, Raymond C K

    2016-12-01

    Multitasking refers to the ability to organize assorted tasks efficiently in a short period of time, which plays an important role in daily life. However, the structural neural correlates of multitasking performance remain unclear. The present study aimed at exploring the brain regions associated with multitasking performance using global correlation analysis. Twenty-six healthy participants first underwent structural brain scans and then performed the modified Six Element Test, which required participants to attempt six subtasks in 10 min while obeying a specific rule. Voxel-based morphometry of the whole brain was used to detect the structural correlates of multitasking ability. Grey matter volume of the anterior cingulate cortex (ACC) was positively correlated with the overall performance and time monitoring in multitasking. In addition, white matter volume of the anterior thalamic radiation (ATR) was also positively correlated with time monitoring during multitasking. Other related brain regions associated with multitasking included the superior frontal gyrus, the inferior occipital gyrus, the lingual gyrus, and the inferior longitudinal fasciculus. No significant correlation was found between grey matter volume of the prefrontal cortex (Brodmann Area 10) and multitasking performance. Using a global correlation analysis to examine various aspects of multitasking performance, this study provided new insights into the structural neural correlates of multitasking ability. In particular, the ACC was identified as an important brain region that played both a general and a specific time-monitoring role in multitasking, extending the role of the ACC from lesioned populations to healthy populations. The present findings also support the view that the ATR may influence multitasking performance by affecting time-monitoring abilities. © 2016 The Institute of Psychology, Chinese Academy of Sciences and John Wiley & Sons Australia, Ltd.

  10. Study of target and non-target interplay in spatial attention task.

    PubMed

    Sweeti; Joshi, Deepak; Panigrahi, B K; Anand, Sneh; Santhosh, Jayasree

    2018-02-01

    Selective visual attention is the ability to selectively pay attention to the targets while inhibiting the distractors. This paper aims to study the targets and non-targets interplay in spatial attention task while subject attends to the target object present in one visual hemifield and ignores the distractor present in another visual hemifield. This paper performs the averaged evoked response potential (ERP) analysis and time-frequency analysis. ERP analysis agrees to the left hemisphere superiority over late potentials for the targets present in right visual hemifield. Time-frequency analysis performed suggests two parameters i.e. event-related spectral perturbation (ERSP) and inter-trial coherence (ITC). These parameters show the same properties for the target present in either of the visual hemifields but show the difference while comparing the activity corresponding to the targets and non-targets. In this way, this study helps to visualise the difference between targets present in the left and right visual hemifields and, also the targets and non-targets present in the left and right visual hemifields. These results could be utilised to monitor subjects' performance in brain-computer interface (BCI) and neurorehabilitation.

  11. Factors that influence standard automated perimetry test results in glaucoma: test reliability, technician experience, time of day, and season.

    PubMed

    Junoy Montolio, Francisco G; Wesselink, Christiaan; Gordijn, Marijke; Jansonius, Nomdo M

    2012-10-09

    To determine the influence of several factors on standard automated perimetry test results in glaucoma. Longitudinal Humphrey field analyzer 30-2 Swedish interactive threshold algorithm data from 160 eyes of 160 glaucoma patients were used. The influence of technician experience, time of day, and season on the mean deviation (MD) was determined by performing linear regression analysis of MD against time on a series of visual fields and subsequently performing a multiple linear regression analysis with the MD residuals as dependent variable and the factors mentioned above as independent variables. Analyses were performed with and without adjustment for the test reliability (fixation losses and false-positive and false-negative answers) and with and without stratification according to disease stage (baseline MD). Mean follow-up was 9.4 years, with on average 10.8 tests per patient. Technician experience, time of day, and season were associated with the MD. Approximately 0.2 dB lower MD values were found for inexperienced technicians (P < 0.001), tests performed after lunch (P < 0.001), and tests performed in the summer or autumn (P < 0.001). The effects of time of day and season appeared to depend on disease stage. Independent of these effects, the percentage of false-positive answers strongly influenced the MD with a 1 dB increase in MD per 10% increase in false-positive answers. Technician experience, time of day, season, and the percentage of false-positive answers have a significant influence on the MD of standard automated perimetry.

  12. Automatic segmentation of time-lapse microscopy images depicting a live Dharma embryo.

    PubMed

    Zacharia, Eleni; Bondesson, Maria; Riu, Anne; Ducharme, Nicole A; Gustafsson, Jan-Åke; Kakadiaris, Ioannis A

    2011-01-01

    Biological inferences about the toxicity of chemicals reached during experiments on the zebrafish Dharma embryo can be greatly affected by the analysis of the time-lapse microscopy images depicting the embryo. Among the stages of image analysis, automatic and accurate segmentation of the Dharma embryo is the most crucial and challenging. In this paper, an accurate and automatic segmentation approach for the segmentation of the Dharma embryo data obtained by fluorescent time-lapse microscopy is proposed. Experiments performed in four stacks of 3D images over time have shown promising results.

  13. Monte Carlo analysis of the Titan III/Transfer Orbit Stage guidance system for the Mars Observer mission

    NASA Astrophysics Data System (ADS)

    Bell, Stephen C.; Ginsburg, Marc A.; Rao, Prabhakara P.

    An important part of space launch vehicle mission planning for a planetary mission is the integrated analysis of guidance and performance dispersions for both booster and upper stage vehicles. For the Mars Observer mission, an integrated trajectory analysis was used to maximize the scientific payload and to minimize injection errors by optimizing the energy management of both vehicles. This was accomplished by designing the Titan III booster vehicle to inject into a hyperbolic departure plane, and the Transfer Orbit Stage (TOS) to correct any booster dispersions. An integrated Monte Carlo analysis of the performance and guidance dispersions of both vehicles provided sensitivities, an evaluation of their guidance schemes and an injection error covariance matrix. The polynomial guidance schemes used for the Titan III variable flight azimuth computations and the TOS solid rocket motor ignition time and burn direction derivations accounted for a wide variation of launch times, performance dispersions, and target conditions. The Mars Observer spacecraft was launched on 25 September 1992 on the Titan III/TOS vehicle. The post flight analysis indicated that a near perfect park orbit injection was achieved, followed by a trans-Mars injection with less than 2sigma errors.

  14. Design and implementation of software for automated quality control and data analysis for a complex LC/MS/MS assay for urine opiates and metabolites.

    PubMed

    Dickerson, Jane A; Schmeling, Michael; Hoofnagle, Andrew N; Hoffman, Noah G

    2013-01-16

    Mass spectrometry provides a powerful platform for performing quantitative, multiplexed assays in the clinical laboratory, but at the cost of increased complexity of analysis and quality assurance calculations compared to other methodologies. Here we describe the design and implementation of a software application that performs quality control calculations for a complex, multiplexed, mass spectrometric analysis of opioids and opioid metabolites. The development and implementation of this application improved our data analysis and quality assurance processes in several ways. First, use of the software significantly improved the procedural consistency for performing quality control calculations. Second, it reduced the amount of time technologists spent preparing and reviewing the data, saving on average over four hours per run, and in some cases improving turnaround time by a day. Third, it provides a mechanism for coupling procedural and software changes with the results of each analysis. We describe several key details of the implementation including the use of version control software and automated unit tests. These generally useful software engineering principles should be considered for any software development project in the clinical lab. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Mixed Criticality Scheduling for Industrial Wireless Sensor Networks

    PubMed Central

    Jin, Xi; Xia, Changqing; Xu, Huiting; Wang, Jintao; Zeng, Peng

    2016-01-01

    Wireless sensor networks (WSNs) have been widely used in industrial systems. Their real-time performance and reliability are fundamental to industrial production. Many works have studied the two aspects, but only focus on single criticality WSNs. Mixed criticality requirements exist in many advanced applications in which different data flows have different levels of importance (or criticality). In this paper, first, we propose a scheduling algorithm, which guarantees the real-time performance and reliability requirements of data flows with different levels of criticality. The algorithm supports centralized optimization and adaptive adjustment. It is able to improve both the scheduling performance and flexibility. Then, we provide the schedulability test through rigorous theoretical analysis. We conduct extensive simulations, and the results demonstrate that the proposed scheduling algorithm and analysis significantly outperform existing ones. PMID:27589741

  16. Interactive design and analysis of future large spacecraft concepts

    NASA Technical Reports Server (NTRS)

    Garrett, L. B.

    1981-01-01

    An interactive computer aided design program used to perform systems level design and analysis of large spacecraft concepts is presented. Emphasis is on rapid design, analysis of integrated spacecraft, and automatic spacecraft modeling for lattice structures. Capabilities and performance of multidiscipline applications modules, the executive and data management software, and graphics display features are reviewed. A single user at an interactive terminal create, design, analyze, and conduct parametric studies of Earth orbiting spacecraft with relative ease. Data generated in the design, analysis, and performance evaluation of an Earth-orbiting large diameter antenna satellite are used to illustrate current capabilities. Computer run time statistics for the individual modules quantify the speed at which modeling, analysis, and design evaluation of integrated spacecraft concepts is accomplished in a user interactive computing environment.

  17. Large-scale network integration in the human brain tracks temporal fluctuations in memory encoding performance.

    PubMed

    Keerativittayayut, Ruedeerat; Aoki, Ryuta; Sarabi, Mitra Taghizadeh; Jimura, Koji; Nakahara, Kiyoshi

    2018-06-18

    Although activation/deactivation of specific brain regions have been shown to be predictive of successful memory encoding, the relationship between time-varying large-scale brain networks and fluctuations of memory encoding performance remains unclear. Here we investigated time-varying functional connectivity patterns across the human brain in periods of 30-40 s, which have recently been implicated in various cognitive functions. During functional magnetic resonance imaging, participants performed a memory encoding task, and their performance was assessed with a subsequent surprise memory test. A graph analysis of functional connectivity patterns revealed that increased integration of the subcortical, default-mode, salience, and visual subnetworks with other subnetworks is a hallmark of successful memory encoding. Moreover, multivariate analysis using the graph metrics of integration reliably classified the brain network states into the period of high (vs. low) memory encoding performance. Our findings suggest that a diverse set of brain systems dynamically interact to support successful memory encoding. © 2018, Keerativittayayut et al.

  18. Effects of noise on the performance of a memory decision response task

    NASA Technical Reports Server (NTRS)

    Lawton, B. W.

    1972-01-01

    An investigation has been made to determine the effects of noise on human performance. Fourteen subjects performed a memory-decision-response task in relative quiet and while listening to tape recorded noises. Analysis of the data obtained indicates that performance was degraded in the presence of noise. Significant increases in problem solution times were found for impulsive noise conditions as compared with times found for the no-noise condition. Performance accuracy was also degraded. Significantly more error responses occurred at higher noise levels; a direct or positive relation was found between error responses and noise level experienced by the subjects.

  19. Comparison of the analytical and clinical performances of Abbott RealTime High Risk HPV, Hybrid Capture 2, and DNA Chip assays in gynecology patients.

    PubMed

    Park, Seungman; Kang, Youjin; Kim, Dong Geun; Kim, Eui-Chong; Park, Sung Sup; Seong, Moon-Woo

    2013-08-01

    The detection of high-risk (HR) HPV in cervical cancer screening is important for early diagnosis of cervical cancer or pre-cancerous lesions. We evaluated the analytical and clinical performances of 3 HR HPV assays in Gynecology patients. A total of 991 specimens were included in this study: 787 specimens for use with a Hybrid Capture 2 (HC2) and 204 specimens for a HPV DNA microarray (DNA Chip). All specimens were tested using an Abbott RealTime High Risk HPV assay (Real-time HR), PGMY PCR, and sequence analysis. Clinical sensitivities for severe abnormal cytology (severe than high-grade squamous intraepithelial lesion) were 81.8% for Real-time HR, 77.3% for HC2, and 66.7% for DNA Chip, and clinical sensitivities for severe abnormal histology (cervical intraepithelial neoplasia grade 2+) were 91.7% for HC2, 87.5% for Real-time HR, and 73.3% for DNA Chip. As compared to results of the sequence analysis, HC2, Real-time HR, and DNA Chip showed concordance rates of 94.3% (115/122), 90.0% (117/130), and 61.5% (16/26), respectively. The HC2 assay and Real-time HR assay showed comparable results to each other in both clinical and analytical performances, while the DNA Chip assay showed poor clinical and analytical performances. The Real-time HR assay can be a good alternative option for HR HPV testing with advantages of allowing full automation and simultaneous genotyping of HR types 16 and 18. Copyright © 2013 Elsevier Inc. All rights reserved.

  20. Functional network mediates age-related differences in reaction time: a replication and extension study

    PubMed Central

    Gazes, Yunglin; Habeck, Christian; O'Shea, Deirdre; Razlighi, Qolamreza R; Steffener, Jason; Stern, Yaakov

    2015-01-01

    Introduction A functional activation (i.e., ordinal trend) pattern was previously identified in both young and older adults during task-switching performance, the expression of which correlated with reaction time. The current study aimed to (1) replicate this functional activation pattern in a new group of fMRI activation data, and (2) extend the previous study by specifically examining whether the effect of aging on reaction time can be explained by differences in the activation of the functional activation pattern. Method A total of 47 young and 50 older participants were included in the extension analysis. Participants performed task-switching as the activation task and were cued by the color of the stimulus for the task to be performed in each block. To test for replication, two approaches were implemented. The first approach tested the replicability of the predictive power of the previously identified functional activation pattern by forward applying the pattern to the Study II data and the second approach was rederivation of the activation pattern in the Study II data. Results Both approaches showed successful replication in the new data set. Using mediation analysis, expression of the pattern from the first approach was found to partially mediate age-related effects on reaction time such that older age was associated with greater activation of the brain pattern and longer reaction time, suggesting that brain activation efficiency (defined as “the rate of activation increase with increasing task difficulty” in Neuropsychologia 47, 2009, 2015) of the regions in the Ordinal trend pattern directly accounts for age-related differences in task performance. Discussion The successful replication of the functional activation pattern demonstrates the versatility of the Ordinal Trend Canonical Variates Analysis, and the ability to summarize each participant's brain activation map into one number provides a useful metric in multimodal analysis as well as cross-study comparisons. PMID:25874162

  1. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    PubMed

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  2. Highly comparative time-series analysis: the empirical structure of time series and their methods

    PubMed Central

    Fulcher, Ben D.; Little, Max A.; Jones, Nick S.

    2013-01-01

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344

  3. FMAj: a tool for high content analysis of muscle dynamics in Drosophila metamorphosis.

    PubMed

    Kuleesha, Yadav; Puah, Wee Choo; Lin, Feng; Wasser, Martin

    2014-01-01

    During metamorphosis in Drosophila melanogaster, larval muscles undergo two different developmental fates; one population is removed by cell death, while the other persistent subset undergoes morphological remodeling and survives to adulthood. Thanks to the ability to perform live imaging of muscle development in transparent pupae and the power of genetics, metamorphosis in Drosophila can be used as a model to study the regulation of skeletal muscle mass. However, time-lapse microscopy generates sizeable image data that require new tools for high throughput image analysis. We performed targeted gene perturbation in muscles and acquired 3D time-series images of muscles in metamorphosis using laser scanning confocal microscopy. To quantify the phenotypic effects of gene perturbations, we designed the Fly Muscle Analysis tool (FMAj) which is based on the ImageJ and MySQL frameworks for image processing and data storage, respectively. The image analysis pipeline of FMAj contains three modules. The first module assists in adding annotations to time-lapse datasets, such as genotypes, experimental parameters and temporal reference points, which are used to compare different datasets. The second module performs segmentation and feature extraction of muscle cells and nuclei. Users can provide annotations to the detected objects, such as muscle identities and anatomical information. The third module performs comparative quantitative analysis of muscle phenotypes. We applied our tool to the phenotypic characterization of two atrophy related genes that were silenced by RNA interference. Reduction of Drosophila Tor (Target of Rapamycin) expression resulted in enhanced atrophy compared to control, while inhibition of the autophagy factor Atg9 caused suppression of atrophy and enlarged muscle fibers of abnormal morphology. FMAj enabled us to monitor the progression of atrophic and hypertrophic phenotypes of individual muscles throughout metamorphosis. We designed a new tool to visualize and quantify morphological changes of muscles in time-lapse images of Drosophila metamorphosis. Our in vivo imaging experiments revealed that evolutionarily conserved genes involved in Tor signalling and autophagy, perform similar functions in regulating muscle mass in mammals and Drosophila. Extending our approach to a genome-wide scale has the potential to identify new genes involved in muscle size regulation.

  4. FMAj: a tool for high content analysis of muscle dynamics in Drosophila metamorphosis

    PubMed Central

    2014-01-01

    Background During metamorphosis in Drosophila melanogaster, larval muscles undergo two different developmental fates; one population is removed by cell death, while the other persistent subset undergoes morphological remodeling and survives to adulthood. Thanks to the ability to perform live imaging of muscle development in transparent pupae and the power of genetics, metamorphosis in Drosophila can be used as a model to study the regulation of skeletal muscle mass. However, time-lapse microscopy generates sizeable image data that require new tools for high throughput image analysis. Results We performed targeted gene perturbation in muscles and acquired 3D time-series images of muscles in metamorphosis using laser scanning confocal microscopy. To quantify the phenotypic effects of gene perturbations, we designed the Fly Muscle Analysis tool (FMAj) which is based on the ImageJ and MySQL frameworks for image processing and data storage, respectively. The image analysis pipeline of FMAj contains three modules. The first module assists in adding annotations to time-lapse datasets, such as genotypes, experimental parameters and temporal reference points, which are used to compare different datasets. The second module performs segmentation and feature extraction of muscle cells and nuclei. Users can provide annotations to the detected objects, such as muscle identities and anatomical information. The third module performs comparative quantitative analysis of muscle phenotypes. We applied our tool to the phenotypic characterization of two atrophy related genes that were silenced by RNA interference. Reduction of Drosophila Tor (Target of Rapamycin) expression resulted in enhanced atrophy compared to control, while inhibition of the autophagy factor Atg9 caused suppression of atrophy and enlarged muscle fibers of abnormal morphology. FMAj enabled us to monitor the progression of atrophic and hypertrophic phenotypes of individual muscles throughout metamorphosis. Conclusions We designed a new tool to visualize and quantify morphological changes of muscles in time-lapse images of Drosophila metamorphosis. Our in vivo imaging experiments revealed that evolutionarily conserved genes involved in Tor signalling and autophagy, perform similar functions in regulating muscle mass in mammals and Drosophila. Extending our approach to a genome-wide scale has the potential to identify new genes involved in muscle size regulation. PMID:25521203

  5. Motor current signature analysis for gearbox condition monitoring under transient speeds using wavelet analysis and dual-level time synchronous averaging

    NASA Astrophysics Data System (ADS)

    Bravo-Imaz, Inaki; Davari Ardakani, Hossein; Liu, Zongchang; García-Arribas, Alfredo; Arnaiz, Aitor; Lee, Jay

    2017-09-01

    This paper focuses on analyzing motor current signature for fault diagnosis of gearboxes operating under transient speed regimes. Two different strategies are evaluated, extensively tested and compared to analyze the motor current signature in order to implement a condition monitoring system for gearboxes in industrial machinery. A specially designed test bench is used, thoroughly monitored to fully characterize the experiments, in which gears in different health status are tested. The measured signals are analyzed using discrete wavelet decomposition, in different decomposition levels using a range of mother wavelets. Moreover, a dual-level time synchronous averaging analysis is performed on the same signal to compare the performance of the two methods. From both analyses, the relevant features of the signals are extracted and cataloged using a self-organizing map, which allows for an easy detection and classification of the diverse health states of the gears. The results demonstrate the effectiveness of both methods for diagnosing gearbox faults. A slightly better performance was observed for dual-level time synchronous averaging method. Based on the obtained results, the proposed methods can used as effective and reliable condition monitoring procedures for gearbox condition monitoring using only motor current signature.

  6. Metabolic profiling of rat hair and screening biomarkers using ultra performance liquid chromatography with electrospray ionization time-of-flight mass spectrometry.

    PubMed

    Inagaki, Shinsuke; Noda, Takumi; Min, Jun Zhe; Toyo'oka, Toshimasa

    2007-12-28

    An exhaustive analysis of metabolites in hair samples has been performed for the first time using ultra performance liquid chromatography with electrospray ionization time-of-flight mass spectrometry (UPLC-ESI-TOF-MS). The hair samples were collected from spontaneously hypertensive model rats (SHR/Izm), stroke-prone SHR (SHRSP/Izm) and Wistar Kyoto (WKY/Izm) rats, and were analyzed by UPLC-ESI-TOF-MS; a multivariate statistical analysis method, such as the principal component analysis (PCA), was then used for screening the biomarkers. From the samples derived from the group of SHRSP/Izm at weeks 10, 18, 26 and 34, we successfully detected a potential biomarker of stroke, which existed at much higher concentrations as compared with that in the other groups. However, a significant difference could not be found at weeks less than 7 before the rats were subjected to stroke and hypertension. In addition, the present method was applicable to screening not only the disease markers, but also the markers related to aging. The method utilizing hair samples is expected to be quite useful for screening biomarkers of many other diseases, and not limited to stroke and hypertension.

  7. Placement of central venous port catheters and peripherally inserted central catheters in the routine clinical setting of a radiology department: analysis of costs and intervention duration learning curve.

    PubMed

    Rotzinger, Roman; Gebauer, Bernhard; Schnapauff, Dirk; Streitparth, Florian; Wieners, Gero; Grieser, Christian; Freyhardt, Patrick; Hamm, Bernd; Maurer, Martin H

    2017-12-01

    Background Placement of central venous port catheters (CVPS) and peripherally inserted central catheters (PICC) is an integral component of state-of-the-art patient care. In the era of increasing cost awareness, it is desirable to have more information to comprehensively assess both procedures. Purpose To perform a retrospective analysis of interventional radiologic implantation of CVPS and PICC lines in a large patient population including a cost analysis of both methods as well as an investigation the learning curve in terms of the interventions' durations. Material and Methods All CVPS and PICC line related interventions performed in an interventional radiology department during a three-year period from January 2011 to December 2013 were examined. Documented patient data included sex, venous access site, and indication for CVPS or PICC placement. A cost analysis including intervention times was performed based on the prorated costs of equipment use, staff costs, and expenditures for disposables. The decrease in intervention duration in the course of time conformed to the learning curve. Results In total, 2987 interventions were performed by 16 radiologists: 1777 CVPS and 791 PICC lines. An average implantation took 22.5 ± 0.6 min (CVPS) and 10.1 ± 0.9 min (PICC lines). For CVPS, this average time was achieved by seven radiologists newly learning the procedures after performing 20 CVPS implantations. Total costs per implantation were €242 (CVPS) and €201 (PICC lines). Conclusion Interventional radiologic implantations of CVPS and PICC lines are well-established procedures, easy to learn by residents, and can be implanted at low costs.

  8. Lightning Jump Algorithm Development for the GOES·R Geostationary Lightning Mapper

    NASA Technical Reports Server (NTRS)

    Schultz. E.; Schultz. C.; Chronis, T.; Stough, S.; Carey, L.; Calhoun, K.; Ortega, K.; Stano, G.; Cecil, D.; Bateman, M.; hide

    2014-01-01

    Current work on the lightning jump algorithm to be used in GOES-R Geostationary Lightning Mapper (GLM)'s data stream is multifaceted due to the intricate interplay between the storm tracking, GLM proxy data, and the performance of the lightning jump itself. This work outlines the progress of the last year, where analysis and performance of the lightning jump algorithm with automated storm tracking and GLM proxy data were assessed using over 700 storms from North Alabama. The cases analyzed coincide with previous semi-objective work performed using total lightning mapping array (LMA) measurements in Schultz et al. (2011). Analysis shows that key components of the algorithm (flash rate and sigma thresholds) have the greatest influence on the performance of the algorithm when validating using severe storm reports. Automated objective analysis using the GLM proxy data has shown probability of detection (POD) values around 60% with false alarm rates (FAR) around 73% using similar methodology to Schultz et al. (2011). However, when applying verification methods similar to those employed by the National Weather Service, POD values increase slightly (69%) and FAR values decrease (63%). The relationship between storm tracking and lightning jump has also been tested in a real-time framework at NSSL. This system includes fully automated tracking by radar alone, real-time LMA and radar observations and the lightning jump. Results indicate that the POD is strong at 65%. However, the FAR is significantly higher than in Schultz et al. (2011) (50-80% depending on various tracking/lightning jump parameters) when using storm reports for verification. Given known issues with Storm Data, the performance of the real-time jump algorithm is also being tested with high density radar and surface observations from the NSSL Severe Hazards Analysis & Verification Experiment (SHAVE).

  9. NAS-Wide Fast-Time Simulation Study for Evaluating Performance of UAS Detect-and-Avoid Alerting and Guidance Systems

    NASA Technical Reports Server (NTRS)

    Lee, Seung Man; Park, Chunki; Cone, Andrew Clayton; Thipphavong, David P.; Santiago, Confesor

    2016-01-01

    This presentation contains the analysis results of NAS-wide fast-time simulations with UAS and VFR traffic for a single day for evaluating the performance of Detect-and-Avoid (DAA) alerting and guidance systems. This purpose of this study was to help refine and validate MOPS alerting and guidance requirements. In this study, we generated plots of all performance metrics that are specified by RTCA SC-228 Minimum Operational Performance Standards (MOPS): 1) to evaluate the sensitivity of alerting parameters on the performance metrics of each DAA alert type: Preventive, Corrective, and Warning alerts and 2) to evaluate the effect of sensor uncertainty on DAA alerting and guidance performance.

  10. Methodology for fast detection of false sharing in threaded scientific codes

    DOEpatents

    Chung, I-Hsin; Cong, Guojing; Murata, Hiroki; Negishi, Yasushi; Wen, Hui-Fang

    2014-11-25

    A profiling tool identifies a code region with a false sharing potential. A static analysis tool classifies variables and arrays in the identified code region. A mapping detection library correlates memory access instructions in the identified code region with variables and arrays in the identified code region while a processor is running the identified code region. The mapping detection library identifies one or more instructions at risk, in the identified code region, which are subject to an analysis by a false sharing detection library. A false sharing detection library performs a run-time analysis of the one or more instructions at risk while the processor is re-running the identified code region. The false sharing detection library determines, based on the performed run-time analysis, whether two different portions of the cache memory line are accessed by the generated binary code.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gundlach-Graham, Alexander W.; Dennis, Elise; Ray, Steven J.

    An inductively coupled plasma distance-of-flight mass spectrometer (ICP-DOFMS) has been coupled with laser-ablation (LA) sample introduction for the elemental analysis of solids. ICP-DOFMS is well suited for the analysis of laser-generated aerosols because it offers both high-speed mass analysis and simultaneous multi-elemental detection. Here, we evaluate the analytical performance of the LA-ICP-DOFMS instrument, equipped with a microchannel plate-based imaging detector, for the measurement of steady-state LA signals, as well as transient signals produced from single LA events. Steady-state detection limits are 1 mg g1, and absolute single-pulse LA detection limits are 200 fg for uranium; the system is shown capablemore » of performing time-resolved single-pulse LA analysis. By leveraging the benefits of simultaneous multi-elemental detection, we also attain a good shot-to-shot reproducibility of 6% relative standard deviation (RSD) and isotope-ratio precision of 0.3% RSD with a 10 s integration time.« less

  12. Modeling error analysis of stationary linear discrete-time filters

    NASA Technical Reports Server (NTRS)

    Patel, R.; Toda, M.

    1977-01-01

    The performance of Kalman-type, linear, discrete-time filters in the presence of modeling errors is considered. The discussion is limited to stationary performance, and bounds are obtained for the performance index, the mean-squared error of estimates for suboptimal and optimal (Kalman) filters. The computation of these bounds requires information on only the model matrices and the range of errors for these matrices. Consequently, a design can easily compare the performance of a suboptimal filter with that of the optimal filter, when only the range of errors in the elements of the model matrices is available.

  13. Caffeine and Bicarbonate for Speed. A Meta-Analysis of Legal Supplements Potential for Improving Intense Endurance Exercise Performance

    PubMed Central

    Christensen, Peter M.; Shirai, Yusuke; Ritz, Christian; Nordsborg, Nikolai B.

    2017-01-01

    A 1% change in average speed is enough to affect medal rankings in intense Olympic endurance events lasting ~45 s to 8 min which for example includes 100 m swimming and 400 m running (~1 min), 1,500 m running and 4000 m track cycling (~4 min) and 2,000 m rowing (~6-8 min). To maximize the likelihood of winning, athletes utilizes legal supplements with or without scientifically documented beneficial effects on performance. Therefore, a continued systematic evidence based evaluation of the possible ergogenic effects is of high importance. A meta-analysis was conducted with a strict focus on closed-end performance tests in humans in the time domain from 45 s to 8 min. These test include time-trials or total work done in a given time. This selection criterion results in a high relevance for athletic performance. Only peer-reviewed placebo controlled studies were included. The often applied and potentially ergogenic supplements beta-alanine, bicarbonate, caffeine and nitrate were selected for analysis. Following a systematic search in Pubmed and SportsDiscuss combined with evaluation of cross references a total of 7 (beta-alanine), 25 (bicarbonate), 9 (caffeine), and 5 (nitrate) studies was included in the meta-analysis. For each study, performance was converted to an average speed (km/h) from which an effect size (ES; Cohens d with 95% confidence intervals) was calculated. A small effect and significant performance improvement relative to placebo was observed for caffeine (ES: 0.41 [0.15–0.68], P = 0.002) and bicarbonate (ES: 0.40 [0.27–0.54], P < 0.001). Trivial and non-significant effects on performance was observed for nitrate (ES: 0.19 [−0.03–0.40], P = 0.09) and beta-alanine (ES: 0.17 [−0.12–0.46], P = 0.24). Thus, caffeine's and bicarbonate's ergogenic effect is clearly documented for intense endurance performance. Importantly, for all supplements an individualized approach may improve the ergogenic effect on performance. PMID:28536531

  14. Acoustic Performance of a Real-Time Three-Dimensional Sound-Reproduction System

    NASA Technical Reports Server (NTRS)

    Faller, Kenneth J., II; Rizzi, Stephen A.; Aumann, Aric R.

    2013-01-01

    The Exterior Effects Room (EER) is a 39-seat auditorium at the NASA Langley Research Center and was built to support psychoacoustic studies of aircraft community noise. The EER has a real-time simulation environment which includes a three-dimensional sound-reproduction system. This system requires real-time application of equalization filters to compensate for spectral coloration of the sound reproduction due to installation and room effects. This paper describes the efforts taken to develop the equalization filters for use in the real-time sound-reproduction system and the subsequent analysis of the system s acoustic performance. The acoustic performance of the compensated and uncompensated sound-reproduction system is assessed for its crossover performance, its performance under stationary and dynamic conditions, the maximum spatialized sound pressure level it can produce from a single virtual source, and for the spatial uniformity of a generated sound field. Additionally, application examples are given to illustrate the compensated sound-reproduction system performance using recorded aircraft flyovers

  15. Differences in kata performance time and distance from a marker for experienced Shotokan karateka under normal sighted and blindfolded conditions.

    PubMed

    Layton, Clive; Avenell, Leon

    2002-08-01

    10 experienced Shotokan karateka were tested on performance time and distance from a marker on the five Heian kata under normal sighted and blind-folded conditions. Whilst each kata's line of movement is different, it is the intention to start and finish at the same location. Analysis showed that despite an average of 16.8 yr. of training, whilst timing was not significantly affected on four of the kata by subjects being deprived of the visual sense, the group's mean change in distance from an original marker was significant for performances on three of the kata.

  16. Statistical mechanics of broadcast channels using low-density parity-check codes.

    PubMed

    Nakamura, Kazutaka; Kabashima, Yoshiyuki; Morelos-Zaragoza, Robert; Saad, David

    2003-03-01

    We investigate the use of Gallager's low-density parity-check (LDPC) codes in a degraded broadcast channel, one of the fundamental models in network information theory. Combining linear codes is a standard technique in practical network communication schemes and is known to provide better performance than simple time sharing methods when algebraic codes are used. The statistical physics based analysis shows that the practical performance of the suggested method, achieved by employing the belief propagation algorithm, is superior to that of LDPC based time sharing codes while the best performance, when received transmissions are optimally decoded, is bounded by the time sharing limit.

  17. Comparative neurobehavioral study of a polybrominated biphenyl-exposed population in Michigan and a nonexposed group in Wisconsin.

    PubMed Central

    Valciukas, J A; Lilis, R; Wolff, M S; Anderson, H A

    1978-01-01

    An analysis of findings regarding the prevalence and time course of symptoms and the results of neurobehavioral testing among Michigan and Wisconsin dairy farmers, is reported. Reviewed are: (1) differences in the prevalence of neurological symptoms at the time of examination; (2) differences in the incidence and time course of symptoms for the period 1972--1976; (3) differences among populations and subgroups (sex and age) regarding performance test scores; (4) correlations between performance test scores and neurological symptoms; and (5) correlations between serum PBB levels as indicators of exposure and performance tests and neurological symptoms. PMID:209977

  18. Laparoendoscopic single-site surgery varicocelectomy versus conventional laparoscopic varicocele ligation: A meta-analysis

    PubMed Central

    Li, Mingchao; Wang, Zhengyun

    2016-01-01

    Objective To perform a meta-analysis of data from available published studies comparing laparoendoscopic single-site surgery varicocelectomy (LESSV) with conventional transperitoneal laparoscopic varicocele ligation. Methods A comprehensive data search was performed in PubMed and Embase to identify randomized controlled trials and comparative studies that compared the two surgical approaches for the treatment of varicoceles. Results Six studies were included in the meta-analysis. LESSV required a significantly longer operative time than conventional laparoscopic varicocelectomy but was associated with significantly less postoperative pain at 6 h and 24 h, a shorter recovery time and greater patient satisfaction with the cosmetic outcome. There was no difference between the two surgical approaches in terms of postoperative semen quality or the incidence of complications. Conclusion These data suggest that LESSV offers a well tolerated and efficient alternative to conventional laparoscopic varicocelectomy, with less pain, a shorter recovery time and better cosmetic satisfaction. Further well-designed studies are required to confirm these findings and update the results of this meta-analysis. PMID:27688686

  19. Improving Efficiency with Work Sampling.

    ERIC Educational Resources Information Center

    Friedman, Mark; Hertz, Paul

    1982-01-01

    Work sampling is a managerial accounting technique which provides information about the efficiency of an operation. This analysis determines what tasks are being performed durinq a period of time to ascertain if time and effort are being allocated efficiently. (SK)

  20. The incorrect usage of singular spectral analysis and discrete wavelet transform in hybrid models to predict hydrological time series

    NASA Astrophysics Data System (ADS)

    Du, Kongchang; Zhao, Ying; Lei, Jiaqiang

    2017-09-01

    In hydrological time series prediction, singular spectrum analysis (SSA) and discrete wavelet transform (DWT) are widely used as preprocessing techniques for artificial neural network (ANN) and support vector machine (SVM) predictors. These hybrid or ensemble models seem to largely reduce the prediction error. In current literature researchers apply these techniques to the whole observed time series and then obtain a set of reconstructed or decomposed time series as inputs to ANN or SVM. However, through two comparative experiments and mathematical deduction we found the usage of SSA and DWT in building hybrid models is incorrect. Since SSA and DWT adopt 'future' values to perform the calculation, the series generated by SSA reconstruction or DWT decomposition contain information of 'future' values. These hybrid models caused incorrect 'high' prediction performance and may cause large errors in practice.

  1. Accepting multiple simultaneous liver offers does not negatively impact transplant outcomes.

    PubMed

    Eldeen, Firas Zahr; Mourad, Moustafa Mabrouk; Bhandari, Mayank; Roll, Garrett; Gunson, Bridget; Mergental, Hynek; Bramhall, Simon; Isaac, John; Muiesan, Paolo; Mirza, Darius F; Perera, M Thamara P R

    2016-02-01

    Impact of performing multiple liver transplants (LT) in a short period of time is unknown. Consecutively performed LT potentially increase complication rates through team fatigue and overutilization of resources and increase ischemia time. We analyzed the impact of undertaking consecutive LT (Consecutive liver transplant, CLT; LT preceded by another transplant performed not more than 12 h before, both transplants grouped together) on outcomes. Of 1702 LT performed, 314 (18.4%) were CLT. Outcome data was compared with solitary LT (SLT; not more than one LT in 12-h period). Recipient, donor, and graft characteristics were evenly matched between SLT and CLT; second LT of CLT group utilized younger donors grafts with longer cold ischemic times (P = 0.015). Implantation and operative time were significantly lower in CLT recipients on intergroup analysis (P = 0.0001 and 0.002, respectively). Early hepatic artery thrombosis (E-HAT) was higher in CLT versus SLT (P = 0.038), despite absolute number of E-HAT being low in all groups. Intragroup analysis demonstrated a trend toward more frequent E-HAT in first LT, compared to subsequent transplants; however, difference did not reach statistical significance (P = 0.135). In era of organ scarcity, CLT performed at high-volume center is safe and allows pragmatic utilization of organs, potentially reducing number of discarded grafts and reducing waiting list mortality. © 2015 Steunstichting ESOT.

  2. High-performance, multi-faceted research sonar electronics

    NASA Astrophysics Data System (ADS)

    Moseley, Julian W.

    This thesis describes the design, implementation and testing of a research sonar system capable of performing complex applications such as coherent Doppler measurement and synthetic aperture imaging. Specifically, this thesis presents an approach to improve the precision of the timing control and increase the signal-to-noise ratio of an existing research sonar. A dedicated timing control subsystem, and hardware drivers are designed to improve the efficiency of the old sonar's timing operations. A low noise preamplifier is designed to reduce the noise component in the received signal arriving at the input of the system's data acquisition board. Noise analysis, frequency response, and timing simulation data are generated in order to predict the functionality and performance improvements expected when the subsystems are implemented. Experimental data, gathered using these subsys- tems, are presented, and are shown to closely match the simulation results, thus verifying performance.

  3. An analysis of relational complexity in an air traffic control conflict detection task.

    PubMed

    Boag, Christine; Neal, Andrew; Loft, Shayne; Halford, Graeme S

    2006-11-15

    Theoretical analyses of air traffic complexity were carried out using the Method for the Analysis of Relational Complexity. Twenty-two air traffic controllers examined static air traffic displays and were required to detect and resolve conflicts. Objective measures of performance included conflict detection time and accuracy. Subjective perceptions of mental workload were assessed by a complexity-sorting task and subjective ratings of the difficulty of different aspects of the task. A metric quantifying the complexity of pair-wise relations among aircraft was able to account for a substantial portion of the variance in the perceived complexity and difficulty of conflict detection problems, as well as reaction time. Other variables that influenced performance included the mean minimum separation between aircraft pairs and the amount of time that aircraft spent in conflict.

  4. Advances in shock timing experiments on the National Ignition Facility

    NASA Astrophysics Data System (ADS)

    Robey, H. F.; Celliers, P. M.; Moody, J. D.; Sater, J.; Parham, T.; Kozioziemski, B.; Dylla-Spears, R.; Ross, J. S.; LePape, S.; Ralph, J. E.; Hohenberger, M.; Dewald, E. L.; Berzak Hopkins, L.; Kroll, J. J.; Yoxall, B. E.; Hamza, A. V.; Boehly, T. R.; Nikroo, A.; Landen, O. L.; Edwards, M. J.

    2016-03-01

    Recent advances in shock timing experiments and analysis techniques now enable shock measurements to be performed in cryogenic deuterium-tritium (DT) ice layered capsule implosions on the National Ignition Facility (NIF). Previous measurements of shock timing in inertial confinement fusion (ICF) implosions were performed in surrogate targets, where the solid DT ice shell and central DT gas were replaced with a continuous liquid deuterium (D2) fill. These previous experiments pose two surrogacy issues: a material surrogacy due to the difference of species (D2 vs. DT) and densities of the materials used and a geometric surrogacy due to presence of an additional interface (ice/gas) previously absent in the liquid-filled targets. This report presents experimental data and a new analysis method for validating the assumptions underlying this surrogate technique.

  5. Influence of storage conditions on the stability of monomeric anthocyanins studied by reversed-phase high-performance liquid chromatography.

    PubMed

    Morais, Helena; Ramos, Cristina; Forgács, Esther; Cserháti, Tibor; Oliviera, José

    2002-04-25

    The effect of light, storage time and temperature on the decomposition rate of monomeric anthocyanin pigments extracted from skins of grape (Vitis vinifera var. Red globe) was determined by reversed-phase high-performance liquid chromatography (RP-HPLC). The impact of various storage conditions on the pigment stability was assessed by stepwise regression analysis. RP-HPLC separated well the five anthocyanins identified and proved the presence of other unidentified pigments at lower concentrations. Stepwise regression analysis confirmed that the overall decomposition rate of monomeric anthocyanins, peonidin-3-glucoside and malvidin-3-glucoside significantly depended on the time and temperature of storage, the effect of storage time being the most important. The presence or absence of light exerted a negligible impact on the decomposition rate.

  6. Posttest analysis of the FFTF inherent safety tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Padilla, A. Jr.; Claybrook, S.W.

    Inherent safety tests were performed during 1986 in the 400-MW (thermal) Fast Flux Test Facility (FFTF) reactor to demonstrate the effectiveness of an inherent shutdown device called the gas expansion module (GEM). The GEM device provided a strong negative reactivity feedback during loss-of-flow conditions by increasing the neutron leakage as a result of an expanding gas bubble. The best-estimate pretest calculations for these tests were performed using the IANUS plant analysis code (Westinghouse Electric Corporation proprietary code) and the MELT/SIEX3 core analysis code. These two codes were also used to perform the required operational safety analyses for the FFTF reactormore » and plant. Although it was intended to also use the SASSYS systems (core and plant) analysis code, the calibration of the SASSYS code for FFTF core and plant analysis was not completed in time to perform pretest analyses. The purpose of this paper is to present the results of the posttest analysis of the 1986 FFTF inherent safety tests using the SASSYS code.« less

  7. Reliability of skeletal maturity analysis using the cervical vertebrae maturation method on dedicated software.

    PubMed

    Padalino, Saverio; Sfondrini, Maria Francesca; Chenuil, Laura; Scudeller, Luigia; Gandini, Paola

    2014-12-01

    The aim of this study was to assess the feasibility of skeletal maturation analysis using the Cervical Vertebrae Maturation (CVM) method by means of dedicated software, developed in collaboration with Outside Format (Paullo-Milan), as compared with manual analysis. From a sample of patients aged 7-21 years, we gathered 100 lateral cephalograms, 20 for each of the five CVM stages. For each cephalogram, we traced cervical vertebrae C2, C3 and C4 by hand using a lead pencil and an acetate sheet and dedicated software. All the tracings were made by an experienced operator (a dentofacial orthopedics resident) and by an inexperienced operator (a student in dental surgery). Each operator recorded the time needed to make each tracing in order to demonstrate differences in the times taken. Concordance between the manual analysis and the analysis performed using the dedicated software was 94% for the resident and 93% for the student. Interobserver concordance was 99%. The hand-tracing was quicker than that performed by means of the software (28 seconds more on average). The cervical vertebrae analysis software offers excellent clinical performance, even if the method takes longer than the manual technique. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  8. An optimal design of wind turbine and ship structure based on neuro-response surface method

    NASA Astrophysics Data System (ADS)

    Lee, Jae-Chul; Shin, Sung-Chul; Kim, Soo-Young

    2015-07-01

    The geometry of engineering systems affects their performances. For this reason, the shape of engineering systems needs to be optimized in the initial design stage. However, engineering system design problems consist of multi-objective optimization and the performance analysis using commercial code or numerical analysis is generally time-consuming. To solve these problems, many engineers perform the optimization using the approximation model (response surface). The Response Surface Method (RSM) is generally used to predict the system performance in engineering research field, but RSM presents some prediction errors for highly nonlinear systems. The major objective of this research is to establish an optimal design method for multi-objective problems and confirm its applicability. The proposed process is composed of three parts: definition of geometry, generation of response surface, and optimization process. To reduce the time for performance analysis and minimize the prediction errors, the approximation model is generated using the Backpropagation Artificial Neural Network (BPANN) which is considered as Neuro-Response Surface Method (NRSM). The optimization is done for the generated response surface by non-dominated sorting genetic algorithm-II (NSGA-II). Through case studies of marine system and ship structure (substructure of floating offshore wind turbine considering hydrodynamics performances and bulk carrier bottom stiffened panels considering structure performance), we have confirmed the applicability of the proposed method for multi-objective side constraint optimization problems.

  9. Real Time Text Analysis

    NASA Astrophysics Data System (ADS)

    Senthilkumar, K.; Ruchika Mehra Vijayan, E.

    2017-11-01

    This paper aims to illustrate real time analysis of large scale data. For practical implementation we are performing sentiment analysis on live Twitter feeds for each individual tweet. To analyze sentiments we will train our data model on sentiWordNet, a polarity assigned wordNet sample by Princeton University. Our main objective will be to efficiency analyze large scale data on the fly using distributed computation. Apache Spark and Apache Hadoop eco system is used as distributed computation platform with Java as development language

  10. Does Active Management Benefit Endowment Returns? An Analysis of the NACUBO-Commonfund Study of Endowments (NCSE) Data

    ERIC Educational Resources Information Center

    Belmont, David; Odisharia, Irakli

    2014-01-01

    We conduct a longitudinal analysis of the NACUBO-Commonfund Study of Endowments (NCSE) results from 2006-2013 to evaluate if active management is related to higher endowment returns in U.S. equities over time. We also analyze the data to evaluate the endowment characteristics that are related to higher levels of performance over time. We find that…

  11. Molecular simultaneous detection of Cherry necrotic rusty mottle virus and Cherry green ring mottle virus by real-time RT-PCR and high resolution melting analysis

    USDA-ARS?s Scientific Manuscript database

    In this study, real-time RT-PCR assays were combined with high resolution melting (HRM) analysis for the simultaneous detection of Cherry necrotic rusty mottle virus (CNRMV) and Cherry green ring mottle virus (CGRMV) infection in sweet cherry trees. Detection of CNRMV and CGRMV was performed using a...

  12. Frequency stability of on-orbit GPS Block-I and Block-II Navstar clocks

    NASA Astrophysics Data System (ADS)

    McCaskill, Thomas B.; Reid, Wilson G.; Buisson, James A.

    On-orbit analysis of the Global Positioning System (GPS) Block-I and Block-II Navstar clocks has been performed by the Naval Research Laboratory using a multi-year database. The Navstar clock phase-offset measurements were computed from pseudorange measurements made by the five GPS monitor sites and from the U.S. Naval Observatory precise-time site using single or dual frequency GPS receivers. Orbital data was obtained from the Navstar broadcast ephemeris and from the best-fit, postprocessed orbital ephemerides supplied by the Naval Surface Weapons Center or by the Defense Mapping Agency. Clock performance in the time domain is characterized using frequency-stability profiles with sample times that vary from 1 to 100 days. Composite plots of Navstar frequency stability and time-prediction uncertainty are included as a summary of clock analysis results. The analysis includes plots of the clock phase offset and frequency offset histories with the eclipse seasons superimposed on selected plots to demonstrate the temperature sensitivity of one of the Block-I Navstar rubidium clocks. The potential impact on navigation and on transferring precise time of the degradation in the long-term frequency stability of the rubidium clocks is discussed.

  13. Performance analysis of wireless sensor networks in geophysical sensing applications

    NASA Astrophysics Data System (ADS)

    Uligere Narasimhamurthy, Adithya

    Performance is an important criteria to consider before switching from a wired network to a wireless sensing network. Performance is especially important in geophysical sensing where the quality of the sensing system is measured by the precision of the acquired signal. Can a wireless sensing network maintain the same reliability and quality metrics that a wired system provides? Our work focuses on evaluating the wireless GeoMote sensor motes that were developed by previous computer science graduate students at Mines. Specifically, we conducted a set of experiments, namely WalkAway and Linear Array experiments, to characterize the performance of the wireless motes. The motes were also equipped with the Sticking Heartbeat Aperture Resynchronization Protocol (SHARP), a time synchronization protocol developed by a previous computer science graduate student at Mines. This protocol should automatically synchronize the mote's internal clocks and reduce time synchronization errors. We also collected passive data to evaluate the response of GeoMotes to various frequency components associated with the seismic waves. With the data collected from these experiments, we evaluated the performance of the SHARP protocol and compared the performance of our GeoMote wireless system against the industry standard wired seismograph system (Geometric-Geode). Using arrival time analysis and seismic velocity calculations, we set out to answer the following question. Can our wireless sensing system (GeoMotes) perform similarly to a traditional wired system in a realistic scenario?

  14. Fast EEG spike detection via eigenvalue analysis and clustering of spatial amplitude distribution

    NASA Astrophysics Data System (ADS)

    Fukami, Tadanori; Shimada, Takamasa; Ishikawa, Bunnoshin

    2018-06-01

    Objective. In the current study, we tested a proposed method for fast spike detection in electroencephalography (EEG). Approach. We performed eigenvalue analysis in two-dimensional space spanned by gradients calculated from two neighboring samples to detect high-amplitude negative peaks. We extracted the spike candidates by imposing restrictions on parameters regarding spike shape and eigenvalues reflecting detection characteristics of individual medical doctors. We subsequently performed clustering, classifying detected peaks by considering the amplitude distribution at 19 scalp electrodes. Clusters with a small number of candidates were excluded. We then defined a score for eliminating spike candidates for which the pattern of detected electrodes differed from the overall pattern in a cluster. Spikes were detected by setting the score threshold. Main results. Based on visual inspection by a psychiatrist experienced in EEG, we evaluated the proposed method using two statistical measures of precision and recall with respect to detection performance. We found that precision and recall exhibited a trade-off relationship. The average recall value was 0.708 in eight subjects with the score threshold that maximized the F-measure, with 58.6  ±  36.2 spikes per subject. Under this condition, the average precision was 0.390, corresponding to a false positive rate 2.09 times higher than the true positive rate. Analysis of the required processing time revealed that, using a general-purpose computer, our method could be used to perform spike detection in 12.1% of the recording time. The process of narrowing down spike candidates based on shape occupied most of the processing time. Significance. Although the average recall value was comparable with that of other studies, the proposed method significantly shortened the processing time.

  15. Specialized data analysis of SSME and advanced propulsion system vibration measurements

    NASA Technical Reports Server (NTRS)

    Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi

    1993-01-01

    The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.

  16. 78 FR 75571 - Independent Assessment of the Process for the Review of Device Submissions; High Priority...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... of performing the technical analysis, management assessment, and program evaluation tasks required to.... Analysis of elements of the review process (including the presubmission process, and investigational device... time to facilitate a more efficient process. This includes analysis of root causes for inefficiencies...

  17. Compressive buckling analysis of hat-stiffened panel

    NASA Technical Reports Server (NTRS)

    Ko, William L.; Jackson, Raymond H.

    1991-01-01

    Buckling analysis was performed on a hat-stiffened panel subjected to uniaxial compression. Both local buckling and global buckling were analyzed. It was found that the global buckling load was several times higher than the buckling load. The predicted local buckling loads compared favorably with both experimental data and finite-element analysis.

  18. Over the hill at 24: persistent age-related cognitive-motor decline in reaction times in an ecologically valid video game task begins in early adulthood.

    PubMed

    Thompson, Joseph J; Blair, Mark R; Henrey, Andrew J

    2014-01-01

    Typically studies of the effects of aging on cognitive-motor performance emphasize changes in elderly populations. Although some research is directly concerned with when age-related decline actually begins, studies are often based on relatively simple reaction time tasks, making it impossible to gauge the impact of experience in compensating for this decline in a real world task. The present study investigates age-related changes in cognitive motor performance through adolescence and adulthood in a complex real world task, the real-time strategy video game StarCraft 2. In this paper we analyze the influence of age on performance using a dataset of 3,305 players, aged 16-44, collected by Thompson, Blair, Chen & Henrey [1]. Using a piecewise regression analysis, we find that age-related slowing of within-game, self-initiated response times begins at 24 years of age. We find no evidence for the common belief expertise should attenuate domain-specific cognitive decline. Domain-specific response time declines appear to persist regardless of skill level. A second analysis of dual-task performance finds no evidence of a corresponding age-related decline. Finally, an exploratory analyses of other age-related differences suggests that older participants may have been compensating for a loss in response speed through the use of game mechanics that reduce cognitive load.

  19. Over the Hill at 24: Persistent Age-Related Cognitive-Motor Decline in Reaction Times in an Ecologically Valid Video Game Task Begins in Early Adulthood

    PubMed Central

    Thompson, Joseph J.; Blair, Mark R.; Henrey, Andrew J.

    2014-01-01

    Typically studies of the effects of aging on cognitive-motor performance emphasize changes in elderly populations. Although some research is directly concerned with when age-related decline actually begins, studies are often based on relatively simple reaction time tasks, making it impossible to gauge the impact of experience in compensating for this decline in a real world task. The present study investigates age-related changes in cognitive motor performance through adolescence and adulthood in a complex real world task, the real-time strategy video game StarCraft 2. In this paper we analyze the influence of age on performance using a dataset of 3,305 players, aged 16-44, collected by Thompson, Blair, Chen & Henrey [1]. Using a piecewise regression analysis, we find that age-related slowing of within-game, self-initiated response times begins at 24 years of age. We find no evidence for the common belief expertise should attenuate domain-specific cognitive decline. Domain-specific response time declines appear to persist regardless of skill level. A second analysis of dual-task performance finds no evidence of a corresponding age-related decline. Finally, an exploratory analyses of other age-related differences suggests that older participants may have been compensating for a loss in response speed through the use of game mechanics that reduce cognitive load. PMID:24718593

  20. Non performing loans (NPLs) in a crisis economy: Long-run equilibrium analysis with a real time VEC model for Greece (2001-2015)

    NASA Astrophysics Data System (ADS)

    Konstantakis, Konstantinos N.; Michaelides, Panayotis G.; Vouldis, Angelos T.

    2016-06-01

    As a result of domestic and international factors, the Greek economy faced a severe crisis which is directly comparable only to the Great Recession. In this context, a prominent victim of this situation was the country's banking system. This paper attempts to shed light on the determining factors of non-performing loans in the Greek banking sector. The analysis presents empirical evidence from the Greek economy, using aggregate data on a quarterly basis, in the time period 2001-2015, fully capturing the recent recession. In this work, we use a relevant econometric framework based on a real time Vector Autoregressive (VAR)-Vector Error Correction (VEC) model, which captures the dynamic interdependencies among the variables used. Consistent with international evidence, the empirical findings show that both macroeconomic and financial factors have a significant impact on non-performing loans in the country. Meanwhile, the deteriorating credit quality feeds back into the economy leading to a self-reinforcing negative loop.

  1. Rapid simultaneous high-resolution mapping of myelin water fraction and relaxation times in human brain using BMC-mcDESPOT.

    PubMed

    Bouhrara, Mustapha; Spencer, Richard G

    2017-02-15

    A number of central nervous system (CNS) diseases exhibit changes in myelin content and magnetic resonance longitudinal, T 1 , and transverse, T 2 , relaxation times, which therefore represent important biomarkers of CNS pathology. Among the methods applied for measurement of myelin water fraction (MWF) and relaxation times, the multicomponent driven equilibrium single pulse observation of T 1 and T 2 (mcDESPOT) approach is of particular interest. mcDESPOT permits whole brain mapping of multicomponent T 1 and T 2 , with data acquisition accomplished within a clinically realistic acquisition time. Unfortunately, previous studies have indicated the limited performance of mcDESPOT in the setting of the modest signal-to-noise range of high-resolution mapping, required for the depiction of small structures and to reduce partial volume effects. Recently, we showed that a new Bayesian Monte Carlo (BMC) analysis substantially improved determination of MWF from mcDESPOT imaging data. However, our previous study was limited in that it did not discuss determination of relaxation times. Here, we extend the BMC analysis to the simultaneous determination of whole-brain MWF and relaxation times using the two-component mcDESPOT signal model. Simulation analyses and in-vivo human brain studies indicate the overall greater performance of this approach compared to the stochastic region contraction (SRC) algorithm, conventionally used to derive parameter estimates from mcDESPOT data. SRC estimates of the transverse relaxation time of the long T 2 fraction, T 2,l , and the longitudinal relaxation time of the short T 1 fraction, T 1,s , clustered towards the lower and upper parameter search space limits, respectively, indicating failure of the fitting procedure. We demonstrate that this effect is absent in the BMC analysis. Our results also showed improved parameter estimation for BMC as compared to SRC for high-resolution mapping. Overall we find that the combination of BMC analysis and mcDESPOT, BMC-mcDESPOT, shows excellent performance for accurate high-resolution whole-brain mapping of MWF and bi-component transverse and longitudinal relaxation times within a clinically realistic acquisition time. Published by Elsevier Inc.

  2. On reliable time-frequency characterization and delay estimation of stimulus frequency otoacoustic emissions

    NASA Astrophysics Data System (ADS)

    Biswal, Milan; Mishra, Srikanta

    2018-05-01

    The limited information on origin and nature of stimulus frequency otoacoustic emissions (SFOAEs) necessitates a thorough reexamination into SFOAE analysis procedures. This will lead to a better understanding of the generation of SFOAEs. The SFOAE response waveform in the time domain can be interpreted as a summation of amplitude modulated and frequency modulated component waveforms. The efficiency of a technique to segregate these components is critical to describe the nature of SFOAEs. Recent advancements in robust time-frequency analysis algorithms have staked claims on the more accurate extraction of these components, from composite signals buried in noise. However, their potential has not been fully explored for SFOAEs analysis. Indifference to distinct information, due to nature of these analysis techniques, may impact the scientific conclusions. This paper attempts to bridge this gap in literature by evaluating the performance of three linear time-frequency analysis algorithms: short-time Fourier transform (STFT), continuous Wavelet transform (CWT), S-transform (ST) and two nonlinear algorithms: Hilbert-Huang Transform (HHT), synchrosqueezed Wavelet transform (SWT). We revisit the extraction of constituent components and estimation of their magnitude and delay, by carefully evaluating the impact of variation in analysis parameters. The performance of HHT and SWT from the perspective of time-frequency filtering and delay estimation were found to be relatively less efficient for analyzing SFOAEs. The intrinsic mode functions of HHT does not completely characterize the reflection components and hence IMF based filtering alone, is not recommended for segregating principal emission from multiple reflection components. We found STFT, WT, and ST to be suitable for canceling multiple internal reflection components with marginal altering in SFOAE.

  3. A statistical approach to evaluate the performance of cardiac biomarkers in predicting death due to acute myocardial infarction: time-dependent ROC curve

    PubMed

    Karaismailoğlu, Eda; Dikmen, Zeliha Günnur; Akbıyık, Filiz; Karaağaoğlu, Ahmet Ergun

    2018-04-30

    Background/aim: Myoglobin, cardiac troponin T, B-type natriuretic peptide (BNP), and creatine kinase isoenzyme MB (CK-MB) are frequently used biomarkers for evaluating risk of patients admitted to an emergency department with chest pain. Recently, time- dependent receiver operating characteristic (ROC) analysis has been used to evaluate the predictive power of biomarkers where disease status can change over time. We aimed to determine the best set of biomarkers that estimate cardiac death during follow-up time. We also obtained optimal cut-off values of these biomarkers, which differentiates between patients with and without risk of death. A web tool was developed to estimate time intervals in risk. Materials and methods: A total of 410 patients admitted to the emergency department with chest pain and shortness of breath were included. Cox regression analysis was used to determine an optimal set of biomarkers that can be used for estimating cardiac death and to combine the significant biomarkers. Time-dependent ROC analysis was performed for evaluating performances of significant biomarkers and a combined biomarker during 240 h. The bootstrap method was used to compare statistical significance and the Youden index was used to determine optimal cut-off values. Results : Myoglobin and BNP were significant by multivariate Cox regression analysis. Areas under the time-dependent ROC curves of myoglobin and BNP were about 0.80 during 240 h, and that of the combined biomarker (myoglobin + BNP) increased to 0.90 during the first 180 h. Conclusion: Although myoglobin is not clinically specific to a cardiac event, in our study both myoglobin and BNP were found to be statistically significant for estimating cardiac death. Using this combined biomarker may increase the power of prediction. Our web tool can be useful for evaluating the risk status of new patients and helping clinicians in making decisions.

  4. Impact of temporal resolution of inputs on hydrological model performance: An analysis based on 2400 flood events

    NASA Astrophysics Data System (ADS)

    Ficchì, Andrea; Perrin, Charles; Andréassian, Vazken

    2016-07-01

    Hydro-climatic data at short time steps are considered essential to model the rainfall-runoff relationship, especially for short-duration hydrological events, typically flash floods. Also, using fine time step information may be beneficial when using or analysing model outputs at larger aggregated time scales. However, the actual gain in prediction efficiency using short time-step data is not well understood or quantified. In this paper, we investigate the extent to which the performance of hydrological modelling is improved by short time-step data, using a large set of 240 French catchments, for which 2400 flood events were selected. Six-minute rain gauge data were available and the GR4 rainfall-runoff model was run with precipitation inputs at eight different time steps ranging from 6 min to 1 day. Then model outputs were aggregated at seven different reference time scales ranging from sub-hourly to daily for a comparative evaluation of simulations at different target time steps. Three classes of model performance behaviour were found for the 240 test catchments: (i) significant improvement of performance with shorter time steps; (ii) performance insensitivity to the modelling time step; (iii) performance degradation as the time step becomes shorter. The differences between these groups were analysed based on a number of catchment and event characteristics. A statistical test highlighted the most influential explanatory variables for model performance evolution at different time steps, including flow auto-correlation, flood and storm duration, flood hydrograph peakedness, rainfall-runoff lag time and precipitation temporal variability.

  5. Collaborative real-time motion video analysis by human observer and image exploitation algorithms

    NASA Astrophysics Data System (ADS)

    Hild, Jutta; Krüger, Wolfgang; Brüstle, Stefan; Trantelle, Patrick; Unmüßig, Gabriel; Heinze, Norbert; Peinsipp-Byma, Elisabeth; Beyerer, Jürgen

    2015-05-01

    Motion video analysis is a challenging task, especially in real-time applications. In most safety and security critical applications, a human observer is an obligatory part of the overall analysis system. Over the last years, substantial progress has been made in the development of automated image exploitation algorithms. Hence, we investigate how the benefits of automated video analysis can be integrated suitably into the current video exploitation systems. In this paper, a system design is introduced which strives to combine both the qualities of the human observer's perception and the automated algorithms, thus aiming to improve the overall performance of a real-time video analysis system. The system design builds on prior work where we showed the benefits for the human observer by means of a user interface which utilizes the human visual focus of attention revealed by the eye gaze direction for interaction with the image exploitation system; eye tracker-based interaction allows much faster, more convenient, and equally precise moving target acquisition in video images than traditional computer mouse selection. The system design also builds on prior work we did on automated target detection, segmentation, and tracking algorithms. Beside the system design, a first pilot study is presented, where we investigated how the participants (all non-experts in video analysis) performed in initializing an object tracking subsystem by selecting a target for tracking. Preliminary results show that the gaze + key press technique is an effective, efficient, and easy to use interaction technique when performing selection operations on moving targets in videos in order to initialize an object tracking function.

  6. Gene expression analysis of immunostained endothelial cells isolated from formaldehyde-fixated paraffin embedded tumors using laser capture microdissection--a technical report.

    PubMed

    Kaneko, Tomoatsu; Okiji, Takashi; Kaneko, Reika; Suda, Hideaki; Nör, Jacques E

    2009-12-01

    Laser capture microdissection (LCM) allows microscopic procurement of specific cell types from tissue sections that can then be used for gene expression analysis. In conventional LCM, frozen tissues stained with hematoxylin are normally used to the molecular analysis. Recent studies suggested that it is possible to carry out gene expression analysis of formaldehyde-fixated paraffin embedded (FFPE) tissues that were stained with hematoxylin. However, it is still unclear if quantitative gene expression analyses can be performed from LCM cells from FFPE tissues that were subjected to immunostaining to enhance identification of target cells. In this proof-of-principle study, we analyzed by reverse transcription-PCR (RT-PCR) and real time PCR the expression of genes in factor VIII immunostained human endothelial cells that were dissected from FFPE tissues by LCM. We observed that immunostaining should be performed at 4 degrees C to preserve the mRNA from the cells. The expression of Bcl-2 in the endothelial cells was evaluated by RT-PCR and by real time PCR. Glyceraldehyde-3-phosphate dehydrogenase and 18S were used as house keeping genes for RT-PCR and real time PCR, respectively. This report unveils a method for quantitative gene expression analysis in cells that were identified by immunostaining and retrieved by LCM from FFPE tissues. This method is ideally suited for the analysis of relatively rare cell types within a tissue, and should improve on our ability to perform differential diagnosis of pathologies as compared to conventional LCM.

  7. No evidence of reaction time slowing in autism spectrum disorder.

    PubMed

    Ferraro, F Richard

    2016-01-01

    A total of 32 studies comprising 238 simple reaction time and choice reaction time conditions were examined in individuals with autism spectrum disorder (n = 964) and controls (n = 1032). A Brinley plot/multiple regression analysis was performed on mean reaction times, regressing autism spectrum disorder performance onto the control performance as a way to examine any generalized simple reaction time/choice reaction time slowing exhibited by the autism spectrum disorder group. The resulting regression equation was Y (autism spectrum disorder) = 0.99 × (control) + 87.93, which accounted for 92.3% of the variance. These results suggest that there are little if any simple reaction time/choice reaction time slowing in this sample of individual with autism spectrum disorder, in comparison with controls. While many cognitive and information processing domains are compromised in autism spectrum disorder, it appears that simple reaction time/choice reaction time remain relatively unaffected in autism spectrum disorder. © The Author(s) 2014.

  8. Comparative Analysis of the Clinical Significance of Oscillatory Components in the Rhythmic Structure of Pulse Signal in the Diagnostics of Psychosomatic Disorders in School Age Children.

    PubMed

    Desova, A A; Dorofeyuk, A A; Anokhin, A M

    2017-01-01

    We performed a comparative analysis of the types of spectral density typical of various parameters of pulse signal. The experimental material was obtained during the examination of school age children with various psychosomatic disorders. We also performed a typological analysis of the spectral density functions corresponding to the time series of different parameters of a single oscillation of pulse signals; the results of their comparative analysis are presented. We determined the most significant spectral components for two disordersin children: arterial hypertension and mitral valve prolapse.

  9. Data analysis techniques used at the Oak Ridge Y-12 plant flywheel evaluation laboratory

    NASA Astrophysics Data System (ADS)

    Steels, R. S., Jr.; Babelay, E. F., Jr.

    1980-07-01

    Some of the more advanced data analysis techniques applied to the problem of experimentally evaluating the performance of high performance composite flywheels are presented. Real time applications include polar plots of runout with interruptions relating to balance and relative motions between parts, radial growth measurements, and temperature of the spinning part. The technique used to measure torque applied to a containment housing during flywheel failure is also presented. The discussion of pre and post test analysis techniques includes resonant frequency determination with modal analysis, waterfall charts, and runout signals at failure.

  10. High-speed separation and characterization of major constituents in Radix Paeoniae Rubra by fast high-performance liquid chromatography coupled with diode-array detection and time-of-flight mass spectrometry.

    PubMed

    Liu, E-Hu; Qi, Lian-Wen; Li, Bin; Peng, Yong-Bo; Li, Ping; Li, Chang-Yin; Cao, Jun

    2009-01-01

    A fast high-performance liquid chromatography (HPLC) method coupled with diode-array detection (DAD) and electrospray ionization time-of-flight mass spectrometry (ESI-TOFMS) has been developed for rapid separation and sensitive identification of major constituents in Radix Paeoniae Rubra (RPR). The total analysis time on a short column packed with 1.8-microm porous particles was about 20 min without a loss in resolution, six times faster than the performance of a conventional column analysis (115 min). The MS fragmentation behavior and structural characterization of major compounds in RPR were investigated here for the first time. The targets were rapidly screened from RPR matrix using a narrow mass window of 0.01 Da to restructure extracted ion chromatograms. Accurate mass measurements (less than 5 ppm error) for both the deprotonated molecule and characteristic fragment ions represent reliable identification criteria for these compounds in complex matrices with similar if not even better performance compared with tandem mass spectrometry. A total of 26 components were screened and identified in RPR including 11 monoterpene glycosides, 11 galloyl glucoses and 4 other phenolic compounds. From the point of time savings, resolving power, accurate mass measurement capability and full spectral sensitivity, the established fast HPLC/DAD/TOFMS method turns out to be a highly useful technique to identify constituents in complex herbal medicines. (c) 2008 John Wiley & Sons, Ltd.

  11. Analysis of Classical Time-Trial Performance and Technique-Specific Physiological Determinants in Elite Female Cross-Country Skiers.

    PubMed

    Sandbakk, Øyvind; Losnegard, Thomas; Skattebo, Øyvind; Hegge, Ann M; Tønnessen, Espen; Kocbach, Jan

    2016-01-01

    The present study investigated the contribution of performance on uphill, flat, and downhill sections to overall performance in an international 10-km classical time-trial in elite female cross-country skiers, as well as the relationships between performance on snow and laboratory-measured physiological variables in the double poling (DP) and diagonal (DIA) techniques. Ten elite female cross-country skiers were continuously measured by a global positioning system device during an international 10-km cross-country skiing time-trial in the classical technique. One month prior to the race, all skiers performed a 5-min submaximal and 3-min self-paced performance test while roller skiing on a treadmill, both in the DP and DIA techniques. The time spent on uphill (r = 0.98) and flat (r = 0.91) sections of the race correlated most strongly with the overall 10-km performance (both p < 0.05). Approximately 56% of the racing time was spent uphill, and stepwise multiple regression revealed that uphill time explained 95.5% of the variance in overall performance (p < 0.001). Distance covered during the 3-min roller-skiing test and body-mass normalized peak oxygen uptake (VO2peak) in both techniques showed the strongest correlations with overall time-trial performance (r = 0.66-0.78), with DP capacity tending to have greatest impact on the flat and DIA capacity on uphill terrain (all p < 0.05). Our present findings reveal that the time spent uphill most strongly determine classical time-trial performance, and that the major portion of the performance differences among elite female cross-country skiers can be explained by variations in technique-specific aerobic power.

  12. Analysis of Classical Time-Trial Performance and Technique-Specific Physiological Determinants in Elite Female Cross-Country Skiers

    PubMed Central

    Sandbakk, Øyvind; Losnegard, Thomas; Skattebo, Øyvind; Hegge, Ann M.; Tønnessen, Espen; Kocbach, Jan

    2016-01-01

    The present study investigated the contribution of performance on uphill, flat, and downhill sections to overall performance in an international 10-km classical time-trial in elite female cross-country skiers, as well as the relationships between performance on snow and laboratory-measured physiological variables in the double poling (DP) and diagonal (DIA) techniques. Ten elite female cross-country skiers were continuously measured by a global positioning system device during an international 10-km cross-country skiing time-trial in the classical technique. One month prior to the race, all skiers performed a 5-min submaximal and 3-min self-paced performance test while roller skiing on a treadmill, both in the DP and DIA techniques. The time spent on uphill (r = 0.98) and flat (r = 0.91) sections of the race correlated most strongly with the overall 10-km performance (both p < 0.05). Approximately 56% of the racing time was spent uphill, and stepwise multiple regression revealed that uphill time explained 95.5% of the variance in overall performance (p < 0.001). Distance covered during the 3-min roller-skiing test and body-mass normalized peak oxygen uptake (VO2peak) in both techniques showed the strongest correlations with overall time-trial performance (r = 0.66–0.78), with DP capacity tending to have greatest impact on the flat and DIA capacity on uphill terrain (all p < 0.05). Our present findings reveal that the time spent uphill most strongly determine classical time-trial performance, and that the major portion of the performance differences among elite female cross-country skiers can be explained by variations in technique-specific aerobic power. PMID:27536245

  13. Use of a Survival Analysis Technique in Understanding Game Performance in Instructional Games. CRESST Report 812

    ERIC Educational Resources Information Center

    Kim, Jinok; Chung, Gregory K. W. K.

    2012-01-01

    In this study we compared the effects of two math game designs on math and game performance, using discrete-time survival analysis (DTSA) to model players' risk of not advancing to the next level in the game. 137 students were randomly assigned to two game conditions. The game covered the concept of a unit and the addition of like-sized fractional…

  14. 76 FR 72493 - ITS Joint Program Office Webinar on Alternative Organizational Structures for a Certificate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-23

    ... over time. (This study is an institutional analysis only, not a technical analysis, and it is not... Adam Hopps at (202) 680-0091. The ITS JPO will present results from an early analysis of organizational models. This analysis will describe the functions that need to be performed by a CME; identify key...

  15. Timing of Occurrence Is the Most Important Characteristic of Spot Sign.

    PubMed

    Wang, Binli; Yan, Shenqiang; Xu, Mengjun; Zhang, Sheng; Liu, Keqin; Hu, Haitao; Selim, Magdy; Lou, Min

    2016-05-01

    Most previous studies have used single-phase computed tomographic angiography to detect the spot sign, a marker for hematoma expansion (HE) in spontaneous intracerebral hemorrhage. We investigated whether defining the spot sign based on timing on perfusion computed tomography (CTP) would improve its specificity for predicting HE. We prospectively enrolled supratentorial spontaneous intracerebral hemorrhage patients who underwent CTP within 6 hours of onset. Logistic regression was performed to assess the risk factors for HE and poor outcome. Predictive performance of individual CTP spot sign characteristics were examined with receiver operating characteristic analysis. Sixty-two men and 21 women with spontaneous intracerebral hemorrhage were included in this analysis. Spot sign was detected in 46% (38/83) of patients. Receiver operating characteristic analysis indicated that the timing of spot sign occurrence on CTP had the greatest area under receiver operating characteristic curve for HE (0.794; 95% confidence interval, 0.630-0.958; P=0.007); the cutoff time was 23.13 seconds. On multivariable analysis, the presence of early-occurring spot sign (ie, spot sign before 23.13 seconds) was an independent predictor not only of HE (odds ratio=28.835; 95% confidence interval, 6.960-119.458; P<0.001), but also of mortality at 3 months (odds ratio =22.377; 95% confidence interval, 1.773-282.334; P=0.016). Moreover, the predictive performance showed that the redefined early-occurring spot sign maintained a higher specificity for HE compared with spot sign (91% versus 74%). Redefining the spot sign based on timing of contrast leakage on CTP to determine early-occurring spot sign improves the specificity for predicting HE and 3-month mortality. The use of early-occurring spot sign could improve the selection of ICH patients for potential hemostatic therapy. © 2016 American Heart Association, Inc.

  16. Timing of occurrence is the most important characteristic of spot sign

    PubMed Central

    Xu, Mengjun; Zhang, Sheng; Liu, Keqin; Hu, Haitao; Selim, Magdy; Lou, Min

    2016-01-01

    Background and Purpose Most previous studies have used single-phase CT angiography (CTA) to detect the spot sign, a marker for hematoma expansion (HE) in spontaneous intracerebral hemorrhage (SICH). We investigated whether defining the spot sign based on timing on perfusion CT (CTP) would improve its specificity for predicting HE. Methods We prospectively enrolled supratentorial SICH patients, who underwent CTP within 6 h of onset. Logistic regression were performed to assess the risk factors for HE and poor outcome. Predictive performance of individual CTP spot sign characteristics were examined with receiver operating characteristic (ROC) analysis. Results Sixty-two men and 21 women with SICH were included in this analysis. Spot sign was detected in 46% (38/83) patients. ROC analysis indicated that the timing of spot sign occurrence on CTP had the greatest AUC for HE (0.794; 95% CI, 0.630-0.958; P=0.007); the cutoff time was 23.13 seconds. On multivariable analysis, the presence of early-occurring spot sign (EOSS; i.e. spot sign before 23.13 seconds) was an independent predictor, not only of HE (OR=28.835; 95% CI, 6.960-119.458; P<0.001), but also of mortality at 3 months (OR=22.377; 95% CI, 1.773-282.334; P=0.016). Moreover, the predictive performance showed that the redefined EOSS maintained a higher specificity for HE compared to spot sign (91% vs 74%). Conclusions Redefining the spot sign based on timing of contrast leakage on CTP to determine EOSS, improves the specificity for predicting HE and 3-month mortality. The use of EOSS could improve the selection of ICH patients for potential hemostatic therapy. PMID:27026627

  17. Performance Evaluation of Reliable Multicast Protocol for Checkout and Launch Control Systems

    NASA Technical Reports Server (NTRS)

    Shu, Wei Wennie; Porter, John

    2000-01-01

    The overall objective of this project is to study reliability and performance of Real Time Critical Network (RTCN) for checkout and launch control systems (CLCS). The major tasks include reliability and performance evaluation of Reliable Multicast (RM) package and fault tolerance analysis and design of dual redundant network architecture.

  18. Market Earnings and Household Work: New Tests of Gender Performance Theory

    ERIC Educational Resources Information Center

    Schneider, Daniel

    2011-01-01

    I examine the contested finding that men and women engage in gender performance through housework. Prior scholarship has found a curvilinear association between earnings share and housework that has been interpreted as evidence of gender performance. I reexamine these findings by conducting the first such analysis to use high-quality time diary…

  19. A Study of Performance Support in Higher Education

    ERIC Educational Resources Information Center

    Lion, Robert W.

    2011-01-01

    Successful performance improvement efforts are closely tied to the strength and integrity of the performance analysis process. During a time when higher education institutions are facing increasing budget cuts, the ability to recruit and retain students is extremely important. For some institutions, web-based courses have been viewed as a way to…

  20. What can eye movements tell us about Symbol Digit substitution by patients with schizophrenia?

    PubMed

    Elahipanah, Ava; Christensen, Bruce K; Reingold, Eyal M

    2011-04-01

    Substitution tests are sensitive to cognitive impairment and reliably discriminate patients with schizophrenia from healthy individuals better than most other neuropsychological instruments. However, due to their multifaceted nature, substitution test scores cannot pinpoint the specific cognitive deficits that lead to poor performance. The current study investigated eye movements during performance on a substitution test in order to better understand what aspect of substitution test performance underlies schizophrenia-related impairment. Twenty-five patients with schizophrenia and 25 healthy individuals performed a computerized version of the Symbol Digit Modalities Test while their eye movements were monitored. As expected, patients achieved lower overall performance scores. Moreover, analysis of participants' eye movements revealed that patients spent more time searching for the target symbol every time they visited the key area. Patients also made more visits to the key area for each response that they made. Regression analysis suggested that patients' impaired performance on substitution tasks is primarily related to a less efficient visual search and, secondarily, to impaired memory. Copyright © 2010 Elsevier B.V. All rights reserved.

  1. Why are they late? Timing abilities and executive control among students with learning disabilities.

    PubMed

    Grinblat, Nufar; Rosenblum, Sara

    2016-12-01

    While a deficient ability to perform daily tasks on time has been reported among students with learning disabilities (LD), the underlying mechanism behind their 'being late' is still unclear. This study aimed to evaluate the organization in time, time estimation abilities, actual performance time pertaining to specific daily activities, as well as the executive functions of students with LD in comparison to those of controls, and to assess the relationships between these domains among each group. The participants were 27 students with LD, aged 20-30, and 32 gender and age-matched controls who completed the Time Organization and Participation Scale (TOPS) and the Behavioral Rating Inventory of Executive Function-Adult version (BRIEF-A). In addition, their ability to estimate the time needed to complete the task of preparing a cup of coffee as well as their actual performance time were evaluated. The results indicated that in comparison to controls, students with LD showed significantly inferior organization in time (TOPS) and executive function abilities (BRIEF-A). Furthermore, their time estimation abilities were significantly inferior and they required significantly more time to prepare a cup of coffee. Regression analysis identified the variables that predicted organization in time and task performance time among each group. The significance of the results for both theoretical and clinical implications are discussed. What this paper adds? This study examines the underlying mechanism of the phenomena of being late among students with LD. Following a recent call for using ecologically valid assessments, the functional daily ability of students with LD to prepare a cup of coffee and to organize time were investigated. Furthermore, their time estimation and executive control abilities were examined as a possible underlying mechanism for their lateness. Although previous studies have indicated executive control deficits among students with LD, to our knowledge, this is the first analysis of the relationships between their executive control and time estimation deficits and their influence upon their daily function and organization in time abilities. Our findings demonstrate that students with LD need more time in order to execute simple daily activities, such as preparing a cup of coffee. Deficient working memory, retrospective time estimation ability and inhibition predicted their performance time and organization in time abilities. Therefore, this paper sheds light on the mechanism behind daily performance in time among students with LD and emphasizes the need for future development of focused intervention programs to meet their unique needs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Neural mechanisms of coarse-to-fine discrimination in the visual cortex.

    PubMed

    Purushothaman, Gopathy; Chen, Xin; Yampolsky, Dmitry; Casagrande, Vivien A

    2014-12-01

    Vision is a dynamic process that refines the spatial scale of analysis over time, as evidenced by a progressive improvement in the ability to detect and discriminate finer details. To understand coarse-to-fine discrimination, we studied the dynamics of spatial frequency (SF) response using reverse correlation in the primary visual cortex (V1) of the primate. In a majority of V1 cells studied, preferred SF either increased monotonically with time (group 1) or changed nonmonotonically, with an initial increase followed by a decrease (group 2). Monotonic shift in preferred SF occurred with or without an early suppression at low SFs. Late suppression at high SFs always accompanied nonmonotonic SF dynamics. Bayesian analysis showed that SF discrimination performance and best discriminable SF frequencies changed with time in different ways in the two groups of neurons. In group 1 neurons, SF discrimination performance peaked on both left and right flanks of the SF tuning curve at about the same time. In group 2 neurons, peak discrimination occurred on the right flank (high SFs) later than on the left flank (low SFs). Group 2 neurons were also better discriminators of high SFs. We examined the relationship between the time at which SF discrimination performance peaked on either flank of the SF tuning curve and the corresponding best discriminable SFs in both neuronal groups. This analysis showed that the population best discriminable SF increased with time in V1. These results suggest neural mechanisms for coarse-to-fine discrimination behavior and that this process originates in V1 or earlier. Copyright © 2014 the American Physiological Society.

  3. Higher cost of single incision laparoscopic cholecystectomy due to longer operating time. A study of opportunity cost based on meta-analysis

    PubMed Central

    GIRABENT-FARRÉS, M.

    2018-01-01

    Background We aimed to calculate the opportunity cost of the operating time to demonstrate that single incision laparoscopic cholecystectomy (SILC) is more expensive than classic laparoscopic cholecystectomy (CLC). Methods We identified studies comparing use of both techniques during the period 2008–2016, and to calculate the opportunity cost, we performed another search in the same period of time with an economic evaluation of classic laparoscopy. We performed a meta-analysis of the items selected in the first review considering the cost of surgery and surgical time, and we analyzed their differences. We subsequently calculated the opportunity cost of these time differences based on the design of a cost/time variable using the data from the second literature review. Results Twenty-seven articles were selected from the first review: 26 for operating time (3.138 patients) and 3 for the cost of surgery (831 patients), and 3 articles from the second review. Both techniques have similar operating costs. Single incision laparoscopy surgery takes longer (16.90min) to perform (p <0.00001) and this difference represents an opportunity cost of 755.97 € (cost/time unit factor of 44.73 €/min). Conclusions SILC costs the same as CLC, but the surgery takes longer to perform, and this difference involves an opportunity cost that increases the total cost of SILC. The value of the opportunity cost of the operating time can vary the total cost of a surgical technique and it should be included in the economic evaluation to support the decision to adopt a new surgical technique. PMID:29549678

  4. Applications of Graph-Theoretic Tests to Online Change Detection

    DTIC Science & Technology

    2014-05-09

    NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT ...assessment, crime investigation, and environmental field analysis. Our work offers a new tool for change detection that can be employed in real- time in very...this paper such MSTs and bipartite matchings. Ruth (2009) reports run times for MNBM ensembles created using Derigs’ (1998) algorithm on the order of

  5. Methodology for CFD Design Analysis of National Launch System Nozzle Manifold

    NASA Technical Reports Server (NTRS)

    Haire, Scot L.

    1993-01-01

    The current design environment dictates that high technology CFD (Computational Fluid Dynamics) analysis produce quality results in a timely manner if it is to be integrated into the design process. The design methodology outlined describes the CFD analysis of an NLS (National Launch System) nozzle film cooling manifold. The objective of the analysis was to obtain a qualitative estimate for the flow distribution within the manifold. A complex, 3D, multiple zone, structured grid was generated from a 3D CAD file of the geometry. A Euler solution was computed with a fully implicit compressible flow solver. Post processing consisted of full 3D color graphics and mass averaged performance. The result was a qualitative CFD solution that provided the design team with relevant information concerning the flow distribution in and performance characteristics of the film cooling manifold within an effective time frame. Also, this design methodology was the foundation for a quick turnaround CFD analysis of the next iteration in the manifold design.

  6. A new approach to harmonic elimination based on a real-time comparison method

    NASA Astrophysics Data System (ADS)

    Gourisetti, Sri Nikhil Gupta

    Undesired harmonics are responsible for noise in a transmission channel, power loss in power electronics and in motor control. Selective Harmonic Elimination (SHE) is a well-known method used to eliminate or suppress the unwanted harmonics between the fundamental and the carrier frequency harmonic/component. But SHE bears the disadvantage of its incapability to use in real-time applications. A novel reference-carrier comparative method has been developed which can be used to generate an SPWM signal to apply in real-time systems. A modified carrier signal is designed and tested for different carrier frequencies based on the generated SPWM FFT. The carrier signal may change for different fundamental to carrier ratio that leads to solving the equations each time. An analysis to find all possible solutions for a particular carrier frequency and fundamental amplitude is performed and found. This proves that there is no one global maxima instead several local maximas exists for a particular condition set that makes this method less sensitive. Additionally, an attempt to find a universal solution that is valid for any carrier signal with predefined fundamental amplitude is performed. A uniform distribution Monte-Carlo sensitivity analysis is performed to measure the window i.e., best and worst possible solutions. The simulations are performed using MATLAB and are justified with experimental results.

  7. Propulsion Powertrain Real-Time Simulation Using Hardware-in-the-Loop (HIL) for Aircraft Electric Propulsion System

    NASA Technical Reports Server (NTRS)

    Choi, Benjamin B.; Brown, Gerald V.

    2017-01-01

    It is essential to design a propulsion powertrain real-time simulator using the hardware-in-the-loop (HIL) system that emulates an electrified aircraft propulsion (EAP) systems power grid. This simulator would enable us to facilitate in-depth understanding of the system principles, to validate system model analysis and performance prediction, and to demonstrate the proof-of-concept of the EAP electrical system. This paper describes how subscale electrical machines with their controllers can mimic the power components in an EAP powertrain. In particular, three powertrain emulations are presented to mimic 1) a gas turbo-=shaft engine driving a generator, consisting of two permanent magnet (PM) motors with brushless motor drives, coupled by a shaft, 2) a motor driving a propulsive fan, and 3) a turbo-shaft engine driven fan (turbofan engine) operation. As a first step towards the demonstration, experimental dynamic characterization of the two motor drive systems, coupled by a mechanical shaft, were performed. The previously developed analytical motor models1 were then replaced with the experimental motor models to perform the real-time demonstration in the predefined flight path profiles. This technique can convert the plain motor system into a unique EAP power grid emulator that enables rapid analysis and real-time simulation performance using hardware-in-the-loop (HIL).

  8. Analysis of UAS DAA Surveillance in Fast-Time Simulations without DAA Mitigation

    NASA Technical Reports Server (NTRS)

    Thipphavong, David P.; Santiago, Confesor; Isaacson, David R.; Lee, Seung Man; Refai, Mohamad Said; Snow, James William

    2015-01-01

    Realization of the expected proliferation of Unmanned Aircraft System (UAS) operations in the National Airspace System (NAS) depends on the development and validation of performance standards for UAS Detect and Avoid (DAA) Systems. The RTCA Special Committee 228 is charged with leading the development of draft Minimum Operational Performance Standards (MOPS) for UAS DAA Systems. NASA, as a participating member of RTCA SC-228 is committed to supporting the development and validation of draft requirements for DAA surveillance system performance. A recent study conducted using NASA's ACES (Airspace Concept Evaluation System) simulation capability begins to address questions surrounding the development of draft MOPS for DAA surveillance systems. ACES simulations were conducted to study the performance of sensor systems proposed by the SC-228 DAA Surveillance sub-group. Analysis included but was not limited to: 1) number of intruders (both IFR and VFR) detected by all sensors as a function of UAS flight time, 2) number of intruders (both IFR and VFR) detected by radar alone as a function of UAS flight time, and 3) number of VFR intruders detected by all sensors as a function of UAS flight time. The results will be used by SC-228 to inform decisions about the surveillance standards of UAS DAA systems and future requirements development and validation efforts.

  9. U.S. Army physical demands study: Prevalence and frequency of performing physically demanding tasks in deployed and non-deployed settings.

    PubMed

    Boye, Michael W; Cohen, Bruce S; Sharp, Marilyn A; Canino, Maria C; Foulis, Stephen A; Larcom, Kathleen; Smith, Laurel

    2017-11-01

    To compare percentages of on-duty time spent performing physically demanding soldier tasks in non-deployed and deployed settings, and secondarily examine the number of physically demanding tasks performed among five Army combat arms occupational specialties. Job task analysis. Soldiers (n=1295; over 99% serving on active duty) across five Army jobs completed one of three questionnaires developed using reviews of job and task related documents, input from subject matter experts, observation of task performance, and conduct of focus groups. Soldiers reported estimates of the total on-duty time spent performing physically demanding tasks in both deployed and non-deployed settings. One-way analyses of variance and Duncan post-hoc tests were used to compare percentage time differences by job. Two-tailed t-tests were used to evaluate differences by setting. Frequency analyses were used to present supplementary findings. Soldiers reported performing physically demanding job-specific tasks 17.7% of the time while non-deployed and 19.6% of the time while deployed. There were significant differences in time spent on job-specific tasks across settings (p<0.05) for three of five occupational specialties. When categories of physically demanding tasks were grouped, all soldiers reported spending more time on physically demanding tasks when deployed (p<0.001). Twenty-five percent reported performing less than half the physically demanding tasks represented on the questionnaire in the last two years. Soldiers spent more time performing physically demanding tasks while deployed compared to non-deployed but spent similar amounts of time performing job-specific tasks. Published by Elsevier Ltd.

  10. Performance comparison of ISAR imaging method based on time frequency transforms

    NASA Astrophysics Data System (ADS)

    Xie, Chunjian; Guo, Chenjiang; Xu, Jiadong

    2013-03-01

    Inverse synthetic aperture radar (ISAR) can image the moving target, especially the target in the air, so it is important in the air defence and missile defence system. Time-frequency Transform was applied to ISAR imaging process widely. Several time frequency transforms were introduced. Noise jamming methods were analysed, and when these noise jamming were added to the echo of the ISAR receiver, the image can become blur even can't to be identify. But the effect is different to the different time frequency analysis. The results of simulation experiment show the Performance Comparison of the method.

  11. RAPID ON-SITE METHODS OF CHEMICAL ANALYSIS

    EPA Science Inventory

    The analysis of potentially hazardous air, water and soil samples collected and shipped to service laboratories off-site is time consuming and expensive. This Chapter addresses the practical alternative of performing the requisite analytical services on-site. The most significant...

  12. Shock timing measurements and analysis in deuterium-tritium-ice layered capsule implosions on NIF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robey, H. F.; Celliers, P. M.; Moody, J. D.

    2014-02-15

    Recent advances in shock timing experiments and analysis techniques now enable shock measurements to be performed in cryogenic deuterium-tritium (DT) ice layered capsule implosions on the National Ignition Facility (NIF). Previous measurements of shock timing in inertial confinement fusion implosions [Boehly et al., Phys. Rev. Lett. 106, 195005 (2011); Robey et al., Phys. Rev. Lett. 108, 215004 (2012)] were performed in surrogate targets, where the solid DT ice shell and central DT gas were replaced with a continuous liquid deuterium (D2) fill. These previous experiments pose two surrogacy issues: a material surrogacy due to the difference of species (D2 vs.more » DT) and densities of the materials used and a geometric surrogacy due to presence of an additional interface (ice/gas) previously absent in the liquid-filled targets. This report presents experimental data and a new analysis method for validating the assumptions underlying this surrogate technique. Comparison of the data with simulation shows good agreement for the timing of the first three shocks, but reveals a considerable discrepancy in the timing of the 4th shock in DT ice layered implosions. Electron preheat is examined as a potential cause of the observed discrepancy in the 4th shock timing.« less

  13. Free wake analysis of hover performance using a new influence coefficient method

    NASA Technical Reports Server (NTRS)

    Quackenbush, Todd R.; Bliss, Donald B.; Ong, Ching Cho; Ching, Cho Ong

    1990-01-01

    A new approach to the prediction of helicopter rotor performance using a free wake analysis was developed. This new method uses a relaxation process that does not suffer from the convergence problems associated with previous time marching simulations. This wake relaxation procedure was coupled to a vortex-lattice, lifting surface loads analysis to produce a novel, self contained performance prediction code: EHPIC (Evaluation of Helicopter Performance using Influence Coefficients). The major technical features of the EHPIC code are described and a substantial amount of background information on the capabilities and proper operation of the code is supplied. Sample problems were undertaken to demonstrate the robustness and flexibility of the basic approach. Also, a performance correlation study was carried out to establish the breadth of applicability of the code, with very favorable results.

  14. A conflict analysis of 4D descent strategies in a metered, multiple-arrival route environment

    NASA Technical Reports Server (NTRS)

    Izumi, K. H.; Harris, C. S.

    1990-01-01

    A conflict analysis was performed on multiple arrival traffic at a typical metered airport. The Flow Management Evaluation Model (FMEM) was used to simulate arrival operations using Denver Stapleton's arrival route structure. Sensitivities of conflict performance to three different 4-D descent strategies (clear-idle Mach/Constant AirSpeed (CAS), constant descent angle Mach/CAS and energy optimal) were examined for three traffic mixes represented by those found at Denver Stapleton, John F. Kennedy and typical en route metering (ERM) airports. The Monte Carlo technique was used to generate simulation entry point times. Analysis results indicate that the clean-idle descent strategy offers the best compromise in overall performance. Performance measures primarily include susceptibility to conflict and conflict severity. Fuel usage performance is extrapolated from previous descent strategy studies.

  15. The BCD of response time analysis in experimental economics.

    PubMed

    Spiliopoulos, Leonidas; Ortmann, Andreas

    2018-01-01

    For decisions in the wild, time is of the essence. Available decision time is often cut short through natural or artificial constraints, or is impinged upon by the opportunity cost of time. Experimental economists have only recently begun to conduct experiments with time constraints and to analyze response time (RT) data, in contrast to experimental psychologists. RT analysis has proven valuable for the identification of individual and strategic decision processes including identification of social preferences in the latter case, model comparison/selection, and the investigation of heuristics that combine speed and performance by exploiting environmental regularities. Here we focus on the benefits, challenges, and desiderata of RT analysis in strategic decision making. We argue that unlocking the potential of RT analysis requires the adoption of process-based models instead of outcome-based models, and discuss how RT in the wild can be captured by time-constrained experiments in the lab. We conclude that RT analysis holds considerable potential for experimental economics, deserves greater attention as a methodological tool, and promises important insights on strategic decision making in naturally occurring environments.

  16. Time/frequency systems.

    DOT National Transportation Integrated Search

    1971-06-01

    The report summarizes the work performed at DOT/TSC on the Time/Frequency ATC System study project. Principal emphasis in this report is given to the evaluation and analysis of the technological risk areas. A survey and description of proposed T/F sy...

  17. Time-series analysis to study the impact of an intersection on dispersion along a street canyon.

    PubMed

    Richmond-Bryant, Jennifer; Eisner, Alfred D; Hahn, Intaek; Fortune, Christopher R; Drake-Richman, Zora E; Brixey, Laurie A; Talih, M; Wiener, Russell W; Ellenson, William D

    2009-12-01

    This paper presents data analysis from the Brooklyn Traffic Real-Time Ambient Pollutant Penetration and Environmental Dispersion (B-TRAPPED) study to assess the transport of ultrafine particulate matter (PM) across urban intersections. Experiments were performed in a street canyon perpendicular to a highway in Brooklyn, NY, USA. Real-time ultrafine PM samplers were positioned on either side of an intersection at multiple locations along a street to collect time-series number concentration data. Meteorology equipment was positioned within the street canyon and at an upstream background site to measure wind speed and direction. Time-series analysis was performed on the PM data to compute a transport velocity along the direction of the street for the cases where background winds were parallel and perpendicular to the street. The data were analyzed for sampler pairs located (1) on opposite sides of the intersection and (2) on the same block. The time-series analysis demonstrated along-street transport, including across the intersection when background winds were parallel to the street canyon and there was minimal transport and no communication across the intersection when background winds were perpendicular to the street canyon. Low but significant values of the cross-correlation function (CCF) underscore the turbulent nature of plume transport along the street canyon. The low correlations suggest that flow switching around corners or traffic-induced turbulence at the intersection may have aided dilution of the PM plume from the highway. This observation supports similar findings in the literature. Furthermore, the time-series analysis methodology applied in this study is introduced as a technique for studying spatiotemporal variation in the urban microscale environment.

  18. In vivo Real-Time Mass Spectrometry for Guided Surgery Application

    NASA Astrophysics Data System (ADS)

    Fatou, Benoit; Saudemont, Philippe; Leblanc, Eric; Vinatier, Denis; Mesdag, Violette; Wisztorski, Maxence; Focsa, Cristian; Salzet, Michel; Ziskind, Michael; Fournier, Isabelle

    2016-05-01

    Here we describe a new instrument (SpiderMass) designed for in vivo and real-time analysis. In this instrument ion production is performed remotely from the MS instrument and the generated ions are transported in real-time to the MS analyzer. Ion production is promoted by Resonant Infrared Laser Ablation (RIR-LA) based on the highly effective excitation of O-H bonds in water molecules naturally present in most biological samples. The retrieved molecular patterns are specific to the cell phenotypes and benign versus cancer regions of patient biopsies can be easily differentiated. We also demonstrate by analysis of human skin that SpiderMass can be used under in vivo conditions with minimal damage and pain. Furthermore SpiderMass can also be used for real-time drug metabolism and pharmacokinetic (DMPK) analysis or food safety topics. SpiderMass is thus the first MS based system designed for in vivo real-time analysis under minimally invasive conditions.

  19. Guidance simulation and test support for differential GPS flight experiment

    NASA Technical Reports Server (NTRS)

    Geier, G. J.; Loomis, P. V. W.; Cabak, A.

    1987-01-01

    Three separate tasks which supported the test preparation, test operations, and post test analysis of the NASA Ames flight test evaluation of the differential Global Positioning System (GPS) are presented. Task 1 consisted of a navigation filter design, coding, and testing to optimally make use of GPS in a differential mode. The filter can be configured to accept inputs from external censors such as an accelerometer and a barometric or radar altimeter. The filter runs in real time onboard a NASA helicopter. It processes raw pseudo and delta range measurements from a single channel sequential GPS receiver. The Kalman filter software interfaces are described in detail, followed by a description of the filter algorithm, including the basic propagation and measurement update equations. The performance during flight tests is reviewed and discussed. Task 2 describes a refinement performed on the lateral and vertical steering algorithms developed on a previous contract. The refinements include modification of the internal logic to allow more diverse inflight initialization procedures, further data smoothing and compensation for system induced time delays. Task 3 describes the TAU Corp participation in the analysis of the real time Kalman navigation filter. The performance was compared to that of the Z-set filter in flight and to the laser tracker position data during post test analysis. This analysis allowed a more optimum selection of the parameters of the filter.

  20. Parallelizing Timed Petri Net simulations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1993-01-01

    The possibility of using parallel processing to accelerate the simulation of Timed Petri Nets (TPN's) was studied. It was recognized that complex system development tools often transform system descriptions into TPN's or TPN-like models, which are then simulated to obtain information about system behavior. Viewed this way, it was important that the parallelization of TPN's be as automatic as possible, to admit the possibility of the parallelization being embedded in the system design tool. Later years of the grant were devoted to examining the problem of joint performance and reliability analysis, to explore whether both types of analysis could be accomplished within a single framework. In this final report, the results of our studies are summarized. We believe that the problem of parallelizing TPN's automatically for MIMD architectures has been almost completely solved for a large and important class of problems. Our initial investigations into joint performance/reliability analysis are two-fold; it was shown that Monte Carlo simulation, with importance sampling, offers promise of joint analysis in the context of a single tool, and methods for the parallel simulation of general Continuous Time Markov Chains, a model framework within which joint performance/reliability models can be cast, were developed. However, very much more work is needed to determine the scope and generality of these approaches. The results obtained in our two studies, future directions for this type of work, and a list of publications are included.

  1. Surgical task analysis of simulated laparoscopic cholecystectomy with a navigation system.

    PubMed

    Sugino, T; Kawahira, H; Nakamura, R

    2014-09-01

       Advanced surgical procedures, which have become complex and difficult, increase the burden of surgeons. Quantitative analysis of surgical procedures can improve training, reduce variability, and enable optimization of surgical procedures. To this end, a surgical task analysis system was developed that uses only surgical navigation information.    Division of the surgical procedure, task progress analysis, and task efficiency analysis were done. First, the procedure was divided into five stages. Second, the operating time and progress rate were recorded to document task progress during specific stages, including the dissecting task. Third, the speed of the surgical instrument motion (mean velocity and acceleration), as well as the size and overlap ratio of the approximate ellipse of the location log data distribution, was computed to estimate the task efficiency during each stage. These analysis methods were evaluated based on experimental validation with two groups of surgeons, i.e., skilled and "other" surgeons. The performance metrics and analytical parameters included incidents during the operation, the surgical environment, and the surgeon's skills or habits.    Comparison of groups revealed that skilled surgeons tended to perform the procedure in less time and involved smaller regions; they also manipulated the surgical instruments more gently.    Surgical task analysis developed for quantitative assessment of surgical procedures and surgical performance may provide practical methods and metrics for objective evaluation of surgical expertise.

  2. Optimization of Region of Interest Drawing for Quantitative Analysis: Differentiation Between Benign and Malignant Breast Lesions on Contrast-Enhanced Sonography.

    PubMed

    Nakata, Norio; Ohta, Tomoyuki; Nishioka, Makiko; Takeyama, Hiroshi; Toriumi, Yasuo; Kato, Kumiko; Nogi, Hiroko; Kamio, Makiko; Fukuda, Kunihiko

    2015-11-01

    This study was performed to evaluate the diagnostic utility of quantitative analysis of benign and malignant breast lesions using contrast-enhanced sonography. Contrast-enhanced sonography using the perflubutane-based contrast agent Sonazoid (Daiichi Sankyo, Tokyo, Japan) was performed in 94 pathologically proven palpable breast mass lesions, which could be depicted with B-mode sonography. Quantitative analyses using the time-intensity curve on contrast-enhanced sonography were performed in 5 region of interest (ROI) types (manually traced ROI and circular ROIs of 5, 10, 15, and 20 mm in diameter). The peak signal intensity, initial slope, time to peak, positive enhancement integral, and wash-out ratio were investigated in each ROI. There were significant differences between benign and malignant lesions in the time to peak (P < .05), initial slope (P < .001), and positive enhancement integral (P < .05) for the manual ROI. Significant differences were found between benign and malignant lesions in the time to peak (P < .05) for the 5-mm ROI; the time to peak (P < .05) and initial slope (P< .05) for the 10-mm ROI; absolute values of the peak signal intensity (P< .05), time to peak (P< .01), and initial slope (P< .005) for the 15-mm ROI; and the time to peak (P < .05) and initial slope (P < .05) for the 20-mm ROI. There were no statistically significant differences in any wash-out ratio values for the 5 ROI types. Kinetic analysis using contrast-enhanced sonography is useful for differentiation between benign and malignant breast lesions. © 2015 by the American Institute of Ultrasound in Medicine.

  3. Numerical flow analysis of axial flow compressor for steady and unsteady flow cases

    NASA Astrophysics Data System (ADS)

    Prabhudev, B. M.; Satish kumar, S.; Rajanna, D.

    2017-07-01

    Performance of jet engine is dependent on the performance of compressor. This paper gives numerical study of performance characteristics for axial compressor. The test rig is present at CSIR LAB Bangalore. Flow domains are meshed and fluid dynamic equations are solved using ANSYS package. Analysis is done for six different speeds and for operating conditions like choke, maximum efficiency & before stall point. Different plots are compared and results are discussed. Shock displacement, vortex flows, leakage patterns are presented along with unsteady FFT plot and time step plot.

  4. Working parameters affecting earth-air heat exchanger (EAHE) system performance for passive cooling: A review

    NASA Astrophysics Data System (ADS)

    Darius, D.; Misaran, M. S.; Rahman, Md. M.; Ismail, M. A.; Amaludin, A.

    2017-07-01

    The study on the effect of the working parameters such as pipe material, pipe length, pipe diameter, depth of burial of the pipe, air flow rate and different types of soils on the thermal performance of earth-air heat exchanger (EAHE) systems is very crucial to ensure that thermal comfort can be achieved. In the past decade, researchers have performed studies to develop numerical models for analysis of EAHE systems. Until recently, two-dimensional models replaced the numerical models in the 1990s and in recent times, more advanced analysis using three-dimensional models, specifically the Computational Fluid Dynamics (CFD) simulation in the analysis of EAHE system. This paper reviews previous models used to analyse the EAHE system and working parameters that affects the earth-air heat exchanger (EAHE) thermal performance as of February 2017. Recent findings on the parameters affecting EAHE performance are also presented and discussed. As a conclusion, with the advent of CFD methods, investigational work have geared up to modelling and simulation work as it saves time and cost. Comprehension of the EAHE working parameters and its effect on system performance is largely established. However, the study on type of soil and its characteristics on the performance of EAHEs systems are surprisingly barren. Therefore, future studies should focus on the effect of soil characteristics such as moisture content, density of soil, and type of soil on the thermal performance of EAHEs system.

  5. A statistical analysis of the daily streamflow hydrograph

    NASA Astrophysics Data System (ADS)

    Kavvas, M. L.; Delleur, J. W.

    1984-03-01

    In this study a periodic statistical analysis of daily streamflow data in Indiana, U.S.A., was performed to gain some new insight into the stochastic structure which describes the daily streamflow process. This analysis was performed by the periodic mean and covariance functions of the daily streamflows, by the time and peak discharge -dependent recession limb of the daily streamflow hydrograph, by the time and discharge exceedance level (DEL) -dependent probability distribution of the hydrograph peak interarrival time, and by the time-dependent probability distribution of the time to peak discharge. Some new statistical estimators were developed and used in this study. In general features, this study has shown that: (a) the persistence properties of daily flows depend on the storage state of the basin at the specified time origin of the flow process; (b) the daily streamflow process is time irreversible; (c) the probability distribution of the daily hydrograph peak interarrival time depends both on the occurrence time of the peak from which the inter-arrival time originates and on the discharge exceedance level; and (d) if the daily streamflow process is modeled as the release from a linear watershed storage, this release should depend on the state of the storage and on the time of the release as the persistence properties and the recession limb decay rates were observed to change with the state of the watershed storage and time. Therefore, a time-varying reservoir system needs to be considered if the daily streamflow process is to be modeled as the release from a linear watershed storage.

  6. Virtual reality, ultrasound-guided liver biopsy simulator: development and performance discrimination.

    PubMed

    Johnson, S J; Hunt, C M; Woolnough, H M; Crawshaw, M; Kilkenny, C; Gould, D A; England, A; Sinha, A; Villard, P F

    2012-05-01

    The aim of this article was to identify and prospectively investigate simulated ultrasound-guided targeted liver biopsy performance metrics as differentiators between levels of expertise in interventional radiology. Task analysis produced detailed procedural step documentation allowing identification of critical procedure steps and performance metrics for use in a virtual reality ultrasound-guided targeted liver biopsy procedure. Consultant (n=14; male=11, female=3) and trainee (n=26; male=19, female=7) scores on the performance metrics were compared. Ethical approval was granted by the Liverpool Research Ethics Committee (UK). Independent t-tests and analysis of variance (ANOVA) investigated differences between groups. Independent t-tests revealed significant differences between trainees and consultants on three performance metrics: targeting, p=0.018, t=-2.487 (-2.040 to -0.207); probe usage time, p = 0.040, t=2.132 (11.064 to 427.983); mean needle length in beam, p=0.029, t=-2.272 (-0.028 to -0.002). ANOVA reported significant differences across years of experience (0-1, 1-2, 3+ years) on seven performance metrics: no-go area touched, p=0.012; targeting, p=0.025; length of session, p=0.024; probe usage time, p=0.025; total needle distance moved, p=0.038; number of skin contacts, p<0.001; total time in no-go area, p=0.008. More experienced participants consistently received better performance scores on all 19 performance metrics. It is possible to measure and monitor performance using simulation, with performance metrics providing feedback on skill level and differentiating levels of expertise. However, a transfer of training study is required.

  7. PLATSIM: A Simulation and Analysis Package for Large-Order Flexible Systems. Version 2.0

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Kenny, Sean P.; Giesy, Daniel P.

    1997-01-01

    The software package PLATSIM provides efficient time and frequency domain analysis of large-order generic space platforms. PLATSIM can perform open-loop analysis or closed-loop analysis with linear or nonlinear control system models. PLATSIM exploits the particular form of sparsity of the plant matrices for very efficient linear and nonlinear time domain analysis, as well as frequency domain analysis. A new, original algorithm for the efficient computation of open-loop and closed-loop frequency response functions for large-order systems has been developed and is implemented within the package. Furthermore, a novel and efficient jitter analysis routine which determines jitter and stability values from time simulations in a very efficient manner has been developed and is incorporated in the PLATSIM package. In the time domain analysis, PLATSIM simulates the response of the space platform to disturbances and calculates the jitter and stability values from the response time histories. In the frequency domain analysis, PLATSIM calculates frequency response function matrices and provides the corresponding Bode plots. The PLATSIM software package is written in MATLAB script language. A graphical user interface is developed in the package to provide convenient access to its various features.

  8. Microwave-immobilized polybutadiene stationary phase for reversed-phase high-performance liquid chromatography.

    PubMed

    Lopes, Nilva P; Collins, Kenneth E; Jardim, Isabel C S F

    2004-03-19

    Polybutadiene (PBD) has been immobilized on high-performance liquid chromatography (HPLC) silica by microwave radiation at various power levels (52-663 W) and actuation times (3-60 min). Columns prepared from these reversed-phase HPLC materials, as well as from similar non-irradiated materials, were tested with standard sample mixtures and characterized by elemental analysis (%C) and infrared spectroscopy. A microwave irradiation of 20 min at 663 W gives a layer of immobilized PBD that presented good performance. Longer irradiation times give thicker immobilized layers having less favorable chromatographic properties.

  9. Solar Total Energy Project (STEP) Performance Analysis of High Temperature Energy Storage Subsystem

    NASA Technical Reports Server (NTRS)

    Moore, D. M.

    1984-01-01

    The 1982 milestones and lessons learned; performance in 1983; a typical day's operation; collector field performance and thermal losses; and formal testing are highlighted. An initial test that involves characterizing the high temperature storage (hts) subsystem is emphasized. The primary element is on 11,000 gallon storage tank that provides energy to the steam generator during transient solar conditions or extends operating time. Overnight, thermal losses were analyzed. The length of time the system is operated at various levels of cogeneration using stored energy is reviewed.

  10. Comparison of detrending methods for fluctuation analysis in hydrology

    NASA Astrophysics Data System (ADS)

    Zhang, Qiang; Zhou, Yu; Singh, Vijay P.; Chen, Yongqin David

    2011-03-01

    SummaryTrends within a hydrologic time series can significantly influence the scaling results of fluctuation analysis, such as rescaled range (RS) analysis and (multifractal) detrended fluctuation analysis (MF-DFA). Therefore, removal of trends is important in the study of scaling properties of the time series. In this study, three detrending methods, including adaptive detrending algorithm (ADA), Fourier-based method, and average removing technique, were evaluated by analyzing numerically generated series and observed streamflow series with obvious relative regular periodic trend. Results indicated that: (1) the Fourier-based detrending method and ADA were similar in detrending practices, and given proper parameters, these two methods can produce similarly satisfactory results; (2) detrended series by Fourier-based detrending method and ADA lose the fluctuation information at larger time scales, and the location of crossover points is heavily impacted by the chosen parameters of these two methods; and (3) the average removing method has an advantage over the other two methods, i.e., the fluctuation information at larger time scales is kept well-an indication of relatively reliable performance in detrending. In addition, the average removing method performed reasonably well in detrending a time series with regular periods or trends. In this sense, the average removing method should be preferred in the study of scaling properties of the hydrometeorolgical series with relative regular periodic trend using MF-DFA.

  11. Detection of myocardial ischemia by automated, motion-corrected, color-encoded perfusion maps compared with visual analysis of adenosine stress cardiovascular magnetic resonance imaging at 3 T: a pilot study.

    PubMed

    Doesch, Christina; Papavassiliu, Theano; Michaely, Henrik J; Attenberger, Ulrike I; Glielmi, Christopher; Süselbeck, Tim; Fink, Christian; Borggrefe, Martin; Schoenberg, Stefan O

    2013-09-01

    The purpose of this study was to compare automated, motion-corrected, color-encoded (AMC) perfusion maps with qualitative visual analysis of adenosine stress cardiovascular magnetic resonance imaging for detection of flow-limiting stenoses. Myocardial perfusion measurements applying the standard adenosine stress imaging protocol and a saturation-recovery temporal generalized autocalibrating partially parallel acquisition (t-GRAPPA) turbo fast low angle shot (Turbo FLASH) magnetic resonance imaging sequence were performed in 25 patients using a 3.0-T MAGNETOM Skyra (Siemens Healthcare Sector, Erlangen, Germany). Perfusion studies were analyzed using AMC perfusion maps and qualitative visual analysis. Angiographically detected coronary artery (CA) stenoses greater than 75% or 50% or more with a myocardial perfusion reserve index less than 1.5 were considered as hemodynamically relevant. Diagnostic performance and time requirement for both methods were compared. Interobserver and intraobserver reliability were also assessed. A total of 29 CA stenoses were included in the analysis. Sensitivity, specificity, positive predictive value, negative predictive value, and accuracy for detection of ischemia on a per-patient basis were comparable using the AMC perfusion maps compared to visual analysis. On a per-CA territory basis, the attribution of an ischemia to the respective vessel was facilitated using the AMC perfusion maps. Interobserver and intraobserver reliability were better for the AMC perfusion maps (concordance correlation coefficient, 0.94 and 0.93, respectively) compared to visual analysis (concordance correlation coefficient, 0.73 and 0.79, respectively). In addition, in comparison to visual analysis, the AMC perfusion maps were able to significantly reduce analysis time from 7.7 (3.1) to 3.2 (1.9) minutes (P < 0.0001). The AMC perfusion maps yielded a diagnostic performance on a per-patient and on a per-CA territory basis comparable with the visual analysis. Furthermore, this approach demonstrated higher interobserver and intraobserver reliability as well as a better time efficiency when compared to visual analysis.

  12. Performance of the new automated Abbott RealTime MTB assay for rapid detection of Mycobacterium tuberculosis complex in respiratory specimens.

    PubMed

    Chen, J H K; She, K K K; Kwong, T-C; Wong, O-Y; Siu, G K H; Leung, C-C; Chang, K-C; Tam, C-M; Ho, P-L; Cheng, V C C; Yuen, K-Y; Yam, W-C

    2015-09-01

    The automated high-throughput Abbott RealTime MTB real-time PCR assay has been recently launched for Mycobacterium tuberculosis complex (MTBC) clinical diagnosis. This study would like to evaluate its performance. We first compared its diagnostic performance with the Roche Cobas TaqMan MTB assay on 214 clinical respiratory specimens. Prospective analysis of a total 520 specimens was then performed to further evaluate the Abbott assay. The Abbott assay showed a lower limit of detection at 22.5 AFB/ml, which was more sensitive than the Cobas assay (167.5 AFB/ml). The two assays demonstrated a significant difference in diagnostic performance (McNemar's test; P = 0.0034), in which the Abbott assay presented significantly higher area under curve (AUC) than the Cobas assay (1.000 vs 0.880; P = 0.0002). The Abbott assay demonstrated extremely low PCR inhibition on clinical respiratory specimens. The automated Abbott assay required only very short manual handling time (0.5 h), which could help to improve the laboratory management. In the prospective analysis, the overall estimates for sensitivity and specificity of the Abbott assay were both 100 % among smear-positive specimens, whereas the smear-negative specimens were 96.7 and 96.1 %, respectively. No cross-reactivity with non-tuberculosis mycobacterial species was observed. The superiority in sensitivity of the Abbott assay for detecting MTBC in smear-negative specimens could further minimize the risk in MTBC false-negative detection. The new Abbott RealTime MTB assay has good diagnostic performance which can be a useful diagnostic tool for rapid MTBC detection in clinical laboratories.

  13. Modeling reciprocal team cohesion-performance relationships, as impacted by shared leadership and members' competence.

    PubMed

    Mathieu, John E; Kukenberger, Michael R; D'Innocenzo, Lauren; Reilly, Greg

    2015-05-01

    Despite the lengthy history of team cohesion-performance research, little is known about their reciprocal relationships over time. Using meta-analysis, we synthesize findings from 17 CLP design studies, and analyze their results using SEM. Results support that team cohesion and performance are related reciprocally with each other over time. We then used longitudinal data from 205 members of 57 student teams who competed in a complex business simulation over 10 weeks, to test: (a) whether team cohesion and performance were related reciprocally over multiple time periods, (b) the relative magnitude of those relationships, and (c) whether they were stable over time. We also considered the influence of team members' academic competence and degree of shared leadership on these dynamics. As anticipated, cohesion and performance were related positively, and reciprocally, over time. However, the cohesion → performance relationship was significantly higher than the performance → cohesion relationship. Moreover, the cohesion → performance relationship grew stronger over time whereas the performance → cohesion relationship remained fairly consistent over time. As expected, shared leadership related positively to team cohesion but not directly to their performance; whereas average team member academic competence related positively to team performance but was unrelated to team cohesion. Finally, we conducted and report a replication using a second sample of students competing in a business simulation. Our earlier substantive relationships were mostly replicated, and we illustrated the dynamic temporal properties of shared leadership. We discuss these findings in terms of theoretical importance, applied implications, and directions for future research. (c) 2015 APA, all rights reserved.

  14. Smart Sampling and HPC-based Probabilistic Look-ahead Contingency Analysis Implementation and its Evaluation with Real-world Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Etingov, Pavel V.; Ren, Huiying

    This paper describes a probabilistic look-ahead contingency analysis application that incorporates smart sampling and high-performance computing (HPC) techniques. Smart sampling techniques are implemented to effectively represent the structure and statistical characteristics of uncertainty introduced by different sources in the power system. They can significantly reduce the data set size required for multiple look-ahead contingency analyses, and therefore reduce the time required to compute them. High-performance-computing (HPC) techniques are used to further reduce computational time. These two techniques enable a predictive capability that forecasts the impact of various uncertainties on potential transmission limit violations. The developed package has been tested withmore » real world data from the Bonneville Power Administration. Case study results are presented to demonstrate the performance of the applications developed.« less

  15. VTOL shipboard letdown guidance system analysis

    NASA Technical Reports Server (NTRS)

    Phatak, A. V.; Karmali, M. S.

    1983-01-01

    Alternative letdown guidance strategies are examined for landing of a VTOL aircraft onboard a small aviation ship under adverse environmental conditions. Off line computer simulation of shipboard landing task is utilized for assessing the relative merits of the proposed guidance schemes. The touchdown performance of a nominal constant rate of descent (CROD) letdown strategy serves as a benchmark for ranking the performance of the alternative letdown schemes. Analysis of ship motion time histories indicates the existence of an alternating sequence of quiescent and rough motions called lulls and swells. A real time algorithms lull/swell classification based upon ship motion pattern features is developed. The classification algorithm is used to command a go/no go signal to indicate the initiation and termination of an acceptable landing window. Simulation results show that such a go/no go pattern based letdown guidance strategy improves touchdown performance.

  16. High stress, lack of sleep, low school performance, and suicide attempts are associated with high energy drink intake in adolescents.

    PubMed

    Kim, So Young; Sim, Songyong; Choi, Hyo Geun

    2017-01-01

    Although an association between energy drinks and suicide has been suggested, few prior studies have considered the role of emotional factors including stress, sleep, and school performance in adolescents. This study aimed to evaluate the association of energy drinks with suicide, independent of possible confounders including stress, sleep, and school performance. In total, 121,106 adolescents with 13-18 years olds from the 2014 and 2015 Korea Youth Risk Behavior Web-based Survey were surveyed for age, sex, region of residence, economic level, paternal and maternal education level, sleep time, stress level, school performance, frequency of energy drink intake, and suicide attempts. Subjective stress levels were classified into severe, moderate, mild, a little, and no stress. Sleep time was divided into 6 groups: < 6 h; 6 ≤ h < 7; 7 ≤ h < 8; 8 ≤ h < 9; and ≥ 9 h. School performance was classified into 5 levels: A (highest), B (middle, high), C (middle), D (middle, low), and E (lowest). Frequency of energy drink consumption was divided into 3 groups: ≥ 3, 1-2, and 0 times a week. The associations of sleep time, stress level, and school performance with suicide attempts and the frequency of energy drink intake were analyzed using multiple and ordinal logistic regression analysis, respectively, with complex sampling. The relationship between frequency of energy drink intake and suicide attempts was analyzed using multiple logistic regression analysis with complex sampling. Higher stress levels, lack of sleep, and low school performance were significantly associated with suicide attempts (each P < 0.001). These variables of high stress level, abnormal sleep time, and low school performance were also proportionally related with higher energy drink intake (P < 0.001). Frequent energy drink intake was significantly associated with suicide attempts in multiple logistic regression analyses (AOR for frequency of energy intake ≥ 3 times a week = 3.03, 95% CI = 2.64-3.49, P < 0.001). Severe stress, inadequate sleep, and low school performance were related with more energy drink intake and suicide attempts in Korean adolescents. Frequent energy drink intake was positively related with suicide attempts, even after adjusting for stress, sleep time, and school performance.

  17. High stress, lack of sleep, low school performance, and suicide attempts are associated with high energy drink intake in adolescents

    PubMed Central

    Kim, So Young; Sim, Songyong

    2017-01-01

    Objective Although an association between energy drinks and suicide has been suggested, few prior studies have considered the role of emotional factors including stress, sleep, and school performance in adolescents. This study aimed to evaluate the association of energy drinks with suicide, independent of possible confounders including stress, sleep, and school performance. Methods In total, 121,106 adolescents with 13–18 years olds from the 2014 and 2015 Korea Youth Risk Behavior Web-based Survey were surveyed for age, sex, region of residence, economic level, paternal and maternal education level, sleep time, stress level, school performance, frequency of energy drink intake, and suicide attempts. Subjective stress levels were classified into severe, moderate, mild, a little, and no stress. Sleep time was divided into 6 groups: < 6 h; 6 ≤ h < 7; 7 ≤ h < 8; 8 ≤ h < 9; and ≥ 9 h. School performance was classified into 5 levels: A (highest), B (middle, high), C (middle), D (middle, low), and E (lowest). Frequency of energy drink consumption was divided into 3 groups: ≥ 3, 1–2, and 0 times a week. The associations of sleep time, stress level, and school performance with suicide attempts and the frequency of energy drink intake were analyzed using multiple and ordinal logistic regression analysis, respectively, with complex sampling. The relationship between frequency of energy drink intake and suicide attempts was analyzed using multiple logistic regression analysis with complex sampling. Results Higher stress levels, lack of sleep, and low school performance were significantly associated with suicide attempts (each P < 0.001). These variables of high stress level, abnormal sleep time, and low school performance were also proportionally related with higher energy drink intake (P < 0.001). Frequent energy drink intake was significantly associated with suicide attempts in multiple logistic regression analyses (AOR for frequency of energy intake ≥ 3 times a week = 3.03, 95% CI = 2.64–3.49, P < 0.001). Conclusion Severe stress, inadequate sleep, and low school performance were related with more energy drink intake and suicide attempts in Korean adolescents. Frequent energy drink intake was positively related with suicide attempts, even after adjusting for stress, sleep time, and school performance. PMID:29135989

  18. Wavelet-based multiscale performance analysis: An approach to assess and improve hydrological models

    NASA Astrophysics Data System (ADS)

    Rathinasamy, Maheswaran; Khosa, Rakesh; Adamowski, Jan; ch, Sudheer; Partheepan, G.; Anand, Jatin; Narsimlu, Boini

    2014-12-01

    The temporal dynamics of hydrological processes are spread across different time scales and, as such, the performance of hydrological models cannot be estimated reliably from global performance measures that assign a single number to the fit of a simulated time series to an observed reference series. Accordingly, it is important to analyze model performance at different time scales. Wavelets have been used extensively in the area of hydrological modeling for multiscale analysis, and have been shown to be very reliable and useful in understanding dynamics across time scales and as these evolve in time. In this paper, a wavelet-based multiscale performance measure for hydrological models is proposed and tested (i.e., Multiscale Nash-Sutcliffe Criteria and Multiscale Normalized Root Mean Square Error). The main advantage of this method is that it provides a quantitative measure of model performance across different time scales. In the proposed approach, model and observed time series are decomposed using the Discrete Wavelet Transform (known as the à trous wavelet transform), and performance measures of the model are obtained at each time scale. The applicability of the proposed method was explored using various case studies-both real as well as synthetic. The synthetic case studies included various kinds of errors (e.g., timing error, under and over prediction of high and low flows) in outputs from a hydrologic model. The real time case studies investigated in this study included simulation results of both the process-based Soil Water Assessment Tool (SWAT) model, as well as statistical models, namely the Coupled Wavelet-Volterra (WVC), Artificial Neural Network (ANN), and Auto Regressive Moving Average (ARMA) methods. For the SWAT model, data from Wainganga and Sind Basin (India) were used, while for the Wavelet Volterra, ANN and ARMA models, data from the Cauvery River Basin (India) and Fraser River (Canada) were used. The study also explored the effect of the choice of the wavelets in multiscale model evaluation. It was found that the proposed wavelet-based performance measures, namely the MNSC (Multiscale Nash-Sutcliffe Criteria) and MNRMSE (Multiscale Normalized Root Mean Square Error), are a more reliable measure than traditional performance measures such as the Nash-Sutcliffe Criteria (NSC), Root Mean Square Error (RMSE), and Normalized Root Mean Square Error (NRMSE). Further, the proposed methodology can be used to: i) compare different hydrological models (both physical and statistical models), and ii) help in model calibration.

  19. Cost Analysis and Performance Assessment of Partner Services for Human Immunodeficiency Virus and Sexually Transmitted Diseases, New York State, 2014.

    PubMed

    Johnson, Britney L; Tesoriero, James; Feng, Wenhui; Qian, Feng; Martin, Erika G

    2017-12-01

    To estimate the programmatic costs of partner services for HIV, syphilis, gonorrhea, and chlamydial infection. New York State and local health departments conducting partner services activities in 2014. A cost analysis estimated, from the state perspective, total program costs and cost per case assignment, patient interview, partner notification, and disease-specific key performance indicator. Data came from contracts, a time study of staff effort, and statewide surveillance systems. Disease-specific costs per case assignment (mean: $580; range: $502-$1,111), patient interview ($703; $608-$1,609), partner notification ($1,169; $950-$1,936), and key performance indicator ($2,697; $1,666-$20,255) varied across diseases. Most costs (79 percent) were devoted to gonorrhea and chlamydial infection investigations. Cost analysis complements cost-effectiveness analysis in evaluating program performance and guiding improvements. © Health Research and Educational Trust.

  20. The Optimal Timing of Stage-2-Palliation After the Norwood Operation.

    PubMed

    Meza, James M; Hickey, Edward; McCrindle, Brian; Blackstone, Eugene; Anderson, Brett; Overman, David; Kirklin, James K; Karamlou, Tara; Caldarone, Christopher; Kim, Richard; DeCampli, William; Jacobs, Marshall; Guleserian, Kristine; Jacobs, Jeffrey P; Jaquiss, Robert

    2018-01-01

    The effect of the timing of stage-2-palliation (S2P) on survival through single ventricle palliation remains unknown. This study investigated the optimal timing of S2P that minimizes pre-S2P attrition and maximizes post-S2P survival. The Congenital Heart Surgeons' Society's critical left ventricular outflow tract obstruction cohort was used. Survival analysis was performed using multiphase parametric hazard analysis. Separate risk factors for death after the Norwood and after S2P were identified. Based on the multivariable models, infants were stratified as low, intermediate, or high risk. Cumulative 2-year, post-Norwood survival was predicted. Optimal timing was determined using conditional survival analysis and plotted as 2-year, post-Norwood survival versus age at S2P. A Norwood operation was performed in 534 neonates from 21 institutions. The S2P was performed in 71%, at a median age of 5.1 months (IQR: 4.3 to 6.0), and 22% died after Norwood. By 5 years after S2P, 10% of infants had died. For low- and intermediate-risk infants, performing S2P after age 3 months was associated with 89% ± 3% and 82% ± 3% 2-year survival, respectively. Undergoing an interval cardiac reoperation or moderate-severe right ventricular dysfunction before S2P were high-risk features. Among high-risk infants, 2-year survival was 63% ± 5%, and even lower when S2P was performed before age 6 months. Performing S2P after age 3 months may optimize survival of low- and intermediate-risk infants. High-risk infants are unlikely to complete three-stage palliation, and early S2P may increase their risk of mortality. We infer that early referral for cardiac transplantation may increase their chance of survival. Copyright © 2018 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  1. Statistical analysis of CCSN/SS7 traffic data from working CCS subnetworks

    NASA Astrophysics Data System (ADS)

    Duffy, Diane E.; McIntosh, Allen A.; Rosenstein, Mark; Willinger, Walter

    1994-04-01

    In this paper, we report on an ongoing statistical analysis of actual CCSN traffic data. The data consist of approximately 170 million signaling messages collected from a variety of different working CCS subnetworks. The key findings from our analysis concern: (1) the characteristics of both the telephone call arrival process and the signaling message arrival process; (2) the tail behavior of the call holding time distribution; and (3) the observed performance of the CCSN with respect to a variety of performance and reliability measurements.

  2. Simulator study of the effect of visual-motion time delays on pilot tracking performance with an audio side task

    NASA Technical Reports Server (NTRS)

    Riley, D. R.; Miller, G. K., Jr.

    1978-01-01

    The effect of time delay was determined in the visual and motion cues in a flight simulator on pilot performance in tracking a target aircraft that was oscillating sinusoidally in altitude only. An audio side task was used to assure the subject was fully occupied at all times. The results indicate that, within the test grid employed, about the same acceptable time delay (250 msec) was obtained for a single aircraft (fighter type) by each of two subjects for both fixed-base and motion-base conditions. Acceptable time delay is defined as the largest amount of delay that can be inserted simultaneously into the visual and motion cues before performance degradation occurs. A statistical analysis of the data was made to establish this value of time delay. Audio side task provided quantitative data that documented the subject's work level.

  3. Comparison of estimators for rolling samples using Forest Inventory and Analysis data

    Treesearch

    Devin S. Johnson; Michael S. Williams; Raymond L. Czaplewski

    2003-01-01

    The performance of three classes of weighted average estimators is studied for an annual inventory design similar to the Forest Inventory and Analysis program of the United States. The first class is based on an ARIMA(0,1,1) time series model. The equal weight, simple moving average is a member of this class. The second class is based on an ARIMA(0,2,2) time series...

  4. Development, validation and operating room-transfer of a six-step laparoscopic training program for the vesicourethral anastomosis.

    PubMed

    Klein, Jan; Teber, Dogu; Frede, Tom; Stock, Christian; Hruza, Marcel; Gözen, Ali; Seemann, Othmar; Schulze, Michael; Rassweiler, Jens

    2013-03-01

    Development and full validation of a laparoscopic training program for stepwise learning of a reproducible application of a standardized laparoscopic anastomosis technique and integration into the clinical course. The training of vesicourethral anastomosis (VUA) was divided into six simple standardized steps. To fix the objective criteria, four experienced surgeons performed the stepwise training protocol. Thirty-eight participants with no previous laparoscopic experience were investigated in their training performance. The times needed to manage each training step and the total training time were recorded. The integration into the clinical course was investigated. The training results and the corresponding steps during laparoscopic radical prostatectomy (LRP) were analyzed. Data analysis of corresponding operating room (OR) sections of 793 LRP was performed. Based on the validity, criteria were determined. In the laboratory section, a significant reduction of OR time for every step was seen in all participants. Coordination: 62%; longitudinal incision: 52%; inverted U-shape incision: 43%; plexus: 47%. Anastomosis catheter model: 38%. VUA: 38%. The laboratory section required a total time of 29 hours (minimum: 16 hours; maximum: 42 hours). All participants had shorter execution times in the laboratory than under real conditions. The best match was found within the VUA model. To perform an anastomosis under real conditions, 25% more time was needed. By using the training protocol, the performance of the VUA is comparable to that of an surgeon with experience of about 50 laparoscopic VUA. Data analysis proved content, construct, and prognostic validity. The use of stepwise training approaches enables a surgeon to learn and reproduce complex reconstructive surgical tasks: eg, the VUA in a safe environment. The validity of the designed system is given at all levels and should be used as a standard in the clinical surgical training in laparoscopic reconstructive urology.

  5. Change and Diversity in Grandparenting Experience.

    ERIC Educational Resources Information Center

    Thomas, Jeanne L.; Datan, Nancy

    In this study, change over time in grandparenting experience, sex differences in grandparenting, and differences among relationships with different grandchildren were explored. Thirteen grandmothers and six grandfathers were interviewed; content analysis and thematic analysis of interview transcripts were performed. Grandparents described changes…

  6. Exact reconstruction analysis/synthesis filter banks with time-varying filters

    NASA Technical Reports Server (NTRS)

    Arrowood, J. L., Jr.; Smith, M. J. T.

    1993-01-01

    This paper examines some of the analysis/synthesis issues associated with FIR time-varying filter banks where the filter bank coefficients are allowed to change in response to the input signal. Several issues are identified as being important in order to realize performance gains from time-varying filter banks in image coding applications. These issues relate to the behavior of the filters as transition from one set of filter banks to another occurs. Lattice structure formulations for the time varying filter bank problem are introduced and discussed in terms of their properties and transition characteristics.

  7. Effects of and preference for pay for performance: an analogue analysis.

    PubMed

    Long, Robert D; Wilder, David A; Betz, Alison; Dutta, Ami

    2012-01-01

    We examined the effects of 2 payment systems on the rate of check processing and time spent on task by participants in a simulated work setting. Three participants experienced individual pay-for-performance (PFP) without base pay and pay-for-time (PFT) conditions. In the last phase, we asked participants to choose which system they preferred. For all participants, the PFP condition produced higher rates of check processing and more time spent on task than did the PFT condition, but choice of payment system varied both within and across participants.

  8. Evidence That Bimanual Motor Timing Performance Is Not a Significant Factor in Developmental Stuttering.

    PubMed

    Hilger, Allison I; Zelaznik, Howard; Smith, Anne

    2016-08-01

    Stuttering involves a breakdown in the speech motor system. We address whether stuttering in its early stage is specific to the speech motor system or whether its impact is observable across motor systems. As an extension of Olander, Smith, and Zelaznik (2010), we measured bimanual motor timing performance in 115 children: 70 children who stutter (CWS) and 45 children who do not stutter (CWNS). The children repeated the clapping task yearly for up to 5 years. We used a synchronization-continuation rhythmic timing paradigm. Two analyses were completed: a cross-sectional analysis of data from the children in the initial year of the study (ages 4;0 [years;months] to 5;11) compared clapping performance between CWS and CWNS. A second, multiyear analysis assessed clapping behavior across the ages 3;5-9;5 to examine any potential relationship between clapping performance and eventual persistence or recovery of stuttering. Preschool CWS were not different from CWNS on rates of clapping or variability in interclap interval. In addition, no relationship was found between bimanual motor timing performance and eventual persistence in or recovery from stuttering. The disparity between the present findings for preschoolers and those of Olander et al. (2010) most likely arises from the smaller sample size used in the earlier study. From the current findings, on the basis of data from relatively large samples of stuttering and nonstuttering children tested over multiple years, we conclude that a bimanual motor timing deficit is not a core feature of early developmental stuttering.

  9. Effect of respiratory muscle training on exercise performance in healthy individuals: a systematic review and meta-analysis.

    PubMed

    Illi, Sabine K; Held, Ulrike; Frank, Irène; Spengler, Christina M

    2012-08-01

    Two distinct types of specific respiratory muscle training (RMT), i.e. respiratory muscle strength (resistive/threshold) and endurance (hyperpnoea) training, have been established to improve the endurance performance of healthy individuals. We performed a systematic review and meta-analysis in order to determine the factors that affect the change in endurance performance after RMT in healthy subjects. A computerized search was performed without language restriction in MEDLINE, EMBASE and CINAHL and references of original studies and reviews were searched for further relevant studies. RMT studies with healthy individuals assessing changes in endurance exercise performance by maximal tests (constant load, time trial, intermittent incremental, conventional [non-intermittent] incremental) were screened and abstracted by two independent investigators. A multiple linear regression model was used to identify effects of subjects' fitness, type of RMT (inspiratory or combined inspiratory/expiratory muscle strength training, respiratory muscle endurance training), type of exercise test, test duration and type of sport (rowing, running, swimming, cycling) on changes in performance after RMT. In addition, a meta-analysis was performed to determine the effect of RMT on endurance performance in those studies providing the necessary data. The multiple linear regression analysis including 46 original studies revealed that less fit subjects benefit more from RMT than highly trained athletes (6.0% per 10 mL · kg⁻¹ · min⁻¹ decrease in maximal oxygen uptake, 95% confidence interval [CI] 1.8, 10.2%; p = 0.005) and that improvements do not differ significantly between inspiratory muscle strength and respiratory muscle endurance training (p = 0.208), while combined inspiratory and expiratory muscle strength training seems to be superior in improving performance, although based on only 6 studies (+12.8% compared with inspiratory muscle strength training, 95% CI 3.6, 22.0%; p = 0.006). Furthermore, constant load tests (+16%, 95% CI 10.2, 22.9%) and intermittent incremental tests (+18.5%, 95% CI 10.8, 26.3%) detect changes in endurance performance better than conventional incremental tests (both p < 0.001) with no difference between time trials and conventional incremental tests (p = 0.286). With increasing test duration, improvements in performance are greater (+0.4% per minute test duration, 95% CI 0.1, 0.6%; p = 0.011) and the type of sport does not influence the magnitude of improvements (all p > 0.05). The meta-analysis, performed on eight controlled trials revealed a significant improvement in performance after RMT, which was detected by constant load tests, time trials and intermittent incremental tests, but not by conventional incremental tests. RMT improves endurance exercise performance in healthy individuals with greater improvements in less fit individuals and in sports of longer durations. The two most common types of RMT (inspiratory muscle strength and respiratory muscle endurance training) do not differ significantly in their effect, while combined inspiratory/expiratory strength training might be superior. Improvements are similar between different types of sports. Changes in performance can be detected by constant load tests, time trials and intermittent incremental tests only. Thus, all types of RMT can be used to improve exercise performance in healthy subjects but care must be taken regarding the test used to investigate the improvements.

  10. Item response theory analysis of the Amyotrophic Lateral Sclerosis Functional Rating Scale-Revised in the Pooled Resource Open-Access ALS Clinical Trials Database.

    PubMed

    Bacci, Elizabeth D; Staniewska, Dorota; Coyne, Karin S; Boyer, Stacey; White, Leigh Ann; Zach, Neta; Cedarbaum, Jesse M

    2016-01-01

    Our objective was to examine dimensionality and item-level performance of the Amyotrophic Lateral Sclerosis Functional Rating Scale-Revised (ALSFRS-R) across time using classical and modern test theory approaches. Confirmatory factor analysis (CFA) and Item Response Theory (IRT) analyses were conducted using data from patients with amyotrophic lateral sclerosis (ALS) Pooled Resources Open-Access ALS Clinical Trials (PRO-ACT) database with complete ALSFRS-R data (n = 888) at three time-points (Time 0, Time 1 (6-months), Time 2 (1-year)). Results demonstrated that in this population of 888 patients, mean age was 54.6 years, 64.4% were male, and 93.7% were Caucasian. The CFA supported a 4* individual-domain structure (bulbar, gross motor, fine motor, and respiratory domains). IRT analysis within each domain revealed misfitting items and overlapping item response category thresholds at all time-points, particularly in the gross motor and respiratory domain items. Results indicate that many of the items of the ALSFRS-R may sub-optimally distinguish among varying levels of disability assessed by each domain, particularly in patients with less severe disability. Measure performance improved across time as patient disability severity increased. In conclusion, modifications to select ALSFRS-R items may improve the instrument's specificity to disability level and sensitivity to treatment effects.

  11. Predicting analysis time in events-driven clinical trials using accumulating time-to-event surrogate information.

    PubMed

    Wang, Jianming; Ke, Chunlei; Yu, Zhinuan; Fu, Lei; Dornseif, Bruce

    2016-05-01

    For clinical trials with time-to-event endpoints, predicting the accrual of the events of interest with precision is critical in determining the timing of interim and final analyses. For example, overall survival (OS) is often chosen as the primary efficacy endpoint in oncology studies, with planned interim and final analyses at a pre-specified number of deaths. Often, correlated surrogate information, such as time-to-progression (TTP) and progression-free survival, are also collected as secondary efficacy endpoints. It would be appealing to borrow strength from the surrogate information to improve the precision of the analysis time prediction. Currently available methods in the literature for predicting analysis timings do not consider utilizing the surrogate information. In this article, using OS and TTP as an example, a general parametric model for OS and TTP is proposed, with the assumption that disease progression could change the course of the overall survival. Progression-free survival, related both to OS and TTP, will be handled separately, as it can be derived from OS and TTP. The authors seek to develop a prediction procedure using a Bayesian method and provide detailed implementation strategies under certain assumptions. Simulations are performed to evaluate the performance of the proposed method. An application to a real study is also provided. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  12. Gifted Ethnic Minority Students and Academic Achievement: A Meta-Analysis

    ERIC Educational Resources Information Center

    Henfield, Malik S.; Woo, Hongryun; Bang, Na Mi

    2017-01-01

    We conducted a meta-analysis exploring ethnic minority students enrolled in gifted/advanced programs with an emphasis on their academic achievement outcomes. A comprehensive search based on the Transparent Reporting of Systematic Reviews and Meta-Analysis checklist, was performed to retrieve articles within a 30-year time period (1983-2014), which…

  13. Real time automatic detection of bearing fault in induction machine using kurtogram analysis.

    PubMed

    Tafinine, Farid; Mokrani, Karim

    2012-11-01

    A proposed signal processing technique for incipient real time bearing fault detection based on kurtogram analysis is presented in this paper. The kurtogram is a fourth-order spectral analysis tool introduced for detecting and characterizing non-stationarities in a signal. This technique starts from investigating the resonance signatures over selected frequency bands to extract the representative features. The traditional spectral analysis is not appropriate for non-stationary vibration signal and for real time diagnosis. The performance of the proposed technique is examined by a series of experimental tests corresponding to different bearing conditions. Test results show that this signal processing technique is an effective bearing fault automatic detection method and gives a good basis for an integrated induction machine condition monitor.

  14. Total cost of ownership: the role of clinical engineering.

    PubMed

    Hockel, Dale; Kintner, Michael

    2014-06-01

    Hospitals often incur substantial hidden costs associated with service agreements that they enter into with original equipment manufacturers at the time of equipment purchase. Hospitals should perform an analysis of the total cost of ownership (TCO) of their organizations' medical equipment to identify opportunities for performance improvement and savings. The findings of the TCO analysis can point to areas where clinical engineering service management can be improved through investments in technology, training, and teamwork.

  15. A Data Envelopment Analysis Model for Selecting Material Handling System Designs

    NASA Astrophysics Data System (ADS)

    Liu, Fuh-Hwa Franklin; Kuo, Wan-Ting

    The material handling system under design is an unmanned job shop with an automated guided vehicle that transport loads within the processing machines. The engineering task is to select the design alternatives that are the combinations of the four design factors: the ratio of production time to transportation time, mean job arrival rate to the system, input/output buffer capacities at each processing machine, and the vehicle control strategies. Each of the design alternatives is simulated to collect the upper and lower bounds of the five performance indices. We develop a Data Envelopment Analysis (DEA) model to assess the 180 designs with imprecise data of the five indices. The three-ways factorial experiment analysis for the assessment results indicates the buffer capacity and the interaction of job arrival rate and buffer capacity affect the performance significantly.

  16. An analysis of a candidate control algorithm for a ride quality augmentation system

    NASA Technical Reports Server (NTRS)

    Suikat, Reiner; Donaldson, Kent; Downing, David R.

    1987-01-01

    This paper presents a detailed analysis of a candidate algorithm for a ride quality augmentation system. The algorithm consists of a full-state feedback control law based on optimal control output weighting, estimators for angle of attack and sideslip, and a maneuvering algorithm. The control law is shown to perform well by both frequency and time domain analysis. The rms vertical acceleration is reduced by about 40 percent over the whole mission flight envelope. The estimators for the angle of attack and sideslip avoid the often inaccurate or costly direct measurement of those angles. The maneuvering algorithm will allow the augmented airplane to respond to pilot inputs. The design characteristics and performance are documented by the closed-loop eigenvalues; rms levels of vertical, lateral, and longitudinal acceleration; and representative time histories and frequency response.

  17. Hardware accelerator design for tracking in smart camera

    NASA Astrophysics Data System (ADS)

    Singh, Sanjay; Dunga, Srinivasa Murali; Saini, Ravi; Mandal, A. S.; Shekhar, Chandra; Vohra, Anil

    2011-10-01

    Smart Cameras are important components in video analysis. For video analysis, smart cameras needs to detect interesting moving objects, track such objects from frame to frame, and perform analysis of object track in real time. Therefore, the use of real-time tracking is prominent in smart cameras. The software implementation of tracking algorithm on a general purpose processor (like PowerPC) could achieve low frame rate far from real-time requirements. This paper presents the SIMD approach based hardware accelerator designed for real-time tracking of objects in a scene. The system is designed and simulated using VHDL and implemented on Xilinx XUP Virtex-IIPro FPGA. Resulted frame rate is 30 frames per second for 250x200 resolution video in gray scale.

  18. Grab a coffee: your aerial images are already analyzed

    NASA Astrophysics Data System (ADS)

    Garetto, Anthony; Rademacher, Thomas; Schulz, Kristian

    2015-07-01

    For over 2 decades the AIMTM platform has been utilized in mask shops as the standard for actinic review of photomask sites in order to perform defect disposition and repair review. Throughout this time the measurement throughput of the systems has been improved in order to keep pace with the requirements demanded by a manufacturing environment, however the analysis of the sites captured has seen little improvement and remained a manual process. This manual analysis of aerial images is time consuming, subject to error and unreliability and contributes to holding up turn-around time (TAT) and slowing process flow in a manufacturing environment. AutoAnalysis, the first application available for the FAVOR® platform, offers a solution to these problems by providing fully automated data transfer and analysis of AIMTM aerial images. The data is automatically output in a customizable format that can be tailored to your internal needs and the requests of your customers. Savings in terms of operator time arise from the automated analysis which no longer needs to be performed. Reliability is improved as human error is eliminated making sure the most defective region is always and consistently captured. Finally the TAT is shortened and process flow for the back end of the line improved as the analysis is fast and runs in parallel to the measurements. In this paper the concept and approach of AutoAnalysis will be presented as well as an update to the status of the project. A look at the benefits arising from the automation and the customizable approach of the solution will be shown.

  19. HENDRICS: High ENergy Data Reduction Interface from the Command Shell

    NASA Astrophysics Data System (ADS)

    Bachetti, Matteo

    2018-05-01

    HENDRICS, a rewrite and update to MaLTPyNT (ascl:1502.021), contains command-line scripts based on Stingray (ascl:1608.001) to perform a quick-look (spectral-)timing analysis of X-ray data, treating the gaps in the data due, e.g., to occultation from the Earth or passages through the SAA, properly. Despite its original main focus on NuSTAR, HENDRICS can perform standard aperiodic timing analysis on X-ray data from, in principle, any other satellite, and its features include power density and cross spectra, time lags, pulsar searches with the Epoch folding and the Z_n^2 statistics, color-color and color-intensity diagrams. The periodograms produced by HENDRICS (such as a power density spectrum or a cospectrum) can be saved in a format compatible with XSPEC (ascl:9910.005) or ISIS (ascl:1302.002)

  20. Analytical Tools for Affordability Analysis

    DTIC Science & Technology

    2015-05-01

    function (Womer)  Unit cost as a function of learning and rate  Learning with forgetting (Benkard)  Learning depreciates over time  Discretionary...Analytical Tools for Affordability Analysis David Tate Cost Analysis and Research Division Institute for Defense Analyses Report Documentation...ES) Institute for Defense Analyses, Cost Analysis and Research Division,4850 Mark Center Drive,Alexandria,VA,22311-1882 8. PERFORMING ORGANIZATION

  1. Simultaneous separation by reversed-phase high-performance liquid chromatography and mass spectral identification of anthocyanins and flavonols in Shiraz grape skin.

    PubMed

    Downey, Mark O; Rochfort, Simone

    2008-08-01

    A limitation of large-scale viticultural trials is the time and cost of comprehensive compositional analysis of the fruit by high-performance liquid chromatography (HPLC). In addition, separate methods have generally been required to identify and quantify different classes of metabolites. To address these shortcomings a reversed-phase HPLC method was developed to simultaneously separate the anthocyanins and flavonols present in grape skins. The method employs a methanol and water gradient acidified with 10% formic acid with a run-time of 48 min including re-equilibration. Identity of anthocyanins and flavonols in Shiraz (Vitis vinifera L.) skin was confirmed by mass spectral analysis.

  2. Requirements analysis for a hardware, discrete-event, simulation engine accelerator

    NASA Astrophysics Data System (ADS)

    Taylor, Paul J., Jr.

    1991-12-01

    An analysis of a general Discrete Event Simulation (DES), executing on the distributed architecture of an eight mode Intel PSC/2 hypercube, was performed. The most time consuming portions of the general DES algorithm were determined to be the functions associated with message passing of required simulation data between processing nodes of the hypercube architecture. A behavioral description, using the IEEE standard VHSIC Hardware Description and Design Language (VHDL), for a general DES hardware accelerator is presented. The behavioral description specifies the operational requirements for a DES coprocessor to augment the hypercube's execution of DES simulations. The DES coprocessor design implements the functions necessary to perform distributed discrete event simulations using a conservative time synchronization protocol.

  3. Low Gravity Rapid Thermal Analysis of Glass

    NASA Technical Reports Server (NTRS)

    Tucker, Dennis S.; Ethridge, Edwin C.; Smith, Guy A.

    2004-01-01

    It has been observed by two research groups that ZrF4-BaF2-LaF3-AlF3-NaF (ZBLAN) glass crystallization is suppressed in microgravity. The mechanism for this phenomenon is unknown at the present time. In order to better understand the mechanism, an experiment was performed on NASA's KC135 reduced gravity aircraft to obtain quantitative crystallization data. An apparatus was designed and constructed for performing rapid thermal analysis of milligram quantities of ZBLAN glass. The apparatus employs an ellipsoidal furnace allowing for rapid heating and cooling. Using this apparatus nucleation and crystallization kinetic data was obtained leading to the construction of time-temperature-transformation curves for ZBLAN in microgravity and unit gravity.

  4. Excited state absorption spectra of dissolved and aggregated distyrylbenzene: A TD-DFT state and vibronic analysis

    NASA Astrophysics Data System (ADS)

    Oliveira, Eliezer Fernando; Shi, Junqing; Lavarda, Francisco Carlos; Lüer, Larry; Milián-Medina, Begoña; Gierschner, Johannes

    2017-07-01

    A time-dependent density functional theory study is performed to reveal the excited state absorption (ESA) features of distyrylbenzene (DSB), a prototype π-conjugated organic oligomer. Starting with a didactic insight to ESA based on simple molecular orbital and configuration considerations, the performance of various density functional theory functionals is tested to reveal the full vibronic ESA features of DSB at short and long probe delay times.

  5. Comparative Study of Motor Performance of Deaf and Hard of Hearing Students in Reaction Time, Visual-Motor Control and Upper Limb Speed and Dexterity Abilities

    ERIC Educational Resources Information Center

    Gkouvatzi, Anastasia N.; Mantis, Konstantinos; Kambas, Antonis

    2010-01-01

    Using the Bruininks-Oseretsky Test the motor performance of 34 deaf--hard-of-hearing pupils, 6-14 year, was evaluated in reaction time, visual-motor control and upper limb speed and dexterity. The two-way ANOVA variance analysis for two independent variables, group, age, and the Post Hoc (Scheffe test) for multiple comparisons were used. The…

  6. Determinants of ambulance response time: A study in Sabah, Malaysia

    NASA Astrophysics Data System (ADS)

    Chin, Su Na; Cheah, Phee Kheng; Arifin, Muhamad Yaakub; Wong, Boh Leng; Omar, Zaturrawiah; Yassin, Fouziah Md; Gabda, Darmesah

    2017-04-01

    Ambulance response time (ART) is one of the standard key performance indicators (KPI) in measuring the emergency medical services (EMS) delivery performances. When the mean time of ART of EMS system reaches the KPI target, it shows that the EMS system performs well. This paper considers the determinants of ART, using data sampled from 967 ambulance runs in a government hospital in Sabah. Multiple regression analysis with backward elimination was proposed for the identification of significant factors. Amongst the underlying factors, travel distance, age of patients, type of treatment and peak hours were identified to be significantly affecting ART. Identifying factors that influence ART helps the development of strategic improvement planning for reducing the ART.

  7. Movie Exposure to Alcohol Cues and Adolescent Alcohol Problems: A Longitudinal Analysis in a National Sample

    PubMed Central

    Wills, Thomas A.; Sargent, James D.; Gibbons, Frederick X.; Gerrard, Meg; Stoolmiller, Mike

    2009-01-01

    The authors tested a theoretical model of how exposure to alcohol cues in movies predicts level of alcohol use (ever use plus ever and recent binge drinking) and alcohol-related problems. A national sample of younger adolescents was interviewed by telephone with 4 repeated assessments spaced at 8-month intervals. A structural equation modeling analysis performed for ever-drinkers at Time 3 (N = 961) indicated that, controlling for a number of covariates, movie alcohol exposure at Time 1 was related to increases in peer alcohol use and adolescent alcohol use at Time 2. Movie exposure had indirect effects to alcohol use and problems at Times 3 and 4 through these pathways, with direct effects to problems from Time 1 rebelliousness and Time 2 movie exposure also found. Prospective risk-promoting effects were also found for alcohol expectancies, peer alcohol use, and availability of alcohol in the home; protective effects were found for mother’s responsiveness and for adolescent’s school performance and self-control. Theoretical and practical implications are discussed. PMID:19290687

  8. Movie exposure to alcohol cues and adolescent alcohol problems: a longitudinal analysis in a national sample.

    PubMed

    Wills, Thomas A; Sargent, James D; Gibbons, Frederick X; Gerrard, Meg; Stoolmiller, Mike

    2009-03-01

    The authors tested a theoretical model of how exposure to alcohol cues in movies predicts level of alcohol use (ever use plus ever and recent binge drinking) and alcohol-related problems. A national sample of younger adolescents was interviewed by telephone with 4 repeated assessments spaced at 8-month intervals. A structural equation modeling analysis performed for ever-drinkers at Time 3 (N = 961) indicated that, controlling for a number of covariates, movie alcohol exposure at Time 1 was related to increases in peer alcohol use and adolescent alcohol use at Time 2. Movie exposure had indirect effects to alcohol use and problems at Times 3 and 4 through these pathways, with direct effects to problems from Time 1 rebelliousness and Time 2 movie exposure also found. Prospective risk-promoting effects were also found for alcohol expectancies, peer alcohol use, and availability of alcohol in the home; protective effects were found for mother's responsiveness and for adolescent's school performance and self-control. Theoretical and practical implications are discussed. (PsycINFO Database Record (c) 2009 APA, all rights reserved).

  9. Capabilities needed for the next generation of thermo-hydraulic codes for use in real time applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arndt, S.A.

    1997-07-01

    The real-time reactor simulation field is currently at a crossroads in terms of the capability to perform real-time analysis using the most sophisticated computer codes. Current generation safety analysis codes are being modified to replace simplified codes that were specifically designed to meet the competing requirement for real-time applications. The next generation of thermo-hydraulic codes will need to have included in their specifications the specific requirement for use in a real-time environment. Use of the codes in real-time applications imposes much stricter requirements on robustness, reliability and repeatability than do design and analysis applications. In addition, the need for codemore » use by a variety of users is a critical issue for real-time users, trainers and emergency planners who currently use real-time simulation, and PRA practitioners who will increasingly use real-time simulation for evaluating PRA success criteria in near real-time to validate PRA results for specific configurations and plant system unavailabilities.« less

  10. Comparison of fluorescent tags for analysis of mannose-6-phosphate glycans.

    PubMed

    Kang, Ji-Yeon; Kwon, Ohsuk; Gil, Jin Young; Oh, Doo-Byoung

    2016-05-15

    Mannose-6-phosphate (M-6-P) glycan analysis is important for quality control of therapeutic enzymes for lysosomal storage diseases. Here, we found that the analysis of glycans containing two M-6-Ps was highly affected by the hydrophilicity of the elution solvent used in high-performance liquid chromatography (HPLC). In addition, the performances of three fluorescent tags--2-aminobenzoic acid (2-AA), 2-aminobenzamide (2-AB), and 3-(acetyl-amino)-6-aminoacridine (AA-Ac)--were compared with each other for M-6-P glycan analysis using HPLC and matrix-assisted laser desorption/ionization time-of-flight mass spectrometry. The best performance for analyzing M-6-P glycans was shown by 2-AA labeling in both analyses. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. A Multifaceted Approach to Investigating Pre-Task Planning Effects on Paired Oral Test Performance

    ERIC Educational Resources Information Center

    Nitta, Ryo; Nakatsuhara, Fumiyo

    2014-01-01

    Despite the growing popularity of paired format speaking assessments, the effects of pre-task planning time on performance in these formats are not yet well understood. For example, some studies have revealed the benefits of planning but others have not. Using a multifaceted approach including analysis of the process of speaking performance, the…

  12. Relative performance of academic departments using DEA with sensitivity analysis.

    PubMed

    Tyagi, Preeti; Yadav, Shiv Prasad; Singh, S P

    2009-05-01

    The process of liberalization and globalization of Indian economy has brought new opportunities and challenges in all areas of human endeavor including education. Educational institutions have to adopt new strategies to make best use of the opportunities and counter the challenges. One of these challenges is how to assess the performance of academic programs based on multiple criteria. Keeping this in view, this paper attempts to evaluate the performance efficiencies of 19 academic departments of IIT Roorkee (India) through data envelopment analysis (DEA) technique. The technique has been used to assess the performance of academic institutions in a number of countries like USA, UK, Australia, etc. But we are using it first time in Indian context to the best of our knowledge. Applying DEA models, we calculate technical, pure technical and scale efficiencies and identify the reference sets for inefficient departments. Input and output projections are also suggested for inefficient departments to reach the frontier. Overall performance, research performance and teaching performance are assessed separately using sensitivity analysis.

  13. Application of the Analysis Phase of the Instructional System Development to the MK-105 Magnetic Minesweeping Mission of the MH-53E Helicopter.

    DTIC Science & Technology

    1987-09-01

    Visual Communication . Although this task is performed several times, the task is performed at different points during the mission. In addition, the...Perform visual communication Give thumbs-up signal when ready for takeoff; check lights on pri-fly B. Perform takeoff and Aircraft operating clear ship...FM c. Operate ICS 2. Perform visual communication 3. Operate IFF transponder B. Maintain mission and fuel logs C. Perform checklists 1. Perform AMCM

  14. Pharmaceutical identifier confirmation via DART-TOF.

    PubMed

    Easter, Jacob L; Steiner, Robert R

    2014-07-01

    Pharmaceutical analysis comprises a large amount of the casework in forensic controlled substances laboratories. In order to reduce the time of analysis for pharmaceuticals, a Direct Analysis in Real Time ion source coupled with an accurate mass time-of-flight (DART-TOF) mass spectrometer was used to confirm identity. DART-TOF spectral data for pharmaceutical samples were analyzed and evaluated by comparison to standard spectra. Identical mass pharmaceuticals were differentiated using collision induced dissociation fragmentation, present/absent ions, and abundance comparison box plots; principal component analysis (PCA) and linear discriminant analysis (LDA) were used for differentiation of identical mass mixed drug spectra. Mass assignment reproducibility and robustness tests were performed on the DART-TOF spectra. Impacts on the forensic science community include a decrease in analysis time over the traditional gas chromatograph/mass spectrometry (GC/MS) confirmations, better laboratory efficiency, and simpler sample preparation. Using physical identifiers and the DART-TOF to confirm pharmaceutical identity will eliminate the use of GC/MS and effectively reduce analysis time while still complying with accepted analysis protocols. This will prove helpful in laboratories with large backlogs and will simplify the confirmation process. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. A systematic review and meta-analysis of carbohydrate benefits associated with randomized controlled competition-based performance trials.

    PubMed

    Pöchmüller, Martin; Schwingshackl, Lukas; Colombani, Paolo C; Hoffmann, Georg

    2016-01-01

    Carbohydrate supplements are widely used by athletes as an ergogenic aid before and during sports events. The present systematic review and meta-analysis aimed at synthesizing all available data from randomized controlled trials performed under real-life conditions. MEDLINE, EMBASE, and the Cochrane Central Register of Controlled Trials were searched systematically up to February 2015. Study groups were categorized according to test mode and type of performance measurement. Subgroup analyses were done with reference to exercise duration and range of carbohydrate concentration. Random effects and fixed effect meta-analyses were performed using the Software package by the Cochrane Collaboration Review Manager 5.3. Twenty-four randomized controlled trials met the objectives and were included in the present systematic review, 16 of which provided data for meta-analyses. Carbohydrate supplementations were associated with a significantly shorter exercise time in groups performing submaximal exercise followed by a time trial [mean difference -0.9 min (95 % confidence interval -1.7, -0.2), p = 0.02] as compared to controls. Subgroup analysis showed that improvements were specific for studies administering a concentration of carbohydrates between 6 and 8 % [mean difference -1.0 min (95 % confidence interval -1.9, -0.0), p = 0.04]. Concerning groups with submaximal exercise followed by a time trial measuring power accomplished within a fixed time or distance, mean power output was significantly higher following carbohydrate load (mean difference 20.2 W (95 % confidence interval 9.0, 31.5), p = 0.0004]. Likewise, mean power output was significantly increased following carbohydrate intervention in groups with time trial measuring power within a fixed time or distance (mean difference 8.1 W (95 % confidence interval 0.5, 15.7) p = 0.04]. Due to the limitations of this systematic review, results can only be applied to a subset of athletes (trained male cyclists). For those, we could observe a potential ergogenic benefit of carbohydrate supplementation especially in a concentration range between 6 and 8 % when exercising longer than 90 min.

  16. Evaluation of four endogenous reference genes and their real-time PCR assays for common wheat quantification in GMOs detection.

    PubMed

    Huang, Huali; Cheng, Fang; Wang, Ruoan; Zhang, Dabing; Yang, Litao

    2013-01-01

    Proper selection of endogenous reference genes and their real-time PCR assays is quite important in genetically modified organisms (GMOs) detection. To find a suitable endogenous reference gene and its real-time PCR assay for common wheat (Triticum aestivum L.) DNA content or copy number quantification, four previously reported wheat endogenous reference genes and their real-time PCR assays were comprehensively evaluated for the target gene sequence variation and their real-time PCR performance among 37 common wheat lines. Three SNPs were observed in the PKABA1 and ALMT1 genes, and these SNPs significantly decreased the efficiency of real-time PCR amplification. GeNorm analysis of the real-time PCR performance of each gene among common wheat lines showed that the Waxy-D1 assay had the lowest M values with the best stability among all tested lines. All results indicated that the Waxy-D1 gene and its real-time PCR assay were most suitable to be used as an endogenous reference gene for common wheat DNA content quantification. The validated Waxy-D1 gene assay will be useful in establishing accurate and creditable qualitative and quantitative PCR analysis of GM wheat.

  17. Evaluation of Four Endogenous Reference Genes and Their Real-Time PCR Assays for Common Wheat Quantification in GMOs Detection

    PubMed Central

    Huang, Huali; Cheng, Fang; Wang, Ruoan; Zhang, Dabing; Yang, Litao

    2013-01-01

    Proper selection of endogenous reference genes and their real-time PCR assays is quite important in genetically modified organisms (GMOs) detection. To find a suitable endogenous reference gene and its real-time PCR assay for common wheat (Triticum aestivum L.) DNA content or copy number quantification, four previously reported wheat endogenous reference genes and their real-time PCR assays were comprehensively evaluated for the target gene sequence variation and their real-time PCR performance among 37 common wheat lines. Three SNPs were observed in the PKABA1 and ALMT1 genes, and these SNPs significantly decreased the efficiency of real-time PCR amplification. GeNorm analysis of the real-time PCR performance of each gene among common wheat lines showed that the Waxy-D1 assay had the lowest M values with the best stability among all tested lines. All results indicated that the Waxy-D1 gene and its real-time PCR assay were most suitable to be used as an endogenous reference gene for common wheat DNA content quantification. The validated Waxy-D1 gene assay will be useful in establishing accurate and creditable qualitative and quantitative PCR analysis of GM wheat. PMID:24098735

  18. State space model approach for forecasting the use of electrical energy (a case study on: PT. PLN (Persero) district of Kroya)

    NASA Astrophysics Data System (ADS)

    Kurniati, Devi; Hoyyi, Abdul; Widiharih, Tatik

    2018-05-01

    Time series data is a series of data taken or measured based on observations at the same time interval. Time series data analysis is used to perform data analysis considering the effect of time. The purpose of time series analysis is to know the characteristics and patterns of a data and predict a data value in some future period based on data in the past. One of the forecasting methods used for time series data is the state space model. This study discusses the modeling and forecasting of electric energy consumption using the state space model for univariate data. The modeling stage is began with optimal Autoregressive (AR) order selection, determination of state vector through canonical correlation analysis, estimation of parameter, and forecasting. The result of this research shows that modeling of electric energy consumption using state space model of order 4 with Mean Absolute Percentage Error (MAPE) value 3.655%, so the model is very good forecasting category.

  19. Antiepileptic drug monotherapy for epilepsy: a network meta-analysis of individual participant data.

    PubMed

    Nevitt, Sarah J; Sudell, Maria; Weston, Jennifer; Tudur Smith, Catrin; Marson, Anthony G

    2017-06-29

    Epilepsy is a common neurological condition with a worldwide prevalence of around 1%. Approximately 60% to 70% of people with epilepsy will achieve a longer-term remission from seizures, and most achieve that remission shortly after starting antiepileptic drug treatment. Most people with epilepsy are treated with a single antiepileptic drug (monotherapy) and current guidelines from the National Institute for Health and Care Excellence (NICE) in the United Kingdom for adults and children recommend carbamazepine or lamotrigine as first-line treatment for partial onset seizures and sodium valproate for generalised onset seizures; however a range of other antiepileptic drug (AED) treatments are available, and evidence is needed regarding their comparative effectiveness in order to inform treatment choices. To compare the time to withdrawal of allocated treatment, remission and first seizure of 10 AEDs (carbamazepine, phenytoin, sodium valproate, phenobarbitone, oxcarbazepine, lamotrigine, gabapentin, topiramate, levetiracetam, zonisamide) currently used as monotherapy in children and adults with partial onset seizures (simple partial, complex partial or secondary generalised) or generalised tonic-clonic seizures with or without other generalised seizure types (absence, myoclonus). We searched the following databases: Cochrane Epilepsy's Specialised Register, CENTRAL, MEDLINE and SCOPUS, and two clinical trials registers. We handsearched relevant journals and contacted pharmaceutical companies, original trial investigators, and experts in the field. The date of the most recent search was 27 July 2016. We included randomised controlled trials of a monotherapy design in adults or children with partial onset seizures or generalised onset tonic-clonic seizures (with or without other generalised seizure types). This was an individual participant data (IPD) review and network meta-analysis. Our primary outcome was 'time to withdrawal of allocated treatment', and our secondary outcomes were 'time to achieve 12-month remission', 'time to achieve six-month remission', 'time to first seizure post-randomisation', and 'occurrence of adverse events'. We presented all time-to-event outcomes as Cox proportional hazard ratios (HRs) with 95% confidence intervals (CIs). We performed pairwise meta-analysis of head-to-head comparisons between drugs within trials to obtain 'direct' treatment effect estimates and we performed frequentist network meta-analysis to combine direct evidence with indirect evidence across the treatment network of 10 drugs. We investigated inconsistency between direct estimates and network meta-analysis via node splitting. Due to variability in methods and detail of reporting adverse events, we have not performed an analysis. We have provided a narrative summary of the most commonly reported adverse events. IPD was provided for at least one outcome of this review for 12,391 out of a total of 17,961 eligible participants (69% of total data) from 36 out of the 77 eligible trials (47% of total trials). We could not include IPD from the remaining 41 trials in analysis for a variety of reasons, such as being unable to contact an author or sponsor to request data, data being lost or no longer available, cost and resources required to prepare data being prohibitive, or local authority or country-specific restrictions.We were able to calculate direct treatment effect estimates for between half and two thirds of comparisons across the outcomes of the review, however for many of the comparisons, data were contributed by only a single trial or by a small number of participants, so confidence intervals of estimates were wide.Network meta-analysis showed that for the primary outcome 'Time to withdrawal of allocated treatment,' for individuals with partial seizures; levetiracetam performed (statistically) significantly better than both current first-line treatments carbamazepine and lamotrigine; lamotrigine performed better than all other treatments (aside from levetiracetam), and carbamazepine performed significantly better than gabapentin and phenobarbitone (high-quality evidence). For individuals with generalised onset seizures, first-line treatment sodium valproate performed significantly better than carbamazepine, topiramate and phenobarbitone (moderate- to high-quality evidence). Furthermore, for both partial and generalised onset seizures, the earliest licenced treatment, phenobarbitone seems to perform worse than all other treatments (moderate- to high-quality evidence).Network meta-analysis also showed that for secondary outcomes 'Time to 12-month remission of seizures' and 'Time to six-month remission of seizures,' few notable differences were shown for either partial or generalised seizure types (moderate- to high-quality evidence). For secondary outcome 'Time to first seizure,' for individuals with partial seizures; phenobarbitone performed significantly better than both current first-line treatments carbamazepine and lamotrigine; carbamazepine performed significantly better than sodium valproate, gabapentin and lamotrigine. Phenytoin also performed significantly better than lamotrigine (high-quality evidence). In general, the earliest licenced treatments (phenytoin and phenobarbitone) performed better than the other treatments for both seizure types (moderate- to high-quality evidence).Generally, direct evidence and network meta-analysis estimates (direct plus indirect evidence) were numerically similar and consistent with confidence intervals of effect sizes overlapping.The most commonly reported adverse events across all drugs were drowsiness/fatigue, headache or migraine, gastrointestinal disturbances, dizziness/faintness and rash or skin disorders. Overall, the high-quality evidence provided by this review supports current guidance (e.g. NICE) that carbamazepine and lamotrigine are suitable first-line treatments for individuals with partial onset seizures and also demonstrates that levetiracetam may be a suitable alternative. High-quality evidence from this review also supports the use of sodium valproate as the first-line treatment for individuals with generalised tonic-clonic seizures (with or without other generalised seizure types) and also demonstrates that lamotrigine and levetiracetam would be suitable alternatives to either of these first-line treatments, particularly for those of childbearing potential, for whom sodium valproate may not be an appropriate treatment option due to teratogenicity.

  20. Antiepileptic drug monotherapy for epilepsy: a network meta-analysis of individual participant data.

    PubMed

    Nevitt, Sarah J; Sudell, Maria; Weston, Jennifer; Tudur Smith, Catrin; Marson, Anthony G

    2017-12-15

    Epilepsy is a common neurological condition with a worldwide prevalence of around 1%. Approximately 60% to 70% of people with epilepsy will achieve a longer-term remission from seizures, and most achieve that remission shortly after starting antiepileptic drug treatment. Most people with epilepsy are treated with a single antiepileptic drug (monotherapy) and current guidelines from the National Institute for Health and Care Excellence (NICE) in the United Kingdom for adults and children recommend carbamazepine or lamotrigine as first-line treatment for partial onset seizures and sodium valproate for generalised onset seizures; however a range of other antiepileptic drug (AED) treatments are available, and evidence is needed regarding their comparative effectiveness in order to inform treatment choices. To compare the time to withdrawal of allocated treatment, remission and first seizure of 10 AEDs (carbamazepine, phenytoin, sodium valproate, phenobarbitone, oxcarbazepine, lamotrigine, gabapentin, topiramate, levetiracetam, zonisamide) currently used as monotherapy in children and adults with partial onset seizures (simple partial, complex partial or secondary generalised) or generalised tonic-clonic seizures with or without other generalised seizure types (absence, myoclonus). We searched the following databases: Cochrane Epilepsy's Specialised Register, CENTRAL, MEDLINE and SCOPUS, and two clinical trials registers. We handsearched relevant journals and contacted pharmaceutical companies, original trial investigators, and experts in the field. The date of the most recent search was 27 July 2016. We included randomised controlled trials of a monotherapy design in adults or children with partial onset seizures or generalised onset tonic-clonic seizures (with or without other generalised seizure types). This was an individual participant data (IPD) review and network meta-analysis. Our primary outcome was 'time to withdrawal of allocated treatment', and our secondary outcomes were 'time to achieve 12-month remission', 'time to achieve six-month remission', 'time to first seizure post-randomisation', and 'occurrence of adverse events'. We presented all time-to-event outcomes as Cox proportional hazard ratios (HRs) with 95% confidence intervals (CIs). We performed pairwise meta-analysis of head-to-head comparisons between drugs within trials to obtain 'direct' treatment effect estimates and we performed frequentist network meta-analysis to combine direct evidence with indirect evidence across the treatment network of 10 drugs. We investigated inconsistency between direct estimates and network meta-analysis via node splitting. Due to variability in methods and detail of reporting adverse events, we have not performed an analysis. We have provided a narrative summary of the most commonly reported adverse events. IPD was provided for at least one outcome of this review for 12,391 out of a total of 17,961 eligible participants (69% of total data) from 36 out of the 77 eligible trials (47% of total trials). We could not include IPD from the remaining 41 trials in analysis for a variety of reasons, such as being unable to contact an author or sponsor to request data, data being lost or no longer available, cost and resources required to prepare data being prohibitive, or local authority or country-specific restrictions.We were able to calculate direct treatment effect estimates for between half and two thirds of comparisons across the outcomes of the review, however for many of the comparisons, data were contributed by only a single trial or by a small number of participants, so confidence intervals of estimates were wide.Network meta-analysis showed that for the primary outcome 'Time to withdrawal of allocated treatment,' for individuals with partial seizures; levetiracetam performed (statistically) significantly better than current first-line treatment carbamazepine and other current first-line treatment lamotrigine performed better than all other treatments (aside from levetiracetam); carbamazepine performed significantly better than gabapentin and phenobarbitone (high-quality evidence). For individuals with generalised onset seizures, first-line treatment sodium valproate performed significantly better than carbamazepine, topiramate and phenobarbitone (moderate- to high-quality evidence). Furthermore, for both partial and generalised onset seizures, the earliest licenced treatment, phenobarbitone seems to perform worse than all other treatments (moderate- to high-quality evidence).Network meta-analysis also showed that for secondary outcomes 'Time to 12-month remission of seizures' and 'Time to six-month remission of seizures,' few notable differences were shown for either partial or generalised seizure types (moderate- to high-quality evidence). For secondary outcome 'Time to first seizure,' for individuals with partial seizures; phenobarbitone performed significantly better than both current first-line treatments carbamazepine and lamotrigine; carbamazepine performed significantly better than sodium valproate, gabapentin and lamotrigine. Phenytoin also performed significantly better than lamotrigine (high-quality evidence). In general, the earliest licenced treatments (phenytoin and phenobarbitone) performed better than the other treatments for both seizure types (moderate- to high-quality evidence).Generally, direct evidence and network meta-analysis estimates (direct plus indirect evidence) were numerically similar and consistent with confidence intervals of effect sizes overlapping.The most commonly reported adverse events across all drugs were drowsiness/fatigue, headache or migraine, gastrointestinal disturbances, dizziness/faintness and rash or skin disorders. Overall, the high-quality evidence provided by this review supports current guidance (e.g. NICE) that carbamazepine and lamotrigine are suitable first-line treatments for individuals with partial onset seizures and also demonstrates that levetiracetam may be a suitable alternative. High-quality evidence from this review also supports the use of sodium valproate as the first-line treatment for individuals with generalised tonic-clonic seizures (with or without other generalised seizure types) and also demonstrates that lamotrigine and levetiracetam would be suitable alternatives to either of these first-line treatments, particularly for those of childbearing potential, for whom sodium valproate may not be an appropriate treatment option due to teratogenicity.

  1. Changes in plasma protein levels as an early indication of a bloodstream infection

    PubMed Central

    Joenväärä, Sakari; Kaartinen, Johanna; Järvinen, Asko; Renkonen, Risto

    2017-01-01

    Blood culture is the primary diagnostic test performed in a suspicion of bloodstream infection to detect the presence of microorganisms and direct the treatment. However, blood culture is slow and time consuming method to detect blood stream infections or separate septic and/or bacteremic patients from others with less serious febrile disease. Plasma proteomics, despite its challenges, remains an important source for early biomarkers for systemic diseases and might show changes before direct evidence from bacteria can be obtained. We have performed a plasma proteomic analysis, simultaneously at the time of blood culture sampling from ten blood culture positive and ten blood culture negative patients, and quantified 172 proteins with two or more unique peptides. Principal components analysis, Orthogonal Projections to Latent Structures Discriminant Analysis (OPLS-DA) and ROC curve analysis were performed to select protein(s) features which can classify the two groups of samples. We propose a number of candidates which qualify as potential biomarkers to select the blood culture positive cases from negative ones. Pathway analysis by two methods revealed complement activation, phagocytosis pathway and alterations in lipid metabolism as enriched pathways which are relevant for the condition. Data are available via ProteomeXchange with identifier PXD005022. PMID:28235076

  2. Application of Open Source Technologies for Oceanographic Data Analysis

    NASA Astrophysics Data System (ADS)

    Huang, T.; Gangl, M.; Quach, N. T.; Wilson, B. D.; Chang, G.; Armstrong, E. M.; Chin, T. M.; Greguska, F.

    2015-12-01

    NEXUS is a data-intensive analysis solution developed with a new approach for handling science data that enables large-scale data analysis by leveraging open source technologies such as Apache Cassandra, Apache Spark, Apache Solr, and Webification. NEXUS has been selected to provide on-the-fly time-series and histogram generation for the Soil Moisture Active Passive (SMAP) mission for Level 2 and Level 3 Active, Passive, and Active Passive products. It also provides an on-the-fly data subsetting capability. NEXUS is designed to scale horizontally, enabling it to handle massive amounts of data in parallel. It takes a new approach on managing time and geo-referenced array data by dividing data artifacts into chunks and stores them in an industry-standard, horizontally scaled NoSQL database. This approach enables the development of scalable data analysis services that can infuse and leverage the elastic computing infrastructure of the Cloud. It is equipped with a high-performance geospatial and indexed data search solution, coupled with a high-performance data Webification solution free from file I/O bottlenecks, as well as a high-performance, in-memory data analysis engine. In this talk, we will focus on the recently funded AIST 2014 project by using NEXUS as the core for oceanographic anomaly detection service and web portal. We call it, OceanXtremes

  3. Use of the landmark method to address immortal person-time bias in comparative effectiveness research: a simulation study.

    PubMed

    Mi, Xiaojuan; Hammill, Bradley G; Curtis, Lesley H; Lai, Edward Chia-Cheng; Setoguchi, Soko

    2016-11-20

    Observational comparative effectiveness and safety studies are often subject to immortal person-time, a period of follow-up during which outcomes cannot occur because of the treatment definition. Common approaches, like excluding immortal time from the analysis or naïvely including immortal time in the analysis, are known to result in biased estimates of treatment effect. Other approaches, such as the Mantel-Byar and landmark methods, have been proposed to handle immortal time. Little is known about the performance of the landmark method in different scenarios. We conducted extensive Monte Carlo simulations to assess the performance of the landmark method compared with other methods in settings that reflect realistic scenarios. We considered four landmark times for the landmark method. We found that the Mantel-Byar method provided unbiased estimates in all scenarios, whereas the exclusion and naïve methods resulted in substantial bias when the hazard of the event was constant or decreased over time. The landmark method performed well in correcting immortal person-time bias in all scenarios when the treatment effect was small, and provided unbiased estimates when there was no treatment effect. The bias associated with the landmark method tended to be small when the treatment rate was higher in the early follow-up period than it was later. These findings were confirmed in a case study of chronic obstructive pulmonary disease. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  4. Assessing team performance in the operating room: development and use of a "black-box" recorder and other tools for the intraoperative environment.

    PubMed

    Guerlain, Stephanie; Adams, Reid B; Turrentine, F Beth; Shin, Thomas; Guo, Hui; Collins, Stephen R; Calland, J Forrest

    2005-01-01

    The objective of this research was to develop a digital system to archive the complete operative environment along with the assessment tools for analysis of this data, allowing prospective studies of operative performance, intraoperative errors, team performance, and communication. Ability to study this environment will yield new insights, allowing design of systems to avoid preventable errors that contribute to perioperative complications. A multitrack, synchronized, digital audio-visual recording system (RATE tool) was developed to monitor intraoperative performance, including software to synchronize data and allow assignment of independent observational scores. Cases were scored for technical performance, participants' situational awareness (knowledge of critical information), and their comfort and satisfaction with the conduct of the procedure. Laparoscopic cholecystectomy (n = 10) was studied. Technical performance of the RATE tool was excellent. The RATE tool allowed real time, multitrack data collection of all aspects of the operative environment, while permitting digital recording of the objective assessment data in a time synchronized and annotated fashion during the procedure. The mean technical performance score was 73% +/- 28% of maximum (perfect) performance. Situational awareness varied widely among team members, with the attending surgeon typically the only team member having comprehensive knowledge of critical case information. The RATE tool allows prospective analysis of performance measures such as technical judgments, team performance, and communication patterns, offers the opportunity to conduct prospective intraoperative studies of human performance, and allows for postoperative discussion, review, and teaching. This study also suggests that gaps in situational awareness might be an underappreciated source of operative adverse events. Future uses of this system will aid teaching, failure or adverse event analysis, and intervention research.

  5. Optimization of BEV Charging Strategy

    NASA Astrophysics Data System (ADS)

    Ji, Wei

    This paper presents different approaches to optimize fast charging and workplace charging strategy of battery electric vehicle (BEV) drivers. For the fast charging analysis, a rule-based model was built to simulate BEV charging behavior. Monte Carlo analysis was performed to explore to the potential range of congestion at fast charging stations which could be more than four hours at the most crowded stations. Genetic algorithm was performed to explore the theoretical minimum waiting time at fast charging stations, and it can decrease the waiting time at the most crowded stations to be shorter than one hour. A deterministic approach was proposed as a feasible suggestion that people should consider to take fast charging when the state of charge is approaching 40 miles. This suggestion is hoped to help to minimize potential congestion at fast charging stations. For the workplace charging analysis, scenario analysis was performed to simulate temporal distribution of charging demand under different workplace charging strategies. It was found that if BEV drivers charge as much as possible and as late as possible at workplace, it could increase the utility of solar-generated electricity while relieve grid stress of extra intensive electricity demand at night caused by charging electric vehicles at home.

  6. A normal incidence X-ray telescope

    NASA Technical Reports Server (NTRS)

    Golub, Leon

    1987-01-01

    The postflight performance evaluation of the X-ray telescope was summarized. All payload systems and subsystems performed well within acceptable limits, with the sole exception of the light-blocking prefilters. Launch, flight and recovery were performed in a fully satisfactory manner. The payload was recovered in a timely manner and in excellent condition. The prefilter performance analysis showed that no X-ray images were detected on the processed flight film. Recommendations for improved performance are listed.

  7. An empirically derived figure of merit for the quality of overall task performance

    NASA Technical Reports Server (NTRS)

    Lemay, Moira

    1989-01-01

    The need to develop an operationally relevant figure of merit for the quality of performance of a complex system such as an aircraft cockpit stems from a hypothesized dissociation between measures of performance and those of workload. Performance can be measured in terms of time, errors, or a combination of these. In most tasks performed by expert operators, errors are relatively rare and often corrected in time to avoid consequences. Moreover, perfect performance is seldom necessary to accomplish a particular task. Moreover, how well an expert performs a complex task consisting of a series of discrete cognitive tasks superimposed on a continuous task, such as flying an aircraft, does not depend on how well each discrete task is performed, but on their smooth sequencing. This makes the amount of time spent on each subtask of paramount importance in measuring overall performance, since smooth sequencing requires a minimum amount of time spent on each task. Quality consists in getting tasks done within a crucial time interval while maintaining acceptable continuous task performance. Thus, a figure of merit for overall quality of performance should be primarily a measure of time to perform discrete subtasks combined with a measure of basic vehicle control. Thus, the proposed figure of merit requires doing a task analysis on a series of performance, or runs, of a particular task, listing each discrete task and its associated time, and calculating the mean and standard deviation of these times, along with the mean and standard deviation of tracking error for the whole task. A set of simulator data on 30 runs of a landing task was obtained and a figure of merit will be calculated for each run. The figure of merit will be compared for voice and data link, so that the impact of this technology on total crew performance (not just communication performance) can be assessed. The effect of data link communication on other cockpit tasks will also be considered.

  8. Analysis of swimming performance: perceptions and practices of US-based swimming coaches.

    PubMed

    Mooney, Robert; Corley, Gavin; Godfrey, Alan; Osborough, Conor; Newell, John; Quinlan, Leo Richard; ÓLaighin, Gearóid

    2016-01-01

    In elite swimming, a broad range of methods are used to assess performance, inform coaching practices and monitor athletic progression. The aim of this paper was to examine the performance analysis practices of swimming coaches and to explore the reasons behind the decisions that coaches take when analysing performance. Survey data were analysed from 298 Level 3 competitive swimming coaches (245 male, 53 female) based in the United States. Results were compiled to provide a generalised picture of practices and perceptions and to examine key emerging themes. It was found that a disparity exists between the importance swim coaches place on biomechanical analysis of swimming performance and the types of analyses that are actually conducted. Video-based methods are most frequently employed, with over 70% of coaches using these methods at least monthly, with analyses being mainly qualitative in nature rather than quantitative. Barriers to the more widespread use of quantitative biomechanical analysis in elite swimming environments were explored. Constraints include time, cost and availability of resources, but other factors such as sources of information on swimming performance and analysis and control over service provision are also discussed, with particular emphasis on video-based methods and emerging sensor-based technologies.

  9. ATD-1 Avionics Phase 2: Post-Flight Data Analysis Report

    NASA Technical Reports Server (NTRS)

    Scharl, Julien

    2017-01-01

    This report aims to satisfy Air Traffic Management Technology Demonstration - 1 (ATD-1) Statement of Work (SOW) 3.6.19 and serves as the delivery mechanism for the analysis described in Annex C of the Flight Test Plan. The report describes the data collected and derived as well as the analysis methodology and associated results extracted from the data set collected during the ATD-1 Flight Test. All analyses described in the SOW were performed and are covered in this report except for the analysis of Final Approach Speed and its effect on performance. This analysis was de-prioritized and, at the time of this report, is not considered feasible in the schedule and costs remaining.

  10. Optimizing Scientist Time through In Situ Visualization and Analysis.

    PubMed

    Patchett, John; Ahrens, James

    2018-01-01

    In situ processing produces reduced size persistent representations of a simulations state while the simulation is running. The need for in situ visualization and data analysis is usually described in terms of supercomputer size and performance in relation to available storage size.

  11. 21ST CENTURY MOLD ANALYSIS IN FOOD

    EPA Science Inventory

    Traditionally, the indoor air community has relied on mold analysis performed by either microscopic observations or the culturing of molds on various media to assess indoor air quality. These techniques were developed in the 19th century and are very laborious and time consumin...

  12. Linear systems analysis program, L224(QR). Volume 2: Supplemental system design and maintenance document

    NASA Technical Reports Server (NTRS)

    Heidergott, K. W.

    1979-01-01

    The computer program known as QR is described. Classical control systems analysis and synthesis (root locus, time response, and frequency response) can be performed using this program. Programming details of the QR program are presented.

  13. Multi-Dimensional Analysis of Dynamic Human Information Interaction

    ERIC Educational Resources Information Center

    Park, Minsoo

    2013-01-01

    Introduction: This study aims to understand the interactions of perception, effort, emotion, time and performance during the performance of multiple information tasks using Web information technologies. Method: Twenty volunteers from a university participated in this study. Questionnaires were used to obtain general background information and…

  14. Integrated Modeling Activities for the James Webb Space Telescope (JWST): Structural-Thermal-Optical Analysis

    NASA Technical Reports Server (NTRS)

    Johnston, John D.; Parrish, Keith; Howard, Joseph M.; Mosier, Gary E.; McGinnis, Mark; Bluth, Marcel; Kim, Kevin; Ha, Hong Q.

    2004-01-01

    This is a continuation of a series of papers on modeling activities for JWST. The structural-thermal- optical, often referred to as "STOP", analysis process is used to predict the effect of thermal distortion on optical performance. The benchmark STOP analysis for JWST assesses the effect of an observatory slew on wavefront error. The paper begins an overview of multi-disciplinary engineering analysis, or integrated modeling, which is a critical element of the JWST mission. The STOP analysis process is then described. This process consists of the following steps: thermal analysis, structural analysis, and optical analysis. Temperatures predicted using geometric and thermal math models are mapped to the structural finite element model in order to predict thermally-induced deformations. Motions and deformations at optical surfaces are input to optical models and optical performance is predicted using either an optical ray trace or WFE estimation techniques based on prior ray traces or first order optics. Following the discussion of the analysis process, results based on models representing the design at the time of the System Requirements Review. In addition to baseline performance predictions, sensitivity studies are performed to assess modeling uncertainties. Of particular interest is the sensitivity of optical performance to uncertainties in temperature predictions and variations in metal properties. The paper concludes with a discussion of modeling uncertainty as it pertains to STOP analysis.

  15. On the Usage of GPUs for Efficient Motion Estimation in Medical Image Sequences

    PubMed Central

    Thiyagalingam, Jeyarajan; Goodman, Daniel; Schnabel, Julia A.; Trefethen, Anne; Grau, Vicente

    2011-01-01

    Images are ubiquitous in biomedical applications from basic research to clinical practice. With the rapid increase in resolution, dimensionality of the images and the need for real-time performance in many applications, computational requirements demand proper exploitation of multicore architectures. Towards this, GPU-specific implementations of image analysis algorithms are particularly promising. In this paper, we investigate the mapping of an enhanced motion estimation algorithm to novel GPU-specific architectures, the resulting challenges and benefits therein. Using a database of three-dimensional image sequences, we show that the mapping leads to substantial performance gains, up to a factor of 60, and can provide near-real-time experience. We also show how architectural peculiarities of these devices can be best exploited in the benefit of algorithms, most specifically for addressing the challenges related to their access patterns and different memory configurations. Finally, we evaluate the performance of the algorithm on three different GPU architectures and perform a comprehensive analysis of the results. PMID:21869880

  16. Simultaneous determination of niacin and pyridoxine at trace levels by using diode array high-performance liquid chromatography and liquid chromatography with quadrupole time-of-flight tandem mass spectrometry.

    PubMed

    Sel, Sabriye; Öztürk Er, Elif; Bakırdere, Sezgin

    2017-12-01

    A highly sensitive and simple diode-array high-performance liquid chromatography and liquid chromatography with quadrupole time-of-flight tandem mass spectrometry method was developed for the simultaneous determination of niacin and pyridoxine in pharmaceutical drugs, tap water, and wastewater samples. To determine the in vivo behavior of niacin and pyridoxine, analytes were subjected to simulated gastric conditions. The calibration plots of the diode-array high-performance liquid chromatography and liquid chromatography with quadrupole time-of-flight tandem mass spectrometry method showed good linearity over a wide concentration range with close to 1.0 correlation coefficients for both analytes. The limit of detection/limit of quantitation values for liquid chromatography quadrupole time-of-flight tandem mass spectrometry analysis were 1.98/6.59 and 1.3/4.4 μg/L for niacin and pyridoxine, respectively, while limit of detection/limit of quantitation values for niacin and pyridoxine in high-performance liquid chromatography analysis were 3.7/12.3 and 5.7/18.9 μg/L, respectively. Recovery studies were also performed to show the applicability of the developed methods, and percentage recovery values were found to be 90-105% in tap water and 94-97% in wastewater for both analytes. The method was also successfully applied for the qualitative and quantitative determination of niacin and pyridoxine in drug samples. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Physical fitness predicts technical-tactical and time-motion profile in simulated Judo and Brazilian Jiu-Jitsu matches.

    PubMed

    Coswig, Victor S; Gentil, Paulo; Bueno, João C A; Follmer, Bruno; Marques, Vitor A; Del Vecchio, Fabrício B

    2018-01-01

    Among combat sports, Judo and Brazilian Jiu-Jitsu (BJJ) present elevated physical fitness demands from the high-intensity intermittent efforts. However, information regarding how metabolic and neuromuscular physical fitness is associated with technical-tactical performance in Judo and BJJ fights is not available. This study aimed to relate indicators of physical fitness with combat performance variables in Judo and BJJ. The sample consisted of Judo ( n  = 16) and BJJ ( n  = 24) male athletes. At the first meeting, the physical tests were applied and, in the second, simulated fights were performed for later notational analysis. The main findings indicate: (i) high reproducibility of the proposed instrument and protocol used for notational analysis in a mobile device; (ii) differences in the technical-tactical and time-motion patterns between modalities; (iii) performance-related variables are different in Judo and BJJ; and (iv) regression models based on metabolic fitness variables may account for up to 53% of the variances in technical-tactical and/or time-motion variables in Judo and up to 31% in BJJ, whereas neuromuscular fitness models can reach values up to 44 and 73% of prediction in Judo and BJJ, respectively. When all components are combined, they can explain up to 90% of high intensity actions in Judo. In conclusion, performance prediction models in simulated combat indicate that anaerobic, aerobic and neuromuscular fitness variables contribute to explain time-motion variables associated with high intensity and technical-tactical variables in Judo and BJJ fights.

  18. Numerical Stability and Control Analysis Towards Falling-Leaf Prediction Capabilities of Splitflow for Two Generic High-Performance Aircraft Models

    NASA Technical Reports Server (NTRS)

    Charlton, Eric F.

    1998-01-01

    Aerodynamic analysis are performed using the Lockheed-Martin Tactical Aircraft Systems (LMTAS) Splitflow computational fluid dynamics code to investigate the computational prediction capabilities for vortex-dominated flow fields of two different tailless aircraft models at large angles of attack and sideslip. These computations are performed with the goal of providing useful stability and control data to designers of high performance aircraft. Appropriate metrics for accuracy, time, and ease of use are determined in consultations with both the LMTAS Advanced Design and Stability and Control groups. Results are obtained and compared to wind-tunnel data for all six components of forces and moments. Moment data is combined to form a "falling leaf" stability analysis. Finally, a handful of viscous simulations were also performed to further investigate nonlinearities and possible viscous effects in the differences between the accumulated inviscid computational and experimental data.

  19. Advanced Video Analysis Needs for Human Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Campbell, Paul D.

    1994-01-01

    Evaluators of human task performance in space missions make use of video as a primary source of data. Extraction of relevant human performance information from video is often a labor-intensive process requiring a large amount of time on the part of the evaluator. Based on the experiences of several human performance evaluators, needs were defined for advanced tools which could aid in the analysis of video data from space missions. Such tools should increase the efficiency with which useful information is retrieved from large quantities of raw video. They should also provide the evaluator with new analytical functions which are not present in currently used methods. Video analysis tools based on the needs defined by this study would also have uses in U.S. industry and education. Evaluation of human performance from video data can be a valuable technique in many industrial and institutional settings where humans are involved in operational systems and processes.

  20. Automatic Real Time Ionogram Scaler with True Height Analysis - Artist

    DTIC Science & Technology

    1983-07-01

    scaled. The corresponding autoscaled values were compared with the manual scaled h’F, h’F2, fminF, foE, foEs, h’E and hlEs. The ARTIST program...I ... , ·~ J .,\\; j~~·n! I:\\’~ .. IC HT:/\\L rritw!E I ONOGI\\AM SCALER ’:!"[’!’if T:\\!_1!: H~:IGHT ANALYSIS - ARTIST P...S. TYPE OF REPORT & PERiCO COVERED Scientific Report No. 7 AUTOMATIC REAL TIME IONOGRAM SCALER WITH TRUE HEIGHT ANALYSIS - ARTIST 6. PERFORMING O𔃾G

  1. Automated system for the on-line monitoring of powder blending processes using near-infrared spectroscopy. Part I. System development and control.

    PubMed

    Hailey, P A; Doherty, P; Tapsell, P; Oliver, T; Aldridge, P K

    1996-03-01

    An automated system for the on-line monitoring of powder blending processes is described. The system employs near-infrared (NIR) spectroscopy using fibre-optics and a graphical user interface (GUI) developed in the LabVIEW environment. The complete supervisory control and data analysis (SCADA) software controls blender and spectrophotometer operation and performs statistical spectral data analysis in real time. A data analysis routine using standard deviation is described to demonstrate an approach to the real-time determination of blend homogeneity.

  2. FIELD QUALITY CONTROL STRATEGIES ASSESSING SOLIDIFICATION/STABILIZATION

    EPA Science Inventory

    Existing regulatory mobility reduction (leaching) tests are not amenable to real time quality control because of the time required to perform sample extraction and chemical analysis. This is of conccern because the leaching test is the most important parameter used to relate trea...

  3. Scalable Performance Measurement and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamblin, Todd

    2009-01-01

    Concurrency levels in large-scale, distributed-memory supercomputers are rising exponentially. Modern machines may contain 100,000 or more microprocessor cores, and the largest of these, IBM's Blue Gene/L, contains over 200,000 cores. Future systems are expected to support millions of concurrent tasks. In this dissertation, we focus on efficient techniques for measuring and analyzing the performance of applications running on very large parallel machines. Tuning the performance of large-scale applications can be a subtle and time-consuming task because application developers must measure and interpret data from many independent processes. While the volume of the raw data scales linearly with the number ofmore » tasks in the running system, the number of tasks is growing exponentially, and data for even small systems quickly becomes unmanageable. Transporting performance data from so many processes over a network can perturb application performance and make measurements inaccurate, and storing such data would require a prohibitive amount of space. Moreover, even if it were stored, analyzing the data would be extremely time-consuming. In this dissertation, we present novel methods for reducing performance data volume. The first draws on multi-scale wavelet techniques from signal processing to compress systemwide, time-varying load-balance data. The second uses statistical sampling to select a small subset of running processes to generate low-volume traces. A third approach combines sampling and wavelet compression to stratify performance data adaptively at run-time and to reduce further the cost of sampled tracing. We have integrated these approaches into Libra, a toolset for scalable load-balance analysis. We present Libra and show how it can be used to analyze data from large scientific applications scalably.« less

  4. Effects of reward and punishment on task performance, mood and autonomic nervous function, and the interaction with personality.

    PubMed

    Sakuragi, Sokichi; Sugiyama, Yoshiki

    2009-06-01

    The effects of reward and punishment are different, and there are individual differences in sensitivity to reward and punishment. The purpose of this study was to investigate the effects of reward and punishment on task performance, mood, and autonomic nervous function, along with the interaction with personality. Twenty-one healthy female subjects volunteered for the experiment. The task performance was evaluated by required time and total errors while performing a Wisconsin Card Sorting Test. We assessed their personalities using the Minnesota Multiphasic Personality Inventory (MMPI) questionnaire, and mood states by a profile of mood states. Autonomic nervous function was estimated by a spectral analysis of heart rate variability, baroreflex sensitivity, and blood pressure. Repeated measures analysis of variance (ANOVA) revealed significant interaction of condition x time course on mood and autonomic nervous activity, which would indicate a less stressed state under the rewarding condition, but revealed no significant interaction of condition x time course on the task performance. The interactions with personality were further analyzed by repeated measures ANOVA applying the clinical scales of MMPI as independent variables, and significant interactions of condition x time course x Pt (psychasthenia) on task performance, mood, and blood pressure, were revealed. That is, the high Pt group, whose members tend to be sensitive and prone to worry, showed gradual improvement of task performance under the punishing situation with slight increase in systolic blood pressure, while showed no improvement under the rewarding situation with fatigue sense attenuation. In contrast, the low Pt group, whose members tend to be adaptive and self-confident, showed gradual improvement under the rewarding situation. Therefore, we should carefully choose the strategy of reward or punishment, considering the interaction with personality as well as the context in which it is given.

  5. Perturbation analysis of queueing systems with a time-varying arrival rate

    NASA Technical Reports Server (NTRS)

    Cassandras, Christos G.; Pan, Jie

    1991-01-01

    The authors consider an M/G/1 queuing with a time-varying arrival rate. The objective is to obtain infinitesimal perturbation analysis (IPA) gradient estimates for various performance measures of interest with respect to certain system parameters. In particular, the authors consider the mean system time over n arrivals and an arrival rate alternating between two values. By choosing a convenient sample path representation of this system, they derive an unbiased IPA gradient estimator which, however, is not consistent, and investigate the nature of this problem.

  6. Improved ultra-performance liquid chromatography with electrospray ionization quadrupole-time-of-flight high-definition mass spectrometry method for the rapid analysis of the chemical constituents of a typical medical formula: Liuwei Dihuang Wan.

    PubMed

    Wang, Ping; Lv, Hai tao; Zhang, Ai hua; Sun, Hui; Yan, Guang li; Han, Ying; Wu, Xiu hong; Wang, Xi jun

    2013-11-01

    Liuwei Dihuang Wan (LDW), a classic Chinese medicinal formula, has been used to improve or restore declined functions related to aging and geriatric diseases, such as impaired mobility, vision, hearing, cognition, and memory. It has attracted increasing attention as one of the most popular and valuable herbal medicines. However, the systematic analysis of the chemical constituents of LDW is difficult and thus has not been well established. In this paper, a rapid, sensitive, and reliable ultra-performance LC with ESI quadrupole TOF high-definition MS method with automated MetaboLynx analysis in positive and negative ion mode was established to characterize the chemical constituents of LDW. The analysis was performed on a Waters UPLC™ HSS T3 using a gradient elution system. MS/MS fragmentation behavior was proposed for aiding the structural identification of the components. Under the optimized conditions, a total of 50 peaks were tentatively characterized by comparing the retention time and MS data. It is concluded that a rapid and robust platform based on ultra-performance LC with ESI quadrupole TOF high-definition MS has been successfully developed for globally identifying multiple constituents of traditional Chinese medicine prescriptions. This is the first report on the systematic analysis of the chemical constituents of LDW. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Failure modes and effects analysis automation

    NASA Technical Reports Server (NTRS)

    Kamhieh, Cynthia H.; Cutts, Dannie E.; Purves, R. Byron

    1988-01-01

    A failure modes and effects analysis (FMEA) assistant was implemented as a knowledge based system and will be used during design of the Space Station to aid engineers in performing the complex task of tracking failures throughout the entire design effort. The three major directions in which automation was pursued were the clerical components of the FMEA process, the knowledge acquisition aspects of FMEA, and the failure propagation/analysis portions of the FMEA task. The system is accessible to design, safety, and reliability engineers at single user workstations and, although not designed to replace conventional FMEA, it is expected to decrease by many man years the time required to perform the analysis.

  8. Particle Analysis Pitfalls

    NASA Technical Reports Server (NTRS)

    Hughes, David; Dazzo, Tony

    2007-01-01

    This viewgraph presentation reviews the use of particle analysis to assist in preparing for the 4th Hubble Space Telescope (HST) Servicing mission. During this mission the Space Telescope Imaging Spectrograph (STIS) will be repaired. The particle analysis consisted of Finite element mesh creation, Black-body viewfactors generated using I-DEAS TMG Thermal Analysis, Grey-body viewfactors calculated using Markov method, Particle distribution modeled using an iterative Monte Carlo process, (time-consuming); in house software called MASTRAM, Differential analysis performed in Excel, and Visualization provided by Tecplot and I-DEAS. Several tests were performed and are reviewed: Conformal Coat Particle Study, Card Extraction Study, Cover Fastener Removal Particle Generation Study, and E-Graf Vibration Particulate Study. The lessons learned during this analysis are also reviewed.

  9. Simultaneous qualitative and quantitative analysis of flavonoids and alkaloids from the leaves of Nelumbo nucifera Gaertn. using high-performance liquid chromatography with quadrupole time-of-flight mass spectrometry.

    PubMed

    Guo, Yujie; Chen, Xi; Qi, Jin; Yu, Boyang

    2016-07-01

    A reliable method, combining qualitative analysis by high-performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry and quantitative assessment by high-performance liquid chromatography with photodiode array detection, has been developed to simultaneously analyze flavonoids and alkaloids in lotus leaf extracts. In the qualitative analysis, a total of 30 compounds, including 12 flavonoids, 16 alkaloids, and two proanthocyanidins, were identified. The fragmentation behaviors of four types of flavone glycoside and three types of alkaloid are summarized. The mass spectra of four representative components, quercetin 3-O-glucuronide, norcoclaurine, nuciferine, and neferine, are shown to illustrate their fragmentation pathways. Five pairs of isomers were detected and three of them were distinguished by comparing the elution order with reference substances and the mass spectrometry data with reported data. In the quantitative analysis, 30 lotus leaf samples from different regions were analyzed to investigate the proportion of eight representative compounds. Quercetin 3-O-glucuronide was found to be the predominant constituent of lotus leaf extracts. For further discrimination among the samples, hierarchical cluster analysis, and principal component analysis, based on the areas of the eight quantitative peaks, were carried out. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Examining the relationship between endogenous pain modulation capacity and endurance exercise performance.

    PubMed

    Flood, Andrew; Waddington, Gordon; Cathcart, Stuart

    2017-01-01

    The aim of the current study was to examine the relationship between pain modulatory capacity and endurance exercise performance. Twenty-seven recreationally active males between 18 and 35 years of age participated in the study. Pain modulation was assessed by examining the inhibitory effect of a noxious conditioning stimulus (cuff occlusion) on the perceived intensity of a second noxious stimulus (pressure pain threshold). Participants completed two, maximal voluntary contractions followed by a submaximal endurance time task. Both performance tasks involved an isometric contraction of the non-dominant leg. The main analysis uncovered a correlation between pain modulatory capacity and performance on the endurance time task (r = -.425, p = .027), such that those with elevated pain modulation produced longer endurance times. These findings are the first to demonstrate the relationship between pain modulation responses and endurance exercise performance.

  11. Dynamic CT myocardial perfusion imaging: performance of 3D semi-automated evaluation software.

    PubMed

    Ebersberger, Ullrich; Marcus, Roy P; Schoepf, U Joseph; Lo, Gladys G; Wang, Yining; Blanke, Philipp; Geyer, Lucas L; Gray, J Cranston; McQuiston, Andrew D; Cho, Young Jun; Scheuering, Michael; Canstein, Christian; Nikolaou, Konstantin; Hoffmann, Ellen; Bamberg, Fabian

    2014-01-01

    To evaluate the performance of three-dimensional semi-automated evaluation software for the assessment of myocardial blood flow (MBF) and blood volume (MBV) at dynamic myocardial perfusion computed tomography (CT). Volume-based software relying on marginal space learning and probabilistic boosting tree-based contour fitting was applied to CT myocardial perfusion imaging data of 37 subjects. In addition, all image data were analysed manually and both approaches were compared with SPECT findings. Study endpoints included time of analysis and conventional measures of diagnostic accuracy. Of 592 analysable segments, 42 showed perfusion defects on SPECT. Average analysis times for the manual and software-based approaches were 49.1 ± 11.2 and 16.5 ± 3.7 min respectively (P < 0.01). There was strong agreement between the two measures of interest (MBF, ICC = 0.91, and MBV, ICC = 0.88, both P < 0.01) and no significant difference in MBF/MBV with respect to diagnostic accuracy between the two approaches for both MBF and MBV for manual versus software-based approach; respectively; all comparisons P > 0.05. Three-dimensional semi-automated evaluation of dynamic myocardial perfusion CT data provides similar measures and diagnostic accuracy to manual evaluation, albeit with substantially reduced analysis times. This capability may aid the integration of this test into clinical workflows. • Myocardial perfusion CT is attractive for comprehensive coronary heart disease assessment. • Traditional image analysis methods are cumbersome and time-consuming. • Automated 3D perfusion software shortens analysis times. • Automated 3D perfusion software increases standardisation of myocardial perfusion CT. • Automated, standardised analysis fosters myocardial perfusion CT integration into clinical practice.

  12. High-frequency video capture and a computer program with frame-by-frame angle determination functionality as tools that support judging in artistic gymnastics.

    PubMed

    Omorczyk, Jarosław; Nosiadek, Leszek; Ambroży, Tadeusz; Nosiadek, Andrzej

    2015-01-01

    The main aim of this study was to verify the usefulness of selected simple methods of recording and fast biomechanical analysis performed by judges of artistic gymnastics in assessing a gymnast's movement technique. The study participants comprised six artistic gymnastics judges, who assessed back handsprings using two methods: a real-time observation method and a frame-by-frame video analysis method. They also determined flexion angles of knee and hip joints using the computer program. In the case of the real-time observation method, the judges gave a total of 5.8 error points with an arithmetic mean of 0.16 points for the flexion of the knee joints. In the high-speed video analysis method, the total amounted to 8.6 error points and the mean value amounted to 0.24 error points. For the excessive flexion of hip joints, the sum of the error values was 2.2 error points and the arithmetic mean was 0.06 error points during real-time observation. The sum obtained using frame-by-frame analysis method equaled 10.8 and the mean equaled 0.30 error points. Error values obtained through the frame-by-frame video analysis of movement technique were higher than those obtained through the real-time observation method. The judges were able to indicate the number of the frame in which the maximal joint flexion occurred with good accuracy. Using the real-time observation method as well as the high-speed video analysis performed without determining the exact angle for assessing movement technique were found to be insufficient tools for improving the quality of judging.

  13. Bayesian analyses of time-interval data for environmental radiation monitoring.

    PubMed

    Luo, Peng; Sharp, Julia L; DeVol, Timothy A

    2013-01-01

    Time-interval (time difference between two consecutive pulses) analysis based on the principles of Bayesian inference was investigated for online radiation monitoring. Using experimental and simulated data, Bayesian analysis of time-interval data [Bayesian (ti)] was compared with Bayesian and a conventional frequentist analysis of counts in a fixed count time [Bayesian (cnt) and single interval test (SIT), respectively]. The performances of the three methods were compared in terms of average run length (ARL) and detection probability for several simulated detection scenarios. Experimental data were acquired with a DGF-4C system in list mode. Simulated data were obtained using Monte Carlo techniques to obtain a random sampling of the Poisson distribution. All statistical algorithms were developed using the R Project for statistical computing. Bayesian analysis of time-interval information provided a similar detection probability as Bayesian analysis of count information, but the authors were able to make a decision with fewer pulses at relatively higher radiation levels. In addition, for the cases with very short presence of the source (< count time), time-interval information is more sensitive to detect a change than count information since the source data is averaged by the background data over the entire count time. The relationships of the source time, change points, and modifications to the Bayesian approach for increasing detection probability are presented.

  14. Covert checks by standardised patients of general practitioners' delivery of new periodic health examinations: clustered cross-sectional study from a consumer organisation

    PubMed Central

    Thaler, Kylie; Harris, Mark F

    2012-01-01

    Objective To assess if data collected by a consumer organisation are valid for a health service research study on physicians' performance in preventive care. To report first results of the analysis of physicians performance like consultation time and guideline adherence in history taking. Design Secondary data analysis of a clustered cross-sectional direct observation survey. Setting General practitioners (GPs) in Vienna, Austria, visited unannounced by mystery shoppers (incognito standardised patients (ISPs)). Participants 21 randomly selected GPs were visited by two different ISPs each. 40 observation protocols were realised. Main outcome measures Robustness of sampling and data collection by the consumer organisation. GPs consultation and waiting times, guideline adherence in history taking. Results The double stratified random sampling method was robust and representative for the private and contracted GPs mix of Vienna. The clinical scenarios presented by the ISPs were valid and believable, and no GP realised the ISPs were not genuine patients. The average consultation time was 46 min (95% CI 37 to 54 min). Waiting times differed more than consultation times between private and contracted GPs. No differences between private and contracted GPs in terms of adherence to the evidence-based guidelines regarding history taking including questions regarding alcohol use were found. According to the analysis, 20% of the GPs took a perfect history (95% CI 9% to 39%). Conclusions The analysis of secondary data collected by a consumer organisation was a valid method for drawing conclusions about GPs preventive practice. Initial results, like consultation times longer than anticipated, and the moderate quality of history taking encourage continuing the analysis on available clinical data. PMID:22872721

  15. Hardware accelerated high performance neutron transport computation based on AGENT methodology

    NASA Astrophysics Data System (ADS)

    Xiao, Shanjie

    The spatial heterogeneity of the next generation Gen-IV nuclear reactor core designs brings challenges to the neutron transport analysis. The Arbitrary Geometry Neutron Transport (AGENT) AGENT code is a three-dimensional neutron transport analysis code being developed at the Laboratory for Neutronics and Geometry Computation (NEGE) at Purdue University. It can accurately describe the spatial heterogeneity in a hierarchical structure through the R-function solid modeler. The previous version of AGENT coupled the 2D transport MOC solver and the 1D diffusion NEM solver to solve the three dimensional Boltzmann transport equation. In this research, the 2D/1D coupling methodology was expanded to couple two transport solvers, the radial 2D MOC solver and the axial 1D MOC solver, for better accuracy. The expansion was benchmarked with the widely applied C5G7 benchmark models and two fast breeder reactor models, and showed good agreement with the reference Monte Carlo results. In practice, the accurate neutron transport analysis for a full reactor core is still time-consuming and thus limits its application. Therefore, another content of my research is focused on designing a specific hardware based on the reconfigurable computing technique in order to accelerate AGENT computations. It is the first time that the application of this type is used to the reactor physics and neutron transport for reactor design. The most time consuming part of the AGENT algorithm was identified. Moreover, the architecture of the AGENT acceleration system was designed based on the analysis. Through the parallel computation on the specially designed, highly efficient architecture, the acceleration design on FPGA acquires high performance at the much lower working frequency than CPUs. The whole design simulations show that the acceleration design would be able to speedup large scale AGENT computations about 20 times. The high performance AGENT acceleration system will drastically shortening the computation time for 3D full-core neutron transport analysis, making the AGENT methodology unique and advantageous, and thus supplies the possibility to extend the application range of neutron transport analysis in either industry engineering or academic research.

  16. High School Start Times and the Impact on High School Students: What We Know, and What We Hope to Learn.

    PubMed

    Morgenthaler, Timothy I; Hashmi, Sarah; Croft, Janet B; Dort, Leslie; Heald, Jonathan L; Mullington, Janet

    2016-12-15

    Several organizations have provided recommendations to ensure high school starts no sooner than 08:30. However, although there are plausible biological reasons to support such recommendations, published recommendations have been based largely on expert opinion and a few observational studies. We sought to perform a critical review of published evidence regarding the effect of high school start times on sleep and other relevant outcomes. We performed a broad literature search to identify 287 candidate publications for inclusion in our review, which focused on studies offering direct comparison of sleep time, academic or physical performance, behavioral health measures, or motor vehicular accidents in high school students. Where possible, outcomes were combined for meta-analysis. After application of study criteria, only 18 studies were suitable for review. Eight studies were amenable to meta-analysis for some outcomes. We found that later school start times, particularly when compared with start times more than 60 min earlier, are associated with longer weekday sleep durations, lower weekday-weekend sleep duration differences, reduced vehicular accident rates, and reduced subjective daytime sleepiness. Improvement in academic performance and behavioral issues is less established. The literature regarding effect of school start time delays on important aspects of high school life suggests some salutary effects, but often the evidence is indirect, imprecise, or derived from cohorts of convenience, making the overall quality of evidence weak or very weak. This review highlights a need for higher-quality data upon which to base important and complex public health decisions. © 2016 American Academy of Sleep Medicine

  17. Impact of Pilates Exercise in Multiple Sclerosis: A Randomized Controlled Trial.

    PubMed

    Duff, Whitney R D; Andrushko, Justin W; Renshaw, Doug W; Chilibeck, Philip D; Farthing, Jonathan P; Danielson, Jana; Evans, Charity D

    2018-01-01

    Pilates is a series of exercises based on whole-body movement and may improve mobility in people with multiple sclerosis (MS). The purpose of this study was to determine the effect of Pilates on walking performance in people with MS. 30 individuals with MS who were not restricted to a wheelchair or scooter (Patient-Determined Disease Steps scale score <7) were randomized to receive Pilates (twice weekly) and massage therapy (once weekly) or once-weekly massage therapy only (control group). The Pilates was delivered in a group setting (five to ten participants per session). The primary outcome was change in walking performance (6-Minute Walk Test) after 12 weeks. Secondary outcomes included functional ability (Timed Up and Go test), balance (Fullerton Advanced Balance Scale), flexibility (sit and reach test), body composition (dual-energy X-ray absorptiometry), core endurance (plank-hold test), and muscle strength and voluntary activation (quadriceps). Intention-to-treat analysis was performed using a two-factor repeated-measures analysis of variance. Walking distance increased by a mean (SD) of 52.4 (40.2) m in the Pilates group versus 15.0 (34.1) m in the control group (group × time, P = .01). Mean (SD) time to complete the Timed Up and Go test decreased by 1.5 (2.8) seconds in the Pilates group versus an increase of 0.3 (0.9) seconds in the control group (group × time, P = .03). There were no other significant differences between groups over time. Pilates improved walking performance and functional ability in persons with MS and is a viable exercise option to help manage the disease.

  18. Cathodal transcranial direct-current stimulation over right posterior parietal cortex enhances human temporal discrimination ability.

    PubMed

    Oyama, Fuyuki; Ishibashi, Keita; Iwanaga, Koichi

    2017-12-04

    Time perception associated with durations from 1 s to several minutes involves activity in the right posterior parietal cortex (rPPC). It is unclear whether altering the activity of the rPPC affects an individual's timing performance. Here, we investigated the human timing performance under the application of transcranial direct-current stimulation (tDCS) that altered the neural activities of the rPPC. We measured the participants' duration-discrimination threshold by administering a behavioral task during the tDCS application. The tDCS conditions consisted of anodal, cathodal, and sham conditions. The electrodes were placed over the P4 position (10-20 system) and on the left supraorbital forehead. On each task trial, the participant observed two visual stimuli and indicated which was longer. The amount of difference between the two stimulus durations was varied repeatedly throughout the trials according to the participant's responses. The correct answer rate of the trials was calculated for each amount of difference, and the minimum amount with the correct answer rate exceeding 75% was selected as the threshold. The data were analyzed by a linear mixed-effects models procedure. Nineteen volunteers participated in the experiment. We excluded three participants from the analysis: two who reported extreme sleepiness while performing the task and one who could recognize the sham condition correctly with confidence. Our analysis of the 16 participants' data showed that the average value of the thresholds observed under the cathodal condition was lower than that of the sham condition. This suggests that inhibition of the rPPC leads to an improvement in temporal discrimination performance, resulting in improved timing performance. In the present study, we found a new effect that cathodal tDCS over the rPPC enhances temporal discrimination performance. In terms of the existence of anodal/cathodal tDCS effects on human timing performance, the results were consistent with a previous study that investigated temporal reproduction performance during tDCS application. However, the results of the current study further indicated that cathodal tDCS over the rPPC increases accuracy of observed time duration rather than inducing an overestimation as a previous study reported.

  19. Wear behavior of AA 5083/SiC nano-particle metal matrix composite: Statistical analysis

    NASA Astrophysics Data System (ADS)

    Hussain Idrisi, Amir; Ismail Mourad, Abdel-Hamid; Thekkuden, Dinu Thomas; Christy, John Victor

    2018-03-01

    This paper reports study on statistical analysis of the wear characteristics of AA5083/SiC nanocomposite. The aluminum matrix composites with different wt % (0%, 1% and 2%) of SiC nanoparticles were fabricated by using stir casting route. The developed composites were used in the manufacturing of spur gears on which the study was conducted. A specially designed test rig was used in testing the wear performance of the gears. The wear was investigated under different conditions of applied load (10N, 20N, and 30N) and operation time (30 mins, 60 mins, 90 mins, and 120mins). The analysis carried out at room temperature under constant speed of 1450 rpm. The wear parameters were optimized by using Taguchi’s method. During this statistical approach, L27 Orthogonal array was selected for the analysis of output. Furthermore, analysis of variance (ANOVA) was used to investigate the influence of applied load, operation time and SiC wt. % on wear behaviour. The wear resistance was analyzed by selecting “smaller is better” characteristics as the objective of the model. From this research, it is observed that experiment time and SiC wt % have the most significant effect on the wear performance followed by the applied load.

  20. A New Strategy for Analyzing Time-Series Data Using Dynamic Networks: Identifying Prospective Biomarkers of Hepatocellular Carcinoma.

    PubMed

    Huang, Xin; Zeng, Jun; Zhou, Lina; Hu, Chunxiu; Yin, Peiyuan; Lin, Xiaohui

    2016-08-31

    Time-series metabolomics studies can provide insight into the dynamics of disease development and facilitate the discovery of prospective biomarkers. To improve the performance of early risk identification, a new strategy for analyzing time-series data based on dynamic networks (ATSD-DN) in a systematic time dimension is proposed. In ATSD-DN, the non-overlapping ratio was applied to measure the changes in feature ratios during the process of disease development and to construct dynamic networks. Dynamic concentration analysis and network topological structure analysis were performed to extract early warning information. This strategy was applied to the study of time-series lipidomics data from a stepwise hepatocarcinogenesis rat model. A ratio of lyso-phosphatidylcholine (LPC) 18:1/free fatty acid (FFA) 20:5 was identified as the potential biomarker for hepatocellular carcinoma (HCC). It can be used to classify HCC and non-HCC rats, and the area under the curve values in the discovery and external validation sets were 0.980 and 0.972, respectively. This strategy was also compared with a weighted relative difference accumulation algorithm (wRDA), multivariate empirical Bayes statistics (MEBA) and support vector machine-recursive feature elimination (SVM-RFE). The better performance of ATSD-DN suggests its potential for a more complete presentation of time-series changes and effective extraction of early warning information.

Top