Science.gov

Sample records for performance analysis 1994-2002

  1. [Characteristics of malaria cases diagnosed in Edirne province between 1994-2002].

    PubMed

    Ay, Gazanfer; Gürcan, Saban; Tatman Otkun, Müşerref; Tuğrul, Murat; Otkun, Metin

    2004-01-01

    In this study, the epidemiological characteristics of malaria cases in Edirne province were investigated. Between the years of 1994-2002, a total of 317,087 blood samples were collected from soldiers in the province with selective active surveillance and from the resident population with active or passive surveillance methods, by the medical staff of Malaria Control Department and Health Centers, to search the presence of Plasmodium. In 281 of them Plasmodium spp. were detected, and the characteristics of malaria cases were investigated. Of the cases, 238 (84.7%) were detected in the first three years and mostly in September. While the indigenous cases were detected in the districts where rice planted intensely, the imported cases were detected in the districts heavily populated by military staff. Of the imported cases, 62% originated from Diyarbakir, Batman and Sanliurfa provinces (Southeast part of Turkey). P. vivax was detected as the causative agent in all blood samples except one P. ovale. This latter case has been the only one in Turkey so far and he was a student from Afghanistan. Attaching importance to fight off mosquitoes in intensely rice planted districts and strictly surveying the military staff, particularly from the region of Southern-East Anatolia, have led to successful control of the malaria cases in Edirne region.

  2. A probable extralimital postbreeding assembly of bufflehead Bucephala albeola in southcentral North Dakota, USA, 1994-2002

    USGS Publications Warehouse

    Igl, L.D.

    2003-01-01

    The Bufflehead Bucephala albeola predominantly in Canada and Alaska (USA). Evidence suggests that the species may have recently expanded its breeding range southward into central and south-central North Dakota. This paper presents data on observations of Buffleheads during the breeding season in Kidder County, North Dakota, 1994-2002, and discusses the possibility that the species has not expanded its breeding range but rather has established an extralimital post-breeding staging area south of its typical breeding range.

  3. Internet Access in U.S. Public Schools and Classrooms: 1994-2002. E.D. Tabs.

    ERIC Educational Resources Information Center

    Kleiner, Anne; Lewis, Laurie

    This report presents data on Internet access in U.S. public schools from 1994 to 2002 by school characteristics. It provides trend analysis on the progress of public schools and classrooms in connecting to the Internet and on the ratio of students to instructional computers with Internet access. For the year 2002, this report also presents data on…

  4. Performance Measurement Analysis System

    1989-06-01

    The PMAS4.0 (Performance Measurement Analysis System) is a user-oriented system designed to track the cost and schedule performance of Department of Energy (DOE) major projects (MPs) and major system acquisitions (MSAs) reporting under DOE Order 5700.4A, Project Management System. PMAS4.0 provides for the analysis of performance measurement data produced from management control systems complying with the Federal Government''s Cost and Schedule Control Systems Criteria.

  5. MIR Performance Analysis

    SciTech Connect

    Hazen, Damian; Hick, Jason

    2012-06-12

    We provide analysis of Oracle StorageTek T10000 Generation B (T10KB) Media Information Record (MIR) Performance Data gathered over the course of a year from our production High Performance Storage System (HPSS). The analysis shows information in the MIR may be used to improve tape subsystem operations. Most notably, we found the MIR information to be helpful in determining whether the drive or tape was most suspect given a read or write error, and for helping identify which tapes should not be reused given their history of read or write errors. We also explored using the MIR Assisted Search to order file retrieval requests. We found that MIR Assisted Search may be used to reduce the time needed to retrieve collections of files from a tape volume.

  6. Dependability and performability analysis

    NASA Technical Reports Server (NTRS)

    Trivedi, Kishor S.; Ciardo, Gianfranco; Malhotra, Manish; Sahner, Robin A.

    1993-01-01

    Several practical issues regarding specifications and solution of dependability and performability models are discussed. Model types with and without rewards are compared. Continuous-time Markov chains (CTMC's) are compared with (continuous-time) Markov reward models (MRM's) and generalized stochastic Petri nets (GSPN's) are compared with stochastic reward nets (SRN's). It is shown that reward-based models could lead to more concise model specifications and solution of a variety of new measures. With respect to the solution of dependability and performability models, three practical issues were identified: largeness, stiffness, and non-exponentiality, and a variety of approaches are discussed to deal with them, including some of the latest research efforts.

  7. Analysis of EDP performance

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The objective of this contract was the investigation of the potential performance gains that would result from an upgrade of the Space Station Freedom (SSF) Data Management System (DMS) Embedded Data Processor (EDP) '386' design with the Intel Pentium (registered trade-mark of Intel Corp.) '586' microprocessor. The Pentium ('586') is the latest member of the industry standard Intel X86 family of CISC (Complex Instruction Set Computer) microprocessors. This contract was scheduled to run in parallel with an internal IBM Federal Systems Company (FSC) Internal Research and Development (IR&D) task that had the goal to generate a baseline flight design for an upgraded EDP using the Pentium. This final report summarizes the activities performed in support of Contract NAS2-13758. Our plan was to baseline performance analyses and measurements on the latest state-of-the-art commercially available Pentium processor, representative of the proposed space station design, and then phase to an IBM capital funded breadboard version of the flight design (if available from IR&D and Space Station work) for additional evaluation of results. Unfortunately, the phase-over to the flight design breadboard did not take place, since the IBM Data Management System (DMS) for the Space Station Freedom was terminated by NASA before the referenced capital funded EDP breadboard could be completed. The baseline performance analyses and measurements, however, were successfully completed, as planned, on the commercial Pentium hardware. The results of those analyses, evaluations, and measurements are presented in this final report.

  8. MPQC: Performance Analysis and Optimization

    SciTech Connect

    Sarje, Abhinav; Williams, Samuel; Bailey, David

    2012-11-30

    MPQC (Massively Parallel Quantum Chemistry) is a widely used computational quantum chemistry code. It is capable of performing a number of computations commonly occurring in quantum chemistry. In order to achieve better performance of MPQC, in this report we present a detailed performance analysis of this code. We then perform loop and memory access optimizations, and measure performance improvements by comparing the performance of the optimized code with that of the original MPQC code. We observe that the optimized MPQC code achieves a significant improvement in the performance through a better utilization of vector processing and memory hierarchies.

  9. Lidar performance analysis

    NASA Technical Reports Server (NTRS)

    Spiers, Gary D.

    1994-01-01

    Section 1 details the theory used to build the lidar model, provides results of using the model to evaluate AEOLUS design instrument designs, and provides snapshots of the visual appearance of the coded model. Appendix A contains a Fortran program to calculate various forms of the refractive index structure function. This program was used to determine the refractive index structure function used in the main lidar simulation code. Appendix B contains a memo on the optimization of the lidar telescope geometry for a line-scan geometry. Appendix C contains the code for the main lidar simulation and brief instruction on running the code. Appendix D contains a Fortran code to calculate the maximum permissible exposure for the eye from the ANSI Z136.1-1992 eye safety standards. Appendix E contains a paper on the eye safety analysis of a space-based coherent lidar presented at the 7th Coherent Laser Radar Applications and Technology Conference, Paris, France, 19-23 July 1993.

  10. Techniques for Automated Performance Analysis

    SciTech Connect

    Marcus, Ryan C.

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  11. Scalable Performance Measurement and Analysis

    SciTech Connect

    Gamblin, Todd

    2009-01-01

    Concurrency levels in large-scale, distributed-memory supercomputers are rising exponentially. Modern machines may contain 100,000 or more microprocessor cores, and the largest of these, IBM's Blue Gene/L, contains over 200,000 cores. Future systems are expected to support millions of concurrent tasks. In this dissertation, we focus on efficient techniques for measuring and analyzing the performance of applications running on very large parallel machines. Tuning the performance of large-scale applications can be a subtle and time-consuming task because application developers must measure and interpret data from many independent processes. While the volume of the raw data scales linearly with the number of tasks in the running system, the number of tasks is growing exponentially, and data for even small systems quickly becomes unmanageable. Transporting performance data from so many processes over a network can perturb application performance and make measurements inaccurate, and storing such data would require a prohibitive amount of space. Moreover, even if it were stored, analyzing the data would be extremely time-consuming. In this dissertation, we present novel methods for reducing performance data volume. The first draws on multi-scale wavelet techniques from signal processing to compress systemwide, time-varying load-balance data. The second uses statistical sampling to select a small subset of running processes to generate low-volume traces. A third approach combines sampling and wavelet compression to stratify performance data adaptively at run-time and to reduce further the cost of sampled tracing. We have integrated these approaches into Libra, a toolset for scalable load-balance analysis. We present Libra and show how it can be used to analyze data from large scientific applications scalably.

  12. Software Performs Complex Design Analysis

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  13. Stage Separation Performance Analysis Project

    NASA Technical Reports Server (NTRS)

    Chen, Yen-Sen; Zhang, Sijun; Liu, Jiwen; Wang, Ten-See

    2001-01-01

    Stage separation process is an important phenomenon in multi-stage launch vehicle operation. The transient flowfield coupled with the multi-body systems is a challenging problem in design analysis. The thermodynamics environment with burning propellants during the upper-stage engine start in the separation processes adds to the complexity of the-entire system. Understanding the underlying flow physics and vehicle dynamics during stage separation is required in designing a multi-stage launch vehicle with good flight performance. A computational fluid dynamics model with the capability to coupling transient multi-body dynamics systems will be a useful tool for simulating the effects of transient flowfield, plume/jet heating and vehicle dynamics. A computational model using generalize mesh system will be used as the basis of this development. The multi-body dynamics system will be solved, by integrating a system of six-degree-of-freedom equations of motion with high accuracy. Multi-body mesh system and their interactions will be modeled using parallel computing algorithms. Adaptive mesh refinement method will also be employed to enhance solution accuracy in the transient process.

  14. SEP thrust subsystem performance sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Atkins, K. L.; Sauer, C. G., Jr.; Kerrisk, D. J.

    1973-01-01

    This is a two-part report on solar electric propulsion (SEP) performance sensitivity analysis. The first part describes the preliminary analysis of the SEP thrust system performance for an Encke rendezvous mission. A detailed description of thrust subsystem hardware tolerances on mission performance is included together with nominal spacecraft parameters based on these tolerances. The second part describes the method of analysis and graphical techniques used in generating the data for Part 1. Included is a description of both the trajectory program used and the additional software developed for this analysis. Part 2 also includes a comprehensive description of the use of the graphical techniques employed in this performance analysis.

  15. Analysis of Costs and Performance

    ERIC Educational Resources Information Center

    Duchesne, Roderick M.

    1973-01-01

    This article outlines a library management information system concerned with total library costs and performance. The system is essentially an adaptation of well-proven industrial and commercial management accounting techniques to the library context. (24 references) (Author)

  16. Guided wave tomography performance analysis

    NASA Astrophysics Data System (ADS)

    Huthwaite, Peter; Lowe, Michael; Cawley, Peter

    2016-02-01

    Quantifying wall loss caused by corrosion is a significant challenge for the petrochemical industry. Corrosion commonly occurs at pipe supports, where surface access for inspection is limited. Guided wave tomography is pursued as a solution to this: guided waves are transmitted through the region of interest from an array, and tomographic reconstruction techniques are applied to the measured signals in order to produce a map of thickness. There are many parameters in the system which can affect the performance; this paper investigates how the accuracy varies as defect width and depth, operating frequency and guided wave mode are all changed. For the S0 mode, the best performance was seen around 170kHz on the 10mm plate, with poor performance seen at almost all other frequencies. A0 showed better performance across a broad range of frequencies, with resolution improving with frequency as the wavelength reduced. However, it was shown that the resolution limit did drop relative to the wavelength, limiting the performance at high frequencies slightly.

  17. Adaptive Optics Communications Performance Analysis

    NASA Technical Reports Server (NTRS)

    Srinivasan, M.; Vilnrotter, V.; Troy, M.; Wilson, K.

    2004-01-01

    The performance improvement obtained through the use of adaptive optics for deep-space communications in the presence of atmospheric turbulence is analyzed. Using simulated focal-plane signal-intensity distributions, uncoded pulse-position modulation (PPM) bit-error probabilities are calculated assuming the use of an adaptive focal-plane detector array as well as an adaptively sized single detector. It is demonstrated that current practical adaptive optics systems can yield performance gains over an uncompensated system ranging from approximately 1 dB to 6 dB depending upon the PPM order and background radiation level.

  18. Structural-Thermal-Optical-Performance (STOP) Analysis

    NASA Technical Reports Server (NTRS)

    Bolognese, Jeffrey; Irish, Sandra

    2015-01-01

    The presentation will be given at the 26th Annual Thermal Fluids Analysis Workshop (TFAWS 2015) hosted by the Goddard Spaceflight Center (GSFC) Thermal Engineering Branch (Code 545). A STOP analysis is a multidiscipline analysis, consisting of Structural, Thermal and Optical Performance Analyses, that is performed for all space flight instruments and satellites. This course will explain the different parts of performing this analysis. The student will learn how to effectively interact with each discipline in order to accurately obtain the system analysis results.

  19. Performance Analysis of GYRO: A Tool Evaluation

    SciTech Connect

    Worley, P.; Roth, P.; Candy, J.; Shan, Hongzhang; Mahinthakumar,G.; Sreepathi, S.; Carrington, L.; Kaiser, T.; Snavely, A.; Reed, D.; Zhang, Y.; Huck, K.; Malony, A.; Shende, S.; Moore, S.; Wolf, F.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  20. Building America Performance Analysis Procedures: Revision 1

    SciTech Connect

    2004-06-01

    To measure progress toward multi-year research goals, cost and performance trade-offs are evaluated through a series of controlled field and laboratory experiments supported by energy analysis techniques using test data to calibrate simulation models.

  1. A Perspective on DSN System Performance Analysis

    NASA Technical Reports Server (NTRS)

    Pham, Timothy T.

    2006-01-01

    This paper discusses the performance analysis effort being carried out in the NASA Deep Space Network. The activity involves root cause analysis of failures and assessment of key performance metrics. The root cause analysis helps pinpoint the true cause of observed problems so that proper correction can be effected. The assessment currently focuses on three aspects: (1) data delivery metrics such as Quantity, Quality, Continuity, and Latency; (2) link-performance metrics such as antenna pointing, system noise temperature, Doppler noise, frequency and time synchronization, wide-area-network loading, link-configuration setup time; and (3) reliability, maintainability, availability metrics. The analysis establishes whether the current system is meeting its specifications and if so, how much margin is available. The findings help identify the weak points in the system and direct attention of programmatic investment for performance improvement.

  2. Paramedir: A Tool for Programmable Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  3. A Performance Approach to Job Analysis.

    ERIC Educational Resources Information Center

    Folsom, Al

    2001-01-01

    Discussion of performance technology and training evaluation focuses on a job analysis process in the Coast Guard. Topics include problems with low survey response rates; costs; the need for appropriate software; discussions with stakeholders and subject matter experts; and maximizing worthy performance. (LRW)

  4. Performance optimisations for distributed analysis in ALICE

    NASA Astrophysics Data System (ADS)

    Betev, L.; Gheata, A.; Gheata, M.; Grigoras, C.; Hristov, P.

    2014-06-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with "sensors" collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis.

  5. Interfacing Computer Aided Parallelization and Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Biegel, Bryan A. (Technical Monitor)

    2003-01-01

    When porting sequential applications to parallel computer architectures, the program developer will typically go through several cycles of source code optimization and performance analysis. We have started a project to develop an environment where the user can jointly navigate through program structure and performance data information in order to make efficient optimization decisions. In a prototype implementation we have interfaced the CAPO computer aided parallelization tool with the Paraver performance analysis tool. We describe both tools and their interface and give an example for how the interface helps within the program development cycle of a benchmark code.

  6. Integrating performance data collection, analysis, and visualization

    NASA Technical Reports Server (NTRS)

    Malony, Allen D.; Reed, Daniel A.; Rudolph, David C.

    1990-01-01

    An integrated data collection, analysis, and data visualization environment is described for a specific parallel system - the Intel iPSC/2 hypercube. The data collection components of the environment encompass software event tracing at the operating system with a program level and a hardware-based performance monitoring system used to capture software events. A visualization system based on the X-window environment permits dynamic display and reduction of performance data. A performance data collection, analysis, and visualization environment makes it possible to access the effects of architectural and system software variations.

  7. Comparative performance analysis of mobile displays

    NASA Astrophysics Data System (ADS)

    Safaee-Rad, Reza; Aleksic, Milivoje

    2012-01-01

    Cell-phone display performance (in terms of color quality and optical efficiency) has become a critical factor in creating a positive user experience. As a result, there is a significant amount of effort by cell-phone OEMs to provide a more competitive display solution. This effort is focused on using different display technologies (with significantly different color characteristics) and more sophisticated display processors. In this paper, the results of a mobile-display comparative performance analysis are presented. Three cell-phones from major OEMs are selected and their display performances are measured and quantified. Comparative performance analysis is done using display characteristics such as display color gamut size, RGB-channels crosstalk, RGB tone responses, gray tracking performance, color accuracy, and optical efficiency.

  8. Shuttle/TDRSS communications system performance analysis

    NASA Technical Reports Server (NTRS)

    Braun, W. R.

    1980-01-01

    The results of the performance analysis performed on the Shuttle/Tracking and Data Relay Satellite System (TDRSS) communications system are presented. The existing Shuttle/TDRSS link simulation program were modified and refined to model the post-radio frequency interference TDRS hardware and to evaluate the performance degradation due to RFI effects. The refined link models were then used to determine, evaluate and assess expected S-band and Ku-band link performance. Parameterization results are presented for the ground station carrier and timing recovery circuits

  9. Using Covariance Analysis to Assess Pointing Performance

    NASA Technical Reports Server (NTRS)

    Bayard, David; Kang, Bryan

    2009-01-01

    A Pointing Covariance Analysis Tool (PCAT) has been developed for evaluating the expected performance of the pointing control system for NASA s Space Interferometry Mission (SIM). The SIM pointing control system is very complex, consisting of multiple feedback and feedforward loops, and operating with multiple latencies and data rates. The SIM pointing problem is particularly challenging due to the effects of thermomechanical drifts in concert with the long camera exposures needed to image dim stars. Other pointing error sources include sensor noises, mechanical vibrations, and errors in the feedforward signals. PCAT models the effects of finite camera exposures and all other error sources using linear system elements. This allows the pointing analysis to be performed using linear covariance analysis. PCAT propagates the error covariance using a Lyapunov equation associated with time-varying discrete and continuous-time system matrices. Unlike Monte Carlo analysis, which could involve thousands of computational runs for a single assessment, the PCAT analysis performs the same assessment in a single run. This capability facilitates the analysis of parametric studies, design trades, and "what-if" scenarios for quickly evaluating and optimizing the control system architecture and design.

  10. Analysis of ultra-triathlon performances.

    PubMed

    Lepers, Romuald; Knechtle, Beat; Knechtle, Patrizia; Rosemann, Thomas

    2011-01-01

    Despite increased interest in ultra-endurance events, little research has examined ultra-triathlon performance. The aims of this study were: (i) to compare swimming, cycling, running, and overall performances in three ultra-distance triathlons, double Ironman distance triathlon (2IMT) (7.6 km swimming, 360 km cycling, and 84.4 km running), triple Ironman distance triathlon (3IMT) (11.4 km, 540 km, and 126.6 km), and deca Ironman distance triathlon (10IMT) (38 km, 1800 km, and 420 km) and (ii) to examine the relationships between the 2IMT, 3IMT, and 10IMT performances to create predicted equations of the 10IMT performances. Race results from 1985 through 2009 were examined to identify triathletes who performed the three considered ultra-distances. In total, 73 triathletes (68 men and 5 women) were identified. The contribution of swimming to overall ultra-triathlon performance was lower than for cycling and running. Running performance was more important to overall performance for 2IMT and 3IMT compared with 10IMT The 2IMT and 3IMT performances were significantly correlated with 10IMT performances for swimming and cycling, but not for running. 10IMT total time performance might be predicted by the following equation: 10IMT race time (minutes) = 5885 + 3.69 × 3IMT race time (minutes). This analysis of human performance during ultra-distance triathlons represents a unique data set in the field of ultra-endurance events. Additional studies are required to determine the physiological and psychological factors associated with ultra-triathlon performance.

  11. US U-25 channel performance analysis

    SciTech Connect

    Doss, E.; Pan, Y. C.

    1980-07-01

    The results of an ANL computational analysis of the performance of the US U-25 MHD channel are presented. This channel has gone through several revisions. The major revision occurred after it had been decided by the DOE Office of MHD to operate the channel with platinum-clad copper electrodes (cold), rather than with ceramic electrodes (hot), as originally planned. This work has been performed at the request of the DOE Office of MHD and the US U-25 generator design Review Committee. The channel specifications and operating conditions are presented. The combustor temperature and thermodynamic and electrical properties of the plasma are computed, and the results are discussed. The MHD channel performance has been predicted for different operating conditions. Sensitivity studies have also been performed on the effects of mass flow rate, surface roughness, combustor temperatures, and loading on the channel performance.

  12. Using Ratio Analysis to Evaluate Financial Performance.

    ERIC Educational Resources Information Center

    Minter, John; And Others

    1982-01-01

    The ways in which ratio analysis can help in long-range planning, budgeting, and asset management to strengthen financial performance and help avoid financial difficulties are explained. Types of ratios considered include balance sheet ratios, net operating ratios, and contribution and demand ratios. (MSE)

  13. An Analysis of Elite Decathlon Performances.

    ERIC Educational Resources Information Center

    Freeman, William H.

    This analysis involved the ten events of the men's decathlon in all performances which scored 8,000 points or higher, based on the 1962 Scoring Tables of the International Amateur Athletic Federation. It attempted to (1) determine the interrelationships among the events and the final scores, (2) look for areas of difference compared to sub-8,000…

  14. Probabilistic Analysis of Gas Turbine Field Performance

    NASA Technical Reports Server (NTRS)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2002-01-01

    A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.

  15. System analysis of high performance MHD systems

    SciTech Connect

    Chang, S.L.; Berry, G.F.; Hu, N.

    1988-01-01

    This paper presents the results of an investigation on the upper ranges of performance that an MHD power plant using advanced technology assumptions might achieve and a parametric study on the key variables affecting this high performance. To simulate a high performance MHD power plant and conduct a parametric study, the Systems Analysis Language Translator (SALT) code developed at Argonne National Laboratory was used. The parametric study results indicate that the overall efficiency of an MHD power plant can be further increased subject to the improvement of some key variables such as, the MHD generator inverter efficiency, channel electrical loading factor, magnetic field strength, preheated air temperature, and combustor heat loss. In an optimization calculation, the simulated high performance MHD power plant using advanced technology assumptions can attain an ultra high overall efficiency, exceeding 62%. 12 refs., 5 figs., 4 tabs.

  16. Performance analysis and prediction in triathlon.

    PubMed

    Ofoghi, Bahadorreza; Zeleznikow, John; Macmahon, Clare; Rehula, Jan; Dwyer, Dan B

    2016-01-01

    Performance in triathlon is dependent upon factors that include somatotype, physiological capacity, technical proficiency and race strategy. Given the multidisciplinary nature of triathlon and the interaction between each of the three race components, the identification of target split times that can be used to inform the design of training plans and race pacing strategies is a complex task. The present study uses machine learning techniques to analyse a large database of performances in Olympic distance triathlons (2008-2012). The analysis reveals patterns of performance in five components of triathlon (three race "legs" and two transitions) and the complex relationships between performance in each component and overall performance in a race. The results provide three perspectives on the relationship between performance in each component of triathlon and the final placing in a race. These perspectives allow the identification of target split times that are required to achieve a certain final place in a race and the opportunity to make evidence-based decisions about race tactics in order to optimise performance.

  17. NPAC-Nozzle Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Barnhart, Paul J.

    1997-01-01

    A simple and accurate nozzle performance analysis methodology has been developed. The geometry modeling requirements are minimal and very flexible, thus allowing rapid design evaluations. The solution techniques accurately couple: continuity, momentum, energy, state, and other relations which permit fast and accurate calculations of nozzle gross thrust. The control volume and internal flow analyses are capable of accounting for the effects of: over/under expansion, flow divergence, wall friction, heat transfer, and mass addition/loss across surfaces. The results from the nozzle performance methodology are shown to be in excellent agreement with experimental data for a variety of nozzle designs over a range of operating conditions.

  18. Multiprocessor smalltalk: Implementation, performance, and analysis

    SciTech Connect

    Pallas, J.I.

    1990-01-01

    Multiprocessor Smalltalk demonstrates the value of object-oriented programming on a multiprocessor. Its implementation and analysis shed light on three areas: concurrent programming in an object oriented language without special extensions, implementation techniques for adapting to multiprocessors, and performance factors in the resulting system. Adding parallelism to Smalltalk code is easy, because programs already use control abstractions like iterators. Smalltalk's basic control and concurrency primitives (lambda expressions, processes and semaphores) can be used to build parallel control abstractions, including parallel iterators, parallel objects, atomic objects, and futures. Language extensions for concurrency are not required. This implementation demonstrates that it is possible to build an efficient parallel object-oriented programming system and illustrates techniques for doing so. Three modification tools-serialization, replication, and reorganization-adapted the Berkeley Smalltalk interpreter to the Firefly multiprocessor. Multiprocessor Smalltalk's performance shows that the combination of multiprocessing and object-oriented programming can be effective: speedups (relative to the original serial version) exceed 2.0 for five processors on all the benchmarks; the median efficiency is 48%. Analysis shows both where performance is lost and how to improve and generalize the experimental results. Changes in the interpreter to support concurrency add at most 12% overhead; better access to per-process variables could eliminate much of that. Changes in the user code to express concurrency add as much as 70% overhead; this overhead could be reduced to 54% if blocks (lambda expressions) were reentrant. Performance is also lost when the program cannot keep all five processors busy.

  19. Nominal Performance Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs for the groundwater exposure scenario for the three climate states considered in the TSPA-LA as well as conversion factors for evaluating compliance with the groundwater protection standard. The BDCFs will be used in performance assessment for calculating all-pathway annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle activity in groundwater and the annual dose

  20. Automated Cache Performance Analysis And Optimization

    SciTech Connect

    Mohror, Kathryn

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  1. Performance management in healthcare: a critical analysis.

    PubMed

    Hewko, Sarah J; Cummings, Greta G

    2016-01-01

    Purpose - The purpose of this paper is to explore the underlying theoretical assumptions and implications of current micro-level performance management and evaluation (PME) practices, specifically within health-care organizations. PME encompasses all activities that are designed and conducted to align employee outputs with organizational goals. Design/methodology/approach - PME, in the context of healthcare, is analyzed through the lens of critical theory. Specifically, Habermas' theory of communicative action is used to highlight some of the questions that arise in looking critically at PME. To provide a richer definition of key theoretical concepts, the authors conducted a preliminary, exploratory hermeneutic semantic analysis of the key words "performance" and "management" and of the term "performance management". Findings - Analysis reveals that existing micro-level PME systems in health-care organizations have the potential to create a workforce that is compliant, dependent, technically oriented and passive, and to support health-care systems in which inequalities and power imbalances are perpetually reinforced. Practical implications - At a time when the health-care system is under increasing pressure to provide high-quality, affordable services with fewer resources, it may be wise to investigate new sector-specific ways of evaluating and managing performance. Originality/value - In this paper, written for health-care leaders and health human resource specialists, the theoretical assumptions and implications of current PME practices within health-care organizations are explored. It is hoped that readers will be inspired to support innovative PME practices within their organizations that encourage peak performance among health-care professionals. PMID:26764960

  2. Performance management in healthcare: a critical analysis.

    PubMed

    Hewko, Sarah J; Cummings, Greta G

    2016-01-01

    Purpose - The purpose of this paper is to explore the underlying theoretical assumptions and implications of current micro-level performance management and evaluation (PME) practices, specifically within health-care organizations. PME encompasses all activities that are designed and conducted to align employee outputs with organizational goals. Design/methodology/approach - PME, in the context of healthcare, is analyzed through the lens of critical theory. Specifically, Habermas' theory of communicative action is used to highlight some of the questions that arise in looking critically at PME. To provide a richer definition of key theoretical concepts, the authors conducted a preliminary, exploratory hermeneutic semantic analysis of the key words "performance" and "management" and of the term "performance management". Findings - Analysis reveals that existing micro-level PME systems in health-care organizations have the potential to create a workforce that is compliant, dependent, technically oriented and passive, and to support health-care systems in which inequalities and power imbalances are perpetually reinforced. Practical implications - At a time when the health-care system is under increasing pressure to provide high-quality, affordable services with fewer resources, it may be wise to investigate new sector-specific ways of evaluating and managing performance. Originality/value - In this paper, written for health-care leaders and health human resource specialists, the theoretical assumptions and implications of current PME practices within health-care organizations are explored. It is hoped that readers will be inspired to support innovative PME practices within their organizations that encourage peak performance among health-care professionals.

  3. Nominal Performance Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    M.A. Wasiolek

    2003-07-25

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports (BSC 2003 [DIRS 160964]; BSC 2003 [DIRS 160965]; BSC 2003 [DIRS 160976]; BSC 2003 [DIRS 161239]; BSC 2003 [DIRS 161241]) contain detailed description of the model input parameters. This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs and conversion factors for the TSPA. The BDCFs will be used in performance assessment for calculating annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle activity in groundwater and the annual dose from beta- and photon-emitting radionuclides.

  4. Diversity Performance Analysis on Multiple HAP Networks.

    PubMed

    Dong, Feihong; Li, Min; Gong, Xiangwu; Li, Hongjun; Gao, Fengyue

    2015-01-01

    One of the main design challenges in wireless sensor networks (WSNs) is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP) is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO) techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO) model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV). In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF) and cumulative distribution function (CDF) of the received signal-to-noise ratio (SNR) are derived. In addition, the average symbol error rate (ASER) with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI) and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques. PMID:26134102

  5. Analysis approaches and interventions with occupational performance

    PubMed Central

    Ahn, Sinae

    2016-01-01

    [Purpose] The purpose of this study was to analyze approaches and interventions with occupational performance in patients with stroke. [Subjects and Methods] In this study, articles published in the past 10 years were searched. The key terms used were “occupational performance AND stroke” and “occupational performance AND CVA”. A total 252 articles were identified, and 79 articles were selected. All interventions were classified according to their approaches according to 6 theories. All interventions were analyzed for frequency. [Results] Regarding the approaches, there were 25 articles for studies that provided high frequency interventions aimed at improving biomechanical approaches (31.6%). This included electrical stimulation therapy, robot therapy, and sensory stimulation training, as well as others. Analysis of the frequency of interventions revealed that the most commonly used interventions, which were used in 18 articles (22.8%), made use of the concept of constraint-induced therapy. [Conclusion] The results of this study suggest an approach for use in clinics for selecting an appropriate intervention for occupational performance. PMID:27799719

  6. Diversity Performance Analysis on Multiple HAP Networks

    PubMed Central

    Dong, Feihong; Li, Min; Gong, Xiangwu; Li, Hongjun; Gao, Fengyue

    2015-01-01

    One of the main design challenges in wireless sensor networks (WSNs) is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP) is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO) techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO) model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV). In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF) and cumulative distribution function (CDF) of the received signal-to-noise ratio (SNR) are derived. In addition, the average symbol error rate (ASER) with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI) and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques. PMID:26134102

  7. Nominal Performance Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    M.A. Wasiolek

    2005-04-28

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standards. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1-1). The objectives of this analysis are to develop BDCFs for the

  8. Idaho National Laboratory Quarterly Performance Analysis

    SciTech Connect

    Lisbeth Mitchell

    2014-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of 60 reportable events (23 from the 4th Qtr FY14 and 37 from the prior three reporting quarters) as well as 58 other issue reports (including not reportable events and Significant Category A and B conditions) identified at INL from July 2013 through October 2014. Battelle Energy Alliance (BEA) operates the INL under contract DE AC07 051D14517.

  9. SUBSONIC WIND TUNNEL PERFORMANCE ANALYSIS SOFTWARE

    NASA Technical Reports Server (NTRS)

    Eckert, W. T.

    1994-01-01

    This program was developed as an aid in the design and analysis of subsonic wind tunnels. It brings together and refines previously scattered and over-simplified techniques used for the design and loss prediction of the components of subsonic wind tunnels. It implements a system of equations for determining the total pressure losses and provides general guidelines for the design of diffusers, contractions, corners and the inlets and exits of non-return tunnels. The algorithms used in the program are applicable to compressible flow through most closed- or open-throated, single-, double- or non-return wind tunnels or ducts. A comparison between calculated performance and that actually achieved by several existing facilities produced generally good agreement. Any system through which air is flowing which involves turns, fans, contractions etc. (e.g., an HVAC system) may benefit from analysis using this software. This program is an update of ARC-11138 which includes PC compatibility and an improved user interface. The method of loss analysis used by the program is a synthesis of theoretical and empirical techniques. Generally, the algorithms used are those which have been substantiated by experimental test. The basic flow-state parameters used by the program are determined from input information about the reference control section and the test section. These parameters were derived from standard relationships for compressible flow. The local flow conditions, including Mach number, Reynolds number and friction coefficient are determined for each end of each component or section. The loss in total pressure caused by each section is calculated in a form non-dimensionalized by local dynamic pressure. The individual losses are based on the nature of the section, local flow conditions and input geometry and parameter information. The loss forms for typical wind tunnel sections considered by the program include: constant area ducts, open throat ducts, contractions, constant

  10. Nominal Performance Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    Wasiolek, Maryla A.

    2000-12-21

    The purpose of this report was to document the process leading to development of the Biosphere Dose Conversion Factors (BDCFs) for the postclosure nominal performance of the potential repository at Yucca Mountain. BDCF calculations concerned twenty-four radionuclides. This selection included sixteen radionuclides that may be significant nominal performance dose contributors during the compliance period of up to 10,000 years, five additional radionuclides of importance for up to 1 million years postclosure, and three relatively short-lived radionuclides important for the human intrusion scenario. Consideration of radionuclide buildup in soil caused by previous irrigation with contaminated groundwater was taken into account in the BDCF development. The effect of climate evolution, from the current arid conditions to a wetter and cooler climate, on the BDCF values was evaluated. The analysis included consideration of different exposure pathway's contribution to the BDCFs. Calculations of nominal performance BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. BDCFs for the nominal performance, when combined with the concentrations of radionuclides in groundwater allow calculation of potential radiation doses to the receptor of interest. Calculated estimates of radionuclide concentration in groundwater result from the saturated zone modeling. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA) to calculate doses to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  11. Approaches to Cycle Analysis and Performance Metrics

    NASA Technical Reports Server (NTRS)

    Parson, Daniel E.

    2003-01-01

    The following notes were prepared as part of an American Institute of Aeronautics and Astronautics (AIAA) sponsored short course entitled Air Breathing Pulse Detonation Engine (PDE) Technology. The course was presented in January of 2003, and again in July of 2004 at two different AIAA meetings. It was taught by seven instructors, each of whom provided information on particular areas of PDE research. These notes cover two areas. The first is titled Approaches to Cycle Analysis and Performance Metrics. Here, the various methods of cycle analysis are introduced. These range from algebraic, thermodynamic equations, to single and multi-dimensional Computational Fluid Dynamic (CFD) solutions. Also discussed are the various means by which performance is measured, and how these are applied in a device which is fundamentally unsteady. The second topic covered is titled PDE Hybrid Applications. Here the concept of coupling a PDE to a conventional turbomachinery based engine is explored. Motivation for such a configuration is provided in the form of potential thermodynamic benefits. This is accompanied by a discussion of challenges to the technology.

  12. Stormwater quality models: performance and sensitivity analysis.

    PubMed

    Dotto, C B S; Kleidorfer, M; Deletic, A; Fletcher, T D; McCarthy, D T; Rauch, W

    2010-01-01

    The complex nature of pollutant accumulation and washoff, along with high temporal and spatial variations, pose challenges for the development and establishment of accurate and reliable models of the pollution generation process in urban environments. Therefore, the search for reliable stormwater quality models remains an important area of research. Model calibration and sensitivity analysis of such models are essential in order to evaluate model performance; it is very unlikely that non-calibrated models will lead to reasonable results. This paper reports on the testing of three models which aim to represent pollutant generation from urban catchments. Assessment of the models was undertaken using a simplified Monte Carlo Markov Chain (MCMC) method. Results are presented in terms of performance, sensitivity to the parameters and correlation between these parameters. In general, it was suggested that the tested models poorly represent reality and result in a high level of uncertainty. The conclusions provide useful information for the improvement of existing models and insights for the development of new model formulations.

  13. Radio-science performance analysis software

    NASA Technical Reports Server (NTRS)

    Morabito, D. D.; Asmar, S. W.

    1995-01-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.

  14. Performance Analysis of ICA in Sensor Array.

    PubMed

    Cai, Xin; Wang, Xiang; Huang, Zhitao; Wang, Fenghua

    2016-01-01

    As the best-known scheme in the field of Blind Source Separation (BSS), Independent Component Analysis (ICA) has been intensively used in various domains, including biomedical and acoustics applications, cooperative or non-cooperative communication, etc. While sensor arrays are involved in most of the applications, the influence on the performance of ICA of practical factors therein has not been sufficiently investigated yet. In this manuscript, the issue is researched by taking the typical antenna array as an illustrative example. Factors taken into consideration include the environment noise level, the properties of the array and that of the radiators. We analyze the analytic relationship between the noise variance, the source variance, the condition number of the mixing matrix and the optimal signal to interference-plus-noise ratio, as well as the relationship between the singularity of the mixing matrix and practical factors concerned. The situations where the mixing process turns (nearly) singular have been paid special attention to, since such circumstances are critical in applications. Results and conclusions obtained should be instructive when applying ICA algorithms on mixtures from sensor arrays. Moreover, an effective countermeasure against the cases of singular mixtures has been proposed, on the basis of previous analysis. Experiments validating the theoretical conclusions as well as the effectiveness of the proposed scheme have been included. PMID:27164100

  15. Performance Analysis of ICA in Sensor Array

    PubMed Central

    Cai, Xin; Wang, Xiang; Huang, Zhitao; Wang, Fenghua

    2016-01-01

    As the best-known scheme in the field of Blind Source Separation (BSS), Independent Component Analysis (ICA) has been intensively used in various domains, including biomedical and acoustics applications, cooperative or non-cooperative communication, etc. While sensor arrays are involved in most of the applications, the influence on the performance of ICA of practical factors therein has not been sufficiently investigated yet. In this manuscript, the issue is researched by taking the typical antenna array as an illustrative example. Factors taken into consideration include the environment noise level, the properties of the array and that of the radiators. We analyze the analytic relationship between the noise variance, the source variance, the condition number of the mixing matrix and the optimal signal to interference-plus-noise ratio, as well as the relationship between the singularity of the mixing matrix and practical factors concerned. The situations where the mixing process turns (nearly) singular have been paid special attention to, since such circumstances are critical in applications. Results and conclusions obtained should be instructive when applying ICA algorithms on mixtures from sensor arrays. Moreover, an effective countermeasure against the cases of singular mixtures has been proposed, on the basis of previous analysis. Experiments validating the theoretical conclusions as well as the effectiveness of the proposed scheme have been included. PMID:27164100

  16. Past Performance analysis of HPOTP bearings

    NASA Technical Reports Server (NTRS)

    Bhat, B. N.; Dolan, F. J.

    1982-01-01

    The past performance analysis conducted on three High Pressure Oxygen Turbopump (HPOTP) bearings from the Space Shuttle Main Engine is presented. Metallurgical analysis of failed bearing balls and races, and wear track and crack configuration analyses were carried out. In addition, one bearing was tested in laboratory at very high axial loads. The results showed that the cracks were surface initiated and propagated into subsurface locations at relatively small angles. Subsurface cracks were much more extensive than was appeared on the surface. The location of major cracks in the races corresponded to high radial loads rather than high axial loads. There was evidence to suggest that the inner races were heated to elevated temperatures. A failure scenario was developed based on the above findings. According to this scenario the HPOTP bearings are heated by a combination of high loads and high coefficient of friction (poor lubrication). Different methods of extending the HPOTP bearing life are also discussed. These include reduction of axial loads, improvements in bearing design, lubrication and cooling, and use of improved bearing materials.

  17. Space Shuttle Main Engine performance analysis

    NASA Astrophysics Data System (ADS)

    Santi, L. Michael

    1993-11-01

    For a number of years, NASA has relied primarily upon periodically updated versions of Rocketdyne's power balance model (PBM) to provide space shuttle main engine (SSME) steady-state performance prediction. A recent computational study indicated that PBM predictions do not satisfy fundamental energy conservation principles. More recently, SSME test results provided by the Technology Test Bed (TTB) program have indicated significant discrepancies between PBM flow and temperature predictions and TTB observations. Results of these investigations have diminished confidence in the predictions provided by PBM, and motivated the development of new computational tools for supporting SSME performance analysis. A multivariate least squares regression algorithm was developed and implemented during this effort in order to efficiently characterize TTB data. This procedure, called the 'gains model,' was used to approximate the variation of SSME performance parameters such as flow rate, pressure, temperature, speed, and assorted hardware characteristics in terms of six assumed independent influences. These six influences were engine power level, mixture ratio, fuel inlet pressure and temperature, and oxidizer inlet pressure and temperature. A BFGS optimization algorithm provided the base procedure for determining regression coefficients for both linear and full quadratic approximations of parameter variation. Statistical information relative to data deviation from regression derived relations was also computed. A new strategy for integrating test data with theoretical performance prediction was also investigated. The current integration procedure employed by PBM treats test data as pristine and adjusts hardware characteristics in a heuristic manner to achieve engine balance. Within PBM, this integration procedure is called 'data reduction.' By contrast, the new data integration procedure, termed 'reconciliation,' uses mathematical optimization techniques, and requires both

  18. Data Link Performance Analysis for LVLASO Experiments

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    1998-01-01

    Low-visibility Landing and Surface Operations System (LVLASO) is currently being prototyped and tested at NASA Langley Research Center. Since the main objective of the system is to maintain the aircraft landings and take-offs even during low-visibility conditions, timely exchange of positional and other information between the aircraft and the ground control is critical. For safety and reliability reasons, there are several redundant sources on the ground (e.g., ASDE, AMASS) that collect and disseminate information about the environment to the aircrafts. The data link subsystem of LVLASO is responsible for supporting the timely transfer of information between the aircrafts and the ground controllers. In fact, if not properly designed, the data link subsystem could become a bottleneck in the proper functioning of LVLASO. Currently, the other components of the system are being designed assuming that the data link has adequate capacity and is capable of delivering the information in a timely manner. During August 1-28, 1997, several flight experiments were conducted to test the prototypes of subsystems developed under LVLASO project, The back-round and details of the tests are described in the next section. The test results have been collected in two CDs by FAA and Rockwell-Collins. Under the current grant, we have analyzed the data and evaluated the performance of the Mode S datalink. In this report, we summarize the results of our analysis. Much of the results are shown in terms of graphs or histograms. The test date (or experiment number) was often taken as the X-axis and the Y-axis denotes whatever metric of focus in that chart. In interpreting these charts, one need to take into account the vehicular traffic during a particular experiment. In general, the performance of the data link was found to be quite satisfactory in terms of delivering long and short Mode S squitters from the vehicles to the ground receiver, Similarly, its performance in delivering control

  19. Space Shuttle Main Engine performance analysis

    NASA Technical Reports Server (NTRS)

    Santi, L. Michael

    1993-01-01

    For a number of years, NASA has relied primarily upon periodically updated versions of Rocketdyne's power balance model (PBM) to provide space shuttle main engine (SSME) steady-state performance prediction. A recent computational study indicated that PBM predictions do not satisfy fundamental energy conservation principles. More recently, SSME test results provided by the Technology Test Bed (TTB) program have indicated significant discrepancies between PBM flow and temperature predictions and TTB observations. Results of these investigations have diminished confidence in the predictions provided by PBM, and motivated the development of new computational tools for supporting SSME performance analysis. A multivariate least squares regression algorithm was developed and implemented during this effort in order to efficiently characterize TTB data. This procedure, called the 'gains model,' was used to approximate the variation of SSME performance parameters such as flow rate, pressure, temperature, speed, and assorted hardware characteristics in terms of six assumed independent influences. These six influences were engine power level, mixture ratio, fuel inlet pressure and temperature, and oxidizer inlet pressure and temperature. A BFGS optimization algorithm provided the base procedure for determining regression coefficients for both linear and full quadratic approximations of parameter variation. Statistical information relative to data deviation from regression derived relations was also computed. A new strategy for integrating test data with theoretical performance prediction was also investigated. The current integration procedure employed by PBM treats test data as pristine and adjusts hardware characteristics in a heuristic manner to achieve engine balance. Within PBM, this integration procedure is called 'data reduction.' By contrast, the new data integration procedure, termed 'reconciliation,' uses mathematical optimization techniques, and requires both

  20. Performance analysis of memory hierachies in high performance systems

    SciTech Connect

    Yogesh, A.

    1993-07-01

    This thesis studies memory bandwidth as a performance predictor of programs. The focus of this work is on computationally intensive programs. These programs are the most likely to access large amounts of data, stressing the memory system. Computationally intensive programs are also likely to use highly optimizing compilers to produce the fastest executables possible. Methods to reduce the amount of data traffic by increasing the average number of references to each item while it resides in the cache are explored. Increasing the average number of references to each cache item reduces the number of memory requests. Chapter 2 describes the DLX architecture. This is the architecture on which all the experiments were performed. Chapter 3 studies memory moves as a performance predictor for a group of application programs. Chapter 4 introduces a model to study the performance of programs in the presence of memory hierarchies. Chapter 5 explores some compiler optimizations that can help increase the references to each item while it resides in the cache.

  1. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    EPA Science Inventory

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...

  2. Performance analysis of electrical circuits /PANE/

    NASA Technical Reports Server (NTRS)

    Johnson, K. L.; Steinberg, L. L.

    1968-01-01

    Automated statistical and worst case computer program has been designed to perform dc and ac steady circuit analyses. The program determines the worst case circuit performance by solving circuit equations.

  3. NEXT Performance Curve Analysis and Validation

    NASA Technical Reports Server (NTRS)

    Saripalli, Pratik; Cardiff, Eric; Englander, Jacob

    2016-01-01

    Performance curves of the NEXT thruster are highly important in determining the thruster's ability in performing towards mission-specific goals. New performance curves are proposed and examined here. The Evolutionary Mission Trajectory Generator (EMTG) is used to verify variations in mission solutions based on both available thruster curves and the new curves generated. Furthermore, variations in BOL and EOL curves are also examined. Mission design results shown here validate the use of EMTG and the new performance curves.

  4. Passive analysis technique for packet link performance

    NASA Astrophysics Data System (ADS)

    Fairhurst, G.; Wan, P. S.

    1993-01-01

    The performance of a bearer link is usually assessed by bit error rate (BER) tests or measurement of the error free seconds (EFS). These require exclusive access to the link. An alternative technique is presented that measures performance by passive observation of the frames passing over a packet link. This may be used to estimate the performance of the link.

  5. IPAC-Inlet Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Barnhart, Paul J.

    1997-01-01

    A series of analyses have been developed which permit the calculation of the performance of common inlet designs. The methods presented are useful for determining the inlet weight flows, total pressure recovery, and aerodynamic drag coefficients for given inlet geometric designs. Limited geometric input data is required to use this inlet performance prediction methodology. The analyses presented here may also be used to perform inlet preliminary design studies. The calculated inlet performance parameters may be used in subsequent engine cycle analyses or installed engine performance calculations for existing uninstalled engine data.

  6. Midlife plasma vitamin D concentrations and performance in different cognitive domains assessed 13 years later.

    PubMed

    Assmann, Karen E; Touvier, Mathilde; Andreeva, Valentina A; Deschasaux, Mélanie; Constans, Thierry; Hercberg, Serge; Galan, Pilar; Kesse-Guyot, Emmanuelle

    2015-05-28

    25-Hydroxyvitamin D (25(OH)D) insufficiency is very common in many countries. Yet, the extent to which 25(OH)D status affects cognitive performance remains unclear. The objective of the present study was to evaluate the cross-time association between midlife plasma 25(OH)D concentrations and subsequent cognitive performance, using a subsample from the French 'SUpplémentation en Vitamines et Minéraux AntioXydants' randomised trial (SU.VI.MAX, 1994-2002) and the SU.VI.MAX 2 observational follow-up study (2007-9). 25(OH)D concentrations were measured in plasma samples drawn in 1994-5, using an electrochemoluminescent immunoassay. Cognitive performance was evaluated in 2007-9 with a neuropsychological battery including phonemic and semantic fluency tasks, the RI-48 (rappel indicé-48 items) cued recall test, the Trail Making Test and the forward and backward digit span. Cognitive factors were extracted via principal component analysis (PCA). Data from 1009 individuals, aged 45-60 years at baseline, with available 25(OH)D and cognitive measurements were analysed by multivariable linear regression models and ANCOVA, stratified by educational level. PCA yielded two factors, designated as 'verbal memory' (strongly correlated with the RI-48 and phonemic/semantic fluency tasks) and 'short-term/working memory' (strongly correlated with the digit span tasks). In the fully adjusted regression model, among individuals with low education, there was a positive association between 25(OH)D concentrations and the 'short-term/working memory' factor (P=0.02), mainly driven by the backward digit span (P=0.004). No association with either cognitive factor was found among better educated participants. In conclusion, higher midlife 25(OH)D concentrations were linked to better outcomes concerning short-term and working memory. However, these results were specific to subjects with low education, suggesting a modifying effect of cognitive reserve. PMID:25864611

  7. Applying time series analysis to performance logs

    NASA Astrophysics Data System (ADS)

    Kubacki, Marcin; Sosnowski, Janusz

    2015-09-01

    Contemporary computer systems provide mechanisms for monitoring various performance parameters (e.g. processor or memory usage, disc or network transfers), which are collected and stored in performance logs. An important issue is to derive characteristic features describing normal and abnormal behavior of the systems. For this purpose we use various schemes of analyzing time series. They have been adapted to the specificity of performance logs and verified using data collected from real systems. The presented approach is useful in evaluating system dependability.

  8. An Ethnostatistical Analysis of Performance Measurement

    ERIC Educational Resources Information Center

    Winiecki, Donald J.

    2008-01-01

    Within the fields of human performance technology (HPT), human resources management (HRM), and management in general, performance measurement is not only foundational but considered necessary at all phases in the process of HPT. In HPT in particular, there is substantial advice literature on what measurement is, why it is essential, and (at a…

  9. Probabilistic Performance Analysis of Fault Diagnosis Schemes

    NASA Astrophysics Data System (ADS)

    Wheeler, Timothy Josh

    The dissertation explores the problem of rigorously quantifying the performance of a fault diagnosis scheme in terms of probabilistic performance metrics. Typically, when the performance of a fault diagnosis scheme is of utmost importance, physical redundancy is used to create a highly reliable system that is easy to analyze. However, in this dissertation, we provide a general framework that applies to more complex analytically redundant or model-based fault diagnosis schemes. For each fault diagnosis problem in this framework, our performance metrics can be computed accurately in polynomial-time. First, we cast the fault diagnosis problem as a sequence of hypothesis tests. At each time, the performance of a fault diagnosis scheme is quantified by the probability that the scheme has chosen the correct hypothesis. The resulting performance metrics are joint probabilities. Using Bayes rule, we decompose these performance metrics into two parts: marginal probabilities that quantify the reliability of the system and conditional probabilities that quantify the performance of the fault diagnosis scheme. These conditional probabilities are used to draw connections between the fault diagnosis and the fields of medical diagnostic testing, signal detection, and general statistical decision theory. Second, we examine the problem of computing the performance metrics efficiently and accurately. To solve this problem, we examine each portion of the fault diagnosis problem and specify a set of sufficient assumptions that guarantee efficient computation. In particular, we provide a detailed characterization of the class of finite-state Markov chains that lead to tractable fault parameter models. To demonstrate that these assumptions enable efficient computation, we provide pseudocode algorithms and prove that their running time is indeed polynomial. Third, we consider fault diagnosis problems involving uncertain systems. The inclusion of uncertainty enlarges the class of systems

  10. Assessing BMP Performance Using Microtox Toxicity Analysis

    EPA Science Inventory

    Best Management Practices (BMPs) have been shown to be effective in reducing runoff and pollutants from urban areas and thus provide a mechanism to improve downstream water quality. Currently, BMP performance regarding water quality improvement is assessed through measuring each...

  11. Performance analysis of cone detection algorithms.

    PubMed

    Mariotti, Letizia; Devaney, Nicholas

    2015-04-01

    Many algorithms have been proposed to help clinicians evaluate cone density and spacing, as these may be related to the onset of retinal diseases. However, there has been no rigorous comparison of the performance of these algorithms. In addition, the performance of such algorithms is typically determined by comparison with human observers. Here we propose a technique to simulate realistic images of the cone mosaic. We use the simulated images to test the performance of three popular cone detection algorithms, and we introduce an algorithm which is used by astronomers to detect stars in astronomical images. We use Free Response Operating Characteristic (FROC) curves to evaluate and compare the performance of the four algorithms. This allows us to optimize the performance of each algorithm. We observe that performance is significantly enhanced by up-sampling the images. We investigate the effect of noise and image quality on cone mosaic parameters estimated using the different algorithms, finding that the estimated regularity is the most sensitive parameter. PMID:26366758

  12. Rocket-in-a-Duct Performance Analysis

    NASA Technical Reports Server (NTRS)

    Schneider, Steven J.; Reed, Brian D.

    1999-01-01

    An axisymmetric, 110 N class, rocket configured with a free expansion between the rocket nozzle and a surrounding duct was tested in an altitude simulation facility. The propellants were gaseous hydrogen and gaseous oxygen and the hardware consisted of a heat sink type copper rocket firing through copper ducts of various diameters and lengths. A secondary flow of nitrogen was introduced at the blind end of the duct to mix with the primary rocket mass flow in the duct. This flow was in the range of 0 to 10% of the primary massflow and its effect on nozzle performance was measured. The random measurement errors on thrust and massflow were within +/-1%. One dimensional equilibrium calculations were used to establish the possible theoretical performance of these rocket-in-a-duct nozzles. Although the scale of these tests was small, they simulated the relevant flow expansion physics at a modest experimental cost. Test results indicated that lower performance was obtained at higher free expansion area ratios and longer ducts, while, higher performance was obtained with the addition of secondary flow. There was a discernable peak in specific impulse efficiency at 4% secondary flow. The small scale of these tests resulted in low performance efficiencies, but prior numerical modeling of larger rocket-in-a-duct engines predicted performance that was comparable to that of optimized rocket nozzles. This remains to be proven in large-scale, rocket-in-a-duct tests.

  13. Network interface unit design options performance analysis

    NASA Technical Reports Server (NTRS)

    Miller, Frank W.

    1991-01-01

    An analysis is presented of three design options for the Space Station Freedom (SSF) onboard Data Management System (DMS) Network Interface Unit (NIU). The NIU provides the interface from the Fiber Distributed Data Interface (FDDI) local area network (LAN) to the DMS processing elements. The FDDI LAN provides the primary means for command and control and low and medium rate telemetry data transfers on board the SSF. The results of this analysis provide the basis for the implementation of the NIU.

  14. Performance Analysis of an all Digital BPSK Demodulator

    NASA Technical Reports Server (NTRS)

    Gevargiz, John M.

    1993-01-01

    A simulation algorithm was developed to analyze the performance of the binary phase shift key (BPSK) Costas loop coupled to the symbol sync loop (SSL). Analysis was performed for various signal-to-noise ratios and loop parameters.

  15. An optical probe for micromachine performance analysis

    SciTech Connect

    Dickey, F.M.; Holswade, S.C.; Smith, N.F.; Miller, S.L.

    1997-01-01

    Understanding the mechanisms that impact the performance of Microelectromechanical Systems (MEMS) is essential to the development of optimized designs and fabrication processes, as well as the qualification of devices for commercial applications. Silicon micromachines include engines that consist of orthogonally oriented linear comb drive actuators mechanically connected to a rotating gear. These gears are as small as 50 {mu}m in diameter and can be driven at rotation rates exceeding 300,000 rpm. Optical techniques offer the potential for measuring long term statistical performance data and transient responses needed to optimize designs and manufacturing techniques. We describe the development of Micromachine Optical Probe (MOP) technology for the evaluation of micromachine performance. The MOP approach is based on the detection of optical signals scattered by the gear teeth or other physical structures. We present experimental results obtained with a prototype optical probe and micromachines developed at Sandia National Laboratories.

  16. Forecast analysis of optical waveguide bus performance

    NASA Technical Reports Server (NTRS)

    Ledesma, R.; Rourke, M. D.

    1979-01-01

    Elements to be considered in the design of a data bus include: architecture; data rate; modulation, encoding, detection; power distribution requirements; protocol, work structure; bus reliability, maintainability; interterminal transmission medium; cost; and others specific to application. Fiber- optic data bus considerations for a 32 port transmissive star architecture, are discussed in a tutorial format. General optical-waveguide bus concepts, are reviewed. The electrical and optical performance of a 32 port transmissive star bus, and the effects of temperature on the performance of optical-waveguide buses are examined. A bibliography of pertinent references and the bus receiver test results are included.

  17. Performance analysis of a VSAT network

    NASA Astrophysics Data System (ADS)

    Karam, Fouad G.; Miller, Neville; Karam, Antoine

    With the growing need for efficient satellite networking facilities, the very small aperture terminal (VSAT) technology emerges as the leading edge of satellite communications. Achieving the required performance of a VSAT network is dictated by the multiple access technique utilized. Determining the inbound access method best suited for a particular application involves trade-offs between response time and space segment utilization. In this paper, the slotted Aloha and dedicated stream access techniques are compared. It is shown that network performance is dependent on the traffic offered from remote earth stations as well as the sensitivity of customer's applications to satellite delay.

  18. Performance Systems Analysis: Learning by Doing

    ERIC Educational Resources Information Center

    Knowles, Marc P.; Suh, Sookyung

    2005-01-01

    The authors discuss potential shortfalls of assistantships and internships in preparing students for practical career application of professional degrees and describe the benefits to overall development of courses eliciting performance in authentic scenarios. This article explores what is necessary, not only to teach, but also to learn, human…

  19. THERMAL PERFORMANCE ANALYSIS FOR WSB DRUM

    SciTech Connect

    Lee, S

    2008-06-26

    The Nuclear Nonproliferation Programs Design Authority is in the design stage of the Waste Solidification Building (WSB) for the treatment and solidification of the radioactive liquid waste streams generated by the Pit Disassembly and Conversion Facility (PDCF) and Mixed Oxide (MOX) Fuel Fabrication Facility (MFFF). The waste streams will be mixed with a cementitious dry mix in a 55-gallon waste container. Savannah River National Laboratory (SRNL) has been performing the testing and evaluations to support technical decisions for the WSB. Engineering Modeling & Simulation Group was requested to evaluate the thermal performance of the 55-gallon drum containing hydration heat source associated with the current baseline cement waste form. A transient axi-symmetric heat transfer model for the drum partially filled with waste form cement has been developed and heat transfer calculations performed for the baseline design configurations. For this case, 65 percent of the drum volume was assumed to be filled with the waste form, which has transient hydration heat source, as one of the baseline conditions. A series of modeling calculations has been performed using a computational heat transfer approach. The baseline modeling results show that the time to reach the maximum temperature of the 65 percent filled drum is about 32 hours when a 43 C initial cement temperature is assumed to be cooled by natural convection with 27 C external air. In addition, the results computed by the present model were compared with analytical solutions. The modeling results will be benchmarked against the prototypic test results. The verified model will be used for the evaluation of the thermal performance for the WSB drum.

  20. Performance Analysis of IIUM Wireless Campus Network

    NASA Astrophysics Data System (ADS)

    Abd Latif, Suhaimi; Masud, Mosharrof H.; Anwar, Farhat

    2013-12-01

    International Islamic University Malaysia (IIUM) is one of the leading universities in the world in terms of quality of education that has been achieved due to providing numerous facilities including wireless services to every enrolled student. The quality of this wireless service is controlled and monitored by Information Technology Division (ITD), an ISO standardized organization under the university. This paper aims to investigate the constraints of wireless campus network of IIUM. It evaluates the performance of the IIUM wireless campus network in terms of delay, throughput and jitter. QualNet 5.2 simulator tool has employed to measure these performances of IIUM wireless campus network. The observation from the simulation result could be one of the influencing factors in improving wireless services for ITD and further improvement.

  1. Performance analysis of anisotropic scattering center detection

    NASA Astrophysics Data System (ADS)

    Moses, Randolph L.; Erten, Eniz; Potter, Lee C.

    1997-07-01

    We consider the problem of detecting anisotropic scattering of targets from wideband SAR measurements. We first develop a scattering model for the response of an ideal dihedral when interrogated by a wideband radar. We formulate a stochastic detection problem based on this model and Gaussian clutter models. We investigate the performance of three detectors, the conventional imaging detector, a generalized likelihood ratio test (GLRT) detector based on the dihedral anisotropic scattering model, and a sum-of- squares detector motivated as a computationally attractive alternative to the GLRT test. We also investigate the performance degradation of the GLRT detector when using truncated angle response filters, and analyze detector sensitivity to changes in target length. Finally, we present initial results of angular matched filter detection applied to UWB radar measurements collected by the Army Research Laboratory at Aberdeen Proving Grounds.

  2. Performance analysis of panoramic infrared systems

    NASA Astrophysics Data System (ADS)

    Furxhi, Orges; Driggers, Ronald G.; Holst, Gerald; Krapels, Keith

    2014-05-01

    Panoramic imagers are becoming more commonplace in the visible part of the spectrum. These imagers are often used in the real estate market, extreme sports, teleconferencing, and security applications. Infrared panoramic imagers, on the other hand, are not as common and only a few have been demonstrated. A panoramic image can be formed in several ways, using pan and stitch, distributed aperture, or omnidirectional optics. When omnidirectional optics are used, the detected image is a warped view of the world that is mapped on the focal plane array in a donut shape. The final image on the display is the mapping of the omnidirectional donut shape image back to the panoramic world view. In this paper we analyze the performance of uncooled thermal panoramic imagers that use omnidirectional optics, focusing on range performance.

  3. Experimental system and component performance analysis

    SciTech Connect

    Peterman, K.

    1984-10-01

    A prototype dye laser flow loop was constructed to flow test large power amplifiers in Building 169. The flow loop is designed to operate at supply pressures up to 900 psig and flow rates up to 250 GPM. During the initial startup of the flow loop experimental measurements were made to evaluate component and system performance. Three candidate dye flow loop pumps and three different pulsation dampeners were tested.

  4. TCP performance analysis for wide area networks

    SciTech Connect

    Chen, H.Y.; Hutchins, J.A.; Testi, N.

    1993-08-01

    Even though networks have been getting faster, perceived throughput at the application level has not increased accordingly. In an attempt to identify many of the performance bottlenecks, we collected and analyzed data over a wide area network (WAN) at T3 (45 Mbps) bandwidth. The information gained will assist in designing new protocols and/or algorithms that are consistent with future high- speed requirements.

  5. Performance Analysis of the Unitree Central File

    NASA Technical Reports Server (NTRS)

    Pentakalos, Odysseas I.; Flater, David

    1994-01-01

    This report consists of two parts. The first part briefly comments on the documentation status of two major systems at NASA#s Center for Computational Sciences, specifically the Cray C98 and the Convex C3830. The second part describes the work done on improving the performance of file transfers between the Unitree Mass Storage System running on the Convex file server and the users workstations distributed over a large georgraphic area.

  6. Light beam deflector performance: a comparative analysis.

    PubMed

    Zook, J D

    1974-04-01

    The performance of various types of analog light beam deflectors is summarized, and their relative positions in a deflector hierarchy are defined. The three types of deflectors considered are (1) mechanical (galvanometer) mirror deflectors, (2) acoustooptic deflectors, and (3) analog electrooptic deflectors. Material figures of merit are defined and compared, and the theoretical trade-off between speed and resolution is given for each type of deflector. PMID:20126095

  7. Performance analysis of intracavity birefringence sensing

    SciTech Connect

    Yoshino, Toshihiko

    2008-05-10

    The performance of intracavity birefringence sensing by use of a standing-wave laser is theoretically analyzed when the cavity involves internal reflection. On the three-mirror compound cavity model, the condition for converting an optical path length into a laser frequency or a retardation into an optical beat frequency with good linearity and little uncertainty is derived as a function of the cavity parameters and is numerically analyzed.

  8. Moisture performance analysis of EPS frost insulation

    SciTech Connect

    Ojanen, T.; Kokko, E.

    1997-11-01

    A horizontal layer of expanded polystyrene foam (EPS) is widely used as a frost insulation of building foundations in the Nordic countries. The performance properties of the insulation depend strongly on the moisture level of the material. Experimental methods are needed to produce samples for testing the material properties in realistic moisture conditions. The objective was to analyze the moisture loads and the wetting mechanisms of horizontal EPS frost insulation. Typical wetting tests, water immersion and diffusive water vapor absorption tests, were studied and the results were compared with the data from site investigations. Usually these tests give higher moisture contents of EPS than what are detected in drained frost insulation applications. Also the effect of different parameters, like the immersion depth and temperature gradient were studied. Special attention was paid to study the effect of diffusion on the wetting process. Numerical simulation showed that under real working conditions the long period diffusive moisture absorption in EPS frost insulation remained lower than 1% Vol. Moisture performance was determined experimentally as a function of the distance between the insulation and the free water level in the ground. The main moisture loads and the principles for good moisture performance of frost insulation are presented.

  9. Database for LDV Signal Processor Performance Analysis

    NASA Technical Reports Server (NTRS)

    Baker, Glenn D.; Murphy, R. Jay; Meyers, James F.

    1989-01-01

    A comparative and quantitative analysis of various laser velocimeter signal processors is difficult because standards for characterizing signal bursts have not been established. This leaves the researcher to select a signal processor based only on manufacturers' claims without the benefit of direct comparison. The present paper proposes the use of a database of digitized signal bursts obtained from a laser velocimeter under various configurations as a method for directly comparing signal processors.

  10. A New Approach to Aircraft Robust Performance Analysis

    NASA Technical Reports Server (NTRS)

    Gregory, Irene M.; Tierno, Jorge E.

    2004-01-01

    A recently developed algorithm for nonlinear system performance analysis has been applied to an F16 aircraft to begin evaluating the suitability of the method for aerospace problems. The algorithm has a potential to be much more efficient than the current methods in performance analysis for aircraft. This paper is the initial step in evaluating this potential.

  11. Optical performance test & analysis of intraocular lenses

    NASA Astrophysics Data System (ADS)

    Choi, Junoh

    Cataract is a condition in the eye that if left untreated, could lead to blindness. One of the effective ways to treat cataract is the removal of the cataractous natural crystalline lens and implantation of an artificial lens called an intraocular lens(IOL). The designs of the IOLs have shown improvements over the years to further imitate natural human vision. A need for an objective testing and analysis tool for the latest IOLs grow with the advancements of the IOLs. In this dissertation, I present a system capable of objective test and analysis of the advanced IOLs. The system consists of (1) Model eye into which an IOL can be inserted to mimic conditions of the human eye. (2) Modulation Transfer Function measurement setup capable of through-focus test for depth of field studies and polychromatic test for study of effects of chromatization. (3) Use of Defocus Transfer Function to simulate depth of field characteristic of rotationally symmetric multifocal designs and extension of the function to polychromatic conditions. (4) Several target imaging experiments for comparison of stray light artifacts and simulation using a non-sequential ray trace package.

  12. A case study in nonconformance and performance trend analysis

    NASA Technical Reports Server (NTRS)

    Maloy, Joseph E.; Newton, Coy P.

    1990-01-01

    As part of NASA's effort to develop an agency-wide approach to trend analysis, a pilot nonconformance and performance trending analysis study was conducted on the Space Shuttle auxiliary power unit (APU). The purpose of the study was to (1) demonstrate that nonconformance analysis can be used to identify repeating failures of a specific item (and the associated failure modes and causes) and (2) determine whether performance parameters could be analyzed and monitored to provide an indication of component or system degradation prior to failure. The nonconformance analysis of the APU did identify repeating component failures, which possibly could be reduced if key performance parameters were monitored and analyzed. The performance-trending analysis verified that the characteristics of hardware parameters can be effective in detecting degradation of hardware performance prior to failure.

  13. Shuttle TPS thermal performance and analysis methodology

    NASA Technical Reports Server (NTRS)

    Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.

    1983-01-01

    Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.

  14. The value of job analysis, job description and performance.

    PubMed

    Wolfe, M N; Coggins, S

    1997-01-01

    All companies, regardless of size, are faced with the same employment concerns. Efficient personnel management requires the use of three human resource techniques--job analysis, job description and performance appraisal. These techniques and tools are not for large practices only. Small groups can obtain the same benefits by employing these performance control measures. Job analysis allows for the development of a compensation system. Job descriptions summarize the most important duties. Performance appraisals help reward outstanding work.

  15. Covariance of Lucky Images: Performance analysis

    NASA Astrophysics Data System (ADS)

    Cagigal, Manuel P.; Valle, Pedro J.; Cagigas, Miguel A.; Villó-Pérez, Isidro; Colodro-Conde, Carlos; Ginski, C.; Mugrauer, M.; Seeliger, M.

    2016-09-01

    The covariance of ground-based Lucky Images (COELI) is a robust and easy-to-use algorithm that allows us to detect faint companions surrounding a host star. In this paper we analyze the relevance of the number of processed frames, the frames quality, the atmosphere conditions and the detection noise on the companion detectability. This analysis has been carried out using both experimental and computer simulated imaging data. Although the technique allows us the detection of faint companions, the camera detection noise and the use of a limited number of frames reduce the minimum detectable companion intensity to around 1000 times fainter than that of the host star when placed at an angular distance corresponding to the few first Airy rings. The reachable contrast could be even larger when detecting companions with the assistance of an adaptive optics system.

  16. Performance analysis of superconducting generator electromagnetic shielding

    NASA Astrophysics Data System (ADS)

    Xia, D.; Xia, Z.

    2015-12-01

    In this paper, the shielding performance of electromagnetic shielding systems is analyzed using the finite element method. Considering the non-iron-core rotor structure of superconducting generators, it is proposed that the stator alternating magnetic field generated under different operating conditions could decompose into oscillating and rotating magnetic field, so that complex issues could be greatly simplified. A 1200KW superconducting generator was analyzed. The distribution of the oscillating magnetic field and the rotating magnetic field in rotor area, which are generated by stator winding currents, and the distribution of the eddy currents in electromagnetic shielding tube, which are induced by these stator winding magnetic fields, are calculated without electromagnetic shielding system and with three different structures of electromagnetic shielding system respectively. On the basis of the results of FEM, the shielding factor of the electromagnetic shielding systems is calculated and the shielding effect of the three different structures on the oscillating magnetic field and the rotating magnetic field is compared. The method and the results in this paper can provide reference for optimal design and loss calculation of superconducting generators.

  17. Image based performance analysis of thermal imagers

    NASA Astrophysics Data System (ADS)

    Wegner, D.; Repasi, E.

    2016-05-01

    Due to advances in technology, modern thermal imagers resemble sophisticated image processing systems in functionality. Advanced signal and image processing tools enclosed into the camera body extend the basic image capturing capability of thermal cameras. This happens in order to enhance the display presentation of the captured scene or specific scene details. Usually, the implemented methods are proprietary company expertise, distributed without extensive documentation. This makes the comparison of thermal imagers especially from different companies a difficult task (or at least a very time consuming/expensive task - e.g. requiring the execution of a field trial and/or an observer trial). For example, a thermal camera equipped with turbulence mitigation capability stands for such a closed system. The Fraunhofer IOSB has started to build up a system for testing thermal imagers by image based methods in the lab environment. This will extend our capability of measuring the classical IR-system parameters (e.g. MTF, MTDP, etc.) in the lab. The system is set up around the IR- scene projector, which is necessary for the thermal display (projection) of an image sequence for the IR-camera under test. The same set of thermal test sequences might be presented to every unit under test. For turbulence mitigation tests, this could be e.g. the same turbulence sequence. During system tests, gradual variation of input parameters (e. g. thermal contrast) can be applied. First ideas of test scenes selection and how to assembly an imaging suite (a set of image sequences) for the analysis of imaging thermal systems containing such black boxes in the image forming path is discussed.

  18. 76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-30

    ... COMMISSION Metal Fatigue Analysis Performed by Computer Software AGENCY: Nuclear Regulatory Commission... applicants' analyses and methodologies using the computer software package, WESTEMS TM , to demonstrate... by Computer Software Addressees All holders of, and applicants for, a power reactor operating...

  19. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  20. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replication and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. A technique is used that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed database with both shared and exclusive locks.

  1. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replications and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. Here, a technique is discussed that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed databases with both shared and exclusive locks.

  2. Performance analysis of parallel supernodal sparse LU factorization

    SciTech Connect

    Grigori, Laura; Li, Xiaoye S.

    2004-02-05

    We investigate performance characteristics for the LU factorization of large matrices with various sparsity patterns. We consider supernodal right-looking parallel factorization on a bi-dimensional grid of processors, making use of static pivoting. We develop a performance model and we validate it using the implementation in SuperLU-DIST, the real matrices and the IBM Power3 machine at NERSC. We use this model to obtain performance bounds on parallel computers, to perform scalability analysis and to identify performance bottlenecks. We also discuss the role of load balance and data distribution in this approach.

  3. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    SciTech Connect

    Bremer, Peer-Timo; Mohr, Bernd; Schulz, Martin; Pasccci, Valerio; Gamblin, Todd; Brunst, Holger

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  4. A Systemic Cause Analysis Model for Human Performance Technicians

    ERIC Educational Resources Information Center

    Sostrin, Jesse

    2011-01-01

    This article presents a systemic, research-based cause analysis model for use in the field of human performance technology (HPT). The model organizes the most prominent barriers to workplace learning and performance into a conceptual framework that explains and illuminates the architecture of these barriers that exist within the fabric of everyday…

  5. Using Importance-Performance Analysis to Evaluate Training

    ERIC Educational Resources Information Center

    Siniscalchi, Jason M.; Beale, Edward K.; Fortuna, Ashley

    2008-01-01

    The importance-performance analysis (IPA) is a tool that can provide timely and usable feedback to improve training. IPA measures the gaps between the importance and how good (performance) a class is perceived by a student and is presented on a 2x2 matrix. The quadrant in which data land in this matrix aids in determining potential future action.…

  6. Preliminary Analysis of Remote Monitoring & Robotic Concepts for Performance Confirmation

    SciTech Connect

    D.A. McAffee

    1997-02-18

    As defined in 10 CFR Part 60.2, Performance Confirmation is the ''program of tests, experiments and analyses which is conducted to evaluate the accuracy and adequacy of the information used to determine with reasonable assurance that the performance objectives for the period after permanent closure will be met''. The overall Performance Confirmation program begins during site characterization and continues up to repository closure. The main purpose of this document is to develop, explore and analyze initial concepts for using remotely operated and robotic systems in gathering repository performance information during Performance Confirmation. This analysis focuses primarily on possible Performance Confirmation related applications within the emplacement drifts after waste packages have been emplaced (post-emplacement) and before permanent closure of the repository (preclosure). This will be a period of time lasting approximately 100 years and basically coincides with the Caretaker phase of the project. This analysis also examines, to a lesser extent, some applications related to Caretaker operations. A previous report examined remote handling and robotic technologies that could be employed during the waste package emplacement phase of the project (Reference 5.1). This analysis is being prepared to provide an early investigation of possible design concepts and technical challenges associated with developing remote systems for monitoring and inspecting activities during Performance Confirmation. The writing of this analysis preceded formal development of Performance Confirmation functional requirements and program plans and therefore examines, in part, the fundamental Performance Confirmation monitoring needs and operating conditions. The scope and primary objectives of this analysis are to: (1) Describe the operating environment and conditions expected in the emplacement drifts during the preclosure period. (Presented in Section 7.2). (2) Identify and discuss the

  7. Performance analysis, quality function deployment and structured methods

    NASA Astrophysics Data System (ADS)

    Maier, M. W.

    Quality function deployment, (QFD), an approach to synthesizing several elements of system modeling and design into a single unit, is presented. Behavioral, physical, and performance modeling are usually considered as separate aspects of system design without explicit linkages. Structured methodologies have developed linkages between behavioral and physical models before, but have not considered the integration of performance models. QFD integrates performance models with traditional structured models. In this method, performance requirements such as cost, weight, and detection range are partitioned into matrices. Partitioning is done by developing a performance model, preferably quantitative, for each requirement. The parameters of the model become the engineering objectives in a QFD analysis and the models are embedded in a spreadsheet version of the traditional QFD matrices. The performance model and its parameters are used to derive part of the functional model by recognizing that a given performance model implies some structure to the functionality of the system.

  8. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  9. Performing modal analysis for multi-metric measurements: a discussion

    NASA Astrophysics Data System (ADS)

    Soman, R.; Majewska, K.; Radzienski, M.; Ostachowicz, W.

    2016-04-01

    This work addresses the severe lack of literature in the area of modal analysis for multi-metric sensing. The paper aims at providing a step by step tutorial for performance of modal analysis using Fiber Bragg Grating (FBG) strain sensors and Laser Doppler Vibrometer (LDV) for displacement measurements. The paper discusses in detail the different parameters which affect the accuracy of the experimental results. It highlights the often implied, and un-mentioned problems, that researchers face while performing experiments. The paper tries to bridge the gap between the theoretical idea of the experiment and its actual execution by discussing each aspect including the choice of specimen, boundary conditions, sensors, sensor position, excitation mechanism and its location as well as the post processing of the data. The paper may be viewed as a checklist for performing modal analysis in order to ensure high quality measurements by avoiding the systematic errors to creep in.

  10. Network DEA: an application to analysis of academic performance

    NASA Astrophysics Data System (ADS)

    Saniee Monfared, Mohammad Ali; Safi, Mahsa

    2013-05-01

    As governmental subsidies to universities are declining in recent years, sustaining excellence in academic performance and more efficient use of resources have become important issues for university stakeholders. To assess the academic performances and the utilization of the resources, two important issues need to be addressed, i.e., a capable methodology and a set of good performance indicators as we consider in this paper. In this paper, we propose a set of performance indicators to enable efficiency analysis of academic activities and apply a novel network DEA structure to account for subfunctional efficiencies such as teaching quality, research productivity, as well as the overall efficiency. We tested our approach on the efficiency analysis of academic colleges at Alzahra University in Iran.

  11. Performance Analysis of Web Applications Based on User Navigation

    NASA Astrophysics Data System (ADS)

    Zhou, Quanshu; Ye, Hairong; Ding, Zuohua

    This paper proposes a method to conduct performance eanalysis of web applications. The behavior model is firstly built from log file after user navigation, then an extended state diagram is extracted from this log file, finally multiple Markov model is cooperated to this state diagram and the performance analysis can be obtained from the Markov model. Five indexes are used to measure the performance and they are: service response time, service path length, service utilization, service implementation rate and access error rate. Our performance analysis result will provide a suggestion to improve the design of web applications and optimize the services. A case study of Zhejiang Chess web site has been used to demonstrate the advantage of our method.

  12. Thermionic cogeneration burner assessment study performance analysis results

    SciTech Connect

    Not Available

    1983-12-01

    The purpose of this contract was to (1) test and evaluate two of the more important engineering aspects of designing and building thermionic cogeneration burners (TCB's); (2) make a cost and performance estimate of the TCB; and identify and evaluate industries where TCB's could be installed and where that the electrical power (dc) produced by the TCB's would be used directly in the process. The results of the performance analysis are detailed.

  13. Temporal geospatial analysis of secondary school students’ examination performance

    NASA Astrophysics Data System (ADS)

    Nik Abd Kadir, ND; Adnan, NA

    2016-06-01

    Malaysia's Ministry of Education has improved the organization of the data to have the geographical information system (GIS) school database. However, no further analysis is done using geospatial analysis tool. Mapping has emerged as a communication tool and becomes effective way to publish the digital and statistical data such as school performance results. The objective of this study is to analyse secondary school student performance of science and mathematics scores of the Sijil Pelajaran Malaysia Examination result in the year 2010 to 2014 for the Kelantan's state schools with the aid of GIS software and geospatial analysis. The school performance according to school grade point average (GPA) from Grade A to Grade G were interpolated and mapped and query analysis using geospatial tools able to be done. This study will be beneficial to the education sector to analyse student performance not only in Kelantan but to the whole Malaysia and this will be a good method to publish in map towards better planning and decision making to prepare young Malaysians for the challenges of education system and performance.

  14. Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Caubet, Jordi; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    In this paper we describe how to apply powerful performance analysis techniques to understand the behavior of multilevel parallel applications. We use the Paraver/OMPItrace performance analysis system for our study. This system consists of two major components: The OMPItrace dynamic instrumentation mechanism, which allows the tracing of processes and threads and the Paraver graphical user interface for inspection and analyses of the generated traces. We describe how to use the system to conduct a detailed comparative study of a benchmark code implemented in five different programming paradigms applicable for shared memory

  15. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  16. Integrated design environment for human performance and human reliability analysis

    SciTech Connect

    Nelson, W.R.

    1997-05-01

    Work over the last few years at the Idaho National Engineering and Environmental Laboratory (INEEL) has included a major focus on applying human performance and human reliability knowledge and methods as an integral element of system design and development. This work has been pursued in programs in a wide variety of technical domains, beginning with nuclear power plant operations. Since the mid-1980`s the laboratory has transferred the methods and tools developed in the nuclear domain to military weapons systems and aircraft, offshore oil and shipping operations, and commercial aviation operations and aircraft design. Through these diverse applications the laboratory has developed an integrated approach and framework for application of human performance analysis, human reliability analysis (HRA), operational data analysis, and simulation studies of human performance to the design and development of complex systems. This approach was recently tested in the NASA Advanced Concepts Program {open_quotes}Structured Human Error Analysis for Aircraft Design.{close_quotes} This program resulted in the prototype software tool THEA (Tool for Human Error Analysis) for incorporating human error analysis in the design of commercial aircraft, focusing on airplane maintenance tasks. Current effort is directed toward applying this framework to the development of advanced Air Traffic Management (ATM) systems as part of NASA`s Advanced Air Transportation Technologies (AATT) program. This paper summarizes the approach, describes recent and current applications in commercial aviation, and provides perspectives on how the approach could be utilized in the nuclear power industry.

  17. Observer analysis and its impact on task performance modeling

    NASA Astrophysics Data System (ADS)

    Jacobs, Eddie L.; Brown, Jeremy B.

    2014-05-01

    Fire fighters use relatively low cost thermal imaging cameras to locate hot spots and fire hazards in buildings. This research describes the analyses performed to study the impact of thermal image quality on fire fighter fire hazard detection task performance. Using human perception data collected by the National Institute of Standards and Technology (NIST) for fire fighters detecting hazards in a thermal image, an observer analysis was performed to quantify the sensitivity and bias of each observer. Using this analysis, the subjects were divided into three groups representing three different levels of performance. The top-performing group was used for the remainder of the modeling. Models were developed which related image quality factors such as contrast, brightness, spatial resolution, and noise to task performance probabilities. The models were fitted to the human perception data using logistic regression, as well as probit regression. Probit regression was found to yield superior fits and showed that models with not only 2nd order parameter interactions, but also 3rd order parameter interactions performed the best.

  18. Analysis of Photovoltaic System Energy Performance Evaluation Method

    SciTech Connect

    Kurtz, S.; Newmiller, J.; Kimber, A.; Flottemesch, R.; Riley, E.; Dierauf, T.; McKee, J.; Krishnani, P.

    2013-11-01

    Documentation of the energy yield of a large photovoltaic (PV) system over a substantial period can be useful to measure a performance guarantee, as an assessment of the health of the system, for verification of a performance model to then be applied to a new system, or for a variety of other purposes. Although the measurement of this performance metric might appear to be straight forward, there are a number of subtleties associated with variations in weather and imperfect data collection that complicate the determination and data analysis. A performance assessment is most valuable when it is completed with a very low uncertainty and when the subtleties are systematically addressed, yet currently no standard exists to guide this process. This report summarizes a draft methodology for an Energy Performance Evaluation Method, the philosophy behind the draft method, and the lessons that were learned by implementing the method.

  19. An Empirical Analysis of Human Performance and Nuclear Safety Culture

    SciTech Connect

    Jeffrey Joe; Larry G. Blackwood

    2006-06-01

    The purpose of this analysis, which was conducted for the US Nuclear Regulatory Commission (NRC), was to test whether an empirical connection exists between human performance and nuclear power plant safety culture. This was accomplished through analyzing the relationship between a measure of human performance and a plant’s Safety Conscious Work Environment (SCWE). SCWE is an important component of safety culture the NRC has developed, but it is not synonymous with it. SCWE is an environment in which employees are encouraged to raise safety concerns both to their own management and to the NRC without fear of harassment, intimidation, retaliation, or discrimination. Because the relationship between human performance and allegations is intuitively reciprocal and both relationship directions need exploration, two series of analyses were performed. First, human performance data could be indicative of safety culture, so regression analyses were performed using human performance data to predict SCWE. It also is likely that safety culture contributes to human performance issues at a plant, so a second set of regressions were performed using allegations to predict HFIS results.

  20. Performance Indicators in Math: Implications for Brief Experimental Analysis of Academic Performance

    ERIC Educational Resources Information Center

    VanDerheyden, Amanda M.; Burns, Matthew K.

    2009-01-01

    Brief experimental analysis (BEA) can be used to specify intervention characteristics that produce positive learning gains for individual students. A key challenge to the use of BEA for intervention planning is the identification of performance indicators (including topography of the skill, measurement characteristics, and decision criteria) that…

  1. Frontiers of Performance Analysis on Leadership-Class Systems

    SciTech Connect

    Fowler, R J; Adhianto, L; de Supinski, B R; Fagan, M; Gamblin, T; Krentel, M; Mellor-Crummey, J; Schulz, M; Tallent, N

    2009-06-15

    The number of cores in high-end systems for scientific computing are employing is increasing rapidly. As a result, there is an pressing need for tools that can measure, model, and diagnose performance problems in highly-parallel runs. We describe two tools that employ complementary approaches for analysis at scale and we illustrate their use on DOE leadership-class systems.

  2. Visuo-Spatial Performance in Autism: A Meta-Analysis

    ERIC Educational Resources Information Center

    Muth, Anne; Hönekopp, Johannes; Falter, Christine M.

    2014-01-01

    Visuo-spatial skills are believed to be enhanced in autism spectrum disorders (ASDs). This meta-analysis tests the current state of evidence for Figure Disembedding, Block Design, Mental Rotation and Navon tasks in ASD and neurotypicals. Block Design (d = 0.32) and Figure Disembedding (d = 0.26) showed superior performance for ASD with large…

  3. Analysis of upper-extremity performance in athletes and musicians.

    PubMed

    An, K N; Bejjani, F J

    1990-08-01

    Injuries can result from direct or indirect trauma and overuse in sports and the performing arts. These injuries occur when the objective exceeds the physiologic tolerance. Biomechanics analysis enables the estimation of the capacities of the body as well as the loading environment encountered by the tendons, muscles, bones, and joints during various types of sports and musical activities.

  4. Job Analysis, Job Descriptions, and Performance Appraisal Systems.

    ERIC Educational Resources Information Center

    Sims, Johnnie M.; Foxley, Cecelia H.

    1980-01-01

    Job analysis, job descriptions, and performance appraisal can benefit student services administration in many ways. Involving staff members in the development and implementation of these techniques can increase commitment to and understanding of the overall objectives of the office, as well as communication and cooperation among colleagues.…

  5. The Analysis of Athletic Performance: Some Practical and Philosophical Considerations

    ERIC Educational Resources Information Center

    Nelson, Lee J.; Groom, Ryan

    2012-01-01

    This article presents a hypothetical dialogue between a notational analyst (NA) recently schooled in the positivistic assessment of athletic performance, an "old-school" traditional coach (TC) who favours subjective analysis, and a pragmatic educator (PE). The conversation opens with NA and TC debating the respective value of quantitative and…

  6. A Semiotic Reading and Discourse Analysis of Postmodern Street Performance

    ERIC Educational Resources Information Center

    Lee, Mimi Miyoung; Chung, Sheng Kuan

    2009-01-01

    Postmodern street art operates under a set of references that requires art educators and researchers to adopt alternative analytical frameworks in order to understand its meanings. In this article, we describe social semiotics, critical discourse analysis, and postmodern street performance as well as the relevance of the former two in interpreting…

  7. Performance Analysis of GAME: A Generic Automated Marking Environment

    ERIC Educational Resources Information Center

    Blumenstein, Michael; Green, Steve; Fogelman, Shoshana; Nguyen, Ann; Muthukkumarasamy, Vallipuram

    2008-01-01

    This paper describes the Generic Automated Marking Environment (GAME) and provides a detailed analysis of its performance in assessing student programming projects and exercises. GAME has been designed to automatically assess programming assignments written in a variety of languages based on the "structure" of the source code and the correctness…

  8. Storage element performance optimization for CMS analysis jobs

    NASA Astrophysics Data System (ADS)

    Behrmann, G.; Dahlblom, J.; Guldmyr, J.; Happonen, K.; Lindén, T.

    2012-12-01

    Tier-2 computing sites in the Worldwide Large Hadron Collider Computing Grid (WLCG) host CPU-resources (Compute Element, CE) and storage resources (Storage Element, SE). The vast amount of data that needs to processed from the Large Hadron Collider (LHC) experiments requires good and efficient use of the available resources. Having a good CPU efficiency for the end users analysis jobs requires that the performance of the storage system is able to scale with I/O requests from hundreds or even thousands of simultaneous jobs. In this presentation we report on the work on improving the SE performance at the Helsinki Institute of Physics (HIP) Tier-2 used for the Compact Muon Experiment (CMS) at the LHC. Statistics from CMS grid jobs are collected and stored in the CMS Dashboard for further analysis, which allows for easy performance monitoring by the sites and by the CMS collaboration. As part of the monitoring framework CMS uses the JobRobot which sends every four hours 100 analysis jobs to each site. CMS also uses the HammerCloud tool for site monitoring and stress testing and it has replaced the JobRobot. The performance of the analysis workflow submitted with JobRobot or HammerCloud can be used to track the performance due to site configuration changes, since the analysis workflow is kept the same for all sites and for months in time. The CPU efficiency of the JobRobot jobs at HIP was increased approximately by 50 % to more than 90 %, by tuning the SE and by improvements in the CMSSW and dCache software. The performance of the CMS analysis jobs improved significantly too. Similar work has been done on other CMS Tier-sites, since on average the CPU efficiency for CMSSW jobs has increased during 2011. Better monitoring of the SE allows faster detection of problems, so that the performance level can be kept high. The next storage upgrade at HIP consists of SAS disk enclosures which can be stress tested on demand with HammerCloud workflows, to make sure that the I/O-performance

  9. Eddy-current steam generator data analysis performance. Final report

    SciTech Connect

    Harris, D.H.

    1993-06-01

    This study assessed the accuracy of eddy current, bobbin coil data analysis of steam generator tubes conducted under the structure of the PWR Steam Generator Examination Guidelines, Individual and team performance measures were obtained from independent analyses of data from 1619 locations in a sample of 199 steam generator tubes. The 92 reportable indications contained in the tube sample, including 64 repairable indications, were attributable to: wear at anti-vibration bars, intergranular attack/stress-corrosion cracking (IGA/SCC) within tube sheet crevice regions, primary-water stress-corrosion cracking (PWSCC) at tube roll transitions, or thinning at cold-leg tube supports. Analyses were conducted by 20 analysts, four each from five vendors of eddy current steam generator examination services. In accordance with the guidelines, site orientation was provided with plant-specific guidelines; preanalysis practice was completed on plant-specific data; analysts were qualified by performance testing; and independent primary-secondary analyses were conducted with resolution of discrepancies (team analyses). Measures of analysis performance included percentages of indications correctly reported, percentages of false reports, and relative operating characteristic (ROC) curves. ROC curves presented comprehensive pictures of analysis accuracy generalizable beyond the specific conditions of this study. They also provided single-value measures of analysis accuracy. Conclusions and recommendations were provided relative to analysis accuracy, effect of primary-secondary analyses, analyses of tube sheet crevice regions, establishment of reporting criteria, improvement of examination guidelines, and needed research.

  10. Analysis of cache performance for operating systems and multiprogramming

    SciTech Connect

    Agarwal, A.

    1987-01-01

    Advances in high-performance processors continue to create an increased need for memory bandwidth. Caches can provide this bandwidth cost-effectively. This dissertation investigates the performance of large caches for realistic operating system and multiprogramming workloads. A suite of efficient and accurate cache analysis techniques is developed. These include: a new data collection method, a mathematical cache model, and a trace sampling and a trace-stitching procedure. The analyses use a data-collection technique called ATUM to obtain realistic system traces of multitasking workloads with little distortion. Accurately characterizing cache behavior using ATUM traces shows that both operating system and multiprogramming activity significantly degrade cache performance, with an even greater proportional impact on large caches. From a careful analysis of the causes of this degradation, various techniques to reduce this loss are explored. While seemingly little can be done to mitigate the effect of system references, multitasking cache misses can be reduced with little effort.

  11. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  12. Performance demonstration program plan for analysis of simulated headspace gases

    SciTech Connect

    1995-06-01

    The Performance Demonstration Program (PDP) for analysis of headspace gases will consist of regular distribution and analyses of test standards to evaluate the capability for analyzing VOCs, hydrogen, and methane in the headspace of transuranic (TRU) waste throughout the Department of Energy (DOE) complex. Each distribution is termed a PDP cycle. These evaluation cycles will provide an objective measure of the reliability of measurements performed for TRU waste characterization. Laboratory performance will be demonstrated by the successful analysis of blind audit samples of simulated TRU waste drum headspace gases according to the criteria set within the text of this Program Plan. Blind audit samples (hereinafter referred to as PDP samples) will be used as an independent means to assess laboratory performance regarding compliance with the QAPP QAOs. The concentration of analytes in the PDP samples will encompass the range of concentrations anticipated in actual waste characterization gas samples. Analyses which are required by the WIPP to demonstrate compliance with various regulatory requirements and which are included in the PDP must be performed by laboratories which have demonstrated acceptable performance in the PDP.

  13. Mission analysis and performance specification studies report, appendix A

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The Near Term Hybrid Passenger Vehicle Development Program tasks included defining missions, developing distributions of daily travel and composite driving cycles for these missions, providing information necessary to estimate the potential replacement of the existing fleet by hybrids, and estimating acceleration/gradeability performance requirements for safe operation. The data was then utilized to develop mission specifications, define reference vehicles, develop hybrid vehicle performance specifications, and make fuel consumption estimates for the vehicles. The major assumptions which underlie the approach taken to the mission analysis and development of performance specifications are the following: the daily operating range of a hybrid vehicle should not be limited by the stored energy capacity and the performance of such a vehicle should not be strongly dependent on the battery state of charge.

  14. Qualitative urinary organic acid analysis: methodological approaches and performance.

    PubMed

    Peters, V; Garbade, S F; Langhans, C D; Hoffmann, G F; Pollitt, R J; Downing, M; Bonham, J R

    2008-12-01

    A programme for proficiency testing of biochemical genetics laboratories undertaking urinary qualitative organic acid analysis and its results for 50 samples examined for factors contributing to poor performance are described. Urine samples from patients in whom inherited metabolic disorders have been confirmed as well as control urines were circulated to participants and the results from 94 laboratories were evaluated. Laboratories showed variability both in terms of their individual performance and on a disease-specific basis. In general, conditions including methylmalonic aciduria, propionic aciduria, isovaleric aciduria, mevalonic aciduria, Canavan disease and 3-methylcrotonyl-CoA carboxylase were readily identified. Detection was poorer for other diseases such as glutaric aciduria type II, glyceric aciduria and, in one sample, 3-methylcrotonyl-CoA carboxylase deficiency. To identify the factors that allow some laboratories to perform well on a consistent basis while others perform badly, we devised a questionnaire and compared the responses with the results for performance in the scheme. A trend towards better performance could be demonstrated for those laboratories that regularly use internal quality control (QC) samples in their sample preparation (p = 0.079) and those that participate in further external quality assurance (EQA) schemes (p = 0,040). Clinicians who depend upon these diagnostic services to identify patients with these defects and the laboratories that provide them should be aware of the potential for missed diagnoses and the factors that may lead to improved performance.

  15. Safety and performance analysis of a commercial photovoltaic installation

    NASA Astrophysics Data System (ADS)

    Hamzavy, Babak T.; Bradley, Alexander Z.

    2013-09-01

    Continuing to better understand the performance of PV systems and changes in performance with the system life is vital to the sustainable growth of solar. A systematic understanding of degradation mechanisms that are induced as a result of variables such as the service environment, installation, module/material design, weather, operation and maintenance, and manufacturing is required for reliable operation throughout a system's lifetime. We wish to report the results from an analysis of a commercial c-Si PV array owned and operated by DuPont. We assessed the electrical performance of the modules by comparing the original manufacturers' performance data with the measurements obtained using a solar simulator to determine the degradation rate. This evaluation provides valuable PV system field experience and document key issues regarding safety and performance. A review of the nondestructive and destructive analytical methods and characterization strategies we have found useful for system, module, and subsequent material component evaluations are presented. We provide an overview of our inspection protocol and subsequent control process to mitigate risk. The objective is to explore and develop best practice protocols regarding PV asset optimization and provide a rationale to reduce risk based on the analysis of our own commercial installations.

  16. The influence of three genes on whether adolescents use contraception, United States 1994-2002

    PubMed Central

    Daw, Jonathan; Guo, Guang

    2013-01-01

    In a further contribution to recent investigations of the relevance of genetic processes for demographic outcomes, we investigate genetic associations with whether adolescents use contraception. Using data from the National Longitudinal Study of Adolescent Health, we find that variants in the dopamine transporter gene DAT1, the dopamine receptor gene DRD2, and the monoamine oxidase gene MAOA are associated with unprotected sexual intercourse. Consistent with previous analyses of these data, the genotypes DRD2*A1/A2, DRD2*A2/A2, DAT1*9R/10R, and MAOA*2R/ are associated with higher odds of unprotected sexual intercourse than other genotypes at these loci. The DRD2 associations apply to both men and women, whereas the other associations apply to women only. These results are robust to controls for population stratification by continental ancestry, do not vary by contraceptive type, and are consistent with previous research showing that these genetic variants are associated with higher rates of impulsivity. PMID:21916669

  17. Phosphorus and suspended sediment load estimates for the Lower Boise River, Idaho, 1994-2002

    USGS Publications Warehouse

    Donato, Mary M.; MacCoy, Dorene E.

    2004-01-01

    The U.S. Geological Survey used LOADEST, newly developed load estimation software, to develop regression equations and estimate loads of total phosphorus (TP), dissolved orthophosphorus (OP), and suspended sediment (SS) from January 1994 through September 2002 at four sites on the lower Boise River: Boise River below Diversion Dam near Boise, Boise River at Glenwood Bridge at Boise, Boise River near Middleton, and Boise River near Parma. The objective was to help the Idaho Department of Environmental Quality develop and implement total maximum daily loads (TMDLs) by providing spatial and temporal resolution for phosphorus and sediment loads and enabling load estimates made by mass balance calculations to be refined and validated. Regression models for TP and OP generally were well fit on the basis of regression coefficients of determination (R2), but results varied in quality from site to site. The TP and OP results for Glenwood probably were affected by the upstream wastewater-treatment plant outlet, which provides a variable phosphorus input that is unrelated to river discharge. Regression models for SS generally were statistically well fit. Regression models for Middleton for all constituents, although statistically acceptable, were of limited usefulness because sparse and intermittent discharge data at that site caused many gaps in the resulting estimates. Although the models successfully simulated measured loads under predominant flow conditions, errors in TP and SS estimates at Middleton and in TP estimates at Parma were larger during high- and low-flow conditions. This shortcoming might be improved if additional concentration data for a wider range of flow conditions were available for calibrating the model. The average estimated daily TP load ranged from less than 250 pounds per day (lb/d) at Diversion to nearly 2,200 lb/d at Parma. Estimated TP loads at all four sites displayed cyclical variations coinciding with seasonal fluctuations in discharge. Estimated annual loads of TP ranged from less than 8 tons at Diversion to 570 tons at Parma. Annual loads of dissolved OP peaked in 1997 at all sites and were consistently higher at Parma than at the other sites. The ratio of OP to TP varied considerably throughout the year at all sites. Peaks in the OP:TP ratio occurred primarily when flows were at their lowest annual stages; estimated seasonal OP:TP ratios were highest in autumn at all sites. Conversely, when flows were high, the ratio was low, reflecting increased TP associated with particulate matter during high flows. Parma exhibited the highest OP:TP ratio during all seasons, at least 0.60 in spring and nearly 0.90 in autumn. Similar OP:TP ratios were estimated at Glenwood. Whereas the OP:TP ratio for Parma and Glenwood peaked in November or December, decreased from January through May, and increased again after June, estimates for Diversion showed nearly the opposite pattern ? ratios were highest in July and lowest in January and February. This difference might reflect complex biological and geochemical processes involving nutrient cycling in Lucky Peak Lake, but further data are needed to substantiate this hypothesis. Estimated monthly average SS loads were highest at Diversion, about 400 tons per day (ton/d). Average annual loads from 1994 through 2002 were 144,000 tons at Diversion, 33,000 tons at Glenwood, and 88,000 tons at Parma. Estimated SS loads peaked in the spring at all sites, coinciding with high flows. Increases in TP in the reach from Diversion to Glenwood ranged from 200 to 350 lb/d. Decreases in TP were small in this reach only during high flows in January and February 1997. Decreases in SS, were large during high-flow conditions indicating sediment deposition in the reach. Intermittent data at Middleton indicated that increases and decreases in TP in the reach from Glenwood to Middleton were during low- and high-flow conditions, respectively. All constituents increased in the r

  18. Merging Right: Questions of Access and Merit in South African Higher Education Reform, 1994-2002

    ERIC Educational Resources Information Center

    Elliott, John

    2005-01-01

    The dismantling of South Africa's apartheid-controlled education system after 1994 brought with it unprecedented policy complications, among them the question of how best to integrate the desiderata of access and merit in school education and tertiary sectors. For the higher education sector, institutional mergers became an increasingly visible…

  19. Performance analysis of FDDI network under frequent bidding requirements

    NASA Astrophysics Data System (ADS)

    Neo, L. K.; Cheng, T. H.; Subramanian, K. R.; Dubey, V. K.

    1993-05-01

    A new bidding scheme is described for the fiber distributed data interface (FDDI). An analysis is presented for the throughput performance of an FDDI network under the assumption of heavy load, which allows the target token rotation time (TTRT) to be bid for and adjusted frequently as and when the access time requirements of synchronous traffic change. Our results show that better throughput performance is achievable under the new bidding scheme. It is also observed that although re-bidding is desirable, escalating and uncontrolled bidding intensity may incur undue overheads that results in unacceptable throughput degradation.

  20. Item difficulty analysis of the tactual performance test trials.

    PubMed

    Charter, R A

    2000-12-01

    A study of item difficulty was performed on data from the Tactual Performance Test trials (preferred, nonpreferred, and both hands) on three groups (normal, alcoholic, and heterogeneous). The total sample size was 314. The three-way split-plot analysis of variance yielded three significant two-way interactions, i.e., group by trial, group by block, and trial by block, thereby making the interpretation complex. One finding is that the blocks at the top of the board are more difficult than the blocks at the bottom of the board.

  1. Aerodynamic Analysis of Cup Anemometers Performance: The Stationary Harmonic Response

    PubMed Central

    Pindado, Santiago; Cubas, Javier; Sanz-Andrés, Ángel

    2013-01-01

    The effect of cup anemometer shape parameters, such as the cups' shape, their size, and their center rotation radius, was experimentally analyzed. This analysis was based on both the calibration constants of the transfer function and the most important harmonic term of the rotor's movement, which due to the cup anemometer design is the third one. This harmonic analysis represents a new approach to study cup anemometer performances. The results clearly showed a good correlation between the average rotational speed of the anemometer's rotor and the mentioned third harmonic term of its movement. PMID:24381512

  2. INL FY2014 1st Quarterly Performance Analysis

    SciTech Connect

    Loran Kinghorn

    2014-07-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2 “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of 76 occurrence reports and over 16 other deficiency reports (including not reportable events) identified at the INL during the period of October 2013 through December 2013. Battelle Energy Alliance (BEA) operates the INL under contract DE AC 07 051D14517

  3. RTOD- RADIAL TURBINE OFF-DESIGN PERFORMANCE ANALYSIS

    NASA Technical Reports Server (NTRS)

    Glassman, A. J.

    1994-01-01

    The RTOD program was developed to accurately predict radial turbine off-design performance. The radial turbine has been used extensively in automotive turbochargers and aircraft auxiliary power units. It is now being given serious consideration for primary powerplant applications. In applications where the turbine will operate over a wide range of power settings, accurate off-design performance prediction is essential for a successful design. RTOD predictions have already illustrated a potential improvement in off-design performance offered by rotor back-sweep for high-work-factor radial turbines. RTOD can be used to analyze other potential performance enhancing design features. RTOD predicts the performance of a radial turbine (with or without rotor blade sweep) as a function of pressure ratio, speed, and stator setting. The program models the flow with the following: 1) stator viscous and trailing edge losses; 2) a vaneless space loss between the stator and the rotor; and 3) rotor incidence, viscous, trailing-edge, clearance, and disk friction losses. The stator and rotor viscous losses each represent the combined effects of profile, endwall, and secondary flow losses. The stator inlet and exit and the rotor inlet flows are modeled by a mean-line analysis, but a sector analysis is used at the rotor exit. The leakage flow through the clearance gap in a pivoting stator is also considered. User input includes gas properties, turbine geometry, and the stator and rotor viscous losses at a reference performance point. RTOD output includes predicted turbine performance over a specified operating range and any user selected flow parameters. The RTOD program is written in FORTRAN IV for batch execution and has been implemented on an IBM 370 series computer with a central memory requirement of approximately 100K of 8 bit bytes. The RTOD program was developed in 1983.

  4. Performance analysis of a generalized upset detection procedure

    NASA Technical Reports Server (NTRS)

    Blough, Douglas M.; Masson, Gerald M.

    1987-01-01

    A general procedure for upset detection in complex systems, called the data block capture and analysis upset monitoring process is described and analyzed. The process consists of repeatedly recording a fixed amount of data from a set of predetermined observation lines of the system being monitored (i.e., capturing a block of data), and then analyzing the captured block in an attempt to determine whether the system is functioning correctly. The algorithm which analyzes the data blocks can be characterized in terms of the amount of time it requires to examine a given length data block to ascertain the existence of features/conditions that have been predetermined to characterize the upset-free behavior of the system. The performance of linear, quadratic, and logarithmic data analysis algorithms is rigorously characterized in terms of three performance measures: (1) the probability of correctly detecting an upset; (2) the expected number of false alarms; and (3) the expected latency in detecting upsets.

  5. High Performance Descriptive Semantic Analysis of Semantic Graph Databases

    SciTech Connect

    Joslyn, Cliff A.; Adolf, Robert D.; al-Saffar, Sinan; Feo, John T.; Haglin, David J.; Mackey, Greg E.; Mizell, David W.

    2011-06-02

    As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to understand their inherent semantic structure, whether codified in explicit ontologies or not. Our group is researching novel methods for what we call descriptive semantic analysis of RDF triplestores, to serve purposes of analysis, interpretation, visualization, and optimization. But data size and computational complexity makes it increasingly necessary to bring high performance computational resources to bear on this task. Our research group built a novel high performance hybrid system comprising computational capability for semantic graph database processing utilizing the large multi-threaded architecture of the Cray XMT platform, conventional servers, and large data stores. In this paper we describe that architecture and our methods, and present the results of our analyses of basic properties, connected components, namespace interaction, and typed paths such for the Billion Triple Challenge 2010 dataset.

  6. Performance analysis of wireless sensor networks in geophysical sensing applications

    NASA Astrophysics Data System (ADS)

    Uligere Narasimhamurthy, Adithya

    Performance is an important criteria to consider before switching from a wired network to a wireless sensing network. Performance is especially important in geophysical sensing where the quality of the sensing system is measured by the precision of the acquired signal. Can a wireless sensing network maintain the same reliability and quality metrics that a wired system provides? Our work focuses on evaluating the wireless GeoMote sensor motes that were developed by previous computer science graduate students at Mines. Specifically, we conducted a set of experiments, namely WalkAway and Linear Array experiments, to characterize the performance of the wireless motes. The motes were also equipped with the Sticking Heartbeat Aperture Resynchronization Protocol (SHARP), a time synchronization protocol developed by a previous computer science graduate student at Mines. This protocol should automatically synchronize the mote's internal clocks and reduce time synchronization errors. We also collected passive data to evaluate the response of GeoMotes to various frequency components associated with the seismic waves. With the data collected from these experiments, we evaluated the performance of the SHARP protocol and compared the performance of our GeoMote wireless system against the industry standard wired seismograph system (Geometric-Geode). Using arrival time analysis and seismic velocity calculations, we set out to answer the following question. Can our wireless sensing system (GeoMotes) perform similarly to a traditional wired system in a realistic scenario?

  7. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    SciTech Connect

    Carlsbad Field Office

    2007-11-19

    The Performance Demonstration Program (PDP) for headspace gases distributes blind audit samples in a gas matrix for analysis of volatile organic compounds (VOCs). Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility’s compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document

  8. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    SciTech Connect

    Carlsbad Field Office

    2007-11-13

    The Performance Demonstration Program (PDP) for headspace gases distributes blind audit samples in a gas matrix for analysis of volatile organic compounds (VOCs). Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility’s compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document

  9. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    SciTech Connect

    Carlsbad Field Office

    2006-04-01

    The Performance Demonstration Program (PDP) for headspace gases distributes sample gases of volatile organic compounds (VOCs) for analysis. Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility’s compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document. Participating measurement

  10. SIMS analysis of high-performance accelerator niobium

    SciTech Connect

    Maheshwari, P.; Stevie, F. A.; Myneni, Ganapati Rao; Rigsbee, J, M.; Dhakal, Pashupati; Ciovati, Gianluigi; Griffis, D. P.

    2014-11-01

    Niobium is used to fabricate superconducting radio frequency accelerator modules because of its high critical temperature, high critical magnetic field, and easy formability. Recent experiments have shown a very significant improvement in performance (over 100%) after a high-temperature bake at 1400 degrees C for 3h. SIMS analysis of this material showed the oxygen profile was significantly deeper than the native oxide with a shape that is indicative of diffusion. Positive secondary ion mass spectra showed the presence of Ti with a depth profile similar to that of O. It is suspected that Ti is associated with the performance improvement. The source of Ti contamination in the anneal furnace has been identified, and a new furnace was constructed without Ti. Initial results from the new furnace do not show the yield improvement. Further analyses should determine the relationship of Ti to cavity performance.

  11. Linear optimization - A case study in performance analysis

    NASA Technical Reports Server (NTRS)

    Stunkel, Craig B.; Fuchs, W. Kent; Rudolph, David C.; Reed, Daniel A.

    1989-01-01

    The paper deals with the performance of two parallel variants of the simplex algorithm on a message-passing system. First, the simplex algorithm is reviewed, two possible parallelizations of the algorithm are discussed, and results of benchmark speedups of the alternatives are presented. Between column and row partitionings, the row partitioning method is found to be generally superior, while the column partitioning method is more efficient when the number of rows is small, and the number of columns is much greater that the number of rows. Various performance analysis tools are then applied to examine the reasons for relative performance differences, and communication idle time due to global minimization and load imbalances is noted as the main factor in execution slowdown.

  12. Crew Exploration Vehicle Launch Abort Controller Performance Analysis

    NASA Technical Reports Server (NTRS)

    Sparks, Dean W., Jr.; Raney, David L.

    2007-01-01

    This paper covers the simulation and evaluation of a controller design for the Crew Module (CM) Launch Abort System (LAS), to measure its ability to meet the abort performance requirements. The controller used in this study is a hybrid design, including features developed by the Government and the Contractor. Testing is done using two separate 6-degree-of-freedom (DOF) computer simulation implementations of the LAS/CM throughout the ascent trajectory: 1) executing a series of abort simulations along a nominal trajectory for the nominal LAS/CM system; and 2) using a series of Monte Carlo runs with perturbed initial flight conditions and perturbed system parameters. The performance of the controller is evaluated against a set of criteria, which is based upon the current functional requirements of the LAS. Preliminary analysis indicates that the performance of the present controller meets (with the exception of a few cases) the evaluation criteria mentioned above.

  13. Total systems design analysis of high performance structures

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1993-01-01

    Designer-control parameters were identified at interdiscipline interfaces to optimize structural systems performance and downstream development and operations with reliability and least life-cycle cost. Interface tasks and iterations are tracked through a matrix of performance disciplines integration versus manufacturing, verification, and operations interactions for a total system design analysis. Performance integration tasks include shapes, sizes, environments, and materials. Integrity integrating tasks are reliability and recurring structural costs. Significant interface designer control parameters were noted as shapes, dimensions, probability range factors, and cost. Structural failure concept is presented, and first-order reliability and deterministic methods, benefits, and limitations are discussed. A deterministic reliability technique combining benefits of both is proposed for static structures which is also timely and economically verifiable. Though launch vehicle environments were primarily considered, the system design process is applicable to any surface system using its own unique filed environments.

  14. The Vehicle Integrated Performance Analysis Experience: Reconnecting With Technical Integration

    NASA Technical Reports Server (NTRS)

    McGhee, D. S.

    2006-01-01

    Very early in the Space Launch Initiative program, a small team of engineers at MSFC proposed a process for performing system-level assessments of a launch vehicle. Aimed primarily at providing insight and making NASA a smart buyer, the Vehicle Integrated Performance Analysis (VIPA) team was created. The difference between the VIPA effort and previous integration attempts is that VIPA a process using experienced people from various disciplines, which focuses them on a technically integrated assessment. The foundations of VIPA s process are described. The VIPA team also recognized the need to target early detailed analysis toward identifying significant systems issues. This process is driven by the T-model for technical integration. VIPA s approach to performing system-level technical integration is discussed in detail. The VIPA process significantly enhances the development and monitoring of realizable project requirements. VIPA s assessment validates the concept s stated performance, identifies significant issues either with the concept or the requirements, and then reintegrates these issues to determine impacts. This process is discussed along with a description of how it may be integrated into a program s insight and review process. The VIPA process has gained favor with both engineering and project organizations for being responsive and insightful

  15. Dynamic performances analysis of a real vehicle driving

    NASA Astrophysics Data System (ADS)

    Abdullah, M. A.; Jamil, J. F.; Salim, M. A.

    2015-12-01

    Vehicle dynamic is the effects of movement of a vehicle generated from the acceleration, braking, ride and handling activities. The dynamic behaviours are determined by the forces from tire, gravity and aerodynamic which acting on the vehicle. This paper emphasizes the analysis of vehicle dynamic performance of a real vehicle. Real driving experiment on the vehicle is conducted to determine the effect of vehicle based on roll, pitch, and yaw, longitudinal, lateral and vertical acceleration. The experiment is done using the accelerometer to record the reading of the vehicle dynamic performance when the vehicle is driven on the road. The experiment starts with weighing a car model to get the center of gravity (COG) to place the accelerometer sensor for data acquisition (DAQ). The COG of the vehicle is determined by using the weight of the vehicle. A rural route is set to launch the experiment and the road conditions are determined for the test. The dynamic performance of the vehicle are depends on the road conditions and driving maneuver. The stability of a vehicle can be controlled by the dynamic performance analysis.

  16. Quantitative analysis of the reconstruction performance of interpolants

    NASA Technical Reports Server (NTRS)

    Lansing, Donald L.; Park, Stephen K.

    1987-01-01

    The analysis presented provides a quantitative measure of the reconstruction or interpolation performance of linear, shift-invariant interpolants. The performance criterion is the mean square error of the difference between the sampled and reconstructed functions. The analysis is applicable to reconstruction algorithms used in image processing and to many types of splines used in numerical analysis and computer graphics. When formulated in the frequency domain, the mean square error clearly separates the contribution of the interpolation method from the contribution of the sampled data. The equations provide a rational basis for selecting an optimal interpolant; that is, one which minimizes the mean square error. The analysis has been applied to a selection of frequently used data splines and reconstruction algorithms: parametric cubic and quintic Hermite splines, exponential and nu splines (including the special case of the cubic spline), parametric cubic convolution, Keys' fourth-order cubic, and a cubic with a discontinuous first derivative. The emphasis in this paper is on the image-dependent case in which no a priori knowledge of the frequency spectrum of the sampled function is assumed.

  17. An advanced probabilistic structural analysis method for implicit performance functions

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.

    1989-01-01

    In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.

  18. Analysis of latency performance of bluetooth low energy (BLE) networks.

    PubMed

    Cho, Keuchul; Park, Woojin; Hong, Moonki; Park, Gisu; Cho, Wooseong; Seo, Jihoon; Han, Kijun

    2014-12-23

    Bluetooth Low Energy (BLE) is a short-range wireless communication technology aiming at low-cost and low-power communication. The performance evaluation of classical Bluetooth device discovery have been intensively studied using analytical modeling and simulative methods, but these techniques are not applicable to BLE, since BLE has a fundamental change in the design of the discovery mechanism, including the usage of three advertising channels. Recently, there several works have analyzed the topic of BLE device discovery, but these studies are still far from thorough. It is thus necessary to develop a new, accurate model for the BLE discovery process. In particular, the wide range settings of the parameters introduce lots of potential for BLE devices to customize their discovery performance. This motivates our study of modeling the BLE discovery process and performing intensive simulation. This paper is focused on building an analytical model to investigate the discovery probability, as well as the expected discovery latency, which are then validated via extensive experiments. Our analysis considers both continuous and discontinuous scanning modes. We analyze the sensitivity of these performance metrics to parameter settings to quantitatively examine to what extent parameters influence the performance metric of the discovery processes.

  19. Performance analysis of image fusion methods in transform domain

    NASA Astrophysics Data System (ADS)

    Choi, Yoonsuk; Sharifahmadian, Ershad; Latifi, Shahram

    2013-05-01

    Image fusion involves merging two or more images in such a way as to retain the most desirable characteristics of each. There are various image fusion methods and they can be classified into three main categories: i) Spatial domain, ii) Transform domain, and iii) Statistical domain. We focus on the transform domain in this paper as spatial domain methods are primitive and statistical domain methods suffer from a significant increase of computational complexity. In the field of image fusion, performance analysis is important since the evaluation result gives valuable information which can be utilized in various applications, such as military, medical imaging, remote sensing, and so on. In this paper, we analyze and compare the performance of fusion methods based on four different transforms: i) wavelet transform, ii) curvelet transform, iii) contourlet transform and iv) nonsubsampled contourlet transform. Fusion framework and scheme are explained in detail, and two different sets of images are used in our experiments. Furthermore, various performance evaluation metrics are adopted to quantitatively analyze the fusion results. The comparison results show that the nonsubsampled contourlet transform method performs better than the other three methods. During the experiments, we also found out that the decomposition level of 3 showed the best fusion performance, and decomposition levels beyond level-3 did not significantly affect the fusion results.

  20. Cross-Industry Performance Modeling: Toward Cooperative Analysis

    SciTech Connect

    H. S. Blackman; W. J. Reece

    1998-10-01

    One of the current unsolved problems in human factors is the difficulty in acquiring information from lessons learned and data collected among human performance analysts in different domains. There are several common concerns and generally accepted issues of importance for human factors, psychology and industry analysts of performance and safety. Among these are the need to incorporate lessons learned in design, to carefully consider implementation of new designs and automation, and the need to reduce human performance-based contributions to risk. In spite of shared concerns, there are several road blocks to widespread sharing of data and lessons learned from operating experience and simulation, including the fact that very few publicly accessible data bases exist(Gertman & Blackman, 1994, and Kirwan, 1997). There is a need to draw together analysts and analytic methodologies to comprise a centralized source of data with sufficient detail to be meaningful while ensuring source anonymity. We propose that a generic source of performance data and a multi-domain data store may provide the first steps toward cooperative performance modeling and analysis across industries.

  1. Cross-industry Performance Modeling: Toward Cooperative Analysis

    SciTech Connect

    Reece, Wendy Jane; Blackman, Harold Stabler

    1998-10-01

    One of the current unsolved problems in human factors is the difficulty in acquiring information from lessons learned and data collected among human performance analysts in different domains. There are several common concerns and generally accepted issues of importance for human factors, psychology and industry analysts of performance and safety. Among these are the need to incorporate lessons learned in design, to carefully consider implementation of new designs and automation, and the need to reduce human performance-based contributions to risk. In spite of shared concerns, there are several roadblocks to widespread sharing of data and lessons learned from operating experience and simulation, including the fact that very few publicly accessible data bases exist (Gertman & Blackman, 1994, and Kirwan, 1997). There is a need to draw together analysts and analytic methodologies to comprise a centralized source of data with sufficient detail to be meaningful while ensuring source anonymity. We propose that a generic source of performance data and a multi-domain data store may provide the first steps toward cooperative performance modeling and analysis across industries.

  2. Clinical laboratory as an economic model for business performance analysis

    PubMed Central

    Buljanović, Vikica; Patajac, Hrvoje; Petrovečki, Mladen

    2011-01-01

    Aim To perform SWOT (strengths, weaknesses, opportunities, and threats) analysis of a clinical laboratory as an economic model that may be used to improve business performance of laboratories by removing weaknesses, minimizing threats, and using external opportunities and internal strengths. Methods Impact of possible threats to and weaknesses of the Clinical Laboratory at Našice General County Hospital business performance and use of strengths and opportunities to improve operating profit were simulated using models created on the basis of SWOT analysis results. The operating profit as a measure of profitability of the clinical laboratory was defined as total revenue minus total expenses and presented using a profit and loss account. Changes in the input parameters in the profit and loss account for 2008 were determined using opportunities and potential threats, and economic sensitivity analysis was made by using changes in the key parameters. The profit and loss account and economic sensitivity analysis were tools for quantifying the impact of changes in the revenues and expenses on the business operations of clinical laboratory. Results Results of simulation models showed that operational profit of €470 723 in 2008 could be reduced to only €21 542 if all possible threats became a reality and current weaknesses remained the same. Also, operational gain could be increased to €535 804 if laboratory strengths and opportunities were utilized. If both the opportunities and threats became a reality, the operational profit would decrease by €384 465. Conclusion The operational profit of the clinical laboratory could be significantly reduced if all threats became a reality and the current weaknesses remained the same. The operational profit could be increased by utilizing strengths and opportunities as much as possible. This type of modeling may be used to monitor business operations of any clinical laboratory and improve its financial situation by

  3. Results of a 24-inch Hybrid Motor Performance Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Sims, Joseph D.; Coleman, Hugh W.

    1998-01-01

    The subscale (11 and 24-inch) hybrid motors at the Marshall Space Flight Center (MSFC) have been used as versatile and cost effective testbeds for developing new technology. Comparisons between motor configuration, ignition systems, feed systems, fuel formulations, and nozzle materials have been carried out without detailed consideration as to haw "good" the motor performance data were. For the 250,000 lb/thrust motor developed by the Hybrid Propulsion Demonstration Program consortium, this shortcoming is particularly risky because motor performance will likely be used as put of a set of downselect criteria to choose between competing ignition and feed systems under development. This analysis directly addresses that shortcoming by applying uncertainty analysis techniques to the experimental determination of the characteristic velocity, theoretical characteristic velocity, and characteristic velocity efficiency for a 24-inch motor firing. With the adoption of fuel-lined headends, flow restriction, and aft mixing chambers, state of the an 24-inch hybrid motors have become very efficient However, impossibly high combustion efficiencies (some computed as high as 108%) have been measured in some tests with 11-inch motors. This analysis has given new insight into explaining how these efficiencies were measured to be so high, and into which experimental measurements contribute the most to the overall uncertainty.

  4. Effects of specified performance criterion and performance feedback on staff behavior: a component analysis.

    PubMed

    Hardesty, Samantha L; Hagopian, Louis P; McIvor, Melissa M; Wagner, Leaora L; Sigurdsson, Sigurdur O; Bowman, Lynn G

    2014-09-01

    The present study isolated the effects of frequently used staff training intervention components to increase communication between direct care staff and clinicians working on an inpatient behavioral unit. Written "protocol review" quizzes developed by clinicians were designed to assess knowledge about a patient's behavioral protocols. Direct care staff completed these at the beginning of each day and evening shift. Clinicians were required to score and discuss these protocol reviews with direct care staff for at least 75% of shifts over a 2-week period. During baseline, only 21% of clinicians met this requirement. Completing and scoring of protocol reviews did not improve following additional in-service training (M = 15%) or following an intervention aimed at decreasing response effort combined with prompting (M = 28%). After implementing an intervention involving specified performance criterion and performance feedback, 86% of clinicians reached the established goal. Results of a component analysis suggested that the presentation of both the specified performance criterion and supporting contingencies was necessary to maintain acceptable levels of performance. PMID:24928213

  5. Effects of specified performance criterion and performance feedback on staff behavior: a component analysis.

    PubMed

    Hardesty, Samantha L; Hagopian, Louis P; McIvor, Melissa M; Wagner, Leaora L; Sigurdsson, Sigurdur O; Bowman, Lynn G

    2014-09-01

    The present study isolated the effects of frequently used staff training intervention components to increase communication between direct care staff and clinicians working on an inpatient behavioral unit. Written "protocol review" quizzes developed by clinicians were designed to assess knowledge about a patient's behavioral protocols. Direct care staff completed these at the beginning of each day and evening shift. Clinicians were required to score and discuss these protocol reviews with direct care staff for at least 75% of shifts over a 2-week period. During baseline, only 21% of clinicians met this requirement. Completing and scoring of protocol reviews did not improve following additional in-service training (M = 15%) or following an intervention aimed at decreasing response effort combined with prompting (M = 28%). After implementing an intervention involving specified performance criterion and performance feedback, 86% of clinicians reached the established goal. Results of a component analysis suggested that the presentation of both the specified performance criterion and supporting contingencies was necessary to maintain acceptable levels of performance.

  6. Performance analysis of charge plasma based dual electrode tunnel FET

    NASA Astrophysics Data System (ADS)

    Anand, Sunny; Intekhab Amin, S.; Sarin, R. K.

    2016-05-01

    This paper proposes the charge plasma based dual electrode doping-less tunnel FET (DEDLTFET). The paper compares the device performance of the conventional doping-less TFET (DLTFET) and doped TFET (DGTFET). DEDLTEFT gives the superior results with high ON state current (ION ∼ 0.56 mA/μm), ION/IOFF ratio ∼ 9.12 × 1013 and an average subthreshold swing (AV-SS ∼ 48 mV/dec). The variation of different device parameters such as channel length, gate oxide material, gate oxide thickness, silicon thickness, gate work function and temperature variation are done and compared with DLTFET and DGTFET. Through the extensive analysis it is found that DEDLTFET shows the better performance than the other two devices, which gives the indication for an excellent future in low power applications.

  7. Performance of a modular interactive data analysis system (MIDAS)

    SciTech Connect

    Maples, C.; Weaver, D.; Logan, D.; Rathbun, W.

    1983-01-01

    A processor cluster, part of a multiprocessor system named MIDAS (modular interactive data analysis system), has been constructed and tested. The architecture permits considerable flexibility in organizing the processing elements for different applications. The current tests involved 8 general CPUs from commercial computers, 2 special purpose pipelined processors and a specially designed communications system. Results on a variety of programs indicated that the cluster performs from 8 to 16 times faster than a standard computer with an identical CPU. The range represents the effect of differing CPU and I/O requirements-ranging from CPU intensive to I/O intensive. A benchmark test indicated that the cluster performed at approximately 85percent the speed of the CDC 7600. Plans for further cluster enhancements and multicluster operation are discussed. 5 references.

  8. Fluid and thermal performance analysis of PMSM used for driving

    NASA Astrophysics Data System (ADS)

    Ding, Shuye; Cui, Guanghui; Li, Zhongyu; Guan, Tianyu

    2016-03-01

    The permanent magnet synchronous motor (PMSM) is widely used in ships under frequency conversion control system. The fluid flow performance and temperature distribution of the PMSM are difficult to clarify due to its complex structure and variable frequency control condition. Therefore, in order to investigate the fluid and thermal characteristics of the PMSM, a 50 kW PMSM was taken as an example in this study, and a 3-D coupling analysis model of fluid and thermal was established. The fluid and temperature fields were calculated by using finite volume method. The cooling medium's properties, such a velocity, streamlines, and temperature, were then analyzed. The correctness of the proposed model, and the rationality of the solution method, were verified by a temperature test of the PMSM. In this study, the changing rheology on the performance of the cooling medium and the working temperature of the PMSM were revealed, which could be helpful for designing the PMSM.

  9. Commissioning and Performance Analysis of WhisperGen Stirling Engine

    NASA Astrophysics Data System (ADS)

    Pradip, Prashant Kaliram

    Stirling engine based cogeneration systems have potential to reduce energy consumption and greenhouse gas emission, due to their high cogeneration efficiency and emission control due to steady external combustion. To date, most studies on this unit have focused on performance based on both experimentation and computer models, and lack experimental data for diversified operating ranges. This thesis starts with the commissioning of a WhisperGen Stirling engine with components and instrumentation to evaluate power and thermal performance of the system. Next, a parametric study on primary engine variables, including air, diesel, and coolant flowrate and temperature were carried out to further understand their effect on engine power and efficiency. Then, this trend was validated with the thermodynamic model developed for the energy analysis of a Stirling cycle. Finally, the energy balance of the Stirling engine was compared without and with heat recovery from the engine block and the combustion chamber exhaust.

  10. Computational analysis of Variable Thrust Engine (VTE) performance

    NASA Technical Reports Server (NTRS)

    Giridharan, M. G.; Krishnan, A.; Przekwas, A. J.

    1993-01-01

    The Variable Thrust Engine (VTE) of the Orbital Maneuvering Vehicle (OMV) uses a hypergolic propellant combination of Monomethyl Hydrazine (MMH) and Nitrogen Tetroxide (NTO) as fuel and oxidizer, respectively. The performance of the VTE depends on a number of complex interacting phenomena such as atomization, spray dynamics, vaporization, turbulent mixing, convective/radiative heat transfer, and hypergolic combustion. This study involved the development of a comprehensive numerical methodology to facilitate detailed analysis of the VTE. An existing Computational Fluid Dynamics (CFD) code was extensively modified to include the following models: a two-liquid, two-phase Eulerian-Lagrangian spray model; a chemical equilibrium model; and a discrete ordinate radiation heat transfer model. The modified code was used to conduct a series of simulations to assess the effects of various physical phenomena and boundary conditions on the VTE performance. The details of the models and the results of the simulations are presented.

  11. Computational analysis of Variable Thrust Engine (VTE) performance

    NASA Astrophysics Data System (ADS)

    Giridharan, M. G.; Krishnan, A.; Przekwas, A. J.

    1993-02-01

    The Variable Thrust Engine (VTE) of the Orbital Maneuvering Vehicle (OMV) uses a hypergolic propellant combination of Monomethyl Hydrazine (MMH) and Nitrogen Tetroxide (NTO) as fuel and oxidizer, respectively. The performance of the VTE depends on a number of complex interacting phenomena such as atomization, spray dynamics, vaporization, turbulent mixing, convective/radiative heat transfer, and hypergolic combustion. This study involved the development of a comprehensive numerical methodology to facilitate detailed analysis of the VTE. An existing Computational Fluid Dynamics (CFD) code was extensively modified to include the following models: a two-liquid, two-phase Eulerian-Lagrangian spray model; a chemical equilibrium model; and a discrete ordinate radiation heat transfer model. The modified code was used to conduct a series of simulations to assess the effects of various physical phenomena and boundary conditions on the VTE performance. The details of the models and the results of the simulations are presented.

  12. Using SWE Standards for Ubiquitous Environmental Sensing: A Performance Analysis

    PubMed Central

    Tamayo, Alain; Granell, Carlos; Huerta, Joaquín

    2012-01-01

    Although smartphone applications represent the most typical data consumer tool from the citizen perspective in environmental applications, they can also be used for in-situ data collection and production in varied scenarios, such as geological sciences and biodiversity. The use of standard protocols, such as SWE, to exchange information between smartphones and sensor infrastructures brings benefits such as interoperability and scalability, but their reliance on XML is a potential problem when large volumes of data are transferred, due to limited bandwidth and processing capabilities on mobile phones. In this article we present a performance analysis about the use of SWE standards in smartphone applications to consume and produce environmental sensor data, analysing to what extent the performance problems related to XML can be alleviated by using alternative uncompressed and compressed formats.

  13. Performance analysis of a laser propelled interorbital tansfer vehicle

    NASA Technical Reports Server (NTRS)

    Minovitch, M. A.

    1976-01-01

    Performance capabilities of a laser-propelled interorbital transfer vehicle receiving propulsive power from one ground-based transmitter was investigated. The laser transmits propulsive energy to the vehicle during successive station fly-overs. By applying a series of these propulsive maneuvers, large payloads can be economically transferred between low earth orbits and synchronous orbits. Operations involving the injection of large payloads onto escape trajectories are also studied. The duration of each successive engine burn must be carefully timed so that the vehicle reappears over the laser station to receive additional propulsive power within the shortest possible time. The analytical solution for determining these time intervals is presented, as is a solution to the problem of determining maximum injection payloads. Parameteric computer analysis based on these optimization studies is presented. The results show that relatively low beam powers, on the order of 50 MW to 60 MW, produce significant performance capabilities.

  14. Lifting surface performance analysis for horizontal axis wind turbines

    NASA Astrophysics Data System (ADS)

    Kocurek, D.

    1987-06-01

    This report describes how numerical lifting-surface theory is applied to the calculation of a horizontal-axis wind turbine's aerodynamic characteristics and performance. The report also describes how such an application is implemented as a computer program. The method evolved from rotary-wing and helicopter applications and features a detailed, prescribed wake. The wake model extends from a hovering-rotor experimental generalization to include the effect of the windmill brake state on the radial and axial displacement rates of the trailing vortex system. Performance calculations are made by coupling the lifting-surface circulation solution to a blade-element analysis that incorporates two-dimensional airfoil characteristics as functions of angle of attack and Reynolds number. Several analytical stall models are also provided to extend the airfoil characteristics beyond the limits of available data. Although this work focuses on the steady-performance problem, the method includes ways to investigate the effects of wind-shear profile, tower shadow, and off-axis shaft alignment. Correlating the method to measured wind-turbine performance, and comparing it to blade-element momentum theory calculations, validate and highlight the extreme sensitivity of predictions to the quality of early post-stall airfoil behavior.

  15. Transient analysis techniques in performing impact and crash dynamic studies

    NASA Technical Reports Server (NTRS)

    Pifko, A. B.; Winter, R.

    1989-01-01

    Because of the emphasis being placed on crashworthiness as a design requirement, increasing demands are being made by various organizations to analyze a wide range of complex structures that must perform safely when subjected to severe impact loads, such as those generated in a crash event. The ultimate goal of crashworthiness design and analysis is to produce vehicles with the ability to reduce the dynamic forces experienced by the occupants to specified levels, while maintaining a survivable envelope around them during a specified crash event. DYCAST is a nonlinear structural dynamic finite element computer code that started from the plans systems of a finite element program for static nonlinear structural analysis. The essential features of DYCAST are outlined.

  16. Removing Grit During Wastewater Treatment: CFD Analysis of HDVS Performance.

    PubMed

    Meroney, Robert N; Sheker, Robert E

    2016-05-01

    Computational Fluid Dynamics (CFD) was used to simulate the grit and sand separation effectiveness of a typical hydrodynamic vortex separator (HDVS) system. The analysis examined the influences on the separator efficiency of: flow rate, fluid viscosities, total suspended solids (TSS), and particle size and distribution. It was found that separator efficiency for a wide range of these independent variables could be consolidated into a few curves based on the particle fall velocity to separator inflow velocity ratio, Ws/Vin. Based on CFD analysis it was also determined that systems of different sizes with length scale ratios ranging from 1 to 10 performed similarly when Ws/Vin and TSS were held constant. The CFD results have also been compared to a limited range of experimental data.

  17. [An analysis of maicaodi by high performance liquid chromatography].

    PubMed

    Yang, H; Chen, R; Jiang, M

    1997-05-01

    Maicaodi has recently been developed and produced by the pesticide plant of Nanjing Agricultural University. The quantitative analysis of the effective components--tribenuron methyl and R (-)napropamide in wettable powder of Maicaode, by a high performance liquid chromatographic method was carried out with a Lichrosorb Si-60 20cm x 0.46cm i.d. column, mobile phase of petroleum ether/isopropanol/methanol/acetonitrile/chloroform mixture solvent (80:5:5:5:5) and internal standard of diisooctyl phthalate. The sample was detected by ultraviolet absorption at 254 nm. The retention times of tribenuron methyl and R (-)napropamide were 10-11min and 6-7min respectively. The coefficient of variation of this analysis was 0.34% with a recovery of 99.51%-100.32%. The coefficient of linear correlation was 0.9999. PMID:15739379

  18. A Divergence Statistics Extension to VTK for Performance Analysis.

    SciTech Connect

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2015-02-01

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.

  19. Analysis of Random Segment Errors on Coronagraph Performance

    NASA Technical Reports Server (NTRS)

    Stahl, Mark T.; Stahl, H. Philip; Shaklan, Stuart B.; N'Diaye, Mamadou

    2016-01-01

    At 2015 SPIE O&P we presented "Preliminary Analysis of Random Segment Errors on Coronagraph Performance" Key Findings: Contrast Leakage for 4thorder Sinc2(X) coronagraph is 10X more sensitive to random segment piston than random tip/tilt, Fewer segments (i.e. 1 ring) or very many segments (> 16 rings) has less contrast leakage as a function of piston or tip/tilt than an aperture with 2 to 4 rings of segments. Revised Findings: Piston is only 2.5X more sensitive than Tip/Tilt

  20. Performance Analysis of Visible Light Communication Using CMOS Sensors

    PubMed Central

    Do, Trong-Hop; Yoo, Myungsik

    2016-01-01

    This paper elucidates the fundamentals of visible light communication systems that use the rolling shutter mechanism of CMOS sensors. All related information involving different subjects, such as photometry, camera operation, photography and image processing, are studied in tandem to explain the system. Then, the system performance is analyzed with respect to signal quality and data rate. To this end, a measure of signal quality, the signal to interference plus noise ratio (SINR), is formulated. Finally, a simulation is conducted to verify the analysis. PMID:26938535

  1. Aerocapture Performance Analysis of A Venus Exploration Mission

    NASA Technical Reports Server (NTRS)

    Starr, Brett R.; Westhelle, Carlos H.

    2005-01-01

    A performance analysis of a Discovery Class Venus Exploration Mission in which aerocapture is used to capture a spacecraft into a 300km polar orbit for a two year science mission has been conducted to quantify its performance. A preliminary performance assessment determined that a high heritage 70 sphere-cone rigid aeroshell with a 0.25 lift to drag ratio has adequate control authority to provide an entry flight path angle corridor large enough for the mission s aerocapture maneuver. A 114 kilograms per square meter ballistic coefficient reference vehicle was developed from the science requirements and the preliminary assessment s heating indicators and deceleration loads. Performance analyses were conducted for the reference vehicle and for sensitivity studies on vehicle ballistic coefficient and maximum bank rate. The performance analyses used a high fidelity flight simulation within a Monte Carlo executive to define the aerocapture heating environment and deceleration loads and to determine mission success statistics. The simulation utilized the Program to Optimize Simulated Trajectories (POST) that was modified to include Venus specific atmospheric and planet models, aerodynamic characteristics, and interplanetary trajectory models. In addition to Venus specific models, an autonomous guidance system, HYPAS, and a pseudo flight controller were incorporated in the simulation. The Monte Carlo analyses incorporated a reference set of approach trajectory delivery errors, aerodynamic uncertainties, and atmospheric density variations. The reference performance analysis determined the reference vehicle achieves 100% successful capture and has a 99.87% probability of attaining the science orbit with a 90 meters per second delta V budget for post aerocapture orbital adjustments. A ballistic coefficient trade study conducted with reference uncertainties determined that the 0.25 L/D vehicle can achieve 100% successful capture with a ballistic coefficient of 228 kilograms

  2. Performance analysis of jump-gliding locomotion for miniature robotics.

    PubMed

    Vidyasagar, A; Zufferey, Jean-Christohphe; Floreano, Dario; Kovač, M

    2015-04-01

    Recent work suggests that jumping locomotion in combination with a gliding phase can be used as an effective mobility principle in robotics. Compared to pure jumping without a gliding phase, the potential benefits of hybrid jump-gliding locomotion includes the ability to extend the distance travelled and reduce the potentially damaging impact forces upon landing. This publication evaluates the performance of jump-gliding locomotion and provides models for the analysis of the relevant dynamics of flight. It also defines a jump-gliding envelope that encompasses the range that can be achieved with jump-gliding robots and that can be used to evaluate the performance and improvement potential of jump-gliding robots. We present first a planar dynamic model and then a simplified closed form model, which allow for quantification of the distance travelled and the impact energy on landing. In order to validate the prediction of these models, we validate the model with experiments using a novel jump-gliding robot, named the 'EPFL jump-glider'. It has a mass of 16.5 g and is able to perform jumps from elevated positions, perform steered gliding flight, land safely and traverse on the ground by repetitive jumping. The experiments indicate that the developed jump-gliding model fits very well with the measured flight data using the EPFL jump-glider, confirming the benefits of jump-gliding locomotion to mobile robotics. The jump-glide envelope considerations indicate that the EPFL jump-glider, when traversing from a 2 m height, reaches 74.3% of optimal jump-gliding distance compared to pure jumping without a gliding phase which only reaches 33.4% of the optimal jump-gliding distance. Methods of further improving flight performance based on the models and inspiration from biological systems are presented providing mechanical design pathways to future jump-gliding robot designs. PMID:25811417

  3. Performance analysis of jump-gliding locomotion for miniature robotics.

    PubMed

    Vidyasagar, A; Zufferey, Jean-Christohphe; Floreano, Dario; Kovač, M

    2015-03-26

    Recent work suggests that jumping locomotion in combination with a gliding phase can be used as an effective mobility principle in robotics. Compared to pure jumping without a gliding phase, the potential benefits of hybrid jump-gliding locomotion includes the ability to extend the distance travelled and reduce the potentially damaging impact forces upon landing. This publication evaluates the performance of jump-gliding locomotion and provides models for the analysis of the relevant dynamics of flight. It also defines a jump-gliding envelope that encompasses the range that can be achieved with jump-gliding robots and that can be used to evaluate the performance and improvement potential of jump-gliding robots. We present first a planar dynamic model and then a simplified closed form model, which allow for quantification of the distance travelled and the impact energy on landing. In order to validate the prediction of these models, we validate the model with experiments using a novel jump-gliding robot, named the 'EPFL jump-glider'. It has a mass of 16.5 g and is able to perform jumps from elevated positions, perform steered gliding flight, land safely and traverse on the ground by repetitive jumping. The experiments indicate that the developed jump-gliding model fits very well with the measured flight data using the EPFL jump-glider, confirming the benefits of jump-gliding locomotion to mobile robotics. The jump-glide envelope considerations indicate that the EPFL jump-glider, when traversing from a 2 m height, reaches 74.3% of optimal jump-gliding distance compared to pure jumping without a gliding phase which only reaches 33.4% of the optimal jump-gliding distance. Methods of further improving flight performance based on the models and inspiration from biological systems are presented providing mechanical design pathways to future jump-gliding robot designs.

  4. A conceptual design tool for RBCC engine performance analysis

    SciTech Connect

    Olds, J.R.; Saks, G.

    1997-01-01

    Future reusable launch vehicles will depend on new propulsion technologies to lower system operational costs while maintaining adequate performance. Recently, a number of vehicle systems utilizing rocket-based combined-cycle (RBCC) propulsion have been proposed as possible low-cost space launch solutions. Vehicles using RBCC propulsion have the potential to combine the best aspects of airbreathing propulsion (high average Isp) with the best aspects of rocket propulsion (high propellant bulk density and engine T/W). Proper conceptual assessment of each proposed vehicle will require computer-based tools that allow for quick and cheap, yet sufficiently accurate disciplinary analyses. At Georgia Tech, a spreadsheet-based tool has been developed that uses quasi-1D flow analysis with component efficiencies to parametrically model RBCC engine performance in ejector, fan-ramjet, ramjet and pure rocket modes. The technique is similar to an earlier RBCC modeling technique developed by the Marquardt Corporation in the mid-1960{close_quote}s. For a given sea-level static thrust requirement, the current tool generates engine weight and size data, as well as Isp and thrust data vs. altitude and Mach number. The latter is output in tabular form for use in a trajectory optimization program. This paper reviews the current state of the RBCC analysis tool and the effort to upgrade it from a Microsoft Excel spreadsheet to a design-oriented UNIX program in C suitable for integration into a multidisciplinary design optimization (MDO) framework. {copyright} {ital 1997 American Institute of Physics.}

  5. A Preliminary Analysis of LANDSAT-4 Thematic Mapper Radiometric Performance

    NASA Technical Reports Server (NTRS)

    Justice, C.; Fusco, L.; Mehl, W.

    1984-01-01

    Analysis was performed to characterize the radiometry of three Thematic Mapper (TM) digital products of a scene of Arkansas. The three digital products examined were the NASA raw (BT) product, the radiometrically corrected (AT) product and the radiometrically and geometrically corrected (PT) product. The frequency distribution of the digital data; the statistical correlation between the bands; and the variability between the detectors within a band were examined on a series of image subsets from the full scene. The results are presented from one 1024 x 1024 pixel subset of Realfoot Lake, Tennessee which displayed a representative range of ground conditions and cover types occurring within the full frame image. Bands 1, 2 and 5 of the sample area are presented. The subsets were extracted from the three digital data products to cover the same geographic area. This analysis provides the first step towards a full appraisal of the TM radiometry being performed as part of the ESA/CEC contribution to the NASA/LIDQA program.

  6. Autotasked Performance in the NAS Workload: A Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Carter, R. L.; Stockdale, I. E.; Kutler, Paul (Technical Monitor)

    1998-01-01

    A statistical analysis of the workload performance of a production quality FORTRAN code for five different Cray Y-MP hardware and system software configurations is performed. The analysis was based on an experimental procedure that was designed to minimize correlations between the number of requested CPUs and the time of day the runs were initiated. Observed autotasking over heads were significantly larger for the set of jobs that requested the maximum number of CPUs. Speedups for UNICOS 6 releases show consistent wall clock speedups in the workload of around 2. which is quite good. The observed speed ups were very similar for the set of jobs that requested 8 CPUs and the set that requested 4 CPUs. The original NAS algorithm for determining charges to the user discourages autotasking in the workload. A new charging algorithm to be applied to jobs run in the NQS multitasking queues also discourages NAS users from using auto tasking. The new algorithm favors jobs requesting 8 CPUs over those that request less, although the jobs requesting 8 CPUs experienced significantly higher over head and presumably degraded system throughput. A charging algorithm is presented that has the following desirable characteristics when applied to the data: higher overhead jobs requesting 8 CPUs are penalized when compared to moderate overhead jobs requesting 4 CPUs, thereby providing a charging incentive to NAS users to use autotasking in a manner that provides them with significantly improved turnaround while also maintaining system throughput.

  7. Correlation analysis between ionospheric scintillation levels and receiver tracking performance

    NASA Astrophysics Data System (ADS)

    Sreeja, V.; Aquino, M.; Elmas, Z. G.; Forte, B.

    2012-06-01

    Rapid fluctuations in the amplitude and phase of a transionospheric radio signal caused by small scale plasma density irregularities in the ionosphere are known as scintillation. Scintillation can seriously impair a GNSS (Global Navigation Satellite Systems) receiver tracking performance, thus affecting the required levels of availability, accuracy and integrity, and consequently the reliability of modern day GNSS based applications. This paper presents an analysis of correlation between scintillation levels and tracking performance of a GNSS receiver for GPS L1C/A, L2C and GLONASS L1, L2 signals. The analyses make use of data recorded over Presidente Prudente (22.1°S, 51.4°W, dip latitude ˜12.3°S) in Brazil, a location close to the Equatorial Ionisation Anomaly (EIA) crest in Latin America. The study presents for the first time this type of correlation analysis for GPS L2C and GLONASS L1, L2 signals. The scintillation levels are defined by the amplitude scintillation index, S4 and the receiver tracking performance is evaluated by the phase tracking jitter. Both S4 and the phase tracking jitter are estimated from the post correlation In-Phase (I) and Quadra-Phase (Q) components logged by the receiver at a high rate. Results reveal that the dependence of the phase tracking jitter on the scintillation levels can be represented by a quadratic fit for the signals. The results presented in this paper are of importance to GNSS users, especially in view of the forthcoming high phase of solar cycle 24 (predicted for 2013).

  8. Instantaneous BeiDou-GPS attitude determination: A performance analysis

    NASA Astrophysics Data System (ADS)

    Nadarajah, Nandakumaran; Teunissen, Peter J. G.; Raziq, Noor

    2014-09-01

    The advent of modernized and new global navigation satellite systems (GNSS) has enhanced the availability of satellite based positioning, navigation, and timing (PNT) solutions. Specifically, it increases redundancy and yields operational back-up or independence in case of failure or unavailability of one system. Among existing GNSS, the Chinese BeiDou system (BDS) is being developed and will consist of geostationary (GEO) satellites, inclined geosynchronous orbit (IGSO) satellites, and medium-Earth-orbit (MEO) satellites. In this contribution, a BeiDou-GPS robustness analysis is carried out for instantaneous, unaided attitude determination. Precise attitude determination using multiple GNSS antennas mounted on a platform relies on the successful resolution of the integer carrier phase ambiguities. The constrained Least-squares AMBiguity Decorrelation Adjustment (C-LAMBDA) method has been developed for the quadratically constrained GNSS compass model that incorporates the known baseline length. In this contribution the method is used to analyse the attitude determination performance when using the GPS and BeiDou systems. The attitude determination performance is evaluated using GPS/BeiDou data sets from a real data campaign in Australia spanning several days. The study includes the performance analyses of both stand-alone and mixed constellation (GPS/BeiDou) attitude estimation under various satellite deprived environments. We demonstrate and quantify the improved availability and accuracy of attitude determination using the combined constellation.

  9. Aerocapture Performance Analysis for a Neptune-Triton Exploration Mission

    NASA Technical Reports Server (NTRS)

    Starr, Brett R.; Westhelle, Carlos H.; Masciarelli, James P.

    2004-01-01

    A systems analysis has been conducted for a Neptune-Triton Exploration Mission in which aerocapture is used to capture a spacecraft at Neptune. Aerocapture uses aerodynamic drag instead of propulsion to decelerate from the interplanetary approach trajectory to a captured orbit during a single pass through the atmosphere. After capture, propulsion is used to move the spacecraft from the initial captured orbit to the desired science orbit. A preliminary assessment identified that a spacecraft with a lift to drag ratio of 0.8 was required for aerocapture. Performance analyses of the 0.8 L/D vehicle were performed using a high fidelity flight simulation within a Monte Carlo executive to determine mission success statistics. The simulation was the Program to Optimize Simulated Trajectories (POST) modified to include Neptune specific atmospheric and planet models, spacecraft aerodynamic characteristics, and interplanetary trajectory models. To these were added autonomous guidance and pseudo flight controller models. The Monte Carlo analyses incorporated approach trajectory delivery errors, aerodynamic characteristics uncertainties, and atmospheric density variations. Monte Carlo analyses were performed for a reference set of uncertainties and sets of uncertainties modified to produce increased and reduced atmospheric variability. For the reference uncertainties, the 0.8 L/D flatbottom ellipsled vehicle achieves 100% successful capture and has a 99.87 probability of attaining the science orbit with a 360 m/s V budget for apoapsis and periapsis adjustment. Monte Carlo analyses were also performed for a guidance system that modulates both bank angle and angle of attack with the reference set of uncertainties. An alpha and bank modulation guidance system reduces the 99.87 percentile DELTA V 173 m/s (48%) to 187 m/s for the reference set of uncertainties.

  10. External quality assessment in water microbiology: statistical analysis of performance.

    PubMed

    Tillett, H E; Lightfoot, N F; Eaton, S

    1993-04-01

    A UK-based scheme of water microbiology assessment requires participants to record counts of relevant organisms. Not every sample will contain the target number of organisms because of natural variation and therefore a range of results is acceptable. Results which are tail-end (i.e. at the extreme low or high end of this range) could occasionally be reported by any individual laboratory by chance. Several tail-end results might imply a laboratory problem. Statistical assessment is done in two stages. A non-parametric test of the distribution of tail-end counts amongst laboratories is performed (Cochran's Q) and, if they are not random, then observed and expected frequencies of tail-end counts are compared to identify participants who may have reported excessive numbers of low or high results. Analyses so far have shown that laboratories find high counts no more frequently than would be expected by chance, but that significant clusters of low counts can be detected among participants. These findings have been observed both in short-term and in long-term assessments, thus allowing detection of new episodes of poor performance and intermittent problems. The analysis relies on an objective definition of tail-end results. Working definitions are presented which should identify poor performance in terms of microbiological significance, and which allow fair comparison between membrane-filtration and multiple-tube techniques. Smaller differences between laboratories, which may be statistically significant, will not be detected. Different definitions of poor performance could be incorporated into future assessments.

  11. Analysis for Improving Performance. Tools for Diagnosing Organizations & Documenting Workplace Expertise. Berrett-Koehler Organizational Performance Series.

    ERIC Educational Resources Information Center

    Swanson, Richard A.

    This book details the work required at the outset of all efforts designed to improve an organization's performance at the organization, process, and/or individual levels. The book is divided into four parts that are devoted to the following aspects of conducting the systematic diagnostic analysis required to improve performance: (1) analysis as…

  12. Advanced multiphysics coupling for LWR fuel performance analysis

    SciTech Connect

    Hales, J. D.; Tonks, M. R.; Gleicher, F. N.; Spencer, B. W.; Novascone, S. R.; Williamson, R. L.; Pastore, G.; Perez, D. M.

    2015-10-01

    Even the most basic nuclear fuel analysis is a multiphysics undertaking, as a credible simulation must consider at a minimum coupled heat conduction and mechanical deformation. The need for more realistic fuel modeling under a variety of conditions invariably leads to a desire to include coupling between a more complete set of the physical phenomena influencing fuel behavior, including neutronics, thermal hydraulics, and mechanisms occurring at lower length scales. This paper covers current efforts toward coupled multiphysics LWR fuel modeling in three main areas. The first area covered in this paper concerns thermomechanical coupling. The interaction of these two physics, particularly related to the feedback effect associated with heat transfer and mechanical contact at the fuel/clad gap, provides numerous computational challenges. An outline is provided of an effective approach used to manage the nonlinearities associated with an evolving gap in BISON, a nuclear fuel performance application. A second type of multiphysics coupling described here is that of coupling neutronics with thermomechanical LWR fuel performance. DeCART, a high-fidelity core analysis program based on the method of characteristics, has been coupled to BISON. DeCART provides sub-pin level resolution of the multigroup neutron flux, with resonance treatment, during a depletion or a fast transient simulation. Two-way coupling between these codes was achieved by mapping fission rate density and fast neutron flux fields from DeCART to BISON and the temperature field from BISON to DeCART while employing a Picard iterative algorithm. Finally, the need for multiscale coupling is considered. Fission gas production and evolution significantly impact fuel performance by causing swelling, a reduction in the thermal conductivity, and fission gas release. The mechanisms involved occur at the atomistic and grain scale and are therefore not the domain of a fuel performance code. However, it is possible to use

  13. Advanced multiphysics coupling for LWR fuel performance analysis

    DOE PAGESBeta

    Hales, J. D.; Tonks, M. R.; Gleicher, F. N.; Spencer, B. W.; Novascone, S. R.; Williamson, R. L.; Pastore, G.; Perez, D. M.

    2015-10-01

    Even the most basic nuclear fuel analysis is a multiphysics undertaking, as a credible simulation must consider at a minimum coupled heat conduction and mechanical deformation. The need for more realistic fuel modeling under a variety of conditions invariably leads to a desire to include coupling between a more complete set of the physical phenomena influencing fuel behavior, including neutronics, thermal hydraulics, and mechanisms occurring at lower length scales. This paper covers current efforts toward coupled multiphysics LWR fuel modeling in three main areas. The first area covered in this paper concerns thermomechanical coupling. The interaction of these two physics,more » particularly related to the feedback effect associated with heat transfer and mechanical contact at the fuel/clad gap, provides numerous computational challenges. An outline is provided of an effective approach used to manage the nonlinearities associated with an evolving gap in BISON, a nuclear fuel performance application. A second type of multiphysics coupling described here is that of coupling neutronics with thermomechanical LWR fuel performance. DeCART, a high-fidelity core analysis program based on the method of characteristics, has been coupled to BISON. DeCART provides sub-pin level resolution of the multigroup neutron flux, with resonance treatment, during a depletion or a fast transient simulation. Two-way coupling between these codes was achieved by mapping fission rate density and fast neutron flux fields from DeCART to BISON and the temperature field from BISON to DeCART while employing a Picard iterative algorithm. Finally, the need for multiscale coupling is considered. Fission gas production and evolution significantly impact fuel performance by causing swelling, a reduction in the thermal conductivity, and fission gas release. The mechanisms involved occur at the atomistic and grain scale and are therefore not the domain of a fuel performance code. However, it is

  14. Performance analysis of a thermosyphon solar water heating system. Part 2: Nighttime performance

    SciTech Connect

    Ghoneim, Z.A.; Nassr, I.S.

    1995-11-01

    The paper presents an experimental and theoretical analysis of the performance of a thermosyphon solar water heating system under actual operating conditions during nighttime. The system consisted of a 1.8 m by 0.8 m flat plate collector and a 100 liter tank. Detailed temperature-time curves for the system over a 24 hour period were recorded using a stand alone microprocessor-based measurement and control system. Nighttime mass flow rates were calculated using several methods, based on the recorded temperature-time curves, the governing conservation equations and the system thermal losses. Deduced nighttime mass flow rates are in the order of 0.005 kg/sec. The effect of reverse flow was found to increase the system thermal losses by nearly fourfold.

  15. 1-D Numerical Analysis of ABCC Engine Performance

    NASA Technical Reports Server (NTRS)

    Holden, Richard

    1999-01-01

    ABCC engine combines air breathing and rocket engine into a single engine to increase the specific impulse over an entire flight trajectory. Except for the heat source, the basic operation of the ABCC is similar to the basic operation of the RBCC engine. The ABCC is intended to have a higher specific impulse than the RBCC for single stage Earth to orbit vehicle. Computational fluid dynamics (CFD) is a useful tool for the analysis of complex transport processes in various components in ABCC propulsion system. The objective of the present research was to develop a transient 1-D numerical model using conservation of mass, linear momentum, and energy equations that could be used to predict flow behavior throughout a generic ABCC engine following a flight path. At specific points during the development of the 1-D numerical model a myriad of tests were performed to prove the program produced consistent, realistic numbers that follow compressible flow theory for various inlet conditions.

  16. Automotive Gas Turbine Power System-Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    1997-01-01

    An open cycle gas turbine numerical modelling code suitable for thermodynamic performance analysis (i.e. thermal efficiency, specific fuel consumption, cycle state points, working fluid flowrates etc.) of automotive and aircraft powerplant applications has been generated at the NASA Lewis Research Center's Power Technology Division. The use this code can be made available to automotive gas turbine preliminary design efforts, either in its present version, or, assuming that resources can be obtained to incorporate empirical models for component weight and packaging volume, in later version that includes the weight-volume estimator feature. The paper contains a brief discussion of the capabilities of the presently operational version of the code, including a listing of input and output parameters and actual sample output listings.

  17. Portable Life Support Subsystem Thermal Hydraulic Performance Analysis

    NASA Technical Reports Server (NTRS)

    Barnes, Bruce; Pinckney, John; Conger, Bruce

    2010-01-01

    This paper presents the current state of the thermal hydraulic modeling efforts being conducted for the Constellation Space Suit Element (CSSE) Portable Life Support Subsystem (PLSS). The goal of these efforts is to provide realistic simulations of the PLSS under various modes of operation. The PLSS thermal hydraulic model simulates the thermal, pressure, flow characteristics, and human thermal comfort related to the PLSS performance. This paper presents modeling approaches and assumptions as well as component model descriptions. Results from the models are presented that show PLSS operations at steady-state and transient conditions. Finally, conclusions and recommendations are offered that summarize results, identify PLSS design weaknesses uncovered during review of the analysis results, and propose areas for improvement to increase model fidelity and accuracy.

  18. Hydrodynamic body shape analysis and their impact on swimming performance.

    PubMed

    Li, Tian-Zeng; Zhan, Jie-Min

    2015-01-01

    This study presents the hydrodynamic characteristics of different adult male swimmer's body shape using computational fluid dynamics method. This simulation strategy is carried out by CFD fluent code with solving the 3D incompressible Navier-Stokes equations using the RNG k-ε turbulence closure. The water free surface is captured by the volume of fluid (VOF) method. A set of full body models, which is based on the anthropometrical characteristics of the most common male swimmers, is created by Computer Aided Industrial Design (CAID) software, Rhinoceros. The analysis of CFD results revealed that swimmer's body shape has a noticeable effect on the hydrodynamics performances. This explains why male swimmer with an inverted triangle body shape has good hydrodynamic characteristics for competitive swimming. PMID:26898107

  19. Performance Analysis: ITS Data through September 30, 2009

    SciTech Connect

    Kerr, C E

    2009-12-07

    Data from ITS was analyzed to understand the issues at LLNL and to identify issues that may require additional management attention and these that meet the threshold for reporting to the DOE Noncompliance Tracking System (NTS). In this report we discuss assessments and issues entered in ITS and compare the number and type presently entered in ITS to previous time periods. Issues reported in ITS were evaluated and discussed. The analysis identified two noncompliances that meet the threshold for reporting to the DOE NTS. All of the data in ITS is analyzed; however, the primary focus of this report is to meet requirements for performance analysis of specific functional areas. The DOE Office of Enforcement expects LLNL to 'implement comprehensive management and independent assessments that are effective in identifying deficiencies and broader problems in safety and security programs, as well as opportunities for continuous improvement within the organization' and to 'regularly perform assessments to evaluate implementation of the contractor's's processes for screening and internal reporting.' LLNL has a self-assessment program, described in the document applicable during this time period, ES&H Manual Document 4.1, that includes line, management and independent assessments. LLNL also has in place a process to identify and report deficiencies of nuclear, worker safety and health and security requirements. In addition, the DOE Office of Enforcement expects that 'issues management databases are used to identify adverse trends, dominant problem areas, and potential repetitive events or conditions' (page 15, DOE Enforcement Process Overview, June 2009). LLNL requires that all worker safety and health and nuclear safety noncompliances be tracked as 'deficiencies' in the LLNL Issues Tracking System (ITS). Data from the ITS are analyzed for worker safety and health (WSH) and nuclear safety noncompliances that may meet the threshold for reporting to the DOE Noncompliance

  20. Performance analysis for stable mobile robot navigation solutions

    NASA Astrophysics Data System (ADS)

    Scrapper, Chris, Jr.; Madhavan, Raj; Balakirsky, Stephen

    2008-04-01

    Robot navigation in complex, dynamic and unstructured environments demands robust mapping and localization solutions. One of the most popular methods in recent years has been the use of scan-matching schemes where temporally correlated sensor data sets are registered for obtaining a Simultaneous Localization and Mapping (SLAM) navigation solution. The primary bottleneck of such scan-matching schemes is correspondence determination, i.e. associating a feature (structure) in one dataset to its counterpart in the other. Outliers, occlusions, and sensor noise complicate the determination of reliable correspondences. This paper describes testing scenarios being developed at NIST to analyze the performance of scan-matching algorithms. This analysis is critical for the development of practical SLAM algorithms in various application domains where sensor payload, wheel slippage, and power constraints impose severe restrictions. We will present results using a high-fidelity simulation testbed, the Unified System for Automation and Robot Simulation (USARSim).

  1. Performance Analysis of an Actor-Based Distributed Simulation

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1998-01-01

    Object-oriented design of simulation programs appears to be very attractive because of the natural association of components in the simulated system with objects. There is great potential in distributing the simulation across several computers for the purpose of parallel computation and its consequent handling of larger problems in less elapsed time. One approach to such a design is to use "actors", that is, active objects with their own thread of control. Because these objects execute concurrently, communication is via messages. This is in contrast to an object-oriented design using passive objects where communication between objects is via method calls (direct calls when they are in the same address space and remote procedure calls when they are in different address spaces or different machines). This paper describes a performance analysis program for the evaluation of a design for distributed simulations based upon actors.

  2. Performance analysis of spread spectrum modulation in data hiding

    NASA Astrophysics Data System (ADS)

    Gang, Litao; Akansu, Ali N.; Ramkumar, Mahalingam

    2001-12-01

    Watermarking or steganography technology provides a possible solution in digital multimedia copyright protection and pirate tracking. Most of the current data hiding schemes are based on spread spectrum modulation. A small value watermark signal is embedded into the content signal in some watermark domain. The information bits can be extracted via correlation. The schemes are applied both in escrow and oblivious cases. This paper reveals, through analysis and simulation, that in oblivious applications where the original signal is not available, the commonly used correlation detection is not optimal. Its maximum likelihood detection is analyzed and a feasible suboptimal detector is derived. Its performance is explored and compared with the correlation detector. Subsequently a linear embedding scheme is proposed and studied. Experiments with image data hiding demonstrates its effectiveness in applications.

  3. Performance Analysis of Paraboloidal Reflector Antennas in Radio Telescopes

    NASA Astrophysics Data System (ADS)

    Yeap, Kim Ho; Law, Young Hui; Rizman, Zairi Ismael; Cheong, Yuen Kiat; Ong, Chu En; Chong, Kok Hen

    2013-10-01

    In this paper, we present an analysis on the performance of the three most commonly used paraboloidal reflector antennas in radio telescopes - i.e. the prime focus, Cassegrain, and Gregorian antennas. In our study, we have adopted the design parameters for the Cassegrain configuration used in the Atacama Large Millimeter Array (ALMA) project. The parameters are subsequently re-calculated so as to meet the design requirement of the Gregorian and prime focus configurations. The simulation results obtained from GRASP reveal that the prime focus configuration produces the lowest side lobes and the highest main lobe level. Such configuration, however, has the disadvantage of being highly susceptible to thermal ground noise radiation. The radiation characteristics produced by both the Cassegrain and Gregorian configurations are very close to each other. Indeed, the results show that there is no significant advantage between the two designs. Hence, we can conclude that both co! nfigurations are comparable in the application of radio telescopes.

  4. Hydrodynamic body shape analysis and their impact on swimming performance.

    PubMed

    Li, Tian-Zeng; Zhan, Jie-Min

    2015-01-01

    This study presents the hydrodynamic characteristics of different adult male swimmer's body shape using computational fluid dynamics method. This simulation strategy is carried out by CFD fluent code with solving the 3D incompressible Navier-Stokes equations using the RNG k-ε turbulence closure. The water free surface is captured by the volume of fluid (VOF) method. A set of full body models, which is based on the anthropometrical characteristics of the most common male swimmers, is created by Computer Aided Industrial Design (CAID) software, Rhinoceros. The analysis of CFD results revealed that swimmer's body shape has a noticeable effect on the hydrodynamics performances. This explains why male swimmer with an inverted triangle body shape has good hydrodynamic characteristics for competitive swimming.

  5. Analysis of Different Blade Architectures on small VAWT Performance

    NASA Astrophysics Data System (ADS)

    Battisti, L.; Brighenti, A.; Benini, E.; Raciti Castelli, M.

    2016-09-01

    The present paper aims at describing and comparing different small Vertical Axis Wind Turbine (VAWT) architectures, in terms of performance and loads. These characteristics can be highlighted by resorting to the Blade Element-Momentum (BE-M) model, commonly adopted for rotor pre-design and controller assessment. After validating the model with experimental data, the paper focuses on the analysis of VAWT loads depending on some relevant rotor features: blade number (2 and 3), airfoil camber line (comparing symmetrical and asymmetrical profiles) and blade inclination (straight versus helical blade). The effect of such characteristics on both power and thrusts (in the streamwise direction and in the crosswise one) as a function of both the blades azimuthal position and their Tip Speed Ratio (TSR) are presented and widely discussed.

  6. Visualization and Analysis of Climate Simulation Performance Data

    NASA Astrophysics Data System (ADS)

    Röber, Niklas; Adamidis, Panagiotis; Behrens, Jörg

    2015-04-01

    Visualization is the key process of transforming abstract (scientific) data into a graphical representation, to aid in the understanding of the information hidden within the data. Climate simulation data sets are typically quite large, time varying, and consist of many different variables sampled on an underlying grid. A large variety of climate models - and sub models - exist to simulate various aspects of the climate system. Generally, one is mainly interested in the physical variables produced by the simulation runs, but model developers are also interested in performance data measured along with these simulations. Climate simulation models are carefully developed complex software systems, designed to run in parallel on large HPC systems. An important goal thereby is to utilize the entire hardware as efficiently as possible, that is, to distribute the workload as even as possible among the individual components. This is a very challenging task, and detailed performance data, such as timings, cache misses etc. have to be used to locate and understand performance problems in order to optimize the model implementation. Furthermore, the correlation of performance data to the processes of the application and the sub-domains of the decomposed underlying grid is vital when addressing communication and load imbalance issues. High resolution climate simulations are carried out on tens to hundreds of thousands of cores, thus yielding a vast amount of profiling data, which cannot be analyzed without appropriate visualization techniques. This PICO presentation displays and discusses the ICON simulation model, which is jointly developed by the Max Planck Institute for Meteorology and the German Weather Service and in partnership with DKRZ. The visualization and analysis of the models performance data allows us to optimize and fine tune the model, as well as to understand its execution on the HPC system. We show and discuss our workflow, as well as present new ideas and

  7. Performance Analysis of a NASA Integrated Network Array

    NASA Technical Reports Server (NTRS)

    Nessel, James A.

    2012-01-01

    The Space Communications and Navigation (SCaN) Program is planning to integrate its individual networks into a unified network which will function as a single entity to provide services to user missions. This integrated network architecture is expected to provide SCaN customers with the capabilities to seamlessly use any of the available SCaN assets to support their missions to efficiently meet the collective needs of Agency missions. One potential optimal application of these assets, based on this envisioned architecture, is that of arraying across existing networks to significantly enhance data rates and/or link availabilities. As such, this document provides an analysis of the transmit and receive performance of a proposed SCaN inter-network antenna array. From the study, it is determined that a fully integrated internetwork array does not provide any significant advantage over an intra-network array, one in which the assets of an individual network are arrayed for enhanced performance. Therefore, it is the recommendation of this study that NASA proceed with an arraying concept, with a fundamental focus on a network-centric arraying.

  8. Scalable Analysis Techniques for Microprocessor Performance Counter Metrics

    SciTech Connect

    Ahn, D H; Vetter, J S

    2002-07-24

    Contemporary microprocessors provide a rich set of integrated performance counters that allow application developers and system architects alike the opportunity to gather important information about workload behaviors. These counters can capture instruction, memory, and operating system behaviors. Current techniques for analyzing data produced from these counters use raw counts, ratios, and visualization techniques to help users make decisions about their application source code. While these techniques are appropriate for analyzing data from one process, they do not scale easily to new levels demanded by contemporary computing systems. Indeed, the amount of data generated by these experiments is on the order of tens of thousands of data points. Furthermore, if users execute multiple experiments, then we add yet another dimension to this already knotty picture. This flood of multidimensional data can swamp efforts to harvest important ideas from these valuable counters. Very simply, this paper addresses these concerns by evaluating several multivariate statistical techniques on these datasets. We find that several techniques, such as statistical clustering, can automatically extract important features from this data. These derived results can, in turn, be feed directly back to an application developer, or used as input to a more comprehensive performance analysis environment, such as a visualization or an expert system.

  9. Magnetohydrodynamic Augmented Propulsion Experiment: I. Performance Analysis and Design

    NASA Technical Reports Server (NTRS)

    Litchford, R. J.; Cole, J. W.; Lineberry, J. T.; Chapman, J. N.; Schmidt, H. J.; Lineberry, C. W.

    2003-01-01

    The performance of conventional thermal propulsion systems is fundamentally constrained by the specific energy limitations associated with chemical fuels and the thermal limits of available materials. Electromagnetic thrust augmentation represents one intriguing possibility for improving the fuel composition of thermal propulsion systems, thereby increasing overall specific energy characteristics; however, realization of such a system requires an extremely high-energy-density electrical power source as well as an efficient plasma acceleration device. This Technical Publication describes the development of an experimental research facility for investigating the use of cross-field magnetohydrodynamic (MHD) accelerators as a possible thrust augmentation device for thermal propulsion systems. In this experiment,a 1.5-MW(sub e) Aerotherm arc heater is used to drive a 2-MW(sub e) MHD accelerator. The heatsink MHD accelerator is configured as an externally diagonalized, segmented channel, which is inserted into a large-bore, 2-T electromagnet. The performance analysis and engineering design of the flow path are described as well as the parameter measurements and flow diagnostics planned for the initial series of test runs.

  10. Performance analysis of reactive congestion control for ATM networks

    NASA Astrophysics Data System (ADS)

    Kawahara, Kenji; Oie, Yuji; Murata, Masayuki; Miyahara, Hideo

    1995-05-01

    In ATM networks, preventive congestion control is widely recognized for efficiently avoiding congestion, and it is implemented by a conjunction of connection admission control and usage parameter control. However, congestion may still occur because of unpredictable statistical fluctuation of traffic sources even when preventive control is performed in the network. In this paper, we study another kind of congestion control, i.e., reactive congestion control, in which each source changes its cell emitting rate adaptively to the traffic load at the switching node (or at the multiplexer). Our intention is that, by incorporating such a congestion control method in ATM networks, more efficient congestion control is established. We develop an analytical model, and carry out an approximate analysis of reactive congestion control algorithm. Numerical results show that the reactive congestion control algorithms are very effective in avoiding congestion and in achieving the statistical gain. Furthermore, the binary congestion control algorithm with pushout mechanism is shown to provide the best performance among the reactive congestion control algorithms treated here.

  11. Performance analysis of a digital capacitance measuring circuit.

    PubMed

    Xu, Lijun; Sun, Shijie; Cao, Zhang; Yang, Wuqiang

    2015-05-01

    This paper presents the design and study of a digital capacitance measuring circuit with theoretical analysis, numerical simulation, and experimental evaluation. The static and dynamic performances of the capacitance measuring circuit are first defined, including signal-to-noise ratio (SNR), standard deviation, accuracy, linearity, sensitivity, and response time, within a given measurement range. Then numerical simulation is carried out to analyze the SNR and standard deviation of the circuit, followed by experiments to validate the overall performance of the circuit. The simulation results show that when the standard deviation of noise is 0.08 mV and the measured capacitance decreases from 6 pF to 3 fF, the SNR decreases from 90 dB to 22 dB and the standard deviation is between 0.17 fF and 0.24 fF. The experimental results show that when the measured capacitance decreases from 6 pF to 40 fF and the data sampled in a single period are used for demodulation, the SNR decreases from 88 dB to 40 dB and the standard deviation is between 0.18 fF and 0.25 fF. The maximum absolute error and relative error are 5.12 fF and 1.26%, respectively. The SNR and standard deviation can be further improved if the data sampled in more than one period are used for demodulation by the circuit.

  12. Regression analysis of technical parameters affecting nuclear power plant performances

    SciTech Connect

    Ghazy, R.; Ricotti, M. E.; Trueco, P.

    2012-07-01

    Since the 80's many studies have been conducted in order to explicate good and bad performances of commercial nuclear power plants (NPPs), but yet no defined correlation has been found out to be totally representative of plant operational experience. In early works, data availability and the number of operating power stations were both limited; therefore, results showed that specific technical characteristics of NPPs were supposed to be the main causal factors for successful plant operation. Although these aspects keep on assuming a significant role, later studies and observations showed that other factors concerning management and organization of the plant could instead be predominant comparing utilities operational and economic results. Utility quality, in a word, can be used to summarize all the managerial and operational aspects that seem to be effective in determining plant performance. In this paper operational data of a consistent sample of commercial nuclear power stations, out of the total 433 operating NPPs, are analyzed, mainly focusing on the last decade operational experience. The sample consists of PWR and BWR technology, operated by utilities located in different countries, including U.S. (Japan)) (France)) (Germany)) and Finland. Multivariate regression is performed using Unit Capability Factor (UCF) as the dependent variable; this factor reflects indeed the effectiveness of plant programs and practices in maximizing the available electrical generation and consequently provides an overall indication of how well plants are operated and maintained. Aspects that may not be real causal factors but which can have a consistent impact on the UCF, as technology design, supplier, size and age, are included in the analysis as independent variables. (authors)

  13. Nutrition, sensory evaluation, and performance analysis of hydrogenated frying oils.

    PubMed

    Hack, Danielle M; Bordi, Peter L; Hessert, S William

    2009-12-01

    The Food and Drug Administration now requires labeling of trans fats on nutrition labels, a decision that has created a push to reformulate deep-fat frying oils. Prior to the passage of this law, frying oils contained trans fats because trans fats made the oils more stable and thus allowing for longer frying usage. In the present study, oil performance, sensory evaluation and nutritional analysis was conducted on trans fat-free oils through a 10-day degradation process using French fries to break down the oil. The goal of the study was to test oil stability and nutrition analysis and to learn consumer preference between trans fat and trans fat-free oils. Sensory evaluation indicated a preference for fries composed from trans fat-free oil mixtures. The most stable oils were also combination oils. Based on these findings, industry representatives considering using the trans fat-free frying oils should consider using blended oils instead, which met customers' taste preference and minimized oil rancidity and usage.

  14. Analysis of beamed-energy ramjet/scramjet performance

    NASA Technical Reports Server (NTRS)

    Myrabo, L. N.; Powers, M. V.; Zaretzky, C. L.

    1986-01-01

    A study has been performed on a laser-heated ramjet/scramjet vehicle concept for propulsion during the air-breathing portion of an orbital launch trajectory. The concept considers axisymmetric, high-thrust vehicles with external inlets and nozzles. Conceptual design and ramjet/scramjet cycle analysis are emphasized, with propulsive energy provided by combustion of on-board fuel. The conventional ramjet/scramjet combustion chamber is replaced by a laser energy absorption chamber. The elimination of on-board propellant can result in very high thrust-to-weight ratios and payload fractions, in a vehicle with a relatively small degree of mechanical complexity. The basic vehicle has a weight of 12,250 lbf, and a diameter of 5 meters, which is close to the size of the Apollo command module. The ramjet calculations are based on a Mach 3 isentropic inlet with a 13.7 degree half-angle conical tip. The scramjet analysis considers conical inlets with 10, 15, and 30 degree half-angles. Flight Mach numbers from 2 to 20 are considered in the calculations.

  15. Comparative performance analysis: Commercial cut-flower rose production

    SciTech Connect

    Whittier, J.; Fischer, C.L.

    1990-04-01

    A comparative performance analysis has been conducted to examine the various factors associated with establishing and operating a commercial rose cut-flower greenhouse in ten different locations across the United States. Plant productivity, defined as net blooms produced per plant per year, is largely dependent upon local climatic conditions and technological improvements. Regional variations in productivity have been explicitly analyzed. The greenhouse operation is assumed to be four acres in size and the facilities utilize current technologies. The operation is designed as a professionally-organized company with an owner/manager, grower, and salesperson. The primary product is a red hybrid tea rose for sales. Selling markets vary by location, but in general they are large metropolitan areas. The analysis strongly indicates that new installations for cut-flower rose production are profitable in several areas in the U.S. Southwest, particularly in New Mexico, Arizona, and Texas. No ones stands out as a favored location. Las Cruces, New Mexico, has the highest net present volume and return on investment results. 68 refs., 1 fig., 8 tabs.

  16. The Current State of Human Performance Technology: A Citation Network Analysis of "Performance Improvement Quarterly," 1988-2010

    ERIC Educational Resources Information Center

    Cho, Yonjoo; Jo, Sung Jun; Park, Sunyoung; Kang, Ingu; Chen, Zengguan

    2011-01-01

    This study conducted a citation network analysis (CNA) of human performance technology (HPT) to examine its current state of the field. Previous reviews of the field have used traditional research methods, such as content analysis, survey, Delphi, and citation analysis. The distinctive features of CNA come from using a social network analysis…

  17. Analysis of Student Performance in Peer Led Undergraduate Supplements

    NASA Astrophysics Data System (ADS)

    Gardner, Linda M.

    Foundations of Chemistry courses at the University of Kansas have traditionally accommodated nearly 1,000 individual students every year with a single course in a large lecture hall. To develop a more student-centered learning atmosphere, Peer Led Undergraduate Supplements (PLUS) were introduced to assist students, starting in the spring of 2010. PLUS was derived from the more well-known Peer-Led Team Learning with modifications to meet the specific needs of the university and the students. The yearlong investigation of PLUS Chemistry began in the fall of 2012 to allow for adequate development of materials and training of peer leaders. We examined the impact of academic achievement for students who attended PLUS sessions while controlling for high school GPA, math ACT scores, credit hours earned in high school, completion of calculus, gender, and those aspiring to be pharmacists (i.e., pre-pharmacy students). In a least linear squares multiple regression, PLUS participants performed on average one percent higher on exam scores for Chemistry 184 and four tenths of a percent on Chemistry 188 for each PLUS session attended. Pre-pharmacy students moderated the effect of PLUS attendance on chemistry achievement, ultimately negating any relative gain associated by attending PLUS sessions. Evidence of gender difference was demonstrated in the Chemistry 188 model, indicating females experience a greater benefit from PLUS sessions. Additionally, an item analysis studied the relationship between PLUS material to individual items on exams. The research discovered that students who attended PLUS session, answered the items correctly 10 to 20 percent more than their comparison group for PLUS interrelated items and no difference to 10 percent for non-PLUS related items. In summary, PLUS has a positive effect on exam performance in introductory chemistry courses at the University of Kansas.

  18. Analysis of correlation between corneal topographical data and visual performance

    NASA Astrophysics Data System (ADS)

    Zhou, Chuanqing; Yu, Lei; Ren, Qiushi

    2007-02-01

    Purpose: To study correlation among corneal asphericity, higher-order aberrations and visual performance for eyes of virgin myopia and postoperative laser in situ keratomileusis (LASIK). Methods: There were 320 candidates 590 eyes for LASIK treatment included in this study. The mean preoperative spherical equivalence was -4.35+/-1.51D (-1.25 to -9.75), with astigmatism less than 2.5 D. Corneal topography maps and contrast sensitivity were measured and analyzed for every eye before and one year after LASIK for the analysis of corneal asphericity and wavefront aberrations. Results: Preoperatively, only 4th and 6th order aberration had significant correlation with corneal asphericity and apical radius of curvature (p<0.001). Postoperatively, all 3th to 6th order aberrations had statistically significant correlation with corneal asphericity (p<0.01), but only 4th and 6th order aberration had significant correlation with apical radius of curvature (p<0.05). The asymmetrical aberration like coma had significant correlation with vertical offset of pupil center (p<0.01). Preoperatively, corneal aberrations had no significant correlation with visual acuity and area under the log contrast sensitivity (AULCSF) (P>0.05). Postoperatively, corneal aberrations still didn't have significant correlation with visual acuity (P>0.05), but had significantly negative correlation with AULCSF (P<0.01). Corneal asphericity had no significant correlation with AULCSF before and after the treatment (P>0.05). Conclusions: Corneal aberrations had different correlation with corneal profile and visual performance for eyes of virgin myopia and postoperative LASIK, which may be due to changed corneal profile and limitation of metrics of corneal aberrations.

  19. Design, fabrication & performance analysis of an unmanned aerial vehicle

    NASA Astrophysics Data System (ADS)

    Khan, M. I.; Salam, M. A.; Afsar, M. R.; Huda, M. N.; Mahmud, T.

    2016-07-01

    An Unmanned Aerial Vehicle was designed, analyzed and fabricated to meet design requirements and perform the entire mission for an international aircraft design competition. The goal was to have a balanced design possessing, good demonstrated flight handling qualities, practical and affordable manufacturing requirements while providing a high vehicle performance. The UAV had to complete total three missions named ferry flight (1st mission), maximum load mission (2nd mission) and emergency medical mission (3rd mission). The requirement of ferry flight mission was to fly as many as laps as possible within 4 minutes. The maximum load mission consists of flying 3 laps while carrying two wooden blocks which simulate cargo. The requirement of emergency medical mission was complete 3 laps as soon as possible while carrying two attendances and two patients. A careful analysis revealed lowest rated aircraft cost (RAC) as the primary design objective. So, the challenge was to build an aircraft with minimum RAC that can fly fast, fly with maximum payload, and fly fast with all the possible configurations. The aircraft design was reached by first generating numerous design concepts capable of completing the mission requirements. In conceptual design phase, Figure of Merit (FOM) analysis was carried out to select initial aircraft configuration, propulsion, empennage and landing gear. After completion of the conceptual design, preliminary design was carried out. The preliminary design iterations had a low wing loading, high lift coefficient, and a high thrust to weight ratio. To make the aircraft capable of Rough Field Taxi; springs were added in the landing gears for absorbing shock. An airfoil shaped fuselage was designed to allowed sufficient space for payload and generate less drag to make the aircraft fly fast. The final design was a high wing monoplane with conventional tail, single tractor propulsion system and a tail dragger landing gear. Payload was stored in

  20. Performance analysis of a Principal Component Analysis ensemble classifier for Emotiv headset P300 spellers.

    PubMed

    Elsawy, Amr S; Eldawlatly, Seif; Taher, Mohamed; Aly, Gamal M

    2014-01-01

    The current trend to use Brain-Computer Interfaces (BCIs) with mobile devices mandates the development of efficient EEG data processing methods. In this paper, we demonstrate the performance of a Principal Component Analysis (PCA) ensemble classifier for P300-based spellers. We recorded EEG data from multiple subjects using the Emotiv neuroheadset in the context of a classical oddball P300 speller paradigm. We compare the performance of the proposed ensemble classifier to the performance of traditional feature extraction and classifier methods. Our results demonstrate the capability of the PCA ensemble classifier to classify P300 data recorded using the Emotiv neuroheadset with an average accuracy of 86.29% on cross-validation data. In addition, offline testing of the recorded data reveals an average classification accuracy of 73.3% that is significantly higher than that achieved using traditional methods. Finally, we demonstrate the effect of the parameters of the P300 speller paradigm on the performance of the method.

  1. Advanced Analysis of Finger-Tapping Performance: A Preliminary Study

    PubMed Central

    Barut, Çağatay; Kızıltan, Erhan; Gelir, Ethem; Köktürk, Fürüzan

    2013-01-01

    Background: The finger-tapping test is a commonly employed quantitative assessment tool used to measure motor performance in the upper extremities. This task is a complex motion that is affected by external stimuli, mood and health status. The complexity of this task is difficult to explain with a single average intertap-interval value (time difference between successive tappings) which only provides general information and neglects the temporal effects of the aforementioned factors. Aims: This study evaluated the time course of average intertap-interval values and the patterns of variation in both the right and left hands of right-handed subjects using a computer-based finger-tapping system. Study Design: Cross sectional study. Methods: Thirty eight male individuals aged between 20 and 28 years (Mean±SD = 22.24±1.65) participated in the study. Participants were asked to perform single-finger-tapping test for 10 seconds of test period. Only the results of right-handed (RH) 35 participants were considered in this study. The test records the time of tapping and saves data as the time difference between successive tappings for further analysis. The average number of tappings and the temporal fluctuation patterns of the intertap-intervals were calculated and compared. The variations in the intertap-interval were evaluated with the best curve fit method. Results: An average tapping speed or tapping rate can reliably be defined for a single-finger tapping test by analysing the graphically presented data of the number of tappings within the test period. However, a different presentation of the same data, namely the intertap-interval values, shows temporal variation as the number of tapping increases. Curve fitting applications indicate that the variation has a biphasic nature. Conclusion: The measures obtained in this study reflect the complex nature of the finger-tapping task and are suggested to provide reliable information regarding hand performance. Moreover, the

  2. 1-D Numerical Analysis of RBCC Engine Performance

    NASA Technical Reports Server (NTRS)

    Han, Samuel S.

    1998-01-01

    An RBCC engine combines air breathing and rocket engines into a single engine to increase the specific impulse over an entire flight trajectory. Considerable research pertaining to RBCC propulsion was performed during the 1960's and these engines were revisited recently as a candidate propulsion system for either a single-stage-to-orbit (SSTO) or two-stage-to-orbit (TSTO) launch vehicle. There are a variety of RBCC configurations that had been evaluated and new designs are currently under development. However, the basic configuration of all RBCC systems is built around the ejector scramjet engine originally developed for the hypersonic airplane. In this configuration, a rocket engine plays as an ejector in the air-augmented initial acceleration mode, as a fuel injector in scramjet mode and the rocket in all rocket mode for orbital insertion. Computational fluid dynamics (CFD) is a useful tool for the analysis of complex transport processes in various components in RBCC propulsion systems. The objective of the present research was to develop a transient 1-D numerical model that could be used to predict flow behavior throughout a generic RBCC engine following a flight path.

  3. Analysis of classifiers performance for classification of potential microcalcification

    NASA Astrophysics Data System (ADS)

    M. N., Arun K.; Sheshadri, H. S.

    2013-07-01

    Breast cancer is a significant public health problem in the world. According to the literature early detection improve breast cancer prognosis. Mammography is a screening tool used for early detection of breast cancer. About 10-30% cases are missed during the routine check as it is difficult for the radiologists to make accurate analysis due to large amount of data. The Microcalcifications (MCs) are considered to be important signs of breast cancer. It has been reported in literature that 30% - 50% of breast cancer detected radio graphically show MCs on mammograms. Histologic examinations report 62% to 79% of breast carcinomas reveals MCs. MC are tiny, vary in size, shape, and distribution, and MC may be closely connected to surrounding tissues. There is a major challenge using the traditional classifiers in the classification of individual potential MCs as the processing of mammograms in appropriate stage generates data sets with an unequal amount of information for both classes (i.e., MC, and Not-MC). Most of the existing state-of-the-art classification approaches are well developed by assuming the underlying training set is evenly distributed. However, they are faced with a severe bias problem when the training set is highly imbalanced in distribution. This paper addresses this issue by using classifiers which handle the imbalanced data sets. In this paper, we also compare the performance of classifiers which are used in the classification of potential MC.

  4. Analysis of Performance of Stereoscopic-Vision Software

    NASA Technical Reports Server (NTRS)

    Kim, Won; Ansar, Adnan; Steele, Robert; Steinke, Robert

    2007-01-01

    A team of JPL researchers has analyzed stereoscopic vision software and produced a document describing its performance. This software is of the type used in maneuvering exploratory robotic vehicles on Martian terrain. The software in question utilizes correlations between portions of the images recorded by two electronic cameras to compute stereoscopic disparities, which, in conjunction with camera models, are used in computing distances to terrain points to be included in constructing a three-dimensional model of the terrain. The analysis included effects of correlation- window size, a pyramidal image down-sampling scheme, vertical misalignment, focus, maximum disparity, stereo baseline, and range ripples. Contributions of sub-pixel interpolation, vertical misalignment, and foreshortening to stereo correlation error were examined theoretically and experimentally. It was found that camera-calibration inaccuracy contributes to both down-range and cross-range error but stereo correlation error affects only the down-range error. Experimental data for quantifying the stereo disparity error were obtained by use of reflective metrological targets taped to corners of bricks placed at known positions relative to the cameras. For the particular 1,024-by-768-pixel cameras of the system analyzed, the standard deviation of the down-range disparity error was found to be 0.32 pixel.

  5. Design consideration and performance analysis of OCT-based topography

    NASA Astrophysics Data System (ADS)

    Meemon, Panomsak; Yao, Jianing; Rolland, Jannick P.

    2014-03-01

    We report a study on design consideration and performance analysis of OCT-based topography by tracking of maximum intensity at each layer's interface. We demonstrate that, for a given stabilized OCT system, a high precision and accuracy of OCT-based layers and thickness topography in the order of tens nanometer can be achieved by using a technique of maximum amplitude tracking. The submicron precision was obtained by over sampling through the FFT of the acquired spectral fringes but was eventually limited by the system stability. Furthermore, we report characterization of a precision, repeatability, and accuracy of the surfaces, sub-surfaces, and thickness topography using our optimized FD-OCT system. We verified that for a given stability of our OCT system, precision of the detected position of signal's peak of down to 20 nm was obtained. In addition, we quantified the degradation of the precision caused by sensitivity fall-off over depth of FD-OCT. The measured precision is about 20 nm at about 0.1 mm depth, and degrades to about 80 nm at 1 mm depth, a position of about 10 dB sensitivity fall-off. The measured repeatability of thickness measurements over depth was approximately 0.04 micron. Finally, the accuracy of the system was verified by comparing with a digital micrometer gauging.

  6. Analysis of Illinois Home Performance with ENERGY STAR® Measure Packages

    SciTech Connect

    Baker, J.; Yee, S.; Brand, L.

    2013-09-01

    Through the Chicagoland Single Family Housing Characterization and Retrofit Prioritization report, the Partnership for Advanced Residential Retrofit research team characterized 15 housing types in the Chicagoland region based on assessor data, utility billing history, and available data from prior energy efficiency programs. Within these 15 groups, a subset showed the greatest opportunity for energy savings based on BEopt Version 1.1 modeling of potential energy efficiency package options and the percent of the housing stock represented by each group. In this project, collected field data from a whole-home program in Illinois are utilized to compare marketplace-installed measures to the energy saving optimal packages previously developed for the 15 housing types. Housing type, conditions, energy efficiency measures installed, and retrofit cost information were collected from 19 homes that participated in the Illinois Home Performance with ENERGY STAR program in 2012, representing eight of the characterized housing groups. Two were selected for further case study analysis to provide an illustration of the differences between optimal and actually installed measures. Taken together, these homes are representative of 34.8% of the Chicagoland residential building stock. In one instance, actual installed measures closely matched optimal recommended measures.

  7. Performance Analysis: Work Control Events Identified January - August 2010

    SciTech Connect

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting in each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded elements of an institutional work planning and control system. By the end of that year this system was documented and implementation had begun. In 2009

  8. Routing performance analysis and optimization within a massively parallel computer

    DOEpatents

    Archer, Charles Jens; Peters, Amanda; Pinnow, Kurt Walter; Swartz, Brent Allen

    2013-04-16

    An apparatus, program product and method optimize the operation of a massively parallel computer system by, in part, receiving actual performance data concerning an application executed by the plurality of interconnected nodes, and analyzing the actual performance data to identify an actual performance pattern. A desired performance pattern may be determined for the application, and an algorithm may be selected from among a plurality of algorithms stored within a memory, the algorithm being configured to achieve the desired performance pattern based on the actual performance data.

  9. Analysis of TIMS performance subjected to simulated wind blast

    NASA Technical Reports Server (NTRS)

    Jaggi, S.; Kuo, S.

    1992-01-01

    The results of the performance of the Thermal Infrared Multispectral Scanner (TIMS) when it is subjected to various wind conditions in the laboratory are described. Various wind conditions were simulated using a 24 inch fan or combinations of air jet streams blowing toward either or both of the blackbody surfaces. The fan was used to simulate a large volume of air flow at moderate speeds (up to 30 mph). The small diameter air jets were used to probe TIMS system response in reaction to localized wind perturbations. The maximum nozzle speed of the air jet was 60 mph. A range of wind directions and speeds were set up in the laboratory during the test. The majority of the wind tests were conducted under ambient conditions with the room temperature fluctuating no more than 2 C. The temperature of the high speed air jet was determined to be within 1 C of the room temperature. TIMS response was recorded on analog tape. Additional thermistor readouts of the blackbody temperatures and thermocouple readout of the ambient temperature were recorded manually to be compared with the housekeeping data recorded on the tape. Additional tests were conducted under conditions of elevated and cooled room temperatures. The room temperature was varied between 19.5 to 25.5 C in these tests. The calibration parameters needed for quantitative analysis of TIMS data were first plotted on a scanline-by-scanline basis. These parameters are the low and high blackbody temperature readings as recorded by the TIMS and their corresponding digitized count values. Using these values, the system transfer equations were calculated. This equation allows us to compute the flux for any video count by computing the slope and intercept of the straight line that relates the flux to the digital count. The actual video of the target (the lab floor in this case) was then compared with a simulated target. This simulated target was assumed to be a blackbody at emissivity of .95 degrees and the temperature was

  10. Performance analysis & optimization of well production in unconventional resource plays

    NASA Astrophysics Data System (ADS)

    Sehbi, Baljit Singh

    The Unconventional Resource Plays consisting of the lowest tier of resources (large volumes and most difficult to develop) have been the main focus of US domestic activity during recent times. Horizontal well drilling and hydraulic fracturing completion technology have been primarily responsible for this paradigm shift. The concept of drainage volume is being examined using pressure diffusion along streamlines. We use diffusive time of flight to optimize the number of hydraulic fracture stages in horizontal well application for Tight Gas reservoirs. Numerous field case histories are available in literature for optimizing number of hydraulic fracture stages, although the conclusions are case specific. In contrast, a general method is being presented that can be used to augment field experiments necessary to optimize the number of hydraulic fracture stages. The optimization results for the tight gas example are in line with the results from economic analysis. The fluid flow simulation for Naturally Fractured Reservoirs (NFR) is performed by Dual-Permeability or Dual-Porosity formulations. Microseismic data from Barnett Shale well is used to characterize the hydraulic fracture geometry. Sensitivity analysis, uncertainty assessment, manual & computer assisted history matching are integrated to develop a comprehensive workflow for building reliable reservoir simulation models. We demonstrate that incorporating proper physics of flow is the first step in building reliable reservoir simulation models. Lack of proper physics often leads to unreasonable reservoir parameter estimates. The workflow demonstrates reduced non-uniqueness for the inverse history matching problem. The behavior of near-critical fluids in Liquid Rich Shale plays defies the production behavior observed in conventional reservoir systems. In conventional reservoirs an increased gas-oil ratio is observed as flowing bottom-hole pressure is less than the saturation pressure. The production behavior is

  11. Performance Analysis of Saturated Induction Motors by Virtual Tests

    ERIC Educational Resources Information Center

    Ojaghi, M.; Faiz, J.; Kazemi, M.; Rezaei, M.

    2012-01-01

    Many undergraduate-level electrical machines textbooks give detailed treatments of the performance of induction motors. Students can deepen this understanding of motor performance by performing the appropriate practical work in laboratories or in simulation using proper software packages. This paper considers various common and less-common tests…

  12. LIFT: analysis of performance in a laser assisted adaptive optics

    NASA Astrophysics Data System (ADS)

    Plantet, Cedric; Meimon, Serge; Conan, Jean-Marc; Neichel, Benoît; Fusco, Thierry

    2014-08-01

    Laser assisted adaptive optics systems rely on Laser Guide Star (LGS) Wave-Front Sensors (WFS) for high order aberration measurements, and rely on Natural Guide Stars (NGS) WFS to complement the measurements on low orders such as tip-tilt and focus. The sky-coverage of the whole system is therefore related to the limiting magnitude of the NGS WFS. We have recently proposed LIFT, a novel phase retrieval WFS technique, that allows a 1 magnitude gain over the usually used 2×2 Shack-Hartmann WFS. After an in-lab validation, LIFT's concept has been demonstrated on sky in open loop on GeMS (the Gemini Multiconjugate adaptive optics System at Gemini South). To complete its validation, LIFT now needs to be operated in closed loop in a laser assisted adaptive optics system. The present work gives a detailed analysis of LIFT's behavior in presence of high order residuals and how to limit aliasing effects on the tip/tilt/focus estimation. Also, we study the high orders' impact on noise propagation. For this purpose, we simulate a multiconjugate adaptive optics loop representative of a GeMS-like 5 LGS configuration. The residual high orders are derived from a Fourier based simulation. We demonstrate that LIFT keeps a high performance gain over the Shack-Hartmann 2×2 whatever the turbulence conditions. Finally, we show the first simulation of a closed loop with LIFT estimating turbulent tip/tilt and focus residuals that could be induced by sodium layer's altitude variations.

  13. Performance analysis of signaling protocols on OBS switches

    NASA Astrophysics Data System (ADS)

    Kirci, Pinar; Zaim, A. Halim

    2005-10-01

    In this paper, Just-In-Time (JIT), Just-Enough-Time (JET) and Horizon signalling schemes for Optical Burst Switched Networks (OBS) are presented. These signaling schemes run over a core dWDM network and a network architecture based on Optical Burst Switches (OBS) is proposed to support IP, ATM and Burst traffic. In IP and ATM traffic several packets are assembled in a single packet called burst and the burst contention is handled by burst dropping. The burst length distribution in IP traffic is arbitrary between 0 and 1, and is fixed in ATM traffic at 0,5. Burst traffic on the other hand is arbitrary between 1 and 5. The Setup and Setup ack length distributions are arbitrary. We apply the Poisson model with rate λ and Self-Similar model with pareto distribution rate α to identify inter-arrival times in these protocols. We consider a communication between a source client node and a destination client node over an ingress and one or more multiple intermediate switches.We use buffering only in the ingress node. The communication is based on single burst connections in which, the connection is set up just before sending a burst and then closed as soon as the burst is sent. Our analysis accounts for several important parameters, including the burst setup, burst setup ack, keepalive messages and the optical switching protocol. We compare the performance of the three signalling schemes on the network under as burst dropping probability under a range of network scenarios.

  14. Graphical User Interface for Simulink Integrated Performance Analysis Model

    NASA Technical Reports Server (NTRS)

    Durham, R. Caitlyn

    2009-01-01

    The J-2X Engine (built by Pratt & Whitney Rocketdyne,) in the Upper Stage of the Ares I Crew Launch Vehicle, will only start within a certain range of temperature and pressure for Liquid Hydrogen and Liquid Oxygen propellants. The purpose of the Simulink Integrated Performance Analysis Model is to verify that in all reasonable conditions the temperature and pressure of the propellants are within the required J-2X engine start boxes. In order to run the simulation, test variables must be entered at all reasonable values of parameters such as heat leak and mass flow rate. To make this testing process as efficient as possible in order to save the maximum amount of time and money, and to show that the J-2X engine will start when it is required to do so, a graphical user interface (GUI) was created to allow the input of values to be used as parameters in the Simulink Model, without opening or altering the contents of the model. The GUI must allow for test data to come from Microsoft Excel files, allow those values to be edited before testing, place those values into the Simulink Model, and get the output from the Simulink Model. The GUI was built using MATLAB, and will run the Simulink simulation when the Simulate option is activated. After running the simulation, the GUI will construct a new Microsoft Excel file, as well as a MATLAB matrix file, using the output values for each test of the simulation so that they may graphed and compared to other values.

  15. VALIDATION GUIDELINES FOR LABORATORIES PERFORMING FORENSIC ANALYSIS OF CHEMICAL TERRORISM

    EPA Science Inventory

    The Scientific Working Group on Forensic Analysis of Chemical Terrorism (SWGFACT) has developed the following guidelines for laboratories engaged in the forensic analysis of chemical evidence associated with terrorism. This document provides a baseline framework and guidance for...

  16. Integrated Flight Performance Analysis of a Launch Abort System Concept

    NASA Technical Reports Server (NTRS)

    Tartabini, Paul V.

    2007-01-01

    This paper describes initial flight performance analyses conducted early in the Orion Project to support concept feasibility studies for the Crew Exploration Vehicle s Launch Abort System (LAS). Key performance requirements that significantly affect abort capability are presented. These requirements have implications on sizing the Abort Motor, tailoring its thrust profile to meet escape requirements for both launch pad and high drag/high dynamic pressure ascent aborts. Additional performance considerations are provided for the Attitude Control Motor, a key element of the Orion LAS design that eliminates the need for ballast and provides performance robustness over a passive control approach. Finally, performance of the LAS jettison function is discussed, along with implications on Jettison Motor sizing and the timing of the jettison event during a nominal mission. These studies provide an initial understanding of LAS performance that will continue to evolve as the Orion design is matured.

  17. An empirical performance analysis of commodity memories in commodity servers

    SciTech Connect

    Kerbyson, D. J.; Lang, M. K.; Patino, G.

    2004-01-01

    This work details a performance study of six different commodity memories in two commodity server nodes on a number of microbenchmarks, that measure low-level performance characteristics, as well as on two applications representative of the ASCI workload. Thc memories vary both in terms of performance, including latency and bandwidths, and also in terms of their physical properties and manufacturer. Two server nodes were used; one Itanium-II Madison based system, and one Xeon based system. All the memories examined can be used within both processing nodes. This allows the performance of the memories to be directly examined while keeping all other factors within a processing node the same (processor, motherboard, operating system etc.). The results of this study show that there can be a significant difference in application performance from the different memories - by as much as 20%. Thus, by choosing the most appropriate memory for a processing node at a minimal cost differential, significant improved performance may be achievable.

  18. Analysis of factors that predict clinical performance in medical school.

    PubMed

    White, Casey B; Dey, Eric L; Fantone, Joseph C

    2009-10-01

    Academic achievement indices including GPAs and MCAT scores are used to predict the spectrum of medical student academic performance types. However, use of these measures ignores two changes influencing medical school admissions: student diversity and affirmative action, and an increased focus on communication skills. To determine if GPA and MCAT predict performance in medical school consistently across students, and whether either predicts clinical performance in clerkships. A path model was developed to examine relationships among indices of medical student performance during the first three years of medical school for five cohorts of medical students. A structural equation approach was used to calculate the coefficients hypothesized in the model for majority and minority students. Significant differences between majority and minority students were observed. MCAT scores, for example, did not predict performance of minority students in the first year of medical school but did predict performance of majority students. This information may be of use to medical school admissions and resident selection committees. PMID:18030590

  19. Analysis of Aurora's Performance Simulation Engine for Three Systems

    SciTech Connect

    Freeman, Janine; Simon, Joseph

    2015-07-07

    Aurora Solar Inc. is building a cloud-based optimization platform to automate the design, engineering, and permit generation process of solar photovoltaic (PV) installations. They requested that the National Renewable Energy Laboratory (NREL) validate the performance of the PV system performance simulation engine of Aurora Solar’s solar design platform, Aurora. In previous work, NREL performed a validation of multiple other PV modeling tools 1, so this study builds upon that work by examining all of the same fixed-tilt systems with available module datasheets that NREL selected and used in the aforementioned study. Aurora Solar set up these three operating PV systems in their modeling platform using NREL-provided system specifications and concurrent weather data. NREL then verified the setup of these systems, ran the simulations, and compared the Aurora-predicted performance data to measured performance data for those three systems, as well as to performance data predicted by other PV modeling tools.

  20. Analysis of Dynamic Performances for Servo Drive Hydraulic System

    NASA Astrophysics Data System (ADS)

    Yang, Jianxi; Wang, Liying; Huang, Jian

    Based on the servo drive hydraulic of system, using MATLAB/Simulink software in this paper, the impacts on system dynamic performances are analyzed and simulated of all the parameters (structure parameters J, Dp, and mechanism parameters A1, α, k, V1 CP). According to the relation curve of the main systematic characteristics and dynamic performances obtained from the simulations, it provides advantages for system dynamic performance improvements. The simulation results indicate that dynamic performances can be improved through the reasonable selection of the system structural parameters. Also it laid the theoretical foundation for further study on energy saving of hydraulic injection machine.

  1. Separation of Performance-Approach and Performance-Avoidance Achievement Goals : A Broader Analysis

    ERIC Educational Resources Information Center

    Murayama, Kou; Elliot, Andrew J.; Yamagata, Shinji

    2011-01-01

    In the literature on achievement goals, performance-approach goals (striving to do better than others) and performance-avoidance goals (striving to avoid doing worse than others) tend to exhibit a moderate to high correlation, raising questions about whether the 2 goals represent distinct constructs. In the current article, we sought to examine…

  2. Manual control analysis of drug effects on driving performance

    NASA Technical Reports Server (NTRS)

    Smiley, A.; Ziedman, K.; Moskowitz, H.

    1981-01-01

    The effects of secobarbital, diazepam, alcohol, and marihuana on car-driver transfer functions obtained using a driving simulator were studied. The first three substances, all CNS depressants, reduced gain, crossover frequency, and coherence which resulted in poorer tracking performance. Marihuana also impaired tracking performance but the only effect on the transfer function parameters was to reduce coherence.

  3. An Analysis of a High Performing School District's Culture

    ERIC Educational Resources Information Center

    Corum, Kenneth D.; Schuetz, Todd B.

    2012-01-01

    This report describes a problem based learning project focusing on the cultural elements of a high performing school district. Current literature on school district culture provides numerous cultural elements that are present in high performing school districts. With the current climate in education placing pressure on school districts to perform…

  4. Cost Analysis When Open Surgeons Perform Minimally Invasive Hysterectomy

    PubMed Central

    Kantartzis, Kelly L.; Ahn, Ki Hoon; Bonidie, Michael J.; Lee, Ted

    2014-01-01

    Background and Objective: The costs to perform a hysterectomy are widely variable. Our objective was to determine hysterectomy costs by route and whether traditionally open surgeons lower costs when performing laparoscopy versus robotics. Methods: Hysterectomy costs including subcategories were collected from 2011 to 2013. Costs were skewed, so 2 statistical transformations were performed. Costs were compared by surgeon classification (open, laparoscopic, or robotic) and surgery route. Results: A total of 4,871 hysterectomies were performed: 34.2% open, 50.7% laparoscopic, and 15.1% robotic. Laparoscopic hysterectomy had the lowest total costs (P < .001). By cost subcategory, laparoscopic hysterectomy was lower than robotic hysterectomy in 6 and higher in 1. When performing robotic hysterectomy, open and robotic surgeon costs were similar. With laparoscopic hysterectomy, open surgeons had higher costs than laparoscopic surgeons for 1 of 2 statistical transformations (P = .007). Open surgeons had lower costs performing laparoscopic hysterectomy than robotic hysterectomy with robotic maintenance and depreciation included (P < .001) but similar costs if these variables were excluded. Conclusion: Although laparoscopic hysterectomy had lowest costs overall, robotics may be no more costly than laparoscopic hysterectomy when performed by surgeons who predominantly perform open hysterectomy. PMID:25489215

  5. An Analysis of the Professional Performance of Physician's Assistants

    ERIC Educational Resources Information Center

    Perry, Henry B., III

    1977-01-01

    The job performance of a national sample of 939 physician's assistants was assessed by both a self-rating scale and one completed by supervising physicians. Three quarters of the supervisors were greatly satisfied with their assistants. Amount of education and previous medical experience did not affect job performance. (Editor/LBH)

  6. Pitch Error Analysis of Young Piano Students' Music Reading Performances

    ERIC Educational Resources Information Center

    Rut Gudmundsdottir, Helga

    2010-01-01

    This study analyzed the music reading performances of 6-13-year-old piano students (N = 35) in their second year of piano study. The stimuli consisted of three piano pieces, systematically constructed to vary in terms of left-hand complexity and input simultaneity. The music reading performances were recorded digitally and a code of error analysis…

  7. Modelling and performance analysis of four and eight element TCAS

    NASA Technical Reports Server (NTRS)

    Sampath, K. S.; Rojas, R. G.; Burnside, W. D.

    1990-01-01

    This semi-annual report describes the work performed during the period September 1989 through March 1990. The first section presents a description of the effect of the engines of the Boeing 737-200 on the performance of a bottom mounted eight-element traffic alert and collision avoidance system (TCAS). The second section deals exclusively with a four element TCAS antenna. The model obtained to simulate the four element TCAS and new algorithms developed for studying its performance are described. The effect of location on its performance when mounted on top of a Boeing 737-200 operating at 1060 MHz is discussed. It was found that the four element TCAS generally does not perform as well as the eight element TCAS III.

  8. Performance analysis of PCM/ADPCM transcoding systems

    NASA Astrophysics Data System (ADS)

    Lee, J.-I.; Un, C. K.

    1985-12-01

    In this paper, the performance of PCM/ADPCM transcoding systems are analyzed. The coders studied are the 64 kbit/s PCM with mu-255 companding law and the 32 kbit/s ADPCM proposed by Cummiskey et al. 1973. The theoretically predicted performance agrees closely with the results of computer simulation for a wide range of input signal level. According to the results, the performance degradation resulting from the code conversion process appears to be minimal for the single tandem case. However, for the case of multiple tandem code conversions, the performance becomes significantly degraded as the number of tandem coders increases. The overall performance of the coder that is inferior to the other coder being cascaded.

  9. Issues in performing a network meta-analysis.

    PubMed

    Senn, Stephen; Gavini, Francois; Magrez, David; Scheen, André

    2013-04-01

    The example of the analysis of a collection of trials in diabetes consisting of a sparsely connected network of 10 treatments is used to make some points about approaches to analysis. In particular various graphical and tabular presentations, both of the network and of the results are provided and the connection to the literature of incomplete blocks is made. It is clear from this example that is inappropriate to treat the main effect of trial as random and the implications of this for analysis are discussed. It is also argued that the generalisation from a classic random-effect meta-analysis to one applied to a network usually involves strong assumptions about the variance components involved. Despite this, it is concluded that such an analysis can be a useful way of exploring a set of trials.

  10. Issues in performing a network meta-analysis.

    PubMed

    Senn, Stephen; Gavini, Francois; Magrez, David; Scheen, André

    2013-04-01

    The example of the analysis of a collection of trials in diabetes consisting of a sparsely connected network of 10 treatments is used to make some points about approaches to analysis. In particular various graphical and tabular presentations, both of the network and of the results are provided and the connection to the literature of incomplete blocks is made. It is clear from this example that is inappropriate to treat the main effect of trial as random and the implications of this for analysis are discussed. It is also argued that the generalisation from a classic random-effect meta-analysis to one applied to a network usually involves strong assumptions about the variance components involved. Despite this, it is concluded that such an analysis can be a useful way of exploring a set of trials. PMID:22218368

  11. Mir Cooperative Solar Array Flight Performance Data and Computational Analysis

    NASA Technical Reports Server (NTRS)

    Kerslake, Thomas W.; Hoffman, David J.

    1997-01-01

    The Mir Cooperative Solar Array (MCSA) was developed jointly by the United States (US) and Russia to provide approximately 6 kW of photovoltaic power to the Russian space station Mir. The MCSA was launched to Mir in November 1995 and installed on the Kvant-1 module in May 1996. Since the MCSA photovoltaic panel modules (PPMs) are nearly identical to those of the International Space Station (ISS) photovoltaic arrays, MCSA operation offered an opportunity to gather multi-year performance data on this technology prior to its implementation on ISS. Two specially designed test sequences were executed in June and December 1996 to measure MCSA performance. Each test period encompassed 3 orbital revolutions whereby the current produced by the MCSA channels was measured. The temperature of MCSA PPMs was also measured. To better interpret the MCSA flight data, a dedicated FORTRAN computer code was developed to predict the detailed thermal-electrical performance of the MCSA. Flight data compared very favorably with computational performance predictions. This indicated that the MCSA electrical performance was fully meeting pre-flight expectations. There were no measurable indications of unexpected or precipitous MCSA performance degradation due to contamination or other causes after 7 months of operation on orbit. Power delivered to the Mir bus was lower than desired as a consequence of the retrofitted power distribution cabling. The strong correlation of experimental and computational results further bolsters the confidence level of performance codes used in critical ISS electric power forecasting. In this paper, MCSA flight performance tests are described as well as the computational modeling behind the performance predictions.

  12. Analysis of a high-performance tubular solar collector

    NASA Technical Reports Server (NTRS)

    Lansing, F. L.; Yung, C. S.

    1981-01-01

    This article analyzes the thermal performance of a new vacuum tube solar collector. The assumptions and mathematical modeling are presented. The problem is reduced to the formulation of two simultaneous linear differential equations characterizing the collector thermal behavior. After applying the boundary conditions, a general solution is obtained which is found similar to the general Hottel, Whillier and Bliss form, but with a complex flow factor. The details of the two-dimensional thermal model of the solar collector at steady state is also presented to include the computer simulation and the performance parameterization. Comparison of the simulated performance with the manufacturer's test data showed good agreement at wide ranges of operating conditions.

  13. Kinematic performance analysis of a parallel-chain hexapod machine

    SciTech Connect

    Jing Song; Jong-I Mou; Calvin King

    1998-05-18

    Inverse and forward kinematic models were derived to analyze the performance of a parallel-chain hexapod machine. Analytical models were constructed for both ideal and real structures. Performance assessment and enhancement algorithms were developed to determine the strut lengths for both ideal and real structures. The strut lengths determined from both cases can be used to analyze the effect of structural imperfections on machine performance. In an open-architecture control environment, strut length errors can be fed back to the controller to compensate for the displacement errors and thus improve the machine's accuracy in production.

  14. Analysis of complex network performance and heuristic node removal strategies

    NASA Astrophysics Data System (ADS)

    Jahanpour, Ehsan; Chen, Xin

    2013-12-01

    Removing important nodes from complex networks is a great challenge in fighting against criminal organizations and preventing disease outbreaks. Six network performance metrics, including four new metrics, are applied to quantify networks' diffusion speed, diffusion scale, homogeneity, and diameter. In order to efficiently identify nodes whose removal maximally destroys a network, i.e., minimizes network performance, ten structured heuristic node removal strategies are designed using different node centrality metrics including degree, betweenness, reciprocal closeness, complement-derived closeness, and eigenvector centrality. These strategies are applied to remove nodes from the September 11, 2001 hijackers' network, and their performance are compared to that of a random strategy, which removes randomly selected nodes, and the locally optimal solution (LOS), which removes nodes to minimize network performance at each step. The computational complexity of the 11 strategies and LOS is also analyzed. Results show that the node removal strategies using degree and betweenness centralities are more efficient than other strategies.

  15. Assessing BMP Performance Using Microtox Toxicity Analysis - Rhode Island

    EPA Science Inventory

    Best Management Practices (BMPs) have been shown to be effective in reducing runoff and pollutants from urban areas and thus provide a mechanism to improve downstream water quality. Currently, BMP performance regarding water quality improvement is assessed through measuring each...

  16. Performance analysis of ten brands of batteries for hearing aids

    PubMed Central

    Penteado, Silvio Pires; Bento, Ricardo Ferreira

    2013-01-01

    Summary Introduction: Comparison of the performance of hearing instrument batteries from various manufacturers can enable otologists, audiologists, or final consumers to select the best products, maximizing the use of these materials. Aim: To analyze the performance of ten brands of batteries for hearing aids available in the Brazilian marketplace. Methods: Hearing aid batteries in four sizes were acquired from ten manufacturers and subjected to the same test conditions in an acoustic laboratory. Results: The results obtained in the laboratory contrasted with the values reported by manufacturers highlighted significant discrepancies, besides the fact that certain brands in certain sizes perform better on some tests, but does not indicate which brand is the best in all sizes. Conclusions: It was possible to investigate the performance of ten brands of hearing aid batteries and describe the procedures to be followed for leakage, accidental intake, and disposal. PMID:25992026

  17. Preliminary basic performance analysis of the Cedar multiprocessor memory system

    NASA Technical Reports Server (NTRS)

    Gallivan, K.; Jalby, W.; Turner, S.; Veidenbaum, A.; Wijshoff, H.

    1991-01-01

    Some preliminary basic results on the performance of the Cedar multiprocessor memory system are presented. Empirical results are presented and used to calibrate a memory system simulator which is then used to discuss the scalability of the system.

  18. Assessing BMP Performance Using Microtox® Toxicity Analysis

    EPA Science Inventory

    Best Management Practices (BMPs) have been shown to be effective in reducing runoff and pollutants from urban areas and thus provide a mechanism to improve downstream water quality. Currently, BMP performance regarding water quality improvement is assessed through measuring each...

  19. Moisture and Structural Analysis for High Performance Hybrid Wall Assemblies

    SciTech Connect

    Grin, A.; Lstiburek, J.

    2012-09-01

    This report describes the work conducted by the Building Science Corporation (BSC) Building America Research Team's 'Energy Efficient Housing Research Partnerships' project. Based on past experience in the Building America program, they have found that combinations of materials and approaches---in other words, systems--usually provide optimum performance. No single manufacturer typically provides all of the components for an assembly, nor has the specific understanding of all the individual components necessary for optimum performance.

  20. Using Weibull Distribution Analysis to Evaluate ALARA Performance

    SciTech Connect

    E. L. Frome, J. P. Watkins, and D. A. Hagemeyer

    2009-10-01

    As Low as Reasonably Achievable (ALARA) is the underlying principle for protecting nuclear workers from potential health outcomes related to occupational radiation exposure. Radiation protection performance is currently evaluated by measures such as collective dose and average measurable dose, which do not indicate ALARA performance. The purpose of this work is to show how statistical modeling of individual doses using the Weibull distribution can provide objective supplemental performance indicators for comparing ALARA implementation among sites and for insights into ALARA practices within a site. Maximum likelihood methods were employed to estimate the Weibull shape and scale parameters used for performance indicators. The shape parameter reflects the effectiveness of maximizing the number of workers receiving lower doses and is represented as the slope of the fitted line on a Weibull probability plot. Additional performance indicators derived from the model parameters include the 99th percentile and the exceedance fraction. When grouping sites by collective total effective dose equivalent (TEDE) and ranking by 99th percentile with confidence intervals, differences in performance among sites can be readily identified. Applying this methodology will enable more efficient and complete evaluation of the effectiveness of ALARA implementation.

  1. Thermal Performance Analysis of a High-Mass Residential Building

    SciTech Connect

    Smith, M.W.; Torcellini, P.A., Hayter, S.J.; Judkoff, R.

    2001-01-30

    Minimizing energy consumption in residential buildings using passive solar strategies almost always calls for the efficient use of massive building materials combined with solar gain control and adequate insulation. Using computerized simulation tools to understand the interactions among all the elements facilitates designing low-energy houses. Finally, the design team must feel confident that these tools are providing realistic results. The design team for the residential building described in this paper relied on computerized design tools to determine building envelope features that would maximize the energy performance [1]. Orientation, overhang dimensions, insulation amounts, window characteristics and other strategies were analyzed to optimize performance in the Pueblo, Colorado, climate. After construction, the actual performance of the house was monitored using both short-term and long-term monitoring approaches to verify the simulation results and document performance. Calibrated computer simulations showed that this house consumes 56% less energy than would a similar theoretical house constructed to meet the minimum residential energy code requirements. This paper discusses this high-mass house and compares the expected energy performance, based on the computer simulations, versus actual energy performance.

  2. Application of uncertainty analysis to cooling tower thermal performance tests

    SciTech Connect

    Yost, J.G.; Wheeler, D.E.

    1986-01-01

    The purpose of this paper is to provide an overview of uncertainty analyses. The following topics are addressed: l. A review and summary of the basic constituents of an uncertainty analysis with definitions and discussion of basic terms; 2. A discussion of the benefits and uses of uncertainty analysis; and 3. Example uncertainty analyses with emphasis on the problems, limitations, and site-specific complications.

  3. Social Cognitive Career Theory, Conscientiousness, and Work Performance: A Meta-Analytic Path Analysis

    ERIC Educational Resources Information Center

    Brown, Steven D.; Lent, Robert W.; Telander, Kyle; Tramayne, Selena

    2011-01-01

    We performed a meta-analytic path analysis of an abbreviated version of social cognitive career theory's (SCCT) model of work performance (Lent, Brown, & Hackett, 1994). The model we tested included the central cognitive predictors of performance (ability, self-efficacy, performance goals), with the exception of outcome expectations. Results…

  4. Full-Envelope Launch Abort System Performance Analysis Methodology

    NASA Technical Reports Server (NTRS)

    Aubuchon, Vanessa V.

    2014-01-01

    The implementation of a new dispersion methodology is described, which dis-perses abort initiation altitude or time along with all other Launch Abort System (LAS) parameters during Monte Carlo simulations. In contrast, the standard methodology assumes that an abort initiation condition is held constant (e.g., aborts initiated at altitude for Mach 1, altitude for maximum dynamic pressure, etc.) while dispersing other LAS parameters. The standard method results in large gaps in performance information due to the discrete nature of initiation conditions, while the full-envelope dispersion method provides a significantly more comprehensive assessment of LAS abort performance for the full launch vehicle ascent flight envelope and identifies performance "pinch-points" that may occur at flight conditions outside of those contained in the discrete set. The new method has significantly increased the fidelity of LAS abort simulations and confidence in the results.

  5. An Analysis of Performance Enhancement Techniques for Overset Grid Applications

    NASA Technical Reports Server (NTRS)

    Djomehri, J. J.; Biswas, R.; Potsdam, M.; Strawn, R. C.; Biegel, Bryan (Technical Monitor)

    2002-01-01

    The overset grid methodology has significantly reduced time-to-solution of high-fidelity computational fluid dynamics (CFD) simulations about complex aerospace configurations. The solution process resolves the geometrical complexity of the problem domain by using separately generated but overlapping structured discretization grids that periodically exchange information through interpolation. However, high performance computations of such large-scale realistic applications must be handled efficiently on state-of-the-art parallel supercomputers. This paper analyzes the effects of various performance enhancement techniques on the parallel efficiency of an overset grid Navier-Stokes CFD application running on an SGI Origin2000 machine. Specifically, the role of asynchronous communication, grid splitting, and grid grouping strategies are presented and discussed. Results indicate that performance depends critically on the level of latency hiding and the quality of load balancing across the processors.

  6. Off-design performance analysis of MHD generator channels

    NASA Technical Reports Server (NTRS)

    Wilson, D. R.; Williams, T. S.

    1980-01-01

    A computer code for performing parametric design point calculations, and evaluating the off-design performance of MHD generators has been developed. The program is capable of analyzing Faraday, Hall, and DCW channels, including the effect of electrical shorting in the gas boundary layers and coal slag layers. Direct integration of the electrode voltage drops is included. The program can be run in either the design or off-design mode. Details of the computer code, together with results of a study of the design and off-design performance of the proposed ETF MHD generator are presented. Design point variations of pre-heat and stoichiometry were analyzed. The off-design study included variations in mass flow rate and oxygen enrichment.

  7. Hydrogen engine performance analysis project. Second annual report

    SciTech Connect

    Adt, Jr., R. R.; Swain, M. R.; Pappas, J. M.

    1980-01-01

    Progress in a 3 year research program to evaluate the performance and emission characteristics of hydrogen-fueled internal combustion engines is reported. Fifteen hydrogen engine configurations will be subjected to performance and emissions characterization tests. During the first two years, baseline data for throttled and unthrottled, carburetted and timed hydrogen induction, Pre IVC hydrogen-fueled engine configurations, with and without exhaust gas recirculation (EGR) and water injection, were obtained. These data, along with descriptions of the test engine and its components, the test apparatus, experimental techniques, experiments performed and the results obtained, are given. Analyses of other hydrogen-engine project data are also presented and compared with the results of the present effort. The unthrottled engine vis-a-vis the throttled engine is found, in general, to exhibit higher brake thermal efficiency. The unthrottled engine also yields lower NO/sub x/ emissions, which were found to be a strong function of fuel-air equivalence ratio. (LCL)

  8. Performance analysis: a study using data envelopment analysis in 26 Brazilian hospitals.

    PubMed

    Guerra, Mariana; de Souza, Antônio Artur; Moreira, Douglas Rafael

    2012-01-01

    This article describes a proposal for analyzing the performance of public Brazilian hospitals using financial and non-financial rates (i.e., operational rates), and thereby highlights the effectiveness (or otherwise) of the financial management of organizations in this study. A total of 72 hospitals in the Brazilian Unified Health Care System (in Portuguese, Sistema Unico de Saúde-SUS), were selected for accessibility and completeness of their data. Twenty-six organizations were used for the study sample, consisting of entities that had publicly disclosed financial statements for the period from 2008 (in particular, via the Internet) and whose operational data could be found in the SUS database. Our proposal, based on models using the method of Data Envelopment Analysis (DEA), was the construction of six initial models that were later compiled into a standard model. The relations between the rates that comprised the models were based on the variables and the notes of: Schuhmann, McCue and Nayar, Barnum and Kutzin, Younis, Younies, and Okojie, Marinho, Moreno, and Cavalini, and Ersoy, Kavuncubasi, Ozcan, and Harris II. We put forward an enhanced grant proposal applicable to Brazil aiming to (i) confirm or refute the rates that show the effectiveness or ineffectiveness of financial management of national hospitals; and (ii) determine the best performances, which could be used as a reference for future studies. Obtained results: (i) for all financial indicators considered, only one showed no significance in all models; and (ii) for operational indicators, the results were not relevant when the number of occupied beds was considered. Though the analysis was related to only services provided by SUS, we conclude that our study has great potential for analyzing the financial management performance of Brazilian hospitals in general, for the following reasons: (i) it shows the relationship of financial and operational rates that can be used to analyze the performance of

  9. Performance issues for engineering analysis on MIMD parallel computers

    SciTech Connect

    Fang, H.E.; Vaughan, C.T.; Gardner, D.R.

    1994-08-01

    We discuss how engineering analysts can obtain greater computational resolution in a more timely manner from applications codes running on MIMD parallel computers. Both processor speed and memory capacity are important to achieving better performance than a serial vector supercomputer. To obtain good performance, a parallel applications code must be scalable. In addition, the aspect ratios of the subdomains in the decomposition of the simulation domain onto the parallel computer should be of order 1. We demonstrate these conclusions using simulations conducted with the PCTH shock wave physics code running on a Cray Y-MP, a 1024-node nCUBE 2, and an 1840-node Paragon.

  10. Performance analysis of Ethernet PON system accommodating 64 ONUs

    NASA Astrophysics Data System (ADS)

    Tanaka, Keiji; Ohara, Kazuho; Miyazaki, Noriyuki; Edagawa, Noboru

    2007-05-01

    We report the performance of an IEEE 802.3 standard compliant Ethernet passive optical network (EPON) system accommodating 64 optical network units (ONUs). After investigating the optical transmission performance, we successfully demonstrate that a high throughput of more than 900Mbits/s can be achieved in a 64-ONU EPON system using multiple logical link identifiers per ONU within a range of 10km. In addition, we confirm the feasibility of IP-based high-quality triple play services in the EPON system.

  11. An analysis of functional shoulder movements during task performance using Dartfish movement analysis software

    PubMed Central

    Khadilkar, Leenesh; MacDermid, Joy C.; Sinden, Kathryn E.; Jenkyn, Thomas R.; Birmingham, Trevor B.; Athwal, George S.

    2014-01-01

    Purpose: Video-based movement analysis software (Dartfish) has potential for clinical applications for understanding shoulder motion if functional measures can be reliably obtained. The primary purpose of this study was to describe the functional range of motion (ROM) of the shoulder used to perform a subset of functional tasks. A second purpose was to assess the reliability of functional ROM measurements obtained by different raters using Dartfish software. Materials and Methods: Ten healthy participants, mean age 29 ± 5 years, were videotaped while performing five tasks selected from the Disabilities of the Arm, Shoulder and Hand (DASH). Video cameras and markers were used to obtain video images suitable for analysis in Dartfish software. Three repetitions of each task were performed. Shoulder movements from all three repetitions were analyzed using Dartfish software. The tracking tool of the Dartfish software was used to obtain shoulder joint angles and arcs of motion. Test-retest and inter-rater reliability of the measurements were evaluated using intraclass correlation coefficients (ICC). Results: Maximum (coronal plane) abduction (118° ± 16°) and (sagittal plane) flexion (111° ± 15°) was observed during ‘washing one's hair;’ maximum extension (−68° ± 9°) was identified during ‘washing one's own back.’ Minimum shoulder ROM was observed during ‘opening a tight jar’ (33° ± 13° abduction and 13° ± 19° flexion). Test-retest reliability (ICC = 0.45 to 0.94) suggests high inter-individual task variability, and inter-rater reliability (ICC = 0.68 to 1.00) showed moderate to excellent agreement. Conclusion: Key findings include: 1) functional shoulder ROM identified in this study compared to similar studies; 2) healthy individuals require less than full ROM when performing five common ADL tasks 3) high participant variability was observed during performance of the five ADL tasks; and 4) Dartfish software provides a clinically relevant

  12. Performance analysis of exam gloves used for aseptic rodent surgery.

    PubMed

    LeMoine, Dana M; Bergdall, Valerie K; Freed, Carrie

    2015-05-01

    Aseptic technique includes the use of sterile surgical gloves for survival surgeries in rodents to minimize the incidence of infections. Exam gloves are much less expensive than are surgical gloves and may represent a cost-effective, readily available option for use in rodent surgery. This study examined the effectiveness of surface disinfection of exam gloves with 70% isopropyl alcohol or a solution of hydrogen peroxide and peracetic acid (HP-PA) in reducing bacterial contamination. Performance levels for asepsis were met when gloves were negative for bacterial contamination after surface disinfection and sham 'exertion' activity. According to these criteria, 94% of HP-PA-disinfected gloves passed, compared with 47% of alcohol-disinfected gloves. In addition, the effect of autoclaving on the integrity of exam gloves was examined, given that autoclaving is another readily available option for aseptic preparation. Performance criteria for glove integrity after autoclaving consisted of: the ability to don the gloves followed by successful simulation of wound closure and completion of stretch tests without tearing or observable defects. Using this criteria, 98% of autoclaved nitrile exam gloves and 76% of autoclaved latex exam gloves met performance expectations compared with the performance of standard surgical gloves (88% nitrile, 100% latex). The results of this study support the use of HP-PA-disinfected latex and nitrile exam gloves or autoclaved nitrile exam gloves as viable cost-effective alternatives to sterile surgical gloves for rodent surgeries. PMID:26045458

  13. Performance Factors Analysis -- A New Alternative to Knowledge Tracing

    ERIC Educational Resources Information Center

    Pavlik, Philip I., Jr.; Cen, Hao; Koedinger, Kenneth R.

    2009-01-01

    Knowledge tracing (KT)[1] has been used in various forms for adaptive computerized instruction for more than 40 years. However, despite its long history of application, it is difficult to use in domain model search procedures, has not been used to capture learning where multiple skills are needed to perform a single action, and has not been used…

  14. Relative Performance of Academic Departments Using DEA with Sensitivity Analysis

    ERIC Educational Resources Information Center

    Tyagi, Preeti; Yadav, Shiv Prasad; Singh, S. P.

    2009-01-01

    The process of liberalization and globalization of Indian economy has brought new opportunities and challenges in all areas of human endeavor including education. Educational institutions have to adopt new strategies to make best use of the opportunities and counter the challenges. One of these challenges is how to assess the performance of…

  15. Performance analysis of exam gloves used for aseptic rodent surgery.

    PubMed

    LeMoine, Dana M; Bergdall, Valerie K; Freed, Carrie

    2015-05-01

    Aseptic technique includes the use of sterile surgical gloves for survival surgeries in rodents to minimize the incidence of infections. Exam gloves are much less expensive than are surgical gloves and may represent a cost-effective, readily available option for use in rodent surgery. This study examined the effectiveness of surface disinfection of exam gloves with 70% isopropyl alcohol or a solution of hydrogen peroxide and peracetic acid (HP-PA) in reducing bacterial contamination. Performance levels for asepsis were met when gloves were negative for bacterial contamination after surface disinfection and sham 'exertion' activity. According to these criteria, 94% of HP-PA-disinfected gloves passed, compared with 47% of alcohol-disinfected gloves. In addition, the effect of autoclaving on the integrity of exam gloves was examined, given that autoclaving is another readily available option for aseptic preparation. Performance criteria for glove integrity after autoclaving consisted of: the ability to don the gloves followed by successful simulation of wound closure and completion of stretch tests without tearing or observable defects. Using this criteria, 98% of autoclaved nitrile exam gloves and 76% of autoclaved latex exam gloves met performance expectations compared with the performance of standard surgical gloves (88% nitrile, 100% latex). The results of this study support the use of HP-PA-disinfected latex and nitrile exam gloves or autoclaved nitrile exam gloves as viable cost-effective alternatives to sterile surgical gloves for rodent surgeries.

  16. Leadership Styles and Organizational Performance: A Predictive Analysis

    ERIC Educational Resources Information Center

    Kieu, Hung Q.

    2010-01-01

    Leadership is critically important because it affects the health of the organization. Research has found that leadership is one of the most significant contributors to organizational performance. Expanding and replicating previous research, and focusing on the specific telecommunications sector, this study used multiple correlation and regression…

  17. Performance analysis of vortex based mixers for confined flows

    NASA Astrophysics Data System (ADS)

    Buschhagen, Timo

    The hybrid rocket is still sparsely employed within major space or defense projects due to their relatively poor combustion efficiency and low fuel grain regression rate. Although hybrid rockets can claim advantages in safety, environmental and performance aspects against established solid and liquid propellant systems, the boundary layer combustion process and the diffusion based mixing within a hybrid rocket grain port leaves the core flow unmixed and limits the system performance. One principle used to enhance the mixing of gaseous flows is to induce streamwise vorticity. The counter-rotating vortex pair (CVP) mixer utilizes this principle and introduces two vortices into a confined flow, generating a stirring motion in order to transport near wall media towards the core and vice versa. Recent studies investigated the velocity field introduced by this type of swirler. The current work is evaluating the mixing performance of the CVP concept, by using an experimental setup to simulate an axial primary pipe flow with a radially entering secondary flow. Hereby the primary flow is altered by the CVP swirler unit. The resulting setup therefore emulates a hybrid rocket motor with a cylindrical single port grain. In order to evaluate the mixing performance the secondary flow concentration at the pipe assembly exit is measured, utilizing a pressure-sensitive paint based procedure.

  18. Performance Analysis of Exam Gloves Used for Aseptic Rodent Surgery

    PubMed Central

    LeMoine, Dana M; Bergdall, Valerie K; Freed, Carrie

    2015-01-01

    Aseptic technique includes the use of sterile surgical gloves for survival surgeries in rodents to minimize the incidence of infections. Exam gloves are much less expensive than are surgical gloves and may represent a cost-effective, readily available option for use in rodent surgery. This study examined the effectiveness of surface disinfection of exam gloves with 70% isopropyl alcohol or a solution of hydrogen peroxide and peracetic acid (HP–PA) in reducing bacterial contamination. Performance levels for asepsis were met when gloves were negative for bacterial contamination after surface disinfection and sham ‘exertion’ activity. According to these criteria, 94% of HP–PA-disinfected gloves passed, compared with 47% of alcohol-disinfected gloves. In addition, the effect of autoclaving on the integrity of exam gloves was examined, given that autoclaving is another readily available option for aseptic preparation. Performance criteria for glove integrity after autoclaving consisted of: the ability to don the gloves followed by successful simulation of wound closure and completion of stretch tests without tearing or observable defects. Using this criteria, 98% of autoclaved nitrile exam gloves and 76% of autoclaved latex exam gloves met performance expectations compared with the performance of standard surgical gloves (88% nitrile, 100% latex). The results of this study support the use of HP–PA-disinfected latex and nitrile exam gloves or autoclaved nitrile exam gloves as viable cost-effective alternatives to sterile surgical gloves for rodent surgeries. PMID:26045458

  19. State Aid and Student Performance: A Supply-Demand Analysis

    ERIC Educational Resources Information Center

    Kinnucan, Henry W.; Zheng, Yuqing; Brehmer, Gerald

    2006-01-01

    Using a supply-demand framework, a six-equation model is specified to generate hypotheses about the relationship between state aid and student performance. Theory predicts that an increase in state or federal aid provides an incentive to decrease local funding, but that the disincentive associated with increased state aid is moderated when federal…

  20. Meta-Analysis of Predictors of Dental School Performance

    ERIC Educational Resources Information Center

    DeCastro, Jeanette E.

    2012-01-01

    Accurate prediction of which candidates show the most promise of success in dental school is imperative for the candidates, the profession, and the public. Several studies suggested that predental GPAs and the Dental Admissions Test (DAT) produce a range of correlations with dental school performance measures. While there have been similarities,…

  1. Analysis of Factors that Predict Clinical Performance in Medical School

    ERIC Educational Resources Information Center

    White, Casey B.; Dey, Eric L.; Fantone, Joseph C.

    2009-01-01

    Academic achievement indices including GPAs and MCAT scores are used to predict the spectrum of medical student academic performance types. However, use of these measures ignores two changes influencing medical school admissions: student diversity and affirmative action, and an increased focus on communication skills. To determine if GPA and MCAT…

  2. Analysis of Job Performance Measurement Data. Report of a Workshop.

    ERIC Educational Resources Information Center

    Green, Bert F., Jr., Ed.; Wing, Hilda, Ed.

    This report describes a workshop at which Army researchers presented some results from the first phase of a two-phase Joint-Service Project. (The objective of this phase was to determine if technically adequate criterion measures can be developed that are representative of job performance.) Part I of the report presents the preliminary results of…

  3. How Motivation Affects Academic Performance: A Structural Equation Modelling Analysis

    ERIC Educational Resources Information Center

    Kusurkar, R. A.; Ten Cate, Th. J.; Vos, C. M. P.; Westers, P.; Croiset, G.

    2013-01-01

    Few studies in medical education have studied effect of quality of motivation on performance. Self-Determination Theory based on quality of motivation differentiates between Autonomous Motivation (AM) that originates within an individual and Controlled Motivation (CM) that originates from external sources. To determine whether Relative Autonomous…

  4. Detection performance analysis for time-of-flight PET

    NASA Astrophysics Data System (ADS)

    Cao, Nannan; Huesman, Ronald H.; Moses, William W.; Qi, Jinyi

    2010-11-01

    In this paper, we investigate the performance of time-of-flight (TOF) positron emission tomography (PET) in improving lesion detectability. We present a theoretical approach to compare lesion detectability of TOF versus non-TOF systems and perform computer simulations to validate the theoretical prediction. A single-ring TOF PET tomograph is simulated using SimSET software, and images are reconstructed in 2D from list-mode data using a maximum a posteriori method. We use a channelized Hotelling observer to assess the detection performance. Both the receiver operating characteristic (ROC) and localization ROC curves are compared for the TOF and non-TOF PET systems. We first studied the SNR gains for TOF PET with different scatter and random fractions, system timing resolutions and object sizes. We found that the TOF information improves the lesion detectability and the improvement is greater with larger fractions of randoms, better timing resolution and bigger objects. The scatters by themselves have little impact on the SNR gain after correction. Since the true system timing resolution may not be known precisely in practice, we investigated the effect of mismatched timing kernels and showed that using a mismatched kernel during reconstruction always degrades the detection performance, no matter whether it is narrower or wider than the real value. Using the proposed theoretical framework, we also studied the effect of lumpy backgrounds on the detection performance. Our results indicated that with lumpy backgrounds, the TOF PET still outperforms the non-TOF PET, but the improvement is smaller compared with the uniform background case. More specifically, with the same correlation length, the SNR gain reduces with bigger number of lumpy patches and greater lumpy amplitudes. With the same variance, the SNR gain reaches the minimum when the width of the Gaussian lumps is close to the size of the tumor.

  5. Learning Problem Expanded Form--A Performance Analysis

    ERIC Educational Resources Information Center

    Zydatiss, Wolfgang

    1976-01-01

    An analysis of the written compositions of German students (aged 16+, in their fourth or sixth year of English as a foreign language) with regard to their use of the progressive form. Four problem areas are enumerated, and it is suggested that these be included in pedagogic grammars. (KM)

  6. Rapid Automatized Naming and Reading Performance: A Meta-Analysis

    ERIC Educational Resources Information Center

    Araújo, Susana; Reis, Alexandra; Petersson, Karl Magnus; Faísca, Luís

    2015-01-01

    Evidence that rapid naming skill is associated with reading ability has become increasingly prevalent in recent years. However, there is considerable variation in the literature concerning the magnitude of this relationship. The objective of the present study was to provide a comprehensive analysis of the evidence on the relationship between rapid…

  7. Applying Score Analysis to a Rehearsal Pedagogy of Expressive Performance

    ERIC Educational Resources Information Center

    Byo, James L.

    2014-01-01

    The discoveries of score analysis (e.g., minor seventh chord, ostinato, phrase elision, melodic fragment, half cadence) are more than just compositional techniques or music vocabulary. They are sounds--fascinating, storytelling, dynamic modes of expression--that when approached as such enrich the rehearsal experience. This article presents a…

  8. Transaction Performance vs. Moore's Law: A Trend Analysis

    NASA Astrophysics Data System (ADS)

    Nambiar, Raghunath; Poess, Meikel

    Intel co-founder Gordon E. Moore postulated in his famous 1965 paper that the number of components in integrated circuits had doubled every year from their invention in 1958 until 1965, and then predicted that the trend would continue for at least ten years. Later, David House, an Intel colleague, after factoring in the increase in performance of transistors, concluded that integrated circuits would double in performance every 18 months. Despite this trend in microprocessor improvements, your favored text editor continues to take the same time to start and your PC takes pretty much the same time to reboot as it took 10 years ago. Can this observation be made on systems supporting the fundamental aspects of our information based economy, namely transaction processing systems?

  9. Performance analysis of 802.15.4 wireless standard

    NASA Astrophysics Data System (ADS)

    Losavio, Emanuele; Orcioni, Simone; Conti, Massimo

    2011-05-01

    In recent years, low distance wireless connectivity is having an exponential growth. Fast design and verification of the performances of the wireless network is becoming a necessity for electronic industry to hit the more and more restrictive market requests. A system level model of the network is indispensable to ensure fast and flexible design and verification. In this work a SystemC model of the IEEE 802.15.4 standard is presented. The model has been used to verify the performances of the 802.15.4 standard in terms of efficiency and channel throughput as a function of the number of nodes in the network, of the dimension of the payload and of the frequency with which the nodes try to transmit.

  10. Control Design and Performance Analysis for Autonomous Formation Flight Experimentss

    NASA Astrophysics Data System (ADS)

    Rice, Caleb Michael

    Autonomous Formation Flight is a key approach for reducing greenhouse gas emissions and managing traffic in future high density airspace. Unmanned Aerial Vehicles (UAV's) have made it possible for the physical demonstration and validation of autonomous formation flight concepts inexpensively and eliminates the flight risk to human pilots. This thesis discusses the design, implementation, and flight testing of three different formation flight control methods, Proportional Integral and Derivative (PID); Fuzzy Logic (FL); and NonLinear Dynamic Inversion (NLDI), and their respective performance behavior. Experimental results show achievable autonomous formation flight and performance quality with a pair of low-cost unmanned research fixed wing aircraft and also with a solo vertical takeoff and landing (VTOL) quadrotor.

  11. Failure Analysis and Regeneration Performances Evaluation on Engine Lubricating Oil

    NASA Astrophysics Data System (ADS)

    Wang, X. L.; Zhang, G. N.; Zhang, J. Y.; Yin, Y. L.; Xu, Y.

    To investigate the behavior of failure and recycling of lubricating oils, three sorts of typical 10w-40 lubricating oils used in heavy-load vehicle including the new oil, waste oil and regeneration oil regenerated by self-researched green regeneration technology were selected. The tribology properties were tested by four-ball friction wear tester as well. The results indicated that the performance of anti-extreme pressure of regeneration oil increase by 34.1% compared with the waste one and its load- carrying ability is close to the new oil; the feature of wear spot are better than those of the waste oil and frictional coefficient almost reach the level of the new oil's. As a result, the performance of anti-wear and friction reducing are getting better obviously.

  12. NS&T Management Observations: Quarterly Performance Analysis

    SciTech Connect

    Gianotto, David

    2014-09-01

    The INL Management Observation Program (MOP) is designed to improve managers and supervisors understanding of work being performed by employees and the barriers impacting their success. The MOP also increases workers understanding of managements’ expectations as they relate to safety, security, quality, and work performance. Management observations (observations) are designed to improve the relationship and trust between employees and managers through increased engagement and interactions between managers and researchers in the field. As part of continuous improvement, NS&T management took initiative to focus on the participation and quality of observations in FY-14. This quarterly report is intended to (a) summarize the participation and quality of management’s observations, (b) assess observations for commonalities or trends related to facility or process barriers impacting research, and (c) provide feedback and make recommendations for improvements NS&T’s MOP.

  13. Resilient Plant Monitoring System: Design, Analysis, and Performance Evaluation

    SciTech Connect

    Humberto E. Garcia; Wen-Chiao Lin; Semyon M. Meerkov; Maruthi T. Ravichandran

    2013-12-01

    Resilient monitoring systems are sensor networks that degrade gracefully under malicious attacks on their sensors, causing them to project misleading information. The goal of this paper is to design, analyze, and evaluate the performance of a resilient monitoring system intended to monitor plant conditions (normal or anomalous). The architecture developed consists of four layers: data quality assessment, process variable assessment, plant condition assessment, and sensor network adaptation. Each of these layers is analyzed by either analytical or numerical tools, and the performance of the overall system is evaluated using simulations. The measure of resiliency of the resulting system is evaluated using Kullback Leibler divergence, and is shown to be sufficiently high in all scenarios considered.

  14. Performance Analysis of XCPC Powered Solar Cooling Demonstration Project

    NASA Astrophysics Data System (ADS)

    Widyolar, Bennett K.

    A solar thermal cooling system using novel non-tracking External Compound Parabolic Concentrators (XCPC) has been built at the University of California, Merced and operated for two cooling seasons. Its performance in providing power for space cooling has been analyzed. This solar cooling system is comprised of 53.3 m2 of XCPC trough collectors which are used to power a 23 kW double effect (LiBr) absorption chiller. This is the first system that combines both XCPC and absorption chilling technologies. Performance of the system was measured in both sunny and cloudy conditions, with both clean and dirty collectors. It was found that these collectors are well suited at providing thermal power to drive absorption cooling systems and that both the coinciding of available thermal power with cooling demand and the simplicity of the XCPC collectors compared to other solar thermal collectors makes them a highly attractive candidate for cooling projects.

  15. Performance Analysis and Portability of the PLUM Load Balancing System

    NASA Technical Reports Server (NTRS)

    Oliker, Leonid; Biswas, Rupak; Gabow, Harold N.

    1998-01-01

    The ability to dynamically adapt an unstructured mesh is a powerful tool for solving computational problems with evolving physical features; however, an efficient parallel implementation is rather difficult. To address this problem, we have developed PLUM, an automatic portable framework for performing adaptive numerical computations in a message-passing environment. PLUM requires that all data be globally redistributed after each mesh adaption to achieve load balance. We present an algorithm for minimizing this remapping overhead by guaranteeing an optimal processor reassignment. We also show that the data redistribution cost can be significantly reduced by applying our heuristic processor reassignment algorithm to the default mapping of the parallel partitioner. Portability is examined by comparing performance on a SP2, an Origin2000, and a T3E. Results show that PLUM can be successfully ported to different platforms without any code modifications.

  16. Performance limit analysis of a metallic fuel for Kalimer

    SciTech Connect

    Lee, Byoung Oon; Cheon, J.S.; Lee, C.B.

    2007-07-01

    A metallic fuel is being considered as the fuel for SFR in Korea. The metal fuel development for SFR in Korea started in 2007 in the areas of metal fuel fabrication, cladding materials and fuel performance evaluation. The MACSIS code for a metallic fuel has been developed as a steady-state performance computer code. Present study represents the preliminary parametric results for evaluating the design limits of the metal fuel for SFR in Korea. The operating limits were analyzed by the MACSIS code. The modules of the creep rupture strength for the Mod.HT9 and the barrier cladding were inserted. The strain limits and the CDF limit were analyzed for the HT9, and the Mod.HT9. To apply the concept of a barrier cladding, the burnup limit of the barrier cladding was analyzed. (authors)

  17. Scattering analysis of high performance large sandwich radomes

    NASA Astrophysics Data System (ADS)

    Shavit, Reuven; Smolski, Adam P.; Michielssen, Eric; Mittra, Raj

    1992-02-01

    Large radomes are assembled from many panels connected together forming joints or seams. When the panels are type A sandwiches that are optimized for minimum transmission loss over moderately narrow bandwidths, the seams and joints introduce scattering effects that can degrade the overall electromagnetic performance. Tuning the dielectric seams with conductive wires and optimizing their geometry is, therefore, crucial to enhancing the electromagnetic performance of the radome. The authors address the problem of systematically tuning the dielectric seams and present both numerical and experimental results to illustrate the tuning procedure. Included are results showing the effect of the tuning of the radome on the radiation of an enclosed aperture of circular or elliptic shape.

  18. Parametric analysis of environmental performance of reused/recycled packaging.

    PubMed

    Tsiliyannis, C A

    2005-12-15

    Annual environmental performance of packaging products which are reused at least once per year is analyzed with respect to three specific criteria: (1) waste quantities, (2) virgin material demand and resource depletion, and (3) environmental impacts from manufacturing. Packaging flow performance is assessed via a combined reuse/ recycle rate index expressed solely in terms of two dimensionless parameters: the conventional recycling rate and the mean number of reuse trips. Quantitative expressions describe the effect of the following physical quantities: annual reuse frequency, lifetime, maximum number of reuse trips, amount of packaging present in the market, annual production plus net trade imports, recycle rate of consumer discard, reuse rate and consumer discard. The results may serve for packaging monitoring and assessment of alternative packaging systems and for setting more efficient environmental policy targets in terms of the reuse/recycle rate. PMID:16475365

  19. The NetLogger Methodology for High Performance Distributed Systems Performance Analysis

    SciTech Connect

    Tierney, Brian; Johnston, William; Crowley, Brian; Hoo, Gary; Brooks, Chris; Gunter, Dan

    1999-12-23

    The authors describe a methodology that enables the real-time diagnosis of performance problems in complex high-performance distributed systems. The methodology includes tools for generating precision event logs that can be used to provide detailed end-to-end application and system level monitoring; a Java agent-based system for managing the large amount of logging data; and tools for visualizing the log data and real-time state of the distributed system. The authors developed these tools for analyzing a high-performance distributed system centered around the transfer of large amounts of data at high speeds from a distributed storage server to a remote visualization client. However, this methodology should be generally applicable to any distributed system. This methodology, called NetLogger, has proven invaluable for diagnosing problems in networks and in distributed systems code. This approach is novel in that it combines network, host, and application-level monitoring, providing a complete view of the entire system.

  20. Performance analysis of an inexpensive Direct Imaging Transmission Ion Microscope

    NASA Astrophysics Data System (ADS)

    Barnes, Patrick; Pallone, Arthur

    2013-03-01

    A direct imaging transmission ion microscope (DITIM) is built from a modified webcam and a commercially available polonium-210 antistatic device mounted on an optics rail. The performance of the DITIM in radiographic mode is analyzed in terms of the line spread function (LSF) and modulation transfer function (MTF) for an opaque edge. Limitations of, potential uses for, and suggested improvements to the DITIM are also discussed. Faculty sponsor

  1. Performance of silicon immersed gratings: measurement, analysis, and modeling

    NASA Astrophysics Data System (ADS)

    Rodenhuis, Michiel; Tol, Paul J. J.; Coppens, Tonny H. M.; Laubert, Phillip P.; van Amerongen, Aaldert H.

    2015-09-01

    The use of Immersed Gratings offers advantages for both space- and ground-based spectrographs. As diffraction takes place inside the high-index medium, the optical path difference and angular dispersion are boosted proportionally, thereby allowing a smaller grating area and a smaller spectrometer size. Short-wave infrared (SWIR) spectroscopy is used in space-based monitoring of greenhouse and pollution gases in the Earth atmosphere. On the extremely large telescopes currently under development, mid-infrared high-resolution spectrographs will, among other things, be used to characterize exo-planet atmospheres. At infrared wavelengths, Silicon is transparent. This means that production methods used in the semiconductor industry can be applied to the fabrication of immersed gratings. Using such methods, we have designed and built immersed gratings for both space- and ground-based instruments, examples being the TROPOMI instrument for the European Space Agency Sentinel-5 precursor mission, Sentinel-5 (ESA) and the METIS (Mid-infrared E-ELT Imager and Spectrograph) instrument for the European Extremely Large Telescope. Three key parameters govern the performance of such gratings: The efficiency, the level of scattered light and the wavefront error induced. In this paper we describe how we can optimize these parameters during the design and manufacturing phase. We focus on the tools and methods used to measure the actual performance realized and present the results. In this paper, the bread-board model (BBM) immersed grating developed for the SWIR-1 channel of Sentinel-5 is used to illustrate this process. Stringent requirements were specified for this grating for the three performance criteria. We will show that -with some margin- the performance requirements have all been met.

  2. Performance Analysis of Distributed Object-Oriented Applications

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1998-01-01

    The purpose of this research was to evaluate the efficiency of a distributed simulation architecture which creates individual modules which are made self-scheduling through the use of a message-based communication system used for requesting input data from another module which is the source of that data. To make the architecture as general as possible, the message-based communication architecture was implemented using standard remote object architectures (Common Object Request Broker Architecture (CORBA) and/or Distributed Component Object Model (DCOM)). A series of experiments were run in which different systems are distributed in a variety of ways across multiple computers and the performance evaluated. The experiments were duplicated in each case so that the overhead due to message communication and data transmission can be separated from the time required to actually perform the computational update of a module each iteration. The software used to distribute the modules across multiple computers was developed in the first year of the current grant and was modified considerably to add a message-based communication scheme supported by the DCOM distributed object architecture. The resulting performance was analyzed using a model created during the first year of this grant which predicts the overhead due to CORBA and DCOM remote procedure calls and includes the effects of data passed to and from the remote objects. A report covering the distributed simulation software and the results of the performance experiments has been submitted separately. The above report also discusses possible future work to apply the methodology to dynamically distribute the simulation modules so as to minimize overall computation time.

  3. [High-performance liquid-liquid chromatography in beverage analysis].

    PubMed

    Bricout, J; Koziet, Y; de Carpentrie, B

    1978-01-01

    Liquid liquid chromatography was performed with columns packed with stationary phases chemically bonded to silica microparticules. These columns show a high efficiency and are used very easily. Flavouring compounds like aromatic aldehydes which have a low volatility were analyzed in brandy using a polar phase alkylnitrile. Sapid substances like amarogentin in Gentiana lutea or glyryrrhizin in Glycyrrhiza glabra were determined by reversed phase chromatography. Finally ionizable substances like synthetic dyes can be analyzed by paired ion chromatography witha non polar stationary phase.

  4. Moisture and Structural Analysis for High Performance Hybrid Wall Assemblies

    SciTech Connect

    Grin, A.; Lstiburek, J.

    2012-09-01

    Based on past experience in the Building America program, BSC has found that combinations of materials and approaches—in other words, systems—usually provide optimum performance. Integration is necessary, as described in this research project. The hybrid walls analyzed utilize a combination of exterior insulation, diagonal metal strapping, and spray polyurethane foam and leave room for cavity-fill insulation. These systems can provide effective thermal, air, moisture, and water barrier systems in one assembly and provide structure.

  5. Numerical analysis of maximal bat performance in baseball.

    PubMed

    Nicholls, Rochelle L; Miller, Karol; Elliott, Bruce C

    2006-01-01

    Metal baseball bats have been experimentally demonstrated to produce higher ball exit velocity (BEV) than wooden bats. In the United States, all bats are subject to BEV tests using hitting machines that rotate the bat in a horizontal plane. In this paper, a model of bat-ball impact was developed based on 3-D translational and rotational kinematics of a swing performed by high-level players. The model was designed to simulate the maximal performance of specific models of a wooden bat and a metal bat when swung by a player, and included material properties and kinematics specific to each bat. Impact dynamics were quantified using the finite element method (ANSYS/LSDYNA, version 6.1). Maximum BEV from both a metal (61.5 m/s) and a wooden (50.9 m/s) bat exceeded the 43.1 m/s threshold by which bats are certified as appropriate for commercial sale. The lower BEV from the wooden bat was attributed to a lower pre-impact bat linear velocity, and a more oblique impact that resulted in a greater proportion of BEV being lost to lateral and vertical motion. The results demonstrate the importance of factoring bat linear velocity and spatial orientation into tests of maximal bat performance, and have implications for the design of metal baseball bats. PMID:15878593

  6. Performance Analysis of the NAS Y-MP Workload

    NASA Technical Reports Server (NTRS)

    Bergeron, Robert J.; Kutler, Paul (Technical Monitor)

    1997-01-01

    This paper describes the performance characteristics of the computational workloads on the NAS Cray Y-MP machines, a Y-MP 832 and later a Y-MP 8128. Hardware measurements indicated that the Y-MP workload performance matured over time, ultimately sustaining an average throughput of 0.8 GFLOPS and a vector operation fraction of 87%. The measurements also revealed an operation rate exceeding 1 per clock period, a well-balanced architecture featuring a strong utilization of vector functional units, and an efficient memory organization. Introduction of the larger memory 8128 increased throughput by allowing a more efficient utilization of CPUs. Throughput also depended on the metering of the batch queues; low-idle Saturday workloads required a buffer of small jobs to prevent memory starvation of the CPU. UNICOS required about 7% of total CPU time to service the 832 workloads; this overhead decreased to 5% for the 8128 workloads. While most of the system time went to service I/O requests, efficient scheduling prevented excessive idle due to I/O wait. System measurements disclosed no obvious bottlenecks in the response of the machine and UNICOS to the workloads. In most cases, Cray-provided software tools were- quite sufficient for measuring the performance of both the machine and operating, system.

  7. Statistical analysis of AFE GN&C aeropass performance

    NASA Technical Reports Server (NTRS)

    Chang, Ho-Pen; French, Raymond A.

    1990-01-01

    Performance of the guidance, navigation, and control (GN&C) system used on the Aeroassist Flight Experiment (AFE) spacecraft has been studied with Monte Carlo techniques. The performance of the AFE GN&C is investigated with a 6-DOF numerical dynamic model which includes a Global Reference Atmospheric Model (GRAM) and a gravitational model with oblateness corrections. The study considers all the uncertainties due to the environment and the system itself. In the AFE's aeropass phase, perturbations on the system performance are caused by an error space which has over 20 dimensions of the correlated/uncorrelated error sources. The goal of this study is to determine, in a statistical sense, how much flight path angle error can be tolerated at entry interface (EI) and still have acceptable delta-V capability at exit to position the AFE spacecraft for recovery. Assuming there is fuel available to produce 380 ft/sec of delta-V at atmospheric exit, a 3-sigma standard deviation in flight path angle error of 0.04 degrees at EI would result in a 98-percent probability of mission success.

  8. High performance computing environment for multidimensional image analysis

    PubMed Central

    Rao, A Ravishankar; Cecchi, Guillermo A; Magnasco, Marcelo

    2007-01-01

    Background The processing of images acquired through microscopy is a challenging task due to the large size of datasets (several gigabytes) and the fast turnaround time required. If the throughput of the image processing stage is significantly increased, it can have a major impact in microscopy applications. Results We present a high performance computing (HPC) solution to this problem. This involves decomposing the spatial 3D image into segments that are assigned to unique processors, and matched to the 3D torus architecture of the IBM Blue Gene/L machine. Communication between segments is restricted to the nearest neighbors. When running on a 2 Ghz Intel CPU, the task of 3D median filtering on a typical 256 megabyte dataset takes two and a half hours, whereas by using 1024 nodes of Blue Gene, this task can be performed in 18.8 seconds, a 478× speedup. Conclusion Our parallel solution dramatically improves the performance of image processing, feature extraction and 3D reconstruction tasks. This increased throughput permits biologists to conduct unprecedented large scale experiments with massive datasets. PMID:17634099

  9. Performance Testing using Silicon Devices - Analysis of Accuracy: Preprint

    SciTech Connect

    Sengupta, M.; Gotseff, P.; Myers, D.; Stoffel, T.

    2012-06-01

    Accurately determining PV module performance in the field requires accurate measurements of solar irradiance reaching the PV panel (i.e., Plane-of-Array - POA Irradiance) with known measurement uncertainty. Pyranometers are commonly based on thermopile or silicon photodiode detectors. Silicon detectors, including PV reference cells, are an attractive choice for reasons that include faster time response (10 us) than thermopile detectors (1 s to 5 s), lower cost and maintenance. The main drawback of silicon detectors is their limited spectral response. Therefore, to determine broadband POA solar irradiance, a pyranometer calibration factor that converts the narrowband response to broadband is required. Normally this calibration factor is a single number determined under clear-sky conditions with respect to a broadband reference radiometer. The pyranometer is then used for various scenarios including varying airmass, panel orientation and atmospheric conditions. This would not be an issue if all irradiance wavelengths that form the broadband spectrum responded uniformly to atmospheric constituents. Unfortunately, the scattering and absorption signature varies widely with wavelength and the calibration factor for the silicon photodiode pyranometer is not appropriate for other conditions. This paper reviews the issues that will arise from the use of silicon detectors for PV performance measurement in the field based on measurements from a group of pyranometers mounted on a 1-axis solar tracker. Also we will present a comparison of simultaneous spectral and broadband measurements from silicon and thermopile detectors and estimated measurement errors when using silicon devices for both array performance and resource assessment.

  10. Numerical analysis of maximal bat performance in baseball.

    PubMed

    Nicholls, Rochelle L; Miller, Karol; Elliott, Bruce C

    2006-01-01

    Metal baseball bats have been experimentally demonstrated to produce higher ball exit velocity (BEV) than wooden bats. In the United States, all bats are subject to BEV tests using hitting machines that rotate the bat in a horizontal plane. In this paper, a model of bat-ball impact was developed based on 3-D translational and rotational kinematics of a swing performed by high-level players. The model was designed to simulate the maximal performance of specific models of a wooden bat and a metal bat when swung by a player, and included material properties and kinematics specific to each bat. Impact dynamics were quantified using the finite element method (ANSYS/LSDYNA, version 6.1). Maximum BEV from both a metal (61.5 m/s) and a wooden (50.9 m/s) bat exceeded the 43.1 m/s threshold by which bats are certified as appropriate for commercial sale. The lower BEV from the wooden bat was attributed to a lower pre-impact bat linear velocity, and a more oblique impact that resulted in a greater proportion of BEV being lost to lateral and vertical motion. The results demonstrate the importance of factoring bat linear velocity and spatial orientation into tests of maximal bat performance, and have implications for the design of metal baseball bats.

  11. Macroergonomic analysis and design for improved safety and quality performance.

    PubMed

    Kleiner, B M

    1999-01-01

    Macroergonomics, which emerged historically after sociotechnical systems theory, quality management, and ergonomics, is presented as the basis for a needed integrative methodology. A macroergonomics methodology was presented in some detail to demonstrate how aspects of microergonomics, total quality management (TQM), and sociotechnical systems (STS) can be triangulated in a common approach. In the context of this methodology, quality and safety were presented as 2 of several important performance criteria. To demonstrate aspects of the methodology, 2 case studies were summarized with safety and quality performance results where available. The first case manipulated both personnel and technical factors to achieve a "safety culture" at a nuclear site. The concept of safety culture is defined in INSAG-4 (International Atomic Energy Agency, 1991). as "that assembly of characteristics and attitudes in organizations and individuals which establishes that, as an overriding priority, nuclear plant safety issues receive the attention warranted by their significance." The second case described a tire manufacturing intervention to improve quality (as defined by Sink and Tuttle, 1989) through joint consideration of technical and social factors. It was suggested that macroergonomics can yield greater performance than can be achieved through ergonomic intervention alone. Whereas case studies help to make the case, more rigorous formative and summative research is needed to refine and validate the proposed methodology respectively.

  12. A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis

    SciTech Connect

    Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth; Tracy Rafferty

    2005-03-15

    The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scale long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK

  13. Uniprocessor Performance Analysis of a Representative Workload of Sandia National Laboratories' Scientific Applications.

    SciTech Connect

    Charles Laverty

    2005-10-01

    UNIPROCESSOR PERFORMANCE ANALYSIS OF A REPRESENTATIVE WORKLOAD OF SANDIA NATIONAL LABORATORIES' SCIENTIFIC APPLICATIONS Master of Science in Electrical Engineering New Mexico State University Las Cruces, New Mexico, 2005 Dr. Jeanine Cook, Chair Throughout the last decade computer performance analysis has become absolutely necessary to maximum performance of some workloads. Sandia National Laboratories (SNL) located in Albuquerque, New Mexico is no different in that to achieve maximum performance of large scientific, parallel workloads performance analysis is needed at the uni-processor level. A representative workload has been chosen as the basis of a computer performance study to determine optimal processor characteristics in order to better specify the next generation of supercomputers. Cube3, a finite element test problem developed at SNL is a representative workload of their scientific workloads. This workload has been studied at the uni-processor level to understand characteristics in the microarchitecture that will lead to the overall performance improvement at the multi-processor level. The goal of studying vthis workload at the uni-processor level is to build a performance prediction model that will be integrated into a multi-processor performance model which is currently being developed at SNL. Through the use of performance counters on the Itanium 2 microarchitecture, performance statistics are studied to determine bottlenecks in the microarchitecture and/or changes in the application code that will maximize performance. From source code analysis a performance degrading loop kernel was identified and through the use of compiler optimizations a performance gain of around 20% was achieved.

  14. Tribological performance analysis of textured steel surfaces under lubricating conditions

    NASA Astrophysics Data System (ADS)

    Singh, R. C.; Pandey, R. K.; Rooplal; Ranganath, M. S.; Maji, S.

    2016-09-01

    The tribological analysis of the lubricated conformal contacts formed between the smooth/textured surfaces of steel discs and smooth surface of steel pins under sliding conditions have been considered. Roles of dimples’ pitch of textured surfaces have been investigated experimentally to understand the variations of coefficient of friction and wear at the tribo-contacts under fully flooded lubricated conditions. Substantial reductions in coefficient of friction and wear at the tribo-interfaces have been observed in presence of textures on the rotating discs for both fully flooded and starved conditions in comparison to the corresponding lubricating conditions of the interfaces formed between the smooth surfaces of disc and pin. In presence of surface texture, the coefficient of friction reduces considerable at elevated sliding speeds (>2 m/s) and unit loads (>0.5 MPa) for the set of operating parameters considered in the analysis.

  15. Performance analysis of Integrated Communication and Control System networks

    NASA Technical Reports Server (NTRS)

    Halevi, Y.; Ray, A.

    1990-01-01

    This paper presents statistical analysis of delays in Integrated Communication and Control System (ICCS) networks that are based on asynchronous time-division multiplexing. The models are obtained in closed form for analyzing control systems with randomly varying delays. The results of this research are applicable to ICCS design for complex dynamical processes like advanced aircraft and spacecraft, autonomous manufacturing plants, and chemical and processing plants.

  16. Comparative analysis of the speed performance of texture analysis algorithms on a graphic processing unit (GPU)

    NASA Astrophysics Data System (ADS)

    Triana-Martinez, J.; Orjuela-Vargas, S. A.; Philips, W.

    2013-03-01

    This paper compares the speed performance of a set of classic image algorithms for evaluating texture in images by using CUDA programming. We include a summary of the general program mode of CUDA. We select a set of texture algorithms, based on statistical analysis, that allow the use of repetitive functions, such as the Coocurrence Matrix, Haralick features and local binary patterns techniques. The memory allocation time between the host and device memory is not taken into account. The results of this approach show a comparison of the texture algorithms in terms of speed when executed on CPU and GPU processors. The comparison shows that the algorithms can be accelerated more than 40 times when implemented using CUDA environment.

  17. Technical and tactical analysis of youth taekwondo performance.

    PubMed

    Casolino, Erika; Lupo, Corrado; Cortis, Cristina; Chiodo, Salvatore; Minganti, Carlo; Capranica, Laura; Tessitore, Antonio

    2012-06-01

    This study aimed to analyze the technical and tactical aspects of young athletes during official taekwondo competitions. Fifty-nine youth taekwondo athletes (43 boys and 16 girls; age range: 10-12 years; weight category range: <24 to >59 kg) with at least 2 years of taekwondo training consisting of three 90-minute training sessions for 3 d·wk⁻¹ participated in this study. Thirty-seven matches (three 1-minute rounds, with 1-minute rest in between) were analyzed to verify the differences (p ≤ 0.05) in offensive and defensive actions in relation to gender (male, female), match outcome (winners, nonwinners), kicking leg (front, rear), and round (first, second, third). No difference emerged for gender and match outcome. With respect to defensive actions (8.4 ± 12.0%), youth athletes engaged more frequently (p < 0.0001) in offensive actions (91.6 ± 12.0%), which showed a significant decrease (p < 0.016) from the first round (42.3 ± 21.8%) to the second (33.1 ± 14.8%) and third (24.5 ± 16.0%) ones. Kicks performed with the rear leg (94.4 ± 7.8%) occurred more frequently (p < 0.0001) than those performed with the front leg (5.6 ± 7.8%). In considering that a high level of coordination is required to perform front-leg kicks and defensive actions necessitate a high level of tactical skills, these findings might indicate a not-yet complete attainment of fundamental coordinative capabilities in 10- to 12-year-old athletes, independently of match outcome. To enhance coordination capabilities in youth athletes, coaches are recommended to structure their training including skill-ability and sport-ability drills. PMID:22614139

  18. Technical and tactical analysis of youth taekwondo performance.

    PubMed

    Casolino, Erika; Lupo, Corrado; Cortis, Cristina; Chiodo, Salvatore; Minganti, Carlo; Capranica, Laura; Tessitore, Antonio

    2012-06-01

    This study aimed to analyze the technical and tactical aspects of young athletes during official taekwondo competitions. Fifty-nine youth taekwondo athletes (43 boys and 16 girls; age range: 10-12 years; weight category range: <24 to >59 kg) with at least 2 years of taekwondo training consisting of three 90-minute training sessions for 3 d·wk⁻¹ participated in this study. Thirty-seven matches (three 1-minute rounds, with 1-minute rest in between) were analyzed to verify the differences (p ≤ 0.05) in offensive and defensive actions in relation to gender (male, female), match outcome (winners, nonwinners), kicking leg (front, rear), and round (first, second, third). No difference emerged for gender and match outcome. With respect to defensive actions (8.4 ± 12.0%), youth athletes engaged more frequently (p < 0.0001) in offensive actions (91.6 ± 12.0%), which showed a significant decrease (p < 0.016) from the first round (42.3 ± 21.8%) to the second (33.1 ± 14.8%) and third (24.5 ± 16.0%) ones. Kicks performed with the rear leg (94.4 ± 7.8%) occurred more frequently (p < 0.0001) than those performed with the front leg (5.6 ± 7.8%). In considering that a high level of coordination is required to perform front-leg kicks and defensive actions necessitate a high level of tactical skills, these findings might indicate a not-yet complete attainment of fundamental coordinative capabilities in 10- to 12-year-old athletes, independently of match outcome. To enhance coordination capabilities in youth athletes, coaches are recommended to structure their training including skill-ability and sport-ability drills.

  19. Theoretical performance analysis of multislice channelized Hotelling observers

    NASA Astrophysics Data System (ADS)

    Goossens, Bart; Platiša, Ljiljana; Philips, Wilfried

    2012-02-01

    Quality assessment of 3D medical images is becoming increasingly important, because of clinical practice rapidly moving in the direction of volumetric imaging. In a recent publication, three multi-slice channelized Hotelling observer (msCHO) models are presented for the task of detecting 3D signals in multi-slice images, where each multi-slice image is inspected in a so called stack-browsing mode. The observer models are based on the assumption that humans observe multi-slice images in a simple two stage process, and each of the models implement this principle in a different way. In this paper, we investigate the theoretical performance, in terms of detection signal-to-noise-ratio (SNR) of msCHO models, for the task of detecting a separable signal in a Gaussian background with separable covariance matrix. We find that, despite the differences in architecture of the three models, they all have the same asymptotical performance in this task (i.e., when the number of training images tends to infinity). On the other hand, when backgrounds with nonseparable covariance matrices are considered, the third model, msCHOc, is expected to perform slightly better than the other msCHO models (msCHOa and msCHOb), but only when sufficient training images are provided. These findings suggest that the choice between the msCHO models mainly depends on the experiment setup (e.g., the number of available training samples), while the relation to human observers depends on the particular choice of the "temporal" channels that the msCHO models use.

  20. Apparatus and method for performing microfluidic manipulations for chemical analysis

    DOEpatents

    Ramsey, J. Michael

    2002-01-01

    A microchip apparatus and method provide fluidic manipulations for a variety of applications, including sample injection for microchip liquid chromatography. The microchip is fabricated using standard photolitographic procedures and chemical wet etching, with the substrate and cover plate joined using direct bonding. Capillary electrophoresis is performed in channels formed in the substrate. Injections are made by electro-osmotically pumping sample through the injection channel that crosses the separation channel, followed by a switching of the potentials to force a plug into the separation channel.

  1. Apparatus and method for performing microfluidic manipulations for chemical analysis

    DOEpatents

    Ramsey, J. Michael

    1999-01-01

    A microchip apparatus and method provide fluidic manipulations for a variety of applications, including sample injection for microchip liquid chromatography. The microchip is fabricated using standard photolithographic procedures and chemical wet etching, with the substrate and cover plate joined using direct bonding. Capillary electrophoresis is performed in channels formed in the substrate. Injections are made by electro-osmotically pumping sample through the injection channel that crosses the separation channel, followed by a switching of the potentials to force a plug into the separation channel.

  2. Apparatus and method for performing microfluidic manipulations for chemical analysis

    SciTech Connect

    Ramsey, J.M.

    1999-12-14

    A microchip apparatus and method provide fluidic manipulations for a variety of applications, including sample injection for microchip liquid chromatography. The microchip is fabricated using standard photolithographic procedures and chemical wet etching, with the substrate and cover plate joined using direct bonding. Capillary electrophoresis is performed in channels formed in the substrate. Injections are made by electro-osmotically pumping sample through the injection channel that crosses the separation channel, followed by a switching of the potentials to force a plug into the separation channel.

  3. Performance and flow analysis of vortex wind power turbines

    SciTech Connect

    Rangwalla, A.A.; Hsu, C.T.

    1982-10-01

    The theoretical study presented investigates some possible vortex flow solutions in the tornado-type wind energy system and evaluates the power coefficient that can be obtained theoretically. The actuator disc concept is applied to the vortex wind turbine configuration. The Burgers vortex model is then introduced and the performance of a turbine using it is derived. A generalized analytical solution of the model is given, followed by a numerical solution of the complete equations. The stability of a Burgers vortex is discussed. (LEW)

  4. Dynamic Curvature Steering Control for Autonomous Vehicle: Performance Analysis

    NASA Astrophysics Data System (ADS)

    Aizzat Zakaria, Muhammad; Zamzuri, Hairi; Amri Mazlan, Saiful

    2016-02-01

    This paper discusses the design of dynamic curvature steering control for autonomous vehicle. The lateral control and longitudinal control are discussed in this paper. The controller is designed based on the dynamic curvature calculation to estimate the path condition and modify the vehicle speed and steering wheel angle accordingly. In this paper, the simulation results are presented to show the capability of the controller to track the reference path. The controller is able to predict the path and modify the vehicle speed to suit the path condition. The effectiveness of the controller is shown in this paper whereby identical performance is achieved with the benchmark but with extra curvature adaptation capabilites.

  5. Molten carbonate fuel cell networks: Principles, analysis, and performance

    NASA Astrophysics Data System (ADS)

    Wimer, J. G.; Williams, M. C.

    1993-01-01

    The chemical reactions in an internally reforming molten carbonate fuel cell (IRMCFC) are described and combined into the overall IRMCFC reaction. Thermodynamic and electrochemical principles are discussed, and structure and operation of fuel cell stacks are explained. In networking, multiple fuel cell stacks are arranged so that reactant streams are fed and recycled through stacks in series for higher reactant utilization and increased system efficiency. Advantages and performance of networked and conventional systems are compared, using ASPEN simulations. The concept of networking can be applied to any electrochemical membrane, such as that developed for hot gas cleanup in future power plants.

  6. Vane structure design trade-off and performance analysis

    NASA Astrophysics Data System (ADS)

    Breault, Robert P.

    1989-04-01

    The APART/PADE and ASAP stray-light software packages (Breault, 1988) are applied to the design of vane structures to block direct propagation paths from the surfaces of optical baffles to other system components. Results for several typical systems are presented in extensive tables and graphs and analyzed. It is shown that vane angle and depth are significant parameters only for the first-order propagation path. Also evaluated are the amounts of particulate debris produced by degraded vane coatings and the effects of the resulting surface contamination on system performance.

  7. Design and performance analysis of gas sorption compressors

    NASA Technical Reports Server (NTRS)

    Chan, C. K.

    1984-01-01

    Compressor kinetics based on gas adsorption and desorption processes by charcoal and for gas absorption and desorption processes by LaNi5 were analyzed using a two-phase model and a three-component model, respectively. The assumption of the modeling involved thermal and mechanical equilibria between phases or among the components. The analyses predicted performance well for compressors which have heaters located outside the adsorbent or the absorbent bed. For the rapidly-cycled compressor, where the heater was centrally located, only the transient pressure compared well with the experimental data.

  8. Performance analysis of 11 Denver Metro passive homes

    NASA Astrophysics Data System (ADS)

    Claridge, D. E.

    1981-07-01

    The auxiliary heating requirements for 11 passive solar homes were calculated using SLR or SUNCAT-2.4 with a standard set of basic assumptions. It is shown that: seven of the homes should use less than half as much heating fuel as typical houses, two should use about half, and two should use about two-thirds or more. These results are compared with performance estimates and show large discrepancies. Differences are attributed largely to specific differences in assumptions in every case but one.

  9. Advanced flight design systems subsystem performance models. Sample model: Environmental analysis routine library

    NASA Technical Reports Server (NTRS)

    Parker, K. C.; Torian, J. G.

    1980-01-01

    A sample environmental control and life support model performance analysis using the environmental analysis routines library is presented. An example of a complete model set up and execution is provided. The particular model was synthesized to utilize all of the component performance routines and most of the program options.

  10. Performance analysis of LDPC codes on OOK terahertz wireless channels

    NASA Astrophysics Data System (ADS)

    Chun, Liu; Chang, Wang; Jun-Cheng, Cao

    2016-02-01

    Atmospheric absorption, scattering, and scintillation are the major causes to deteriorate the transmission quality of terahertz (THz) wireless communications. An error control coding scheme based on low density parity check (LDPC) codes with soft decision decoding algorithm is proposed to improve the bit-error-rate (BER) performance of an on-off keying (OOK) modulated THz signal through atmospheric channel. The THz wave propagation characteristics and channel model in atmosphere is set up. Numerical simulations validate the great performance of LDPC codes against the atmospheric fading and demonstrate the huge potential in future ultra-high speed beyond Gbps THz communications. Project supported by the National Key Basic Research Program of China (Grant No. 2014CB339803), the National High Technology Research and Development Program of China (Grant No. 2011AA010205), the National Natural Science Foundation of China (Grant Nos. 61131006, 61321492, and 61204135), the Major National Development Project of Scientific Instrument and Equipment (Grant No. 2011YQ150021), the National Science and Technology Major Project (Grant No. 2011ZX02707), the International Collaboration and Innovation Program on High Mobility Materials Engineering of the Chinese Academy of Sciences, and the Shanghai Municipal Commission of Science and Technology (Grant No. 14530711300).

  11. Structural Design and Sealing Performance Analysis of Biomimetic Sealing Ring.

    PubMed

    Han, Chuanjun; Zhang, Han; Zhang, Jie

    2015-01-01

    In order to reduce the failure probability of rubber sealing rings in reciprocating dynamic seal, a new structure of sealing ring based on bionics was designed. The biomimetic ring has three concave ridges and convex bulges on each side which are very similar to earthworms. Bulges were circularly designed and sealing performances of the biomimetic ring in both static seal and dynamic seal were simulated by FEM. In addition, effects of precompression, medium pressure, speed, friction coefficient, and material parameters on sealing performances were discussed. The results show that von Mises stress of the biomimetic sealing ring distributed symmetrically in no-pressure static sealing. The maximum von Mises stress appears on the second bulge of the inner side. High contact stress concentrates on left bulges. Von Mises stress distribution becomes uneven under medium pressure. Both von Mises stress and contact stress increase when precompression, medium pressure, and rubber hardness increase in static sealing. Biomimetic ring can avoid rolling and distortion in reciprocating dynamic seal, and its working life is much longer than O-ring and rectangular ring. The maximum von Mises stress and contact stress increase with the precompression, medium pressure, rubber hardness, and friction coefficient in reciprocating dynamic seal. PMID:27019582

  12. Performance analysis of a large-grain dataflow scheduling paradigm

    NASA Technical Reports Server (NTRS)

    Young, Steven D.; Wills, Robert W.

    1993-01-01

    A paradigm for scheduling computations on a network of multiprocessors using large-grain data flow scheduling at run time is described and analyzed. The computations to be scheduled must follow a static flow graph, while the schedule itself will be dynamic (i.e., determined at run time). Many applications characterized by static flow exist, and they include real-time control and digital signal processing. With the advent of computer-aided software engineering (CASE) tools for capturing software designs in dataflow-like structures, macro-dataflow scheduling becomes increasingly attractive, if not necessary. For parallel implementations, using the macro-dataflow method allows the scheduling to be insulated from the application designer and enables the maximum utilization of available resources. Further, by allowing multitasking, processor utilizations can approach 100 percent while they maintain maximum speedup. Extensive simulation studies are performed on 4-, 8-, and 16-processor architectures that reflect the effects of communication delays, scheduling delays, algorithm class, and multitasking on performance and speedup gains.

  13. The error performance analysis over cyclic redundancy check codes

    NASA Astrophysics Data System (ADS)

    Yoon, Hee B.

    1991-06-01

    The burst error is generated in digital communication networks by various unpredictable conditions, which occur at high error rates, for short durations, and can impact services. To completely describe a burst error one has to know the bit pattern. This is impossible in practice on working systems. Therefore, under the memoryless binary symmetric channel (MBSC) assumptions, the performance evaluation or estimation schemes for digital signal 1 (DS1) transmission systems carrying live traffic is an interesting and important problem. This study will present some analytical methods, leading to efficient detecting algorithms of burst error using cyclic redundancy check (CRC) code. The definition of burst error is introduced using three different models. Among the three burst error models, the mathematical model is used in this study. The probability density function, function(b) of burst error of length b is proposed. The performance of CRC-n codes is evaluated and analyzed using function(b) through the use of a computer simulation model within CRC block burst error. The simulation result shows that the mean block burst error tends to approach the pattern of the burst error which random bit errors generate.

  14. Structural Design and Sealing Performance Analysis of Biomimetic Sealing Ring.

    PubMed

    Han, Chuanjun; Zhang, Han; Zhang, Jie

    2015-01-01

    In order to reduce the failure probability of rubber sealing rings in reciprocating dynamic seal, a new structure of sealing ring based on bionics was designed. The biomimetic ring has three concave ridges and convex bulges on each side which are very similar to earthworms. Bulges were circularly designed and sealing performances of the biomimetic ring in both static seal and dynamic seal were simulated by FEM. In addition, effects of precompression, medium pressure, speed, friction coefficient, and material parameters on sealing performances were discussed. The results show that von Mises stress of the biomimetic sealing ring distributed symmetrically in no-pressure static sealing. The maximum von Mises stress appears on the second bulge of the inner side. High contact stress concentrates on left bulges. Von Mises stress distribution becomes uneven under medium pressure. Both von Mises stress and contact stress increase when precompression, medium pressure, and rubber hardness increase in static sealing. Biomimetic ring can avoid rolling and distortion in reciprocating dynamic seal, and its working life is much longer than O-ring and rectangular ring. The maximum von Mises stress and contact stress increase with the precompression, medium pressure, rubber hardness, and friction coefficient in reciprocating dynamic seal.

  15. A Preliminary Analysis of LANDSAT-4 Thematic Mapper Radiometric Performance

    NASA Technical Reports Server (NTRS)

    Justice, C.; Fusco, L.; Mehl, W.

    1985-01-01

    The NASA raw (BT) product, the radiometrically corrected (AT) product, and the radiometrically and geometrically corrected (PT) product of a TM scene were analyzed examine the frequency distribution of the digital data; the statistical correlation between the bands; and the variability between the detectors within a band. The analyses were performed on a series of image subsets from the full scence. Results are presented from one 1024 c 1024 pixel subset of Realfoot Lake, Tennessee which displayed a representative range of ground conditions and cover types occurring within the full frame image. From this cursory examination of one of the first seven channel TM data sets, it would appear that the radiometric performance of the system is most satisfactory and largely meets pre-launch specifications. Problems were noted with Band 5 Detector 3 and Band 2 Detector 4. Differences were observed between forward and reverse scan detector responses both for the BT and AT products. No systematic variations were observed between odd and even detectors.

  16. Structural Design and Sealing Performance Analysis of Biomimetic Sealing Ring

    PubMed Central

    Han, Chuanjun

    2015-01-01

    In order to reduce the failure probability of rubber sealing rings in reciprocating dynamic seal, a new structure of sealing ring based on bionics was designed. The biomimetic ring has three concave ridges and convex bulges on each side which are very similar to earthworms. Bulges were circularly designed and sealing performances of the biomimetic ring in both static seal and dynamic seal were simulated by FEM. In addition, effects of precompression, medium pressure, speed, friction coefficient, and material parameters on sealing performances were discussed. The results show that von Mises stress of the biomimetic sealing ring distributed symmetrically in no-pressure static sealing. The maximum von Mises stress appears on the second bulge of the inner side. High contact stress concentrates on left bulges. Von Mises stress distribution becomes uneven under medium pressure. Both von Mises stress and contact stress increase when precompression, medium pressure, and rubber hardness increase in static sealing. Biomimetic ring can avoid rolling and distortion in reciprocating dynamic seal, and its working life is much longer than O-ring and rectangular ring. The maximum von Mises stress and contact stress increase with the precompression, medium pressure, rubber hardness, and friction coefficient in reciprocating dynamic seal. PMID:27019582

  17. Cost-Effective Hyperspectral Transmissometers for Oceanographic Applications: Performance Analysis

    PubMed Central

    Ramírez-Pérez, Marta; Röttgers, Rüdiger; Torrecilla, Elena; Piera, Jaume

    2015-01-01

    The recent development of inexpensive, compact hyperspectral transmissometers broadens the research capabilities of oceanographic applications. These developments have been achieved by incorporating technologies such as micro-spectrometers as detectors as well as light emitting diodes (LEDs) as light sources. In this study, we evaluate the performance of the new commercial LED-based hyperspectral transmissometer VIPER (TriOS GmbH, Rastede, Germany), which combines different LEDs to emulate the visible light spectrum, aiming at the determination of attenuation coefficients in coastal environments. For this purpose, experimental uncertainties related to the instrument stability, the effect of ambient light and derived temperature, and salinity correction factors are analyzed. Our results identify some issues related to the thermal management of the LEDs and the contamination of ambient light. Furthermore, the performance of VIPER is validated against other transmissometers through simultaneous field measurements. It is demonstrated that VIPER provides a compact and cost-effective alternative for beam attenuation measurements in coastal waters, but it requires the consideration of several optimizations. PMID:26343652

  18. Examination performance and graphological analysis of students' handwriting.

    PubMed

    Lowis, M J; Mooney, S

    2001-10-01

    Research has yielded mixed support for graphological claims. The present study was designed to see whether specific components of students' handwriting were related to personality traits associated with achievement in written examinations. If aspects were identified that could be used to predict future academic performance, the findings would not only be of interest to graphologists but would be invaluable to both student and tutor in a teaching environment. In a blind trial, 100 handwriting samples from first-year scripts were analysed for the presence or absence of 12 graphological characteristics deemed to be relevant for academic performance, and each of these aspects was tested for association with the grade points awarded. Statistically significant differences were found for two of the 12 characteristics: "carefulness" and "constancy." Also, measurements of individual letters indicated that consistent slant was significantly associated with high grade points, whereas upright or mixed writing was not. These attributes appeared to be generally related to readability and aesthetic quality. Although such aspects might influence the grading of scripts by teachers, typed versions received similar grades to those awarded for the handwritten versions.

  19. The performance analysis of linux networking - packet receiving

    SciTech Connect

    Wu, Wenji; Crawford, Matt; Bowden, Mark; /Fermilab

    2006-11-01

    The computing models for High-Energy Physics experiments are becoming ever more globally distributed and grid-based, both for technical reasons (e.g., to place computational and data resources near each other and the demand) and for strategic reasons (e.g., to leverage equipment investments). To support such computing models, the network and end systems, computing and storage, face unprecedented challenges. One of the biggest challenges is to transfer scientific data sets--now in the multi-petabyte (10{sup 15} bytes) range and expected to grow to exabytes within a decade--reliably and efficiently among facilities and computation centers scattered around the world. Both the network and end systems should be able to provide the capabilities to support high bandwidth, sustained, end-to-end data transmission. Recent trends in technology are showing that although the raw transmission speeds used in networks are increasing rapidly, the rate of advancement of microprocessor technology has slowed down. Therefore, network protocol-processing overheads have risen sharply in comparison with the time spent in packet transmission, resulting in degraded throughput for networked applications. More and more, it is the network end system, instead of the network, that is responsible for degraded performance of network applications. In this paper, the Linux system's packet receive process is studied from NIC to application. We develop a mathematical model to characterize the Linux packet receiving process. Key factors that affect Linux systems network performance are analyzed.

  20. Performance analysis of flexible DSSC with binder addition

    NASA Astrophysics Data System (ADS)

    Muliani, Lia; Hidayat, Jojo; Anggraini, Putri Nur

    2016-04-01

    Flexible DSSC is one of modification of DSSC based on its substrate. Operating at low temperature, flexible DSSC requires a binder to improve particles interconnection. This research was done to compare the morphology and performance of flexible DSSC that was produced with binder-added and binder-free. TiO2 powder, butanol, and HCl were mixed for preparation of TiO2 paste. Small amount of titanium isopropoxide as binder was added into the mixture. TiO2 paste was deposited on ITO-PET plastic substrate with area of 1x1 cm2 by doctor blade method. Furthermore, SEM, XRD, and BET characterization were done to analyze morphology and surface area of the TiO2 photoelectrode microstructures. Dyed TiO2 photoelectrode and platinum counter electrode were assembled and injected by electrolyte. In the last process, flexible DSSCs were illuminated by sun simulator to do J-V measurement. As a result, flexible DSSC containing binder showed higher performance with photoconversion efficiency of 0.31%.

  1. Methods for Analysis of Outdoor Performance Data (Presentation)

    SciTech Connect

    Jordan, D.

    2011-02-01

    The ability to accurately predict power delivery over the course of time is of vital importance to the growth of the photovoltaic (PV) industry. Two key cost drivers are the efficiency with which sunlight is converted to power and secondly how this relationship develops over time. The accurate knowledge of power decline over time, also known as degradation rates, is essential and important to all stakeholders--utility companies, integrators, investors, and scientists alike. Different methods to determine degradation rates and discrete versus continuous data are presented, and some general best practice methods are outlined. In addition, historical degradation rates and some preliminary analysis with respect to climate are given.

  2. A Chaotic Ordered Hierarchies Consistency Analysis Performance Evaluation Model

    NASA Astrophysics Data System (ADS)

    Yeh, Wei-Chang

    2013-02-01

    The Hierarchies Consistency Analysis (HCA) is proposed by Guh in-cooperated along with some case study on a Resort to reinforce the weakness of Analytical Hierarchy Process (AHP). Although the results obtained enabled aid for the Decision Maker to make more reasonable and rational verdicts, the HCA itself is flawed. In this paper, our objective is to indicate the problems of HCA, and then propose a revised method called chaotic ordered HCA (COH in short) which can avoid problems. Since the COH is based upon Guh's method, the Decision Maker establishes decisions in a way similar to that of the original method.

  3. Human Performance Modeling for Dynamic Human Reliability Analysis

    SciTech Connect

    Boring, Ronald Laurids; Joe, Jeffrey Clark; Mandelli, Diego

    2015-08-01

    Part of the U.S. Department of Energy’s (DOE’s) Light Water Reac- tor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Charac- terization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk framework. In this paper, we review simulation based and non simulation based human reliability analysis (HRA) methods. This paper summarizes the founda- tional information needed to develop a feasible approach to modeling human in- teractions in RISMC simulations.

  4. Navier-Stokes analysis of radial turbine rotor performance

    NASA Technical Reports Server (NTRS)

    Larosiliere, L. M.

    1993-01-01

    An analysis of flow through a radial turbine rotor using the three-dimensional, thin-layer Navier-Stokes code RVC3D is described. The rotor is a solid version of an air-cooled metallic radial turbine having thick trailing edges, shroud clearance, and scalloped-backface clearance. Results are presented at the nominal operating condition using both a zero-clearance model and a model simulating the effects of the shroud and scalloped-backface clearance flows. A comparison with the available test data is made and details of the internal flow physics are discussed, allowing a better understanding of the complex flow distribution within the rotor.

  5. Performance analysis of high quality parallel preconditioners applied to 3D finite element structural analysis

    SciTech Connect

    Kolotilina, L.; Nikishin, A.; Yeremin, A.

    1994-12-31

    The solution of large systems of linear equations is a crucial bottleneck when performing 3D finite element analysis of structures. Also, in many cases the reliability and robustness of iterative solution strategies, and their efficiency when exploiting hardware resources, fully determine the scope of industrial applications which can be solved on a particular computer platform. This is especially true for modern vector/parallel supercomputers with large vector length and for modern massively parallel supercomputers. Preconditioned iterative methods have been successfully applied to industrial class finite element analysis of structures. The construction and application of high quality preconditioners constitutes a high percentage of the total solution time. Parallel implementation of high quality preconditioners on such architectures is a formidable challenge. Two common types of existing preconditioners are the implicit preconditioners and the explicit preconditioners. The implicit preconditioners (e.g. incomplete factorizations of several types) are generally high quality but require solution of lower and upper triangular systems of equations per iteration which are difficult to parallelize without deteriorating the convergence rate. The explicit type of preconditionings (e.g. polynomial preconditioners or Jacobi-like preconditioners) require sparse matrix-vector multiplications and can be parallelized but their preconditioning qualities are less than desirable. The authors present results of numerical experiments with Factorized Sparse Approximate Inverses (FSAI) for symmetric positive definite linear systems. These are high quality preconditioners that possess a large resource of parallelism by construction without increasing the serial complexity.

  6. Integrative Performance Analysis of a Novel Bone Level Tapered Implant.

    PubMed

    Dard, M; Kuehne, S; Obrecht, M; Grandin, M; Helfenstein, J; Pippenger, B E

    2016-03-01

    Primary mechanical stability, as measured by maximum insertion torque and resonance frequency analysis, is generally considered to be positively associated with successful secondary stability and implant success. Primary implant stability can be affected by several factors, including the quality and quantity of available bone, the implant design, and the surgical procedure. The use of a tapered implant design, for instance, has been shown to result in good primary stability even in clinical scenarios where primary stability is otherwise difficult to achieve with traditional cylindrical implants-for example, in soft bone and for immediate placement in extraction sockets. In this study, bone-type specific drill procedures are presented for a novel Straumann bone level tapered implant that ensure maximum insertion torque values are kept within the range of 15 to 80 Ncm. The drill procedures are tested in vitro using polyurethane foam blocks of variable density, ex vivo on explanted porcine ribs (bone type 3), and finally in vivo on porcine mandibles (bone type 1). In each test site, adapted drill procedures are found to achieve a good primary stability. These results are further translated into a finite element analysis model capable of predicting primary stability of tapered implants. In conclusion, we have assessed the biomechanical behavior of a novel taper-walled implant in combination with a bone-type specific drill procedure in both synthetic and natural bone of various types, and we have developed an in silico model for predicting primary stability upon implantation. PMID:26927485

  7. Performance analysis of elite men's and women's wheelchair basketball teams.

    PubMed

    Gómez, Miguel Ángel; Pérez, Javier; Molik, Bartosz; Szyman, Robert J; Sampaio, Jaime

    2014-01-01

    The purpose of the present study was to identify which game-related statistics discriminate winning and losing teams in men's and women's elite wheelchair basketball. The sample comprised all the games played during the Beijing Paralympics 2008 and the World Wheelchair Basketball Championship 2010. The game-related statistics from the official box scores were gathered and data were analysed in 2 groups: balanced games (final score differences ≤ 12 points) and unbalanced games (final score differences >13 points). Discriminant analysis allowed identifying the successful 2-point field-goals and free-throws, the unsuccessful 3-point field-goals and free-throws, the assists and fouls received as discriminant statistics between winning and losing teams in men's balanced games. In women's games, the teams were discriminated only by the successful 2-point field-goals. Linear regression analysis showed that the quality of opposition had great effects in final point differential. The field-goals percentage and free-throws rate were the most important factors in men's games, and field-goals percentage and offensive rebounding percentage in women's games. The identified trends allow improving game understanding and helping wheelchair basketball coaches to plan accurate practice sessions and, ultimately, deciding better in competition.

  8. A Kinematics Analysis Of Three Best 100 M Performances Ever

    PubMed Central

    Krzysztof, Maćkała; Mero, Antti

    2013-01-01

    The purpose of this investigation was to compare and determine the relevance of the morphological characteristics and variability of running speed parameters (stride length and stride frequency) between Usain Bolt’s three best 100 m performances. Based on this, an attempt was made to define which factors determine the performance of Usain Bolt’s sprint and, therefore, distinguish him from other sprinters. We analyzed the previous world record of 9.69 s set in the 2008 Beijing Olympics, the current record of 9.58 s set in the 2009 Berlin World Championships in Athletics and the O lympic record of 9.63 s set in 2012 London Olympics Games by Usain Bolt. The application of VirtualDub Programme allowed the acquisition of basic kinematical variables such as step length and step frequency parameters of 100 m sprint from video footage provided by NBC TV station, BBC TV station. This data was compared with other data available on the web and data published by the Scientific Research Project Office responsible on behalf of IAAF and the German Athletics Association (DVL). The main hypothesis was that the step length is the main factor that determines running speed in the 10 and 20 m sections of the entire 100 m distance. Bolt’s anthropometric advantage (body height, leg length and liner body) is not questionable and it is one of the factors that makes him faster than the rest of the finalists from each three competitions. Additionally, Bolt’s 20 cm longer stride shows benefit in the latter part of the race. Despite these factors, he is probably able to strike the ground more forcefully than rest of sprinters, relative to their body mass, therefore, he might maximize his time on the ground and to exert the same force over this period of time. This ability, combined with longer stride allows him to create very high running speed - over 12 m/s (12.05 – 12.34 m/s) in some 10 m sections of his three 100 m performances. These assumption confirmed the application of

  9. A kinematics analysis of three best 100 m performances ever.

    PubMed

    Krzysztof, Maćkała; Mero, Antti

    2013-03-01

    The purpose of this investigation was to compare and determine the relevance of the morphological characteristics and variability of running speed parameters (stride length and stride frequency) between Usain Bolt's three best 100 m performances. Based on this, an attempt was made to define which factors determine the performance of Usain Bolt's sprint and, therefore, distinguish him from other sprinters. We analyzed the previous world record of 9.69 s set in the 2008 Beijing Olympics, the current record of 9.58 s set in the 2009 Berlin World Championships in Athletics and the O lympic record of 9.63 s set in 2012 London Olympics Games by Usain Bolt. The application of VirtualDub Programme allowed the acquisition of basic kinematical variables such as step length and step frequency parameters of 100 m sprint from video footage provided by NBC TV station, BBC TV station. This data was compared with other data available on the web and data published by the Scientific Research Project Office responsible on behalf of IAAF and the German Athletics Association (DVL). The main hypothesis was that the step length is the main factor that determines running speed in the 10 and 20 m sections of the entire 100 m distance. Bolt's anthropometric advantage (body height, leg length and liner body) is not questionable and it is one of the factors that makes him faster than the rest of the finalists from each three competitions. Additionally, Bolt's 20 cm longer stride shows benefit in the latter part of the race. Despite these factors, he is probably able to strike the ground more forcefully than rest of sprinters, relative to their body mass, therefore, he might maximize his time on the ground and to exert the same force over this period of time. This ability, combined with longer stride allows him to create very high running speed - over 12 m/s (12.05 - 12.34 m/s) in some 10 m sections of his three 100 m performances. These assumption confirmed the application of Ballerieich

  10. Benchmarking and performance analysis of the CM-2. [SIMD computer

    NASA Technical Reports Server (NTRS)

    Myers, David W.; Adams, George B., II

    1988-01-01

    A suite of benchmarking routines testing communication, basic arithmetic operations, and selected kernel algorithms written in LISP and PARIS was developed for the CM-2. Experiment runs are automated via a software framework that sequences individual tests, allowing for unattended overnight operation. Multiple measurements are made and treated statistically to generate well-characterized results from the noisy values given by cm:time. The results obtained provide a comparison with similar, but less extensive, testing done on a CM-1. Tests were chosen to aid the algorithmist in constructing fast, efficient, and correct code on the CM-2, as well as gain insight into what performance criteria are needed when evaluating parallel processing machines.

  11. Analysis of the impact of error detection on computer performance

    NASA Technical Reports Server (NTRS)

    Shin, K. C.; Lee, Y. H.

    1983-01-01

    Conventionally, reliability analyses either assume that a fault/error is detected immediately following its occurrence, or neglect damages caused by latent errors. Though unrealistic, this assumption was imposed in order to avoid the difficulty of determining the respective probabilities that a fault induces an error and the error is then detected in a random amount of time after its occurrence. As a remedy for this problem a model is proposed to analyze the impact of error detection on computer performance under moderate assumptions. Error latency, the time interval between occurrence and the moment of detection, is used to measure the effectiveness of a detection mechanism. This model is used to: (1) predict the probability of producing an unreliable result, and (2) estimate the loss of computation due to fault and/or error.

  12. Oxygen rich gas generator design and performance analysis

    NASA Technical Reports Server (NTRS)

    Gloyer, P. W.; Knuth, W. H.; Crawford, R. A.

    1993-01-01

    The present oxygen-rich combustion research investigates oxygen gas generator concepts. The theoretical and modeling aspects of a selected concept are presented, together with a refined concept resulting from the findings of the study. This investigation examined a counter-flow gas generator design for O2/H2 mass ratios of 100-200, featuring a near-stoichiometric combustion zone followed by downstream mixing. The critical technologies required to develop a performance model are analyzed and include the following: (1) oxygen flow boiling; (2) two-phase oxygen flow heat transfer; (3) film-cooling in the combustion zone; (4) oxygen-rich combustion with hydrogen; and (5) mixing and dilution.

  13. Structural analysis of amorphous phosphates using high performance liquid chromatography

    SciTech Connect

    Sales, B.C.; Boatner, L.A.; Chakoumakos, B.C.; McCallum, J.C.; Ramey, J.O.; Zuhr, R.A.

    1993-12-31

    Determining the atomic-scale structure of amorphous solids has proven to be a formidable scientific and technological problem for the past 100 years. The technique of high-performance liquid chromatography (HPLC) provides unique detailed information regarding the structure of partially disordered or amorphous phosphate solids. Applications of the experimental technique of HPLC to phosphate solids are reviewed, and examples of the type of information that can be obtained with HPLC are presented. Inorganic phosphates encompass a large class of important materials whose applications include: catalysts, ion-exchange media, solid electrolytes for batteries, linear and nonlinear optical components, chelating agents, synthetic replacements for bone and teeth, phosphors, detergents, and fertilizers. Phosphate ions also represent a unique link between living systems and the inorganic world.

  14. Spaceborne Doppler Precipitation Radar: System Configurations and Performance Analysis

    NASA Technical Reports Server (NTRS)

    Tanelli, Simone; Im, Eastwood

    2004-01-01

    Knowledge of the global distribution of the vertical velocity of precipitation is important in in the study of energy transportation in the atmosphere, the climate and weather. Such knowledge can only be directly acquired with the use of spaceborne Doppler precipitation radars. Although the high relative speed of the radar with respect to the rainfall particles introduces significant broadening in the Doppler spectrum, recent studies have shown that the average vertical velocity can be measured to acceptable accuracy levels by appropriate selection of radar parameters. Furthermore, methods to correct for specific errors arising from NUBF effects and pointing uncertainties have recently been developed. In this paper we will present the results of the trade studies on the performances of a spaceborne Doppler radar with different system parameters configurations.

  15. Operational Performance Analysis of Passive Acoustic Monitoring for Killer Whales

    SciTech Connect

    Matzner, Shari; Fu, Tao; Ren, Huiying; Deng, Zhiqun; Sun, Yannan; Carlson, Thomas J.

    2011-09-30

    For the planned tidal turbine site in Puget Sound, WA, the main concern is to protect Southern Resident Killer Whales (SRKW) due to their Endangered Species Act status. A passive acoustic monitoring system is proposed because the whales emit vocalizations that can be detected by a passive system. The algorithm for detection is implemented in two stages. The first stage is an energy detector designed to detect candidate signals. The second stage is a spectral classifier that is designed to reduce false alarms. The evaluation presented here of the detection algorithm incorporates behavioral models of the species of interest, environmental models of noise levels and potential false alarm sources to provide a realistic characterization of expected operational performance.

  16. Rankine engine solar power generation. I - Performance and economic analysis

    NASA Technical Reports Server (NTRS)

    Gossler, A. A.; Orrock, J. E.

    1981-01-01

    Results of a computer simulation of the performance of a solar flat plate collector powered electrical generation system are presented. The simulation was configured to include locations in New Mexico, North Dakota, Tennessee, and Massachusetts, and considered a water-based heat-transfer fluid collector system with storage. The collectors also powered a Rankine-cycle boiler filled with a low temperature working fluid. The generator was considered to be run only when excess solar heat and full storage would otherwise require heat purging through the collectors. All power was directed into the utility grid. The solar powered generator unit addition was found to be dependent on site location and collector area, and reduced the effective solar cost with collector areas greater than 400-670 sq m. The sites were economically ranked, best to worst: New Mexico, North Dakota, Massachusetts, and Tennessee.

  17. Systems study on engineered barriers: barrier performance analysis

    SciTech Connect

    Stula, R.T.; Albert, T.E.; Kirstein, B.E.; Lester, D.H.

    1980-09-01

    A performance assessment model for multiple barrier packages containing unreprocessed spent fuel has been modified and applied to several package designs. The objective of the study was to develop information to be used in programmatic decision making concerning engineered barrier package design and development. The assessment model, BARIER, was developed in previous tasks of the System Study on Engineered Barriers (SSEB). The new version discussed in this report contains a refined and expanded corrosion rate data base which includes pitting, crack growth, and graphitization as well as bulk corrosion. Corrosion rates for oxic and anoxic conditions at each of the two temperature ranges are supplied. Other improvements include a rigorous treatment of radionuclide release after package failure which includes resistance of damaged barriers and backfill, refined temperature calculations that account for convection and radiation, a subroutine to calculate nuclear gamma radiation field at each barrier surface, refined stress calculations with reduced conservatism and various coding improvements to improve running time and core usage. This report also contains discussion of alternative scenarios to the assumed flooded repository as well as the impact of water exclusion backfills. The model was used to assess post repository closure performance for several designs which were all variation of basic designs from the Spent Unreprocessed Fuel (SURF) program. Many designs were found to delay the onset of leaching by at least a few hundreds of years in all geologic media. Long delay times for radionuclide release were found for packages with a few inches of sorption backfill. Release of uranium, plutonium, and americium was assessed.

  18. Performance analysis of OFDM modulation on indoor broadband PLC channels

    NASA Astrophysics Data System (ADS)

    Antonio Cortés, José; Díez, Luis; Cañete, Francisco Javier; Sánchez-Martínez, Juan José; Entrambasaguas, José Tomás

    2011-12-01

    Indoor broadband power-line communications is a suitable technology for home networking applications. In this context, orthogonal frequency-division multiplexing (OFDM) is the most widespread modulation technique. It has recently been adopted by the ITU-T Recommendation G.9960 and is also used by most of the commercial systems, whose number of carriers has gone from about 100 to a few thousands in less than a decade. However, indoor power-line channels are frequency-selective and exhibit periodic time variations. Hence, increasing the number of carriers does not always improves the performance, since it reduces the distortion because of the frequency selectivity, but increases the one caused by the channel time variation. In addition, the long impulse response of power-line channels obliges to use an insufficient cyclic prefix. Increasing its value reduces the distortion, but also the symbol rate. Therefore, there are optimum values for both modulation parameters. This article evaluates the performance of an OFDM system as a function of the number of carriers and the cyclic prefix length, determining their most appropriate values for the indoor power-line scenario. This task must be accomplished by means of time-consuming simulations employing a linear time-varying filtering, since no consensus on a tractable statistical channel model has been reached yet. However, this study presents a simpler procedure in which the distortion because of the frequency selectivity is computed using a time-invariant channel response, and an analytical expression is derived for the one caused by the channel time variation.

  19. Project on restaurant energy performance: end-use monitoring and analysis. Appendixes I and II

    SciTech Connect

    Claar, C.N.; Mazzucchi, R.P.; Heidell, J.A.

    1985-05-01

    This is the second volume of the report, ''The Porject on Restaurant Energy Performance - End-Use Monitoring and Analysis''. The first volume (PNL-5462) contains a summary and analysis of the metered energy performance data collected by the Project on Restaurant Energy Performance (PREP). Appendix I, presented here, contains monitoring site descriptions, measurement plans, and data summaries for the seven restaurants metered for PREP. Appendix II, also in this volume, is a description of the PREP computer system.

  20. Increasing the performance of tritium analysis by electrolytic enrichment.

    PubMed

    Groning, M; Auer, R; Brummer, D; Jaklitsch, M; Sambandam, C; Tanweer, A; Tatzber, H

    2009-06-01

    Several improvements are described for the existing tritium enrichment system at the Isotope Hydrology Laboratory of the International Atomic Energy Agency for processing natural water samples. The improvements include a simple method for pretreatment of electrolytic cells to ensure a high tritium separation factor, an improved design of the exhaust system for explosive gases, and a vacuum distillation line for faster initial preparation of water samples for electrolytic enrichment and for tritium analysis. Achievements included the reduction of variation of individual enrichment parameters of all cells to less than 1% and an improvement of 50% of the stability of the background mean. It resulted in an improved detection limit of less than 0.4 TU (at 2s), important for application of tritium measurements in the future at low concentration levels, and resulted in measurement precisions of+/-0.2 TU and+/-0.15 TU for liquid scintillation counting and for gas proportional counting, respectively. PMID:20183225

  1. Algorithms and architectures for high performance analysis of semantic graphs.

    SciTech Connect

    Hendrickson, Bruce Alan

    2005-09-01

    Semantic graphs offer one promising avenue for intelligence analysis in homeland security. They provide a mechanism for describing a wide variety of relationships between entities of potential interest. The vertices are nouns of various types, e.g. people, organizations, events, etc. Edges in the graph represent different types of relationships between entities, e.g. 'is friends with', 'belongs-to', etc. Semantic graphs offer a number of potential advantages as a knowledge representation system. They allow information of different kinds, and collected in differing ways, to be combined in a seamless manner. A semantic graph is a very compressed representation of some of relationship information. It has been reported that the semantic graph can be two orders of magnitude smaller than the processed intelligence data. This allows for much larger portions of the data universe to be resident in computer memory. Many intelligence queries that are relevant to the terrorist threat are naturally expressed in the language of semantic graphs. One example is the search for 'interesting' relationships between two individuals or between an individual and an event, which can be phrased as a search for short paths in the graph. Another example is the search for an analyst-specified threat pattern, which can be cast as an instance of subgraph isomorphism. It is important to note than many kinds of analysis are not relationship based, so these are not good candidates for semantic graphs. Thus, a semantic graph should always be used in conjunction with traditional knowledge representation and interface methods. Operations that involve looking for chains of relationships (e.g. friend of a friend) are not efficiently executable in a traditional relational database. However, the semantic graph can be thought of as a pre-join of the database, and it is ideally suited for these kinds of operations. Researchers at Sandia National Laboratories are working to facilitate semantic graph

  2. High Performance Parallel Analysis of Coupled Problems for Aircraft Propulsion

    NASA Technical Reports Server (NTRS)

    Felippa, C. A.; Farhat, C.; Lanteri, S.; Maman, N.; Piperno, S.; Gumaste, U.

    1994-01-01

    In order to predict the dynamic response of a flexible structure in a fluid flow, the equations of motion of the structure and the fluid must be solved simultaneously. In this paper, we present several partitioned procedures for time-integrating this focus coupled problem and discuss their merits in terms of accuracy, stability, heterogeneous computing, I/O transfers, subcycling, and parallel processing. All theoretical results are derived for a one-dimensional piston model problem with a compressible flow, because the complete three-dimensional aeroelastic problem is difficult to analyze mathematically. However, the insight gained from the analysis of the coupled piston problem and the conclusions drawn from its numerical investigation are confirmed with the numerical simulation of the two-dimensional transient aeroelastic response of a flexible panel in a transonic nonlinear Euler flow regime.

  3. Performing Taxometric Analysis to Distinguish Categorical and Dimensional Variables.

    PubMed

    Ruscio, John; Ruscio, Ayelet Meron; Carney, Lauren M

    2011-01-01

    A fundamental question facing clinical scientists is whether the constructs they are studying are categorical or dimensional in nature. The taxometric method was developed expressly to answer this question and is being used by a growing number of investigators to inform theory, research, and practice in psychopathology. The current paper provides a practical introduction to the method, updating earlier tutorials based on the findings of recent methodological studies. We offer revised guidelines for data requirements, indicator selection, parameter estimation, and procedure selection and implementation. We illustrate our recommended approach to taxometric analysis using idealized data sets as well as data sets representative of those found in clinical research. We close with advice to help newcomers get started on their own taxometric analyses.

  4. Airborne Doppler lidar detection of wind shear. Results of performance analysis

    NASA Technical Reports Server (NTRS)

    Huffaker, R. Milton

    1988-01-01

    Results of a performance analysis of an airborne Doppler radar wind shear detection system are given in vugraph form. It was concluded that both CO sub 2 and Ho:YAG lasers are feasible for dry microburst applications, but with limited performance in wet microbursts. The Ho:YAG performs better than the CO sub 2 for a set of identical lidar parameters.

  5. Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0

    DOE PAGESBeta

    Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; Morris, Alan

    2008-01-01

    The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less

  6. Performance analysis of digital silicon photomultipliers for PET

    NASA Astrophysics Data System (ADS)

    Somlai-Schweiger, I.; Schneider, F. R.; Ziegler, S. I.

    2015-05-01

    A silicon photomultiplier (SiPM) with electronics integrated on cell level has been developed by Philips Digital Photon Counting. The device delivers a digital signal of the detected photon counts and their time stamp, making it a potential candidate for positron emission tomography (PET) applications. Several operational parameters of the specifically developed acquisition protocol can be adjusted to optimize the photon detection. In this work, the combination of five different parameters (trigger scheme, validation scheme, cell inhibition, temperature and excess bias voltage) is analyzed. Their impact on both the intrinsic as well as the PET-oriented sensor's performance is studied when coupled to two different PET candidate scintillators, GAGG and LYSO (2 × 2 × 6 mm3). The results show that SiPM intrinsic properties such as breakdown voltage temperature coefficient (20 mV/K) and optical crosstalk (20%) are similar to state-of-the-art analog devices. The main differences are induced by the logic of the acquisition sequence and its parameters. The sensor's dark-count-rate (DCR) is 770 kHz/mm2 at 24°C and 100% active cells. It can be reduced through cell inhibition and lower temperatures (ca. 2 orders of magnitude at 0°C and 20% cell inhibition). DCR reduction is necessary to avoid acquiring dark-count-triggered and validated events, causing loss of detection sensitivity. The typical time fraction spent with these events is 42.5% (GAGG) and 35.5% (LYSO). Increasing percentages of cell inhibition affect the photodetection efficiency and with it the energy resolution and the coincidence time resolution (CTR). At 5.6 °C and 10% cell inhibition, the measured energy resolution is 11.9% and 13.5% (FWHM, saturation corrected) and a FWHM CTR of 458 ps and 177 ps is achieved, for GAGG and LYSO respectively. With the implemented setup, the optimum configuration for PET, in terms of sensitivity, energy resolution and CTR, is trigger scheme 1, validation scheme 8, 10

  7. Analysis of bio-anode performance through electrochemical impedance spectroscopy.

    PubMed

    ter Heijne, Annemiek; Schaetzle, Olivier; Gimenez, Sixto; Navarro, Lucia; Hamelers, Bert; Fabregat-Santiago, Francisco

    2015-12-01

    In this paper we studied the performance of bioanodes under different experimental conditions using polarization curves and impedance spectroscopy. We have identified that the large capacitances of up to 1 mF·cm(-2) for graphite anodes have their origin in the nature of the carbonaceous electrode, rather than the microbial culture. In some cases, the separate contributions of charge transfer and diffusion resistance were clearly visible, while in other cases their contribution was masked by the high capacitance of 1 mF·cm(-2). The impedance data were analyzed using the basic Randles model to analyze ohmic, charge transfer and diffusion resistances. Increasing buffer concentration from 0 to 50mM and increasing pH from 6 to 8 resulted in decreased charge transfer and diffusion resistances; lowest values being 144 Ω·cm(2) and 34 Ω·cm(2), respectively. At acetate concentrations below 1 mM, current generation was limited by acetate. We show a linear relationship between inverse charge transfer resistance at potentials close to open circuit and saturation (maximum) current, associated to the Butler-Volmer relationship that needs further exploration.

  8. Analysis of bio-anode performance through electrochemical impedance spectroscopy.

    PubMed

    ter Heijne, Annemiek; Schaetzle, Olivier; Gimenez, Sixto; Navarro, Lucia; Hamelers, Bert; Fabregat-Santiago, Francisco

    2015-12-01

    In this paper we studied the performance of bioanodes under different experimental conditions using polarization curves and impedance spectroscopy. We have identified that the large capacitances of up to 1 mF·cm(-2) for graphite anodes have their origin in the nature of the carbonaceous electrode, rather than the microbial culture. In some cases, the separate contributions of charge transfer and diffusion resistance were clearly visible, while in other cases their contribution was masked by the high capacitance of 1 mF·cm(-2). The impedance data were analyzed using the basic Randles model to analyze ohmic, charge transfer and diffusion resistances. Increasing buffer concentration from 0 to 50mM and increasing pH from 6 to 8 resulted in decreased charge transfer and diffusion resistances; lowest values being 144 Ω·cm(2) and 34 Ω·cm(2), respectively. At acetate concentrations below 1 mM, current generation was limited by acetate. We show a linear relationship between inverse charge transfer resistance at potentials close to open circuit and saturation (maximum) current, associated to the Butler-Volmer relationship that needs further exploration. PMID:25869113

  9. Lunar lander configuration study and parametric performance analysis

    NASA Technical Reports Server (NTRS)

    Donahue, Benjamin B.; Fowler, C. R.

    1993-01-01

    Future Lunar exploration plans will call for delivery of significant mounts or cargo to provide for crew habitation, surface tansportation, and scientific exploration activities. Minimization of costly surface based infrastructure is in large part directly related to the design of the cargo delivery/landing craft. This study focused on evaluating Lunar lander concepts from a logistics oriented perspective, and outlines the approach used in the development of a preferred configuration, sets forth the benefits derived from its utilization and describes the missions and system considered. Results indicate that only direct-to-surface downloading of payloads provides for unassisted cargo removal operations imperative to efficient and low risk site buildup, including the emplacement of Space Station derivative surface habitat modules, immediate cargo jettison for both descent abort and emergency surface ascent essential to piloted missions carrying cargo, and short habitat egress/ingress paths necessary to productive surface work tours for crew members carrying hand held experiments, tools and other bulky articles. Accommodating cargo in a position underneath the vehicles structural frame, landing craft described herein eliminate altogether the necessity for dedicated surface based off-loading vehicles, the operations and maintenance associated with their operation, and the precipitous ladder climbs to and from the surface that are inherent to traditional designs. Parametric evaluations illustrate performance and mass variation with respect to mission requirements.

  10. Performance analysis of GPS receivers in impulsive noise

    NASA Astrophysics Data System (ADS)

    Liu, Liyu; Amin, Moeness

    2005-06-01

    The use of GPS has broadened to include mounting on or inside manned or autonomous vehicles which makes it subject to interference generated from motor emissions. Many sources of interference are typically modeled as impulsive noise whose characteristics may vary in terms of power, pulse width, and pulse occurrences. In this paper, we examine the effect of impulsive noise on GPS delay lock loops (DLL). We consider the DLL for the GPS Coarse Acquisition code (C/A), which is used in civilian applications, but also needed in military GPS receivers to perform signal acquisition and tracking. We focus on the statistics of the noise components of the early, late, punctual correlators, which contribute to the discriminator error. The discriminator noise components are produced from the correlation between the impulsive noise and the early, late and punctual reference C/A code. Due to long time averaging, these components assume Gaussian distributions. The discriminator error variance is derived, incorporating the front-end precorrelation filter. It is shown that the synchronization error variance is significantly affected by the power of the received impulsive noise, the precorrelation filter, and the sample rate.

  11. Performance Analysis of TCP Enhancements in Satellite Data Networks

    NASA Technical Reports Server (NTRS)

    Broyles, Ren H.

    1999-01-01

    This research examines two proposed enhancements to the well-known Transport Control Protocol (TCP) in the presence of noisy communication links. The Multiple Pipes protocol is an application-level adaptation of the standard TCP protocol, where several TCP links cooperate to transfer data. The Space Communication Protocol Standard - Transport Protocol (SCPS-TP) modifies TCP to optimize performance in a satellite environment. While SCPS-TP has inherent advantages that allow it to deliver data more rapidly than Multiple Pipes, the protocol, when optimized for operation in a high-error environment, is not compatible with legacy TCP systems, and requires changes to the TCP specification. This investigation determines the level of improvement offered by SCPS-TP's Corruption Mode, which will help determine if migration to the protocol is appropriate in different environments. As the percentage of corrupted packets approaches 5 %, Multiple Pipes can take over five times longer than SCPS-TP to deliver data. At high error rates, SCPS-TP's advantage is primarily caused by Multiple Pipes' use of congestion control algorithms. The lack of congestion control, however, limits the systems in which SCPS-TP can be effectively used.

  12. Performance analysis of a fault-tolerant distributed multimedia server

    NASA Astrophysics Data System (ADS)

    Derryberry, Barbara

    1998-12-01

    The evolving demands of networks to support Webtone, H.323, AIN and other advanced services require multimedia servers that can deliver a number of value-added capabilities such as to negotiate protocols, deliver network services, and respond to QoS requests. The server is one of the primary limiters on network capacity. THe next generation server must be based upon a flexible, robust, scalable, and reliable platform to keep abreast with the revolutionary pace of service demand and development while continuing to provide the same dependability that voice networks have provided for decades. A new distributed platform, which is based upon the Totem fault-tolerant messaging system, is described. Processor and network resources are modeled and analyzed. Quantitative results are presented that assess this platform in terms of messaging capacity and performance for various architecture and design options including processing technologies and fault-tolerance modes. The impacts of fault-tolerant messaging are identified based upon analytical modeling of the proposed server architecture.

  13. Automated Dsm Extraction from Uav Images and Performance Analysis

    NASA Astrophysics Data System (ADS)

    Rhee, S.; Kim, T.

    2015-08-01

    As technology evolves, unmanned aerial vehicles (UAVs) imagery is being used from simple applications such as image acquisition to complicated applications such as 3D spatial information extraction. Spatial information is usually provided in the form of a DSM or point cloud. It is important to generate very dense tie points automatically from stereo images. In this paper, we tried to apply stereo image-based matching technique developed for satellite/aerial images to UAV images, propose processing steps for automated DSM generation and to analyse the possibility of DSM generation. For DSM generation from UAV images, firstly, exterior orientation parameters (EOPs) for each dataset were adjusted. Secondly, optimum matching pairs were determined. Thirdly, stereo image matching was performed with each pair. Developed matching algorithm is based on grey-level correlation on pixels applied along epipolar lines. Finally, the extracted match results were united with one result and the final DSM was made. Generated DSM was compared with a reference DSM from Lidar. Overall accuracy was 1.5 m in NMAD. However, several problems have to be solved in future, including obtaining precise EOPs, handling occlusion and image blurring problems. More effective interpolation technique needs to be developed in the future.

  14. Compact time- and space-integrating SAR processor: performance analysis

    NASA Astrophysics Data System (ADS)

    Haney, Michael W.; Levy, James J.; Michael, Robert R., Jr.; Christensen, Marc P.

    1995-06-01

    Progress made during the previous 12 months toward the fabrication and test of a flight demonstration prototype of the acousto-optic time- and space-integrating real-time SAR image formation processor is reported. Compact, rugged, and low-power analog optical signal processing techniques are used for the most computationally taxing portions of the SAR imaging problem to overcome the size and power consumption limitations of electronic approaches. Flexibility and performance are maintained by the use of digital electronics for the critical low-complexity filter generation and output image processing functions. The results reported for this year include tests of a laboratory version of the RAPID SAR concept on phase history data generated from real SAR high-resolution imagery; a description of the new compact 2D acousto-optic scanner that has a 2D space bandwidth product approaching 106 sports, specified and procured for NEOS Technologies during the last year; and a design and layout of the optical module portion of the flight-worthy prototype.

  15. Building America Performance Analysis Procedures for Existing Homes

    SciTech Connect

    Hendron, R.

    2006-05-01

    Because there are more than 101 million residential households in the United States today, it is not surprising that existing residential buildings represent an extremely large source of potential energy savings. Because thousands of these homes are renovated each year, Building America is investigating the best ways to make existing homes more energy-efficient, based on lessons learned from research in new homes. The Building America program is aiming for a 20%-30% reduction in energy use in existing homes by 2020. The strategy for the existing homes project of Building America is to establish technology pathways that reduce energy consumption cost-effectively in American homes. The existing buildings project focuses on finding ways to adapt the results from the new homes research to retrofit applications in existing homes. Research activities include a combination of computer modeling, field demonstrations, and long-term monitoring to support the development of integrated approaches to reduce energy use in existing residential buildings. Analytical tools are being developed to guide designers and builders in selecting the best approaches for each application. Also, DOE partners with the U.S. Environmental Protection Agency (EPA) to increase energy efficiency in existing homes through the Home Performance with ENERGY STAR program.

  16. Engineering sciences area and module performance and failure analysis area

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.; Runkle, L. D.

    1982-01-01

    Photovoltaic-array/power-conditioner interface studies are updated. An experiment conducted to evaluate different operating-point strategies, such as constant voltage and pilot cells, and to determine array energy losses when the array is operated off the maximum power points is described. Initial results over a test period of three and a half weeks showed a 2% energy loss when the array is operated at a fixed voltage. Degraded-array studies conducted at NE RES that used a range of simulated common types of degraded I-V curves are reviewed. The instrumentation installed at the JPL field-test site to obtain the irradiance data was described. Experiments using an optical filter to adjust the spectral irradiance of the large-area pulsed solar simulator (LAPSS) to AM1.5 are described. Residential-array research activity is reviewed. Voltage isolation test results are described. Experiments performed on one type of module to determine the relationship between leakage current and temperature are reviewed. An encapsulated-cell testing approach is explained. The test program, data reduction methods, and initial results of long-duration module testing are described.

  17. New analysis and performance of a wall-current monitor

    NASA Astrophysics Data System (ADS)

    Suwada, T.; Tamiya, K.; Urano, T.; Kobayashi, H.; Asami, A.

    1997-02-01

    A new wall-current monitor has been developed in order to reinforce the beam-monitoring system in the PF 2.5-GeV linac for the KEK B-Factory. A prototype monitor was tested for its performance and characteristics. The experimental results in terms of both bench tests and beam tests by single-bunch electron beams were analyzed on the basis of equivalent-circuit models. The frequency response of the monitor agreed well with a lumped equivalent-circuit model for both time- and frequency-domain measurements. The position dependence and its frequency characteristics of the monitor also agreed well with a distributed equivalent-circuit model for both time- and frequency-domain measurements. The rise time of the monitor was about 3 ns, which indicated a poor response for short-pulse beams (< 1 ns). The reason could be attributed to the stray inductance of the ceramic solid resistor and not very good frequency response of the ferrite core.

  18. Performance analysis of a medical record exchanges model.

    PubMed

    Huang, Ean-Wen; Liou, Der-Ming

    2007-03-01

    Electronic medical record exchange among hospitals can provide more information for physician diagnosis and reduce costs from duplicate examinations. In this paper, we proposed and implemented a medical record exchange model. According to our study, exchange interface servers (EISs) are designed for hospitals to manage the information communication through the intra and interhospital networks linked with a medical records database. An index service center can be given responsibility for managing the EIS and publishing the addresses and public keys. The prototype system has been implemented to generate, parse, and transfer the health level seven query messages. Moreover, the system can encrypt and decrypt a message using the public-key encryption algorithm. The queuing theory is applied to evaluate the performance of our proposed model. We estimated the service time for each queue of the CPU, database, and network, and measured the response time and possible bottlenecks of the model. The capacity of the model is estimated to process the medical records of about 4000 patients/h in the 1-MB network backbone environments, which comprises about the 4% of the total outpatients in Taiwan.

  19. Lunar lander configuration study and parametric performance analysis

    NASA Astrophysics Data System (ADS)

    Donahue, Benjamin B.; Fowler, C. R.

    1993-06-01

    Future Lunar exploration plans will call for delivery of significant mounts or cargo to provide for crew habitation, surface tansportation, and scientific exploration activities. Minimization of costly surface based infrastructure is in large part directly related to the design of the cargo delivery/landing craft. This study focused on evaluating Lunar lander concepts from a logistics oriented perspective, and outlines the approach used in the development of a preferred configuration, sets forth the benefits derived from its utilization and describes the missions and system considered. Results indicate that only direct-to-surface downloading of payloads provides for unassisted cargo removal operations imperative to efficient and low risk site buildup, including the emplacement of Space Station derivative surface habitat modules, immediate cargo jettison for both descent abort and emergency surface ascent essential to piloted missions carrying cargo, and short habitat egress/ingress paths necessary to productive surface work tours for crew members carrying hand held experiments, tools and other bulky articles. Accommodating cargo in a position underneath the vehicles structural frame, landing craft described herein eliminate altogether the necessity for dedicated surface based off-loading vehicles, the operations and maintenance associated with their operation, and the precipitous ladder climbs to and from the surface that are inherent to traditional designs. Parametric evaluations illustrate performance and mass variation with respect to mission requirements.

  20. T-wave alternans: clinical performance, limitations and analysis methodologies.

    PubMed

    Garcia, Euler V; Pastore, Carlos Alberto; Samesima, Nelson; Pereira Filho, Horácio G

    2011-03-01

    Accurate recognition of individuals at higher immediate risk of sudden cardiac death (SCD) is still an open question. The fortuitous nature of acute cardiovascular events just does not seem to fit the well-known model of ventricular tachycardia/fibrillation induction in a static arrhythmogenic substrate by a synchronous trigger. On the mechanism of SCD, a dynamical electrical instability would better explain the rarity of the simultaneous association of a correct trigger and an appropriate cardiac substrate. Several studies have been conducted trying to measure this cardiac electrical instability (or any valid surrogate) in an ECG beat stream. Among the current possible candidates we can number QT prolongation, QT dispersion, late potentials, T-wave alternans (TWA), and heart rate turbulence. This article reviews the particular role of TWA in the current cardiac risk stratification scenario. TWA findings are still heterogeneous, ranging from very good to nearly null prognostic performance depending on the clinical population observed and clinical protocol in use. To fill the current gaps in the TWA base of knowledge, practitioners, and researchers should better explore the technical features of the several technologies available for TWA evaluation and pay greater attention to the fact that TWA values are responsive to several factors other than medications. Information about the cellular and subcellular mechanisms of TWA is outside the scope of this article, but the reader is referred to some of the good papers available on this topic whenever this extra information could help the understanding of the concepts and facts covered herein. PMID:21359487

  1. Performance analysis of bullet trajectory estimation: Approach, simulation, and experiments

    SciTech Connect

    Ng, L.C.; Karr, T.J.

    1994-11-08

    This paper describes an approach to estimate a bullet`s trajectory from a time sequence of angles-only observations from a high-speed camera, and analyzes its performance. The technique is based on fitting a ballistic model of a bullet in flight along with unknown source location parameters to a time series of angular observations. The theory is developed to precisely reconstruct, from firing range geometry, the actual bullet trajectory as it appeared on the focal plane array and in real space. A metric for measuring the effective trajectory track error is also presented. Detailed Monte-Carlo simulations assuming different bullet ranges, shot-angles, camera frame rates, and angular noise show that angular track error can be as small as 100 {mu}rad for a 2 mrad/pixel sensor. It is also shown that if actual values of bullet ballistic parameters were available, the bullet s source location variables, and the angles of flight information could also be determined.

  2. Analysis of radiation performances of plasma sheet antenna

    NASA Astrophysics Data System (ADS)

    Yin, Bo; Zhang, Zu-Fan; Wang, Ping

    2015-12-01

    A novel concept of plasma sheet antennas is presented in this paper, and the radiation performances of plasma sheet antennas are investigated in detail. Firstly, a model of planar plasma antenna (PPA) fed by a microstrip line is developed, and its reflection coefficient is computed by the JE convolution finite-difference time-domain method and compared with that of the metallic patch antenna. It is found that the design of PPA can learn from the theory of the metallic patch antenna, and the impedance matching and reconstruction of resonant frequency can be expediently realized by adjusting the parameters of plasma. Then the PPA is mounted on a metallic cylindrical surface, and the reflection coefficient of the conformal plasma antenna (CPA) is also computed. At the same time, the influence of conformal cylinder radius on the reflection coefficient is also analyzed. Finally, the radiation pattern of a CPA is given, the results show that the pattern agrees well with the one of PPA in the main radiation direction, but its side lobe level has deteriorated significantly.

  3. Performance Analysis of an Improved MUSIC DoA Estimator

    NASA Astrophysics Data System (ADS)

    Vallet, Pascal; Mestre, Xavier; Loubaton, Philippe

    2015-12-01

    This paper adresses the statistical performance of subspace DoA estimation using a sensor array, in the asymptotic regime where the number of samples and sensors both converge to infinity at the same rate. Improved subspace DoA estimators were derived (termed as G-MUSIC) in previous works, and were shown to be consistent and asymptotically Gaussian distributed in the case where the number of sources and their DoA remain fixed. In this case, which models widely spaced DoA scenarios, it is proved in the present paper that the traditional MUSIC method also provides DoA consistent estimates having the same asymptotic variances as the G-MUSIC estimates. The case of DoA that are spaced of the order of a beamwidth, which models closely spaced sources, is also considered. It is shown that G-MUSIC estimates are still able to consistently separate the sources, while it is no longer the case for the MUSIC ones. The asymptotic variances of G-MUSIC estimates are also evaluated.

  4. The Performance Analysis Based on SAR Sample Covariance Matrix

    PubMed Central

    Erten, Esra

    2012-01-01

    Multi-channel systems appear in several fields of application in science. In the Synthetic Aperture Radar (SAR) context, multi-channel systems may refer to different domains, as multi-polarization, multi-interferometric or multi-temporal data, or even a combination of them. Due to the inherent speckle phenomenon present in SAR images, the statistical description of the data is almost mandatory for its utilization. The complex images acquired over natural media present in general zero-mean circular Gaussian characteristics. In this case, second order statistics as the multi-channel covariance matrix fully describe the data. For practical situations however, the covariance matrix has to be estimated using a limited number of samples, and this sample covariance matrix follow the complex Wishart distribution. In this context, the eigendecomposition of the multi-channel covariance matrix has been shown in different areas of high relevance regarding the physical properties of the imaged scene. Specifically, the maximum eigenvalue of the covariance matrix has been frequently used in different applications as target or change detection, estimation of the dominant scattering mechanism in polarimetric data, moving target indication, etc. In this paper, the statistical behavior of the maximum eigenvalue derived from the eigendecomposition of the sample multi-channel covariance matrix in terms of multi-channel SAR images is simplified for SAR community. Validation is performed against simulated data and examples of estimation and detection problems using the analytical expressions are as well given. PMID:22736976

  5. Performance Analysis of Digital Tracking Loops for Telemetry Ranging Applications

    NASA Astrophysics Data System (ADS)

    Vilnrotter, V.; Hamkins, J.; Xie, H.; Ashrafi, S.

    2015-08-01

    In this article, we analyze mathematical models of digital loops used to track the phase and timing of communications and navigation signals. The limits on the accuracy of phase and timing estimates play a critical role in the accuracy achievable in telemetry ranging applications. We describe in detail a practical algorithm to compute the loop parameters for discrete update (DU) and continuous update (CU) loop formulations, and we show that a simple power-series approximation to the DU model is valid over a large range of time-bandwidth product . Several numerical examples compare the estimation error variance of the DU and CU models to each other and to Cramer-Rao lower bounds. Finally, the results are applied to the problem of ranging, by evaluating the performance of a phase-locked loop designed to track a typical ambiguity-resolving pseudonoise (PN) code received and demodulated at the spacecraft on the uplink part of the two-way ranging link, and a data transition tracking loop (DTTL) on the downlink part.

  6. Design and demonstrate the performance of cryogenic components representative of space vehicles: Start basket liquid acquisition device performance analysis

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The objective was to design, fabricate and test an integrated cryogenic test article incorporating both fluid and thermal propellant management subsystems. A 2.2 m (87 in) diameter aluminum test tank was outfitted with multilayer insulation, helium purge system, low-conductive tank supports, thermodynamic vent system, liquid acquisition device and immersed outflow pump. Tests and analysis performed on the start basket liquid acquisition device and studies of the liquid retention characteristics of fine mesh screens are discussed.

  7. Performance analysis of bonded composite doublers on aircraft structures

    SciTech Connect

    Roach, D.

    1995-08-01

    Researchers contend that composite repairs (or structural reinforcement doublers) offer numerous advantages over metallic patches including corrosion resistance, light weight, high strength, elimination of rivets, and time savings in installation. Their use in commercial aviation has been stifled by uncertainties surrounding their application, subsequent inspection and long-term endurance. The process of repairing or reinforcing airplane structures is time consuming and the design is dependent upon an accompanying stress and fatigue analysis. A repair that is too stiff may result in a loss of fatigue life, continued growth of the crack being repaired, and the initiation of a new flaw in the undesirable high stress field around the patch. Uncertainties in load spectrums used to design repairs exacerbates these problems as does the use of rivets to apply conventional doublers. Many of these repair or structural reinforcement difficulties can be addressed through the use of composite doublers. Primary among unknown entities are the effects of non-optimum installations and the certification of adequate inspection procedures. This paper presents on overview of a program intended to introduce composite doubler technology to the US commercial aircraft fleet. In this project, a specific composite application has been chosen on an L-1011 aircraft in order to focus the tasks on application and operation issues. Through the use of laboratory test structures and flight demonstrations on an in-service L-1011 airplane, this study is investigating composite doubler design, fabrication, installation, structural integrity, and non-destructive evaluation. In addition to providing an overview of the L-1011 project, this paper focuses on a series of fatigue and strength tests which have been conducted in order to study the damage tolerance of composite doublers. Test results to-date are presented.

  8. A Covariance Analysis Tool for Assessing Fundamental Limits of SIM Pointing Performance

    NASA Technical Reports Server (NTRS)

    Bayard, David S.; Kang, Bryan H.

    2007-01-01

    This paper presents a performance analysis of the instrument pointing control system for NASA's Space Interferometer Mission (SIM). SIM has a complex pointing system that uses a fast steering mirror in combination with a multirate control architecture to blend feed forward information with feedback information. A pointing covariance analysis tool (PCAT) is developed specifically to analyze systems with such complexity. The development of PCAT as a mathematical tool for covariance analysis is outlined in the paper. PCAT is then applied to studying performance of SIM's science pointing system. The analysis reveals and clearly delineates a fundamental limit that exists for SIM pointing performance. The limit is especially stringent for dim star targets. Discussion of the nature of the performance limit is provided, and methods are suggested to potentially improve pointing performance.

  9. Performance analysis of mini-propellers based on FlightGear

    NASA Astrophysics Data System (ADS)

    Vogeltanz, Tomáš

    2016-06-01

    This paper presents a performance analysis of three mini-propellers based on the FlightGear flight simulator. Although a basic propeller analysis has to be performed before the use of FlightGear, for a complex and more practical performance analysis, it is advantageous to use a propeller model in cooperation with a particular aircraft model. This approach may determine whether the propeller has sufficient quality in respect of aircraft requirements. In the first section, the software used for the analysis is illustrated. Then, the parameters of the analyzed mini-propellers and the tested UAV are described. Finally, the main section shows and discusses the results of the performance analysis of the mini-propellers.

  10. Performance Analysis of Occurrences January 1, 2011-December 31, 2011

    SciTech Connect

    Ludwig, M

    2012-03-16

    This report documents the analysis of the occurrences during the period January 1, 2011 through December 31, 2011. The report compares LLNL occurrences by reporting criteria and significance category to see if LLNL is reporting occurrences along similar percentages as other DOE sites. The three-year trends are analyzed. It does not include the analysis of the causes or the lessons learned from the occurrences, as they are analyzed separately. The number and types of occurrences that LLNL reports to DOE varies over time. This variation can be attributed to normally occurring changes in frequency; DOE's or LLNL's heightened interest in a particular subject area; changes in LLNL processes; or emerging problems. Since all of the DOE sites use the same reporting criteria, it is helpful to understand if LLNL is consistent with or diverging from reporting at other sites. This section compares the normalized number of occurrences reported by LLNL and other DOE sites. In order to compare LLNL occurrence reports to occurrence reports from other DOE sites, we normalized (or standardized) the data from the sites. DOE sites vary widely in their budgets, populations, and scope of work and these variations may affect reporting frequency. In addition, reports are required for a wide range of occurrence types, some of which may not be applicable to all DOE sites. For example, one occurrence reporting group is Group 3, Nuclear Safety Basis, and not all sites have nuclear operations. Because limited information is available for all sites, the sites were normalized based on best available information. Site effort hours were extracted from the DOE Computerized Accident Incident Reporting System (CAIRS) and used to normalize (or standardize) the number of occurrences by site. Effort hours are those hours that employees normally work and do not include vacation, holiday hours etc. Sites are responsible for calculating their effort hours and ensuring entry into CAIRS. Out of the 30 DOE

  11. Space tug economic analysis study. Volume 2: Tug concepts analysis. Appendix: Tug design and performance data base

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The tug design and performance data base for the economic analysis of space tug operation are presented. A compendium of the detailed design and performance information from the data base is developed. The design data are parametric across a range of reusable space tug sizes. The performance curves are generated for selected point designs of expendable orbit injection stages and reusable tugs. Data are presented in the form of graphs for various modes of operation.

  12. Development and performance analysis of a metallic micro-direct methanol fuel cell for high-performance applications

    NASA Astrophysics Data System (ADS)

    Zhang, Bo; Zhang, Yufeng; He, Hong; Li, Jianmin; Yuan, Zhenyu; Na, Chaoran; Liu, Xiaowei

    As a promising candidate for conventional micro-power sources, the micro-direct methanol fuel cell (μDMFC) is currently attracting increased attention due to its various advantages and prospective suitability for portable applications. This paper reports the design, fabrication and analysis of a high-performance μDMFC with two metal current collectors. Employing micro-stamping technology, the current collectors are fabricated on 300-μm-thick stainless steel plates. The flow fields for both cathode and anode are uniform in shape and size. Two sheets of stainless steel mesh are added between the membrane electrode assembly (MEA) and current collectors in order to improve cell performance. To avoid electrochemical corrosion, titanium nitride (TiN) layers with thickness of 500 nm are deposited onto the surface of current collectors and stainless steel mesh. The performance of this metallic μDMFC is thoroughly studied by both simulation and experimental methods. The results show that all the parameters investigated, including current collector material, stainless steel mesh, anode feeding mode, methanol concentration, anode flow rate, and operating temperature have significant effects on cell performance. Moreover, the results show that under optimal operating conditions, the metallic μDMFC exhibits promising performance, yielding a maximum power density of 65.66 mW cm -2 at 40 °C and 115.0 mW cm -2 at 80 °C.

  13. Water-quality and biological conditions in the Lower Boise River, Ada and Canyon Counties, Idaho, 1994-2002

    USGS Publications Warehouse

    MacCoy, Dorene E.

    2004-01-01

    The water quality and biotic integrity of the lower Boise River between Lucky Peak Dam and the river's mouth near Parma, Idaho, have been affected by agricultural land and water use, wastewater treatment facility discharge, urbanization, reservoir operations, and river channel alteration. The U.S. Geological Survey (USGS) and cooperators have studied water-quality and biological aspects of the lower Boise River in the past to address water-quality concerns and issues brought forth by the Clean Water Act of 1977. Past and present issues include preservation of beneficial uses of the river for fisheries, recreation, and irrigation; and maintenance of high-quality water for domestic and agricultural uses. Evaluation of the data collected from 1994 to 2002 by the USGS revealed increases in constituent concentrations in the lower Boise in a downstream direction. Median suspended sediment concentrations from Diversion Dam (downstream from Lucky Peak Dam) to Parma increased more than 11 times, nitrogen concentrations increased more than 8 times, phosphorus concentrations increased more than 7 times, and fecal coliform concentrations increased more than 400 times. Chlorophyll-a concentrations, used as an indicator of nutrient input and the potential for nuisance algal growth, also increased in a downstream direction; median concentrations were highest at the Middleton and Parma sites. There were no discernible temporal trends in nutrients, sediment, or bacteria concentrations over the 8-year study. The State of Idaho?s temperature standards to protect coldwater biota and salmonid spawning were exceeded most frequently at Middleton and Parma. Suspended sediment concentrations exceeded criteria proposed by Idaho Department of Environmental Quality most frequently at Parma and at all but three tributaries. Total nitrogen concentrations at Glenwood, Middleton, and Parma exceeded national background levels; median flow-adjusted total nitrogen concentrations at Middleton and Parma were higher than those in undeveloped basins sampled nationwide by the USGS. Total phosphorus concentrations at Glenwood, Middleton, and Parma also exceeded those in undeveloped basins. Macroinvertebrate and fish communities were used to evaluate the long-term integration of water-quality contaminants and loss of habitat in the lower Boise. Biological integrity of the macroinvertebrate population was assessed with the attributes (metrics) of Ephemeroptera, Plecoptera, and Trichoptera (EPT) richness and metrics used in the Idaho River Macroinvertebrate Index (RMI): taxa richness; EPT richness; percent dominant taxon; percent Elmidae (riffle beetles); and percent predators. Average EPT was about 10, and RMI scores were frequently below 16, which indicated intermediate or poor water quality. The number of EPT taxa and RMI scores for the lower Boise were half those for least-impacted streams in Idaho. The fine sediment bioassessment index (FSBI) was used to evaluate macroinvertebrate sediment tolerance. The FSBI scores were lower than those for a site upstream in the Boise River Basin near Twin Springs, a site not impacted by urbanization and agriculture, which indicated that the lower Boise macroinvertebrate population may be impacted by fine sediment. Macroinvertebrate functional feeding groups and percent tolerant species, mainly at Middleton and Parma, were typical of those in areas of degraded water quality and habitat. The biological integrity of the fish population was evaluated using the Idaho River Fish Index (RFI), which consists of the 10 metrics: number of coldwater native species, percent sculpin, percent coldwater species, percent sensitive native individuals, percent tolerant individuals, number of nonindigenous species, number of coldwater fish captured per minute of electrofishing, percent of fish with deformities (eroded fins, lesions, or tumors), number of trout age classes, and percent carp. RFI scores for lower Boise sites indicated a d

  14. Analysis of sampling and quantization effects on the performance of PN code tracking loops

    NASA Technical Reports Server (NTRS)

    Quirk, K. J.; Srinivasan, M.

    2002-01-01

    Pseudonoise (PN) code tracking loops in direct-sequence spread-spectrum systems are often implemented using digital hardware. Performance degradation due to quantization and sampling effects is not adequately characterized by the traditional analog system feedback loop analysis.

  15. Analytical techniques and instrumentation: A compilation. [analytical instrumentation, materials performance, and systems analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.

  16. Performance Cycle Analysis of a Two-Spool, Separate-Exhaust Turbofan With Interstage Turbine Burner

    NASA Technical Reports Server (NTRS)

    Liew, K. H.; Urip, E.; Yang, S. L.; Mattingly, J. D.; Marek, C. J.

    2005-01-01

    This paper presents the performance cycle analysis of a dual-spool, separate-exhaust turbofan engine, with an Interstage Turbine Burner serving as a secondary combustor. The ITB, which is located at the transition duct between the high- and the low-pressure turbines, is a relatively new concept for increasing specific thrust and lowering pollutant emissions in modern jet engine propulsion. A detailed performance analysis of this engine has been conducted for steady-state engine performance prediction. A code is written and is capable of predicting engine performances (i.e., thrust and thrust specific fuel consumption) at varying flight conditions and throttle settings. Two design-point engines were studied to reveal trends in performance at both full and partial throttle operations. A mission analysis is also presented to assure the advantage of saving fuel by adding ITB.

  17. Comment on 'Performance analysis in football: a critical review and implications for future research'.

    PubMed

    Carling, Christopher; Wright, Craig; Nelson, Lee John; Bradley, Paul S

    2014-01-01

    MacKenzie and Cushion (2013) recently reviewed performance analysis research in association football (soccer). Their critical review focused on several themes related to methodological approaches such as sample size, match context and operational definitions and the implications of research findings to professional practice. In this response letter, we comment on additional pragmatic concerns regarding these key themes as well as some of the difficulties commonly faced when conducting performance analysis research.

  18. Performance characterization of image and video analysis systems at Siemens Corporate Research

    NASA Astrophysics Data System (ADS)

    Ramesh, Visvanathan; Jolly, Marie-Pierre; Greiffenhagen, Michael

    2000-06-01

    There has been a significant increase in commercial products using imaging analysis techniques to solve real-world problems in diverse fields such as manufacturing, medical imaging, document analysis, transportation and public security, etc. This has been accelerated by various factors: more advanced algorithms, the availability of cheaper sensors, and faster processors. While algorithms continue to improve in performance, a major stumbling block in translating improvements in algorithms to faster deployment of image analysis systems is the lack of characterization of limits of algorithms and how they affect total system performance. The research community has realized the need for performance analysis and there have been significant efforts in the last few years to remedy the situation. Our efforts at SCR have been on statistical modeling and characterization of modules and systems. The emphasis is on both white-box and black box methodologies to evaluate and optimize vision systems. In the first part of this paper we review the literature on performance characterization and then provide an overview of the status of research in performance characterization of image and video understanding systems. The second part of the paper is on performance evaluation of medical image segmentation algorithms. Finally, we highlight some research issues in performance analysis in medical imaging systems.

  19. Performance Measures Analysis (PMA) as a Means of Assessing Consistency between Course Objectives and Evaluation Process.

    ERIC Educational Resources Information Center

    Curtiss, Frederic R.; Swonger, Alvin K.

    1981-01-01

    A performance measure analysis process developed by the Competency Based Education Committee of the University of Rhode Island College of Pharmacy to assess the status of the measurement of student performance is described. A taxonomy of levels of learning is appended. (Author/MLW)

  20. Child Maltreatment and School Performance Declines: An Event-History Analysis.

    ERIC Educational Resources Information Center

    Leiter, Jeffrey; Johnsen, Matthew C.

    1997-01-01

    Presents a longitudinal analysis of school performance declines among neglected and abused children, using the maltreatment and school histories of 1,369 children in Mecklenburg County, North Carolina. Significant relationships between maltreatment and declines in performance were found in diverse school outcomes. (SLD)

  1. The Effect of Goal Setting on Group Performance: A Meta-Analysis

    ERIC Educational Resources Information Center

    Kleingeld, Ad; van Mierlo, Heleen; Arends, Lidia

    2011-01-01

    Updating and extending the work of O'Leary-Kelly, Martocchio, and Frink (1994), with this meta-analysis on goal setting and group performance we show that specific difficult goals yield considerably higher group performance compared with nonspecific goals (d = 0.80 plus or minus 0.35, k = 23 effect sizes). Moderately difficult and easy goals were…

  2. Performation Metrics Development Analysis for Information and Communications Technology Outsourcing: A Case Study

    ERIC Educational Resources Information Center

    Travis, James L., III

    2014-01-01

    This study investigated how and to what extent the development and use of the OV-5a operational architecture decomposition tree (OADT) from the Department of Defense (DoD) Architecture Framework (DoDAF) affects requirements analysis with respect to complete performance metrics for performance-based services acquisition of ICT under rigid…

  3. An Analysis of Factors That Affect the Educational Performance of Agricultural Students

    ERIC Educational Resources Information Center

    Greenway, Gina

    2012-01-01

    Many factors contribute to student achievement. This study focuses on three areas: how students learn, how student personality type affects performance, and how course format affects performance outcomes. The analysis sought to improve understanding of the direction and magnitude with which each of these factors impacts student success. Improved…

  4. Application of Data Envelopment Analysis on the Indicators Contributing to Learning and Teaching Performance

    ERIC Educational Resources Information Center

    Montoneri, Bernard; Lin, Tyrone T.; Lee, Chia-Chi; Huang, Shio-Ling

    2012-01-01

    This paper applies data envelopment analysis (DEA) to explore the quantitative relative efficiency of 18 classes of freshmen students studying a course of English conversation in a university of Taiwan from the academic year 2004-2006. A diagram of teaching performance improvement mechanism is designed to identify key performance indicators for…

  5. Longitudinal Trend Analysis of Performance Indicators for South Carolina's Technical Colleges

    ERIC Educational Resources Information Center

    Hossain, Mohammad Nurul

    2010-01-01

    This study included an analysis of the trend of performance indicators for the technical college sector of higher education in South Carolina. In response to demands for accountability and transparency in higher education, the state of South Carolina developed sector specific performance indicators to measure various educational outcomes for each…

  6. Value-Added and Other Methods for Measuring School Performance: An Analysis of Performance Measurement Strategies in Teacher Incentive Fund Proposals. Research Brief

    ERIC Educational Resources Information Center

    National Center on Performance Incentives, 2008

    2008-01-01

    In "Value-Added and Other Methods for Measuring School Performance: An Analysis of Performance Measurement Strategies in Teacher Incentive Fund Proposals"--a paper presented at the February 2008 National Center on Performance Incentives research to policy conference--Robert Meyer and Michael Christian examine select performance-pay plans used by…

  7. Hope and optimism: latent structures and influences on grade expectancy and academic performance.

    PubMed

    Rand, Kevin L

    2009-02-01

    A synthesized model of trait hope (Snyder 1994, 2002) and trait optimism (Scheier & Carver, 1985) is proposed. In this model hope and optimism are conceptualized as facets of an overarching trait called goal attitude. Structural equation modeling is used to test the plausibility of the proposed model in a sample of 345 students in a university psychology course who completed the Adult Hope Scale (Snyder et al., 1991) and the Life Orientation Test-Revised (Scheier, Carver, & Bridges, 1994). The proposed model shows acceptable fit to the observed data. The synthesized model is used to examine the unique and common influences of hope and optimism on grade expectancy and academic performance in 312 students who completed the course. The results show that hope uniquely influenced students' grade expectancies, whereas optimism did not. In turn, grade expectancies influenced academic performance. Neither hope nor optimism had a unique, direct influence on academic performance. In contrast, the shared aspect of hope and optimism (i.e., goal attitude) had a direct influence on academic performance.

  8. Parallels in Academic and Nonacademic Discursive Styles: An Analysis of a Mexican Woman's Narrative Performance

    ERIC Educational Resources Information Center

    Barajas, E. Dominguez

    2007-01-01

    This article presents a rhetorical analysis of a Mexican woman's oral narrative performance using a discourse studies and interactional sociolinguistics framework. The results of the analysis suggest that the discursive practice of the oral narrative and that of academic discourse share certain rhetorical features. These features are (a) the…

  9. 41 CFR 102-80.130 - Who must perform the equivalent level of safety analysis?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire Prevention Equivalent Level of Safety Analysis... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false Who must perform the equivalent level of safety analysis? 102-80.130 Section 102-80.130 Public Contracts and Property...

  10. 41 CFR 102-80.130 - Who must perform the equivalent level of safety analysis?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire Prevention Equivalent Level of Safety Analysis... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false Who must perform the equivalent level of safety analysis? 102-80.130 Section 102-80.130 Public Contracts and Property...

  11. 41 CFR 102-80.130 - Who must perform the equivalent level of safety analysis?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire Prevention Equivalent Level of Safety Analysis... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false Who must perform the equivalent level of safety analysis? 102-80.130 Section 102-80.130 Public Contracts and Property...

  12. COMPONENT, IMAGE, AND FACTOR ANALYSIS OF TESTS OF INTELLECT AND OF MOTOR PERFORMANCE.

    ERIC Educational Resources Information Center

    HARRIS, CHESTER W.; LIBA, MARIE R.

    AN ATTEMPT WAS MADE TO DETERMINE THE EFFECTS OF CERTAIN VARIATIONS IN METHODOLOGY ON THE ANALYSIS OF EXISTING SETS OF DATA IN THE AREAS OF ABILITY OR INTELLIGENCE AND MOTOR PERFORMANCE OR PHYSICAL FITNESS. USING CURRENT DEVELOPMENTS IN THEORY AND METHODS OF FACTOR ANALYSIS DIFFERENT TREATMENTS OF VARIOUS SETS OF DATA, THREE RELATIVELY NEW MODELS…

  13. Evaluating Language Environment Analysis System Performance for Chinese: A Pilot Study in Shanghai

    ERIC Educational Resources Information Center

    Gilkerson, Jill; Zhang, Yiwen; Xu, Dongxin; Richards, Jeffrey A.; Xu, Xiaojuan; Jiang, Fan; Harnsberger, James; Topping, Keith

    2015-01-01

    Purpose: The purpose of this study was to evaluate performance of the Language Environment Analysis (LENA) automated language-analysis system for the Chinese Shanghai dialect and Mandarin (SDM) languages. Method: Volunteer parents of 22 children aged 3-23 months were recruited in Shanghai. Families provided daylong in-home audio recordings using…

  14. Computer Analysis of the Auditory Characteristics of Musical Performance. Final Report.

    ERIC Educational Resources Information Center

    Heller, Jack J.; Campbell, Warren C.

    The purpose of this research was to perform computer analysis and modification of complex musical tones and to develop models of perceptual and learning processes in music. Analysis of the physical attributes of sound (frequency, intensity, and harmonic content, versus time) provided necessary information about the musical parameters of…

  15. Geometrically nonlinear design sensitivity analysis on parallel-vector high-performance computers

    NASA Technical Reports Server (NTRS)

    Baddourah, Majdi A.; Nguyen, Duc T.

    1993-01-01

    Parallel-vector solution strategies for generation and assembly of element matrices, solution of the resulted system of linear equations, calculations of the unbalanced loads, displacements, stresses, and design sensitivity analysis (DSA) are all incorporated into the Newton Raphson (NR) procedure for nonlinear finite element analysis and DSA. Numerical results are included to show the performance of the proposed method for structural analysis and DSA in a parallel-vector computer environment.

  16. A further analysis for the minimum-variance deconvolution filter performance

    NASA Astrophysics Data System (ADS)

    Chi, Chong-Yung

    1987-06-01

    Chi and Mendel (1984) analyzed the performance of minimum-variance deconvolution (MVD). In this correspondence, a further analysis of the performance of the MVD filter is presented. It is shown that the MVD filter performs like an inverse filter and a whitening filter as SNR goes to infinity, and like a matched filter as SNR goes to zero. The estimation error of the MVD filter is colored noise, but it becomes white when SNR goes to zero. This analysis also conects the error power-spectral density of the MVD filter with the spectrum of the causal-prediction error filter.

  17. Elastic-plastic mixed-iterative finite element analysis: Implementation and performance assessment

    NASA Technical Reports Server (NTRS)

    Sutjahjo, Edhi; Chamis, Christos C.

    1993-01-01

    An elastic-plastic algorithm based on Von Mises and associative flow criteria is implemented in MHOST-a mixed iterative finite element analysis computer program developed by NASA Lewis Research Center. The performance of the resulting elastic-plastic mixed-iterative analysis is examined through a set of convergence studies. Membrane and bending behaviors of 4-node quadrilateral shell finite elements are tested for elastic-plastic performance. Generally, the membrane results are excellent, indicating the implementation of elastic-plastic mixed-iterative analysis is appropriate.

  18. Methodology issues concerning the accuracy of kinematic data collection and analysis using the ariel performance analysis system

    NASA Technical Reports Server (NTRS)

    Wilmington, R. P.; Klute, Glenn K. (Editor); Carroll, Amy E. (Editor); Stuart, Mark A. (Editor); Poliner, Jeff (Editor); Rajulu, Sudhakar (Editor); Stanush, Julie (Editor)

    1992-01-01

    Kinematics, the study of motion exclusive of the influences of mass and force, is one of the primary methods used for the analysis of human biomechanical systems as well as other types of mechanical systems. The Anthropometry and Biomechanics Laboratory (ABL) in the Crew Interface Analysis section of the Man-Systems Division performs both human body kinematics as well as mechanical system kinematics using the Ariel Performance Analysis System (APAS). The APAS supports both analysis of analog signals (e.g. force plate data collection) as well as digitization and analysis of video data. The current evaluations address several methodology issues concerning the accuracy of the kinematic data collection and analysis used in the ABL. This document describes a series of evaluations performed to gain quantitative data pertaining to position and constant angular velocity movements under several operating conditions. Two-dimensional as well as three-dimensional data collection and analyses were completed in a controlled laboratory environment using typical hardware setups. In addition, an evaluation was performed to evaluate the accuracy impact due to a single axis camera offset. Segment length and positional data exhibited errors within 3 percent when using three-dimensional analysis and yielded errors within 8 percent through two-dimensional analysis (Direct Linear Software). Peak angular velocities displayed errors within 6 percent through three-dimensional analyses and exhibited errors of 12 percent when using two-dimensional analysis (Direct Linear Software). The specific results from this series of evaluations and their impacts on the methodology issues of kinematic data collection and analyses are presented in detail. The accuracy levels observed in these evaluations are also presented.

  19. Intrinsic motivation and extrinsic incentives jointly predict performance: a 40-year meta-analysis.

    PubMed

    Cerasoli, Christopher P; Nicklin, Jessica M; Ford, Michael T

    2014-07-01

    More than 4 decades of research and 9 meta-analyses have focused on the undermining effect: namely, the debate over whether the provision of extrinsic incentives erodes intrinsic motivation. This review and meta-analysis builds on such previous reviews by focusing on the interrelationship among intrinsic motivation, extrinsic incentives, and performance, with reference to 2 moderators: performance type (quality vs. quantity) and incentive contingency (directly performance-salient vs. indirectly performance-salient), which have not been systematically reviewed to date. Based on random-effects meta-analytic methods, findings from school, work, and physical domains (k = 183, N = 212,468) indicate that intrinsic motivation is a medium to strong predictor of performance (ρ = .21-45). The importance of intrinsic motivation to performance remained in place whether incentives were presented. In addition, incentive salience influenced the predictive validity of intrinsic motivation for performance: In a "crowding out" fashion, intrinsic motivation was less important to performance when incentives were directly tied to performance and was more important when incentives were indirectly tied to performance. Considered simultaneously through meta-analytic regression, intrinsic motivation predicted more unique variance in quality of performance, whereas incentives were a better predictor of quantity of performance. With respect to performance, incentives and intrinsic motivation are not necessarily antagonistic and are best considered simultaneously. Future research should consider using nonperformance criteria (e.g., well-being, job satisfaction) as well as applying the percent-of-maximum-possible (POMP) method in meta-analyses. PMID:24491020

  20. Intrinsic motivation and extrinsic incentives jointly predict performance: a 40-year meta-analysis.

    PubMed

    Cerasoli, Christopher P; Nicklin, Jessica M; Ford, Michael T

    2014-07-01

    More than 4 decades of research and 9 meta-analyses have focused on the undermining effect: namely, the debate over whether the provision of extrinsic incentives erodes intrinsic motivation. This review and meta-analysis builds on such previous reviews by focusing on the interrelationship among intrinsic motivation, extrinsic incentives, and performance, with reference to 2 moderators: performance type (quality vs. quantity) and incentive contingency (directly performance-salient vs. indirectly performance-salient), which have not been systematically reviewed to date. Based on random-effects meta-analytic methods, findings from school, work, and physical domains (k = 183, N = 212,468) indicate that intrinsic motivation is a medium to strong predictor of performance (ρ = .21-45). The importance of intrinsic motivation to performance remained in place whether incentives were presented. In addition, incentive salience influenced the predictive validity of intrinsic motivation for performance: In a "crowding out" fashion, intrinsic motivation was less important to performance when incentives were directly tied to performance and was more important when incentives were indirectly tied to performance. Considered simultaneously through meta-analytic regression, intrinsic motivation predicted more unique variance in quality of performance, whereas incentives were a better predictor of quantity of performance. With respect to performance, incentives and intrinsic motivation are not necessarily antagonistic and are best considered simultaneously. Future research should consider using nonperformance criteria (e.g., well-being, job satisfaction) as well as applying the percent-of-maximum-possible (POMP) method in meta-analyses.

  1. Parametric performance analysis of OTEC system using HFC32/HFC134a mixtures

    SciTech Connect

    Uehara, Haruo; Ikegami, Yasuyuki

    1995-11-01

    Parametric performance analysis is performed on an Ocean Thermal Energy Conversion (OTEC) system using HFC32/HFC134a mixtures as working fluid. The analyzed OTEC system uses the Kalina cycle. The parameters in the performance analysis consist of the warm sea water inlet temperature, the cold sea water inlet temperature, the heat transfer performance of the evaporator, condenser and regenerator, the turbine inlet pressure, the turbine inlet temperature, the molar fraction of HFC32. Effects of these various parameters on the efficiency of the Kalina cycle using HFC32/HFC134a mixtures are clarified by using this analysis, and compared with calculation results using ammonia/water mixtures as working fluid. The thermal efficiency of OTEC system using the Kalina cycle can reach up to about 5 percent with an inlet warm sea water temperature of 28 C and an inlet cold sea water temperature of 4 C.

  2. Performance Analysis of Distributed Applications using Automatic Classification of Communication Inefficiencies

    SciTech Connect

    Vetter, J.

    1999-11-01

    We present a technique for performance analysis that helps users understand the communication behavior of their message passing applications. Our method automatically classifies individual communication operations and it reveals the cause of communication inefficiencies in the application. This classification allows the developer to focus quickly on the culprits of truly inefficient behavior, rather than manually foraging through massive amounts of performance data. Specifically, we trace the message operations of MPI applications and then classify each individual communication event using decision tree classification, a supervised learning technique. We train our decision tree using microbenchmarks that demonstrate both efficient and inefficient communication. Since our technique adapts to the target system's configuration through these microbenchmarks, we can simultaneously automate the performance analysis process and improve classification accuracy. Our experiments on four applications demonstrate that our technique can improve the accuracy of performance analysis, and dramatically reduce the amount of data that users must encounter.

  3. Performance (Off-Design) Cycle Analysis for a Turbofan Engine With Interstage Turbine Burner

    NASA Technical Reports Server (NTRS)

    Liew, K. H.; Urip, E.; Yang, S. L.; Mattingly, J. D.; Marek, C. J.

    2005-01-01

    This report presents the performance of a steady-state, dual-spool, separate-exhaust turbofan engine, with an interstage turbine burner (ITB) serving as a secondary combustor. The ITB, which is located in the transition duct between the high- and the low-pressure turbines, is a relatively new concept for increasing specific thrust and lowering pollutant emissions in modern jet-engine propulsion. A detailed off-design performance analysis of ITB engines is written in Microsoft(Registered Trademark) Excel (Redmond, Washington) macrocode with Visual Basic Application to calculate engine performances over the entire operating envelope. Several design-point engine cases are pre-selected using a parametric cycle-analysis code developed previously in Microsoft(Registered Trademark) Excel, for off-design analysis. The off-design code calculates engine performances (i.e. thrust and thrust-specific-fuel-consumption) at various flight conditions and throttle settings.

  4. Performance analysis in football: a critical review and implications for future research.

    PubMed

    Mackenzie, Rob; Cushion, Chris

    2013-01-01

    This paper critically reviews existing literature relating to performance analysis (PA) in football, arguing that an alternative approach is warranted. The paper considers the applicability of variables analysed along with research findings in the context of their implications for professional practice. This includes a review of methodological approaches commonly adopted throughout PA research, including a consideration of the nature and size of the samples used in relation to generalisability. Definitions and classifications of variables used within performance analysis are discussed in the context of reliability and validity. The contribution of PA findings to the field is reviewed. The review identifies an overemphasis on researching predictive and performance controlling variables. A different approach is proposed that works with and from performance analysis information to develop research investigating athlete and coach learning, thus adding to applied practice. Future research should pay attention to the social and cultural influences that impact PA delivery and athlete learning in applied settings.

  5. Design and performance of an analysis-by-synthesis class of predictive speech coders

    NASA Technical Reports Server (NTRS)

    Rose, Richard C.; Barnwell, Thomas P., III

    1990-01-01

    The performance of a broad class of analysis-by-synthesis linear predictive speech coders is quantified experimentally. The class of coders includes a number of well-known techniques as well as a very large number of speech coders which have not been named or studied. A general formulation for deriving the parametric representation used in all of the coders in the class is presented. A new coder, named the self-excited vocoder, is discussed because of its good performance with low complexity, and because of the insight this coder gives to analysis-by-synthesis coders in general. The results of a study comparing the performances of different members of this class are presented. The study takes the form of a series of formal subjective and objective speech quality tests performed on selected coders. The results of this study lead to some interesting and important observations concerning the controlling parameters for analysis-by-synthesis speech coders.

  6. Statistical Analysis of the Performance of MDL Enumeration for Multiple-Missed Detection in Array Processing

    PubMed Central

    Du, Fei; Li, Yibo; Jin, Shijiu

    2015-01-01

    An accurate performance analysis on the MDL criterion for source enumeration in array processing is presented in this paper. The enumeration results of MDL can be predicted precisely by the proposed procedure via the statistical analysis of the sample eigenvalues, whose distributive properties are investigated with the consideration of their interactions. A novel approach is also developed for the performance evaluation when the source number is underestimated by a number greater than one, which is denoted as “multiple-missed detection”, and the probability of a specific underestimated source number can be estimated by ratio distribution analysis. Simulation results are included to demonstrate the superiority of the presented method over available results and confirm the ability of the proposed approach to perform multiple-missed detection analysis. PMID:26295232

  7. A Finite Rate Chemical Analysis of Nitric Oxide Flow Contamination Effects on Scramjet Performance

    NASA Technical Reports Server (NTRS)

    Cabell, Karen F.; Rock, Kenneth E.

    2003-01-01

    The level of nitric oxide contamination in the test gas of the Langley Research Center Arc-Heated Scramjet Test Facility and the effect of the contamination on scramjet test engine performance were investigated analytically. A finite rate chemical analysis was performed to determine the levels of nitric oxide produced in the facility at conditions corresponding to Mach 6 to 8 flight simulations. Results indicate that nitric oxide levels range from one to three mole percent, corroborating previously obtained measurements. A three-stream combustor code with finite rate chemistry was used to investigate the effects of nitric oxide on scramjet performance. Results indicate that nitric oxide in the test gas causes a small increase in heat release and thrust performance for the test conditions investigated. However, a rate constant uncertainty analysis suggests that the effect of nitric oxide ranges from no net effect, to an increase of about 10 percent in thrust performance.

  8. The Development of a Handbook for Astrobee F Performance and Stability Analysis

    NASA Technical Reports Server (NTRS)

    Wolf, R. S.

    1982-01-01

    An astrobee F performance and stability analysis is presented, for use by the NASA Sounding Rocket Division. The performance analysis provides information regarding altitude, mach number, dynamic pressure, and velocity as functions of time since launch. It is found that payload weight has the greatest effect on performance, and performance prediction accuracy was calculated to remain within 1%. In addition, to assure sufficient flight stability, a predicted rigid-body static margin of at least 8% of the total vehicle length is required. Finally, fin cant angle predictions are given in order to achieve a 2.5 cycle per second burnout roll rate, based on obtaining 75% of the steady roll rate. It is noted that this method can be used by flight performance engineers to create a similar handbook for any sounding rocket series.

  9. A performance index approach to aerodynamic design with the use of analysis codes only

    NASA Technical Reports Server (NTRS)

    Barger, Raymond L.; Moitra, Anutosh

    1988-01-01

    A method is described for designing an aerodynamic configuration for a specified performance vector, based on results from several similar, but not identical, trial configurations, each defined by a geometry parameter vector. The theory shows the method effective provided that: (1) the results for the trial configuration provide sufficient variation so that a linear combination of them approximates the specified performance; and (2) the difference between the performance vectors (including the specifed performance) are sufficiently small that the linearity assumption of sensitivity analysis applies to the differences. A computed example describes the design of a high supersonic Mach number missile wing body configuration based on results from a set of four trial configurations.

  10. Multiple Criteria and Multiple Periods Performance Analysis: The Comparison of North African Railways

    NASA Astrophysics Data System (ADS)

    Sabri, Karim; Colson, Gérard E.; Mbangala, Augustin M.

    2008-10-01

    Multi-period differences of technical and financial performances are analysed by comparing five North African railways over the period (1990-2004). A first approach is based on the Malmquist DEA TFP index for measuring the total factors productivity change, decomposed into technical efficiency change and technological changes. A multiple criteria analysis is also performed using the PROMETHEE II method and the software ARGOS. These methods provide complementary detailed information, especially by discriminating the technological and management progresses by Malmquist and the two dimensions of performance by Promethee: that are the service to the community and the enterprises performances, often in conflict.

  11. Cost and performance analysis of physical protection systems -- A case study

    SciTech Connect

    Hicks, M.J.; Snell, M.S.; Sandoval, J.S.; Potter, C.S.

    1998-08-01

    Design and analysis of physical protection systems requires (1) identification of mission critical assets; (2) identification of potential threats that might undermine mission capability; (3) identification of the consequences of loss of mission-critical assets (e.g., time and cost to recover required capability and impact on operational readiness); and (4) analysis of the effectiveness of physical protection elements. CPA -- Cost and Performance Analysis -- addresses the fourth of these four issues. CPA is a methodology that joins Activity Based Cost estimation with performance-based analysis of physical protection systems. CPA offers system managers an approach that supports both tactical decision making and strategic planning. Current exploratory applications of the CPA methodology address analysis of alternative conceptual designs. Hypothetical data is used to illustrate this process.

  12. Analysis of validation data sets in the Class A Performance Evaluation Program

    SciTech Connect

    Hunn, B.D.

    1983-01-01

    The primary objective of the DOE Passive Solar Class A Performance Evaluation Program is to collect, analyze, and archive detailed test data for the rigorous validation of analysis/design tools used for passive solar research and design. This paper presents results of the analysis and qualification of several one- and two-week data sets taken at three Class A test sites for the purpose of validating envelope and thermal-storage-energy-transfer processes in passive solar analysis/design tools. Analysis of the data sets consists of editing the measured data and comparing these data with simulated performance results using public-domain, passive solar analysis tools and a standard reporting format developed for the Class A program. Comparisons of the measured data with results using the DOE-2 computer program are presented.

  13. Latent class analysis of reading, decoding, and writing performance using the Academic Performance Test: concurrent and discriminating validity

    PubMed Central

    Cogo-Moreira, Hugo; Carvalho, Carolina Alves Ferreira; de Souza Batista Kida, Adriana; de Avila, Clara Regina Brandão; Salum, Giovanni Abrahão; Moriyama, Tais Silveira; Gadelha, Ary; Rohde, Luis Augusto; de Moura, Luciana Monteiro; Jackowski, Andrea Parolin; de Jesus Mari, Jair

    2013-01-01

    Aim To explore and validate the best returned latent class solution for reading and writing subtests from the Academic Performance Test (TDE). Sample A total of 1,945 children (6–14 years of age), who answered the TDE, the Development and Well-Being Assessment (DAWBA), and had an estimated intelligence quotient (IQ) higher than 70, came from public schools in São Paulo (35 schools) and Porto Alegre (22 schools) that participated in the ‘High Risk Cohort Study for Childhood Psychiatric Disorders’ project. They were on average 9.52 years old (standard deviation = 1.856), from the 1st to 9th grades, and 53.3% male. The mean estimated IQ was 102.70 (standard deviation = 16.44). Methods Via Item Response Theory (IRT), the highest discriminating items (‘a’>1.7) were selected from the TDE subtests of reading and writing. A latent class analysis was run based on these subtests. The statistically and empirically best latent class solutions were validated through concurrent (IQ and combined attention deficit hyperactivity disorder [ADHD] diagnoses) and discriminant (major depression diagnoses) measures. Results A three-class solution was found to be the best model solution, revealing classes of children with good, not-so-good, or poor performance on TDE reading and writing tasks. The three-class solution has been shown to be correlated with estimated IQ and to ADHD diagnosis. No association was observed between the latent class and major depression. Conclusion The three-class solution showed both concurrent and discriminant validity. This work provides initial evidence of validity for an empirically derived categorical classification of reading, decoding, and writing performance using the TDE. A valid classification encourages further research investing correlates of reading and writing performance using the TDE. PMID:23983466

  14. Analysis and compilation of missile aerodynamic data. Volume 2: Performance analysis

    NASA Technical Reports Server (NTRS)

    Burkhalter, J. E.

    1977-01-01

    A general analysis is given of the flight dynamics of several surface-to-air and two air-to-air missile configurations. The analysis involves three phases: vertical climb, straight and level flight, and constant altitude turn. Wind tunnel aerodynamic data and full scale missile characteristics are used where available; unknown data are estimated. For the constant altitude turn phase, a three degree of freedom flight simulation is used. Important parameters considered in this analysis are the vehicle weight, Mach number, heading angle, thrust level, sideslip angle, g loading, and time to make the turn. The actual flight path during the turn is also determined. Results are presented in graphical form.

  15. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments

    PubMed Central

    Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein

    2014-01-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance. PMID:24847184

  16. Evaluation of Field Portable X-Ray Fluorescence Performance for the Analysis of Ni in Soil.

    PubMed

    Du, Guo-dong; Lei, Mei; Zhou, Guang-dong; Chen, Tong-bin; Qiu, Rong-liang

    2015-03-01

    As a rapid, in-situ analysis method, Field portable X-ray fluorescence spectrometry (FP-XRF) can be widely applied in soil heavy metals analysis field. Whereas, some factors may affect FP-XRF performance and restrict the application. Studies have proved that FP-XRF has poorer performance when the concentration of target element is low, and soil moisture and particle size will affect FP-XRF performance. But few studies have been conducted in depth. This study took an example of Ni, demonstrated the relationship between Ni concentration and FP-XRF performance on accuracy and precision, and gave a critical value. Effects of soil moisture and particle size on accuracy and precision also had been compared. Results show that, FP-XRF performance is related to Ni concentration and the critical value is 400 mg x kg(-1). Relative standard deviation (RSD) and relative uncertainty decrease while the Ni concentration is below 400 mg x kg(-1), hence FP-XRF performance improves with increasing Ni concentration in this range; RSD and relative uncertainty change little while the Ni concentration is above 400 mg x kg(-1), hence FP-XRF performance does not have correlation with Ni concentration any more. For in-situ analysis, the relative uncertainty contributed by soil moisture is 3.77%, and the relative certainty contributed by particle size is 0.56%. Effect of soil moisture is evidently more serious than particle size both on accuracy and precision.

  17. From driving cycle analysis to understanding battery performance in real-life electric hybrid vehicle operation

    NASA Astrophysics Data System (ADS)

    Liaw, Bor Yann; Dubarry, Matthieu

    This paper proposes a methodology and approach to understand battery performance and life through driving cycle and duty cycle analyses from electric and hybrid vehicle (EHV) operation in real-world situations. Conducting driving cycle analysis with trip data collected from EHV operation in real life is very difficult and challenging. In fact, no comprehensive approach has been accepted to date, except those using standard driving cycles on a dynamometer or a track. Similarly, analyzing duty cycle performance of a battery under real-life operation faces the same challenge. A successful driving cycle analysis, however, can significantly enhance our understanding of EHV performance in real-life driving. Likewise, we also expect similar results through duty cycle analysis for batteries. Since 1995, we have been developing tools to analyze EHV and power source performance. In particular, we were able to collect data from a fleet of 15 Hyundai Santa Fe electric sports utility vehicles (e-SUVs) operated on Oahu, Hawaii; from July 2001 to June 2003 to allow driving and duty cycle analyses in order to understand battery pack performance from a variety of EHV operating conditions. We thus developed a comprehensive approach that comprises fuzzy logic pattern recognition (FL-PR) techniques to perform driving and duty cycle analyses. This approach has been successfully applied to EHV performance analysis via the creation of a compositional driving profile called "driving cycle profile" (DrCP) for each trip. The same approach was used to analyze battery performance via the construction of "duty cycle profile" (DuCP) to express battery usage under various operating conditions. The combination of the two analyses enables us to understand both the usage profile of EHV and battery performance in synergetic details and in a systematic manner using a pattern recognition technique.

  18. BioGraphE: high-performance bionetwork analysis using the Biological Graph Environment

    PubMed Central

    Chin, George; Chavarria, Daniel G; Nakamura, Grant C; Sofia, Heidi J

    2008-01-01

    Background Graphs and networks are common analysis representations for biological systems. Many traditional graph algorithms such as k-clique, k-coloring, and subgraph matching have great potential as analysis techniques for newly available data in biology. Yet, as the amount of genomic and bionetwork information rapidly grows, scientists need advanced new computational strategies and tools for dealing with the complexities of the bionetwork analysis and the volume of the data. Results We introduce a computational framework for graph analysis called the Biological Graph Environment (BioGraphE), which provides a general, scalable integration platform for connecting graph problems in biology to optimized computational solvers and high-performance systems. This framework enables biology researchers and computational scientists to identify and deploy network analysis applications and to easily connect them to efficient and powerful computational software and hardware that are specifically designed and tuned to solve complex graph problems. In our particular application of BioGraphE to support network analysis in genome biology, we investigate the use of a Boolean satisfiability solver known as Survey Propagation as a core computational solver executing on standard high-performance parallel systems, as well as multi-threaded architectures. Conclusion In our application of BioGraphE to conduct bionetwork analysis of homology networks, we found that BioGraphE and a custom, parallel implementation of the Survey Propagation SAT solver were capable of solving very large bionetwork problems at high rates of execution on different high-performance computing platforms. PMID:18541059

  19. Performance analysis of a turbofan as a part of an airbreathing propulsion system for space shuttles

    NASA Astrophysics Data System (ADS)

    Steinebach, D. A.; Kuehl, W.; Gallus, H. E.

    1993-04-01

    This paper presents the results of the design and performance analysis of airbreathing engines for aerospace planes. The analysis is illustrated by introducing an exemplary twin-shaft turbofan engine with post-combustion and bypass-combustion. Some modules of the performance analysis algorithm such as inlet pressure recovery or real gas effects are also presented. The jet engine is designed in view of increasing temperatures at high flight Mach numbers. Hence, the engine design data are dependent on the characteristics of the available materials as well as on the trajectory of the aerospace plane. The results illustrate the strong influence of the real gas effects on the engine thrust particularly in the case of over-stoichiometric combustion of hydrogen. Turbofan engines offer the following advantages in comparison with equivalent turbojet engines: higher thrust performance in supersonic flight range and lower fuel consumption due to operation management of post-combustion and bypass-combustion.

  20. Sensitivity Analysis of Wind Plant Performance to Key Turbine Design Parameters: A Systems Engineering Approach; Preprint

    SciTech Connect

    Dykes, K.; Ning, A.; King, R.; Graf, P.; Scott, G.; Veers, P.

    2014-02-01

    This paper introduces the development of a new software framework for research, design, and development of wind energy systems which is meant to 1) represent a full wind plant including all physical and nonphysical assets and associated costs up to the point of grid interconnection, 2) allow use of interchangeable models of varying fidelity for different aspects of the system, and 3) support system level multidisciplinary analyses and optimizations. This paper describes the design of the overall software capability and applies it to a global sensitivity analysis of wind turbine and plant performance and cost. The analysis was performed using three different model configurations involving different levels of fidelity, which illustrate how increasing fidelity can preserve important system interactions that build up to overall system performance and cost. Analyses were performed for a reference wind plant based on the National Renewable Energy Laboratory's 5-MW reference turbine at a mid-Atlantic offshore location within the United States.

  1. Free wake analysis of hover performance using a new influence coefficient method

    NASA Technical Reports Server (NTRS)

    Quackenbush, Todd R.; Bliss, Donald B.; Ong, Ching Cho; Ching, Cho Ong

    1990-01-01

    A new approach to the prediction of helicopter rotor performance using a free wake analysis was developed. This new method uses a relaxation process that does not suffer from the convergence problems associated with previous time marching simulations. This wake relaxation procedure was coupled to a vortex-lattice, lifting surface loads analysis to produce a novel, self contained performance prediction code: EHPIC (Evaluation of Helicopter Performance using Influence Coefficients). The major technical features of the EHPIC code are described and a substantial amount of background information on the capabilities and proper operation of the code is supplied. Sample problems were undertaken to demonstrate the robustness and flexibility of the basic approach. Also, a performance correlation study was carried out to establish the breadth of applicability of the code, with very favorable results.

  2. Are memory self-efficacy and memory performance related? A meta-analysis.

    PubMed

    Beaudoin, Marine; Desrichard, Olivier

    2011-03-01

    The association between memory self-efficacy (MSE) and memory performance is highly documented in the literature. However, previous studies have produced inconsistent results, and there is no consensus on the existence of a significant link between these two variables. In order to evaluate whether or not the effect size of the MSE-memory performance relationship in healthy adults is significant and to test several theory-driven moderators, we conducted a meta-analysis of published and unpublished studies. A random-effects model analysis of data from 107 relevant studies (673 effect sizes) indicated a low but significant weighted mean correlation between MSE and memory performance, r = .15, 95% CI [.13, .17]. In addition, the mean effect size was significantly moderated by the way MSE was assessed. Memory performance was more strongly related to concurrent MSE (perceived current ability to perform a given task) than it was to global MSE (perceived usual memory ability in general). Furthermore, we found marginally larger MSE-memory performance correlations when the memory situations used to assess MSE involved familiar stimuli. No effect of the method used to assess global MSE or domain MSE (memory rating vs. performance predictions) was found. The results also show that the resource demands of the memory tasks have a moderator effect, as the MSE-performance correlation is larger with free-recall and cued-recall tasks than it is with recognition tasks. Limitations (generalization issues, moderators not considered) and implications for future research are discussed. PMID:21244133

  3. The UTRC wind energy conversion system performance analysis for horizontal axis wind turbines (WECSPER)

    NASA Technical Reports Server (NTRS)

    Egolf, T. A.; Landgrebe, A. J.

    1981-01-01

    The theory for the UTRC Energy Conversion System Performance Analysis (WECSPER) for the prediction of horizontal axis wind turbine performance is presented. Major features of the analysis are the ability to: (1) treat the wind turbine blades as lifting lines with a prescribed wake model; (2) solve for the wake-induced inflow and blade circulation using real nonlinear airfoil data; and (3) iterate internally to obtain a compatible wake transport velocity and blade loading solution. This analysis also provides an approximate treatment of wake distortions due to tower shadow or wind shear profiles. Finally, selected results of internal UTRC application of the analysis to existing wind turbines and correlation with limited test data are described.

  4. Inertial Sensor Technology for Elite Swimming Performance Analysis: A Systematic Review.

    PubMed

    Mooney, Robert; Corley, Gavin; Godfrey, Alan; Quinlan, Leo R; ÓLaighin, Gearóid

    2015-01-01

    Technical evaluation of swimming performance is an essential factor of elite athletic preparation. Novel methods of analysis, incorporating body worn inertial sensors (i.e., Microelectromechanical systems, or MEMS, accelerometers and gyroscopes), have received much attention recently from both research and commercial communities as an alternative to video-based approaches. This technology may allow for improved analysis of stroke mechanics, race performance and energy expenditure, as well as real-time feedback to the coach, potentially enabling more efficient, competitive and quantitative coaching. The aim of this paper is to provide a systematic review of the literature related to the use of inertial sensors for the technical analysis of swimming performance. This paper focuses on providing an evaluation of the accuracy of different feature detection algorithms described in the literature for the analysis of different phases of swimming, specifically starts, turns and free-swimming. The consequences associated with different sensor attachment locations are also considered for both single and multiple sensor configurations. Additional information such as this should help practitioners to select the most appropriate systems and methods for extracting the key performance related parameters that are important to them for analysing their swimmers' performance and may serve to inform both applied and research practices. PMID:26712760

  5. Inertial Sensor Technology for Elite Swimming Performance Analysis: A Systematic Review.

    PubMed

    Mooney, Robert; Corley, Gavin; Godfrey, Alan; Quinlan, Leo R; ÓLaighin, Gearóid

    2015-01-01

    Technical evaluation of swimming performance is an essential factor of elite athletic preparation. Novel methods of analysis, incorporating body worn inertial sensors (i.e., Microelectromechanical systems, or MEMS, accelerometers and gyroscopes), have received much attention recently from both research and commercial communities as an alternative to video-based approaches. This technology may allow for improved analysis of stroke mechanics, race performance and energy expenditure, as well as real-time feedback to the coach, potentially enabling more efficient, competitive and quantitative coaching. The aim of this paper is to provide a systematic review of the literature related to the use of inertial sensors for the technical analysis of swimming performance. This paper focuses on providing an evaluation of the accuracy of different feature detection algorithms described in the literature for the analysis of different phases of swimming, specifically starts, turns and free-swimming. The consequences associated with different sensor attachment locations are also considered for both single and multiple sensor configurations. Additional information such as this should help practitioners to select the most appropriate systems and methods for extracting the key performance related parameters that are important to them for analysing their swimmers' performance and may serve to inform both applied and research practices.

  6. Inertial Sensor Technology for Elite Swimming Performance Analysis: A Systematic Review

    PubMed Central

    Mooney, Robert; Corley, Gavin; Godfrey, Alan; Quinlan, Leo R; ÓLaighin, Gearóid

    2015-01-01

    Technical evaluation of swimming performance is an essential factor of elite athletic preparation. Novel methods of analysis, incorporating body worn inertial sensors (i.e., Microelectromechanical systems, or MEMS, accelerometers and gyroscopes), have received much attention recently from both research and commercial communities as an alternative to video-based approaches. This technology may allow for improved analysis of stroke mechanics, race performance and energy expenditure, as well as real-time feedback to the coach, potentially enabling more efficient, competitive and quantitative coaching. The aim of this paper is to provide a systematic review of the literature related to the use of inertial sensors for the technical analysis of swimming performance. This paper focuses on providing an evaluation of the accuracy of different feature detection algorithms described in the literature for the analysis of different phases of swimming, specifically starts, turns and free-swimming. The consequences associated with different sensor attachment locations are also considered for both single and multiple sensor configurations. Additional information such as this should help practitioners to select the most appropriate systems and methods for extracting the key performance related parameters that are important to them for analysing their swimmers’ performance and may serve to inform both applied and research practices. PMID:26712760

  7. Theoretical analysis of the subcritical experiments performed in the IPEN/MB-01 research reactor facility

    SciTech Connect

    Lee, S. M.; Dos Santos, A.

    2012-07-01

    The theoretical analysis of the subcritical experiments performed at the IPEN/MB-01 reactor employing the coupled NJOY/AMPX-II/TORT systems was successfully accomplished. All the analysis was performed employing ENDF/B-VII.0. The theoretical approach follows all the steps of the subcritical model of Gandini and Salvatores. The theory/experiment comparison reveals that the calculated subcritical reactivity is in a very good agreement to the experimental values. The subcritical index ({xi}) shows some discrepancies although in this particular case some work still have to be made to model in a better way the neutron source present in the experiments. (authors)

  8. Seventy-meter antenna performance predictions: GTD analysis compared with traditional ray-tracing methods

    NASA Technical Reports Server (NTRS)

    Schredder, J. M.

    1988-01-01

    A comparative analysis was performed, using both the Geometrical Theory of Diffraction (GTD) and traditional pathlength error analysis techniques, for predicting RF antenna gain performance and pointing corrections. The NASA/JPL 70 meter antenna with its shaped surface was analyzed for gravity loading over the range of elevation angles. Also analyzed were the effects of lateral and axial displacements of the subreflector. Significant differences were noted between the predictions of the two methods, in the effect of subreflector displacements, and in the optimal subreflector positions to focus a gravity-deformed main reflector. The results are of relevance to future design procedure.

  9. Statistical analysis of microgravity experiment performance using the degrees of success scale

    NASA Technical Reports Server (NTRS)

    Upshaw, Bernadette; Liou, Ying-Hsin Andrew; Morilak, Daniel P.

    1994-01-01

    This paper describes an approach to identify factors that significantly influence microgravity experiment performance. Investigators developed the 'degrees of success' scale to provide a numerical representation of success. A degree of success was assigned to 293 microgravity experiments. Experiment information including the degree of success rankings and factors for analysis was compiled into a database. Through an analysis of variance, nine significant factors in microgravity experiment performance were identified. The frequencies of these factors are presented along with the average degree of success at each level. A preliminary discussion of the relationship between the significant factors and the degree of success is presented.

  10. Seventy-meter antenna performance predictions: GTD analysis compared with traditional ray-tracing methods

    NASA Astrophysics Data System (ADS)

    Schredder, J. M.

    1988-02-01

    A comparative analysis was performed, using both the Geometrical Theory of Diffraction (GTD) and traditional pathlength error analysis techniques, for predicting RF antenna gain performance and pointing corrections. The NASA/JPL 70 meter antenna with its shaped surface was analyzed for gravity loading over the range of elevation angles. Also analyzed were the effects of lateral and axial displacements of the subreflector. Significant differences were noted between the predictions of the two methods, in the effect of subreflector displacements, and in the optimal subreflector positions to focus a gravity-deformed main reflector. The results are of relevance to future design procedure.

  11. Performance Refactoring of Instrumentation, Measurement, and Analysis Technologies for Petascale Computing: the PRIMA Project

    SciTech Connect

    Malony, Allen D.; Wolf, Felix G.

    2014-01-31

    The growing number of cores provided by today’s high-end computing systems present substantial challenges to application developers in their pursuit of parallel efficiency. To find the most effective optimization strategy, application developers need insight into the runtime behavior of their code. The University of Oregon (UO) and the Juelich Supercomputing Centre of Forschungszentrum Juelich (FZJ) develop the performance analysis tools TAU and Scalasca, respectively, which allow high-performance computing (HPC) users to collect and analyze relevant performance data – even at very large scales. TAU and Scalasca are considered among the most advanced parallel performance systems available, and are used extensively across HPC centers in the U.S., Germany, and around the world. The TAU and Scalasca groups share a heritage of parallel performance tool research and partnership throughout the past fifteen years. Indeed, the close interactions of the two groups resulted in a cross-fertilization of tool ideas and technologies that pushed TAU and Scalasca to what they are today. It also produced two performance systems with an increasing degree of functional overlap. While each tool has its specific analysis focus, the tools were implementing measurement infrastructures that were substantially similar. Because each tool provides complementary performance analysis, sharing of measurement results is valuable to provide the user with more facets to understand performance behavior. However, each measurement system was producing performance data in different formats, requiring data interoperability tools to be created. A common measurement and instrumentation system was needed to more closely integrate TAU and Scalasca and to avoid the duplication of development and maintenance effort. The PRIMA (Performance Refactoring of Instrumentation, Measurement, and Analysis) project was proposed over three years ago as a joint international effort between UO and FZJ to accomplish

  12. Performance Refactoring of Instrumentation, Measurement, and Analysis Technologies for Petascale Computing. The PRIMA Project

    SciTech Connect

    Malony, Allen D.; Wolf, Felix G.

    2014-01-31

    The growing number of cores provided by today’s high-­end computing systems present substantial challenges to application developers in their pursuit of parallel efficiency. To find the most effective optimization strategy, application developers need insight into the runtime behavior of their code. The University of Oregon (UO) and the Juelich Supercomputing Centre of Forschungszentrum Juelich (FZJ) develop the performance analysis tools TAU and Scalasca, respectively, which allow high-­performance computing (HPC) users to collect and analyze relevant performance data – even at very large scales. TAU and Scalasca are considered among the most advanced parallel performance systems available, and are used extensively across HPC centers in the U.S., Germany, and around the world. The TAU and Scalasca groups share a heritage of parallel performance tool research and partnership throughout the past fifteen years. Indeed, the close interactions of the two groups resulted in a cross-­fertilization of tool ideas and technologies that pushed TAU and Scalasca to what they are today. It also produced two performance systems with an increasing degree of functional overlap. While each tool has its specific analysis focus, the tools were implementing measurement infrastructures that were substantially similar. Because each tool provides complementary performance analysis, sharing of measurement results is valuable to provide the user with more facets to understand performance behavior. However, each measurement system was producing performance data in different formats, requiring data interoperability tools to be created. A common measurement and instrumentation system was needed to more closely integrate TAU and Scalasca and to avoid the duplication of development and maintenance effort. The PRIMA (Performance Refactoring of Instrumentation, Measurement, and Analysis) project was proposed over three years ago as a joint international effort between UO and FZJ to

  13. Idaho National Laboratory Quarterly Performance Analysis - 3rd Quarter FY2014

    SciTech Connect

    Lisbeth A. Mitchell

    2014-09-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of occurrence reports and other non-reportable issues identified at INL from July 2013 through June 2014.

  14. Idaho National Laboratory Quarterly Performance Analysis for the 2nd Quarter FY 2015

    SciTech Connect

    Mitchell, Lisbeth A.

    2015-04-01

    This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of events for the 2nd Qtr FY-15.

  15. An analysis for high speed propeller-nacelle aerodynamic performance prediction. Volume 1: Theory and application

    NASA Technical Reports Server (NTRS)

    Egolf, T. Alan; Anderson, Olof L.; Edwards, David E.; Landgrebe, Anton J.

    1988-01-01

    A computer program, the Propeller Nacelle Aerodynamic Performance Prediction Analysis (PANPER), was developed for the prediction and analysis of the performance and airflow of propeller-nacelle configurations operating over a forward speed range inclusive of high speed flight typical of recent propfan designs. A propeller lifting line, wake program was combined with a compressible, viscous center body interaction program, originally developed for diffusers, to compute the propeller-nacelle flow field, blade loading distribution, propeller performance, and the nacelle forebody pressure and viscous drag distributions. The computer analysis is applicable to single and coaxial counterrotating propellers. The blade geometries can include spanwise variations in sweep, droop, taper, thickness, and airfoil section type. In the coaxial mode of operation the analysis can treat both equal and unequal blade number and rotational speeds on the propeller disks. The nacelle portion of the analysis can treat both free air and tunnel wall configurations including wall bleed. The analysis was applied to many different sets of flight conditions using selected aerodynamic modeling options. The influence of different propeller nacelle-tunnel wall configurations was studied. Comparisons with available test data for both single and coaxial propeller configurations are presented along with a discussion of the results.

  16. The Role of Culture, Competitiveness and Economic Performance in Explaining Academic Performance: A Global Market Analysis for International Student Segmentation

    ERIC Educational Resources Information Center

    Baumann, Chris; Hamin

    2011-01-01

    A nation's culture, competitiveness and economic performance explain academic performance. Partial Least Squares (PLS) testing of 2252 students shows culture affects competitiveness and academic performance. Culture and economic performance each explain 32%; competitiveness 36%. The model predicts academic performance when culture, competitiveness…

  17. Measuring the performance of Internet companies using a two-stage data envelopment analysis model

    NASA Astrophysics Data System (ADS)

    Cao, Xiongfei; Yang, Feng

    2011-05-01

    In exploring the business operation of Internet companies, few researchers have used data envelopment analysis (DEA) to evaluate their performance. Since the Internet companies have a two-stage production process: marketability and profitability, this study employs a relational two-stage DEA model to assess the efficiency of the 40 dot com firms. The results show that our model performs better in measuring efficiency, and is able to discriminate the causes of inefficiency, thus helping business management to be more effective through providing more guidance to business performance improvement.

  18. Performance analysis for queueing systems with close down periods and server under maintenance

    NASA Astrophysics Data System (ADS)

    Krishna Kumar, B.; Anbarasu, S.; Lakshmi, S. R. Anantha

    2015-01-01

    A single server queue subject to maintenance of the server and the close down period is considered. We obtain explicit expressions for the transient probabilities of the system size, the server under maintenance state and the close down period. The time-dependent performance measures of the system and the probability density function of the first-passage-time to reach the maintenance state are discussed. The corresponding steady state analysis and key performance measures of the system are also presented. Finally, the effect of various parameters on system performance measures is demonstrated by a numerical example.

  19. Analysis of the Energy Performance of the Chesapeake Bay Foundation's Philip Merrill Environmental Center

    SciTech Connect

    Griffith, B.; Deru M.; Torcellini, P.; Ellis, P.

    2005-04-01

    The Chesapeake Bay Foundation designed their new headquarters building to minimize its environmental impact on the already highly polluted Chesapeake Bay by incorporating numerous high-performance energy saving features into the building design. CBF then contacted NREL to perform a nonbiased energy evaluation of the building. Because their building attracted much attention in the sustainable design community, an unbiased evaluation was necessary to help designers replicate successes and identify and correct problem areas. This report focuses on NREL's monitoring and analysis of the overall energy performance of the building.

  20. The age of peak performance in Ironman triathlon: a cross-sectional and longitudinal data analysis

    PubMed Central

    2013-01-01

    Background The aims of the present study were, firstly, to investigate in a cross-sectional analysis the age of peak Ironman performance within one calendar year in all qualifiers for Ironman Hawaii and Ironman Hawaii; secondly, to determine in a longitudinal analysis on a qualifier for Ironman Hawaii whether the age of peak Ironman performance and Ironman performance itself change across years; and thirdly, to determine the gender difference in performance. Methods In a cross-sectional analysis, the age of the top ten finishers for all qualifier races for Ironman Hawaii and Ironman Hawaii was determined in 2010. For a longitudinal analysis, the age and the performance of the annual top ten female and male finishers in a qualifier for Ironman Hawaii was determined in Ironman Switzerland between 1995 and 2010. Results In 19 of the 20 analyzed triathlons held in 2010, there was no difference in the age of peak Ironman performance between women and men (p > 0.05). The only difference in the age of peak Ironman performance between genders was in ‘Ironman Canada’ where men were older than women (p = 0.023). For all 20 races, the age of peak Ironman performance was 32.2 ± 1.5 years for men and 33.0 ± 1.6 years for women (p > 0.05). In Ironman Switzerland, there was no difference in the age of peak Ironman performance between genders for top ten women and men from 1995 to 2010 (F = 0.06, p = 0.8). The mean age of top ten women and men was 31.4 ± 1.7 and 31.5 ± 1.7 years (Cohen's d = 0.06), respectively. The gender difference in performance in the three disciplines and for overall race time decreased significantly across years. Men and women improved overall race times by approximately 1.2 and 4.2 min/year, respectively. Conclusions Women and men peak at a similar age of 32–33 years in an Ironman triathlon with no gender difference. In a qualifier for Ironman Hawaii, the age of peak Ironman performance remained unchanged across years. In contrast, gender

  1. Stability and performance analysis of a full-train system with inerters

    NASA Astrophysics Data System (ADS)

    Wang, Fu-Cheng; Hsieh, Min-Ruei; Chen, Hsueh-Ju

    2012-04-01

    This paper discusses the use of inerters to improve the stability and performance of a full-train system. First, we construct a 28 degree-of-freedom train model in AutoSim, and obtain a linearised model for analysis in Matlab. Then, the benefits of inerters are investigated by the critical speed, settling time and passenger comfort. In addition, we apply a new mechatronic network for further performance improvement, and synthesise the optimal electrical circuit for experimental verification. From the results, inerters are shown to be effective in improving the stability and performance of train systems.

  2. Supercomputer and cluster performance modeling and analysis efforts:2004-2006.

    SciTech Connect

    Sturtevant, Judith E.; Ganti, Anand; Meyer, Harold Edward; Stevenson, Joel O.; Benner, Robert E., Jr.; Goudy, Susan Phelps; Doerfler, Douglas W.; Domino, Stefan Paul; Taylor, Mark A.; Malins, Robert Joseph; Scott, Ryan T.; Barnette, Daniel Wayne; Rajan, Mahesh; Ang, James Alfred; Black, Amalia Rebecca; Laub, Thomas William; Vaughan, Courtenay Thomas; Franke, Brian Claude

    2007-02-01

    This report describes efforts by the Performance Modeling and Analysis Team to investigate performance characteristics of Sandia's engineering and scientific applications on the ASC capability and advanced architecture supercomputers, and Sandia's capacity Linux clusters. Efforts to model various aspects of these computers are also discussed. The goals of these efforts are to quantify and compare Sandia's supercomputer and cluster performance characteristics; to reveal strengths and weaknesses in such systems; and to predict performance characteristics of, and provide guidelines for, future acquisitions and follow-on systems. Described herein are the results obtained from running benchmarks and applications to extract performance characteristics and comparisons, as well as modeling efforts, obtained during the time period 2004-2006. The format of the report, with hypertext links to numerous additional documents, purposefully minimizes the document size needed to disseminate the extensive results from our research.

  3. The pragmatics of "madness": performance analysis of a Bangladeshi woman's "aberrant" lament.

    PubMed

    Wilce, J M

    1998-03-01

    A fine-grained analysis of the transcript of a Bangladeshi woman's lament is used to argue for an anthropology of "madness" that attends closely to performance and performativity. The emergent, interactive production of wept speech, together with the conflicting use to which it is put by the performer and her relatives, is linked problematically to performance genres and to ethnopsychiatric indexes of madness. Tuneful weeping is taken by relatives to be performative of madness, in a sense something like Austin's. Yet, exploration of the divergent linguistic ideologies which are brought to bear on the lament not only enables more nuanced ethnographic treatment but also has reflexive ramifications for medical and psychological anthropology. This leads to a critique of the referentialism in our own treatment of language. The role played by transparent reference is overshadowed by indexicality and by dialogical processes of proposing and resisting labels for speech genres attributed to the "mad."

  4. Analysis of hydrocortisone acetate ointments and creams by high-performance liquid chromatography.

    PubMed

    Lea, A R; Kennedy, J M; Low, G K

    1980-09-26

    High-performance liquid chromatographic (HPLC) methods for the analysis of hydrocortisone containing ointments and creams have been investigated. A method which uses a silica column and involves a minimum of sample pre-treatment has been shown to compare favourably with the triphenyltetrazolim chloride method of the British Pharmacopoeia. For hydrocortisone ointments the HPLC procedure provides results of equivalent precision and has advantages with respect to the time taken for each analysis and specificity. Application of the method to the analysis of hydrocortisone creams has been explored and the deviation between the HPLC and colorimetric method requires further investigation.

  5. A Secondary Analysis of the Impact of School Management Practices on School Performance

    ERIC Educational Resources Information Center

    Talbert, Dale A.

    2009-01-01

    The purpose of this study was to conduct a secondary analysis of the impact of school management practices on school performance utilizing a survey design of School and Staffing (SASS) data collected by the National Center for Education Statistics (NCES) of the U.S. Department of Education, 1999-2000. The study identifies those school management…

  6. Using Performance Analysis for Training in an Organization Implementing ISO-9000 Manufacturing Practices: A Case Study.

    ERIC Educational Resources Information Center

    Kunneman, Dale E.; Sleezer, Catherine M.

    2000-01-01

    This case study examines the application of the Performance Analysis for Training (PAT) Model in an organization that was implementing ISO-9000 (International Standards Organization) processes for manufacturing practices. Discusses the interaction of organization characteristics, decision maker characteristics, and analyst characteristics to…

  7. Multilevel Structural Equation Models for the Analysis of Comparative Data on Educational Performance

    ERIC Educational Resources Information Center

    Goldstein, Harvey; Bonnet, Gerard; Rocher, Thierry

    2007-01-01

    The Programme for International Student Assessment comparative study of reading performance among 15-year-olds is reanalyzed using statistical procedures that allow the full complexity of the data structures to be explored. The article extends existing multilevel factor analysis and structural equation models and shows how this can extract richer…

  8. Social Cognitive Predictors of College Students' Academic Performance and Persistence: A Meta-Analytic Path Analysis

    ERIC Educational Resources Information Center

    Brown, Steven D.; Tramayne, Selena; Hoxha, Denada; Telander, Kyle; Fan, Xiaoyan; Lent, Robert W.

    2008-01-01

    This study tested Social Cognitive Career Theory's (SCCT) academic performance model using a two-stage approach that combined meta-analytic and structural equation modeling methodologies. Unbiased correlations obtained from a previously published meta-analysis [Robbins, S. B., Lauver, K., Le, H., Davis, D., & Langley, R. (2004). Do psychosocial…

  9. A SWOT Analysis of Male and Female Students' Performance in Chemistry: A Comparative Study

    ERIC Educational Resources Information Center

    Ezeudu, Florence O.; Chiaha, Gertrude-Theresa Uzoamaka; Anazor, Lynda Chioma; Eze, Justina Uzoamaka; Omeke, Faith Chinwe

    2015-01-01

    The purpose of this study was to do a SWOT analysis and compare performances of male and female students in chemistry. Four research questions and four null hypotheses guided the study. Two boys', two girls' and two coeducational schools involving 1319 males and 1831 females, were selected by a stratified, deliberate sampling technique. A…

  10. Meta-Analysis of Studies Investigating the Effects of Father Absence on Children's Cognitive Performance.

    ERIC Educational Resources Information Center

    Salzman, Stephanie A.

    A meta-analysis was conducted of 137 studies investigating the effects of father absence due to employment, military service, death, divorce, separation, or desertion on children's cognitive performance as assessed by scores on standardized intelligence, scholastic aptitude, and academic achievement tests and school grades. Aggregation of the…

  11. Performance-Based Occupational Affective Behavior Analysis (OABA). Implementation and Supporting Research.

    ERIC Educational Resources Information Center

    Pucel, David J.; And Others

    This document contains two sections: implementation of the performance-based Occupational Affective Behavior Analysis (OABA), and supporting research. Section 1 presents OABA, an analytic procedure designed to identify those affective behaviors important to success in an occupation, and gives directions on how to implement the procedure. The…

  12. A Meta-Analysis of the Five-Factor Model of Personality and Academic Performance

    ERIC Educational Resources Information Center

    Poropat, Arthur E.

    2009-01-01

    This article reports a meta-analysis of personality-academic performance relationships, based on the 5-factor model, in which cumulative sample sizes ranged to over 70,000. Most analyzed studies came from the tertiary level of education, but there were similar aggregate samples from secondary and tertiary education. There was a comparatively…

  13. Increasing Student Performance on the Independent School Entrance Exam (ISEE) Using the Gap Analysis Approach

    ERIC Educational Resources Information Center

    Sarshar, Shanon Etty

    2013-01-01

    Using the Gap Analysis problem-solving framework (Clark & Estes, 2008), this study examined the performance gap experienced by 6th grade students on the math sections of the ISEE (Independent School Entrance Exam). The purpose of the study was to identify and validate the knowledge, motivation, and organization causes of the students' low…

  14. Funding Ohio Community Colleges: An Analysis of the Performance Funding Model

    ERIC Educational Resources Information Center

    Krueger, Cynthia A.

    2013-01-01

    This study examined Ohio's community college performance funding model that is based on seven student success metrics. A percentage of the regular state subsidy is withheld from institutions; funding is earned back based on the three-year average of success points achieved in comparison to other community colleges in the state. Analysis of…

  15. How to Construct More Accurate Student Models: Comparing and Optimizing Knowledge Tracing and Performance Factor Analysis

    ERIC Educational Resources Information Center

    Gong, Yue; Beck, Joseph E.; Heffernan, Neil T.

    2011-01-01

    Student modeling is a fundamental concept applicable to a variety of intelligent tutoring systems (ITS). However, there is not a lot of practical guidance on how to construct and train such models. This paper compares two approaches for student modeling, Knowledge Tracing (KT) and Performance Factors Analysis (PFA), by evaluating their predictive…

  16. High Performance for Labeling, Separation and Detection Methods for Genome Analysis 12

    SciTech Connect

    Richard A. Mathies

    2003-11-24

    OAK-B135 Our research efforts over the past year have focused on the development of advanced integrated PCR reactors on chips, the development of robust integrated valves, the development of integrated optical detectors, and the development of microdevices for performing integrated genetic analysis. The details of all the projects will be found in the listed papers.

  17. A Multilevel Multivariate Analysis of Academic Performances in College Based on NCAA Student-Athletes

    ERIC Educational Resources Information Center

    McArdle, John J.; Paskus, Thomas S.; Boker, Steven M.

    2013-01-01

    This is an application of contemporary multilevel regression modeling to the prediction of academic performances of 1st-year college students. At a first level of analysis, the data come from N greater than 16,000 students who were college freshman in 1994-1995 and who were also participants in high-level college athletics. At a second level of…

  18. Retention Test and Strategy Analysis of a Nonvisual Seriation Task Performed by First Grade Children.

    ERIC Educational Resources Information Center

    Padilla, Michael J.

    Reported is a test of retention related to a weight, texture, and force experiment performed by a group of 120 first grade students who had been randomly grouped into three treatment strategies. The results of the retention task, accuracy test, an analysis of the strategies used on both posttests and retention tests, and some miscellaneous post…

  19. Video Analysis of Athletic Training Student Performance: Changing Educational Competency into Clinical Proficiency

    ERIC Educational Resources Information Center

    Kawaguchi, Jeffrey K.

    2009-01-01

    Context: Assessing clinical proficiency and documenting learning over time is quite challenging. Educators must look for unique ways to effectively examine students' performance and archive evidence of their academic progress. Objective: To discuss the use of video analysis to bridge the gap from educational competency to clinical proficiency, and…

  20. Predicting Teacher Performance with Test Scores and Grade Point Average: A Meta-Analysis

    ERIC Educational Resources Information Center

    D'Agostino, Jerome V.; Powers, Sonya J.

    2009-01-01

    A meta-analysis was conducted to examine the degree to which teachers' test scores and their performance in preparation programs as measured by their collegiate grade point average (GPA) predicted their teaching competence. Results from 123 studies that yielded 715 effect sizes were analyzed, and the mediating effects of test and GPA type,…

  1. The Impact of Training and Demographics in WIA Program Performance: A Statistical Analysis

    ERIC Educational Resources Information Center

    Moore, Richard W.; Gorman, Philip C.

    2009-01-01

    The Workforce Investment Act (WIA) measures participant labor market outcomes to drive program performance. This article uses statistical analysis to examine the relationship between participant characteristics and key outcome measures in one large California local WIA program. This study also measures the impact of different training…

  2. A Meta-Analysis of Adult-Rated Child Personality and Academic Performance in Primary Education

    ERIC Educational Resources Information Center

    Poropat, Arthur E.

    2014-01-01

    Background: Personality is reliably associated with academic performance, but personality measurement in primary education can be problematic. Young children find it difficult to accurately self-rate personality, and dominant models of adult personality may be inappropriate for children. Aims: This meta-analysis was conducted to determine the…

  3. School Expenditure and School Performance: Evidence from New South Wales Schools Using a Dynamic Panel Analysis

    ERIC Educational Resources Information Center

    Pugh, G.; Mangan, J.; Blackburn, V.; Radicic, D.

    2015-01-01

    This article estimates the effects of school expenditure on school performance in government secondary schools in New South Wales, Australia over the period 2006-2010. It uses dynamic panel analysis to exploit time series data on individual schools that only recently has become available. We find a significant but small effect of expenditure on…

  4. California Charter Schools Serving Low-SES Students: An Analysis of the Academic Performance Index.

    ERIC Educational Resources Information Center

    Slovacek, Simeon P.; Kunnan, Antony J.; Kim, Hae-Jin

    This report presents the findings of an analysis of the Academic Performance Index (API) scores based on SATs taken in 1999, 2000, and 2001. It focuses on charter schools in California that serve students from low socioeconomic-status (SES) families. The purpose of the study was to see how standardized test scores from charter schools serving…

  5. Integration of Pharmacy Practice and Pharmaceutical Analysis: Quality Assessment of Laboratory Performance.

    ERIC Educational Resources Information Center

    McGill, Julian E.; Holly, Deborah R.

    1996-01-01

    Laboratory portions of courses in pharmacy practice and pharmaceutical analysis at the Medical University of South Carolina are integrated and coordinated to provide feedback on student performance in compounding medications. Students analyze the products they prepare, with early exposure to compendia requirements and other references. Student…

  6. Use of a Survival Analysis Technique in Understanding Game Performance in Instructional Games. CRESST Report 812

    ERIC Educational Resources Information Center

    Kim, Jinok; Chung, Gregory K. W. K.

    2012-01-01

    In this study we compared the effects of two math game designs on math and game performance, using discrete-time survival analysis (DTSA) to model players' risk of not advancing to the next level in the game. 137 students were randomly assigned to two game conditions. The game covered the concept of a unit and the addition of like-sized fractional…

  7. The Effect of Modeling and Silent Analysis on the Performance Effectiveness of Advanced Elementary Instrumentalists.

    ERIC Educational Resources Information Center

    Fortney, Patrick M.

    1992-01-01

    By keeping abreast of the latest research in the field, music educators can better understand how practicing helps students to use practice methods that are the most effective. The purpose of this study was to determine the relative effectiveness of modeling and silent analysis on the performance ability of advanced elementary school…

  8. Performance Evaluation of Technical Institutions: An Application of Data Envelopment Analysis

    ERIC Educational Resources Information Center

    Debnath, Roma Mitra; Shankar, Ravi; Kumar, Surender

    2008-01-01

    Technical institutions (TIs) are playing an important role in making India a knowledge hub of this century. There is still great diversity in their relative performance, which is a matter of concern to the education planner. This article employs the method of data envelopment analysis (DEA) to compare the relative efficiency of TIs in India. The…

  9. Performance analysis of an integrated GPS/inertial attitude determination system. M.S. Thesis - MIT

    NASA Technical Reports Server (NTRS)

    Sullivan, Wendy I.

    1994-01-01

    The performance of an integrated GPS/inertial attitude determination system is investigated using a linear covariance analysis. The principles of GPS interferometry are reviewed, and the major error sources of both interferometers and gyroscopes are discussed and modeled. A new figure of merit, attitude dilution of precision (ADOP), is defined for two possible GPS attitude determination methods, namely single difference and double difference interferometry. Based on this figure of merit, a satellite selection scheme is proposed. The performance of the integrated GPS/inertial attitude determination system is determined using a linear covariance analysis. Based on this analysis, it is concluded that the baseline errors (i.e., knowledge of the GPS interferometer baseline relative to the vehicle coordinate system) are the limiting factor in system performance. By reducing baseline errors, it should be possible to use lower quality gyroscopes without significantly reducing performance. For the cases considered, single difference interferometry is only marginally better than double difference interferometry. Finally, the performance of the system is found to be relatively insensitive to the satellite selection technique.

  10. Meta-analysis of the technical performance of an imaging procedure: guidelines and statistical methodology.

    PubMed

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2015-02-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes. PMID:24872353

  11. Meta-analysis of the technical performance of an imaging procedure: guidelines and statistical methodology.

    PubMed

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2015-02-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes.

  12. Analysis of incinerator performance and metal emissions from recent trial and test burns

    SciTech Connect

    Ho, T.C.; Lee, H.T.; Kuo, T.H.

    1994-12-31

    Recent trial- and test-burn data from five rotary kiln incinerator facilities were analyzed for combustion performance and metal emissions. The incinerator facilities examined included: DuPont`s Gulf Coast Regional Waste Incinerator in Orange, Texas; Chemical Waste Management`s Incinerator in Port Arthur, Texas; Rollins Environmental Service`s Incinerator in Deer Park, Texas; Martin Marietta`s TSCA Incinerator in Oak Ridge, Tennessee; and EPA`s Incineration Research Facility in Jefferson, Arkansas. The analysis involved the use of a PC-based computer program capable of performing material and energy balance calculations and predicting equilibrium compositions based on the minimization of system free energy. For each analysis, the feed data of waste and fuel and the corresponding operating parameters associated with incinerator and/or afterburner operation were input to the program and the program simulated the combustion performance under equilibrium conditions. In the analysis, the field-recorded performance data were compared with the simulated equilibrium results and the incinerator performance, including the quality of the field data, the combustion efficiency, the percent excess air, the heat loss, and the amount of air inleakage, was evaluated. In addition, the field-obtained metal data were analyzed for emission rate and metal balance. 13 refs., 4 figs., 16 tabs.

  13. Thermal Performance Analysis of Solar Collectors Installed for Combisystem in the Apartment Building

    NASA Astrophysics Data System (ADS)

    Žandeckis, A.; Timma, L.; Blumberga, D.; Rochas, C.; Rošā, M.

    2012-01-01

    The paper focuses on the application of wood pellet and solar combisystem for space heating and hot water preparation at apartment buildings under the climate of Northern Europe. A pilot project has been implemented in the city of Sigulda (N 57° 09.410 E 024° 52.194), Latvia. The system was designed and optimised using TRNSYS - a dynamic simulation tool. The pilot project was continuously monitored. To the analysis the heat transfer fluid flow rate and the influence of the inlet temperature on the performance of solar collectors were subjected. The thermal performance of a solar collector loop was studied using a direct method. A multiple regression analysis was carried out using STATGRAPHICS Centurion 16.1.15 with the aim to identify the operational and weather parameters of the system which cause the strongest influence on the collector's performance. The parameters to be used for the system's optimisation have been evaluated.

  14. Experimental study on the performance characteristics and emission analysis of a diesel engine using vegetable oils

    NASA Astrophysics Data System (ADS)

    Saha, Anup; Ehite, Ekramul Haque; Alam, M. M.

    2016-07-01

    In this research, Vegetable oils derived from Sesame Seed and Rice Bran were used and experimented upon. Using Kerosene as the solvent in varying proportions (30%, 50%, 70% by volume) with the vegetables oils, different blends of Sesame and Rice Bran Oils were produced. The important characteristic properties were found by experimentation and compared with those of Straight Run Diesel. Subsequently, Straight Run Diesel, vegetable oils and their blends were used to run a diesel engine one-by-one and the performance analysis was conducted, followed by an investigation of the exhaust emissions. From the comparative performance analysis, it was found that Rice Bran oil showed better performance as a fuel than Sesame with regards to power production and specific fuel consumption and also resulted in less Carbon Monoxide (CO) emission than Sesame oil blends.

  15. Optical ensemble analysis of intraocular lens performance through a simulated clinical trial with ZEMAX.

    PubMed

    Zhao, Huawei

    2009-01-01

    A ZEMAX model was constructed to simulate a clinical trial of intraocular lenses (IOLs) based on a clinically oriented Monte Carlo ensemble analysis using postoperative ocular parameters. The purpose of this model is to test the feasibility of streamlining and optimizing both the design process and the clinical testing of IOLs. This optical ensemble analysis (OEA) is also validated. Simulated pseudophakic eyes were generated by using the tolerancing and programming features of ZEMAX optical design software. OEA methodology was verified by demonstrating that the results of clinical performance simulations were consistent with previously published clinical performance data using the same types of IOLs. From these results we conclude that the OEA method can objectively simulate the potential clinical trial performance of IOLs.

  16. Performance & stability analysis of a three lobe journal bearing with varying parameters: Experiments and analysis

    NASA Astrophysics Data System (ADS)

    Biswas, Nabarun; Chakraborti, Prasun; Saha, Ankuran; Biswas, Srijit

    2016-07-01

    3-lobe Hydrodynamic oil journal bearings are widely used in heavy industries as a part of different rotating machinery due to their high level of performances. 3-lobe hydrodynamic oil journal bearing allows the transmission of large amounts of loads at a mean speed of rotation. In this present work, an attempt has been made to investigate the pressure domain and subsequent effects in a 3 lobe journal bearing under different static loads in a stable operating speed. Analytical calculations were carried out with codes generated using Matlab software. Experiments were performed in Journal Bearing test rig incorporating 3-lobe under different loads with stable operating speed of 1000 RPM. It has been observed that an increase in load resulted rise in pressure profile, maximum pressure angle and temperature. A further attempt has been made to see the effect of eccentricity ratio and dynamic viscosity considering no change in the RPM. It has also been observed that dynamic viscosity has a significant effect on the stable operating speed. With the reduction in static load, the stability of operating speed attained at higher values.

  17. Performance analysis of coexisting IEEE 802.15.4-based health monitoring WBANs.

    PubMed

    Deylami, Mohammad; Jovanov, Emil

    2012-01-01

    Wireless Body Area Networks (WBANs) for health monitoring systems are required to meet stringent performance demands regarding the tradeoff between reliability, latency, and power efficiency. WBANs feature limited range and bandwidth and they are prone to interference. Considering the life-critical nature of some WBAN systems, we present an in-depth investigation of the situations where the dynamic coexistence of multiple WBANs may severely affect their performances. In this paper, we analytically study the effect of coexistence on the operation of WBANs. We present a mathematical analysis to precisely obtain the probabilities of successful communication and validate this analysis through simulation. Our simulation analysis indicates that in the default mode of operation, coexistence of three WBANs can lead to the loss of 20-85% of data transmissions for typical sensor configurations.

  18. Implementation of multivariate linear mixed-effects models in the analysis of indoor climate performance experiments

    NASA Astrophysics Data System (ADS)

    Jensen, Kasper L.; Spiild, Henrik; Toftum, Jørn

    2012-01-01

    The aim of the current study was to apply multivariate mixed-effects modeling to analyze experimental data on the relation between air quality and the performance of office work. The method estimates in one step the effect of the exposure on a multi-dimensional response variable, and yields important information on the correlation between the different dimensions of the response variable, which in this study was composed of both subjective perceptions and a two-dimensional performance task outcome. Such correlation is typically not included in the output from univariate analysis methods. Data originated from three different series of experiments investigating the effects of air quality on performance. The example analyses resulted in a significant and positive correlation between two performance tasks, indicating that the two tasks to some extent measured the same dimension of mental performance. The analysis seems superior to conventional univariate statistics and the information provided may be important for the design of performance experiments in general and for the conclusions that can be based on such studies.

  19. CFD study on flow characteristics of pump sump and performance analysis of the mixed flow pump

    NASA Astrophysics Data System (ADS)

    Zhao, Y. X.; Kim, C. G.; Lee, Y. H.

    2013-12-01

    Head-capacity curves provided by the pump manufacturer are obtained on the condition of no vortices flowing into the pump intake. The efficiency and performance of pumping stations depend not only on the performance of the selected pumps but also on the proper design of the intake sumps. A faulty design of pump sump can lead to the occurrence of swirl and vortices, which reduce the pump performance. Therefore, sump model test is necessary in order to check the flow condition around intake structure. Numerical simulation is a good facility for reducing the time and cost involved throughout the design process. In this study, the commercial software ANSYS CFX-13.0 has been used for the CFD analysis of the pump sump. The effect of an anti-vortex device (AVD) for the submerged vortex has been examined. Hydraulic performances for the head rise, shaft power, pump efficiencies versus flow rate are studied by the performance curves. In addition, numerical simulation of cavitation phenomenon in a mixed flow pump has been performed by calculating the full cavitation model with k-ε turbulence model. According to the result, the efficacy of the AVD to ensure the uniform flow conditions around the pump intake is confirmed. From the numerical analysis, the inception of cavitation is observed on the suction surface where the leading edges meet the tip, and then the cavitation zone expands.

  20. Effects of warming-up on physical performance: a systematic review with meta-analysis.

    PubMed

    Fradkin, Andrea J; Zazryn, Tsharni R; Smoliga, James M

    2010-01-01

    The value of warming-up is a worthy research problem because it is not known whether warming-up benefits, harms, or has no effect on individuals. The purpose of this study was to review the evidence relating to performance improvement using a warm-up. A systematic review and meta-analysis were undertaken. Relevant studies were identified by searching Medline, SPORTDiscus, and PubMed (1966-April 2008). Studies investigating the effects of warming-up on performance improvement in physical activities were included. Studies were included only if the subjects were human and only if the warm-up included activities other than stretching. The quality of included studies was assessed independently by 2 assessors using the Physiotherapy Evidence Database scale. Thirty-two studies, all of high quality (6.5-9 [mean = 7.6] of 10) reported sufficient data (quality score >6) on the effects of warming-up on performance improvement. Warm-up was shown to improve performance in 79% of the criterions examined. This analysis has shown that performance improvements can be demonstrated after completion of adequate warm-up activities, and there is little evidence to suggest that warming-up is detrimental to sports participants. Because there were few well-conducted, randomized, controlled trials undertaken, more of these are needed to further determine the role of warming-up in relation to performance improvement. PMID:19996770