Science.gov

Sample records for performance analysis 1994-2002

  1. Using remote-sensing data to determine equilibrium-line altitude and mass-balance time series: validation on three French glaciers, 1994 2002

    NASA Astrophysics Data System (ADS)

    Rabatel, Antoine; Dedieu, Jean-Pierre; Vincent, Christian

    Alpine glaciers are very sensitive to climate fluctuations, and their mass balance can be used as an indicator of regional-scale climate change. Here, we present a method to calculate glacier mass balance using remote-sensing data. Snowline measurements from remotely sensed images recorded at the end of the hydrological year provide an effective proxy of the equilibrium line. Mass balance can be deduced from the equilibrium-line altitude (ELA) variations. Three well-documented glaciers in the French Alps, where the mass balance is measured at ground level with a stake network, were selected to assess the accuracy of the method over the 1994 2002 period (eight mass-balance cycles). Results obtained by ground measurements and remote sensing are compared and show excellent correlation (r2 > 0.89), both for the ELA and for the mass balance, indicating that the remote-sensing method can be applied to glaciers where no ground data exist, on the scale of a mountain range or a given climatic area. The main differences can be attributed to discrepancies between the dates of image acquisition and field measurements. Cloud cover and recent snowfalls constitute the main restrictions of the image-based method.

  2. [Molecular epidemiology of rabies epizootics in Colombia, 1994-2002: evidence of human and canine rabies associated with chiroptera].

    PubMed

    Páez, Andrés; Nuñez, Constanza; García, Clemencia; Boshell, Jorge

    2003-03-01

    Three urban rabies outbreaks have been reported in Colombia during the last two decades, one of which is ongoing in the Caribbean region (northern Colombia). The earlier outbreaks occurred almost simultaneously in Arauca (eastern Colombia) and in the Central region, ending in 1997. Phylogenetic relationships among rabies viruses isolated from the three areas were based on a comparison of cDNA fragments coding for the endodomain of protein G and a fragment of L protein obtained by RT-PCR. The sequenced amplicons which included the G-L intergenic region contained 902 base pairs. Phylogenetic analysis showed three distinct groups of viruses. Colombian genetic variant I viruses were isolated only from Arauca and the Central region, but are now apparently extinct. Colombian genetic variant II viruses were isolated in the Caribbean region and are still being transmitted in that area. The third group of bat rabies variants were isolated from two insectivorous bats, three domestic dogs and a human. This associates bat rabies virus with rabies in Colombian dogs and humans, and indicates bats to be a rabies reservoir of public health significance.

  3. Holley Stick Performance Analysis

    DTIC Science & Technology

    2013-04-01

    TECHNICAL REPORT 2015 April 2013 Holley Stick Performance Analysis Steven T. Holste , Ph.D Jeffrey J. Person...Performance Analysis Steven T. Holste , PhD. Jeffrey J. Person Approved for public release...IEDs on U.S. and coalition forces. Figure 2 depicts OEF casualties from 2008 through 2012, indicating Total and IED-caused Killed in Action (KIA

  4. Performance Support for Performance Analysis

    ERIC Educational Resources Information Center

    Schaffer, Scott; Douglas, Ian

    2004-01-01

    Over the past several years, there has been a shift in emphasis in many business, industry, government and military training organizations toward human performance technology or HPT (Rossett, 2002; Dean, 1995). This trend has required organizations to increase the human performance knowledge, skills, and abilities of the training workforce.…

  5. MIR Performance Analysis

    SciTech Connect

    Hazen, Damian; Hick, Jason

    2012-06-12

    We provide analysis of Oracle StorageTek T10000 Generation B (T10KB) Media Information Record (MIR) Performance Data gathered over the course of a year from our production High Performance Storage System (HPSS). The analysis shows information in the MIR may be used to improve tape subsystem operations. Most notably, we found the MIR information to be helpful in determining whether the drive or tape was most suspect given a read or write error, and for helping identify which tapes should not be reused given their history of read or write errors. We also explored using the MIR Assisted Search to order file retrieval requests. We found that MIR Assisted Search may be used to reduce the time needed to retrieve collections of files from a tape volume.

  6. Performance analysis in saber.

    PubMed

    Aquili, Andrea; Tancredi, Virginia; Triossi, Tamara; De Sanctis, Desiree; Padua, Elvira; DʼArcangelo, Giovanna; Melchiorri, Giovanni

    2013-03-01

    Fencing is a sport practiced by both men and women, which uses 3 weapons: foil, épée, and saber. In general, there are few scientific studies available in international literature; they are limited to the performance analysis of fencing bouts, yet there is nothing about saber. There are 2 kinds of competitions in the World Cup for both men and women: the "FIE GP" and "A." The aim of this study was to carry out a saber performance analysis to gain useful indicators for the definition of a performance model. In addition, it is expected to verify if it could be influenced by the type of competition and if there are differences between men and women. Sixty bouts: 33 FIE GP and 27 "A" competitions (35 men's and 25 women's saber bouts) were analyzed. The results indicated that most actions are offensive (55% for men and 49% for women); the central area of the piste is mostly used (72% for men and 67% for women); the effective fighting time is 13.6% for men and 17.1% for women, and the ratio between the action and break times is 1:6.5 for men and 1:5.1 for women. A lunge is carried out every 23.9 seconds by men and every 20 seconds by women, and a direction change is carried out every 65.3 seconds by men and every 59.7 seconds by women. The data confirm the differences between the saber and the other 2 weapons. There is no significant difference between the data of the 2 different kinds of competitions.

  7. DAS performance analysis

    SciTech Connect

    Bates, G.; Bodine, S.; Carroll, T.; Keller, M.

    1984-02-01

    This report begins with an overview of the Data Acquisition System (DAS), which supports several of PPPL's experimental devices. Performance measurements which were taken on DAS and the tools used to make them are then described.

  8. Dependability and performability analysis

    NASA Technical Reports Server (NTRS)

    Trivedi, Kishor S.; Ciardo, Gianfranco; Malhotra, Manish; Sahner, Robin A.

    1993-01-01

    Several practical issues regarding specifications and solution of dependability and performability models are discussed. Model types with and without rewards are compared. Continuous-time Markov chains (CTMC's) are compared with (continuous-time) Markov reward models (MRM's) and generalized stochastic Petri nets (GSPN's) are compared with stochastic reward nets (SRN's). It is shown that reward-based models could lead to more concise model specifications and solution of a variety of new measures. With respect to the solution of dependability and performability models, three practical issues were identified: largeness, stiffness, and non-exponentiality, and a variety of approaches are discussed to deal with them, including some of the latest research efforts.

  9. Lidar performance analysis

    NASA Technical Reports Server (NTRS)

    Spiers, Gary D.

    1994-01-01

    Section 1 details the theory used to build the lidar model, provides results of using the model to evaluate AEOLUS design instrument designs, and provides snapshots of the visual appearance of the coded model. Appendix A contains a Fortran program to calculate various forms of the refractive index structure function. This program was used to determine the refractive index structure function used in the main lidar simulation code. Appendix B contains a memo on the optimization of the lidar telescope geometry for a line-scan geometry. Appendix C contains the code for the main lidar simulation and brief instruction on running the code. Appendix D contains a Fortran code to calculate the maximum permissible exposure for the eye from the ANSI Z136.1-1992 eye safety standards. Appendix E contains a paper on the eye safety analysis of a space-based coherent lidar presented at the 7th Coherent Laser Radar Applications and Technology Conference, Paris, France, 19-23 July 1993.

  10. Techniques for Automated Performance Analysis

    SciTech Connect

    Marcus, Ryan C.

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  11. Scalable Performance Measurement and Analysis

    SciTech Connect

    Gamblin, Todd

    2009-01-01

    Concurrency levels in large-scale, distributed-memory supercomputers are rising exponentially. Modern machines may contain 100,000 or more microprocessor cores, and the largest of these, IBM's Blue Gene/L, contains over 200,000 cores. Future systems are expected to support millions of concurrent tasks. In this dissertation, we focus on efficient techniques for measuring and analyzing the performance of applications running on very large parallel machines. Tuning the performance of large-scale applications can be a subtle and time-consuming task because application developers must measure and interpret data from many independent processes. While the volume of the raw data scales linearly with the number of tasks in the running system, the number of tasks is growing exponentially, and data for even small systems quickly becomes unmanageable. Transporting performance data from so many processes over a network can perturb application performance and make measurements inaccurate, and storing such data would require a prohibitive amount of space. Moreover, even if it were stored, analyzing the data would be extremely time-consuming. In this dissertation, we present novel methods for reducing performance data volume. The first draws on multi-scale wavelet techniques from signal processing to compress systemwide, time-varying load-balance data. The second uses statistical sampling to select a small subset of running processes to generate low-volume traces. A third approach combines sampling and wavelet compression to stratify performance data adaptively at run-time and to reduce further the cost of sampled tracing. We have integrated these approaches into Libra, a toolset for scalable load-balance analysis. We present Libra and show how it can be used to analyze data from large scientific applications scalably.

  12. Software Performs Complex Design Analysis

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  13. Stage Separation Performance Analysis Project

    NASA Technical Reports Server (NTRS)

    Chen, Yen-Sen; Zhang, Sijun; Liu, Jiwen; Wang, Ten-See

    2001-01-01

    Stage separation process is an important phenomenon in multi-stage launch vehicle operation. The transient flowfield coupled with the multi-body systems is a challenging problem in design analysis. The thermodynamics environment with burning propellants during the upper-stage engine start in the separation processes adds to the complexity of the-entire system. Understanding the underlying flow physics and vehicle dynamics during stage separation is required in designing a multi-stage launch vehicle with good flight performance. A computational fluid dynamics model with the capability to coupling transient multi-body dynamics systems will be a useful tool for simulating the effects of transient flowfield, plume/jet heating and vehicle dynamics. A computational model using generalize mesh system will be used as the basis of this development. The multi-body dynamics system will be solved, by integrating a system of six-degree-of-freedom equations of motion with high accuracy. Multi-body mesh system and their interactions will be modeled using parallel computing algorithms. Adaptive mesh refinement method will also be employed to enhance solution accuracy in the transient process.

  14. SEP thrust subsystem performance sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Atkins, K. L.; Sauer, C. G., Jr.; Kerrisk, D. J.

    1973-01-01

    This is a two-part report on solar electric propulsion (SEP) performance sensitivity analysis. The first part describes the preliminary analysis of the SEP thrust system performance for an Encke rendezvous mission. A detailed description of thrust subsystem hardware tolerances on mission performance is included together with nominal spacecraft parameters based on these tolerances. The second part describes the method of analysis and graphical techniques used in generating the data for Part 1. Included is a description of both the trajectory program used and the additional software developed for this analysis. Part 2 also includes a comprehensive description of the use of the graphical techniques employed in this performance analysis.

  15. Model Performance Evaluation and Scenario Analysis (MPESA)

    EPA Pesticide Factsheets

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  16. Structural-Thermal-Optical-Performance (STOP) Analysis

    NASA Technical Reports Server (NTRS)

    Bolognese, Jeffrey; Irish, Sandra

    2015-01-01

    The presentation will be given at the 26th Annual Thermal Fluids Analysis Workshop (TFAWS 2015) hosted by the Goddard Spaceflight Center (GSFC) Thermal Engineering Branch (Code 545). A STOP analysis is a multidiscipline analysis, consisting of Structural, Thermal and Optical Performance Analyses, that is performed for all space flight instruments and satellites. This course will explain the different parts of performing this analysis. The student will learn how to effectively interact with each discipline in order to accurately obtain the system analysis results.

  17. Echo Ranging/Probe Alert Performance Analysis.

    DTIC Science & Technology

    1982-11-04

    contract included technical analyses of acoustic communication equipment, system performance predictions, sea test design and data analysis, and...proposing functional system design alternatives. 2.0 SUMMARY OF WORK PERFORMED The JAYCOR effort focused on the analysis of the Echo Ranging/ Probe Alert...JAYCOR Document No. J640-020-82-2242, 16 August 1982, CONFIDENTIAL. 13. Probe Alert Design System Performance Estimates (U), J.L. Collins, JAYCOR Document

  18. Building America Performance Analysis Procedures: Revision 1

    SciTech Connect

    2004-06-01

    To measure progress toward multi-year research goals, cost and performance trade-offs are evaluated through a series of controlled field and laboratory experiments supported by energy analysis techniques using test data to calibrate simulation models.

  19. Paramedir: A Tool for Programmable Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  20. Architecture Analysis of High Performance Capacitors (POSTPRINT)

    DTIC Science & Technology

    2009-07-01

    includes the measurement of heat dissipated from a recently developed fluorenyl polyester (FPE) capacitor under an AC excitation. II. Capacitor ...AFRL-RZ-WP-TP-2010-2100 ARCHITECTURE ANALYSIS OF HIGH PERFORMANCE CAPACITORS (POSTPRINT) Hiroyuki Kosai and Tyler Bixel UES, Inc...2009 4. TITLE AND SUBTITLE ARCHITECTURE ANALYSIS OF HIGH PERFORMANCE CAPACITORS (POSTPRINT) 5a. CONTRACT NUMBER In-house 5b. GRANT NUMBER 5c

  1. Analysis of Performance Variation Using Query Expansion.

    ERIC Educational Resources Information Center

    Alemayehu, Nega

    2003-01-01

    Discussion of information retrieval performance evaluation focuses on a case study using a statistical repeated measures analysis of variance for testing the significance of factors, such as retrieval method and topic in retrieval performance variation. Analyses of the effect of query expansion on document ranking confirm that expansion affects…

  2. A Performance Approach to Job Analysis.

    ERIC Educational Resources Information Center

    Folsom, Al

    2001-01-01

    Discussion of performance technology and training evaluation focuses on a job analysis process in the Coast Guard. Topics include problems with low survey response rates; costs; the need for appropriate software; discussions with stakeholders and subject matter experts; and maximizing worthy performance. (LRW)

  3. Performance optimisations for distributed analysis in ALICE

    NASA Astrophysics Data System (ADS)

    Betev, L.; Gheata, A.; Gheata, M.; Grigoras, C.; Hristov, P.

    2014-06-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with "sensors" collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis.

  4. Interfacing Computer Aided Parallelization and Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Biegel, Bryan A. (Technical Monitor)

    2003-01-01

    When porting sequential applications to parallel computer architectures, the program developer will typically go through several cycles of source code optimization and performance analysis. We have started a project to develop an environment where the user can jointly navigate through program structure and performance data information in order to make efficient optimization decisions. In a prototype implementation we have interfaced the CAPO computer aided parallelization tool with the Paraver performance analysis tool. We describe both tools and their interface and give an example for how the interface helps within the program development cycle of a benchmark code.

  5. Massive Contingency Analysis with High Performance Computing

    SciTech Connect

    Huang, Zhenyu; Chen, Yousu; Nieplocha, Jaroslaw

    2009-07-26

    Contingency analysis is a key function in the Energy Management System (EMS) to assess the impact of various combinations of power system component failures based on state estimates. Contingency analysis is also extensively used in power market operation for feasibility test of market solutions. Faster analysis of more cases is required to safely and reliably operate today’s power grids with less marginal and more intermittent renewable energy sources. Enabled by the latest development in the computer industry, high performance computing holds the promise of meet the need in the power industry. This paper investigates the potential of high performance computing for massive contingency analysis. The framework of "N-x" contingency analysis is established and computational load balancing schemes are studied and implemented with high performance computers. Case studies of massive 300,000-contingency-case analysis using the Western Electricity Coordinating Council power grid model are presented to illustrate the application of high performance computing and demonstrate the performance of the framework and computational load balancing schemes.

  6. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  7. Shuttle/TDRSS communications system performance analysis

    NASA Technical Reports Server (NTRS)

    Braun, W. R.

    1980-01-01

    The results of the performance analysis performed on the Shuttle/Tracking and Data Relay Satellite System (TDRSS) communications system are presented. The existing Shuttle/TDRSS link simulation program were modified and refined to model the post-radio frequency interference TDRS hardware and to evaluate the performance degradation due to RFI effects. The refined link models were then used to determine, evaluate and assess expected S-band and Ku-band link performance. Parameterization results are presented for the ground station carrier and timing recovery circuits

  8. Performance analysis of LAN bridges and routers

    NASA Technical Reports Server (NTRS)

    Hajare, Ankur R.

    1991-01-01

    Bridges and routers are used to interconnect Local Area Networks (LANs). The performance of these devices is important since they can become bottlenecks in large multi-segment networks. Performance metrics and test methodology for bridges and routers were not standardized. Performance data reported by vendors is not applicable to the actual scenarios encountered in an operational network. However, vendor-provided data can be used to calibrate models of bridges and routers that, along with other models, yield performance data for a network. Several tools are available for modeling bridges and routers - Network II.5 was used. The results of the analysis of some bridges and routers are presented.

  9. Using Covariance Analysis to Assess Pointing Performance

    NASA Technical Reports Server (NTRS)

    Bayard, David; Kang, Bryan

    2009-01-01

    A Pointing Covariance Analysis Tool (PCAT) has been developed for evaluating the expected performance of the pointing control system for NASA s Space Interferometry Mission (SIM). The SIM pointing control system is very complex, consisting of multiple feedback and feedforward loops, and operating with multiple latencies and data rates. The SIM pointing problem is particularly challenging due to the effects of thermomechanical drifts in concert with the long camera exposures needed to image dim stars. Other pointing error sources include sensor noises, mechanical vibrations, and errors in the feedforward signals. PCAT models the effects of finite camera exposures and all other error sources using linear system elements. This allows the pointing analysis to be performed using linear covariance analysis. PCAT propagates the error covariance using a Lyapunov equation associated with time-varying discrete and continuous-time system matrices. Unlike Monte Carlo analysis, which could involve thousands of computational runs for a single assessment, the PCAT analysis performs the same assessment in a single run. This capability facilitates the analysis of parametric studies, design trades, and "what-if" scenarios for quickly evaluating and optimizing the control system architecture and design.

  10. Analysis of ultra-triathlon performances.

    PubMed

    Lepers, Romuald; Knechtle, Beat; Knechtle, Patrizia; Rosemann, Thomas

    2011-01-01

    Despite increased interest in ultra-endurance events, little research has examined ultra-triathlon performance. The aims of this study were: (i) to compare swimming, cycling, running, and overall performances in three ultra-distance triathlons, double Ironman distance triathlon (2IMT) (7.6 km swimming, 360 km cycling, and 84.4 km running), triple Ironman distance triathlon (3IMT) (11.4 km, 540 km, and 126.6 km), and deca Ironman distance triathlon (10IMT) (38 km, 1800 km, and 420 km) and (ii) to examine the relationships between the 2IMT, 3IMT, and 10IMT performances to create predicted equations of the 10IMT performances. Race results from 1985 through 2009 were examined to identify triathletes who performed the three considered ultra-distances. In total, 73 triathletes (68 men and 5 women) were identified. The contribution of swimming to overall ultra-triathlon performance was lower than for cycling and running. Running performance was more important to overall performance for 2IMT and 3IMT compared with 10IMT The 2IMT and 3IMT performances were significantly correlated with 10IMT performances for swimming and cycling, but not for running. 10IMT total time performance might be predicted by the following equation: 10IMT race time (minutes) = 5885 + 3.69 × 3IMT race time (minutes). This analysis of human performance during ultra-distance triathlons represents a unique data set in the field of ultra-endurance events. Additional studies are required to determine the physiological and psychological factors associated with ultra-triathlon performance.

  11. Analysis of driver performance under reduced visibility

    NASA Technical Reports Server (NTRS)

    Kaeppler, W. D.

    1982-01-01

    Mathematical models describing vehicle dynamics as well as human behavior may be useful in evaluating driver performance and in establishing design criteria for vehicles more compatible with man. In 1977, a two level model of driver steering behavior was developed, but its parameters were identified for clear visibility conditions only. Since driver performance degrades under conditions of reduced visibility, e.g., fog, the two level model should be investigated to determine its applicability to such conditions. The data analysis of a recently performed driving simulation experiment showed that the model still performed reasonably well under fog conditions, although there was a degradation in its predictive capacity during fog. Some additional parameters affecting anticipation and lag time may improve the model's performance for reduced visibility conditions.

  12. US U-25 channel performance analysis

    SciTech Connect

    Doss, E.; Pan, Y. C.

    1980-07-01

    The results of an ANL computational analysis of the performance of the US U-25 MHD channel are presented. This channel has gone through several revisions. The major revision occurred after it had been decided by the DOE Office of MHD to operate the channel with platinum-clad copper electrodes (cold), rather than with ceramic electrodes (hot), as originally planned. This work has been performed at the request of the DOE Office of MHD and the US U-25 generator design Review Committee. The channel specifications and operating conditions are presented. The combustor temperature and thermodynamic and electrical properties of the plasma are computed, and the results are discussed. The MHD channel performance has been predicted for different operating conditions. Sensitivity studies have also been performed on the effects of mass flow rate, surface roughness, combustor temperatures, and loading on the channel performance.

  13. Using Ratio Analysis to Evaluate Financial Performance.

    ERIC Educational Resources Information Center

    Minter, John; And Others

    1982-01-01

    The ways in which ratio analysis can help in long-range planning, budgeting, and asset management to strengthen financial performance and help avoid financial difficulties are explained. Types of ratios considered include balance sheet ratios, net operating ratios, and contribution and demand ratios. (MSE)

  14. Probabilistic Analysis of Gas Turbine Field Performance

    NASA Technical Reports Server (NTRS)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2002-01-01

    A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.

  15. Performance Analysis of Surfing: A Review.

    PubMed

    Farley, Oliver R L; Abbiss, Chris R; Sheppard, Jeremy M

    2017-01-01

    Farley, ORL, Abbiss, CR, and Sheppard, JM. Performance Analysis of Surfing: A Review. J Strength Cond Res 31(1): 260-271, 2017-Despite the increased professionalism and substantial growth of surfing worldwide, there is limited information available to practitioners and coaches in terms of key performance analytics that are common in other field-based sports. Indeed, research analyzing surfing performance is limited to a few studies examining male surfers' heart rates, surfing activities through time-motion analysis (TMA) using video recordings and Global Positioning Satellite (GPS) data during competition and recreational surfing. These studies have indicated that specific activities undertaken during surfing are unique with a variety of activities (i.e., paddling, resting, wave riding, breath holding, and recovery of surfboard in the surf). Furthermore, environmental and wave conditions also seem to influence the physical demands of competition surfing. It is due to these demands that surfers are required to have a high cardiorespiratory fitness, high muscular endurance, and considerable strength and anaerobic power, particular within the upper torso. By exploring various methods of performance analysis used within other sports, it is possible to improve our understanding of surfing demands. In so doing this will assist in the development of protocols and strategies to assess physiological characteristics of surfers, monitor athlete performance, improve training prescription, and identify talent. Therefore, this review explores the current literature to provide insights into methodological protocols, delimitations of research into athlete analysis and an overview of surfing dynamics. Specifically, this review will describe and review the use of TMA, GPS, and other technologies (i.e., HR) that are used in external and internal load monitoring as they pertain to surfing.

  16. Performance analysis and prediction in triathlon.

    PubMed

    Ofoghi, Bahadorreza; Zeleznikow, John; Macmahon, Clare; Rehula, Jan; Dwyer, Dan B

    2016-01-01

    Performance in triathlon is dependent upon factors that include somatotype, physiological capacity, technical proficiency and race strategy. Given the multidisciplinary nature of triathlon and the interaction between each of the three race components, the identification of target split times that can be used to inform the design of training plans and race pacing strategies is a complex task. The present study uses machine learning techniques to analyse a large database of performances in Olympic distance triathlons (2008-2012). The analysis reveals patterns of performance in five components of triathlon (three race "legs" and two transitions) and the complex relationships between performance in each component and overall performance in a race. The results provide three perspectives on the relationship between performance in each component of triathlon and the final placing in a race. These perspectives allow the identification of target split times that are required to achieve a certain final place in a race and the opportunity to make evidence-based decisions about race tactics in order to optimise performance.

  17. NPAC-Nozzle Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Barnhart, Paul J.

    1997-01-01

    A simple and accurate nozzle performance analysis methodology has been developed. The geometry modeling requirements are minimal and very flexible, thus allowing rapid design evaluations. The solution techniques accurately couple: continuity, momentum, energy, state, and other relations which permit fast and accurate calculations of nozzle gross thrust. The control volume and internal flow analyses are capable of accounting for the effects of: over/under expansion, flow divergence, wall friction, heat transfer, and mass addition/loss across surfaces. The results from the nozzle performance methodology are shown to be in excellent agreement with experimental data for a variety of nozzle designs over a range of operating conditions.

  18. Multiprocessor smalltalk: Implementation, performance, and analysis

    SciTech Connect

    Pallas, J.I.

    1990-01-01

    Multiprocessor Smalltalk demonstrates the value of object-oriented programming on a multiprocessor. Its implementation and analysis shed light on three areas: concurrent programming in an object oriented language without special extensions, implementation techniques for adapting to multiprocessors, and performance factors in the resulting system. Adding parallelism to Smalltalk code is easy, because programs already use control abstractions like iterators. Smalltalk's basic control and concurrency primitives (lambda expressions, processes and semaphores) can be used to build parallel control abstractions, including parallel iterators, parallel objects, atomic objects, and futures. Language extensions for concurrency are not required. This implementation demonstrates that it is possible to build an efficient parallel object-oriented programming system and illustrates techniques for doing so. Three modification tools-serialization, replication, and reorganization-adapted the Berkeley Smalltalk interpreter to the Firefly multiprocessor. Multiprocessor Smalltalk's performance shows that the combination of multiprocessing and object-oriented programming can be effective: speedups (relative to the original serial version) exceed 2.0 for five processors on all the benchmarks; the median efficiency is 48%. Analysis shows both where performance is lost and how to improve and generalize the experimental results. Changes in the interpreter to support concurrency add at most 12% overhead; better access to per-process variables could eliminate much of that. Changes in the user code to express concurrency add as much as 70% overhead; this overhead could be reduced to 54% if blocks (lambda expressions) were reentrant. Performance is also lost when the program cannot keep all five processors busy.

  19. Nominal Performance Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs for the groundwater exposure scenario for the three climate states considered in the TSPA-LA as well as conversion factors for evaluating compliance with the groundwater protection standard. The BDCFs will be used in performance assessment for calculating all-pathway annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle activity in groundwater and the annual dose

  20. Automated Cache Performance Analysis And Optimization

    SciTech Connect

    Mohror, Kathryn

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  1. Performance management in healthcare: a critical analysis.

    PubMed

    Hewko, Sarah J; Cummings, Greta G

    2016-01-01

    Purpose - The purpose of this paper is to explore the underlying theoretical assumptions and implications of current micro-level performance management and evaluation (PME) practices, specifically within health-care organizations. PME encompasses all activities that are designed and conducted to align employee outputs with organizational goals. Design/methodology/approach - PME, in the context of healthcare, is analyzed through the lens of critical theory. Specifically, Habermas' theory of communicative action is used to highlight some of the questions that arise in looking critically at PME. To provide a richer definition of key theoretical concepts, the authors conducted a preliminary, exploratory hermeneutic semantic analysis of the key words "performance" and "management" and of the term "performance management". Findings - Analysis reveals that existing micro-level PME systems in health-care organizations have the potential to create a workforce that is compliant, dependent, technically oriented and passive, and to support health-care systems in which inequalities and power imbalances are perpetually reinforced. Practical implications - At a time when the health-care system is under increasing pressure to provide high-quality, affordable services with fewer resources, it may be wise to investigate new sector-specific ways of evaluating and managing performance. Originality/value - In this paper, written for health-care leaders and health human resource specialists, the theoretical assumptions and implications of current PME practices within health-care organizations are explored. It is hoped that readers will be inspired to support innovative PME practices within their organizations that encourage peak performance among health-care professionals.

  2. IBIS detector performance during calibration - preliminary analysis

    NASA Astrophysics Data System (ADS)

    Bazzano, A.; Bird, A. J.; Laurent, P.; Malaguti, G.; Quadrini, E. M.; Segreto, A.; Volkmer, R.; del Santo, M.; Gabriele, M.; Tikkanen, T.

    2003-11-01

    The IBIS telescope is a high angular resolution gamma-ray imager due to be launched on the INTEGRAL satellite on October 17, 2002. The scientific goal of IBIS is to study astrophysical processes from celestial sources and diffuse regions in the hard X-ray and soft gamma-ray domains. IBIS features a coded aperture imaging system and a novel large area (~3000cm2) multilayer pixellated detector which utilises both cadmium telluride (16,384 detectors) and caesium iodide elements (4096 detectors) surrounded by a BGO active veto shield. We present an overview of, and preliminary analysis from, the IBIS calibration campaign. The performance of each pixel has been characterised, and hence the scientific performance of the IBIS detector system as a whole can now be established.

  3. Analysis approaches and interventions with occupational performance

    PubMed Central

    Ahn, Sinae

    2016-01-01

    [Purpose] The purpose of this study was to analyze approaches and interventions with occupational performance in patients with stroke. [Subjects and Methods] In this study, articles published in the past 10 years were searched. The key terms used were “occupational performance AND stroke” and “occupational performance AND CVA”. A total 252 articles were identified, and 79 articles were selected. All interventions were classified according to their approaches according to 6 theories. All interventions were analyzed for frequency. [Results] Regarding the approaches, there were 25 articles for studies that provided high frequency interventions aimed at improving biomechanical approaches (31.6%). This included electrical stimulation therapy, robot therapy, and sensory stimulation training, as well as others. Analysis of the frequency of interventions revealed that the most commonly used interventions, which were used in 18 articles (22.8%), made use of the concept of constraint-induced therapy. [Conclusion] The results of this study suggest an approach for use in clinics for selecting an appropriate intervention for occupational performance. PMID:27799719

  4. Performance analysis of quantum dots infrared photodetector

    NASA Astrophysics Data System (ADS)

    Liu, Hongmei; Zhang, Fangfang; Zhang, Jianqi; He, Guojing

    2011-08-01

    Performance analysis of the quantum dots infrared photodetector(QDIP), which can provide device designers with theoretical guidance and experimental verification, arouses a wide interest and becomes a hot research topic in the recent years. In the paper, in comparison with quantum well infrared photodetector(QWIP) characteristic, the performance of QDIP is mainly discussed and summarized by analyzing the special properties of quantum dots material. To be specific, the dark current density and the detectivity in the normalized incident phenomenon are obtained from Phillip performance model, the carrier lifetime and the dark current of QDIP are studied by combing with the "photon bottleneck" effect, and the detectivity of QDIP is theoretically derived from considering photoconduction gain under the influence of the capture probability. From the experimental results, a conclusion is made that QDIP can not only receive the normal incidence light, but also has the advantages of the long carrier life, the big photoconductive gain, the low dark current and so on, and it further illustrates a anticipated superiority of QDIP in performance and a wide use of QDIP in many engineering fields in the future.

  5. Diversity Performance Analysis on Multiple HAP Networks

    PubMed Central

    Dong, Feihong; Li, Min; Gong, Xiangwu; Li, Hongjun; Gao, Fengyue

    2015-01-01

    One of the main design challenges in wireless sensor networks (WSNs) is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP) is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO) techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO) model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV). In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF) and cumulative distribution function (CDF) of the received signal-to-noise ratio (SNR) are derived. In addition, the average symbol error rate (ASER) with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI) and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques. PMID:26134102

  6. Nominal Performance Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    M.A. Wasiolek

    2005-04-28

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standards. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1-1). The objectives of this analysis are to develop BDCFs for the

  7. PERFORMANCE ANALYSIS OF MECHANICAL DRAFT COOLING TOWER

    SciTech Connect

    Lee, S; Alfred Garrett, A; James02 Bollinger, J; Larry Koffman, L

    2009-02-10

    Industrial processes use mechanical draft cooling towers (MDCT's) to dissipate waste heat by transferring heat from water to air via evaporative cooling, which causes air humidification. The Savannah River Site (SRS) has cross-flow and counter-current MDCT's consisting of four independent compartments called cells. Each cell has its own fan to help maximize heat transfer between ambient air and circulated water. The primary objective of the work is to simulate the cooling tower performance for the counter-current cooling tower and to conduct a parametric study under different fan speeds and ambient air conditions. The Savannah River National Laboratory (SRNL) developed a computational fluid dynamics (CFD) model and performed the benchmarking analysis against the integral measurement results to accomplish the objective. The model uses three-dimensional steady-state momentum, continuity equations, air-vapor species balance equation, and two-equation turbulence as the basic governing equations. It was assumed that vapor phase is always transported by the continuous air phase with no slip velocity. In this case, water droplet component was considered as discrete phase for the interfacial heat and mass transfer via Lagrangian approach. Thus, the air-vapor mixture model with discrete water droplet phase is used for the analysis. A series of parametric calculations was performed to investigate the impact of wind speeds and ambient conditions on the thermal performance of the cooling tower when fans were operating and when they were turned off. The model was also benchmarked against the literature data and the SRS integral test results for key parameters such as air temperature and humidity at the tower exit and water temperature for given ambient conditions. Detailed results will be published here.

  8. Idaho National Laboratory Quarterly Performance Analysis

    SciTech Connect

    Mitchell, Lisbeth

    2014-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of 60 reportable events (23 from the 4th Qtr FY14 and 37 from the prior three reporting quarters) as well as 58 other issue reports (including not reportable events and Significant Category A and B conditions) identified at INL from July 2013 through October 2014. Battelle Energy Alliance (BEA) operates the INL under contract DE AC07 051D14517.

  9. Performance Analysis on Fault Tolerant Control System

    NASA Technical Reports Server (NTRS)

    Shin, Jong-Yeob; Belcastro, Christine

    2005-01-01

    In a fault tolerant control (FTC) system, a parameter varying FTC law is reconfigured based on fault parameters estimated by fault detection and isolation (FDI) modules. FDI modules require some time to detect fault occurrences in aero-vehicle dynamics. In this paper, an FTC analysis framework is provided to calculate the upper bound of an induced-L(sub 2) norm of an FTC system with existence of false identification and detection time delay. The upper bound is written as a function of a fault detection time and exponential decay rates and has been used to determine which FTC law produces less performance degradation (tracking error) due to false identification. The analysis framework is applied for an FTC system of a HiMAT (Highly Maneuverable Aircraft Technology) vehicle. Index Terms fault tolerant control system, linear parameter varying system, HiMAT vehicle.

  10. SUBSONIC WIND TUNNEL PERFORMANCE ANALYSIS SOFTWARE

    NASA Technical Reports Server (NTRS)

    Eckert, W. T.

    1994-01-01

    This program was developed as an aid in the design and analysis of subsonic wind tunnels. It brings together and refines previously scattered and over-simplified techniques used for the design and loss prediction of the components of subsonic wind tunnels. It implements a system of equations for determining the total pressure losses and provides general guidelines for the design of diffusers, contractions, corners and the inlets and exits of non-return tunnels. The algorithms used in the program are applicable to compressible flow through most closed- or open-throated, single-, double- or non-return wind tunnels or ducts. A comparison between calculated performance and that actually achieved by several existing facilities produced generally good agreement. Any system through which air is flowing which involves turns, fans, contractions etc. (e.g., an HVAC system) may benefit from analysis using this software. This program is an update of ARC-11138 which includes PC compatibility and an improved user interface. The method of loss analysis used by the program is a synthesis of theoretical and empirical techniques. Generally, the algorithms used are those which have been substantiated by experimental test. The basic flow-state parameters used by the program are determined from input information about the reference control section and the test section. These parameters were derived from standard relationships for compressible flow. The local flow conditions, including Mach number, Reynolds number and friction coefficient are determined for each end of each component or section. The loss in total pressure caused by each section is calculated in a form non-dimensionalized by local dynamic pressure. The individual losses are based on the nature of the section, local flow conditions and input geometry and parameter information. The loss forms for typical wind tunnel sections considered by the program include: constant area ducts, open throat ducts, contractions, constant

  11. Failure analysis of high performance ballistic fibers

    NASA Astrophysics Data System (ADS)

    Spatola, Jennifer S.

    High performance fibers have a high tensile strength and modulus, good wear resistance, and a low density, making them ideal for applications in ballistic impact resistance, such as body armor. However, the observed ballistic performance of these fibers is much lower than the predicted values. Since the predictions assume only tensile stress failure, it is safe to assume that the stress state is affecting fiber performance. The purpose of this research was to determine if there are failure mode changes in the fiber fracture when transversely loaded by indenters of different shapes. An experimental design mimicking transverse impact was used to determine any such effects. Three different indenters were used: round, FSP, and razor blade. The indenter height was changed to change the angle of failure tested. Five high performance fibers were examined: KevlarRTM KM2, SpectraRTM 130d, DyneemaRTM SK-62 and SK-76, and ZylonRTM 555. Failed fibers were analyzed using an SEM to determine failure mechanisms. The results show that the round and razor blade indenters produced a constant failure strain, as well as failure mechanisms independent of testing angle. The FSP indenter produced a decrease in failure strain as the angle increased. Fibrillation was the dominant failure mechanism at all angles for the round indenter, while through thickness shearing was the failure mechanism for the razor blade. The FSP indenter showed a transition from fibrillation at low angles to through thickness shearing at high angles, indicating that the round and razor blade indenters are extreme cases of the FSP indenter. The failure mechanisms observed with the FSP indenter at various angles correlated with the experimental strain data obtained during fiber testing. This indicates that geometry of the indenter tip in compression is a contributing factor in lowering the failure strain of the high performance fibers. TEM analysis of the fiber failure mechanisms was also attempted, though without

  12. Nominal Performance Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    Wasiolek, Maryla A.

    2000-12-21

    The purpose of this report was to document the process leading to development of the Biosphere Dose Conversion Factors (BDCFs) for the postclosure nominal performance of the potential repository at Yucca Mountain. BDCF calculations concerned twenty-four radionuclides. This selection included sixteen radionuclides that may be significant nominal performance dose contributors during the compliance period of up to 10,000 years, five additional radionuclides of importance for up to 1 million years postclosure, and three relatively short-lived radionuclides important for the human intrusion scenario. Consideration of radionuclide buildup in soil caused by previous irrigation with contaminated groundwater was taken into account in the BDCF development. The effect of climate evolution, from the current arid conditions to a wetter and cooler climate, on the BDCF values was evaluated. The analysis included consideration of different exposure pathway's contribution to the BDCFs. Calculations of nominal performance BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. BDCFs for the nominal performance, when combined with the concentrations of radionuclides in groundwater allow calculation of potential radiation doses to the receptor of interest. Calculated estimates of radionuclide concentration in groundwater result from the saturated zone modeling. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA) to calculate doses to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  13. Approaches to Cycle Analysis and Performance Metrics

    NASA Technical Reports Server (NTRS)

    Parson, Daniel E.

    2003-01-01

    The following notes were prepared as part of an American Institute of Aeronautics and Astronautics (AIAA) sponsored short course entitled Air Breathing Pulse Detonation Engine (PDE) Technology. The course was presented in January of 2003, and again in July of 2004 at two different AIAA meetings. It was taught by seven instructors, each of whom provided information on particular areas of PDE research. These notes cover two areas. The first is titled Approaches to Cycle Analysis and Performance Metrics. Here, the various methods of cycle analysis are introduced. These range from algebraic, thermodynamic equations, to single and multi-dimensional Computational Fluid Dynamic (CFD) solutions. Also discussed are the various means by which performance is measured, and how these are applied in a device which is fundamentally unsteady. The second topic covered is titled PDE Hybrid Applications. Here the concept of coupling a PDE to a conventional turbomachinery based engine is explored. Motivation for such a configuration is provided in the form of potential thermodynamic benefits. This is accompanied by a discussion of challenges to the technology.

  14. Axial and centrifugal pump meanline performance analysis

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.

    1994-01-01

    A meanline pump flow modeling method has been developed to provide a fast capability for modeling pumps of cryogenic rocket engines. Based on this method, a meanline pump flow code (PUMPA) has been written that can predict the performance of pumps at off-design operating conditions, given the loss of the diffusion system at the design point. The design point rotor efficiency is obtained from empirically derived correlations of loss to rotor specific speed. The rapid input setup and computer run time for the meanline pump flow code makes it an effective analysis and conceptual design tool. The map generation capabilities of the PUMPA code provide the information needed for interfacing with a rocket engine system modeling code.

  15. Stormwater quality models: performance and sensitivity analysis.

    PubMed

    Dotto, C B S; Kleidorfer, M; Deletic, A; Fletcher, T D; McCarthy, D T; Rauch, W

    2010-01-01

    The complex nature of pollutant accumulation and washoff, along with high temporal and spatial variations, pose challenges for the development and establishment of accurate and reliable models of the pollution generation process in urban environments. Therefore, the search for reliable stormwater quality models remains an important area of research. Model calibration and sensitivity analysis of such models are essential in order to evaluate model performance; it is very unlikely that non-calibrated models will lead to reasonable results. This paper reports on the testing of three models which aim to represent pollutant generation from urban catchments. Assessment of the models was undertaken using a simplified Monte Carlo Markov Chain (MCMC) method. Results are presented in terms of performance, sensitivity to the parameters and correlation between these parameters. In general, it was suggested that the tested models poorly represent reality and result in a high level of uncertainty. The conclusions provide useful information for the improvement of existing models and insights for the development of new model formulations.

  16. Past Performance analysis of HPOTP bearings

    NASA Technical Reports Server (NTRS)

    Bhat, B. N.; Dolan, F. J.

    1982-01-01

    The past performance analysis conducted on three High Pressure Oxygen Turbopump (HPOTP) bearings from the Space Shuttle Main Engine is presented. Metallurgical analysis of failed bearing balls and races, and wear track and crack configuration analyses were carried out. In addition, one bearing was tested in laboratory at very high axial loads. The results showed that the cracks were surface initiated and propagated into subsurface locations at relatively small angles. Subsurface cracks were much more extensive than was appeared on the surface. The location of major cracks in the races corresponded to high radial loads rather than high axial loads. There was evidence to suggest that the inner races were heated to elevated temperatures. A failure scenario was developed based on the above findings. According to this scenario the HPOTP bearings are heated by a combination of high loads and high coefficient of friction (poor lubrication). Different methods of extending the HPOTP bearing life are also discussed. These include reduction of axial loads, improvements in bearing design, lubrication and cooling, and use of improved bearing materials.

  17. Radio-science performance analysis software

    NASA Technical Reports Server (NTRS)

    Morabito, D. D.; Asmar, S. W.

    1995-01-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.

  18. Performance Analysis of ICA in Sensor Array

    PubMed Central

    Cai, Xin; Wang, Xiang; Huang, Zhitao; Wang, Fenghua

    2016-01-01

    As the best-known scheme in the field of Blind Source Separation (BSS), Independent Component Analysis (ICA) has been intensively used in various domains, including biomedical and acoustics applications, cooperative or non-cooperative communication, etc. While sensor arrays are involved in most of the applications, the influence on the performance of ICA of practical factors therein has not been sufficiently investigated yet. In this manuscript, the issue is researched by taking the typical antenna array as an illustrative example. Factors taken into consideration include the environment noise level, the properties of the array and that of the radiators. We analyze the analytic relationship between the noise variance, the source variance, the condition number of the mixing matrix and the optimal signal to interference-plus-noise ratio, as well as the relationship between the singularity of the mixing matrix and practical factors concerned. The situations where the mixing process turns (nearly) singular have been paid special attention to, since such circumstances are critical in applications. Results and conclusions obtained should be instructive when applying ICA algorithms on mixtures from sensor arrays. Moreover, an effective countermeasure against the cases of singular mixtures has been proposed, on the basis of previous analysis. Experiments validating the theoretical conclusions as well as the effectiveness of the proposed scheme have been included. PMID:27164100

  19. Radio-science performance analysis software

    NASA Astrophysics Data System (ADS)

    Morabito, D. D.; Asmar, S. W.

    1995-02-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.

  20. Space Shuttle Main Engine performance analysis

    NASA Astrophysics Data System (ADS)

    Santi, L. Michael

    1993-11-01

    For a number of years, NASA has relied primarily upon periodically updated versions of Rocketdyne's power balance model (PBM) to provide space shuttle main engine (SSME) steady-state performance prediction. A recent computational study indicated that PBM predictions do not satisfy fundamental energy conservation principles. More recently, SSME test results provided by the Technology Test Bed (TTB) program have indicated significant discrepancies between PBM flow and temperature predictions and TTB observations. Results of these investigations have diminished confidence in the predictions provided by PBM, and motivated the development of new computational tools for supporting SSME performance analysis. A multivariate least squares regression algorithm was developed and implemented during this effort in order to efficiently characterize TTB data. This procedure, called the 'gains model,' was used to approximate the variation of SSME performance parameters such as flow rate, pressure, temperature, speed, and assorted hardware characteristics in terms of six assumed independent influences. These six influences were engine power level, mixture ratio, fuel inlet pressure and temperature, and oxidizer inlet pressure and temperature. A BFGS optimization algorithm provided the base procedure for determining regression coefficients for both linear and full quadratic approximations of parameter variation. Statistical information relative to data deviation from regression derived relations was also computed. A new strategy for integrating test data with theoretical performance prediction was also investigated. The current integration procedure employed by PBM treats test data as pristine and adjusts hardware characteristics in a heuristic manner to achieve engine balance. Within PBM, this integration procedure is called 'data reduction.' By contrast, the new data integration procedure, termed 'reconciliation,' uses mathematical optimization techniques, and requires both

  1. Space Shuttle Main Engine performance analysis

    NASA Technical Reports Server (NTRS)

    Santi, L. Michael

    1993-01-01

    For a number of years, NASA has relied primarily upon periodically updated versions of Rocketdyne's power balance model (PBM) to provide space shuttle main engine (SSME) steady-state performance prediction. A recent computational study indicated that PBM predictions do not satisfy fundamental energy conservation principles. More recently, SSME test results provided by the Technology Test Bed (TTB) program have indicated significant discrepancies between PBM flow and temperature predictions and TTB observations. Results of these investigations have diminished confidence in the predictions provided by PBM, and motivated the development of new computational tools for supporting SSME performance analysis. A multivariate least squares regression algorithm was developed and implemented during this effort in order to efficiently characterize TTB data. This procedure, called the 'gains model,' was used to approximate the variation of SSME performance parameters such as flow rate, pressure, temperature, speed, and assorted hardware characteristics in terms of six assumed independent influences. These six influences were engine power level, mixture ratio, fuel inlet pressure and temperature, and oxidizer inlet pressure and temperature. A BFGS optimization algorithm provided the base procedure for determining regression coefficients for both linear and full quadratic approximations of parameter variation. Statistical information relative to data deviation from regression derived relations was also computed. A new strategy for integrating test data with theoretical performance prediction was also investigated. The current integration procedure employed by PBM treats test data as pristine and adjusts hardware characteristics in a heuristic manner to achieve engine balance. Within PBM, this integration procedure is called 'data reduction.' By contrast, the new data integration procedure, termed 'reconciliation,' uses mathematical optimization techniques, and requires both

  2. Data Link Performance Analysis for LVLASO Experiments

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    1998-01-01

    Low-visibility Landing and Surface Operations System (LVLASO) is currently being prototyped and tested at NASA Langley Research Center. Since the main objective of the system is to maintain the aircraft landings and take-offs even during low-visibility conditions, timely exchange of positional and other information between the aircraft and the ground control is critical. For safety and reliability reasons, there are several redundant sources on the ground (e.g., ASDE, AMASS) that collect and disseminate information about the environment to the aircrafts. The data link subsystem of LVLASO is responsible for supporting the timely transfer of information between the aircrafts and the ground controllers. In fact, if not properly designed, the data link subsystem could become a bottleneck in the proper functioning of LVLASO. Currently, the other components of the system are being designed assuming that the data link has adequate capacity and is capable of delivering the information in a timely manner. During August 1-28, 1997, several flight experiments were conducted to test the prototypes of subsystems developed under LVLASO project, The back-round and details of the tests are described in the next section. The test results have been collected in two CDs by FAA and Rockwell-Collins. Under the current grant, we have analyzed the data and evaluated the performance of the Mode S datalink. In this report, we summarize the results of our analysis. Much of the results are shown in terms of graphs or histograms. The test date (or experiment number) was often taken as the X-axis and the Y-axis denotes whatever metric of focus in that chart. In interpreting these charts, one need to take into account the vehicular traffic during a particular experiment. In general, the performance of the data link was found to be quite satisfactory in terms of delivering long and short Mode S squitters from the vehicles to the ground receiver, Similarly, its performance in delivering control

  3. Performance analysis of robust road sign identification

    NASA Astrophysics Data System (ADS)

    Ali, Nursabillilah M.; Mustafah, Y. M.; Rashid, N. K. A. M.

    2013-12-01

    This study describes performance analysis of a robust system for road sign identification that incorporated two stages of different algorithms. The proposed algorithms consist of HSV color filtering and PCA techniques respectively in detection and recognition stages. The proposed algorithms are able to detect the three standard types of colored images namely Red, Yellow and Blue. The hypothesis of the study is that road sign images can be used to detect and identify signs that are involved with the existence of occlusions and rotational changes. PCA is known as feature extraction technique that reduces dimensional size. The sign image can be easily recognized and identified by the PCA method as is has been used in many application areas. Based on the experimental result, it shows that the HSV is robust in road sign detection with minimum of 88% and 77% successful rate for non-partial and partial occlusions images. For successful recognition rates using PCA can be achieved in the range of 94-98%. The occurrences of all classes are recognized successfully is between 5% and 10% level of occlusions.

  4. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    EPA Science Inventory

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...

  5. NEXT Performance Curve Analysis and Validation

    NASA Technical Reports Server (NTRS)

    Saripalli, Pratik; Cardiff, Eric; Englander, Jacob

    2016-01-01

    Performance curves of the NEXT thruster are highly important in determining the thruster's ability in performing towards mission-specific goals. New performance curves are proposed and examined here. The Evolutionary Mission Trajectory Generator (EMTG) is used to verify variations in mission solutions based on both available thruster curves and the new curves generated. Furthermore, variations in BOL and EOL curves are also examined. Mission design results shown here validate the use of EMTG and the new performance curves.

  6. IPAC-Inlet Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Barnhart, Paul J.

    1997-01-01

    A series of analyses have been developed which permit the calculation of the performance of common inlet designs. The methods presented are useful for determining the inlet weight flows, total pressure recovery, and aerodynamic drag coefficients for given inlet geometric designs. Limited geometric input data is required to use this inlet performance prediction methodology. The analyses presented here may also be used to perform inlet preliminary design studies. The calculated inlet performance parameters may be used in subsequent engine cycle analyses or installed engine performance calculations for existing uninstalled engine data.

  7. An Ethnostatistical Analysis of Performance Measurement

    ERIC Educational Resources Information Center

    Winiecki, Donald J.

    2008-01-01

    Within the fields of human performance technology (HPT), human resources management (HRM), and management in general, performance measurement is not only foundational but considered necessary at all phases in the process of HPT. In HPT in particular, there is substantial advice literature on what measurement is, why it is essential, and (at a…

  8. Spacecraft Multiple Array Communication System Performance Analysis

    NASA Technical Reports Server (NTRS)

    Hwu, Shian U.; Desilva, Kanishka; Sham, Catherine C.

    2010-01-01

    The Communication Systems Simulation Laboratory (CSSL) at the NASA Johnson Space Center is tasked to perform spacecraft and ground network communication system simulations, design validation, and performance verification. The CSSL has developed simulation tools that model spacecraft communication systems and the space and ground environment in which the tools operate. In this paper, a spacecraft communication system with multiple arrays is simulated. Multiple array combined technique is used to increase the radio frequency coverage and data rate performance. The technique is to achieve phase coherence among the phased arrays to combine the signals at the targeting receiver constructively. There are many technical challenges in spacecraft integration with a high transmit power communication system. The array combining technique can improve the communication system data rate and coverage performances without increasing the system transmit power requirements. Example simulation results indicate significant performance improvement can be achieved with phase coherence implementation.

  9. Path analysis of self-efficacy and diving performance revisited.

    PubMed

    Feltz, Deborah L; Chow, Graig M; Hepler, Teri J

    2008-06-01

    The Feltz (1982) path analysis of the relationship between diving efficacy and performance showed that, over trials, past performance was a stronger predictor than self-efficacy of performance. Bandura (1997) criticized the study as statistically "overcontrolling" for past performance by using raw past performance scores along with self-efficacy as predictors of performance. He suggests residualizing past performance by regressing the raw scores on self-efficacy and entering them into the model to remove prior contributions of self-efficacy imbedded in past performance scores. To resolve this controversy, we reanalyzed the Feltz data using three statistical models: raw past performance, residual past performance, and a method that residualizes past performance and self-efficacy. Results revealed that self-efficacy was a stronger predictor of performance in both residualized models than in the raw past performance model. Furthermore, the influence of past performance on future performance was weaker when the residualized methods were conducted.

  10. Assessing BMP Performance Using Microtox Toxicity Analysis

    EPA Science Inventory

    Best Management Practices (BMPs) have been shown to be effective in reducing runoff and pollutants from urban areas and thus provide a mechanism to improve downstream water quality. Currently, BMP performance regarding water quality improvement is assessed through measuring each...

  11. Space conditioning performance analysis and simulation study

    NASA Astrophysics Data System (ADS)

    Patani, A.; Bonne, U.

    1981-07-01

    The engine driven heat pump model was expanded to incorporate an approach for evaluating the influence of cycling and systems included the sensitivity of performance to electric consumption, compressor speed, mixing, and climate. A modular program for evaluating the steady state performance of absorption heat pumps was developed. Initial simulations indicated performance trends as a function of outdoor temperature and the refrigerant absorber charge. The combustion heating system model, HFLAME, was used to simulate the benefits of fan/pump overrun and the dependence of corresponding setpoints on off period losses and electric costs. Benefits of fuel, fuel/air modulation as compared to cyclic performance were also analyzed. An energy distribution factor was defined to describe the effect of the distribution system on realizing savings of retrofits.

  12. Managing Performance Analysis with Dynamic Statistical Projection Pursuit

    SciTech Connect

    Vetter, J.S.; Reed, D.A.

    2000-05-22

    Computer systems and applications are growing more complex. Consequently, performance analysis has become more difficult due to the complex, transient interrelationships among runtime components. To diagnose these types of performance issues, developers must use detailed instrumentation to capture a large number of performance metrics. Unfortunately, this instrumentation may actually influence the performance analysis, leading the developer to an ambiguous conclusion. In this paper, we introduce a technique for focusing a performance analysis on interesting performance metrics. This technique, called dynamic statistical projection pursuit, identifies interesting performance metrics that the monitoring system should capture across some number of processors. By reducing the number of performance metrics, projection pursuit can limit the impact of instrumentation on the performance of the target system and can reduce the volume of performance data.

  13. Rocket-in-a-Duct Performance Analysis

    NASA Technical Reports Server (NTRS)

    Schneider, Steven J.; Reed, Brian D.

    1999-01-01

    An axisymmetric, 110 N class, rocket configured with a free expansion between the rocket nozzle and a surrounding duct was tested in an altitude simulation facility. The propellants were gaseous hydrogen and gaseous oxygen and the hardware consisted of a heat sink type copper rocket firing through copper ducts of various diameters and lengths. A secondary flow of nitrogen was introduced at the blind end of the duct to mix with the primary rocket mass flow in the duct. This flow was in the range of 0 to 10% of the primary massflow and its effect on nozzle performance was measured. The random measurement errors on thrust and massflow were within +/-1%. One dimensional equilibrium calculations were used to establish the possible theoretical performance of these rocket-in-a-duct nozzles. Although the scale of these tests was small, they simulated the relevant flow expansion physics at a modest experimental cost. Test results indicated that lower performance was obtained at higher free expansion area ratios and longer ducts, while, higher performance was obtained with the addition of secondary flow. There was a discernable peak in specific impulse efficiency at 4% secondary flow. The small scale of these tests resulted in low performance efficiencies, but prior numerical modeling of larger rocket-in-a-duct engines predicted performance that was comparable to that of optimized rocket nozzles. This remains to be proven in large-scale, rocket-in-a-duct tests.

  14. Network interface unit design options performance analysis

    NASA Technical Reports Server (NTRS)

    Miller, Frank W.

    1991-01-01

    An analysis is presented of three design options for the Space Station Freedom (SSF) onboard Data Management System (DMS) Network Interface Unit (NIU). The NIU provides the interface from the Fiber Distributed Data Interface (FDDI) local area network (LAN) to the DMS processing elements. The FDDI LAN provides the primary means for command and control and low and medium rate telemetry data transfers on board the SSF. The results of this analysis provide the basis for the implementation of the NIU.

  15. A performance analysis system for MEMS using automated imaging methods

    SciTech Connect

    LaVigne, G.F.; Miller, S.L.

    1998-08-01

    The ability to make in-situ performance measurements of MEMS operating at high speeds has been demonstrated using a new image analysis system. Significant improvements in performance and reliability have directly resulted from the use of this system.

  16. An optical probe for micromachine performance analysis

    SciTech Connect

    Dickey, F.M.; Holswade, S.C.; Smith, N.F.; Miller, S.L.

    1997-01-01

    Understanding the mechanisms that impact the performance of Microelectromechanical Systems (MEMS) is essential to the development of optimized designs and fabrication processes, as well as the qualification of devices for commercial applications. Silicon micromachines include engines that consist of orthogonally oriented linear comb drive actuators mechanically connected to a rotating gear. These gears are as small as 50 {mu}m in diameter and can be driven at rotation rates exceeding 300,000 rpm. Optical techniques offer the potential for measuring long term statistical performance data and transient responses needed to optimize designs and manufacturing techniques. We describe the development of Micromachine Optical Probe (MOP) technology for the evaluation of micromachine performance. The MOP approach is based on the detection of optical signals scattered by the gear teeth or other physical structures. We present experimental results obtained with a prototype optical probe and micromachines developed at Sandia National Laboratories.

  17. Forecast analysis of optical waveguide bus performance

    NASA Technical Reports Server (NTRS)

    Ledesma, R.; Rourke, M. D.

    1979-01-01

    Elements to be considered in the design of a data bus include: architecture; data rate; modulation, encoding, detection; power distribution requirements; protocol, work structure; bus reliability, maintainability; interterminal transmission medium; cost; and others specific to application. Fiber- optic data bus considerations for a 32 port transmissive star architecture, are discussed in a tutorial format. General optical-waveguide bus concepts, are reviewed. The electrical and optical performance of a 32 port transmissive star bus, and the effects of temperature on the performance of optical-waveguide buses are examined. A bibliography of pertinent references and the bus receiver test results are included.

  18. Performance analysis of a VSAT network

    NASA Astrophysics Data System (ADS)

    Karam, Fouad G.; Miller, Neville; Karam, Antoine

    With the growing need for efficient satellite networking facilities, the very small aperture terminal (VSAT) technology emerges as the leading edge of satellite communications. Achieving the required performance of a VSAT network is dictated by the multiple access technique utilized. Determining the inbound access method best suited for a particular application involves trade-offs between response time and space segment utilization. In this paper, the slotted Aloha and dedicated stream access techniques are compared. It is shown that network performance is dependent on the traffic offered from remote earth stations as well as the sensitivity of customer's applications to satellite delay.

  19. Applying Mechanics to Swimming Performance Analysis.

    ERIC Educational Resources Information Center

    Barthels, Katharine

    1989-01-01

    Swimming teachers and coaches can improve their feedback to swimmers, when correcting or refining swim movements, by applying some basic biomechanical concepts relevant to swimming. This article focuses on the biomechanical considerations used in analyzing swimming performance. Techniques for spotting and correcting problems that impede…

  20. Cognitive Performance: A Model for Analysis

    ERIC Educational Resources Information Center

    Marjoribanks, Kevin

    1975-01-01

    In the present study, cognitive performance was examined by analysing a path model which included family environment variables, social status indicators, and a set of enabling conditions consisting of self-esteem, attitudes toward schoolwork and educational and occupational aspirations. (Editor)

  1. THERMAL PERFORMANCE ANALYSIS FOR WSB DRUM

    SciTech Connect

    Lee, S

    2008-06-26

    The Nuclear Nonproliferation Programs Design Authority is in the design stage of the Waste Solidification Building (WSB) for the treatment and solidification of the radioactive liquid waste streams generated by the Pit Disassembly and Conversion Facility (PDCF) and Mixed Oxide (MOX) Fuel Fabrication Facility (MFFF). The waste streams will be mixed with a cementitious dry mix in a 55-gallon waste container. Savannah River National Laboratory (SRNL) has been performing the testing and evaluations to support technical decisions for the WSB. Engineering Modeling & Simulation Group was requested to evaluate the thermal performance of the 55-gallon drum containing hydration heat source associated with the current baseline cement waste form. A transient axi-symmetric heat transfer model for the drum partially filled with waste form cement has been developed and heat transfer calculations performed for the baseline design configurations. For this case, 65 percent of the drum volume was assumed to be filled with the waste form, which has transient hydration heat source, as one of the baseline conditions. A series of modeling calculations has been performed using a computational heat transfer approach. The baseline modeling results show that the time to reach the maximum temperature of the 65 percent filled drum is about 32 hours when a 43 C initial cement temperature is assumed to be cooled by natural convection with 27 C external air. In addition, the results computed by the present model were compared with analytical solutions. The modeling results will be benchmarked against the prototypic test results. The verified model will be used for the evaluation of the thermal performance for the WSB drum.

  2. Performance analysis of panoramic infrared systems

    NASA Astrophysics Data System (ADS)

    Furxhi, Orges; Driggers, Ronald G.; Holst, Gerald; Krapels, Keith

    2014-05-01

    Panoramic imagers are becoming more commonplace in the visible part of the spectrum. These imagers are often used in the real estate market, extreme sports, teleconferencing, and security applications. Infrared panoramic imagers, on the other hand, are not as common and only a few have been demonstrated. A panoramic image can be formed in several ways, using pan and stitch, distributed aperture, or omnidirectional optics. When omnidirectional optics are used, the detected image is a warped view of the world that is mapped on the focal plane array in a donut shape. The final image on the display is the mapping of the omnidirectional donut shape image back to the panoramic world view. In this paper we analyze the performance of uncooled thermal panoramic imagers that use omnidirectional optics, focusing on range performance.

  3. Experimental system and component performance analysis

    SciTech Connect

    Peterman, K.

    1984-10-01

    A prototype dye laser flow loop was constructed to flow test large power amplifiers in Building 169. The flow loop is designed to operate at supply pressures up to 900 psig and flow rates up to 250 GPM. During the initial startup of the flow loop experimental measurements were made to evaluate component and system performance. Three candidate dye flow loop pumps and three different pulsation dampeners were tested.

  4. Performance Analysis of the Unitree Central File

    NASA Technical Reports Server (NTRS)

    Pentakalos, Odysseas I.; Flater, David

    1994-01-01

    This report consists of two parts. The first part briefly comments on the documentation status of two major systems at NASA#s Center for Computational Sciences, specifically the Cray C98 and the Convex C3830. The second part describes the work done on improving the performance of file transfers between the Unitree Mass Storage System running on the Convex file server and the users workstations distributed over a large georgraphic area.

  5. Cost and Training Effectiveness Analysis Performance Guide

    DTIC Science & Technology

    1980-07-23

    perform cost and training effectiveness analyses (CTEA) during Weapon System Acquisition required by the Life Cycle System Management Model (LCSMM) and...light cf training risk . This would be the case if a training program were estimated to be more effective in training certain high- risk tasks out were...also estimated to be somewhat more costly than the next best program. Possible impacts of training risk may indi- cate that the more effective

  6. Moisture performance analysis of EPS frost insulation

    SciTech Connect

    Ojanen, T.; Kokko, E.

    1997-11-01

    A horizontal layer of expanded polystyrene foam (EPS) is widely used as a frost insulation of building foundations in the Nordic countries. The performance properties of the insulation depend strongly on the moisture level of the material. Experimental methods are needed to produce samples for testing the material properties in realistic moisture conditions. The objective was to analyze the moisture loads and the wetting mechanisms of horizontal EPS frost insulation. Typical wetting tests, water immersion and diffusive water vapor absorption tests, were studied and the results were compared with the data from site investigations. Usually these tests give higher moisture contents of EPS than what are detected in drained frost insulation applications. Also the effect of different parameters, like the immersion depth and temperature gradient were studied. Special attention was paid to study the effect of diffusion on the wetting process. Numerical simulation showed that under real working conditions the long period diffusive moisture absorption in EPS frost insulation remained lower than 1% Vol. Moisture performance was determined experimentally as a function of the distance between the insulation and the free water level in the ground. The main moisture loads and the principles for good moisture performance of frost insulation are presented.

  7. Database for LDV Signal Processor Performance Analysis

    NASA Technical Reports Server (NTRS)

    Baker, Glenn D.; Murphy, R. Jay; Meyers, James F.

    1989-01-01

    A comparative and quantitative analysis of various laser velocimeter signal processors is difficult because standards for characterizing signal bursts have not been established. This leaves the researcher to select a signal processor based only on manufacturers' claims without the benefit of direct comparison. The present paper proposes the use of a database of digitized signal bursts obtained from a laser velocimeter under various configurations as a method for directly comparing signal processors.

  8. Shuttle TPS thermal performance and analysis methodology

    NASA Technical Reports Server (NTRS)

    Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.

    1983-01-01

    Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.

  9. Performance analysis of advanced spacecraft TPS

    NASA Technical Reports Server (NTRS)

    Pitts, William C.

    1991-01-01

    Spacecraft entering a planetary atmosphere require a very sophisticated thermal protection system. The materials used must be tailored to each specific vehicle based on its planned mission profiles. Starting with the Space Shuttle, many types of ceramic insulation with various combinations of thermal properties have been developed by others. The development of two new materials is described: A Composite Flexible Blanket Insulation which has a significantly lower effective thermal conductivity than other ceramic blankets; and a Silicon Matrix Composite which has applications at high temperature locations such as wing leading edges. Also, a systematic study is described that considers the application of these materials for a proposed Personnel Launch System. The study shows how most of these available ceramic materials would perform during atmospheric entry of this vehicle. Other specific applications of these thermal protection materials are discussed.

  10. Beyond "Yes or No": The Vulpe' Performance Analysis System. Revised.

    ERIC Educational Resources Information Center

    Hampton Univ., VA.

    The booklet describes the Vulpe' Performance Analysis System (VPAS), a measure of a child's progress in developmental activities which provides a link to instructional programming. In the assessment stage the child's performance is scored according to how much and what type of assistance is required to perform the task. The scale ranges from no…

  11. What Do HPT Consultants Do for Performance Analysis?

    ERIC Educational Resources Information Center

    Kang, Sung

    2017-01-01

    This study was conducted to contribute to the field of Human Performance Technology (HPT) through the validation of the performance analysis process of the International Society for Performance Improvement (ISPI) HPT model, the most representative and frequently utilized process model in the HPT field. The study was conducted using content…

  12. Performance analysis of advanced spacecraft TPS

    NASA Technical Reports Server (NTRS)

    Pitts, William C.

    1987-01-01

    The analysis on the feasibility for using metal hydrides in the thermal protection system of cryogenic tanks in space was based on the heat capacity of ice as the phase change material (PCM). It was found that with ice the thermal protection system weight could be reduced by, at most, about 20 percent over an all LI-900 insulation. For this concept to be viable, a metal hydride with considerably more capacity than water would be required. None were found. Special metal hydrides were developed for hydrogen fuel storage applications and it may be possible to do so for the current application. Until this appears promising further effort on this feasibility study does not seem warranted.

  13. Covariance of lucky images: performance analysis

    NASA Astrophysics Data System (ADS)

    Cagigal, Manuel P.; Valle, Pedro J.; Cagigas, Miguel A.; Villó-Pérez, Isidro; Colodro-Conde, Carlos; Ginski, C.; Mugrauer, M.; Seeliger, M.

    2017-01-01

    The covariance of ground-based lucky images is a robust and easy-to-use algorithm that allows us to detect faint companions surrounding a host star. In this paper, we analyse the relevance of the number of processed frames, the frames' quality, the atmosphere conditions and the detection noise on the companion detectability. This analysis has been carried out using both experimental and computer-simulated imaging data. Although the technique allows us the detection of faint companions, the camera detection noise and the use of a limited number of frames reduce the minimum detectable companion intensity to around 1000 times fainter than that of the host star when placed at an angular distance corresponding to the few first Airy rings. The reachable contrast could be even larger when detecting companions with the assistance of an adaptive optics system.

  14. 76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-30

    ... COMMISSION Metal Fatigue Analysis Performed by Computer Software AGENCY: Nuclear Regulatory Commission... applicants' analyses and methodologies using the computer software package, WESTEMS TM , to demonstrate... by Computer Software Addressees All holders of, and applicants for, a power reactor operating...

  15. Instrument performs nondestructive chemical analysis, data can be telemetered

    NASA Technical Reports Server (NTRS)

    Turkevich, A.

    1965-01-01

    Instrument automatically performs a nondestructive chemical analysis of surfaces and transmits the data in the form of electronic signals. It employs solid-state nuclear particle detectors with a charged nuclear particle source and an electronic pulse-height analyzer.

  16. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  17. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replication and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. A technique is used that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed database with both shared and exclusive locks.

  18. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replications and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. Here, a technique is discussed that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed databases with both shared and exclusive locks.

  19. Performance analysis of parallel supernodal sparse LU factorization

    SciTech Connect

    Grigori, Laura; Li, Xiaoye S.

    2004-02-05

    We investigate performance characteristics for the LU factorization of large matrices with various sparsity patterns. We consider supernodal right-looking parallel factorization on a bi-dimensional grid of processors, making use of static pivoting. We develop a performance model and we validate it using the implementation in SuperLU-DIST, the real matrices and the IBM Power3 machine at NERSC. We use this model to obtain performance bounds on parallel computers, to perform scalability analysis and to identify performance bottlenecks. We also discuss the role of load balance and data distribution in this approach.

  20. An Exploratory Analysis of Performance on the SAT.

    ERIC Educational Resources Information Center

    Wainer, Howard

    1984-01-01

    Techniques of exploratory data analysis (EDA) were used to decompose data tables portraying performance of ethnic groups on the Scholastic Aptitude Test. These analyses indicate the size and structure of differences in performance among groups studied, nature of changes across time, and interactions between group membership and time. (Author/DWH)

  1. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    SciTech Connect

    Bremer, Peer-Timo; Mohr, Bernd; Schulz, Martin; Pasccci, Valerio; Gamblin, Todd; Brunst, Holger

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  2. School Performance Feedback Systems: Conceptualization, Analysis, and Reflection.

    ERIC Educational Resources Information Center

    Visscher, Adrie J.; Coe, Robert

    2003-01-01

    Presents a conceptualization and analysis of school performance feedback systems (SPFS), followed by framework that includes factors crucial for their use and effects. Provides two examples of use of SPFS. Summarizes evidence on the process, problems, and impact of SPFS; suggests strategies for using performance feedback to improve schools.…

  3. A Systemic Cause Analysis Model for Human Performance Technicians

    ERIC Educational Resources Information Center

    Sostrin, Jesse

    2011-01-01

    This article presents a systemic, research-based cause analysis model for use in the field of human performance technology (HPT). The model organizes the most prominent barriers to workplace learning and performance into a conceptual framework that explains and illuminates the architecture of these barriers that exist within the fabric of everyday…

  4. Using Importance-Performance Analysis to Evaluate Training

    ERIC Educational Resources Information Center

    Siniscalchi, Jason M.; Beale, Edward K.; Fortuna, Ashley

    2008-01-01

    The importance-performance analysis (IPA) is a tool that can provide timely and usable feedback to improve training. IPA measures the gaps between the importance and how good (performance) a class is perceived by a student and is presented on a 2x2 matrix. The quadrant in which data land in this matrix aids in determining potential future action.…

  5. Using Importance-Performance Analysis To Evaluate Teaching Effectiveness.

    ERIC Educational Resources Information Center

    Attarian, Aram

    This paper introduces Importance-Performance (IP) analysis as a method to evaluate teaching effectiveness in a university outdoor program. Originally developed for use in the field of marketing, IP analysis is simple and easy to administer, and provides the instructor with a visual representation of what teaching attributes are important, how…

  6. Using Latent Class Analysis To Set Academic Performance Standards.

    ERIC Educational Resources Information Center

    Brown, Richard S.

    The use of latent class analysis for establishing student performance standards was studied. Latent class analysis (LCA) is an established procedure for investigating the latent structure of a set of data. LCA presumes that groups, classes, or respondents differ qualitatively from one another, and that these differences account for all of the…

  7. Preliminary Analysis of Remote Monitoring & Robotic Concepts for Performance Confirmation

    SciTech Connect

    D.A. McAffee

    1997-02-18

    As defined in 10 CFR Part 60.2, Performance Confirmation is the ''program of tests, experiments and analyses which is conducted to evaluate the accuracy and adequacy of the information used to determine with reasonable assurance that the performance objectives for the period after permanent closure will be met''. The overall Performance Confirmation program begins during site characterization and continues up to repository closure. The main purpose of this document is to develop, explore and analyze initial concepts for using remotely operated and robotic systems in gathering repository performance information during Performance Confirmation. This analysis focuses primarily on possible Performance Confirmation related applications within the emplacement drifts after waste packages have been emplaced (post-emplacement) and before permanent closure of the repository (preclosure). This will be a period of time lasting approximately 100 years and basically coincides with the Caretaker phase of the project. This analysis also examines, to a lesser extent, some applications related to Caretaker operations. A previous report examined remote handling and robotic technologies that could be employed during the waste package emplacement phase of the project (Reference 5.1). This analysis is being prepared to provide an early investigation of possible design concepts and technical challenges associated with developing remote systems for monitoring and inspecting activities during Performance Confirmation. The writing of this analysis preceded formal development of Performance Confirmation functional requirements and program plans and therefore examines, in part, the fundamental Performance Confirmation monitoring needs and operating conditions. The scope and primary objectives of this analysis are to: (1) Describe the operating environment and conditions expected in the emplacement drifts during the preclosure period. (Presented in Section 7.2). (2) Identify and discuss the

  8. Advanced Risk Analysis for High-Performing Organizations

    DTIC Science & Technology

    2006-01-01

    using traditional risk analysis techniques. Mission Assurance Analysis Protocol (MAAP) is one technique that high performers can use to identify and mitigate the risks arising from operational complexity....The operational environment for many types of organizations is changing. Changes in operational environments are driving the need for advanced risk ... analysis techniques. Many types of risk prevalent in today’s operational environments (e.g., event risks, inherited risk) are not readily identified

  9. Long-tailed duck (Clangula hyemalis) microsatellite DNA data; Alaska, Canada, Russia, 1994-2002

    USGS Publications Warehouse

    Wilson, Robert E.; Talbot, Sandra L.

    2016-01-01

    This data set describes nuclear microsatellite genotypes derived from twelve autosomal loci (6AB, Aph02, Aph08, Aph19, Aph23, Bca10, Bca11, Hhi5, Sfi11, Smo07, Smo09, and CRG), and two Z-linked microsatellite loci (Bca4 and Smo1). A total of 111 Long-tailed Ducks were examined for this genotyping with samples coming from the two primary breeding locales within Alaska (Arctic Coastal Plain of Alaska and the Yukon Delta, Western Alaska) and a representative locale in the central Canadian Arctic (Queen Maud Gulf Bird Sanctuary, Nunavut, Canada). The sex of most samples was determined in the field by plumage and later confirmed by using the CHD molecular sexing protocol (Griffiths et al., 1998).

  10. Phosphorus and suspended sediment load estimates for the Lower Boise River, Idaho, 1994-2002

    USGS Publications Warehouse

    Donato, Mary M.; MacCoy, Dorene E.

    2004-01-01

    The U.S. Geological Survey used LOADEST, newly developed load estimation software, to develop regression equations and estimate loads of total phosphorus (TP), dissolved orthophosphorus (OP), and suspended sediment (SS) from January 1994 through September 2002 at four sites on the lower Boise River: Boise River below Diversion Dam near Boise, Boise River at Glenwood Bridge at Boise, Boise River near Middleton, and Boise River near Parma. The objective was to help the Idaho Department of Environmental Quality develop and implement total maximum daily loads (TMDLs) by providing spatial and temporal resolution for phosphorus and sediment loads and enabling load estimates made by mass balance calculations to be refined and validated. Regression models for TP and OP generally were well fit on the basis of regression coefficients of determination (R2), but results varied in quality from site to site. The TP and OP results for Glenwood probably were affected by the upstream wastewater-treatment plant outlet, which provides a variable phosphorus input that is unrelated to river discharge. Regression models for SS generally were statistically well fit. Regression models for Middleton for all constituents, although statistically acceptable, were of limited usefulness because sparse and intermittent discharge data at that site caused many gaps in the resulting estimates. Although the models successfully simulated measured loads under predominant flow conditions, errors in TP and SS estimates at Middleton and in TP estimates at Parma were larger during high- and low-flow conditions. This shortcoming might be improved if additional concentration data for a wider range of flow conditions were available for calibrating the model. The average estimated daily TP load ranged from less than 250 pounds per day (lb/d) at Diversion to nearly 2,200 lb/d at Parma. Estimated TP loads at all four sites displayed cyclical variations coinciding with seasonal fluctuations in discharge. Estimated annual loads of TP ranged from less than 8 tons at Diversion to 570 tons at Parma. Annual loads of dissolved OP peaked in 1997 at all sites and were consistently higher at Parma than at the other sites. The ratio of OP to TP varied considerably throughout the year at all sites. Peaks in the OP:TP ratio occurred primarily when flows were at their lowest annual stages; estimated seasonal OP:TP ratios were highest in autumn at all sites. Conversely, when flows were high, the ratio was low, reflecting increased TP associated with particulate matter during high flows. Parma exhibited the highest OP:TP ratio during all seasons, at least 0.60 in spring and nearly 0.90 in autumn. Similar OP:TP ratios were estimated at Glenwood. Whereas the OP:TP ratio for Parma and Glenwood peaked in November or December, decreased from January through May, and increased again after June, estimates for Diversion showed nearly the opposite pattern ? ratios were highest in July and lowest in January and February. This difference might reflect complex biological and geochemical processes involving nutrient cycling in Lucky Peak Lake, but further data are needed to substantiate this hypothesis. Estimated monthly average SS loads were highest at Diversion, about 400 tons per day (ton/d). Average annual loads from 1994 through 2002 were 144,000 tons at Diversion, 33,000 tons at Glenwood, and 88,000 tons at Parma. Estimated SS loads peaked in the spring at all sites, coinciding with high flows. Increases in TP in the reach from Diversion to Glenwood ranged from 200 to 350 lb/d. Decreases in TP were small in this reach only during high flows in January and February 1997. Decreases in SS, were large during high-flow conditions indicating sediment deposition in the reach. Intermittent data at Middleton indicated that increases and decreases in TP in the reach from Glenwood to Middleton were during low- and high-flow conditions, respectively. All constituents increased in the r

  11. Merging Right: Questions of Access and Merit in South African Higher Education Reform, 1994-2002

    ERIC Educational Resources Information Center

    Elliott, John

    2005-01-01

    The dismantling of South Africa's apartheid-controlled education system after 1994 brought with it unprecedented policy complications, among them the question of how best to integrate the desiderata of access and merit in school education and tertiary sectors. For the higher education sector, institutional mergers became an increasingly visible…

  12. [Phenotypic diagnosis of primary immunodeficiencies in Antioquia, Colombia, 1994-2002].

    PubMed

    Montoya, Carlos Julio; Henao, Julieta; Salgado, Helí; Olivares, María M; López, Juan A; Rugeles, Claudia; Franco, José Luis; Orrego, Julio; García, Diana M; Patiño, Pablo J

    2002-12-01

    Recurrent infections are a frequent cause of medical visits. They can be due to a heterogeneous group of dysfunctions that increase the susceptibility to pathogenic and opportunistic microorganisms, such as immunological deficiencies. To define an opportune rational treatment and to guide the molecular diagnosis of primary immunodeficiency diseases, we establish a program for the phenotypic diagnosis of these illnesses in Antioquia, Colombia, including clinical and laboratory evaluations of patients who present recurrent infections with abnormal evolution. Between August 1, 1994 and July 31, 2002, phenotypic diagnosis of primary immunodeficiency was made in 98 patients. Similar to data reported in the literature, antibody deficiencies were the most frequent (40.8%), followed by combined deficiencies (21.4%). This phenotypic characterization has allowed for appropriate treatments for each patient and, in some cases, functional and molecular studies that can lead to a definite molecular diagnosis.

  13. Analysis of portable impactor performance for enumeration of viable bioaerosols.

    PubMed

    Yao, Maosheng; Mainelis, Gediminas

    2007-07-01

    Portable impactors are increasingly being used to estimate concentration of bioaerosols in residential and occupational environments; however, little data are available about their performance. This study investigated the overall performances of the SMA MicroPortable, BioCulture, Microflow, Microbiological Air Sampler (MAS-100), Millipore Air Tester, SAS Super 180, and RCS High Flow portable microbial samplers when collecting bacteria and fungi both indoors and outdoors. The performance of these samplers was compared with that of the BioStage impactor. The Button Aerosol Sampler equipped with gelatin filter was also included in the study. Results showed that the sampling environment can have a statistically significant effect on sampler performance, most likely due to the differences in airborne microorganism composition and/or their size distribution. Data analysis using analysis of variance showed that the relative performance of all samplers (except the RCS High Flow and MAS-100) was statistically different (lower) compared with the BioStage. The MAS-100 also had statistically higher performance compared with other portable samplers except the RCS High Flow. The Millipore Air Tester and the SMA had the lowest performances. The relative performance of the impactors was described using a multiple linear regression model (R(2) = 0.83); the effects of the samplers' cutoff sizes and jet-to-plate distances as predictor variables were statistically significant. The data presented in this study will help field professionals in selecting bioaerosol samplers. The developed empirical formula describing the overall performance of bioaerosol impactors can assist in sampler design.

  14. The development of a reliable amateur boxing performance analysis template.

    PubMed

    Thomson, Edward; Lamb, Kevin; Nicholas, Ceri

    2013-01-01

    The aim of this study was to devise a valid performance analysis system for the assessment of the movement characteristics associated with competitive amateur boxing and assess its reliability using analysts of varying experience of the sport and performance analysis. Key performance indicators to characterise the demands of an amateur contest (offensive, defensive and feinting) were developed and notated using a computerised notational analysis system. Data were subjected to intra- and inter-observer reliability assessment using median sign tests and calculating the proportion of agreement within predetermined limits of error. For all performance indicators, intra-observer reliability revealed non-significant differences between observations (P > 0.05) and high agreement was established (80-100%) regardless of whether exact or the reference value of ±1 was applied. Inter-observer reliability was less impressive for both analysts (amateur boxer and experienced analyst), with the proportion of agreement ranging from 33-100%. Nonetheless, there was no systematic bias between observations for any indicator (P > 0.05), and the proportion of agreement within the reference range (±1) was 100%. A reliable performance analysis template has been developed for the assessment of amateur boxing performance and is available for use by researchers, coaches and athletes to classify and quantify the movement characteristics of amateur boxing.

  15. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  16. Performing modal analysis for multi-metric measurements: a discussion

    NASA Astrophysics Data System (ADS)

    Soman, R.; Majewska, K.; Radzienski, M.; Ostachowicz, W.

    2016-04-01

    This work addresses the severe lack of literature in the area of modal analysis for multi-metric sensing. The paper aims at providing a step by step tutorial for performance of modal analysis using Fiber Bragg Grating (FBG) strain sensors and Laser Doppler Vibrometer (LDV) for displacement measurements. The paper discusses in detail the different parameters which affect the accuracy of the experimental results. It highlights the often implied, and un-mentioned problems, that researchers face while performing experiments. The paper tries to bridge the gap between the theoretical idea of the experiment and its actual execution by discussing each aspect including the choice of specimen, boundary conditions, sensors, sensor position, excitation mechanism and its location as well as the post processing of the data. The paper may be viewed as a checklist for performing modal analysis in order to ensure high quality measurements by avoiding the systematic errors to creep in.

  17. Advanced Video Analysis Needs for Human Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Campbell, Paul D.

    1994-01-01

    Evaluators of human task performance in space missions make use of video as a primary source of data. Extraction of relevant human performance information from video is often a labor-intensive process requiring a large amount of time on the part of the evaluator. Based on the experiences of several human performance evaluators, needs were defined for advanced tools which could aid in the analysis of video data from space missions. Such tools should increase the efficiency with which useful information is retrieved from large quantities of raw video. They should also provide the evaluator with new analytical functions which are not present in currently used methods. Video analysis tools based on the needs defined by this study would also have uses in U.S. industry and education. Evaluation of human performance from video data can be a valuable technique in many industrial and institutional settings where humans are involved in operational systems and processes.

  18. Temporal geospatial analysis of secondary school students’ examination performance

    NASA Astrophysics Data System (ADS)

    Nik Abd Kadir, ND; Adnan, NA

    2016-06-01

    Malaysia's Ministry of Education has improved the organization of the data to have the geographical information system (GIS) school database. However, no further analysis is done using geospatial analysis tool. Mapping has emerged as a communication tool and becomes effective way to publish the digital and statistical data such as school performance results. The objective of this study is to analyse secondary school student performance of science and mathematics scores of the Sijil Pelajaran Malaysia Examination result in the year 2010 to 2014 for the Kelantan's state schools with the aid of GIS software and geospatial analysis. The school performance according to school grade point average (GPA) from Grade A to Grade G were interpolated and mapped and query analysis using geospatial tools able to be done. This study will be beneficial to the education sector to analyse student performance not only in Kelantan but to the whole Malaysia and this will be a good method to publish in map towards better planning and decision making to prepare young Malaysians for the challenges of education system and performance.

  19. Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Caubet, Jordi; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    In this paper we describe how to apply powerful performance analysis techniques to understand the behavior of multilevel parallel applications. We use the Paraver/OMPItrace performance analysis system for our study. This system consists of two major components: The OMPItrace dynamic instrumentation mechanism, which allows the tracing of processes and threads and the Paraver graphical user interface for inspection and analyses of the generated traces. We describe how to use the system to conduct a detailed comparative study of a benchmark code implemented in five different programming paradigms applicable for shared memory

  20. Visualization and Data Analysis for High-Performance Computing

    SciTech Connect

    Sewell, Christopher Meyer

    2016-09-27

    This is a set of slides from a guest lecture for a class at the University of Texas, El Paso on visualization and data analysis for high-performance computing. The topics covered are the following: trends in high-performance computing; scientific visualization, such as OpenGL, ray tracing and volume rendering, VTK, and ParaView; data science at scale, such as in-situ visualization, image databases, distributed memory parallelism, shared memory parallelism, VTK-m, "big data", and then an analysis example.

  1. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  2. Integrated design environment for human performance and human reliability analysis

    SciTech Connect

    Nelson, W.R.

    1997-05-01

    Work over the last few years at the Idaho National Engineering and Environmental Laboratory (INEEL) has included a major focus on applying human performance and human reliability knowledge and methods as an integral element of system design and development. This work has been pursued in programs in a wide variety of technical domains, beginning with nuclear power plant operations. Since the mid-1980`s the laboratory has transferred the methods and tools developed in the nuclear domain to military weapons systems and aircraft, offshore oil and shipping operations, and commercial aviation operations and aircraft design. Through these diverse applications the laboratory has developed an integrated approach and framework for application of human performance analysis, human reliability analysis (HRA), operational data analysis, and simulation studies of human performance to the design and development of complex systems. This approach was recently tested in the NASA Advanced Concepts Program {open_quotes}Structured Human Error Analysis for Aircraft Design.{close_quotes} This program resulted in the prototype software tool THEA (Tool for Human Error Analysis) for incorporating human error analysis in the design of commercial aircraft, focusing on airplane maintenance tasks. Current effort is directed toward applying this framework to the development of advanced Air Traffic Management (ATM) systems as part of NASA`s Advanced Air Transportation Technologies (AATT) program. This paper summarizes the approach, describes recent and current applications in commercial aviation, and provides perspectives on how the approach could be utilized in the nuclear power industry.

  3. Design and performance analysis of gas and liquid radial turbines

    NASA Astrophysics Data System (ADS)

    Tan, Xu

    In the first part of the research, pumps running in reverse as turbines are studied. This work uses experimental data of wide range of pumps representing the centrifugal pumps' configurations in terms of specific speed. Based on specific speed and specific diameter an accurate correlation is developed to predict the performances at best efficiency point of the centrifugal pump in its turbine mode operation. The proposed prediction method yields very good results to date compared to previous such attempts. The present method is compared to nine previous methods found in the literature. The comparison results show that the method proposed in this paper is the most accurate. The proposed method can be further complemented and supplemented by more future tests to increase its accuracy. The proposed method is meaningful because it is based both specific speed and specific diameter. The second part of the research is focused on the design and analysis of the radial gas turbine. The specification of the turbine is obtained from the solar biogas hybrid system. The system is theoretically analyzed and constructed based on the purchased compressor. Theoretical analysis results in a specification of 100lb/min, 900ºC inlet total temperature and 1.575atm inlet total pressure. 1-D and 3-D geometry of the rotor is generated based on Aungier's method. 1-D loss model analysis and 3-D CFD simulations are performed to examine the performances of the rotor. The total-to-total efficiency of the rotor is more than 90%. With the help of CFD analysis, modifications on the preliminary design obtained optimized aerodynamic performances. At last, the theoretical performance analysis on the hybrid system is performed with the designed turbine.

  4. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    SciTech Connect

    Bhatele, Abhinav

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  5. Analysis of Photovoltaic System Energy Performance Evaluation Method

    SciTech Connect

    Kurtz, S.; Newmiller, J.; Kimber, A.; Flottemesch, R.; Riley, E.; Dierauf, T.; McKee, J.; Krishnani, P.

    2013-11-01

    Documentation of the energy yield of a large photovoltaic (PV) system over a substantial period can be useful to measure a performance guarantee, as an assessment of the health of the system, for verification of a performance model to then be applied to a new system, or for a variety of other purposes. Although the measurement of this performance metric might appear to be straight forward, there are a number of subtleties associated with variations in weather and imperfect data collection that complicate the determination and data analysis. A performance assessment is most valuable when it is completed with a very low uncertainty and when the subtleties are systematically addressed, yet currently no standard exists to guide this process. This report summarizes a draft methodology for an Energy Performance Evaluation Method, the philosophy behind the draft method, and the lessons that were learned by implementing the method.

  6. An Empirical Analysis of Human Performance and Nuclear Safety Culture

    SciTech Connect

    Jeffrey Joe; Larry G. Blackwood

    2006-06-01

    The purpose of this analysis, which was conducted for the US Nuclear Regulatory Commission (NRC), was to test whether an empirical connection exists between human performance and nuclear power plant safety culture. This was accomplished through analyzing the relationship between a measure of human performance and a plant’s Safety Conscious Work Environment (SCWE). SCWE is an important component of safety culture the NRC has developed, but it is not synonymous with it. SCWE is an environment in which employees are encouraged to raise safety concerns both to their own management and to the NRC without fear of harassment, intimidation, retaliation, or discrimination. Because the relationship between human performance and allegations is intuitively reciprocal and both relationship directions need exploration, two series of analyses were performed. First, human performance data could be indicative of safety culture, so regression analyses were performed using human performance data to predict SCWE. It also is likely that safety culture contributes to human performance issues at a plant, so a second set of regressions were performed using allegations to predict HFIS results.

  7. The Analysis of Athletic Performance: Some Practical and Philosophical Considerations

    ERIC Educational Resources Information Center

    Nelson, Lee J.; Groom, Ryan

    2012-01-01

    This article presents a hypothetical dialogue between a notational analyst (NA) recently schooled in the positivistic assessment of athletic performance, an "old-school" traditional coach (TC) who favours subjective analysis, and a pragmatic educator (PE). The conversation opens with NA and TC debating the respective value of quantitative and…

  8. Performance on the Pharmacy College Admission Test: An Exploratory Analysis.

    ERIC Educational Resources Information Center

    Kawahara, Nancy E.; Ethington, Corinna

    1994-01-01

    Median polishing, an exploratory data statistical analysis technique, was used to study achievement patterns for men and women on the Pharmacy College Admission Test over a six-year period. In general, a declining trend in scores was found, and males performed better than females, with the largest differences found in chemistry and biology.…

  9. Visuo-Spatial Performance in Autism: A Meta-Analysis

    ERIC Educational Resources Information Center

    Muth, Anne; Hönekopp, Johannes; Falter, Christine M.

    2014-01-01

    Visuo-spatial skills are believed to be enhanced in autism spectrum disorders (ASDs). This meta-analysis tests the current state of evidence for Figure Disembedding, Block Design, Mental Rotation and Navon tasks in ASD and neurotypicals. Block Design (d = 0.32) and Figure Disembedding (d = 0.26) showed superior performance for ASD with large…

  10. Path Analysis Tests of Theoretical Models of Children's Memory Performance

    ERIC Educational Resources Information Center

    DeMarie, Darlene; Miller, Patricia H.; Ferron, John; Cunningham, Walter R.

    2004-01-01

    Path analysis was used to test theoretical models of relations among variables known to predict differences in children's memory--strategies, capacity, and metamemory. Children in kindergarten to fourth grade (chronological ages 5 to 11) performed different memory tasks. Several strategies (i.e., sorting, clustering, rehearsal, and self-testing)…

  11. A Semiotic Reading and Discourse Analysis of Postmodern Street Performance

    ERIC Educational Resources Information Center

    Lee, Mimi Miyoung; Chung, Sheng Kuan

    2009-01-01

    Postmodern street art operates under a set of references that requires art educators and researchers to adopt alternative analytical frameworks in order to understand its meanings. In this article, we describe social semiotics, critical discourse analysis, and postmodern street performance as well as the relevance of the former two in interpreting…

  12. Performance Analysis of GAME: A Generic Automated Marking Environment

    ERIC Educational Resources Information Center

    Blumenstein, Michael; Green, Steve; Fogelman, Shoshana; Nguyen, Ann; Muthukkumarasamy, Vallipuram

    2008-01-01

    This paper describes the Generic Automated Marking Environment (GAME) and provides a detailed analysis of its performance in assessing student programming projects and exercises. GAME has been designed to automatically assess programming assignments written in a variety of languages based on the "structure" of the source code and the correctness…

  13. Frontiers of Performance Analysis on Leadership-Class Systems

    SciTech Connect

    Fowler, R J; Adhianto, L; de Supinski, B R; Fagan, M; Gamblin, T; Krentel, M; Mellor-Crummey, J; Schulz, M; Tallent, N

    2009-06-15

    The number of cores in high-end systems for scientific computing are employing is increasing rapidly. As a result, there is an pressing need for tools that can measure, model, and diagnose performance problems in highly-parallel runs. We describe two tools that employ complementary approaches for analysis at scale and we illustrate their use on DOE leadership-class systems.

  14. Storage element performance optimization for CMS analysis jobs

    NASA Astrophysics Data System (ADS)

    Behrmann, G.; Dahlblom, J.; Guldmyr, J.; Happonen, K.; Lindén, T.

    2012-12-01

    Tier-2 computing sites in the Worldwide Large Hadron Collider Computing Grid (WLCG) host CPU-resources (Compute Element, CE) and storage resources (Storage Element, SE). The vast amount of data that needs to processed from the Large Hadron Collider (LHC) experiments requires good and efficient use of the available resources. Having a good CPU efficiency for the end users analysis jobs requires that the performance of the storage system is able to scale with I/O requests from hundreds or even thousands of simultaneous jobs. In this presentation we report on the work on improving the SE performance at the Helsinki Institute of Physics (HIP) Tier-2 used for the Compact Muon Experiment (CMS) at the LHC. Statistics from CMS grid jobs are collected and stored in the CMS Dashboard for further analysis, which allows for easy performance monitoring by the sites and by the CMS collaboration. As part of the monitoring framework CMS uses the JobRobot which sends every four hours 100 analysis jobs to each site. CMS also uses the HammerCloud tool for site monitoring and stress testing and it has replaced the JobRobot. The performance of the analysis workflow submitted with JobRobot or HammerCloud can be used to track the performance due to site configuration changes, since the analysis workflow is kept the same for all sites and for months in time. The CPU efficiency of the JobRobot jobs at HIP was increased approximately by 50 % to more than 90 %, by tuning the SE and by improvements in the CMSSW and dCache software. The performance of the CMS analysis jobs improved significantly too. Similar work has been done on other CMS Tier-sites, since on average the CPU efficiency for CMSSW jobs has increased during 2011. Better monitoring of the SE allows faster detection of problems, so that the performance level can be kept high. The next storage upgrade at HIP consists of SAS disk enclosures which can be stress tested on demand with HammerCloud workflows, to make sure that the I/O-performance

  15. Modeling and performance analysis of GPS vector tracking algorithms

    NASA Astrophysics Data System (ADS)

    Lashley, Matthew

    This dissertation provides a detailed analysis of GPS vector tracking algorithms and the advantages they have over traditional receiver architectures. Standard GPS receivers use a decentralized architecture that separates the tasks of signal tracking and position/velocity estimation. Vector tracking algorithms combine the two tasks into a single algorithm. The signals from the various satellites are processed collectively through a Kalman filter. The advantages of vector tracking over traditional, scalar tracking methods are thoroughly investigated. A method for making a valid comparison between vector and scalar tracking loops is developed. This technique avoids the ambiguities encountered when attempting to make a valid comparison between tracking loops (which are characterized by noise bandwidths and loop order) and the Kalman filters (which are characterized by process and measurement noise covariance matrices) that are used by vector tracking algorithms. The improvement in performance offered by vector tracking is calculated in multiple different scenarios. Rule of thumb analysis techniques for scalar Frequency Lock Loops (FLL) are extended to the vector tracking case. The analysis tools provide a simple method for analyzing the performance of vector tracking loops. The analysis tools are verified using Monte Carlo simulations. Monte Carlo simulations are also used to study the effects of carrier to noise power density (C/N0) ratio estimation and the advantage offered by vector tracking over scalar tracking. The improvement from vector tracking ranges from 2.4 to 6.2 dB in various scenarios. The difference in the performance of the three vector tracking architectures is analyzed. The effects of using a federated architecture with and without information sharing between the receiver's channels are studied. A combination of covariance analysis and Monte Carlo simulation is used to analyze the performance of the three algorithms. The federated algorithm without

  16. Performance Analysis of HF Band FB-MC-SS

    SciTech Connect

    Hussein Moradi; Stephen Andrew Laraway; Behrouz Farhang-Boroujeny

    2016-01-01

    Abstract—In a recent paper [1] the filter bank multicarrier spread spectrum (FB-MC-SS) waveform was proposed for wideband spread spectrum HF communications. A significant benefit of this waveform is robustness against narrow and partial band interference. Simulation results in [1] demonstrated good performance in a wideband HF channel over a wide range of conditions. In this paper we present a theoretical analysis of the bit error probably for this system. Our analysis tailors the results from [2] where BER performance was analyzed for maximum ration combining systems that accounted for correlation between subcarriers and channel estimation error. Equations are give for BER that closely match the simulated performance in most situations.

  17. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  18. Performance demonstration program plan for analysis of simulated headspace gases

    SciTech Connect

    1995-06-01

    The Performance Demonstration Program (PDP) for analysis of headspace gases will consist of regular distribution and analyses of test standards to evaluate the capability for analyzing VOCs, hydrogen, and methane in the headspace of transuranic (TRU) waste throughout the Department of Energy (DOE) complex. Each distribution is termed a PDP cycle. These evaluation cycles will provide an objective measure of the reliability of measurements performed for TRU waste characterization. Laboratory performance will be demonstrated by the successful analysis of blind audit samples of simulated TRU waste drum headspace gases according to the criteria set within the text of this Program Plan. Blind audit samples (hereinafter referred to as PDP samples) will be used as an independent means to assess laboratory performance regarding compliance with the QAPP QAOs. The concentration of analytes in the PDP samples will encompass the range of concentrations anticipated in actual waste characterization gas samples. Analyses which are required by the WIPP to demonstrate compliance with various regulatory requirements and which are included in the PDP must be performed by laboratories which have demonstrated acceptable performance in the PDP.

  19. Relative performance of academic departments using DEA with sensitivity analysis.

    PubMed

    Tyagi, Preeti; Yadav, Shiv Prasad; Singh, S P

    2009-05-01

    The process of liberalization and globalization of Indian economy has brought new opportunities and challenges in all areas of human endeavor including education. Educational institutions have to adopt new strategies to make best use of the opportunities and counter the challenges. One of these challenges is how to assess the performance of academic programs based on multiple criteria. Keeping this in view, this paper attempts to evaluate the performance efficiencies of 19 academic departments of IIT Roorkee (India) through data envelopment analysis (DEA) technique. The technique has been used to assess the performance of academic institutions in a number of countries like USA, UK, Australia, etc. But we are using it first time in Indian context to the best of our knowledge. Applying DEA models, we calculate technical, pure technical and scale efficiencies and identify the reference sets for inefficient departments. Input and output projections are also suggested for inefficient departments to reach the frontier. Overall performance, research performance and teaching performance are assessed separately using sensitivity analysis.

  20. Mission analysis and performance specification studies report, appendix A

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The Near Term Hybrid Passenger Vehicle Development Program tasks included defining missions, developing distributions of daily travel and composite driving cycles for these missions, providing information necessary to estimate the potential replacement of the existing fleet by hybrids, and estimating acceleration/gradeability performance requirements for safe operation. The data was then utilized to develop mission specifications, define reference vehicles, develop hybrid vehicle performance specifications, and make fuel consumption estimates for the vehicles. The major assumptions which underlie the approach taken to the mission analysis and development of performance specifications are the following: the daily operating range of a hybrid vehicle should not be limited by the stored energy capacity and the performance of such a vehicle should not be strongly dependent on the battery state of charge.

  1. Integrated microfluidic systems for high-performance genetic analysis.

    PubMed

    Liu, Peng; Mathies, Richard A

    2009-10-01

    Driven by the ambitious goals of genome-related research, fully integrated microfluidic systems have developed rapidly to advance biomolecular and, in particular, genetic analysis. To produce a microsystem with high performance, several key elements must be strategically chosen, including device materials, temperature control, microfluidic control, and sample/product transport integration. We review several significant examples of microfluidic integration in DNA sequencing, gene expression analysis, pathogen detection, and forensic short tandem repeat typing. The advantages of high speed, increased sensitivity, and enhanced reliability enable these integrated microsystems to address bioanalytical challenges such as single-copy DNA sequencing, single-cell gene expression analysis, pathogen detection, and forensic identification of humans in formats that enable large-scale and point-of-analysis applications.

  2. Safety and performance analysis of a commercial photovoltaic installation

    NASA Astrophysics Data System (ADS)

    Hamzavy, Babak T.; Bradley, Alexander Z.

    2013-09-01

    Continuing to better understand the performance of PV systems and changes in performance with the system life is vital to the sustainable growth of solar. A systematic understanding of degradation mechanisms that are induced as a result of variables such as the service environment, installation, module/material design, weather, operation and maintenance, and manufacturing is required for reliable operation throughout a system's lifetime. We wish to report the results from an analysis of a commercial c-Si PV array owned and operated by DuPont. We assessed the electrical performance of the modules by comparing the original manufacturers' performance data with the measurements obtained using a solar simulator to determine the degradation rate. This evaluation provides valuable PV system field experience and document key issues regarding safety and performance. A review of the nondestructive and destructive analytical methods and characterization strategies we have found useful for system, module, and subsequent material component evaluations are presented. We provide an overview of our inspection protocol and subsequent control process to mitigate risk. The objective is to explore and develop best practice protocols regarding PV asset optimization and provide a rationale to reduce risk based on the analysis of our own commercial installations.

  3. Thermodynamic performance analysis of ramjet engine at wide working conditions

    NASA Astrophysics Data System (ADS)

    Ou, Min; Yan, Li; Tang, Jing-feng; Huang, Wei; Chen, Xiao-qian

    2017-03-01

    Although ramjet has the advantages of high-speed flying and higher specific impulse, the performance parameters will decline seriously with the increase of flight Mach number and flight height. Therefore, the investigation on the thermodynamic performance of ramjet is very crucial for broadening the working range. In the current study, a typical ramjet model has been employed to investigate the performance characteristics at wide working conditions. First of all, the compression characteristic analysis is carried out based on the Brayton cycle. The obtained results show that the specific cross-section area (A2 and A5) and the air-fuel ratio (f) have a great influence on the ramjet performance indexes. Secondly, the thermodynamic calculation process of ramjet is given from the view of the pneumatic thermal analysis. Then, the variable trends of the ramjet performance indexes with the flow conditions, the air-fuel ratio (f), the specific cross-sectional area (A2 and A5) under the fixed operating condition, equipotential dynamic pressure condition and variable dynamic pressure condition have been discussed. Finally, the optimum value of the specific cross-sectional area (A5) and the air-fuel ratio (f) of the ramjet model at a fixed work condition (Ma=3.5, H=12 km) are obtained.

  4. Performance requirements analysis for payload delivery from a space station

    NASA Technical Reports Server (NTRS)

    Friedlander, A. L.; Soldner, J. K.; Bell, J. (Editor); Ricks, G. W.; Kincade, R. E.; Deatkins, D.; Reynolds, R.; Nader, B. A.; Hill, O.; Babb, G. R.

    1983-01-01

    Operations conducted from a space station in low Earth orbit which have different constraints and opportunities than those conducted from direct Earth launch were examined. While a space station relieves many size and performance constraints on the space shuttle, the space station's inertial orbit has different launch window constraints from those associated with customary Earth launches which reflect upon upper stage capability. A performance requirements analysis was developed to provide a reference source of parametric data, and specific case solutions and upper stage sizing trade to assist potential space station users and space station and upper stage developers assess the impacts of a space station on missions of interest.

  5. A Study of ATLAS Grid Performance for Distributed Analysis

    NASA Astrophysics Data System (ADS)

    Panitkin, Sergey; Fine, Valery; Wenaus, Torre

    2012-12-01

    In the past two years the ATLAS Collaboration at the LHC has collected a large volume of data and published a number of ground breaking papers. The Grid-based ATLAS distributed computing infrastructure played a crucial role in enabling timely analysis of the data. We will present a study of the performance and usage of the ATLAS Grid as platform for physics analysis in 2011. This includes studies of general properties as well as timing properties of user jobs (wait time, run time, etc). These studies are based on mining of data archived by the PanDA workload management system.

  6. Aerodynamic Analysis of Cup Anemometers Performance: The Stationary Harmonic Response

    PubMed Central

    Pindado, Santiago; Cubas, Javier; Sanz-Andrés, Ángel

    2013-01-01

    The effect of cup anemometer shape parameters, such as the cups' shape, their size, and their center rotation radius, was experimentally analyzed. This analysis was based on both the calibration constants of the transfer function and the most important harmonic term of the rotor's movement, which due to the cup anemometer design is the third one. This harmonic analysis represents a new approach to study cup anemometer performances. The results clearly showed a good correlation between the average rotational speed of the anemometer's rotor and the mentioned third harmonic term of its movement. PMID:24381512

  7. INL FY2014 1st Quarterly Performance Analysis

    SciTech Connect

    Kinghorn, Loran

    2014-07-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2 “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of 76 occurrence reports and over 16 other deficiency reports (including not reportable events) identified at the INL during the period of October 2013 through December 2013. Battelle Energy Alliance (BEA) operates the INL under contract DE AC 07 051D14517

  8. Aerodynamic analysis of cup anemometers performance: the stationary harmonic response.

    PubMed

    Pindado, Santiago; Cubas, Javier; Sanz-Andrés, Angel

    2013-01-01

    The effect of cup anemometer shape parameters, such as the cups' shape, their size, and their center rotation radius, was experimentally analyzed. This analysis was based on both the calibration constants of the transfer function and the most important harmonic term of the rotor's movement, which due to the cup anemometer design is the third one. This harmonic analysis represents a new approach to study cup anemometer performances. The results clearly showed a good correlation between the average rotational speed of the anemometer's rotor and the mentioned third harmonic term of its movement.

  9. Performance analysis of solar powered absorption refrigeration system

    NASA Astrophysics Data System (ADS)

    Abu-Ein, Suleiman Qaseem; Fayyad, Sayel M.; Momani, Waleed; Al-Bousoul, Mamdouh

    2009-12-01

    The present work provides a detailed thermodynamic analysis of a 10 kW solar absorption refrigeration system using ammonia-water mixtures as a working medium. This analysis includes both first law and second law of thermodynamics. The coefficient of performance (COP), exergetic coefficient of performance (ECOP) and the exergy losses (Δ E) through each component of the system at different operating conditions are obtained. The minimum and maximum values of COP and ECOP were found to be at 110 and 200°C generator temperatures respectively. About 40% of the system exergy losses were found to be in the generator. The maximum exergy losses in the absorber occur at generator temperature of 130°C for all evaporator temperatures. A computer simulation model is developed to carry out the calculations and to obtain the results of the present study.

  10. Identifying influential factors of business process performance using dependency analysis

    NASA Astrophysics Data System (ADS)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  11. Microfabricated devices for performing chemical and biochemical analysis

    SciTech Connect

    Ramsey, J.M.; Jacobson, S.C.; Foote, R.S.

    1997-05-01

    There is growing interest in microfabricated devices that perform chemical and biochemical analysis. The general goal is to use microfabrication tools to construct miniature devices that can perform a complete analysis starting with an unprocessed sample. Such devices have been referred to as lab-on-a-chip devices. Initial efforts on microfluidic laboratory-on-a-chip devices focused on chemical separations. There are many potential applications of these fluidic microchip devices. Some applications such as chemical process control or environmental monitoring would require that a chip be used over an extended period of time or for many analyses. Other applications such as forensics, clinical diagnostics, and genetic diagnostics would employ the chip devices as single use disposable devices.

  12. Modeling and performance analysis of QoS data

    NASA Astrophysics Data System (ADS)

    Strzeciwilk, Dariusz; Zuberek, Włodzimierz M.

    2016-09-01

    The article presents the results of modeling and analysis of data transmission performance on systems that support quality of service. Models are designed and tested, taking into account multiservice network architecture, i.e. supporting the transmission of data related to different classes of traffic. Studied were mechanisms of traffic shaping systems, which are based on the Priority Queuing with an integrated source of data and the various sources of data that is generated. Discussed were the basic problems of the architecture supporting QoS and queuing systems. Designed and built were models based on Petri nets, supported by temporal logics. The use of simulation tools was to verify the mechanisms of shaping traffic with the applied queuing algorithms. It is shown that temporal models of Petri nets can be effectively used in the modeling and analysis of the performance of computer networks.

  13. High Performance Descriptive Semantic Analysis of Semantic Graph Databases

    SciTech Connect

    Joslyn, Cliff A.; Adolf, Robert D.; al-Saffar, Sinan; Feo, John T.; Haglin, David J.; Mackey, Greg E.; Mizell, David W.

    2011-06-02

    As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to understand their inherent semantic structure, whether codified in explicit ontologies or not. Our group is researching novel methods for what we call descriptive semantic analysis of RDF triplestores, to serve purposes of analysis, interpretation, visualization, and optimization. But data size and computational complexity makes it increasingly necessary to bring high performance computational resources to bear on this task. Our research group built a novel high performance hybrid system comprising computational capability for semantic graph database processing utilizing the large multi-threaded architecture of the Cray XMT platform, conventional servers, and large data stores. In this paper we describe that architecture and our methods, and present the results of our analyses of basic properties, connected components, namespace interaction, and typed paths such for the Billion Triple Challenge 2010 dataset.

  14. Detection of Wind Turbine Power Performance Abnormalities Using Eigenvalue Analysis

    DTIC Science & Technology

    2014-12-23

    Detection of Wind Turbine Power Performance Abnormalities Using Eigenvalue Analysis Georgios Alexandros Skrimpas1, Christian Walsted Sweeney2, Kun S...University of Denmark, Lyngby, 2800, Denmark nm@elektro.dtu.dk jh@elektro.dtu.dk ABSTRACT Condition monitoring of wind turbines is a field of continu...ous research and development as new turbine configurations enter into the market and new failure modes appear. Systems utilising well established

  15. Performance Analysis of Visible Light Communication Using CMOS Sensors.

    PubMed

    Do, Trong-Hop; Yoo, Myungsik

    2016-02-29

    This paper elucidates the fundamentals of visible light communication systems that use the rolling shutter mechanism of CMOS sensors. All related information involving different subjects, such as photometry, camera operation, photography and image processing, are studied in tandem to explain the system. Then, the system performance is analyzed with respect to signal quality and data rate. To this end, a measure of signal quality, the signal to interference plus noise ratio (SINR), is formulated. Finally, a simulation is conducted to verify the analysis.

  16. Performance of multifractal detrended fluctuation analysis on short time series

    NASA Astrophysics Data System (ADS)

    López, Juan Luis; Contreras, Jesús Guillermo

    2013-02-01

    The performance of the multifractal detrended analysis on short time series is evaluated for synthetic samples of several mono- and multifractal models. The reconstruction of the generalized Hurst exponents is used to determine the range of applicability of the method and the precision of its results as a function of the decreasing length of the series. As an application the series of the daily exchange rate between the U.S. dollar and the euro is studied.

  17. Using Linguistic Analysis to Identify High Performing Teams

    DTIC Science & Technology

    2006-06-01

    linguistic analysis (specifically the Linguistic Inquiry and Word Count, LIWC) in identifying potential high performing teams. In a series of studies...usefulness of one technological tool, the Linguistic Inquiry Word Count (LIWC; Pennebaker, Francis, & Booth, 2001), in identifying productive groups. The...LIWC analyzes text on a word -by- word basis, categorizes each word using 72 linguistic dimensions (e.g., pronoun, present tense, cognitive process), and

  18. Analysis and performance of flat-plate solar collector arrays

    SciTech Connect

    Wang, X.A.; Wu, L.G. )

    1990-01-01

    A new discrete numerical model is proposed to calculate the flow and temperature distribution in solar collector arrays. The flow nonuniformity, the longitudinal heat conduction, and the buoyancy effect are all taken into account in the analysis. The numerical results of pressure and temperature distribution are found in agreement with the experimental results. It is found that the flow nonuniformity has detrimental effect on the thermal performance of collector array.

  19. Analysis of guidance law performance using personal computers

    NASA Technical Reports Server (NTRS)

    Barrios, J. Rene

    1990-01-01

    A point mass, three-degree of freedom model is presented as a basic development tool for PC based simulation models. The model has been used in the development of guidance algorithms as well as in other applications such as performance management systems to compute optimal speeds. Its limitations and advantages are discussed with regard to the windshear environment. A method for simulating a simple autopilot is explained in detail and applied in the analysis of different guidance laws.

  20. Performance analysis of wireless sensor networks in geophysical sensing applications

    NASA Astrophysics Data System (ADS)

    Uligere Narasimhamurthy, Adithya

    Performance is an important criteria to consider before switching from a wired network to a wireless sensing network. Performance is especially important in geophysical sensing where the quality of the sensing system is measured by the precision of the acquired signal. Can a wireless sensing network maintain the same reliability and quality metrics that a wired system provides? Our work focuses on evaluating the wireless GeoMote sensor motes that were developed by previous computer science graduate students at Mines. Specifically, we conducted a set of experiments, namely WalkAway and Linear Array experiments, to characterize the performance of the wireless motes. The motes were also equipped with the Sticking Heartbeat Aperture Resynchronization Protocol (SHARP), a time synchronization protocol developed by a previous computer science graduate student at Mines. This protocol should automatically synchronize the mote's internal clocks and reduce time synchronization errors. We also collected passive data to evaluate the response of GeoMotes to various frequency components associated with the seismic waves. With the data collected from these experiments, we evaluated the performance of the SHARP protocol and compared the performance of our GeoMote wireless system against the industry standard wired seismograph system (Geometric-Geode). Using arrival time analysis and seismic velocity calculations, we set out to answer the following question. Can our wireless sensing system (GeoMotes) perform similarly to a traditional wired system in a realistic scenario?

  1. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    SciTech Connect

    Carlsbad Field Office

    2006-04-01

    The Performance Demonstration Program (PDP) for headspace gases distributes sample gases of volatile organic compounds (VOCs) for analysis. Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility’s compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document. Participating measurement

  2. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    SciTech Connect

    Carlsbad Field Office

    2007-11-19

    The Performance Demonstration Program (PDP) for headspace gases distributes blind audit samples in a gas matrix for analysis of volatile organic compounds (VOCs). Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility’s compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document

  3. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    SciTech Connect

    Carlsbad Field Office

    2007-11-13

    The Performance Demonstration Program (PDP) for headspace gases distributes blind audit samples in a gas matrix for analysis of volatile organic compounds (VOCs). Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility’s compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document

  4. Multi-order analysis framework for comprehensive biometric performance evaluation

    NASA Astrophysics Data System (ADS)

    Gorodnichy, Dmitry O.

    2010-04-01

    It is not uncommon for contemporary biometric systems to have more than one match below the matching threshold, or to have two or more matches having close matching scores. This is especially true for those that store large quantities of identities and/or are applied to measure loosely constrained biometric traits, such as in identification from video or at a distance. Current biometric performance evaluation standards however are still largely based on measuring single-score statistics such as False Match, False Non-Match rates and the trade-off curves based thereon. Such methodology and reporting makes it impossible to investigate the risks and risk mitigation strategies associated with not having a unique identifying score. To address the issue, Canada Border Services Agency has developed a novel modality-agnostic multi-order performance analysis framework. The framework allows one to analyze the system performance at several levels of detail, by defining the traditional single-score-based metrics as Order-1 analysis, and introducing Order- 2 and Order-3 analysis to permit the investigation of the system reliability and the confidence of its recognition decisions. Implemented in a toolkit called C-BET (Comprehensive Biometrics Evaluation Toolkit), the framework has been applied in a recent examination of the state-of-the art iris recognition systems, the results of which are presented, and is now recommended to other agencies interested in testing and tuning the biometric systems.

  5. A Multifaceted Independent Performance Analysis of Facial Subspace Recognition Algorithms

    PubMed Central

    Bajwa, Usama Ijaz; Taj, Imtiaz Ahmad; Anwar, Muhammad Waqas; Wang, Xuan

    2013-01-01

    Face recognition has emerged as the fastest growing biometric technology and has expanded a lot in the last few years. Many new algorithms and commercial systems have been proposed and developed. Most of them use Principal Component Analysis (PCA) as a base for their techniques. Different and even conflicting results have been reported by researchers comparing these algorithms. The purpose of this study is to have an independent comparative analysis considering both performance and computational complexity of six appearance based face recognition algorithms namely PCA, 2DPCA, A2DPCA, (2D)2PCA, LPP and 2DLPP under equal working conditions. This study was motivated due to the lack of unbiased comprehensive comparative analysis of some recent subspace methods with diverse distance metric combinations. For comparison with other studies, FERET, ORL and YALE databases have been used with evaluation criteria as of FERET evaluations which closely simulate real life scenarios. A comparison of results with previous studies is performed and anomalies are reported. An important contribution of this study is that it presents the suitable performance conditions for each of the algorithms under consideration. PMID:23451054

  6. Total systems design analysis of high performance structures

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1993-01-01

    Designer-control parameters were identified at interdiscipline interfaces to optimize structural systems performance and downstream development and operations with reliability and least life-cycle cost. Interface tasks and iterations are tracked through a matrix of performance disciplines integration versus manufacturing, verification, and operations interactions for a total system design analysis. Performance integration tasks include shapes, sizes, environments, and materials. Integrity integrating tasks are reliability and recurring structural costs. Significant interface designer control parameters were noted as shapes, dimensions, probability range factors, and cost. Structural failure concept is presented, and first-order reliability and deterministic methods, benefits, and limitations are discussed. A deterministic reliability technique combining benefits of both is proposed for static structures which is also timely and economically verifiable. Though launch vehicle environments were primarily considered, the system design process is applicable to any surface system using its own unique filed environments.

  7. Crew Exploration Vehicle Launch Abort Controller Performance Analysis

    NASA Technical Reports Server (NTRS)

    Sparks, Dean W., Jr.; Raney, David L.

    2007-01-01

    This paper covers the simulation and evaluation of a controller design for the Crew Module (CM) Launch Abort System (LAS), to measure its ability to meet the abort performance requirements. The controller used in this study is a hybrid design, including features developed by the Government and the Contractor. Testing is done using two separate 6-degree-of-freedom (DOF) computer simulation implementations of the LAS/CM throughout the ascent trajectory: 1) executing a series of abort simulations along a nominal trajectory for the nominal LAS/CM system; and 2) using a series of Monte Carlo runs with perturbed initial flight conditions and perturbed system parameters. The performance of the controller is evaluated against a set of criteria, which is based upon the current functional requirements of the LAS. Preliminary analysis indicates that the performance of the present controller meets (with the exception of a few cases) the evaluation criteria mentioned above.

  8. SIMS analysis of high-performance accelerator niobium

    SciTech Connect

    Maheshwari, P.; Stevie, F. A.; Myneni, Ganapati Rao; Rigsbee, J, M.; Dhakal, Pashupati; Ciovati, Gianluigi; Griffis, D. P.

    2014-11-01

    Niobium is used to fabricate superconducting radio frequency accelerator modules because of its high critical temperature, high critical magnetic field, and easy formability. Recent experiments have shown a very significant improvement in performance (over 100%) after a high-temperature bake at 1400 degrees C for 3h. SIMS analysis of this material showed the oxygen profile was significantly deeper than the native oxide with a shape that is indicative of diffusion. Positive secondary ion mass spectra showed the presence of Ti with a depth profile similar to that of O. It is suspected that Ti is associated with the performance improvement. The source of Ti contamination in the anneal furnace has been identified, and a new furnace was constructed without Ti. Initial results from the new furnace do not show the yield improvement. Further analyses should determine the relationship of Ti to cavity performance.

  9. Performance Analysis of Coaxial Fed Stacked Patch Antennas

    NASA Astrophysics Data System (ADS)

    Jain, Satish K.; Jain, Shobha

    2014-01-01

    A performance analysis of coaxial fed stacked dual patch electromagnetic-coupled microstrip antenna useful for satellite communication working in X/Ku band is presented. A simplified structure of stacked dual patch antenna is proposed with adjustable foam-gap between patches. Few important geometrical parameters were chosen on which the performance of stacked dual patch antenna mainly depends. Dimension of lower square patch, upper square patch and height of foam-gap between two patches are the parameters, which were varied one by one keeping other parameters constant. The performance was observed through the reflection coefficient (dB) and smith chart impedance plot, obtained from the numerical simulator (IE3D) for the dual resonance frequency and bandwidth. Proposed geometry of stacked dual patch antenna was also analyzed with cavity model and artificial neural network modeling technique. Dual resonance frequencies and associated bandwidth were calculated through them and results were cross checked in the laboratory with a few experimental findings.

  10. The Vehicle Integrated Performance Analysis Experience: Reconnecting With Technical Integration

    NASA Technical Reports Server (NTRS)

    McGhee, D. S.

    2006-01-01

    Very early in the Space Launch Initiative program, a small team of engineers at MSFC proposed a process for performing system-level assessments of a launch vehicle. Aimed primarily at providing insight and making NASA a smart buyer, the Vehicle Integrated Performance Analysis (VIPA) team was created. The difference between the VIPA effort and previous integration attempts is that VIPA a process using experienced people from various disciplines, which focuses them on a technically integrated assessment. The foundations of VIPA s process are described. The VIPA team also recognized the need to target early detailed analysis toward identifying significant systems issues. This process is driven by the T-model for technical integration. VIPA s approach to performing system-level technical integration is discussed in detail. The VIPA process significantly enhances the development and monitoring of realizable project requirements. VIPA s assessment validates the concept s stated performance, identifies significant issues either with the concept or the requirements, and then reintegrates these issues to determine impacts. This process is discussed along with a description of how it may be integrated into a program s insight and review process. The VIPA process has gained favor with both engineering and project organizations for being responsive and insightful

  11. Dynamic performances analysis of a real vehicle driving

    NASA Astrophysics Data System (ADS)

    Abdullah, M. A.; Jamil, J. F.; Salim, M. A.

    2015-12-01

    Vehicle dynamic is the effects of movement of a vehicle generated from the acceleration, braking, ride and handling activities. The dynamic behaviours are determined by the forces from tire, gravity and aerodynamic which acting on the vehicle. This paper emphasizes the analysis of vehicle dynamic performance of a real vehicle. Real driving experiment on the vehicle is conducted to determine the effect of vehicle based on roll, pitch, and yaw, longitudinal, lateral and vertical acceleration. The experiment is done using the accelerometer to record the reading of the vehicle dynamic performance when the vehicle is driven on the road. The experiment starts with weighing a car model to get the center of gravity (COG) to place the accelerometer sensor for data acquisition (DAQ). The COG of the vehicle is determined by using the weight of the vehicle. A rural route is set to launch the experiment and the road conditions are determined for the test. The dynamic performance of the vehicle are depends on the road conditions and driving maneuver. The stability of a vehicle can be controlled by the dynamic performance analysis.

  12. Reproducible LTE uplink performance analysis using precomputed interference signals

    NASA Astrophysics Data System (ADS)

    Pauli, Volker; Nisar, Muhammad Danish; Seidel, Eiko

    2011-12-01

    The consideration of realistic uplink inter-cell interference is essential for the overall performance testing of future cellular systems, and in particular for the evaluation of the radio resource management (RRM) algorithms. Most beyond-3G communication systems employ orthogonal multiple access in uplink (SC-FDMA in LTE and OFDMA in WiMAX), and additionally rely on frequency-selective RRM (scheduling) algorithms. This makes the task of accurate modeling of uplink interference both crucial and non-trivial. Traditional methods for its modeling (e.g., via additive white Gaussian noise interference sources) are therefore proving to be ineffective to realistically model the uplink interference in the next generation cellular systems. In this article, we propose the use of realistic precomputed interference patterns for LTE uplink performance analysis and testing. The interference patterns are generated via an LTE system-level simulator for a given set of scenario parameters, such as cell configuration, user configurations, and traffic models. The generated interference patterns (some of which are made publicly available) can be employed to benchmark the performance of any LTE uplink system in both lab simulations and field trials for practical deployments. It is worth mentioning that the proposed approach can also be extended to other cellular communication systems employing OFDMA-like multiple access with frequency-selective RRM techniques. The proposed approach offers twofold advantages. First, it allows for repeatability and reproducibility of the performance analysis. This is of crucial significance not only for researchers and developers to analyze the behavior and performance of their systems, but also for the network operators to compare the performance of competing system vendors. Second, the proposed testing mechanism evades the need for deployment of multiple cells (with multiple active users in each) to achieve realistic field trials, thereby resulting in

  13. Design and performance analysis of multilayer nested grazing incidence optics

    NASA Astrophysics Data System (ADS)

    Zuo, Fuchang; Deng, Loulou; Mei, Zhiwu; Li, Liansheng; Lv, Zhengxin

    2014-10-01

    We have developed X-ray grazing incidence optics with a single mirror. Although t can be used to demonstrate and test on the ground to verify the feasibility of X-ray detection system, it is unable to meet the requirements of X-ray pulsar navigation due to small effective area and large mass. There is an urgent need to develop multilayer nested grazing incidence optics, which consists of multilayer mirrors to form a coaxial and confocal system to maximize the use of space and increase the effective area. In this paper, aiming at the future demand of X-ray pulsar navigation, optimization and analysis of nested X-ray grazing incidence optics was carried out, the recurrence relations between the layers of mirrors were derived, reasonable initial structural parameters and stray light reduction method was given, and theoretical effective collection area was calculated. The initial structure and stray light eliminating structure are designed. The optical-mechanical-thermal numerical model was established using optical analysis software and finite element software for stray light analysis, focusing performance analysis, tolerance analysis, and mechanical analysis, providing evidence and guidance for the processing and alignment of nested X-ray grazing incidence optics.

  14. Quantitative analysis of the reconstruction performance of interpolants

    NASA Technical Reports Server (NTRS)

    Lansing, Donald L.; Park, Stephen K.

    1987-01-01

    The analysis presented provides a quantitative measure of the reconstruction or interpolation performance of linear, shift-invariant interpolants. The performance criterion is the mean square error of the difference between the sampled and reconstructed functions. The analysis is applicable to reconstruction algorithms used in image processing and to many types of splines used in numerical analysis and computer graphics. When formulated in the frequency domain, the mean square error clearly separates the contribution of the interpolation method from the contribution of the sampled data. The equations provide a rational basis for selecting an optimal interpolant; that is, one which minimizes the mean square error. The analysis has been applied to a selection of frequently used data splines and reconstruction algorithms: parametric cubic and quintic Hermite splines, exponential and nu splines (including the special case of the cubic spline), parametric cubic convolution, Keys' fourth-order cubic, and a cubic with a discontinuous first derivative. The emphasis in this paper is on the image-dependent case in which no a priori knowledge of the frequency spectrum of the sampled function is assumed.

  15. How motivation affects academic performance: a structural equation modelling analysis.

    PubMed

    Kusurkar, R A; Ten Cate, Th J; Vos, C M P; Westers, P; Croiset, G

    2013-03-01

    Few studies in medical education have studied effect of quality of motivation on performance. Self-Determination Theory based on quality of motivation differentiates between Autonomous Motivation (AM) that originates within an individual and Controlled Motivation (CM) that originates from external sources. To determine whether Relative Autonomous Motivation (RAM, a measure of the balance between AM and CM) affects academic performance through good study strategy and higher study effort and compare this model between subgroups: males and females; students selected via two different systems namely qualitative and weighted lottery selection. Data on motivation, study strategy and effort was collected from 383 medical students of VU University Medical Center Amsterdam and their academic performance results were obtained from the student administration. Structural Equation Modelling analysis technique was used to test a hypothesized model in which high RAM would positively affect Good Study Strategy (GSS) and study effort, which in turn would positively affect academic performance in the form of grade point averages. This model fit well with the data, Chi square = 1.095, df = 3, p = 0.778, RMSEA model fit = 0.000. This model also fitted well for all tested subgroups of students. Differences were found in the strength of relationships between the variables for the different subgroups as expected. In conclusion, RAM positively correlated with academic performance through deep strategy towards study and higher study effort. This model seems valid in medical education in subgroups such as males, females, students selected by qualitative and weighted lottery selection.

  16. Analysis of latency performance of bluetooth low energy (BLE) networks.

    PubMed

    Cho, Keuchul; Park, Woojin; Hong, Moonki; Park, Gisu; Cho, Wooseong; Seo, Jihoon; Han, Kijun

    2014-12-23

    Bluetooth Low Energy (BLE) is a short-range wireless communication technology aiming at low-cost and low-power communication. The performance evaluation of classical Bluetooth device discovery have been intensively studied using analytical modeling and simulative methods, but these techniques are not applicable to BLE, since BLE has a fundamental change in the design of the discovery mechanism, including the usage of three advertising channels. Recently, there several works have analyzed the topic of BLE device discovery, but these studies are still far from thorough. It is thus necessary to develop a new, accurate model for the BLE discovery process. In particular, the wide range settings of the parameters introduce lots of potential for BLE devices to customize their discovery performance. This motivates our study of modeling the BLE discovery process and performing intensive simulation. This paper is focused on building an analytical model to investigate the discovery probability, as well as the expected discovery latency, which are then validated via extensive experiments. Our analysis considers both continuous and discontinuous scanning modes. We analyze the sensitivity of these performance metrics to parameter settings to quantitatively examine to what extent parameters influence the performance metric of the discovery processes.

  17. Cross-industry Performance Modeling: Toward Cooperative Analysis

    SciTech Connect

    Reece, Wendy Jane; Blackman, Harold Stabler

    1998-10-01

    One of the current unsolved problems in human factors is the difficulty in acquiring information from lessons learned and data collected among human performance analysts in different domains. There are several common concerns and generally accepted issues of importance for human factors, psychology and industry analysts of performance and safety. Among these are the need to incorporate lessons learned in design, to carefully consider implementation of new designs and automation, and the need to reduce human performance-based contributions to risk. In spite of shared concerns, there are several roadblocks to widespread sharing of data and lessons learned from operating experience and simulation, including the fact that very few publicly accessible data bases exist (Gertman & Blackman, 1994, and Kirwan, 1997). There is a need to draw together analysts and analytic methodologies to comprise a centralized source of data with sufficient detail to be meaningful while ensuring source anonymity. We propose that a generic source of performance data and a multi-domain data store may provide the first steps toward cooperative performance modeling and analysis across industries.

  18. Cross-Industry Performance Modeling: Toward Cooperative Analysis

    SciTech Connect

    H. S. Blackman; W. J. Reece

    1998-10-01

    One of the current unsolved problems in human factors is the difficulty in acquiring information from lessons learned and data collected among human performance analysts in different domains. There are several common concerns and generally accepted issues of importance for human factors, psychology and industry analysts of performance and safety. Among these are the need to incorporate lessons learned in design, to carefully consider implementation of new designs and automation, and the need to reduce human performance-based contributions to risk. In spite of shared concerns, there are several road blocks to widespread sharing of data and lessons learned from operating experience and simulation, including the fact that very few publicly accessible data bases exist(Gertman & Blackman, 1994, and Kirwan, 1997). There is a need to draw together analysts and analytic methodologies to comprise a centralized source of data with sufficient detail to be meaningful while ensuring source anonymity. We propose that a generic source of performance data and a multi-domain data store may provide the first steps toward cooperative performance modeling and analysis across industries.

  19. Results of a 24-inch Hybrid Motor Performance Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Sims, Joseph D.; Coleman, Hugh W.

    1998-01-01

    The subscale (11 and 24-inch) hybrid motors at the Marshall Space Flight Center (MSFC) have been used as versatile and cost effective testbeds for developing new technology. Comparisons between motor configuration, ignition systems, feed systems, fuel formulations, and nozzle materials have been carried out without detailed consideration as to haw "good" the motor performance data were. For the 250,000 lb/thrust motor developed by the Hybrid Propulsion Demonstration Program consortium, this shortcoming is particularly risky because motor performance will likely be used as put of a set of downselect criteria to choose between competing ignition and feed systems under development. This analysis directly addresses that shortcoming by applying uncertainty analysis techniques to the experimental determination of the characteristic velocity, theoretical characteristic velocity, and characteristic velocity efficiency for a 24-inch motor firing. With the adoption of fuel-lined headends, flow restriction, and aft mixing chambers, state of the an 24-inch hybrid motors have become very efficient However, impossibly high combustion efficiencies (some computed as high as 108%) have been measured in some tests with 11-inch motors. This analysis has given new insight into explaining how these efficiencies were measured to be so high, and into which experimental measurements contribute the most to the overall uncertainty.

  20. Effects of specified performance criterion and performance feedback on staff behavior: a component analysis.

    PubMed

    Hardesty, Samantha L; Hagopian, Louis P; McIvor, Melissa M; Wagner, Leaora L; Sigurdsson, Sigurdur O; Bowman, Lynn G

    2014-09-01

    The present study isolated the effects of frequently used staff training intervention components to increase communication between direct care staff and clinicians working on an inpatient behavioral unit. Written "protocol review" quizzes developed by clinicians were designed to assess knowledge about a patient's behavioral protocols. Direct care staff completed these at the beginning of each day and evening shift. Clinicians were required to score and discuss these protocol reviews with direct care staff for at least 75% of shifts over a 2-week period. During baseline, only 21% of clinicians met this requirement. Completing and scoring of protocol reviews did not improve following additional in-service training (M = 15%) or following an intervention aimed at decreasing response effort combined with prompting (M = 28%). After implementing an intervention involving specified performance criterion and performance feedback, 86% of clinicians reached the established goal. Results of a component analysis suggested that the presentation of both the specified performance criterion and supporting contingencies was necessary to maintain acceptable levels of performance.

  1. Computational analysis of Variable Thrust Engine (VTE) performance

    NASA Technical Reports Server (NTRS)

    Giridharan, M. G.; Krishnan, A.; Przekwas, A. J.

    1993-01-01

    The Variable Thrust Engine (VTE) of the Orbital Maneuvering Vehicle (OMV) uses a hypergolic propellant combination of Monomethyl Hydrazine (MMH) and Nitrogen Tetroxide (NTO) as fuel and oxidizer, respectively. The performance of the VTE depends on a number of complex interacting phenomena such as atomization, spray dynamics, vaporization, turbulent mixing, convective/radiative heat transfer, and hypergolic combustion. This study involved the development of a comprehensive numerical methodology to facilitate detailed analysis of the VTE. An existing Computational Fluid Dynamics (CFD) code was extensively modified to include the following models: a two-liquid, two-phase Eulerian-Lagrangian spray model; a chemical equilibrium model; and a discrete ordinate radiation heat transfer model. The modified code was used to conduct a series of simulations to assess the effects of various physical phenomena and boundary conditions on the VTE performance. The details of the models and the results of the simulations are presented.

  2. Performance analysis of charge plasma based dual electrode tunnel FET

    NASA Astrophysics Data System (ADS)

    Anand, Sunny; Intekhab Amin, S.; Sarin, R. K.

    2016-05-01

    This paper proposes the charge plasma based dual electrode doping-less tunnel FET (DEDLTFET). The paper compares the device performance of the conventional doping-less TFET (DLTFET) and doped TFET (DGTFET). DEDLTEFT gives the superior results with high ON state current (ION ∼ 0.56 mA/μm), ION/IOFF ratio ∼ 9.12 × 1013 and an average subthreshold swing (AV-SS ∼ 48 mV/dec). The variation of different device parameters such as channel length, gate oxide material, gate oxide thickness, silicon thickness, gate work function and temperature variation are done and compared with DLTFET and DGTFET. Through the extensive analysis it is found that DEDLTFET shows the better performance than the other two devices, which gives the indication for an excellent future in low power applications.

  3. Commissioning and Performance Analysis of WhisperGen Stirling Engine

    NASA Astrophysics Data System (ADS)

    Pradip, Prashant Kaliram

    Stirling engine based cogeneration systems have potential to reduce energy consumption and greenhouse gas emission, due to their high cogeneration efficiency and emission control due to steady external combustion. To date, most studies on this unit have focused on performance based on both experimentation and computer models, and lack experimental data for diversified operating ranges. This thesis starts with the commissioning of a WhisperGen Stirling engine with components and instrumentation to evaluate power and thermal performance of the system. Next, a parametric study on primary engine variables, including air, diesel, and coolant flowrate and temperature were carried out to further understand their effect on engine power and efficiency. Then, this trend was validated with the thermodynamic model developed for the energy analysis of a Stirling cycle. Finally, the energy balance of the Stirling engine was compared without and with heat recovery from the engine block and the combustion chamber exhaust.

  4. Performance analysis of multiple PRF technique for ambiguity resolution

    NASA Technical Reports Server (NTRS)

    Chang, C. Y.; Curlander, J. C.

    1992-01-01

    For short wavelength spaceborne synthetic aperture radar (SAR), ambiguity in Doppler centroid estimation occurs when the azimuth squint angle uncertainty is larger than the azimuth antenna beamwidth. Multiple pulse recurrence frequency (PRF) hopping is a technique developed to resolve the ambiguity by operating the radar in different PRF's in the pre-imaging sequence. Performance analysis results of the multiple PRF technique are presented, given the constraints of the attitude bound, the drift rate uncertainty, and the arbitrary numerical values of PRF's. The algorithm performance is derived in terms of the probability of correct ambiguity resolution. Examples, using the Shuttle Imaging Radar-C (SIR-C) and X-SAR parameters, demonstrate that the probability of correct ambiguity resolution obtained by the multiple PRF technique is greater than 95 percent and 80 percent for the SIR-C and X-SAR applications, respectively. The success rate is significantly higher than that achieved by the range cross correlation technique.

  5. An analysis of calendar performance in two autistic calendar savants

    PubMed Central

    Kennedy, Daniel P.; Squire, Larry R.

    2007-01-01

    We acquired large data sets of calendar performance from two autistic calendar savants, DG and RN. An analysis of their errors and reaction times revealed that (1) both individuals had knowledge of calendar information from a limited range of years; (2) there was no evidence for the use of memorized anchor dates that could, by virtue of counting away from the anchors, allow correct responses to questions about other dates; and (3) the two individuals differed in their calendar knowledge, as well as in their ability to perform secondary tasks in which calendar knowledge was assessed indirectly. In view of the fact that there are only 14 possible annual calendars, we suggest that both savants worked by memorizing these 14 possible calendar arrangements. PMID:17686947

  6. Using SWE Standards for Ubiquitous Environmental Sensing: A Performance Analysis

    PubMed Central

    Tamayo, Alain; Granell, Carlos; Huerta, Joaquín

    2012-01-01

    Although smartphone applications represent the most typical data consumer tool from the citizen perspective in environmental applications, they can also be used for in-situ data collection and production in varied scenarios, such as geological sciences and biodiversity. The use of standard protocols, such as SWE, to exchange information between smartphones and sensor infrastructures brings benefits such as interoperability and scalability, but their reliance on XML is a potential problem when large volumes of data are transferred, due to limited bandwidth and processing capabilities on mobile phones. In this article we present a performance analysis about the use of SWE standards in smartphone applications to consume and produce environmental sensor data, analysing to what extent the performance problems related to XML can be alleviated by using alternative uncompressed and compressed formats.

  7. An analysis of calendar performance in two autistic calendar savants.

    PubMed

    Kennedy, Daniel P; Squire, Larry R

    2007-08-01

    We acquired large data sets of calendar performance from two autistic calendar savants, DG and RN. An analysis of their errors and reaction times revealed that (1) both individuals had knowledge of calendar information from a limited range of years; (2) there was no evidence for the use of memorized anchor dates that could, by virtue of counting away from the anchors, allow correct responses to questions about other dates; and (3) the two individuals differed in their calendar knowledge, as well as in their ability to perform secondary tasks in which calendar knowledge was assessed indirectly. In view of the fact that there are only 14 possible annual calendars, we suggest that both savants worked by memorizing these 14 possible calendar arrangements.

  8. Fluid and thermal performance analysis of PMSM used for driving

    NASA Astrophysics Data System (ADS)

    Ding, Shuye; Cui, Guanghui; Li, Zhongyu; Guan, Tianyu

    2016-03-01

    The permanent magnet synchronous motor (PMSM) is widely used in ships under frequency conversion control system. The fluid flow performance and temperature distribution of the PMSM are difficult to clarify due to its complex structure and variable frequency control condition. Therefore, in order to investigate the fluid and thermal characteristics of the PMSM, a 50 kW PMSM was taken as an example in this study, and a 3-D coupling analysis model of fluid and thermal was established. The fluid and temperature fields were calculated by using finite volume method. The cooling medium's properties, such a velocity, streamlines, and temperature, were then analyzed. The correctness of the proposed model, and the rationality of the solution method, were verified by a temperature test of the PMSM. In this study, the changing rheology on the performance of the cooling medium and the working temperature of the PMSM were revealed, which could be helpful for designing the PMSM.

  9. Performance analysis of a laser propelled interorbital tansfer vehicle

    NASA Technical Reports Server (NTRS)

    Minovitch, M. A.

    1976-01-01

    Performance capabilities of a laser-propelled interorbital transfer vehicle receiving propulsive power from one ground-based transmitter was investigated. The laser transmits propulsive energy to the vehicle during successive station fly-overs. By applying a series of these propulsive maneuvers, large payloads can be economically transferred between low earth orbits and synchronous orbits. Operations involving the injection of large payloads onto escape trajectories are also studied. The duration of each successive engine burn must be carefully timed so that the vehicle reappears over the laser station to receive additional propulsive power within the shortest possible time. The analytical solution for determining these time intervals is presented, as is a solution to the problem of determining maximum injection payloads. Parameteric computer analysis based on these optimization studies is presented. The results show that relatively low beam powers, on the order of 50 MW to 60 MW, produce significant performance capabilities.

  10. Turnover rates and organizational performance: a meta-analysis.

    PubMed

    Park, Tae-Youn; Shaw, Jason D

    2013-03-01

    The authors conducted a meta-analysis of the relationship between turnover rates and organizational performance to (a) determine the magnitude of the relationship; (b) test organization-, context-, and methods-related moderators of the relationship; and (c) suggest future directions for the turnover literature on the basis of the findings. The results from 300 total correlations (N = 309,245) and 110 independent correlations (N = 120,066) show that the relationship between total turnover rates and organizational performance is significant and negative (ρ = -.15). In addition, the relationship is more negative for voluntary (ρ = -.15) and reduction-in-force turnover (ρ = -.17) than for involuntary turnover (ρ = -.01). Moreover, the meta-analytic correlation differs significantly across several organization- and context-related factors (e.g., types of employment system, dimensions of organizational performance, region, and entity size). Finally, in sample-level regressions, the strength of the turnover rates-organizational performance relationship significantly varies across different average levels of total and voluntary turnover rates, which suggests a potential curvilinear relationship. The authors outline the practical magnitude of the findings and discuss implications for future organizational-level turnover research.

  11. Performance Model and Sensitivity Analysis for a Solar Thermoelectric Generator

    NASA Astrophysics Data System (ADS)

    Rehman, Naveed Ur; Siddiqui, Mubashir Ali

    2017-01-01

    In this paper, a regression model for evaluating the performance of solar concentrated thermoelectric generators (SCTEGs) is established and the significance of contributing parameters is discussed in detail. The model is based on several natural, design and operational parameters of the system, including the thermoelectric generator (TEG) module and its intrinsic material properties, the connected electrical load, concentrator attributes, heat transfer coefficients, solar flux, and ambient temperature. The model is developed by fitting a response curve, using the least-squares method, to the results. The sample points for the model were obtained by simulating a thermodynamic model, also developed in this paper, over a range of values of input variables. These samples were generated employing the Latin hypercube sampling (LHS) technique using a realistic distribution of parameters. The coefficient of determination was found to be 99.2%. The proposed model is validated by comparing the predicted results with those in the published literature. In addition, based on the elasticity for parameters in the model, sensitivity analysis was performed and the effects of parameters on the performance of SCTEGs are discussed in detail. This research will contribute to the design and performance evaluation of any SCTEG system for a variety of applications.

  12. Performance Model and Sensitivity Analysis for a Solar Thermoelectric Generator

    NASA Astrophysics Data System (ADS)

    Rehman, Naveed Ur; Siddiqui, Mubashir Ali

    2017-03-01

    In this paper, a regression model for evaluating the performance of solar concentrated thermoelectric generators (SCTEGs) is established and the significance of contributing parameters is discussed in detail. The model is based on several natural, design and operational parameters of the system, including the thermoelectric generator (TEG) module and its intrinsic material properties, the connected electrical load, concentrator attributes, heat transfer coefficients, solar flux, and ambient temperature. The model is developed by fitting a response curve, using the least-squares method, to the results. The sample points for the model were obtained by simulating a thermodynamic model, also developed in this paper, over a range of values of input variables. These samples were generated employing the Latin hypercube sampling (LHS) technique using a realistic distribution of parameters. The coefficient of determination was found to be 99.2%. The proposed model is validated by comparing the predicted results with those in the published literature. In addition, based on the elasticity for parameters in the model, sensitivity analysis was performed and the effects of parameters on the performance of SCTEGs are discussed in detail. This research will contribute to the design and performance evaluation of any SCTEG system for a variety of applications.

  13. Analysis of Random Segment Errors on Coronagraph Performance

    NASA Technical Reports Server (NTRS)

    Stahl, Mark T.; Stahl, H. Philip; Shaklan, Stuart B.; N'Diaye, Mamadou

    2016-01-01

    At 2015 SPIE O&P we presented "Preliminary Analysis of Random Segment Errors on Coronagraph Performance" Key Findings: Contrast Leakage for 4thorder Sinc2(X) coronagraph is 10X more sensitive to random segment piston than random tip/tilt, Fewer segments (i.e. 1 ring) or very many segments (> 16 rings) has less contrast leakage as a function of piston or tip/tilt than an aperture with 2 to 4 rings of segments. Revised Findings: Piston is only 2.5X more sensitive than Tip/Tilt

  14. Cost-Performance Analysis of Perovskite Solar Modules.

    PubMed

    Cai, Molang; Wu, Yongzhen; Chen, Han; Yang, Xudong; Qiang, Yinghuai; Han, Liyuan

    2017-01-01

    Perovskite solar cells (PSCs) are promising candidates for the next generation of solar cells because they are easy to fabricate and have high power conversion efficiencies. However, there has been no detailed analysis of the cost of PSC modules. We selected two representative examples of PSCs and performed a cost analysis of their productions: one was a moderate-efficiency module produced from cheap materials, and the other was a high-efficiency module produced from expensive materials. The costs of both modules were found to be lower than those of other photovoltaic technologies. We used the calculated module costs to estimate the levelized cost of electricity (LCOE) of PSCs. The LCOE was calculated to be 3.5-4.9 US cents/kWh with an efficiency and lifetime of greater than 12% and 15 years respectively, below the cost of traditional energy sources.

  15. Cost‐Performance Analysis of Perovskite Solar Modules

    PubMed Central

    Cai, Molang; Wu, Yongzhen; Chen, Han; Yang, Xudong; Qiang, Yinghuai

    2016-01-01

    Perovskite solar cells (PSCs) are promising candidates for the next generation of solar cells because they are easy to fabricate and have high power conversion efficiencies. However, there has been no detailed analysis of the cost of PSC modules. We selected two representative examples of PSCs and performed a cost analysis of their productions: one was a moderate‐efficiency module produced from cheap materials, and the other was a high‐efficiency module produced from expensive materials. The costs of both modules were found to be lower than those of other photovoltaic technologies. We used the calculated module costs to estimate the levelized cost of electricity (LCOE) of PSCs. The LCOE was calculated to be 3.5–4.9 US cents/kWh with an efficiency and lifetime of greater than 12% and 15 years respectively, below the cost of traditional energy sources. PMID:28105403

  16. Space mission scenario development and performance analysis tool

    NASA Technical Reports Server (NTRS)

    Kordon, Mark; Baker, John; Gilbert, John; Hanks, David

    2004-01-01

    This paper discusses a new and innovative approach for a rapid spacecraft multi-disciplinary performance analysis using a tool called the Mission Scenario Development Workbench (MSDW). To meet the needs of new classes of space missions, analysis tools with proven models were developed and integrated into a framework to enable rapid trades and analyses between spacecraft designs and operational scenarios during the formulation phase of a mission. Generally speaking, spacecraft resources are highly constrained on deep space missions and this approach makes it possible to maximize the use of existing resources to attain the best possible science return. This approach also has the potential benefit of reducing the risk of costly design changes made later in the design cycle necessary to meet the mission requirements by understanding system design sensitivities early and adding appropriate margins. This paper will describe the approach used by the Mars Science Laboratory Project to accomplish this result.

  17. A Divergence Statistics Extension to VTK for Performance Analysis

    SciTech Connect

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2015-02-01

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.

  18. Transient analysis techniques in performing impact and crash dynamic studies

    NASA Technical Reports Server (NTRS)

    Pifko, A. B.; Winter, R.

    1989-01-01

    Because of the emphasis being placed on crashworthiness as a design requirement, increasing demands are being made by various organizations to analyze a wide range of complex structures that must perform safely when subjected to severe impact loads, such as those generated in a crash event. The ultimate goal of crashworthiness design and analysis is to produce vehicles with the ability to reduce the dynamic forces experienced by the occupants to specified levels, while maintaining a survivable envelope around them during a specified crash event. DYCAST is a nonlinear structural dynamic finite element computer code that started from the plans systems of a finite element program for static nonlinear structural analysis. The essential features of DYCAST are outlined.

  19. Statistical Performance Analysis of Data-Driven Neural Models.

    PubMed

    Freestone, Dean R; Layton, Kelvin J; Kuhlmann, Levin; Cook, Mark J

    2017-02-01

    Data-driven model-based analysis of electrophysiological data is an emerging technique for understanding the mechanisms of seizures. Model-based analysis enables tracking of hidden brain states that are represented by the dynamics of neural mass models. Neural mass models describe the mean firing rates and mean membrane potentials of populations of neurons. Various neural mass models exist with different levels of complexity and realism. An ideal data-driven model-based analysis framework will incorporate the most realistic model possible, enabling accurate imaging of the physiological variables. However, models must be sufficiently parsimonious to enable tracking of important variables using data. This paper provides tools to inform the realism versus parsimony trade-off, the Bayesian Cramer-Rao (lower) Bound (BCRB). We demonstrate how the BCRB can be used to assess the feasibility of using various popular neural mass models to track epilepsy-related dynamics via stochastic filtering methods. A series of simulations show how optimal state estimates relate to measurement noise, model error and initial state uncertainty. We also demonstrate that state estimation accuracy will vary between seizure-like and normal rhythms. The performance of the extended Kalman filter (EKF) is assessed against the BCRB. This work lays a foundation for assessing feasibility of model-based analysis. We discuss how the framework can be used to design experiments to better understand epilepsy.

  20. Performance and Degradation Analysis of Operating PV Systems

    NASA Astrophysics Data System (ADS)

    Da Silva Freire, Felipe

    The environmental concerns together with the decrease in technology cost lead the solar market to growth rapidly along the last decade. The photovoltaic (PV) systems are one of the solar energy alternatives and the silicon solar cells are currently the most widespread technology. Photovoltaic (PV) modules are considered the most reliable component of a photovoltaic system. The reliability and lifetime depends on the modules energy conversion performance and degradation modes. The analysis of monitoring data give insights about the PV system performance along its service time. The comparison between this data and mathematical models configure a way to predict the futures and new PV installations performance. The goal of this study is to understand the PV systems performance and degradation along its lifetime. A mathematical model was employed to predict the power output of a real, relatively new operating PV system with respect to environmental parameters temperature, irradiance and cloud coverage. The model used is based on one diode ideality factor and takes into account the parasitic series resistance. The results have been compared with the actual PV output data collected for the year 2014 and show good correlation. As the model predicts the system power output assuming the system in new conditions, the deviation in performance of the real data in comparison to the modeling results need to be further investigated for systems in service for longer time. For this propose, the study presents a condensed review of various causes of degradation in silicon PV modules and techniques to observe and investigate these degradation mechanisms. Major effects on output performance exhibit increase in observed ideality factor n2 and recombination current J02 primarily caused by decrease in minority carrier lifetime, shunts and increase in series resistance. The study further, investigates the governing degradation modes on a ten years old PV crystalline silicon module

  1. Performance analysis of jump-gliding locomotion for miniature robotics.

    PubMed

    Vidyasagar, A; Zufferey, Jean-Christohphe; Floreano, Dario; Kovač, M

    2015-03-26

    Recent work suggests that jumping locomotion in combination with a gliding phase can be used as an effective mobility principle in robotics. Compared to pure jumping without a gliding phase, the potential benefits of hybrid jump-gliding locomotion includes the ability to extend the distance travelled and reduce the potentially damaging impact forces upon landing. This publication evaluates the performance of jump-gliding locomotion and provides models for the analysis of the relevant dynamics of flight. It also defines a jump-gliding envelope that encompasses the range that can be achieved with jump-gliding robots and that can be used to evaluate the performance and improvement potential of jump-gliding robots. We present first a planar dynamic model and then a simplified closed form model, which allow for quantification of the distance travelled and the impact energy on landing. In order to validate the prediction of these models, we validate the model with experiments using a novel jump-gliding robot, named the 'EPFL jump-glider'. It has a mass of 16.5 g and is able to perform jumps from elevated positions, perform steered gliding flight, land safely and traverse on the ground by repetitive jumping. The experiments indicate that the developed jump-gliding model fits very well with the measured flight data using the EPFL jump-glider, confirming the benefits of jump-gliding locomotion to mobile robotics. The jump-glide envelope considerations indicate that the EPFL jump-glider, when traversing from a 2 m height, reaches 74.3% of optimal jump-gliding distance compared to pure jumping without a gliding phase which only reaches 33.4% of the optimal jump-gliding distance. Methods of further improving flight performance based on the models and inspiration from biological systems are presented providing mechanical design pathways to future jump-gliding robot designs.

  2. Performance Analysis of the Least-Squares Estimator in Astrometry

    NASA Astrophysics Data System (ADS)

    Lobos, Rodrigo A.; Silva, Jorge F.; Mendez, Rene A.; Orchard, Marcos

    2015-11-01

    We characterize the performance of the widely-used least-squares estimator in astrometry in terms of a comparison with the Cramer-Rao lower variance bound. In this inference context the performance of the least-squares estimator does not offer a closed-form expression, but a new result is presented (Theorem 1) where both the bias and the mean-square-error of the least-squares estimator are bounded and approximated analytically, in the latter case in terms of a nominal value and an interval around it. From the predicted nominal value we analyze how efficient is the least-squares estimator in comparison with the minimum variance Cramer-Rao bound. Based on our results, we show that, for the high signal-to-noise ratio regime, the performance of the least-squares estimator is significantly poorer than the Cramer-Rao bound, and we characterize this gap analytically. On the positive side, we show that for the challenging low signal-to-noise regime (attributed to either a weak astronomical signal or a noise-dominated condition) the least-squares estimator is near optimal, as its performance asymptotically approaches the Cramer-Rao bound. However, we also demonstrate that, in general, there is no unbiased estimator for the astrometric position that can precisely reach the Cramer-Rao bound. We validate our theoretical analysis through simulated digital-detector observations under typical observing conditions. We show that the nominal value for the mean-square-error of the least-squares estimator (obtained from our theorem) can be used as a benchmark indicator of the expected statistical performance of the least-squares method under a wide range of conditions. Our results are valid for an idealized linear (one-dimensional) array detector where intra-pixel response changes are neglected, and where flat-fielding is achieved with very high accuracy.

  3. TPMS Data Analysis for Enhancing Intelligent Vehicle Performance

    NASA Astrophysics Data System (ADS)

    Hannan, M. A.; Hussain, A.; Mohamed, A.; Samad, S. A.

    The main objective of the study is to analyze Tire Pressure Monitoring System (TPMS) data that contributes significantly towards the enhancement of the intelligent vehicle performance evaluation. TPMS pressure and temperature data were collected from the prototype model of the MEMS Tire Pressure Module (TPM) that was fitted on to an intelligent tire rim through its receiver. In this study, we are focusing only analytical data analysis of TPMS. In the analytical study, a novel method for data classification, goodness of fit and hypothesis testing was proposed. A classification scheme was employed to classify the temperature and pressure data based on ID at the quadrant basis operating zone of the Front Right (FR), Front Left (FL), Rear Left (RL) and Rear Right (RR) tires. Principle Component Analysis (PCA) with polynomial fitting for exploring goodness of fit of tire data was also applied. Finally, hypothesis testing using Satterthwaite statistic was carried out. Results obtained are in agreement with the null hypothesis and as such validate the usefulness of the TPMS system in maintaining and enhancing vehicle performance.

  4. Autotasked Performance in the NAS Workload: A Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Carter, R. L.; Stockdale, I. E.; Kutler, Paul (Technical Monitor)

    1998-01-01

    A statistical analysis of the workload performance of a production quality FORTRAN code for five different Cray Y-MP hardware and system software configurations is performed. The analysis was based on an experimental procedure that was designed to minimize correlations between the number of requested CPUs and the time of day the runs were initiated. Observed autotasking over heads were significantly larger for the set of jobs that requested the maximum number of CPUs. Speedups for UNICOS 6 releases show consistent wall clock speedups in the workload of around 2. which is quite good. The observed speed ups were very similar for the set of jobs that requested 8 CPUs and the set that requested 4 CPUs. The original NAS algorithm for determining charges to the user discourages autotasking in the workload. A new charging algorithm to be applied to jobs run in the NQS multitasking queues also discourages NAS users from using auto tasking. The new algorithm favors jobs requesting 8 CPUs over those that request less, although the jobs requesting 8 CPUs experienced significantly higher over head and presumably degraded system throughput. A charging algorithm is presented that has the following desirable characteristics when applied to the data: higher overhead jobs requesting 8 CPUs are penalized when compared to moderate overhead jobs requesting 4 CPUs, thereby providing a charging incentive to NAS users to use autotasking in a manner that provides them with significantly improved turnaround while also maintaining system throughput.

  5. A meta-analysis of math performance in Turner syndrome

    PubMed Central

    Baker, Joseph M; Reiss, Allan L

    2015-01-01

    AIM Studies investigating the relationship between Turner syndrome and math learning disability have used a wide variation of tasks designed to test various aspects of mathematical competencies. Although these studies have revealed much about the math deficits common to Turner syndrome, their diversity makes comparisons between individual studies difficult. As a result, the consistency of outcomes among these diverse measures remains unknown. The overarching aim of this review is to provide a systematic meta-analysis of the differences in math and number performance between females with Turner syndrome and age-matched neurotypical peers. METHOD We provide a meta-analysis of behavioral performance in Turner syndrome relative to age-matched neurotypical populations on assessments of math and number aptitude. In total, 112 comparisons collected across 17 studies were included. RESULTS Although 54% of all statistical comparisons in our analyses failed to reject the null hypothesis, our results indicate that meaningful group differences exist on all comparisons except those that do not require explicit calculation. INTERPRETATION Taken together, these results help elucidate our current understanding of math and number weaknesses in Turner syndrome, while highlighting specific topics that require further investigation. PMID:26566693

  6. A Preliminary Analysis of LANDSAT-4 Thematic Mapper Radiometric Performance

    NASA Technical Reports Server (NTRS)

    Justice, C.; Fusco, L.; Mehl, W.

    1984-01-01

    Analysis was performed to characterize the radiometry of three Thematic Mapper (TM) digital products of a scene of Arkansas. The three digital products examined were the NASA raw (BT) product, the radiometrically corrected (AT) product and the radiometrically and geometrically corrected (PT) product. The frequency distribution of the digital data; the statistical correlation between the bands; and the variability between the detectors within a band were examined on a series of image subsets from the full scene. The results are presented from one 1024 x 1024 pixel subset of Realfoot Lake, Tennessee which displayed a representative range of ground conditions and cover types occurring within the full frame image. Bands 1, 2 and 5 of the sample area are presented. The subsets were extracted from the three digital data products to cover the same geographic area. This analysis provides the first step towards a full appraisal of the TM radiometry being performed as part of the ESA/CEC contribution to the NASA/LIDQA program.

  7. A conceptual design tool for RBCC engine performance analysis

    SciTech Connect

    Olds, J.R.; Saks, G.

    1997-01-01

    Future reusable launch vehicles will depend on new propulsion technologies to lower system operational costs while maintaining adequate performance. Recently, a number of vehicle systems utilizing rocket-based combined-cycle (RBCC) propulsion have been proposed as possible low-cost space launch solutions. Vehicles using RBCC propulsion have the potential to combine the best aspects of airbreathing propulsion (high average Isp) with the best aspects of rocket propulsion (high propellant bulk density and engine T/W). Proper conceptual assessment of each proposed vehicle will require computer-based tools that allow for quick and cheap, yet sufficiently accurate disciplinary analyses. At Georgia Tech, a spreadsheet-based tool has been developed that uses quasi-1D flow analysis with component efficiencies to parametrically model RBCC engine performance in ejector, fan-ramjet, ramjet and pure rocket modes. The technique is similar to an earlier RBCC modeling technique developed by the Marquardt Corporation in the mid-1960{close_quote}s. For a given sea-level static thrust requirement, the current tool generates engine weight and size data, as well as Isp and thrust data vs. altitude and Mach number. The latter is output in tabular form for use in a trajectory optimization program. This paper reviews the current state of the RBCC analysis tool and the effort to upgrade it from a Microsoft Excel spreadsheet to a design-oriented UNIX program in C suitable for integration into a multidisciplinary design optimization (MDO) framework. {copyright} {ital 1997 American Institute of Physics.}

  8. A conceptual design tool for RBCC engine performance analysis

    NASA Astrophysics Data System (ADS)

    Olds, John R.; Saks, Greg

    1997-01-01

    Future reusable launch vehicles will depend on new propulsion technologies to lower system operational costs while maintaining adequate performance. Recently, a number of vehicle systems utilizing rocket-based combined-cycle (RBCC) propulsion have been proposed as possible low-cost space launch solutions. Vehicles using RBCC propulsion have the potential to combine the best aspects of airbreathing propulsion (high average Isp) with the best aspects of rocket propulsion (high propellant bulk density and engine T/W). Proper conceptual assessment of each proposed vehicle will require computer-based tools that allow for quick and cheap, yet sufficiently accurate disciplinary analyses. At Georgia Tech, a spreadsheet-based tool has been developed that uses quasi-1D flow analysis with component efficiencies to parametrically model RBCC engine performance in ejector, fan-ramjet, ramjet and pure rocket modes. The technique is similar to an earlier RBCC modeling technique developed by the Marquardt Corporation in the mid-1960's. For a given sea-level static thrust requirement, the current tool generates engine weight and size data, as well as Isp and thrust data vs. altitude and Mach number. The latter is output in tabular form for use in a trajectory optimization program. This paper reviews the current state of the RBCC analysis tool and the effort to upgrade it from a Microsoft Excel spreadsheet to a design-oriented UNIX program in C suitable for integration into a multidisciplinary design optimization (MDO) framework.

  9. Topology design and performance analysis of an integrated communication network

    NASA Technical Reports Server (NTRS)

    Li, V. O. K.; Lam, Y. F.; Hou, T. C.; Yuen, J. H.

    1985-01-01

    A research study on the topology design and performance analysis for the Space Station Information System (SSIS) network is conducted. It is begun with a survey of existing research efforts in network topology design. Then a new approach for topology design is presented. It uses an efficient algorithm to generate candidate network designs (consisting of subsets of the set of all network components) in increasing order of their total costs, and checks each design to see if it forms an acceptable network. This technique gives the true cost-optimal network, and is particularly useful when the network has many constraints and not too many components. The algorithm for generating subsets is described in detail, and various aspects of the overall design procedure are discussed. Two more efficient versions of this algorithm (applicable in specific situations) are also given. Next, two important aspects of network performance analysis: network reliability and message delays are discussed. A new model is introduced to study the reliability of a network with dependent failures. For message delays, a collection of formulas from existing research results is given to compute or estimate the delays of messages in a communication network without making the independence assumption. The design algorithm coded in PASCAL is included as an appendix.

  10. Correlation analysis between ionospheric scintillation levels and receiver tracking performance

    NASA Astrophysics Data System (ADS)

    Sreeja, V.; Aquino, M.; Elmas, Z. G.; Forte, B.

    2012-06-01

    Rapid fluctuations in the amplitude and phase of a transionospheric radio signal caused by small scale plasma density irregularities in the ionosphere are known as scintillation. Scintillation can seriously impair a GNSS (Global Navigation Satellite Systems) receiver tracking performance, thus affecting the required levels of availability, accuracy and integrity, and consequently the reliability of modern day GNSS based applications. This paper presents an analysis of correlation between scintillation levels and tracking performance of a GNSS receiver for GPS L1C/A, L2C and GLONASS L1, L2 signals. The analyses make use of data recorded over Presidente Prudente (22.1°S, 51.4°W, dip latitude ˜12.3°S) in Brazil, a location close to the Equatorial Ionisation Anomaly (EIA) crest in Latin America. The study presents for the first time this type of correlation analysis for GPS L2C and GLONASS L1, L2 signals. The scintillation levels are defined by the amplitude scintillation index, S4 and the receiver tracking performance is evaluated by the phase tracking jitter. Both S4 and the phase tracking jitter are estimated from the post correlation In-Phase (I) and Quadra-Phase (Q) components logged by the receiver at a high rate. Results reveal that the dependence of the phase tracking jitter on the scintillation levels can be represented by a quadratic fit for the signals. The results presented in this paper are of importance to GNSS users, especially in view of the forthcoming high phase of solar cycle 24 (predicted for 2013).

  11. Instantaneous BeiDou-GPS attitude determination: A performance analysis

    NASA Astrophysics Data System (ADS)

    Nadarajah, Nandakumaran; Teunissen, Peter J. G.; Raziq, Noor

    2014-09-01

    The advent of modernized and new global navigation satellite systems (GNSS) has enhanced the availability of satellite based positioning, navigation, and timing (PNT) solutions. Specifically, it increases redundancy and yields operational back-up or independence in case of failure or unavailability of one system. Among existing GNSS, the Chinese BeiDou system (BDS) is being developed and will consist of geostationary (GEO) satellites, inclined geosynchronous orbit (IGSO) satellites, and medium-Earth-orbit (MEO) satellites. In this contribution, a BeiDou-GPS robustness analysis is carried out for instantaneous, unaided attitude determination. Precise attitude determination using multiple GNSS antennas mounted on a platform relies on the successful resolution of the integer carrier phase ambiguities. The constrained Least-squares AMBiguity Decorrelation Adjustment (C-LAMBDA) method has been developed for the quadratically constrained GNSS compass model that incorporates the known baseline length. In this contribution the method is used to analyse the attitude determination performance when using the GPS and BeiDou systems. The attitude determination performance is evaluated using GPS/BeiDou data sets from a real data campaign in Australia spanning several days. The study includes the performance analyses of both stand-alone and mixed constellation (GPS/BeiDou) attitude estimation under various satellite deprived environments. We demonstrate and quantify the improved availability and accuracy of attitude determination using the combined constellation.

  12. Design and Performance Analysis of Incremental Networked Predictive Control Systems.

    PubMed

    Pang, Zhong-Hua; Liu, Guo-Ping; Zhou, Donghua

    2016-06-01

    This paper is concerned with the design and performance analysis of networked control systems with network-induced delay, packet disorder, and packet dropout. Based on the incremental form of the plant input-output model and an incremental error feedback control strategy, an incremental networked predictive control (INPC) scheme is proposed to actively compensate for the round-trip time delay resulting from the above communication constraints. The output tracking performance and closed-loop stability of the resulting INPC system are considered for two cases: 1) plant-model match case and 2) plant-model mismatch case. For the former case, the INPC system can achieve the same output tracking performance and closed-loop stability as those of the corresponding local control system. For the latter case, a sufficient condition for the stability of the closed-loop INPC system is derived using the switched system theory. Furthermore, for both cases, the INPC system can achieve a zero steady-state output tracking error for step commands. Finally, both numerical simulations and practical experiments on an Internet-based servo motor system illustrate the effectiveness of the proposed method.

  13. Visual Analysis of Cloud Computing Performance Using Behavioral Lines.

    PubMed

    Muelder, Chris; Zhu, Biao; Chen, Wei; Zhang, Hongxin; Ma, Kwan-Liu

    2016-02-29

    Cloud computing is an essential technology to Big Data analytics and services. A cloud computing system is often comprised of a large number of parallel computing and storage devices. Monitoring the usage and performance of such a system is important for efficient operations, maintenance, and security. Tracing every application on a large cloud system is untenable due to scale and privacy issues. But profile data can be collected relatively efficiently by regularly sampling the state of the system, including properties such as CPU load, memory usage, network usage, and others, creating a set of multivariate time series for each system. Adequate tools for studying such large-scale, multidimensional data are lacking. In this paper, we present a visual based analysis approach to understanding and analyzing the performance and behavior of cloud computing systems. Our design is based on similarity measures and a layout method to portray the behavior of each compute node over time. When visualizing a large number of behavioral lines together, distinct patterns often appear suggesting particular types of performance bottleneck. The resulting system provides multiple linked views, which allow the user to interactively explore the data by examining the data or a selected subset at different levels of detail. Our case studies, which use datasets collected from two different cloud systems, show that this visual based approach is effective in identifying trends and anomalies of the systems.

  14. The Use of Error Analysis to Assess Resident Performance

    PubMed Central

    D’Angelo, Anne-Lise D.; Law, Katherine E.; Cohen, Elaine R.; Greenberg, Jacob A.; Kwan, Calvin; Greenberg, Caprice; Wiegmann, Douglas A.; Pugh, Carla M.

    2015-01-01

    Background The aim of this study is to assess validity of a human factors error assessment method for evaluating resident performance during a simulated operative procedure. Methods Seven PGY4-5 residents had 30 minutes to complete a simulated laparoscopic ventral hernia (LVH) repair on Day 1 of a national, advanced laparoscopic course. Faculty provided immediate feedback on operative errors and residents participated in a final product analysis of their repairs. Residents then received didactic and hands-on training regarding several advanced laparoscopic procedures during a lecture session and animate lab. On Day 2, residents performed a nonequivalent LVH repair using a simulator. Three investigators reviewed and coded videos of the repairs using previously developed human error classification systems. Results Residents committed 121 total errors on Day 1 compared to 146 on Day 2. One of seven residents successfully completed the LVH repair on Day 1 compared to all seven residents on Day 2 (p=.001). The majority of errors (85%) committed on Day 2 were technical and occurred during the last two steps of the procedure. There were significant differences in error type (p=<.001) and level (p=.019) from Day 1 to Day 2. The proportion of omission errors decreased from Day 1 (33%) to Day 2 (14%). In addition, there were more technical and commission errors on Day 2. Conclusion The error assessment tool was successful in categorizing performance errors, supporting known-groups validity evidence. Evaluating resident performance through error classification has great potential in facilitating our understanding of operative readiness. PMID:26003910

  15. Aerocapture Performance Analysis for a Neptune-Triton Exploration Mission

    NASA Technical Reports Server (NTRS)

    Starr, Brett R.; Westhelle, Carlos H.; Masciarelli, James P.

    2004-01-01

    A systems analysis has been conducted for a Neptune-Triton Exploration Mission in which aerocapture is used to capture a spacecraft at Neptune. Aerocapture uses aerodynamic drag instead of propulsion to decelerate from the interplanetary approach trajectory to a captured orbit during a single pass through the atmosphere. After capture, propulsion is used to move the spacecraft from the initial captured orbit to the desired science orbit. A preliminary assessment identified that a spacecraft with a lift to drag ratio of 0.8 was required for aerocapture. Performance analyses of the 0.8 L/D vehicle were performed using a high fidelity flight simulation within a Monte Carlo executive to determine mission success statistics. The simulation was the Program to Optimize Simulated Trajectories (POST) modified to include Neptune specific atmospheric and planet models, spacecraft aerodynamic characteristics, and interplanetary trajectory models. To these were added autonomous guidance and pseudo flight controller models. The Monte Carlo analyses incorporated approach trajectory delivery errors, aerodynamic characteristics uncertainties, and atmospheric density variations. Monte Carlo analyses were performed for a reference set of uncertainties and sets of uncertainties modified to produce increased and reduced atmospheric variability. For the reference uncertainties, the 0.8 L/D flatbottom ellipsled vehicle achieves 100% successful capture and has a 99.87 probability of attaining the science orbit with a 360 m/s V budget for apoapsis and periapsis adjustment. Monte Carlo analyses were also performed for a guidance system that modulates both bank angle and angle of attack with the reference set of uncertainties. An alpha and bank modulation guidance system reduces the 99.87 percentile DELTA V 173 m/s (48%) to 187 m/s for the reference set of uncertainties.

  16. Advanced multiphysics coupling for LWR fuel performance analysis

    DOE PAGES

    Hales, J. D.; Tonks, M. R.; Gleicher, F. N.; ...

    2015-10-01

    Even the most basic nuclear fuel analysis is a multiphysics undertaking, as a credible simulation must consider at a minimum coupled heat conduction and mechanical deformation. The need for more realistic fuel modeling under a variety of conditions invariably leads to a desire to include coupling between a more complete set of the physical phenomena influencing fuel behavior, including neutronics, thermal hydraulics, and mechanisms occurring at lower length scales. This paper covers current efforts toward coupled multiphysics LWR fuel modeling in three main areas. The first area covered in this paper concerns thermomechanical coupling. The interaction of these two physics,more » particularly related to the feedback effect associated with heat transfer and mechanical contact at the fuel/clad gap, provides numerous computational challenges. An outline is provided of an effective approach used to manage the nonlinearities associated with an evolving gap in BISON, a nuclear fuel performance application. A second type of multiphysics coupling described here is that of coupling neutronics with thermomechanical LWR fuel performance. DeCART, a high-fidelity core analysis program based on the method of characteristics, has been coupled to BISON. DeCART provides sub-pin level resolution of the multigroup neutron flux, with resonance treatment, during a depletion or a fast transient simulation. Two-way coupling between these codes was achieved by mapping fission rate density and fast neutron flux fields from DeCART to BISON and the temperature field from BISON to DeCART while employing a Picard iterative algorithm. Finally, the need for multiscale coupling is considered. Fission gas production and evolution significantly impact fuel performance by causing swelling, a reduction in the thermal conductivity, and fission gas release. The mechanisms involved occur at the atomistic and grain scale and are therefore not the domain of a fuel performance code. However, it is

  17. Advanced multiphysics coupling for LWR fuel performance analysis

    SciTech Connect

    Hales, J. D.; Tonks, M. R.; Gleicher, F. N.; Spencer, B. W.; Novascone, S. R.; Williamson, R. L.; Pastore, G.; Perez, D. M.

    2015-10-01

    Even the most basic nuclear fuel analysis is a multiphysics undertaking, as a credible simulation must consider at a minimum coupled heat conduction and mechanical deformation. The need for more realistic fuel modeling under a variety of conditions invariably leads to a desire to include coupling between a more complete set of the physical phenomena influencing fuel behavior, including neutronics, thermal hydraulics, and mechanisms occurring at lower length scales. This paper covers current efforts toward coupled multiphysics LWR fuel modeling in three main areas. The first area covered in this paper concerns thermomechanical coupling. The interaction of these two physics, particularly related to the feedback effect associated with heat transfer and mechanical contact at the fuel/clad gap, provides numerous computational challenges. An outline is provided of an effective approach used to manage the nonlinearities associated with an evolving gap in BISON, a nuclear fuel performance application. A second type of multiphysics coupling described here is that of coupling neutronics with thermomechanical LWR fuel performance. DeCART, a high-fidelity core analysis program based on the method of characteristics, has been coupled to BISON. DeCART provides sub-pin level resolution of the multigroup neutron flux, with resonance treatment, during a depletion or a fast transient simulation. Two-way coupling between these codes was achieved by mapping fission rate density and fast neutron flux fields from DeCART to BISON and the temperature field from BISON to DeCART while employing a Picard iterative algorithm. Finally, the need for multiscale coupling is considered. Fission gas production and evolution significantly impact fuel performance by causing swelling, a reduction in the thermal conductivity, and fission gas release. The mechanisms involved occur at the atomistic and grain scale and are therefore not the domain of a fuel performance code. However, it is possible to use

  18. A theoretical analysis of vacuum arc thruster performance

    NASA Technical Reports Server (NTRS)

    Polk, James E.; Sekerak, Mike; Ziemer, John K.; Schein, Jochen; Qi, Niansheng; Binder, Robert; Anders, Andre

    2001-01-01

    In vacuum arc discharges the current is conducted through vapor evaporated from the cathode surface. In these devices very dense, highly ionized plasmas can be created from any metallic or conducting solid used as the cathode. This paper describes theoretical models of performance for several thruster configurations which use vacuum arc plasma sources. This analysis suggests that thrusters using vacuum arc sources can be operated efficiently with a range of propellant options that gives great flexibility in specific impulse. In addition, the efficiency of plasma production in these devices appears to be largely independent of scale because the metal vapor is ionized within a few microns of the cathode electron emission sites, so this approach is well-suited for micropropulsion.

  19. Performance analysis of ultrasonic ranging using a digital polarity correlator

    NASA Astrophysics Data System (ADS)

    Kodama, T.; Nakahira, K.

    2013-01-01

    This paper presents performance analysis of the distance measurement using a digital polarity correlator applied to the ultrasonic ranging system, consisting of piezoelectric transducers for pulse echo operation and a pulse compression filter using chirp signals. Analytical and simulation results show that the technique of one-bit correlation is as effective as two-bit correlation with respect to signal-to-noise ratios and probability of detecting a target, and further that both methods approach results obtained from a complete correlation of received signals with a reference signal, in the case that the threshold of the received signals is adjusted with regards to the noise level. Experimental results show close agreement with the presented theory.

  20. Analysis of Different Blade Architectures on small VAWT Performance

    NASA Astrophysics Data System (ADS)

    Battisti, L.; Brighenti, A.; Benini, E.; Raciti Castelli, M.

    2016-09-01

    The present paper aims at describing and comparing different small Vertical Axis Wind Turbine (VAWT) architectures, in terms of performance and loads. These characteristics can be highlighted by resorting to the Blade Element-Momentum (BE-M) model, commonly adopted for rotor pre-design and controller assessment. After validating the model with experimental data, the paper focuses on the analysis of VAWT loads depending on some relevant rotor features: blade number (2 and 3), airfoil camber line (comparing symmetrical and asymmetrical profiles) and blade inclination (straight versus helical blade). The effect of such characteristics on both power and thrusts (in the streamwise direction and in the crosswise one) as a function of both the blades azimuthal position and their Tip Speed Ratio (TSR) are presented and widely discussed.

  1. Human performance analysis of industrial radiography radiation exposure events

    SciTech Connect

    Reece, W.J.; Hill, S.G.

    1995-12-01

    A set of radiation overexposure event reports were reviewed as part of a program to examine human performance in industrial radiography for the US Nuclear Regulatory Commission. Incident records for a seven year period were retrieved from an event database. Ninety-five exposure events were initially categorized and sorted for further analysis. Descriptive models were applied to a subset of severe overexposure events. Modeling included: (1) operational sequence tables to outline the key human actions and interactions with equipment, (2) human reliability event trees, (3) an application of an information processing failures model, and (4) an extrapolated use of the error influences and effects diagram. Results of the modeling analyses provided insights into the industrial radiography task and suggested areas for further action and study to decrease overexposures.

  2. Performance Analysis: ITS Data through September 30, 2009

    SciTech Connect

    Kerr, C E

    2009-12-07

    Data from ITS was analyzed to understand the issues at LLNL and to identify issues that may require additional management attention and these that meet the threshold for reporting to the DOE Noncompliance Tracking System (NTS). In this report we discuss assessments and issues entered in ITS and compare the number and type presently entered in ITS to previous time periods. Issues reported in ITS were evaluated and discussed. The analysis identified two noncompliances that meet the threshold for reporting to the DOE NTS. All of the data in ITS is analyzed; however, the primary focus of this report is to meet requirements for performance analysis of specific functional areas. The DOE Office of Enforcement expects LLNL to 'implement comprehensive management and independent assessments that are effective in identifying deficiencies and broader problems in safety and security programs, as well as opportunities for continuous improvement within the organization' and to 'regularly perform assessments to evaluate implementation of the contractor's's processes for screening and internal reporting.' LLNL has a self-assessment program, described in the document applicable during this time period, ES&H Manual Document 4.1, that includes line, management and independent assessments. LLNL also has in place a process to identify and report deficiencies of nuclear, worker safety and health and security requirements. In addition, the DOE Office of Enforcement expects that 'issues management databases are used to identify adverse trends, dominant problem areas, and potential repetitive events or conditions' (page 15, DOE Enforcement Process Overview, June 2009). LLNL requires that all worker safety and health and nuclear safety noncompliances be tracked as 'deficiencies' in the LLNL Issues Tracking System (ITS). Data from the ITS are analyzed for worker safety and health (WSH) and nuclear safety noncompliances that may meet the threshold for reporting to the DOE Noncompliance

  3. Hydrodynamic body shape analysis and their impact on swimming performance.

    PubMed

    Li, Tian-Zeng; Zhan, Jie-Min

    2015-01-01

    This study presents the hydrodynamic characteristics of different adult male swimmer's body shape using computational fluid dynamics method. This simulation strategy is carried out by CFD fluent code with solving the 3D incompressible Navier-Stokes equations using the RNG k-ε turbulence closure. The water free surface is captured by the volume of fluid (VOF) method. A set of full body models, which is based on the anthropometrical characteristics of the most common male swimmers, is created by Computer Aided Industrial Design (CAID) software, Rhinoceros. The analysis of CFD results revealed that swimmer's body shape has a noticeable effect on the hydrodynamics performances. This explains why male swimmer with an inverted triangle body shape has good hydrodynamic characteristics for competitive swimming.

  4. Portable Life Support Subsystem Thermal Hydraulic Performance Analysis

    NASA Technical Reports Server (NTRS)

    Barnes, Bruce; Pinckney, John; Conger, Bruce

    2010-01-01

    This paper presents the current state of the thermal hydraulic modeling efforts being conducted for the Constellation Space Suit Element (CSSE) Portable Life Support Subsystem (PLSS). The goal of these efforts is to provide realistic simulations of the PLSS under various modes of operation. The PLSS thermal hydraulic model simulates the thermal, pressure, flow characteristics, and human thermal comfort related to the PLSS performance. This paper presents modeling approaches and assumptions as well as component model descriptions. Results from the models are presented that show PLSS operations at steady-state and transient conditions. Finally, conclusions and recommendations are offered that summarize results, identify PLSS design weaknesses uncovered during review of the analysis results, and propose areas for improvement to increase model fidelity and accuracy.

  5. Performance Analysis of an Actor-Based Distributed Simulation

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1998-01-01

    Object-oriented design of simulation programs appears to be very attractive because of the natural association of components in the simulated system with objects. There is great potential in distributing the simulation across several computers for the purpose of parallel computation and its consequent handling of larger problems in less elapsed time. One approach to such a design is to use "actors", that is, active objects with their own thread of control. Because these objects execute concurrently, communication is via messages. This is in contrast to an object-oriented design using passive objects where communication between objects is via method calls (direct calls when they are in the same address space and remote procedure calls when they are in different address spaces or different machines). This paper describes a performance analysis program for the evaluation of a design for distributed simulations based upon actors.

  6. 1-D Numerical Analysis of ABCC Engine Performance

    NASA Technical Reports Server (NTRS)

    Holden, Richard

    1999-01-01

    ABCC engine combines air breathing and rocket engine into a single engine to increase the specific impulse over an entire flight trajectory. Except for the heat source, the basic operation of the ABCC is similar to the basic operation of the RBCC engine. The ABCC is intended to have a higher specific impulse than the RBCC for single stage Earth to orbit vehicle. Computational fluid dynamics (CFD) is a useful tool for the analysis of complex transport processes in various components in ABCC propulsion system. The objective of the present research was to develop a transient 1-D numerical model using conservation of mass, linear momentum, and energy equations that could be used to predict flow behavior throughout a generic ABCC engine following a flight path. At specific points during the development of the 1-D numerical model a myriad of tests were performed to prove the program produced consistent, realistic numbers that follow compressible flow theory for various inlet conditions.

  7. Automotive Gas Turbine Power System-Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    1997-01-01

    An open cycle gas turbine numerical modelling code suitable for thermodynamic performance analysis (i.e. thermal efficiency, specific fuel consumption, cycle state points, working fluid flowrates etc.) of automotive and aircraft powerplant applications has been generated at the NASA Lewis Research Center's Power Technology Division. The use this code can be made available to automotive gas turbine preliminary design efforts, either in its present version, or, assuming that resources can be obtained to incorporate empirical models for component weight and packaging volume, in later version that includes the weight-volume estimator feature. The paper contains a brief discussion of the capabilities of the presently operational version of the code, including a listing of input and output parameters and actual sample output listings.

  8. Seismic performance analysis of Tendaho earth fill dam, Ethiopia.

    NASA Astrophysics Data System (ADS)

    Berhe, T.; Wu, W.

    2009-04-01

    The Tendaho dam is found in the Afar regional state, North Eastern part of Ethiopia. It is located within an area known as the ‘Tendaho Graben' ,which forms the center of Afar triangle, a low lying area of land where East African, Red sea and the Gulf of Eden Rift systems converge. The dam is an earthfill dam with a volume of about 4 Million cubic meters and with mixed clay core. The geological setting associated with the site of the dam, the geotechnical properties of the dam materials and seismicity of the region are reviewed. Based on this review, the foundation materials and dam body include some liquefiable granular soils. Moreover, the active East African Rift Valley fault, which can generate an earthquake of magnitude greater than 6, passes through the dam body. This valley is the primary seismic source contributing to the hazard at the Tendaho dam site. The availability of liquefiable materials beneath and within the dam body and the presence of the active fault crossing the dam site demand a thorough seismic analysis of the dam. The peak ground acceleration (PGA) is selected as a measure of ground motion severity. The PGA was selected according to the guidelines of the International Commission on Large Dams, ICOLD. Based on the criteria set by the ICOLD, the dam is analyzed for two different earthquake magnitudes, the Maximum Credible Earthquake (MCE) and the Operating Basis Earthquake (OBE). Numerical codes are useful tools to investigate the safety of dams in seismic prone areas. In this paper, FLAC3D numerical tool is used to investigate the performance of the dam under dynamic loading. Based on the numerical analysis, the seismic performance of the dam is investigated.

  9. Visualization and Analysis of Climate Simulation Performance Data

    NASA Astrophysics Data System (ADS)

    Röber, Niklas; Adamidis, Panagiotis; Behrens, Jörg

    2015-04-01

    Visualization is the key process of transforming abstract (scientific) data into a graphical representation, to aid in the understanding of the information hidden within the data. Climate simulation data sets are typically quite large, time varying, and consist of many different variables sampled on an underlying grid. A large variety of climate models - and sub models - exist to simulate various aspects of the climate system. Generally, one is mainly interested in the physical variables produced by the simulation runs, but model developers are also interested in performance data measured along with these simulations. Climate simulation models are carefully developed complex software systems, designed to run in parallel on large HPC systems. An important goal thereby is to utilize the entire hardware as efficiently as possible, that is, to distribute the workload as even as possible among the individual components. This is a very challenging task, and detailed performance data, such as timings, cache misses etc. have to be used to locate and understand performance problems in order to optimize the model implementation. Furthermore, the correlation of performance data to the processes of the application and the sub-domains of the decomposed underlying grid is vital when addressing communication and load imbalance issues. High resolution climate simulations are carried out on tens to hundreds of thousands of cores, thus yielding a vast amount of profiling data, which cannot be analyzed without appropriate visualization techniques. This PICO presentation displays and discusses the ICON simulation model, which is jointly developed by the Max Planck Institute for Meteorology and the German Weather Service and in partnership with DKRZ. The visualization and analysis of the models performance data allows us to optimize and fine tune the model, as well as to understand its execution on the HPC system. We show and discuss our workflow, as well as present new ideas and

  10. A performance-based approach to landslide risk analysis

    NASA Astrophysics Data System (ADS)

    Romeo, R. W.

    2009-04-01

    An approach for the risk assessment based on a probabilistic analysis of the performance of structures threatened by landslides is shown and discussed. The risk is a possible loss due to the occurrence of a potentially damaging event. Analytically the risk is the probability convolution of hazard, which defines the frequency of occurrence of the event (i.e., the demand), and fragility that defines the capacity of the system to withstand the event given its characteristics (i.e., severity) and those of the exposed goods (vulnerability), that is: Risk=p(D>=d|S,V) The inequality sets a damage (or loss) threshold beyond which the system's performance is no longer met. Therefore a consistent approach to risk assessment should: 1) adopt a probabilistic model which takes into account all the uncertainties of the involved variables (capacity and demand), 2) follow a performance approach based on given loss or damage thresholds. The proposed method belongs to the category of the semi-empirical ones: the theoretical component is given by the probabilistic capacity-demand model; the empirical component is given by the observed statistical behaviour of structures damaged by landslides. Two landslide properties alone are required: the area-extent and the type (or kinematism). All other properties required to determine the severity of landslides (such as depth, speed and frequency) are derived via probabilistic methods. The severity (or intensity) of landslides, in terms of kinetic energy, is the demand of resistance; the resistance capacity is given by the cumulative distribution functions of the limit state performance (fragility functions) assessed via damage surveys and cards compilation. The investigated limit states are aesthetic (of nominal concern alone), functional (interruption of service) and structural (economic and social losses). The damage probability is the probabilistic convolution of hazard (the probability mass function of the frequency of occurrence of given

  11. Performance analysis for second-design space Stirling engine model

    NASA Astrophysics Data System (ADS)

    Ogiwara, Sachio; Fujiwara, Tsutomu; Eguchi, Kunihisa; Nakamura, Yoshihiro

    A hybrid free-piston Stirling research engine, called NALSEM 125, has been tested since 1988 as part of a solar dynamic power technology program. It is a gamma-type Stirling driven linear-alternator machine with helium as a working fluid. The objective of the experimental program is to understand the thermodynamic and dynamic mechanisms of the free piston engine integrated with a magnet-moving alternator. After the first phase engine experiments of NALSEM 125, a second design Stirling engine of NALSEM 125 R has been tested. By using a second-order analytical tool, some design modifications were performed to provide much more stable dynamic operations over a required operating range, as well as to incorporate an electric heater head simulating a hot interface of 12 sodium heat pipes. Describes in this paper are thermodynamic performance data of NALSEM 125R operations, which are also compared with the computational analysis, considering the power losses resulting from pressure drop and gas leakage.

  12. Performance analysis of a digital capacitance measuring circuit

    NASA Astrophysics Data System (ADS)

    Xu, Lijun; Sun, Shijie; Cao, Zhang; Yang, Wuqiang

    2015-05-01

    This paper presents the design and study of a digital capacitance measuring circuit with theoretical analysis, numerical simulation, and experimental evaluation. The static and dynamic performances of the capacitance measuring circuit are first defined, including signal-to-noise ratio (SNR), standard deviation, accuracy, linearity, sensitivity, and response time, within a given measurement range. Then numerical simulation is carried out to analyze the SNR and standard deviation of the circuit, followed by experiments to validate the overall performance of the circuit. The simulation results show that when the standard deviation of noise is 0.08 mV and the measured capacitance decreases from 6 pF to 3 fF, the SNR decreases from 90 dB to 22 dB and the standard deviation is between 0.17 fF and 0.24 fF. The experimental results show that when the measured capacitance decreases from 6 pF to 40 fF and the data sampled in a single period are used for demodulation, the SNR decreases from 88 dB to 40 dB and the standard deviation is between 0.18 fF and 0.25 fF. The maximum absolute error and relative error are 5.12 fF and 1.26%, respectively. The SNR and standard deviation can be further improved if the data sampled in more than one period are used for demodulation by the circuit.

  13. Performance analysis and optimization of power plants with gas turbines

    NASA Astrophysics Data System (ADS)

    Besharati-Givi, Maryam

    The gas turbine is one of the most important applications for power generation. The purpose of this research is performance analysis and optimization of power plants by using different design systems at different operation conditions. In this research, accurate efficiency calculation and finding optimum values of efficiency for design of chiller inlet cooling and blade cooled gas turbine are investigated. This research shows how it is possible to find the optimum design for different operation conditions, like ambient temperature, relative humidity, turbine inlet temperature, and compressor pressure ratio. The simulated designs include the chiller, with varied COP and fogging cooling for a compressor. In addition, the overall thermal efficiency is improved by adding some design systems like reheat and regenerative heating. The other goal of this research focuses on the blade-cooled gas turbine for higher turbine inlet temperature, and consequently, higher efficiency. New film cooling equations, along with changing film cooling effectiveness for optimum cooling air requirement at the first-stage blades, and an internal and trailing edge cooling for the second stage, are innovated for optimal efficiency calculation. This research sets the groundwork for using the optimum value of efficiency calculation, while using inlet cooling and blade cooling designs. In the final step, the designed systems in the gas cycles are combined with a steam cycle for performance improvement.

  14. Magnetohydrodynamic Augmented Propulsion Experiment: I. Performance Analysis and Design

    NASA Technical Reports Server (NTRS)

    Litchford, R. J.; Cole, J. W.; Lineberry, J. T.; Chapman, J. N.; Schmidt, H. J.; Lineberry, C. W.

    2003-01-01

    The performance of conventional thermal propulsion systems is fundamentally constrained by the specific energy limitations associated with chemical fuels and the thermal limits of available materials. Electromagnetic thrust augmentation represents one intriguing possibility for improving the fuel composition of thermal propulsion systems, thereby increasing overall specific energy characteristics; however, realization of such a system requires an extremely high-energy-density electrical power source as well as an efficient plasma acceleration device. This Technical Publication describes the development of an experimental research facility for investigating the use of cross-field magnetohydrodynamic (MHD) accelerators as a possible thrust augmentation device for thermal propulsion systems. In this experiment,a 1.5-MW(sub e) Aerotherm arc heater is used to drive a 2-MW(sub e) MHD accelerator. The heatsink MHD accelerator is configured as an externally diagonalized, segmented channel, which is inserted into a large-bore, 2-T electromagnet. The performance analysis and engineering design of the flow path are described as well as the parameter measurements and flow diagnostics planned for the initial series of test runs.

  15. Performance Analysis of a NASA Integrated Network Array

    NASA Technical Reports Server (NTRS)

    Nessel, James A.

    2012-01-01

    The Space Communications and Navigation (SCaN) Program is planning to integrate its individual networks into a unified network which will function as a single entity to provide services to user missions. This integrated network architecture is expected to provide SCaN customers with the capabilities to seamlessly use any of the available SCaN assets to support their missions to efficiently meet the collective needs of Agency missions. One potential optimal application of these assets, based on this envisioned architecture, is that of arraying across existing networks to significantly enhance data rates and/or link availabilities. As such, this document provides an analysis of the transmit and receive performance of a proposed SCaN inter-network antenna array. From the study, it is determined that a fully integrated internetwork array does not provide any significant advantage over an intra-network array, one in which the assets of an individual network are arrayed for enhanced performance. Therefore, it is the recommendation of this study that NASA proceed with an arraying concept, with a fundamental focus on a network-centric arraying.

  16. Scalable Analysis Techniques for Microprocessor Performance Counter Metrics

    SciTech Connect

    Ahn, D H; Vetter, J S

    2002-07-24

    Contemporary microprocessors provide a rich set of integrated performance counters that allow application developers and system architects alike the opportunity to gather important information about workload behaviors. These counters can capture instruction, memory, and operating system behaviors. Current techniques for analyzing data produced from these counters use raw counts, ratios, and visualization techniques to help users make decisions about their application source code. While these techniques are appropriate for analyzing data from one process, they do not scale easily to new levels demanded by contemporary computing systems. Indeed, the amount of data generated by these experiments is on the order of tens of thousands of data points. Furthermore, if users execute multiple experiments, then we add yet another dimension to this already knotty picture. This flood of multidimensional data can swamp efforts to harvest important ideas from these valuable counters. Very simply, this paper addresses these concerns by evaluating several multivariate statistical techniques on these datasets. We find that several techniques, such as statistical clustering, can automatically extract important features from this data. These derived results can, in turn, be feed directly back to an application developer, or used as input to a more comprehensive performance analysis environment, such as a visualization or an expert system.

  17. Analysis of beamed-energy ramjet/scramjet performance

    NASA Technical Reports Server (NTRS)

    Myrabo, L. N.; Powers, M. V.; Zaretzky, C. L.

    1986-01-01

    A study has been performed on a laser-heated ramjet/scramjet vehicle concept for propulsion during the air-breathing portion of an orbital launch trajectory. The concept considers axisymmetric, high-thrust vehicles with external inlets and nozzles. Conceptual design and ramjet/scramjet cycle analysis are emphasized, with propulsive energy provided by combustion of on-board fuel. The conventional ramjet/scramjet combustion chamber is replaced by a laser energy absorption chamber. The elimination of on-board propellant can result in very high thrust-to-weight ratios and payload fractions, in a vehicle with a relatively small degree of mechanical complexity. The basic vehicle has a weight of 12,250 lbf, and a diameter of 5 meters, which is close to the size of the Apollo command module. The ramjet calculations are based on a Mach 3 isentropic inlet with a 13.7 degree half-angle conical tip. The scramjet analysis considers conical inlets with 10, 15, and 30 degree half-angles. Flight Mach numbers from 2 to 20 are considered in the calculations.

  18. The Current State of Human Performance Technology: A Citation Network Analysis of "Performance Improvement Quarterly," 1988-2010

    ERIC Educational Resources Information Center

    Cho, Yonjoo; Jo, Sung Jun; Park, Sunyoung; Kang, Ingu; Chen, Zengguan

    2011-01-01

    This study conducted a citation network analysis (CNA) of human performance technology (HPT) to examine its current state of the field. Previous reviews of the field have used traditional research methods, such as content analysis, survey, Delphi, and citation analysis. The distinctive features of CNA come from using a social network analysis…

  19. Analysis of correlation between corneal topographical data and visual performance

    NASA Astrophysics Data System (ADS)

    Zhou, Chuanqing; Yu, Lei; Ren, Qiushi

    2007-02-01

    Purpose: To study correlation among corneal asphericity, higher-order aberrations and visual performance for eyes of virgin myopia and postoperative laser in situ keratomileusis (LASIK). Methods: There were 320 candidates 590 eyes for LASIK treatment included in this study. The mean preoperative spherical equivalence was -4.35+/-1.51D (-1.25 to -9.75), with astigmatism less than 2.5 D. Corneal topography maps and contrast sensitivity were measured and analyzed for every eye before and one year after LASIK for the analysis of corneal asphericity and wavefront aberrations. Results: Preoperatively, only 4th and 6th order aberration had significant correlation with corneal asphericity and apical radius of curvature (p<0.001). Postoperatively, all 3th to 6th order aberrations had statistically significant correlation with corneal asphericity (p<0.01), but only 4th and 6th order aberration had significant correlation with apical radius of curvature (p<0.05). The asymmetrical aberration like coma had significant correlation with vertical offset of pupil center (p<0.01). Preoperatively, corneal aberrations had no significant correlation with visual acuity and area under the log contrast sensitivity (AULCSF) (P>0.05). Postoperatively, corneal aberrations still didn't have significant correlation with visual acuity (P>0.05), but had significantly negative correlation with AULCSF (P<0.01). Corneal asphericity had no significant correlation with AULCSF before and after the treatment (P>0.05). Conclusions: Corneal aberrations had different correlation with corneal profile and visual performance for eyes of virgin myopia and postoperative LASIK, which may be due to changed corneal profile and limitation of metrics of corneal aberrations.

  20. Analysis of Student Performance in Peer Led Undergraduate Supplements

    NASA Astrophysics Data System (ADS)

    Gardner, Linda M.

    Foundations of Chemistry courses at the University of Kansas have traditionally accommodated nearly 1,000 individual students every year with a single course in a large lecture hall. To develop a more student-centered learning atmosphere, Peer Led Undergraduate Supplements (PLUS) were introduced to assist students, starting in the spring of 2010. PLUS was derived from the more well-known Peer-Led Team Learning with modifications to meet the specific needs of the university and the students. The yearlong investigation of PLUS Chemistry began in the fall of 2012 to allow for adequate development of materials and training of peer leaders. We examined the impact of academic achievement for students who attended PLUS sessions while controlling for high school GPA, math ACT scores, credit hours earned in high school, completion of calculus, gender, and those aspiring to be pharmacists (i.e., pre-pharmacy students). In a least linear squares multiple regression, PLUS participants performed on average one percent higher on exam scores for Chemistry 184 and four tenths of a percent on Chemistry 188 for each PLUS session attended. Pre-pharmacy students moderated the effect of PLUS attendance on chemistry achievement, ultimately negating any relative gain associated by attending PLUS sessions. Evidence of gender difference was demonstrated in the Chemistry 188 model, indicating females experience a greater benefit from PLUS sessions. Additionally, an item analysis studied the relationship between PLUS material to individual items on exams. The research discovered that students who attended PLUS session, answered the items correctly 10 to 20 percent more than their comparison group for PLUS interrelated items and no difference to 10 percent for non-PLUS related items. In summary, PLUS has a positive effect on exam performance in introductory chemistry courses at the University of Kansas.

  1. Measurement Performance of a Computer Assisted Vertebral Motion Analysis System

    PubMed Central

    Davis, Reginald J.; Lee, David C.; Cheng, Boyle

    2015-01-01

    Background Segmental instability of the lumbar spine is a significant cost within the US health care system; however current thresholds for indication of radiographic instability are not well defined. Purpose To determine the performance measurements of sagittal lumbar intervertebral measurements using computerassisted measurements of the lumbar spine using motion sequences from a video-fluoroscopic technique. Study design Sensitivity, specificity, predictive values, prevalence, and test-retest reliability evaluation of digitized manual versus computer-assisted measurements of the lumbar spine. Patient sample A total of 2239 intervertebral levels from 509 symptomatic patients, and 287 intervertebral levels from 73 asymptomatic participants were retrospectively evaluated. Outcome measures Specificity, sensitivity, negative predictive value (NPV), diagnostic accuracy, and prevalence between the two measurement techniques; Measurements of Coefficient of repeatability (CR), limits of agreement (LOA), intraclass correlation coefficient (ICC; type 3,1), and standard error of measurement for both measurement techniques. Methods Asymptomatic individuals and symptomatic patients were all evaluated using both the Vertebral Motion Analysis (VMA) system and fluoroscopic flexion extension static radiographs (FE). The analysis was compared to known thresholds of 15% intervertebral translation (IVT, equivalent to 5.3mm assuming a 35mm vertebral body depth) and 25° intervertebral rotation (IVR). Results The VMA measurements demonstrated greater specificity, % change in sensitivity, NPV, prevalence, and reliability compared with FE for radiographic evidence of instability. Specificity was 99.4% and 99.1% in the VMA compared to 98.3% and 98.2% in the FE for IVR and IVT, respectively. Sensitivity in this study was 41.2% and 44.6% greater in the VMA compared to the FE for IVR and IVT, respectively. NPV was 91% and 88% in the VMA compared to 62% and 66% in the FE for IVR and IVT

  2. Design, fabrication & performance analysis of an unmanned aerial vehicle

    NASA Astrophysics Data System (ADS)

    Khan, M. I.; Salam, M. A.; Afsar, M. R.; Huda, M. N.; Mahmud, T.

    2016-07-01

    An Unmanned Aerial Vehicle was designed, analyzed and fabricated to meet design requirements and perform the entire mission for an international aircraft design competition. The goal was to have a balanced design possessing, good demonstrated flight handling qualities, practical and affordable manufacturing requirements while providing a high vehicle performance. The UAV had to complete total three missions named ferry flight (1st mission), maximum load mission (2nd mission) and emergency medical mission (3rd mission). The requirement of ferry flight mission was to fly as many as laps as possible within 4 minutes. The maximum load mission consists of flying 3 laps while carrying two wooden blocks which simulate cargo. The requirement of emergency medical mission was complete 3 laps as soon as possible while carrying two attendances and two patients. A careful analysis revealed lowest rated aircraft cost (RAC) as the primary design objective. So, the challenge was to build an aircraft with minimum RAC that can fly fast, fly with maximum payload, and fly fast with all the possible configurations. The aircraft design was reached by first generating numerous design concepts capable of completing the mission requirements. In conceptual design phase, Figure of Merit (FOM) analysis was carried out to select initial aircraft configuration, propulsion, empennage and landing gear. After completion of the conceptual design, preliminary design was carried out. The preliminary design iterations had a low wing loading, high lift coefficient, and a high thrust to weight ratio. To make the aircraft capable of Rough Field Taxi; springs were added in the landing gears for absorbing shock. An airfoil shaped fuselage was designed to allowed sufficient space for payload and generate less drag to make the aircraft fly fast. The final design was a high wing monoplane with conventional tail, single tractor propulsion system and a tail dragger landing gear. Payload was stored in

  3. Advanced Analysis of Finger-Tapping Performance: A Preliminary Study

    PubMed Central

    Barut, Çağatay; Kızıltan, Erhan; Gelir, Ethem; Köktürk, Fürüzan

    2013-01-01

    Background: The finger-tapping test is a commonly employed quantitative assessment tool used to measure motor performance in the upper extremities. This task is a complex motion that is affected by external stimuli, mood and health status. The complexity of this task is difficult to explain with a single average intertap-interval value (time difference between successive tappings) which only provides general information and neglects the temporal effects of the aforementioned factors. Aims: This study evaluated the time course of average intertap-interval values and the patterns of variation in both the right and left hands of right-handed subjects using a computer-based finger-tapping system. Study Design: Cross sectional study. Methods: Thirty eight male individuals aged between 20 and 28 years (Mean±SD = 22.24±1.65) participated in the study. Participants were asked to perform single-finger-tapping test for 10 seconds of test period. Only the results of right-handed (RH) 35 participants were considered in this study. The test records the time of tapping and saves data as the time difference between successive tappings for further analysis. The average number of tappings and the temporal fluctuation patterns of the intertap-intervals were calculated and compared. The variations in the intertap-interval were evaluated with the best curve fit method. Results: An average tapping speed or tapping rate can reliably be defined for a single-finger tapping test by analysing the graphically presented data of the number of tappings within the test period. However, a different presentation of the same data, namely the intertap-interval values, shows temporal variation as the number of tapping increases. Curve fitting applications indicate that the variation has a biphasic nature. Conclusion: The measures obtained in this study reflect the complex nature of the finger-tapping task and are suggested to provide reliable information regarding hand performance. Moreover, the

  4. Thermal Performance Analysis of a Geologic Borehole Repository

    SciTech Connect

    Reagin, Lauren

    2016-08-16

    The Brazilian Nuclear Research Institute (IPEN) proposed a design for the disposal of Disused Sealed Radioactive Sources (DSRS) based on the IAEA Borehole Disposal of Sealed Radioactive Sources (BOSS) design that would allow the entirety of Brazil’s inventory of DSRS to be disposed in a single borehole. The proposed IPEN design allows for 170 waste packages (WPs) containing DSRS (such as Co-60 and Cs-137) to be stacked on top of each other inside the borehole. The primary objective of this work was to evaluate the thermal performance of a conservative approach to the IPEN proposal with the equivalent of two WPs and two different inside configurations using Co-60 as the radioactive heat source. The current WP configuration (heterogeneous) for the IPEN proposal has 60% of the WP volume being occupied by a nuclear radioactive heat source and the remaining 40% as vacant space. The second configuration (homogeneous) considered for this project was a homogeneous case where 100% of the WP volume was occupied by a nuclear radioactive heat source. The computational models for the thermal analyses of the WP configurations with the Co-60 heat source considered three different cooling mechanisms (conduction, radiation, and convection) and the effect of mesh size on the results from the thermal analysis. The results of the analyses yielded maximum temperatures inside the WPs for both of the WP configurations and various mesh sizes. The heterogeneous WP considered the cooling mechanisms of conduction, convection, and radiation. The temperature results from the heterogeneous WP analysis suggest that the model is cooled predominantly by conduction with effect of radiation and natural convection on cooling being negligible. From the thermal analysis comparing the two WP configurations, the results suggest that either WP configuration could be used for the design. The mesh sensitivity results verify the meshes used, and results obtained from the thermal analyses were close to

  5. Analysis of Performance of Stereoscopic-Vision Software

    NASA Technical Reports Server (NTRS)

    Kim, Won; Ansar, Adnan; Steele, Robert; Steinke, Robert

    2007-01-01

    A team of JPL researchers has analyzed stereoscopic vision software and produced a document describing its performance. This software is of the type used in maneuvering exploratory robotic vehicles on Martian terrain. The software in question utilizes correlations between portions of the images recorded by two electronic cameras to compute stereoscopic disparities, which, in conjunction with camera models, are used in computing distances to terrain points to be included in constructing a three-dimensional model of the terrain. The analysis included effects of correlation- window size, a pyramidal image down-sampling scheme, vertical misalignment, focus, maximum disparity, stereo baseline, and range ripples. Contributions of sub-pixel interpolation, vertical misalignment, and foreshortening to stereo correlation error were examined theoretically and experimentally. It was found that camera-calibration inaccuracy contributes to both down-range and cross-range error but stereo correlation error affects only the down-range error. Experimental data for quantifying the stereo disparity error were obtained by use of reflective metrological targets taped to corners of bricks placed at known positions relative to the cameras. For the particular 1,024-by-768-pixel cameras of the system analyzed, the standard deviation of the down-range disparity error was found to be 0.32 pixel.

  6. Performance Analysis of Intelligent Robust Facility Layout Design

    NASA Astrophysics Data System (ADS)

    Moslemipour, G.; Lee, T. S.; Loong, Y. T.

    2017-03-01

    Design of a robust production facility layout with minimum handling cost (MHC) presents an appropriate approach to tackle facility layout problems in a dynamic volatile environment, in which product demands randomly change in each planning period. The objective of the design is to find the robust facility layout with minimum total material handling cost over the entire multi-period planning horizon. This paper proposes a new mathematical model for designing robust machine layout in the stochastic dynamic environment of manufacturing systems using quadratic assignment problem (QAP) formulation. In this investigation, product demands are assumed to be normally distributed random variables with known expected value, variance, and covariance that randomly change from period to period. The proposed model was verified and validated using randomly generated numerical data and benchmark examples. The effect of dependent product demands and varying interest rate on the total cost function of the proposed model has also been investigated. Sensitivity analysis on the proposed model has been performed. Dynamic programming and simulated annealing optimization algorithms were used in solving the modeled example problems.

  7. Analysis of classifiers performance for classification of potential microcalcification

    NASA Astrophysics Data System (ADS)

    M. N., Arun K.; Sheshadri, H. S.

    2013-07-01

    Breast cancer is a significant public health problem in the world. According to the literature early detection improve breast cancer prognosis. Mammography is a screening tool used for early detection of breast cancer. About 10-30% cases are missed during the routine check as it is difficult for the radiologists to make accurate analysis due to large amount of data. The Microcalcifications (MCs) are considered to be important signs of breast cancer. It has been reported in literature that 30% - 50% of breast cancer detected radio graphically show MCs on mammograms. Histologic examinations report 62% to 79% of breast carcinomas reveals MCs. MC are tiny, vary in size, shape, and distribution, and MC may be closely connected to surrounding tissues. There is a major challenge using the traditional classifiers in the classification of individual potential MCs as the processing of mammograms in appropriate stage generates data sets with an unequal amount of information for both classes (i.e., MC, and Not-MC). Most of the existing state-of-the-art classification approaches are well developed by assuming the underlying training set is evenly distributed. However, they are faced with a severe bias problem when the training set is highly imbalanced in distribution. This paper addresses this issue by using classifiers which handle the imbalanced data sets. In this paper, we also compare the performance of classifiers which are used in the classification of potential MC.

  8. Space rescue system definition (system performance analysis and trades)

    NASA Astrophysics Data System (ADS)

    Housten, Sam; Elsner, Tim; Redler, Ken; Svendsen, Hal; Wenzel, Sheri

    This paper addresses key technical issues involved in the system definition of the Assured Crew Return Vehicle (ACRV). The perspective on these issues is that of a prospective ACRV contractor, performing system analysis and trade studies. The objective of these analyses and trade studies is to develop the recovery vehicle system concept and top level requirements. The starting point for this work is the definition of the set of design missions for the ACRV. This set of missions encompasses three classes of contingency/emergency (crew illness/injury, space station catastrophe/failure, transportation element catastrophe/failure). The need is to provide a system to return Space Station crew to Earth quickly (less than 24 hours) in response to randomly occurring contingency events over an extended period of time (30 years of planned Space Station life). The main topics addressed and characterized in this paper include the following: Key Recovery (Rescue) Site Access Considerations; Rescue Site Locations and Distribution; Vehicle Cross Range vs Site Access; On-orbit Loiter Capability and Vehicle Design; and Water vs. Land Recovery.

  9. 1-D Numerical Analysis of RBCC Engine Performance

    NASA Technical Reports Server (NTRS)

    Han, Samuel S.

    1998-01-01

    An RBCC engine combines air breathing and rocket engines into a single engine to increase the specific impulse over an entire flight trajectory. Considerable research pertaining to RBCC propulsion was performed during the 1960's and these engines were revisited recently as a candidate propulsion system for either a single-stage-to-orbit (SSTO) or two-stage-to-orbit (TSTO) launch vehicle. There are a variety of RBCC configurations that had been evaluated and new designs are currently under development. However, the basic configuration of all RBCC systems is built around the ejector scramjet engine originally developed for the hypersonic airplane. In this configuration, a rocket engine plays as an ejector in the air-augmented initial acceleration mode, as a fuel injector in scramjet mode and the rocket in all rocket mode for orbital insertion. Computational fluid dynamics (CFD) is a useful tool for the analysis of complex transport processes in various components in RBCC propulsion systems. The objective of the present research was to develop a transient 1-D numerical model that could be used to predict flow behavior throughout a generic RBCC engine following a flight path.

  10. Analysis of Illinois Home Performance with ENERGY STAR® Measure Packages

    SciTech Connect

    Baker, J.; Yee, S.; Brand, L.

    2013-09-01

    Through the Chicagoland Single Family Housing Characterization and Retrofit Prioritization report, the Partnership for Advanced Residential Retrofit research team characterized 15 housing types in the Chicagoland region based on assessor data, utility billing history, and available data from prior energy efficiency programs. Within these 15 groups, a subset showed the greatest opportunity for energy savings based on BEopt Version 1.1 modeling of potential energy efficiency package options and the percent of the housing stock represented by each group. In this project, collected field data from a whole-home program in Illinois are utilized to compare marketplace-installed measures to the energy saving optimal packages previously developed for the 15 housing types. Housing type, conditions, energy efficiency measures installed, and retrofit cost information were collected from 19 homes that participated in the Illinois Home Performance with ENERGY STAR program in 2012, representing eight of the characterized housing groups. Two were selected for further case study analysis to provide an illustration of the differences between optimal and actually installed measures. Taken together, these homes are representative of 34.8% of the Chicagoland residential building stock. In one instance, actual installed measures closely matched optimal recommended measures.

  11. Performance Analysis: Work Control Events Identified January - August 2010

    SciTech Connect

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting in each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded elements of an institutional work planning and control system. By the end of that year this system was documented and implementation had begun. In 2009

  12. Routing performance analysis and optimization within a massively parallel computer

    DOEpatents

    Archer, Charles Jens; Peters, Amanda; Pinnow, Kurt Walter; Swartz, Brent Allen

    2013-04-16

    An apparatus, program product and method optimize the operation of a massively parallel computer system by, in part, receiving actual performance data concerning an application executed by the plurality of interconnected nodes, and analyzing the actual performance data to identify an actual performance pattern. A desired performance pattern may be determined for the application, and an algorithm may be selected from among a plurality of algorithms stored within a memory, the algorithm being configured to achieve the desired performance pattern based on the actual performance data.

  13. Polarisation of High-Performing and Low-Performing Secondary Schools in Victoria, Australia: An Analysis of Causal Complexities

    ERIC Educational Resources Information Center

    Bandaranayake, Bandara

    2016-01-01

    Applying qualitative comparative analysis (QCA), this study explores the configurations of conditions that contribute to the polarisation of high-performing and low-performing secondary schools in Victoria, Australia. It is argued that the success and failure of schools can be understood in terms of causal complexity, where one or several…

  14. Analysis of TIMS performance subjected to simulated wind blast

    NASA Technical Reports Server (NTRS)

    Jaggi, S.; Kuo, S.

    1992-01-01

    The results of the performance of the Thermal Infrared Multispectral Scanner (TIMS) when it is subjected to various wind conditions in the laboratory are described. Various wind conditions were simulated using a 24 inch fan or combinations of air jet streams blowing toward either or both of the blackbody surfaces. The fan was used to simulate a large volume of air flow at moderate speeds (up to 30 mph). The small diameter air jets were used to probe TIMS system response in reaction to localized wind perturbations. The maximum nozzle speed of the air jet was 60 mph. A range of wind directions and speeds were set up in the laboratory during the test. The majority of the wind tests were conducted under ambient conditions with the room temperature fluctuating no more than 2 C. The temperature of the high speed air jet was determined to be within 1 C of the room temperature. TIMS response was recorded on analog tape. Additional thermistor readouts of the blackbody temperatures and thermocouple readout of the ambient temperature were recorded manually to be compared with the housekeeping data recorded on the tape. Additional tests were conducted under conditions of elevated and cooled room temperatures. The room temperature was varied between 19.5 to 25.5 C in these tests. The calibration parameters needed for quantitative analysis of TIMS data were first plotted on a scanline-by-scanline basis. These parameters are the low and high blackbody temperature readings as recorded by the TIMS and their corresponding digitized count values. Using these values, the system transfer equations were calculated. This equation allows us to compute the flux for any video count by computing the slope and intercept of the straight line that relates the flux to the digital count. The actual video of the target (the lab floor in this case) was then compared with a simulated target. This simulated target was assumed to be a blackbody at emissivity of .95 degrees and the temperature was

  15. Performance Analysis of Saturated Induction Motors by Virtual Tests

    ERIC Educational Resources Information Center

    Ojaghi, M.; Faiz, J.; Kazemi, M.; Rezaei, M.

    2012-01-01

    Many undergraduate-level electrical machines textbooks give detailed treatments of the performance of induction motors. Students can deepen this understanding of motor performance by performing the appropriate practical work in laboratories or in simulation using proper software packages. This paper considers various common and less-common tests…

  16. The Algerian Seismic Network: Performance from data quality analysis

    NASA Astrophysics Data System (ADS)

    Yelles, Abdelkarim; Allili, Toufik; Alili, Azouaou

    2013-04-01

    Seismic monitoring in Algeria has seen a great change after the Boumerdes earthquake of May 21st, 2003. Indeed the installation of a New Digital seismic network (ADSN) upgrade drastically the previous analog telemetry network. During the last four years, the number of stations in operation has greatly increased to 66 stations with 15 Broad Band, 02 Very Broad band, 47 Short period and 21 accelerometers connected in real time using various mode of transmission ( VSAT, ADSL, GSM, ...) and managed by Antelope software. The spatial distribution of these stations covers most of northern Algeria from east to west. Since the operation of the network, significant number of local, regional and tele-seismic events was located by the automatic processing, revised and archived in databases. This new set of data is characterized by the accuracy of the automatic location of local seismicity and the ability to determine its focal mechanisms. Periodically, data recorded including earthquakes, calibration pulse and cultural noise are checked using PSD (Power Spectral Density) analysis to determine the noise level. ADSN Broadband stations data quality is controlled in quasi real time using the "PQLX" software by computing PDFs and PSDs of the recordings. Some other tools and programs allow the monitoring and the maintenance of the entire electronic system for example to check the power state of the system, the mass position of the sensors and the environment conditions (Temperature, Humidity, Air Pressure) inside the vaults. The new design of the network allows management of many aspects of real time seismology: seismic monitoring, rapid determination of earthquake, message alert, moment tensor estimation, seismic source determination, shakemaps calculation, etc. The international standards permit to contribute in regional seismic monitoring and the Mediterranean warning system. The next two years with the acquisition of new seismic equipment to reach 50 new BB stations led to

  17. Graphical User Interface for Simulink Integrated Performance Analysis Model

    NASA Technical Reports Server (NTRS)

    Durham, R. Caitlyn

    2009-01-01

    The J-2X Engine (built by Pratt & Whitney Rocketdyne,) in the Upper Stage of the Ares I Crew Launch Vehicle, will only start within a certain range of temperature and pressure for Liquid Hydrogen and Liquid Oxygen propellants. The purpose of the Simulink Integrated Performance Analysis Model is to verify that in all reasonable conditions the temperature and pressure of the propellants are within the required J-2X engine start boxes. In order to run the simulation, test variables must be entered at all reasonable values of parameters such as heat leak and mass flow rate. To make this testing process as efficient as possible in order to save the maximum amount of time and money, and to show that the J-2X engine will start when it is required to do so, a graphical user interface (GUI) was created to allow the input of values to be used as parameters in the Simulink Model, without opening or altering the contents of the model. The GUI must allow for test data to come from Microsoft Excel files, allow those values to be edited before testing, place those values into the Simulink Model, and get the output from the Simulink Model. The GUI was built using MATLAB, and will run the Simulink simulation when the Simulate option is activated. After running the simulation, the GUI will construct a new Microsoft Excel file, as well as a MATLAB matrix file, using the output values for each test of the simulation so that they may graphed and compared to other values.

  18. VALIDATION GUIDELINES FOR LABORATORIES PERFORMING FORENSIC ANALYSIS OF CHEMICAL TERRORISM

    EPA Science Inventory

    The Scientific Working Group on Forensic Analysis of Chemical Terrorism (SWGFACT) has developed the following guidelines for laboratories engaged in the forensic analysis of chemical evidence associated with terrorism. This document provides a baseline framework and guidance for...

  19. Analysis of Aurora's Performance Simulation Engine for Three Systems

    SciTech Connect

    Freeman, Janine; Simon, Joseph

    2015-07-07

    Aurora Solar Inc. is building a cloud-based optimization platform to automate the design, engineering, and permit generation process of solar photovoltaic (PV) installations. They requested that the National Renewable Energy Laboratory (NREL) validate the performance of the PV system performance simulation engine of Aurora Solar’s solar design platform, Aurora. In previous work, NREL performed a validation of multiple other PV modeling tools 1, so this study builds upon that work by examining all of the same fixed-tilt systems with available module datasheets that NREL selected and used in the aforementioned study. Aurora Solar set up these three operating PV systems in their modeling platform using NREL-provided system specifications and concurrent weather data. NREL then verified the setup of these systems, ran the simulations, and compared the Aurora-predicted performance data to measured performance data for those three systems, as well as to performance data predicted by other PV modeling tools.

  20. An empirical performance analysis of commodity memories in commodity servers

    SciTech Connect

    Kerbyson, D. J.; Lang, M. K.; Patino, G.

    2004-01-01

    This work details a performance study of six different commodity memories in two commodity server nodes on a number of microbenchmarks, that measure low-level performance characteristics, as well as on two applications representative of the ASCI workload. Thc memories vary both in terms of performance, including latency and bandwidths, and also in terms of their physical properties and manufacturer. Two server nodes were used; one Itanium-II Madison based system, and one Xeon based system. All the memories examined can be used within both processing nodes. This allows the performance of the memories to be directly examined while keeping all other factors within a processing node the same (processor, motherboard, operating system etc.). The results of this study show that there can be a significant difference in application performance from the different memories - by as much as 20%. Thus, by choosing the most appropriate memory for a processing node at a minimal cost differential, significant improved performance may be achievable.

  1. Multidimensional scaling analysis of simulated air combat maneuvering performance data.

    PubMed

    Polzella, D J; Reid, G B

    1989-02-01

    This paper describes the decomposition of air combat maneuvering by means of multidimensional scaling (MDS). MDS analyses were applied to performance data obtained from expert and novice pilots during simulated air-to-air combat. The results of these analyses revealed that the performance of expert pilots is characterized by advantageous maneuverability and intelligent energy management. It is argued that MDS, unlike simpler metrics, permits the investigator to achieve greater insights into the underlying structure associated with performance of a complex task.

  2. Student academic performance analysis using fuzzy C-means clustering

    NASA Astrophysics Data System (ADS)

    Rosadi, R.; Akamal; Sudrajat, R.; Kharismawan, B.; Hambali, Y. A.

    2017-01-01

    Grade Point Average (GPA) is commonly used as an indicator of academic performance. Academic performance evaluations is a basic way to evaluate the progression of student performance, when evaluating student’s academic performance, there are occasion where the student data is grouped especially when the amounts of data is large. Thus, the pattern of data relationship within and among groups can be revealed. Grouping data can be done by using clustering method, where one of the methods is the Fuzzy C-Means algorithm. Furthermore, this algorithm is then applied to a set of student data form the Faculty of Mathematics and Natural Sciences, Padjadjaran University.

  3. The Relationship between Performance and Satisfaction: A Utility Analysis.

    DTIC Science & Technology

    1985-03-01

    BUREAU OF STANDARDS-1963-A REPRODUCED AT GOVEANMENT EXPENSE Organizational Behavior Research Department of Management CV Department of Psychology (VJ...will briefly review present thought on the satisfaction-performance relationship, and then turn to an explanation for the proposed curvilinear...implications of this new approach will be discussed. The Relationship between Performance and Satisfaction Since the consistent finding in several reviews

  4. Pitch Error Analysis of Young Piano Students' Music Reading Performances

    ERIC Educational Resources Information Center

    Rut Gudmundsdottir, Helga

    2010-01-01

    This study analyzed the music reading performances of 6-13-year-old piano students (N = 35) in their second year of piano study. The stimuli consisted of three piano pieces, systematically constructed to vary in terms of left-hand complexity and input simultaneity. The music reading performances were recorded digitally and a code of error analysis…

  5. Manual control analysis of drug effects on driving performance

    NASA Technical Reports Server (NTRS)

    Smiley, A.; Ziedman, K.; Moskowitz, H.

    1981-01-01

    The effects of secobarbital, diazepam, alcohol, and marihuana on car-driver transfer functions obtained using a driving simulator were studied. The first three substances, all CNS depressants, reduced gain, crossover frequency, and coherence which resulted in poorer tracking performance. Marihuana also impaired tracking performance but the only effect on the transfer function parameters was to reduce coherence.

  6. An Analysis of Parents' Attitudes towards Authentic Performance Assessment.

    ERIC Educational Resources Information Center

    Xue, Yange; Meisels, Samuel J.; Bickel, Donna DiPrima; Nicholson, Julie; Atkins-Burnett, Sally

    This study focused on parents' reactions to the implementation of a curriculum-embedded performance assessment for young children. It examines the Work Sampling System (WSS) (S. Meisels, J. Jablon, D. Marsden, M. Dichtelmiller, A. Dorfman, and D. Steele, 1994), a continuous progress performance assessment system that offers an alternative to…

  7. An Analysis of a High Performing School District's Culture

    ERIC Educational Resources Information Center

    Corum, Kenneth D.; Schuetz, Todd B.

    2012-01-01

    This report describes a problem based learning project focusing on the cultural elements of a high performing school district. Current literature on school district culture provides numerous cultural elements that are present in high performing school districts. With the current climate in education placing pressure on school districts to perform…

  8. Modelling and performance analysis of four and eight element TCAS

    NASA Technical Reports Server (NTRS)

    Sampath, K. S.; Rojas, R. G.; Burnside, W. D.

    1990-01-01

    This semi-annual report describes the work performed during the period September 1989 through March 1990. The first section presents a description of the effect of the engines of the Boeing 737-200 on the performance of a bottom mounted eight-element traffic alert and collision avoidance system (TCAS). The second section deals exclusively with a four element TCAS antenna. The model obtained to simulate the four element TCAS and new algorithms developed for studying its performance are described. The effect of location on its performance when mounted on top of a Boeing 737-200 operating at 1060 MHz is discussed. It was found that the four element TCAS generally does not perform as well as the eight element TCAS III.

  9. Sensitivity analysis and performance estimation of refractivity from clutter techniques

    NASA Astrophysics Data System (ADS)

    Yardim, Caglar; Gerstoft, Peter; Hodgkiss, William S.

    2009-02-01

    Refractivity from clutter (RFC) refers to techniques that estimate the atmospheric refractivity profile from radar clutter returns. A RFC algorithm works by finding the environment whose simulated clutter pattern matches the radar measured one. This paper introduces a procedure to compute RFC estimator performance. It addresses the major factors such as the radar parameters, the sea surface characteristics, and the environment (region, time of the day, season) that affect the estimator performance and formalizes an error metric combining all of these. This is important for applications such as calculating the optimal radar parameters, selecting the best RFC inversion algorithm under a set of conditions, and creating a regional performance map of a RFC system. The performance metric is used to compute the RFC performance of a non-Bayesian evaporation duct estimator. A Bayesian estimator that incorporates meteorological statistics in the inversion is introduced and compared to the non-Bayesian estimator. The performance metric is used to determine the optimal radar parameters of the evaporation duct estimator for six scenarios. An evaporation duct inversion performance map for a S band radar is created for the larger Mediterranean/Arabian Sea region.

  10. Experimental Analysis of Team Performance: Methodological Developments and Research Results.

    DTIC Science & Technology

    1982-07-06

    The effects of a cooperation contingency on behavior in a continuous three-person environment. Journal of the Experimental Analysis of Behavior , 25...J.V. Effects of a pairing contingency on behavior in a three-person programmed environment. Journal of the Experimental Analysis of Behavior , 1978

  11. Mir Cooperative Solar Array Flight Performance Data and Computational Analysis

    NASA Technical Reports Server (NTRS)

    Kerslake, Thomas W.; Hoffman, David J.

    1997-01-01

    The Mir Cooperative Solar Array (MCSA) was developed jointly by the United States (US) and Russia to provide approximately 6 kW of photovoltaic power to the Russian space station Mir. The MCSA was launched to Mir in November 1995 and installed on the Kvant-1 module in May 1996. Since the MCSA photovoltaic panel modules (PPMs) are nearly identical to those of the International Space Station (ISS) photovoltaic arrays, MCSA operation offered an opportunity to gather multi-year performance data on this technology prior to its implementation on ISS. Two specially designed test sequences were executed in June and December 1996 to measure MCSA performance. Each test period encompassed 3 orbital revolutions whereby the current produced by the MCSA channels was measured. The temperature of MCSA PPMs was also measured. To better interpret the MCSA flight data, a dedicated FORTRAN computer code was developed to predict the detailed thermal-electrical performance of the MCSA. Flight data compared very favorably with computational performance predictions. This indicated that the MCSA electrical performance was fully meeting pre-flight expectations. There were no measurable indications of unexpected or precipitous MCSA performance degradation due to contamination or other causes after 7 months of operation on orbit. Power delivered to the Mir bus was lower than desired as a consequence of the retrofitted power distribution cabling. The strong correlation of experimental and computational results further bolsters the confidence level of performance codes used in critical ISS electric power forecasting. In this paper, MCSA flight performance tests are described as well as the computational modeling behind the performance predictions.

  12. Mir Cooperative Solar Array flight performance data and computational analysis

    SciTech Connect

    Kerslake, T.W.; Hoffman, D.J.

    1997-12-31

    The Mir Cooperative Solar Array (MCSA) was developed jointly by the United States (US) and Russia to provide approximately 6 kW of photovoltaic power to the Russian space station Mir. The MCSA was launched to Mir in November 1995 and installed on the Kvant-1 module in May 1996. Since the MCSA photovoltaic panel modules (PPMs) are nearly identical to those of the International Space Station (ISS) photovoltaic arrays, MCSA operation offered an opportunity to gather multi-year performance data on this technology prior to its implementation on ISS. Two specially designed test sequences were executed in June and December 1996 to measure MCSA performance. Each test period encompassed 3 orbital revolutions whereby the current produced by the MCSA channels was measured. The temperature of MCSA PPMs was also measured. To better interpret the MCSA flight data, a dedicated FORTRAN computer code was developed to predict the detailed thermal-electrical performance of the MCSA. Flight data compared very favorably with computational performance predictions. This indicated that the MCSA electrical performance was fully meeting pre-flight expectations. There were no measurable indications of unexpected or precipitous MCSA performance degradation due to contamination or other causes after 7 months of operation on orbit. Power delivered to the Mir bus was lower than desired as a consequence of the retrofitted power distribution cabling. The strong correlation of experimental and computational results further bolsters the confidence level of performance codes used in critical ISS electric power forecasting. In this paper, MCSA flight performance tests are described as well as the computational modeling behind the performance predictions.

  13. Issues in performing a network meta-analysis.

    PubMed

    Senn, Stephen; Gavini, Francois; Magrez, David; Scheen, André

    2013-04-01

    The example of the analysis of a collection of trials in diabetes consisting of a sparsely connected network of 10 treatments is used to make some points about approaches to analysis. In particular various graphical and tabular presentations, both of the network and of the results are provided and the connection to the literature of incomplete blocks is made. It is clear from this example that is inappropriate to treat the main effect of trial as random and the implications of this for analysis are discussed. It is also argued that the generalisation from a classic random-effect meta-analysis to one applied to a network usually involves strong assumptions about the variance components involved. Despite this, it is concluded that such an analysis can be a useful way of exploring a set of trials.

  14. Imaging Performance Analysis of Simbol-X with Simulations

    SciTech Connect

    Chauvin, M.; Roques, J. P.

    2009-05-11

    Simbol-X is an X-Ray telescope operating in formation flight. It means that its optical performances will strongly depend on the drift of the two spacecrafts and its ability to measure these drifts for image reconstruction. We built a dynamical ray tracing code to study the impact of these parameters on the optical performance of Simbol-X (see Chauvin et al., these proceedings). Using the simulation tool we have developed, we have conducted detailed analyses of the impact of different parameters on the imaging performance of the Simbol-X telescope.

  15. Imaging Performance Analysis of Simbol-X with Simulations

    NASA Astrophysics Data System (ADS)

    Chauvin, M.; Roques, J. P.

    2009-05-01

    Simbol-X is an X-Ray telescope operating in formation flight. It means that its optical performances will strongly depend on the drift of the two spacecrafts and its ability to measure these drifts for image reconstruction. We built a dynamical ray tracing code to study the impact of these parameters on the optical performance of Simbol-X (see Chauvin et al., these proceedings). Using the simulation tool we have developed, we have conducted detailed analyses of the impact of different parameters on the imaging performance of the Simbol-X telescope.

  16. Performance Analysis of FSO Communication Using Different Coding Schemes

    NASA Astrophysics Data System (ADS)

    Gupta, Nidhi; Prakash, Siddi Jai; Kaushal, Hemani; Jain, V. K.; Kar, Subrat

    2011-10-01

    A major impairment in Free Space Optical (FSO) links is the turbulence induced fading which severely degrades the link performance. To mitigate turbulence induced fading and, therefore, to improve the error rate performance, error control coding schemes can be used. In this paper, we investigate the bit error performance of FSO links with different coding techniques over log normal atmospheric turbulence fading channels. The modulation scheme considered is BPSK. On the basis of computed results using Monte Carlo simulation, a comparative study of uncoded and coded systems is made.

  17. Performance analysis of job scheduling policies in parallel supercomputing environments

    SciTech Connect

    Naik, V.K.; Squillante, M.S.; Setia, S.K.

    1993-12-31

    In this paper the authors analyze three general classes of scheduling policies under a workload typical of largescale scientific computing. These policies differ in the manner in which processors are partitioned among the jobs as well as the way in which jobs are prioritized for execution on the partitions. Their results indicate that existing static schemes do not perform well under varying workloads. Adaptive policies tend to make better scheduling decisions, but their ability to adjust to workload changes is limited. Dynamic partitioning policies, on the other hand, yield the best performance and can be tuned to provide desired performance differences among jobs with varying resource demands.

  18. Performance analysis of morphological component analysis (MCA) method for mammograms using some statistical features

    NASA Astrophysics Data System (ADS)

    Gardezi, Syed Jamal Safdar; Faye, Ibrahima; Kamel, Nidal; Eltoukhy, Mohamed Meselhy; Hussain, Muhammad

    2014-10-01

    Early detection of breast cancer helps reducing the mortality rates. Mammography is very useful tool in breast cancer detection. But it is very difficult to separate different morphological features in mammographic images. In this study, Morphological Component Analysis (MCA) method is used to extract different morphological aspects of mammographic images by effectively preserving the morphological characteristics of regions. MCA decomposes the mammogram into piecewise smooth part and the texture part using the Local Discrete Cosine Transform (LDCT) and Curvelet Transform via wrapping (CURVwrap). In this study, simple comparison in performance has been done using some statistical features for the original image versus the piecewise smooth part obtained from the MCA decomposition. The results show that MCA suppresses the structural noises and blood vessels from the mammogram and enhances the performance for mass detection.

  19. Analysis of complex network performance and heuristic node removal strategies

    NASA Astrophysics Data System (ADS)

    Jahanpour, Ehsan; Chen, Xin

    2013-12-01

    Removing important nodes from complex networks is a great challenge in fighting against criminal organizations and preventing disease outbreaks. Six network performance metrics, including four new metrics, are applied to quantify networks' diffusion speed, diffusion scale, homogeneity, and diameter. In order to efficiently identify nodes whose removal maximally destroys a network, i.e., minimizes network performance, ten structured heuristic node removal strategies are designed using different node centrality metrics including degree, betweenness, reciprocal closeness, complement-derived closeness, and eigenvector centrality. These strategies are applied to remove nodes from the September 11, 2001 hijackers' network, and their performance are compared to that of a random strategy, which removes randomly selected nodes, and the locally optimal solution (LOS), which removes nodes to minimize network performance at each step. The computational complexity of the 11 strategies and LOS is also analyzed. Results show that the node removal strategies using degree and betweenness centralities are more efficient than other strategies.

  20. Performance analysis of ten brands of batteries for hearing aids

    PubMed Central

    Penteado, Silvio Pires; Bento, Ricardo Ferreira

    2013-01-01

    Summary Introduction: Comparison of the performance of hearing instrument batteries from various manufacturers can enable otologists, audiologists, or final consumers to select the best products, maximizing the use of these materials. Aim: To analyze the performance of ten brands of batteries for hearing aids available in the Brazilian marketplace. Methods: Hearing aid batteries in four sizes were acquired from ten manufacturers and subjected to the same test conditions in an acoustic laboratory. Results: The results obtained in the laboratory contrasted with the values reported by manufacturers highlighted significant discrepancies, besides the fact that certain brands in certain sizes perform better on some tests, but does not indicate which brand is the best in all sizes. Conclusions: It was possible to investigate the performance of ten brands of hearing aid batteries and describe the procedures to be followed for leakage, accidental intake, and disposal. PMID:25992026

  1. Preliminary basic performance analysis of the Cedar multiprocessor memory system

    NASA Technical Reports Server (NTRS)

    Gallivan, K.; Jalby, W.; Turner, S.; Veidenbaum, A.; Wijshoff, H.

    1991-01-01

    Some preliminary basic results on the performance of the Cedar multiprocessor memory system are presented. Empirical results are presented and used to calibrate a memory system simulator which is then used to discuss the scalability of the system.

  2. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    EPA Pesticide Factsheets

    The model performance evaluation consists of metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors.

  3. Assessing BMP Performance Using Microtox® Toxicity Analysis

    EPA Science Inventory

    Best Management Practices (BMPs) have been shown to be effective in reducing runoff and pollutants from urban areas and thus provide a mechanism to improve downstream water quality. Currently, BMP performance regarding water quality improvement is assessed through measuring each...

  4. Parent involvement and student academic performance: A multiple mediational analysis

    PubMed Central

    Topor, David R.; Keane, Susan P.; Shelton, Terri L.; Calkins, Susan D.

    2011-01-01

    Parent involvement in a child's education is consistently found to be positively associated with a child's academic performance. However, there has been little investigation of the mechanisms that explain this association. The present study examines two potential mechanisms of this association: the child's perception of cognitive competence and the quality of the student-teacher relationship. This study used a sample of 158 seven-year old participants, their mothers, and their teachers. Results indicated a statistically significant association between parent involvement and a child's academic performance, over and above the impact of the child's intelligence. A multiple mediation model indicated that the child's perception of cognitive competence fully mediated the relation between parent involvement and the child's performance on a standardized achievement test. The quality of the student-teacher relationship fully mediated the relation between parent involvement and teacher ratings of the child's classroom academic performance. Limitations, future research directions, and implications for public policy initiatives were discussed. PMID:20603757

  5. Assessing BMP Performance Using Microtox Toxicity Analysis - Rhode Island

    EPA Science Inventory

    Best Management Practices (BMPs) have been shown to be effective in reducing runoff and pollutants from urban areas and thus provide a mechanism to improve downstream water quality. Currently, BMP performance regarding water quality improvement is assessed through measuring each...

  6. Parent involvement and student academic performance: a multiple mediational analysis.

    PubMed

    Topor, David R; Keane, Susan P; Shelton, Terri L; Calkins, Susan D

    2010-01-01

    Parent involvement in a child's education is consistently found to be positively associated with a child's academic performance. However, there has been little investigation of the mechanisms that explain this association. The present study examines two potential mechanisms of this association: the child's perception of cognitive competence and the quality of the student-teacher relationship. This study used a sample of 158 seven-year-old participants, their mothers, and their teachers. Results indicated a statistically significant association between parent involvement and a child's academic performance, over and above the impact of the child's intelligence. A multiple mediation model indicated that the child's perception of cognitive competence fully mediated the relation between parent involvement and the child's performance on a standardized achievement test. The quality of the student-teacher relationship fully mediated the relation between parent involvement and teacher ratings of the child's classroom academic performance. Limitations, future research directions, and implications for public policy initiatives are discussed.

  7. Moisture and Structural Analysis for High Performance Hybrid Wall Assemblies

    SciTech Connect

    Grin, A.; Lstiburek, J.

    2012-09-01

    This report describes the work conducted by the Building Science Corporation (BSC) Building America Research Team's 'Energy Efficient Housing Research Partnerships' project. Based on past experience in the Building America program, they have found that combinations of materials and approaches---in other words, systems--usually provide optimum performance. No single manufacturer typically provides all of the components for an assembly, nor has the specific understanding of all the individual components necessary for optimum performance.

  8. Distributed Sensor Fusion Performance Analysis Under an Uncertain Environment

    DTIC Science & Technology

    2012-10-01

    cannot be obtained accurately, the sub-optimal fusion processor is assumed to have an estimated correlation coefficient and its performance difference...detectability indices for the sub-optimal and optimal cases is derived as a function of the true correlation coefficient , the estimated value, and the...performance is to a mismatched estimation of the correlation coefficient . Furthermore, we show that for the special case where all local sensors have the

  9. A Performance Analysis of the USAF Work Information Management System

    DTIC Science & Technology

    1990-09-01

    numbers reset when they reach 65535. Two statistics are available from this information that provide a means to evaluate the impact of VTOC I/Os and...One SA recommends page pools on all disks (18). The decision on number and placement of page pools can best be made with a complete evaluation of I...provide economic evaluation or justification for performance improvement alternatives. Several methods are available to measure performance. Benchmark

  10. Using Weibull Distribution Analysis to Evaluate ALARA Performance

    SciTech Connect

    E. L. Frome, J. P. Watkins, and D. A. Hagemeyer

    2009-10-01

    As Low as Reasonably Achievable (ALARA) is the underlying principle for protecting nuclear workers from potential health outcomes related to occupational radiation exposure. Radiation protection performance is currently evaluated by measures such as collective dose and average measurable dose, which do not indicate ALARA performance. The purpose of this work is to show how statistical modeling of individual doses using the Weibull distribution can provide objective supplemental performance indicators for comparing ALARA implementation among sites and for insights into ALARA practices within a site. Maximum likelihood methods were employed to estimate the Weibull shape and scale parameters used for performance indicators. The shape parameter reflects the effectiveness of maximizing the number of workers receiving lower doses and is represented as the slope of the fitted line on a Weibull probability plot. Additional performance indicators derived from the model parameters include the 99th percentile and the exceedance fraction. When grouping sites by collective total effective dose equivalent (TEDE) and ranking by 99th percentile with confidence intervals, differences in performance among sites can be readily identified. Applying this methodology will enable more efficient and complete evaluation of the effectiveness of ALARA implementation.

  11. Social Cognitive Career Theory, Conscientiousness, and Work Performance: A Meta-Analytic Path Analysis

    ERIC Educational Resources Information Center

    Brown, Steven D.; Lent, Robert W.; Telander, Kyle; Tramayne, Selena

    2011-01-01

    We performed a meta-analytic path analysis of an abbreviated version of social cognitive career theory's (SCCT) model of work performance (Lent, Brown, & Hackett, 1994). The model we tested included the central cognitive predictors of performance (ability, self-efficacy, performance goals), with the exception of outcome expectations. Results…

  12. Simulation and performance analysis of triple-effect absorption cycles

    SciTech Connect

    Grossman, G.; Wilk, M.; DeVault, R.C.

    1993-08-01

    Performance simulation has been carried out for several triple-effect cycles, designed to improve utilization of high temperature heat sources for absorption systems and capable of substantial performance improvement over equivalent double-effect cycles. The systems investigated include the three-condenser-three-desorber (3C3D) cycle, forming an extension of the conventional double-effect one; the recently proposed Double Condenser Coupled (DCC) cycle which recovers heat from the hot condensate leaving the high temperature condensers and adds it to the lower temperature desorbers; and the dual loop cycle comprising two complete single-effect loops, recovering heat from the condenser and absorber of one loop to the desorber of the other loop and generating a cooling effect in the evaporators of both loops. A modular computer code for simulation of absorption systems was used to investigate the performances of the cycles and compare them on an equivalent basis, by selecting a common reference design and operating condition. Performance simulation was carried out over a range of operating conditions, including some investigation of the influence of the design parameters. Coefficients of performance ranging from 1.27 for the series-flow 3C3D to 1.73 for the parallel-flow DCC have been calculated at the design point. The relative merits and shortcomings of the different cycle configurations has been studied.

  13. Full-Envelope Launch Abort System Performance Analysis Methodology

    NASA Technical Reports Server (NTRS)

    Aubuchon, Vanessa V.

    2014-01-01

    The implementation of a new dispersion methodology is described, which dis-perses abort initiation altitude or time along with all other Launch Abort System (LAS) parameters during Monte Carlo simulations. In contrast, the standard methodology assumes that an abort initiation condition is held constant (e.g., aborts initiated at altitude for Mach 1, altitude for maximum dynamic pressure, etc.) while dispersing other LAS parameters. The standard method results in large gaps in performance information due to the discrete nature of initiation conditions, while the full-envelope dispersion method provides a significantly more comprehensive assessment of LAS abort performance for the full launch vehicle ascent flight envelope and identifies performance "pinch-points" that may occur at flight conditions outside of those contained in the discrete set. The new method has significantly increased the fidelity of LAS abort simulations and confidence in the results.

  14. Trajectory analysis and performance for SEP Comet Encke missions

    NASA Technical Reports Server (NTRS)

    Sauer, C. G., Jr.

    1973-01-01

    A summary of the performance of Solar Electric Propulsion spacecraft for Comet Encke missions for the 1980, 1984 and 1987 mission opportunities is presented together with a description of the spacecraft trajectory for each opportunity. Included is data for rendezvous trajectories for all three opportunities and data for a slow flyby mission during the 1980 opportunity. A range of propulsion system input powers of 10 to 20 kW are considered together with a constant spacecraft power requirement of 400 watts. The performance presented in this paper is indicative of that using 30 cm Mercury electron bombardment thrusters that are currently being developed. Performance is given in terms of final spacecraft mass and is thus independent of any particular spacecraft design concept.

  15. Off-design performance analysis of MHD generator channels

    NASA Astrophysics Data System (ADS)

    Wilson, D. R.; Williams, T. S.

    1980-01-01

    A computer code for performing parametric design point calculations, and evaluating the off-design performance of MHD generators has been developed. The program is capable of analyzing Faraday, Hall, and DCW channels, including the effect of electrical shorting in the gas boundary layers and coal slag layers. Direct integration of the electrode voltage drops is included. The program can be run in either the design or off-design mode. Details of the computer code, together with results of a study of the design and off-design performance of the proposed ETF MHD generator are presented. Design point variations of pre-heat and stoichiometry were analyzed. The off-design study included variations in mass flow rate and oxygen enrichment.

  16. Hydrogen engine performance analysis project. Second annual report

    SciTech Connect

    Adt, Jr., R. R.; Swain, M. R.; Pappas, J. M.

    1980-01-01

    Progress in a 3 year research program to evaluate the performance and emission characteristics of hydrogen-fueled internal combustion engines is reported. Fifteen hydrogen engine configurations will be subjected to performance and emissions characterization tests. During the first two years, baseline data for throttled and unthrottled, carburetted and timed hydrogen induction, Pre IVC hydrogen-fueled engine configurations, with and without exhaust gas recirculation (EGR) and water injection, were obtained. These data, along with descriptions of the test engine and its components, the test apparatus, experimental techniques, experiments performed and the results obtained, are given. Analyses of other hydrogen-engine project data are also presented and compared with the results of the present effort. The unthrottled engine vis-a-vis the throttled engine is found, in general, to exhibit higher brake thermal efficiency. The unthrottled engine also yields lower NO/sub x/ emissions, which were found to be a strong function of fuel-air equivalence ratio. (LCL)

  17. An Analysis of Performance Enhancement Techniques for Overset Grid Applications

    NASA Technical Reports Server (NTRS)

    Djomehri, J. J.; Biswas, R.; Potsdam, M.; Strawn, R. C.; Biegel, Bryan (Technical Monitor)

    2002-01-01

    The overset grid methodology has significantly reduced time-to-solution of high-fidelity computational fluid dynamics (CFD) simulations about complex aerospace configurations. The solution process resolves the geometrical complexity of the problem domain by using separately generated but overlapping structured discretization grids that periodically exchange information through interpolation. However, high performance computations of such large-scale realistic applications must be handled efficiently on state-of-the-art parallel supercomputers. This paper analyzes the effects of various performance enhancement techniques on the parallel efficiency of an overset grid Navier-Stokes CFD application running on an SGI Origin2000 machine. Specifically, the role of asynchronous communication, grid splitting, and grid grouping strategies are presented and discussed. Results indicate that performance depends critically on the level of latency hiding and the quality of load balancing across the processors.

  18. Comparison of performance between rescaled range analysis and rescaled variance analysis in detecting abrupt dynamic change

    NASA Astrophysics Data System (ADS)

    He, Wen-Ping; Liu, Qun-Qun; Jiang, Yun-Di; Lu, Ying

    2015-04-01

    In the present paper, a comparison of the performance between moving cutting data-rescaled range analysis (MC-R/S) and moving cutting data-rescaled variance analysis (MC-V/S) is made. The results clearly indicate that the operating efficiency of the MC-R/S algorithm is higher than that of the MC-V/S algorithm. In our numerical test, the computer time consumed by MC-V/S is approximately 25 times that by MC-R/S for an identical window size in artificial data. Except for the difference in operating efficiency, there are no significant differences in performance between MC-R/S and MC-V/S for the abrupt dynamic change detection. MC-R/S and MC-V/S both display some degree of anti-noise ability. However, it is important to consider the influences of strong noise on the detection results of MC-R/S and MC-V/S in practical application processes. Project supported by the National Basic Research Program of China (Grant No. 2012CB955902) and the National Natural Science Foundation of China (Grant Nos. 41275074, 41475073, and 41175084).

  19. Performance issues for engineering analysis on MIMD parallel computers

    SciTech Connect

    Fang, H.E.; Vaughan, C.T.; Gardner, D.R.

    1994-08-01

    We discuss how engineering analysts can obtain greater computational resolution in a more timely manner from applications codes running on MIMD parallel computers. Both processor speed and memory capacity are important to achieving better performance than a serial vector supercomputer. To obtain good performance, a parallel applications code must be scalable. In addition, the aspect ratios of the subdomains in the decomposition of the simulation domain onto the parallel computer should be of order 1. We demonstrate these conclusions using simulations conducted with the PCTH shock wave physics code running on a Cray Y-MP, a 1024-node nCUBE 2, and an 1840-node Paragon.

  20. Virtual Mastoidectomy Performance Evaluation through Multi-Volume Analysis

    PubMed Central

    Kerwin, Thomas; Stredney, Don; Wiet, Gregory; Shen, Han-Wei

    2012-01-01

    Purpose Development of a visualization system that provides surgical instructors with a method to compare the results of many virtual surgeries (n > 100). Methods A masked distance field models the overlap between expert and resident results. Multiple volume displays are used side-by-side with a 2D point display. Results Performance characteristics were examined by comparing the results of specific residents with those of experts and the entire class. Conclusions The software provides a promising approach for comparing performance between large groups of residents learning mastoidectomy techniques. PMID:22528058

  1. A relative performance analysis of atmospheric Laser Doppler Velocimeter methods.

    NASA Technical Reports Server (NTRS)

    Farmer, W. M.; Hornkohl, J. O.; Brayton, D. B.

    1971-01-01

    Evaluation of the effectiveness of atmospheric applications of a Laser Doppler Velocimeter (LDV) at a wavelength of about 0.5 micrometer in conjunction with dual scatter LDV illuminating techniques, or at a wavelength of 10.6 micrometer with local oscillator LDV illuminating techniques. Equations and examples are given to provide a quantitative basis for LDV system selection and performance criteria in atmospheric research. The comparative study shows that specific ranges and conditions exist where performance of one of the methods is superior to that of the other. It is also pointed out that great care must be exercised in choosing system parameters that optimize a particular LDV designed for atmospheric applications.

  2. Performance analysis of different database in new internet mapping system

    NASA Astrophysics Data System (ADS)

    Yao, Xing; Su, Wei; Gao, Shuai

    2017-03-01

    In the Mapping System of New Internet, Massive mapping entries between AID and RID need to be stored, added, updated, and deleted. In order to better deal with the problem when facing a large number of mapping entries update and query request, the Mapping System of New Internet must use high-performance database. In this paper, we focus on the performance of Redis, SQLite, and MySQL these three typical databases, and the results show that the Mapping System based on different databases can adapt to different needs according to the actual situation.

  3. Performance analysis of the Jersey City Total Energy Site

    NASA Astrophysics Data System (ADS)

    Hurley, C. W.; Ryan, J. D.; Phillips, C. W.

    1982-08-01

    Engineering, economic, environmental, and reliability data from a 486 - unit apartment/commercial complex was gathered. The complex was designed to recover waste heat from diesel engines to make the central equipment building a total energy (TE) plant. Analysis of the data indicates that a significant savings in fuel is possible by minor modifications in plant procedures. The results of an analysis of the quality of utility services supplied to the consumers on the site and an analysis of a series of environmental tests made the effects of the plant on air quality and noise are included. In general, although those systems utilizing the TE concept showed a significant savings in fuel, such systems do not represent attractive investments compared to conventional systems.

  4. Performance analysis of exam gloves used for aseptic rodent surgery.

    PubMed

    LeMoine, Dana M; Bergdall, Valerie K; Freed, Carrie

    2015-05-01

    Aseptic technique includes the use of sterile surgical gloves for survival surgeries in rodents to minimize the incidence of infections. Exam gloves are much less expensive than are surgical gloves and may represent a cost-effective, readily available option for use in rodent surgery. This study examined the effectiveness of surface disinfection of exam gloves with 70% isopropyl alcohol or a solution of hydrogen peroxide and peracetic acid (HP-PA) in reducing bacterial contamination. Performance levels for asepsis were met when gloves were negative for bacterial contamination after surface disinfection and sham 'exertion' activity. According to these criteria, 94% of HP-PA-disinfected gloves passed, compared with 47% of alcohol-disinfected gloves. In addition, the effect of autoclaving on the integrity of exam gloves was examined, given that autoclaving is another readily available option for aseptic preparation. Performance criteria for glove integrity after autoclaving consisted of: the ability to don the gloves followed by successful simulation of wound closure and completion of stretch tests without tearing or observable defects. Using this criteria, 98% of autoclaved nitrile exam gloves and 76% of autoclaved latex exam gloves met performance expectations compared with the performance of standard surgical gloves (88% nitrile, 100% latex). The results of this study support the use of HP-PA-disinfected latex and nitrile exam gloves or autoclaved nitrile exam gloves as viable cost-effective alternatives to sterile surgical gloves for rodent surgeries.

  5. Relative Performance of Academic Departments Using DEA with Sensitivity Analysis

    ERIC Educational Resources Information Center

    Tyagi, Preeti; Yadav, Shiv Prasad; Singh, S. P.

    2009-01-01

    The process of liberalization and globalization of Indian economy has brought new opportunities and challenges in all areas of human endeavor including education. Educational institutions have to adopt new strategies to make best use of the opportunities and counter the challenges. One of these challenges is how to assess the performance of…

  6. Performance analysis of vortex based mixers for confined flows

    NASA Astrophysics Data System (ADS)

    Buschhagen, Timo

    The hybrid rocket is still sparsely employed within major space or defense projects due to their relatively poor combustion efficiency and low fuel grain regression rate. Although hybrid rockets can claim advantages in safety, environmental and performance aspects against established solid and liquid propellant systems, the boundary layer combustion process and the diffusion based mixing within a hybrid rocket grain port leaves the core flow unmixed and limits the system performance. One principle used to enhance the mixing of gaseous flows is to induce streamwise vorticity. The counter-rotating vortex pair (CVP) mixer utilizes this principle and introduces two vortices into a confined flow, generating a stirring motion in order to transport near wall media towards the core and vice versa. Recent studies investigated the velocity field introduced by this type of swirler. The current work is evaluating the mixing performance of the CVP concept, by using an experimental setup to simulate an axial primary pipe flow with a radially entering secondary flow. Hereby the primary flow is altered by the CVP swirler unit. The resulting setup therefore emulates a hybrid rocket motor with a cylindrical single port grain. In order to evaluate the mixing performance the secondary flow concentration at the pipe assembly exit is measured, utilizing a pressure-sensitive paint based procedure.

  7. Analysis of Factors that Predict Clinical Performance in Medical School

    ERIC Educational Resources Information Center

    White, Casey B.; Dey, Eric L.; Fantone, Joseph C.

    2009-01-01

    Academic achievement indices including GPAs and MCAT scores are used to predict the spectrum of medical student academic performance types. However, use of these measures ignores two changes influencing medical school admissions: student diversity and affirmative action, and an increased focus on communication skills. To determine if GPA and MCAT…

  8. Performance analysis of digital integrate-and-dump filters

    NASA Technical Reports Server (NTRS)

    Chie, C. M.

    1982-01-01

    Key design parameters associated with the operation of a digital integrate-and-dump filter are identified in this paper. Performance degradation effects associated with the selection of these parameters such as analog-to-digital converter (ADC) gain loading factor, number of bits used, predetection bandwidth, sampling rate, and accumulator length are considered. Numerical results of practical interest are also provided.

  9. Performance Factors Analysis -- A New Alternative to Knowledge Tracing

    ERIC Educational Resources Information Center

    Pavlik, Philip I., Jr.; Cen, Hao; Koedinger, Kenneth R.

    2009-01-01

    Knowledge tracing (KT)[1] has been used in various forms for adaptive computerized instruction for more than 40 years. However, despite its long history of application, it is difficult to use in domain model search procedures, has not been used to capture learning where multiple skills are needed to perform a single action, and has not been used…

  10. Meta-Analysis of Predictors of Dental School Performance

    ERIC Educational Resources Information Center

    DeCastro, Jeanette E.

    2012-01-01

    Accurate prediction of which candidates show the most promise of success in dental school is imperative for the candidates, the profession, and the public. Several studies suggested that predental GPAs and the Dental Admissions Test (DAT) produce a range of correlations with dental school performance measures. While there have been similarities,…

  11. Wireless imaging sensor network design and performance analysis

    NASA Astrophysics Data System (ADS)

    Sundaram, Ramakrishnan

    2016-05-01

    This paper discusses (a) the design and implementation of the integrated radio tomographic imaging (RTI) interface for radio signal strength (RSS) data obtained from a wireless imaging sensor network (WISN) (b) the use of model-driven methods to determine the extent of regularization to be applied to reconstruct images from the RSS data, and (c) preliminary study of the performance of the network.

  12. Performance Assessment in Water Polo Using Compositional Data Analysis.

    PubMed

    Ordóñez, Enrique García; Pérez, María Del Carmen Iglesias; González, Carlos Touriño

    2016-12-01

    The aim of the present study was to identify groups of offensive performance indicators which best discriminated between a match score (favourable, balanced or unfavourable) in water polo. The sample comprised 88 regular season games (2011-2014) from the Spanish Professional Water Polo League. The offensive performance indicators were clustered in five groups: Attacks in relation to the different playing situations; Shots in relation to the different playing situations; Attacks outcome; Origin of shots; Technical execution of shots. The variables of each group had a constant sum which equalled 100%. The data were compositional data, therefore the variables were changed by means of the additive log-ratio (alr) transformation. Multivariate discriminant analyses to compare the match scores were calculated using the transformed variables. With regard to the percentage of right classification, the results showed the group that discriminated the most between the match scores was "Attacks outcome" (60.4% for the original sample and 52.2% for cross-validation). The performance indicators that discriminated the most between the match scores in games with penalties were goals (structure coefficient (SC) = .761), counterattack shots (SC = .541) and counterattacks (SC = .481). In matches without penalties, goals were the primary discriminating factor (SC = .576). This approach provides a new tool to compare the importance of the offensive performance groups and their effect on the match score discrimination.

  13. How Motivation Affects Academic Performance: A Structural Equation Modelling Analysis

    ERIC Educational Resources Information Center

    Kusurkar, R. A.; Ten Cate, Th. J.; Vos, C. M. P.; Westers, P.; Croiset, G.

    2013-01-01

    Few studies in medical education have studied effect of quality of motivation on performance. Self-Determination Theory based on quality of motivation differentiates between Autonomous Motivation (AM) that originates within an individual and Controlled Motivation (CM) that originates from external sources. To determine whether Relative Autonomous…

  14. Performance Analysis of Exam Gloves Used for Aseptic Rodent Surgery

    PubMed Central

    LeMoine, Dana M; Bergdall, Valerie K; Freed, Carrie

    2015-01-01

    Aseptic technique includes the use of sterile surgical gloves for survival surgeries in rodents to minimize the incidence of infections. Exam gloves are much less expensive than are surgical gloves and may represent a cost-effective, readily available option for use in rodent surgery. This study examined the effectiveness of surface disinfection of exam gloves with 70% isopropyl alcohol or a solution of hydrogen peroxide and peracetic acid (HP–PA) in reducing bacterial contamination. Performance levels for asepsis were met when gloves were negative for bacterial contamination after surface disinfection and sham ‘exertion’ activity. According to these criteria, 94% of HP–PA-disinfected gloves passed, compared with 47% of alcohol-disinfected gloves. In addition, the effect of autoclaving on the integrity of exam gloves was examined, given that autoclaving is another readily available option for aseptic preparation. Performance criteria for glove integrity after autoclaving consisted of: the ability to don the gloves followed by successful simulation of wound closure and completion of stretch tests without tearing or observable defects. Using this criteria, 98% of autoclaved nitrile exam gloves and 76% of autoclaved latex exam gloves met performance expectations compared with the performance of standard surgical gloves (88% nitrile, 100% latex). The results of this study support the use of HP–PA-disinfected latex and nitrile exam gloves or autoclaved nitrile exam gloves as viable cost-effective alternatives to sterile surgical gloves for rodent surgeries. PMID:26045458

  15. Design and performance analysis of digital acoustic underwater telemetry system

    NASA Astrophysics Data System (ADS)

    Catipovic, J. A.; Baggeroer, A. B.; Vonderheydt, K.; Koelsch, D. E.

    1985-11-01

    The work discusses the design and performance characteristics of a Digital Acoustic Telemetry System (DATS) which incorporates the current state-of-the-art technology and is capable of reliable data transmission at rates useful to a wide range of ocean exploration and development gear.

  16. An analysis of performance models for free water surface wetlands.

    PubMed

    Carleton, James N; Montas, Hubert J

    2010-06-01

    Although treatment wetlands are intended to attenuate pollutants, reliably predicting their performance remains a challenge because removal processes are often complex, spatially heterogeneous, and incompletely understood. Although initially popular for characterizing wetland performance, plug flow reactor models are problematic because their parameters exhibit correlation with hydraulic loading. One-dimensional advective-dispersive-reactive models may also be inadequate when longitudinal dispersion is non-Fickian as a result of pronounced transverse gradients in velocity (preferential flow). Models that make use of residence time distributions have shown promise in improving wetland performance characterization, however their applicability may be limited by certain inherent assumptions, e.g. that transverse mixing is nil. A recently-developed bicontinuum (mobile-mobile) model that addresses some of these weaknesses may hold promise for improving wetland performance modeling, however this model has yet to be tested against real-world wetland data. This paper examines the state of the science of free water surface wetland hydrodynamics and transport modeling, discusses the strengths and weaknesses of various steady state models, and compares them to each other in terms of each model's ability to represent data sets from monitored wetlands.

  17. Performance Assessment in Water Polo Using Compositional Data Analysis

    PubMed Central

    Ordóñez, Enrique García; González, Carlos Touriño

    2016-01-01

    Abstract The aim of the present study was to identify groups of offensive performance indicators which best discriminated between a match score (favourable, balanced or unfavourable) in water polo. The sample comprised 88 regular season games (2011-2014) from the Spanish Professional Water Polo League. The offensive performance indicators were clustered in five groups: Attacks in relation to the different playing situations; Shots in relation to the different playing situations; Attacks outcome; Origin of shots; Technical execution of shots. The variables of each group had a constant sum which equalled 100%. The data were compositional data, therefore the variables were changed by means of the additive log-ratio (alr) transformation. Multivariate discriminant analyses to compare the match scores were calculated using the transformed variables. With regard to the percentage of right classification, the results showed the group that discriminated the most between the match scores was “Attacks outcome” (60.4% for the original sample and 52.2% for cross-validation). The performance indicators that discriminated the most between the match scores in games with penalties were goals (structure coefficient (SC) = .761), counterattack shots (SC = .541) and counterattacks (SC = .481). In matches without penalties, goals were the primary discriminating factor (SC = .576). This approach provides a new tool to compare the importance of the offensive performance groups and their effect on the match score discrimination. PMID:28031766

  18. Leadership Styles and Organizational Performance: A Predictive Analysis

    ERIC Educational Resources Information Center

    Kieu, Hung Q.

    2010-01-01

    Leadership is critically important because it affects the health of the organization. Research has found that leadership is one of the most significant contributors to organizational performance. Expanding and replicating previous research, and focusing on the specific telecommunications sector, this study used multiple correlation and regression…

  19. Rapid analysis of phentolamine by high-performance liquid chromatography.

    PubMed

    Webster, Gregory K; Lemmer, Robert R; Greenwald, Steven

    2003-02-01

    A rapid liquid chromatographic method is validated for the quantitative analysis of phentolamine. Phentolamine exists in three forms for this investigation: as a mesylate salt, hydrochloride salt, and free base. In solution, phentolamine dissociates from its salt and is chromatographed as free phentolamine. This validation confirms the analysis of each form, which is simply based upon molar mass differences encountered in weighing. As such, both the United States Pharmacopeia hydrochloride and mesylate standards are used throughout this validation to demonstrate this equivalency. The validation demonstrates that this method may be used to quantitate phentolamine, regardless of its salt form.

  20. Theoretical performance analysis for CMOS based high resolution detectors.

    PubMed

    Jain, Amit; Bednarek, Daniel R; Rudin, Stephen

    2013-03-06

    High resolution imaging capabilities are essential for accurately guiding successful endovascular interventional procedures. Present x-ray imaging detectors are not always adequate due to their inherent limitations. The newly-developed high-resolution micro-angiographic fluoroscope (MAF-CCD) detector has demonstrated excellent clinical image quality; however, further improvement in performance and physical design may be possible using CMOS sensors. We have thus calculated the theoretical performance of two proposed CMOS detectors which may be used as a successor to the MAF. The proposed detectors have a 300 μm thick HL-type CsI phosphor, a 50 μm-pixel CMOS sensor with and without a variable gain light image intensifier (LII), and are designated MAF-CMOS-LII and MAF-CMOS, respectively. For the performance evaluation, linear cascade modeling was used. The detector imaging chains were divided into individual stages characterized by one of the basic processes (quantum gain, binomial selection, stochastic and deterministic blurring, additive noise). Ranges of readout noise and exposure were used to calculate the detectors' MTF and DQE. The MAF-CMOS showed slightly better MTF than the MAF-CMOS-LII, but the MAF-CMOS-LII showed far better DQE, especially for lower exposures. The proposed detectors can have improved MTF and DQE compared with the present high resolution MAF detector. The performance of the MAF-CMOS is excellent for the angiography exposure range; however it is limited at fluoroscopic levels due to additive instrumentation noise. The MAF-CMOS-LII, having the advantage of the variable LII gain, can overcome the noise limitation and hence may perform exceptionally for the full range of required exposures; however, it is more complex and hence more expensive.

  1. Mirror Analysis: How To Achieve Customer-Driven Human Performance.

    ERIC Educational Resources Information Center

    Mourier, Pierre

    1999-01-01

    Presents an evaluation/development method for achieving customer-driven improvement in organizations. Describes the steps to external and internal "mirror analysis," a process for determining if the organization functions as a mirror of customers' needs and expectations. Twelve figures illustrate factors in the process. (AEF)

  2. An Integrated Solution for Performing Thermo-fluid Conjugate Analysis

    NASA Technical Reports Server (NTRS)

    Kornberg, Oren

    2009-01-01

    A method has been developed which integrates a fluid flow analyzer and a thermal analyzer to produce both steady state and transient results of 1-D, 2-D, and 3-D analysis models. The Generalized Fluid System Simulation Program (GFSSP) is a one dimensional, general purpose fluid analysis code which computes pressures and flow distributions in complex fluid networks. The MSC Systems Improved Numerical Differencing Analyzer (MSC.SINDA) is a one dimensional general purpose thermal analyzer that solves network representations of thermal systems. Both GFSSP and MSC.SINDA have graphical user interfaces which are used to build the respective model and prepare it for analysis. The SINDA/GFSSP Conjugate Integrator (SGCI) is a formbase graphical integration program used to set input parameters for the conjugate analyses and run the models. The contents of this paper describes SGCI and its thermo-fluids conjugate analysis techniques and capabilities by presenting results from some example models including the cryogenic chill down of a copper pipe, a bar between two walls in a fluid stream, and a solid plate creating a phase change in a flowing fluid.

  3. Applying Score Analysis to a Rehearsal Pedagogy of Expressive Performance

    ERIC Educational Resources Information Center

    Byo, James L.

    2014-01-01

    The discoveries of score analysis (e.g., minor seventh chord, ostinato, phrase elision, melodic fragment, half cadence) are more than just compositional techniques or music vocabulary. They are sounds--fascinating, storytelling, dynamic modes of expression--that when approached as such enrich the rehearsal experience. This article presents a…

  4. Rapid Automatized Naming and Reading Performance: A Meta-Analysis

    ERIC Educational Resources Information Center

    Araújo, Susana; Reis, Alexandra; Petersson, Karl Magnus; Faísca, Luís

    2015-01-01

    Evidence that rapid naming skill is associated with reading ability has become increasingly prevalent in recent years. However, there is considerable variation in the literature concerning the magnitude of this relationship. The objective of the present study was to provide a comprehensive analysis of the evidence on the relationship between rapid…

  5. The NetLogger Methodology for High Performance Distributed Systems Performance Analysis

    SciTech Connect

    Tierney, Brian; Johnston, William; Crowley, Brian; Hoo, Gary; Brooks, Chris; Gunter, Dan

    1999-12-23

    The authors describe a methodology that enables the real-time diagnosis of performance problems in complex high-performance distributed systems. The methodology includes tools for generating precision event logs that can be used to provide detailed end-to-end application and system level monitoring; a Java agent-based system for managing the large amount of logging data; and tools for visualizing the log data and real-time state of the distributed system. The authors developed these tools for analyzing a high-performance distributed system centered around the transfer of large amounts of data at high speeds from a distributed storage server to a remote visualization client. However, this methodology should be generally applicable to any distributed system. This methodology, called NetLogger, has proven invaluable for diagnosing problems in networks and in distributed systems code. This approach is novel in that it combines network, host, and application-level monitoring, providing a complete view of the entire system.

  6. Water-quality and biological conditions in the Lower Boise River, Ada and Canyon Counties, Idaho, 1994-2002

    USGS Publications Warehouse

    MacCoy, Dorene E.

    2004-01-01

    The water quality and biotic integrity of the lower Boise River between Lucky Peak Dam and the river's mouth near Parma, Idaho, have been affected by agricultural land and water use, wastewater treatment facility discharge, urbanization, reservoir operations, and river channel alteration. The U.S. Geological Survey (USGS) and cooperators have studied water-quality and biological aspects of the lower Boise River in the past to address water-quality concerns and issues brought forth by the Clean Water Act of 1977. Past and present issues include preservation of beneficial uses of the river for fisheries, recreation, and irrigation; and maintenance of high-quality water for domestic and agricultural uses. Evaluation of the data collected from 1994 to 2002 by the USGS revealed increases in constituent concentrations in the lower Boise in a downstream direction. Median suspended sediment concentrations from Diversion Dam (downstream from Lucky Peak Dam) to Parma increased more than 11 times, nitrogen concentrations increased more than 8 times, phosphorus concentrations increased more than 7 times, and fecal coliform concentrations increased more than 400 times. Chlorophyll-a concentrations, used as an indicator of nutrient input and the potential for nuisance algal growth, also increased in a downstream direction; median concentrations were highest at the Middleton and Parma sites. There were no discernible temporal trends in nutrients, sediment, or bacteria concentrations over the 8-year study. The State of Idaho?s temperature standards to protect coldwater biota and salmonid spawning were exceeded most frequently at Middleton and Parma. Suspended sediment concentrations exceeded criteria proposed by Idaho Department of Environmental Quality most frequently at Parma and at all but three tributaries. Total nitrogen concentrations at Glenwood, Middleton, and Parma exceeded national background levels; median flow-adjusted total nitrogen concentrations at Middleton and Parma were higher than those in undeveloped basins sampled nationwide by the USGS. Total phosphorus concentrations at Glenwood, Middleton, and Parma also exceeded those in undeveloped basins. Macroinvertebrate and fish communities were used to evaluate the long-term integration of water-quality contaminants and loss of habitat in the lower Boise. Biological integrity of the macroinvertebrate population was assessed with the attributes (metrics) of Ephemeroptera, Plecoptera, and Trichoptera (EPT) richness and metrics used in the Idaho River Macroinvertebrate Index (RMI): taxa richness; EPT richness; percent dominant taxon; percent Elmidae (riffle beetles); and percent predators. Average EPT was about 10, and RMI scores were frequently below 16, which indicated intermediate or poor water quality. The number of EPT taxa and RMI scores for the lower Boise were half those for least-impacted streams in Idaho. The fine sediment bioassessment index (FSBI) was used to evaluate macroinvertebrate sediment tolerance. The FSBI scores were lower than those for a site upstream in the Boise River Basin near Twin Springs, a site not impacted by urbanization and agriculture, which indicated that the lower Boise macroinvertebrate population may be impacted by fine sediment. Macroinvertebrate functional feeding groups and percent tolerant species, mainly at Middleton and Parma, were typical of those in areas of degraded water quality and habitat. The biological integrity of the fish population was evaluated using the Idaho River Fish Index (RFI), which consists of the 10 metrics: number of coldwater native species, percent sculpin, percent coldwater species, percent sensitive native individuals, percent tolerant individuals, number of nonindigenous species, number of coldwater fish captured per minute of electrofishing, percent of fish with deformities (eroded fins, lesions, or tumors), number of trout age classes, and percent carp. RFI scores for lower Boise sites indicated a d

  7. The effect of urban street gang densities on small area homicide incidence in a large metropolitan county, 1994-2002.

    PubMed

    Robinson, Paul L; Boscardin, W John; George, Sheba M; Teklehaimanot, Senait; Heslin, Kevin C; Bluthenthal, Ricky N

    2009-07-01

    The presence of street gangs has been hypothesized as influencing overall levels of violence in urban communities through a process of gun-drug diffusion and cross-type homicide. This effect is said to act independently of other known correlates of violence, i.e., neighborhood poverty. To test this hypothesis, we independently assessed the impact of population exposure to local street gang densities on 8-year homicide rates in small areas of Los Angeles County, California. Homicide data from the Los Angeles County Coroners Office were analyzed with original field survey data on street gang locations, while controlling for the established covariates of community homicide rates. Bivariate and multivariate regression analyses explicated strong relationships between homicide rates, gang density, race/ethnicity, and socioeconomic structure. Street gang densities alone had cumulative effects on small area homicide rates. Local gang densities, along with high school dropout rates, high unemployment rates, racial and ethnic concentration, and higher population densities, together explained 90% of the variation in local 8-year homicide rates. Several other commonly considered covariates were insignificant in the model. Urban environments with higher densities of street gangs exhibited higher overall homicide rates, independent of other community covariates of homicide. The unique nature of street gang killings and their greater potential to influence future local rates of violence suggests that more direct public health interventions are needed alongside traditional criminal justice mechanisms to combat urban violence and homicides.

  8. Performance Analysis of XCPC Powered Solar Cooling Demonstration Project

    NASA Astrophysics Data System (ADS)

    Widyolar, Bennett K.

    A solar thermal cooling system using novel non-tracking External Compound Parabolic Concentrators (XCPC) has been built at the University of California, Merced and operated for two cooling seasons. Its performance in providing power for space cooling has been analyzed. This solar cooling system is comprised of 53.3 m2 of XCPC trough collectors which are used to power a 23 kW double effect (LiBr) absorption chiller. This is the first system that combines both XCPC and absorption chilling technologies. Performance of the system was measured in both sunny and cloudy conditions, with both clean and dirty collectors. It was found that these collectors are well suited at providing thermal power to drive absorption cooling systems and that both the coinciding of available thermal power with cooling demand and the simplicity of the XCPC collectors compared to other solar thermal collectors makes them a highly attractive candidate for cooling projects.

  9. Transaction Performance vs. Moore's Law: A Trend Analysis

    NASA Astrophysics Data System (ADS)

    Nambiar, Raghunath; Poess, Meikel

    Intel co-founder Gordon E. Moore postulated in his famous 1965 paper that the number of components in integrated circuits had doubled every year from their invention in 1958 until 1965, and then predicted that the trend would continue for at least ten years. Later, David House, an Intel colleague, after factoring in the increase in performance of transistors, concluded that integrated circuits would double in performance every 18 months. Despite this trend in microprocessor improvements, your favored text editor continues to take the same time to start and your PC takes pretty much the same time to reboot as it took 10 years ago. Can this observation be made on systems supporting the fundamental aspects of our information based economy, namely transaction processing systems?

  10. Performance Analysis and Portability of the PLUM Load Balancing System

    NASA Technical Reports Server (NTRS)

    Oliker, Leonid; Biswas, Rupak; Gabow, Harold N.

    1998-01-01

    The ability to dynamically adapt an unstructured mesh is a powerful tool for solving computational problems with evolving physical features; however, an efficient parallel implementation is rather difficult. To address this problem, we have developed PLUM, an automatic portable framework for performing adaptive numerical computations in a message-passing environment. PLUM requires that all data be globally redistributed after each mesh adaption to achieve load balance. We present an algorithm for minimizing this remapping overhead by guaranteeing an optimal processor reassignment. We also show that the data redistribution cost can be significantly reduced by applying our heuristic processor reassignment algorithm to the default mapping of the parallel partitioner. Portability is examined by comparing performance on a SP2, an Origin2000, and a T3E. Results show that PLUM can be successfully ported to different platforms without any code modifications.

  11. NS&T Management Observations: Quarterly Performance Analysis

    SciTech Connect

    Gianotto, David

    2014-09-01

    The INL Management Observation Program (MOP) is designed to improve managers and supervisors understanding of work being performed by employees and the barriers impacting their success. The MOP also increases workers understanding of managements’ expectations as they relate to safety, security, quality, and work performance. Management observations (observations) are designed to improve the relationship and trust between employees and managers through increased engagement and interactions between managers and researchers in the field. As part of continuous improvement, NS&T management took initiative to focus on the participation and quality of observations in FY-14. This quarterly report is intended to (a) summarize the participation and quality of management’s observations, (b) assess observations for commonalities or trends related to facility or process barriers impacting research, and (c) provide feedback and make recommendations for improvements NS&T’s MOP.

  12. Resilient Plant Monitoring System: Design, Analysis, and Performance Evaluation

    SciTech Connect

    Humberto E. Garcia; Wen-Chiao Lin; Semyon M. Meerkov; Maruthi T. Ravichandran

    2013-12-01

    Resilient monitoring systems are sensor networks that degrade gracefully under malicious attacks on their sensors, causing them to project misleading information. The goal of this paper is to design, analyze, and evaluate the performance of a resilient monitoring system intended to monitor plant conditions (normal or anomalous). The architecture developed consists of four layers: data quality assessment, process variable assessment, plant condition assessment, and sensor network adaptation. Each of these layers is analyzed by either analytical or numerical tools, and the performance of the overall system is evaluated using simulations. The measure of resiliency of the resulting system is evaluated using Kullback Leibler divergence, and is shown to be sufficiently high in all scenarios considered.

  13. ATM solar array in-flight performance analysis

    NASA Technical Reports Server (NTRS)

    Thornton, J. P.; Crabtree, L. W.

    1974-01-01

    The physical and electrical characteristics of the Apollo Telescope Mount (ATM) solar array are described and in-flight performance data are analyzed and compared with predicted results. Two solar cell module configurations were used. Type I module consists of 228 2 x 6 cm solar cells with two cells in parallel and 114 cells in series. Type II modules contain 684 2 x 2 cm cells with six cells in parallel and 114 cells in series. A different interconnection scheme was used for each type. Panels using type II modules with mesh interconnect system performed marginally better than those using type I module with loop interconnect system. The average degradation rate for the ATM array was 8.2% for a 271-day mission.

  14. ROC analysis of diagnostic performance in liver scintigraphy.

    PubMed

    Fritz, S L; Preston, D F; Gallagher, J H

    1981-02-01

    Studies on the accuracy of liver scintigraphy for the detection of metastases were assembled from 38 sources in the medical literature. An ROC curve was fitted to the observed values of sensitivity and specificity using an algorithm developed by Ogilvie and Creelman. This ROC curve fitted the data better than average sensitivity and specificity values in each of four subsets of the data. For the subset dealing with Tc-99m sulfur colloid scintigraphy, performed for detection of suspected metastases and containing data on 2800 scans from 17 independent series, it was not possible to reject the hypothesis that interobserver variation was entirely due to the use of different decision thresholds by the reporting clinicians. Thus the ROC curve obtained is a reasonable baseline estimate of the performance potentially achievable in today's clinical setting. Comparison of new reports with these data is possible, but is limited by the small sample sizes in most reported series.

  15. Failure Analysis and Regeneration Performances Evaluation on Engine Lubricating Oil

    NASA Astrophysics Data System (ADS)

    Wang, X. L.; Zhang, G. N.; Zhang, J. Y.; Yin, Y. L.; Xu, Y.

    To investigate the behavior of failure and recycling of lubricating oils, three sorts of typical 10w-40 lubricating oils used in heavy-load vehicle including the new oil, waste oil and regeneration oil regenerated by self-researched green regeneration technology were selected. The tribology properties were tested by four-ball friction wear tester as well. The results indicated that the performance of anti-extreme pressure of regeneration oil increase by 34.1% compared with the waste one and its load- carrying ability is close to the new oil; the feature of wear spot are better than those of the waste oil and frictional coefficient almost reach the level of the new oil's. As a result, the performance of anti-wear and friction reducing are getting better obviously.

  16. Uniprocessor Performance Analysis of a Representative Workload of Sandia National Laboratories' Scientific Applications.

    SciTech Connect

    Charles Laverty

    2005-10-01

    UNIPROCESSOR PERFORMANCE ANALYSIS OF A REPRESENTATIVE WORKLOAD OF SANDIA NATIONAL LABORATORIES' SCIENTIFIC APPLICATIONS Master of Science in Electrical Engineering New Mexico State University Las Cruces, New Mexico, 2005 Dr. Jeanine Cook, Chair Throughout the last decade computer performance analysis has become absolutely necessary to maximum performance of some workloads. Sandia National Laboratories (SNL) located in Albuquerque, New Mexico is no different in that to achieve maximum performance of large scientific, parallel workloads performance analysis is needed at the uni-processor level. A representative workload has been chosen as the basis of a computer performance study to determine optimal processor characteristics in order to better specify the next generation of supercomputers. Cube3, a finite element test problem developed at SNL is a representative workload of their scientific workloads. This workload has been studied at the uni-processor level to understand characteristics in the microarchitecture that will lead to the overall performance improvement at the multi-processor level. The goal of studying vthis workload at the uni-processor level is to build a performance prediction model that will be integrated into a multi-processor performance model which is currently being developed at SNL. Through the use of performance counters on the Itanium 2 microarchitecture, performance statistics are studied to determine bottlenecks in the microarchitecture and/or changes in the application code that will maximize performance. From source code analysis a performance degrading loop kernel was identified and through the use of compiler optimizations a performance gain of around 20% was achieved.

  17. Moisture and Structural Analysis for High Performance Hybrid Wall Assemblies

    SciTech Connect

    Grin, A.; Lstiburek, J.

    2012-09-01

    Based on past experience in the Building America program, BSC has found that combinations of materials and approaches—in other words, systems—usually provide optimum performance. Integration is necessary, as described in this research project. The hybrid walls analyzed utilize a combination of exterior insulation, diagonal metal strapping, and spray polyurethane foam and leave room for cavity-fill insulation. These systems can provide effective thermal, air, moisture, and water barrier systems in one assembly and provide structure.

  18. Performance analysis of an inexpensive Direct Imaging Transmission Ion Microscope

    NASA Astrophysics Data System (ADS)

    Barnes, Patrick; Pallone, Arthur

    2013-03-01

    A direct imaging transmission ion microscope (DITIM) is built from a modified webcam and a commercially available polonium-210 antistatic device mounted on an optics rail. The performance of the DITIM in radiographic mode is analyzed in terms of the line spread function (LSF) and modulation transfer function (MTF) for an opaque edge. Limitations of, potential uses for, and suggested improvements to the DITIM are also discussed. Faculty sponsor

  19. Distributed Tracking Fidelity-Metric Performance Analysis Using Confusion Matrices

    DTIC Science & Technology

    2012-07-01

    evaluation [3, 4], T2T developments [ 5 , 6], and simultaneous tracking and ID (STID) approaches [7, 8, 9, 10], we seek a method for distributed tracking...While there is an interest to process all the data in signal-level fusion, such as image fusion [ 31 ], the transmission of the data is limited by...combination. Section 2 describes the tracking metrics. Section 3 overviews the JBPDAF. Section 4 describes the CM DLF. Section 5 shows a performance

  20. Performance of silicon immersed gratings: measurement, analysis, and modeling

    NASA Astrophysics Data System (ADS)

    Rodenhuis, Michiel; Tol, Paul J. J.; Coppens, Tonny H. M.; Laubert, Phillip P.; van Amerongen, Aaldert H.

    2015-09-01

    The use of Immersed Gratings offers advantages for both space- and ground-based spectrographs. As diffraction takes place inside the high-index medium, the optical path difference and angular dispersion are boosted proportionally, thereby allowing a smaller grating area and a smaller spectrometer size. Short-wave infrared (SWIR) spectroscopy is used in space-based monitoring of greenhouse and pollution gases in the Earth atmosphere. On the extremely large telescopes currently under development, mid-infrared high-resolution spectrographs will, among other things, be used to characterize exo-planet atmospheres. At infrared wavelengths, Silicon is transparent. This means that production methods used in the semiconductor industry can be applied to the fabrication of immersed gratings. Using such methods, we have designed and built immersed gratings for both space- and ground-based instruments, examples being the TROPOMI instrument for the European Space Agency Sentinel-5 precursor mission, Sentinel-5 (ESA) and the METIS (Mid-infrared E-ELT Imager and Spectrograph) instrument for the European Extremely Large Telescope. Three key parameters govern the performance of such gratings: The efficiency, the level of scattered light and the wavefront error induced. In this paper we describe how we can optimize these parameters during the design and manufacturing phase. We focus on the tools and methods used to measure the actual performance realized and present the results. In this paper, the bread-board model (BBM) immersed grating developed for the SWIR-1 channel of Sentinel-5 is used to illustrate this process. Stringent requirements were specified for this grating for the three performance criteria. We will show that -with some margin- the performance requirements have all been met.

  1. Deployable Air Beam Fender System (DAFS): Energy Absorption Performance Analysis

    DTIC Science & Technology

    2007-03-30

    its energy absorption performance. Quarter-scale and full-scale models were evaluated and compared to protot ype tests for a variety of inflation...pressures, impact berthing conditions, and ballast levels. Model predictions were validated with correlated test data. The explicit FEA method captured...was used. In step 1, the fender was inflated to the specified inflation pressure and the acceleration caused by gravity (386.4 in./s 2) was applied

  2. Performance Analysis of Automated Attack Graph Generation Software

    DTIC Science & Technology

    2006-12-01

    gapped from the target network . Although no performance details were available from Skybox, an examination of recent patents submitted by Skybox...distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The current generation of network vulnerability detection software uses...databases of known vulnerabilities and scans target networks for these weaknesses. The results can be voluminous and difficult to assess. Thus, the

  3. Performance Analysis of Distributed Object-Oriented Applications

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1998-01-01

    The purpose of this research was to evaluate the efficiency of a distributed simulation architecture which creates individual modules which are made self-scheduling through the use of a message-based communication system used for requesting input data from another module which is the source of that data. To make the architecture as general as possible, the message-based communication architecture was implemented using standard remote object architectures (Common Object Request Broker Architecture (CORBA) and/or Distributed Component Object Model (DCOM)). A series of experiments were run in which different systems are distributed in a variety of ways across multiple computers and the performance evaluated. The experiments were duplicated in each case so that the overhead due to message communication and data transmission can be separated from the time required to actually perform the computational update of a module each iteration. The software used to distribute the modules across multiple computers was developed in the first year of the current grant and was modified considerably to add a message-based communication scheme supported by the DCOM distributed object architecture. The resulting performance was analyzed using a model created during the first year of this grant which predicts the overhead due to CORBA and DCOM remote procedure calls and includes the effects of data passed to and from the remote objects. A report covering the distributed simulation software and the results of the performance experiments has been submitted separately. The above report also discusses possible future work to apply the methodology to dynamically distribute the simulation modules so as to minimize overall computation time.

  4. Analysis of Air Force Wartime Contracted Construction Project Performance

    DTIC Science & Technology

    2015-03-26

    peacetime factors as a baseline, project factors, health and safety compliance, quality of work, technical performance, work productivity, and...significant difference in overall project quality . In conclusion, cost monitoring from the owner and scrutiny of project management is critical to the...sponsor’s need for the research, and finishes with a brief description of the scope and methodology for the paper . 2 The AFCEC program, though

  5. Performance Analysis of Evolutionary Algorithms for Steiner Tree Problems.

    PubMed

    Lai, Xinsheng; Zhou, Yuren; Xia, Xiaoyun; Zhang, Qingfu

    2016-12-13

    The Steiner tree problem (STP) aims to determine some Steiner nodes such that the minimum spanning tree over these Steiner nodes and a given set of special nodes has the minimum weight, which is NP-hard. STP includes several important cases. The Steiner tree problem in graphs (GSTP) is one of them. Many heuristics have been proposed for STP, and some of them have proved to be performance guarantee approximation algorithms for this problem. Since evolutionary algorithms (EAs) are general and popular randomized heuristics, it is significant to investigate the performance of EAs for STP. Several empirical investigations have shown that EAs are efficient for STP. However, up to now, there is no theoretical work on the performance of EAs for STP. In this paper, we reveal that the (1 + 1) EA achieves 3/2-approximation ratio for STP in a special class of quasi-bipartite graphs in expected runtime O(r(r + s - 1) ⋅ wmax), where r, s and wmax are respectively the number of Steiner nodes, the number of special nodes and the largest weight among all edges in the input graph. We also show that the (1 + 1) EA is better than two other heuristics on two GSTP instances, and the (1 + 1) EA may be inefficient on a constructed GSTP instance.

  6. Performance analysis of axial-flow mixing impellers

    SciTech Connect

    Wu, J.; Pullum, L.

    2000-03-01

    Theoretical formulations for impeller performance were evaluated based on a blade-element theory. These enable the calculation of the head and power vs. flow-rate curves of axial-flow impellers. The technique uses the life and drag coefficients of the blade section of an impeller to calculate the spanwise swirl-velocity distribution. Using the angular-momentum equation, it is possible to calculate the corresponding spanwise distribution of the energy head of the impeller. Integration of these distributions of head and torque gives the impeller's performance. Parameters including the flow number, the power number, the thrust force number, and the swirl velocity can be found at the impeller operating point, determined using the head curve and an experimentally calibrated resistance curve. A laser Doppler velocimetry (LDV) system was used to measure the velocity distribution for different axial flow impellers in mixing tanks. Calculated flow and power numbers agreed well with the experimental results. Using the blade's spanwise head distribution and a set of calibrated flow-resistance data, it is also possible to estimate an impeller's outlet axial-velocity distribution. Predictions compared well with LDV experimental data. The effect of impeller-blade angle, number of blades, blade camber, and blade thickness on the performance of axial-flow impellers was investigated using the Agitator software.

  7. Performance testing and analysis of vertical ambient air vaporizers

    NASA Astrophysics Data System (ADS)

    Pandey, A. S.; Singh, V. N.; Shah, M. I.; Acharya, D. V.

    2017-02-01

    Ambient air vaporizers are used to regasify cryogenic liquids at extremely low temperature (below -153°C). Frost formation occurs on it due to large temperature difference between ambient air and cryogenic fluid. Frosting induces additional load on equipment and reduces its heat transfer effectiveness. Hence, mechanical and thermal design of vaporizers account for frosting. An experimental set-up has been designed and effects of flow rate and ground clearance on the performance of ambient air vaporizers are evaluated. The flow rate is increased from the rated capacity of 500 Nm3/h to 640 Nm3/h and ground clearance is reduced from 500 mm to 175 mm. The above variations reduce the time duration for which gaseous nitrogen is delivered at temperature higher than 10.1°C (desired). Hence duty cycle reduces from eight hours to five hours. The other factors affecting performance such as fin configuration, fluid type, fluid pressure, intermittent flow nature and climatic conditions are assumed to be constant over the test duration. The decrement in outlet gas temperature (from 38 °C to 10.1°C) with corresponding increment in frost thickness leads to deterioration of performance of ambient air vaporizers.

  8. Statistical analysis of AFE GN&C aeropass performance

    NASA Technical Reports Server (NTRS)

    Chang, Ho-Pen; French, Raymond A.

    1990-01-01

    Performance of the guidance, navigation, and control (GN&C) system used on the Aeroassist Flight Experiment (AFE) spacecraft has been studied with Monte Carlo techniques. The performance of the AFE GN&C is investigated with a 6-DOF numerical dynamic model which includes a Global Reference Atmospheric Model (GRAM) and a gravitational model with oblateness corrections. The study considers all the uncertainties due to the environment and the system itself. In the AFE's aeropass phase, perturbations on the system performance are caused by an error space which has over 20 dimensions of the correlated/uncorrelated error sources. The goal of this study is to determine, in a statistical sense, how much flight path angle error can be tolerated at entry interface (EI) and still have acceptable delta-V capability at exit to position the AFE spacecraft for recovery. Assuming there is fuel available to produce 380 ft/sec of delta-V at atmospheric exit, a 3-sigma standard deviation in flight path angle error of 0.04 degrees at EI would result in a 98-percent probability of mission success.

  9. Performance Testing using Silicon Devices - Analysis of Accuracy: Preprint

    SciTech Connect

    Sengupta, M.; Gotseff, P.; Myers, D.; Stoffel, T.

    2012-06-01

    Accurately determining PV module performance in the field requires accurate measurements of solar irradiance reaching the PV panel (i.e., Plane-of-Array - POA Irradiance) with known measurement uncertainty. Pyranometers are commonly based on thermopile or silicon photodiode detectors. Silicon detectors, including PV reference cells, are an attractive choice for reasons that include faster time response (10 us) than thermopile detectors (1 s to 5 s), lower cost and maintenance. The main drawback of silicon detectors is their limited spectral response. Therefore, to determine broadband POA solar irradiance, a pyranometer calibration factor that converts the narrowband response to broadband is required. Normally this calibration factor is a single number determined under clear-sky conditions with respect to a broadband reference radiometer. The pyranometer is then used for various scenarios including varying airmass, panel orientation and atmospheric conditions. This would not be an issue if all irradiance wavelengths that form the broadband spectrum responded uniformly to atmospheric constituents. Unfortunately, the scattering and absorption signature varies widely with wavelength and the calibration factor for the silicon photodiode pyranometer is not appropriate for other conditions. This paper reviews the issues that will arise from the use of silicon detectors for PV performance measurement in the field based on measurements from a group of pyranometers mounted on a 1-axis solar tracker. Also we will present a comparison of simultaneous spectral and broadband measurements from silicon and thermopile detectors and estimated measurement errors when using silicon devices for both array performance and resource assessment.

  10. Analysis of thermoelastohydrodynamic performance of journal misaligned engine main bearings

    NASA Astrophysics Data System (ADS)

    Bi, Fengrong; Shao, Kang; Liu, Changwen; Wang, Xia; Zhang, Jian

    2015-05-01

    To understand the engine main bearings' working condition is important in order to improve the performance of engine. However, thermal effects and thermal effect deformations of engine main bearings are rarely considered simultaneously in most studies. A typical finite element model is selected and the effect of thermoelastohydrodynamic(TEHD) reaction on engine main bearings is investigated. The calculated method of main bearing's thermal hydrodynamic reaction and journal misalignment effect is finite difference method, and its deformation reaction is calculated by using finite element method. The oil film pressure is solved numerically with Reynolds boundary conditions when various bearing characteristics are calculated. The whole model considers a temperature-pressure-viscosity relationship for the lubricant, surface roughness effect, and also an angular misalignment between the journal and the bearing. Numerical simulations of operation of a typical I6 diesel engine main bearing is conducted and importance of several contributing factors in mixed lubrication is discussed. The performance characteristics of journal misaligned main bearings under elastohydrodynamic(EHD) and TEHD loads of an I6 diesel engine are received, and then the journal center orbit movement, minimum oil film thickness and maximum oil film pressure of main bearings are estimated over a wide range of engine operation. The model is verified through the comparison with other present models. The TEHD performance of engine main bearings with various effects under the influences of journal misalignment is revealed, this is helpful to understand EHD and TEHD effect of misaligned engine main bearings.

  11. Performance Analysis of Hybrid Electric Vehicle over Different Driving Cycles

    NASA Astrophysics Data System (ADS)

    Panday, Aishwarya; Bansal, Hari Om

    2017-02-01

    Article aims to find the nature and response of a hybrid vehicle on various standard driving cycles. Road profile parameters play an important role in determining the fuel efficiency. Typical parameters of road profile can be reduced to a useful smaller set using principal component analysis and independent component analysis. Resultant data set obtained after size reduction may result in more appropriate and important parameter cluster. With reduced parameter set fuel economies over various driving cycles, are ranked using TOPSIS and VIKOR multi-criteria decision making methods. The ranking trend is then compared with the fuel economies achieved after driving the vehicle over respective roads. Control strategy responsible for power split is optimized using genetic algorithm. 1RC battery model and modified SOC estimation method are considered for the simulation and improved results compared with the default are obtained.

  12. DWT features performance analysis for automatic speech recognition of Urdu.

    PubMed

    Ali, Hazrat; Ahmad, Nasir; Zhou, Xianwei; Iqbal, Khalid; Ali, Sahibzada Muhammad

    2014-01-01

    This paper presents the work on Automatic Speech Recognition of Urdu language, using a comparative analysis for Discrete Wavelets Transform (DWT) based features and Mel Frequency Cepstral Coefficients (MFCC). These features have been extracted for one hundred isolated words of Urdu, each word uttered by ten different speakers. The words have been selected from the most frequently used words of Urdu. A variety of age and dialect has been covered by using a balanced corpus approach. After extraction of features, the classification has been achieved by using Linear Discriminant Analysis. After the classification task, the confusion matrix obtained for the DWT features has been compared with the one obtained for Mel-Frequency Cepstral Coefficients based speech recognition. The framework has been trained and tested for speech data recorded under controlled environments. The experimental results are useful in determination of the optimum features for speech recognition task.

  13. Performance analysis of Integrated Communication and Control System networks

    NASA Technical Reports Server (NTRS)

    Halevi, Y.; Ray, A.

    1990-01-01

    This paper presents statistical analysis of delays in Integrated Communication and Control System (ICCS) networks that are based on asynchronous time-division multiplexing. The models are obtained in closed form for analyzing control systems with randomly varying delays. The results of this research are applicable to ICCS design for complex dynamical processes like advanced aircraft and spacecraft, autonomous manufacturing plants, and chemical and processing plants.

  14. An Analysis of the Air Force Enlisted Performance Feedback System

    DTIC Science & Technology

    1992-09-01

    subjects giving and receiving the feedback from the various methods (DeGregorio and Fisher, 1988:605). The four types of techniques they studied in... receives some P.~ id~srn Hence, leadership gives support from upper-level ~potfo elw p no emphasis to a feedback management. Certain Io .a.........k...researchers’ analysis of the literature. The researchers found evidence that the new Air Force feedback system is an improvement over the old design. Under the

  15. HF Over-the-Horizon Radar System Performance Analysis

    DTIC Science & Technology

    2007-09-01

    3,500 km at cf = 14.5 MHz. A model of the maximum detection range for the Chinese FMCW OTH backscatter (OTH-B) radar was developed in MATLAB . An...calculation of the maximum usable frequency (MUF), and footprint prediction. Also, radar equation analysis was done in MATLAB to study the signal-to- noise...target detection technique and radar equations are applied. Chapter V uses PROPLAB model simulation to bring in the principle of raytracing and

  16. Cost and Training Effectiveness Analysis (CTEA) Performance Guide.

    DTIC Science & Technology

    1980-09-01

    effectiveness analyses (CTEA) during Weapon System Acquisition required by the Life Cycle System Management Model (LCSMM) and other related...weapon from its Program/Project/Product Manager (PM), potential manufacturer(s), or from a laser expert and use it. Further, rememiber that you are costing...the DIO, the Management Analysis Division, and input data to TRADOC Form 273-R (Figure 2) submitted in accordance with TRADOC Reg 11-12. In the absence

  17. Theoretical performance analysis of multislice channelized Hotelling observers

    NASA Astrophysics Data System (ADS)

    Goossens, Bart; Platiša, Ljiljana; Philips, Wilfried

    2012-02-01

    Quality assessment of 3D medical images is becoming increasingly important, because of clinical practice rapidly moving in the direction of volumetric imaging. In a recent publication, three multi-slice channelized Hotelling observer (msCHO) models are presented for the task of detecting 3D signals in multi-slice images, where each multi-slice image is inspected in a so called stack-browsing mode. The observer models are based on the assumption that humans observe multi-slice images in a simple two stage process, and each of the models implement this principle in a different way. In this paper, we investigate the theoretical performance, in terms of detection signal-to-noise-ratio (SNR) of msCHO models, for the task of detecting a separable signal in a Gaussian background with separable covariance matrix. We find that, despite the differences in architecture of the three models, they all have the same asymptotical performance in this task (i.e., when the number of training images tends to infinity). On the other hand, when backgrounds with nonseparable covariance matrices are considered, the third model, msCHOc, is expected to perform slightly better than the other msCHO models (msCHOa and msCHOb), but only when sufficient training images are provided. These findings suggest that the choice between the msCHO models mainly depends on the experiment setup (e.g., the number of available training samples), while the relation to human observers depends on the particular choice of the "temporal" channels that the msCHO models use.

  18. Technical and tactical analysis of youth taekwondo performance.

    PubMed

    Casolino, Erika; Lupo, Corrado; Cortis, Cristina; Chiodo, Salvatore; Minganti, Carlo; Capranica, Laura; Tessitore, Antonio

    2012-06-01

    This study aimed to analyze the technical and tactical aspects of young athletes during official taekwondo competitions. Fifty-nine youth taekwondo athletes (43 boys and 16 girls; age range: 10-12 years; weight category range: <24 to >59 kg) with at least 2 years of taekwondo training consisting of three 90-minute training sessions for 3 d·wk⁻¹ participated in this study. Thirty-seven matches (three 1-minute rounds, with 1-minute rest in between) were analyzed to verify the differences (p ≤ 0.05) in offensive and defensive actions in relation to gender (male, female), match outcome (winners, nonwinners), kicking leg (front, rear), and round (first, second, third). No difference emerged for gender and match outcome. With respect to defensive actions (8.4 ± 12.0%), youth athletes engaged more frequently (p < 0.0001) in offensive actions (91.6 ± 12.0%), which showed a significant decrease (p < 0.016) from the first round (42.3 ± 21.8%) to the second (33.1 ± 14.8%) and third (24.5 ± 16.0%) ones. Kicks performed with the rear leg (94.4 ± 7.8%) occurred more frequently (p < 0.0001) than those performed with the front leg (5.6 ± 7.8%). In considering that a high level of coordination is required to perform front-leg kicks and defensive actions necessitate a high level of tactical skills, these findings might indicate a not-yet complete attainment of fundamental coordinative capabilities in 10- to 12-year-old athletes, independently of match outcome. To enhance coordination capabilities in youth athletes, coaches are recommended to structure their training including skill-ability and sport-ability drills.

  19. Design and performance analysis of gas sorption compressors

    NASA Technical Reports Server (NTRS)

    Chan, C. K.

    1984-01-01

    Compressor kinetics based on gas adsorption and desorption processes by charcoal and for gas absorption and desorption processes by LaNi5 were analyzed using a two-phase model and a three-component model, respectively. The assumption of the modeling involved thermal and mechanical equilibria between phases or among the components. The analyses predicted performance well for compressors which have heaters located outside the adsorbent or the absorbent bed. For the rapidly-cycled compressor, where the heater was centrally located, only the transient pressure compared well with the experimental data.

  20. Dynamic Curvature Steering Control for Autonomous Vehicle: Performance Analysis

    NASA Astrophysics Data System (ADS)

    Aizzat Zakaria, Muhammad; Zamzuri, Hairi; Amri Mazlan, Saiful

    2016-02-01

    This paper discusses the design of dynamic curvature steering control for autonomous vehicle. The lateral control and longitudinal control are discussed in this paper. The controller is designed based on the dynamic curvature calculation to estimate the path condition and modify the vehicle speed and steering wheel angle accordingly. In this paper, the simulation results are presented to show the capability of the controller to track the reference path. The controller is able to predict the path and modify the vehicle speed to suit the path condition. The effectiveness of the controller is shown in this paper whereby identical performance is achieved with the benchmark but with extra curvature adaptation capabilites.

  1. Performance and flow analysis of vortex wind power turbines

    SciTech Connect

    Rangwalla, A.A.; Hsu, C.T.

    1982-10-01

    The theoretical study presented investigates some possible vortex flow solutions in the tornado-type wind energy system and evaluates the power coefficient that can be obtained theoretically. The actuator disc concept is applied to the vortex wind turbine configuration. The Burgers vortex model is then introduced and the performance of a turbine using it is derived. A generalized analytical solution of the model is given, followed by a numerical solution of the complete equations. The stability of a Burgers vortex is discussed. (LEW)

  2. Apparatus and method for performing microfluidic manipulations for chemical analysis

    DOEpatents

    Ramsey, J. Michael

    2002-01-01

    A microchip apparatus and method provide fluidic manipulations for a variety of applications, including sample injection for microchip liquid chromatography. The microchip is fabricated using standard photolitographic procedures and chemical wet etching, with the substrate and cover plate joined using direct bonding. Capillary electrophoresis is performed in channels formed in the substrate. Injections are made by electro-osmotically pumping sample through the injection channel that crosses the separation channel, followed by a switching of the potentials to force a plug into the separation channel.

  3. Apparatus and method for performing microfluidic manipulations for chemical analysis

    DOEpatents

    Ramsey, J. Michael

    1999-01-01

    A microchip apparatus and method provide fluidic manipulations for a variety of applications, including sample injection for microchip liquid chromatography. The microchip is fabricated using standard photolithographic procedures and chemical wet etching, with the substrate and cover plate joined using direct bonding. Capillary electrophoresis is performed in channels formed in the substrate. Injections are made by electro-osmotically pumping sample through the injection channel that crosses the separation channel, followed by a switching of the potentials to force a plug into the separation channel.

  4. Performability analysis using semi-Markov reward processes

    NASA Technical Reports Server (NTRS)

    Ciardo, Gianfranco; Marie, Raymond A.; Sericola, Bruno; Trivedi, Kishor S.

    1990-01-01

    Beaudry (1978) proposed a simple method of computing the distribution of performability in a Markov reward process. Two extensions of Beaudry's approach are presented. The method is generalized to a semi-Markov reward process by removing the restriction requiring the association of zero reward to absorbing states only. The algorithm proceeds by replacing zero-reward nonabsorbing states by a probabilistic switch; it is therefore related to the elimination of vanishing states from the reachability graph of a generalized stochastic Petri net and to the elimination of fast transient states in a decomposition approach to stiff Markov chains. The use of the approach is illustrated with three applications.

  5. Using Importance-Performance Analysis to Guide Instructional Design of Experiential Learning Activities

    ERIC Educational Resources Information Center

    Anderson, Sheri; Hsu, Yu-Chang; Kinney, Judy

    2016-01-01

    Designing experiential learning activities requires an instructor to think about what they want the students to learn. Using importance-performance analysis can assist with the instructional design of the activities. This exploratory study used importance-performance analysis in an online introduction to criminology course. There is limited…

  6. Identifying Musical Performance Behavior in Instrumentalists Using Computer-Based Sound Spectrum Analysis.

    ERIC Educational Resources Information Center

    Rees, Fred J.; Michelis, Rainer M.

    1991-01-01

    Examines relationships between musical information processed by a computer-based sound analysis system and the audiovisual record of a performer's response to musical assignments. Concludes that computer analysis permits identification of performance behavior. Suggests that a database could be designed to provide both diagnostic response to…

  7. Performance analysis of LVQ algorithms: a statistical physics approach.

    PubMed

    Ghosh, Anarta; Biehl, Michael; Hammer, Barbara

    2006-01-01

    Learning vector quantization (LVQ) constitutes a powerful and intuitive method for adaptive nearest prototype classification. However, original LVQ has been introduced based on heuristics and numerous modifications exist to achieve better convergence and stability. Recently, a mathematical foundation by means of a cost function has been proposed which, as a limiting case, yields a learning rule similar to classical LVQ2.1. It also motivates a modification which shows better stability. However, the exact dynamics as well as the generalization ability of many LVQ algorithms have not been thoroughly investigated so far. Using concepts from statistical physics and the theory of on-line learning, we present a mathematical framework to analyse the performance of different LVQ algorithms in a typical scenario in terms of their dynamics, sensitivity to initial conditions, and generalization ability. Significant differences in the algorithmic stability and generalization ability can be found already for slightly different variants of LVQ. We study five LVQ algorithms in detail: Kohonen's original LVQ1, unsupervised vector quantization (VQ), a mixture of VQ and LVQ, LVQ2.1, and a variant of LVQ which is based on a cost function. Surprisingly, basic LVQ1 shows very good performance in terms of stability, asymptotic generalization ability, and robustness to initializations and model parameters which, in many cases, is superior to recent alternative proposals.

  8. An aerodynamic performance analysis of a perforated wind turbine blade

    NASA Astrophysics Data System (ADS)

    Didane, D. H.; Mohd, S.; Subari, Z.; Rosly, N.; Ghafir, M. F. Abdul; Mohd Masrom, M. F.

    2016-11-01

    Wind power is one of the important renewable energy sources. Currently, many researches are focusing on improving the aerodynamic performance of wind turbine blades through simulations and wind tunnel testing. In the present study, the aerodynamic performance of the perforated Eqwin blade (shell type blade) is investigated by using numerical simulation. Three types of slots namely circular, horizontal rectangular and vertical rectangular were evaluated. It was found that the optimum angle of attack for a perforated shell type blade was 12° with maximum Cl/Cd value of 6.420. In general, for all the perforated blade cases, Cl/Cd tended to decrease as the slot size increased except for the circular slot with 5 mm diameter. This was due to the disturbance of the airflow in lower side region which passed through the bigger slot size. Among the modified slots; the circular slot with diameter of 5 mm would be the best slot configuration that can be considered for blade fabrication. The Cl/Cd obtained was 6.46 which is about 5% more than the value of the reference blade. Moreover, the introduced slot would also reduce the overall weight of the blade by 1.3%.

  9. Performance analysis of flexible DSSC with binder addition

    NASA Astrophysics Data System (ADS)

    Muliani, Lia; Hidayat, Jojo; Anggraini, Putri Nur

    2016-04-01

    Flexible DSSC is one of modification of DSSC based on its substrate. Operating at low temperature, flexible DSSC requires a binder to improve particles interconnection. This research was done to compare the morphology and performance of flexible DSSC that was produced with binder-added and binder-free. TiO2 powder, butanol, and HCl were mixed for preparation of TiO2 paste. Small amount of titanium isopropoxide as binder was added into the mixture. TiO2 paste was deposited on ITO-PET plastic substrate with area of 1x1 cm2 by doctor blade method. Furthermore, SEM, XRD, and BET characterization were done to analyze morphology and surface area of the TiO2 photoelectrode microstructures. Dyed TiO2 photoelectrode and platinum counter electrode were assembled and injected by electrolyte. In the last process, flexible DSSCs were illuminated by sun simulator to do J-V measurement. As a result, flexible DSSC containing binder showed higher performance with photoconversion efficiency of 0.31%.

  10. The performance analysis of linux networking - packet receiving

    SciTech Connect

    Wu, Wenji; Crawford, Matt; Bowden, Mark; /Fermilab

    2006-11-01

    The computing models for High-Energy Physics experiments are becoming ever more globally distributed and grid-based, both for technical reasons (e.g., to place computational and data resources near each other and the demand) and for strategic reasons (e.g., to leverage equipment investments). To support such computing models, the network and end systems, computing and storage, face unprecedented challenges. One of the biggest challenges is to transfer scientific data sets--now in the multi-petabyte (10{sup 15} bytes) range and expected to grow to exabytes within a decade--reliably and efficiently among facilities and computation centers scattered around the world. Both the network and end systems should be able to provide the capabilities to support high bandwidth, sustained, end-to-end data transmission. Recent trends in technology are showing that although the raw transmission speeds used in networks are increasing rapidly, the rate of advancement of microprocessor technology has slowed down. Therefore, network protocol-processing overheads have risen sharply in comparison with the time spent in packet transmission, resulting in degraded throughput for networked applications. More and more, it is the network end system, instead of the network, that is responsible for degraded performance of network applications. In this paper, the Linux system's packet receive process is studied from NIC to application. We develop a mathematical model to characterize the Linux packet receiving process. Key factors that affect Linux systems network performance are analyzed.

  11. Performance analysis of a large-grain dataflow scheduling paradigm

    NASA Technical Reports Server (NTRS)

    Young, Steven D.; Wills, Robert W.

    1993-01-01

    A paradigm for scheduling computations on a network of multiprocessors using large-grain data flow scheduling at run time is described and analyzed. The computations to be scheduled must follow a static flow graph, while the schedule itself will be dynamic (i.e., determined at run time). Many applications characterized by static flow exist, and they include real-time control and digital signal processing. With the advent of computer-aided software engineering (CASE) tools for capturing software designs in dataflow-like structures, macro-dataflow scheduling becomes increasingly attractive, if not necessary. For parallel implementations, using the macro-dataflow method allows the scheduling to be insulated from the application designer and enables the maximum utilization of available resources. Further, by allowing multitasking, processor utilizations can approach 100 percent while they maintain maximum speedup. Extensive simulation studies are performed on 4-, 8-, and 16-processor architectures that reflect the effects of communication delays, scheduling delays, algorithm class, and multitasking on performance and speedup gains.

  12. Structural Design and Sealing Performance Analysis of Biomimetic Sealing Ring

    PubMed Central

    Han, Chuanjun

    2015-01-01

    In order to reduce the failure probability of rubber sealing rings in reciprocating dynamic seal, a new structure of sealing ring based on bionics was designed. The biomimetic ring has three concave ridges and convex bulges on each side which are very similar to earthworms. Bulges were circularly designed and sealing performances of the biomimetic ring in both static seal and dynamic seal were simulated by FEM. In addition, effects of precompression, medium pressure, speed, friction coefficient, and material parameters on sealing performances were discussed. The results show that von Mises stress of the biomimetic sealing ring distributed symmetrically in no-pressure static sealing. The maximum von Mises stress appears on the second bulge of the inner side. High contact stress concentrates on left bulges. Von Mises stress distribution becomes uneven under medium pressure. Both von Mises stress and contact stress increase when precompression, medium pressure, and rubber hardness increase in static sealing. Biomimetic ring can avoid rolling and distortion in reciprocating dynamic seal, and its working life is much longer than O-ring and rectangular ring. The maximum von Mises stress and contact stress increase with the precompression, medium pressure, rubber hardness, and friction coefficient in reciprocating dynamic seal. PMID:27019582

  13. A Preliminary Analysis of LANDSAT-4 Thematic Mapper Radiometric Performance

    NASA Technical Reports Server (NTRS)

    Justice, C.; Fusco, L.; Mehl, W.

    1985-01-01

    The NASA raw (BT) product, the radiometrically corrected (AT) product, and the radiometrically and geometrically corrected (PT) product of a TM scene were analyzed examine the frequency distribution of the digital data; the statistical correlation between the bands; and the variability between the detectors within a band. The analyses were performed on a series of image subsets from the full scence. Results are presented from one 1024 c 1024 pixel subset of Realfoot Lake, Tennessee which displayed a representative range of ground conditions and cover types occurring within the full frame image. From this cursory examination of one of the first seven channel TM data sets, it would appear that the radiometric performance of the system is most satisfactory and largely meets pre-launch specifications. Problems were noted with Band 5 Detector 3 and Band 2 Detector 4. Differences were observed between forward and reverse scan detector responses both for the BT and AT products. No systematic variations were observed between odd and even detectors.

  14. Performance Analysis for the New g-2 Experiment at Fermilab

    SciTech Connect

    Stratakis, Diktys; Convery, Mary; Crmkovic, J.; Froemming, Nathan; Johnstone, Carol; Johnstone, John; Korostelev, Maxim; Morgan, James; Morse, William; Syphers, Michael; Tishchenko, Vladimir

    2016-06-01

    The new g-2 experiment at Fermilab aims to measure the muon anomalous magnetic moment to a precision of ±0.14 ppm ─ a fourfold improvement over the 0.54 ppm precision obtained in the g-2 BNL E821experiment. Achieving this goal requires the delivery of highly polarized 3.094 GeV/c muons with a narrow ±0.5% Δp/p acceptance to the g-2 storage ring. In this study, we describe a muon capture and transport scheme that should meet this requirement. First, we present the conceptual design of our proposed scheme wherein we describe its basic features. Then, we detail its performance numerically by simulating the pion production in the (g-2) production target, the muon collection by the downstream beamline optics as well as the beam polarization and spin-momentum correlation up to the storage ring. The sensitivity in performance of our proposed channel against key parameters such as magnet apertures and magnet positioning errors is analyzed

  15. High-performance computing in accelerating structure design and analysis

    NASA Astrophysics Data System (ADS)

    Li, Zenghai; Folwell, Nathan; Ge, Lixin; Guetz, Adam; Ivanov, Valentin; Kowalski, Marc; Lee, Lie-Quan; Ng, Cho-Kuen; Schussman, Greg; Stingelin, Lukas; Uplenchwar, Ravindra; Wolf, Michael; Xiao, Liling; Ko, Kwok

    2006-03-01

    Future high-energy accelerators such as the Next Linear Collider (NLC) will accelerate multi-bunch beams of high current and low emittance to obtain high luminosity, which put stringent requirements on the accelerating structures for efficiency and beam stability. While numerical modeling has been quite standard in accelerator R&D, designing the NLC accelerating structure required a new simulation capability because of the geometric complexity and level of accuracy involved. Under the US DOE Advanced Computing initiatives (first the Grand Challenge and now SciDAC), SLAC has developed a suite of electromagnetic codes based on unstructured grids and utilizing high-performance computing to provide an advanced tool for modeling structures at accuracies and scales previously not possible. This paper will discuss the code development and computational science research (e.g. domain decomposition, scalable eigensolvers, adaptive mesh refinement) that have enabled the large-scale simulations needed for meeting the computational challenges posed by the NLC as well as projects such as the PEP-II and RIA. Numerical results will be presented to show how high-performance computing has made a qualitative improvement in accelerator structure modeling for these accelerators, either at the component level (single cell optimization), or on the scale of an entire structure (beam heating and long-range wakefields).

  16. High-Performance Computing in Accelerating Structure Design And Analysis

    SciTech Connect

    Li, Z.H.; Folwell, N.; Ge, Li-Xin; Guetz, A.; Ivanov, V.; Kowalski, M.; Lee, L.Q.; Ng, C.K.; Schussman, G.; Stingelin, L.; Uplenchwar, R.; Wolf, M.; Xiao, L.L.; Ko, K.; /SLAC /PSI, Villigen /Illinois U., Urbana

    2006-06-27

    Future high-energy accelerators such as the Next Linear Collider (NLC) will accelerate multi-bunch beams of high current and low emittance to obtain high luminosity, which put stringent requirements on the accelerating structures for efficiency and beam stability. While numerical modeling has been quite standard in accelerator R&D, designing the NLC accelerating structure required a new simulation capability because of the geometric complexity and level of accuracy involved. Under the US DOE Advanced Computing initiatives (first the Grand Challenge and now SciDAC), SLAC has developed a suite of electromagnetic codes based on unstructured grids and utilizing high performance computing to provide an advanced tool for modeling structures at accuracies and scales previously not possible. This paper will discuss the code development and computational science research (e.g. domain decomposition, scalable eigensolvers, adaptive mesh refinement) that have enabled the large-scale simulations needed for meeting the computational challenges posed by the NLC as well as projects such as the PEP-II and RIA. Numerical results will be presented to show how high performance computing has made a qualitative improvement in accelerator structure modeling for these accelerators, either at the component level (single cell optimization), or on the scale of an entire structure (beam heating and long range wakefields).

  17. Ascending performance analysis for high altitude zero pressure balloon

    NASA Astrophysics Data System (ADS)

    Saleh, Sherif; He, Weiliang

    2017-04-01

    This paper describes a comprehensive simulation for high altitude zero pressure balloon trajectories. A mathematical model was established to simulate the ascending process which considers the atmospheric conditions and thermodynamic variations. Influences of launch parameters on ascending performance were analyzed. The necessary quantity of initial lift gas was estimated and optimized, so that ensures no ballast consuming during the ascending process. The climbing rate was a governing parameter to evaluate the ascending performance. Based on the simulation, results revealed the apparent different effect on climbing rate at troposphere and stratosphere layers. Change in launch time and site mainly affect the climbing rate at the stratosphere and have no significant effect at the troposphere and tropopause altitudes. Meanwhile, change in launch date has a negligible effect on both layers. Due to the earth's declination angle, the influence of the same launch latitude and the same launch longitude is not identical within a year. Also, results showed that the optimum lift gas quantity improved the climbing rate stability to obtain an accurate simulation.

  18. Methods for Analysis of Outdoor Performance Data (Presentation)

    SciTech Connect

    Jordan, D.

    2011-02-01

    The ability to accurately predict power delivery over the course of time is of vital importance to the growth of the photovoltaic (PV) industry. Two key cost drivers are the efficiency with which sunlight is converted to power and secondly how this relationship develops over time. The accurate knowledge of power decline over time, also known as degradation rates, is essential and important to all stakeholders--utility companies, integrators, investors, and scientists alike. Different methods to determine degradation rates and discrete versus continuous data are presented, and some general best practice methods are outlined. In addition, historical degradation rates and some preliminary analysis with respect to climate are given.

  19. Human Performance Modeling for Dynamic Human Reliability Analysis

    SciTech Connect

    Boring, Ronald Laurids; Joe, Jeffrey Clark; Mandelli, Diego

    2015-08-01

    Part of the U.S. Department of Energy’s (DOE’s) Light Water Reac- tor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Charac- terization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk framework. In this paper, we review simulation based and non simulation based human reliability analysis (HRA) methods. This paper summarizes the founda- tional information needed to develop a feasible approach to modeling human in- teractions in RISMC simulations.

  20. Navier-Stokes analysis of radial turbine rotor performance

    NASA Technical Reports Server (NTRS)

    Larosiliere, L. M.

    1993-01-01

    An analysis of flow through a radial turbine rotor using the three-dimensional, thin-layer Navier-Stokes code RVC3D is described. The rotor is a solid version of an air-cooled metallic radial turbine having thick trailing edges, shroud clearance, and scalloped-backface clearance. Results are presented at the nominal operating condition using both a zero-clearance model and a model simulating the effects of the shroud and scalloped-backface clearance flows. A comparison with the available test data is made and details of the internal flow physics are discussed, allowing a better understanding of the complex flow distribution within the rotor.

  1. Performance analysis of high quality parallel preconditioners applied to 3D finite element structural analysis

    SciTech Connect

    Kolotilina, L.; Nikishin, A.; Yeremin, A.

    1994-12-31

    The solution of large systems of linear equations is a crucial bottleneck when performing 3D finite element analysis of structures. Also, in many cases the reliability and robustness of iterative solution strategies, and their efficiency when exploiting hardware resources, fully determine the scope of industrial applications which can be solved on a particular computer platform. This is especially true for modern vector/parallel supercomputers with large vector length and for modern massively parallel supercomputers. Preconditioned iterative methods have been successfully applied to industrial class finite element analysis of structures. The construction and application of high quality preconditioners constitutes a high percentage of the total solution time. Parallel implementation of high quality preconditioners on such architectures is a formidable challenge. Two common types of existing preconditioners are the implicit preconditioners and the explicit preconditioners. The implicit preconditioners (e.g. incomplete factorizations of several types) are generally high quality but require solution of lower and upper triangular systems of equations per iteration which are difficult to parallelize without deteriorating the convergence rate. The explicit type of preconditionings (e.g. polynomial preconditioners or Jacobi-like preconditioners) require sparse matrix-vector multiplications and can be parallelized but their preconditioning qualities are less than desirable. The authors present results of numerical experiments with Factorized Sparse Approximate Inverses (FSAI) for symmetric positive definite linear systems. These are high quality preconditioners that possess a large resource of parallelism by construction without increasing the serial complexity.

  2. Comparative analysis of actigraphy performance in healthy young subjects.

    PubMed

    Bellone, Giannina J; Plano, Santiago A; Cardinali, Daniel P; Chada, Daniel Pérez; Vigo, Daniel E; Golombek, Diego A

    2016-01-01

    Sleep-related health disorders are increasing worldwide; diagnosis and treatment of such sleep diseases are commonly invasive and sometimes unpractical or expensive. Actigraphy has been recently introduced as a tool for the study of sleep and circadian disorders; however, there are several devices that claim to be useful for research and have not been thoroughly tested. This comparative study provides activity, sleep and temperature information regarding several of the most commonly used actigraphers: Micro-Mini Motion Logger; Act Trust; Misfit Flash; Fitbit Flex & Thermochron. Twenty-two healthy young subjects were assessed with five different commercial actigraphs (Micro-Mini Motionlogger Watch, Condor Act Trust, MisFit Flash and Fitbit Flex) and a temperature recorder (Thermochron), and also completed a sleep diary for a week. There were not significant differences in the analysis of rest-activity pattern between devices. Temperature rhythm comparison between the Act Trust and the Thermochron showed significant differences in rhythm percentage (p<0.05) and mesor (p<0.0563) but not in amplitude or acrophase. Although data accessibility and ease of use was very different for the diverse devices, there were no significant differences for sleep onset, total sleep time and sleep efficiency recordings, where applicable. In conclusion, depending on the type of study and analysis desired (as well as cost and compliance of use), we propose some relative advantages for the different actigraphy/temperature recording devices.

  3. Performance analysis of elite men's and women's wheelchair basketball teams.

    PubMed

    Gómez, Miguel Ángel; Pérez, Javier; Molik, Bartosz; Szyman, Robert J; Sampaio, Jaime

    2014-01-01

    The purpose of the present study was to identify which game-related statistics discriminate winning and losing teams in men's and women's elite wheelchair basketball. The sample comprised all the games played during the Beijing Paralympics 2008 and the World Wheelchair Basketball Championship 2010. The game-related statistics from the official box scores were gathered and data were analysed in 2 groups: balanced games (final score differences ≤ 12 points) and unbalanced games (final score differences >13 points). Discriminant analysis allowed identifying the successful 2-point field-goals and free-throws, the unsuccessful 3-point field-goals and free-throws, the assists and fouls received as discriminant statistics between winning and losing teams in men's balanced games. In women's games, the teams were discriminated only by the successful 2-point field-goals. Linear regression analysis showed that the quality of opposition had great effects in final point differential. The field-goals percentage and free-throws rate were the most important factors in men's games, and field-goals percentage and offensive rebounding percentage in women's games. The identified trends allow improving game understanding and helping wheelchair basketball coaches to plan accurate practice sessions and, ultimately, deciding better in competition.

  4. A Kinematics Analysis Of Three Best 100 M Performances Ever

    PubMed Central

    Krzysztof, Maćkała; Mero, Antti

    2013-01-01

    The purpose of this investigation was to compare and determine the relevance of the morphological characteristics and variability of running speed parameters (stride length and stride frequency) between Usain Bolt’s three best 100 m performances. Based on this, an attempt was made to define which factors determine the performance of Usain Bolt’s sprint and, therefore, distinguish him from other sprinters. We analyzed the previous world record of 9.69 s set in the 2008 Beijing Olympics, the current record of 9.58 s set in the 2009 Berlin World Championships in Athletics and the O lympic record of 9.63 s set in 2012 London Olympics Games by Usain Bolt. The application of VirtualDub Programme allowed the acquisition of basic kinematical variables such as step length and step frequency parameters of 100 m sprint from video footage provided by NBC TV station, BBC TV station. This data was compared with other data available on the web and data published by the Scientific Research Project Office responsible on behalf of IAAF and the German Athletics Association (DVL). The main hypothesis was that the step length is the main factor that determines running speed in the 10 and 20 m sections of the entire 100 m distance. Bolt’s anthropometric advantage (body height, leg length and liner body) is not questionable and it is one of the factors that makes him faster than the rest of the finalists from each three competitions. Additionally, Bolt’s 20 cm longer stride shows benefit in the latter part of the race. Despite these factors, he is probably able to strike the ground more forcefully than rest of sprinters, relative to their body mass, therefore, he might maximize his time on the ground and to exert the same force over this period of time. This ability, combined with longer stride allows him to create very high running speed - over 12 m/s (12.05 – 12.34 m/s) in some 10 m sections of his three 100 m performances. These assumption confirmed the application of

  5. Structural analysis of amorphous phosphates using high performance liquid chromatography

    SciTech Connect

    Sales, B.C.; Boatner, L.A.; Chakoumakos, B.C.; McCallum, J.C.; Ramey, J.O.; Zuhr, R.A.

    1993-12-31

    Determining the atomic-scale structure of amorphous solids has proven to be a formidable scientific and technological problem for the past 100 years. The technique of high-performance liquid chromatography (HPLC) provides unique detailed information regarding the structure of partially disordered or amorphous phosphate solids. Applications of the experimental technique of HPLC to phosphate solids are reviewed, and examples of the type of information that can be obtained with HPLC are presented. Inorganic phosphates encompass a large class of important materials whose applications include: catalysts, ion-exchange media, solid electrolytes for batteries, linear and nonlinear optical components, chelating agents, synthetic replacements for bone and teeth, phosphors, detergents, and fertilizers. Phosphate ions also represent a unique link between living systems and the inorganic world.

  6. Rankine engine solar power generation. I - Performance and economic analysis

    NASA Technical Reports Server (NTRS)

    Gossler, A. A.; Orrock, J. E.

    1981-01-01

    Results of a computer simulation of the performance of a solar flat plate collector powered electrical generation system are presented. The simulation was configured to include locations in New Mexico, North Dakota, Tennessee, and Massachusetts, and considered a water-based heat-transfer fluid collector system with storage. The collectors also powered a Rankine-cycle boiler filled with a low temperature working fluid. The generator was considered to be run only when excess solar heat and full storage would otherwise require heat purging through the collectors. All power was directed into the utility grid. The solar powered generator unit addition was found to be dependent on site location and collector area, and reduced the effective solar cost with collector areas greater than 400-670 sq m. The sites were economically ranked, best to worst: New Mexico, North Dakota, Massachusetts, and Tennessee.

  7. Oxygen rich gas generator design and performance analysis

    NASA Technical Reports Server (NTRS)

    Gloyer, P. W.; Knuth, W. H.; Crawford, R. A.

    1993-01-01

    The present oxygen-rich combustion research investigates oxygen gas generator concepts. The theoretical and modeling aspects of a selected concept are presented, together with a refined concept resulting from the findings of the study. This investigation examined a counter-flow gas generator design for O2/H2 mass ratios of 100-200, featuring a near-stoichiometric combustion zone followed by downstream mixing. The critical technologies required to develop a performance model are analyzed and include the following: (1) oxygen flow boiling; (2) two-phase oxygen flow heat transfer; (3) film-cooling in the combustion zone; (4) oxygen-rich combustion with hydrogen; and (5) mixing and dilution.

  8. Performance analysis of a mirror by numerical iterative method.

    PubMed

    Park, Kwijong; Cho, Myung; Lee, Dae-Hee; Moon, Bongkon

    2014-12-29

    Zernike polynomials are generally used to predict the optical performance of a mirror. However, it can also be done by a numerical iterative method. As piston, tip, tilt, and defocus (P.T.T.F) aberrations can be easily removed by optical alignment, we iteratively used a rotation transformation and a paraboloid graph subtraction for removal of the aberrations from a raw deformation of the optical surface through a Finite Element Method (FEM). The results of a 30 cm concave circular mirror corrected by the iterative method were almost the same as those yielded by Zernike polynomial fitting, and the computational time was fast. In addition, a concave square mirror whose surface area is π was analyzed in order to visualize the deformation maps of a general mirror aperture shape. The iterative method can be applicable efficiently because it does not depend on the mirror aperture shape.

  9. Finite element analysis and performance study of switched reluctance generator

    NASA Astrophysics Data System (ADS)

    Zhang, Qianhan; Guo, Yingjun; Xu, Qi; Yu, Xiaoying; Guo, Yajie

    2017-03-01

    Analyses a three-phase 12/8 switched reluctance generator (SRG) which is based on its structure and performance principle. The initial size data were calculated by MathCAD, and the simulation model was set up in the ANSOFT software environment with the maximum efficiency and the maximum output power as the main reference parameters. The outer diameter of the stator and the inner diameter of the rotor were parameterized. The static magnetic field distribution, magnetic flux, magnetic energy, torque, inductance characteristics, back electromotive force and phase current waveform of SRG is obtained by analyzing the static magnetic field and the steady state motion of two-dimensional transient magnetic field in ANSOFT environment. Finally, the experimental data of the prototype are compared with the simulation results, which provide a reliable basis for the design and research of SRG wind turbine system.

  10. Performance analysis of the ultra-linear optical intensity modulator

    NASA Astrophysics Data System (ADS)

    Madamopoulos, Nicholas; Dingel, Benjamin

    2006-10-01

    The linear optical intensity modulator is a key component in any broadband optical access-based analog fiber-optic link systems such as sub-carrier multiplexing (SCM) systems, ultra-dense CATV, Radio-over-Fiber (RoF) communications, and other platform access systems. Previously, we have proposed a super-linear optical modulator, having SFDR = 130 -140 dB-Hz 2/3, based on a unique combination of phase-modulator (PM) and a weak ring resonator (RR) modulator within a Mach-Zehnder interferometer (MZI). We presented some of its unique features. In this paper, we characterize further this ultra-linear optical intensity modulator, analyze its RF performance and provide method for parameter optimization. Other excellent features of this modulator design such as high manufacturing tolerance, effect of link insertion loss, adaptive characteristic and device simplicity are also discussed.

  11. Benchmarking and performance analysis of the CM-2. [SIMD computer

    NASA Technical Reports Server (NTRS)

    Myers, David W.; Adams, George B., II

    1988-01-01

    A suite of benchmarking routines testing communication, basic arithmetic operations, and selected kernel algorithms written in LISP and PARIS was developed for the CM-2. Experiment runs are automated via a software framework that sequences individual tests, allowing for unattended overnight operation. Multiple measurements are made and treated statistically to generate well-characterized results from the noisy values given by cm:time. The results obtained provide a comparison with similar, but less extensive, testing done on a CM-1. Tests were chosen to aid the algorithmist in constructing fast, efficient, and correct code on the CM-2, as well as gain insight into what performance criteria are needed when evaluating parallel processing machines.

  12. Analysis of the impact of error detection on computer performance

    NASA Technical Reports Server (NTRS)

    Shin, K. C.; Lee, Y. H.

    1983-01-01

    Conventionally, reliability analyses either assume that a fault/error is detected immediately following its occurrence, or neglect damages caused by latent errors. Though unrealistic, this assumption was imposed in order to avoid the difficulty of determining the respective probabilities that a fault induces an error and the error is then detected in a random amount of time after its occurrence. As a remedy for this problem a model is proposed to analyze the impact of error detection on computer performance under moderate assumptions. Error latency, the time interval between occurrence and the moment of detection, is used to measure the effectiveness of a detection mechanism. This model is used to: (1) predict the probability of producing an unreliable result, and (2) estimate the loss of computation due to fault and/or error.

  13. Operational Performance Analysis of Passive Acoustic Monitoring for Killer Whales

    SciTech Connect

    Matzner, Shari; Fu, Tao; Ren, Huiying; Deng, Zhiqun; Sun, Yannan; Carlson, Thomas J.

    2011-09-30

    For the planned tidal turbine site in Puget Sound, WA, the main concern is to protect Southern Resident Killer Whales (SRKW) due to their Endangered Species Act status. A passive acoustic monitoring system is proposed because the whales emit vocalizations that can be detected by a passive system. The algorithm for detection is implemented in two stages. The first stage is an energy detector designed to detect candidate signals. The second stage is a spectral classifier that is designed to reduce false alarms. The evaluation presented here of the detection algorithm incorporates behavioral models of the species of interest, environmental models of noise levels and potential false alarm sources to provide a realistic characterization of expected operational performance.

  14. Performance analysis of rotating disc contactor (RDC) column

    NASA Astrophysics Data System (ADS)

    Aiffah, Wan Nurul; Aisyah, Siti; Fashihah, Nor; Anuar, Khairil

    2014-06-01

    Liquid-liquid extraction is one of the most important separation processes. Different kinds of liquid-liquid extrator such as Rotating Disc Contactor (RDC) Column being used in industries. The study of liquid-liquid extraction in an RDC column has be come a very important subject to be discussed not just amongst chemical engineers but mathematicans as well. In this study, the performance of small diameter column RDC using the chemical system involving cumene/isobutryric asid/water are analyzed by the method of design of the experiments (DOE). DOE are applied to estimated the effect of four independent variable; protor speed, flow rate, concentration of continuous inlet and dispersed inlet and their interaction factor to detemine the most significant factor that effect the concentration of continuous and dispersed outlet as output parameters.

  15. Performance analysis of the BNL FACE gas injection system

    SciTech Connect

    Lipfert, F.W.; Hendrey, G.R.; Lewin, K.F.; Nagy, J.; Alexander, Y.

    1992-12-31

    As described elsewhere in this volume, the criteria for successful operation of a FACE-type crop fumigation system include the spatial uniformity of the gas injected over the crop growing area, the efficiency of gas usage, and the overall cost of the system. Efficiency of gas usage is important not only from a cost standpoint, but also to reduce the distances required to preclude interference between replicate FACE arrays or control plots. The details of the FACE design and analyses of the important fluid mechanical concepts are described below. A more detailed description of the hardware was given. Additional FACE performance and CO{sub 2} distribution data are given. 7 refs., 25 figs.

  16. Performance analysis of rocket-ramjet propelled SSTO-vehicles

    NASA Astrophysics Data System (ADS)

    Schoettle, U. M.

    1985-10-01

    Winged single-stage-to-orbit (SSTO) vehicles designed for vertical or horizontal takeoff and horizontal landing with both rocket and rocket-ramjet propulsion concepts are analyzed and their performance and costs are compared. For this purpose, LOX/LH2 rocket baseline vehicles with payload and mission requirements similar to the Space Shuttle system are modified to accommodate hydrogen-fueled ramjet engines. The use of the airbreathing engines results in a substantial decrease in propellant consumption but is heavily penalized by the added weights of the ramjet engine and structural reinforcements to resist higher aero-thermodynamic loads. The results suggest that for vehicles of the same gross lift-off weight the total system costs of the airbreathing vehicles are 19 percent higher compared to rocket systems; however, due to increased payload capabilities, the specific transportation costs are lower.

  17. Stress and Sealing Performance Analysis of Containment Vessel

    SciTech Connect

    WU, TSU-TE

    2005-05-24

    This paper presents a numerical technique for analyzing the containment vessel subjected to the combined loading of closure-bolt torque and internal pressure. The detailed stress distributions in the O-rings generated by both the torque load and the internal pressure can be evaluated by using this method. Consequently, the sealing performance of the O-rings can be determined. The material of the O-rings can be represented by any available constitutive equation for hyperelastic material. In the numerical calculation of this paper, the form of the Mooney-Rivlin strain energy potential is used. The technique treats both the preloading process of bolt tightening and the application of internal pressure as slow dynamic loads. Consequently, the problem can be evaluated using explicit numerical integration scheme.

  18. Performance analysis of SA-3 missile second stage

    NASA Technical Reports Server (NTRS)

    Helmy, A. M.

    1981-01-01

    One SA-3 missile was disassembled. The constituents of the second stage were thoroughly investigated for geometrical details. The second stage slotted composite propellant grain was subjected to mechanical properties testing, physiochemical analyses, and burning rate measurements at different conditions. To determine the propellant performance parameters, the slotted composite propellant grain was machined into a set of small-size tubular grains. These grains were fired in a small size rocket motor with a set of interchangeable nozzles with different throat diameters. The firings were carried out at three different conditions. The data from test motor firings, physiochemical properties of the propellant, burning rate measurement results and geometrical details of the second stage motor, were used as input data in a computer program to compute the internal ballistic characteristics of the second stage.

  19. Motion coordination and performance analysis of multiple vehicle systems

    NASA Astrophysics Data System (ADS)

    Sharma, Vikrant

    In this dissertation, issues related to multiple vehicle systems are studied. First, the issue of vehicular congestion is addressed and its effect on the performance of some systems studied. Motion coordination algorithms for some systems of interest are also developed. The issue of vehicular congestion is addressed by characterizing the effect of increasing the number of vehicles, in a bounded region, on the speed of the vehicles. A multiple vehicle routing problem is considered where vehicles are required to stay velocity-dependent distance away from each other to avoid physical collisions. Optimal solutions to the minimum time routing are characterized and are found to increase with the square root of the number of vehicles in the environment, for different distributions of the sources and destinations of the vehicles. The second issue addressed is that of the effect of vehicular congestion on the delay associated with data delivery in wireless networks where vehicles are used to transport data to increase the wireless capacity of the network. Tight bounds on the associated delay are derived. The next problem addressed is that of covering an arbitrary path-connected two dimensional region, using multiple unmanned aerial vehicles, in minimum time. A constant-factor optimal algorithm is presented for any given initial positions of the vehicles inside the environment. The last problem addressed is that of the deployment of an environment monitoring network of mobile sensors to improve the network lifetime and sensing quality. A distributed algorithm is presented that improves the system's performance starting from an initial deployment.

  20. Systems study on engineered barriers: barrier performance analysis

    SciTech Connect

    Stula, R.T.; Albert, T.E.; Kirstein, B.E.; Lester, D.H.

    1980-09-01

    A performance assessment model for multiple barrier packages containing unreprocessed spent fuel has been modified and applied to several package designs. The objective of the study was to develop information to be used in programmatic decision making concerning engineered barrier package design and development. The assessment model, BARIER, was developed in previous tasks of the System Study on Engineered Barriers (SSEB). The new version discussed in this report contains a refined and expanded corrosion rate data base which includes pitting, crack growth, and graphitization as well as bulk corrosion. Corrosion rates for oxic and anoxic conditions at each of the two temperature ranges are supplied. Other improvements include a rigorous treatment of radionuclide release after package failure which includes resistance of damaged barriers and backfill, refined temperature calculations that account for convection and radiation, a subroutine to calculate nuclear gamma radiation field at each barrier surface, refined stress calculations with reduced conservatism and various coding improvements to improve running time and core usage. This report also contains discussion of alternative scenarios to the assumed flooded repository as well as the impact of water exclusion backfills. The model was used to assess post repository closure performance for several designs which were all variation of basic designs from the Spent Unreprocessed Fuel (SURF) program. Many designs were found to delay the onset of leaching by at least a few hundreds of years in all geologic media. Long delay times for radionuclide release were found for packages with a few inches of sorption backfill. Release of uranium, plutonium, and americium was assessed.

  1. Performance analysis of OFDM modulation on indoor broadband PLC channels

    NASA Astrophysics Data System (ADS)

    Antonio Cortés, José; Díez, Luis; Cañete, Francisco Javier; Sánchez-Martínez, Juan José; Entrambasaguas, José Tomás

    2011-12-01

    Indoor broadband power-line communications is a suitable technology for home networking applications. In this context, orthogonal frequency-division multiplexing (OFDM) is the most widespread modulation technique. It has recently been adopted by the ITU-T Recommendation G.9960 and is also used by most of the commercial systems, whose number of carriers has gone from about 100 to a few thousands in less than a decade. However, indoor power-line channels are frequency-selective and exhibit periodic time variations. Hence, increasing the number of carriers does not always improves the performance, since it reduces the distortion because of the frequency selectivity, but increases the one caused by the channel time variation. In addition, the long impulse response of power-line channels obliges to use an insufficient cyclic prefix. Increasing its value reduces the distortion, but also the symbol rate. Therefore, there are optimum values for both modulation parameters. This article evaluates the performance of an OFDM system as a function of the number of carriers and the cyclic prefix length, determining their most appropriate values for the indoor power-line scenario. This task must be accomplished by means of time-consuming simulations employing a linear time-varying filtering, since no consensus on a tractable statistical channel model has been reached yet. However, this study presents a simpler procedure in which the distortion because of the frequency selectivity is computed using a time-invariant channel response, and an analytical expression is derived for the one caused by the channel time variation.

  2. Algorithms and architectures for high performance analysis of semantic graphs.

    SciTech Connect

    Hendrickson, Bruce Alan

    2005-09-01

    Semantic graphs offer one promising avenue for intelligence analysis in homeland security. They provide a mechanism for describing a wide variety of relationships between entities of potential interest. The vertices are nouns of various types, e.g. people, organizations, events, etc. Edges in the graph represent different types of relationships between entities, e.g. 'is friends with', 'belongs-to', etc. Semantic graphs offer a number of potential advantages as a knowledge representation system. They allow information of different kinds, and collected in differing ways, to be combined in a seamless manner. A semantic graph is a very compressed representation of some of relationship information. It has been reported that the semantic graph can be two orders of magnitude smaller than the processed intelligence data. This allows for much larger portions of the data universe to be resident in computer memory. Many intelligence queries that are relevant to the terrorist threat are naturally expressed in the language of semantic graphs. One example is the search for 'interesting' relationships between two individuals or between an individual and an event, which can be phrased as a search for short paths in the graph. Another example is the search for an analyst-specified threat pattern, which can be cast as an instance of subgraph isomorphism. It is important to note than many kinds of analysis are not relationship based, so these are not good candidates for semantic graphs. Thus, a semantic graph should always be used in conjunction with traditional knowledge representation and interface methods. Operations that involve looking for chains of relationships (e.g. friend of a friend) are not efficiently executable in a traditional relational database. However, the semantic graph can be thought of as a pre-join of the database, and it is ideally suited for these kinds of operations. Researchers at Sandia National Laboratories are working to facilitate semantic graph

  3. High Performance Parallel Analysis of Coupled Problems for Aircraft Propulsion

    NASA Technical Reports Server (NTRS)

    Felippa, C. A.; Farhat, C.; Lanteri, S.; Maman, N.; Piperno, S.; Gumaste, U.

    1994-01-01

    In order to predict the dynamic response of a flexible structure in a fluid flow, the equations of motion of the structure and the fluid must be solved simultaneously. In this paper, we present several partitioned procedures for time-integrating this focus coupled problem and discuss their merits in terms of accuracy, stability, heterogeneous computing, I/O transfers, subcycling, and parallel processing. All theoretical results are derived for a one-dimensional piston model problem with a compressible flow, because the complete three-dimensional aeroelastic problem is difficult to analyze mathematically. However, the insight gained from the analysis of the coupled piston problem and the conclusions drawn from its numerical investigation are confirmed with the numerical simulation of the two-dimensional transient aeroelastic response of a flexible panel in a transonic nonlinear Euler flow regime.

  4. Gust characteristics for WECS design and performance analysis

    SciTech Connect

    Doran, J.C.; Powell, D.C.

    1980-05-01

    This document provides a description of some gust characteristics which are useful in the study of wind turbine fatigue caused by a fluctuating wind environment. The particular gust form chosen can also be used in the analysis of the dynamic response of a turbine. The statistical behavior of such gust characteristics is not identical to that determined simply from the wind recorded by an anemometer. These modes of behavior may be related, however, by the application of appropriate digital filters to the anemometer data. This procedure has been carried out for a number of sample cases, and the variations of the resultant gust features are presented. A number of suggestions on specific applications and interpretations of the data are included.

  5. Increasing the performance of tritium analysis by electrolytic enrichment.

    PubMed

    Groning, M; Auer, R; Brummer, D; Jaklitsch, M; Sambandam, C; Tanweer, A; Tatzber, H

    2009-06-01

    Several improvements are described for the existing tritium enrichment system at the Isotope Hydrology Laboratory of the International Atomic Energy Agency for processing natural water samples. The improvements include a simple method for pretreatment of electrolytic cells to ensure a high tritium separation factor, an improved design of the exhaust system for explosive gases, and a vacuum distillation line for faster initial preparation of water samples for electrolytic enrichment and for tritium analysis. Achievements included the reduction of variation of individual enrichment parameters of all cells to less than 1% and an improvement of 50% of the stability of the background mean. It resulted in an improved detection limit of less than 0.4 TU (at 2s), important for application of tritium measurements in the future at low concentration levels, and resulted in measurement precisions of+/-0.2 TU and+/-0.15 TU for liquid scintillation counting and for gas proportional counting, respectively.

  6. Interaction Analysis in Performing Arts: A Case Study in Multimodal Choreography

    NASA Astrophysics Data System (ADS)

    Christou, Maria; Luciani, Annie

    The growing overture towards interacting virtual words and the variety of uses, have brought great changes in the performing arts, that worth a profound analysis in order to understand the emerging issues. We examine the performance conception for its embodiment capacity with a methodology based on interaction analysis. Finally, we propose a new situation of multimodal choreography that respects the aforementioned analysis, and we evaluate the results on a simulation exercise.

  7. Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0

    DOE PAGES

    Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; ...

    2008-01-01

    The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less

  8. Analysis of the performance of an air-water heat pump: Regulation of intrinsic performances

    NASA Astrophysics Data System (ADS)

    Martin-Neuville, H.; Reybillet, M.; Patureau, J. P.

    Improvements for an electrical compressor heat pump of around 12 kW with air as a heat source are examined. To test the heat pump under different weather conditions a test loop has been built. On the condenser side a water circuit with several capacities and heat exchangers simulates the thermal behavior of a 120 sq m dwelling. A commercial domestic heat pump was extensively tested. The instantaneous performance of the heat pump agreed well with the data claimed by the manufacturer. The annual energy saving, however, was significantly less due to the following: (1) loss of efficiency caused by defrosting cycles; (2) loss of efficiency due to inadequate thermal load matching between the heat pump and the house. It was shown that control of the condensing temperature can bring energy savings of 10 percent. This could probably also be realized by load matching with a compressor with a variable speed; and (3) the inefficient operation of components such as the evaporator and the condenser heat exchangers and the expansion valve. Optimization could lead to a considerable improvement. Modifications in the compressor are proposed which may lead to an increase in efficiency to 60 or 70 percent.

  9. High speed spherical roller-bearing analysis and comparison with experimental performance

    NASA Technical Reports Server (NTRS)

    Kleckner, R. J.; Dyba, G.

    1983-01-01

    The capabilities of a spherical roller bearing analysis/design tool, Spherbean (spherical bearing analysis) are described. Capabilities of the analysis are demonstrated and verified by comparison with experimental data. A practical design problem is presented where the computer program is used to improve a particular bearing's performance.

  10. Analysis of bio-anode performance through electrochemical impedance spectroscopy.

    PubMed

    ter Heijne, Annemiek; Schaetzle, Olivier; Gimenez, Sixto; Navarro, Lucia; Hamelers, Bert; Fabregat-Santiago, Francisco

    2015-12-01

    In this paper we studied the performance of bioanodes under different experimental conditions using polarization curves and impedance spectroscopy. We have identified that the large capacitances of up to 1 mF·cm(-2) for graphite anodes have their origin in the nature of the carbonaceous electrode, rather than the microbial culture. In some cases, the separate contributions of charge transfer and diffusion resistance were clearly visible, while in other cases their contribution was masked by the high capacitance of 1 mF·cm(-2). The impedance data were analyzed using the basic Randles model to analyze ohmic, charge transfer and diffusion resistances. Increasing buffer concentration from 0 to 50mM and increasing pH from 6 to 8 resulted in decreased charge transfer and diffusion resistances; lowest values being 144 Ω·cm(2) and 34 Ω·cm(2), respectively. At acetate concentrations below 1 mM, current generation was limited by acetate. We show a linear relationship between inverse charge transfer resistance at potentials close to open circuit and saturation (maximum) current, associated to the Butler-Volmer relationship that needs further exploration.

  11. Performance analysis of structured pedigree distributed fusion systems

    NASA Astrophysics Data System (ADS)

    Arambel, Pablo O.

    2009-05-01

    Structured pedigree is a way to compress pedigree information. When applied to distributed fusion systems, the approach avoids the well known problem of information double counting resulting from ignoring the cross-correlation among fused estimates. Other schemes that attempt to compute optimal fused estimates require the transmission of full pedigree information or raw data. This usually can not be implemented in practical systems because of the enormous requirements in communications bandwidth. The Structured Pedigree approach achieves data compression by maintaining multiple covariance matrices, one for each uncorrelated source in the network. These covariance matrices are transmitted by each node along with the state estimate. This represents a significant compression when compared to full pedigree schemes. The transmission of these covariance matrices (or a subset of these covariance matrices) allows for an efficient fusion of the estimates, while avoiding information double counting and guaranteeing consistency on the estimates. This is achieved by exploiting the additional partial knowledge on the correlation of the estimates. The approach uses a generalized version of the Split Covariance Intersection algorithm that applies to multiple estimates and multiple uncorrelated sources. In this paper we study the performance of the proposed distributed fusion system by analyzing a simple but instructive example.

  12. Analysis of radiation performances of plasma sheet antenna

    NASA Astrophysics Data System (ADS)

    Yin, Bo; Zhang, Zu-Fan; Wang, Ping

    2015-12-01

    A novel concept of plasma sheet antennas is presented in this paper, and the radiation performances of plasma sheet antennas are investigated in detail. Firstly, a model of planar plasma antenna (PPA) fed by a microstrip line is developed, and its reflection coefficient is computed by the JE convolution finite-difference time-domain method and compared with that of the metallic patch antenna. It is found that the design of PPA can learn from the theory of the metallic patch antenna, and the impedance matching and reconstruction of resonant frequency can be expediently realized by adjusting the parameters of plasma. Then the PPA is mounted on a metallic cylindrical surface, and the reflection coefficient of the conformal plasma antenna (CPA) is also computed. At the same time, the influence of conformal cylinder radius on the reflection coefficient is also analyzed. Finally, the radiation pattern of a CPA is given, the results show that the pattern agrees well with the one of PPA in the main radiation direction, but its side lobe level has deteriorated significantly.

  13. Performance Analysis of an Improved MUSIC DoA Estimator

    NASA Astrophysics Data System (ADS)

    Vallet, Pascal; Mestre, Xavier; Loubaton, Philippe

    2015-12-01

    This paper adresses the statistical performance of subspace DoA estimation using a sensor array, in the asymptotic regime where the number of samples and sensors both converge to infinity at the same rate. Improved subspace DoA estimators were derived (termed as G-MUSIC) in previous works, and were shown to be consistent and asymptotically Gaussian distributed in the case where the number of sources and their DoA remain fixed. In this case, which models widely spaced DoA scenarios, it is proved in the present paper that the traditional MUSIC method also provides DoA consistent estimates having the same asymptotic variances as the G-MUSIC estimates. The case of DoA that are spaced of the order of a beamwidth, which models closely spaced sources, is also considered. It is shown that G-MUSIC estimates are still able to consistently separate the sources, while it is no longer the case for the MUSIC ones. The asymptotic variances of G-MUSIC estimates are also evaluated.

  14. Compact time- and space-integrating SAR processor: performance analysis

    NASA Astrophysics Data System (ADS)

    Haney, Michael W.; Levy, James J.; Michael, Robert R., Jr.; Christensen, Marc P.

    1995-06-01

    Progress made during the previous 12 months toward the fabrication and test of a flight demonstration prototype of the acousto-optic time- and space-integrating real-time SAR image formation processor is reported. Compact, rugged, and low-power analog optical signal processing techniques are used for the most computationally taxing portions of the SAR imaging problem to overcome the size and power consumption limitations of electronic approaches. Flexibility and performance are maintained by the use of digital electronics for the critical low-complexity filter generation and output image processing functions. The results reported for this year include tests of a laboratory version of the RAPID SAR concept on phase history data generated from real SAR high-resolution imagery; a description of the new compact 2D acousto-optic scanner that has a 2D space bandwidth product approaching 106 sports, specified and procured for NEOS Technologies during the last year; and a design and layout of the optical module portion of the flight-worthy prototype.

  15. Lunar lander configuration study and parametric performance analysis

    NASA Astrophysics Data System (ADS)

    Donahue, Benjamin B.; Fowler, C. R.

    1993-06-01

    Future Lunar exploration plans will call for delivery of significant mounts or cargo to provide for crew habitation, surface tansportation, and scientific exploration activities. Minimization of costly surface based infrastructure is in large part directly related to the design of the cargo delivery/landing craft. This study focused on evaluating Lunar lander concepts from a logistics oriented perspective, and outlines the approach used in the development of a preferred configuration, sets forth the benefits derived from its utilization and describes the missions and system considered. Results indicate that only direct-to-surface downloading of payloads provides for unassisted cargo removal operations imperative to efficient and low risk site buildup, including the emplacement of Space Station derivative surface habitat modules, immediate cargo jettison for both descent abort and emergency surface ascent essential to piloted missions carrying cargo, and short habitat egress/ingress paths necessary to productive surface work tours for crew members carrying hand held experiments, tools and other bulky articles. Accommodating cargo in a position underneath the vehicles structural frame, landing craft described herein eliminate altogether the necessity for dedicated surface based off-loading vehicles, the operations and maintenance associated with their operation, and the precipitous ladder climbs to and from the surface that are inherent to traditional designs. Parametric evaluations illustrate performance and mass variation with respect to mission requirements.

  16. The performance analysis based on SAR sample covariance matrix.

    PubMed

    Erten, Esra

    2012-01-01

    Multi-channel systems appear in several fields of application in science. In the Synthetic Aperture Radar (SAR) context, multi-channel systems may refer to different domains, as multi-polarization, multi-interferometric or multi-temporal data, or even a combination of them. Due to the inherent speckle phenomenon present in SAR images, the statistical description of the data is almost mandatory for its utilization. The complex images acquired over natural media present in general zero-mean circular Gaussian characteristics. In this case, second order statistics as the multi-channel covariance matrix fully describe the data. For practical situations however, the covariance matrix has to be estimated using a limited number of samples, and this sample covariance matrix follow the complex Wishart distribution. In this context, the eigendecomposition of the multi-channel covariance matrix has been shown in different areas of high relevance regarding the physical properties of the imaged scene. Specifically, the maximum eigenvalue of the covariance matrix has been frequently used in different applications as target or change detection, estimation of the dominant scattering mechanism in polarimetric data, moving target indication, etc. In this paper, the statistical behavior of the maximum eigenvalue derived from the eigendecomposition of the sample multi-channel covariance matrix in terms of multi-channel SAR images is simplified for SAR community. Validation is performed against simulated data and examples of estimation and detection problems using the analytical expressions are as well given.

  17. Performance Analysis of TCP Enhancements in Satellite Data Networks

    NASA Technical Reports Server (NTRS)

    Broyles, Ren H.

    1999-01-01

    This research examines two proposed enhancements to the well-known Transport Control Protocol (TCP) in the presence of noisy communication links. The Multiple Pipes protocol is an application-level adaptation of the standard TCP protocol, where several TCP links cooperate to transfer data. The Space Communication Protocol Standard - Transport Protocol (SCPS-TP) modifies TCP to optimize performance in a satellite environment. While SCPS-TP has inherent advantages that allow it to deliver data more rapidly than Multiple Pipes, the protocol, when optimized for operation in a high-error environment, is not compatible with legacy TCP systems, and requires changes to the TCP specification. This investigation determines the level of improvement offered by SCPS-TP's Corruption Mode, which will help determine if migration to the protocol is appropriate in different environments. As the percentage of corrupted packets approaches 5 %, Multiple Pipes can take over five times longer than SCPS-TP to deliver data. At high error rates, SCPS-TP's advantage is primarily caused by Multiple Pipes' use of congestion control algorithms. The lack of congestion control, however, limits the systems in which SCPS-TP can be effectively used.

  18. Automated Dsm Extraction from Uav Images and Performance Analysis

    NASA Astrophysics Data System (ADS)

    Rhee, S.; Kim, T.

    2015-08-01

    As technology evolves, unmanned aerial vehicles (UAVs) imagery is being used from simple applications such as image acquisition to complicated applications such as 3D spatial information extraction. Spatial information is usually provided in the form of a DSM or point cloud. It is important to generate very dense tie points automatically from stereo images. In this paper, we tried to apply stereo image-based matching technique developed for satellite/aerial images to UAV images, propose processing steps for automated DSM generation and to analyse the possibility of DSM generation. For DSM generation from UAV images, firstly, exterior orientation parameters (EOPs) for each dataset were adjusted. Secondly, optimum matching pairs were determined. Thirdly, stereo image matching was performed with each pair. Developed matching algorithm is based on grey-level correlation on pixels applied along epipolar lines. Finally, the extracted match results were united with one result and the final DSM was made. Generated DSM was compared with a reference DSM from Lidar. Overall accuracy was 1.5 m in NMAD. However, several problems have to be solved in future, including obtaining precise EOPs, handling occlusion and image blurring problems. More effective interpolation technique needs to be developed in the future.

  19. Performance analysis of a medical record exchanges model.

    PubMed

    Huang, Ean-Wen; Liou, Der-Ming

    2007-03-01

    Electronic medical record exchange among hospitals can provide more information for physician diagnosis and reduce costs from duplicate examinations. In this paper, we proposed and implemented a medical record exchange model. According to our study, exchange interface servers (EISs) are designed for hospitals to manage the information communication through the intra and interhospital networks linked with a medical records database. An index service center can be given responsibility for managing the EIS and publishing the addresses and public keys. The prototype system has been implemented to generate, parse, and transfer the health level seven query messages. Moreover, the system can encrypt and decrypt a message using the public-key encryption algorithm. The queuing theory is applied to evaluate the performance of our proposed model. We estimated the service time for each queue of the CPU, database, and network, and measured the response time and possible bottlenecks of the model. The capacity of the model is estimated to process the medical records of about 4000 patients/h in the 1-MB network backbone environments, which comprises about the 4% of the total outpatients in Taiwan.

  20. Performance analysis of ultrasono-therapy transducer with contact detection.

    PubMed

    Moreno, Eduardo; González, Gilberto; Leija, Lorenzo; Rodríguez, Orlando; Castillo, Martha; Fuentes, Martín

    2003-06-01

    The performance of ultrasono-therapy transducer with contact detection by using the impedance phase change is described. Usually a therapy transducer is designed with a lambda/2 frontal plate glued to a PZT-4 piezoceramic. This plate ensures a good mechanical protection of the piezoceramic with a corresponding high-transmission energy. Normally this transducer is operated at the minimum at the frequency of the impedance module of its input electric impedance, but this operation point is affected by the shift caused by the expected temperature increase. This shift could be higher than the narrow bandwidth presented. As a result we obtain a decrease in the power level for medical treatment. Usually it is designed electronic drivers with automatic control that follow the frequency change, but the relatively narrow bandwidth introduces difficulty in the design. Another frequency operation point is presented here and analyzed using the criteria of the maximum of the impedance phase with a wider bandwidth than in the previous case. Simulation with mechanical losses are presented with experimental results that show the convenience of this criteria for practical application.

  1. Performance analysis of the continuous trace gas preconcentrator

    NASA Astrophysics Data System (ADS)

    Muntz, E. P.; Han, Y.-L.

    2011-03-01

    In gas molecule detection systems, certain trace gas components can go undetected. This is due to ultralow yet dangerous concentrations combined with limitations of the detection methods. To remedy this problem, a preconcentrator can be included in a system to increase the trace gas concentrations, before the gas samples enter the detection unit. The widely used adsorption/desorption preconcentrators enable detection by interrupting the sampled gas flow for significant periods, in order to accumulate detectable periodic concentrations of trace gas molecules. The recently patented continuous trace gas preconcentrator (CTGP) provides a unique approach for enhancing the trace gas concentration, without stopping the flow. In this study, a performance model is developed for the CTGP, by application of the Poiseuille flow coefficients for long tubes. Based on the Cercignani-Lampis scattering kernel, Sharipov calculated the Poiseuille flow coefficients for various geometries and numerous operating Knudsen numbers. The concentrations of sampled molecules were analyzed in this study using Sharipov's flow coefficients. The results presented here reinforce the potential benefits of the CTGP.

  2. Performance analysis of bullet trajectory estimation: Approach, simulation, and experiments

    SciTech Connect

    Ng, L.C.; Karr, T.J.

    1994-11-08

    This paper describes an approach to estimate a bullet`s trajectory from a time sequence of angles-only observations from a high-speed camera, and analyzes its performance. The technique is based on fitting a ballistic model of a bullet in flight along with unknown source location parameters to a time series of angular observations. The theory is developed to precisely reconstruct, from firing range geometry, the actual bullet trajectory as it appeared on the focal plane array and in real space. A metric for measuring the effective trajectory track error is also presented. Detailed Monte-Carlo simulations assuming different bullet ranges, shot-angles, camera frame rates, and angular noise show that angular track error can be as small as 100 {mu}rad for a 2 mrad/pixel sensor. It is also shown that if actual values of bullet ballistic parameters were available, the bullet s source location variables, and the angles of flight information could also be determined.

  3. Analysis of LCoS displays performance in diffractive optics

    NASA Astrophysics Data System (ADS)

    Lizana, A.; Lobato, L.; Iemmi, C.; Márquez, A.; Moreno, I.; Campos, J.; Yzuel, M. J.

    2010-06-01

    In this paper, we describe the Mueller-Jones combined method which is useful to optimize the LCoS displays phase response. This method, by means of the experimentally obtained Mueller matrices of the device, enables to obtain pairs of states of polarization (for the generation and for the detection states), which lead to the phase-only modulation regime. Moreover, some experimental results are provided as a function of the incident angle, wavelength and gray level. In addition, we also show the strong dependence of the LCoS performance with the signal addressed to the device, which affects the value of different physical parameters, such as the global phase-shift or the time-fluctuations in phase. Retardance curve and time-fluctuations in phase for the different sequences studied are obtained from the experimental Mueller matrices (the former) and by using a diffractive based set-up (the latter). The efficiency of basic diffractive optical elements is tested with the LCoS display, emphasizing the suitability of the best electrical sequence found when used in diffractive optics.

  4. The interlaboratory performance of microbiological methods for food analysis.

    PubMed

    Ellison, Stephen L R; Key, Pauline; Wood, Roger

    2012-01-01

    Repeatability and reproducibility data for microbiological methods in food analysis were collated and assessed with a view to identifying useful or important trends. Generalized additive modeling for location, shape, and scale was used to model the distribution of variances. It was found that mean reproducibility for log10(CFU) data is largely independent of concentration, while repeatability SD of log10(CFU) data shows a strongly significant decrease in repeatability SD with increasing enumeration. The model for reproducibility SD gave a mean of 0.44, with an upper 95th percentile of approximately 0.76. Repeatability variance could be described reasonably well by a simple dichotomous model; at enumerations below 10(5)/g, the model for repeatability SD gave a mean of approximately 0.35 and upper 95th percentile of 0.63. Above 10(5)/g, the model gave a mean of 0.2 and upper 95th percentile of 0.36. A Horwitz-like function showed no appreciable advantage in describing the data set and gave apparently worse fit. The relationship between repeatability and reproducibility of log10(CFU) is not constant across the concentration range studied. Both repeatability and reproducibility were found to depend on matrix class and organism.

  5. The Performance Analysis of a Uav Based Mobile Mapping System

    NASA Astrophysics Data System (ADS)

    Tsai, M. L.; Chiang, K. W.; Tseng, Y. H.; Rau, J. Y.; Huang, Y. W.; Lo, C. F.

    2012-07-01

    In order to facilitate applications such as environment detection or disaster monitoring, developing a quickly and low cost system to collect near real time spatial information is very important. Such a rapid spatial information collection capability has become an emerging trend in the technology of remote sensing and mapping application. In this study, a fixed-wing UAV based spatial information acquisition platform is developed and evaluated. The proposed UAV based platform has a direct georeferencing module including an low cost INS/GPS integrated system, low cost digital camera as well as other general UAV modules including immediately video monitoring communication system. This direct georeferencing module is able to provide differential GPS processing with single frequency carrier phase measurements to obtain sufficient positioning accuracy. All those necessary calibration procedures including interior orientation parameters, the lever arm and boresight angle are implemented. In addition, a flight test is performed to verify the positioning accuracy in direct georeferencing mode without using any ground control point that is required for most of current UAV based photogrammetric platforms. In other word, this is one of the pilot studies concerning direct georeferenced based UAV photogrammetric platform. The preliminary results in term of positioning accuracy in direct georeferenced mode without using any GCP illustrate horizontal positioning accuracies in x and y axes are both less than 20 meters, respectively. On the contrary, the positioning accuracy of z axis is less than 50 meters with 600 meters flight height above ground. Such accuracy is good for near real time disaster relief. Therefore, it is a relatively safe and cheap platform to collect critical spatial information for urgent response such as disaster relief and assessment applications where ground control points are not available.

  6. Graphitic polymer nanocomposites: Wear performance and wear debris analysis

    NASA Astrophysics Data System (ADS)

    Liu, Tian

    With the addition of appropriate nanofillers, nanocomposites have been shown to be an effective avenue to achieve a multitude of enhanced properties, even extending to multi-functionalities not normally considered possible for conventional polymer materials. However, the structure and properties of polymeric nanocomposites can be influenced by some environmental factors in practical use, such as wear and temperature, due to the nature of viscoelasticity of polymer matrix. The large interfacial areas exist between matrix and nanofillers are also susceptible to the wear/temperature-related changes. In this work, I was devoted to developing high wear and/or thermal performance graphitic nanofiller reinforced high density polyethylene (HDPE) composites. Two critical issues, including appropriate filler-matrix interactions and proper dispersion of the nano-reinforcement, were addressed through the effective nanofiller surface modification. Wear, thermal and mechanical properties of the resultant nanocomposites were systematically investigated. Meanwhile, the wear debris generated on the sliding surface of composite materials was analyzed morphologically and quantitatively. In particular, the research regarding the possibility of determining the effects of wear and thermal processes on the nanocomposites by detecting dielectric signals over the lifetime of polymeric materials was conducted. Correlations that exist between effects of wear or thermal processes and dielectric properties of the nanocomposites were then explored. Based on the studies of HDPE nanocomposites, high quality ultrahigh molecular weight polyethylene (UHMWPE) nanocomposite reinforced by graphitic nanofillers was finally extended in this thesis. UHMWPE is an extremely viscous polymer and thus cannot be processed conventionally, typically resulting in dispersion issues far worse than that of other composite systems. The research presented aims at solving the issue by using ultrasonication-assisted melt

  7. Time series analysis of regional climate model performance

    NASA Astrophysics Data System (ADS)

    Evans, Jason P.; Oglesby, Robert J.; Lapenta, William M.

    2005-02-01

    Four regional climate models (RegCM2, MM5/BATS, MM5/SHEELS, and MM5/OSU) were intercompared on a fairly small domain covering a relatively homogenous area in Kansas, United States, including the First International Satellite Land Surface Climatology Project (SLSCP) Field Experiment (FIFE) site. The models were integrated for a 2-year period covering 1987 and 1988. The model results are evaluated against data collected during this time period at the Konza Prairie Long-Term Ecological Research (LTER) site as well as over the summer observation periods of FIFE. The models all captured the proper qualitative behavior of the interannual variability, though the magnitudes varied considerably between models. They also found it particularly difficult to reproduce observed changes in the variance of surface variables. No model performed consistently better, with each model displaying particular strengths and weaknesses of its own. RegCM2 could be improved by including an ice phase in the cloud microphysics parameterization. MM5/BATS and MM5/SHEELS need revision of the formulation of stability dependence of the surface drag coefficients, including the coupling to the wind field, as well as using a total soil depth more representative of the area. MM5/OSU simulates too much resistance to evapotranspiration and fails to close the energy budget. All of the models overestimate runoff and evapotranspiration during winter, creating a dry anomaly which persists throughout the following summer. Development and verification of parameterizations involved in coupling the land surface and atmospheric components of these models together is at least as important as the development and verification of each component individually.

  8. Miniaturized ultra-high performance liquid chromatography coupled to electrochemical detection: Investigation of system performance for neurochemical analysis.

    PubMed

    Van Schoors, Jolien; Maes, Katrien; Van Wanseele, Yannick; Broeckhoven, Ken; Van Eeckhaut, Ann

    2016-01-04

    The interest in implementation of miniaturized ultra-high performance liquid chromatography (UHPLC) in neurochemical research is growing because of the need for faster, more selective and more sensitive neurotransmitter analyses. The instrument performance of a tailor designed microbore UHPLC system coupled to electrochemical detection (ECD) is investigated, focusing on the quantitative monoamine determination in in vivo microdialysis samples. The use of a microbore column (1.0mm I.D.) requires miniaturization of the entire instrument, though a balance between extra-column band broadening and injection volume must be considered. This is accomplished through the user defined Performance Optimizing Injection Sequence, whereby 5 μL sample is injected on the column with a measured extra-column variance of 4.5-9.0 μL(2) and only 7 μL sample uptake. Different sub-2 μm and superficially porous particle stationary phases are compared by means of the kinetic plot approach. Peak efficiencies of about 16000-35000 theoretical plates are obtained for the Acquity UPLC BEH C18 column within 13 min analysis time. Furthermore, the coupling to ECD is shown suitable for microbore UHPLC analysis thanks to the miniaturized flow cell design, sufficiently fast data acquisition and mathematical data filtering. Ultimately, injection of in vivo samples demonstrates the applicability of the system for microdialysis analysis.

  9. Performance analysis of bonded composite doublers on aircraft structures

    SciTech Connect

    Roach, D.

    1995-08-01

    Researchers contend that composite repairs (or structural reinforcement doublers) offer numerous advantages over metallic patches including corrosion resistance, light weight, high strength, elimination of rivets, and time savings in installation. Their use in commercial aviation has been stifled by uncertainties surrounding their application, subsequent inspection and long-term endurance. The process of repairing or reinforcing airplane structures is time consuming and the design is dependent upon an accompanying stress and fatigue analysis. A repair that is too stiff may result in a loss of fatigue life, continued growth of the crack being repaired, and the initiation of a new flaw in the undesirable high stress field around the patch. Uncertainties in load spectrums used to design repairs exacerbates these problems as does the use of rivets to apply conventional doublers. Many of these repair or structural reinforcement difficulties can be addressed through the use of composite doublers. Primary among unknown entities are the effects of non-optimum installations and the certification of adequate inspection procedures. This paper presents on overview of a program intended to introduce composite doubler technology to the US commercial aircraft fleet. In this project, a specific composite application has been chosen on an L-1011 aircraft in order to focus the tasks on application and operation issues. Through the use of laboratory test structures and flight demonstrations on an in-service L-1011 airplane, this study is investigating composite doubler design, fabrication, installation, structural integrity, and non-destructive evaluation. In addition to providing an overview of the L-1011 project, this paper focuses on a series of fatigue and strength tests which have been conducted in order to study the damage tolerance of composite doublers. Test results to-date are presented.

  10. A Covariance Analysis Tool for Assessing Fundamental Limits of SIM Pointing Performance

    NASA Technical Reports Server (NTRS)

    Bayard, David S.; Kang, Bryan H.

    2007-01-01

    This paper presents a performance analysis of the instrument pointing control system for NASA's Space Interferometer Mission (SIM). SIM has a complex pointing system that uses a fast steering mirror in combination with a multirate control architecture to blend feed forward information with feedback information. A pointing covariance analysis tool (PCAT) is developed specifically to analyze systems with such complexity. The development of PCAT as a mathematical tool for covariance analysis is outlined in the paper. PCAT is then applied to studying performance of SIM's science pointing system. The analysis reveals and clearly delineates a fundamental limit that exists for SIM pointing performance. The limit is especially stringent for dim star targets. Discussion of the nature of the performance limit is provided, and methods are suggested to potentially improve pointing performance.

  11. Performance analysis of mini-propellers based on FlightGear

    NASA Astrophysics Data System (ADS)

    Vogeltanz, Tomáš

    2016-06-01

    This paper presents a performance analysis of three mini-propellers based on the FlightGear flight simulator. Although a basic propeller analysis has to be performed before the use of FlightGear, for a complex and more practical performance analysis, it is advantageous to use a propeller model in cooperation with a particular aircraft model. This approach may determine whether the propeller has sufficient quality in respect of aircraft requirements. In the first section, the software used for the analysis is illustrated. Then, the parameters of the analyzed mini-propellers and the tested UAV are described. Finally, the main section shows and discusses the results of the performance analysis of the mini-propellers.

  12. A TIERED APPROACH TO PERFORMING UNCERTAINTY ANALYSIS IN CONDUCTING EXPOSURE ANALYSIS FOR CHEMICALS

    EPA Science Inventory

    The WHO/IPCS draft Guidance Document on Characterizing and Communicating Uncertainty in Exposure Assessment provides guidance on recommended strategies for conducting uncertainty analysis as part of human exposure analysis. Specifically, a tiered approach to uncertainty analysis ...

  13. Performance Analysis of Occurrences January 1, 2011-December 31, 2011

    SciTech Connect

    Ludwig, M

    2012-03-16

    This report documents the analysis of the occurrences during the period January 1, 2011 through December 31, 2011. The report compares LLNL occurrences by reporting criteria and significance category to see if LLNL is reporting occurrences along similar percentages as other DOE sites. The three-year trends are analyzed. It does not include the analysis of the causes or the lessons learned from the occurrences, as they are analyzed separately. The number and types of occurrences that LLNL reports to DOE varies over time. This variation can be attributed to normally occurring changes in frequency; DOE's or LLNL's heightened interest in a particular subject area; changes in LLNL processes; or emerging problems. Since all of the DOE sites use the same reporting criteria, it is helpful to understand if LLNL is consistent with or diverging from reporting at other sites. This section compares the normalized number of occurrences reported by LLNL and other DOE sites. In order to compare LLNL occurrence reports to occurrence reports from other DOE sites, we normalized (or standardized) the data from the sites. DOE sites vary widely in their budgets, populations, and scope of work and these variations may affect reporting frequency. In addition, reports are required for a wide range of occurrence types, some of which may not be applicable to all DOE sites. For example, one occurrence reporting group is Group 3, Nuclear Safety Basis, and not all sites have nuclear operations. Because limited information is available for all sites, the sites were normalized based on best available information. Site effort hours were extracted from the DOE Computerized Accident Incident Reporting System (CAIRS) and used to normalize (or standardize) the number of occurrences by site. Effort hours are those hours that employees normally work and do not include vacation, holiday hours etc. Sites are responsible for calculating their effort hours and ensuring entry into CAIRS. Out of the 30 DOE

  14. The Performance Analysis of Ultra High Speed PM Type Synchronous Motor-Generator for Micro Turbine

    NASA Astrophysics Data System (ADS)

    Hong, Do-Kwan; Woo, Byung-Chul; Jeong, Yeon-Ho; Koo, Dae-Hyun; Cho, Yun-Hyun

    This paper deals with loss analysis, structural, thermal-fluid and rotordynamics (critical speed and unbalance) which need in developing the motor-generator. This machine has designed of a generator of 800 W, 400 krpm and a starter of 400 W, 200 krpm. The generated losses of motor-generator are derived by magnetic analysis. Thermal-fluid analysis is performed using loss analysis result. The critical speed is extracted by Campbell diagram. Unbalance vibration response analysis enable to predict the expected vibration amplitude by unbalance. The motor-generator is well-developed using the applied several techniques of analysis.

  15. Analytical techniques and instrumentation: A compilation. [analytical instrumentation, materials performance, and systems analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.

  16. Performance Cycle Analysis of a Two-Spool, Separate-Exhaust Turbofan With Interstage Turbine Burner

    NASA Technical Reports Server (NTRS)

    Liew, K. H.; Urip, E.; Yang, S. L.; Mattingly, J. D.; Marek, C. J.

    2005-01-01

    This paper presents the performance cycle analysis of a dual-spool, separate-exhaust turbofan engine, with an Interstage Turbine Burner serving as a secondary combustor. The ITB, which is located at the transition duct between the high- and the low-pressure turbines, is a relatively new concept for increasing specific thrust and lowering pollutant emissions in modern jet engine propulsion. A detailed performance analysis of this engine has been conducted for steady-state engine performance prediction. A code is written and is capable of predicting engine performances (i.e., thrust and thrust specific fuel consumption) at varying flight conditions and throttle settings. Two design-point engines were studied to reveal trends in performance at both full and partial throttle operations. A mission analysis is also presented to assure the advantage of saving fuel by adding ITB.

  17. Simulink-Based Implementation and Performance Analysis of TDS-OFDM in Time-Varying Environments

    DTIC Science & Technology

    2014-09-01

    IMPLEMENTATION AND PERFORMANCE ANALYSIS OF TDS- OFDM IN TIME- VARYING ENVIRONMENTS by Hui-Chen Lai September 2014 Thesis Advisor: Monique P...2014 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE SIMULINK-BASED IMPLEMENTATION AND PERFORMANCE ANALYSIS OF TDS- OFDM ...simulink-based software models to implement and test the time-domain synchronous OFDM (TDS- OFDM ) transmitter and receiver systems. This technique

  18. Performance characterization of image and video analysis systems at Siemens Corporate Research

    NASA Astrophysics Data System (ADS)

    Ramesh, Visvanathan; Jolly, Marie-Pierre; Greiffenhagen, Michael

    2000-06-01

    There has been a significant increase in commercial products using imaging analysis techniques to solve real-world problems in diverse fields such as manufacturing, medical imaging, document analysis, transportation and public security, etc. This has been accelerated by various factors: more advanced algorithms, the availability of cheaper sensors, and faster processors. While algorithms continue to improve in performance, a major stumbling block in translating improvements in algorithms to faster deployment of image analysis systems is the lack of characterization of limits of algorithms and how they affect total system performance. The research community has realized the need for performance analysis and there have been significant efforts in the last few years to remedy the situation. Our efforts at SCR have been on statistical modeling and characterization of modules and systems. The emphasis is on both white-box and black box methodologies to evaluate and optimize vision systems. In the first part of this paper we review the literature on performance characterization and then provide an overview of the status of research in performance characterization of image and video understanding systems. The second part of the paper is on performance evaluation of medical image segmentation algorithms. Finally, we highlight some research issues in performance analysis in medical imaging systems.

  19. Value-Added and Other Methods for Measuring School Performance: An Analysis of Performance Measurement Strategies in Teacher Incentive Fund Proposals. Research Brief

    ERIC Educational Resources Information Center

    National Center on Performance Incentives, 2008

    2008-01-01

    In "Value-Added and Other Methods for Measuring School Performance: An Analysis of Performance Measurement Strategies in Teacher Incentive Fund Proposals"--a paper presented at the February 2008 National Center on Performance Incentives research to policy conference--Robert Meyer and Michael Christian examine select performance-pay plans…

  20. Magnitude of Task-Sampling Variability in Performance Assessment: A Meta-Analysis

    ERIC Educational Resources Information Center

    Huang, Chiungjung

    2009-01-01

    This study examined the percentage of task-sampling variability in performance assessment via a meta-analysis. In total, 50 studies containing 130 independent data sets were analyzed. Overall results indicate that the percentage of variance for (a) differential difficulty of task was roughly 12% and (b) examinee's differential performance of the…

  1. Performation Metrics Development Analysis for Information and Communications Technology Outsourcing: A Case Study

    ERIC Educational Resources Information Center

    Travis, James L., III

    2014-01-01

    This study investigated how and to what extent the development and use of the OV-5a operational architecture decomposition tree (OADT) from the Department of Defense (DoD) Architecture Framework (DoDAF) affects requirements analysis with respect to complete performance metrics for performance-based services acquisition of ICT under rigid…

  2. Relation of Early Testing and Incentive on Quiz Performance in Introductory Psychology: An Archival Analysis

    ERIC Educational Resources Information Center

    McGuire, Michael J.; MacDonald, Pamelyn M.

    2009-01-01

    Students should learn best by repeating a cycle of studying, testing, and feedback, all of which are components of "mastery learning." We performed an archival analysis to determine the relation between taking quizzes early and quiz performance in a "mastery learning" context. Also investigated was whether extra credit resulted in early testing…

  3. Using Multilevel Analysis to Monitor Test Performance across Administrations. Research Report. ETS RR-14-29

    ERIC Educational Resources Information Center

    Wei, Youhua; Qu, Yanxuan

    2014-01-01

    For a testing program with frequent administrations, it is important to understand and monitor the stability and fluctuation of test performance across administrations. Different methods have been proposed for this purpose. This study explored the potential of using multilevel analysis to understand and monitor examinees' test performance across…

  4. Embedded Figures Test Performance in the Broader Autism Phenotype: A Meta-Analysis

    ERIC Educational Resources Information Center

    Cribb, Serena J.; Olaithe, Michelle; Di Lorenzo, Renata; Dunlop, Patrick D.; Maybery, Murray T.

    2016-01-01

    People with autism show superior performance to controls on the Embedded Figures Test (EFT). However, studies examining the relationship between autistic-like traits and EFT performance in neurotypical individuals have yielded inconsistent findings. To examine the inconsistency, a meta-analysis was conducted of studies that (a) compared high and…

  5. Application of Data Envelopment Analysis on the Indicators Contributing to Learning and Teaching Performance

    ERIC Educational Resources Information Center

    Montoneri, Bernard; Lin, Tyrone T.; Lee, Chia-Chi; Huang, Shio-Ling

    2012-01-01

    This paper applies data envelopment analysis (DEA) to explore the quantitative relative efficiency of 18 classes of freshmen students studying a course of English conversation in a university of Taiwan from the academic year 2004-2006. A diagram of teaching performance improvement mechanism is designed to identify key performance indicators for…

  6. Computer Analysis of the Auditory Characteristics of Musical Performance. Final Report.

    ERIC Educational Resources Information Center

    Heller, Jack J.; Campbell, Warren C.

    The purpose of this research was to perform computer analysis and modification of complex musical tones and to develop models of perceptual and learning processes in music. Analysis of the physical attributes of sound (frequency, intensity, and harmonic content, versus time) provided necessary information about the musical parameters of…

  7. Interdisciplinary Learning in Pharmacology: Analysis of M.D. and P.A. Student Performance.

    ERIC Educational Resources Information Center

    Andrus, Peter L.; And Others

    1981-01-01

    An analysis of M.D. and physician's assistant (P.A.) student comparative performances at Baylor College of Medicine is presented. The analysis was seen as being appropriate in view of the P.A.'s increasing involvement with the implementation and monitoring of therapeutic programs. (Author/MLW)

  8. Evaluating Language Environment Analysis System Performance for Chinese: A Pilot Study in Shanghai

    ERIC Educational Resources Information Center

    Gilkerson, Jill; Zhang, Yiwen; Xu, Dongxin; Richards, Jeffrey A.; Xu, Xiaojuan; Jiang, Fan; Harnsberger, James; Topping, Keith

    2015-01-01

    Purpose: The purpose of this study was to evaluate performance of the Language Environment Analysis (LENA) automated language-analysis system for the Chinese Shanghai dialect and Mandarin (SDM) languages. Method: Volunteer parents of 22 children aged 3-23 months were recruited in Shanghai. Families provided daylong in-home audio recordings using…

  9. COMPONENT, IMAGE, AND FACTOR ANALYSIS OF TESTS OF INTELLECT AND OF MOTOR PERFORMANCE.

    ERIC Educational Resources Information Center

    HARRIS, CHESTER W.; LIBA, MARIE R.

    AN ATTEMPT WAS MADE TO DETERMINE THE EFFECTS OF CERTAIN VARIATIONS IN METHODOLOGY ON THE ANALYSIS OF EXISTING SETS OF DATA IN THE AREAS OF ABILITY OR INTELLIGENCE AND MOTOR PERFORMANCE OR PHYSICAL FITNESS. USING CURRENT DEVELOPMENTS IN THEORY AND METHODS OF FACTOR ANALYSIS DIFFERENT TREATMENTS OF VARIOUS SETS OF DATA, THREE RELATIVELY NEW MODELS…

  10. Sound analysis of a musical performance to evaluate prosthodontic treatment for a clarinet player.

    PubMed

    Hattori, Mariko; Sumita, Yuka I; Taniguchi, Hisashi

    2015-01-01

    Some dental patients use the orofacial region to play wind instruments; however, musical performance has not been objectively evaluated following prosthodontic treatment in such patients. The purpose of this report was to describe prosthodontic treatment for a clarinet player using sound analysis. The patient required a removable partial denture for his maxillary anterior teeth. Sound analysis was performed before and after denture adjustment, and the patient completed a questionnaire regarding his perceptions while playing his clarinet. After adjustment, the denture showed better performance, and patient satisfaction increased compared with that before adjustment.

  11. Elastic-plastic mixed-iterative finite element analysis: Implementation and performance assessment

    NASA Technical Reports Server (NTRS)

    Sutjahjo, Edhi; Chamis, Christos C.

    1993-01-01

    An elastic-plastic algorithm based on Von Mises and associative flow criteria is implemented in MHOST-a mixed iterative finite element analysis computer program developed by NASA Lewis Research Center. The performance of the resulting elastic-plastic mixed-iterative analysis is examined through a set of convergence studies. Membrane and bending behaviors of 4-node quadrilateral shell finite elements are tested for elastic-plastic performance. Generally, the membrane results are excellent, indicating the implementation of elastic-plastic mixed-iterative analysis is appropriate.

  12. Intrinsic motivation and extrinsic incentives jointly predict performance: a 40-year meta-analysis.

    PubMed

    Cerasoli, Christopher P; Nicklin, Jessica M; Ford, Michael T

    2014-07-01

    More than 4 decades of research and 9 meta-analyses have focused on the undermining effect: namely, the debate over whether the provision of extrinsic incentives erodes intrinsic motivation. This review and meta-analysis builds on such previous reviews by focusing on the interrelationship among intrinsic motivation, extrinsic incentives, and performance, with reference to 2 moderators: performance type (quality vs. quantity) and incentive contingency (directly performance-salient vs. indirectly performance-salient), which have not been systematically reviewed to date. Based on random-effects meta-analytic methods, findings from school, work, and physical domains (k = 183, N = 212,468) indicate that intrinsic motivation is a medium to strong predictor of performance (ρ = .21-45). The importance of intrinsic motivation to performance remained in place whether incentives were presented. In addition, incentive salience influenced the predictive validity of intrinsic motivation for performance: In a "crowding out" fashion, intrinsic motivation was less important to performance when incentives were directly tied to performance and was more important when incentives were indirectly tied to performance. Considered simultaneously through meta-analytic regression, intrinsic motivation predicted more unique variance in quality of performance, whereas incentives were a better predictor of quantity of performance. With respect to performance, incentives and intrinsic motivation are not necessarily antagonistic and are best considered simultaneously. Future research should consider using nonperformance criteria (e.g., well-being, job satisfaction) as well as applying the percent-of-maximum-possible (POMP) method in meta-analyses.

  13. Methodology issues concerning the accuracy of kinematic data collection and analysis using the ariel performance analysis system

    NASA Technical Reports Server (NTRS)

    Wilmington, R. P.; Klute, Glenn K. (Editor); Carroll, Amy E. (Editor); Stuart, Mark A. (Editor); Poliner, Jeff (Editor); Rajulu, Sudhakar (Editor); Stanush, Julie (Editor)

    1992-01-01

    Kinematics, the study of motion exclusive of the influences of mass and force, is one of the primary methods used for the analysis of human biomechanical systems as well as other types of mechanical systems. The Anthropometry and Biomechanics Laboratory (ABL) in the Crew Interface Analysis section of the Man-Systems Division performs both human body kinematics as well as mechanical system kinematics using the Ariel Performance Analysis System (APAS). The APAS supports both analysis of analog signals (e.g. force plate data collection) as well as digitization and analysis of video data. The current evaluations address several methodology issues concerning the accuracy of the kinematic data collection and analysis used in the ABL. This document describes a series of evaluations performed to gain quantitative data pertaining to position and constant angular velocity movements under several operating conditions. Two-dimensional as well as three-dimensional data collection and analyses were completed in a controlled laboratory environment using typical hardware setups. In addition, an evaluation was performed to evaluate the accuracy impact due to a single axis camera offset. Segment length and positional data exhibited errors within 3 percent when using three-dimensional analysis and yielded errors within 8 percent through two-dimensional analysis (Direct Linear Software). Peak angular velocities displayed errors within 6 percent through three-dimensional analyses and exhibited errors of 12 percent when using two-dimensional analysis (Direct Linear Software). The specific results from this series of evaluations and their impacts on the methodology issues of kinematic data collection and analyses are presented in detail. The accuracy levels observed in these evaluations are also presented.

  14. Design and performance of an analysis-by-synthesis class of predictive speech coders

    NASA Technical Reports Server (NTRS)

    Rose, Richard C.; Barnwell, Thomas P., III

    1990-01-01

    The performance of a broad class of analysis-by-synthesis linear predictive speech coders is quantified experimentally. The class of coders includes a number of well-known techniques as well as a very large number of speech coders which have not been named or studied. A general formulation for deriving the parametric representation used in all of the coders in the class is presented. A new coder, named the self-excited vocoder, is discussed because of its good performance with low complexity, and because of the insight this coder gives to analysis-by-synthesis coders in general. The results of a study comparing the performances of different members of this class are presented. The study takes the form of a series of formal subjective and objective speech quality tests performed on selected coders. The results of this study lead to some interesting and important observations concerning the controlling parameters for analysis-by-synthesis speech coders.

  15. Performance (Off-Design) Cycle Analysis for a Turbofan Engine With Interstage Turbine Burner

    NASA Technical Reports Server (NTRS)

    Liew, K. H.; Urip, E.; Yang, S. L.; Mattingly, J. D.; Marek, C. J.

    2005-01-01

    This report presents the performance of a steady-state, dual-spool, separate-exhaust turbofan engine, with an interstage turbine burner (ITB) serving as a secondary combustor. The ITB, which is located in the transition duct between the high- and the low-pressure turbines, is a relatively new concept for increasing specific thrust and lowering pollutant emissions in modern jet-engine propulsion. A detailed off-design performance analysis of ITB engines is written in Microsoft(Registered Trademark) Excel (Redmond, Washington) macrocode with Visual Basic Application to calculate engine performances over the entire operating envelope. Several design-point engine cases are pre-selected using a parametric cycle-analysis code developed previously in Microsoft(Registered Trademark) Excel, for off-design analysis. The off-design code calculates engine performances (i.e. thrust and thrust-specific-fuel-consumption) at various flight conditions and throttle settings.

  16. Performance analysis in football: a critical review and implications for future research.

    PubMed

    Mackenzie, Rob; Cushion, Chris

    2013-01-01

    This paper critically reviews existing literature relating to performance analysis (PA) in football, arguing that an alternative approach is warranted. The paper considers the applicability of variables analysed along with research findings in the context of their implications for professional practice. This includes a review of methodological approaches commonly adopted throughout PA research, including a consideration of the nature and size of the samples used in relation to generalisability. Definitions and classifications of variables used within performance analysis are discussed in the context of reliability and validity. The contribution of PA findings to the field is reviewed. The review identifies an overemphasis on researching predictive and performance controlling variables. A different approach is proposed that works with and from performance analysis information to develop research investigating athlete and coach learning, thus adding to applied practice. Future research should pay attention to the social and cultural influences that impact PA delivery and athlete learning in applied settings.

  17. Parametric performance analysis of OTEC system using HFC32/HFC134a mixtures

    SciTech Connect

    Uehara, Haruo; Ikegami, Yasuyuki

    1995-11-01

    Parametric performance analysis is performed on an Ocean Thermal Energy Conversion (OTEC) system using HFC32/HFC134a mixtures as working fluid. The analyzed OTEC system uses the Kalina cycle. The parameters in the performance analysis consist of the warm sea water inlet temperature, the cold sea water inlet temperature, the heat transfer performance of the evaporator, condenser and regenerator, the turbine inlet pressure, the turbine inlet temperature, the molar fraction of HFC32. Effects of these various parameters on the efficiency of the Kalina cycle using HFC32/HFC134a mixtures are clarified by using this analysis, and compared with calculation results using ammonia/water mixtures as working fluid. The thermal efficiency of OTEC system using the Kalina cycle can reach up to about 5 percent with an inlet warm sea water temperature of 28 C and an inlet cold sea water temperature of 4 C.

  18. Performance Analysis of Distributed Applications using Automatic Classification of Communication Inefficiencies

    SciTech Connect

    Vetter, J.

    1999-11-01

    We present a technique for performance analysis that helps users understand the communication behavior of their message passing applications. Our method automatically classifies individual communication operations and it reveals the cause of communication inefficiencies in the application. This classification allows the developer to focus quickly on the culprits of truly inefficient behavior, rather than manually foraging through massive amounts of performance data. Specifically, we trace the message operations of MPI applications and then classify each individual communication event using decision tree classification, a supervised learning technique. We train our decision tree using microbenchmarks that demonstrate both efficient and inefficient communication. Since our technique adapts to the target system's configuration through these microbenchmarks, we can simultaneously automate the performance analysis process and improve classification accuracy. Our experiments on four applications demonstrate that our technique can improve the accuracy of performance analysis, and dramatically reduce the amount of data that users must encounter.

  19. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will

  20. Statistical Analysis of the Performance of MDL Enumeration for Multiple-Missed Detection in Array Processing.

    PubMed

    Du, Fei; Li, Yibo; Jin, Shijiu

    2015-08-18

    An accurate performance analysis on the MDL criterion for source enumeration in array processing is presented in this paper. The enumeration results of MDL can be predicted precisely by the proposed procedure via the statistical analysis of the sample eigenvalues, whose distributive properties are investigated with the consideration of their interactions. A novel approach is also developed for the performance evaluation when the source number is underestimated by a number greater than one, which is denoted as "multiple-missed detection", and the probability of a specific underestimated source number can be estimated by ratio distribution analysis. Simulation results are included to demonstrate the superiority of the presented method over available results and confirm the ability of the proposed approach to perform multiple-missed detection analysis.

  1. Statistical Analysis of the Performance of MDL Enumeration for Multiple-Missed Detection in Array Processing

    PubMed Central

    Du, Fei; Li, Yibo; Jin, Shijiu

    2015-01-01

    An accurate performance analysis on the MDL criterion for source enumeration in array processing is presented in this paper. The enumeration results of MDL can be predicted precisely by the proposed procedure via the statistical analysis of the sample eigenvalues, whose distributive properties are investigated with the consideration of their interactions. A novel approach is also developed for the performance evaluation when the source number is underestimated by a number greater than one, which is denoted as “multiple-missed detection”, and the probability of a specific underestimated source number can be estimated by ratio distribution analysis. Simulation results are included to demonstrate the superiority of the presented method over available results and confirm the ability of the proposed approach to perform multiple-missed detection analysis. PMID:26295232

  2. The Development of a Handbook for Astrobee F Performance and Stability Analysis

    NASA Technical Reports Server (NTRS)

    Wolf, R. S.

    1982-01-01

    An astrobee F performance and stability analysis is presented, for use by the NASA Sounding Rocket Division. The performance analysis provides information regarding altitude, mach number, dynamic pressure, and velocity as functions of time since launch. It is found that payload weight has the greatest effect on performance, and performance prediction accuracy was calculated to remain within 1%. In addition, to assure sufficient flight stability, a predicted rigid-body static margin of at least 8% of the total vehicle length is required. Finally, fin cant angle predictions are given in order to achieve a 2.5 cycle per second burnout roll rate, based on obtaining 75% of the steady roll rate. It is noted that this method can be used by flight performance engineers to create a similar handbook for any sounding rocket series.

  3. A Finite Rate Chemical Analysis of Nitric Oxide Flow Contamination Effects on Scramjet Performance

    NASA Technical Reports Server (NTRS)

    Cabell, Karen F.; Rock, Kenneth E.

    2003-01-01

    The level of nitric oxide contamination in the test gas of the Langley Research Center Arc-Heated Scramjet Test Facility and the effect of the contamination on scramjet test engine performance were investigated analytically. A finite rate chemical analysis was performed to determine the levels of nitric oxide produced in the facility at conditions corresponding to Mach 6 to 8 flight simulations. Results indicate that nitric oxide levels range from one to three mole percent, corroborating previously obtained measurements. A three-stream combustor code with finite rate chemistry was used to investigate the effects of nitric oxide on scramjet performance. Results indicate that nitric oxide in the test gas causes a small increase in heat release and thrust performance for the test conditions investigated. However, a rate constant uncertainty analysis suggests that the effect of nitric oxide ranges from no net effect, to an increase of about 10 percent in thrust performance.

  4. A performance index approach to aerodynamic design with the use of analysis codes only

    NASA Technical Reports Server (NTRS)

    Barger, Raymond L.; Moitra, Anutosh

    1988-01-01

    A method is described for designing an aerodynamic configuration for a specified performance vector, based on results from several similar, but not identical, trial configurations, each defined by a geometry parameter vector. The theory shows the method effective provided that: (1) the results for the trial configuration provide sufficient variation so that a linear combination of them approximates the specified performance; and (2) the difference between the performance vectors (including the specifed performance) are sufficiently small that the linearity assumption of sensitivity analysis applies to the differences. A computed example describes the design of a high supersonic Mach number missile wing body configuration based on results from a set of four trial configurations.

  5. Performance analysis of optical wireless communications on long-range links

    NASA Astrophysics Data System (ADS)

    Epple, Bernhard

    2011-09-01

    Optical wireless communications over long-range atmospheric links experiences strong fading that heavily influences the performance of communication systems. Most research on this topic is focused on simulation or measurement of the link performance in terms of the bit error ratio. In this work a statistical channel model derived from measurements is used for simulations of the link performance on packet layer. For analysis of a possible improvement of packet layer performance by error protection techniques like forward error correction and automatic repeat request, additional simulations are done. All simulations are done for several communication scenarios like the maritime environment, land mobile and air-to-ground links.

  6. Analysis of validation data sets in the Class A Performance Evaluation Program

    SciTech Connect

    Hunn, B.D.

    1983-01-01

    The primary objective of the DOE Passive Solar Class A Performance Evaluation Program is to collect, analyze, and archive detailed test data for the rigorous validation of analysis/design tools used for passive solar research and design. This paper presents results of the analysis and qualification of several one- and two-week data sets taken at three Class A test sites for the purpose of validating envelope and thermal-storage-energy-transfer processes in passive solar analysis/design tools. Analysis of the data sets consists of editing the measured data and comparing these data with simulated performance results using public-domain, passive solar analysis tools and a standard reporting format developed for the Class A program. Comparisons of the measured data with results using the DOE-2 computer program are presented.

  7. Latent class analysis of reading, decoding, and writing performance using the Academic Performance Test: concurrent and discriminating validity

    PubMed Central

    Cogo-Moreira, Hugo; Carvalho, Carolina Alves Ferreira; de Souza Batista Kida, Adriana; de Avila, Clara Regina Brandão; Salum, Giovanni Abrahão; Moriyama, Tais Silveira; Gadelha, Ary; Rohde, Luis Augusto; de Moura, Luciana Monteiro; Jackowski, Andrea Parolin; de Jesus Mari, Jair

    2013-01-01

    Aim To explore and validate the best returned latent class solution for reading and writing subtests from the Academic Performance Test (TDE). Sample A total of 1,945 children (6–14 years of age), who answered the TDE, the Development and Well-Being Assessment (DAWBA), and had an estimated intelligence quotient (IQ) higher than 70, came from public schools in São Paulo (35 schools) and Porto Alegre (22 schools) that participated in the ‘High Risk Cohort Study for Childhood Psychiatric Disorders’ project. They were on average 9.52 years old (standard deviation = 1.856), from the 1st to 9th grades, and 53.3% male. The mean estimated IQ was 102.70 (standard deviation = 16.44). Methods Via Item Response Theory (IRT), the highest discriminating items (‘a’>1.7) were selected from the TDE subtests of reading and writing. A latent class analysis was run based on these subtests. The statistically and empirically best latent class solutions were validated through concurrent (IQ and combined attention deficit hyperactivity disorder [ADHD] diagnoses) and discriminant (major depression diagnoses) measures. Results A three-class solution was found to be the best model solution, revealing classes of children with good, not-so-good, or poor performance on TDE reading and writing tasks. The three-class solution has been shown to be correlated with estimated IQ and to ADHD diagnosis. No association was observed between the latent class and major depression. Conclusion The three-class solution showed both concurrent and discriminant validity. This work provides initial evidence of validity for an empirically derived categorical classification of reading, decoding, and writing performance using the TDE. A valid classification encourages further research investing correlates of reading and writing performance using the TDE. PMID:23983466

  8. TOF plotter—a program to perform routine analysis time-of-flight mass spectral data

    NASA Astrophysics Data System (ADS)

    Knippel, Brad C.; Padgett, Clifford W.; Marcus, R. Kenneth

    2004-03-01

    The main article discusses the operation and application of the program to mass spectral data files. This laboratory has recently reported the construction and characterization of a linear time-of-flight mass spectrometer (ToF-MS) utilizing a radio frequency glow discharge ionization source. Data acquisition and analysis was performed using a digital oscilloscope and Microsoft Excel, respectively. Presently, no software package is available that is specifically designed for time-of-flight mass spectral analysis that is not instrument dependent. While spreadsheet applications such as Excel offer tremendous utility, they can be cumbersome when repeatedly performing tasks which are too complex or too user intensive for macros to be viable. To address this situation and make data analysis a faster, simpler task, our laboratory has developed a Microsoft Windows-based software program coded in Microsoft Visual Basic. This program enables the user to rapidly perform routine data analysis tasks such as mass calibration, plotting and smoothing on x- y data sets. In addition to a suite of tools for data analysis, a number of calculators are built into the software to simplify routine calculations pertaining to linear ToF-MS. These include mass resolution, ion kinetic energy and single peak identification calculators. A detailed description of the software and its associated functions is presented followed by a characterization of its performance in the analysis of several representative ToF-MS spectra obtained from different GD-ToF-MS systems.

  9. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments

    PubMed Central

    Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein

    2014-01-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance. PMID:24847184

  10. Analysis and compilation of missile aerodynamic data. Volume 2: Performance analysis

    NASA Technical Reports Server (NTRS)

    Burkhalter, J. E.

    1977-01-01

    A general analysis is given of the flight dynamics of several surface-to-air and two air-to-air missile configurations. The analysis involves three phases: vertical climb, straight and level flight, and constant altitude turn. Wind tunnel aerodynamic data and full scale missile characteristics are used where available; unknown data are estimated. For the constant altitude turn phase, a three degree of freedom flight simulation is used. Important parameters considered in this analysis are the vehicle weight, Mach number, heading angle, thrust level, sideslip angle, g loading, and time to make the turn. The actual flight path during the turn is also determined. Results are presented in graphical form.

  11. Performance analysis and dynamic modeling of a single-spool turbojet engine

    NASA Astrophysics Data System (ADS)

    Andrei, Irina-Carmen; Toader, Adrian; Stroe, Gabriela; Frunzulica, Florin

    2017-01-01

    The purposes of modeling and simulation of a turbojet engine are the steady state analysis and transient analysis. From the steady state analysis, which consists in the investigation of the operating, equilibrium regimes and it is based on appropriate modeling describing the operation of a turbojet engine at design and off-design regimes, results the performance analysis, concluded by the engine's operational maps (i.e. the altitude map, velocity map and speed map) and the engine's universal map. The mathematical model that allows the calculation of the design and off-design performances, in case of a single spool turbojet is detailed. An in house code was developed, its calibration was done for the J85 turbojet engine as the test case. The dynamic modeling of the turbojet engine is obtained from the energy balance equations for compressor, combustor and turbine, as the engine's main parts. The transient analysis, which is based on appropriate modeling of engine and its main parts, expresses the dynamic behavior of the turbojet engine, and further, provides details regarding the engine's control. The aim of the dynamic analysis is to determine a control program for the turbojet, based on the results provided by performance analysis. In case of the single-spool turbojet engine, with fixed nozzle geometry, the thrust is controlled by one parameter, which is the fuel flow rate. The design and management of the aircraft engine controls are based on the results of the transient analysis. The construction of the design model is complex, since it is based on both steady-state and transient analysis, further allowing the flight path cycle analysis and optimizations. This paper presents numerical simulations for a single-spool turbojet engine (J85 as test case), with appropriate modeling for steady-state and dynamic analysis.

  12. Noise analysis and performance comparison of low current measurement systems for biomedical applications.

    PubMed

    Dongsoo Kim; Goldstein, B; Wei Tang; Sigworth, F J; Culurciello, E

    2013-02-01

    In this paper, we report on the noise analysis of low current measurement systems for biomedical applications and their fundamental limits. We analyzed resistive feedback, capacitive feedback and current amplifier circuits for low current measurement systems. Detailed noise analysis for different biomedical applications are presented and matched with measurement data using a 0.5-μm fabrication process. Based on the theoretical analysis and the corresponding measurement results, the capacitive feedback system provides better noise performance for the measurement of low current than the others. The capacitive feedback circuit is capable of measuring 750 fA RMS at a 10 kHz sampling rate, whereas the resistive feedback provides 4 pA and the current conveyor provides 600 pA at the same bandwidth. This paper provides design guidelines to maximize the performance of low current measuring system for biomedical instrumentation and to provide the best performance available with CMOS technologies.

  13. Free wake analysis of hover performance using a new influence coefficient method

    NASA Technical Reports Server (NTRS)

    Quackenbush, Todd R.; Bliss, Donald B.; Ong, Ching Cho; Ching, Cho Ong

    1990-01-01

    A new approach to the prediction of helicopter rotor performance using a free wake analysis was developed. This new method uses a relaxation process that does not suffer from the convergence problems associated with previous time marching simulations. This wake relaxation procedure was coupled to a vortex-lattice, lifting surface loads analysis to produce a novel, self contained performance prediction code: EHPIC (Evaluation of Helicopter Performance using Influence Coefficients). The major technical features of the EHPIC code are described and a substantial amount of background information on the capabilities and proper operation of the code is supplied. Sample problems were undertaken to demonstrate the robustness and flexibility of the basic approach. Also, a performance correlation study was carried out to establish the breadth of applicability of the code, with very favorable results.

  14. Sensitivity Analysis of Wind Plant Performance to Key Turbine Design Parameters: A Systems Engineering Approach; Preprint

    SciTech Connect

    Dykes, K.; Ning, A.; King, R.; Graf, P.; Scott, G.; Veers, P.

    2014-02-01

    This paper introduces the development of a new software framework for research, design, and development of wind energy systems which is meant to 1) represent a full wind plant including all physical and nonphysical assets and associated costs up to the point of grid interconnection, 2) allow use of interchangeable models of varying fidelity for different aspects of the system, and 3) support system level multidisciplinary analyses and optimizations. This paper describes the design of the overall software capability and applies it to a global sensitivity analysis of wind turbine and plant performance and cost. The analysis was performed using three different model configurations involving different levels of fidelity, which illustrate how increasing fidelity can preserve important system interactions that build up to overall system performance and cost. Analyses were performed for a reference wind plant based on the National Renewable Energy Laboratory's 5-MW reference turbine at a mid-Atlantic offshore location within the United States.

  15. Functional data analysis of joint coordination in the development of vertical jump performance.

    PubMed

    Harrison, A J; Ryan, W; Hayes, K

    2007-05-01

    Mastery of complex motor skills requires effective development of inter-segment coordination patterns. These coordination patterns can be described and quantified using various methods, including descriptive angle-angle diagrams, conjugate cross-correlations, vector coding, normalized root mean squared error techniques and, as in this study, functional data analysis procedures. Lower limb kinematic data were obtained for 49 children performing the vertical jump. Participants were assigned to developmental stages using the criteria of Gallahue and Ozmun . Inter-segment joint coordination data consisting of pairs of joint angle-time data were smoothed using B-splines and the resulting bivariate functions were analysed using functional principal component analysis and stepwise discriminant analysis. The results of the analysis showed that the knee-hip joint coordination pattern was most effective at discriminating between developmental stages. The results provide support for the application of functional data analysis techniques in the analysis of joint coordination or time series type data.

  16. Performance Refactoring of Instrumentation, Measurement, and Analysis Technologies for Petascale Computing. The PRIMA Project

    SciTech Connect

    Malony, Allen D.; Wolf, Felix G.

    2014-01-31

    The growing number of cores provided by today’s high-­end computing systems present substantial challenges to application developers in their pursuit of parallel efficiency. To find the most effective optimization strategy, application developers need insight into the runtime behavior of their code. The University of Oregon (UO) and the Juelich Supercomputing Centre of Forschungszentrum Juelich (FZJ) develop the performance analysis tools TAU and Scalasca, respectively, which allow high-­performance computing (HPC) users to collect and analyze relevant performance data – even at very large scales. TAU and Scalasca are considered among the most advanced parallel performance systems available, and are used extensively across HPC centers in the U.S., Germany, and around the world. The TAU and Scalasca groups share a heritage of parallel performance tool research and partnership throughout the past fifteen years. Indeed, the close interactions of the two groups resulted in a cross-­fertilization of tool ideas and technologies that pushed TAU and Scalasca to what they are today. It also produced two performance systems with an increasing degree of functional overlap. While each tool has its specific analysis focus, the tools were implementing measurement infrastructures that were substantially similar. Because each tool provides complementary performance analysis, sharing of measurement results is valuable to provide the user with more facets to understand performance behavior. However, each measurement system was producing performance data in different formats, requiring data interoperability tools to be created. A common measurement and instrumentation system was needed to more closely integrate TAU and Scalasca and to avoid the duplication of development and maintenance effort. The PRIMA (Performance Refactoring of Instrumentation, Measurement, and Analysis) project was proposed over three years ago as a joint international effort between UO and FZJ to

  17. Performance Refactoring of Instrumentation, Measurement, and Analysis Technologies for Petascale Computing: the PRIMA Project

    SciTech Connect

    Malony, Allen D.; Wolf, Felix G.

    2014-01-31

    The growing number of cores provided by today’s high-end computing systems present substantial challenges to application developers in their pursuit of parallel efficiency. To find the most effective optimization strategy, application developers need insight into the runtime behavior of their code. The University of Oregon (UO) and the Juelich Supercomputing Centre of Forschungszentrum Juelich (FZJ) develop the performance analysis tools TAU and Scalasca, respectively, which allow high-performance computing (HPC) users to collect and analyze relevant performance data – even at very large scales. TAU and Scalasca are considered among the most advanced parallel performance systems available, and are used extensively across HPC centers in the U.S., Germany, and around the world. The TAU and Scalasca groups share a heritage of parallel performance tool research and partnership throughout the past fifteen years. Indeed, the close interactions of the two groups resulted in a cross-fertilization of tool ideas and technologies that pushed TAU and Scalasca to what they are today. It also produced two performance systems with an increasing degree of functional overlap. While each tool has its specific analysis focus, the tools were implementing measurement infrastructures that were substantially similar. Because each tool provides complementary performance analysis, sharing of measurement results is valuable to provide the user with more facets to understand performance behavior. However, each measurement system was producing performance data in different formats, requiring data interoperability tools to be created. A common measurement and instrumentation system was needed to more closely integrate TAU and Scalasca and to avoid the duplication of development and maintenance effort. The PRIMA (Performance Refactoring of Instrumentation, Measurement, and Analysis) project was proposed over three years ago as a joint international effort between UO and FZJ to accomplish

  18. Inertial Sensor Technology for Elite Swimming Performance Analysis: A Systematic Review

    PubMed Central

    Mooney, Robert; Corley, Gavin; Godfrey, Alan; Quinlan, Leo R; ÓLaighin, Gearóid

    2015-01-01

    Technical evaluation of swimming performance is an essential factor of elite athletic preparation. Novel methods of analysis, incorporating body worn inertial sensors (i.e., Microelectromechanical systems, or MEMS, accelerometers and gyroscopes), have received much attention recently from both research and commercial communities as an alternative to video-based approaches. This technology may allow for improved analysis of stroke mechanics, race performance and energy expenditure, as well as real-time feedback to the coach, potentially enabling more efficient, competitive and quantitative coaching. The aim of this paper is to provide a systematic review of the literature related to the use of inertial sensors for the technical analysis of swimming performance. This paper focuses on providing an evaluation of the accuracy of different feature detection algorithms described in the literature for the analysis of different phases of swimming, specifically starts, turns and free-swimming. The consequences associated with different sensor attachment locations are also considered for both single and multiple sensor configurations. Additional information such as this should help practitioners to select the most appropriate systems and methods for extracting the key performance related parameters that are important to them for analysing their swimmers’ performance and may serve to inform both applied and research practices. PMID:26712760

  19. Efficacy of Ginseng Supplements on Fatigue and Physical Performance: a Meta-analysis

    PubMed Central

    2016-01-01

    We conducted a meta-analysis to investigate the efficacy of ginseng supplements on fatigue reduction and physical performance enhancement as reported by randomized controlled trials (RCTs). RCTs that investigated the efficacy of ginseng supplements on fatigue reduction and physical performance enhancement compared with placebos were included. The main outcome measures were fatigue reduction and physical performance enhancement. Out of 155 articles meeting initial criteria, 12 RCTs involving 630 participants (311 participants in the intervention group and 319 participants in the placebo group) were included in the final analysis. In the fixed-effect meta-analysis of four RCTs, there was a statistically significant efficacy of ginseng supplements on fatigue reduction (standardized mean difference, SMD = 0.34; 95% confidence interval [CI] = 0.16 to 0.52). However, ginseng supplements were not associated with physical performance enhancement in the fixed-effect meta-analysis of eight RCTs (SMD = −0.01; 95% CI = −0.29 to 0.27). We found that there was insufficient clinical evidence to support the use of ginseng supplements on reducing fatigue and enhancing physical performance because only few RCTs with a small sample size have been published so far. Further lager RCTs are required to confirm the efficacy of ginseng supplements on fatigue reduction. PMID:27822924

  20. Inertial Sensor Technology for Elite Swimming Performance Analysis: A Systematic Review.

    PubMed

    Mooney, Robert; Corley, Gavin; Godfrey, Alan; Quinlan, Leo R; ÓLaighin, Gearóid

    2015-12-25

    Technical evaluation of swimming performance is an essential factor of elite athletic preparation. Novel methods of analysis, incorporating body worn inertial sensors (i.e., Microelectromechanical systems, or MEMS, accelerometers and gyroscopes), have received much attention recently from both research and commercial communities as an alternative to video-based approaches. This technology may allow for improved analysis of stroke mechanics, race performance and energy expenditure, as well as real-time feedback to the coach, potentially enabling more efficient, competitive and quantitative coaching. The aim of this paper is to provide a systematic review of the literature related to the use of inertial sensors for the technical analysis of swimming performance. This paper focuses on providing an evaluation of the accuracy of different feature detection algorithms described in the literature for the analysis of different phases of swimming, specifically starts, turns and free-swimming. The consequences associated with different sensor attachment locations are also considered for both single and multiple sensor configurations. Additional information such as this should help practitioners to select the most appropriate systems and methods for extracting the key performance related parameters that are important to them for analysing their swimmers' performance and may serve to inform both applied and research practices.

  1. Analysis of swimming performance: perceptions and practices of US-based swimming coaches.

    PubMed

    Mooney, Robert; Corley, Gavin; Godfrey, Alan; Osborough, Conor; Newell, John; Quinlan, Leo Richard; ÓLaighin, Gearóid

    2016-01-01

    In elite swimming, a broad range of methods are used to assess performance, inform coaching practices and monitor athletic progression. The aim of this paper was to examine the performance analysis practices of swimming coaches and to explore the reasons behind the decisions that coaches take when analysing performance. Survey data were analysed from 298 Level 3 competitive swimming coaches (245 male, 53 female) based in the United States. Results were compiled to provide a generalised picture of practices and perceptions and to examine key emerging themes. It was found that a disparity exists between the importance swim coaches place on biomechanical analysis of swimming performance and the types of analyses that are actually conducted. Video-based methods are most frequently employed, with over 70% of coaches using these methods at least monthly, with analyses being mainly qualitative in nature rather than quantitative. Barriers to the more widespread use of quantitative biomechanical analysis in elite swimming environments were explored. Constraints include time, cost and availability of resources, but other factors such as sources of information on swimming performance and analysis and control over service provision are also discussed, with particular emphasis on video-based methods and emerging sensor-based technologies.

  2. Efficacy of Ginseng Supplements on Fatigue and Physical Performance: a Meta-analysis.

    PubMed

    Bach, Hoang Viet; Kim, Jeongseon; Myung, Seung Kwon; Cho, Young Ae

    2016-12-01

    We conducted a meta-analysis to investigate the efficacy of ginseng supplements on fatigue reduction and physical performance enhancement as reported by randomized controlled trials (RCTs). RCTs that investigated the efficacy of ginseng supplements on fatigue reduction and physical performance enhancement compared with placebos were included. The main outcome measures were fatigue reduction and physical performance enhancement. Out of 155 articles meeting initial criteria, 12 RCTs involving 630 participants (311 participants in the intervention group and 319 participants in the placebo group) were included in the final analysis. In the fixed-effect meta-analysis of four RCTs, there was a statistically significant efficacy of ginseng supplements on fatigue reduction (standardized mean difference, SMD = 0.34; 95% confidence interval [CI] = 0.16 to 0.52). However, ginseng supplements were not associated with physical performance enhancement in the fixed-effect meta-analysis of eight RCTs (SMD = -0.01; 95% CI = -0.29 to 0.27). We found that there was insufficient clinical evidence to support the use of ginseng supplements on reducing fatigue and enhancing physical performance because only few RCTs with a small sample size have been published so far. Further lager RCTs are required to confirm the efficacy of ginseng supplements on fatigue reduction.

  3. Statistical analysis of microgravity experiment performance using the degrees of success scale

    NASA Technical Reports Server (NTRS)

    Upshaw, Bernadette; Liou, Ying-Hsin Andrew; Morilak, Daniel P.

    1994-01-01

    This paper describes an approach to identify factors that significantly influence microgravity experiment performance. Investigators developed the 'degrees of success' scale to provide a numerical representation of success. A degree of success was assigned to 293 microgravity experiments. Experiment information including the degree of success rankings and factors for analysis was compiled into a database. Through an analysis of variance, nine significant factors in microgravity experiment performance were identified. The frequencies of these factors are presented along with the average degree of success at each level. A preliminary discussion of the relationship between the significant factors and the degree of success is presented.

  4. Seventy-meter antenna performance predictions: GTD analysis compared with traditional ray-tracing methods

    NASA Technical Reports Server (NTRS)

    Schredder, J. M.

    1988-01-01

    A comparative analysis was performed, using both the Geometrical Theory of Diffraction (GTD) and traditional pathlength error analysis techniques, for predicting RF antenna gain performance and pointing corrections. The NASA/JPL 70 meter antenna with its shaped surface was analyzed for gravity loading over the range of elevation angles. Also analyzed were the effects of lateral and axial displacements of the subreflector. Significant differences were noted between the predictions of the two methods, in the effect of subreflector displacements, and in the optimal subreflector positions to focus a gravity-deformed main reflector. The results are of relevance to future design procedure.

  5. The UTRC wind energy conversion system performance analysis for horizontal axis wind turbines (WECSPER)

    NASA Technical Reports Server (NTRS)

    Egolf, T. A.; Landgrebe, A. J.

    1981-01-01

    The theory for the UTRC Energy Conversion System Performance Analysis (WECSPER) for the prediction of horizontal axis wind turbine performance is presented. Major features of the analysis are the ability to: (1) treat the wind turbine blades as lifting lines with a prescribed wake model; (2) solve for the wake-induced inflow and blade circulation using real nonlinear airfoil data; and (3) iterate internally to obtain a compatible wake transport velocity and blade loading solution. This analysis also provides an approximate treatment of wake distortions due to tower shadow or wind shear profiles. Finally, selected results of internal UTRC application of the analysis to existing wind turbines and correlation with limited test data are described.

  6. Spectral Graph Theory Analysis of Software-Defined Networks to Improve Performance and Security

    DTIC Science & Technology

    2015-09-01

    networks for transmission operations in smart grids,” in the Proc. IEEE PES Innovative Smart Grid Technologies (ISGT), Washington, DC, 2013. [34] D...GRAPH THEORY ANALYSIS OF SOFTWARE-DEFINED NETWORKS TO IMPROVE PERFORMANCE AND SECURITY by Thomas C. Parker September 2015 Dissertation Co...September 2015 3. REPORT TYPE AND DATES COVERED Dissertation 4. TITLE AND SUBTITLE SPECTRAL GRAPH THEORY ANALYSIS OF SOFTWARE-DEFINED NETWORKS

  7. Performance Analysis of the Link-16/JTIDS Waveform With Concatenated Coding

    DTIC Science & Technology

    2009-09-01

    ANALYSIS OF THE LINK-16/ JTIDS WAVEFORM WITH CONCATENATED CODING by Ioannis Koromilas September 2009 Thesis Advisor: Ralph C. Robertson...Master’s Thesis 4. TITLE AND SUBTITLE: Performance Analysis of the Link-16/ JTIDS Waveform with Concatenated Coding 6. AUTHOR Ioannis Koromilas 5...capabilities. The communication terminal of Link-16 is called the Joint Tactical Information Distribution System ( JTIDS ) and features Reed-Solomon (RS) coding

  8. Idaho National Laboratory Quarterly Performance Analysis for the 2nd Quarter FY 2015

    SciTech Connect

    Mitchell, Lisbeth A.

    2015-04-01

    This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of events for the 2nd Qtr FY-15.

  9. Idaho National Laboratory Quarterly Performance Analysis - 3rd Quarter FY2014

    SciTech Connect

    Lisbeth A. Mitchell

    2014-09-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of occurrence reports and other non-reportable issues identified at INL from July 2013 through June 2014.

  10. The Role of Culture, Competitiveness and Economic Performance in Explaining Academic Performance: A Global Market Analysis for International Student Segmentation

    ERIC Educational Resources Information Center

    Baumann, Chris; Hamin

    2011-01-01

    A nation's culture, competitiveness and economic performance explain academic performance. Partial Least Squares (PLS) testing of 2252 students shows culture affects competitiveness and academic performance. Culture and economic performance each explain 32%; competitiveness 36%. The model predicts academic performance when culture, competitiveness…

  11. Analysis of the temporal dynamics of model performance and parameter sensitivity for hydrological models

    NASA Astrophysics Data System (ADS)

    Reusser, D.; Zehe, E.

    2009-04-01

    The temporal dynamics of hydrological model performance gives insights into errors that cannot be obtained from global performance measures assigning a single number to the fit of a simulated time series to an observed reference series. These errors can include errors in data, model parameters, or model structure. Dealing with a set of performance measures evaluated at a high temporal resolution implies analyzing and interpreting a high dimensional data set. We present a method for such a hydrological model performance assessment with a high temporal resolution. Information about possible relevant processes during times with distinct model performance is obtained from parameter sensitivity analysis - also with high temporal resolution. We illustrate the combined approach of temporally resolved model performance and parameter sensitivity for a rainfall-runoff modeling case study. The headwater catchment of the Wilde Weisseritz in the eastern Ore mountains is simulated with the conceptual model WaSiM-ETH. The proposed time-resolved performance assessment starts with the computation of a large set of classically used performance measures for a moving window. The key of the developed approach is a data-reduction method based on self-organizing maps (SOMs) and cluster analysis to classify the high-dimensional performance matrix. Synthetic peak errors are used to interpret the resulting error classes. The temporally resolved sensitivity analysis is based on the FAST algorithm. The final outcome of the proposed method is a time series of the occurrence of dominant error types as well as a time series of the relative parameter sensitivity. For the two case studies analyzed here, 6 error types have been identified. They show clear temporal patterns which can lead to the identification of model structural errors. The parameter sensitivity helps to identify the relevant model parts.

  12. An analysis for high speed propeller-nacelle aerodynamic performance prediction. Volume 1: Theory and application

    NASA Technical Reports Server (NTRS)

    Egolf, T. Alan; Anderson, Olof L.; Edwards, David E.; Landgrebe, Anton J.

    1988-01-01

    A computer program, the Propeller Nacelle Aerodynamic Performance Prediction Analysis (PANPER), was developed for the prediction and analysis of the performance and airflow of propeller-nacelle configurations operating over a forward speed range inclusive of high speed flight typical of recent propfan designs. A propeller lifting line, wake program was combined with a compressible, viscous center body interaction program, originally developed for diffusers, to compute the propeller-nacelle flow field, blade loading distribution, propeller performance, and the nacelle forebody pressure and viscous drag distributions. The computer analysis is applicable to single and coaxial counterrotating propellers. The blade geometries can include spanwise variations in sweep, droop, taper, thickness, and airfoil section type. In the coaxial mode of operation the analysis can treat both equal and unequal blade number and rotational speeds on the propeller disks. The nacelle portion of the analysis can treat both free air and tunnel wall configurations including wall bleed. The analysis was applied to many different sets of flight conditions using selected aerodynamic modeling options. The influence of different propeller nacelle-tunnel wall configurations was studied. Comparisons with available test data for both single and coaxial propeller configurations are presented along with a discussion of the results.

  13. The age of peak performance in Ironman triathlon: a cross-sectional and longitudinal data analysis

    PubMed Central

    2013-01-01

    Background The aims of the present study were, firstly, to investigate in a cross-sectional analysis the age of peak Ironman performance within one calendar year in all qualifiers for Ironman Hawaii and Ironman Hawaii; secondly, to determine in a longitudinal analysis on a qualifier for Ironman Hawaii whether the age of peak Ironman performance and Ironman performance itself change across years; and thirdly, to determine the gender difference in performance. Methods In a cross-sectional analysis, the age of the top ten finishers for all qualifier races for Ironman Hawaii and Ironman Hawaii was determined in 2010. For a longitudinal analysis, the age and the performance of the annual top ten female and male finishers in a qualifier for Ironman Hawaii was determined in Ironman Switzerland between 1995 and 2010. Results In 19 of the 20 analyzed triathlons held in 2010, there was no difference in the age of peak Ironman performance between women and men (p > 0.05). The only difference in the age of peak Ironman performance between genders was in ‘Ironman Canada’ where men were older than women (p = 0.023). For all 20 races, the age of peak Ironman performance was 32.2 ± 1.5 years for men and 33.0 ± 1.6 years for women (p > 0.05). In Ironman Switzerland, there was no difference in the age of peak Ironman performance between genders for top ten women and men from 1995 to 2010 (F = 0.06, p = 0.8). The mean age of top ten women and men was 31.4 ± 1.7 and 31.5 ± 1.7 years (Cohen's d = 0.06), respectively. The gender difference in performance in the three disciplines and for overall race time decreased significantly across years. Men and women improved overall race times by approximately 1.2 and 4.2 min/year, respectively. Conclusions Women and men peak at a similar age of 32–33 years in an Ironman triathlon with no gender difference. In a qualifier for Ironman Hawaii, the age of peak Ironman performance remained unchanged across years. In contrast, gender

  14. Supercomputer and cluster performance modeling and analysis efforts:2004-2006.

    SciTech Connect

    Sturtevant, Judith E.; Ganti, Anand; Meyer, Harold Edward; Stevenson, Joel O.; Benner, Robert E., Jr.; Goudy, Susan Phelps; Doerfler, Douglas W.; Domino, Stefan Paul; Taylor, Mark A.; Malins, Robert Joseph; Scott, Ryan T.; Barnette, Daniel Wayne; Rajan, Mahesh; Ang, James Alfred; Black, Amalia Rebecca; Laub, Thomas William; Vaughan, Courtenay Thomas; Franke, Brian Claude

    2007-02-01

    This report describes efforts by the Performance Modeling and Analysis Team to investigate performance characteristics of Sandia's engineering and scientific applications on the ASC capability and advanced architecture supercomputers, and Sandia's capacity Linux clusters. Efforts to model various aspects of these computers are also discussed. The goals of these efforts are to quantify and compare Sandia's supercomputer and cluster performance characteristics; to reveal strengths and weaknesses in such systems; and to predict performance characteristics of, and provide guidelines for, future acquisitions and follow-on systems. Described herein are the results obtained from running benchmarks and applications to extract performance characteristics and comparisons, as well as modeling efforts, obtained during the time period 2004-2006. The format of the report, with hypertext links to numerous additional documents, purposefully minimizes the document size needed to disseminate the extensive results from our research.

  15. Task versus relationship conflict, team performance, and team member satisfaction: a meta-analysis.

    PubMed

    De Dreu, Carsten K W; Weingart, Laurie R

    2003-08-01

    This study provides a meta-analysis of research on the associations between relationship conflict, task conflict, team performance, and team member satisfaction. Consistent with past theorizing, results revealed strong and negative correlations between relationship conflict, team performance, and team member satisfaction. In contrast to what has been suggested in both academic research and introductory textbooks, however, results also revealed strong and negative (instead of the predicted positive) correlations between task conflict team performance, and team member satisfaction. As predicted, conflict had stronger negative relations with team performance in highly complex (decision making, project, mixed) than in less complex (production) tasks. Finally, task conflict was less negatively related to team performance when task conflict and relationship conflict were weakly, rather than strongly, correlated.

  16. Analysis of performance losses of direct ethanol fuel cells with the aid of a reference electrode

    NASA Astrophysics Data System (ADS)

    Li, Guangchun; Pickup, Peter G.

    The performances of direct ethanol fuel cells with different anode catalysts, different ethanol concentrations, and at different operating temperatures have been studied. The performance losses of the cell have been separated into individual electrode performance losses with the aid of a reference electrode, ethanol crossover has been quantified, and CO 2 and acetic acid production have been measured by titration. It has been shown that the cell performance strongly depends on the anode catalyst, ethanol concentration, and operating temperature. It was found that the cathode and anode exhibit different dependences on ethanol concentration and operating temperature. The performance of the cathode is very sensitive to the rate of ethanol crossover. Product analysis provides insights into the mechanisms of electro-oxidation of ethanol.

  17. Characterizing Learning by Simultaneous Analysis of Continuous and Binary Measures of Performance

    PubMed Central

    Prerau, M. J.; Smith, A. C.; Eden, Uri T.; Kubota, Y.; Yanike, M.; Suzuki, W.; Graybiel, A. M.

    2009-01-01

    Continuous observations, such as reaction and run times, and binary observations, such as correct/incorrect responses, are recorded routinely in behavioral learning experiments. Although both types of performance measures are often recorded simultaneously, the two have not been used in combination to evaluate learning. We present a state-space model of learning in which the observation process has simultaneously recorded continuous and binary measures of performance. We use these performance measures simultaneously to estimate the model parameters and the unobserved cognitive state process by maximum likelihood using an approximate expectation maximization (EM) algorithm. We introduce the concept of a reaction-time curve and reformulate our previous definitions of the learning curve, the ideal observer curve, the learning trial and between-trial comparisons of performance in terms of the new model. We illustrate the properties of the new model in an analysis of a simulated learning experiment. In the simulated data analysis, simultaneous use of the two measures of performance provided more credible and accurate estimates of the learning than either measure analyzed separately. We also analyze two actual learning experiments in which the performance of rats and of monkeys was tracked across trials by simultaneously recorded reaction and run times and the correct and incorrect responses. In the analysis of the actual experiments, our algorithm gave a straightforward, efficient way to characterize learning by combining continuous and binary measures of performance. This analysis paradigm has implications for characterizing learning and for the more general problem of combining different data types to characterize the properties of a neural system. PMID:19692505

  18. Climate Analysis and Long Range Forecasting of Radar Performance in the Western North Pacific

    DTIC Science & Technology

    2009-06-01

    LONG RANGE FORECASTING OF RADAR PERFORMANCE IN THE WESTERN NORTH PACIFIC by David Ramsaur June 2009 Thesis Co-Advisors...2. REPORT DATE June 2009 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE Climate Analysis and Long Range...degree of MASTER OF SCIENCE IN METEOROLOGY from the NAVAL POSTGRADUATE SCHOOL June 2009 Author: David C. Ramsaur

  19. Performance-Based Occupational Affective Behavior Analysis (OABA). Implementation and Supporting Research.

    ERIC Educational Resources Information Center

    Pucel, David J.; And Others

    This document contains two sections: implementation of the performance-based Occupational Affective Behavior Analysis (OABA), and supporting research. Section 1 presents OABA, an analytic procedure designed to identify those affective behaviors important to success in an occupation, and gives directions on how to implement the procedure. The…

  20. Increasing Student Performance on the Independent School Entrance Exam (ISEE) Using the Gap Analysis Approach

    ERIC Educational Resources Information Center

    Sarshar, Shanon Etty

    2013-01-01

    Using the Gap Analysis problem-solving framework (Clark & Estes, 2008), this study examined the performance gap experienced by 6th grade students on the math sections of the ISEE (Independent School Entrance Exam). The purpose of the study was to identify and validate the knowledge, motivation, and organization causes of the students' low…