Science.gov

Sample records for performance analysis 1994-2002

  1. MyPyramid equivalents database for USDA survey food codes, 1994-2002, version 1.0.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The MyPyramid Equivalents Database for USDA Food Codes, 1994-2002 Version 1.0 (MyPyrEquivDB_v1) is based on USDA’s MyPyramid Food Guidance System (2005) and provides equivalents data on the five major food groups and selected subgroups (32 groups in all) for all USDA survey food codes available for ...

  2. [Characteristics of malaria cases diagnosed in Edirne province between 1994-2002].

    PubMed

    Ay, Gazanfer; Gürcan, Saban; Tatman Otkun, Müşerref; Tuğrul, Murat; Otkun, Metin

    2004-01-01

    In this study, the epidemiological characteristics of malaria cases in Edirne province were investigated. Between the years of 1994-2002, a total of 317,087 blood samples were collected from soldiers in the province with selective active surveillance and from the resident population with active or passive surveillance methods, by the medical staff of Malaria Control Department and Health Centers, to search the presence of Plasmodium. In 281 of them Plasmodium spp. were detected, and the characteristics of malaria cases were investigated. Of the cases, 238 (84.7%) were detected in the first three years and mostly in September. While the indigenous cases were detected in the districts where rice planted intensely, the imported cases were detected in the districts heavily populated by military staff. Of the imported cases, 62% originated from Diyarbakir, Batman and Sanliurfa provinces (Southeast part of Turkey). P. vivax was detected as the causative agent in all blood samples except one P. ovale. This latter case has been the only one in Turkey so far and he was a student from Afghanistan. Attaching importance to fight off mosquitoes in intensely rice planted districts and strictly surveying the military staff, particularly from the region of Southern-East Anatolia, have led to successful control of the malaria cases in Edirne region. PMID:15293910

  3. A probable extralimital post-breeding assembly of Bufflehead Bucephala albeola in southcentral North Dakota, USA, 1994-2002

    USGS Publications Warehouse

    Igl, L.D.

    2003-01-01

    The Bufflehead Bucephala albeola breeds predominantly in Canada and Alaska (USA). Evidence suggests that the species may have recently expanded its breeding range southward into central and south central North Dakota. This paper presents data on observations of Buffleheads during the breeding season in Kidder County, North Dakota, 1994-2002, and discusses the possibility that the species has not expanded its breeding range but rather has established an extralimital post-breeding staging area south of its typical breeding range.

  4. A probable extralimital postbreeding assembly of bufflehead Bucephala albeola in southcentral North Dakota, USA, 1994-2002

    USGS Publications Warehouse

    Igl, L.D.

    2003-01-01

    The Bufflehead Bucephala albeola predominantly in Canada and Alaska (USA). Evidence suggests that the species may have recently expanded its breeding range southward into central and south-central North Dakota. This paper presents data on observations of Buffleheads during the breeding season in Kidder County, North Dakota, 1994-2002, and discusses the possibility that the species has not expanded its breeding range but rather has established an extralimital post-breeding staging area south of its typical breeding range.

  5. Doppler images of the RS CVn binary II Pegasi during the years 1994-2002

    NASA Astrophysics Data System (ADS)

    Lindborg, M.; Korpi, M. J.; Hackman, T.; Tuominen, I.; Ilyin, I.; Piskunov, N.

    2011-02-01

    Aims: We publish 16 Doppler imaging temperature maps for the years 1994-2002 of the active RS CVn star II Peg. The six maps from 1999-2002 are based on previously unpublished observations. Through Doppler imaging we want to study the spot evolution of the star and in particular compare this with previous results showing a cyclic spot behaviour and persistent, active longitudes. Methods: The observations were collected with the SOFIN spectrograph at the Nordic Optical Telescope. The temperature maps were calculated using a Doppler imaging code based on Tikhonov regularization. Results: During 1994-2001, our results show a consistent trend in the derived longitudes of the principal and secondary temperature minima over time such that the magnetic structure appears to rotate somewhat more rapidly than the orbital period of this close binary. A sudden phase jump in the active region occurred between the observing seasons of 2001 and 2002. No clear trend over time is detected in the derived latitudes of the spots, indicating that the systematic motion could be related to the drift of the spot-generating mechanism rather than to differential rotation. The derived temperature maps are quite similar to the ones obtained earlier with different methods and the main differences occur in the spot latitudes and relative strength of the spot structures. Conclusions: We observe both longitude and latitude shifts in the spot activity of II Peg. However, our results are not consistent with the periodic behaviour presented in previous studies. Full Table 1 is only available in electronic form at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/526/A44

  6. Internet Access in U.S. Public Schools and Classrooms: 1994-2002. E.D. Tabs.

    ERIC Educational Resources Information Center

    Kleiner, Anne; Lewis, Laurie

    This report presents data on Internet access in U.S. public schools from 1994 to 2002 by school characteristics. It provides trend analysis on the progress of public schools and classrooms in connecting to the Internet and on the ratio of students to instructional computers with Internet access. For the year 2002, this report also presents data on…

  7. [Molecular epidemiology of rabies epizootics in Colombia, 1994-2002: evidence of human and canine rabies associated with chiroptera].

    PubMed

    Páez, Andrés; Nuñez, Constanza; García, Clemencia; Boshell, Jorge

    2003-03-01

    Three urban rabies outbreaks have been reported in Colombia during the last two decades, one of which is ongoing in the Caribbean region (northern Colombia). The earlier outbreaks occurred almost simultaneously in Arauca (eastern Colombia) and in the Central region, ending in 1997. Phylogenetic relationships among rabies viruses isolated from the three areas were based on a comparison of cDNA fragments coding for the endodomain of protein G and a fragment of L protein obtained by RT-PCR. The sequenced amplicons which included the G-L intergenic region contained 902 base pairs. Phylogenetic analysis showed three distinct groups of viruses. Colombian genetic variant I viruses were isolated only from Arauca and the Central region, but are now apparently extinct. Colombian genetic variant II viruses were isolated in the Caribbean region and are still being transmitted in that area. The third group of bat rabies variants were isolated from two insectivorous bats, three domestic dogs and a human. This associates bat rabies virus with rabies in Colombian dogs and humans, and indicates bats to be a rabies reservoir of public health significance. PMID:12696396

  8. Performance Measurement Analysis System

    Energy Science and Technology Software Center (ESTSC)

    1989-06-01

    The PMAS4.0 (Performance Measurement Analysis System) is a user-oriented system designed to track the cost and schedule performance of Department of Energy (DOE) major projects (MPs) and major system acquisitions (MSAs) reporting under DOE Order 5700.4A, Project Management System. PMAS4.0 provides for the analysis of performance measurement data produced from management control systems complying with the Federal Government''s Cost and Schedule Control Systems Criteria.

  9. Ariel Performance Analysis System

    NASA Astrophysics Data System (ADS)

    Ariel, Gideon B.; Penny, M. A.; Saar, Dany

    1990-08-01

    The Ariel Performance Analysis System is a computer-based system for the measurement, analysis and presentation of human performance. The system is based on a proprietary technique for processing multiple high-speed film and video recordings of a subject's performance. It is noninvasive, and does not require wires, sensors, markers or reflectors. In addition, it is portable and does not require modification of the performing environment. The scale and accuracy of measurement can be set to whatever levels are required by the activity being performed.

  10. Performance Support for Performance Analysis

    ERIC Educational Resources Information Center

    Schaffer, Scott; Douglas, Ian

    2004-01-01

    Over the past several years, there has been a shift in emphasis in many business, industry, government and military training organizations toward human performance technology or HPT (Rossett, 2002; Dean, 1995). This trend has required organizations to increase the human performance knowledge, skills, and abilities of the training workforce.…

  11. MIR Performance Analysis

    SciTech Connect

    Hazen, Damian; Hick, Jason

    2012-06-12

    We provide analysis of Oracle StorageTek T10000 Generation B (T10KB) Media Information Record (MIR) Performance Data gathered over the course of a year from our production High Performance Storage System (HPSS). The analysis shows information in the MIR may be used to improve tape subsystem operations. Most notably, we found the MIR information to be helpful in determining whether the drive or tape was most suspect given a read or write error, and for helping identify which tapes should not be reused given their history of read or write errors. We also explored using the MIR Assisted Search to order file retrieval requests. We found that MIR Assisted Search may be used to reduce the time needed to retrieve collections of files from a tape volume.

  12. DAS performance analysis

    SciTech Connect

    Bates, G.; Bodine, S.; Carroll, T.; Keller, M.

    1984-02-01

    This report begins with an overview of the Data Acquisition System (DAS), which supports several of PPPL's experimental devices. Performance measurements which were taken on DAS and the tools used to make them are then described.

  13. Dependability and performability analysis

    NASA Technical Reports Server (NTRS)

    Trivedi, Kishor S.; Ciardo, Gianfranco; Malhotra, Manish; Sahner, Robin A.

    1993-01-01

    Several practical issues regarding specifications and solution of dependability and performability models are discussed. Model types with and without rewards are compared. Continuous-time Markov chains (CTMC's) are compared with (continuous-time) Markov reward models (MRM's) and generalized stochastic Petri nets (GSPN's) are compared with stochastic reward nets (SRN's). It is shown that reward-based models could lead to more concise model specifications and solution of a variety of new measures. With respect to the solution of dependability and performability models, three practical issues were identified: largeness, stiffness, and non-exponentiality, and a variety of approaches are discussed to deal with them, including some of the latest research efforts.

  14. Analysis of EDP performance

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The objective of this contract was the investigation of the potential performance gains that would result from an upgrade of the Space Station Freedom (SSF) Data Management System (DMS) Embedded Data Processor (EDP) '386' design with the Intel Pentium (registered trade-mark of Intel Corp.) '586' microprocessor. The Pentium ('586') is the latest member of the industry standard Intel X86 family of CISC (Complex Instruction Set Computer) microprocessors. This contract was scheduled to run in parallel with an internal IBM Federal Systems Company (FSC) Internal Research and Development (IR&D) task that had the goal to generate a baseline flight design for an upgraded EDP using the Pentium. This final report summarizes the activities performed in support of Contract NAS2-13758. Our plan was to baseline performance analyses and measurements on the latest state-of-the-art commercially available Pentium processor, representative of the proposed space station design, and then phase to an IBM capital funded breadboard version of the flight design (if available from IR&D and Space Station work) for additional evaluation of results. Unfortunately, the phase-over to the flight design breadboard did not take place, since the IBM Data Management System (DMS) for the Space Station Freedom was terminated by NASA before the referenced capital funded EDP breadboard could be completed. The baseline performance analyses and measurements, however, were successfully completed, as planned, on the commercial Pentium hardware. The results of those analyses, evaluations, and measurements are presented in this final report.

  15. MPQC: Performance Analysis and Optimization

    SciTech Connect

    Sarje, Abhinav; Williams, Samuel; Bailey, David

    2012-11-30

    MPQC (Massively Parallel Quantum Chemistry) is a widely used computational quantum chemistry code. It is capable of performing a number of computations commonly occurring in quantum chemistry. In order to achieve better performance of MPQC, in this report we present a detailed performance analysis of this code. We then perform loop and memory access optimizations, and measure performance improvements by comparing the performance of the optimized code with that of the original MPQC code. We observe that the optimized MPQC code achieves a significant improvement in the performance through a better utilization of vector processing and memory hierarchies.

  16. Lidar performance analysis

    NASA Technical Reports Server (NTRS)

    Spiers, Gary D.

    1994-01-01

    Section 1 details the theory used to build the lidar model, provides results of using the model to evaluate AEOLUS design instrument designs, and provides snapshots of the visual appearance of the coded model. Appendix A contains a Fortran program to calculate various forms of the refractive index structure function. This program was used to determine the refractive index structure function used in the main lidar simulation code. Appendix B contains a memo on the optimization of the lidar telescope geometry for a line-scan geometry. Appendix C contains the code for the main lidar simulation and brief instruction on running the code. Appendix D contains a Fortran code to calculate the maximum permissible exposure for the eye from the ANSI Z136.1-1992 eye safety standards. Appendix E contains a paper on the eye safety analysis of a space-based coherent lidar presented at the 7th Coherent Laser Radar Applications and Technology Conference, Paris, France, 19-23 July 1993.

  17. Techniques for Automated Performance Analysis

    SciTech Connect

    Marcus, Ryan C.

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  18. Inlet Performance Analysis Code Developed

    NASA Technical Reports Server (NTRS)

    Jules, Kenol; Barnhart, Paul J.

    1998-01-01

    The design characteristics of an inlet very much depend on whether the inlet is to be flown at subsonic, supersonic, or hypersonic speed. Whichever the case, the primary function of an inlet is to deliver free-stream air to the engine face at the highest stagnation pressure possible and with the lowest possible variation in both stagnation pressure and temperature. At high speeds, this is achieved by a system of oblique and/or normal shock waves, and possibly some isentropic compression. For both subsonic and supersonic flight, current design practice indicates that the inlet should deliver the air to the engine face at approximately Mach 0.45. As a result, even for flight in the high subsonic regime, the inlet must retard (or diffuse) the air substantially. Second, the design of an inlet is influenced largely by the compromise between high performance and low weight. This compromise involves tradeoffs between the mission requirements, flight trajectory, airframe aerodynamics, engine performance, and weight-all of which, in turn, influence each other. Therefore, to study the effects of some of these influential factors, the Propulsion System Analysis Office of the NASA Lewis Research Center developed the Inlet Performance Analysis Code (IPAC). This code uses oblique shock and Prandtl-Meyer expansion theory to predict inlet performance. It can be used to predict performance for a given inlet geometric design such as pitot, axisymmetric, and two-dimensional. IPAC also can be used to design preliminary inlet systems and to make subsequent performance analyses. It computes the total pressure, the recovery, the airflow, and the drag coefficients. The pressure recovery includes losses associated with normal and oblique shocks, internal and external friction, the sharp lip, and diffuser components. Flow rate includes captured, engine, spillage, bleed, and bypass flows. The aerodynamic drag calculation includes drags associated with spillage, cowl lip suction, wave, bleed

  19. Scalable Performance Measurement and Analysis

    SciTech Connect

    Gamblin, Todd

    2009-01-01

    Concurrency levels in large-scale, distributed-memory supercomputers are rising exponentially. Modern machines may contain 100,000 or more microprocessor cores, and the largest of these, IBM's Blue Gene/L, contains over 200,000 cores. Future systems are expected to support millions of concurrent tasks. In this dissertation, we focus on efficient techniques for measuring and analyzing the performance of applications running on very large parallel machines. Tuning the performance of large-scale applications can be a subtle and time-consuming task because application developers must measure and interpret data from many independent processes. While the volume of the raw data scales linearly with the number of tasks in the running system, the number of tasks is growing exponentially, and data for even small systems quickly becomes unmanageable. Transporting performance data from so many processes over a network can perturb application performance and make measurements inaccurate, and storing such data would require a prohibitive amount of space. Moreover, even if it were stored, analyzing the data would be extremely time-consuming. In this dissertation, we present novel methods for reducing performance data volume. The first draws on multi-scale wavelet techniques from signal processing to compress systemwide, time-varying load-balance data. The second uses statistical sampling to select a small subset of running processes to generate low-volume traces. A third approach combines sampling and wavelet compression to stratify performance data adaptively at run-time and to reduce further the cost of sampled tracing. We have integrated these approaches into Libra, a toolset for scalable load-balance analysis. We present Libra and show how it can be used to analyze data from large scientific applications scalably.

  20. MUSE instrument global performance analysis

    NASA Astrophysics Data System (ADS)

    Loupias, M.; Bacon, R.; Caillier, P.; Fleischmann, A.; Jarno, A.; Kelz, A.; Kosmalski, J.; Laurent, F.; Le Floch, M.; Lizon, J. L.; Manescau, A.; Nicklas, H.; Parès, L.; Pécontal, A.; Reiss, R.; Remillieux, A.; Renault, E.; Roth, M. M.; Rupprecht, G.; Stuik, R.

    2010-07-01

    MUSE (Multi Unit Spectroscopic Explorer) is a second generation instrument developed for ESO (European Southern Observatory) and will be assembled to the VLT (Very Large Telescope) in 2012. The MUSE instrument can simultaneously record 90.000 spectra in the visible wavelength range (465-930nm), across a 1*1arcmin2 field of view, thanks to 24 identical Integral Field Units (IFU). A collaboration of 7 institutes has successfully passed the Final Design Review and is currently working on the first sub-assemblies. The sharing of performances has been based on 5 main functional sub-systems. The Fore Optics sub-system derotates and anamorphoses the VLT Nasmyth focal plane image, the Splitting and Relay Optics associated with the Main Structure are feeding each IFU with 1/24th of the field of view. Each IFU is composed of a 3D function insured by an image slicer system and a spectrograph, and a detection function by a 4k*4k CCD cooled down to 163°K. The 5th function is the calibration and data reduction of the instrument. This article depicts the breakdown of performances between these sub-systems (throughput, image quality...), and underlines the constraining parameters of the interfaces either internal or with the VLT. The validation of all these requirements is a critical task started a few months ago which requires a clear traceability and performances analysis.

  1. Software Performs Complex Design Analysis

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  2. Stage Separation Performance Analysis Project

    NASA Technical Reports Server (NTRS)

    Chen, Yen-Sen; Zhang, Sijun; Liu, Jiwen; Wang, Ten-See

    2001-01-01

    Stage separation process is an important phenomenon in multi-stage launch vehicle operation. The transient flowfield coupled with the multi-body systems is a challenging problem in design analysis. The thermodynamics environment with burning propellants during the upper-stage engine start in the separation processes adds to the complexity of the-entire system. Understanding the underlying flow physics and vehicle dynamics during stage separation is required in designing a multi-stage launch vehicle with good flight performance. A computational fluid dynamics model with the capability to coupling transient multi-body dynamics systems will be a useful tool for simulating the effects of transient flowfield, plume/jet heating and vehicle dynamics. A computational model using generalize mesh system will be used as the basis of this development. The multi-body dynamics system will be solved, by integrating a system of six-degree-of-freedom equations of motion with high accuracy. Multi-body mesh system and their interactions will be modeled using parallel computing algorithms. Adaptive mesh refinement method will also be employed to enhance solution accuracy in the transient process.

  3. SEP thrust subsystem performance sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Atkins, K. L.; Sauer, C. G., Jr.; Kerrisk, D. J.

    1973-01-01

    This is a two-part report on solar electric propulsion (SEP) performance sensitivity analysis. The first part describes the preliminary analysis of the SEP thrust system performance for an Encke rendezvous mission. A detailed description of thrust subsystem hardware tolerances on mission performance is included together with nominal spacecraft parameters based on these tolerances. The second part describes the method of analysis and graphical techniques used in generating the data for Part 1. Included is a description of both the trajectory program used and the additional software developed for this analysis. Part 2 also includes a comprehensive description of the use of the graphical techniques employed in this performance analysis.

  4. Analysis of Costs and Performance

    ERIC Educational Resources Information Center

    Duchesne, Roderick M.

    1973-01-01

    This article outlines a library management information system concerned with total library costs and performance. The system is essentially an adaptation of well-proven industrial and commercial management accounting techniques to the library context. (24 references) (Author)

  5. Guided wave tomography performance analysis

    NASA Astrophysics Data System (ADS)

    Huthwaite, Peter; Lowe, Michael; Cawley, Peter

    2016-02-01

    Quantifying wall loss caused by corrosion is a significant challenge for the petrochemical industry. Corrosion commonly occurs at pipe supports, where surface access for inspection is limited. Guided wave tomography is pursued as a solution to this: guided waves are transmitted through the region of interest from an array, and tomographic reconstruction techniques are applied to the measured signals in order to produce a map of thickness. There are many parameters in the system which can affect the performance; this paper investigates how the accuracy varies as defect width and depth, operating frequency and guided wave mode are all changed. For the S0 mode, the best performance was seen around 170kHz on the 10mm plate, with poor performance seen at almost all other frequencies. A0 showed better performance across a broad range of frequencies, with resolution improving with frequency as the wavelength reduced. However, it was shown that the resolution limit did drop relative to the wavelength, limiting the performance at high frequencies slightly.

  6. Adaptive Optics Communications Performance Analysis

    NASA Technical Reports Server (NTRS)

    Srinivasan, M.; Vilnrotter, V.; Troy, M.; Wilson, K.

    2004-01-01

    The performance improvement obtained through the use of adaptive optics for deep-space communications in the presence of atmospheric turbulence is analyzed. Using simulated focal-plane signal-intensity distributions, uncoded pulse-position modulation (PPM) bit-error probabilities are calculated assuming the use of an adaptive focal-plane detector array as well as an adaptively sized single detector. It is demonstrated that current practical adaptive optics systems can yield performance gains over an uncompensated system ranging from approximately 1 dB to 6 dB depending upon the PPM order and background radiation level.

  7. Structural-Thermal-Optical-Performance (STOP) Analysis

    NASA Technical Reports Server (NTRS)

    Bolognese, Jeffrey; Irish, Sandra

    2015-01-01

    The presentation will be given at the 26th Annual Thermal Fluids Analysis Workshop (TFAWS 2015) hosted by the Goddard Spaceflight Center (GSFC) Thermal Engineering Branch (Code 545). A STOP analysis is a multidiscipline analysis, consisting of Structural, Thermal and Optical Performance Analyses, that is performed for all space flight instruments and satellites. This course will explain the different parts of performing this analysis. The student will learn how to effectively interact with each discipline in order to accurately obtain the system analysis results.

  8. Performance of statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Davis, R. F.; Hines, D. E.

    1973-01-01

    Statistical energy analysis (SEA) methods have been developed for high frequency modal analyses on random vibration environments. These SEA methods are evaluated by comparing analytical predictions to test results. Simple test methods are developed for establishing SEA parameter values. Techniques are presented, based on the comparison of the predictions with test values, for estimating SEA accuracy as a function of frequency for a general structure.

  9. Quasi steady MPD performance analysis

    NASA Astrophysics Data System (ADS)

    Guarducci, F.; Paccani, G.; Lehnert, J.

    2011-04-01

    Pulsed (quasi-steady) solid propellant magnetoplasmadynamic thruster operation has been investigated both in the self-induced and applied magnetic field cases. Input parameters have been varied in order to analyze performance (in particular impulse bit) dependance on these parameters. The stored energy per shot has been set to four values between 2000 and 3000 J, while magnetic field has been set to six values between 0 and 159 mT. Impulse bit has been evaluated through a thrust stand technique: a brief overview of this method is given together with a description of the data processing procedure. Current measurements allow to use Maeker's formula as a reference for comparison between theoretical and empirical results as well as between self and applied field operation. Appreciable improvements of the thruster impulse bit performance have been noticed for defined sets of stored energy and applied field values. An inductive interaction between the magnet coil and the laboratory facilities, resulting in thrust stand displacement, has been observed: this phenomenon and its consequences on measurements have been investigated. A target used as a ballistic pendulum, insensitive to magnetic coupling, has been employed to acquire a new set of measurements: the results obtained with the target technique show a maximum discrepancy of 5% when compared with the measurements derived from the thrust stand technique. Finally, the thrust stand measurements appear to be affected by the inductive interactions only for very high values of the applied field.

  10. Performance Analysis of GYRO: A Tool Evaluation

    SciTech Connect

    Worley, P.; Roth, P.; Candy, J.; Shan, Hongzhang; Mahinthakumar,G.; Sreepathi, S.; Carrington, L.; Kaiser, T.; Snavely, A.; Reed, D.; Zhang, Y.; Huck, K.; Malony, A.; Shende, S.; Moore, S.; Wolf, F.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  11. Stocker/feeder cattle standardized performance analysis.

    PubMed

    McGrann, J M; McAdams, J

    1995-07-01

    The Standardized Performance Analysis (SPA) for stocker/feed cattle is a recommended set of production and financial performance analysis guidelines developed specifically for the grazing, growing, and finishing phases of beef cattle production. Guidelines were developed by members of the National Cattlemen's Association (NCA), Extension Specialists, and the National Integrated Resource Management Coordination Committee to provide beef cattle producers with a comprehensive, standardized means of measuring, analyzing, and reporting the performance and profitability of an operation. This article describes and illustrates through an example the performance measures chosen. The NCA certifies software and education materials conforming to the Stocker/Feeder Guidelines. PMID:7584818

  12. Building America Performance Analysis Procedures: Revision 1

    SciTech Connect

    2004-06-01

    To measure progress toward multi-year research goals, cost and performance trade-offs are evaluated through a series of controlled field and laboratory experiments supported by energy analysis techniques using test data to calibrate simulation models.

  13. A Perspective on DSN System Performance Analysis

    NASA Technical Reports Server (NTRS)

    Pham, Timothy T.

    2006-01-01

    This paper discusses the performance analysis effort being carried out in the NASA Deep Space Network. The activity involves root cause analysis of failures and assessment of key performance metrics. The root cause analysis helps pinpoint the true cause of observed problems so that proper correction can be effected. The assessment currently focuses on three aspects: (1) data delivery metrics such as Quantity, Quality, Continuity, and Latency; (2) link-performance metrics such as antenna pointing, system noise temperature, Doppler noise, frequency and time synchronization, wide-area-network loading, link-configuration setup time; and (3) reliability, maintainability, availability metrics. The analysis establishes whether the current system is meeting its specifications and if so, how much margin is available. The findings help identify the weak points in the system and direct attention of programmatic investment for performance improvement.

  14. Paramedir: A Tool for Programmable Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  15. A Performance Approach to Job Analysis.

    ERIC Educational Resources Information Center

    Folsom, Al

    2001-01-01

    Discussion of performance technology and training evaluation focuses on a job analysis process in the Coast Guard. Topics include problems with low survey response rates; costs; the need for appropriate software; discussions with stakeholders and subject matter experts; and maximizing worthy performance. (LRW)

  16. Teaching performance management using behavior analysis

    PubMed Central

    Ackley, George B. E.; Bailey, Jon S.

    1995-01-01

    A special undergraduate track in performance management, taught using behavior analysis principles, is described. The key elements of the program are presented, including the point systems and other reinforcement contingencies in the classes, the goals of the instructional activities, and many of the requirements used to evaluate student performance. Finally, the article provides examples of the performance management projects students have conducted with local businesses. PMID:22478206

  17. Performance optimisations for distributed analysis in ALICE

    NASA Astrophysics Data System (ADS)

    Betev, L.; Gheata, A.; Gheata, M.; Grigoras, C.; Hristov, P.

    2014-06-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with "sensors" collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis.

  18. Integrating performance data collection, analysis, and visualization

    NASA Technical Reports Server (NTRS)

    Malony, Allen D.; Reed, Daniel A.; Rudolph, David C.

    1990-01-01

    An integrated data collection, analysis, and data visualization environment is described for a specific parallel system - the Intel iPSC/2 hypercube. The data collection components of the environment encompass software event tracing at the operating system with a program level and a hardware-based performance monitoring system used to capture software events. A visualization system based on the X-window environment permits dynamic display and reduction of performance data. A performance data collection, analysis, and visualization environment makes it possible to access the effects of architectural and system software variations.

  19. Interfacing Computer Aided Parallelization and Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Biegel, Bryan A. (Technical Monitor)

    2003-01-01

    When porting sequential applications to parallel computer architectures, the program developer will typically go through several cycles of source code optimization and performance analysis. We have started a project to develop an environment where the user can jointly navigate through program structure and performance data information in order to make efficient optimization decisions. In a prototype implementation we have interfaced the CAPO computer aided parallelization tool with the Paraver performance analysis tool. We describe both tools and their interface and give an example for how the interface helps within the program development cycle of a benchmark code.

  20. Comparative performance analysis of mobile displays

    NASA Astrophysics Data System (ADS)

    Safaee-Rad, Reza; Aleksic, Milivoje

    2012-01-01

    Cell-phone display performance (in terms of color quality and optical efficiency) has become a critical factor in creating a positive user experience. As a result, there is a significant amount of effort by cell-phone OEMs to provide a more competitive display solution. This effort is focused on using different display technologies (with significantly different color characteristics) and more sophisticated display processors. In this paper, the results of a mobile-display comparative performance analysis are presented. Three cell-phones from major OEMs are selected and their display performances are measured and quantified. Comparative performance analysis is done using display characteristics such as display color gamut size, RGB-channels crosstalk, RGB tone responses, gray tracking performance, color accuracy, and optical efficiency.

  1. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  2. Comprehensive analysis of transport aircraft flight performance

    NASA Astrophysics Data System (ADS)

    Filippone, Antonio

    2008-04-01

    This paper reviews the state-of-the art in comprehensive performance codes for fixed-wing aircraft. The importance of system analysis in flight performance is discussed. The paper highlights the role of aerodynamics, propulsion, flight mechanics, aeroacoustics, flight operation, numerical optimisation, stochastic methods and numerical analysis. The latter discipline is used to investigate the sensitivities of the sub-systems to uncertainties in critical state parameters or functional parameters. The paper discusses critically the data used for performance analysis, and the areas where progress is required. Comprehensive analysis codes can be used for mission fuel planning, envelope exploration, competition analysis, a wide variety of environmental studies, marketing analysis, aircraft certification and conceptual aircraft design. A comprehensive program that uses the multi-disciplinary approach for transport aircraft is presented. The model includes a geometry deck, a separate engine input deck with the main parameters, a database of engine performance from an independent simulation, and an operational deck. The comprehensive code has modules for deriving the geometry from bitmap files, an aerodynamics model for all flight conditions, a flight mechanics model for flight envelopes and mission analysis, an aircraft noise model and engine emissions. The model is validated at different levels. Validation of the aerodynamic model is done against the scale models DLR-F4 and F6. A general model analysis and flight envelope exploration are shown for the Boeing B-777-300 with GE-90 turbofan engines with intermediate passenger capacity (394 passengers in 2 classes). Validation of the flight model is done by sensitivity analysis on the wetted area (or profile drag), on the specific air range, the brake-release gross weight and the aircraft noise. A variety of results is shown, including specific air range charts, take-off weight-altitude charts, payload-range performance

  3. Performance analysis of LAN bridges and routers

    NASA Technical Reports Server (NTRS)

    Hajare, Ankur R.

    1991-01-01

    Bridges and routers are used to interconnect Local Area Networks (LANs). The performance of these devices is important since they can become bottlenecks in large multi-segment networks. Performance metrics and test methodology for bridges and routers were not standardized. Performance data reported by vendors is not applicable to the actual scenarios encountered in an operational network. However, vendor-provided data can be used to calibrate models of bridges and routers that, along with other models, yield performance data for a network. Several tools are available for modeling bridges and routers - Network II.5 was used. The results of the analysis of some bridges and routers are presented.

  4. Using Covariance Analysis to Assess Pointing Performance

    NASA Technical Reports Server (NTRS)

    Bayard, David; Kang, Bryan

    2009-01-01

    A Pointing Covariance Analysis Tool (PCAT) has been developed for evaluating the expected performance of the pointing control system for NASA s Space Interferometry Mission (SIM). The SIM pointing control system is very complex, consisting of multiple feedback and feedforward loops, and operating with multiple latencies and data rates. The SIM pointing problem is particularly challenging due to the effects of thermomechanical drifts in concert with the long camera exposures needed to image dim stars. Other pointing error sources include sensor noises, mechanical vibrations, and errors in the feedforward signals. PCAT models the effects of finite camera exposures and all other error sources using linear system elements. This allows the pointing analysis to be performed using linear covariance analysis. PCAT propagates the error covariance using a Lyapunov equation associated with time-varying discrete and continuous-time system matrices. Unlike Monte Carlo analysis, which could involve thousands of computational runs for a single assessment, the PCAT analysis performs the same assessment in a single run. This capability facilitates the analysis of parametric studies, design trades, and "what-if" scenarios for quickly evaluating and optimizing the control system architecture and design.

  5. Laser Atmospheric Wind Sounder (LAWS) performance analysis

    NASA Technical Reports Server (NTRS)

    Kenyon, D.; Petheram, J.

    1991-01-01

    The science objectives of the NASA's Laser Atmospheric Sounder (LAWS) are discussed, and results of the performance analysis of the LAWS system are presented together with the instrument configuration used for these performance analyses. The results of analyses show that the science requirements for the wind-velocity accuracies of m/sec in the lower troposphere and 5 m/sec in the upper troposphere will be met by the present design of the LAWS system. The paper presents the performance estimates of the LAWS in terms of the global coverage, spatial resolution, signal-to-noise ratio, line-of-sight velocity error, and horizontal inversion accuracy.

  6. Analysis of driver performance under reduced visibility

    NASA Technical Reports Server (NTRS)

    Kaeppler, W. D.

    1982-01-01

    Mathematical models describing vehicle dynamics as well as human behavior may be useful in evaluating driver performance and in establishing design criteria for vehicles more compatible with man. In 1977, a two level model of driver steering behavior was developed, but its parameters were identified for clear visibility conditions only. Since driver performance degrades under conditions of reduced visibility, e.g., fog, the two level model should be investigated to determine its applicability to such conditions. The data analysis of a recently performed driving simulation experiment showed that the model still performed reasonably well under fog conditions, although there was a degradation in its predictive capacity during fog. Some additional parameters affecting anticipation and lag time may improve the model's performance for reduced visibility conditions.

  7. Automated Cache Performance Analysis And Optimization

    SciTech Connect

    Mohror, Kathryn

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  8. Using Ratio Analysis to Evaluate Financial Performance.

    ERIC Educational Resources Information Center

    Minter, John; And Others

    1982-01-01

    The ways in which ratio analysis can help in long-range planning, budgeting, and asset management to strengthen financial performance and help avoid financial difficulties are explained. Types of ratios considered include balance sheet ratios, net operating ratios, and contribution and demand ratios. (MSE)

  9. Probabilistic Analysis of Gas Turbine Field Performance

    NASA Technical Reports Server (NTRS)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2002-01-01

    A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.

  10. Performance analysis and prediction in triathlon.

    PubMed

    Ofoghi, Bahadorreza; Zeleznikow, John; Macmahon, Clare; Rehula, Jan; Dwyer, Dan B

    2016-01-01

    Performance in triathlon is dependent upon factors that include somatotype, physiological capacity, technical proficiency and race strategy. Given the multidisciplinary nature of triathlon and the interaction between each of the three race components, the identification of target split times that can be used to inform the design of training plans and race pacing strategies is a complex task. The present study uses machine learning techniques to analyse a large database of performances in Olympic distance triathlons (2008-2012). The analysis reveals patterns of performance in five components of triathlon (three race "legs" and two transitions) and the complex relationships between performance in each component and overall performance in a race. The results provide three perspectives on the relationship between performance in each component of triathlon and the final placing in a race. These perspectives allow the identification of target split times that are required to achieve a certain final place in a race and the opportunity to make evidence-based decisions about race tactics in order to optimise performance. PMID:26177783

  11. NPAC-Nozzle Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Barnhart, Paul J.

    1997-01-01

    A simple and accurate nozzle performance analysis methodology has been developed. The geometry modeling requirements are minimal and very flexible, thus allowing rapid design evaluations. The solution techniques accurately couple: continuity, momentum, energy, state, and other relations which permit fast and accurate calculations of nozzle gross thrust. The control volume and internal flow analyses are capable of accounting for the effects of: over/under expansion, flow divergence, wall friction, heat transfer, and mass addition/loss across surfaces. The results from the nozzle performance methodology are shown to be in excellent agreement with experimental data for a variety of nozzle designs over a range of operating conditions.

  12. Multiprocessor smalltalk: Implementation, performance, and analysis

    SciTech Connect

    Pallas, J.I.

    1990-01-01

    Multiprocessor Smalltalk demonstrates the value of object-oriented programming on a multiprocessor. Its implementation and analysis shed light on three areas: concurrent programming in an object oriented language without special extensions, implementation techniques for adapting to multiprocessors, and performance factors in the resulting system. Adding parallelism to Smalltalk code is easy, because programs already use control abstractions like iterators. Smalltalk's basic control and concurrency primitives (lambda expressions, processes and semaphores) can be used to build parallel control abstractions, including parallel iterators, parallel objects, atomic objects, and futures. Language extensions for concurrency are not required. This implementation demonstrates that it is possible to build an efficient parallel object-oriented programming system and illustrates techniques for doing so. Three modification tools-serialization, replication, and reorganization-adapted the Berkeley Smalltalk interpreter to the Firefly multiprocessor. Multiprocessor Smalltalk's performance shows that the combination of multiprocessing and object-oriented programming can be effective: speedups (relative to the original serial version) exceed 2.0 for five processors on all the benchmarks; the median efficiency is 48%. Analysis shows both where performance is lost and how to improve and generalize the experimental results. Changes in the interpreter to support concurrency add at most 12% overhead; better access to per-process variables could eliminate much of that. Changes in the user code to express concurrency add as much as 70% overhead; this overhead could be reduced to 54% if blocks (lambda expressions) were reentrant. Performance is also lost when the program cannot keep all five processors busy.

  13. Transformer cooler performance: Analysis and experiment

    SciTech Connect

    Lang, S.K.; Bergles, A.E.

    1994-12-31

    During the summer of 1988, the coolers operating on the number-one transformer at the Niagara Mohawk New Scotland Substation were unable to maintain the temperature of the transformer oil at an acceptable level during a period of peak power and hot weather conditions. As a result of this incident, the Niagara Mohawk Power Corporation requested that the operation performance characteristics of the failed General Electric FOA oil transformer cooler be investigated by the RPI Heat Transfer Laboratory. A theoretical and experimental analysis has been performed on the performance of a transformer cooler. The theoretical study involved the formulation of a numerical model of the cooler, which predicted that the performance is extremely sensitive and dependent upon the air-side flow rate/heat transfer coefficient, as well as the available heat transfer area. The experimental work consisted of the design and implementation of a cooling loop from which experimental data were obtained to confirm the reliability of the numerical calculations. The experimental results are in good agreement with the numerical predictions, therefore, they confirm the reliability of the analysis.

  14. Transformer cooler performance: Analysis and experiment

    SciTech Connect

    Lang, S.K.; Bergles, A.E.

    1995-10-01

    During the summer of 1988, the coolers operating on the number-one transformer at the Niagara Mohawk New Scotland Substation were unable to maintain the temperature of the transformer oil at an acceptable level during a period of peak power and hot weather conditions. As a result of that incident, the Niagara Mohawk Power Corporation requested that the operation performance characteristics of the failed General Electric FOA oil transformer cooler be investigated by the Heat Transfer Laboratory at Rensselaer Polytechnic Institute. A theoretical and experimental analysis has been performed on the performance of a transformer cooler. The theoretical study involved the formulation of a model of the cooler, which predicted that the performance is extremely sensitive and dependent upon the air-side flow rate/heat transfer coefficient, as well as the available heat transfer area. The experimental work consisted of the design and implementation of a cooling loop, from which experimental data were obtained to confirm the accuracy of the predictions. The experimental results are in good agreement with the numerical predictions; therefore, they confirm the reliability of the analysis.

  15. Nominal Performance Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs for the groundwater exposure scenario for the three climate states considered in the TSPA-LA as well as conversion factors for evaluating compliance with the groundwater protection standard. The BDCFs will be used in performance assessment for calculating all-pathway annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle activity in groundwater and the annual dose

  16. Performance management in healthcare: a critical analysis.

    PubMed

    Hewko, Sarah J; Cummings, Greta G

    2016-01-01

    Purpose - The purpose of this paper is to explore the underlying theoretical assumptions and implications of current micro-level performance management and evaluation (PME) practices, specifically within health-care organizations. PME encompasses all activities that are designed and conducted to align employee outputs with organizational goals. Design/methodology/approach - PME, in the context of healthcare, is analyzed through the lens of critical theory. Specifically, Habermas' theory of communicative action is used to highlight some of the questions that arise in looking critically at PME. To provide a richer definition of key theoretical concepts, the authors conducted a preliminary, exploratory hermeneutic semantic analysis of the key words "performance" and "management" and of the term "performance management". Findings - Analysis reveals that existing micro-level PME systems in health-care organizations have the potential to create a workforce that is compliant, dependent, technically oriented and passive, and to support health-care systems in which inequalities and power imbalances are perpetually reinforced. Practical implications - At a time when the health-care system is under increasing pressure to provide high-quality, affordable services with fewer resources, it may be wise to investigate new sector-specific ways of evaluating and managing performance. Originality/value - In this paper, written for health-care leaders and health human resource specialists, the theoretical assumptions and implications of current PME practices within health-care organizations are explored. It is hoped that readers will be inspired to support innovative PME practices within their organizations that encourage peak performance among health-care professionals. PMID:26764960

  17. Nominal Performance Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    M.A. Wasiolek

    2003-07-25

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports (BSC 2003 [DIRS 160964]; BSC 2003 [DIRS 160965]; BSC 2003 [DIRS 160976]; BSC 2003 [DIRS 161239]; BSC 2003 [DIRS 161241]) contain detailed description of the model input parameters. This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs and conversion factors for the TSPA. The BDCFs will be used in performance assessment for calculating annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle activity in groundwater and the annual dose from beta- and photon-emitting radionuclides.

  18. Nominal Performance Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    M.A. Wasiolek

    2005-04-28

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standards. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1-1). The objectives of this analysis are to develop BDCFs for the

  19. Diversity Performance Analysis on Multiple HAP Networks.

    PubMed

    Dong, Feihong; Li, Min; Gong, Xiangwu; Li, Hongjun; Gao, Fengyue

    2015-01-01

    One of the main design challenges in wireless sensor networks (WSNs) is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP) is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO) techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO) model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV). In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF) and cumulative distribution function (CDF) of the received signal-to-noise ratio (SNR) are derived. In addition, the average symbol error rate (ASER) with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI) and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques. PMID:26134102

  20. Diversity Performance Analysis on Multiple HAP Networks

    PubMed Central

    Dong, Feihong; Li, Min; Gong, Xiangwu; Li, Hongjun; Gao, Fengyue

    2015-01-01

    One of the main design challenges in wireless sensor networks (WSNs) is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP) is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO) techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO) model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV). In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF) and cumulative distribution function (CDF) of the received signal-to-noise ratio (SNR) are derived. In addition, the average symbol error rate (ASER) with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI) and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques. PMID:26134102

  1. Performance Analysis on Fault Tolerant Control System

    NASA Technical Reports Server (NTRS)

    Shin, Jong-Yeob; Belcastro, Christine

    2005-01-01

    In a fault tolerant control (FTC) system, a parameter varying FTC law is reconfigured based on fault parameters estimated by fault detection and isolation (FDI) modules. FDI modules require some time to detect fault occurrences in aero-vehicle dynamics. In this paper, an FTC analysis framework is provided to calculate the upper bound of an induced-L(sub 2) norm of an FTC system with existence of false identification and detection time delay. The upper bound is written as a function of a fault detection time and exponential decay rates and has been used to determine which FTC law produces less performance degradation (tracking error) due to false identification. The analysis framework is applied for an FTC system of a HiMAT (Highly Maneuverable Aircraft Technology) vehicle. Index Terms fault tolerant control system, linear parameter varying system, HiMAT vehicle.

  2. Idaho National Laboratory Quarterly Performance Analysis

    SciTech Connect

    Lisbeth Mitchell

    2014-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of 60 reportable events (23 from the 4th Qtr FY14 and 37 from the prior three reporting quarters) as well as 58 other issue reports (including not reportable events and Significant Category A and B conditions) identified at INL from July 2013 through October 2014. Battelle Energy Alliance (BEA) operates the INL under contract DE AC07 051D14517.

  3. An analysis of air-turborocket performance

    NASA Astrophysics Data System (ADS)

    Bussi, Giuseppe; Colasurdo, Guido; Pastrone, Dario

    1993-06-01

    In order to assess the capabilities of the air-turborocket, an off-design analysis of a representative LOX-LH2 fed engine is carried out. Working lines on an envisageable compressor map are drawn for different flight conditions along a typical transatmospheric vehicle flight path. Characteristic aspects of the air-turborocket behavior in the spontaneous and controlled mode are highlighted. Specific thrust and propellant consumption at full throttle are computed, both in the dry and augmented mode. Performance achievable by exploiting the permissible mass flow range of the compressor map via the variation of the nozzle throat area, is shown.

  4. Analysis of imaging system performance capabilities

    NASA Astrophysics Data System (ADS)

    Haim, Harel; Marom, Emanuel

    2013-06-01

    Present performance analysis of optical imaging systems based on results obtained with classic one-dimensional (1D) resolution targets (such as the USAF resolution chart) are significantly different than those obtained with a newly proposed 2D target [1]. We hereby prove such claim and show how the novel 2D target should be used for correct characterization of optical imaging systems in terms of resolution and contrast. We apply thereafter the consequences of these observations on the optimal design of some two-dimensional barcode structures.

  5. PERFORMANCE ANALYSIS OF MECHANICAL DRAFT COOLING TOWER

    SciTech Connect

    Lee, S; Alfred Garrett, A; James02 Bollinger, J; Larry Koffman, L

    2009-02-10

    Industrial processes use mechanical draft cooling towers (MDCT's) to dissipate waste heat by transferring heat from water to air via evaporative cooling, which causes air humidification. The Savannah River Site (SRS) has cross-flow and counter-current MDCT's consisting of four independent compartments called cells. Each cell has its own fan to help maximize heat transfer between ambient air and circulated water. The primary objective of the work is to simulate the cooling tower performance for the counter-current cooling tower and to conduct a parametric study under different fan speeds and ambient air conditions. The Savannah River National Laboratory (SRNL) developed a computational fluid dynamics (CFD) model and performed the benchmarking analysis against the integral measurement results to accomplish the objective. The model uses three-dimensional steady-state momentum, continuity equations, air-vapor species balance equation, and two-equation turbulence as the basic governing equations. It was assumed that vapor phase is always transported by the continuous air phase with no slip velocity. In this case, water droplet component was considered as discrete phase for the interfacial heat and mass transfer via Lagrangian approach. Thus, the air-vapor mixture model with discrete water droplet phase is used for the analysis. A series of parametric calculations was performed to investigate the impact of wind speeds and ambient conditions on the thermal performance of the cooling tower when fans were operating and when they were turned off. The model was also benchmarked against the literature data and the SRS integral test results for key parameters such as air temperature and humidity at the tower exit and water temperature for given ambient conditions. Detailed results will be published here.

  6. SUBSONIC WIND TUNNEL PERFORMANCE ANALYSIS SOFTWARE

    NASA Technical Reports Server (NTRS)

    Eckert, W. T.

    1994-01-01

    This program was developed as an aid in the design and analysis of subsonic wind tunnels. It brings together and refines previously scattered and over-simplified techniques used for the design and loss prediction of the components of subsonic wind tunnels. It implements a system of equations for determining the total pressure losses and provides general guidelines for the design of diffusers, contractions, corners and the inlets and exits of non-return tunnels. The algorithms used in the program are applicable to compressible flow through most closed- or open-throated, single-, double- or non-return wind tunnels or ducts. A comparison between calculated performance and that actually achieved by several existing facilities produced generally good agreement. Any system through which air is flowing which involves turns, fans, contractions etc. (e.g., an HVAC system) may benefit from analysis using this software. This program is an update of ARC-11138 which includes PC compatibility and an improved user interface. The method of loss analysis used by the program is a synthesis of theoretical and empirical techniques. Generally, the algorithms used are those which have been substantiated by experimental test. The basic flow-state parameters used by the program are determined from input information about the reference control section and the test section. These parameters were derived from standard relationships for compressible flow. The local flow conditions, including Mach number, Reynolds number and friction coefficient are determined for each end of each component or section. The loss in total pressure caused by each section is calculated in a form non-dimensionalized by local dynamic pressure. The individual losses are based on the nature of the section, local flow conditions and input geometry and parameter information. The loss forms for typical wind tunnel sections considered by the program include: constant area ducts, open throat ducts, contractions, constant

  7. Failure analysis of high performance ballistic fibers

    NASA Astrophysics Data System (ADS)

    Spatola, Jennifer S.

    High performance fibers have a high tensile strength and modulus, good wear resistance, and a low density, making them ideal for applications in ballistic impact resistance, such as body armor. However, the observed ballistic performance of these fibers is much lower than the predicted values. Since the predictions assume only tensile stress failure, it is safe to assume that the stress state is affecting fiber performance. The purpose of this research was to determine if there are failure mode changes in the fiber fracture when transversely loaded by indenters of different shapes. An experimental design mimicking transverse impact was used to determine any such effects. Three different indenters were used: round, FSP, and razor blade. The indenter height was changed to change the angle of failure tested. Five high performance fibers were examined: KevlarRTM KM2, SpectraRTM 130d, DyneemaRTM SK-62 and SK-76, and ZylonRTM 555. Failed fibers were analyzed using an SEM to determine failure mechanisms. The results show that the round and razor blade indenters produced a constant failure strain, as well as failure mechanisms independent of testing angle. The FSP indenter produced a decrease in failure strain as the angle increased. Fibrillation was the dominant failure mechanism at all angles for the round indenter, while through thickness shearing was the failure mechanism for the razor blade. The FSP indenter showed a transition from fibrillation at low angles to through thickness shearing at high angles, indicating that the round and razor blade indenters are extreme cases of the FSP indenter. The failure mechanisms observed with the FSP indenter at various angles correlated with the experimental strain data obtained during fiber testing. This indicates that geometry of the indenter tip in compression is a contributing factor in lowering the failure strain of the high performance fibers. TEM analysis of the fiber failure mechanisms was also attempted, though without

  8. Nominal Performance Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    Wasiolek, Maryla A.

    2000-12-21

    The purpose of this report was to document the process leading to development of the Biosphere Dose Conversion Factors (BDCFs) for the postclosure nominal performance of the potential repository at Yucca Mountain. BDCF calculations concerned twenty-four radionuclides. This selection included sixteen radionuclides that may be significant nominal performance dose contributors during the compliance period of up to 10,000 years, five additional radionuclides of importance for up to 1 million years postclosure, and three relatively short-lived radionuclides important for the human intrusion scenario. Consideration of radionuclide buildup in soil caused by previous irrigation with contaminated groundwater was taken into account in the BDCF development. The effect of climate evolution, from the current arid conditions to a wetter and cooler climate, on the BDCF values was evaluated. The analysis included consideration of different exposure pathway's contribution to the BDCFs. Calculations of nominal performance BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. BDCFs for the nominal performance, when combined with the concentrations of radionuclides in groundwater allow calculation of potential radiation doses to the receptor of interest. Calculated estimates of radionuclide concentration in groundwater result from the saturated zone modeling. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA) to calculate doses to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  9. Approaches to Cycle Analysis and Performance Metrics

    NASA Technical Reports Server (NTRS)

    Parson, Daniel E.

    2003-01-01

    The following notes were prepared as part of an American Institute of Aeronautics and Astronautics (AIAA) sponsored short course entitled Air Breathing Pulse Detonation Engine (PDE) Technology. The course was presented in January of 2003, and again in July of 2004 at two different AIAA meetings. It was taught by seven instructors, each of whom provided information on particular areas of PDE research. These notes cover two areas. The first is titled Approaches to Cycle Analysis and Performance Metrics. Here, the various methods of cycle analysis are introduced. These range from algebraic, thermodynamic equations, to single and multi-dimensional Computational Fluid Dynamic (CFD) solutions. Also discussed are the various means by which performance is measured, and how these are applied in a device which is fundamentally unsteady. The second topic covered is titled PDE Hybrid Applications. Here the concept of coupling a PDE to a conventional turbomachinery based engine is explored. Motivation for such a configuration is provided in the form of potential thermodynamic benefits. This is accompanied by a discussion of challenges to the technology.

  10. Axial and centrifugal pump meanline performance analysis

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.

    1994-01-01

    A meanline pump flow modeling method has been developed to provide a fast capability for modeling pumps of cryogenic rocket engines. Based on this method, a meanline pump flow code (PUMPA) has been written that can predict the performance of pumps at off-design operating conditions, given the loss of the diffusion system at the design point. The design point rotor efficiency is obtained from empirically derived correlations of loss to rotor specific speed. The rapid input setup and computer run time for the meanline pump flow code makes it an effective analysis and conceptual design tool. The map generation capabilities of the PUMPA code provide the information needed for interfacing with a rocket engine system modeling code.

  11. Performance Analysis of ICA in Sensor Array

    PubMed Central

    Cai, Xin; Wang, Xiang; Huang, Zhitao; Wang, Fenghua

    2016-01-01

    As the best-known scheme in the field of Blind Source Separation (BSS), Independent Component Analysis (ICA) has been intensively used in various domains, including biomedical and acoustics applications, cooperative or non-cooperative communication, etc. While sensor arrays are involved in most of the applications, the influence on the performance of ICA of practical factors therein has not been sufficiently investigated yet. In this manuscript, the issue is researched by taking the typical antenna array as an illustrative example. Factors taken into consideration include the environment noise level, the properties of the array and that of the radiators. We analyze the analytic relationship between the noise variance, the source variance, the condition number of the mixing matrix and the optimal signal to interference-plus-noise ratio, as well as the relationship between the singularity of the mixing matrix and practical factors concerned. The situations where the mixing process turns (nearly) singular have been paid special attention to, since such circumstances are critical in applications. Results and conclusions obtained should be instructive when applying ICA algorithms on mixtures from sensor arrays. Moreover, an effective countermeasure against the cases of singular mixtures has been proposed, on the basis of previous analysis. Experiments validating the theoretical conclusions as well as the effectiveness of the proposed scheme have been included. PMID:27164100

  12. High Performance Data Analysis via Coordinated Caches

    NASA Astrophysics Data System (ADS)

    Fischer, M.; Metzlaff, C.; Kühn, E.; Giffels, M.; Quast, G.; Jung, C.; Hauth, T.

    2015-12-01

    With the second run period of the LHC, high energy physics collaborations will have to face increasing computing infrastructural needs. Opportunistic resources are expected to absorb many computationally expensive tasks, such as Monte Carlo event simulation. This leaves dedicated HEP infrastructure with an increased load of analysis tasks that in turn will need to process an increased volume of data. In addition to storage capacities, a key factor for future computing infrastructure is therefore input bandwidth available per core. Modern data analysis infrastructure relies on one of two paradigms: data is kept on dedicated storage and accessed via network or distributed over all compute nodes and accessed locally. Dedicated storage allows data volume to grow independently of processing capacities, whereas local access allows processing capacities to scale linearly. However, with the growing data volume and processing requirements, HEP will require both of these features. For enabling adequate user analyses in the future, the KIT CMS group is merging both paradigms: popular data is spread over a local disk layer on compute nodes, while any data is available from an arbitrarily sized background storage. This concept is implemented as a pool of distributed caches, which are loosely coordinated by a central service. A Tier 3 prototype cluster is currently being set up for performant user analyses of both local and remote data.

  13. Radio-science performance analysis software

    NASA Astrophysics Data System (ADS)

    Morabito, D. D.; Asmar, S. W.

    1995-02-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.

  14. Past Performance analysis of HPOTP bearings

    NASA Technical Reports Server (NTRS)

    Bhat, B. N.; Dolan, F. J.

    1982-01-01

    The past performance analysis conducted on three High Pressure Oxygen Turbopump (HPOTP) bearings from the Space Shuttle Main Engine is presented. Metallurgical analysis of failed bearing balls and races, and wear track and crack configuration analyses were carried out. In addition, one bearing was tested in laboratory at very high axial loads. The results showed that the cracks were surface initiated and propagated into subsurface locations at relatively small angles. Subsurface cracks were much more extensive than was appeared on the surface. The location of major cracks in the races corresponded to high radial loads rather than high axial loads. There was evidence to suggest that the inner races were heated to elevated temperatures. A failure scenario was developed based on the above findings. According to this scenario the HPOTP bearings are heated by a combination of high loads and high coefficient of friction (poor lubrication). Different methods of extending the HPOTP bearing life are also discussed. These include reduction of axial loads, improvements in bearing design, lubrication and cooling, and use of improved bearing materials.

  15. Performance Analysis of ICA in Sensor Array.

    PubMed

    Cai, Xin; Wang, Xiang; Huang, Zhitao; Wang, Fenghua

    2016-01-01

    As the best-known scheme in the field of Blind Source Separation (BSS), Independent Component Analysis (ICA) has been intensively used in various domains, including biomedical and acoustics applications, cooperative or non-cooperative communication, etc. While sensor arrays are involved in most of the applications, the influence on the performance of ICA of practical factors therein has not been sufficiently investigated yet. In this manuscript, the issue is researched by taking the typical antenna array as an illustrative example. Factors taken into consideration include the environment noise level, the properties of the array and that of the radiators. We analyze the analytic relationship between the noise variance, the source variance, the condition number of the mixing matrix and the optimal signal to interference-plus-noise ratio, as well as the relationship between the singularity of the mixing matrix and practical factors concerned. The situations where the mixing process turns (nearly) singular have been paid special attention to, since such circumstances are critical in applications. Results and conclusions obtained should be instructive when applying ICA algorithms on mixtures from sensor arrays. Moreover, an effective countermeasure against the cases of singular mixtures has been proposed, on the basis of previous analysis. Experiments validating the theoretical conclusions as well as the effectiveness of the proposed scheme have been included. PMID:27164100

  16. Radio-science performance analysis software

    NASA Technical Reports Server (NTRS)

    Morabito, D. D.; Asmar, S. W.

    1995-01-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.

  17. Space Shuttle Main Engine performance analysis

    NASA Technical Reports Server (NTRS)

    Santi, L. Michael

    1993-01-01

    For a number of years, NASA has relied primarily upon periodically updated versions of Rocketdyne's power balance model (PBM) to provide space shuttle main engine (SSME) steady-state performance prediction. A recent computational study indicated that PBM predictions do not satisfy fundamental energy conservation principles. More recently, SSME test results provided by the Technology Test Bed (TTB) program have indicated significant discrepancies between PBM flow and temperature predictions and TTB observations. Results of these investigations have diminished confidence in the predictions provided by PBM, and motivated the development of new computational tools for supporting SSME performance analysis. A multivariate least squares regression algorithm was developed and implemented during this effort in order to efficiently characterize TTB data. This procedure, called the 'gains model,' was used to approximate the variation of SSME performance parameters such as flow rate, pressure, temperature, speed, and assorted hardware characteristics in terms of six assumed independent influences. These six influences were engine power level, mixture ratio, fuel inlet pressure and temperature, and oxidizer inlet pressure and temperature. A BFGS optimization algorithm provided the base procedure for determining regression coefficients for both linear and full quadratic approximations of parameter variation. Statistical information relative to data deviation from regression derived relations was also computed. A new strategy for integrating test data with theoretical performance prediction was also investigated. The current integration procedure employed by PBM treats test data as pristine and adjusts hardware characteristics in a heuristic manner to achieve engine balance. Within PBM, this integration procedure is called 'data reduction.' By contrast, the new data integration procedure, termed 'reconciliation,' uses mathematical optimization techniques, and requires both

  18. Space Shuttle Main Engine performance analysis

    NASA Astrophysics Data System (ADS)

    Santi, L. Michael

    1993-11-01

    For a number of years, NASA has relied primarily upon periodically updated versions of Rocketdyne's power balance model (PBM) to provide space shuttle main engine (SSME) steady-state performance prediction. A recent computational study indicated that PBM predictions do not satisfy fundamental energy conservation principles. More recently, SSME test results provided by the Technology Test Bed (TTB) program have indicated significant discrepancies between PBM flow and temperature predictions and TTB observations. Results of these investigations have diminished confidence in the predictions provided by PBM, and motivated the development of new computational tools for supporting SSME performance analysis. A multivariate least squares regression algorithm was developed and implemented during this effort in order to efficiently characterize TTB data. This procedure, called the 'gains model,' was used to approximate the variation of SSME performance parameters such as flow rate, pressure, temperature, speed, and assorted hardware characteristics in terms of six assumed independent influences. These six influences were engine power level, mixture ratio, fuel inlet pressure and temperature, and oxidizer inlet pressure and temperature. A BFGS optimization algorithm provided the base procedure for determining regression coefficients for both linear and full quadratic approximations of parameter variation. Statistical information relative to data deviation from regression derived relations was also computed. A new strategy for integrating test data with theoretical performance prediction was also investigated. The current integration procedure employed by PBM treats test data as pristine and adjusts hardware characteristics in a heuristic manner to achieve engine balance. Within PBM, this integration procedure is called 'data reduction.' By contrast, the new data integration procedure, termed 'reconciliation,' uses mathematical optimization techniques, and requires both

  19. Data Link Performance Analysis for LVLASO Experiments

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    1998-01-01

    Low-visibility Landing and Surface Operations System (LVLASO) is currently being prototyped and tested at NASA Langley Research Center. Since the main objective of the system is to maintain the aircraft landings and take-offs even during low-visibility conditions, timely exchange of positional and other information between the aircraft and the ground control is critical. For safety and reliability reasons, there are several redundant sources on the ground (e.g., ASDE, AMASS) that collect and disseminate information about the environment to the aircrafts. The data link subsystem of LVLASO is responsible for supporting the timely transfer of information between the aircrafts and the ground controllers. In fact, if not properly designed, the data link subsystem could become a bottleneck in the proper functioning of LVLASO. Currently, the other components of the system are being designed assuming that the data link has adequate capacity and is capable of delivering the information in a timely manner. During August 1-28, 1997, several flight experiments were conducted to test the prototypes of subsystems developed under LVLASO project, The back-round and details of the tests are described in the next section. The test results have been collected in two CDs by FAA and Rockwell-Collins. Under the current grant, we have analyzed the data and evaluated the performance of the Mode S datalink. In this report, we summarize the results of our analysis. Much of the results are shown in terms of graphs or histograms. The test date (or experiment number) was often taken as the X-axis and the Y-axis denotes whatever metric of focus in that chart. In interpreting these charts, one need to take into account the vehicular traffic during a particular experiment. In general, the performance of the data link was found to be quite satisfactory in terms of delivering long and short Mode S squitters from the vehicles to the ground receiver, Similarly, its performance in delivering control

  20. Performance analysis of memory hierachies in high performance systems

    SciTech Connect

    Yogesh, A.

    1993-07-01

    This thesis studies memory bandwidth as a performance predictor of programs. The focus of this work is on computationally intensive programs. These programs are the most likely to access large amounts of data, stressing the memory system. Computationally intensive programs are also likely to use highly optimizing compilers to produce the fastest executables possible. Methods to reduce the amount of data traffic by increasing the average number of references to each item while it resides in the cache are explored. Increasing the average number of references to each cache item reduces the number of memory requests. Chapter 2 describes the DLX architecture. This is the architecture on which all the experiments were performed. Chapter 3 studies memory moves as a performance predictor for a group of application programs. Chapter 4 introduces a model to study the performance of programs in the presence of memory hierarchies. Chapter 5 explores some compiler optimizations that can help increase the references to each item while it resides in the cache.

  1. Performance analysis of robust road sign identification

    NASA Astrophysics Data System (ADS)

    Ali, Nursabillilah M.; Mustafah, Y. M.; Rashid, N. K. A. M.

    2013-12-01

    This study describes performance analysis of a robust system for road sign identification that incorporated two stages of different algorithms. The proposed algorithms consist of HSV color filtering and PCA techniques respectively in detection and recognition stages. The proposed algorithms are able to detect the three standard types of colored images namely Red, Yellow and Blue. The hypothesis of the study is that road sign images can be used to detect and identify signs that are involved with the existence of occlusions and rotational changes. PCA is known as feature extraction technique that reduces dimensional size. The sign image can be easily recognized and identified by the PCA method as is has been used in many application areas. Based on the experimental result, it shows that the HSV is robust in road sign detection with minimum of 88% and 77% successful rate for non-partial and partial occlusions images. For successful recognition rates using PCA can be achieved in the range of 94-98%. The occurrences of all classes are recognized successfully is between 5% and 10% level of occlusions.

  2. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    EPA Science Inventory

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...

  3. Performance analysis of electrical circuits /PANE/

    NASA Technical Reports Server (NTRS)

    Johnson, K. L.; Steinberg, L. L.

    1968-01-01

    Automated statistical and worst case computer program has been designed to perform dc and ac steady circuit analyses. The program determines the worst case circuit performance by solving circuit equations.

  4. NEXT Performance Curve Analysis and Validation

    NASA Technical Reports Server (NTRS)

    Saripalli, Pratik; Cardiff, Eric; Englander, Jacob

    2016-01-01

    Performance curves of the NEXT thruster are highly important in determining the thruster's ability in performing towards mission-specific goals. New performance curves are proposed and examined here. The Evolutionary Mission Trajectory Generator (EMTG) is used to verify variations in mission solutions based on both available thruster curves and the new curves generated. Furthermore, variations in BOL and EOL curves are also examined. Mission design results shown here validate the use of EMTG and the new performance curves.

  5. Cost and performance analysis of physical security systems

    SciTech Connect

    Hicks, M.J.; Yates, D.; Jago, W.H.; Phillips, A.W.

    1998-04-01

    Analysis of cost and performance of physical security systems can be a complex, multi-dimensional problem. There are a number of point tools that address various aspects of cost and performance analysis. Increased interest in cost tradeoffs of physical security alternatives has motivated development of an architecture called Cost and Performance Analysis (CPA), which takes a top-down approach to aligning cost and performance metrics. CPA incorporates results generated by existing physical security system performance analysis tools, and utilizes an existing cost analysis tool. The objective of this architecture is to offer comprehensive visualization of complex data to security analysts and decision-makers.

  6. IPAC-Inlet Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Barnhart, Paul J.

    1997-01-01

    A series of analyses have been developed which permit the calculation of the performance of common inlet designs. The methods presented are useful for determining the inlet weight flows, total pressure recovery, and aerodynamic drag coefficients for given inlet geometric designs. Limited geometric input data is required to use this inlet performance prediction methodology. The analyses presented here may also be used to perform inlet preliminary design studies. The calculated inlet performance parameters may be used in subsequent engine cycle analyses or installed engine performance calculations for existing uninstalled engine data.

  7. An Ethnostatistical Analysis of Performance Measurement

    ERIC Educational Resources Information Center

    Winiecki, Donald J.

    2008-01-01

    Within the fields of human performance technology (HPT), human resources management (HRM), and management in general, performance measurement is not only foundational but considered necessary at all phases in the process of HPT. In HPT in particular, there is substantial advice literature on what measurement is, why it is essential, and (at a…

  8. Spacecraft Multiple Array Communication System Performance Analysis

    NASA Technical Reports Server (NTRS)

    Hwu, Shian U.; Desilva, Kanishka; Sham, Catherine C.

    2010-01-01

    The Communication Systems Simulation Laboratory (CSSL) at the NASA Johnson Space Center is tasked to perform spacecraft and ground network communication system simulations, design validation, and performance verification. The CSSL has developed simulation tools that model spacecraft communication systems and the space and ground environment in which the tools operate. In this paper, a spacecraft communication system with multiple arrays is simulated. Multiple array combined technique is used to increase the radio frequency coverage and data rate performance. The technique is to achieve phase coherence among the phased arrays to combine the signals at the targeting receiver constructively. There are many technical challenges in spacecraft integration with a high transmit power communication system. The array combining technique can improve the communication system data rate and coverage performances without increasing the system transmit power requirements. Example simulation results indicate significant performance improvement can be achieved with phase coherence implementation.

  9. Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures

    NASA Technical Reports Server (NTRS)

    Biegel, Bryan A. (Technical Monitor); Jost, G.; Jin, H.; Labarta J.; Gimenez, J.; Caubet, J.

    2003-01-01

    Parallel programming paradigms include process level parallelism, thread level parallelization, and multilevel parallelism. This viewgraph presentation describes a detailed performance analysis of these paradigms for Shared Memory Architecture (SMA). This analysis uses the Paraver Performance Analysis System. The presentation includes diagrams of a flow of useful computations.

  10. Power plant performance monitoring and improvement: Volume 5, Turbine cycle performance analysis: Interim report

    SciTech Connect

    Crim, H.G. Jr.; Westcott, J.C.; de Mello, R.W.; Brandon, R.E.; Kona, C.; Schmehl, T.G.; Reddington, J.R.

    1987-12-01

    This volume describes advanced instrumentation and computer programs for turbine cycle performance analysis. Unit conditions are displayed on-line. Included are techniques for monitoring the performance of feedwater heaters and the main condenser, procedures for planning turbine maintenance based on an analysis of preoutage testing and performance history, and an overview of the project's computerized data handling and display systems. (DWL)

  11. Assessing BMP Performance Using Microtox Toxicity Analysis

    EPA Science Inventory

    Best Management Practices (BMPs) have been shown to be effective in reducing runoff and pollutants from urban areas and thus provide a mechanism to improve downstream water quality. Currently, BMP performance regarding water quality improvement is assessed through measuring each...

  12. Analysis of telescope performance: MTF approach

    NASA Astrophysics Data System (ADS)

    Vítek, Stanislav; Páta, Petr

    2006-03-01

    Small robotic telescopes (like BOOTES in Spain, BART in Czech Republic or FRAM in Argentina) are constructed for continuous galactic survey and fast reactions to GRB (Gamma Ray Burts) alerts. Due subtile construction performance of those instruments strongly depends on temperature, atmosphere scintillations etc. In this article will be discussed possibilities of performance improvement based on knowledge of any transfer characteristic like modulation transfer function MTF (or Point Spread Function PSF of course) of imaging system introducing a robotic telescope.

  13. Performance analysis of cone detection algorithms.

    PubMed

    Mariotti, Letizia; Devaney, Nicholas

    2015-04-01

    Many algorithms have been proposed to help clinicians evaluate cone density and spacing, as these may be related to the onset of retinal diseases. However, there has been no rigorous comparison of the performance of these algorithms. In addition, the performance of such algorithms is typically determined by comparison with human observers. Here we propose a technique to simulate realistic images of the cone mosaic. We use the simulated images to test the performance of three popular cone detection algorithms, and we introduce an algorithm which is used by astronomers to detect stars in astronomical images. We use Free Response Operating Characteristic (FROC) curves to evaluate and compare the performance of the four algorithms. This allows us to optimize the performance of each algorithm. We observe that performance is significantly enhanced by up-sampling the images. We investigate the effect of noise and image quality on cone mosaic parameters estimated using the different algorithms, finding that the estimated regularity is the most sensitive parameter. PMID:26366758

  14. Rocket-in-a-Duct Performance Analysis

    NASA Technical Reports Server (NTRS)

    Schneider, Steven J.; Reed, Brian D.

    1999-01-01

    An axisymmetric, 110 N class, rocket configured with a free expansion between the rocket nozzle and a surrounding duct was tested in an altitude simulation facility. The propellants were gaseous hydrogen and gaseous oxygen and the hardware consisted of a heat sink type copper rocket firing through copper ducts of various diameters and lengths. A secondary flow of nitrogen was introduced at the blind end of the duct to mix with the primary rocket mass flow in the duct. This flow was in the range of 0 to 10% of the primary massflow and its effect on nozzle performance was measured. The random measurement errors on thrust and massflow were within +/-1%. One dimensional equilibrium calculations were used to establish the possible theoretical performance of these rocket-in-a-duct nozzles. Although the scale of these tests was small, they simulated the relevant flow expansion physics at a modest experimental cost. Test results indicated that lower performance was obtained at higher free expansion area ratios and longer ducts, while, higher performance was obtained with the addition of secondary flow. There was a discernable peak in specific impulse efficiency at 4% secondary flow. The small scale of these tests resulted in low performance efficiencies, but prior numerical modeling of larger rocket-in-a-duct engines predicted performance that was comparable to that of optimized rocket nozzles. This remains to be proven in large-scale, rocket-in-a-duct tests.

  15. Network interface unit design options performance analysis

    NASA Technical Reports Server (NTRS)

    Miller, Frank W.

    1991-01-01

    An analysis is presented of three design options for the Space Station Freedom (SSF) onboard Data Management System (DMS) Network Interface Unit (NIU). The NIU provides the interface from the Fiber Distributed Data Interface (FDDI) local area network (LAN) to the DMS processing elements. The FDDI LAN provides the primary means for command and control and low and medium rate telemetry data transfers on board the SSF. The results of this analysis provide the basis for the implementation of the NIU.

  16. Managing Performance Analysis with Dynamic Statistical Projection Pursuit

    SciTech Connect

    Vetter, J.S.; Reed, D.A.

    2000-05-22

    Computer systems and applications are growing more complex. Consequently, performance analysis has become more difficult due to the complex, transient interrelationships among runtime components. To diagnose these types of performance issues, developers must use detailed instrumentation to capture a large number of performance metrics. Unfortunately, this instrumentation may actually influence the performance analysis, leading the developer to an ambiguous conclusion. In this paper, we introduce a technique for focusing a performance analysis on interesting performance metrics. This technique, called dynamic statistical projection pursuit, identifies interesting performance metrics that the monitoring system should capture across some number of processors. By reducing the number of performance metrics, projection pursuit can limit the impact of instrumentation on the performance of the target system and can reduce the volume of performance data.

  17. Midlife plasma vitamin D concentrations and performance in different cognitive domains assessed 13 years later.

    PubMed

    Assmann, Karen E; Touvier, Mathilde; Andreeva, Valentina A; Deschasaux, Mélanie; Constans, Thierry; Hercberg, Serge; Galan, Pilar; Kesse-Guyot, Emmanuelle

    2015-05-28

    25-Hydroxyvitamin D (25(OH)D) insufficiency is very common in many countries. Yet, the extent to which 25(OH)D status affects cognitive performance remains unclear. The objective of the present study was to evaluate the cross-time association between midlife plasma 25(OH)D concentrations and subsequent cognitive performance, using a subsample from the French 'SUpplémentation en Vitamines et Minéraux AntioXydants' randomised trial (SU.VI.MAX, 1994-2002) and the SU.VI.MAX 2 observational follow-up study (2007-9). 25(OH)D concentrations were measured in plasma samples drawn in 1994-5, using an electrochemoluminescent immunoassay. Cognitive performance was evaluated in 2007-9 with a neuropsychological battery including phonemic and semantic fluency tasks, the RI-48 (rappel indicé-48 items) cued recall test, the Trail Making Test and the forward and backward digit span. Cognitive factors were extracted via principal component analysis (PCA). Data from 1009 individuals, aged 45-60 years at baseline, with available 25(OH)D and cognitive measurements were analysed by multivariable linear regression models and ANCOVA, stratified by educational level. PCA yielded two factors, designated as 'verbal memory' (strongly correlated with the RI-48 and phonemic/semantic fluency tasks) and 'short-term/working memory' (strongly correlated with the digit span tasks). In the fully adjusted regression model, among individuals with low education, there was a positive association between 25(OH)D concentrations and the 'short-term/working memory' factor (P=0.02), mainly driven by the backward digit span (P=0.004). No association with either cognitive factor was found among better educated participants. In conclusion, higher midlife 25(OH)D concentrations were linked to better outcomes concerning short-term and working memory. However, these results were specific to subjects with low education, suggesting a modifying effect of cognitive reserve. PMID:25864611

  18. System performance analysis of stretched membrane heliostats

    SciTech Connect

    Anderson, J V; Murphy, L M; Short, W; Wendelin, T

    1985-12-01

    The optical performance of both focused and unfocused stretched membrane heliostats was examined in the context of the overall cost and performance of central receiver systems. The sensitivity of optical performance to variations in design parameters such as the system size (capacity), delivery temperature, heliostat size, and heliostat surface quality was also examined. The results support the conclusion that focused stretched membrane systems provide an economically attractive alternative to current glass/metal heliostats over essentially the entire range of design parameters studied. In addition, unfocused stretched membrane heliostats may be attractive for a somewhat more limited range of applications, which would include the larger plant sizes (e.g., 450 MW) and lower delivery temperatures (e.g., 450/sup 0/C), or situations in which the heliostat size could economically be reduced.

  19. Modeling and analysis of web portals performance

    NASA Astrophysics Data System (ADS)

    Abdul Rahim, Rahela; Ibrahim, Haslinda; Syed Yahaya, Sharipah Soaad; Khalid, Khairini

    2011-10-01

    The main objective of this study is to develop a model based on queuing theory at a system level of web portals performance for a university. A system level performance model views the system being modeled as a 'black box' which considers the arrival rate of packets to the portals server and service rate of the portals server. These two parameters are important elements to measure Web portals performance metrics such as server utilization, average server throughput, average number of packet in the server and mean response time. This study refers to infinite population and finite queue. The proposed analytical model is simple in such a way that it is easy to define and fast to interpret the results but still represents the real situation.

  20. Forecast analysis of optical waveguide bus performance

    NASA Technical Reports Server (NTRS)

    Ledesma, R.; Rourke, M. D.

    1979-01-01

    Elements to be considered in the design of a data bus include: architecture; data rate; modulation, encoding, detection; power distribution requirements; protocol, work structure; bus reliability, maintainability; interterminal transmission medium; cost; and others specific to application. Fiber- optic data bus considerations for a 32 port transmissive star architecture, are discussed in a tutorial format. General optical-waveguide bus concepts, are reviewed. The electrical and optical performance of a 32 port transmissive star bus, and the effects of temperature on the performance of optical-waveguide buses are examined. A bibliography of pertinent references and the bus receiver test results are included.

  1. Computer program performs statistical analysis for random processes

    NASA Technical Reports Server (NTRS)

    Newberry, M. H.

    1966-01-01

    Random Vibration Analysis Program /RAVAN/ performs statistical analysis on a number of phenomena associated with flight and captive tests, but can also be used in analyzing data from many other random processes.

  2. Computer program performs stiffness matrix structural analysis

    NASA Technical Reports Server (NTRS)

    Bamford, R.; Batchelder, R.; Schmele, L.; Wada, B. K.

    1968-01-01

    Computer program generates the stiffness matrix for a particular type of structure from geometrical data, and performs static and normal mode analyses. It requires the structure to be modeled as a stable framework of uniform, weightless members, and joints at which loads are applied and weights are lumped.

  3. THERMAL PERFORMANCE ANALYSIS FOR WSB DRUM

    SciTech Connect

    Lee, S

    2008-06-26

    The Nuclear Nonproliferation Programs Design Authority is in the design stage of the Waste Solidification Building (WSB) for the treatment and solidification of the radioactive liquid waste streams generated by the Pit Disassembly and Conversion Facility (PDCF) and Mixed Oxide (MOX) Fuel Fabrication Facility (MFFF). The waste streams will be mixed with a cementitious dry mix in a 55-gallon waste container. Savannah River National Laboratory (SRNL) has been performing the testing and evaluations to support technical decisions for the WSB. Engineering Modeling & Simulation Group was requested to evaluate the thermal performance of the 55-gallon drum containing hydration heat source associated with the current baseline cement waste form. A transient axi-symmetric heat transfer model for the drum partially filled with waste form cement has been developed and heat transfer calculations performed for the baseline design configurations. For this case, 65 percent of the drum volume was assumed to be filled with the waste form, which has transient hydration heat source, as one of the baseline conditions. A series of modeling calculations has been performed using a computational heat transfer approach. The baseline modeling results show that the time to reach the maximum temperature of the 65 percent filled drum is about 32 hours when a 43 C initial cement temperature is assumed to be cooled by natural convection with 27 C external air. In addition, the results computed by the present model were compared with analytical solutions. The modeling results will be benchmarked against the prototypic test results. The verified model will be used for the evaluation of the thermal performance for the WSB drum.

  4. A guide for performing system safety analysis

    NASA Technical Reports Server (NTRS)

    Brush, J. M.; Douglass, R. W., III.; Williamson, F. R.; Dorman, M. C. (Editor)

    1974-01-01

    A general guide is presented for performing system safety analyses of hardware, software, operations and human elements of an aerospace program. The guide describes a progression of activities that can be effectively applied to identify hazards to personnel and equipment during all periods of system development. The general process of performing safety analyses is described; setting forth in a logical order the information and data requirements, the analytical steps, and the results. These analyses are the technical basis of a system safety program. Although the guidance established by this document cannot replace human experience and judgement, it does provide a methodical approach to the identification of hazards and evaluation of risks to the system.

  5. Performance Analysis of IIUM Wireless Campus Network

    NASA Astrophysics Data System (ADS)

    Abd Latif, Suhaimi; Masud, Mosharrof H.; Anwar, Farhat

    2013-12-01

    International Islamic University Malaysia (IIUM) is one of the leading universities in the world in terms of quality of education that has been achieved due to providing numerous facilities including wireless services to every enrolled student. The quality of this wireless service is controlled and monitored by Information Technology Division (ITD), an ISO standardized organization under the university. This paper aims to investigate the constraints of wireless campus network of IIUM. It evaluates the performance of the IIUM wireless campus network in terms of delay, throughput and jitter. QualNet 5.2 simulator tool has employed to measure these performances of IIUM wireless campus network. The observation from the simulation result could be one of the influencing factors in improving wireless services for ITD and further improvement.

  6. Performance analysis of panoramic infrared systems

    NASA Astrophysics Data System (ADS)

    Furxhi, Orges; Driggers, Ronald G.; Holst, Gerald; Krapels, Keith

    2014-05-01

    Panoramic imagers are becoming more commonplace in the visible part of the spectrum. These imagers are often used in the real estate market, extreme sports, teleconferencing, and security applications. Infrared panoramic imagers, on the other hand, are not as common and only a few have been demonstrated. A panoramic image can be formed in several ways, using pan and stitch, distributed aperture, or omnidirectional optics. When omnidirectional optics are used, the detected image is a warped view of the world that is mapped on the focal plane array in a donut shape. The final image on the display is the mapping of the omnidirectional donut shape image back to the panoramic world view. In this paper we analyze the performance of uncooled thermal panoramic imagers that use omnidirectional optics, focusing on range performance.

  7. Light beam deflector performance: a comparative analysis.

    PubMed

    Zook, J D

    1974-04-01

    The performance of various types of analog light beam deflectors is summarized, and their relative positions in a deflector hierarchy are defined. The three types of deflectors considered are (1) mechanical (galvanometer) mirror deflectors, (2) acoustooptic deflectors, and (3) analog electrooptic deflectors. Material figures of merit are defined and compared, and the theoretical trade-off between speed and resolution is given for each type of deflector. PMID:20126095

  8. Performance analysis of intracavity birefringence sensing

    SciTech Connect

    Yoshino, Toshihiko

    2008-05-10

    The performance of intracavity birefringence sensing by use of a standing-wave laser is theoretically analyzed when the cavity involves internal reflection. On the three-mirror compound cavity model, the condition for converting an optical path length into a laser frequency or a retardation into an optical beat frequency with good linearity and little uncertainty is derived as a function of the cavity parameters and is numerically analyzed.

  9. Performance Analysis of the Unitree Central File

    NASA Technical Reports Server (NTRS)

    Pentakalos, Odysseas I.; Flater, David

    1994-01-01

    This report consists of two parts. The first part briefly comments on the documentation status of two major systems at NASA#s Center for Computational Sciences, specifically the Cray C98 and the Convex C3830. The second part describes the work done on improving the performance of file transfers between the Unitree Mass Storage System running on the Convex file server and the users workstations distributed over a large georgraphic area.

  10. Moisture performance analysis of EPS frost insulation

    SciTech Connect

    Ojanen, T.; Kokko, E.

    1997-11-01

    A horizontal layer of expanded polystyrene foam (EPS) is widely used as a frost insulation of building foundations in the Nordic countries. The performance properties of the insulation depend strongly on the moisture level of the material. Experimental methods are needed to produce samples for testing the material properties in realistic moisture conditions. The objective was to analyze the moisture loads and the wetting mechanisms of horizontal EPS frost insulation. Typical wetting tests, water immersion and diffusive water vapor absorption tests, were studied and the results were compared with the data from site investigations. Usually these tests give higher moisture contents of EPS than what are detected in drained frost insulation applications. Also the effect of different parameters, like the immersion depth and temperature gradient were studied. Special attention was paid to study the effect of diffusion on the wetting process. Numerical simulation showed that under real working conditions the long period diffusive moisture absorption in EPS frost insulation remained lower than 1% Vol. Moisture performance was determined experimentally as a function of the distance between the insulation and the free water level in the ground. The main moisture loads and the principles for good moisture performance of frost insulation are presented.

  11. Performance analysis of the multichannel astrometric photometer

    NASA Technical Reports Server (NTRS)

    Huang, Chunsheng; Lawrence, George N.; Levy, Eugene H.; Mcmillan, Robert S.

    1987-01-01

    It has been proposed that extrasolar planetary systems may be observed if perturbations in star position due to the orbit of Jupiter-type planets could be detected. To see this motion, high accuracy measurements of 0.01 milliarcsecond are required over a relatively large field of view. Techniques using a moving Ronchi grating have been proposed for this application and have been successful in ground-based lower resolution tests. The method may have application to other precision angular measurement problems. This paper explores the theoretical description of the method, considers certain of the error sources, and presents a preliminary calculation of the performance which may be achieved.

  12. Performance analysis of instrumentation system management policies

    SciTech Connect

    Waheed, A.; Melfi, V.F.; Rover, D.T.

    1995-12-01

    Run-time trace data helps to debug and analyze the parallel programs. Obtaining and managing this data during run-time is the responsibility of an instrumentation system that incurs overhead. In its worst case, this overhead can result in severe perturbation of the behavior of the actual program. This paper presents a queuing model for an instrumentation system. The purpose is to provide a rigorous mathematical tool that could allow the analysis of the overhead of program behavior due to the instrumentation system management policies. We summarize the effects of two management policies: FOF and FAOF policies.

  13. Analysis of tandem mirror reactor performance

    SciTech Connect

    Wu, K.F.; Campbell, R.B.; Peng, Y.K.M.

    1984-11-01

    Parametric studies are performed using a tandem mirror plasma point model to evaluate the wall loading GAMMA and the physics figure of merit, Q (fusion power/injected power). We explore the relationship among several dominant parameters and determine the impact on the plasma performance of electron cyclotron resonance heating in the plug region. These global particle and energy balance studies were carried out under the constraints of magnetohydrodynamic (MHD) equilibrium and stability and constant magnetic flux, assuming a fixed end-cell geometry. We found that the higher the choke coil fields, the higher the Q, wall loading, and fusion power due to the combination of the increased central-cell field B/sub c/ and density n/sub c/ and the reduced central-cell beta ..beta../sub c/. The MHD stability requirement of constant B/sub c//sup 2/..beta../sub c/ causes the reduction in ..beta../sub c/. In addition, a higher value of fusion power can also be obtained, at a fixed central-cell length, by operating at a lower value of B/sub c/ and a higher value of ..beta../sub c/.

  14. A New Approach to Aircraft Robust Performance Analysis

    NASA Technical Reports Server (NTRS)

    Gregory, Irene M.; Tierno, Jorge E.

    2004-01-01

    A recently developed algorithm for nonlinear system performance analysis has been applied to an F16 aircraft to begin evaluating the suitability of the method for aerospace problems. The algorithm has a potential to be much more efficient than the current methods in performance analysis for aircraft. This paper is the initial step in evaluating this potential.

  15. Optical performance test & analysis of intraocular lenses

    NASA Astrophysics Data System (ADS)

    Choi, Junoh

    Cataract is a condition in the eye that if left untreated, could lead to blindness. One of the effective ways to treat cataract is the removal of the cataractous natural crystalline lens and implantation of an artificial lens called an intraocular lens(IOL). The designs of the IOLs have shown improvements over the years to further imitate natural human vision. A need for an objective testing and analysis tool for the latest IOLs grow with the advancements of the IOLs. In this dissertation, I present a system capable of objective test and analysis of the advanced IOLs. The system consists of (1) Model eye into which an IOL can be inserted to mimic conditions of the human eye. (2) Modulation Transfer Function measurement setup capable of through-focus test for depth of field studies and polychromatic test for study of effects of chromatization. (3) Use of Defocus Transfer Function to simulate depth of field characteristic of rotationally symmetric multifocal designs and extension of the function to polychromatic conditions. (4) Several target imaging experiments for comparison of stray light artifacts and simulation using a non-sequential ray trace package.

  16. A case study in nonconformance and performance trend analysis

    NASA Technical Reports Server (NTRS)

    Maloy, Joseph E.; Newton, Coy P.

    1990-01-01

    As part of NASA's effort to develop an agency-wide approach to trend analysis, a pilot nonconformance and performance trending analysis study was conducted on the Space Shuttle auxiliary power unit (APU). The purpose of the study was to (1) demonstrate that nonconformance analysis can be used to identify repeating failures of a specific item (and the associated failure modes and causes) and (2) determine whether performance parameters could be analyzed and monitored to provide an indication of component or system degradation prior to failure. The nonconformance analysis of the APU did identify repeating component failures, which possibly could be reduced if key performance parameters were monitored and analyzed. The performance-trending analysis verified that the characteristics of hardware parameters can be effective in detecting degradation of hardware performance prior to failure.

  17. Shuttle TPS thermal performance and analysis methodology

    NASA Technical Reports Server (NTRS)

    Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.

    1983-01-01

    Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.

  18. Performance analysis of advanced spacecraft TPS

    NASA Technical Reports Server (NTRS)

    Pitts, William C.

    1991-01-01

    Spacecraft entering a planetary atmosphere require a very sophisticated thermal protection system. The materials used must be tailored to each specific vehicle based on its planned mission profiles. Starting with the Space Shuttle, many types of ceramic insulation with various combinations of thermal properties have been developed by others. The development of two new materials is described: A Composite Flexible Blanket Insulation which has a significantly lower effective thermal conductivity than other ceramic blankets; and a Silicon Matrix Composite which has applications at high temperature locations such as wing leading edges. Also, a systematic study is described that considers the application of these materials for a proposed Personnel Launch System. The study shows how most of these available ceramic materials would perform during atmospheric entry of this vehicle. Other specific applications of these thermal protection materials are discussed.

  19. Lytro camera technology: theory, algorithms, performance analysis

    NASA Astrophysics Data System (ADS)

    Georgiev, Todor; Yu, Zhan; Lumsdaine, Andrew; Goma, Sergio

    2013-03-01

    The Lytro camera is the first implementation of a plenoptic camera for the consumer market. We consider it a successful example of the miniaturization aided by the increase in computational power characterizing mobile computational photography. The plenoptic camera approach to radiance capture uses a microlens array as an imaging system focused on the focal plane of the main camera lens. This paper analyzes the performance of Lytro camera from a system level perspective, considering the Lytro camera as a black box, and uses our interpretation of Lytro image data saved by the camera. We present our findings based on our interpretation of Lytro camera file structure, image calibration and image rendering; in this context, artifacts and final image resolution are discussed.

  20. Energy performance analysis of prototype electrochromic windows

    SciTech Connect

    Sullivan, R.; Rubin, M.; Selkowitz, S.

    1996-12-01

    This paper presents the results of a study investigating the energy performance of three newly developed prototype electrochromic devices. The DOE-2.1 E energy simulation program was used to analyze the annual cooling, lighting, and total electric energy use and peak demand as a function of window type and size. The authors simulated a prototypical commercial office building module located in the cooling-dominated locations of Phoenix, AZ and Miami, FL. Heating energy use was also studied in the heating-dominated location of Madison, WI. Daylight illuminance was used to control electrochromic state-switching. Two types of window systems were analyzed; i.e., the outer pane electrochromic glazing was combined with either a conventional low-E or a spectrally selective inner pane. The properties of the electrochromic glazings are based on measured data of new prototypes developed as part of a cooperative DOE-industry program. The results show the largest difference in annual electric energy performance between the different window types occurs in Phoenix and is about 6.5 kWh/m{sup 2} floor area (0.60 kWh/ft{sup 2}) which can represent a cost of about $.52/m{sup 2} ($.05/ft{sup 2}) using electricity costing $.08/kWh. In heating-dominated locations, the electrochromic should be maintained in its bleached state during the heating season to take advantage of beneficial solar heat gain which would reduce the amount of required heating. This also means that the electrochromic window with the largest solar heat gain coefficient is best.

  1. Performance analysis of advanced spacecraft TPS

    NASA Technical Reports Server (NTRS)

    Pitts, William C.

    1987-01-01

    The analysis on the feasibility for using metal hydrides in the thermal protection system of cryogenic tanks in space was based on the heat capacity of ice as the phase change material (PCM). It was found that with ice the thermal protection system weight could be reduced by, at most, about 20 percent over an all LI-900 insulation. For this concept to be viable, a metal hydride with considerably more capacity than water would be required. None were found. Special metal hydrides were developed for hydrogen fuel storage applications and it may be possible to do so for the current application. Until this appears promising further effort on this feasibility study does not seem warranted.

  2. Performance analysis of superconducting generator electromagnetic shielding

    NASA Astrophysics Data System (ADS)

    Xia, D.; Xia, Z.

    2015-12-01

    In this paper, the shielding performance of electromagnetic shielding systems is analyzed using the finite element method. Considering the non-iron-core rotor structure of superconducting generators, it is proposed that the stator alternating magnetic field generated under different operating conditions could decompose into oscillating and rotating magnetic field, so that complex issues could be greatly simplified. A 1200KW superconducting generator was analyzed. The distribution of the oscillating magnetic field and the rotating magnetic field in rotor area, which are generated by stator winding currents, and the distribution of the eddy currents in electromagnetic shielding tube, which are induced by these stator winding magnetic fields, are calculated without electromagnetic shielding system and with three different structures of electromagnetic shielding system respectively. On the basis of the results of FEM, the shielding factor of the electromagnetic shielding systems is calculated and the shielding effect of the three different structures on the oscillating magnetic field and the rotating magnetic field is compared. The method and the results in this paper can provide reference for optimal design and loss calculation of superconducting generators.

  3. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  4. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replications and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. Here, a technique is discussed that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed databases with both shared and exclusive locks.

  5. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replication and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. A technique is used that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed database with both shared and exclusive locks.

  6. Performance analysis of parallel supernodal sparse LU factorization

    SciTech Connect

    Grigori, Laura; Li, Xiaoye S.

    2004-02-05

    We investigate performance characteristics for the LU factorization of large matrices with various sparsity patterns. We consider supernodal right-looking parallel factorization on a bi-dimensional grid of processors, making use of static pivoting. We develop a performance model and we validate it using the implementation in SuperLU-DIST, the real matrices and the IBM Power3 machine at NERSC. We use this model to obtain performance bounds on parallel computers, to perform scalability analysis and to identify performance bottlenecks. We also discuss the role of load balance and data distribution in this approach.

  7. A Systemic Cause Analysis Model for Human Performance Technicians

    ERIC Educational Resources Information Center

    Sostrin, Jesse

    2011-01-01

    This article presents a systemic, research-based cause analysis model for use in the field of human performance technology (HPT). The model organizes the most prominent barriers to workplace learning and performance into a conceptual framework that explains and illuminates the architecture of these barriers that exist within the fabric of everyday…

  8. An Alternative Method to Predict Performance: Canonical Redundancy Analysis.

    ERIC Educational Resources Information Center

    Dawson-Saunders, Beth; Doolen, Deane R.

    1981-01-01

    The relationships between predictors of performance and subsequent measures of clinical performance in medical school were examined for two classes at Southern Illinois University of Medicine. Canonical redundancy analysis was used to evaluate the association between six academic and three biographical preselection characteristics and four…

  9. An Exploratory Analysis of Performance on the SAT.

    ERIC Educational Resources Information Center

    Wainer, Howard

    1984-01-01

    Techniques of exploratory data analysis (EDA) were used to decompose data tables portraying performance of ethnic groups on the Scholastic Aptitude Test. These analyses indicate the size and structure of differences in performance among groups studied, nature of changes across time, and interactions between group membership and time. (Author/DWH)

  10. Performance and stability analysis of a photovoltaic power system

    NASA Technical Reports Server (NTRS)

    Merrill, W. C.; Blaha, R. J.; Pickrell, R. L.

    1978-01-01

    The performance and stability characteristics of a 10 kVA photovoltaic power system are studied using linear Bode analysis and a nonlinear analog simulation. Power conversion efficiencies, system stability, and system transient performance results are given for system operation at various levels of solar insolation. Additionally, system operation and the modeling of system components for the purpose of computer simulation are described.

  11. Using Importance-Performance Analysis to Evaluate Training

    ERIC Educational Resources Information Center

    Siniscalchi, Jason M.; Beale, Edward K.; Fortuna, Ashley

    2008-01-01

    The importance-performance analysis (IPA) is a tool that can provide timely and usable feedback to improve training. IPA measures the gaps between the importance and how good (performance) a class is perceived by a student and is presented on a 2x2 matrix. The quadrant in which data land in this matrix aids in determining potential future action.…

  12. Analysis of a Ubiquitous Performance Support System for Teachers

    ERIC Educational Resources Information Center

    Chen, Chao-Hsiu; Hwang, Gwo-Jen; Yang, Tzu-Chi; Chen, Shih-Hsuan; Huang, Shen-Yu

    2009-01-01

    This paper describes a Ubiquitous Performance Support System for Teachers (UPSST) and its implementation model. Personal Digital Assistants (PDAs) were used as the platform to support high-school teachers. Based on concepts of Electronic Performance Support Systems and design-based research, the authors conducted an iterative process of analysis,…

  13. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    SciTech Connect

    Bremer, Peer-Timo; Mohr, Bernd; Schulz, Martin; Pasccci, Valerio; Gamblin, Todd; Brunst, Holger

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  14. Using Importance-Performance Analysis To Evaluate Teaching Effectiveness.

    ERIC Educational Resources Information Center

    Attarian, Aram

    This paper introduces Importance-Performance (IP) analysis as a method to evaluate teaching effectiveness in a university outdoor program. Originally developed for use in the field of marketing, IP analysis is simple and easy to administer, and provides the instructor with a visual representation of what teaching attributes are important, how…

  15. Preliminary Analysis of Remote Monitoring & Robotic Concepts for Performance Confirmation

    SciTech Connect

    D.A. McAffee

    1997-02-18

    As defined in 10 CFR Part 60.2, Performance Confirmation is the ''program of tests, experiments and analyses which is conducted to evaluate the accuracy and adequacy of the information used to determine with reasonable assurance that the performance objectives for the period after permanent closure will be met''. The overall Performance Confirmation program begins during site characterization and continues up to repository closure. The main purpose of this document is to develop, explore and analyze initial concepts for using remotely operated and robotic systems in gathering repository performance information during Performance Confirmation. This analysis focuses primarily on possible Performance Confirmation related applications within the emplacement drifts after waste packages have been emplaced (post-emplacement) and before permanent closure of the repository (preclosure). This will be a period of time lasting approximately 100 years and basically coincides with the Caretaker phase of the project. This analysis also examines, to a lesser extent, some applications related to Caretaker operations. A previous report examined remote handling and robotic technologies that could be employed during the waste package emplacement phase of the project (Reference 5.1). This analysis is being prepared to provide an early investigation of possible design concepts and technical challenges associated with developing remote systems for monitoring and inspecting activities during Performance Confirmation. The writing of this analysis preceded formal development of Performance Confirmation functional requirements and program plans and therefore examines, in part, the fundamental Performance Confirmation monitoring needs and operating conditions. The scope and primary objectives of this analysis are to: (1) Describe the operating environment and conditions expected in the emplacement drifts during the preclosure period. (Presented in Section 7.2). (2) Identify and discuss the

  16. The development of a reliable amateur boxing performance analysis template.

    PubMed

    Thomson, Edward; Lamb, Kevin; Nicholas, Ceri

    2013-01-01

    The aim of this study was to devise a valid performance analysis system for the assessment of the movement characteristics associated with competitive amateur boxing and assess its reliability using analysts of varying experience of the sport and performance analysis. Key performance indicators to characterise the demands of an amateur contest (offensive, defensive and feinting) were developed and notated using a computerised notational analysis system. Data were subjected to intra- and inter-observer reliability assessment using median sign tests and calculating the proportion of agreement within predetermined limits of error. For all performance indicators, intra-observer reliability revealed non-significant differences between observations (P > 0.05) and high agreement was established (80-100%) regardless of whether exact or the reference value of ±1 was applied. Inter-observer reliability was less impressive for both analysts (amateur boxer and experienced analyst), with the proportion of agreement ranging from 33-100%. Nonetheless, there was no systematic bias between observations for any indicator (P > 0.05), and the proportion of agreement within the reference range (±1) was 100%. A reliable performance analysis template has been developed for the assessment of amateur boxing performance and is available for use by researchers, coaches and athletes to classify and quantify the movement characteristics of amateur boxing. PMID:23121380

  17. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  18. Mean streamline aerodynamic performance analysis of centrifugal compressors

    SciTech Connect

    Aungier, R.H.

    1995-07-01

    Aerodynamic performance prediction models for centrifugal compressor impellers are presented. In combination with similar procedures for stationary components, previously published in the open literature, a comprehensive mean streamline performance analysis for centrifugal compressor stages is provided. The accuracy and versatility of the overall analysis is demonstrated for several centrifugal compressor stages of various types, including comparison with intrastage component performance data. Detailed validation of the analysis against experimental data has been accomplished for over a hundred stages, including stage flow coefficients from 0.009 to 0.15 and pressure ratios up to about 3.5. Its application to turbocharger stages includes pressure ratios up to 4.2, but with test uncertainty much greater than for the data used in the detailed validation studies.

  19. Performing modal analysis for multi-metric measurements: a discussion

    NASA Astrophysics Data System (ADS)

    Soman, R.; Majewska, K.; Radzienski, M.; Ostachowicz, W.

    2016-04-01

    This work addresses the severe lack of literature in the area of modal analysis for multi-metric sensing. The paper aims at providing a step by step tutorial for performance of modal analysis using Fiber Bragg Grating (FBG) strain sensors and Laser Doppler Vibrometer (LDV) for displacement measurements. The paper discusses in detail the different parameters which affect the accuracy of the experimental results. It highlights the often implied, and un-mentioned problems, that researchers face while performing experiments. The paper tries to bridge the gap between the theoretical idea of the experiment and its actual execution by discussing each aspect including the choice of specimen, boundary conditions, sensors, sensor position, excitation mechanism and its location as well as the post processing of the data. The paper may be viewed as a checklist for performing modal analysis in order to ensure high quality measurements by avoiding the systematic errors to creep in.

  20. Advanced Video Analysis Needs for Human Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Campbell, Paul D.

    1994-01-01

    Evaluators of human task performance in space missions make use of video as a primary source of data. Extraction of relevant human performance information from video is often a labor-intensive process requiring a large amount of time on the part of the evaluator. Based on the experiences of several human performance evaluators, needs were defined for advanced tools which could aid in the analysis of video data from space missions. Such tools should increase the efficiency with which useful information is retrieved from large quantities of raw video. They should also provide the evaluator with new analytical functions which are not present in currently used methods. Video analysis tools based on the needs defined by this study would also have uses in U.S. industry and education. Evaluation of human performance from video data can be a valuable technique in many industrial and institutional settings where humans are involved in operational systems and processes.

  1. Network DEA: an application to analysis of academic performance

    NASA Astrophysics Data System (ADS)

    Saniee Monfared, Mohammad Ali; Safi, Mahsa

    2013-05-01

    As governmental subsidies to universities are declining in recent years, sustaining excellence in academic performance and more efficient use of resources have become important issues for university stakeholders. To assess the academic performances and the utilization of the resources, two important issues need to be addressed, i.e., a capable methodology and a set of good performance indicators as we consider in this paper. In this paper, we propose a set of performance indicators to enable efficiency analysis of academic activities and apply a novel network DEA structure to account for subfunctional efficiencies such as teaching quality, research productivity, as well as the overall efficiency. We tested our approach on the efficiency analysis of academic colleges at Alzahra University in Iran.

  2. Performance Analysis of Web Applications Based on User Navigation

    NASA Astrophysics Data System (ADS)

    Zhou, Quanshu; Ye, Hairong; Ding, Zuohua

    This paper proposes a method to conduct performance eanalysis of web applications. The behavior model is firstly built from log file after user navigation, then an extended state diagram is extracted from this log file, finally multiple Markov model is cooperated to this state diagram and the performance analysis can be obtained from the Markov model. Five indexes are used to measure the performance and they are: service response time, service path length, service utilization, service implementation rate and access error rate. Our performance analysis result will provide a suggestion to improve the design of web applications and optimize the services. A case study of Zhejiang Chess web site has been used to demonstrate the advantage of our method.

  3. Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Caubet, Jordi; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    In this paper we describe how to apply powerful performance analysis techniques to understand the behavior of multilevel parallel applications. We use the Paraver/OMPItrace performance analysis system for our study. This system consists of two major components: The OMPItrace dynamic instrumentation mechanism, which allows the tracing of processes and threads and the Paraver graphical user interface for inspection and analyses of the generated traces. We describe how to use the system to conduct a detailed comparative study of a benchmark code implemented in five different programming paradigms applicable for shared memory

  4. Temporal geospatial analysis of secondary school students’ examination performance

    NASA Astrophysics Data System (ADS)

    Nik Abd Kadir, ND; Adnan, NA

    2016-06-01

    Malaysia's Ministry of Education has improved the organization of the data to have the geographical information system (GIS) school database. However, no further analysis is done using geospatial analysis tool. Mapping has emerged as a communication tool and becomes effective way to publish the digital and statistical data such as school performance results. The objective of this study is to analyse secondary school student performance of science and mathematics scores of the Sijil Pelajaran Malaysia Examination result in the year 2010 to 2014 for the Kelantan's state schools with the aid of GIS software and geospatial analysis. The school performance according to school grade point average (GPA) from Grade A to Grade G were interpolated and mapped and query analysis using geospatial tools able to be done. This study will be beneficial to the education sector to analyse student performance not only in Kelantan but to the whole Malaysia and this will be a good method to publish in map towards better planning and decision making to prepare young Malaysians for the challenges of education system and performance.

  5. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  6. Integrated design environment for human performance and human reliability analysis

    SciTech Connect

    Nelson, W.R.

    1997-05-01

    Work over the last few years at the Idaho National Engineering and Environmental Laboratory (INEEL) has included a major focus on applying human performance and human reliability knowledge and methods as an integral element of system design and development. This work has been pursued in programs in a wide variety of technical domains, beginning with nuclear power plant operations. Since the mid-1980`s the laboratory has transferred the methods and tools developed in the nuclear domain to military weapons systems and aircraft, offshore oil and shipping operations, and commercial aviation operations and aircraft design. Through these diverse applications the laboratory has developed an integrated approach and framework for application of human performance analysis, human reliability analysis (HRA), operational data analysis, and simulation studies of human performance to the design and development of complex systems. This approach was recently tested in the NASA Advanced Concepts Program {open_quotes}Structured Human Error Analysis for Aircraft Design.{close_quotes} This program resulted in the prototype software tool THEA (Tool for Human Error Analysis) for incorporating human error analysis in the design of commercial aircraft, focusing on airplane maintenance tasks. Current effort is directed toward applying this framework to the development of advanced Air Traffic Management (ATM) systems as part of NASA`s Advanced Air Transportation Technologies (AATT) program. This paper summarizes the approach, describes recent and current applications in commercial aviation, and provides perspectives on how the approach could be utilized in the nuclear power industry.

  7. Analysis of Photovoltaic System Energy Performance Evaluation Method

    SciTech Connect

    Kurtz, S.; Newmiller, J.; Kimber, A.; Flottemesch, R.; Riley, E.; Dierauf, T.; McKee, J.; Krishnani, P.

    2013-11-01

    Documentation of the energy yield of a large photovoltaic (PV) system over a substantial period can be useful to measure a performance guarantee, as an assessment of the health of the system, for verification of a performance model to then be applied to a new system, or for a variety of other purposes. Although the measurement of this performance metric might appear to be straight forward, there are a number of subtleties associated with variations in weather and imperfect data collection that complicate the determination and data analysis. A performance assessment is most valuable when it is completed with a very low uncertainty and when the subtleties are systematically addressed, yet currently no standard exists to guide this process. This report summarizes a draft methodology for an Energy Performance Evaluation Method, the philosophy behind the draft method, and the lessons that were learned by implementing the method.

  8. An Empirical Analysis of Human Performance and Nuclear Safety Culture

    SciTech Connect

    Jeffrey Joe; Larry G. Blackwood

    2006-06-01

    The purpose of this analysis, which was conducted for the US Nuclear Regulatory Commission (NRC), was to test whether an empirical connection exists between human performance and nuclear power plant safety culture. This was accomplished through analyzing the relationship between a measure of human performance and a plant’s Safety Conscious Work Environment (SCWE). SCWE is an important component of safety culture the NRC has developed, but it is not synonymous with it. SCWE is an environment in which employees are encouraged to raise safety concerns both to their own management and to the NRC without fear of harassment, intimidation, retaliation, or discrimination. Because the relationship between human performance and allegations is intuitively reciprocal and both relationship directions need exploration, two series of analyses were performed. First, human performance data could be indicative of safety culture, so regression analyses were performed using human performance data to predict SCWE. It also is likely that safety culture contributes to human performance issues at a plant, so a second set of regressions were performed using allegations to predict HFIS results.

  9. Performance on the Pharmacy College Admission Test: An Exploratory Analysis.

    ERIC Educational Resources Information Center

    Kawahara, Nancy E.; Ethington, Corinna

    1994-01-01

    Median polishing, an exploratory data statistical analysis technique, was used to study achievement patterns for men and women on the Pharmacy College Admission Test over a six-year period. In general, a declining trend in scores was found, and males performed better than females, with the largest differences found in chemistry and biology.…

  10. Performance Analysis of GAME: A Generic Automated Marking Environment

    ERIC Educational Resources Information Center

    Blumenstein, Michael; Green, Steve; Fogelman, Shoshana; Nguyen, Ann; Muthukkumarasamy, Vallipuram

    2008-01-01

    This paper describes the Generic Automated Marking Environment (GAME) and provides a detailed analysis of its performance in assessing student programming projects and exercises. GAME has been designed to automatically assess programming assignments written in a variety of languages based on the "structure" of the source code and the correctness…

  11. Frontiers of Performance Analysis on Leadership-Class Systems

    SciTech Connect

    Fowler, R J; Adhianto, L; de Supinski, B R; Fagan, M; Gamblin, T; Krentel, M; Mellor-Crummey, J; Schulz, M; Tallent, N

    2009-06-15

    The number of cores in high-end systems for scientific computing are employing is increasing rapidly. As a result, there is an pressing need for tools that can measure, model, and diagnose performance problems in highly-parallel runs. We describe two tools that employ complementary approaches for analysis at scale and we illustrate their use on DOE leadership-class systems.

  12. Job Analysis, Job Descriptions, and Performance Appraisal Systems.

    ERIC Educational Resources Information Center

    Sims, Johnnie M.; Foxley, Cecelia H.

    1980-01-01

    Job analysis, job descriptions, and performance appraisal can benefit student services administration in many ways. Involving staff members in the development and implementation of these techniques can increase commitment to and understanding of the overall objectives of the office, as well as communication and cooperation among colleagues.…

  13. A Semiotic Reading and Discourse Analysis of Postmodern Street Performance

    ERIC Educational Resources Information Center

    Lee, Mimi Miyoung; Chung, Sheng Kuan

    2009-01-01

    Postmodern street art operates under a set of references that requires art educators and researchers to adopt alternative analytical frameworks in order to understand its meanings. In this article, we describe social semiotics, critical discourse analysis, and postmodern street performance as well as the relevance of the former two in interpreting…

  14. The Analysis of Athletic Performance: Some Practical and Philosophical Considerations

    ERIC Educational Resources Information Center

    Nelson, Lee J.; Groom, Ryan

    2012-01-01

    This article presents a hypothetical dialogue between a notational analyst (NA) recently schooled in the positivistic assessment of athletic performance, an "old-school" traditional coach (TC) who favours subjective analysis, and a pragmatic educator (PE). The conversation opens with NA and TC debating the respective value of quantitative and…

  15. Visuo-Spatial Performance in Autism: A Meta-Analysis

    ERIC Educational Resources Information Center

    Muth, Anne; Hönekopp, Johannes; Falter, Christine M.

    2014-01-01

    Visuo-spatial skills are believed to be enhanced in autism spectrum disorders (ASDs). This meta-analysis tests the current state of evidence for Figure Disembedding, Block Design, Mental Rotation and Navon tasks in ASD and neurotypicals. Block Design (d = 0.32) and Figure Disembedding (d = 0.26) showed superior performance for ASD with large…

  16. Identification of human operator performance models utilizing time series analysis

    NASA Technical Reports Server (NTRS)

    Holden, F. M.; Shinners, S. M.

    1973-01-01

    The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.

  17. Storage element performance optimization for CMS analysis jobs

    NASA Astrophysics Data System (ADS)

    Behrmann, G.; Dahlblom, J.; Guldmyr, J.; Happonen, K.; Lindén, T.

    2012-12-01

    Tier-2 computing sites in the Worldwide Large Hadron Collider Computing Grid (WLCG) host CPU-resources (Compute Element, CE) and storage resources (Storage Element, SE). The vast amount of data that needs to processed from the Large Hadron Collider (LHC) experiments requires good and efficient use of the available resources. Having a good CPU efficiency for the end users analysis jobs requires that the performance of the storage system is able to scale with I/O requests from hundreds or even thousands of simultaneous jobs. In this presentation we report on the work on improving the SE performance at the Helsinki Institute of Physics (HIP) Tier-2 used for the Compact Muon Experiment (CMS) at the LHC. Statistics from CMS grid jobs are collected and stored in the CMS Dashboard for further analysis, which allows for easy performance monitoring by the sites and by the CMS collaboration. As part of the monitoring framework CMS uses the JobRobot which sends every four hours 100 analysis jobs to each site. CMS also uses the HammerCloud tool for site monitoring and stress testing and it has replaced the JobRobot. The performance of the analysis workflow submitted with JobRobot or HammerCloud can be used to track the performance due to site configuration changes, since the analysis workflow is kept the same for all sites and for months in time. The CPU efficiency of the JobRobot jobs at HIP was increased approximately by 50 % to more than 90 %, by tuning the SE and by improvements in the CMSSW and dCache software. The performance of the CMS analysis jobs improved significantly too. Similar work has been done on other CMS Tier-sites, since on average the CPU efficiency for CMSSW jobs has increased during 2011. Better monitoring of the SE allows faster detection of problems, so that the performance level can be kept high. The next storage upgrade at HIP consists of SAS disk enclosures which can be stress tested on demand with HammerCloud workflows, to make sure that the I/O-performance

  18. Eddy-current steam generator data analysis performance. Final report

    SciTech Connect

    Harris, D.H.

    1993-06-01

    This study assessed the accuracy of eddy current, bobbin coil data analysis of steam generator tubes conducted under the structure of the PWR Steam Generator Examination Guidelines, Individual and team performance measures were obtained from independent analyses of data from 1619 locations in a sample of 199 steam generator tubes. The 92 reportable indications contained in the tube sample, including 64 repairable indications, were attributable to: wear at anti-vibration bars, intergranular attack/stress-corrosion cracking (IGA/SCC) within tube sheet crevice regions, primary-water stress-corrosion cracking (PWSCC) at tube roll transitions, or thinning at cold-leg tube supports. Analyses were conducted by 20 analysts, four each from five vendors of eddy current steam generator examination services. In accordance with the guidelines, site orientation was provided with plant-specific guidelines; preanalysis practice was completed on plant-specific data; analysts were qualified by performance testing; and independent primary-secondary analyses were conducted with resolution of discrepancies (team analyses). Measures of analysis performance included percentages of indications correctly reported, percentages of false reports, and relative operating characteristic (ROC) curves. ROC curves presented comprehensive pictures of analysis accuracy generalizable beyond the specific conditions of this study. They also provided single-value measures of analysis accuracy. Conclusions and recommendations were provided relative to analysis accuracy, effect of primary-secondary analyses, analyses of tube sheet crevice regions, establishment of reporting criteria, improvement of examination guidelines, and needed research.

  19. Performance Indicators in Math: Implications for Brief Experimental Analysis of Academic Performance

    ERIC Educational Resources Information Center

    VanDerheyden, Amanda M.; Burns, Matthew K.

    2009-01-01

    Brief experimental analysis (BEA) can be used to specify intervention characteristics that produce positive learning gains for individual students. A key challenge to the use of BEA for intervention planning is the identification of performance indicators (including topography of the skill, measurement characteristics, and decision criteria) that…

  20. Modeling and performance analysis of GPS vector tracking algorithms

    NASA Astrophysics Data System (ADS)

    Lashley, Matthew

    This dissertation provides a detailed analysis of GPS vector tracking algorithms and the advantages they have over traditional receiver architectures. Standard GPS receivers use a decentralized architecture that separates the tasks of signal tracking and position/velocity estimation. Vector tracking algorithms combine the two tasks into a single algorithm. The signals from the various satellites are processed collectively through a Kalman filter. The advantages of vector tracking over traditional, scalar tracking methods are thoroughly investigated. A method for making a valid comparison between vector and scalar tracking loops is developed. This technique avoids the ambiguities encountered when attempting to make a valid comparison between tracking loops (which are characterized by noise bandwidths and loop order) and the Kalman filters (which are characterized by process and measurement noise covariance matrices) that are used by vector tracking algorithms. The improvement in performance offered by vector tracking is calculated in multiple different scenarios. Rule of thumb analysis techniques for scalar Frequency Lock Loops (FLL) are extended to the vector tracking case. The analysis tools provide a simple method for analyzing the performance of vector tracking loops. The analysis tools are verified using Monte Carlo simulations. Monte Carlo simulations are also used to study the effects of carrier to noise power density (C/N0) ratio estimation and the advantage offered by vector tracking over scalar tracking. The improvement from vector tracking ranges from 2.4 to 6.2 dB in various scenarios. The difference in the performance of the three vector tracking architectures is analyzed. The effects of using a federated architecture with and without information sharing between the receiver's channels are studied. A combination of covariance analysis and Monte Carlo simulation is used to analyze the performance of the three algorithms. The federated algorithm without

  1. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  2. Analysis of portable impactor performance for enumeration of viable bioaerosols.

    PubMed

    Yao, Maosheng; Mainelis, Gediminas

    2007-07-01

    Portable impactors are increasingly being used to estimate concentration of bioaerosols in residential and occupational environments; however, little data are available about their performance. This study investigated the overall performances of the SMA MicroPortable, BioCulture, Microflow, Microbiological Air Sampler (MAS-100), Millipore Air Tester, SAS Super 180, and RCS High Flow portable microbial samplers when collecting bacteria and fungi both indoors and outdoors. The performance of these samplers was compared with that of the BioStage impactor. The Button Aerosol Sampler equipped with gelatin filter was also included in the study. Results showed that the sampling environment can have a statistically significant effect on sampler performance, most likely due to the differences in airborne microorganism composition and/or their size distribution. Data analysis using analysis of variance showed that the relative performance of all samplers (except the RCS High Flow and MAS-100) was statistically different (lower) compared with the BioStage. The MAS-100 also had statistically higher performance compared with other portable samplers except the RCS High Flow. The Millipore Air Tester and the SMA had the lowest performances. The relative performance of the impactors was described using a multiple linear regression model (R(2) = 0.83); the effects of the samplers' cutoff sizes and jet-to-plate distances as predictor variables were statistically significant. The data presented in this study will help field professionals in selecting bioaerosol samplers. The developed empirical formula describing the overall performance of bioaerosol impactors can assist in sampler design. PMID:17538812

  3. Performance demonstration program plan for analysis of simulated headspace gases

    SciTech Connect

    1995-06-01

    The Performance Demonstration Program (PDP) for analysis of headspace gases will consist of regular distribution and analyses of test standards to evaluate the capability for analyzing VOCs, hydrogen, and methane in the headspace of transuranic (TRU) waste throughout the Department of Energy (DOE) complex. Each distribution is termed a PDP cycle. These evaluation cycles will provide an objective measure of the reliability of measurements performed for TRU waste characterization. Laboratory performance will be demonstrated by the successful analysis of blind audit samples of simulated TRU waste drum headspace gases according to the criteria set within the text of this Program Plan. Blind audit samples (hereinafter referred to as PDP samples) will be used as an independent means to assess laboratory performance regarding compliance with the QAPP QAOs. The concentration of analytes in the PDP samples will encompass the range of concentrations anticipated in actual waste characterization gas samples. Analyses which are required by the WIPP to demonstrate compliance with various regulatory requirements and which are included in the PDP must be performed by laboratories which have demonstrated acceptable performance in the PDP.

  4. Performance analysis of mini centrifugal pump with splitter blades

    NASA Astrophysics Data System (ADS)

    Shigemitsu, T.; Fukutomi, J.; Wada, T.; Shinohara, H.

    2013-12-01

    Design method for a mini centrifugal pump is not established because the internal flow condition for these small-sized fluid machines is not clarified and conventional theory is not suitable for small-sized pumps. Then, a semi-open impeller for the mini centrifugal pump with 55mm impeller diameter is adopted in this research to take simplicity and maintenance into consideration. Splitter blades are adopted in this research to improve the performance and internal flow condition of mini centrifugal pump having large blade outlet angle. The performance tests are conducted with these rotors in order to investigate the effect of the splitter blades on the performance and internal flow condition of the mini centrifugal pump. A three dimensional steady numerical flow analysis is conducted to analyze rotor, volute efficiency and loss caused by a vortex. It is clarified from the experimental results that the performance of the mini centrifugal pump is improved by the effect of the splitter blades. Flow condition at outlet of the rotor becomes uniform and back flow regions are suppressed in the case with the splitter blades. Further, the volute efficiency increases and the vortex loss decreases. In the present paper, the performance of the mini centrifugal pump is shown and the flow condition is clarified with the results of the experiment and the numerical flow analysis. Furthermore, the performance analyses of the mini centrifugal pumps with and without the splitter blades are conducted.

  5. Mission analysis and performance specification studies report, appendix A

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The Near Term Hybrid Passenger Vehicle Development Program tasks included defining missions, developing distributions of daily travel and composite driving cycles for these missions, providing information necessary to estimate the potential replacement of the existing fleet by hybrids, and estimating acceleration/gradeability performance requirements for safe operation. The data was then utilized to develop mission specifications, define reference vehicles, develop hybrid vehicle performance specifications, and make fuel consumption estimates for the vehicles. The major assumptions which underlie the approach taken to the mission analysis and development of performance specifications are the following: the daily operating range of a hybrid vehicle should not be limited by the stored energy capacity and the performance of such a vehicle should not be strongly dependent on the battery state of charge.

  6. Safety and performance analysis of a commercial photovoltaic installation

    NASA Astrophysics Data System (ADS)

    Hamzavy, Babak T.; Bradley, Alexander Z.

    2013-09-01

    Continuing to better understand the performance of PV systems and changes in performance with the system life is vital to the sustainable growth of solar. A systematic understanding of degradation mechanisms that are induced as a result of variables such as the service environment, installation, module/material design, weather, operation and maintenance, and manufacturing is required for reliable operation throughout a system's lifetime. We wish to report the results from an analysis of a commercial c-Si PV array owned and operated by DuPont. We assessed the electrical performance of the modules by comparing the original manufacturers' performance data with the measurements obtained using a solar simulator to determine the degradation rate. This evaluation provides valuable PV system field experience and document key issues regarding safety and performance. A review of the nondestructive and destructive analytical methods and characterization strategies we have found useful for system, module, and subsequent material component evaluations are presented. We provide an overview of our inspection protocol and subsequent control process to mitigate risk. The objective is to explore and develop best practice protocols regarding PV asset optimization and provide a rationale to reduce risk based on the analysis of our own commercial installations.

  7. Performance analysis of static locking in distributed database systems

    SciTech Connect

    Shyu, S.C. ); Li, V.O.K. . Dept. of Electrical Engineering)

    1990-06-01

    Numerous performance models have been proposed for locking algorithms in centralized database systems, but few have been developed for distributed ones. Existing results on distributed locking usually ignore the deadlock problem so as to simplify the analysis. In this paper, a new performance model for static locking in distributed database systems is developed.A queuing model is used to approximate static locking in distributed database systems without deadlocks. Then a random graph model is proposed to find the deadlock probability of each transaction. The above two models are integrated, so that given the transaction arrival rate, the response time and the effective throughput can be calculated.

  8. Performance requirements analysis for payload delivery from a space station

    NASA Technical Reports Server (NTRS)

    Friedlander, A. L.; Soldner, J. K.; Bell, J. (Editor); Ricks, G. W.; Kincade, R. E.; Deatkins, D.; Reynolds, R.; Nader, B. A.; Hill, O.; Babb, G. R.

    1983-01-01

    Operations conducted from a space station in low Earth orbit which have different constraints and opportunities than those conducted from direct Earth launch were examined. While a space station relieves many size and performance constraints on the space shuttle, the space station's inertial orbit has different launch window constraints from those associated with customary Earth launches which reflect upon upper stage capability. A performance requirements analysis was developed to provide a reference source of parametric data, and specific case solutions and upper stage sizing trade to assist potential space station users and space station and upper stage developers assess the impacts of a space station on missions of interest.

  9. A Study of ATLAS Grid Performance for Distributed Analysis

    NASA Astrophysics Data System (ADS)

    Panitkin, Sergey; Fine, Valery; Wenaus, Torre

    2012-12-01

    In the past two years the ATLAS Collaboration at the LHC has collected a large volume of data and published a number of ground breaking papers. The Grid-based ATLAS distributed computing infrastructure played a crucial role in enabling timely analysis of the data. We will present a study of the performance and usage of the ATLAS Grid as platform for physics analysis in 2011. This includes studies of general properties as well as timing properties of user jobs (wait time, run time, etc). These studies are based on mining of data archived by the PanDA workload management system.

  10. INL FY2014 1st Quarterly Performance Analysis

    SciTech Connect

    Loran Kinghorn

    2014-07-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2 “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of 76 occurrence reports and over 16 other deficiency reports (including not reportable events) identified at the INL during the period of October 2013 through December 2013. Battelle Energy Alliance (BEA) operates the INL under contract DE AC 07 051D14517

  11. High Performance Descriptive Semantic Analysis of Semantic Graph Databases

    SciTech Connect

    Joslyn, Cliff A.; Adolf, Robert D.; al-Saffar, Sinan; Feo, John T.; Haglin, David J.; Mackey, Greg E.; Mizell, David W.

    2011-06-02

    As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to understand their inherent semantic structure, whether codified in explicit ontologies or not. Our group is researching novel methods for what we call descriptive semantic analysis of RDF triplestores, to serve purposes of analysis, interpretation, visualization, and optimization. But data size and computational complexity makes it increasingly necessary to bring high performance computational resources to bear on this task. Our research group built a novel high performance hybrid system comprising computational capability for semantic graph database processing utilizing the large multi-threaded architecture of the Cray XMT platform, conventional servers, and large data stores. In this paper we describe that architecture and our methods, and present the results of our analyses of basic properties, connected components, namespace interaction, and typed paths such for the Billion Triple Challenge 2010 dataset.

  12. Performance analysis of solar powered absorption refrigeration system

    NASA Astrophysics Data System (ADS)

    Abu-Ein, Suleiman Qaseem; Fayyad, Sayel M.; Momani, Waleed; Al-Bousoul, Mamdouh

    2009-12-01

    The present work provides a detailed thermodynamic analysis of a 10 kW solar absorption refrigeration system using ammonia-water mixtures as a working medium. This analysis includes both first law and second law of thermodynamics. The coefficient of performance (COP), exergetic coefficient of performance (ECOP) and the exergy losses (Δ E) through each component of the system at different operating conditions are obtained. The minimum and maximum values of COP and ECOP were found to be at 110 and 200°C generator temperatures respectively. About 40% of the system exergy losses were found to be in the generator. The maximum exergy losses in the absorber occur at generator temperature of 130°C for all evaporator temperatures. A computer simulation model is developed to carry out the calculations and to obtain the results of the present study.

  13. Microfabricated devices for performing chemical and biochemical analysis

    SciTech Connect

    Ramsey, J.M.; Jacobson, S.C.; Foote, R.S.

    1997-05-01

    There is growing interest in microfabricated devices that perform chemical and biochemical analysis. The general goal is to use microfabrication tools to construct miniature devices that can perform a complete analysis starting with an unprocessed sample. Such devices have been referred to as lab-on-a-chip devices. Initial efforts on microfluidic laboratory-on-a-chip devices focused on chemical separations. There are many potential applications of these fluidic microchip devices. Some applications such as chemical process control or environmental monitoring would require that a chip be used over an extended period of time or for many analyses. Other applications such as forensics, clinical diagnostics, and genetic diagnostics would employ the chip devices as single use disposable devices.

  14. Performance analysis of a generalized upset detection procedure

    NASA Technical Reports Server (NTRS)

    Blough, Douglas M.; Masson, Gerald M.

    1987-01-01

    A general procedure for upset detection in complex systems, called the data block capture and analysis upset monitoring process is described and analyzed. The process consists of repeatedly recording a fixed amount of data from a set of predetermined observation lines of the system being monitored (i.e., capturing a block of data), and then analyzing the captured block in an attempt to determine whether the system is functioning correctly. The algorithm which analyzes the data blocks can be characterized in terms of the amount of time it requires to examine a given length data block to ascertain the existence of features/conditions that have been predetermined to characterize the upset-free behavior of the system. The performance of linear, quadratic, and logarithmic data analysis algorithms is rigorously characterized in terms of three performance measures: (1) the probability of correctly detecting an upset; (2) the expected number of false alarms; and (3) the expected latency in detecting upsets.

  15. Identifying influential factors of business process performance using dependency analysis

    NASA Astrophysics Data System (ADS)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  16. Performance of multifractal detrended fluctuation analysis on short time series

    NASA Astrophysics Data System (ADS)

    López, Juan Luis; Contreras, Jesús Guillermo

    2013-02-01

    The performance of the multifractal detrended analysis on short time series is evaluated for synthetic samples of several mono- and multifractal models. The reconstruction of the generalized Hurst exponents is used to determine the range of applicability of the method and the precision of its results as a function of the decreasing length of the series. As an application the series of the daily exchange rate between the U.S. dollar and the euro is studied.

  17. Analysis of guidance law performance using personal computers

    NASA Technical Reports Server (NTRS)

    Barrios, J. Rene

    1990-01-01

    A point mass, three-degree of freedom model is presented as a basic development tool for PC based simulation models. The model has been used in the development of guidance algorithms as well as in other applications such as performance management systems to compute optimal speeds. Its limitations and advantages are discussed with regard to the windshear environment. A method for simulating a simple autopilot is explained in detail and applied in the analysis of different guidance laws.

  18. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    SciTech Connect

    Carlsbad Field Office

    2007-11-19

    The Performance Demonstration Program (PDP) for headspace gases distributes blind audit samples in a gas matrix for analysis of volatile organic compounds (VOCs). Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility’s compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document

  19. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    SciTech Connect

    Carlsbad Field Office

    2007-11-13

    The Performance Demonstration Program (PDP) for headspace gases distributes blind audit samples in a gas matrix for analysis of volatile organic compounds (VOCs). Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility’s compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document

  20. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    SciTech Connect

    Carlsbad Field Office

    2006-04-01

    The Performance Demonstration Program (PDP) for headspace gases distributes sample gases of volatile organic compounds (VOCs) for analysis. Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility’s compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document. Participating measurement

  1. Performance analysis of wireless sensor networks in geophysical sensing applications

    NASA Astrophysics Data System (ADS)

    Uligere Narasimhamurthy, Adithya

    Performance is an important criteria to consider before switching from a wired network to a wireless sensing network. Performance is especially important in geophysical sensing where the quality of the sensing system is measured by the precision of the acquired signal. Can a wireless sensing network maintain the same reliability and quality metrics that a wired system provides? Our work focuses on evaluating the wireless GeoMote sensor motes that were developed by previous computer science graduate students at Mines. Specifically, we conducted a set of experiments, namely WalkAway and Linear Array experiments, to characterize the performance of the wireless motes. The motes were also equipped with the Sticking Heartbeat Aperture Resynchronization Protocol (SHARP), a time synchronization protocol developed by a previous computer science graduate student at Mines. This protocol should automatically synchronize the mote's internal clocks and reduce time synchronization errors. We also collected passive data to evaluate the response of GeoMotes to various frequency components associated with the seismic waves. With the data collected from these experiments, we evaluated the performance of the SHARP protocol and compared the performance of our GeoMote wireless system against the industry standard wired seismograph system (Geometric-Geode). Using arrival time analysis and seismic velocity calculations, we set out to answer the following question. Can our wireless sensing system (GeoMotes) perform similarly to a traditional wired system in a realistic scenario?

  2. Performance Analysis of Rotary Dehumidifier/Humidifier and Systems

    NASA Astrophysics Data System (ADS)

    Hamamoto, Yoshinori; Okajima, Jiro; Matsuoka, Fumio; Akisawa, Atsushi; Kashiwagi, Takao

    The study aims at clarifying the performance of desiccant rotor, and at obtaining the design aspects of high efficient desiccant rotor and systems. In the paper, theoretical analysis is performed for rotary dehumidifier and humidifier. The validity of the model is confirmed by comparison between experimental data and calculation. The influences of several complex factors such as adsorption/desorption time ratio, air flow path patterns and air conditions on rotor performance are examined. It is clarified that there is an optimum angle of adsorption, desorption and purge zone becoming a maximum amount of humidifying And also, there is an optimum desorption side air flow rate. It is confirmed that air flow rate and air temperature influence significantly the amount of dehumidifing and humidifing. Furthermore, it is suggested that heat transfer enhancement of the rotor is efficient in the mass transfer enhancement at the beginning of desorption process.

  3. SIMS analysis of high-performance accelerator niobium

    SciTech Connect

    Maheshwari, P.; Stevie, F. A.; Myneni, Ganapati Rao; Rigsbee, J, M.; Dhakal, Pashupati; Ciovati, Gianluigi; Griffis, D. P.

    2014-11-01

    Niobium is used to fabricate superconducting radio frequency accelerator modules because of its high critical temperature, high critical magnetic field, and easy formability. Recent experiments have shown a very significant improvement in performance (over 100%) after a high-temperature bake at 1400 degrees C for 3h. SIMS analysis of this material showed the oxygen profile was significantly deeper than the native oxide with a shape that is indicative of diffusion. Positive secondary ion mass spectra showed the presence of Ti with a depth profile similar to that of O. It is suspected that Ti is associated with the performance improvement. The source of Ti contamination in the anneal furnace has been identified, and a new furnace was constructed without Ti. Initial results from the new furnace do not show the yield improvement. Further analyses should determine the relationship of Ti to cavity performance.

  4. Crew Exploration Vehicle Launch Abort Controller Performance Analysis

    NASA Technical Reports Server (NTRS)

    Sparks, Dean W., Jr.; Raney, David L.

    2007-01-01

    This paper covers the simulation and evaluation of a controller design for the Crew Module (CM) Launch Abort System (LAS), to measure its ability to meet the abort performance requirements. The controller used in this study is a hybrid design, including features developed by the Government and the Contractor. Testing is done using two separate 6-degree-of-freedom (DOF) computer simulation implementations of the LAS/CM throughout the ascent trajectory: 1) executing a series of abort simulations along a nominal trajectory for the nominal LAS/CM system; and 2) using a series of Monte Carlo runs with perturbed initial flight conditions and perturbed system parameters. The performance of the controller is evaluated against a set of criteria, which is based upon the current functional requirements of the LAS. Preliminary analysis indicates that the performance of the present controller meets (with the exception of a few cases) the evaluation criteria mentioned above.

  5. Total systems design analysis of high performance structures

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1993-01-01

    Designer-control parameters were identified at interdiscipline interfaces to optimize structural systems performance and downstream development and operations with reliability and least life-cycle cost. Interface tasks and iterations are tracked through a matrix of performance disciplines integration versus manufacturing, verification, and operations interactions for a total system design analysis. Performance integration tasks include shapes, sizes, environments, and materials. Integrity integrating tasks are reliability and recurring structural costs. Significant interface designer control parameters were noted as shapes, dimensions, probability range factors, and cost. Structural failure concept is presented, and first-order reliability and deterministic methods, benefits, and limitations are discussed. A deterministic reliability technique combining benefits of both is proposed for static structures which is also timely and economically verifiable. Though launch vehicle environments were primarily considered, the system design process is applicable to any surface system using its own unique filed environments.

  6. The Vehicle Integrated Performance Analysis Experience: Reconnecting With Technical Integration

    NASA Technical Reports Server (NTRS)

    McGhee, D. S.

    2006-01-01

    Very early in the Space Launch Initiative program, a small team of engineers at MSFC proposed a process for performing system-level assessments of a launch vehicle. Aimed primarily at providing insight and making NASA a smart buyer, the Vehicle Integrated Performance Analysis (VIPA) team was created. The difference between the VIPA effort and previous integration attempts is that VIPA a process using experienced people from various disciplines, which focuses them on a technically integrated assessment. The foundations of VIPA s process are described. The VIPA team also recognized the need to target early detailed analysis toward identifying significant systems issues. This process is driven by the T-model for technical integration. VIPA s approach to performing system-level technical integration is discussed in detail. The VIPA process significantly enhances the development and monitoring of realizable project requirements. VIPA s assessment validates the concept s stated performance, identifies significant issues either with the concept or the requirements, and then reintegrates these issues to determine impacts. This process is discussed along with a description of how it may be integrated into a program s insight and review process. The VIPA process has gained favor with both engineering and project organizations for being responsive and insightful

  7. Dynamic performances analysis of a real vehicle driving

    NASA Astrophysics Data System (ADS)

    Abdullah, M. A.; Jamil, J. F.; Salim, M. A.

    2015-12-01

    Vehicle dynamic is the effects of movement of a vehicle generated from the acceleration, braking, ride and handling activities. The dynamic behaviours are determined by the forces from tire, gravity and aerodynamic which acting on the vehicle. This paper emphasizes the analysis of vehicle dynamic performance of a real vehicle. Real driving experiment on the vehicle is conducted to determine the effect of vehicle based on roll, pitch, and yaw, longitudinal, lateral and vertical acceleration. The experiment is done using the accelerometer to record the reading of the vehicle dynamic performance when the vehicle is driven on the road. The experiment starts with weighing a car model to get the center of gravity (COG) to place the accelerometer sensor for data acquisition (DAQ). The COG of the vehicle is determined by using the weight of the vehicle. A rural route is set to launch the experiment and the road conditions are determined for the test. The dynamic performance of the vehicle are depends on the road conditions and driving maneuver. The stability of a vehicle can be controlled by the dynamic performance analysis.

  8. Quantitative analysis of the reconstruction performance of interpolants

    NASA Technical Reports Server (NTRS)

    Lansing, Donald L.; Park, Stephen K.

    1987-01-01

    The analysis presented provides a quantitative measure of the reconstruction or interpolation performance of linear, shift-invariant interpolants. The performance criterion is the mean square error of the difference between the sampled and reconstructed functions. The analysis is applicable to reconstruction algorithms used in image processing and to many types of splines used in numerical analysis and computer graphics. When formulated in the frequency domain, the mean square error clearly separates the contribution of the interpolation method from the contribution of the sampled data. The equations provide a rational basis for selecting an optimal interpolant; that is, one which minimizes the mean square error. The analysis has been applied to a selection of frequently used data splines and reconstruction algorithms: parametric cubic and quintic Hermite splines, exponential and nu splines (including the special case of the cubic spline), parametric cubic convolution, Keys' fourth-order cubic, and a cubic with a discontinuous first derivative. The emphasis in this paper is on the image-dependent case in which no a priori knowledge of the frequency spectrum of the sampled function is assumed.

  9. Reproducible LTE uplink performance analysis using precomputed interference signals

    NASA Astrophysics Data System (ADS)

    Pauli, Volker; Nisar, Muhammad Danish; Seidel, Eiko

    2011-12-01

    The consideration of realistic uplink inter-cell interference is essential for the overall performance testing of future cellular systems, and in particular for the evaluation of the radio resource management (RRM) algorithms. Most beyond-3G communication systems employ orthogonal multiple access in uplink (SC-FDMA in LTE and OFDMA in WiMAX), and additionally rely on frequency-selective RRM (scheduling) algorithms. This makes the task of accurate modeling of uplink interference both crucial and non-trivial. Traditional methods for its modeling (e.g., via additive white Gaussian noise interference sources) are therefore proving to be ineffective to realistically model the uplink interference in the next generation cellular systems. In this article, we propose the use of realistic precomputed interference patterns for LTE uplink performance analysis and testing. The interference patterns are generated via an LTE system-level simulator for a given set of scenario parameters, such as cell configuration, user configurations, and traffic models. The generated interference patterns (some of which are made publicly available) can be employed to benchmark the performance of any LTE uplink system in both lab simulations and field trials for practical deployments. It is worth mentioning that the proposed approach can also be extended to other cellular communication systems employing OFDMA-like multiple access with frequency-selective RRM techniques. The proposed approach offers twofold advantages. First, it allows for repeatability and reproducibility of the performance analysis. This is of crucial significance not only for researchers and developers to analyze the behavior and performance of their systems, but also for the network operators to compare the performance of competing system vendors. Second, the proposed testing mechanism evades the need for deployment of multiple cells (with multiple active users in each) to achieve realistic field trials, thereby resulting in

  10. An advanced probabilistic structural analysis method for implicit performance functions

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.

    1989-01-01

    In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.

  11. Clinical laboratory as an economic model for business performance analysis

    PubMed Central

    Buljanović, Vikica; Patajac, Hrvoje; Petrovečki, Mladen

    2011-01-01

    Aim To perform SWOT (strengths, weaknesses, opportunities, and threats) analysis of a clinical laboratory as an economic model that may be used to improve business performance of laboratories by removing weaknesses, minimizing threats, and using external opportunities and internal strengths. Methods Impact of possible threats to and weaknesses of the Clinical Laboratory at Našice General County Hospital business performance and use of strengths and opportunities to improve operating profit were simulated using models created on the basis of SWOT analysis results. The operating profit as a measure of profitability of the clinical laboratory was defined as total revenue minus total expenses and presented using a profit and loss account. Changes in the input parameters in the profit and loss account for 2008 were determined using opportunities and potential threats, and economic sensitivity analysis was made by using changes in the key parameters. The profit and loss account and economic sensitivity analysis were tools for quantifying the impact of changes in the revenues and expenses on the business operations of clinical laboratory. Results Results of simulation models showed that operational profit of €470 723 in 2008 could be reduced to only €21 542 if all possible threats became a reality and current weaknesses remained the same. Also, operational gain could be increased to €535 804 if laboratory strengths and opportunities were utilized. If both the opportunities and threats became a reality, the operational profit would decrease by €384 465. Conclusion The operational profit of the clinical laboratory could be significantly reduced if all threats became a reality and the current weaknesses remained the same. The operational profit could be increased by utilizing strengths and opportunities as much as possible. This type of modeling may be used to monitor business operations of any clinical laboratory and improve its financial situation by

  12. Analysis of latency performance of bluetooth low energy (BLE) networks.

    PubMed

    Cho, Keuchul; Park, Woojin; Hong, Moonki; Park, Gisu; Cho, Wooseong; Seo, Jihoon; Han, Kijun

    2015-01-01

    Bluetooth Low Energy (BLE) is a short-range wireless communication technology aiming at low-cost and low-power communication. The performance evaluation of classical Bluetooth device discovery have been intensively studied using analytical modeling and simulative methods, but these techniques are not applicable to BLE, since BLE has a fundamental change in the design of the discovery mechanism, including the usage of three advertising channels. Recently, there several works have analyzed the topic of BLE device discovery, but these studies are still far from thorough. It is thus necessary to develop a new, accurate model for the BLE discovery process. In particular, the wide range settings of the parameters introduce lots of potential for BLE devices to customize their discovery performance. This motivates our study of modeling the BLE discovery process and performing intensive simulation. This paper is focused on building an analytical model to investigate the discovery probability, as well as the expected discovery latency, which are then validated via extensive experiments. Our analysis considers both continuous and discontinuous scanning modes. We analyze the sensitivity of these performance metrics to parameter settings to quantitatively examine to what extent parameters influence the performance metric of the discovery processes. PMID:25545266

  13. Performance analysis of image fusion methods in transform domain

    NASA Astrophysics Data System (ADS)

    Choi, Yoonsuk; Sharifahmadian, Ershad; Latifi, Shahram

    2013-05-01

    Image fusion involves merging two or more images in such a way as to retain the most desirable characteristics of each. There are various image fusion methods and they can be classified into three main categories: i) Spatial domain, ii) Transform domain, and iii) Statistical domain. We focus on the transform domain in this paper as spatial domain methods are primitive and statistical domain methods suffer from a significant increase of computational complexity. In the field of image fusion, performance analysis is important since the evaluation result gives valuable information which can be utilized in various applications, such as military, medical imaging, remote sensing, and so on. In this paper, we analyze and compare the performance of fusion methods based on four different transforms: i) wavelet transform, ii) curvelet transform, iii) contourlet transform and iv) nonsubsampled contourlet transform. Fusion framework and scheme are explained in detail, and two different sets of images are used in our experiments. Furthermore, various performance evaluation metrics are adopted to quantitatively analyze the fusion results. The comparison results show that the nonsubsampled contourlet transform method performs better than the other three methods. During the experiments, we also found out that the decomposition level of 3 showed the best fusion performance, and decomposition levels beyond level-3 did not significantly affect the fusion results.

  14. Analysis of Latency Performance of Bluetooth Low Energy (BLE) Networks

    PubMed Central

    Cho, Keuchul; Park, Woojin; Hong, Moonki; Park, Gisu; Cho, Wooseong; Seo, Jihoon; Han, Kijun

    2015-01-01

    Bluetooth Low Energy (BLE) is a short-range wireless communication technology aiming at low-cost and low-power communication. The performance evaluation of classical Bluetooth device discovery have been intensively studied using analytical modeling and simulative methods, but these techniques are not applicable to BLE, since BLE has a fundamental change in the design of the discovery mechanism, including the usage of three advertising channels. Recently, there several works have analyzed the topic of BLE device discovery, but these studies are still far from thorough. It is thus necessary to develop a new, accurate model for the BLE discovery process. In particular, the wide range settings of the parameters introduce lots of potential for BLE devices to customize their discovery performance. This motivates our study of modeling the BLE discovery process and performing intensive simulation. This paper is focused on building an analytical model to investigate the discovery probability, as well as the expected discovery latency, which are then validated via extensive experiments. Our analysis considers both continuous and discontinuous scanning modes. We analyze the sensitivity of these performance metrics to parameter settings to quantitatively examine to what extent parameters influence the performance metric of the discovery processes. PMID:25545266

  15. Cross-industry Performance Modeling: Toward Cooperative Analysis

    SciTech Connect

    Reece, Wendy Jane; Blackman, Harold Stabler

    1998-10-01

    One of the current unsolved problems in human factors is the difficulty in acquiring information from lessons learned and data collected among human performance analysts in different domains. There are several common concerns and generally accepted issues of importance for human factors, psychology and industry analysts of performance and safety. Among these are the need to incorporate lessons learned in design, to carefully consider implementation of new designs and automation, and the need to reduce human performance-based contributions to risk. In spite of shared concerns, there are several roadblocks to widespread sharing of data and lessons learned from operating experience and simulation, including the fact that very few publicly accessible data bases exist (Gertman & Blackman, 1994, and Kirwan, 1997). There is a need to draw together analysts and analytic methodologies to comprise a centralized source of data with sufficient detail to be meaningful while ensuring source anonymity. We propose that a generic source of performance data and a multi-domain data store may provide the first steps toward cooperative performance modeling and analysis across industries.

  16. Cross-Industry Performance Modeling: Toward Cooperative Analysis

    SciTech Connect

    H. S. Blackman; W. J. Reece

    1998-10-01

    One of the current unsolved problems in human factors is the difficulty in acquiring information from lessons learned and data collected among human performance analysts in different domains. There are several common concerns and generally accepted issues of importance for human factors, psychology and industry analysts of performance and safety. Among these are the need to incorporate lessons learned in design, to carefully consider implementation of new designs and automation, and the need to reduce human performance-based contributions to risk. In spite of shared concerns, there are several road blocks to widespread sharing of data and lessons learned from operating experience and simulation, including the fact that very few publicly accessible data bases exist(Gertman & Blackman, 1994, and Kirwan, 1997). There is a need to draw together analysts and analytic methodologies to comprise a centralized source of data with sufficient detail to be meaningful while ensuring source anonymity. We propose that a generic source of performance data and a multi-domain data store may provide the first steps toward cooperative performance modeling and analysis across industries.

  17. Performance analysis and visualization of electric power systems

    NASA Astrophysics Data System (ADS)

    Dong, Xuejiang; Shinozuka, Masanobu

    2003-08-01

    This paper describes a method of system performance evaluation for electric power network. The basic element that plays a crucial role here is the fragility information for transmission system equipment. The method utilizes the fragility information for evaluation of system performance degradation of LADWP's (Los Angeles Department of Water and Power's) power network damaged by a severe earthquake by comparing its performance before and after the earthquake event. One of the highlights of this paper is the use of computer code "PowerWorld" to visualize the state of power flow of the network, segment by segment. Similarly, the method can evaluate quantitatively the effect of various measures of rehabilitation or retrofit performed on equipment and/or facilities of the network. This is done by comparing the system performance with or without the rehabilitation. In this context, the results of experimental and analytical studies carried out by other researchers are used to determine the possible range of fragility enhancement associated with the rehabilitation of transformers in terms of base-isolation systems. In this analysis, 47 scenario earthquakes are used to develop the risk curves for the LADWP"s power transmission system. The risk curve can then be correlated to economic impact of the reduction in power supply due to earthquake. Recovery aspects of the damaged power system will be studied from this point of view in future.

  18. Performance-based design and analysis of flexible composite propulsors

    NASA Astrophysics Data System (ADS)

    Motley, Michael R.; Young, Yin L.

    2011-11-01

    Advanced composite propellers, turbines, and jet engines have become increasingly popular in part because of their ability to provide improved performance over traditional metallic rotors through exploitation of the intrinsic bend-twist coupling characteristics of anisotropic composite materials. While these performance improvements can be significant from a conceptual perspective, the load-dependent deformation responses of adaptive blades make the design of these structures highly non-trivial. Hence, it is necessary to understand and predict the dependence of the deformations on the geometry, material constitution, and fluid-structure interaction responses across the entire range of expected loading conditions.The objective of this work is to develop a probabilistic performance-based design and analysis methodology for flexible composite propulsors. To demonstrate the method, it is applied for the design and analysis of two (rigid) metallic and (flexible) composite propellers for a twin-shafted naval combatant craft. The probabilistic operational space is developed by considering the variation of vessel thrust requirements as a function of the vessel speed and wave conditions along with the probabilistic speed profiles. The performance of the metallic and composite propellers are compared and discussed. The implications of load-dependent deformations of the flexible composite propeller on the operating conditions and the resulting performance with respect to propeller efficiency, power demand, and fluid cavitation are presented for both spatially uniform and varying flows. While the proposed framework is demonstrated for marine propellers, the methodology can be generally applied for any marine, aerospace, or wind energy structure that must operate in a wide range of loading conditions over its expected life.

  19. Results of a 24-inch Hybrid Motor Performance Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Sims, Joseph D.; Coleman, Hugh W.

    1998-01-01

    The subscale (11 and 24-inch) hybrid motors at the Marshall Space Flight Center (MSFC) have been used as versatile and cost effective testbeds for developing new technology. Comparisons between motor configuration, ignition systems, feed systems, fuel formulations, and nozzle materials have been carried out without detailed consideration as to haw "good" the motor performance data were. For the 250,000 lb/thrust motor developed by the Hybrid Propulsion Demonstration Program consortium, this shortcoming is particularly risky because motor performance will likely be used as put of a set of downselect criteria to choose between competing ignition and feed systems under development. This analysis directly addresses that shortcoming by applying uncertainty analysis techniques to the experimental determination of the characteristic velocity, theoretical characteristic velocity, and characteristic velocity efficiency for a 24-inch motor firing. With the adoption of fuel-lined headends, flow restriction, and aft mixing chambers, state of the an 24-inch hybrid motors have become very efficient However, impossibly high combustion efficiencies (some computed as high as 108%) have been measured in some tests with 11-inch motors. This analysis has given new insight into explaining how these efficiencies were measured to be so high, and into which experimental measurements contribute the most to the overall uncertainty.

  20. An analysis of calendar performance in two autistic calendar savants.

    PubMed

    Kennedy, Daniel P; Squire, Larry R

    2007-08-01

    We acquired large data sets of calendar performance from two autistic calendar savants, DG and RN. An analysis of their errors and reaction times revealed that (1) both individuals had knowledge of calendar information from a limited range of years; (2) there was no evidence for the use of memorized anchor dates that could, by virtue of counting away from the anchors, allow correct responses to questions about other dates; and (3) the two individuals differed in their calendar knowledge, as well as in their ability to perform secondary tasks in which calendar knowledge was assessed indirectly. In view of the fact that there are only 14 possible annual calendars, we suggest that both savants worked by memorizing these 14 possible calendar arrangements. PMID:17686947

  1. Computational analysis of Variable Thrust Engine (VTE) performance

    NASA Technical Reports Server (NTRS)

    Giridharan, M. G.; Krishnan, A.; Przekwas, A. J.

    1993-01-01

    The Variable Thrust Engine (VTE) of the Orbital Maneuvering Vehicle (OMV) uses a hypergolic propellant combination of Monomethyl Hydrazine (MMH) and Nitrogen Tetroxide (NTO) as fuel and oxidizer, respectively. The performance of the VTE depends on a number of complex interacting phenomena such as atomization, spray dynamics, vaporization, turbulent mixing, convective/radiative heat transfer, and hypergolic combustion. This study involved the development of a comprehensive numerical methodology to facilitate detailed analysis of the VTE. An existing Computational Fluid Dynamics (CFD) code was extensively modified to include the following models: a two-liquid, two-phase Eulerian-Lagrangian spray model; a chemical equilibrium model; and a discrete ordinate radiation heat transfer model. The modified code was used to conduct a series of simulations to assess the effects of various physical phenomena and boundary conditions on the VTE performance. The details of the models and the results of the simulations are presented.

  2. Performance analysis of coherent wireless optical communications with atmospheric turbulence.

    PubMed

    Niu, Mingbo; Song, Xuegui; Cheng, Julian; Holzman, Jonathan F

    2012-03-12

    Coherent wireless optical communication systems with heterodyne detection are analyzed for binary phase-shift keying (BPSK), differential PSK (DPSK), and M-ary PSK over Gamma-Gamma turbulence channels. Closed-form error rate expressions are derived using a series expansion approach. It is shown that, in the special case of K-distributed turbulence channel, the DPSK incurs a 3 dB signal-to-noise ratio (SNR) penalty compared to BPSK in the large SNR regime. The outage probability is also obtained, and a detailed outage truncation error analysis is presented and used to assess the accuracy in system performance estimation. It is shown that our series error rate expressions are simple to use and highly accurate for practical system performance estimation. PMID:22418534

  3. Performance analysis of charge plasma based dual electrode tunnel FET

    NASA Astrophysics Data System (ADS)

    Anand, Sunny; Intekhab Amin, S.; Sarin, R. K.

    2016-05-01

    This paper proposes the charge plasma based dual electrode doping-less tunnel FET (DEDLTFET). The paper compares the device performance of the conventional doping-less TFET (DLTFET) and doped TFET (DGTFET). DEDLTEFT gives the superior results with high ON state current (ION ∼ 0.56 mA/μm), ION/IOFF ratio ∼ 9.12 × 1013 and an average subthreshold swing (AV-SS ∼ 48 mV/dec). The variation of different device parameters such as channel length, gate oxide material, gate oxide thickness, silicon thickness, gate work function and temperature variation are done and compared with DLTFET and DGTFET. Through the extensive analysis it is found that DEDLTFET shows the better performance than the other two devices, which gives the indication for an excellent future in low power applications.

  4. Using SWE Standards for Ubiquitous Environmental Sensing: A Performance Analysis

    PubMed Central

    Tamayo, Alain; Granell, Carlos; Huerta, Joaquín

    2012-01-01

    Although smartphone applications represent the most typical data consumer tool from the citizen perspective in environmental applications, they can also be used for in-situ data collection and production in varied scenarios, such as geological sciences and biodiversity. The use of standard protocols, such as SWE, to exchange information between smartphones and sensor infrastructures brings benefits such as interoperability and scalability, but their reliance on XML is a potential problem when large volumes of data are transferred, due to limited bandwidth and processing capabilities on mobile phones. In this article we present a performance analysis about the use of SWE standards in smartphone applications to consume and produce environmental sensor data, analysing to what extent the performance problems related to XML can be alleviated by using alternative uncompressed and compressed formats.

  5. Fluid and thermal performance analysis of PMSM used for driving

    NASA Astrophysics Data System (ADS)

    Ding, Shuye; Cui, Guanghui; Li, Zhongyu; Guan, Tianyu

    2016-03-01

    The permanent magnet synchronous motor (PMSM) is widely used in ships under frequency conversion control system. The fluid flow performance and temperature distribution of the PMSM are difficult to clarify due to its complex structure and variable frequency control condition. Therefore, in order to investigate the fluid and thermal characteristics of the PMSM, a 50 kW PMSM was taken as an example in this study, and a 3-D coupling analysis model of fluid and thermal was established. The fluid and temperature fields were calculated by using finite volume method. The cooling medium's properties, such a velocity, streamlines, and temperature, were then analyzed. The correctness of the proposed model, and the rationality of the solution method, were verified by a temperature test of the PMSM. In this study, the changing rheology on the performance of the cooling medium and the working temperature of the PMSM were revealed, which could be helpful for designing the PMSM.

  6. Performance analysis of fractional order extremum seeking control.

    PubMed

    Malek, Hadi; Dadras, Sara; Chen, YangQuan

    2016-07-01

    Extremum-seeking scheme is a powerful adaptive technique to optimize steady-state system performance. In this paper, a novel extremum-seeking scheme for the optimization of nonlinear plants using fractional order calculus is proposed. The fractional order extremum-seeking algorithm only utilizes output measurements of the plant, however, it performs superior in many aspects such as convergence speed and robustness. A detailed stability analysis is given to not only guarantee a faster convergence of the system to an adjustable neighborhood of the optimum but also confirm a better robustness for proposed algorithm. Furthermore, simulation and experimental results demonstrate that the fractional order extremum-seeking scheme for nonlinear systems outperforms the traditional integer order one. PMID:27000632

  7. Commissioning and Performance Analysis of WhisperGen Stirling Engine

    NASA Astrophysics Data System (ADS)

    Pradip, Prashant Kaliram

    Stirling engine based cogeneration systems have potential to reduce energy consumption and greenhouse gas emission, due to their high cogeneration efficiency and emission control due to steady external combustion. To date, most studies on this unit have focused on performance based on both experimentation and computer models, and lack experimental data for diversified operating ranges. This thesis starts with the commissioning of a WhisperGen Stirling engine with components and instrumentation to evaluate power and thermal performance of the system. Next, a parametric study on primary engine variables, including air, diesel, and coolant flowrate and temperature were carried out to further understand their effect on engine power and efficiency. Then, this trend was validated with the thermodynamic model developed for the energy analysis of a Stirling cycle. Finally, the energy balance of the Stirling engine was compared without and with heat recovery from the engine block and the combustion chamber exhaust.

  8. Performance analysis of two high actuator count MEMS deformable mirrors

    NASA Astrophysics Data System (ADS)

    Ryan, Peter J.; Cornelissen, Steven A.; Lam, Charlie V.; Bierden, Paul A.

    2013-03-01

    Two new MEMS deformable mirrors have been designed and fabricated, one having a continuous facesheet with an active aperture of 20mm and 2040 actuators and the other, a similarly sized segmented tip tilt piston DM containing 1021 elements and 3063 actuators. The surface figures, electro mechanical performances, and actuator yield of these devices, with statistical information, are reported here. The statistical distributions of these measurements directly illustrate the surface variance of Boston Micromachines deformable mirrors. Measurements of the surface figure were also performed with the elements at different actuation states. Also presented here are deviations of the surface figure under actuation versus at its rest state, the electromechanical distribution, and a dynamic analysis.

  9. Performance analysis of a laser propelled interorbital tansfer vehicle

    NASA Technical Reports Server (NTRS)

    Minovitch, M. A.

    1976-01-01

    Performance capabilities of a laser-propelled interorbital transfer vehicle receiving propulsive power from one ground-based transmitter was investigated. The laser transmits propulsive energy to the vehicle during successive station fly-overs. By applying a series of these propulsive maneuvers, large payloads can be economically transferred between low earth orbits and synchronous orbits. Operations involving the injection of large payloads onto escape trajectories are also studied. The duration of each successive engine burn must be carefully timed so that the vehicle reappears over the laser station to receive additional propulsive power within the shortest possible time. The analytical solution for determining these time intervals is presented, as is a solution to the problem of determining maximum injection payloads. Parameteric computer analysis based on these optimization studies is presented. The results show that relatively low beam powers, on the order of 50 MW to 60 MW, produce significant performance capabilities.

  10. Performance analysis of a SOFC under direct internal reforming conditions

    NASA Astrophysics Data System (ADS)

    Janardhanan, Vinod M.; Heuveline, Vincent; Deutschmann, Olaf

    This paper presents the performance analysis of a planar solid-oxide fuel cell (SOFC) under direct internal reforming conditions. A detailed solid-oxide fuel cell model is used to study the influences of various operating parameters on cell performance. Significant differences in efficiency and power density are observed for isothermal and adiabatic operational regimes. The influence of air number, specific catalyst area, anode thickness, steam to carbon (s/c) ratio of the inlet fuel, and extend of pre-reforming on cell performance is analyzed. In all cases except for the case of pre-reformed fuel, adiabatic operation results in lower performance compared to isothermal operation. It is further discussed that, though direct internal reforming may lead to cost reduction and increased efficiency by effective utilization of waste heat, the efficiency of the fuel cell itself is higher for pre-reformed fuel compared to non-reformed fuel. Furthermore, criteria for the choice of optimal operating conditions for cell stacks operating under direct internal reforming conditions are discussed.

  11. Space mission scenario development and performance analysis tool

    NASA Technical Reports Server (NTRS)

    Kordon, Mark; Baker, John; Gilbert, John; Hanks, David

    2004-01-01

    This paper discusses a new and innovative approach for a rapid spacecraft multi-disciplinary performance analysis using a tool called the Mission Scenario Development Workbench (MSDW). To meet the needs of new classes of space missions, analysis tools with proven models were developed and integrated into a framework to enable rapid trades and analyses between spacecraft designs and operational scenarios during the formulation phase of a mission. Generally speaking, spacecraft resources are highly constrained on deep space missions and this approach makes it possible to maximize the use of existing resources to attain the best possible science return. This approach also has the potential benefit of reducing the risk of costly design changes made later in the design cycle necessary to meet the mission requirements by understanding system design sensitivities early and adding appropriate margins. This paper will describe the approach used by the Mars Science Laboratory Project to accomplish this result.

  12. Removing Grit During Wastewater Treatment: CFD Analysis of HDVS Performance.

    PubMed

    Meroney, Robert N; Sheker, Robert E

    2016-05-01

    Computational Fluid Dynamics (CFD) was used to simulate the grit and sand separation effectiveness of a typical hydrodynamic vortex separator (HDVS) system. The analysis examined the influences on the separator efficiency of: flow rate, fluid viscosities, total suspended solids (TSS), and particle size and distribution. It was found that separator efficiency for a wide range of these independent variables could be consolidated into a few curves based on the particle fall velocity to separator inflow velocity ratio, Ws/Vin. Based on CFD analysis it was also determined that systems of different sizes with length scale ratios ranging from 1 to 10 performed similarly when Ws/Vin and TSS were held constant. The CFD results have also been compared to a limited range of experimental data. PMID:27131307

  13. Transient analysis techniques in performing impact and crash dynamic studies

    NASA Technical Reports Server (NTRS)

    Pifko, A. B.; Winter, R.

    1989-01-01

    Because of the emphasis being placed on crashworthiness as a design requirement, increasing demands are being made by various organizations to analyze a wide range of complex structures that must perform safely when subjected to severe impact loads, such as those generated in a crash event. The ultimate goal of crashworthiness design and analysis is to produce vehicles with the ability to reduce the dynamic forces experienced by the occupants to specified levels, while maintaining a survivable envelope around them during a specified crash event. DYCAST is a nonlinear structural dynamic finite element computer code that started from the plans systems of a finite element program for static nonlinear structural analysis. The essential features of DYCAST are outlined.

  14. [An analysis of maicaodi by high performance liquid chromatography].

    PubMed

    Yang, H; Chen, R; Jiang, M

    1997-05-01

    Maicaodi has recently been developed and produced by the pesticide plant of Nanjing Agricultural University. The quantitative analysis of the effective components--tribenuron methyl and R (-)napropamide in wettable powder of Maicaode, by a high performance liquid chromatographic method was carried out with a Lichrosorb Si-60 20cm x 0.46cm i.d. column, mobile phase of petroleum ether/isopropanol/methanol/acetonitrile/chloroform mixture solvent (80:5:5:5:5) and internal standard of diisooctyl phthalate. The sample was detected by ultraviolet absorption at 254 nm. The retention times of tribenuron methyl and R (-)napropamide were 10-11min and 6-7min respectively. The coefficient of variation of this analysis was 0.34% with a recovery of 99.51%-100.32%. The coefficient of linear correlation was 0.9999. PMID:15739379

  15. A Divergence Statistics Extension to VTK for Performance Analysis.

    SciTech Connect

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2015-02-01

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.

  16. Theoretical analysis of the performance of a foam fractionation column

    PubMed Central

    Tobin, S. T.; Weaire, D.; Hutzler, S.

    2014-01-01

    A model system for theory and experiment which is relevant to foam fractionation consists of a column of foam moving through an inverted U-tube between two pools of surfactant solution. The foam drainage equation is used for a detailed theoretical analysis of this process. In a previous paper, we focused on the case where the lengths of the two legs are large. In this work, we examine the approach to the limiting case (i.e. the effects of finite leg lengths) and how it affects the performance of the fractionation column. We also briefly discuss some alternative set-ups that are of interest in industry and experiment, with numerical and analytical results to support them. Our analysis is shown to be generally applicable to a range of fractionation columns. PMID:24808752

  17. Performance Analysis of Visible Light Communication Using CMOS Sensors.

    PubMed

    Do, Trong-Hop; Yoo, Myungsik

    2016-01-01

    This paper elucidates the fundamentals of visible light communication systems that use the rolling shutter mechanism of CMOS sensors. All related information involving different subjects, such as photometry, camera operation, photography and image processing, are studied in tandem to explain the system. Then, the system performance is analyzed with respect to signal quality and data rate. To this end, a measure of signal quality, the signal to interference plus noise ratio (SINR), is formulated. Finally, a simulation is conducted to verify the analysis. PMID:26938535

  18. Performance Analysis of Visible Light Communication Using CMOS Sensors

    PubMed Central

    Do, Trong-Hop; Yoo, Myungsik

    2016-01-01

    This paper elucidates the fundamentals of visible light communication systems that use the rolling shutter mechanism of CMOS sensors. All related information involving different subjects, such as photometry, camera operation, photography and image processing, are studied in tandem to explain the system. Then, the system performance is analyzed with respect to signal quality and data rate. To this end, a measure of signal quality, the signal to interference plus noise ratio (SINR), is formulated. Finally, a simulation is conducted to verify the analysis. PMID:26938535

  19. Analysis of Random Segment Errors on Coronagraph Performance

    NASA Technical Reports Server (NTRS)

    Shaklan, Stuart B.; N'Diaye, Mamadou; Stahl, Mark T.; Stahl, H. Philip

    2016-01-01

    At 2015 SPIE O&P we presented "Preliminary Analysis of Random Segment Errors on Coronagraph Performance" Key Findings: Contrast Leakage for 4thorder Sinc2(X) coronagraph is 10X more sensitive to random segment piston than random tip/tilt, Fewer segments (i.e. 1 ring) or very many segments (> 16 rings) has less contrast leakage as a function of piston or tip/tilt than an aperture with 2 to 4 rings of segments. Revised Findings: Piston is only 2.5X more sensitive than Tip/Tilt

  20. Active charge/passive discharge solar heating systems: Thermal analysis and performance comparisons and performance comparisons

    NASA Astrophysics Data System (ADS)

    Swisher, J.

    1981-06-01

    This type of system combines liquid-cooled solar collector panels with a massive integral storage component that passively heats the building interior by radiation and free convection. The TRNSYS simulation program is used to evaluate system performance and to provide input for the development of a simplified analysis method. This method, which provides monthly calculations of delivered solar energy, is based on Klein's Phi-bar procedure and data from hourly TRNSYS simulations. The method can be applied to systems using a floor slab, a structural wall, or a water tank as the storage component. Important design parameters include collector area and orientation, building heat loss, collector and heat exchanger efficiencies, storage capacity, and storage to room coupling. Performance simulation results are used for comparisons with active and passive solar designs.

  1. Effects of specified performance criterion and performance feedback on staff behavior: a component analysis.

    PubMed

    Hardesty, Samantha L; Hagopian, Louis P; McIvor, Melissa M; Wagner, Leaora L; Sigurdsson, Sigurdur O; Bowman, Lynn G

    2014-09-01

    The present study isolated the effects of frequently used staff training intervention components to increase communication between direct care staff and clinicians working on an inpatient behavioral unit. Written "protocol review" quizzes developed by clinicians were designed to assess knowledge about a patient's behavioral protocols. Direct care staff completed these at the beginning of each day and evening shift. Clinicians were required to score and discuss these protocol reviews with direct care staff for at least 75% of shifts over a 2-week period. During baseline, only 21% of clinicians met this requirement. Completing and scoring of protocol reviews did not improve following additional in-service training (M = 15%) or following an intervention aimed at decreasing response effort combined with prompting (M = 28%). After implementing an intervention involving specified performance criterion and performance feedback, 86% of clinicians reached the established goal. Results of a component analysis suggested that the presentation of both the specified performance criterion and supporting contingencies was necessary to maintain acceptable levels of performance. PMID:24928213

  2. Voxel model in BNCT treatment planning: performance analysis and improvements

    NASA Astrophysics Data System (ADS)

    González, Sara J.; Carando, Daniel G.; Santa Cruz, Gustavo A.; Zamenhof, Robert G.

    2005-02-01

    In recent years, many efforts have been made to study the performance of treatment planning systems in deriving an accurate dosimetry of the complex radiation fields involved in boron neutron capture therapy (BNCT). The computational model of the patient's anatomy is one of the main factors involved in this subject. This work presents a detailed analysis of the performance of the 1 cm based voxel reconstruction approach. First, a new and improved material assignment algorithm implemented in NCTPlan treatment planning system for BNCT is described. Based on previous works, the performances of the 1 cm based voxel methods used in the MacNCTPlan and NCTPlan treatment planning systems are compared by standard simulation tests. In addition, the NCTPlan voxel model is benchmarked against in-phantom physical dosimetry of the RA-6 reactor of Argentina. This investigation shows the 1 cm resolution to be accurate enough for all reported tests, even in the extreme cases such as a parallelepiped phantom irradiated through one of its sharp edges. This accuracy can be degraded at very shallow depths in which, to improve the estimates, the anatomy images need to be positioned in a suitable way. Rules for this positioning are presented. The skin is considered one of the organs at risk in all BNCT treatments and, in the particular case of cutaneous melanoma of extremities, limits the delivered dose to the patient. Therefore, the performance of the voxel technique is deeply analysed in these shallow regions. A theoretical analysis is carried out to assess the distortion caused by homogenization and material percentage rounding processes. Then, a new strategy for the treatment of surface voxels is proposed and tested using two different irradiation problems. For a parallelepiped phantom perpendicularly irradiated with a 5 keV neutron source, the large thermal neutron fluence deviation present at shallow depths (from 54% at 0 mm depth to 5% at 4 mm depth) is reduced to 2% on average

  3. Aerocapture Performance Analysis of A Venus Exploration Mission

    NASA Technical Reports Server (NTRS)

    Starr, Brett R.; Westhelle, Carlos H.

    2005-01-01

    A performance analysis of a Discovery Class Venus Exploration Mission in which aerocapture is used to capture a spacecraft into a 300km polar orbit for a two year science mission has been conducted to quantify its performance. A preliminary performance assessment determined that a high heritage 70 sphere-cone rigid aeroshell with a 0.25 lift to drag ratio has adequate control authority to provide an entry flight path angle corridor large enough for the mission s aerocapture maneuver. A 114 kilograms per square meter ballistic coefficient reference vehicle was developed from the science requirements and the preliminary assessment s heating indicators and deceleration loads. Performance analyses were conducted for the reference vehicle and for sensitivity studies on vehicle ballistic coefficient and maximum bank rate. The performance analyses used a high fidelity flight simulation within a Monte Carlo executive to define the aerocapture heating environment and deceleration loads and to determine mission success statistics. The simulation utilized the Program to Optimize Simulated Trajectories (POST) that was modified to include Venus specific atmospheric and planet models, aerodynamic characteristics, and interplanetary trajectory models. In addition to Venus specific models, an autonomous guidance system, HYPAS, and a pseudo flight controller were incorporated in the simulation. The Monte Carlo analyses incorporated a reference set of approach trajectory delivery errors, aerodynamic uncertainties, and atmospheric density variations. The reference performance analysis determined the reference vehicle achieves 100% successful capture and has a 99.87% probability of attaining the science orbit with a 90 meters per second delta V budget for post aerocapture orbital adjustments. A ballistic coefficient trade study conducted with reference uncertainties determined that the 0.25 L/D vehicle can achieve 100% successful capture with a ballistic coefficient of 228 kilograms

  4. WRF model performance analysis for a suite of simulation design

    NASA Astrophysics Data System (ADS)

    Mohan, Manju; Sati, Ankur Prabhat

    2016-03-01

    At present scientists are successfully using Numerical Weather Prediction (NWP) models to achieve a reliable forecast. Nested domains are preferred by the modelling community with varying grid ratios having wider applications. The impact of the nesting grid ratio (NGR) on the model performance needs systematic analysis and explored in the present study. The usage of WRF is mostly as a mesoscale model in simulating either extreme events or events of smaller duration shown with statistical model evaluation for the correspondingly similar and short period of time. Thus, influence of the simulation period on model performance has been examined for key meteorological parameters. Several works done earlier on episodes involve model implementation for longer duration and for that single simulation is performed often for a continuous stretch. This study scrutinizes the influence on model performance due to one single simulation versus several smaller simulations for the same duration; essentially splitting the run-time. In the present study, the surface wind (i.e., winds at 10 meters), temperature and Relative humidity at 2 meters as obtained from model simulations are compared with the Observations. The sensitivity study of nesting grid ratio, continuous versus smaller split simulations and realistic simulation period is done in the present study. It is found that there is no statistically significant difference in the simulated results on changing the nesting grid ratio while the smaller time split schemes (2 days and 4 days schemes on comparison with 8 days and 16 days continuous run) improve the results significantly. The impact of increasing number of observations from different sites on model performance is also scrutinised. Furthermore, conceptual framework is provided for Optimum time period for simulations to have confidence in statistical model evaluation.

  5. Performance analysis of jump-gliding locomotion for miniature robotics.

    PubMed

    Vidyasagar, A; Zufferey, Jean-Christohphe; Floreano, Dario; Kovač, M

    2015-04-01

    Recent work suggests that jumping locomotion in combination with a gliding phase can be used as an effective mobility principle in robotics. Compared to pure jumping without a gliding phase, the potential benefits of hybrid jump-gliding locomotion includes the ability to extend the distance travelled and reduce the potentially damaging impact forces upon landing. This publication evaluates the performance of jump-gliding locomotion and provides models for the analysis of the relevant dynamics of flight. It also defines a jump-gliding envelope that encompasses the range that can be achieved with jump-gliding robots and that can be used to evaluate the performance and improvement potential of jump-gliding robots. We present first a planar dynamic model and then a simplified closed form model, which allow for quantification of the distance travelled and the impact energy on landing. In order to validate the prediction of these models, we validate the model with experiments using a novel jump-gliding robot, named the 'EPFL jump-glider'. It has a mass of 16.5 g and is able to perform jumps from elevated positions, perform steered gliding flight, land safely and traverse on the ground by repetitive jumping. The experiments indicate that the developed jump-gliding model fits very well with the measured flight data using the EPFL jump-glider, confirming the benefits of jump-gliding locomotion to mobile robotics. The jump-glide envelope considerations indicate that the EPFL jump-glider, when traversing from a 2 m height, reaches 74.3% of optimal jump-gliding distance compared to pure jumping without a gliding phase which only reaches 33.4% of the optimal jump-gliding distance. Methods of further improving flight performance based on the models and inspiration from biological systems are presented providing mechanical design pathways to future jump-gliding robot designs. PMID:25811417

  6. A Preliminary Analysis of LANDSAT-4 Thematic Mapper Radiometric Performance

    NASA Technical Reports Server (NTRS)

    Justice, C.; Fusco, L.; Mehl, W.

    1984-01-01

    Analysis was performed to characterize the radiometry of three Thematic Mapper (TM) digital products of a scene of Arkansas. The three digital products examined were the NASA raw (BT) product, the radiometrically corrected (AT) product and the radiometrically and geometrically corrected (PT) product. The frequency distribution of the digital data; the statistical correlation between the bands; and the variability between the detectors within a band were examined on a series of image subsets from the full scene. The results are presented from one 1024 x 1024 pixel subset of Realfoot Lake, Tennessee which displayed a representative range of ground conditions and cover types occurring within the full frame image. Bands 1, 2 and 5 of the sample area are presented. The subsets were extracted from the three digital data products to cover the same geographic area. This analysis provides the first step towards a full appraisal of the TM radiometry being performed as part of the ESA/CEC contribution to the NASA/LIDQA program.

  7. A conceptual design tool for RBCC engine performance analysis

    NASA Astrophysics Data System (ADS)

    Olds, John R.; Saks, Greg

    1997-01-01

    Future reusable launch vehicles will depend on new propulsion technologies to lower system operational costs while maintaining adequate performance. Recently, a number of vehicle systems utilizing rocket-based combined-cycle (RBCC) propulsion have been proposed as possible low-cost space launch solutions. Vehicles using RBCC propulsion have the potential to combine the best aspects of airbreathing propulsion (high average Isp) with the best aspects of rocket propulsion (high propellant bulk density and engine T/W). Proper conceptual assessment of each proposed vehicle will require computer-based tools that allow for quick and cheap, yet sufficiently accurate disciplinary analyses. At Georgia Tech, a spreadsheet-based tool has been developed that uses quasi-1D flow analysis with component efficiencies to parametrically model RBCC engine performance in ejector, fan-ramjet, ramjet and pure rocket modes. The technique is similar to an earlier RBCC modeling technique developed by the Marquardt Corporation in the mid-1960's. For a given sea-level static thrust requirement, the current tool generates engine weight and size data, as well as Isp and thrust data vs. altitude and Mach number. The latter is output in tabular form for use in a trajectory optimization program. This paper reviews the current state of the RBCC analysis tool and the effort to upgrade it from a Microsoft Excel spreadsheet to a design-oriented UNIX program in C suitable for integration into a multidisciplinary design optimization (MDO) framework.

  8. Topology design and performance analysis of an integrated communication network

    NASA Technical Reports Server (NTRS)

    Li, V. O. K.; Lam, Y. F.; Hou, T. C.; Yuen, J. H.

    1985-01-01

    A research study on the topology design and performance analysis for the Space Station Information System (SSIS) network is conducted. It is begun with a survey of existing research efforts in network topology design. Then a new approach for topology design is presented. It uses an efficient algorithm to generate candidate network designs (consisting of subsets of the set of all network components) in increasing order of their total costs, and checks each design to see if it forms an acceptable network. This technique gives the true cost-optimal network, and is particularly useful when the network has many constraints and not too many components. The algorithm for generating subsets is described in detail, and various aspects of the overall design procedure are discussed. Two more efficient versions of this algorithm (applicable in specific situations) are also given. Next, two important aspects of network performance analysis: network reliability and message delays are discussed. A new model is introduced to study the reliability of a network with dependent failures. For message delays, a collection of formulas from existing research results is given to compute or estimate the delays of messages in a communication network without making the independence assumption. The design algorithm coded in PASCAL is included as an appendix.

  9. A conceptual design tool for RBCC engine performance analysis

    SciTech Connect

    Olds, J.R.; Saks, G.

    1997-01-01

    Future reusable launch vehicles will depend on new propulsion technologies to lower system operational costs while maintaining adequate performance. Recently, a number of vehicle systems utilizing rocket-based combined-cycle (RBCC) propulsion have been proposed as possible low-cost space launch solutions. Vehicles using RBCC propulsion have the potential to combine the best aspects of airbreathing propulsion (high average Isp) with the best aspects of rocket propulsion (high propellant bulk density and engine T/W). Proper conceptual assessment of each proposed vehicle will require computer-based tools that allow for quick and cheap, yet sufficiently accurate disciplinary analyses. At Georgia Tech, a spreadsheet-based tool has been developed that uses quasi-1D flow analysis with component efficiencies to parametrically model RBCC engine performance in ejector, fan-ramjet, ramjet and pure rocket modes. The technique is similar to an earlier RBCC modeling technique developed by the Marquardt Corporation in the mid-1960{close_quote}s. For a given sea-level static thrust requirement, the current tool generates engine weight and size data, as well as Isp and thrust data vs. altitude and Mach number. The latter is output in tabular form for use in a trajectory optimization program. This paper reviews the current state of the RBCC analysis tool and the effort to upgrade it from a Microsoft Excel spreadsheet to a design-oriented UNIX program in C suitable for integration into a multidisciplinary design optimization (MDO) framework. {copyright} {ital 1997 American Institute of Physics.}

  10. Autotasked Performance in the NAS Workload: A Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Carter, R. L.; Stockdale, I. E.; Kutler, Paul (Technical Monitor)

    1998-01-01

    A statistical analysis of the workload performance of a production quality FORTRAN code for five different Cray Y-MP hardware and system software configurations is performed. The analysis was based on an experimental procedure that was designed to minimize correlations between the number of requested CPUs and the time of day the runs were initiated. Observed autotasking over heads were significantly larger for the set of jobs that requested the maximum number of CPUs. Speedups for UNICOS 6 releases show consistent wall clock speedups in the workload of around 2. which is quite good. The observed speed ups were very similar for the set of jobs that requested 8 CPUs and the set that requested 4 CPUs. The original NAS algorithm for determining charges to the user discourages autotasking in the workload. A new charging algorithm to be applied to jobs run in the NQS multitasking queues also discourages NAS users from using auto tasking. The new algorithm favors jobs requesting 8 CPUs over those that request less, although the jobs requesting 8 CPUs experienced significantly higher over head and presumably degraded system throughput. A charging algorithm is presented that has the following desirable characteristics when applied to the data: higher overhead jobs requesting 8 CPUs are penalized when compared to moderate overhead jobs requesting 4 CPUs, thereby providing a charging incentive to NAS users to use autotasking in a manner that provides them with significantly improved turnaround while also maintaining system throughput.

  11. Ultra performance liquid chromatography tandem mass spectrometry performance evaluation for analysis of antibiotics in natural waters.

    PubMed

    Tamtam, Fatima; Mercier, Fabien; Eurin, Joëlle; Chevreuil, Marc; Le Bot, Barbara

    2009-03-01

    An ultra performance liquid chromatography electrospray tandem mass spectrometry (UPLC/MS/MS) method was developed and validated for the determination of 17 antibiotics in natural waters in one single extraction and chromatographic procedure. Gradient separation conditions were optimised for 17 compounds belonging to five different antibiotic groups: quinolones (oxolinic acid, nalidixic acid, pipemidic acid, flumequine), fluoroquinolones (enoxacin, ciprofloxacin, norfloxacin, ofloxacin, enrofloxacin, sarafloxacin, danofloxacin, difloxacin, lomefloxacin), sulphonamides (sulphamethoxazole, sulphamethazine), nitro-imidazole (ornidazole) and diaminopyrimidine (trimethoprim). The separation of all compounds, obtained using a 1.7 microm particle size column (100 mm x 2.1 mm), was achieved within 10 min time. Water samples were adjusted to pH 7 and extracted using Oasis hydrophilic-lipophilic balance (HLB) solid phase extraction cartridges. After elution with methanol and concentration, extracts were injected in a C18 column (Acquity UPLC BEH C18) and detected by tandem mass spectrometry. Average recovery from 100 ng L(-1) fortified samples was higher than 70% for most of the compounds, with relative standard deviations below 20%. Performances of the method (recoveries, detection limit, quantification limit and relative standard deviation) and matrix effects were studied, and results obtained showed that method was suitable for routine analysis of antibiotics in surface water. Samples analysis from Seine River (France) confirmed the interest of antibiotic contamination evaluation in that area. PMID:19148627

  12. Correlation analysis between ionospheric scintillation levels and receiver tracking performance

    NASA Astrophysics Data System (ADS)

    Sreeja, V.; Aquino, M.; Elmas, Z. G.; Forte, B.

    2012-06-01

    Rapid fluctuations in the amplitude and phase of a transionospheric radio signal caused by small scale plasma density irregularities in the ionosphere are known as scintillation. Scintillation can seriously impair a GNSS (Global Navigation Satellite Systems) receiver tracking performance, thus affecting the required levels of availability, accuracy and integrity, and consequently the reliability of modern day GNSS based applications. This paper presents an analysis of correlation between scintillation levels and tracking performance of a GNSS receiver for GPS L1C/A, L2C and GLONASS L1, L2 signals. The analyses make use of data recorded over Presidente Prudente (22.1°S, 51.4°W, dip latitude ˜12.3°S) in Brazil, a location close to the Equatorial Ionisation Anomaly (EIA) crest in Latin America. The study presents for the first time this type of correlation analysis for GPS L2C and GLONASS L1, L2 signals. The scintillation levels are defined by the amplitude scintillation index, S4 and the receiver tracking performance is evaluated by the phase tracking jitter. Both S4 and the phase tracking jitter are estimated from the post correlation In-Phase (I) and Quadra-Phase (Q) components logged by the receiver at a high rate. Results reveal that the dependence of the phase tracking jitter on the scintillation levels can be represented by a quadratic fit for the signals. The results presented in this paper are of importance to GNSS users, especially in view of the forthcoming high phase of solar cycle 24 (predicted for 2013).

  13. Design and Performance Analysis of Incremental Networked Predictive Control Systems.

    PubMed

    Pang, Zhong-Hua; Liu, Guo-Ping; Zhou, Donghua

    2016-06-01

    This paper is concerned with the design and performance analysis of networked control systems with network-induced delay, packet disorder, and packet dropout. Based on the incremental form of the plant input-output model and an incremental error feedback control strategy, an incremental networked predictive control (INPC) scheme is proposed to actively compensate for the round-trip time delay resulting from the above communication constraints. The output tracking performance and closed-loop stability of the resulting INPC system are considered for two cases: 1) plant-model match case and 2) plant-model mismatch case. For the former case, the INPC system can achieve the same output tracking performance and closed-loop stability as those of the corresponding local control system. For the latter case, a sufficient condition for the stability of the closed-loop INPC system is derived using the switched system theory. Furthermore, for both cases, the INPC system can achieve a zero steady-state output tracking error for step commands. Finally, both numerical simulations and practical experiments on an Internet-based servo motor system illustrate the effectiveness of the proposed method. PMID:26186798

  14. Instantaneous BeiDou-GPS attitude determination: A performance analysis

    NASA Astrophysics Data System (ADS)

    Nadarajah, Nandakumaran; Teunissen, Peter J. G.; Raziq, Noor

    2014-09-01

    The advent of modernized and new global navigation satellite systems (GNSS) has enhanced the availability of satellite based positioning, navigation, and timing (PNT) solutions. Specifically, it increases redundancy and yields operational back-up or independence in case of failure or unavailability of one system. Among existing GNSS, the Chinese BeiDou system (BDS) is being developed and will consist of geostationary (GEO) satellites, inclined geosynchronous orbit (IGSO) satellites, and medium-Earth-orbit (MEO) satellites. In this contribution, a BeiDou-GPS robustness analysis is carried out for instantaneous, unaided attitude determination. Precise attitude determination using multiple GNSS antennas mounted on a platform relies on the successful resolution of the integer carrier phase ambiguities. The constrained Least-squares AMBiguity Decorrelation Adjustment (C-LAMBDA) method has been developed for the quadratically constrained GNSS compass model that incorporates the known baseline length. In this contribution the method is used to analyse the attitude determination performance when using the GPS and BeiDou systems. The attitude determination performance is evaluated using GPS/BeiDou data sets from a real data campaign in Australia spanning several days. The study includes the performance analyses of both stand-alone and mixed constellation (GPS/BeiDou) attitude estimation under various satellite deprived environments. We demonstrate and quantify the improved availability and accuracy of attitude determination using the combined constellation.

  15. A convolution integral approach for performance assessments with uncertainty analysis

    SciTech Connect

    Dawoud, E.; Miller, L.F.

    1999-09-01

    Performance assessments that include uncertainty analyses and risk assessments are typically not obtained for time-dependent releases of radioactive contaminants to the geosphere when a series of sequentially coupled transport models is required for determining results. This is due, in part, to the geophysical complexity of the site, and to the numerical complexity of the fate and transport models. The lack of a practical tool for linking the transport models in a fashion that facilitates uncertainty analysis is another reason for not performing uncertainty analyses in these studies. The multiconvolution integral (MCI) approach presented herein greatly facilitates the practicality of incorporating uncertainty analyses into performance assessments. In this research an MCI approach is developed, and the decoupling of fate and transport processes into an independent system is described. A conceptual model, extracted from the Inactive Tanks project at the Oak Ridge National Laboratory (ORNL), is used to demonstrate the approach. Numerical models are used for transport of {sup 90}Sr from a disposal facility, WC-1 at ORNL, through the vadose and saturated zones to a downgradient point at Fifth Creek, and an analytical surface water model is used to transport the contaminants to a downstream potential receptor point at White Oak Creek. The probability density functions of the final concentrations obtained by the MCI approach are in excellent agreement with those obtained by a Monte Carlo approach that propagated uncertainties through all submodels for each random sample.

  16. Aerocapture Performance Analysis for a Neptune-Triton Exploration Mission

    NASA Technical Reports Server (NTRS)

    Starr, Brett R.; Westhelle, Carlos H.; Masciarelli, James P.

    2004-01-01

    A systems analysis has been conducted for a Neptune-Triton Exploration Mission in which aerocapture is used to capture a spacecraft at Neptune. Aerocapture uses aerodynamic drag instead of propulsion to decelerate from the interplanetary approach trajectory to a captured orbit during a single pass through the atmosphere. After capture, propulsion is used to move the spacecraft from the initial captured orbit to the desired science orbit. A preliminary assessment identified that a spacecraft with a lift to drag ratio of 0.8 was required for aerocapture. Performance analyses of the 0.8 L/D vehicle were performed using a high fidelity flight simulation within a Monte Carlo executive to determine mission success statistics. The simulation was the Program to Optimize Simulated Trajectories (POST) modified to include Neptune specific atmospheric and planet models, spacecraft aerodynamic characteristics, and interplanetary trajectory models. To these were added autonomous guidance and pseudo flight controller models. The Monte Carlo analyses incorporated approach trajectory delivery errors, aerodynamic characteristics uncertainties, and atmospheric density variations. Monte Carlo analyses were performed for a reference set of uncertainties and sets of uncertainties modified to produce increased and reduced atmospheric variability. For the reference uncertainties, the 0.8 L/D flatbottom ellipsled vehicle achieves 100% successful capture and has a 99.87 probability of attaining the science orbit with a 360 m/s V budget for apoapsis and periapsis adjustment. Monte Carlo analyses were also performed for a guidance system that modulates both bank angle and angle of attack with the reference set of uncertainties. An alpha and bank modulation guidance system reduces the 99.87 percentile DELTA V 173 m/s (48%) to 187 m/s for the reference set of uncertainties.

  17. Advanced multiphysics coupling for LWR fuel performance analysis

    SciTech Connect

    Hales, J. D.; Tonks, M. R.; Gleicher, F. N.; Spencer, B. W.; Novascone, S. R.; Williamson, R. L.; Pastore, G.; Perez, D. M.

    2015-10-01

    Even the most basic nuclear fuel analysis is a multiphysics undertaking, as a credible simulation must consider at a minimum coupled heat conduction and mechanical deformation. The need for more realistic fuel modeling under a variety of conditions invariably leads to a desire to include coupling between a more complete set of the physical phenomena influencing fuel behavior, including neutronics, thermal hydraulics, and mechanisms occurring at lower length scales. This paper covers current efforts toward coupled multiphysics LWR fuel modeling in three main areas. The first area covered in this paper concerns thermomechanical coupling. The interaction of these two physics, particularly related to the feedback effect associated with heat transfer and mechanical contact at the fuel/clad gap, provides numerous computational challenges. An outline is provided of an effective approach used to manage the nonlinearities associated with an evolving gap in BISON, a nuclear fuel performance application. A second type of multiphysics coupling described here is that of coupling neutronics with thermomechanical LWR fuel performance. DeCART, a high-fidelity core analysis program based on the method of characteristics, has been coupled to BISON. DeCART provides sub-pin level resolution of the multigroup neutron flux, with resonance treatment, during a depletion or a fast transient simulation. Two-way coupling between these codes was achieved by mapping fission rate density and fast neutron flux fields from DeCART to BISON and the temperature field from BISON to DeCART while employing a Picard iterative algorithm. Finally, the need for multiscale coupling is considered. Fission gas production and evolution significantly impact fuel performance by causing swelling, a reduction in the thermal conductivity, and fission gas release. The mechanisms involved occur at the atomistic and grain scale and are therefore not the domain of a fuel performance code. However, it is possible to use

  18. Advanced multiphysics coupling for LWR fuel performance analysis

    DOE PAGESBeta

    Hales, J. D.; Tonks, M. R.; Gleicher, F. N.; Spencer, B. W.; Novascone, S. R.; Williamson, R. L.; Pastore, G.; Perez, D. M.

    2015-10-01

    Even the most basic nuclear fuel analysis is a multiphysics undertaking, as a credible simulation must consider at a minimum coupled heat conduction and mechanical deformation. The need for more realistic fuel modeling under a variety of conditions invariably leads to a desire to include coupling between a more complete set of the physical phenomena influencing fuel behavior, including neutronics, thermal hydraulics, and mechanisms occurring at lower length scales. This paper covers current efforts toward coupled multiphysics LWR fuel modeling in three main areas. The first area covered in this paper concerns thermomechanical coupling. The interaction of these two physics,more » particularly related to the feedback effect associated with heat transfer and mechanical contact at the fuel/clad gap, provides numerous computational challenges. An outline is provided of an effective approach used to manage the nonlinearities associated with an evolving gap in BISON, a nuclear fuel performance application. A second type of multiphysics coupling described here is that of coupling neutronics with thermomechanical LWR fuel performance. DeCART, a high-fidelity core analysis program based on the method of characteristics, has been coupled to BISON. DeCART provides sub-pin level resolution of the multigroup neutron flux, with resonance treatment, during a depletion or a fast transient simulation. Two-way coupling between these codes was achieved by mapping fission rate density and fast neutron flux fields from DeCART to BISON and the temperature field from BISON to DeCART while employing a Picard iterative algorithm. Finally, the need for multiscale coupling is considered. Fission gas production and evolution significantly impact fuel performance by causing swelling, a reduction in the thermal conductivity, and fission gas release. The mechanisms involved occur at the atomistic and grain scale and are therefore not the domain of a fuel performance code. However, it is

  19. Performance Analysis of an Actor-Based Distributed Simulation

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1998-01-01

    Object-oriented design of simulation programs appears to be very attractive because of the natural association of components in the simulated system with objects. There is great potential in distributing the simulation across several computers for the purpose of parallel computation and its consequent handling of larger problems in less elapsed time. One approach to such a design is to use "actors", that is, active objects with their own thread of control. Because these objects execute concurrently, communication is via messages. This is in contrast to an object-oriented design using passive objects where communication between objects is via method calls (direct calls when they are in the same address space and remote procedure calls when they are in different address spaces or different machines). This paper describes a performance analysis program for the evaluation of a design for distributed simulations based upon actors.

  20. Performance analysis for geometrical attack on digital image watermarking

    NASA Astrophysics Data System (ADS)

    Jayanthi, VE.; Rajamani, V.; Karthikayen, P.

    2011-11-01

    We present a technique for irreversible watermarking approach robust to affine transform attacks in camera, biomedical and satellite images stored in the form of monochrome bitmap images. The watermarking approach is based on image normalisation in which both watermark embedding and extraction are carried out with respect to an image normalised to meet a set of predefined moment criteria. The normalisation procedure is invariant to affine transform attacks. The result of watermarking scheme is suitable for public watermarking applications, where the original image is not available for watermark extraction. Here, direct-sequence code division multiple access approach is used to embed multibit text information in DCT and DWT transform domains. The proposed watermarking schemes are robust against various types of attacks such as Gaussian noise, shearing, scaling, rotation, flipping, affine transform, signal processing and JPEG compression. Performance analysis results are measured using image processing metrics.

  1. Performance Analysis: ITS Data through September 30, 2009

    SciTech Connect

    Kerr, C E

    2009-12-07

    Data from ITS was analyzed to understand the issues at LLNL and to identify issues that may require additional management attention and these that meet the threshold for reporting to the DOE Noncompliance Tracking System (NTS). In this report we discuss assessments and issues entered in ITS and compare the number and type presently entered in ITS to previous time periods. Issues reported in ITS were evaluated and discussed. The analysis identified two noncompliances that meet the threshold for reporting to the DOE NTS. All of the data in ITS is analyzed; however, the primary focus of this report is to meet requirements for performance analysis of specific functional areas. The DOE Office of Enforcement expects LLNL to 'implement comprehensive management and independent assessments that are effective in identifying deficiencies and broader problems in safety and security programs, as well as opportunities for continuous improvement within the organization' and to 'regularly perform assessments to evaluate implementation of the contractor's's processes for screening and internal reporting.' LLNL has a self-assessment program, described in the document applicable during this time period, ES&H Manual Document 4.1, that includes line, management and independent assessments. LLNL also has in place a process to identify and report deficiencies of nuclear, worker safety and health and security requirements. In addition, the DOE Office of Enforcement expects that 'issues management databases are used to identify adverse trends, dominant problem areas, and potential repetitive events or conditions' (page 15, DOE Enforcement Process Overview, June 2009). LLNL requires that all worker safety and health and nuclear safety noncompliances be tracked as 'deficiencies' in the LLNL Issues Tracking System (ITS). Data from the ITS are analyzed for worker safety and health (WSH) and nuclear safety noncompliances that may meet the threshold for reporting to the DOE Noncompliance

  2. Performance analysis of bearings-only tracking algorithm

    NASA Astrophysics Data System (ADS)

    van Huyssteen, David; Farooq, Mohamad

    1998-07-01

    A number of 'bearing-only' target motion analysis algorithms have appeared in the literature over the years, all suited to track an object based solely on noisy measurements of its angular position. In their paper 'Utilization of Modified Polar (MP) Coordinates for Bearings-Only Tracking' Aidala and Hammel advocate a filter in which the observable and unobservable states are naturally decoupled. While the MP filter has certain advantages over Cartesian and pseudolinear extended Kalman filters, it does not escape the requirement for the observer to steer an optimum maneuvering course to guarantee acceptable performance. This paper demonstrates by simulation the consequence if the observer deviates from this profile, even if it is sufficient to produce full state observability.

  3. A theoretical analysis of vacuum arc thruster performance

    NASA Technical Reports Server (NTRS)

    Polk, James E.; Sekerak, Mike; Ziemer, John K.; Schein, Jochen; Qi, Niansheng; Binder, Robert; Anders, Andre

    2001-01-01

    In vacuum arc discharges the current is conducted through vapor evaporated from the cathode surface. In these devices very dense, highly ionized plasmas can be created from any metallic or conducting solid used as the cathode. This paper describes theoretical models of performance for several thruster configurations which use vacuum arc plasma sources. This analysis suggests that thrusters using vacuum arc sources can be operated efficiently with a range of propellant options that gives great flexibility in specific impulse. In addition, the efficiency of plasma production in these devices appears to be largely independent of scale because the metal vapor is ionized within a few microns of the cathode electron emission sites, so this approach is well-suited for micropropulsion.

  4. Performance Analysis of Paraboloidal Reflector Antennas in Radio Telescopes

    NASA Astrophysics Data System (ADS)

    Yeap, Kim Ho; Law, Young Hui; Rizman, Zairi Ismael; Cheong, Yuen Kiat; Ong, Chu En; Chong, Kok Hen

    2013-10-01

    In this paper, we present an analysis on the performance of the three most commonly used paraboloidal reflector antennas in radio telescopes - i.e. the prime focus, Cassegrain, and Gregorian antennas. In our study, we have adopted the design parameters for the Cassegrain configuration used in the Atacama Large Millimeter Array (ALMA) project. The parameters are subsequently re-calculated so as to meet the design requirement of the Gregorian and prime focus configurations. The simulation results obtained from GRASP reveal that the prime focus configuration produces the lowest side lobes and the highest main lobe level. Such configuration, however, has the disadvantage of being highly susceptible to thermal ground noise radiation. The radiation characteristics produced by both the Cassegrain and Gregorian configurations are very close to each other. Indeed, the results show that there is no significant advantage between the two designs. Hence, we can conclude that both co! nfigurations are comparable in the application of radio telescopes.

  5. Automotive Gas Turbine Power System-Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    1997-01-01

    An open cycle gas turbine numerical modelling code suitable for thermodynamic performance analysis (i.e. thermal efficiency, specific fuel consumption, cycle state points, working fluid flowrates etc.) of automotive and aircraft powerplant applications has been generated at the NASA Lewis Research Center's Power Technology Division. The use this code can be made available to automotive gas turbine preliminary design efforts, either in its present version, or, assuming that resources can be obtained to incorporate empirical models for component weight and packaging volume, in later version that includes the weight-volume estimator feature. The paper contains a brief discussion of the capabilities of the presently operational version of the code, including a listing of input and output parameters and actual sample output listings.

  6. Performance analysis of spread spectrum modulation in data hiding

    NASA Astrophysics Data System (ADS)

    Gang, Litao; Akansu, Ali N.; Ramkumar, Mahalingam

    2001-12-01

    Watermarking or steganography technology provides a possible solution in digital multimedia copyright protection and pirate tracking. Most of the current data hiding schemes are based on spread spectrum modulation. A small value watermark signal is embedded into the content signal in some watermark domain. The information bits can be extracted via correlation. The schemes are applied both in escrow and oblivious cases. This paper reveals, through analysis and simulation, that in oblivious applications where the original signal is not available, the commonly used correlation detection is not optimal. Its maximum likelihood detection is analyzed and a feasible suboptimal detector is derived. Its performance is explored and compared with the correlation detector. Subsequently a linear embedding scheme is proposed and studied. Experiments with image data hiding demonstrates its effectiveness in applications.

  7. Human performance analysis of industrial radiography radiation exposure events

    SciTech Connect

    Reece, W.J.; Hill, S.G.

    1995-12-01

    A set of radiation overexposure event reports were reviewed as part of a program to examine human performance in industrial radiography for the US Nuclear Regulatory Commission. Incident records for a seven year period were retrieved from an event database. Ninety-five exposure events were initially categorized and sorted for further analysis. Descriptive models were applied to a subset of severe overexposure events. Modeling included: (1) operational sequence tables to outline the key human actions and interactions with equipment, (2) human reliability event trees, (3) an application of an information processing failures model, and (4) an extrapolated use of the error influences and effects diagram. Results of the modeling analyses provided insights into the industrial radiography task and suggested areas for further action and study to decrease overexposures.

  8. Portable Life Support Subsystem Thermal Hydraulic Performance Analysis

    NASA Technical Reports Server (NTRS)

    Barnes, Bruce; Pinckney, John; Conger, Bruce

    2010-01-01

    This paper presents the current state of the thermal hydraulic modeling efforts being conducted for the Constellation Space Suit Element (CSSE) Portable Life Support Subsystem (PLSS). The goal of these efforts is to provide realistic simulations of the PLSS under various modes of operation. The PLSS thermal hydraulic model simulates the thermal, pressure, flow characteristics, and human thermal comfort related to the PLSS performance. This paper presents modeling approaches and assumptions as well as component model descriptions. Results from the models are presented that show PLSS operations at steady-state and transient conditions. Finally, conclusions and recommendations are offered that summarize results, identify PLSS design weaknesses uncovered during review of the analysis results, and propose areas for improvement to increase model fidelity and accuracy.

  9. Hydrodynamic body shape analysis and their impact on swimming performance.

    PubMed

    Li, Tian-Zeng; Zhan, Jie-Min

    2015-01-01

    This study presents the hydrodynamic characteristics of different adult male swimmer's body shape using computational fluid dynamics method. This simulation strategy is carried out by CFD fluent code with solving the 3D incompressible Navier-Stokes equations using the RNG k-ε turbulence closure. The water free surface is captured by the volume of fluid (VOF) method. A set of full body models, which is based on the anthropometrical characteristics of the most common male swimmers, is created by Computer Aided Industrial Design (CAID) software, Rhinoceros. The analysis of CFD results revealed that swimmer's body shape has a noticeable effect on the hydrodynamics performances. This explains why male swimmer with an inverted triangle body shape has good hydrodynamic characteristics for competitive swimming. PMID:26898107

  10. Performance Analysis: Work Control Events Identified January - August 2010

    SciTech Connect

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting in each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded elements of an institutional work planning and control system. By the end of that year this system was documented and implementation had begun. In 2009

  11. Visualization and Analysis of Climate Simulation Performance Data

    NASA Astrophysics Data System (ADS)

    Röber, Niklas; Adamidis, Panagiotis; Behrens, Jörg

    2015-04-01

    Visualization is the key process of transforming abstract (scientific) data into a graphical representation, to aid in the understanding of the information hidden within the data. Climate simulation data sets are typically quite large, time varying, and consist of many different variables sampled on an underlying grid. A large variety of climate models - and sub models - exist to simulate various aspects of the climate system. Generally, one is mainly interested in the physical variables produced by the simulation runs, but model developers are also interested in performance data measured along with these simulations. Climate simulation models are carefully developed complex software systems, designed to run in parallel on large HPC systems. An important goal thereby is to utilize the entire hardware as efficiently as possible, that is, to distribute the workload as even as possible among the individual components. This is a very challenging task, and detailed performance data, such as timings, cache misses etc. have to be used to locate and understand performance problems in order to optimize the model implementation. Furthermore, the correlation of performance data to the processes of the application and the sub-domains of the decomposed underlying grid is vital when addressing communication and load imbalance issues. High resolution climate simulations are carried out on tens to hundreds of thousands of cores, thus yielding a vast amount of profiling data, which cannot be analyzed without appropriate visualization techniques. This PICO presentation displays and discusses the ICON simulation model, which is jointly developed by the Max Planck Institute for Meteorology and the German Weather Service and in partnership with DKRZ. The visualization and analysis of the models performance data allows us to optimize and fine tune the model, as well as to understand its execution on the HPC system. We show and discuss our workflow, as well as present new ideas and

  12. A performance-based approach to landslide risk analysis

    NASA Astrophysics Data System (ADS)

    Romeo, R. W.

    2009-04-01

    An approach for the risk assessment based on a probabilistic analysis of the performance of structures threatened by landslides is shown and discussed. The risk is a possible loss due to the occurrence of a potentially damaging event. Analytically the risk is the probability convolution of hazard, which defines the frequency of occurrence of the event (i.e., the demand), and fragility that defines the capacity of the system to withstand the event given its characteristics (i.e., severity) and those of the exposed goods (vulnerability), that is: Risk=p(D>=d|S,V) The inequality sets a damage (or loss) threshold beyond which the system's performance is no longer met. Therefore a consistent approach to risk assessment should: 1) adopt a probabilistic model which takes into account all the uncertainties of the involved variables (capacity and demand), 2) follow a performance approach based on given loss or damage thresholds. The proposed method belongs to the category of the semi-empirical ones: the theoretical component is given by the probabilistic capacity-demand model; the empirical component is given by the observed statistical behaviour of structures damaged by landslides. Two landslide properties alone are required: the area-extent and the type (or kinematism). All other properties required to determine the severity of landslides (such as depth, speed and frequency) are derived via probabilistic methods. The severity (or intensity) of landslides, in terms of kinetic energy, is the demand of resistance; the resistance capacity is given by the cumulative distribution functions of the limit state performance (fragility functions) assessed via damage surveys and cards compilation. The investigated limit states are aesthetic (of nominal concern alone), functional (interruption of service) and structural (economic and social losses). The damage probability is the probabilistic convolution of hazard (the probability mass function of the frequency of occurrence of given

  13. Magnetohydrodynamic Augmented Propulsion Experiment: I. Performance Analysis and Design

    NASA Technical Reports Server (NTRS)

    Litchford, R. J.; Cole, J. W.; Lineberry, J. T.; Chapman, J. N.; Schmidt, H. J.; Lineberry, C. W.

    2003-01-01

    The performance of conventional thermal propulsion systems is fundamentally constrained by the specific energy limitations associated with chemical fuels and the thermal limits of available materials. Electromagnetic thrust augmentation represents one intriguing possibility for improving the fuel composition of thermal propulsion systems, thereby increasing overall specific energy characteristics; however, realization of such a system requires an extremely high-energy-density electrical power source as well as an efficient plasma acceleration device. This Technical Publication describes the development of an experimental research facility for investigating the use of cross-field magnetohydrodynamic (MHD) accelerators as a possible thrust augmentation device for thermal propulsion systems. In this experiment,a 1.5-MW(sub e) Aerotherm arc heater is used to drive a 2-MW(sub e) MHD accelerator. The heatsink MHD accelerator is configured as an externally diagonalized, segmented channel, which is inserted into a large-bore, 2-T electromagnet. The performance analysis and engineering design of the flow path are described as well as the parameter measurements and flow diagnostics planned for the initial series of test runs.

  14. RWMC Performance Assessment/Composite Analysis Monitoring Report - FY-2002

    SciTech Connect

    Ritter, P.D.; Parsons, A.M.

    2002-09-30

    US DOE Order 435.1, Radioactive Waste Management, Chapter IV and the associated implementation manual and guidance require monitoring of low-level radioactive waste (LLW) disposal facilities. The Performance Assessment/Composite Analysis (PA/CA) Monitoring program was developed and implemented to meet this requirement. This report represents the results of PA/CA monitoring projects that are available as of September 2002. The technical basis for the PA/CA program is provided in the PA/CA Monitoring Program document and a program description document (PDD) serves as the quality assurance project plan for implementing the PM program. Subsurface monitoring, air pathway surveillance, and subsidence monitoring/control are required to comply with DOE Order 435.1, Chapter IV. Subsidence monitoring/control and air pathway surveillance are performed entirely by other INEEL programs - their work is summarized herein. Subsurface monitoring includes near-field (source) monitoring of buried activated beryllium and steel, monitoring of groundwater in the vadose zone, and monitoring of the Snake River Plain Aquifer. Most of the required subsurface monitoring information presented in this report was gathered from the results of ongoing INEEL monitoring programs. This report also presents results for several new monitoring efforts that have been initiated to characterize any migration of radionuclides in surface sediment near the waste.

  15. Performance analysis of reactive congestion control for ATM networks

    NASA Astrophysics Data System (ADS)

    Kawahara, Kenji; Oie, Yuji; Murata, Masayuki; Miyahara, Hideo

    1995-05-01

    In ATM networks, preventive congestion control is widely recognized for efficiently avoiding congestion, and it is implemented by a conjunction of connection admission control and usage parameter control. However, congestion may still occur because of unpredictable statistical fluctuation of traffic sources even when preventive control is performed in the network. In this paper, we study another kind of congestion control, i.e., reactive congestion control, in which each source changes its cell emitting rate adaptively to the traffic load at the switching node (or at the multiplexer). Our intention is that, by incorporating such a congestion control method in ATM networks, more efficient congestion control is established. We develop an analytical model, and carry out an approximate analysis of reactive congestion control algorithm. Numerical results show that the reactive congestion control algorithms are very effective in avoiding congestion and in achieving the statistical gain. Furthermore, the binary congestion control algorithm with pushout mechanism is shown to provide the best performance among the reactive congestion control algorithms treated here.

  16. Performance analysis and optimization of power plants with gas turbines

    NASA Astrophysics Data System (ADS)

    Besharati-Givi, Maryam

    The gas turbine is one of the most important applications for power generation. The purpose of this research is performance analysis and optimization of power plants by using different design systems at different operation conditions. In this research, accurate efficiency calculation and finding optimum values of efficiency for design of chiller inlet cooling and blade cooled gas turbine are investigated. This research shows how it is possible to find the optimum design for different operation conditions, like ambient temperature, relative humidity, turbine inlet temperature, and compressor pressure ratio. The simulated designs include the chiller, with varied COP and fogging cooling for a compressor. In addition, the overall thermal efficiency is improved by adding some design systems like reheat and regenerative heating. The other goal of this research focuses on the blade-cooled gas turbine for higher turbine inlet temperature, and consequently, higher efficiency. New film cooling equations, along with changing film cooling effectiveness for optimum cooling air requirement at the first-stage blades, and an internal and trailing edge cooling for the second stage, are innovated for optimal efficiency calculation. This research sets the groundwork for using the optimum value of efficiency calculation, while using inlet cooling and blade cooling designs. In the final step, the designed systems in the gas cycles are combined with a steam cycle for performance improvement.

  17. Scalable Analysis Techniques for Microprocessor Performance Counter Metrics

    SciTech Connect

    Ahn, D H; Vetter, J S

    2002-07-24

    Contemporary microprocessors provide a rich set of integrated performance counters that allow application developers and system architects alike the opportunity to gather important information about workload behaviors. These counters can capture instruction, memory, and operating system behaviors. Current techniques for analyzing data produced from these counters use raw counts, ratios, and visualization techniques to help users make decisions about their application source code. While these techniques are appropriate for analyzing data from one process, they do not scale easily to new levels demanded by contemporary computing systems. Indeed, the amount of data generated by these experiments is on the order of tens of thousands of data points. Furthermore, if users execute multiple experiments, then we add yet another dimension to this already knotty picture. This flood of multidimensional data can swamp efforts to harvest important ideas from these valuable counters. Very simply, this paper addresses these concerns by evaluating several multivariate statistical techniques on these datasets. We find that several techniques, such as statistical clustering, can automatically extract important features from this data. These derived results can, in turn, be feed directly back to an application developer, or used as input to a more comprehensive performance analysis environment, such as a visualization or an expert system.

  18. Performance Analysis of a NASA Integrated Network Array

    NASA Technical Reports Server (NTRS)

    Nessel, James A.

    2012-01-01

    The Space Communications and Navigation (SCaN) Program is planning to integrate its individual networks into a unified network which will function as a single entity to provide services to user missions. This integrated network architecture is expected to provide SCaN customers with the capabilities to seamlessly use any of the available SCaN assets to support their missions to efficiently meet the collective needs of Agency missions. One potential optimal application of these assets, based on this envisioned architecture, is that of arraying across existing networks to significantly enhance data rates and/or link availabilities. As such, this document provides an analysis of the transmit and receive performance of a proposed SCaN inter-network antenna array. From the study, it is determined that a fully integrated internetwork array does not provide any significant advantage over an intra-network array, one in which the assets of an individual network are arrayed for enhanced performance. Therefore, it is the recommendation of this study that NASA proceed with an arraying concept, with a fundamental focus on a network-centric arraying.

  19. Performance analysis for second-design space Stirling engine model

    NASA Astrophysics Data System (ADS)

    Ogiwara, Sachio; Fujiwara, Tsutomu; Eguchi, Kunihisa; Nakamura, Yoshihiro

    A hybrid free-piston Stirling research engine, called NALSEM 125, has been tested since 1988 as part of a solar dynamic power technology program. It is a gamma-type Stirling driven linear-alternator machine with helium as a working fluid. The objective of the experimental program is to understand the thermodynamic and dynamic mechanisms of the free piston engine integrated with a magnet-moving alternator. After the first phase engine experiments of NALSEM 125, a second design Stirling engine of NALSEM 125 R has been tested. By using a second-order analytical tool, some design modifications were performed to provide much more stable dynamic operations over a required operating range, as well as to incorporate an electric heater head simulating a hot interface of 12 sodium heat pipes. Describes in this paper are thermodynamic performance data of NALSEM 125R operations, which are also compared with the computational analysis, considering the power losses resulting from pressure drop and gas leakage.

  20. Performance analysis and experiment validation of a pneumatic vibration isolator

    NASA Astrophysics Data System (ADS)

    Yang, Yuanyuan; Tan, Jiubin; Wang, Lei; Zhou, Tong

    2015-02-01

    A performance analysis and experiment validation of a pneumatic vibration isolator (PVI) that applied in the wafer stage of lithography is proposed in this work. The wafer stage of lithography is a dual-stage actuator system, including a long-stroke stage (LS) and a short-stroke stage (SS). In order to achieve the nanometer level positioning the isolator is designed to reduce the transmission of LS excitations to SS. In addition, considering the SS with six degrees of freedom and required to keep a strict constant temperature environment, the isolator need to have two functions, including the decoupling for vertical to horizontal and gravity compensation. In this isolator, a biaxial hinge was designed to decouple vertical rotation freedom, and a gas bearing was designed to decouple horizontal motion. The stiffness and damping of the pneumatic vibration isolator were analyzed. Besides, an analysis of the natural frequency and vibration transmissibility of the isolator is presented. In the end, the results show that vibration transmission is reduced significantly by the isolator and natural frequency can be lower than 0.6 Hz. This means that experimental results accord with the prediction model.

  1. Performance of membrane filters used for TEM analysis of asbestos.

    PubMed

    Webber, James S; Czuhanich, Alex G; Carhart, Laurie J

    2007-10-01

    This article presents findings related to characteristics of membrane filters that can affect the recovery of asbestos and the quality of preparations for transmission electron microscopy (TEM) analysis. Certain applications and preparation steps can lead to unacceptable performance of membrane filters used in analysis of asbestos by TEM. Unless substantial care is used in the collapsing of mixed-cellulose ester (MCE) filters with an acetone hot block, grid preparations can suffer and fiber recoveries can be compromised. Calibration of the etching depth of MCE filters, especially at differing locations in an asher's chamber, is critical for reliable fiber recovery. Excessive etching of MCE filters with aerosol-deposited asbestos can lead to loss of short fibers, while insufficient etching of MCE filters with aqueous-deposited asbestos can, paradoxically, also lead to loss of short fibers. Interlaboratory precision on MCE filters is improved by aerosol-deposited asbestos, as opposed to aqueous deposition. In comparison, straightforward preparation, improved solvents, and reduced contamination make PC filters an increasingly acceptable alternative. Variations in the geometric configuration during application of carbon films can lead to fiber loss and unacceptable grid quality for either type of filter. PMID:17763069

  2. Nutrition, sensory evaluation, and performance analysis of hydrogenated frying oils.

    PubMed

    Hack, Danielle M; Bordi, Peter L; Hessert, S William

    2009-12-01

    The Food and Drug Administration now requires labeling of trans fats on nutrition labels, a decision that has created a push to reformulate deep-fat frying oils. Prior to the passage of this law, frying oils contained trans fats because trans fats made the oils more stable and thus allowing for longer frying usage. In the present study, oil performance, sensory evaluation and nutritional analysis was conducted on trans fat-free oils through a 10-day degradation process using French fries to break down the oil. The goal of the study was to test oil stability and nutrition analysis and to learn consumer preference between trans fat and trans fat-free oils. Sensory evaluation indicated a preference for fries composed from trans fat-free oil mixtures. The most stable oils were also combination oils. Based on these findings, industry representatives considering using the trans fat-free frying oils should consider using blended oils instead, which met customers' taste preference and minimized oil rancidity and usage. PMID:19919512

  3. Residential fenestration performance analysis using RESFEN3.1

    SciTech Connect

    Huang, Y.J.; Mitchell, R.; Arasteh, D.; Selkowitz, S.

    1999-02-01

    This paper describes the development efforts of RESFEN3.1, a PC-based computer program for calculating the heating and cooling energy performance and cost of residential fenestration systems. The development of RESFEN has been coordinated with ongoing efforts by the National Fenestration Rating Council (NFRC) to develop an energy rating system for windows and slqdights to maintain maximum consistency between RESFEN and NFRC's planned energy rating system. Unlike previous versions of RESFEN, that used regression equations to replicate a large database of computer simulations, Version 3.1 produces results based on actual hour-by-hour simulations. This approach has been facilitated by the exponential increase in the speed of personal computers in recent years. RESFEN3.1 has the capability of analyzing the energy performance of windows in new residential buildings in 52 North American locations. The user describes the physical, thermal and optical properties of the windows in each orientation, solar heat gain reductions due to obstructions, overhangs, or shades; and the location of the house. The RESFEN program then models a prototypical house for that location and calculates the energy use of the house using the DOE-2 program. The user can vary the HVAC system, foundation type, and utility costs. Results are presented for the annual heating and cooling energy use, energy cost, and peak energy demand of the house, and the incremental energy use or peak demand attributable to the windows in each orientation. This paper describes the capabilities of RESFEN3.1, its usefulness in analyzing the energy performance of residential windows and its development effort and gives insight into the structure of the computer program. It also discusses the rationale and benefits of the approach taken in RESFEN in combining a simple-to-use graphical front-end with a detailed hour-by-hour ''simulation engine'' to produce an energy analysis tool for the general public that is user

  4. Regression analysis of technical parameters affecting nuclear power plant performances

    SciTech Connect

    Ghazy, R.; Ricotti, M. E.; Trueco, P.

    2012-07-01

    Since the 80's many studies have been conducted in order to explicate good and bad performances of commercial nuclear power plants (NPPs), but yet no defined correlation has been found out to be totally representative of plant operational experience. In early works, data availability and the number of operating power stations were both limited; therefore, results showed that specific technical characteristics of NPPs were supposed to be the main causal factors for successful plant operation. Although these aspects keep on assuming a significant role, later studies and observations showed that other factors concerning management and organization of the plant could instead be predominant comparing utilities operational and economic results. Utility quality, in a word, can be used to summarize all the managerial and operational aspects that seem to be effective in determining plant performance. In this paper operational data of a consistent sample of commercial nuclear power stations, out of the total 433 operating NPPs, are analyzed, mainly focusing on the last decade operational experience. The sample consists of PWR and BWR technology, operated by utilities located in different countries, including U.S. (Japan)) (France)) (Germany)) and Finland. Multivariate regression is performed using Unit Capability Factor (UCF) as the dependent variable; this factor reflects indeed the effectiveness of plant programs and practices in maximizing the available electrical generation and consequently provides an overall indication of how well plants are operated and maintained. Aspects that may not be real causal factors but which can have a consistent impact on the UCF, as technology design, supplier, size and age, are included in the analysis as independent variables. (authors)

  5. The Current State of Human Performance Technology: A Citation Network Analysis of "Performance Improvement Quarterly," 1988-2010

    ERIC Educational Resources Information Center

    Cho, Yonjoo; Jo, Sung Jun; Park, Sunyoung; Kang, Ingu; Chen, Zengguan

    2011-01-01

    This study conducted a citation network analysis (CNA) of human performance technology (HPT) to examine its current state of the field. Previous reviews of the field have used traditional research methods, such as content analysis, survey, Delphi, and citation analysis. The distinctive features of CNA come from using a social network analysis…

  6. Design, fabrication & performance analysis of an unmanned aerial vehicle

    NASA Astrophysics Data System (ADS)

    Khan, M. I.; Salam, M. A.; Afsar, M. R.; Huda, M. N.; Mahmud, T.

    2016-07-01

    An Unmanned Aerial Vehicle was designed, analyzed and fabricated to meet design requirements and perform the entire mission for an international aircraft design competition. The goal was to have a balanced design possessing, good demonstrated flight handling qualities, practical and affordable manufacturing requirements while providing a high vehicle performance. The UAV had to complete total three missions named ferry flight (1st mission), maximum load mission (2nd mission) and emergency medical mission (3rd mission). The requirement of ferry flight mission was to fly as many as laps as possible within 4 minutes. The maximum load mission consists of flying 3 laps while carrying two wooden blocks which simulate cargo. The requirement of emergency medical mission was complete 3 laps as soon as possible while carrying two attendances and two patients. A careful analysis revealed lowest rated aircraft cost (RAC) as the primary design objective. So, the challenge was to build an aircraft with minimum RAC that can fly fast, fly with maximum payload, and fly fast with all the possible configurations. The aircraft design was reached by first generating numerous design concepts capable of completing the mission requirements. In conceptual design phase, Figure of Merit (FOM) analysis was carried out to select initial aircraft configuration, propulsion, empennage and landing gear. After completion of the conceptual design, preliminary design was carried out. The preliminary design iterations had a low wing loading, high lift coefficient, and a high thrust to weight ratio. To make the aircraft capable of Rough Field Taxi; springs were added in the landing gears for absorbing shock. An airfoil shaped fuselage was designed to allowed sufficient space for payload and generate less drag to make the aircraft fly fast. The final design was a high wing monoplane with conventional tail, single tractor propulsion system and a tail dragger landing gear. Payload was stored in

  7. Measurement Performance of a Computer Assisted Vertebral Motion Analysis System

    PubMed Central

    Davis, Reginald J.; Lee, David C.; Cheng, Boyle

    2015-01-01

    Background Segmental instability of the lumbar spine is a significant cost within the US health care system; however current thresholds for indication of radiographic instability are not well defined. Purpose To determine the performance measurements of sagittal lumbar intervertebral measurements using computerassisted measurements of the lumbar spine using motion sequences from a video-fluoroscopic technique. Study design Sensitivity, specificity, predictive values, prevalence, and test-retest reliability evaluation of digitized manual versus computer-assisted measurements of the lumbar spine. Patient sample A total of 2239 intervertebral levels from 509 symptomatic patients, and 287 intervertebral levels from 73 asymptomatic participants were retrospectively evaluated. Outcome measures Specificity, sensitivity, negative predictive value (NPV), diagnostic accuracy, and prevalence between the two measurement techniques; Measurements of Coefficient of repeatability (CR), limits of agreement (LOA), intraclass correlation coefficient (ICC; type 3,1), and standard error of measurement for both measurement techniques. Methods Asymptomatic individuals and symptomatic patients were all evaluated using both the Vertebral Motion Analysis (VMA) system and fluoroscopic flexion extension static radiographs (FE). The analysis was compared to known thresholds of 15% intervertebral translation (IVT, equivalent to 5.3mm assuming a 35mm vertebral body depth) and 25° intervertebral rotation (IVR). Results The VMA measurements demonstrated greater specificity, % change in sensitivity, NPV, prevalence, and reliability compared with FE for radiographic evidence of instability. Specificity was 99.4% and 99.1% in the VMA compared to 98.3% and 98.2% in the FE for IVR and IVT, respectively. Sensitivity in this study was 41.2% and 44.6% greater in the VMA compared to the FE for IVR and IVT, respectively. NPV was 91% and 88% in the VMA compared to 62% and 66% in the FE for IVR and IVT

  8. Analysis of correlation between corneal topographical data and visual performance

    NASA Astrophysics Data System (ADS)

    Zhou, Chuanqing; Yu, Lei; Ren, Qiushi

    2007-02-01

    Purpose: To study correlation among corneal asphericity, higher-order aberrations and visual performance for eyes of virgin myopia and postoperative laser in situ keratomileusis (LASIK). Methods: There were 320 candidates 590 eyes for LASIK treatment included in this study. The mean preoperative spherical equivalence was -4.35+/-1.51D (-1.25 to -9.75), with astigmatism less than 2.5 D. Corneal topography maps and contrast sensitivity were measured and analyzed for every eye before and one year after LASIK for the analysis of corneal asphericity and wavefront aberrations. Results: Preoperatively, only 4th and 6th order aberration had significant correlation with corneal asphericity and apical radius of curvature (p<0.001). Postoperatively, all 3th to 6th order aberrations had statistically significant correlation with corneal asphericity (p<0.01), but only 4th and 6th order aberration had significant correlation with apical radius of curvature (p<0.05). The asymmetrical aberration like coma had significant correlation with vertical offset of pupil center (p<0.01). Preoperatively, corneal aberrations had no significant correlation with visual acuity and area under the log contrast sensitivity (AULCSF) (P>0.05). Postoperatively, corneal aberrations still didn't have significant correlation with visual acuity (P>0.05), but had significantly negative correlation with AULCSF (P<0.01). Corneal asphericity had no significant correlation with AULCSF before and after the treatment (P>0.05). Conclusions: Corneal aberrations had different correlation with corneal profile and visual performance for eyes of virgin myopia and postoperative LASIK, which may be due to changed corneal profile and limitation of metrics of corneal aberrations.

  9. Analysis of Student Performance in Peer Led Undergraduate Supplements

    NASA Astrophysics Data System (ADS)

    Gardner, Linda M.

    Foundations of Chemistry courses at the University of Kansas have traditionally accommodated nearly 1,000 individual students every year with a single course in a large lecture hall. To develop a more student-centered learning atmosphere, Peer Led Undergraduate Supplements (PLUS) were introduced to assist students, starting in the spring of 2010. PLUS was derived from the more well-known Peer-Led Team Learning with modifications to meet the specific needs of the university and the students. The yearlong investigation of PLUS Chemistry began in the fall of 2012 to allow for adequate development of materials and training of peer leaders. We examined the impact of academic achievement for students who attended PLUS sessions while controlling for high school GPA, math ACT scores, credit hours earned in high school, completion of calculus, gender, and those aspiring to be pharmacists (i.e., pre-pharmacy students). In a least linear squares multiple regression, PLUS participants performed on average one percent higher on exam scores for Chemistry 184 and four tenths of a percent on Chemistry 188 for each PLUS session attended. Pre-pharmacy students moderated the effect of PLUS attendance on chemistry achievement, ultimately negating any relative gain associated by attending PLUS sessions. Evidence of gender difference was demonstrated in the Chemistry 188 model, indicating females experience a greater benefit from PLUS sessions. Additionally, an item analysis studied the relationship between PLUS material to individual items on exams. The research discovered that students who attended PLUS session, answered the items correctly 10 to 20 percent more than their comparison group for PLUS interrelated items and no difference to 10 percent for non-PLUS related items. In summary, PLUS has a positive effect on exam performance in introductory chemistry courses at the University of Kansas.

  10. Parental Behaviors and Adolescent Academic Performance: A Longitudinal Analysis.

    ERIC Educational Resources Information Center

    Melby, Janet N.; Conger, Rand D.

    1996-01-01

    Used 4 waves of data on 347 seventh graders and their parents to examine relation of parental involvement and hostility to academic performance. Parental behavior affected later academic performance, when controlling for earlier performance. Setting and positively reinforcing appropriate behavioral standards increased academic performance, whereas…

  11. Space rescue system definition (system performance analysis and trades)

    NASA Astrophysics Data System (ADS)

    Housten, Sam; Elsner, Tim; Redler, Ken; Svendsen, Hal; Wenzel, Sheri

    This paper addresses key technical issues involved in the system definition of the Assured Crew Return Vehicle (ACRV). The perspective on these issues is that of a prospective ACRV contractor, performing system analysis and trade studies. The objective of these analyses and trade studies is to develop the recovery vehicle system concept and top level requirements. The starting point for this work is the definition of the set of design missions for the ACRV. This set of missions encompasses three classes of contingency/emergency (crew illness/injury, space station catastrophe/failure, transportation element catastrophe/failure). The need is to provide a system to return Space Station crew to Earth quickly (less than 24 hours) in response to randomly occurring contingency events over an extended period of time (30 years of planned Space Station life). The main topics addressed and characterized in this paper include the following: Key Recovery (Rescue) Site Access Considerations; Rescue Site Locations and Distribution; Vehicle Cross Range vs Site Access; On-orbit Loiter Capability and Vehicle Design; and Water vs. Land Recovery.

  12. 1-D Numerical Analysis of RBCC Engine Performance

    NASA Technical Reports Server (NTRS)

    Han, Samuel S.

    1998-01-01

    An RBCC engine combines air breathing and rocket engines into a single engine to increase the specific impulse over an entire flight trajectory. Considerable research pertaining to RBCC propulsion was performed during the 1960's and these engines were revisited recently as a candidate propulsion system for either a single-stage-to-orbit (SSTO) or two-stage-to-orbit (TSTO) launch vehicle. There are a variety of RBCC configurations that had been evaluated and new designs are currently under development. However, the basic configuration of all RBCC systems is built around the ejector scramjet engine originally developed for the hypersonic airplane. In this configuration, a rocket engine plays as an ejector in the air-augmented initial acceleration mode, as a fuel injector in scramjet mode and the rocket in all rocket mode for orbital insertion. Computational fluid dynamics (CFD) is a useful tool for the analysis of complex transport processes in various components in RBCC propulsion systems. The objective of the present research was to develop a transient 1-D numerical model that could be used to predict flow behavior throughout a generic RBCC engine following a flight path.

  13. Analysis of Illinois Home Performance with ENERGY STAR® Measure Packages

    SciTech Connect

    Baker, J.; Yee, S.; Brand, L.

    2013-09-01

    Through the Chicagoland Single Family Housing Characterization and Retrofit Prioritization report, the Partnership for Advanced Residential Retrofit research team characterized 15 housing types in the Chicagoland region based on assessor data, utility billing history, and available data from prior energy efficiency programs. Within these 15 groups, a subset showed the greatest opportunity for energy savings based on BEopt Version 1.1 modeling of potential energy efficiency package options and the percent of the housing stock represented by each group. In this project, collected field data from a whole-home program in Illinois are utilized to compare marketplace-installed measures to the energy saving optimal packages previously developed for the 15 housing types. Housing type, conditions, energy efficiency measures installed, and retrofit cost information were collected from 19 homes that participated in the Illinois Home Performance with ENERGY STAR program in 2012, representing eight of the characterized housing groups. Two were selected for further case study analysis to provide an illustration of the differences between optimal and actually installed measures. Taken together, these homes are representative of 34.8% of the Chicagoland residential building stock. In one instance, actual installed measures closely matched optimal recommended measures.

  14. Analysis of Performance of Stereoscopic-Vision Software

    NASA Technical Reports Server (NTRS)

    Kim, Won; Ansar, Adnan; Steele, Robert; Steinke, Robert

    2007-01-01

    A team of JPL researchers has analyzed stereoscopic vision software and produced a document describing its performance. This software is of the type used in maneuvering exploratory robotic vehicles on Martian terrain. The software in question utilizes correlations between portions of the images recorded by two electronic cameras to compute stereoscopic disparities, which, in conjunction with camera models, are used in computing distances to terrain points to be included in constructing a three-dimensional model of the terrain. The analysis included effects of correlation- window size, a pyramidal image down-sampling scheme, vertical misalignment, focus, maximum disparity, stereo baseline, and range ripples. Contributions of sub-pixel interpolation, vertical misalignment, and foreshortening to stereo correlation error were examined theoretically and experimentally. It was found that camera-calibration inaccuracy contributes to both down-range and cross-range error but stereo correlation error affects only the down-range error. Experimental data for quantifying the stereo disparity error were obtained by use of reflective metrological targets taped to corners of bricks placed at known positions relative to the cameras. For the particular 1,024-by-768-pixel cameras of the system analyzed, the standard deviation of the down-range disparity error was found to be 0.32 pixel.

  15. Performance analysis of unmanned vehicle positioning and obstacle mapping

    NASA Astrophysics Data System (ADS)

    Bostelman, Roger; Hong, Tsai; Madhavan, Raj; Chang, Tommy; Scott, Harry

    2006-05-01

    As unmanned ground vehicles take on more and more intelligent tasks, determination of potential obstacles and accurate estimation of their position become critical for successful navigation and path planning. The performance analysis of obstacle mapping and unmanned vehicle positioning in outdoor environments is the subject of this paper. Recently, the National Institute of Standards and Technology's (NIST) Intelligent Systems Division has been a part of the Defense Advanced Research Project Agency LAGR (Learning Applied to Ground Robots) Program. NIST's objective for the LAGR Project is to insert learning algorithms into the modules that make up the NIST 4D/RCS (Four Dimensional/Real-Time Control System) standard reference model architecture which has been successfully applied to many intelligent systems. We detail world modeling techniques used in the 4D/RCS architecture and then analyze the high precision maps generated by the vehicle world modeling algorithms as compared to ground truth obtained from an independent differential GPS system operable throughout most of the NIST campus. This work has implications, not only for outdoor vehicles but also, for indoor automated guided vehicles where future systems will have more and more onboard intelligence requiring non-contact sensors to provide accurate vehicle and object positioning.

  16. Micromechanical analysis of damping performance of piezoelectric structural fiber composites

    NASA Astrophysics Data System (ADS)

    Dai, Qingli; Ng, Kenny

    2010-04-01

    Recent studies showed that the active piezoelectric structural fiber (PSF) composites may achieve significant and simultaneous improvements in sensing/actuating, stiffness, fracture toughness and vibration damping. These characteristics can be of particular importance in various civil, mechanical and aerospace structures. This study firstly conducted the micromechanical finite element analysis to predict the elastic properties and piezoelectrical coupling parameters of a special type of an active PSF composite laminate. The PSF composite laminates are made of longitudinally poled PSFs that are unidirectionally deployed in the polymer binding matrix. The passive damping performance of these active composites was studied under the cyclic force loadings with different frequencies. It was found that the passive electric-mechanical coupling behavior can absorb limited dynamic energy and delay the structure responses with minimum viscoelastic damping. The actuating function of piezoelectric materials was then applied to reduce the dynamic mechanical deformation. The step voltage inputs were imposed to the interdigital electrodes of PSF laminate transducer along the poled direction. The cyclic pressure loading was applied transversely to the composite laminate. The electromechnical interaction with the 1-3 coupling parameter generated the transverse expansion, which can reduce the cyclic deformation evenly by shifting the response waves. This study shows the promise in using this type of active composites as actuators to improve stability of the structure dynamic.

  17. Analysis and performance of subsonic ejectors for pulsatile flow applications

    SciTech Connect

    Roche, J.G.; Liburdy, J.A.

    1994-12-31

    This study looks at the application of ejectors to four-stroke engines. The goal is to develop a system of exhaust gas emission control by premixing exhaust gas with fresh atmospheric air. The constraints on the system include relatively low pressure pulsatile flow of the primary gas, geometric constraints (small size), significant density differences between the two fluid streams and possible large back-pressure operating conditions. A model is applied to the ejector application to pulsatile flow based on a global control volume analysis. The model constrains the operating conditions based on conservation of mass, momentum and energy for incompressible flow conditions. The time dependent effects are modeled by including a representative inertia term in the momentum equation based on quasi-steady conditions. The results are used to illustrate the operating characteristics for a small four-stroke engine application. The sensitivity of operation to the operating and design parameters of the system are illustrated. In particular, the effects of the pulsatile flow on the operation are shown to increase the performance under certain operating conditions. The model simulation is compared to some data available in the literature.

  18. Analysis of classifiers performance for classification of potential microcalcification

    NASA Astrophysics Data System (ADS)

    M. N., Arun K.; Sheshadri, H. S.

    2013-07-01

    Breast cancer is a significant public health problem in the world. According to the literature early detection improve breast cancer prognosis. Mammography is a screening tool used for early detection of breast cancer. About 10-30% cases are missed during the routine check as it is difficult for the radiologists to make accurate analysis due to large amount of data. The Microcalcifications (MCs) are considered to be important signs of breast cancer. It has been reported in literature that 30% - 50% of breast cancer detected radio graphically show MCs on mammograms. Histologic examinations report 62% to 79% of breast carcinomas reveals MCs. MC are tiny, vary in size, shape, and distribution, and MC may be closely connected to surrounding tissues. There is a major challenge using the traditional classifiers in the classification of individual potential MCs as the processing of mammograms in appropriate stage generates data sets with an unequal amount of information for both classes (i.e., MC, and Not-MC). Most of the existing state-of-the-art classification approaches are well developed by assuming the underlying training set is evenly distributed. However, they are faced with a severe bias problem when the training set is highly imbalanced in distribution. This paper addresses this issue by using classifiers which handle the imbalanced data sets. In this paper, we also compare the performance of classifiers which are used in the classification of potential MC.

  19. A Design and Performance Analysis Tool for Superconducting RF Systems

    NASA Astrophysics Data System (ADS)

    Schilcher, Th.; Simrock, S. N.; Merminga, L.; Wang, D. X.

    1997-05-01

    Superconducting rf systems are usually operated with continuous rf power or with rf pulse lengths exceeding 1 ms to maximize the overall plug power efficiency. Typical examples are CEBAF at Jefferson Lab and the TESLA Test Facility at DESY. The long pulses allow for effective application of feedback to stabilize the accelerating field in presence of microphonics, Lorentz force detuning, and fluctuations of the beam current. In this paper we describe a set of tools to be used with MATLAB and SIMULINK, which allow to analyse the quality of field regulation for a given design. The tools include models for the cavities, the rf power source, the beam, sources of field perturbations, and the rf feedback system. The rf control relevant electrical and mechanical characteristics of the cavity are described in form of time-varying state space models. The power source is modeled as a current generator and includes saturation characteristics and noise. An arbitrary time structure can be imposed on the beam current to reflect a macro-pulse stucture and bunch charge fluctuations. For rf feedback several schemes can be selected: Traditional amplitude and phase control as well as I/Q control. The choices for the feedback controller include analog or digital approaches and various choices of frequency response. Feedforward can be added to further supress repetitive errors. The results of a performance analysis of the CEBAF and the TESLA Linac rf system using these tools are presented.

  20. A design and performance analysis tool for superconducting RF systems

    SciTech Connect

    T. Schilcher; S.N. Simrock; L. Merminga; D.X. Wang

    1997-05-01

    Superconducting rf systems are usually operated with continuous rf power or with rf pulse lengths exceeding 1 ms to maximize the overall wall plug power efficiency. Typical examples are CEBAF at the Thomas Jefferson National Accelerator Facility (Jefferson Lab) and the TESLA Test Facility at DESY. The long pulses allow for effective application of feedback to stabilize the accelerating field in presence of microphonics, Lorentz force detuning, and fluctuations of the beam current. In this paper the authors describe a set of tools to be used with MATLAB and SIMULINK, which allow to analyze the quality of field regulation for a given design. The tools include models for the cavities, the rf power source, the beam, sources of field perturbations, and the rf feedback system. The rf control relevant electrical and mechanical characteristics of the cavity are described in form of time-varying state space models. The power source is modeled as a current generator and includes saturation characteristics and noise.An arbitrary time structure can be imposed on the beam current to reflect a macro-pulse structure and bunch charge fluctuations. For rf feedback several schemes can be selected: Traditional amplitude and phase control as well as I/Q control. The choices for the feedback controller include analog or digital approaches and various choices of frequency response. Feed forward can be added to further suppress repetitive errors. The results of a performance analysis of the CEBAF and the TESLA Linac rf system using these tools are presented.

  1. ADS-Demo Fuel Rod Performance: Multivariate Statistical Analysis

    SciTech Connect

    Calabrese, R.; Vettraino, F.; Luzzi, L.

    2004-07-01

    A forward step in the development of Accelerator Driven System (ADS) for the Pu, MA and LLFP transmutation, is the realisation of a 80 MWt ADS-demo (XADS) whose basic objective is the system feasibility demonstration. The XADS is forecasted to adopt the UO{sub 2}-PuO{sub 2} mixed-oxides fuel already experimented in the sodium cooled fast reactors such as the french SPX-1. The present multivariate statistical analysis performed by using the Transuranus Code, was carried out for the Normal Operation at the so-called Enhanced Nominal Conditions (120% nominal reactor power), aimed at verifying that the fuel system complies with the stated design limits, i.e. centerline fuel temperature, cladding temperature and damage, during all the in-reactor lifetime. A statistical input set similar to SPX and PEC fuel case, was adopted. One most relevant assumption in the present calculations was a 30% AISI-316 cladding thickness corrosion at EOL. Relative influence of main fuel rod parameters on fuel centerline temperature was also evaluated. (authors)

  2. Design consideration and performance analysis of OCT-based topography

    NASA Astrophysics Data System (ADS)

    Meemon, Panomsak; Yao, Jianing; Rolland, Jannick P.

    2014-03-01

    We report a study on design consideration and performance analysis of OCT-based topography by tracking of maximum intensity at each layer's interface. We demonstrate that, for a given stabilized OCT system, a high precision and accuracy of OCT-based layers and thickness topography in the order of tens nanometer can be achieved by using a technique of maximum amplitude tracking. The submicron precision was obtained by over sampling through the FFT of the acquired spectral fringes but was eventually limited by the system stability. Furthermore, we report characterization of a precision, repeatability, and accuracy of the surfaces, sub-surfaces, and thickness topography using our optimized FD-OCT system. We verified that for a given stability of our OCT system, precision of the detected position of signal's peak of down to 20 nm was obtained. In addition, we quantified the degradation of the precision caused by sensitivity fall-off over depth of FD-OCT. The measured precision is about 20 nm at about 0.1 mm depth, and degrades to about 80 nm at 1 mm depth, a position of about 10 dB sensitivity fall-off. The measured repeatability of thickness measurements over depth was approximately 0.04 micron. Finally, the accuracy of the system was verified by comparing with a digital micrometer gauging.

  3. Routing performance analysis and optimization within a massively parallel computer

    DOEpatents

    Archer, Charles Jens; Peters, Amanda; Pinnow, Kurt Walter; Swartz, Brent Allen

    2013-04-16

    An apparatus, program product and method optimize the operation of a massively parallel computer system by, in part, receiving actual performance data concerning an application executed by the plurality of interconnected nodes, and analyzing the actual performance data to identify an actual performance pattern. A desired performance pattern may be determined for the application, and an algorithm may be selected from among a plurality of algorithms stored within a memory, the algorithm being configured to achieve the desired performance pattern based on the actual performance data.

  4. Analysis of TIMS performance subjected to simulated wind blast

    NASA Technical Reports Server (NTRS)

    Jaggi, S.; Kuo, S.

    1992-01-01

    The results of the performance of the Thermal Infrared Multispectral Scanner (TIMS) when it is subjected to various wind conditions in the laboratory are described. Various wind conditions were simulated using a 24 inch fan or combinations of air jet streams blowing toward either or both of the blackbody surfaces. The fan was used to simulate a large volume of air flow at moderate speeds (up to 30 mph). The small diameter air jets were used to probe TIMS system response in reaction to localized wind perturbations. The maximum nozzle speed of the air jet was 60 mph. A range of wind directions and speeds were set up in the laboratory during the test. The majority of the wind tests were conducted under ambient conditions with the room temperature fluctuating no more than 2 C. The temperature of the high speed air jet was determined to be within 1 C of the room temperature. TIMS response was recorded on analog tape. Additional thermistor readouts of the blackbody temperatures and thermocouple readout of the ambient temperature were recorded manually to be compared with the housekeeping data recorded on the tape. Additional tests were conducted under conditions of elevated and cooled room temperatures. The room temperature was varied between 19.5 to 25.5 C in these tests. The calibration parameters needed for quantitative analysis of TIMS data were first plotted on a scanline-by-scanline basis. These parameters are the low and high blackbody temperature readings as recorded by the TIMS and their corresponding digitized count values. Using these values, the system transfer equations were calculated. This equation allows us to compute the flux for any video count by computing the slope and intercept of the straight line that relates the flux to the digital count. The actual video of the target (the lab floor in this case) was then compared with a simulated target. This simulated target was assumed to be a blackbody at emissivity of .95 degrees and the temperature was

  5. Performance analysis & optimization of well production in unconventional resource plays

    NASA Astrophysics Data System (ADS)

    Sehbi, Baljit Singh

    The Unconventional Resource Plays consisting of the lowest tier of resources (large volumes and most difficult to develop) have been the main focus of US domestic activity during recent times. Horizontal well drilling and hydraulic fracturing completion technology have been primarily responsible for this paradigm shift. The concept of drainage volume is being examined using pressure diffusion along streamlines. We use diffusive time of flight to optimize the number of hydraulic fracture stages in horizontal well application for Tight Gas reservoirs. Numerous field case histories are available in literature for optimizing number of hydraulic fracture stages, although the conclusions are case specific. In contrast, a general method is being presented that can be used to augment field experiments necessary to optimize the number of hydraulic fracture stages. The optimization results for the tight gas example are in line with the results from economic analysis. The fluid flow simulation for Naturally Fractured Reservoirs (NFR) is performed by Dual-Permeability or Dual-Porosity formulations. Microseismic data from Barnett Shale well is used to characterize the hydraulic fracture geometry. Sensitivity analysis, uncertainty assessment, manual & computer assisted history matching are integrated to develop a comprehensive workflow for building reliable reservoir simulation models. We demonstrate that incorporating proper physics of flow is the first step in building reliable reservoir simulation models. Lack of proper physics often leads to unreasonable reservoir parameter estimates. The workflow demonstrates reduced non-uniqueness for the inverse history matching problem. The behavior of near-critical fluids in Liquid Rich Shale plays defies the production behavior observed in conventional reservoir systems. In conventional reservoirs an increased gas-oil ratio is observed as flowing bottom-hole pressure is less than the saturation pressure. The production behavior is

  6. Performance of a malaria microscopy image analysis slide reading device

    PubMed Central

    2012-01-01

    Background Viewing Plasmodium in Romanovsky-stained blood has long been considered the gold standard for diagnosis and a cornerstone in management of the disease. This method however, requires a subjective evaluation by trained, experienced diagnosticians and establishing proficiency of diagnosis is fraught with many challenges. Reported here is an evaluation of a diagnostic system (a “device” consisting of a microscope, a scanner, and a computer algorithm) that evaluates scanned images of standard Giemsa-stained slides and reports species and parasitaemia. Methods The device was challenged with two independent tests: a 55 slide, expert slide reading test the composition of which has been published by the World Health Organization (“WHO55” test), and a second test in which slides were made from a sample of consenting subjects participating in a malaria incidence survey conducted in Equatorial Guinea (EGMIS test). These subjects’ blood was tested by malaria RDT as well as having the blood smear diagnosis unequivocally determined by a worldwide panel of a minimum of six reference microscopists. Only slides with unequivocal microscopic diagnoses were used for the device challenge, n = 119. Results On the WHO55 test, the device scored a “Level 4” using the WHO published grading scheme. Broken down by more traditional analysis parameters this result was translated to 89% and 70% sensitivity and specificity, respectively. Species were correctly identified in 61% of the slides and the quantification of parasites fell within acceptable range of the validated parasitaemia in 10% of the cases. On the EGMIS test it scored 100% and 94% sensitivity/specificity, with 64% of the species correct and 45% of the parasitaemia within an acceptable range. A pooled analysis of the 174 slides used for both tests resulted in an overall 92% sensitivity and 90% specificity with 61% species and 19% quantifications correct. Conclusions In its current manifestation, the

  7. Performance Analysis of Saturated Induction Motors by Virtual Tests

    ERIC Educational Resources Information Center

    Ojaghi, M.; Faiz, J.; Kazemi, M.; Rezaei, M.

    2012-01-01

    Many undergraduate-level electrical machines textbooks give detailed treatments of the performance of induction motors. Students can deepen this understanding of motor performance by performing the appropriate practical work in laboratories or in simulation using proper software packages. This paper considers various common and less-common tests…

  8. Performance analysis in sport: contributions from a joint analysis of athletes' experience and biomechanical indicators.

    PubMed

    Sève, C; Nordez, A; Poizat, G; Saury, J

    2013-10-01

    The purpose of this study was to test the usefulness of combining two types of analysis to investigate sports performance with the aim of optimizing it. These two types of analysis correspond to two levels of athletes' activity: (a) their experiences during performance and (b) the biomechanical characteristics of their movements. Rowing served as an illustration, and the activity of one female crew member was studied during a race. Three types of data were collected: (a) audiovisual data recorded during the race; (b) verbalization data obtained in interviews conducted afterward; and (c) biomechanical data. The courses of experience of the two rowers during the race were reconstructed on the basis of the audiovisual and verbalization data. This paper presents a detailed analysis of a single phenomenon of the race experienced by one of the rowers. According to the coaches, it reflected a dysfunction in crew coordination. The aim of this analysis was to identify the biomechanical characteristics of the rowers' movements that might explain it. The results showed that the phenomenon could be explained principally by an amplitude differential between the two rowers' strokes. On this basis, the coaches defined new training objectives to remedy the dysfunction in crew coordination. PMID:22150999

  9. Graphical User Interface for Simulink Integrated Performance Analysis Model

    NASA Technical Reports Server (NTRS)

    Durham, R. Caitlyn

    2009-01-01

    The J-2X Engine (built by Pratt & Whitney Rocketdyne,) in the Upper Stage of the Ares I Crew Launch Vehicle, will only start within a certain range of temperature and pressure for Liquid Hydrogen and Liquid Oxygen propellants. The purpose of the Simulink Integrated Performance Analysis Model is to verify that in all reasonable conditions the temperature and pressure of the propellants are within the required J-2X engine start boxes. In order to run the simulation, test variables must be entered at all reasonable values of parameters such as heat leak and mass flow rate. To make this testing process as efficient as possible in order to save the maximum amount of time and money, and to show that the J-2X engine will start when it is required to do so, a graphical user interface (GUI) was created to allow the input of values to be used as parameters in the Simulink Model, without opening or altering the contents of the model. The GUI must allow for test data to come from Microsoft Excel files, allow those values to be edited before testing, place those values into the Simulink Model, and get the output from the Simulink Model. The GUI was built using MATLAB, and will run the Simulink simulation when the Simulate option is activated. After running the simulation, the GUI will construct a new Microsoft Excel file, as well as a MATLAB matrix file, using the output values for each test of the simulation so that they may graphed and compared to other values.

  10. The Algerian Seismic Network: Performance from data quality analysis

    NASA Astrophysics Data System (ADS)

    Yelles, Abdelkarim; Allili, Toufik; Alili, Azouaou

    2013-04-01

    Seismic monitoring in Algeria has seen a great change after the Boumerdes earthquake of May 21st, 2003. Indeed the installation of a New Digital seismic network (ADSN) upgrade drastically the previous analog telemetry network. During the last four years, the number of stations in operation has greatly increased to 66 stations with 15 Broad Band, 02 Very Broad band, 47 Short period and 21 accelerometers connected in real time using various mode of transmission ( VSAT, ADSL, GSM, ...) and managed by Antelope software. The spatial distribution of these stations covers most of northern Algeria from east to west. Since the operation of the network, significant number of local, regional and tele-seismic events was located by the automatic processing, revised and archived in databases. This new set of data is characterized by the accuracy of the automatic location of local seismicity and the ability to determine its focal mechanisms. Periodically, data recorded including earthquakes, calibration pulse and cultural noise are checked using PSD (Power Spectral Density) analysis to determine the noise level. ADSN Broadband stations data quality is controlled in quasi real time using the "PQLX" software by computing PDFs and PSDs of the recordings. Some other tools and programs allow the monitoring and the maintenance of the entire electronic system for example to check the power state of the system, the mass position of the sensors and the environment conditions (Temperature, Humidity, Air Pressure) inside the vaults. The new design of the network allows management of many aspects of real time seismology: seismic monitoring, rapid determination of earthquake, message alert, moment tensor estimation, seismic source determination, shakemaps calculation, etc. The international standards permit to contribute in regional seismic monitoring and the Mediterranean warning system. The next two years with the acquisition of new seismic equipment to reach 50 new BB stations led to

  11. LIFT: analysis of performance in a laser assisted adaptive optics

    NASA Astrophysics Data System (ADS)

    Plantet, Cedric; Meimon, Serge; Conan, Jean-Marc; Neichel, Benoît; Fusco, Thierry

    2014-08-01

    Laser assisted adaptive optics systems rely on Laser Guide Star (LGS) Wave-Front Sensors (WFS) for high order aberration measurements, and rely on Natural Guide Stars (NGS) WFS to complement the measurements on low orders such as tip-tilt and focus. The sky-coverage of the whole system is therefore related to the limiting magnitude of the NGS WFS. We have recently proposed LIFT, a novel phase retrieval WFS technique, that allows a 1 magnitude gain over the usually used 2×2 Shack-Hartmann WFS. After an in-lab validation, LIFT's concept has been demonstrated on sky in open loop on GeMS (the Gemini Multiconjugate adaptive optics System at Gemini South). To complete its validation, LIFT now needs to be operated in closed loop in a laser assisted adaptive optics system. The present work gives a detailed analysis of LIFT's behavior in presence of high order residuals and how to limit aliasing effects on the tip/tilt/focus estimation. Also, we study the high orders' impact on noise propagation. For this purpose, we simulate a multiconjugate adaptive optics loop representative of a GeMS-like 5 LGS configuration. The residual high orders are derived from a Fourier based simulation. We demonstrate that LIFT keeps a high performance gain over the Shack-Hartmann 2×2 whatever the turbulence conditions. Finally, we show the first simulation of a closed loop with LIFT estimating turbulent tip/tilt and focus residuals that could be induced by sodium layer's altitude variations.

  12. VALIDATION GUIDELINES FOR LABORATORIES PERFORMING FORENSIC ANALYSIS OF CHEMICAL TERRORISM

    EPA Science Inventory

    The Scientific Working Group on Forensic Analysis of Chemical Terrorism (SWGFACT) has developed the following guidelines for laboratories engaged in the forensic analysis of chemical evidence associated with terrorism. This document provides a baseline framework and guidance for...

  13. An empirical performance analysis of commodity memories in commodity servers

    SciTech Connect

    Kerbyson, D. J.; Lang, M. K.; Patino, G.

    2004-01-01

    This work details a performance study of six different commodity memories in two commodity server nodes on a number of microbenchmarks, that measure low-level performance characteristics, as well as on two applications representative of the ASCI workload. Thc memories vary both in terms of performance, including latency and bandwidths, and also in terms of their physical properties and manufacturer. Two server nodes were used; one Itanium-II Madison based system, and one Xeon based system. All the memories examined can be used within both processing nodes. This allows the performance of the memories to be directly examined while keeping all other factors within a processing node the same (processor, motherboard, operating system etc.). The results of this study show that there can be a significant difference in application performance from the different memories - by as much as 20%. Thus, by choosing the most appropriate memory for a processing node at a minimal cost differential, significant improved performance may be achievable.

  14. Analysis of factors that predict clinical performance in medical school.

    PubMed

    White, Casey B; Dey, Eric L; Fantone, Joseph C

    2009-10-01

    Academic achievement indices including GPAs and MCAT scores are used to predict the spectrum of medical student academic performance types. However, use of these measures ignores two changes influencing medical school admissions: student diversity and affirmative action, and an increased focus on communication skills. To determine if GPA and MCAT predict performance in medical school consistently across students, and whether either predicts clinical performance in clerkships. A path model was developed to examine relationships among indices of medical student performance during the first three years of medical school for five cohorts of medical students. A structural equation approach was used to calculate the coefficients hypothesized in the model for majority and minority students. Significant differences between majority and minority students were observed. MCAT scores, for example, did not predict performance of minority students in the first year of medical school but did predict performance of majority students. This information may be of use to medical school admissions and resident selection committees. PMID:18030590

  15. Integrated Flight Performance Analysis of a Launch Abort System Concept

    NASA Technical Reports Server (NTRS)

    Tartabini, Paul V.

    2007-01-01

    This paper describes initial flight performance analyses conducted early in the Orion Project to support concept feasibility studies for the Crew Exploration Vehicle s Launch Abort System (LAS). Key performance requirements that significantly affect abort capability are presented. These requirements have implications on sizing the Abort Motor, tailoring its thrust profile to meet escape requirements for both launch pad and high drag/high dynamic pressure ascent aborts. Additional performance considerations are provided for the Attitude Control Motor, a key element of the Orion LAS design that eliminates the need for ballast and provides performance robustness over a passive control approach. Finally, performance of the LAS jettison function is discussed, along with implications on Jettison Motor sizing and the timing of the jettison event during a nominal mission. These studies provide an initial understanding of LAS performance that will continue to evolve as the Orion design is matured.

  16. Analysis of Aurora's Performance Simulation Engine for Three Systems

    SciTech Connect

    Freeman, Janine; Simon, Joseph

    2015-07-07

    Aurora Solar Inc. is building a cloud-based optimization platform to automate the design, engineering, and permit generation process of solar photovoltaic (PV) installations. They requested that the National Renewable Energy Laboratory (NREL) validate the performance of the PV system performance simulation engine of Aurora Solar’s solar design platform, Aurora. In previous work, NREL performed a validation of multiple other PV modeling tools 1, so this study builds upon that work by examining all of the same fixed-tilt systems with available module datasheets that NREL selected and used in the aforementioned study. Aurora Solar set up these three operating PV systems in their modeling platform using NREL-provided system specifications and concurrent weather data. NREL then verified the setup of these systems, ran the simulations, and compared the Aurora-predicted performance data to measured performance data for those three systems, as well as to performance data predicted by other PV modeling tools.

  17. Analysis of Dynamic Performances for Servo Drive Hydraulic System

    NASA Astrophysics Data System (ADS)

    Yang, Jianxi; Wang, Liying; Huang, Jian

    Based on the servo drive hydraulic of system, using MATLAB/Simulink software in this paper, the impacts on system dynamic performances are analyzed and simulated of all the parameters (structure parameters J, Dp, and mechanism parameters A1, α, k, V1 CP). According to the relation curve of the main systematic characteristics and dynamic performances obtained from the simulations, it provides advantages for system dynamic performance improvements. The simulation results indicate that dynamic performances can be improved through the reasonable selection of the system structural parameters. Also it laid the theoretical foundation for further study on energy saving of hydraulic injection machine.

  18. Analysis and Performance of a 12-Pulse High Power Regulator

    NASA Technical Reports Server (NTRS)

    Silva, Arnold; Daeges, John

    1994-01-01

    Under work being performed to upgrade the 20 Kilowatt CW uplink transmitters of the NASA Deep Space Network (DSN), the high voltage regulator has been revisited in order to optimize its performance (long-term stability and regulation), and enhance field reliability.

  19. Manual control analysis of drug effects on driving performance

    NASA Technical Reports Server (NTRS)

    Smiley, A.; Ziedman, K.; Moskowitz, H.

    1981-01-01

    The effects of secobarbital, diazepam, alcohol, and marihuana on car-driver transfer functions obtained using a driving simulator were studied. The first three substances, all CNS depressants, reduced gain, crossover frequency, and coherence which resulted in poorer tracking performance. Marihuana also impaired tracking performance but the only effect on the transfer function parameters was to reduce coherence.

  20. Pitch Error Analysis of Young Piano Students' Music Reading Performances

    ERIC Educational Resources Information Center

    Rut Gudmundsdottir, Helga

    2010-01-01

    This study analyzed the music reading performances of 6-13-year-old piano students (N = 35) in their second year of piano study. The stimuli consisted of three piano pieces, systematically constructed to vary in terms of left-hand complexity and input simultaneity. The music reading performances were recorded digitally and a code of error analysis…

  1. An Analysis of Comprehension Performance of Sudanese EFL Students

    ERIC Educational Resources Information Center

    Ahmed, Mahgoub Dafalla

    2015-01-01

    This study examines the Sudanese EFL students' comprehension performance and measures the differences in their performance according to gender. After one semester of participation in extensive reading, 300 secondary school students from 15 schools in the state of Khartoum are randomly selected for the study. A comprehension test followed by a…

  2. An Analysis of a High Performing School District's Culture

    ERIC Educational Resources Information Center

    Corum, Kenneth D.; Schuetz, Todd B.

    2012-01-01

    This report describes a problem based learning project focusing on the cultural elements of a high performing school district. Current literature on school district culture provides numerous cultural elements that are present in high performing school districts. With the current climate in education placing pressure on school districts to perform…

  3. Analysis and performance of radial flow rotary dessicant dehumidifiers

    SciTech Connect

    Elsayed, M.M.; Chamkha, A.J.

    1997-02-01

    A model is developed to predict the steady periodic performance of a radial flow desiccant wheel. The model is expressed in terms of the same dimensionless parameters that are commonly used in modeling of the conventional axial flow desiccant wheel. In addition a dimensionless geometrical ratio of the volume of the matrix to the volume of the wheel core is found to affect the performance of the wheel. A finite difference technique on staggered grid is used to discretize the governing dimensionless equations. The discretized equations are solved to predict the performance of the desiccant wheel at given values of operation parameters. A sensitivity study is carried out to investigate the effect of changing any of these parameters on the performance of the wheel. The performance of the radial flow desiccant wheel having the same values of the operation parameters.

  4. Modelling and performance analysis of four and eight element TCAS

    NASA Technical Reports Server (NTRS)

    Sampath, K. S.; Rojas, R. G.; Burnside, W. D.

    1990-01-01

    This semi-annual report describes the work performed during the period September 1989 through March 1990. The first section presents a description of the effect of the engines of the Boeing 737-200 on the performance of a bottom mounted eight-element traffic alert and collision avoidance system (TCAS). The second section deals exclusively with a four element TCAS antenna. The model obtained to simulate the four element TCAS and new algorithms developed for studying its performance are described. The effect of location on its performance when mounted on top of a Boeing 737-200 operating at 1060 MHz is discussed. It was found that the four element TCAS generally does not perform as well as the eight element TCAS III.

  5. Performance analysis of morphological component analysis (MCA) method for mammograms using some statistical features

    NASA Astrophysics Data System (ADS)

    Gardezi, Syed Jamal Safdar; Faye, Ibrahima; Kamel, Nidal; Eltoukhy, Mohamed Meselhy; Hussain, Muhammad

    2014-10-01

    Early detection of breast cancer helps reducing the mortality rates. Mammography is very useful tool in breast cancer detection. But it is very difficult to separate different morphological features in mammographic images. In this study, Morphological Component Analysis (MCA) method is used to extract different morphological aspects of mammographic images by effectively preserving the morphological characteristics of regions. MCA decomposes the mammogram into piecewise smooth part and the texture part using the Local Discrete Cosine Transform (LDCT) and Curvelet Transform via wrapping (CURVwrap). In this study, simple comparison in performance has been done using some statistical features for the original image versus the piecewise smooth part obtained from the MCA decomposition. The results show that MCA suppresses the structural noises and blood vessels from the mammogram and enhances the performance for mass detection.

  6. Performance analysis of a Principal Component Analysis ensemble classifier for Emotiv headset P300 spellers.

    PubMed

    Elsawy, Amr S; Eldawlatly, Seif; Taher, Mohamed; Aly, Gamal M

    2014-01-01

    The current trend to use Brain-Computer Interfaces (BCIs) with mobile devices mandates the development of efficient EEG data processing methods. In this paper, we demonstrate the performance of a Principal Component Analysis (PCA) ensemble classifier for P300-based spellers. We recorded EEG data from multiple subjects using the Emotiv neuroheadset in the context of a classical oddball P300 speller paradigm. We compare the performance of the proposed ensemble classifier to the performance of traditional feature extraction and classifier methods. Our results demonstrate the capability of the PCA ensemble classifier to classify P300 data recorded using the Emotiv neuroheadset with an average accuracy of 86.29% on cross-validation data. In addition, offline testing of the recorded data reveals an average classification accuracy of 73.3% that is significantly higher than that achieved using traditional methods. Finally, we demonstrate the effect of the parameters of the P300 speller paradigm on the performance of the method. PMID:25571123

  7. Sensitivity analysis and performance estimation of refractivity from clutter techniques

    NASA Astrophysics Data System (ADS)

    Yardim, Caglar; Gerstoft, Peter; Hodgkiss, William S.

    2009-02-01

    Refractivity from clutter (RFC) refers to techniques that estimate the atmospheric refractivity profile from radar clutter returns. A RFC algorithm works by finding the environment whose simulated clutter pattern matches the radar measured one. This paper introduces a procedure to compute RFC estimator performance. It addresses the major factors such as the radar parameters, the sea surface characteristics, and the environment (region, time of the day, season) that affect the estimator performance and formalizes an error metric combining all of these. This is important for applications such as calculating the optimal radar parameters, selecting the best RFC inversion algorithm under a set of conditions, and creating a regional performance map of a RFC system. The performance metric is used to compute the RFC performance of a non-Bayesian evaporation duct estimator. A Bayesian estimator that incorporates meteorological statistics in the inversion is introduced and compared to the non-Bayesian estimator. The performance metric is used to determine the optimal radar parameters of the evaporation duct estimator for six scenarios. An evaporation duct inversion performance map for a S band radar is created for the larger Mediterranean/Arabian Sea region.

  8. Mir Cooperative Solar Array Flight Performance Data and Computational Analysis

    NASA Technical Reports Server (NTRS)

    Kerslake, Thomas W.; Hoffman, David J.

    1997-01-01

    The Mir Cooperative Solar Array (MCSA) was developed jointly by the United States (US) and Russia to provide approximately 6 kW of photovoltaic power to the Russian space station Mir. The MCSA was launched to Mir in November 1995 and installed on the Kvant-1 module in May 1996. Since the MCSA photovoltaic panel modules (PPMs) are nearly identical to those of the International Space Station (ISS) photovoltaic arrays, MCSA operation offered an opportunity to gather multi-year performance data on this technology prior to its implementation on ISS. Two specially designed test sequences were executed in June and December 1996 to measure MCSA performance. Each test period encompassed 3 orbital revolutions whereby the current produced by the MCSA channels was measured. The temperature of MCSA PPMs was also measured. To better interpret the MCSA flight data, a dedicated FORTRAN computer code was developed to predict the detailed thermal-electrical performance of the MCSA. Flight data compared very favorably with computational performance predictions. This indicated that the MCSA electrical performance was fully meeting pre-flight expectations. There were no measurable indications of unexpected or precipitous MCSA performance degradation due to contamination or other causes after 7 months of operation on orbit. Power delivered to the Mir bus was lower than desired as a consequence of the retrofitted power distribution cabling. The strong correlation of experimental and computational results further bolsters the confidence level of performance codes used in critical ISS electric power forecasting. In this paper, MCSA flight performance tests are described as well as the computational modeling behind the performance predictions.

  9. Mir Cooperative Solar Array flight performance data and computational analysis

    SciTech Connect

    Kerslake, T.W.; Hoffman, D.J.

    1997-12-31

    The Mir Cooperative Solar Array (MCSA) was developed jointly by the United States (US) and Russia to provide approximately 6 kW of photovoltaic power to the Russian space station Mir. The MCSA was launched to Mir in November 1995 and installed on the Kvant-1 module in May 1996. Since the MCSA photovoltaic panel modules (PPMs) are nearly identical to those of the International Space Station (ISS) photovoltaic arrays, MCSA operation offered an opportunity to gather multi-year performance data on this technology prior to its implementation on ISS. Two specially designed test sequences were executed in June and December 1996 to measure MCSA performance. Each test period encompassed 3 orbital revolutions whereby the current produced by the MCSA channels was measured. The temperature of MCSA PPMs was also measured. To better interpret the MCSA flight data, a dedicated FORTRAN computer code was developed to predict the detailed thermal-electrical performance of the MCSA. Flight data compared very favorably with computational performance predictions. This indicated that the MCSA electrical performance was fully meeting pre-flight expectations. There were no measurable indications of unexpected or precipitous MCSA performance degradation due to contamination or other causes after 7 months of operation on orbit. Power delivered to the Mir bus was lower than desired as a consequence of the retrofitted power distribution cabling. The strong correlation of experimental and computational results further bolsters the confidence level of performance codes used in critical ISS electric power forecasting. In this paper, MCSA flight performance tests are described as well as the computational modeling behind the performance predictions.

  10. Performance Analysis of Satellite Clock Bias Based on Wavelet Analysis and Neural Network

    NASA Astrophysics Data System (ADS)

    Guo, C. J.; Teng, Y. L.

    2010-10-01

    In the field of the real-time GPS precise point positioning (PPP), the real-time and reliable prediction of satellite clock bias (SCB) is one key to realize the real-time GPS PPP with high accuracy. The satellite borne GPS atomic clock has high frequency, is very sensitive and extremely easy to be influenced by the outside world and its own factors. So it is very difficult to master its complicated and detailed law of change. With the above characters, a novel four-stage method for SCB prediction based on wavelet analysis and neural network is proposed. The basic ideas, prediction models and steps of clock bias prediction based on wavelet analysis and radial basis function (RBF) network are discussed, respectively. This model adopts "sliding window" to compartmentalize data and utilizes neural network to prognosticate coefficients of clock bias sequence at each layer after wavelet analysis and wiping off noise. As a result, the intricate and meticulous diversification rule of clock bias sequence is obtained more accurately and the clock bias sequence is better approached. Compared with the grey system model and neural network model, a careful precision analysis of SCB prediction is made to verify the feasibility and validity of this proposed method by using the performance parameters of GPS satellite clocks. The simulation results show that the prediction precision of this novel model is much better. This model can afford the SCB prediction with relatively high precision for real-time GPS PPP.

  11. Performance analysis of bearing-only target location algorithms

    NASA Astrophysics Data System (ADS)

    Gavish, Motti; Weiss, Anthony J.

    1992-07-01

    The performance of two well known bearing only location techniques, the maximum likelihood (ML) and the Stansfield estimators, is examined. Analytical expressions are obtained for the bias and the covariance matrix of the estimation error, which permit performance comparison for any case of interest. It is shown that the Stansfield algorithm provides biased estimates even for large numbers of measurements, in contrast with the ML method. The rms error of the Stansfield technique is not necessarily larger than the rms of the ML technique. However, it is shown that the ML technique is superior to the Stansfield method when the number of measurements is large enough. Simulation results verify the predicted theoretical performance.

  12. Assessing BMP Performance Using Microtox Toxicity Analysis - Rhode Island

    EPA Science Inventory

    Best Management Practices (BMPs) have been shown to be effective in reducing runoff and pollutants from urban areas and thus provide a mechanism to improve downstream water quality. Currently, BMP performance regarding water quality improvement is assessed through measuring each...

  13. Preliminary basic performance analysis of the Cedar multiprocessor memory system

    NASA Technical Reports Server (NTRS)

    Gallivan, K.; Jalby, W.; Turner, S.; Veidenbaum, A.; Wijshoff, H.

    1991-01-01

    Some preliminary basic results on the performance of the Cedar multiprocessor memory system are presented. Empirical results are presented and used to calibrate a memory system simulator which is then used to discuss the scalability of the system.

  14. Federated queries for comparative effectiveness research: performance analysis.

    PubMed

    Price, Ronald C; Huth, Derick; Smith, Jody; Harper, Steve; Pace, Wilson; Pulver, Gerald; Kahn, Michael G; Schilling, Lisa M; Facelli, Julio C

    2012-01-01

    This paper presents a study of the performance of federated queries implemented in a system that simulates the architecture proposed for the Scalable Architecture for Federated Translational Inquiries Network (SAFTINet). Performance tests were conducted using both physical hardware and virtual machines within the test laboratory of the Center for High Performance Computing at the University of Utah. Tests were performed on SAFTINet networks ranging from 4 to 32 nodes with databases containing synthetic data for several million patients. The results show that the caGrid FQE (Federated Query Engine) is capable and suitable for comparative effectiveness research (CER) federated queries given its nearly linear scalability as partner nodes increase in number. The results presented here are also important for the specification of the hardware required to run a CER grid. PMID:22941983

  15. Assessing BMP Performance Using Microtox® Toxicity Analysis

    EPA Science Inventory

    Best Management Practices (BMPs) have been shown to be effective in reducing runoff and pollutants from urban areas and thus provide a mechanism to improve downstream water quality. Currently, BMP performance regarding water quality improvement is assessed through measuring each...

  16. Performance analysis of ten brands of batteries for hearing aids

    PubMed Central

    Penteado, Silvio Pires; Bento, Ricardo Ferreira

    2013-01-01

    Summary Introduction: Comparison of the performance of hearing instrument batteries from various manufacturers can enable otologists, audiologists, or final consumers to select the best products, maximizing the use of these materials. Aim: To analyze the performance of ten brands of batteries for hearing aids available in the Brazilian marketplace. Methods: Hearing aid batteries in four sizes were acquired from ten manufacturers and subjected to the same test conditions in an acoustic laboratory. Results: The results obtained in the laboratory contrasted with the values reported by manufacturers highlighted significant discrepancies, besides the fact that certain brands in certain sizes perform better on some tests, but does not indicate which brand is the best in all sizes. Conclusions: It was possible to investigate the performance of ten brands of hearing aid batteries and describe the procedures to be followed for leakage, accidental intake, and disposal. PMID:25992026

  17. Analysis of complex network performance and heuristic node removal strategies

    NASA Astrophysics Data System (ADS)

    Jahanpour, Ehsan; Chen, Xin

    2013-12-01

    Removing important nodes from complex networks is a great challenge in fighting against criminal organizations and preventing disease outbreaks. Six network performance metrics, including four new metrics, are applied to quantify networks' diffusion speed, diffusion scale, homogeneity, and diameter. In order to efficiently identify nodes whose removal maximally destroys a network, i.e., minimizes network performance, ten structured heuristic node removal strategies are designed using different node centrality metrics including degree, betweenness, reciprocal closeness, complement-derived closeness, and eigenvector centrality. These strategies are applied to remove nodes from the September 11, 2001 hijackers' network, and their performance are compared to that of a random strategy, which removes randomly selected nodes, and the locally optimal solution (LOS), which removes nodes to minimize network performance at each step. The computational complexity of the 11 strategies and LOS is also analyzed. Results show that the node removal strategies using degree and betweenness centralities are more efficient than other strategies.

  18. Moisture and Structural Analysis for High Performance Hybrid Wall Assemblies

    SciTech Connect

    Grin, A.; Lstiburek, J.

    2012-09-01

    This report describes the work conducted by the Building Science Corporation (BSC) Building America Research Team's 'Energy Efficient Housing Research Partnerships' project. Based on past experience in the Building America program, they have found that combinations of materials and approaches---in other words, systems--usually provide optimum performance. No single manufacturer typically provides all of the components for an assembly, nor has the specific understanding of all the individual components necessary for optimum performance.

  19. Using Weibull Distribution Analysis to Evaluate ALARA Performance

    SciTech Connect

    E. L. Frome, J. P. Watkins, and D. A. Hagemeyer

    2009-10-01

    As Low as Reasonably Achievable (ALARA) is the underlying principle for protecting nuclear workers from potential health outcomes related to occupational radiation exposure. Radiation protection performance is currently evaluated by measures such as collective dose and average measurable dose, which do not indicate ALARA performance. The purpose of this work is to show how statistical modeling of individual doses using the Weibull distribution can provide objective supplemental performance indicators for comparing ALARA implementation among sites and for insights into ALARA practices within a site. Maximum likelihood methods were employed to estimate the Weibull shape and scale parameters used for performance indicators. The shape parameter reflects the effectiveness of maximizing the number of workers receiving lower doses and is represented as the slope of the fitted line on a Weibull probability plot. Additional performance indicators derived from the model parameters include the 99th percentile and the exceedance fraction. When grouping sites by collective total effective dose equivalent (TEDE) and ranking by 99th percentile with confidence intervals, differences in performance among sites can be readily identified. Applying this methodology will enable more efficient and complete evaluation of the effectiveness of ALARA implementation.

  20. High-performance liquid chromatographic analysis of ampicillin.

    PubMed

    Tsuji, K; Robertson, J H

    1975-09-01

    A high-pressure liquid chromatographic method for the analysis of ampicillin is described. The method uses a 1-m long stainless steel column packed with anionic exchange resin, with a mobile phase of 0.02 M NaNO3 in 0.01 M pH 9.15 borate buffer at a flow rate of 0.45 ml/min. The degradation products of ampicillin, penicillenic and penicilloic acids of ampicillin, can be separated and quantitated in less than 12 min of chromatographic time. The relative standard deviation for the analysis of ampicillin is less than 1%, and the method is sensitive to approximately 20 ng of ampicillin/sample injected. The method was applied to the analysis of various pharmaceutical preparations of ampicillin. It is also applicable, with a slight modification, for the analysis of penicillins G and V. PMID:1185575

  1. Application of uncertainty analysis to cooling tower thermal performance tests

    SciTech Connect

    Yost, J.G.; Wheeler, D.E.

    1986-01-01

    The purpose of this paper is to provide an overview of uncertainty analyses. The following topics are addressed: l. A review and summary of the basic constituents of an uncertainty analysis with definitions and discussion of basic terms; 2. A discussion of the benefits and uses of uncertainty analysis; and 3. Example uncertainty analyses with emphasis on the problems, limitations, and site-specific complications.

  2. Performance analysis of the ascent propulsion system of the Apollo spacecraft

    NASA Technical Reports Server (NTRS)

    Hooper, J. C., III

    1973-01-01

    Activities involved in the performance analysis of the Apollo lunar module ascent propulsion system are discussed. A description of the ascent propulsion system, including hardware, instrumentation, and system characteristics, is included. The methods used to predict the inflight performance and to establish performance uncertainties of the ascent propulsion system are discussed. The techniques of processing the telemetered flight data and performing postflight performance reconstruction to determine actual inflight performance are discussed. Problems that have been encountered and results from the analysis of the ascent propulsion system performance during the Apollo 9, 10, and 11 missions are presented.

  3. Trajectory analysis and performance for SEP Comet Encke missions

    NASA Technical Reports Server (NTRS)

    Sauer, C. G., Jr.

    1973-01-01

    A summary of the performance of Solar Electric Propulsion spacecraft for Comet Encke missions for the 1980, 1984 and 1987 mission opportunities is presented together with a description of the spacecraft trajectory for each opportunity. Included is data for rendezvous trajectories for all three opportunities and data for a slow flyby mission during the 1980 opportunity. A range of propulsion system input powers of 10 to 20 kW are considered together with a constant spacecraft power requirement of 400 watts. The performance presented in this paper is indicative of that using 30 cm Mercury electron bombardment thrusters that are currently being developed. Performance is given in terms of final spacecraft mass and is thus independent of any particular spacecraft design concept.

  4. Hydrogen engine performance analysis project. Second annual report

    SciTech Connect

    Adt, Jr., R. R.; Swain, M. R.; Pappas, J. M.

    1980-01-01

    Progress in a 3 year research program to evaluate the performance and emission characteristics of hydrogen-fueled internal combustion engines is reported. Fifteen hydrogen engine configurations will be subjected to performance and emissions characterization tests. During the first two years, baseline data for throttled and unthrottled, carburetted and timed hydrogen induction, Pre IVC hydrogen-fueled engine configurations, with and without exhaust gas recirculation (EGR) and water injection, were obtained. These data, along with descriptions of the test engine and its components, the test apparatus, experimental techniques, experiments performed and the results obtained, are given. Analyses of other hydrogen-engine project data are also presented and compared with the results of the present effort. The unthrottled engine vis-a-vis the throttled engine is found, in general, to exhibit higher brake thermal efficiency. The unthrottled engine also yields lower NO/sub x/ emissions, which were found to be a strong function of fuel-air equivalence ratio. (LCL)

  5. An Analysis of Performance Enhancement Techniques for Overset Grid Applications

    NASA Technical Reports Server (NTRS)

    Djomehri, J. J.; Biswas, R.; Potsdam, M.; Strawn, R. C.; Biegel, Bryan (Technical Monitor)

    2002-01-01

    The overset grid methodology has significantly reduced time-to-solution of high-fidelity computational fluid dynamics (CFD) simulations about complex aerospace configurations. The solution process resolves the geometrical complexity of the problem domain by using separately generated but overlapping structured discretization grids that periodically exchange information through interpolation. However, high performance computations of such large-scale realistic applications must be handled efficiently on state-of-the-art parallel supercomputers. This paper analyzes the effects of various performance enhancement techniques on the parallel efficiency of an overset grid Navier-Stokes CFD application running on an SGI Origin2000 machine. Specifically, the role of asynchronous communication, grid splitting, and grid grouping strategies are presented and discussed. Results indicate that performance depends critically on the level of latency hiding and the quality of load balancing across the processors.

  6. Off-design performance analysis of MHD generator channels

    NASA Technical Reports Server (NTRS)

    Wilson, D. R.; Williams, T. S.

    1980-01-01

    A computer code for performing parametric design point calculations, and evaluating the off-design performance of MHD generators has been developed. The program is capable of analyzing Faraday, Hall, and DCW channels, including the effect of electrical shorting in the gas boundary layers and coal slag layers. Direct integration of the electrode voltage drops is included. The program can be run in either the design or off-design mode. Details of the computer code, together with results of a study of the design and off-design performance of the proposed ETF MHD generator are presented. Design point variations of pre-heat and stoichiometry were analyzed. The off-design study included variations in mass flow rate and oxygen enrichment.

  7. Full-Envelope Launch Abort System Performance Analysis Methodology

    NASA Technical Reports Server (NTRS)

    Aubuchon, Vanessa V.

    2014-01-01

    The implementation of a new dispersion methodology is described, which dis-perses abort initiation altitude or time along with all other Launch Abort System (LAS) parameters during Monte Carlo simulations. In contrast, the standard methodology assumes that an abort initiation condition is held constant (e.g., aborts initiated at altitude for Mach 1, altitude for maximum dynamic pressure, etc.) while dispersing other LAS parameters. The standard method results in large gaps in performance information due to the discrete nature of initiation conditions, while the full-envelope dispersion method provides a significantly more comprehensive assessment of LAS abort performance for the full launch vehicle ascent flight envelope and identifies performance "pinch-points" that may occur at flight conditions outside of those contained in the discrete set. The new method has significantly increased the fidelity of LAS abort simulations and confidence in the results.

  8. Logit Model based Performance Analysis of an Optimization Algorithm

    NASA Astrophysics Data System (ADS)

    Hernández, J. A.; Ospina, J. D.; Villada, D.

    2011-09-01

    In this paper, the performance of the Multi Dynamics Algorithm for Global Optimization (MAGO) is studied through simulation using five standard test functions. To guarantee that the algorithm converges to a global optimum, a set of experiments searching for the best combination between the only two MAGO parameters -number of iterations and number of potential solutions, are considered. These parameters are sequentially varied, while increasing the dimension of several test functions, and performance curves were obtained. The MAGO was originally designed to perform well with small populations; therefore, the self-adaptation task with small populations is more challenging while the problem dimension is higher. The results showed that the convergence probability to an optimal solution increases according to growing patterns of the number of iterations and the number of potential solutions. However, the success rates slow down when the dimension of the problem escalates. Logit Model is used to determine the mutual effects between the parameters of the algorithm.

  9. Performance analysis of SHE-PWM using Fourier Series and Newton-Raphson analysis

    NASA Astrophysics Data System (ADS)

    Lada, M. Y.; Khiar, M. S. A.; Ghani, S. A.; Nawawi, M. R. M.; Nor, A. S. M.; Yuen, J. G. M.

    2015-05-01

    The performance of inverter has become a vital role in contributing effective power system nowadays. However the major issue that will reduce the inverter performance is the harmonic distortions that contribute to power losses. Thus, there are a variety of controls techniques have been implemented for inverters switching such as square wave, SHE-PWM, unipolar and bipolar. The square wave type inverter produces output voltage in square shape which has simple logic control and power switches. Next, unipolar and bipolar techniques are using comparator to compare the reference voltage waveform with the triangular waveform. The difference between unipolar and bipolar is there are two reference signals which are compared with the triangular waveform for unipolar switching. On the other hand, bipolar switching compares triangular waveform with a reference signal. Selective Harmonic Elimination Pulse-Width Modulation (SHE-PWM) is another control technique for inverters. This research propose SHE-PWM as a low switching frequency strategy that uses Fourier Series and Newton-Raphson analysis to calculate the switching angles for elimination of harmonic distortion. Fourier Series is used to determine the amplitude of any odd harmonic in the output signal whereas Newton-Raphson is used to solve the equation for finding switching angles. As a result, SHE-PWM can select the low frequency harmonic components need to be eliminated and reduce the harmonic distortion. It also prevents the harmonic distortion that sensitive to the inverter performance

  10. Social Cognitive Career Theory, Conscientiousness, and Work Performance: A Meta-Analytic Path Analysis

    ERIC Educational Resources Information Center

    Brown, Steven D.; Lent, Robert W.; Telander, Kyle; Tramayne, Selena

    2011-01-01

    We performed a meta-analytic path analysis of an abbreviated version of social cognitive career theory's (SCCT) model of work performance (Lent, Brown, & Hackett, 1994). The model we tested included the central cognitive predictors of performance (ability, self-efficacy, performance goals), with the exception of outcome expectations. Results…

  11. Analysis of thermal performance of penetrated multi-layer insulation

    NASA Technical Reports Server (NTRS)

    Foster, Winfred A., Jr.; Jenkins, Rhonald M.; Yoo, Chai H.; Barrett, William E.

    1988-01-01

    Results of research performed for the purpose of studying the sensitivity of multi-layer insulation blanket performance caused by penetrations through the blanket are presented. The work described in this paper presents the experimental data obtained from thermal vacuum tests of various penetration geometries similar to those present on the Hubble Space Telescope. The data obtained from these tests is presented in terms of electrical power required sensitivity factors referenced to a multi-layer blanket without a penetration. The results of these experiments indicate that a significant increase in electrical power is required to overcome the radiation heat losses in the vicinity of the penetrations.

  12. Performance analysis of Ethernet PON system accommodating 64 ONUs

    NASA Astrophysics Data System (ADS)

    Tanaka, Keiji; Ohara, Kazuho; Miyazaki, Noriyuki; Edagawa, Noboru

    2007-05-01

    We report the performance of an IEEE 802.3 standard compliant Ethernet passive optical network (EPON) system accommodating 64 optical network units (ONUs). After investigating the optical transmission performance, we successfully demonstrate that a high throughput of more than 900Mbits/s can be achieved in a 64-ONU EPON system using multiple logical link identifiers per ONU within a range of 10km. In addition, we confirm the feasibility of IP-based high-quality triple play services in the EPON system.

  13. A relative performance analysis of atmospheric Laser Doppler Velocimeter methods.

    NASA Technical Reports Server (NTRS)

    Farmer, W. M.; Hornkohl, J. O.; Brayton, D. B.

    1971-01-01

    Evaluation of the effectiveness of atmospheric applications of a Laser Doppler Velocimeter (LDV) at a wavelength of about 0.5 micrometer in conjunction with dual scatter LDV illuminating techniques, or at a wavelength of 10.6 micrometer with local oscillator LDV illuminating techniques. Equations and examples are given to provide a quantitative basis for LDV system selection and performance criteria in atmospheric research. The comparative study shows that specific ranges and conditions exist where performance of one of the methods is superior to that of the other. It is also pointed out that great care must be exercised in choosing system parameters that optimize a particular LDV designed for atmospheric applications.

  14. Performance issues for engineering analysis on MIMD parallel computers

    SciTech Connect

    Fang, H.E.; Vaughan, C.T.; Gardner, D.R.

    1994-08-01

    We discuss how engineering analysts can obtain greater computational resolution in a more timely manner from applications codes running on MIMD parallel computers. Both processor speed and memory capacity are important to achieving better performance than a serial vector supercomputer. To obtain good performance, a parallel applications code must be scalable. In addition, the aspect ratios of the subdomains in the decomposition of the simulation domain onto the parallel computer should be of order 1. We demonstrate these conclusions using simulations conducted with the PCTH shock wave physics code running on a Cray Y-MP, a 1024-node nCUBE 2, and an 1840-node Paragon.

  15. Performance analysis of the Jersey City Total Energy Site

    NASA Astrophysics Data System (ADS)

    Hurley, C. W.; Ryan, J. D.; Phillips, C. W.

    1982-08-01

    Engineering, economic, environmental, and reliability data from a 486 - unit apartment/commercial complex was gathered. The complex was designed to recover waste heat from diesel engines to make the central equipment building a total energy (TE) plant. Analysis of the data indicates that a significant savings in fuel is possible by minor modifications in plant procedures. The results of an analysis of the quality of utility services supplied to the consumers on the site and an analysis of a series of environmental tests made the effects of the plant on air quality and noise are included. In general, although those systems utilizing the TE concept showed a significant savings in fuel, such systems do not represent attractive investments compared to conventional systems.

  16. Leadership Styles and Organizational Performance: A Predictive Analysis

    ERIC Educational Resources Information Center

    Kieu, Hung Q.

    2010-01-01

    Leadership is critically important because it affects the health of the organization. Research has found that leadership is one of the most significant contributors to organizational performance. Expanding and replicating previous research, and focusing on the specific telecommunications sector, this study used multiple correlation and regression…

  17. How Motivation Affects Academic Performance: A Structural Equation Modelling Analysis

    ERIC Educational Resources Information Center

    Kusurkar, R. A.; Ten Cate, Th. J.; Vos, C. M. P.; Westers, P.; Croiset, G.

    2013-01-01

    Few studies in medical education have studied effect of quality of motivation on performance. Self-Determination Theory based on quality of motivation differentiates between Autonomous Motivation (AM) that originates within an individual and Controlled Motivation (CM) that originates from external sources. To determine whether Relative Autonomous…

  18. Academic Performance and School Integration: A Multi-Ethnic Analysis.

    ERIC Educational Resources Information Center

    Maynor, Waltz

    Determining whether statistically significant differences occur in the measured achievement of a group of 608 white pupils, 127 Lumbee Indian pupils, and 680 black pupils--from a newly racially integrated North Carolina school system--this study analyzed academic performance with respect to each student ethnic group, each teacher ethnic group, and…

  19. Relative Performance of Academic Departments Using DEA with Sensitivity Analysis

    ERIC Educational Resources Information Center

    Tyagi, Preeti; Yadav, Shiv Prasad; Singh, S. P.

    2009-01-01

    The process of liberalization and globalization of Indian economy has brought new opportunities and challenges in all areas of human endeavor including education. Educational institutions have to adopt new strategies to make best use of the opportunities and counter the challenges. One of these challenges is how to assess the performance of…

  20. Performance analysis of exam gloves used for aseptic rodent surgery.

    PubMed

    LeMoine, Dana M; Bergdall, Valerie K; Freed, Carrie

    2015-05-01

    Aseptic technique includes the use of sterile surgical gloves for survival surgeries in rodents to minimize the incidence of infections. Exam gloves are much less expensive than are surgical gloves and may represent a cost-effective, readily available option for use in rodent surgery. This study examined the effectiveness of surface disinfection of exam gloves with 70% isopropyl alcohol or a solution of hydrogen peroxide and peracetic acid (HP-PA) in reducing bacterial contamination. Performance levels for asepsis were met when gloves were negative for bacterial contamination after surface disinfection and sham 'exertion' activity. According to these criteria, 94% of HP-PA-disinfected gloves passed, compared with 47% of alcohol-disinfected gloves. In addition, the effect of autoclaving on the integrity of exam gloves was examined, given that autoclaving is another readily available option for aseptic preparation. Performance criteria for glove integrity after autoclaving consisted of: the ability to don the gloves followed by successful simulation of wound closure and completion of stretch tests without tearing or observable defects. Using this criteria, 98% of autoclaved nitrile exam gloves and 76% of autoclaved latex exam gloves met performance expectations compared with the performance of standard surgical gloves (88% nitrile, 100% latex). The results of this study support the use of HP-PA-disinfected latex and nitrile exam gloves or autoclaved nitrile exam gloves as viable cost-effective alternatives to sterile surgical gloves for rodent surgeries. PMID:26045458

  1. Performance Analysis of Exam Gloves Used for Aseptic Rodent Surgery

    PubMed Central

    LeMoine, Dana M; Bergdall, Valerie K; Freed, Carrie

    2015-01-01

    Aseptic technique includes the use of sterile surgical gloves for survival surgeries in rodents to minimize the incidence of infections. Exam gloves are much less expensive than are surgical gloves and may represent a cost-effective, readily available option for use in rodent surgery. This study examined the effectiveness of surface disinfection of exam gloves with 70% isopropyl alcohol or a solution of hydrogen peroxide and peracetic acid (HP–PA) in reducing bacterial contamination. Performance levels for asepsis were met when gloves were negative for bacterial contamination after surface disinfection and sham ‘exertion’ activity. According to these criteria, 94% of HP–PA-disinfected gloves passed, compared with 47% of alcohol-disinfected gloves. In addition, the effect of autoclaving on the integrity of exam gloves was examined, given that autoclaving is another readily available option for aseptic preparation. Performance criteria for glove integrity after autoclaving consisted of: the ability to don the gloves followed by successful simulation of wound closure and completion of stretch tests without tearing or observable defects. Using this criteria, 98% of autoclaved nitrile exam gloves and 76% of autoclaved latex exam gloves met performance expectations compared with the performance of standard surgical gloves (88% nitrile, 100% latex). The results of this study support the use of HP–PA-disinfected latex and nitrile exam gloves or autoclaved nitrile exam gloves as viable cost-effective alternatives to sterile surgical gloves for rodent surgeries. PMID:26045458

  2. State Aid and Student Performance: A Supply-Demand Analysis

    ERIC Educational Resources Information Center

    Kinnucan, Henry W.; Zheng, Yuqing; Brehmer, Gerald

    2006-01-01

    Using a supply-demand framework, a six-equation model is specified to generate hypotheses about the relationship between state aid and student performance. Theory predicts that an increase in state or federal aid provides an incentive to decrease local funding, but that the disincentive associated with increased state aid is moderated when federal…

  3. Meta-Analysis of Predictors of Dental School Performance

    ERIC Educational Resources Information Center

    DeCastro, Jeanette E.

    2012-01-01

    Accurate prediction of which candidates show the most promise of success in dental school is imperative for the candidates, the profession, and the public. Several studies suggested that predental GPAs and the Dental Admissions Test (DAT) produce a range of correlations with dental school performance measures. While there have been similarities,…

  4. Analysis of Factors that Predict Clinical Performance in Medical School

    ERIC Educational Resources Information Center

    White, Casey B.; Dey, Eric L.; Fantone, Joseph C.

    2009-01-01

    Academic achievement indices including GPAs and MCAT scores are used to predict the spectrum of medical student academic performance types. However, use of these measures ignores two changes influencing medical school admissions: student diversity and affirmative action, and an increased focus on communication skills. To determine if GPA and MCAT…

  5. Performance Achievement and Analysis of Teaching during Choral Rehearsals.

    ERIC Educational Resources Information Center

    Davis, Anita P.

    1998-01-01

    Evaluates teaching sequences in high-school choral rehearsals to provide insight into the relationship between ensemble achievement and performance preparation. Indicates that teacher pace improvement is unrelated to ensemble maturity, teacher verbalization may not relate to success, and teacher assistance and instructions decrease with student…

  6. Performance Analysis of Optical Code Division Multiplex System

    NASA Astrophysics Data System (ADS)

    Kaur, Sandeep; Bhatia, Kamaljit Singh

    2013-12-01

    This paper presents the Pseudo-Orthogonal Code generator for Optical Code Division Multiple Access (OCDMA) system which helps to reduce the need of bandwidth expansion and improve spectral efficiency. In this paper we investigate the performance of multi-user OCDMA system to achieve data rate more than 1 Tbit/s.

  7. Performance analysis of vortex based mixers for confined flows

    NASA Astrophysics Data System (ADS)

    Buschhagen, Timo

    The hybrid rocket is still sparsely employed within major space or defense projects due to their relatively poor combustion efficiency and low fuel grain regression rate. Although hybrid rockets can claim advantages in safety, environmental and performance aspects against established solid and liquid propellant systems, the boundary layer combustion process and the diffusion based mixing within a hybrid rocket grain port leaves the core flow unmixed and limits the system performance. One principle used to enhance the mixing of gaseous flows is to induce streamwise vorticity. The counter-rotating vortex pair (CVP) mixer utilizes this principle and introduces two vortices into a confined flow, generating a stirring motion in order to transport near wall media towards the core and vice versa. Recent studies investigated the velocity field introduced by this type of swirler. The current work is evaluating the mixing performance of the CVP concept, by using an experimental setup to simulate an axial primary pipe flow with a radially entering secondary flow. Hereby the primary flow is altered by the CVP swirler unit. The resulting setup therefore emulates a hybrid rocket motor with a cylindrical single port grain. In order to evaluate the mixing performance the secondary flow concentration at the pipe assembly exit is measured, utilizing a pressure-sensitive paint based procedure.

  8. Hydrogen engine performance analysis project. First quarterly report, March 1980

    SciTech Connect

    Adt, Jr, R R; Swain, M R; Pappas, J M

    1980-01-01

    Progress in a program aimed at obtaining operational and performance data on a prototype pre intake valve closing fuel ingestion (PreIVC) hydrogen-fueled automotive engine is reported. Information is included on the construction and testing of an unthrottled hydrogen delivery system and on flashback during starting. It was determined that the flashback was caused by runaway surface ignition. (LCL)

  9. Performance Factors Analysis -- A New Alternative to Knowledge Tracing

    ERIC Educational Resources Information Center

    Pavlik, Philip I., Jr.; Cen, Hao; Koedinger, Kenneth R.

    2009-01-01

    Knowledge tracing (KT)[1] has been used in various forms for adaptive computerized instruction for more than 40 years. However, despite its long history of application, it is difficult to use in domain model search procedures, has not been used to capture learning where multiple skills are needed to perform a single action, and has not been used…

  10. Elementary School Size and Student Performance: A Conceptual Analysis

    ERIC Educational Resources Information Center

    Zoda, Pamela; Combs, Julie P.; Slate, John R.

    2011-01-01

    In this article, we reviewed the empirical literature concerning the relationship between school size and student performance with a focus was on determining the extent to which school size, specifically elementary school size, was related to student academic achievement. Most of the extant literature was on secondary school size with fewer…

  11. Theoretical performance analysis for CMOS based high resolution detectors.

    PubMed

    Jain, Amit; Bednarek, Daniel R; Rudin, Stephen

    2013-03-01

    High resolution imaging capabilities are essential for accurately guiding successful endovascular interventional procedures. Present x-ray imaging detectors are not always adequate due to their inherent limitations. The newly-developed high-resolution micro-angiographic fluoroscope (MAF-CCD) detector has demonstrated excellent clinical image quality; however, further improvement in performance and physical design may be possible using CMOS sensors. We have thus calculated the theoretical performance of two proposed CMOS detectors which may be used as a successor to the MAF. The proposed detectors have a 300 μm thick HL-type CsI phosphor, a 50 μm-pixel CMOS sensor with and without a variable gain light image intensifier (LII), and are designated MAF-CMOS-LII and MAF-CMOS, respectively. For the performance evaluation, linear cascade modeling was used. The detector imaging chains were divided into individual stages characterized by one of the basic processes (quantum gain, binomial selection, stochastic and deterministic blurring, additive noise). Ranges of readout noise and exposure were used to calculate the detectors' MTF and DQE. The MAF-CMOS showed slightly better MTF than the MAF-CMOS-LII, but the MAF-CMOS-LII showed far better DQE, especially for lower exposures. The proposed detectors can have improved MTF and DQE compared with the present high resolution MAF detector. The performance of the MAF-CMOS is excellent for the angiography exposure range; however it is limited at fluoroscopic levels due to additive instrumentation noise. The MAF-CMOS-LII, having the advantage of the variable LII gain, can overcome the noise limitation and hence may perform exceptionally for the full range of required exposures; however, it is more complex and hence more expensive. PMID:24353390

  12. Cost and performance analysis of conceptual designs of physical protection systems

    SciTech Connect

    Hicks, M.J.; Snell, M.S.; Sandoval, J.S.; Potter, C.S.

    1998-06-01

    CPA -- Cost and Performance Analysis -- is a methodology that joins Activity Based Cost (ABC) estimation with performance based analysis of physical protection systems. CPA offers system managers an approach that supports both tactical decision making and strategic planning. Current exploratory applications of the CPA methodology are addressing analysis of alternative conceptual designs. To support these activities, the original architecture for CPA, is being expanded to incorporate results from a suite of performance and consequence analysis tools such as JTS (Joint Tactical Simulation), ERAD (Explosive Release Atmospheric Dispersion) and blast effect models. The process flow for applying CPA to the development and analysis conceptual designs is illustrated graphically.

  13. An Analysis of Tasks Performed in the Ornamental Horticulture Industry.

    ERIC Educational Resources Information Center

    Berkey, Arthur L.; Drake, William E.

    This publication is the result of a detailed task analysis study of the ornamental horticulture industry in New York State. Nine types of horticulture businesses identified were: (1) retail florists, (2) farm and garden supply store, (3) landscape services, (4) greenhouse production, (5) nursery production, (6) turf production, (7) arborist…

  14. NYMEX crude oil futures market: An analysis of its performance

    SciTech Connect

    Chassard, C.; Halliwell, M.

    1986-01-01

    This book includes a description of the mechanics of crude oil futures trading and analysis of the growth and structure of NYMEX deals. Futures markets are expected to fulfill two functions: price risk transfer and price discovery. This shows NYMEX is effective for hedging short-term price risks but of little help for price discovery.

  15. Performing Resource Usage Analysis for a NOTIS System.

    ERIC Educational Resources Information Center

    Hinnebusch, Mark

    1991-01-01

    Outlines methods that the Florida Center for Library Automation (FCLA) has developed to estimate transaction costs and overall demand for NOTIS services. Transaction resource usage analysis is discussed, record structures are explained, institution collection size is considered, and usage and response time by hour of day is described. (six…

  16. Applying Score Analysis to a Rehearsal Pedagogy of Expressive Performance

    ERIC Educational Resources Information Center

    Byo, James L.

    2014-01-01

    The discoveries of score analysis (e.g., minor seventh chord, ostinato, phrase elision, melodic fragment, half cadence) are more than just compositional techniques or music vocabulary. They are sounds--fascinating, storytelling, dynamic modes of expression--that when approached as such enrich the rehearsal experience. This article presents a…

  17. Rapid Automatized Naming and Reading Performance: A Meta-Analysis

    ERIC Educational Resources Information Center

    Araújo, Susana; Reis, Alexandra; Petersson, Karl Magnus; Faísca, Luís

    2015-01-01

    Evidence that rapid naming skill is associated with reading ability has become increasingly prevalent in recent years. However, there is considerable variation in the literature concerning the magnitude of this relationship. The objective of the present study was to provide a comprehensive analysis of the evidence on the relationship between rapid…

  18. Dopingless PNPN tunnel FET with improved performance: Design and analysis

    NASA Astrophysics Data System (ADS)

    Ram, Mamidala Saketh; Abdi, Dawit Burusie

    2015-06-01

    In this paper, we present a two-dimensional simulation study of a dopingless PNPN TFET with a hetero-gate dielectric. Using a dual-material-gate in a dopingless TFET, the energy band gap on the source side is modulated to create an N+ source pocket. Our technique obviates the need to use ion implantation for the formation of the N+ source pocket. The dopingless PNPN TFET with a hetero-gate dielectric is demonstrated to exhibit a superior performance in terms of ON-state current and subthreshold swing when compared to a conventional dopingless TFET. Our results may pave the way for realizing high performance dopingless TFETs using a low thermal budget required for low power and low cost applications.

  19. Resilient Plant Monitoring System: Design, Analysis, and Performance Evaluation

    SciTech Connect

    Humberto E. Garcia; Wen-Chiao Lin; Semyon M. Meerkov; Maruthi T. Ravichandran

    2013-12-01

    Resilient monitoring systems are sensor networks that degrade gracefully under malicious attacks on their sensors, causing them to project misleading information. The goal of this paper is to design, analyze, and evaluate the performance of a resilient monitoring system intended to monitor plant conditions (normal or anomalous). The architecture developed consists of four layers: data quality assessment, process variable assessment, plant condition assessment, and sensor network adaptation. Each of these layers is analyzed by either analytical or numerical tools, and the performance of the overall system is evaluated using simulations. The measure of resiliency of the resulting system is evaluated using Kullback Leibler divergence, and is shown to be sufficiently high in all scenarios considered.

  20. Performance limit analysis of a metallic fuel for Kalimer

    SciTech Connect

    Lee, Byoung Oon; Cheon, J.S.; Lee, C.B.

    2007-07-01

    A metallic fuel is being considered as the fuel for SFR in Korea. The metal fuel development for SFR in Korea started in 2007 in the areas of metal fuel fabrication, cladding materials and fuel performance evaluation. The MACSIS code for a metallic fuel has been developed as a steady-state performance computer code. Present study represents the preliminary parametric results for evaluating the design limits of the metal fuel for SFR in Korea. The operating limits were analyzed by the MACSIS code. The modules of the creep rupture strength for the Mod.HT9 and the barrier cladding were inserted. The strain limits and the CDF limit were analyzed for the HT9, and the Mod.HT9. To apply the concept of a barrier cladding, the burnup limit of the barrier cladding was analyzed. (authors)

  1. Failure Analysis and Regeneration Performances Evaluation on Engine Lubricating Oil

    NASA Astrophysics Data System (ADS)

    Wang, X. L.; Zhang, G. N.; Zhang, J. Y.; Yin, Y. L.; Xu, Y.

    To investigate the behavior of failure and recycling of lubricating oils, three sorts of typical 10w-40 lubricating oils used in heavy-load vehicle including the new oil, waste oil and regeneration oil regenerated by self-researched green regeneration technology were selected. The tribology properties were tested by four-ball friction wear tester as well. The results indicated that the performance of anti-extreme pressure of regeneration oil increase by 34.1% compared with the waste one and its load- carrying ability is close to the new oil; the feature of wear spot are better than those of the waste oil and frictional coefficient almost reach the level of the new oil's. As a result, the performance of anti-wear and friction reducing are getting better obviously.

  2. Performance Analysis of XCPC Powered Solar Cooling Demonstration Project

    NASA Astrophysics Data System (ADS)

    Widyolar, Bennett K.

    A solar thermal cooling system using novel non-tracking External Compound Parabolic Concentrators (XCPC) has been built at the University of California, Merced and operated for two cooling seasons. Its performance in providing power for space cooling has been analyzed. This solar cooling system is comprised of 53.3 m2 of XCPC trough collectors which are used to power a 23 kW double effect (LiBr) absorption chiller. This is the first system that combines both XCPC and absorption chilling technologies. Performance of the system was measured in both sunny and cloudy conditions, with both clean and dirty collectors. It was found that these collectors are well suited at providing thermal power to drive absorption cooling systems and that both the coinciding of available thermal power with cooling demand and the simplicity of the XCPC collectors compared to other solar thermal collectors makes them a highly attractive candidate for cooling projects.

  3. Control Design and Performance Analysis for Autonomous Formation Flight Experimentss

    NASA Astrophysics Data System (ADS)

    Rice, Caleb Michael

    Autonomous Formation Flight is a key approach for reducing greenhouse gas emissions and managing traffic in future high density airspace. Unmanned Aerial Vehicles (UAV's) have made it possible for the physical demonstration and validation of autonomous formation flight concepts inexpensively and eliminates the flight risk to human pilots. This thesis discusses the design, implementation, and flight testing of three different formation flight control methods, Proportional Integral and Derivative (PID); Fuzzy Logic (FL); and NonLinear Dynamic Inversion (NLDI), and their respective performance behavior. Experimental results show achievable autonomous formation flight and performance quality with a pair of low-cost unmanned research fixed wing aircraft and also with a solo vertical takeoff and landing (VTOL) quadrotor.

  4. Transaction Performance vs. Moore's Law: A Trend Analysis

    NASA Astrophysics Data System (ADS)

    Nambiar, Raghunath; Poess, Meikel

    Intel co-founder Gordon E. Moore postulated in his famous 1965 paper that the number of components in integrated circuits had doubled every year from their invention in 1958 until 1965, and then predicted that the trend would continue for at least ten years. Later, David House, an Intel colleague, after factoring in the increase in performance of transistors, concluded that integrated circuits would double in performance every 18 months. Despite this trend in microprocessor improvements, your favored text editor continues to take the same time to start and your PC takes pretty much the same time to reboot as it took 10 years ago. Can this observation be made on systems supporting the fundamental aspects of our information based economy, namely transaction processing systems?

  5. Performance Analysis and Portability of the PLUM Load Balancing System

    NASA Technical Reports Server (NTRS)

    Oliker, Leonid; Biswas, Rupak; Gabow, Harold N.

    1998-01-01

    The ability to dynamically adapt an unstructured mesh is a powerful tool for solving computational problems with evolving physical features; however, an efficient parallel implementation is rather difficult. To address this problem, we have developed PLUM, an automatic portable framework for performing adaptive numerical computations in a message-passing environment. PLUM requires that all data be globally redistributed after each mesh adaption to achieve load balance. We present an algorithm for minimizing this remapping overhead by guaranteeing an optimal processor reassignment. We also show that the data redistribution cost can be significantly reduced by applying our heuristic processor reassignment algorithm to the default mapping of the parallel partitioner. Portability is examined by comparing performance on a SP2, an Origin2000, and a T3E. Results show that PLUM can be successfully ported to different platforms without any code modifications.

  6. Wavelet-based ultrasound image denoising: performance analysis and comparison.

    PubMed

    Rizi, F Yousefi; Noubari, H Ahmadi; Setarehdan, S K

    2011-01-01

    Ultrasound images are generally affected by multiplicative speckle noise, which is mainly due to the coherent nature of the scattering phenomenon. Speckle noise filtering is thus a critical pre-processing step in medical ultrasound imaging provided that the diagnostic features of interest are not lost. A comparative study of the performance of alternative wavelet based ultrasound image denoising methods is presented in this article. In particular, the contourlet and curvelet techniques with dual tree complex and real and double density wavelet transform denoising methods were applied to real ultrasound images and results were quantitatively compared. The results show that curvelet-based method performs superior as compared to other methods and can effectively reduce most of the speckle noise content of a given image. PMID:22255196

  7. Moisture and Structural Analysis for High Performance Hybrid Wall Assemblies

    SciTech Connect

    Grin, A.; Lstiburek, J.

    2012-09-01

    Based on past experience in the Building America program, BSC has found that combinations of materials and approaches—in other words, systems—usually provide optimum performance. Integration is necessary, as described in this research project. The hybrid walls analyzed utilize a combination of exterior insulation, diagonal metal strapping, and spray polyurethane foam and leave room for cavity-fill insulation. These systems can provide effective thermal, air, moisture, and water barrier systems in one assembly and provide structure.

  8. Performance analysis of a potassium-base AMTEC cell

    SciTech Connect

    Huang, C.; Hendricks, T.J.; Hunt, T.K.

    1998-07-01

    Sodium-BASE Alkali-Metal-Thermal-to-Electric-Conversion (AMTEC) cells have been receiving increased attention and funding from the Department of Energy, NASA and the United States Air Force. Recently, sodium-BASE (Na-BASE) AMTEC cells were selected for the Advanced Radioisotope Power System (ARPS) program for the next generation of deep-space missions and spacecraft. Potassium-BASE (K-BASE) AMTEC cells have not received as much attention to date, even though the vapor pressure of potassium is higher than that of sodium at the same temperature. So that, K-BASE AMTEC cells with potentially higher open circuit voltage and higher power output than Na-BASE AMTEC cells are possible. Because the surface tension of potassium is about half of the surface tension of sodium at the same temperature, the artery and evaporator design in a potassium AMTEC cell has much more challenging pore size requirements than designs using sodium. This paper uses a flexible thermal/fluid/electrical model to predict the performance of a K-BASE AMTEC cell. Pore sizes in the artery of K-BASE AMTEC cells must be smaller by an order of magnitude than in Na-BASE AMTEC cells. The performance of a K-BASE AMTEC cell was higher than a Na-BASE AMTEC cell at low voltages/high currents. K-BASE AMTEC cells also have the potential of much better electrode performance, thereby creating another avenue for potentially better performance in K-BASE AMTEC cells.

  9. Performance of silicon immersed gratings: measurement, analysis, and modeling

    NASA Astrophysics Data System (ADS)

    Rodenhuis, Michiel; Tol, Paul J. J.; Coppens, Tonny H. M.; Laubert, Phillip P.; van Amerongen, Aaldert H.

    2015-09-01

    The use of Immersed Gratings offers advantages for both space- and ground-based spectrographs. As diffraction takes place inside the high-index medium, the optical path difference and angular dispersion are boosted proportionally, thereby allowing a smaller grating area and a smaller spectrometer size. Short-wave infrared (SWIR) spectroscopy is used in space-based monitoring of greenhouse and pollution gases in the Earth atmosphere. On the extremely large telescopes currently under development, mid-infrared high-resolution spectrographs will, among other things, be used to characterize exo-planet atmospheres. At infrared wavelengths, Silicon is transparent. This means that production methods used in the semiconductor industry can be applied to the fabrication of immersed gratings. Using such methods, we have designed and built immersed gratings for both space- and ground-based instruments, examples being the TROPOMI instrument for the European Space Agency Sentinel-5 precursor mission, Sentinel-5 (ESA) and the METIS (Mid-infrared E-ELT Imager and Spectrograph) instrument for the European Extremely Large Telescope. Three key parameters govern the performance of such gratings: The efficiency, the level of scattered light and the wavefront error induced. In this paper we describe how we can optimize these parameters during the design and manufacturing phase. We focus on the tools and methods used to measure the actual performance realized and present the results. In this paper, the bread-board model (BBM) immersed grating developed for the SWIR-1 channel of Sentinel-5 is used to illustrate this process. Stringent requirements were specified for this grating for the three performance criteria. We will show that -with some margin- the performance requirements have all been met.

  10. Performance analysis of an inexpensive Direct Imaging Transmission Ion Microscope

    NASA Astrophysics Data System (ADS)

    Barnes, Patrick; Pallone, Arthur

    2013-03-01

    A direct imaging transmission ion microscope (DITIM) is built from a modified webcam and a commercially available polonium-210 antistatic device mounted on an optics rail. The performance of the DITIM in radiographic mode is analyzed in terms of the line spread function (LSF) and modulation transfer function (MTF) for an opaque edge. Limitations of, potential uses for, and suggested improvements to the DITIM are also discussed. Faculty sponsor

  11. Mars orbiter laser altimeter: receiver model and performance analysis.

    PubMed

    Abshire, J B; Sun, X; Afzal, R S

    2000-05-20

    The design, calibration, and performance of the Mars Orbiter Laser Altimeter (MOLA) receiver are described. The MOLA measurements include the range to the surface, which is determined by the laser-pulse time of flight; the height variability within the footprint determined by the laser echo pulse width; and the apparent surface reflectivity determined by the ratio of the echo to transmitted pulse energies. PMID:18345159

  12. Performance Analysis of Distributed Object-Oriented Applications

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1998-01-01

    The purpose of this research was to evaluate the efficiency of a distributed simulation architecture which creates individual modules which are made self-scheduling through the use of a message-based communication system used for requesting input data from another module which is the source of that data. To make the architecture as general as possible, the message-based communication architecture was implemented using standard remote object architectures (Common Object Request Broker Architecture (CORBA) and/or Distributed Component Object Model (DCOM)). A series of experiments were run in which different systems are distributed in a variety of ways across multiple computers and the performance evaluated. The experiments were duplicated in each case so that the overhead due to message communication and data transmission can be separated from the time required to actually perform the computational update of a module each iteration. The software used to distribute the modules across multiple computers was developed in the first year of the current grant and was modified considerably to add a message-based communication scheme supported by the DCOM distributed object architecture. The resulting performance was analyzed using a model created during the first year of this grant which predicts the overhead due to CORBA and DCOM remote procedure calls and includes the effects of data passed to and from the remote objects. A report covering the distributed simulation software and the results of the performance experiments has been submitted separately. The above report also discusses possible future work to apply the methodology to dynamically distribute the simulation modules so as to minimize overall computation time.

  13. Analysis of thermoelastohydrodynamic performance of journal misaligned engine main bearings

    NASA Astrophysics Data System (ADS)

    Bi, Fengrong; Shao, Kang; Liu, Changwen; Wang, Xia; Zhang, Jian

    2015-05-01

    To understand the engine main bearings' working condition is important in order to improve the performance of engine. However, thermal effects and thermal effect deformations of engine main bearings are rarely considered simultaneously in most studies. A typical finite element model is selected and the effect of thermoelastohydrodynamic(TEHD) reaction on engine main bearings is investigated. The calculated method of main bearing's thermal hydrodynamic reaction and journal misalignment effect is finite difference method, and its deformation reaction is calculated by using finite element method. The oil film pressure is solved numerically with Reynolds boundary conditions when various bearing characteristics are calculated. The whole model considers a temperature-pressure-viscosity relationship for the lubricant, surface roughness effect, and also an angular misalignment between the journal and the bearing. Numerical simulations of operation of a typical I6 diesel engine main bearing is conducted and importance of several contributing factors in mixed lubrication is discussed. The performance characteristics of journal misaligned main bearings under elastohydrodynamic(EHD) and TEHD loads of an I6 diesel engine are received, and then the journal center orbit movement, minimum oil film thickness and maximum oil film pressure of main bearings are estimated over a wide range of engine operation. The model is verified through the comparison with other present models. The TEHD performance of engine main bearings with various effects under the influences of journal misalignment is revealed, this is helpful to understand EHD and TEHD effect of misaligned engine main bearings.

  14. Numerical analysis of maximal bat performance in baseball.

    PubMed

    Nicholls, Rochelle L; Miller, Karol; Elliott, Bruce C

    2006-01-01

    Metal baseball bats have been experimentally demonstrated to produce higher ball exit velocity (BEV) than wooden bats. In the United States, all bats are subject to BEV tests using hitting machines that rotate the bat in a horizontal plane. In this paper, a model of bat-ball impact was developed based on 3-D translational and rotational kinematics of a swing performed by high-level players. The model was designed to simulate the maximal performance of specific models of a wooden bat and a metal bat when swung by a player, and included material properties and kinematics specific to each bat. Impact dynamics were quantified using the finite element method (ANSYS/LSDYNA, version 6.1). Maximum BEV from both a metal (61.5 m/s) and a wooden (50.9 m/s) bat exceeded the 43.1 m/s threshold by which bats are certified as appropriate for commercial sale. The lower BEV from the wooden bat was attributed to a lower pre-impact bat linear velocity, and a more oblique impact that resulted in a greater proportion of BEV being lost to lateral and vertical motion. The results demonstrate the importance of factoring bat linear velocity and spatial orientation into tests of maximal bat performance, and have implications for the design of metal baseball bats. PMID:15878593

  15. Performance analysis of axial-flow mixing impellers

    SciTech Connect

    Wu, J.; Pullum, L.

    2000-03-01

    Theoretical formulations for impeller performance were evaluated based on a blade-element theory. These enable the calculation of the head and power vs. flow-rate curves of axial-flow impellers. The technique uses the life and drag coefficients of the blade section of an impeller to calculate the spanwise swirl-velocity distribution. Using the angular-momentum equation, it is possible to calculate the corresponding spanwise distribution of the energy head of the impeller. Integration of these distributions of head and torque gives the impeller's performance. Parameters including the flow number, the power number, the thrust force number, and the swirl velocity can be found at the impeller operating point, determined using the head curve and an experimentally calibrated resistance curve. A laser Doppler velocimetry (LDV) system was used to measure the velocity distribution for different axial flow impellers in mixing tanks. Calculated flow and power numbers agreed well with the experimental results. Using the blade's spanwise head distribution and a set of calibrated flow-resistance data, it is also possible to estimate an impeller's outlet axial-velocity distribution. Predictions compared well with LDV experimental data. The effect of impeller-blade angle, number of blades, blade camber, and blade thickness on the performance of axial-flow impellers was investigated using the Agitator software.

  16. Performance Analysis of a Convection-Based Tilt Sensor

    NASA Astrophysics Data System (ADS)

    Ju Chan Choi,; Seong Ho Kong,

    2010-06-01

    This paper presents fabrication sequence and performance improving methods for convection-based tilt sensor. Also a packaging method to minimize the effect of environmental temperature fluctuation is proposed. Both electrolytic solution-based and air convection-based tilt sensors realized using micro-electro-mechanical-system (MEMS) technology have been previously reported by our research group. Although the MEMS-based electrolytic tilt sensor shows merited characteristics, such as wider operating tilt range, lower cost and compactness, compared to commercialized conventional electrolytic tilt sensors, it still suffers from metal electrode corrosion, electrolyte deterioration, surface tension of the electrolyte, and difficulty in packaging. In order to avoid those demerits, convective tilt sensor using air medium instead of electrolytic solution has been proposed and its fundamental performances has also been demonstrated in the previous works. In this paper, the effect of air medium condition on sensitivity of proposed convective tilt sensor has been investigated. In addition, a packaging method utilizing the Peltier device is presented to minimize environmental thermal effect without additional temperature compensation circuit. It is expected that this technique can be similarly applied to improve the performance and reliability of other sensors using gas media.

  17. Performance Testing using Silicon Devices - Analysis of Accuracy: Preprint

    SciTech Connect

    Sengupta, M.; Gotseff, P.; Myers, D.; Stoffel, T.

    2012-06-01

    Accurately determining PV module performance in the field requires accurate measurements of solar irradiance reaching the PV panel (i.e., Plane-of-Array - POA Irradiance) with known measurement uncertainty. Pyranometers are commonly based on thermopile or silicon photodiode detectors. Silicon detectors, including PV reference cells, are an attractive choice for reasons that include faster time response (10 us) than thermopile detectors (1 s to 5 s), lower cost and maintenance. The main drawback of silicon detectors is their limited spectral response. Therefore, to determine broadband POA solar irradiance, a pyranometer calibration factor that converts the narrowband response to broadband is required. Normally this calibration factor is a single number determined under clear-sky conditions with respect to a broadband reference radiometer. The pyranometer is then used for various scenarios including varying airmass, panel orientation and atmospheric conditions. This would not be an issue if all irradiance wavelengths that form the broadband spectrum responded uniformly to atmospheric constituents. Unfortunately, the scattering and absorption signature varies widely with wavelength and the calibration factor for the silicon photodiode pyranometer is not appropriate for other conditions. This paper reviews the issues that will arise from the use of silicon detectors for PV performance measurement in the field based on measurements from a group of pyranometers mounted on a 1-axis solar tracker. Also we will present a comparison of simultaneous spectral and broadband measurements from silicon and thermopile detectors and estimated measurement errors when using silicon devices for both array performance and resource assessment.

  18. Performance Analysis of the NAS Y-MP Workload

    NASA Technical Reports Server (NTRS)

    Bergeron, Robert J.; Kutler, Paul (Technical Monitor)

    1997-01-01

    This paper describes the performance characteristics of the computational workloads on the NAS Cray Y-MP machines, a Y-MP 832 and later a Y-MP 8128. Hardware measurements indicated that the Y-MP workload performance matured over time, ultimately sustaining an average throughput of 0.8 GFLOPS and a vector operation fraction of 87%. The measurements also revealed an operation rate exceeding 1 per clock period, a well-balanced architecture featuring a strong utilization of vector functional units, and an efficient memory organization. Introduction of the larger memory 8128 increased throughput by allowing a more efficient utilization of CPUs. Throughput also depended on the metering of the batch queues; low-idle Saturday workloads required a buffer of small jobs to prevent memory starvation of the CPU. UNICOS required about 7% of total CPU time to service the 832 workloads; this overhead decreased to 5% for the 8128 workloads. While most of the system time went to service I/O requests, efficient scheduling prevented excessive idle due to I/O wait. System measurements disclosed no obvious bottlenecks in the response of the machine and UNICOS to the workloads. In most cases, Cray-provided software tools were- quite sufficient for measuring the performance of both the machine and operating, system.

  19. Analysis of face deformation effects on gas film seal performance

    NASA Technical Reports Server (NTRS)

    Zuk, J.

    1972-01-01

    Analyses are presented for compressible fluid flow across shaft face seals with face deformation. The solutions are obtained from an approximate integral analysis. The models, used in this analysis, can predict gas film seal behavior operating at subsonic or choked flow conditions. The flow regime can either be laminar or turbulent. Entrance losses can also be accounted for. When fluid inertia effects are negligible and the sealing faces are slightly deformed, the following results are found for both laminar and turbulent flows: (1) The pressure profiles are independent of fluid properties; and (2) the parallel film leakage equation can be used, provided a characteristic film thickness is used. Pressure profiles are presented for both divergent and convergent seal faces under choked flow conditions.

  20. DWT features performance analysis for automatic speech recognition of Urdu.

    PubMed

    Ali, Hazrat; Ahmad, Nasir; Zhou, Xianwei; Iqbal, Khalid; Ali, Sahibzada Muhammad

    2014-01-01

    This paper presents the work on Automatic Speech Recognition of Urdu language, using a comparative analysis for Discrete Wavelets Transform (DWT) based features and Mel Frequency Cepstral Coefficients (MFCC). These features have been extracted for one hundred isolated words of Urdu, each word uttered by ten different speakers. The words have been selected from the most frequently used words of Urdu. A variety of age and dialect has been covered by using a balanced corpus approach. After extraction of features, the classification has been achieved by using Linear Discriminant Analysis. After the classification task, the confusion matrix obtained for the DWT features has been compared with the one obtained for Mel-Frequency Cepstral Coefficients based speech recognition. The framework has been trained and tested for speech data recorded under controlled environments. The experimental results are useful in determination of the optimum features for speech recognition task. PMID:25674450

  1. Tribological performance analysis of textured steel surfaces under lubricating conditions

    NASA Astrophysics Data System (ADS)

    Singh, R. C.; Pandey, R. K.; Rooplal; Ranganath, M. S.; Maji, S.

    2016-09-01

    The tribological analysis of the lubricated conformal contacts formed between the smooth/textured surfaces of steel discs and smooth surface of steel pins under sliding conditions have been considered. Roles of dimples’ pitch of textured surfaces have been investigated experimentally to understand the variations of coefficient of friction and wear at the tribo-contacts under fully flooded lubricated conditions. Substantial reductions in coefficient of friction and wear at the tribo-interfaces have been observed in presence of textures on the rotating discs for both fully flooded and starved conditions in comparison to the corresponding lubricating conditions of the interfaces formed between the smooth surfaces of disc and pin. In presence of surface texture, the coefficient of friction reduces considerable at elevated sliding speeds (>2 m/s) and unit loads (>0.5 MPa) for the set of operating parameters considered in the analysis.

  2. Performance analysis of Integrated Communication and Control System networks

    NASA Technical Reports Server (NTRS)

    Halevi, Y.; Ray, A.

    1990-01-01

    This paper presents statistical analysis of delays in Integrated Communication and Control System (ICCS) networks that are based on asynchronous time-division multiplexing. The models are obtained in closed form for analyzing control systems with randomly varying delays. The results of this research are applicable to ICCS design for complex dynamical processes like advanced aircraft and spacecraft, autonomous manufacturing plants, and chemical and processing plants.

  3. Comparative analysis of the speed performance of texture analysis algorithms on a graphic processing unit (GPU)

    NASA Astrophysics Data System (ADS)

    Triana-Martinez, J.; Orjuela-Vargas, S. A.; Philips, W.

    2013-03-01

    This paper compares the speed performance of a set of classic image algorithms for evaluating texture in images by using CUDA programming. We include a summary of the general program mode of CUDA. We select a set of texture algorithms, based on statistical analysis, that allow the use of repetitive functions, such as the Coocurrence Matrix, Haralick features and local binary patterns techniques. The memory allocation time between the host and device memory is not taken into account. The results of this approach show a comparison of the texture algorithms in terms of speed when executed on CPU and GPU processors. The comparison shows that the algorithms can be accelerated more than 40 times when implemented using CUDA environment.

  4. The NetLogger Methodology for High Performance Distributed Systems Performance Analysis

    SciTech Connect

    Tierney, Brian; Johnston, William; Crowley, Brian; Hoo, Gary; Brooks, Chris; Gunter, Dan

    1999-12-23

    The authors describe a methodology that enables the real-time diagnosis of performance problems in complex high-performance distributed systems. The methodology includes tools for generating precision event logs that can be used to provide detailed end-to-end application and system level monitoring; a Java agent-based system for managing the large amount of logging data; and tools for visualizing the log data and real-time state of the distributed system. The authors developed these tools for analyzing a high-performance distributed system centered around the transfer of large amounts of data at high speeds from a distributed storage server to a remote visualization client. However, this methodology should be generally applicable to any distributed system. This methodology, called NetLogger, has proven invaluable for diagnosing problems in networks and in distributed systems code. This approach is novel in that it combines network, host, and application-level monitoring, providing a complete view of the entire system.

  5. A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis

    SciTech Connect

    Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth; Tracy Rafferty

    2005-03-15

    The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scale long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK

  6. Uniprocessor Performance Analysis of a Representative Workload of Sandia National Laboratories' Scientific Applications.

    SciTech Connect

    Charles Laverty

    2005-10-01

    UNIPROCESSOR PERFORMANCE ANALYSIS OF A REPRESENTATIVE WORKLOAD OF SANDIA NATIONAL LABORATORIES' SCIENTIFIC APPLICATIONS Master of Science in Electrical Engineering New Mexico State University Las Cruces, New Mexico, 2005 Dr. Jeanine Cook, Chair Throughout the last decade computer performance analysis has become absolutely necessary to maximum performance of some workloads. Sandia National Laboratories (SNL) located in Albuquerque, New Mexico is no different in that to achieve maximum performance of large scientific, parallel workloads performance analysis is needed at the uni-processor level. A representative workload has been chosen as the basis of a computer performance study to determine optimal processor characteristics in order to better specify the next generation of supercomputers. Cube3, a finite element test problem developed at SNL is a representative workload of their scientific workloads. This workload has been studied at the uni-processor level to understand characteristics in the microarchitecture that will lead to the overall performance improvement at the multi-processor level. The goal of studying vthis workload at the uni-processor level is to build a performance prediction model that will be integrated into a multi-processor performance model which is currently being developed at SNL. Through the use of performance counters on the Itanium 2 microarchitecture, performance statistics are studied to determine bottlenecks in the microarchitecture and/or changes in the application code that will maximize performance. From source code analysis a performance degrading loop kernel was identified and through the use of compiler optimizations a performance gain of around 20% was achieved.

  7. Performance.

    PubMed

    Chambers, David W

    2006-01-01

    High performance is difficult to maintain because it is dynamic and not well understood. Based on a synthesis of many sources, a model is proposed where performance is a function of the balance between capacity and challenge. Too much challenge produces coping (or a crash); excess capacity results in boredom. Over time, peak performance drifts toward boredom. Performance can be managed by adjusting our level of ability, our effort, the opportunity to perform, and the challenge we agree to take on. Coping, substandard but acceptable performance, is common among professionals and its long-term side effects can be debilitating. A crash occurs when coping mechanisms fail. PMID:17020177

  8. Performance and flow analysis of vortex wind power turbines

    SciTech Connect

    Rangwalla, A.A.; Hsu, C.T.

    1982-10-01

    The theoretical study presented investigates some possible vortex flow solutions in the tornado-type wind energy system and evaluates the power coefficient that can be obtained theoretically. The actuator disc concept is applied to the vortex wind turbine configuration. The Burgers vortex model is then introduced and the performance of a turbine using it is derived. A generalized analytical solution of the model is given, followed by a numerical solution of the complete equations. The stability of a Burgers vortex is discussed. (LEW)

  9. Vane structure design trade-off and performance analysis

    NASA Astrophysics Data System (ADS)

    Breault, Robert P.

    1989-04-01

    The APART/PADE and ASAP stray-light software packages (Breault, 1988) are applied to the design of vane structures to block direct propagation paths from the surfaces of optical baffles to other system components. Results for several typical systems are presented in extensive tables and graphs and analyzed. It is shown that vane angle and depth are significant parameters only for the first-order propagation path. Also evaluated are the amounts of particulate debris produced by degraded vane coatings and the effects of the resulting surface contamination on system performance.

  10. Dynamic Curvature Steering Control for Autonomous Vehicle: Performance Analysis

    NASA Astrophysics Data System (ADS)

    Aizzat Zakaria, Muhammad; Zamzuri, Hairi; Amri Mazlan, Saiful

    2016-02-01

    This paper discusses the design of dynamic curvature steering control for autonomous vehicle. The lateral control and longitudinal control are discussed in this paper. The controller is designed based on the dynamic curvature calculation to estimate the path condition and modify the vehicle speed and steering wheel angle accordingly. In this paper, the simulation results are presented to show the capability of the controller to track the reference path. The controller is able to predict the path and modify the vehicle speed to suit the path condition. The effectiveness of the controller is shown in this paper whereby identical performance is achieved with the benchmark but with extra curvature adaptation capabilites.

  11. Apparatus and method for performing microfluidic manipulations for chemical analysis

    SciTech Connect

    Ramsey, J.M.

    1999-12-14

    A microchip apparatus and method provide fluidic manipulations for a variety of applications, including sample injection for microchip liquid chromatography. The microchip is fabricated using standard photolithographic procedures and chemical wet etching, with the substrate and cover plate joined using direct bonding. Capillary electrophoresis is performed in channels formed in the substrate. Injections are made by electro-osmotically pumping sample through the injection channel that crosses the separation channel, followed by a switching of the potentials to force a plug into the separation channel.

  12. Received optical power calculations for optical communications link performance analysis

    NASA Technical Reports Server (NTRS)

    Marshall, W. K.; Burk, B. D.

    1986-01-01

    The factors affecting optical communication link performance differ substantially from those at microwave frequencies, due to the drastically differing technologies, modulation formats, and effects of quantum noise in optical communications. In addition detailed design control table calculations for optical systems are less well developed than corresponding microwave system techniques, reflecting the relatively less mature state of development of optical communications. Described below are detailed calculations of received optical signal and background power in optical communication systems, with emphasis on analytic models for accurately predicting transmitter and receiver system losses.

  13. Design and performance analysis of gas sorption compressors

    NASA Technical Reports Server (NTRS)

    Chan, C. K.

    1984-01-01

    Compressor kinetics based on gas adsorption and desorption processes by charcoal and for gas absorption and desorption processes by LaNi5 were analyzed using a two-phase model and a three-component model, respectively. The assumption of the modeling involved thermal and mechanical equilibria between phases or among the components. The analyses predicted performance well for compressors which have heaters located outside the adsorbent or the absorbent bed. For the rapidly-cycled compressor, where the heater was centrally located, only the transient pressure compared well with the experimental data.

  14. Performability analysis using semi-Markov reward processes

    NASA Technical Reports Server (NTRS)

    Ciardo, Gianfranco; Marie, Raymond A.; Sericola, Bruno; Trivedi, Kishor S.

    1990-01-01

    Beaudry (1978) proposed a simple method of computing the distribution of performability in a Markov reward process. Two extensions of Beaudry's approach are presented. The method is generalized to a semi-Markov reward process by removing the restriction requiring the association of zero reward to absorbing states only. The algorithm proceeds by replacing zero-reward nonabsorbing states by a probabilistic switch; it is therefore related to the elimination of vanishing states from the reachability graph of a generalized stochastic Petri net and to the elimination of fast transient states in a decomposition approach to stiff Markov chains. The use of the approach is illustrated with three applications.

  15. Apparatus and method for performing microfluidic manipulations for chemical analysis

    DOEpatents

    Ramsey, J. Michael

    2002-01-01

    A microchip apparatus and method provide fluidic manipulations for a variety of applications, including sample injection for microchip liquid chromatography. The microchip is fabricated using standard photolitographic procedures and chemical wet etching, with the substrate and cover plate joined using direct bonding. Capillary electrophoresis is performed in channels formed in the substrate. Injections are made by electro-osmotically pumping sample through the injection channel that crosses the separation channel, followed by a switching of the potentials to force a plug into the separation channel.

  16. Apparatus and method for performing microfluidic manipulations for chemical analysis

    DOEpatents

    Ramsey, J. Michael

    1999-01-01

    A microchip apparatus and method provide fluidic manipulations for a variety of applications, including sample injection for microchip liquid chromatography. The microchip is fabricated using standard photolithographic procedures and chemical wet etching, with the substrate and cover plate joined using direct bonding. Capillary electrophoresis is performed in channels formed in the substrate. Injections are made by electro-osmotically pumping sample through the injection channel that crosses the separation channel, followed by a switching of the potentials to force a plug into the separation channel.

  17. Performance analysis of high quality parallel preconditioners applied to 3D finite element structural analysis

    SciTech Connect

    Kolotilina, L.; Nikishin, A.; Yeremin, A.

    1994-12-31

    The solution of large systems of linear equations is a crucial bottleneck when performing 3D finite element analysis of structures. Also, in many cases the reliability and robustness of iterative solution strategies, and their efficiency when exploiting hardware resources, fully determine the scope of industrial applications which can be solved on a particular computer platform. This is especially true for modern vector/parallel supercomputers with large vector length and for modern massively parallel supercomputers. Preconditioned iterative methods have been successfully applied to industrial class finite element analysis of structures. The construction and application of high quality preconditioners constitutes a high percentage of the total solution time. Parallel implementation of high quality preconditioners on such architectures is a formidable challenge. Two common types of existing preconditioners are the implicit preconditioners and the explicit preconditioners. The implicit preconditioners (e.g. incomplete factorizations of several types) are generally high quality but require solution of lower and upper triangular systems of equations per iteration which are difficult to parallelize without deteriorating the convergence rate. The explicit type of preconditionings (e.g. polynomial preconditioners or Jacobi-like preconditioners) require sparse matrix-vector multiplications and can be parallelized but their preconditioning qualities are less than desirable. The authors present results of numerical experiments with Factorized Sparse Approximate Inverses (FSAI) for symmetric positive definite linear systems. These are high quality preconditioners that possess a large resource of parallelism by construction without increasing the serial complexity.

  18. Performance analysis of program execution on data flow systems

    SciTech Connect

    Jennings, S.F.; Oldehoeft, A.E.

    1983-01-01

    A Petri Net model and graph analysis technique is presented for program execution on data flow systems. The model encompasses static systems in which recurrent computations reuse the same copy of a program node and dynamic systems in which reuse of a node results in the creation of a new copy. Program execution time, assuming sufficient resources, is analysed by use of the theory of state machine decomposable petri nets and an approximation technique based on graph reduction. Numerical simulation results are presented for validation of the model for static systems. 14 references.

  19. Human Performance Modeling for Dynamic Human Reliability Analysis

    SciTech Connect

    Boring, Ronald Laurids; Joe, Jeffrey Clark; Mandelli, Diego

    2015-08-01

    Part of the U.S. Department of Energy’s (DOE’s) Light Water Reac- tor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Charac- terization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk framework. In this paper, we review simulation based and non simulation based human reliability analysis (HRA) methods. This paper summarizes the founda- tional information needed to develop a feasible approach to modeling human in- teractions in RISMC simulations.

  20. Performance analysis of replication ALOHA for fading mobile communications channels

    NASA Technical Reports Server (NTRS)

    Yan, Tsun-Yee; Clare, Loren P.

    1986-01-01

    This paper describes an ALOHA random access protocol for fading communications channels. A two-state Markov model is used for the channel error process to account for the channel fading memory. The ALOHA protocol is modified to send multiple contiguous copies of a message at each transmission attempt. Both pure and slotted ALOHA channels are considered. The analysis is applicable to fading environments where the channel memory is short compared to the propagation delay. It is shown that smaller delay may be achieved using replications and, in noisy conditions, can also improve throughput.

  1. Navier-Stokes analysis of radial turbine rotor performance

    NASA Technical Reports Server (NTRS)

    Larosiliere, L. M.

    1993-01-01

    An analysis of flow through a radial turbine rotor using the three-dimensional, thin-layer Navier-Stokes code RVC3D is described. The rotor is a solid version of an air-cooled metallic radial turbine having thick trailing edges, shroud clearance, and scalloped-backface clearance. Results are presented at the nominal operating condition using both a zero-clearance model and a model simulating the effects of the shroud and scalloped-backface clearance flows. A comparison with the available test data is made and details of the internal flow physics are discussed, allowing a better understanding of the complex flow distribution within the rotor.

  2. Methods for Analysis of Outdoor Performance Data (Presentation)

    SciTech Connect

    Jordan, D.

    2011-02-01

    The ability to accurately predict power delivery over the course of time is of vital importance to the growth of the photovoltaic (PV) industry. Two key cost drivers are the efficiency with which sunlight is converted to power and secondly how this relationship develops over time. The accurate knowledge of power decline over time, also known as degradation rates, is essential and important to all stakeholders--utility companies, integrators, investors, and scientists alike. Different methods to determine degradation rates and discrete versus continuous data are presented, and some general best practice methods are outlined. In addition, historical degradation rates and some preliminary analysis with respect to climate are given.

  3. A Preliminary Analysis of LANDSAT-4 Thematic Mapper Radiometric Performance

    NASA Technical Reports Server (NTRS)

    Justice, C.; Fusco, L.; Mehl, W.

    1985-01-01

    The NASA raw (BT) product, the radiometrically corrected (AT) product, and the radiometrically and geometrically corrected (PT) product of a TM scene were analyzed examine the frequency distribution of the digital data; the statistical correlation between the bands; and the variability between the detectors within a band. The analyses were performed on a series of image subsets from the full scence. Results are presented from one 1024 c 1024 pixel subset of Realfoot Lake, Tennessee which displayed a representative range of ground conditions and cover types occurring within the full frame image. From this cursory examination of one of the first seven channel TM data sets, it would appear that the radiometric performance of the system is most satisfactory and largely meets pre-launch specifications. Problems were noted with Band 5 Detector 3 and Band 2 Detector 4. Differences were observed between forward and reverse scan detector responses both for the BT and AT products. No systematic variations were observed between odd and even detectors.

  4. Performance analysis of a large-grain dataflow scheduling paradigm

    NASA Technical Reports Server (NTRS)

    Young, Steven D.; Wills, Robert W.

    1993-01-01

    A paradigm for scheduling computations on a network of multiprocessors using large-grain data flow scheduling at run time is described and analyzed. The computations to be scheduled must follow a static flow graph, while the schedule itself will be dynamic (i.e., determined at run time). Many applications characterized by static flow exist, and they include real-time control and digital signal processing. With the advent of computer-aided software engineering (CASE) tools for capturing software designs in dataflow-like structures, macro-dataflow scheduling becomes increasingly attractive, if not necessary. For parallel implementations, using the macro-dataflow method allows the scheduling to be insulated from the application designer and enables the maximum utilization of available resources. Further, by allowing multitasking, processor utilizations can approach 100 percent while they maintain maximum speedup. Extensive simulation studies are performed on 4-, 8-, and 16-processor architectures that reflect the effects of communication delays, scheduling delays, algorithm class, and multitasking on performance and speedup gains.

  5. Performance analysis of LDPC codes on OOK terahertz wireless channels

    NASA Astrophysics Data System (ADS)

    Chun, Liu; Chang, Wang; Jun-Cheng, Cao

    2016-02-01

    Atmospheric absorption, scattering, and scintillation are the major causes to deteriorate the transmission quality of terahertz (THz) wireless communications. An error control coding scheme based on low density parity check (LDPC) codes with soft decision decoding algorithm is proposed to improve the bit-error-rate (BER) performance of an on-off keying (OOK) modulated THz signal through atmospheric channel. The THz wave propagation characteristics and channel model in atmosphere is set up. Numerical simulations validate the great performance of LDPC codes against the atmospheric fading and demonstrate the huge potential in future ultra-high speed beyond Gbps THz communications. Project supported by the National Key Basic Research Program of China (Grant No. 2014CB339803), the National High Technology Research and Development Program of China (Grant No. 2011AA010205), the National Natural Science Foundation of China (Grant Nos. 61131006, 61321492, and 61204135), the Major National Development Project of Scientific Instrument and Equipment (Grant No. 2011YQ150021), the National Science and Technology Major Project (Grant No. 2011ZX02707), the International Collaboration and Innovation Program on High Mobility Materials Engineering of the Chinese Academy of Sciences, and the Shanghai Municipal Commission of Science and Technology (Grant No. 14530711300).

  6. The performance analysis of linux networking - packet receiving

    SciTech Connect

    Wu, Wenji; Crawford, Matt; Bowden, Mark; /Fermilab

    2006-11-01

    The computing models for High-Energy Physics experiments are becoming ever more globally distributed and grid-based, both for technical reasons (e.g., to place computational and data resources near each other and the demand) and for strategic reasons (e.g., to leverage equipment investments). To support such computing models, the network and end systems, computing and storage, face unprecedented challenges. One of the biggest challenges is to transfer scientific data sets--now in the multi-petabyte (10{sup 15} bytes) range and expected to grow to exabytes within a decade--reliably and efficiently among facilities and computation centers scattered around the world. Both the network and end systems should be able to provide the capabilities to support high bandwidth, sustained, end-to-end data transmission. Recent trends in technology are showing that although the raw transmission speeds used in networks are increasing rapidly, the rate of advancement of microprocessor technology has slowed down. Therefore, network protocol-processing overheads have risen sharply in comparison with the time spent in packet transmission, resulting in degraded throughput for networked applications. More and more, it is the network end system, instead of the network, that is responsible for degraded performance of network applications. In this paper, the Linux system's packet receive process is studied from NIC to application. We develop a mathematical model to characterize the Linux packet receiving process. Key factors that affect Linux systems network performance are analyzed.

  7. Structural Design and Sealing Performance Analysis of Biomimetic Sealing Ring.

    PubMed

    Han, Chuanjun; Zhang, Han; Zhang, Jie

    2015-01-01

    In order to reduce the failure probability of rubber sealing rings in reciprocating dynamic seal, a new structure of sealing ring based on bionics was designed. The biomimetic ring has three concave ridges and convex bulges on each side which are very similar to earthworms. Bulges were circularly designed and sealing performances of the biomimetic ring in both static seal and dynamic seal were simulated by FEM. In addition, effects of precompression, medium pressure, speed, friction coefficient, and material parameters on sealing performances were discussed. The results show that von Mises stress of the biomimetic sealing ring distributed symmetrically in no-pressure static sealing. The maximum von Mises stress appears on the second bulge of the inner side. High contact stress concentrates on left bulges. Von Mises stress distribution becomes uneven under medium pressure. Both von Mises stress and contact stress increase when precompression, medium pressure, and rubber hardness increase in static sealing. Biomimetic ring can avoid rolling and distortion in reciprocating dynamic seal, and its working life is much longer than O-ring and rectangular ring. The maximum von Mises stress and contact stress increase with the precompression, medium pressure, rubber hardness, and friction coefficient in reciprocating dynamic seal. PMID:27019582

  8. Performance analysis of flexible DSSC with binder addition

    NASA Astrophysics Data System (ADS)

    Muliani, Lia; Hidayat, Jojo; Anggraini, Putri Nur

    2016-04-01

    Flexible DSSC is one of modification of DSSC based on its substrate. Operating at low temperature, flexible DSSC requires a binder to improve particles interconnection. This research was done to compare the morphology and performance of flexible DSSC that was produced with binder-added and binder-free. TiO2 powder, butanol, and HCl were mixed for preparation of TiO2 paste. Small amount of titanium isopropoxide as binder was added into the mixture. TiO2 paste was deposited on ITO-PET plastic substrate with area of 1x1 cm2 by doctor blade method. Furthermore, SEM, XRD, and BET characterization were done to analyze morphology and surface area of the TiO2 photoelectrode microstructures. Dyed TiO2 photoelectrode and platinum counter electrode were assembled and injected by electrolyte. In the last process, flexible DSSCs were illuminated by sun simulator to do J-V measurement. As a result, flexible DSSC containing binder showed higher performance with photoconversion efficiency of 0.31%.

  9. Cost-Effective Hyperspectral Transmissometers for Oceanographic Applications: Performance Analysis

    PubMed Central

    Ramírez-Pérez, Marta; Röttgers, Rüdiger; Torrecilla, Elena; Piera, Jaume

    2015-01-01

    The recent development of inexpensive, compact hyperspectral transmissometers broadens the research capabilities of oceanographic applications. These developments have been achieved by incorporating technologies such as micro-spectrometers as detectors as well as light emitting diodes (LEDs) as light sources. In this study, we evaluate the performance of the new commercial LED-based hyperspectral transmissometer VIPER (TriOS GmbH, Rastede, Germany), which combines different LEDs to emulate the visible light spectrum, aiming at the determination of attenuation coefficients in coastal environments. For this purpose, experimental uncertainties related to the instrument stability, the effect of ambient light and derived temperature, and salinity correction factors are analyzed. Our results identify some issues related to the thermal management of the LEDs and the contamination of ambient light. Furthermore, the performance of VIPER is validated against other transmissometers through simultaneous field measurements. It is demonstrated that VIPER provides a compact and cost-effective alternative for beam attenuation measurements in coastal waters, but it requires the consideration of several optimizations. PMID:26343652

  10. Structural Design and Sealing Performance Analysis of Biomimetic Sealing Ring

    PubMed Central

    Han, Chuanjun

    2015-01-01

    In order to reduce the failure probability of rubber sealing rings in reciprocating dynamic seal, a new structure of sealing ring based on bionics was designed. The biomimetic ring has three concave ridges and convex bulges on each side which are very similar to earthworms. Bulges were circularly designed and sealing performances of the biomimetic ring in both static seal and dynamic seal were simulated by FEM. In addition, effects of precompression, medium pressure, speed, friction coefficient, and material parameters on sealing performances were discussed. The results show that von Mises stress of the biomimetic sealing ring distributed symmetrically in no-pressure static sealing. The maximum von Mises stress appears on the second bulge of the inner side. High contact stress concentrates on left bulges. Von Mises stress distribution becomes uneven under medium pressure. Both von Mises stress and contact stress increase when precompression, medium pressure, and rubber hardness increase in static sealing. Biomimetic ring can avoid rolling and distortion in reciprocating dynamic seal, and its working life is much longer than O-ring and rectangular ring. The maximum von Mises stress and contact stress increase with the precompression, medium pressure, rubber hardness, and friction coefficient in reciprocating dynamic seal. PMID:27019582

  11. Advanced flight design systems subsystem performance models. Sample model: Environmental analysis routine library

    NASA Technical Reports Server (NTRS)

    Parker, K. C.; Torian, J. G.

    1980-01-01

    A sample environmental control and life support model performance analysis using the environmental analysis routines library is presented. An example of a complete model set up and execution is provided. The particular model was synthesized to utilize all of the component performance routines and most of the program options.

  12. Performance analysis of elite men's and women's wheelchair basketball teams.

    PubMed

    Gómez, Miguel Ángel; Pérez, Javier; Molik, Bartosz; Szyman, Robert J; Sampaio, Jaime

    2014-01-01

    The purpose of the present study was to identify which game-related statistics discriminate winning and losing teams in men's and women's elite wheelchair basketball. The sample comprised all the games played during the Beijing Paralympics 2008 and the World Wheelchair Basketball Championship 2010. The game-related statistics from the official box scores were gathered and data were analysed in 2 groups: balanced games (final score differences ≤ 12 points) and unbalanced games (final score differences >13 points). Discriminant analysis allowed identifying the successful 2-point field-goals and free-throws, the unsuccessful 3-point field-goals and free-throws, the assists and fouls received as discriminant statistics between winning and losing teams in men's balanced games. In women's games, the teams were discriminated only by the successful 2-point field-goals. Linear regression analysis showed that the quality of opposition had great effects in final point differential. The field-goals percentage and free-throws rate were the most important factors in men's games, and field-goals percentage and offensive rebounding percentage in women's games. The identified trends allow improving game understanding and helping wheelchair basketball coaches to plan accurate practice sessions and, ultimately, deciding better in competition. PMID:24506819

  13. Energy performance standards for new buildings: economic analysis

    SciTech Connect

    Not Available

    1980-01-01

    This assessment determines the major economic impacts of the implementation of the standards on affected groups, and evaluates the effectiveness of the standards as an investment in energy conservation. The analyses examine the impacts on individual owners, construction industry, and the Nation. Chapter 2, Summary, briefly displays the results of the analysis. Chapter 3, Approach, describes the methodology used to evaluate the standards for the various building types and perspectives. The basis and structure for evaluating the standards' impacts on occupants and the Nation are described. Chapter 4, Building Microeconomics, evaluates the net economic effects of changes in building cost and energy use for three categories of buildings: single family residential, commercial and multifamily residential, and mobile homes. Chapter 5, Primary National Impacts, develops forecasts of energy savings and national costs and benefits both with and without implementation of the standards. Chapter 6, Impacts on Selected Building Industries, estimates changes in labor and material use in building construction and assesses the importance of these changes. Chapter 7, Net National Impacts, assesses the effects of changes in energy consumption and construction of new buildings on the national economy, including such factors as national income, investment, employment, and balance of trade. Details of models and data bases used in the analysis are included in Appendixes A through I. (MCW)

  14. Integrative Performance Analysis of a Novel Bone Level Tapered Implant.

    PubMed

    Dard, M; Kuehne, S; Obrecht, M; Grandin, M; Helfenstein, J; Pippenger, B E

    2016-03-01

    Primary mechanical stability, as measured by maximum insertion torque and resonance frequency analysis, is generally considered to be positively associated with successful secondary stability and implant success. Primary implant stability can be affected by several factors, including the quality and quantity of available bone, the implant design, and the surgical procedure. The use of a tapered implant design, for instance, has been shown to result in good primary stability even in clinical scenarios where primary stability is otherwise difficult to achieve with traditional cylindrical implants-for example, in soft bone and for immediate placement in extraction sockets. In this study, bone-type specific drill procedures are presented for a novel Straumann bone level tapered implant that ensure maximum insertion torque values are kept within the range of 15 to 80 Ncm. The drill procedures are tested in vitro using polyurethane foam blocks of variable density, ex vivo on explanted porcine ribs (bone type 3), and finally in vivo on porcine mandibles (bone type 1). In each test site, adapted drill procedures are found to achieve a good primary stability. These results are further translated into a finite element analysis model capable of predicting primary stability of tapered implants. In conclusion, we have assessed the biomechanical behavior of a novel taper-walled implant in combination with a bone-type specific drill procedure in both synthetic and natural bone of various types, and we have developed an in silico model for predicting primary stability upon implantation. PMID:26927485

  15. A kinematics analysis of three best 100 m performances ever.

    PubMed

    Krzysztof, Maćkała; Mero, Antti

    2013-03-01

    The purpose of this investigation was to compare and determine the relevance of the morphological characteristics and variability of running speed parameters (stride length and stride frequency) between Usain Bolt's three best 100 m performances. Based on this, an attempt was made to define which factors determine the performance of Usain Bolt's sprint and, therefore, distinguish him from other sprinters. We analyzed the previous world record of 9.69 s set in the 2008 Beijing Olympics, the current record of 9.58 s set in the 2009 Berlin World Championships in Athletics and the O lympic record of 9.63 s set in 2012 London Olympics Games by Usain Bolt. The application of VirtualDub Programme allowed the acquisition of basic kinematical variables such as step length and step frequency parameters of 100 m sprint from video footage provided by NBC TV station, BBC TV station. This data was compared with other data available on the web and data published by the Scientific Research Project Office responsible on behalf of IAAF and the German Athletics Association (DVL). The main hypothesis was that the step length is the main factor that determines running speed in the 10 and 20 m sections of the entire 100 m distance. Bolt's anthropometric advantage (body height, leg length and liner body) is not questionable and it is one of the factors that makes him faster than the rest of the finalists from each three competitions. Additionally, Bolt's 20 cm longer stride shows benefit in the latter part of the race. Despite these factors, he is probably able to strike the ground more forcefully than rest of sprinters, relative to their body mass, therefore, he might maximize his time on the ground and to exert the same force over this period of time. This ability, combined with longer stride allows him to create very high running speed - over 12 m/s (12.05 - 12.34 m/s) in some 10 m sections of his three 100 m performances. These assumption confirmed the application of Ballerieich

  16. Pulse Amplitude and Delay Modulation: Design and performance analysis

    NASA Astrophysics Data System (ADS)

    Slaiman, Iskandar; Tang, Tong Boon; Hamid, Nor Hisham

    2015-06-01

    Power efficient modulation techniques have previously been proposed to provide the uplink in visible light communication systems. However, such techniques have poor bandwidth utilization as multiple bits are mapped to single narrow pulse. When the bandwidth is limited, it has been found that degradation in optical power becomes very high and data rate poor. In this paper we introduce a new modulation technique called Pulse Amplitude and Delay Modulation (PADM). We compare its performance with Dual Header Pulse Interval Modulation (DH-PIM) that has the best reported bandwidth efficiency. Experiment results show that the data rate could be enhanced from 3.2 kps to 4.3 kbs using a red link (640 nm) under same error rate. This suggests PADM has better bandwidth efficiency than DH-PIM.

  17. Spaceborne Doppler Precipitation Radar: System Configurations and Performance Analysis

    NASA Technical Reports Server (NTRS)

    Tanelli, Simone; Im, Eastwood

    2004-01-01

    Knowledge of the global distribution of the vertical velocity of precipitation is important in in the study of energy transportation in the atmosphere, the climate and weather. Such knowledge can only be directly acquired with the use of spaceborne Doppler precipitation radars. Although the high relative speed of the radar with respect to the rainfall particles introduces significant broadening in the Doppler spectrum, recent studies have shown that the average vertical velocity can be measured to acceptable accuracy levels by appropriate selection of radar parameters. Furthermore, methods to correct for specific errors arising from NUBF effects and pointing uncertainties have recently been developed. In this paper we will present the results of the trade studies on the performances of a spaceborne Doppler radar with different system parameters configurations.

  18. Performance and design analysis of ballistic reusable SSTO launch vehicles

    NASA Astrophysics Data System (ADS)

    Koelle, Dietrich E.

    Based on previous system studies of MBB on single-stage ballistic launch vehicles with vertical take-off and landing (VTOL) from 1969 and 1986, a review is presented of the performance and design criteria of such advanced launch systems with respect to the present 'state of the art'. This type of launch vehicle is a prime candidate for an economical future space transportation system in the medium-size payload class. Ascent trajectory optimization, which is more difficult than for a multistage rocket, reveals the requirement for a careful thrust variation (reduction) during ascent as well as a high takeoff acceleration in order to achieve the minimum velocity requirement and maximum payload. Further, the impact of vehicle net mass and average specific impulse are presented as well as the design options for the single-stage to orbit (SSTO) propulsion system and other specific design features.

  19. Structural analysis of amorphous phosphates using high performance liquid chromatography

    SciTech Connect

    Sales, B.C.; Boatner, L.A.; Chakoumakos, B.C.; McCallum, J.C.; Ramey, J.O.; Zuhr, R.A.

    1993-12-31

    Determining the atomic-scale structure of amorphous solids has proven to be a formidable scientific and technological problem for the past 100 years. The technique of high-performance liquid chromatography (HPLC) provides unique detailed information regarding the structure of partially disordered or amorphous phosphate solids. Applications of the experimental technique of HPLC to phosphate solids are reviewed, and examples of the type of information that can be obtained with HPLC are presented. Inorganic phosphates encompass a large class of important materials whose applications include: catalysts, ion-exchange media, solid electrolytes for batteries, linear and nonlinear optical components, chelating agents, synthetic replacements for bone and teeth, phosphors, detergents, and fertilizers. Phosphate ions also represent a unique link between living systems and the inorganic world.

  20. Operational Performance Analysis of Passive Acoustic Monitoring for Killer Whales

    SciTech Connect

    Matzner, Shari; Fu, Tao; Ren, Huiying; Deng, Zhiqun; Sun, Yannan; Carlson, Thomas J.

    2011-09-30

    For the planned tidal turbine site in Puget Sound, WA, the main concern is to protect Southern Resident Killer Whales (SRKW) due to their Endangered Species Act status. A passive acoustic monitoring system is proposed because the whales emit vocalizations that can be detected by a passive system. The algorithm for detection is implemented in two stages. The first stage is an energy detector designed to detect candidate signals. The second stage is a spectral classifier that is designed to reduce false alarms. The evaluation presented here of the detection algorithm incorporates behavioral models of the species of interest, environmental models of noise levels and potential false alarm sources to provide a realistic characterization of expected operational performance.

  1. Performance analysis of SA-3 missile second stage

    NASA Technical Reports Server (NTRS)

    Helmy, A. M.

    1981-01-01

    One SA-3 missile was disassembled. The constituents of the second stage were thoroughly investigated for geometrical details. The second stage slotted composite propellant grain was subjected to mechanical properties testing, physiochemical analyses, and burning rate measurements at different conditions. To determine the propellant performance parameters, the slotted composite propellant grain was machined into a set of small-size tubular grains. These grains were fired in a small size rocket motor with a set of interchangeable nozzles with different throat diameters. The firings were carried out at three different conditions. The data from test motor firings, physiochemical properties of the propellant, burning rate measurement results and geometrical details of the second stage motor, were used as input data in a computer program to compute the internal ballistic characteristics of the second stage.

  2. Benchmarking and performance analysis of the CM-2. [SIMD computer

    NASA Technical Reports Server (NTRS)

    Myers, David W.; Adams, George B., II

    1988-01-01

    A suite of benchmarking routines testing communication, basic arithmetic operations, and selected kernel algorithms written in LISP and PARIS was developed for the CM-2. Experiment runs are automated via a software framework that sequences individual tests, allowing for unattended overnight operation. Multiple measurements are made and treated statistically to generate well-characterized results from the noisy values given by cm:time. The results obtained provide a comparison with similar, but less extensive, testing done on a CM-1. Tests were chosen to aid the algorithmist in constructing fast, efficient, and correct code on the CM-2, as well as gain insight into what performance criteria are needed when evaluating parallel processing machines.

  3. Rankine engine solar power generation. I - Performance and economic analysis

    NASA Technical Reports Server (NTRS)

    Gossler, A. A.; Orrock, J. E.

    1981-01-01

    Results of a computer simulation of the performance of a solar flat plate collector powered electrical generation system are presented. The simulation was configured to include locations in New Mexico, North Dakota, Tennessee, and Massachusetts, and considered a water-based heat-transfer fluid collector system with storage. The collectors also powered a Rankine-cycle boiler filled with a low temperature working fluid. The generator was considered to be run only when excess solar heat and full storage would otherwise require heat purging through the collectors. All power was directed into the utility grid. The solar powered generator unit addition was found to be dependent on site location and collector area, and reduced the effective solar cost with collector areas greater than 400-670 sq m. The sites were economically ranked, best to worst: New Mexico, North Dakota, Massachusetts, and Tennessee.

  4. Performance analysis of OFDM modulation on indoor broadband PLC channels

    NASA Astrophysics Data System (ADS)

    Antonio Cortés, José; Díez, Luis; Cañete, Francisco Javier; Sánchez-Martínez, Juan José; Entrambasaguas, José Tomás

    2011-12-01

    Indoor broadband power-line communications is a suitable technology for home networking applications. In this context, orthogonal frequency-division multiplexing (OFDM) is the most widespread modulation technique. It has recently been adopted by the ITU-T Recommendation G.9960 and is also used by most of the commercial systems, whose number of carriers has gone from about 100 to a few thousands in less than a decade. However, indoor power-line channels are frequency-selective and exhibit periodic time variations. Hence, increasing the number of carriers does not always improves the performance, since it reduces the distortion because of the frequency selectivity, but increases the one caused by the channel time variation. In addition, the long impulse response of power-line channels obliges to use an insufficient cyclic prefix. Increasing its value reduces the distortion, but also the symbol rate. Therefore, there are optimum values for both modulation parameters. This article evaluates the performance of an OFDM system as a function of the number of carriers and the cyclic prefix length, determining their most appropriate values for the indoor power-line scenario. This task must be accomplished by means of time-consuming simulations employing a linear time-varying filtering, since no consensus on a tractable statistical channel model has been reached yet. However, this study presents a simpler procedure in which the distortion because of the frequency selectivity is computed using a time-invariant channel response, and an analytical expression is derived for the one caused by the channel time variation.

  5. Systems study on engineered barriers: barrier performance analysis

    SciTech Connect

    Stula, R.T.; Albert, T.E.; Kirstein, B.E.; Lester, D.H.

    1980-09-01

    A performance assessment model for multiple barrier packages containing unreprocessed spent fuel has been modified and applied to several package designs. The objective of the study was to develop information to be used in programmatic decision making concerning engineered barrier package design and development. The assessment model, BARIER, was developed in previous tasks of the System Study on Engineered Barriers (SSEB). The new version discussed in this report contains a refined and expanded corrosion rate data base which includes pitting, crack growth, and graphitization as well as bulk corrosion. Corrosion rates for oxic and anoxic conditions at each of the two temperature ranges are supplied. Other improvements include a rigorous treatment of radionuclide release after package failure which includes resistance of damaged barriers and backfill, refined temperature calculations that account for convection and radiation, a subroutine to calculate nuclear gamma radiation field at each barrier surface, refined stress calculations with reduced conservatism and various coding improvements to improve running time and core usage. This report also contains discussion of alternative scenarios to the assumed flooded repository as well as the impact of water exclusion backfills. The model was used to assess post repository closure performance for several designs which were all variation of basic designs from the Spent Unreprocessed Fuel (SURF) program. Many designs were found to delay the onset of leaching by at least a few hundreds of years in all geologic media. Long delay times for radionuclide release were found for packages with a few inches of sorption backfill. Release of uranium, plutonium, and americium was assessed.

  6. High Performance Parallel Analysis of Coupled Problems for Aircraft Propulsion

    NASA Technical Reports Server (NTRS)

    Felippa, C. A.; Farhat, C.; Lanteri, S.; Maman, N.; Piperno, S.; Gumaste, U.

    1994-01-01

    In order to predict the dynamic response of a flexible structure in a fluid flow, the equations of motion of the structure and the fluid must be solved simultaneously. In this paper, we present several partitioned procedures for time-integrating this focus coupled problem and discuss their merits in terms of accuracy, stability, heterogeneous computing, I/O transfers, subcycling, and parallel processing. All theoretical results are derived for a one-dimensional piston model problem with a compressible flow, because the complete three-dimensional aeroelastic problem is difficult to analyze mathematically. However, the insight gained from the analysis of the coupled piston problem and the conclusions drawn from its numerical investigation are confirmed with the numerical simulation of the two-dimensional transient aeroelastic response of a flexible panel in a transonic nonlinear Euler flow regime.

  7. Algorithms and architectures for high performance analysis of semantic graphs.

    SciTech Connect

    Hendrickson, Bruce Alan

    2005-09-01

    Semantic graphs offer one promising avenue for intelligence analysis in homeland security. They provide a mechanism for describing a wide variety of relationships between entities of potential interest. The vertices are nouns of various types, e.g. people, organizations, events, etc. Edges in the graph represent different types of relationships between entities, e.g. 'is friends with', 'belongs-to', etc. Semantic graphs offer a number of potential advantages as a knowledge representation system. They allow information of different kinds, and collected in differing ways, to be combined in a seamless manner. A semantic graph is a very compressed representation of some of relationship information. It has been reported that the semantic graph can be two orders of magnitude smaller than the processed intelligence data. This allows for much larger portions of the data universe to be resident in computer memory. Many intelligence queries that are relevant to the terrorist threat are naturally expressed in the language of semantic graphs. One example is the search for 'interesting' relationships between two individuals or between an individual and an event, which can be phrased as a search for short paths in the graph. Another example is the search for an analyst-specified threat pattern, which can be cast as an instance of subgraph isomorphism. It is important to note than many kinds of analysis are not relationship based, so these are not good candidates for semantic graphs. Thus, a semantic graph should always be used in conjunction with traditional knowledge representation and interface methods. Operations that involve looking for chains of relationships (e.g. friend of a friend) are not efficiently executable in a traditional relational database. However, the semantic graph can be thought of as a pre-join of the database, and it is ideally suited for these kinds of operations. Researchers at Sandia National Laboratories are working to facilitate semantic graph

  8. Increasing the performance of tritium analysis by electrolytic enrichment.

    PubMed

    Groning, M; Auer, R; Brummer, D; Jaklitsch, M; Sambandam, C; Tanweer, A; Tatzber, H

    2009-06-01

    Several improvements are described for the existing tritium enrichment system at the Isotope Hydrology Laboratory of the International Atomic Energy Agency for processing natural water samples. The improvements include a simple method for pretreatment of electrolytic cells to ensure a high tritium separation factor, an improved design of the exhaust system for explosive gases, and a vacuum distillation line for faster initial preparation of water samples for electrolytic enrichment and for tritium analysis. Achievements included the reduction of variation of individual enrichment parameters of all cells to less than 1% and an improvement of 50% of the stability of the background mean. It resulted in an improved detection limit of less than 0.4 TU (at 2s), important for application of tritium measurements in the future at low concentration levels, and resulted in measurement precisions of+/-0.2 TU and+/-0.15 TU for liquid scintillation counting and for gas proportional counting, respectively. PMID:20183225

  9. Project on restaurant energy performance: end-use monitoring and analysis. Appendixes I and II

    SciTech Connect

    Claar, C.N.; Mazzucchi, R.P.; Heidell, J.A.

    1985-05-01

    This is the second volume of the report, ''The Porject on Restaurant Energy Performance - End-Use Monitoring and Analysis''. The first volume (PNL-5462) contains a summary and analysis of the metered energy performance data collected by the Project on Restaurant Energy Performance (PREP). Appendix I, presented here, contains monitoring site descriptions, measurement plans, and data summaries for the seven restaurants metered for PREP. Appendix II, also in this volume, is a description of the PREP computer system.

  10. Performance analysis of digital silicon photomultipliers for PET

    NASA Astrophysics Data System (ADS)

    Somlai-Schweiger, I.; Schneider, F. R.; Ziegler, S. I.

    2015-05-01

    A silicon photomultiplier (SiPM) with electronics integrated on cell level has been developed by Philips Digital Photon Counting. The device delivers a digital signal of the detected photon counts and their time stamp, making it a potential candidate for positron emission tomography (PET) applications. Several operational parameters of the specifically developed acquisition protocol can be adjusted to optimize the photon detection. In this work, the combination of five different parameters (trigger scheme, validation scheme, cell inhibition, temperature and excess bias voltage) is analyzed. Their impact on both the intrinsic as well as the PET-oriented sensor's performance is studied when coupled to two different PET candidate scintillators, GAGG and LYSO (2 × 2 × 6 mm3). The results show that SiPM intrinsic properties such as breakdown voltage temperature coefficient (20 mV/K) and optical crosstalk (20%) are similar to state-of-the-art analog devices. The main differences are induced by the logic of the acquisition sequence and its parameters. The sensor's dark-count-rate (DCR) is 770 kHz/mm2 at 24°C and 100% active cells. It can be reduced through cell inhibition and lower temperatures (ca. 2 orders of magnitude at 0°C and 20% cell inhibition). DCR reduction is necessary to avoid acquiring dark-count-triggered and validated events, causing loss of detection sensitivity. The typical time fraction spent with these events is 42.5% (GAGG) and 35.5% (LYSO). Increasing percentages of cell inhibition affect the photodetection efficiency and with it the energy resolution and the coincidence time resolution (CTR). At 5.6 °C and 10% cell inhibition, the measured energy resolution is 11.9% and 13.5% (FWHM, saturation corrected) and a FWHM CTR of 458 ps and 177 ps is achieved, for GAGG and LYSO respectively. With the implemented setup, the optimum configuration for PET, in terms of sensitivity, energy resolution and CTR, is trigger scheme 1, validation scheme 8, 10

  11. Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0

    DOE PAGESBeta

    Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; Morris, Alan

    2008-01-01

    The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less

  12. Analysis of bio-anode performance through electrochemical impedance spectroscopy.

    PubMed

    ter Heijne, Annemiek; Schaetzle, Olivier; Gimenez, Sixto; Navarro, Lucia; Hamelers, Bert; Fabregat-Santiago, Francisco

    2015-12-01

    In this paper we studied the performance of bioanodes under different experimental conditions using polarization curves and impedance spectroscopy. We have identified that the large capacitances of up to 1 mF·cm(-2) for graphite anodes have their origin in the nature of the carbonaceous electrode, rather than the microbial culture. In some cases, the separate contributions of charge transfer and diffusion resistance were clearly visible, while in other cases their contribution was masked by the high capacitance of 1 mF·cm(-2). The impedance data were analyzed using the basic Randles model to analyze ohmic, charge transfer and diffusion resistances. Increasing buffer concentration from 0 to 50mM and increasing pH from 6 to 8 resulted in decreased charge transfer and diffusion resistances; lowest values being 144 Ω·cm(2) and 34 Ω·cm(2), respectively. At acetate concentrations below 1 mM, current generation was limited by acetate. We show a linear relationship between inverse charge transfer resistance at potentials close to open circuit and saturation (maximum) current, associated to the Butler-Volmer relationship that needs further exploration. PMID:25869113

  13. Compact time- and space-integrating SAR processor: performance analysis

    NASA Astrophysics Data System (ADS)

    Haney, Michael W.; Levy, James J.; Michael, Robert R., Jr.; Christensen, Marc P.

    1995-06-01

    Progress made during the previous 12 months toward the fabrication and test of a flight demonstration prototype of the acousto-optic time- and space-integrating real-time SAR image formation processor is reported. Compact, rugged, and low-power analog optical signal processing techniques are used for the most computationally taxing portions of the SAR imaging problem to overcome the size and power consumption limitations of electronic approaches. Flexibility and performance are maintained by the use of digital electronics for the critical low-complexity filter generation and output image processing functions. The results reported for this year include tests of a laboratory version of the RAPID SAR concept on phase history data generated from real SAR high-resolution imagery; a description of the new compact 2D acousto-optic scanner that has a 2D space bandwidth product approaching 106 sports, specified and procured for NEOS Technologies during the last year; and a design and layout of the optical module portion of the flight-worthy prototype.

  14. Performance analysis of a multilevel coded modulation system

    NASA Astrophysics Data System (ADS)

    Kofman, Yosef; Zehavi, Ephraim; Shamai, Shlomo

    1994-02-01

    A modified version of the multilevel coded modulation scheme of Imai & Hirakawa is presented and analyzed. In the transmitter, the outputs of the component codes are bit interleaved prior to mapping into 8-PSK channel signals. A multistage receiver is considered, in which the output amplitudes of the Gaussian channel are soft limited before entering the second and third stage decoders. Upper bounds and Gaussian approximations for the bit error probability of every component code, which take into account errors in previously decoded stages, are presented. Aided by a comprehensive computer simulation, it is demonstrated in a specific example that the addition of the interleaver and soft limiter in the third stage improves its performance by 1.1 dB at a bit error probability of 10(exp -5), and that the multilevel scheme improves on an Ungerboeck's code with the same decoding complexity. The rate selection of the component codes is also considered and a simple selection rule, based on information theoretic arguments, is provided.

  15. Performance analysis of bullet trajectory estimation: Approach, simulation, and experiments

    SciTech Connect

    Ng, L.C.; Karr, T.J.

    1994-11-08

    This paper describes an approach to estimate a bullet`s trajectory from a time sequence of angles-only observations from a high-speed camera, and analyzes its performance. The technique is based on fitting a ballistic model of a bullet in flight along with unknown source location parameters to a time series of angular observations. The theory is developed to precisely reconstruct, from firing range geometry, the actual bullet trajectory as it appeared on the focal plane array and in real space. A metric for measuring the effective trajectory track error is also presented. Detailed Monte-Carlo simulations assuming different bullet ranges, shot-angles, camera frame rates, and angular noise show that angular track error can be as small as 100 {mu}rad for a 2 mrad/pixel sensor. It is also shown that if actual values of bullet ballistic parameters were available, the bullet s source location variables, and the angles of flight information could also be determined.

  16. Application of high performance capillary electrophoresis on toxic alkaloids analysis.

    PubMed

    Zhang, Li; Wang, Rong; Zhang, Yurong; Yu, Yunqiu

    2007-06-01

    We employed CE to identify mixtures of the toxic alkaloids lappaconitine, bullatine A, atropine sulfate, atropine methobromide, scopolamine hydrobromide, anisodamine hydrobromide, brucine, strychnine, quinine sulfate, and chloroquine in human blood and urine, using procaine hydrochloride as an internal standard. The separation employed a fused-silica capillary of 75 microm id x 60 cm length (effective length: 50.2 cm) and a buffer containing 100 mM phosphate and 5% ACN (pH 4.0). The sample was injected in a pressure mode and the separation was performed at a voltage of 16 kV and a temperature of 25 degrees C. The compounds were detected by UV absorbance at wavelengths of 195 and 235 nm. All the ten alkaloids were separated within 16 min. The method was validated with regard to precision (RSD), accuracy, sensitivity, linear range, LOD, and LOQ. In blood and urine samples, the detection limits were 5-40 ng/mL and linear calibration curves were obtained over the range of 0.02-10 microg/mL. The precision of intra- and interday measurements was less than 15%. Electrophoretic peaks could be identified either by the relative migration time or by their UV spectrum. PMID:17623479

  17. Engineering sciences area and module performance and failure analysis area

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.; Runkle, L. D.

    1982-01-01

    Photovoltaic-array/power-conditioner interface studies are updated. An experiment conducted to evaluate different operating-point strategies, such as constant voltage and pilot cells, and to determine array energy losses when the array is operated off the maximum power points is described. Initial results over a test period of three and a half weeks showed a 2% energy loss when the array is operated at a fixed voltage. Degraded-array studies conducted at NE RES that used a range of simulated common types of degraded I-V curves are reviewed. The instrumentation installed at the JPL field-test site to obtain the irradiance data was described. Experiments using an optical filter to adjust the spectral irradiance of the large-area pulsed solar simulator (LAPSS) to AM1.5 are described. Residential-array research activity is reviewed. Voltage isolation test results are described. Experiments performed on one type of module to determine the relationship between leakage current and temperature are reviewed. An encapsulated-cell testing approach is explained. The test program, data reduction methods, and initial results of long-duration module testing are described.

  18. Structure performance analysis for TMT teritiary mirror system

    NASA Astrophysics Data System (ADS)

    Su, Yan-qin; Zhang, Jing-xu; Yang, Fei; Wang, Huai; Chen, Bao-gang

    2014-09-01

    TMT Tertiary Mirror System (M3S) is required to be able to track and point. It should rotate with the observing object. In this article, the schemes of Tertiary Mirror supporting assembly and position assembly are introduced. Then, the static and dynamic performance of Tertiary Mirror System has been analyzed. It is shown that, the maximum deflection is 1.024 mm, the maximum stress is 138.91MPa when gravity is affecting. The security of M3S can be assured when telescope is working at all required positions. The first nature frequency is 15.39Hz, the requirement 15 Hz has been satisfied. In addition, the response to earthquake has been estimated primarily. Results shows that when earthquake with mean return time 200 year happens in three directions simultaneous, or earthquake with mean return time 1000 years happens, the lateral support will be destroyed. Protection measures should be considered. Conclusions in this article are useful guides for the M3S design.

  19. Performing integrative functional genomics analysis in GeneWeaver.org.

    PubMed

    Jay, Jeremy J; Chesler, Elissa J

    2014-01-01

    Functional genomics experiments and analyses give rise to large sets of results, each typically quantifying the relation of molecular entities including genes, gene products, polymorphisms, and other genomic features with biological characteristics or processes. There is tremendous utility and value in using these data in an integrative fashion to find convergent evidence for the role of genes in various processes, to identify functionally similar molecular entities, or to compare processes based on their genomic correlates. However, these gene-centered data are often deposited in diverse and non-interoperable stores. Therefore, integration requires biologists to implement computational algorithms and harmonization of gene identifiers both within and across species. The GeneWeaver web-based software system brings together a large data archive from diverse functional genomics data with a suite of combinatorial tools in an interactive environment. Account management features allow data and results to be shared among user-defined groups. Users can retrieve curated gene set data, upload, store, and share their own experimental results and perform integrative analyses including novel algorithmic approaches for set-set integration of genes and functions. PMID:24233775

  20. The Performance Analysis Based on SAR Sample Covariance Matrix

    PubMed Central

    Erten, Esra

    2012-01-01

    Multi-channel systems appear in several fields of application in science. In the Synthetic Aperture Radar (SAR) context, multi-channel systems may refer to different domains, as multi-polarization, multi-interferometric or multi-temporal data, or even a combination of them. Due to the inherent speckle phenomenon present in SAR images, the statistical description of the data is almost mandatory for its utilization. The complex images acquired over natural media present in general zero-mean circular Gaussian characteristics. In this case, second order statistics as the multi-channel covariance matrix fully describe the data. For practical situations however, the covariance matrix has to be estimated using a limited number of samples, and this sample covariance matrix follow the complex Wishart distribution. In this context, the eigendecomposition of the multi-channel covariance matrix has been shown in different areas of high relevance regarding the physical properties of the imaged scene. Specifically, the maximum eigenvalue of the covariance matrix has been frequently used in different applications as target or change detection, estimation of the dominant scattering mechanism in polarimetric data, moving target indication, etc. In this paper, the statistical behavior of the maximum eigenvalue derived from the eigendecomposition of the sample multi-channel covariance matrix in terms of multi-channel SAR images is simplified for SAR community. Validation is performed against simulated data and examples of estimation and detection problems using the analytical expressions are as well given. PMID:22736976

  1. Building America Performance Analysis Procedures for Existing Homes

    SciTech Connect

    Hendron, R.

    2006-05-01

    Because there are more than 101 million residential households in the United States today, it is not surprising that existing residential buildings represent an extremely large source of potential energy savings. Because thousands of these homes are renovated each year, Building America is investigating the best ways to make existing homes more energy-efficient, based on lessons learned from research in new homes. The Building America program is aiming for a 20%-30% reduction in energy use in existing homes by 2020. The strategy for the existing homes project of Building America is to establish technology pathways that reduce energy consumption cost-effectively in American homes. The existing buildings project focuses on finding ways to adapt the results from the new homes research to retrofit applications in existing homes. Research activities include a combination of computer modeling, field demonstrations, and long-term monitoring to support the development of integrated approaches to reduce energy use in existing residential buildings. Analytical tools are being developed to guide designers and builders in selecting the best approaches for each application. Also, DOE partners with the U.S. Environmental Protection Agency (EPA) to increase energy efficiency in existing homes through the Home Performance with ENERGY STAR program.

  2. Performance Analysis of TCP Enhancements in Satellite Data Networks

    NASA Technical Reports Server (NTRS)

    Broyles, Ren H.

    1999-01-01

    This research examines two proposed enhancements to the well-known Transport Control Protocol (TCP) in the presence of noisy communication links. The Multiple Pipes protocol is an application-level adaptation of the standard TCP protocol, where several TCP links cooperate to transfer data. The Space Communication Protocol Standard - Transport Protocol (SCPS-TP) modifies TCP to optimize performance in a satellite environment. While SCPS-TP has inherent advantages that allow it to deliver data more rapidly than Multiple Pipes, the protocol, when optimized for operation in a high-error environment, is not compatible with legacy TCP systems, and requires changes to the TCP specification. This investigation determines the level of improvement offered by SCPS-TP's Corruption Mode, which will help determine if migration to the protocol is appropriate in different environments. As the percentage of corrupted packets approaches 5 %, Multiple Pipes can take over five times longer than SCPS-TP to deliver data. At high error rates, SCPS-TP's advantage is primarily caused by Multiple Pipes' use of congestion control algorithms. The lack of congestion control, however, limits the systems in which SCPS-TP can be effectively used.

  3. Analysis of radiation performances of plasma sheet antenna

    NASA Astrophysics Data System (ADS)

    Yin, Bo; Zhang, Zu-Fan; Wang, Ping

    2015-12-01

    A novel concept of plasma sheet antennas is presented in this paper, and the radiation performances of plasma sheet antennas are investigated in detail. Firstly, a model of planar plasma antenna (PPA) fed by a microstrip line is developed, and its reflection coefficient is computed by the JE convolution finite-difference time-domain method and compared with that of the metallic patch antenna. It is found that the design of PPA can learn from the theory of the metallic patch antenna, and the impedance matching and reconstruction of resonant frequency can be expediently realized by adjusting the parameters of plasma. Then the PPA is mounted on a metallic cylindrical surface, and the reflection coefficient of the conformal plasma antenna (CPA) is also computed. At the same time, the influence of conformal cylinder radius on the reflection coefficient is also analyzed. Finally, the radiation pattern of a CPA is given, the results show that the pattern agrees well with the one of PPA in the main radiation direction, but its side lobe level has deteriorated significantly.

  4. Performance Analysis of an Improved MUSIC DoA Estimator

    NASA Astrophysics Data System (ADS)

    Vallet, Pascal; Mestre, Xavier; Loubaton, Philippe

    2015-12-01

    This paper adresses the statistical performance of subspace DoA estimation using a sensor array, in the asymptotic regime where the number of samples and sensors both converge to infinity at the same rate. Improved subspace DoA estimators were derived (termed as G-MUSIC) in previous works, and were shown to be consistent and asymptotically Gaussian distributed in the case where the number of sources and their DoA remain fixed. In this case, which models widely spaced DoA scenarios, it is proved in the present paper that the traditional MUSIC method also provides DoA consistent estimates having the same asymptotic variances as the G-MUSIC estimates. The case of DoA that are spaced of the order of a beamwidth, which models closely spaced sources, is also considered. It is shown that G-MUSIC estimates are still able to consistently separate the sources, while it is no longer the case for the MUSIC ones. The asymptotic variances of G-MUSIC estimates are also evaluated.

  5. Performance analysis of structured pedigree distributed fusion systems

    NASA Astrophysics Data System (ADS)

    Arambel, Pablo O.

    2009-05-01

    Structured pedigree is a way to compress pedigree information. When applied to distributed fusion systems, the approach avoids the well known problem of information double counting resulting from ignoring the cross-correlation among fused estimates. Other schemes that attempt to compute optimal fused estimates require the transmission of full pedigree information or raw data. This usually can not be implemented in practical systems because of the enormous requirements in communications bandwidth. The Structured Pedigree approach achieves data compression by maintaining multiple covariance matrices, one for each uncorrelated source in the network. These covariance matrices are transmitted by each node along with the state estimate. This represents a significant compression when compared to full pedigree schemes. The transmission of these covariance matrices (or a subset of these covariance matrices) allows for an efficient fusion of the estimates, while avoiding information double counting and guaranteeing consistency on the estimates. This is achieved by exploiting the additional partial knowledge on the correlation of the estimates. The approach uses a generalized version of the Split Covariance Intersection algorithm that applies to multiple estimates and multiple uncorrelated sources. In this paper we study the performance of the proposed distributed fusion system by analyzing a simple but instructive example.

  6. Lunar lander configuration study and parametric performance analysis

    NASA Astrophysics Data System (ADS)

    Donahue, Benjamin B.; Fowler, C. R.

    1993-06-01

    Future Lunar exploration plans will call for delivery of significant mounts or cargo to provide for crew habitation, surface tansportation, and scientific exploration activities. Minimization of costly surface based infrastructure is in large part directly related to the design of the cargo delivery/landing craft. This study focused on evaluating Lunar lander concepts from a logistics oriented perspective, and outlines the approach used in the development of a preferred configuration, sets forth the benefits derived from its utilization and describes the missions and system considered. Results indicate that only direct-to-surface downloading of payloads provides for unassisted cargo removal operations imperative to efficient and low risk site buildup, including the emplacement of Space Station derivative surface habitat modules, immediate cargo jettison for both descent abort and emergency surface ascent essential to piloted missions carrying cargo, and short habitat egress/ingress paths necessary to productive surface work tours for crew members carrying hand held experiments, tools and other bulky articles. Accommodating cargo in a position underneath the vehicles structural frame, landing craft described herein eliminate altogether the necessity for dedicated surface based off-loading vehicles, the operations and maintenance associated with their operation, and the precipitous ladder climbs to and from the surface that are inherent to traditional designs. Parametric evaluations illustrate performance and mass variation with respect to mission requirements.

  7. Lunar lander configuration study and parametric performance analysis

    NASA Technical Reports Server (NTRS)

    Donahue, Benjamin B.; Fowler, C. R.

    1993-01-01

    Future Lunar exploration plans will call for delivery of significant mounts or cargo to provide for crew habitation, surface tansportation, and scientific exploration activities. Minimization of costly surface based infrastructure is in large part directly related to the design of the cargo delivery/landing craft. This study focused on evaluating Lunar lander concepts from a logistics oriented perspective, and outlines the approach used in the development of a preferred configuration, sets forth the benefits derived from its utilization and describes the missions and system considered. Results indicate that only direct-to-surface downloading of payloads provides for unassisted cargo removal operations imperative to efficient and low risk site buildup, including the emplacement of Space Station derivative surface habitat modules, immediate cargo jettison for both descent abort and emergency surface ascent essential to piloted missions carrying cargo, and short habitat egress/ingress paths necessary to productive surface work tours for crew members carrying hand held experiments, tools and other bulky articles. Accommodating cargo in a position underneath the vehicles structural frame, landing craft described herein eliminate altogether the necessity for dedicated surface based off-loading vehicles, the operations and maintenance associated with their operation, and the precipitous ladder climbs to and from the surface that are inherent to traditional designs. Parametric evaluations illustrate performance and mass variation with respect to mission requirements.

  8. Automated Dsm Extraction from Uav Images and Performance Analysis

    NASA Astrophysics Data System (ADS)

    Rhee, S.; Kim, T.

    2015-08-01

    As technology evolves, unmanned aerial vehicles (UAVs) imagery is being used from simple applications such as image acquisition to complicated applications such as 3D spatial information extraction. Spatial information is usually provided in the form of a DSM or point cloud. It is important to generate very dense tie points automatically from stereo images. In this paper, we tried to apply stereo image-based matching technique developed for satellite/aerial images to UAV images, propose processing steps for automated DSM generation and to analyse the possibility of DSM generation. For DSM generation from UAV images, firstly, exterior orientation parameters (EOPs) for each dataset were adjusted. Secondly, optimum matching pairs were determined. Thirdly, stereo image matching was performed with each pair. Developed matching algorithm is based on grey-level correlation on pixels applied along epipolar lines. Finally, the extracted match results were united with one result and the final DSM was made. Generated DSM was compared with a reference DSM from Lidar. Overall accuracy was 1.5 m in NMAD. However, several problems have to be solved in future, including obtaining precise EOPs, handling occlusion and image blurring problems. More effective interpolation technique needs to be developed in the future.

  9. Performance of the Carbon Dioxide Information Analysis Center (CDIAC)

    SciTech Connect

    Stoss, F.W.; Jones, S.B.

    1993-11-01

    The Carbon Dioxide Information Analysis Center (CDIAC) provides information and data resources in support of the US Department of Energy`s Global Change Research Program. CDIAC also serves as a resource of global change information for a broader international commonly of researchers, policymakers, managers, educators, and students. The number of requests for CDIAC`s data products, information services, and publications has grown over the years and represents multidisciplinary interests in the physical, life, and social sciences and from diverse work settings in government, business, and academia. CDIAC`s staff addresses thousands of requests yearly for data and information resources. In response to these requests, CDIAC has distributed tens of thousands of data products, technical reports, newsletters, and other information resources worldwide since 1982. This paper describes CDIAC, examines CDIAC`s user community, and describes CDIAC`s response to requests for information. The CDIAC Information System, which serves as a comprehensive PC-based inventory and information management tracking system, is also described.

  10. High speed spherical roller-bearing analysis and comparison with experimental performance

    NASA Technical Reports Server (NTRS)

    Kleckner, R. J.; Dyba, G.

    1983-01-01

    The capabilities of a spherical roller bearing analysis/design tool, Spherbean (spherical bearing analysis) are described. Capabilities of the analysis are demonstrated and verified by comparison with experimental data. A practical design problem is presented where the computer program is used to improve a particular bearing's performance.

  11. A TIERED APPROACH TO PERFORMING UNCERTAINTY ANALYSIS IN CONDUCTING EXPOSURE ANALYSIS FOR CHEMICALS

    EPA Science Inventory

    The WHO/IPCS draft Guidance Document on Characterizing and Communicating Uncertainty in Exposure Assessment provides guidance on recommended strategies for conducting uncertainty analysis as part of human exposure analysis. Specifically, a tiered approach to uncertainty analysis ...

  12. Computerized Neurocognitive Test Performance in Schizophrenia: A Lifespan Analysis

    PubMed Central

    Irani, Farzin; Brensinger, Colleen M.; Richard, Jan; Calkins, Monica E.; Moberg, Paul J.; Bilker, Waren; Gur, Raquel E.; Gur, Ruben C.

    2011-01-01

    Objective Computerized neurocognitive batteries based on advanced behavioral neuroscience methods are increasingly used in large-scale clinical and genomic studies. Favorable construct validity in younger schizophrenia patients has been reported, but not in older patients. New variables afforded by computerized assessments were used to clarify age-associated cognitive impairment across the lifespan. Methods 624 patients with schizophrenia and 624 healthy comparison (HC) subjects aged 16–75 completed a 1–2 hour computerized neurocognitive battery (CNB) that assessed abstraction and mental flexibility, attention, working memory, recognition memory (verbal, facial, spatial), language, visuospatial and emotion processing. Linear mixed effects models tested for group differences in accuracy, response time, and efficiency scores. Contrasts were stratified by age. Results 91% of older (45+) and 94% of younger (<45) groups provided “good” data quality. After controlling for parental education and project, there were significant three-way interactions for diagnosis x domain x age group on all three outcome variables. Patients performed worse than HC across all neurocognitive domains, except in the oldest group of 60+ patients. Age-stratified analyses did not show differences between younger (16–45) and older patients (45–60, 60+), except for the attention domain. Older patients’ reduced working memory efficiency was due to worse speed, not accuracy. Older patients were quicker than younger patients in processing emotions. Conclusions Computerized assessments are feasible in large cohorts of schizophrenia patients. There is stable and generalized neurocognitive dysfunction across the lifespan in schizophrenia, albeit with fewer differences in some domains between older patients and HC after age 60. Speed-accuracy tradeoff strategies suggest deceleration of some frontal networks and improvements in speed of emotional processing. PMID:22183011

  13. The Performance Analysis of a Uav Based Mobile Mapping System

    NASA Astrophysics Data System (ADS)

    Tsai, M. L.; Chiang, K. W.; Tseng, Y. H.; Rau, J. Y.; Huang, Y. W.; Lo, C. F.

    2012-07-01

    In order to facilitate applications such as environment detection or disaster monitoring, developing a quickly and low cost system to collect near real time spatial information is very important. Such a rapid spatial information collection capability has become an emerging trend in the technology of remote sensing and mapping application. In this study, a fixed-wing UAV based spatial information acquisition platform is developed and evaluated. The proposed UAV based platform has a direct georeferencing module including an low cost INS/GPS integrated system, low cost digital camera as well as other general UAV modules including immediately video monitoring communication system. This direct georeferencing module is able to provide differential GPS processing with single frequency carrier phase measurements to obtain sufficient positioning accuracy. All those necessary calibration procedures including interior orientation parameters, the lever arm and boresight angle are implemented. In addition, a flight test is performed to verify the positioning accuracy in direct georeferencing mode without using any ground control point that is required for most of current UAV based photogrammetric platforms. In other word, this is one of the pilot studies concerning direct georeferenced based UAV photogrammetric platform. The preliminary results in term of positioning accuracy in direct georeferenced mode without using any GCP illustrate horizontal positioning accuracies in x and y axes are both less than 20 meters, respectively. On the contrary, the positioning accuracy of z axis is less than 50 meters with 600 meters flight height above ground. Such accuracy is good for near real time disaster relief. Therefore, it is a relatively safe and cheap platform to collect critical spatial information for urgent response such as disaster relief and assessment applications where ground control points are not available.

  14. Performance analysis of bonded composite doublers on aircraft structures

    SciTech Connect

    Roach, D.

    1995-08-01

    Researchers contend that composite repairs (or structural reinforcement doublers) offer numerous advantages over metallic patches including corrosion resistance, light weight, high strength, elimination of rivets, and time savings in installation. Their use in commercial aviation has been stifled by uncertainties surrounding their application, subsequent inspection and long-term endurance. The process of repairing or reinforcing airplane structures is time consuming and the design is dependent upon an accompanying stress and fatigue analysis. A repair that is too stiff may result in a loss of fatigue life, continued growth of the crack being repaired, and the initiation of a new flaw in the undesirable high stress field around the patch. Uncertainties in load spectrums used to design repairs exacerbates these problems as does the use of rivets to apply conventional doublers. Many of these repair or structural reinforcement difficulties can be addressed through the use of composite doublers. Primary among unknown entities are the effects of non-optimum installations and the certification of adequate inspection procedures. This paper presents on overview of a program intended to introduce composite doubler technology to the US commercial aircraft fleet. In this project, a specific composite application has been chosen on an L-1011 aircraft in order to focus the tasks on application and operation issues. Through the use of laboratory test structures and flight demonstrations on an in-service L-1011 airplane, this study is investigating composite doubler design, fabrication, installation, structural integrity, and non-destructive evaluation. In addition to providing an overview of the L-1011 project, this paper focuses on a series of fatigue and strength tests which have been conducted in order to study the damage tolerance of composite doublers. Test results to-date are presented.

  15. Design and demonstrate the performance of cryogenic components representative of space vehicles: Start basket liquid acquisition device performance analysis

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The objective was to design, fabricate and test an integrated cryogenic test article incorporating both fluid and thermal propellant management subsystems. A 2.2 m (87 in) diameter aluminum test tank was outfitted with multilayer insulation, helium purge system, low-conductive tank supports, thermodynamic vent system, liquid acquisition device and immersed outflow pump. Tests and analysis performed on the start basket liquid acquisition device and studies of the liquid retention characteristics of fine mesh screens are discussed.

  16. A covariance analysis tool for assessing fundamental limits of SIM pointing performance

    NASA Astrophysics Data System (ADS)

    Bayard, David S.; Kang, Bryan H.

    2007-09-01

    This paper presents a performance analysis of the instrument pointing control system for NASA's Space Interferometer Mission (SIM). SIM has a complex pointing system that uses a fast steering mirror in combination with a multirate control architecture to blend feedforward information with feedback information. A pointing covariance analysis tool (PCAT) is developed specifically to analyze systems with such complexity. The development of PCAT as a mathematical tool for covariance analysis is outlined in the paper. PCAT is then applied to studying performance of SIM's science pointing system. The analysis reveals and clearly delineates a fundamental limit that exists for SIM pointing performance. The limit is especially stringent for dim star targets. Discussion of the nature of the performance limit is provided, and methods are suggested to potentially improve pointing performance.

  17. A Covariance Analysis Tool for Assessing Fundamental Limits of SIM Pointing Performance

    NASA Technical Reports Server (NTRS)

    Bayard, David S.; Kang, Bryan H.

    2007-01-01

    This paper presents a performance analysis of the instrument pointing control system for NASA's Space Interferometer Mission (SIM). SIM has a complex pointing system that uses a fast steering mirror in combination with a multirate control architecture to blend feed forward information with feedback information. A pointing covariance analysis tool (PCAT) is developed specifically to analyze systems with such complexity. The development of PCAT as a mathematical tool for covariance analysis is outlined in the paper. PCAT is then applied to studying performance of SIM's science pointing system. The analysis reveals and clearly delineates a fundamental limit that exists for SIM pointing performance. The limit is especially stringent for dim star targets. Discussion of the nature of the performance limit is provided, and methods are suggested to potentially improve pointing performance.

  18. Performance Analysis of Occurrences January 1, 2011-December 31, 2011

    SciTech Connect

    Ludwig, M

    2012-03-16

    This report documents the analysis of the occurrences during the period January 1, 2011 through December 31, 2011. The report compares LLNL occurrences by reporting criteria and significance category to see if LLNL is reporting occurrences along similar percentages as other DOE sites. The three-year trends are analyzed. It does not include the analysis of the causes or the lessons learned from the occurrences, as they are analyzed separately. The number and types of occurrences that LLNL reports to DOE varies over time. This variation can be attributed to normally occurring changes in frequency; DOE's or LLNL's heightened interest in a particular subject area; changes in LLNL processes; or emerging problems. Since all of the DOE sites use the same reporting criteria, it is helpful to understand if LLNL is consistent with or diverging from reporting at other sites. This section compares the normalized number of occurrences reported by LLNL and other DOE sites. In order to compare LLNL occurrence reports to occurrence reports from other DOE sites, we normalized (or standardized) the data from the sites. DOE sites vary widely in their budgets, populations, and scope of work and these variations may affect reporting frequency. In addition, reports are required for a wide range of occurrence types, some of which may not be applicable to all DOE sites. For example, one occurrence reporting group is Group 3, Nuclear Safety Basis, and not all sites have nuclear operations. Because limited information is available for all sites, the sites were normalized based on best available information. Site effort hours were extracted from the DOE Computerized Accident Incident Reporting System (CAIRS) and used to normalize (or standardize) the number of occurrences by site. Effort hours are those hours that employees normally work and do not include vacation, holiday hours etc. Sites are responsible for calculating their effort hours and ensuring entry into CAIRS. Out of the 30 DOE

  19. Performance analysis of mini-propellers based on FlightGear

    NASA Astrophysics Data System (ADS)

    Vogeltanz, Tomáš

    2016-06-01

    This paper presents a performance analysis of three mini-propellers based on the FlightGear flight simulator. Although a basic propeller analysis has to be performed before the use of FlightGear, for a complex and more practical performance analysis, it is advantageous to use a propeller model in cooperation with a particular aircraft model. This approach may determine whether the propeller has sufficient quality in respect of aircraft requirements. In the first section, the software used for the analysis is illustrated. Then, the parameters of the analyzed mini-propellers and the tested UAV are described. Finally, the main section shows and discusses the results of the performance analysis of the mini-propellers.

  20. Space tug economic analysis study. Volume 2: Tug concepts analysis. Appendix: Tug design and performance data base

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The tug design and performance data base for the economic analysis of space tug operation are presented. A compendium of the detailed design and performance information from the data base is developed. The design data are parametric across a range of reusable space tug sizes. The performance curves are generated for selected point designs of expendable orbit injection stages and reusable tugs. Data are presented in the form of graphs for various modes of operation.

  1. Analysis of sampling and quantization effects on the performance of PN code tracking loops

    NASA Technical Reports Server (NTRS)

    Quirk, K. J.; Srinivasan, M.

    2002-01-01

    Pseudonoise (PN) code tracking loops in direct-sequence spread-spectrum systems are often implemented using digital hardware. Performance degradation due to quantization and sampling effects is not adequately characterized by the traditional analog system feedback loop analysis.

  2. Analytical techniques and instrumentation: A compilation. [analytical instrumentation, materials performance, and systems analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.

  3. Performance Analysis and Optimization on a Parallel Atmospheric General Circulation Model Code

    NASA Technical Reports Server (NTRS)

    Lou, J. Z.; Farrara, J. D.

    1997-01-01

    An analysis is presented of the primary factors influencing the performance of a parallel implementation of the UCLA atmospheric general circulation model (AGCM) on distributed-memory, massively parallel computer systems.

  4. Performance Cycle Analysis of a Two-Spool, Separate-Exhaust Turbofan With Interstage Turbine Burner

    NASA Technical Reports Server (NTRS)

    Liew, K. H.; Urip, E.; Yang, S. L.; Mattingly, J. D.; Marek, C. J.

    2005-01-01

    This paper presents the performance cycle analysis of a dual-spool, separate-exhaust turbofan engine, with an Interstage Turbine Burner serving as a secondary combustor. The ITB, which is located at the transition duct between the high- and the low-pressure turbines, is a relatively new concept for increasing specific thrust and lowering pollutant emissions in modern jet engine propulsion. A detailed performance analysis of this engine has been conducted for steady-state engine performance prediction. A code is written and is capable of predicting engine performances (i.e., thrust and thrust specific fuel consumption) at varying flight conditions and throttle settings. Two design-point engines were studied to reveal trends in performance at both full and partial throttle operations. A mission analysis is also presented to assure the advantage of saving fuel by adding ITB.

  5. Application of modified profile analysis to function testing of simulated CTOL transport touchdown-performance data

    NASA Technical Reports Server (NTRS)

    Parrish, R. V.; Mckissick, B. T.

    1979-01-01

    The modification to the methodology of profile analysis to accommodate the testing of differences between two functions with a single test, rather than multiple tests at various values of the abscissa, is described and demonstrated for two sets of simulation-performance data. The first application was to a flight-simulation comparison of pilot-vehicle performance with a three-element refractive display to performance with a more widely used beam-splitter-reflective-mirror display system. The results demonstrate that the refractive system for out-the-window scene display provides equivalent performance to the reflective system. The second application demonstrates the detection of significant differences by modified profile-analysis procedures. This application compares the effects of two sets of pitch-axis force-feel characteristics on the sink rate at touchdown performance utilizing the refractive system. This experiment demonstrates the dependence of simulator sink-rate performance on force-feel characteristics.

  6. Longitudinal Trend Analysis of Performance Indicators for South Carolina's Technical Colleges

    ERIC Educational Resources Information Center

    Hossain, Mohammad Nurul

    2010-01-01

    This study included an analysis of the trend of performance indicators for the technical college sector of higher education in South Carolina. In response to demands for accountability and transparency in higher education, the state of South Carolina developed sector specific performance indicators to measure various educational outcomes for each…

  7. Performation Metrics Development Analysis for Information and Communications Technology Outsourcing: A Case Study

    ERIC Educational Resources Information Center

    Travis, James L., III

    2014-01-01

    This study investigated how and to what extent the development and use of the OV-5a operational architecture decomposition tree (OADT) from the Department of Defense (DoD) Architecture Framework (DoDAF) affects requirements analysis with respect to complete performance metrics for performance-based services acquisition of ICT under rigid…

  8. Performance Measures Analysis (PMA) as a Means of Assessing Consistency between Course Objectives and Evaluation Process.

    ERIC Educational Resources Information Center

    Curtiss, Frederic R.; Swonger, Alvin K.

    1981-01-01

    A performance measure analysis process developed by the Competency Based Education Committee of the University of Rhode Island College of Pharmacy to assess the status of the measurement of student performance is described. A taxonomy of levels of learning is appended. (Author/MLW)

  9. Relation of Early Testing and Incentive on Quiz Performance in Introductory Psychology: An Archival Analysis

    ERIC Educational Resources Information Center

    McGuire, Michael J.; MacDonald, Pamelyn M.

    2009-01-01

    Students should learn best by repeating a cycle of studying, testing, and feedback, all of which are components of "mastery learning." We performed an archival analysis to determine the relation between taking quizzes early and quiz performance in a "mastery learning" context. Also investigated was whether extra credit resulted in early testing…

  10. Application of Data Envelopment Analysis on the Indicators Contributing to Learning and Teaching Performance

    ERIC Educational Resources Information Center

    Montoneri, Bernard; Lin, Tyrone T.; Lee, Chia-Chi; Huang, Shio-Ling

    2012-01-01

    This paper applies data envelopment analysis (DEA) to explore the quantitative relative efficiency of 18 classes of freshmen students studying a course of English conversation in a university of Taiwan from the academic year 2004-2006. A diagram of teaching performance improvement mechanism is designed to identify key performance indicators for…

  11. The Effect of Goal Setting on Group Performance: A Meta-Analysis

    ERIC Educational Resources Information Center

    Kleingeld, Ad; van Mierlo, Heleen; Arends, Lidia

    2011-01-01

    Updating and extending the work of O'Leary-Kelly, Martocchio, and Frink (1994), with this meta-analysis on goal setting and group performance we show that specific difficult goals yield considerably higher group performance compared with nonspecific goals (d = 0.80 plus or minus 0.35, k = 23 effect sizes). Moderately difficult and easy goals were…

  12. An Analysis of Factors That Affect the Educational Performance of Agricultural Students

    ERIC Educational Resources Information Center

    Greenway, Gina

    2012-01-01

    Many factors contribute to student achievement. This study focuses on three areas: how students learn, how student personality type affects performance, and how course format affects performance outcomes. The analysis sought to improve understanding of the direction and magnitude with which each of these factors impacts student success. Improved…

  13. 41 CFR 102-80.130 - Who must perform the equivalent level of safety analysis?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire Prevention Equivalent Level of Safety Analysis... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false Who must perform the equivalent level of safety analysis? 102-80.130 Section 102-80.130 Public Contracts and Property...

  14. Parallels in Academic and Nonacademic Discursive Styles: An Analysis of a Mexican Woman's Narrative Performance

    ERIC Educational Resources Information Center

    Barajas, E. Dominguez

    2007-01-01

    This article presents a rhetorical analysis of a Mexican woman's oral narrative performance using a discourse studies and interactional sociolinguistics framework. The results of the analysis suggest that the discursive practice of the oral narrative and that of academic discourse share certain rhetorical features. These features are (a) the…

  15. Learners Performing Tasks in a Japanese EFL Classroom: A Multimodal and Interpersonal Approach to Analysis

    ERIC Educational Resources Information Center

    Stone, Paul

    2012-01-01

    In this paper I describe and analyse learner task-based interactions from a multimodal perspective with the aim of better understanding how learners' interpersonal relationships might affect task performance. Task-based pedagogy is focused on classroom interaction between learners, yet analysis of tasks has often neglected the analysis of this…

  16. Evaluating Language Environment Analysis System Performance for Chinese: A Pilot Study in Shanghai

    ERIC Educational Resources Information Center

    Gilkerson, Jill; Zhang, Yiwen; Xu, Dongxin; Richards, Jeffrey A.; Xu, Xiaojuan; Jiang, Fan; Harnsberger, James; Topping, Keith

    2015-01-01

    Purpose: The purpose of this study was to evaluate performance of the Language Environment Analysis (LENA) automated language-analysis system for the Chinese Shanghai dialect and Mandarin (SDM) languages. Method: Volunteer parents of 22 children aged 3-23 months were recruited in Shanghai. Families provided daylong in-home audio recordings using…

  17. Computer Analysis of the Auditory Characteristics of Musical Performance. Final Report.

    ERIC Educational Resources Information Center

    Heller, Jack J.; Campbell, Warren C.

    The purpose of this research was to perform computer analysis and modification of complex musical tones and to develop models of perceptual and learning processes in music. Analysis of the physical attributes of sound (frequency, intensity, and harmonic content, versus time) provided necessary information about the musical parameters of…

  18. COMPONENT, IMAGE, AND FACTOR ANALYSIS OF TESTS OF INTELLECT AND OF MOTOR PERFORMANCE.

    ERIC Educational Resources Information Center

    HARRIS, CHESTER W.; LIBA, MARIE R.

    AN ATTEMPT WAS MADE TO DETERMINE THE EFFECTS OF CERTAIN VARIATIONS IN METHODOLOGY ON THE ANALYSIS OF EXISTING SETS OF DATA IN THE AREAS OF ABILITY OR INTELLIGENCE AND MOTOR PERFORMANCE OR PHYSICAL FITNESS. USING CURRENT DEVELOPMENTS IN THEORY AND METHODS OF FACTOR ANALYSIS DIFFERENT TREATMENTS OF VARIOUS SETS OF DATA, THREE RELATIVELY NEW MODELS…

  19. 41 CFR 102-80.130 - Who must perform the equivalent level of safety analysis?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire Prevention Equivalent Level of Safety Analysis... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Who must perform the equivalent level of safety analysis? 102-80.130 Section 102-80.130 Public Contracts and Property...

  20. Cost and Schedule Control Systems Criteria for contract performance measurement. Data Analysis Guide

    SciTech Connect

    1986-03-01

    The Data Analysis Guide has been prepared to aid both DOE and industry personnel in the effective use of contract performance measurement data. It suggests techniques for analyzing contractor cost and schedule data to give insight into current contract performance status and help validate contractor estimates of future contract performance. The techniques contained herein should be modified and tailored to fit particular project and special needs.

  1. Methodology issues concerning the accuracy of kinematic data collection and analysis using the ariel performance analysis system

    NASA Technical Reports Server (NTRS)

    Wilmington, R. P.; Klute, Glenn K. (Editor); Carroll, Amy E. (Editor); Stuart, Mark A. (Editor); Poliner, Jeff (Editor); Rajulu, Sudhakar (Editor); Stanush, Julie (Editor)

    1992-01-01

    Kinematics, the study of motion exclusive of the influences of mass and force, is one of the primary methods used for the analysis of human biomechanical systems as well as other types of mechanical systems. The Anthropometry and Biomechanics Laboratory (ABL) in the Crew Interface Analysis section of the Man-Systems Division performs both human body kinematics as well as mechanical system kinematics using the Ariel Performance Analysis System (APAS). The APAS supports both analysis of analog signals (e.g. force plate data collection) as well as digitization and analysis of video data. The current evaluations address several methodology issues concerning the accuracy of the kinematic data collection and analysis used in the ABL. This document describes a series of evaluations performed to gain quantitative data pertaining to position and constant angular velocity movements under several operating conditions. Two-dimensional as well as three-dimensional data collection and analyses were completed in a controlled laboratory environment using typical hardware setups. In addition, an evaluation was performed to evaluate the accuracy impact due to a single axis camera offset. Segment length and positional data exhibited errors within 3 percent when using three-dimensional analysis and yielded errors within 8 percent through two-dimensional analysis (Direct Linear Software). Peak angular velocities displayed errors within 6 percent through three-dimensional analyses and exhibited errors of 12 percent when using two-dimensional analysis (Direct Linear Software). The specific results from this series of evaluations and their impacts on the methodology issues of kinematic data collection and analyses are presented in detail. The accuracy levels observed in these evaluations are also presented.

  2. Value-Added and Other Methods for Measuring School Performance: An Analysis of Performance Measurement Strategies in Teacher Incentive Fund Proposals. Research Brief

    ERIC Educational Resources Information Center

    National Center on Performance Incentives, 2008

    2008-01-01

    In "Value-Added and Other Methods for Measuring School Performance: An Analysis of Performance Measurement Strategies in Teacher Incentive Fund Proposals"--a paper presented at the February 2008 National Center on Performance Incentives research to policy conference--Robert Meyer and Michael Christian examine select performance-pay plans used by…

  3. Geometrically nonlinear design sensitivity analysis on parallel-vector high-performance computers

    NASA Technical Reports Server (NTRS)

    Baddourah, Majdi A.; Nguyen, Duc T.

    1993-01-01

    Parallel-vector solution strategies for generation and assembly of element matrices, solution of the resulted system of linear equations, calculations of the unbalanced loads, displacements, stresses, and design sensitivity analysis (DSA) are all incorporated into the Newton Raphson (NR) procedure for nonlinear finite element analysis and DSA. Numerical results are included to show the performance of the proposed method for structural analysis and DSA in a parallel-vector computer environment.

  4. Performance Analysis of DPSK-OCDMA System for Optical Access Network

    NASA Astrophysics Data System (ADS)

    Islam, Monirul; Ahmed, N.; Aljunid, S. A.; Ali, Sharafat; Sayeed, S.; Sabri, Naseer

    2016-03-01

    In this research, the performance of optical code division multiple access (OCDMA) using differential phase shift keying (DPSK) has been compared with OCDMA On-Off Keying (OOK). This comparison took place in terms of bit error rate (BER) and receiver power where two bit rates (155 Mbps and 622 Mbps) have been used for this analysis. Using of OptiSystem 7.0 simulation, comparing eye diagram and optical spectrum alongside with BER and Rx power. It is found that OCDMA-DPSK performs better in comparison to OCDMA-OOK. The performance analysis also provides parameter for designing and development of an OCDMA system for optical access network using DPSK.

  5. Sound analysis of a musical performance to evaluate prosthodontic treatment for a clarinet player.

    PubMed

    Hattori, Mariko; Sumita, Yuka I; Taniguchi, Hisashi

    2015-01-01

    Some dental patients use the orofacial region to play wind instruments; however, musical performance has not been objectively evaluated following prosthodontic treatment in such patients. The purpose of this report was to describe prosthodontic treatment for a clarinet player using sound analysis. The patient required a removable partial denture for his maxillary anterior teeth. Sound analysis was performed before and after denture adjustment, and the patient completed a questionnaire regarding his perceptions while playing his clarinet. After adjustment, the denture showed better performance, and patient satisfaction increased compared with that before adjustment. PMID:24920520

  6. A further analysis for the minimum-variance deconvolution filter performance

    NASA Technical Reports Server (NTRS)

    Chi, Chong-Yung

    1987-01-01

    Chi and Mendel (1984) analyzed the performance of minimum-variance deconvolution (MVD). In this correspondence, a further analysis of the performance of the MVD filter is presented. It is shown that the MVD filter performs like an inverse filter and a whitening filter as SNR goes to infinity, and like a matched filter as SNR goes to zero. The estimation error of the MVD filter is colored noise, but it becomes white when SNR goes to zero. This analysis also conects the error power-spectral density of the MVD filter with the spectrum of the causal-prediction error filter.

  7. A further analysis for the minimum-variance deconvolution filter performance

    NASA Astrophysics Data System (ADS)

    Chi, Chong-Yung

    1987-06-01

    Chi and Mendel (1984) analyzed the performance of minimum-variance deconvolution (MVD). In this correspondence, a further analysis of the performance of the MVD filter is presented. It is shown that the MVD filter performs like an inverse filter and a whitening filter as SNR goes to infinity, and like a matched filter as SNR goes to zero. The estimation error of the MVD filter is colored noise, but it becomes white when SNR goes to zero. This analysis also conects the error power-spectral density of the MVD filter with the spectrum of the causal-prediction error filter.

  8. Elastic-plastic mixed-iterative finite element analysis: Implementation and performance assessment

    NASA Technical Reports Server (NTRS)

    Sutjahjo, Edhi; Chamis, Christos C.

    1993-01-01

    An elastic-plastic algorithm based on Von Mises and associative flow criteria is implemented in MHOST-a mixed iterative finite element analysis computer program developed by NASA Lewis Research Center. The performance of the resulting elastic-plastic mixed-iterative analysis is examined through a set of convergence studies. Membrane and bending behaviors of 4-node quadrilateral shell finite elements are tested for elastic-plastic performance. Generally, the membrane results are excellent, indicating the implementation of elastic-plastic mixed-iterative analysis is appropriate.

  9. Developing a Comprehensive Software Suite for Advanced Reactor Performance and Safety Analysis

    SciTech Connect

    Pointer, William David; Bradley, Keith S; Fischer, Paul F; Smith, Micheal A; Tautges, Timothy J; Ferencz, Robert M; Martineau, Richard C; Jain, Rajeev; Obabko, Aleksandr; Billings, Jay Jay

    2013-01-01

    This paper provides an introduction to the reactor analysis capabilities of the nuclear power reactor simulation tools that are being developed as part of the U.S. Department of Energy s Nuclear Energy Advanced Modeling and Simulation (NEAMS) Toolkit. The NEAMS Toolkit is an integrated suite of multi-physics simulation tools that leverage high-performance computing to reduce uncertainty in the prediction of performance and safety of advanced reactor and fuel designs. The Toolkit effort is comprised of two major components, the Fuels Product Line (FPL), which provides tools for fuel performance analysis, and the Reactor Product Line (RPL), which provides tools for reactor performance and safety analysis. This paper provides an overview of the NEAMS RPL development effort.

  10. Design and performance of an analysis-by-synthesis class of predictive speech coders

    NASA Technical Reports Server (NTRS)

    Rose, Richard C.; Barnwell, Thomas P., III

    1990-01-01

    The performance of a broad class of analysis-by-synthesis linear predictive speech coders is quantified experimentally. The class of coders includes a number of well-known techniques as well as a very large number of speech coders which have not been named or studied. A general formulation for deriving the parametric representation used in all of the coders in the class is presented. A new coder, named the self-excited vocoder, is discussed because of its good performance with low complexity, and because of the insight this coder gives to analysis-by-synthesis coders in general. The results of a study comparing the performances of different members of this class are presented. The study takes the form of a series of formal subjective and objective speech quality tests performed on selected coders. The results of this study lead to some interesting and important observations concerning the controlling parameters for analysis-by-synthesis speech coders.

  11. Performance (Off-Design) Cycle Analysis for a Turbofan Engine With Interstage Turbine Burner

    NASA Technical Reports Server (NTRS)

    Liew, K. H.; Urip, E.; Yang, S. L.; Mattingly, J. D.; Marek, C. J.

    2005-01-01

    This report presents the performance of a steady-state, dual-spool, separate-exhaust turbofan engine, with an interstage turbine burner (ITB) serving as a secondary combustor. The ITB, which is located in the transition duct between the high- and the low-pressure turbines, is a relatively new concept for increasing specific thrust and lowering pollutant emissions in modern jet-engine propulsion. A detailed off-design performance analysis of ITB engines is written in Microsoft(Registered Trademark) Excel (Redmond, Washington) macrocode with Visual Basic Application to calculate engine performances over the entire operating envelope. Several design-point engine cases are pre-selected using a parametric cycle-analysis code developed previously in Microsoft(Registered Trademark) Excel, for off-design analysis. The off-design code calculates engine performances (i.e. thrust and thrust-specific-fuel-consumption) at various flight conditions and throttle settings.

  12. Statistical Analysis of the Performance of MDL Enumeration for Multiple-Missed Detection in Array Processing

    PubMed Central

    Du, Fei; Li, Yibo; Jin, Shijiu

    2015-01-01

    An accurate performance analysis on the MDL criterion for source enumeration in array processing is presented in this paper. The enumeration results of MDL can be predicted precisely by the proposed procedure via the statistical analysis of the sample eigenvalues, whose distributive properties are investigated with the consideration of their interactions. A novel approach is also developed for the performance evaluation when the source number is underestimated by a number greater than one, which is denoted as “multiple-missed detection”, and the probability of a specific underestimated source number can be estimated by ratio distribution analysis. Simulation results are included to demonstrate the superiority of the presented method over available results and confirm the ability of the proposed approach to perform multiple-missed detection analysis. PMID:26295232

  13. Intrinsic motivation and extrinsic incentives jointly predict performance: a 40-year meta-analysis.

    PubMed

    Cerasoli, Christopher P; Nicklin, Jessica M; Ford, Michael T

    2014-07-01

    More than 4 decades of research and 9 meta-analyses have focused on the undermining effect: namely, the debate over whether the provision of extrinsic incentives erodes intrinsic motivation. This review and meta-analysis builds on such previous reviews by focusing on the interrelationship among intrinsic motivation, extrinsic incentives, and performance, with reference to 2 moderators: performance type (quality vs. quantity) and incentive contingency (directly performance-salient vs. indirectly performance-salient), which have not been systematically reviewed to date. Based on random-effects meta-analytic methods, findings from school, work, and physical domains (k = 183, N = 212,468) indicate that intrinsic motivation is a medium to strong predictor of performance (ρ = .21-45). The importance of intrinsic motivation to performance remained in place whether incentives were presented. In addition, incentive salience influenced the predictive validity of intrinsic motivation for performance: In a "crowding out" fashion, intrinsic motivation was less important to performance when incentives were directly tied to performance and was more important when incentives were indirectly tied to performance. Considered simultaneously through meta-analytic regression, intrinsic motivation predicted more unique variance in quality of performance, whereas incentives were a better predictor of quantity of performance. With respect to performance, incentives and intrinsic motivation are not necessarily antagonistic and are best considered simultaneously. Future research should consider using nonperformance criteria (e.g., well-being, job satisfaction) as well as applying the percent-of-maximum-possible (POMP) method in meta-analyses. PMID:24491020

  14. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will

  15. A Finite Rate Chemical Analysis of Nitric Oxide Flow Contamination Effects on Scramjet Performance

    NASA Technical Reports Server (NTRS)

    Cabell, Karen F.; Rock, Kenneth E.

    2003-01-01

    The level of nitric oxide contamination in the test gas of the Langley Research Center Arc-Heated Scramjet Test Facility and the effect of the contamination on scramjet test engine performance were investigated analytically. A finite rate chemical analysis was performed to determine the levels of nitric oxide produced in the facility at conditions corresponding to Mach 6 to 8 flight simulations. Results indicate that nitric oxide levels range from one to three mole percent, corroborating previously obtained measurements. A three-stream combustor code with finite rate chemistry was used to investigate the effects of nitric oxide on scramjet performance. Results indicate that nitric oxide in the test gas causes a small increase in heat release and thrust performance for the test conditions investigated. However, a rate constant uncertainty analysis suggests that the effect of nitric oxide ranges from no net effect, to an increase of about 10 percent in thrust performance.

  16. The Development of a Handbook for Astrobee F Performance and Stability Analysis

    NASA Technical Reports Server (NTRS)

    Wolf, R. S.

    1982-01-01

    An astrobee F performance and stability analysis is presented, for use by the NASA Sounding Rocket Division. The performance analysis provides information regarding altitude, mach number, dynamic pressure, and velocity as functions of time since launch. It is found that payload weight has the greatest effect on performance, and performance prediction accuracy was calculated to remain within 1%. In addition, to assure sufficient flight stability, a predicted rigid-body static margin of at least 8% of the total vehicle length is required. Finally, fin cant angle predictions are given in order to achieve a 2.5 cycle per second burnout roll rate, based on obtaining 75% of the steady roll rate. It is noted that this method can be used by flight performance engineers to create a similar handbook for any sounding rocket series.

  17. The development of a handbook for astrobee F performance and stability analysis

    NASA Technical Reports Server (NTRS)

    Wolf, R. S.

    1982-01-01

    An astrobee F performance and stability analysis is presented, for use by the NASA Sounding Rocket Division. The performance analysis provides information regarding altitude, mach number, dynamic pressure, and velocity as functions of time since launch. It is found that payload weight has the greatest effect on performance, and performance prediction accuracy was calculated to remain within 1%. In addition, to assure sufficient flight stability, a predicted rigid-body static margin of at least 8% of the total vehicle length is required. Finally, fin cant angle predictions are given in order to achieve a 2.5 cycle per second burnout roll rate, based on obtaining 75% of the steady roll rate. It is noted that this method can be used by flight performance engineers to create a similar handbook for any sounding rocket series.

  18. Analysis and compilation of missile aerodynamic data. Volume 2: Performance analysis

    NASA Technical Reports Server (NTRS)

    Burkhalter, J. E.

    1977-01-01

    A general analysis is given of the flight dynamics of several surface-to-air and two air-to-air missile configurations. The analysis involves three phases: vertical climb, straight and level flight, and constant altitude turn. Wind tunnel aerodynamic data and full scale missile characteristics are used where available; unknown data are estimated. For the constant altitude turn phase, a three degree of freedom flight simulation is used. Important parameters considered in this analysis are the vehicle weight, Mach number, heading angle, thrust level, sideslip angle, g loading, and time to make the turn. The actual flight path during the turn is also determined. Results are presented in graphical form.

  19. [Epidemics of Ebola haemorrhagic fever in Gabon (1994-2002). Epidemiologic aspects and considerations on control measures].

    PubMed

    Milleliri, J M; Tévi-Benissan, C; Baize, S; Leroy, E; Georges-Courbot, M C

    2004-08-01

    Based on the description of the four Ebola haemorrhagic fever epidemics (EHF) occurred in Gabon between 1994 and 2002, the authors are considering the cultural and psycho-sociological aspects accounting for the difficulty to implement control measures. On the whole, the result of these raging epidemics came up to 207 cases and 150 dead (lethality: 72%). Analysing precisely the aspects of the third epidemic and pointing up the possible factors explaining its spreading far beyond its epicentre, the authors bring about the limits of measures not always understood by local populations. The discussion will deal with the possibilities of a better surveillance, a quick management of intervention means including a regional permanent pre-alert and taking into account the issue raised by the possible Ebola virus endemic. PMID:15462203

  20. Analysis of validation data sets in the Class A Performance Evaluation Program

    SciTech Connect

    Hunn, B.D.

    1983-01-01

    The primary objective of the DOE Passive Solar Class A Performance Evaluation Program is to collect, analyze, and archive detailed test data for the rigorous validation of analysis/design tools used for passive solar research and design. This paper presents results of the analysis and qualification of several one- and two-week data sets taken at three Class A test sites for the purpose of validating envelope and thermal-storage-energy-transfer processes in passive solar analysis/design tools. Analysis of the data sets consists of editing the measured data and comparing these data with simulated performance results using public-domain, passive solar analysis tools and a standard reporting format developed for the Class A program. Comparisons of the measured data with results using the DOE-2 computer program are presented.