Science.gov

Sample records for performance analysis 1994-2002

  1. Using remote-sensing data to determine equilibrium-line altitude and mass-balance time series: validation on three French glaciers, 1994 2002

    NASA Astrophysics Data System (ADS)

    Rabatel, Antoine; Dedieu, Jean-Pierre; Vincent, Christian

    Alpine glaciers are very sensitive to climate fluctuations, and their mass balance can be used as an indicator of regional-scale climate change. Here, we present a method to calculate glacier mass balance using remote-sensing data. Snowline measurements from remotely sensed images recorded at the end of the hydrological year provide an effective proxy of the equilibrium line. Mass balance can be deduced from the equilibrium-line altitude (ELA) variations. Three well-documented glaciers in the French Alps, where the mass balance is measured at ground level with a stake network, were selected to assess the accuracy of the method over the 1994 2002 period (eight mass-balance cycles). Results obtained by ground measurements and remote sensing are compared and show excellent correlation (r2 > 0.89), both for the ELA and for the mass balance, indicating that the remote-sensing method can be applied to glaciers where no ground data exist, on the scale of a mountain range or a given climatic area. The main differences can be attributed to discrepancies between the dates of image acquisition and field measurements. Cloud cover and recent snowfalls constitute the main restrictions of the image-based method.

  2. Internet Access in U.S. Public Schools and Classrooms: 1994-2002. E.D. Tabs.

    ERIC Educational Resources Information Center

    Kleiner, Anne; Lewis, Laurie

    This report presents data on Internet access in U.S. public schools from 1994 to 2002 by school characteristics. It provides trend analysis on the progress of public schools and classrooms in connecting to the Internet and on the ratio of students to instructional computers with Internet access. For the year 2002, this report also presents data on…

  3. [Molecular epidemiology of rabies epizootics in Colombia, 1994-2002: evidence of human and canine rabies associated with chiroptera].

    PubMed

    Páez, Andrés; Nuñez, Constanza; García, Clemencia; Boshell, Jorge

    2003-03-01

    Three urban rabies outbreaks have been reported in Colombia during the last two decades, one of which is ongoing in the Caribbean region (northern Colombia). The earlier outbreaks occurred almost simultaneously in Arauca (eastern Colombia) and in the Central region, ending in 1997. Phylogenetic relationships among rabies viruses isolated from the three areas were based on a comparison of cDNA fragments coding for the endodomain of protein G and a fragment of L protein obtained by RT-PCR. The sequenced amplicons which included the G-L intergenic region contained 902 base pairs. Phylogenetic analysis showed three distinct groups of viruses. Colombian genetic variant I viruses were isolated only from Arauca and the Central region, but are now apparently extinct. Colombian genetic variant II viruses were isolated in the Caribbean region and are still being transmitted in that area. The third group of bat rabies variants were isolated from two insectivorous bats, three domestic dogs and a human. This associates bat rabies virus with rabies in Colombian dogs and humans, and indicates bats to be a rabies reservoir of public health significance.

  4. Dependability and Performability Analysis

    DTIC Science & Technology

    1993-11-01

    NASA Contractor Report 191565 ICASE Report No. 93-85 (0) ii ICASE U DEPENDABILITY AND PERFORMABILITY ANALYSIS Kishor S. Trivedi Gianfranco Ciardo...PERFORMABILITY ANALYSIS1 Kishor S. Trivedi Gianfranco Ciardo Manish Malhotra Robin A. Sahner Department of Electrical Engineering, Duke University...Trivedi, Gianfranco Ci&rdo, Manish Maihotra, and Robin A. Sahner 7. PERFORMING ORGANIZATION NAME(S) AND ADORESS(ES) 8. PERFORMING ORGANIZATION Institute

  5. Performance Support for Performance Analysis

    ERIC Educational Resources Information Center

    Schaffer, Scott; Douglas, Ian

    2004-01-01

    Over the past several years, there has been a shift in emphasis in many business, industry, government and military training organizations toward human performance technology or HPT (Rossett, 2002; Dean, 1995). This trend has required organizations to increase the human performance knowledge, skills, and abilities of the training workforce.…

  6. MIR Performance Analysis

    SciTech Connect

    Hazen, Damian; Hick, Jason

    2012-06-12

    We provide analysis of Oracle StorageTek T10000 Generation B (T10KB) Media Information Record (MIR) Performance Data gathered over the course of a year from our production High Performance Storage System (HPSS). The analysis shows information in the MIR may be used to improve tape subsystem operations. Most notably, we found the MIR information to be helpful in determining whether the drive or tape was most suspect given a read or write error, and for helping identify which tapes should not be reused given their history of read or write errors. We also explored using the MIR Assisted Search to order file retrieval requests. We found that MIR Assisted Search may be used to reduce the time needed to retrieve collections of files from a tape volume.

  7. Performance analysis in saber.

    PubMed

    Aquili, Andrea; Tancredi, Virginia; Triossi, Tamara; De Sanctis, Desiree; Padua, Elvira; DʼArcangelo, Giovanna; Melchiorri, Giovanni

    2013-03-01

    Fencing is a sport practiced by both men and women, which uses 3 weapons: foil, épée, and saber. In general, there are few scientific studies available in international literature; they are limited to the performance analysis of fencing bouts, yet there is nothing about saber. There are 2 kinds of competitions in the World Cup for both men and women: the "FIE GP" and "A." The aim of this study was to carry out a saber performance analysis to gain useful indicators for the definition of a performance model. In addition, it is expected to verify if it could be influenced by the type of competition and if there are differences between men and women. Sixty bouts: 33 FIE GP and 27 "A" competitions (35 men's and 25 women's saber bouts) were analyzed. The results indicated that most actions are offensive (55% for men and 49% for women); the central area of the piste is mostly used (72% for men and 67% for women); the effective fighting time is 13.6% for men and 17.1% for women, and the ratio between the action and break times is 1:6.5 for men and 1:5.1 for women. A lunge is carried out every 23.9 seconds by men and every 20 seconds by women, and a direction change is carried out every 65.3 seconds by men and every 59.7 seconds by women. The data confirm the differences between the saber and the other 2 weapons. There is no significant difference between the data of the 2 different kinds of competitions.

  8. DAS performance analysis

    SciTech Connect

    Bates, G.; Bodine, S.; Carroll, T.; Keller, M.

    1984-02-01

    This report begins with an overview of the Data Acquisition System (DAS), which supports several of PPPL's experimental devices. Performance measurements which were taken on DAS and the tools used to make them are then described.

  9. Performance analysis of FDDI

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory J.

    1988-01-01

    The Fiber Distributed Data Interface (FDDI) is an imerging ANSI and ISO standard for a 100 megabit per second fiber optic token ring. The performance of the FDDI media access control protocol is analyzed using a simulation developed at NASA Ames. Both analyses using standard measures of performance (including average delay for asynchronous traffic, channel utilization, and transmission queue length) and analyses of characteristics of ring behavior which can be attributed to constraints imposed by the timed token protocol on token holding time (including bounded token rotation time, support for synchronous traffic, and fairness of channel access for nodes transmitting asynchronous traffic) are included.

  10. Dependability and performability analysis

    NASA Technical Reports Server (NTRS)

    Trivedi, Kishor S.; Ciardo, Gianfranco; Malhotra, Manish; Sahner, Robin A.

    1993-01-01

    Several practical issues regarding specifications and solution of dependability and performability models are discussed. Model types with and without rewards are compared. Continuous-time Markov chains (CTMC's) are compared with (continuous-time) Markov reward models (MRM's) and generalized stochastic Petri nets (GSPN's) are compared with stochastic reward nets (SRN's). It is shown that reward-based models could lead to more concise model specifications and solution of a variety of new measures. With respect to the solution of dependability and performability models, three practical issues were identified: largeness, stiffness, and non-exponentiality, and a variety of approaches are discussed to deal with them, including some of the latest research efforts.

  11. Analysis of EDP performance

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The objective of this contract was the investigation of the potential performance gains that would result from an upgrade of the Space Station Freedom (SSF) Data Management System (DMS) Embedded Data Processor (EDP) '386' design with the Intel Pentium (registered trade-mark of Intel Corp.) '586' microprocessor. The Pentium ('586') is the latest member of the industry standard Intel X86 family of CISC (Complex Instruction Set Computer) microprocessors. This contract was scheduled to run in parallel with an internal IBM Federal Systems Company (FSC) Internal Research and Development (IR&D) task that had the goal to generate a baseline flight design for an upgraded EDP using the Pentium. This final report summarizes the activities performed in support of Contract NAS2-13758. Our plan was to baseline performance analyses and measurements on the latest state-of-the-art commercially available Pentium processor, representative of the proposed space station design, and then phase to an IBM capital funded breadboard version of the flight design (if available from IR&D and Space Station work) for additional evaluation of results. Unfortunately, the phase-over to the flight design breadboard did not take place, since the IBM Data Management System (DMS) for the Space Station Freedom was terminated by NASA before the referenced capital funded EDP breadboard could be completed. The baseline performance analyses and measurements, however, were successfully completed, as planned, on the commercial Pentium hardware. The results of those analyses, evaluations, and measurements are presented in this final report.

  12. Performance Analysis of MYSEA

    DTIC Science & Technology

    2012-09-01

    algebra libraries automatically tuned for the target processor. An evaluation of the Denali Isolation kernel [24] made use of web server benchmarks to...the Denali Isolation Kernel’s primitive operations [24]. Network micro benchmarks measure the bandwidth, throughput and network latency ex- perienced...and Software. In Proceedings: IEEE, volume 93, pp. 293–312, 2005. [24] A. Whitaker, M. Shaw, and S. D. Gribble. Scale and performance in the Denali

  13. Sensor performance analysis

    NASA Technical Reports Server (NTRS)

    Montgomery, H. E.; Ostrow, H.; Ressler, G. M.

    1990-01-01

    The theory is described and the equations required to design are developed and the performance of electro-optical sensor systems that operate from the visible through the thermal infrared spectral regions are analyzed. Methods to compute essential optical and detector parameters, signal-to-noise ratio, MTF, and figures of merit such as NE delta rho and NE delta T are developed. A set of atmospheric tables are provided to determine scene radiance in the visible spectral region. The Planck function is used to determine radiance in the infrared. The equations developed were incorporated in a spreadsheet so that a wide variety of sensor studies can be rapidly and efficiently conducted.

  14. MPQC: Performance Analysis and Optimization

    SciTech Connect

    Sarje, Abhinav; Williams, Samuel; Bailey, David

    2013-01-24

    MPQC (Massively Parallel Quantum Chemistry) is a widely used computational quantum chemistry code. It is capable of performing a number of computations commonly occurring in quantum chemistry. In order to achieve better performance of MPQC, in this report we present a detailed performance analysis of this code. We then perform loop and memory access optimizations, and measure performance improvements by comparing the performance of the optimized code with that of the original MPQC code. We observe that the optimized MPQC code achieves a significant improvement in the performance through a better utilization of vector processing and memory hierarchies.

  15. Lidar performance analysis

    NASA Technical Reports Server (NTRS)

    Spiers, Gary D.

    1994-01-01

    Section 1 details the theory used to build the lidar model, provides results of using the model to evaluate AEOLUS design instrument designs, and provides snapshots of the visual appearance of the coded model. Appendix A contains a Fortran program to calculate various forms of the refractive index structure function. This program was used to determine the refractive index structure function used in the main lidar simulation code. Appendix B contains a memo on the optimization of the lidar telescope geometry for a line-scan geometry. Appendix C contains the code for the main lidar simulation and brief instruction on running the code. Appendix D contains a Fortran code to calculate the maximum permissible exposure for the eye from the ANSI Z136.1-1992 eye safety standards. Appendix E contains a paper on the eye safety analysis of a space-based coherent lidar presented at the 7th Coherent Laser Radar Applications and Technology Conference, Paris, France, 19-23 July 1993.

  16. French adults' cognitive performance after daily supplementation with antioxidant vitamins and minerals at nutritional doses: a post hoc analysis of the Supplementation in Vitamins and Mineral Antioxidants (SU.VI.MAX) trial.

    PubMed

    Kesse-Guyot, Emmanuelle; Fezeu, Léopold; Jeandel, Claude; Ferry, Monique; Andreeva, Valentina; Amieva, Hélène; Hercberg, Serge; Galan, Pilar

    2011-09-01

    Antioxidant properties of some vitamins and trace elements may help to prevent cognitive decline. The aim of the current study was to estimate the long-term effects of antioxidant nutrient supplementation on the cognitive performance of participants in the Supplementation in Vitamins and Mineral Antioxidants (SU.VI.MAX) study 6 y after the end of the trial. This study included 4447 French participants aged 45-60 y who were enrolled in the SU.VI.MAX study (1994-2002), which was a double-blind, placebo-controlled, randomized trial. From 1994 to 2002, participants received daily vitamin C (120 mg), β-carotene (6 mg), vitamin E (30 mg), selenium (100 μg), and zinc (20 mg) in combination or as a placebo. In 2007-2009, the cognitive performance of participants was assessed with 4 neuropsychological tests (6 tasks). Principal components analysis (PCA) was performed to identify cognitive-function summary scores. Associations between antioxidant supplementation and cognitive functions, in the full sample and by subgroups, were estimated through ANOVA and expressed as mean differences and 95% CIs. Subgroup analyses were performed according to baseline characteristics. Subjects receiving active antioxidant supplementation had better episodic memory scores (mean difference: 0.61; 95% CI: 0.02, 1.20). PCA indicated 2 factors that were interpreted as showing verbal memory and executive functioning. Verbal memory was improved by antioxidant supplementation only in subjects who were nonsmokers or who had low serum vitamin C concentrations at baseline. This study supports the role of an adequate antioxidant nutrient status in the preservation of verbal memory under certain conditions. This trial was registered at clinicaltrials.gov as NCT00272428.

  17. Techniques for Automated Performance Analysis

    SciTech Connect

    Marcus, Ryan C.

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  18. Causal analysis of academic performance.

    PubMed

    Rao, D C; Morton, N E; Elston, R C; Yee, S

    1977-03-01

    Maximum likelihood methods are presented to test for the relations between causes and effects in linear path diagrams, without assuming that estimates of causes are free of error. Causal analysis is illustrated by published data of the Equal Educational Opportunity Survey, which show that American schools do not significantly modify socioeconomic differences in academic performance and that little of the observed racial difference in academic performance is causal. For two races differing by 15 IQ points, the differential if social class were randomized would be only about 3 points. The principle is stressed that a racial effect in a causal system may be environmental and that its etiology can be studied only by analysis of family resemblance in hybrid populations.

  19. Scalable Performance Measurement and Analysis

    SciTech Connect

    Gamblin, Todd

    2009-01-01

    Concurrency levels in large-scale, distributed-memory supercomputers are rising exponentially. Modern machines may contain 100,000 or more microprocessor cores, and the largest of these, IBM's Blue Gene/L, contains over 200,000 cores. Future systems are expected to support millions of concurrent tasks. In this dissertation, we focus on efficient techniques for measuring and analyzing the performance of applications running on very large parallel machines. Tuning the performance of large-scale applications can be a subtle and time-consuming task because application developers must measure and interpret data from many independent processes. While the volume of the raw data scales linearly with the number of tasks in the running system, the number of tasks is growing exponentially, and data for even small systems quickly becomes unmanageable. Transporting performance data from so many processes over a network can perturb application performance and make measurements inaccurate, and storing such data would require a prohibitive amount of space. Moreover, even if it were stored, analyzing the data would be extremely time-consuming. In this dissertation, we present novel methods for reducing performance data volume. The first draws on multi-scale wavelet techniques from signal processing to compress systemwide, time-varying load-balance data. The second uses statistical sampling to select a small subset of running processes to generate low-volume traces. A third approach combines sampling and wavelet compression to stratify performance data adaptively at run-time and to reduce further the cost of sampled tracing. We have integrated these approaches into Libra, a toolset for scalable load-balance analysis. We present Libra and show how it can be used to analyze data from large scientific applications scalably.

  20. Software Performs Complex Design Analysis

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  1. Stage Separation Performance Analysis Project

    NASA Technical Reports Server (NTRS)

    Chen, Yen-Sen; Zhang, Sijun; Liu, Jiwen; Wang, Ten-See

    2001-01-01

    Stage separation process is an important phenomenon in multi-stage launch vehicle operation. The transient flowfield coupled with the multi-body systems is a challenging problem in design analysis. The thermodynamics environment with burning propellants during the upper-stage engine start in the separation processes adds to the complexity of the-entire system. Understanding the underlying flow physics and vehicle dynamics during stage separation is required in designing a multi-stage launch vehicle with good flight performance. A computational fluid dynamics model with the capability to coupling transient multi-body dynamics systems will be a useful tool for simulating the effects of transient flowfield, plume/jet heating and vehicle dynamics. A computational model using generalize mesh system will be used as the basis of this development. The multi-body dynamics system will be solved, by integrating a system of six-degree-of-freedom equations of motion with high accuracy. Multi-body mesh system and their interactions will be modeled using parallel computing algorithms. Adaptive mesh refinement method will also be employed to enhance solution accuracy in the transient process.

  2. SEP thrust subsystem performance sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Atkins, K. L.; Sauer, C. G., Jr.; Kerrisk, D. J.

    1973-01-01

    This is a two-part report on solar electric propulsion (SEP) performance sensitivity analysis. The first part describes the preliminary analysis of the SEP thrust system performance for an Encke rendezvous mission. A detailed description of thrust subsystem hardware tolerances on mission performance is included together with nominal spacecraft parameters based on these tolerances. The second part describes the method of analysis and graphical techniques used in generating the data for Part 1. Included is a description of both the trajectory program used and the additional software developed for this analysis. Part 2 also includes a comprehensive description of the use of the graphical techniques employed in this performance analysis.

  3. Adaptive Optics Communications Performance Analysis

    NASA Technical Reports Server (NTRS)

    Srinivasan, M.; Vilnrotter, V.; Troy, M.; Wilson, K.

    2004-01-01

    The performance improvement obtained through the use of adaptive optics for deep-space communications in the presence of atmospheric turbulence is analyzed. Using simulated focal-plane signal-intensity distributions, uncoded pulse-position modulation (PPM) bit-error probabilities are calculated assuming the use of an adaptive focal-plane detector array as well as an adaptively sized single detector. It is demonstrated that current practical adaptive optics systems can yield performance gains over an uncompensated system ranging from approximately 1 dB to 6 dB depending upon the PPM order and background radiation level.

  4. Model Performance Evaluation and Scenario Analysis (MPESA)

    EPA Pesticide Factsheets

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  5. Kuipers performs Water Sample Analysis

    NASA Image and Video Library

    2012-05-15

    ISS031-E-084619 (15 May 2012) --- After collecting samples from the Water Recovery System (WRS), European Space Agency astronaut Andre Kuipers, Expedition 31 flight engineer, processes the samples for chemical and microbial analysis in the Unity node of the International Space Station.

  6. Structural-Thermal-Optical-Performance (STOP) Analysis

    NASA Technical Reports Server (NTRS)

    Bolognese, Jeffrey; Irish, Sandra

    2015-01-01

    The presentation will be given at the 26th Annual Thermal Fluids Analysis Workshop (TFAWS 2015) hosted by the Goddard Spaceflight Center (GSFC) Thermal Engineering Branch (Code 545). A STOP analysis is a multidiscipline analysis, consisting of Structural, Thermal and Optical Performance Analyses, that is performed for all space flight instruments and satellites. This course will explain the different parts of performing this analysis. The student will learn how to effectively interact with each discipline in order to accurately obtain the system analysis results.

  7. Performance of statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Davis, R. F.; Hines, D. E.

    1973-01-01

    Statistical energy analysis (SEA) methods have been developed for high frequency modal analyses on random vibration environments. These SEA methods are evaluated by comparing analytical predictions to test results. Simple test methods are developed for establishing SEA parameter values. Techniques are presented, based on the comparison of the predictions with test values, for estimating SEA accuracy as a function of frequency for a general structure.

  8. Echo Ranging/Probe Alert Performance Analysis.

    DTIC Science & Technology

    1982-11-04

    contract included technical analyses of acoustic communication equipment, system performance predictions, sea test design and data analysis, and...proposing functional system design alternatives. 2.0 SUMMARY OF WORK PERFORMED The JAYCOR effort focused on the analysis of the Echo Ranging/ Probe Alert...JAYCOR Document No. J640-020-82-2242, 16 August 1982, CONFIDENTIAL. 13. Probe Alert Design System Performance Estimates (U), J.L. Collins, JAYCOR Document

  9. Performance Analysis of GYRO: A Tool Evaluation

    SciTech Connect

    Worley, P.; Roth, P.; Candy, J.; Shan, Hongzhang; Mahinthakumar,G.; Sreepathi, S.; Carrington, L.; Kaiser, T.; Snavely, A.; Reed, D.; Zhang, Y.; Huck, K.; Malony, A.; Shende, S.; Moore, S.; Wolf, F.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  10. A Perspective on DSN System Performance Analysis

    NASA Technical Reports Server (NTRS)

    Pham, Timothy T.

    2006-01-01

    This paper discusses the performance analysis effort being carried out in the NASA Deep Space Network. The activity involves root cause analysis of failures and assessment of key performance metrics. The root cause analysis helps pinpoint the true cause of observed problems so that proper correction can be effected. The assessment currently focuses on three aspects: (1) data delivery metrics such as Quantity, Quality, Continuity, and Latency; (2) link-performance metrics such as antenna pointing, system noise temperature, Doppler noise, frequency and time synchronization, wide-area-network loading, link-configuration setup time; and (3) reliability, maintainability, availability metrics. The analysis establishes whether the current system is meeting its specifications and if so, how much margin is available. The findings help identify the weak points in the system and direct attention of programmatic investment for performance improvement.

  11. Building America Performance Analysis Procedures: Revision 1

    SciTech Connect

    2004-06-01

    To measure progress toward multi-year research goals, cost and performance trade-offs are evaluated through a series of controlled field and laboratory experiments supported by energy analysis techniques using test data to calibrate simulation models.

  12. Paramedir: A Tool for Programmable Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  13. Architecture Analysis of High Performance Capacitors (POSTPRINT)

    DTIC Science & Technology

    2009-07-01

    includes the measurement of heat dissipated from a recently developed fluorenyl polyester (FPE) capacitor under an AC excitation. II. Capacitor ...AFRL-RZ-WP-TP-2010-2100 ARCHITECTURE ANALYSIS OF HIGH PERFORMANCE CAPACITORS (POSTPRINT) Hiroyuki Kosai and Tyler Bixel UES, Inc...2009 4. TITLE AND SUBTITLE ARCHITECTURE ANALYSIS OF HIGH PERFORMANCE CAPACITORS (POSTPRINT) 5a. CONTRACT NUMBER In-house 5b. GRANT NUMBER 5c

  14. A Performance Approach to Job Analysis.

    ERIC Educational Resources Information Center

    Folsom, Al

    2001-01-01

    Discussion of performance technology and training evaluation focuses on a job analysis process in the Coast Guard. Topics include problems with low survey response rates; costs; the need for appropriate software; discussions with stakeholders and subject matter experts; and maximizing worthy performance. (LRW)

  15. Conducting a Customer-Focused Performance Analysis.

    ERIC Educational Resources Information Center

    Grant, David A.; Moseley, James L.

    1999-01-01

    Explains how to conduct an organization's performance analysis that focuses on customer needs by identifying the desired state, determining the current state, and identifying the current or predicted gap in performance. Considers the organization's mission, a vision or strategic plan, the organization's cultural values, and organizational goals.…

  16. A Performance Approach to Job Analysis.

    ERIC Educational Resources Information Center

    Folsom, Al

    2001-01-01

    Discussion of performance technology and training evaluation focuses on a job analysis process in the Coast Guard. Topics include problems with low survey response rates; costs; the need for appropriate software; discussions with stakeholders and subject matter experts; and maximizing worthy performance. (LRW)

  17. Analysis of Performance Variation Using Query Expansion.

    ERIC Educational Resources Information Center

    Alemayehu, Nega

    2003-01-01

    Discussion of information retrieval performance evaluation focuses on a case study using a statistical repeated measures analysis of variance for testing the significance of factors, such as retrieval method and topic in retrieval performance variation. Analyses of the effect of query expansion on document ranking confirm that expansion affects…

  18. Analysis of Performance Variation Using Query Expansion.

    ERIC Educational Resources Information Center

    Alemayehu, Nega

    2003-01-01

    Discussion of information retrieval performance evaluation focuses on a case study using a statistical repeated measures analysis of variance for testing the significance of factors, such as retrieval method and topic in retrieval performance variation. Analyses of the effect of query expansion on document ranking confirm that expansion affects…

  19. Performance optimisations for distributed analysis in ALICE

    NASA Astrophysics Data System (ADS)

    Betev, L.; Gheata, A.; Gheata, M.; Grigoras, C.; Hristov, P.

    2014-06-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with "sensors" collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis.

  20. Interfacing Computer Aided Parallelization and Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Biegel, Bryan A. (Technical Monitor)

    2003-01-01

    When porting sequential applications to parallel computer architectures, the program developer will typically go through several cycles of source code optimization and performance analysis. We have started a project to develop an environment where the user can jointly navigate through program structure and performance data information in order to make efficient optimization decisions. In a prototype implementation we have interfaced the CAPO computer aided parallelization tool with the Paraver performance analysis tool. We describe both tools and their interface and give an example for how the interface helps within the program development cycle of a benchmark code.

  1. Massive Contingency Analysis with High Performance Computing

    SciTech Connect

    Huang, Zhenyu; Chen, Yousu; Nieplocha, Jaroslaw

    2009-07-26

    Contingency analysis is a key function in the Energy Management System (EMS) to assess the impact of various combinations of power system component failures based on state estimates. Contingency analysis is also extensively used in power market operation for feasibility test of market solutions. Faster analysis of more cases is required to safely and reliably operate today’s power grids with less marginal and more intermittent renewable energy sources. Enabled by the latest development in the computer industry, high performance computing holds the promise of meet the need in the power industry. This paper investigates the potential of high performance computing for massive contingency analysis. The framework of "N-x" contingency analysis is established and computational load balancing schemes are studied and implemented with high performance computers. Case studies of massive 300,000-contingency-case analysis using the Western Electricity Coordinating Council power grid model are presented to illustrate the application of high performance computing and demonstrate the performance of the framework and computational load balancing schemes.

  2. Comparative performance analysis of mobile displays

    NASA Astrophysics Data System (ADS)

    Safaee-Rad, Reza; Aleksic, Milivoje

    2012-01-01

    Cell-phone display performance (in terms of color quality and optical efficiency) has become a critical factor in creating a positive user experience. As a result, there is a significant amount of effort by cell-phone OEMs to provide a more competitive display solution. This effort is focused on using different display technologies (with significantly different color characteristics) and more sophisticated display processors. In this paper, the results of a mobile-display comparative performance analysis are presented. Three cell-phones from major OEMs are selected and their display performances are measured and quantified. Comparative performance analysis is done using display characteristics such as display color gamut size, RGB-channels crosstalk, RGB tone responses, gray tracking performance, color accuracy, and optical efficiency.

  3. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  4. Comprehensive analysis of transport aircraft flight performance

    NASA Astrophysics Data System (ADS)

    Filippone, Antonio

    2008-04-01

    This paper reviews the state-of-the art in comprehensive performance codes for fixed-wing aircraft. The importance of system analysis in flight performance is discussed. The paper highlights the role of aerodynamics, propulsion, flight mechanics, aeroacoustics, flight operation, numerical optimisation, stochastic methods and numerical analysis. The latter discipline is used to investigate the sensitivities of the sub-systems to uncertainties in critical state parameters or functional parameters. The paper discusses critically the data used for performance analysis, and the areas where progress is required. Comprehensive analysis codes can be used for mission fuel planning, envelope exploration, competition analysis, a wide variety of environmental studies, marketing analysis, aircraft certification and conceptual aircraft design. A comprehensive program that uses the multi-disciplinary approach for transport aircraft is presented. The model includes a geometry deck, a separate engine input deck with the main parameters, a database of engine performance from an independent simulation, and an operational deck. The comprehensive code has modules for deriving the geometry from bitmap files, an aerodynamics model for all flight conditions, a flight mechanics model for flight envelopes and mission analysis, an aircraft noise model and engine emissions. The model is validated at different levels. Validation of the aerodynamic model is done against the scale models DLR-F4 and F6. A general model analysis and flight envelope exploration are shown for the Boeing B-777-300 with GE-90 turbofan engines with intermediate passenger capacity (394 passengers in 2 classes). Validation of the flight model is done by sensitivity analysis on the wetted area (or profile drag), on the specific air range, the brake-release gross weight and the aircraft noise. A variety of results is shown, including specific air range charts, take-off weight-altitude charts, payload-range performance

  5. Shuttle/TDRSS communications system performance analysis

    NASA Technical Reports Server (NTRS)

    Braun, W. R.

    1980-01-01

    The results of the performance analysis performed on the Shuttle/Tracking and Data Relay Satellite System (TDRSS) communications system are presented. The existing Shuttle/TDRSS link simulation program were modified and refined to model the post-radio frequency interference TDRS hardware and to evaluate the performance degradation due to RFI effects. The refined link models were then used to determine, evaluate and assess expected S-band and Ku-band link performance. Parameterization results are presented for the ground station carrier and timing recovery circuits

  6. Performance analysis of LAN bridges and routers

    NASA Technical Reports Server (NTRS)

    Hajare, Ankur R.

    1991-01-01

    Bridges and routers are used to interconnect Local Area Networks (LANs). The performance of these devices is important since they can become bottlenecks in large multi-segment networks. Performance metrics and test methodology for bridges and routers were not standardized. Performance data reported by vendors is not applicable to the actual scenarios encountered in an operational network. However, vendor-provided data can be used to calibrate models of bridges and routers that, along with other models, yield performance data for a network. Several tools are available for modeling bridges and routers - Network II.5 was used. The results of the analysis of some bridges and routers are presented.

  7. Using Covariance Analysis to Assess Pointing Performance

    NASA Technical Reports Server (NTRS)

    Bayard, David; Kang, Bryan

    2009-01-01

    A Pointing Covariance Analysis Tool (PCAT) has been developed for evaluating the expected performance of the pointing control system for NASA s Space Interferometry Mission (SIM). The SIM pointing control system is very complex, consisting of multiple feedback and feedforward loops, and operating with multiple latencies and data rates. The SIM pointing problem is particularly challenging due to the effects of thermomechanical drifts in concert with the long camera exposures needed to image dim stars. Other pointing error sources include sensor noises, mechanical vibrations, and errors in the feedforward signals. PCAT models the effects of finite camera exposures and all other error sources using linear system elements. This allows the pointing analysis to be performed using linear covariance analysis. PCAT propagates the error covariance using a Lyapunov equation associated with time-varying discrete and continuous-time system matrices. Unlike Monte Carlo analysis, which could involve thousands of computational runs for a single assessment, the PCAT analysis performs the same assessment in a single run. This capability facilitates the analysis of parametric studies, design trades, and "what-if" scenarios for quickly evaluating and optimizing the control system architecture and design.

  8. Analysis of ultra-triathlon performances.

    PubMed

    Lepers, Romuald; Knechtle, Beat; Knechtle, Patrizia; Rosemann, Thomas

    2011-01-01

    Despite increased interest in ultra-endurance events, little research has examined ultra-triathlon performance. The aims of this study were: (i) to compare swimming, cycling, running, and overall performances in three ultra-distance triathlons, double Ironman distance triathlon (2IMT) (7.6 km swimming, 360 km cycling, and 84.4 km running), triple Ironman distance triathlon (3IMT) (11.4 km, 540 km, and 126.6 km), and deca Ironman distance triathlon (10IMT) (38 km, 1800 km, and 420 km) and (ii) to examine the relationships between the 2IMT, 3IMT, and 10IMT performances to create predicted equations of the 10IMT performances. Race results from 1985 through 2009 were examined to identify triathletes who performed the three considered ultra-distances. In total, 73 triathletes (68 men and 5 women) were identified. The contribution of swimming to overall ultra-triathlon performance was lower than for cycling and running. Running performance was more important to overall performance for 2IMT and 3IMT compared with 10IMT The 2IMT and 3IMT performances were significantly correlated with 10IMT performances for swimming and cycling, but not for running. 10IMT total time performance might be predicted by the following equation: 10IMT race time (minutes) = 5885 + 3.69 × 3IMT race time (minutes). This analysis of human performance during ultra-distance triathlons represents a unique data set in the field of ultra-endurance events. Additional studies are required to determine the physiological and psychological factors associated with ultra-triathlon performance.

  9. US U-25 channel performance analysis

    SciTech Connect

    Doss, E.; Pan, Y. C.

    1980-07-01

    The results of an ANL computational analysis of the performance of the US U-25 MHD channel are presented. This channel has gone through several revisions. The major revision occurred after it had been decided by the DOE Office of MHD to operate the channel with platinum-clad copper electrodes (cold), rather than with ceramic electrodes (hot), as originally planned. This work has been performed at the request of the DOE Office of MHD and the US U-25 generator design Review Committee. The channel specifications and operating conditions are presented. The combustor temperature and thermodynamic and electrical properties of the plasma are computed, and the results are discussed. The MHD channel performance has been predicted for different operating conditions. Sensitivity studies have also been performed on the effects of mass flow rate, surface roughness, combustor temperatures, and loading on the channel performance.

  10. Analysis of driver performance under reduced visibility

    NASA Technical Reports Server (NTRS)

    Kaeppler, W. D.

    1982-01-01

    Mathematical models describing vehicle dynamics as well as human behavior may be useful in evaluating driver performance and in establishing design criteria for vehicles more compatible with man. In 1977, a two level model of driver steering behavior was developed, but its parameters were identified for clear visibility conditions only. Since driver performance degrades under conditions of reduced visibility, e.g., fog, the two level model should be investigated to determine its applicability to such conditions. The data analysis of a recently performed driving simulation experiment showed that the model still performed reasonably well under fog conditions, although there was a degradation in its predictive capacity during fog. Some additional parameters affecting anticipation and lag time may improve the model's performance for reduced visibility conditions.

  11. Using Ratio Analysis to Evaluate Financial Performance.

    ERIC Educational Resources Information Center

    Minter, John; And Others

    1982-01-01

    The ways in which ratio analysis can help in long-range planning, budgeting, and asset management to strengthen financial performance and help avoid financial difficulties are explained. Types of ratios considered include balance sheet ratios, net operating ratios, and contribution and demand ratios. (MSE)

  12. Probabilistic Analysis of Gas Turbine Field Performance

    NASA Technical Reports Server (NTRS)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2002-01-01

    A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.

  13. Performance Analysis of Surfing: A Review.

    PubMed

    Farley, Oliver R L; Abbiss, Chris R; Sheppard, Jeremy M

    2017-01-01

    Farley, ORL, Abbiss, CR, and Sheppard, JM. Performance Analysis of Surfing: A Review. J Strength Cond Res 31(1): 260-271, 2017-Despite the increased professionalism and substantial growth of surfing worldwide, there is limited information available to practitioners and coaches in terms of key performance analytics that are common in other field-based sports. Indeed, research analyzing surfing performance is limited to a few studies examining male surfers' heart rates, surfing activities through time-motion analysis (TMA) using video recordings and Global Positioning Satellite (GPS) data during competition and recreational surfing. These studies have indicated that specific activities undertaken during surfing are unique with a variety of activities (i.e., paddling, resting, wave riding, breath holding, and recovery of surfboard in the surf). Furthermore, environmental and wave conditions also seem to influence the physical demands of competition surfing. It is due to these demands that surfers are required to have a high cardiorespiratory fitness, high muscular endurance, and considerable strength and anaerobic power, particular within the upper torso. By exploring various methods of performance analysis used within other sports, it is possible to improve our understanding of surfing demands. In so doing this will assist in the development of protocols and strategies to assess physiological characteristics of surfers, monitor athlete performance, improve training prescription, and identify talent. Therefore, this review explores the current literature to provide insights into methodological protocols, delimitations of research into athlete analysis and an overview of surfing dynamics. Specifically, this review will describe and review the use of TMA, GPS, and other technologies (i.e., HR) that are used in external and internal load monitoring as they pertain to surfing.

  14. Performance analysis and prediction in triathlon.

    PubMed

    Ofoghi, Bahadorreza; Zeleznikow, John; Macmahon, Clare; Rehula, Jan; Dwyer, Dan B

    2016-01-01

    Performance in triathlon is dependent upon factors that include somatotype, physiological capacity, technical proficiency and race strategy. Given the multidisciplinary nature of triathlon and the interaction between each of the three race components, the identification of target split times that can be used to inform the design of training plans and race pacing strategies is a complex task. The present study uses machine learning techniques to analyse a large database of performances in Olympic distance triathlons (2008-2012). The analysis reveals patterns of performance in five components of triathlon (three race "legs" and two transitions) and the complex relationships between performance in each component and overall performance in a race. The results provide three perspectives on the relationship between performance in each component of triathlon and the final placing in a race. These perspectives allow the identification of target split times that are required to achieve a certain final place in a race and the opportunity to make evidence-based decisions about race tactics in order to optimise performance.

  15. NPAC-Nozzle Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Barnhart, Paul J.

    1997-01-01

    A simple and accurate nozzle performance analysis methodology has been developed. The geometry modeling requirements are minimal and very flexible, thus allowing rapid design evaluations. The solution techniques accurately couple: continuity, momentum, energy, state, and other relations which permit fast and accurate calculations of nozzle gross thrust. The control volume and internal flow analyses are capable of accounting for the effects of: over/under expansion, flow divergence, wall friction, heat transfer, and mass addition/loss across surfaces. The results from the nozzle performance methodology are shown to be in excellent agreement with experimental data for a variety of nozzle designs over a range of operating conditions.

  16. PATHA: Performance Analysis Tool for HPC Applications

    SciTech Connect

    Yoo, Wucherl; Koo, Michelle; Cao, Yi; Sim, Alex; Nugent, Peter; Wu, Kesheng

    2016-02-18

    Large science projects rely on complex workflows to analyze terabytes or petabytes of data. These jobs are often running over thousands of CPU cores and simultaneously performing data accesses, data movements, and computation. It is difficult to identify bottlenecks or to debug the performance issues in these large workflows. In order to address these challenges, we have developed Performance Analysis Tool for HPC Applications (PATHA) using the state-of-art open source big data processing tools. Our framework can ingest system logs to extract key performance measures, and apply the most sophisticated statistical tools and data mining methods on the performance data. Furthermore, it utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of PATHA, we conduct a case study on the workflows from an astronomy project known as the Palomar Transient Factory (PTF). This study processed 1.6 TB of system logs collected on the NERSC supercomputer Edison. Using PATHA, we were able to identify performance bottlenecks, which reside in three tasks of PTF workflow with the dependency on the density of celestial objects.

  17. PATHA: Performance Analysis Tool for HPC Applications

    DOE PAGES

    Yoo, Wucherl; Koo, Michelle; Cao, Yi; ...

    2016-02-18

    Large science projects rely on complex workflows to analyze terabytes or petabytes of data. These jobs are often running over thousands of CPU cores and simultaneously performing data accesses, data movements, and computation. It is difficult to identify bottlenecks or to debug the performance issues in these large workflows. In order to address these challenges, we have developed Performance Analysis Tool for HPC Applications (PATHA) using the state-of-art open source big data processing tools. Our framework can ingest system logs to extract key performance measures, and apply the most sophisticated statistical tools and data mining methods on the performance data.more » Furthermore, it utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of PATHA, we conduct a case study on the workflows from an astronomy project known as the Palomar Transient Factory (PTF). This study processed 1.6 TB of system logs collected on the NERSC supercomputer Edison. Using PATHA, we were able to identify performance bottlenecks, which reside in three tasks of PTF workflow with the dependency on the density of celestial objects.« less

  18. Multiprocessor smalltalk: Implementation, performance, and analysis

    SciTech Connect

    Pallas, J.I.

    1990-01-01

    Multiprocessor Smalltalk demonstrates the value of object-oriented programming on a multiprocessor. Its implementation and analysis shed light on three areas: concurrent programming in an object oriented language without special extensions, implementation techniques for adapting to multiprocessors, and performance factors in the resulting system. Adding parallelism to Smalltalk code is easy, because programs already use control abstractions like iterators. Smalltalk's basic control and concurrency primitives (lambda expressions, processes and semaphores) can be used to build parallel control abstractions, including parallel iterators, parallel objects, atomic objects, and futures. Language extensions for concurrency are not required. This implementation demonstrates that it is possible to build an efficient parallel object-oriented programming system and illustrates techniques for doing so. Three modification tools-serialization, replication, and reorganization-adapted the Berkeley Smalltalk interpreter to the Firefly multiprocessor. Multiprocessor Smalltalk's performance shows that the combination of multiprocessing and object-oriented programming can be effective: speedups (relative to the original serial version) exceed 2.0 for five processors on all the benchmarks; the median efficiency is 48%. Analysis shows both where performance is lost and how to improve and generalize the experimental results. Changes in the interpreter to support concurrency add at most 12% overhead; better access to per-process variables could eliminate much of that. Changes in the user code to express concurrency add as much as 70% overhead; this overhead could be reduced to 54% if blocks (lambda expressions) were reentrant. Performance is also lost when the program cannot keep all five processors busy.

  19. Nominal Performance Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs for the groundwater exposure scenario for the three climate states considered in the TSPA-LA as well as conversion factors for evaluating compliance with the groundwater protection standard. The BDCFs will be used in performance assessment for calculating all-pathway annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle activity in groundwater and the annual dose

  20. Automated Cache Performance Analysis And Optimization

    SciTech Connect

    Mohror, Kathryn

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  1. Performance management in healthcare: a critical analysis.

    PubMed

    Hewko, Sarah J; Cummings, Greta G

    2016-01-01

    Purpose - The purpose of this paper is to explore the underlying theoretical assumptions and implications of current micro-level performance management and evaluation (PME) practices, specifically within health-care organizations. PME encompasses all activities that are designed and conducted to align employee outputs with organizational goals. Design/methodology/approach - PME, in the context of healthcare, is analyzed through the lens of critical theory. Specifically, Habermas' theory of communicative action is used to highlight some of the questions that arise in looking critically at PME. To provide a richer definition of key theoretical concepts, the authors conducted a preliminary, exploratory hermeneutic semantic analysis of the key words "performance" and "management" and of the term "performance management". Findings - Analysis reveals that existing micro-level PME systems in health-care organizations have the potential to create a workforce that is compliant, dependent, technically oriented and passive, and to support health-care systems in which inequalities and power imbalances are perpetually reinforced. Practical implications - At a time when the health-care system is under increasing pressure to provide high-quality, affordable services with fewer resources, it may be wise to investigate new sector-specific ways of evaluating and managing performance. Originality/value - In this paper, written for health-care leaders and health human resource specialists, the theoretical assumptions and implications of current PME practices within health-care organizations are explored. It is hoped that readers will be inspired to support innovative PME practices within their organizations that encourage peak performance among health-care professionals.

  2. Nominal Performance Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    M.A. Wasiolek

    2003-07-25

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports (BSC 2003 [DIRS 160964]; BSC 2003 [DIRS 160965]; BSC 2003 [DIRS 160976]; BSC 2003 [DIRS 161239]; BSC 2003 [DIRS 161241]) contain detailed description of the model input parameters. This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs and conversion factors for the TSPA. The BDCFs will be used in performance assessment for calculating annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle activity in groundwater and the annual dose from beta- and photon-emitting radionuclides.

  3. IBIS detector performance during calibration - preliminary analysis

    NASA Astrophysics Data System (ADS)

    Bazzano, A.; Bird, A. J.; Laurent, P.; Malaguti, G.; Quadrini, E. M.; Segreto, A.; Volkmer, R.; del Santo, M.; Gabriele, M.; Tikkanen, T.

    2003-11-01

    The IBIS telescope is a high angular resolution gamma-ray imager due to be launched on the INTEGRAL satellite on October 17, 2002. The scientific goal of IBIS is to study astrophysical processes from celestial sources and diffuse regions in the hard X-ray and soft gamma-ray domains. IBIS features a coded aperture imaging system and a novel large area (~3000cm2) multilayer pixellated detector which utilises both cadmium telluride (16,384 detectors) and caesium iodide elements (4096 detectors) surrounded by a BGO active veto shield. We present an overview of, and preliminary analysis from, the IBIS calibration campaign. The performance of each pixel has been characterised, and hence the scientific performance of the IBIS detector system as a whole can now be established.

  4. Performance analysis of quantum dots infrared photodetector

    NASA Astrophysics Data System (ADS)

    Liu, Hongmei; Zhang, Fangfang; Zhang, Jianqi; He, Guojing

    2011-08-01

    Performance analysis of the quantum dots infrared photodetector(QDIP), which can provide device designers with theoretical guidance and experimental verification, arouses a wide interest and becomes a hot research topic in the recent years. In the paper, in comparison with quantum well infrared photodetector(QWIP) characteristic, the performance of QDIP is mainly discussed and summarized by analyzing the special properties of quantum dots material. To be specific, the dark current density and the detectivity in the normalized incident phenomenon are obtained from Phillip performance model, the carrier lifetime and the dark current of QDIP are studied by combing with the "photon bottleneck" effect, and the detectivity of QDIP is theoretically derived from considering photoconduction gain under the influence of the capture probability. From the experimental results, a conclusion is made that QDIP can not only receive the normal incidence light, but also has the advantages of the long carrier life, the big photoconductive gain, the low dark current and so on, and it further illustrates a anticipated superiority of QDIP in performance and a wide use of QDIP in many engineering fields in the future.

  5. Diversity Performance Analysis on Multiple HAP Networks.

    PubMed

    Dong, Feihong; Li, Min; Gong, Xiangwu; Li, Hongjun; Gao, Fengyue

    2015-06-30

    One of the main design challenges in wireless sensor networks (WSNs) is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP) is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO) techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO) model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV). In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF) and cumulative distribution function (CDF) of the received signal-to-noise ratio (SNR) are derived. In addition, the average symbol error rate (ASER) with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI) and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques.

  6. Analysis approaches and interventions with occupational performance

    PubMed Central

    Ahn, Sinae

    2016-01-01

    [Purpose] The purpose of this study was to analyze approaches and interventions with occupational performance in patients with stroke. [Subjects and Methods] In this study, articles published in the past 10 years were searched. The key terms used were “occupational performance AND stroke” and “occupational performance AND CVA”. A total 252 articles were identified, and 79 articles were selected. All interventions were classified according to their approaches according to 6 theories. All interventions were analyzed for frequency. [Results] Regarding the approaches, there were 25 articles for studies that provided high frequency interventions aimed at improving biomechanical approaches (31.6%). This included electrical stimulation therapy, robot therapy, and sensory stimulation training, as well as others. Analysis of the frequency of interventions revealed that the most commonly used interventions, which were used in 18 articles (22.8%), made use of the concept of constraint-induced therapy. [Conclusion] The results of this study suggest an approach for use in clinics for selecting an appropriate intervention for occupational performance. PMID:27799719

  7. Diversity Performance Analysis on Multiple HAP Networks

    PubMed Central

    Dong, Feihong; Li, Min; Gong, Xiangwu; Li, Hongjun; Gao, Fengyue

    2015-01-01

    One of the main design challenges in wireless sensor networks (WSNs) is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP) is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO) techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO) model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV). In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF) and cumulative distribution function (CDF) of the received signal-to-noise ratio (SNR) are derived. In addition, the average symbol error rate (ASER) with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI) and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques. PMID:26134102

  8. Nominal Performance Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    M.A. Wasiolek

    2005-04-28

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standards. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1-1). The objectives of this analysis are to develop BDCFs for the

  9. PERFORMANCE ANALYSIS OF MECHANICAL DRAFT COOLING TOWER

    SciTech Connect

    Lee, S; Alfred Garrett, A; James02 Bollinger, J; Larry Koffman, L

    2009-02-10

    Industrial processes use mechanical draft cooling towers (MDCT's) to dissipate waste heat by transferring heat from water to air via evaporative cooling, which causes air humidification. The Savannah River Site (SRS) has cross-flow and counter-current MDCT's consisting of four independent compartments called cells. Each cell has its own fan to help maximize heat transfer between ambient air and circulated water. The primary objective of the work is to simulate the cooling tower performance for the counter-current cooling tower and to conduct a parametric study under different fan speeds and ambient air conditions. The Savannah River National Laboratory (SRNL) developed a computational fluid dynamics (CFD) model and performed the benchmarking analysis against the integral measurement results to accomplish the objective. The model uses three-dimensional steady-state momentum, continuity equations, air-vapor species balance equation, and two-equation turbulence as the basic governing equations. It was assumed that vapor phase is always transported by the continuous air phase with no slip velocity. In this case, water droplet component was considered as discrete phase for the interfacial heat and mass transfer via Lagrangian approach. Thus, the air-vapor mixture model with discrete water droplet phase is used for the analysis. A series of parametric calculations was performed to investigate the impact of wind speeds and ambient conditions on the thermal performance of the cooling tower when fans were operating and when they were turned off. The model was also benchmarked against the literature data and the SRS integral test results for key parameters such as air temperature and humidity at the tower exit and water temperature for given ambient conditions. Detailed results will be published here.

  10. Idaho National Laboratory Quarterly Performance Analysis

    SciTech Connect

    Mitchell, Lisbeth

    2014-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Quality and Performance Management Organization. The Department of Energy (DOE) Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2, “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of 60 reportable events (23 from the 4th Qtr FY14 and 37 from the prior three reporting quarters) as well as 58 other issue reports (including not reportable events and Significant Category A and B conditions) identified at INL from July 2013 through October 2014. Battelle Energy Alliance (BEA) operates the INL under contract DE AC07 051D14517.

  11. Performance Analysis on Fault Tolerant Control System

    NASA Technical Reports Server (NTRS)

    Shin, Jong-Yeob; Belcastro, Christine

    2005-01-01

    In a fault tolerant control (FTC) system, a parameter varying FTC law is reconfigured based on fault parameters estimated by fault detection and isolation (FDI) modules. FDI modules require some time to detect fault occurrences in aero-vehicle dynamics. In this paper, an FTC analysis framework is provided to calculate the upper bound of an induced-L(sub 2) norm of an FTC system with existence of false identification and detection time delay. The upper bound is written as a function of a fault detection time and exponential decay rates and has been used to determine which FTC law produces less performance degradation (tracking error) due to false identification. The analysis framework is applied for an FTC system of a HiMAT (Highly Maneuverable Aircraft Technology) vehicle. Index Terms fault tolerant control system, linear parameter varying system, HiMAT vehicle.

  12. SUBSONIC WIND TUNNEL PERFORMANCE ANALYSIS SOFTWARE

    NASA Technical Reports Server (NTRS)

    Eckert, W. T.

    1994-01-01

    This program was developed as an aid in the design and analysis of subsonic wind tunnels. It brings together and refines previously scattered and over-simplified techniques used for the design and loss prediction of the components of subsonic wind tunnels. It implements a system of equations for determining the total pressure losses and provides general guidelines for the design of diffusers, contractions, corners and the inlets and exits of non-return tunnels. The algorithms used in the program are applicable to compressible flow through most closed- or open-throated, single-, double- or non-return wind tunnels or ducts. A comparison between calculated performance and that actually achieved by several existing facilities produced generally good agreement. Any system through which air is flowing which involves turns, fans, contractions etc. (e.g., an HVAC system) may benefit from analysis using this software. This program is an update of ARC-11138 which includes PC compatibility and an improved user interface. The method of loss analysis used by the program is a synthesis of theoretical and empirical techniques. Generally, the algorithms used are those which have been substantiated by experimental test. The basic flow-state parameters used by the program are determined from input information about the reference control section and the test section. These parameters were derived from standard relationships for compressible flow. The local flow conditions, including Mach number, Reynolds number and friction coefficient are determined for each end of each component or section. The loss in total pressure caused by each section is calculated in a form non-dimensionalized by local dynamic pressure. The individual losses are based on the nature of the section, local flow conditions and input geometry and parameter information. The loss forms for typical wind tunnel sections considered by the program include: constant area ducts, open throat ducts, contractions, constant

  13. SUBSONIC WIND TUNNEL PERFORMANCE ANALYSIS SOFTWARE

    NASA Technical Reports Server (NTRS)

    Eckert, W. T.

    1994-01-01

    This program was developed as an aid in the design and analysis of subsonic wind tunnels. It brings together and refines previously scattered and over-simplified techniques used for the design and loss prediction of the components of subsonic wind tunnels. It implements a system of equations for determining the total pressure losses and provides general guidelines for the design of diffusers, contractions, corners and the inlets and exits of non-return tunnels. The algorithms used in the program are applicable to compressible flow through most closed- or open-throated, single-, double- or non-return wind tunnels or ducts. A comparison between calculated performance and that actually achieved by several existing facilities produced generally good agreement. Any system through which air is flowing which involves turns, fans, contractions etc. (e.g., an HVAC system) may benefit from analysis using this software. This program is an update of ARC-11138 which includes PC compatibility and an improved user interface. The method of loss analysis used by the program is a synthesis of theoretical and empirical techniques. Generally, the algorithms used are those which have been substantiated by experimental test. The basic flow-state parameters used by the program are determined from input information about the reference control section and the test section. These parameters were derived from standard relationships for compressible flow. The local flow conditions, including Mach number, Reynolds number and friction coefficient are determined for each end of each component or section. The loss in total pressure caused by each section is calculated in a form non-dimensionalized by local dynamic pressure. The individual losses are based on the nature of the section, local flow conditions and input geometry and parameter information. The loss forms for typical wind tunnel sections considered by the program include: constant area ducts, open throat ducts, contractions, constant

  14. Nominal Performance Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    Wasiolek, Maryla A.

    2000-12-21

    The purpose of this report was to document the process leading to development of the Biosphere Dose Conversion Factors (BDCFs) for the postclosure nominal performance of the potential repository at Yucca Mountain. BDCF calculations concerned twenty-four radionuclides. This selection included sixteen radionuclides that may be significant nominal performance dose contributors during the compliance period of up to 10,000 years, five additional radionuclides of importance for up to 1 million years postclosure, and three relatively short-lived radionuclides important for the human intrusion scenario. Consideration of radionuclide buildup in soil caused by previous irrigation with contaminated groundwater was taken into account in the BDCF development. The effect of climate evolution, from the current arid conditions to a wetter and cooler climate, on the BDCF values was evaluated. The analysis included consideration of different exposure pathway's contribution to the BDCFs. Calculations of nominal performance BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. BDCFs for the nominal performance, when combined with the concentrations of radionuclides in groundwater allow calculation of potential radiation doses to the receptor of interest. Calculated estimates of radionuclide concentration in groundwater result from the saturated zone modeling. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA) to calculate doses to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  15. Failure analysis of high performance ballistic fibers

    NASA Astrophysics Data System (ADS)

    Spatola, Jennifer S.

    High performance fibers have a high tensile strength and modulus, good wear resistance, and a low density, making them ideal for applications in ballistic impact resistance, such as body armor. However, the observed ballistic performance of these fibers is much lower than the predicted values. Since the predictions assume only tensile stress failure, it is safe to assume that the stress state is affecting fiber performance. The purpose of this research was to determine if there are failure mode changes in the fiber fracture when transversely loaded by indenters of different shapes. An experimental design mimicking transverse impact was used to determine any such effects. Three different indenters were used: round, FSP, and razor blade. The indenter height was changed to change the angle of failure tested. Five high performance fibers were examined: KevlarRTM KM2, SpectraRTM 130d, DyneemaRTM SK-62 and SK-76, and ZylonRTM 555. Failed fibers were analyzed using an SEM to determine failure mechanisms. The results show that the round and razor blade indenters produced a constant failure strain, as well as failure mechanisms independent of testing angle. The FSP indenter produced a decrease in failure strain as the angle increased. Fibrillation was the dominant failure mechanism at all angles for the round indenter, while through thickness shearing was the failure mechanism for the razor blade. The FSP indenter showed a transition from fibrillation at low angles to through thickness shearing at high angles, indicating that the round and razor blade indenters are extreme cases of the FSP indenter. The failure mechanisms observed with the FSP indenter at various angles correlated with the experimental strain data obtained during fiber testing. This indicates that geometry of the indenter tip in compression is a contributing factor in lowering the failure strain of the high performance fibers. TEM analysis of the fiber failure mechanisms was also attempted, though without

  16. Approaches to Cycle Analysis and Performance Metrics

    NASA Technical Reports Server (NTRS)

    Parson, Daniel E.

    2003-01-01

    The following notes were prepared as part of an American Institute of Aeronautics and Astronautics (AIAA) sponsored short course entitled Air Breathing Pulse Detonation Engine (PDE) Technology. The course was presented in January of 2003, and again in July of 2004 at two different AIAA meetings. It was taught by seven instructors, each of whom provided information on particular areas of PDE research. These notes cover two areas. The first is titled Approaches to Cycle Analysis and Performance Metrics. Here, the various methods of cycle analysis are introduced. These range from algebraic, thermodynamic equations, to single and multi-dimensional Computational Fluid Dynamic (CFD) solutions. Also discussed are the various means by which performance is measured, and how these are applied in a device which is fundamentally unsteady. The second topic covered is titled PDE Hybrid Applications. Here the concept of coupling a PDE to a conventional turbomachinery based engine is explored. Motivation for such a configuration is provided in the form of potential thermodynamic benefits. This is accompanied by a discussion of challenges to the technology.

  17. High-resolution SAR ATR performance analysis

    NASA Astrophysics Data System (ADS)

    Douglas, Joel; Burke, Monica; Ettinger, Gil J.

    2004-09-01

    High resolution Synthetic Aperture Radar (SAR) imagery (e.g., four inch or better resolution) contains features not seen in one foot or lower resolution imagery, due to the isolation of the scatterers into separate resolution cells. These features provide the potential for additional discrimination power for Automatic Target Recognition (ATR) systems. In this paper, we analyze the performance of the Real-Time MSTAR (RT-MSTAR) system as a function of image resolution. Performance is measured both in terms of the probability of correct identification on military targets, and also in terms of confuser rejection. The analysis demonstrates two factors that significantly enhance performance. First, use of the high resolution imagery results in much higher probability of correct identification, as demonstrated using Lynx SAR imagery at 4" and 12". Second, incorporating models of the confusers, when available, greatly reduces false alarms, even at higher resolutions. Several new areas of work emerge, including making use of higher-level feature information available in the imagery, and rapid creation of models for vehicles that pose particular confuser rejection challenges.

  18. Axial and centrifugal pump meanline performance analysis

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.

    1994-01-01

    A meanline pump flow modeling method has been developed to provide a fast capability for modeling pumps of cryogenic rocket engines. Based on this method, a meanline pump flow code (PUMPA) has been written that can predict the performance of pumps at off-design operating conditions, given the loss of the diffusion system at the design point. The design point rotor efficiency is obtained from empirically derived correlations of loss to rotor specific speed. The rapid input setup and computer run time for the meanline pump flow code makes it an effective analysis and conceptual design tool. The map generation capabilities of the PUMPA code provide the information needed for interfacing with a rocket engine system modeling code.

  19. Stormwater quality models: performance and sensitivity analysis.

    PubMed

    Dotto, C B S; Kleidorfer, M; Deletic, A; Fletcher, T D; McCarthy, D T; Rauch, W

    2010-01-01

    The complex nature of pollutant accumulation and washoff, along with high temporal and spatial variations, pose challenges for the development and establishment of accurate and reliable models of the pollution generation process in urban environments. Therefore, the search for reliable stormwater quality models remains an important area of research. Model calibration and sensitivity analysis of such models are essential in order to evaluate model performance; it is very unlikely that non-calibrated models will lead to reasonable results. This paper reports on the testing of three models which aim to represent pollutant generation from urban catchments. Assessment of the models was undertaken using a simplified Monte Carlo Markov Chain (MCMC) method. Results are presented in terms of performance, sensitivity to the parameters and correlation between these parameters. In general, it was suggested that the tested models poorly represent reality and result in a high level of uncertainty. The conclusions provide useful information for the improvement of existing models and insights for the development of new model formulations.

  20. Deep Space Optical Link ARQ Performance Analysis

    NASA Technical Reports Server (NTRS)

    Clare, Loren; Miles, Gregory

    2016-01-01

    Substantial advancements have been made toward the use of optical communications for deep space exploration missions, promising a much higher volume of data to be communicated in comparison with present -day Radio Frequency (RF) based systems. One or more ground-based optical terminals are assumed to communicate with the spacecraft. Both short-term and long-term link outages will arise due to weather at the ground station(s), space platform pointing stability, and other effects. To mitigate these outages, an Automatic Repeat Query (ARQ) retransmission method is assumed, together with a reliable back channel for acknowledgement traffic. Specifically, the Licklider Transmission Protocol (LTP) is used, which is a component of the Disruption-Tolerant Networking (DTN) protocol suite that is well suited for high bandwidth-delay product links subject to disruptions. We provide an analysis of envisioned deep space mission scenarios and quantify buffering, latency and throughput performance, using a simulation in which long-term weather effects are modeled with a Gilbert -Elliot Markov chain, short-term outages occur as a Bernoulli process, and scheduled outages arising from geometric visibility or operational constraints are represented. We find that both short- and long-term effects impact throughput, but long-term weather effects dominate buffer sizing and overflow losses as well as latency performance.

  1. Radio-science performance analysis software

    NASA Astrophysics Data System (ADS)

    Morabito, D. D.; Asmar, S. W.

    1995-02-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.

  2. Radio-science performance analysis software

    NASA Technical Reports Server (NTRS)

    Morabito, D. D.; Asmar, S. W.

    1995-01-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.

  3. Performance Analysis of ICA in Sensor Array

    PubMed Central

    Cai, Xin; Wang, Xiang; Huang, Zhitao; Wang, Fenghua

    2016-01-01

    As the best-known scheme in the field of Blind Source Separation (BSS), Independent Component Analysis (ICA) has been intensively used in various domains, including biomedical and acoustics applications, cooperative or non-cooperative communication, etc. While sensor arrays are involved in most of the applications, the influence on the performance of ICA of practical factors therein has not been sufficiently investigated yet. In this manuscript, the issue is researched by taking the typical antenna array as an illustrative example. Factors taken into consideration include the environment noise level, the properties of the array and that of the radiators. We analyze the analytic relationship between the noise variance, the source variance, the condition number of the mixing matrix and the optimal signal to interference-plus-noise ratio, as well as the relationship between the singularity of the mixing matrix and practical factors concerned. The situations where the mixing process turns (nearly) singular have been paid special attention to, since such circumstances are critical in applications. Results and conclusions obtained should be instructive when applying ICA algorithms on mixtures from sensor arrays. Moreover, an effective countermeasure against the cases of singular mixtures has been proposed, on the basis of previous analysis. Experiments validating the theoretical conclusions as well as the effectiveness of the proposed scheme have been included. PMID:27164100

  4. Past Performance analysis of HPOTP bearings

    NASA Technical Reports Server (NTRS)

    Bhat, B. N.; Dolan, F. J.

    1982-01-01

    The past performance analysis conducted on three High Pressure Oxygen Turbopump (HPOTP) bearings from the Space Shuttle Main Engine is presented. Metallurgical analysis of failed bearing balls and races, and wear track and crack configuration analyses were carried out. In addition, one bearing was tested in laboratory at very high axial loads. The results showed that the cracks were surface initiated and propagated into subsurface locations at relatively small angles. Subsurface cracks were much more extensive than was appeared on the surface. The location of major cracks in the races corresponded to high radial loads rather than high axial loads. There was evidence to suggest that the inner races were heated to elevated temperatures. A failure scenario was developed based on the above findings. According to this scenario the HPOTP bearings are heated by a combination of high loads and high coefficient of friction (poor lubrication). Different methods of extending the HPOTP bearing life are also discussed. These include reduction of axial loads, improvements in bearing design, lubrication and cooling, and use of improved bearing materials.

  5. Performance analysis of memory hierachies in high performance systems

    SciTech Connect

    Yogesh, Agrawel

    1993-07-01

    This thesis studies memory bandwidth as a performance predictor of programs. The focus of this work is on computationally intensive programs. These programs are the most likely to access large amounts of data, stressing the memory system. Computationally intensive programs are also likely to use highly optimizing compilers to produce the fastest executables possible. Methods to reduce the amount of data traffic by increasing the average number of references to each item while it resides in the cache are explored. Increasing the average number of references to each cache item reduces the number of memory requests. Chapter 2 describes the DLX architecture. This is the architecture on which all the experiments were performed. Chapter 3 studies memory moves as a performance predictor for a group of application programs. Chapter 4 introduces a model to study the performance of programs in the presence of memory hierarchies. Chapter 5 explores some compiler optimizations that can help increase the references to each item while it resides in the cache.

  6. Space Shuttle Main Engine performance analysis

    NASA Astrophysics Data System (ADS)

    Santi, L. Michael

    1993-11-01

    For a number of years, NASA has relied primarily upon periodically updated versions of Rocketdyne's power balance model (PBM) to provide space shuttle main engine (SSME) steady-state performance prediction. A recent computational study indicated that PBM predictions do not satisfy fundamental energy conservation principles. More recently, SSME test results provided by the Technology Test Bed (TTB) program have indicated significant discrepancies between PBM flow and temperature predictions and TTB observations. Results of these investigations have diminished confidence in the predictions provided by PBM, and motivated the development of new computational tools for supporting SSME performance analysis. A multivariate least squares regression algorithm was developed and implemented during this effort in order to efficiently characterize TTB data. This procedure, called the 'gains model,' was used to approximate the variation of SSME performance parameters such as flow rate, pressure, temperature, speed, and assorted hardware characteristics in terms of six assumed independent influences. These six influences were engine power level, mixture ratio, fuel inlet pressure and temperature, and oxidizer inlet pressure and temperature. A BFGS optimization algorithm provided the base procedure for determining regression coefficients for both linear and full quadratic approximations of parameter variation. Statistical information relative to data deviation from regression derived relations was also computed. A new strategy for integrating test data with theoretical performance prediction was also investigated. The current integration procedure employed by PBM treats test data as pristine and adjusts hardware characteristics in a heuristic manner to achieve engine balance. Within PBM, this integration procedure is called 'data reduction.' By contrast, the new data integration procedure, termed 'reconciliation,' uses mathematical optimization techniques, and requires both

  7. Data Link Performance Analysis for LVLASO Experiments

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    1998-01-01

    Low-visibility Landing and Surface Operations System (LVLASO) is currently being prototyped and tested at NASA Langley Research Center. Since the main objective of the system is to maintain the aircraft landings and take-offs even during low-visibility conditions, timely exchange of positional and other information between the aircraft and the ground control is critical. For safety and reliability reasons, there are several redundant sources on the ground (e.g., ASDE, AMASS) that collect and disseminate information about the environment to the aircrafts. The data link subsystem of LVLASO is responsible for supporting the timely transfer of information between the aircrafts and the ground controllers. In fact, if not properly designed, the data link subsystem could become a bottleneck in the proper functioning of LVLASO. Currently, the other components of the system are being designed assuming that the data link has adequate capacity and is capable of delivering the information in a timely manner. During August 1-28, 1997, several flight experiments were conducted to test the prototypes of subsystems developed under LVLASO project, The back-round and details of the tests are described in the next section. The test results have been collected in two CDs by FAA and Rockwell-Collins. Under the current grant, we have analyzed the data and evaluated the performance of the Mode S datalink. In this report, we summarize the results of our analysis. Much of the results are shown in terms of graphs or histograms. The test date (or experiment number) was often taken as the X-axis and the Y-axis denotes whatever metric of focus in that chart. In interpreting these charts, one need to take into account the vehicular traffic during a particular experiment. In general, the performance of the data link was found to be quite satisfactory in terms of delivering long and short Mode S squitters from the vehicles to the ground receiver, Similarly, its performance in delivering control

  8. Space Shuttle Main Engine performance analysis

    NASA Technical Reports Server (NTRS)

    Santi, L. Michael

    1993-01-01

    For a number of years, NASA has relied primarily upon periodically updated versions of Rocketdyne's power balance model (PBM) to provide space shuttle main engine (SSME) steady-state performance prediction. A recent computational study indicated that PBM predictions do not satisfy fundamental energy conservation principles. More recently, SSME test results provided by the Technology Test Bed (TTB) program have indicated significant discrepancies between PBM flow and temperature predictions and TTB observations. Results of these investigations have diminished confidence in the predictions provided by PBM, and motivated the development of new computational tools for supporting SSME performance analysis. A multivariate least squares regression algorithm was developed and implemented during this effort in order to efficiently characterize TTB data. This procedure, called the 'gains model,' was used to approximate the variation of SSME performance parameters such as flow rate, pressure, temperature, speed, and assorted hardware characteristics in terms of six assumed independent influences. These six influences were engine power level, mixture ratio, fuel inlet pressure and temperature, and oxidizer inlet pressure and temperature. A BFGS optimization algorithm provided the base procedure for determining regression coefficients for both linear and full quadratic approximations of parameter variation. Statistical information relative to data deviation from regression derived relations was also computed. A new strategy for integrating test data with theoretical performance prediction was also investigated. The current integration procedure employed by PBM treats test data as pristine and adjusts hardware characteristics in a heuristic manner to achieve engine balance. Within PBM, this integration procedure is called 'data reduction.' By contrast, the new data integration procedure, termed 'reconciliation,' uses mathematical optimization techniques, and requires both

  9. Performance analysis of robust road sign identification

    NASA Astrophysics Data System (ADS)

    Ali, Nursabillilah M.; Mustafah, Y. M.; Rashid, N. K. A. M.

    2013-12-01

    This study describes performance analysis of a robust system for road sign identification that incorporated two stages of different algorithms. The proposed algorithms consist of HSV color filtering and PCA techniques respectively in detection and recognition stages. The proposed algorithms are able to detect the three standard types of colored images namely Red, Yellow and Blue. The hypothesis of the study is that road sign images can be used to detect and identify signs that are involved with the existence of occlusions and rotational changes. PCA is known as feature extraction technique that reduces dimensional size. The sign image can be easily recognized and identified by the PCA method as is has been used in many application areas. Based on the experimental result, it shows that the HSV is robust in road sign detection with minimum of 88% and 77% successful rate for non-partial and partial occlusions images. For successful recognition rates using PCA can be achieved in the range of 94-98%. The occurrences of all classes are recognized successfully is between 5% and 10% level of occlusions.

  10. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    EPA Science Inventory

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...

  11. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    EPA Science Inventory

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...

  12. NEXT Performance Curve Analysis and Validation

    NASA Technical Reports Server (NTRS)

    Saripalli, Pratik; Cardiff, Eric; Englander, Jacob

    2016-01-01

    Performance curves of the NEXT thruster are highly important in determining the thruster's ability in performing towards mission-specific goals. New performance curves are proposed and examined here. The Evolutionary Mission Trajectory Generator (EMTG) is used to verify variations in mission solutions based on both available thruster curves and the new curves generated. Furthermore, variations in BOL and EOL curves are also examined. Mission design results shown here validate the use of EMTG and the new performance curves.

  13. IPAC-Inlet Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Barnhart, Paul J.

    1997-01-01

    A series of analyses have been developed which permit the calculation of the performance of common inlet designs. The methods presented are useful for determining the inlet weight flows, total pressure recovery, and aerodynamic drag coefficients for given inlet geometric designs. Limited geometric input data is required to use this inlet performance prediction methodology. The analyses presented here may also be used to perform inlet preliminary design studies. The calculated inlet performance parameters may be used in subsequent engine cycle analyses or installed engine performance calculations for existing uninstalled engine data.

  14. Cost and performance analysis of physical security systems

    SciTech Connect

    Hicks, M.J.; Yates, D.; Jago, W.H.; Phillips, A.W.

    1998-04-01

    Analysis of cost and performance of physical security systems can be a complex, multi-dimensional problem. There are a number of point tools that address various aspects of cost and performance analysis. Increased interest in cost tradeoffs of physical security alternatives has motivated development of an architecture called Cost and Performance Analysis (CPA), which takes a top-down approach to aligning cost and performance metrics. CPA incorporates results generated by existing physical security system performance analysis tools, and utilizes an existing cost analysis tool. The objective of this architecture is to offer comprehensive visualization of complex data to security analysts and decision-makers.

  15. [Performance analysis of scientific researchers in biomedicine].

    PubMed

    Gamba, Gerardo

    2013-01-01

    There is no data about the performance of scientific researchers in biomedicine in our environment that can be use by individual subjects to compare their execution with their pairs. Using the Scopus browser the following data from 115 scientific researchers in biomedicine were obtained: actual institution, number of articles published, place on each article within the author list as first, last or unique author, total number of citations, percentage of citations due to the most cited paper, and h-index. Results were analyzed with descriptive statistics and simple lineal regressions. Most of scientific researches in the sample are from the National Institutes of the Health Ministry or some of the research institutes or faculties at the Universidad Nacional Autónoma de México. Total number of publications was < 50 in 26%, from 50 to 100 in 36.5%, from 101 to 150 in 18.2%, from 151 to 200 in 9.5%, and more than 200 papers in 9.5%. The researcher was considered to be the main author, by being the first, the last or the unique author, from the 22 to 91% of the papers, with 75% being main author in more than 50% of the manuscripts. Total citations varied from 240 to 10,866. There is a significant correlation between the number of papers and citations, with R2 of 0.46. In the most cited paper, the researchers were considered the main author in 43%. The h-index varied from 7 to 57. Eight researchers had h-index of less than 10. Most are between 11 and 20, 25% are between 21 and 0 and only 10.4% had an h-index of more than 30. There is a significant correlation between number of published papers and h-index, with R2 of 0.57. This work provides an analysis of scientific publications in a sample of 115 scientific researchers in biomedicine in Mexico City, which can be used to compare the productivity of individual subjects with their pairs.

  16. An Ethnostatistical Analysis of Performance Measurement

    ERIC Educational Resources Information Center

    Winiecki, Donald J.

    2008-01-01

    Within the fields of human performance technology (HPT), human resources management (HRM), and management in general, performance measurement is not only foundational but considered necessary at all phases in the process of HPT. In HPT in particular, there is substantial advice literature on what measurement is, why it is essential, and (at a…

  17. An Ethnostatistical Analysis of Performance Measurement

    ERIC Educational Resources Information Center

    Winiecki, Donald J.

    2008-01-01

    Within the fields of human performance technology (HPT), human resources management (HRM), and management in general, performance measurement is not only foundational but considered necessary at all phases in the process of HPT. In HPT in particular, there is substantial advice literature on what measurement is, why it is essential, and (at a…

  18. Spacecraft Multiple Array Communication System Performance Analysis

    NASA Technical Reports Server (NTRS)

    Hwu, Shian U.; Desilva, Kanishka; Sham, Catherine C.

    2010-01-01

    The Communication Systems Simulation Laboratory (CSSL) at the NASA Johnson Space Center is tasked to perform spacecraft and ground network communication system simulations, design validation, and performance verification. The CSSL has developed simulation tools that model spacecraft communication systems and the space and ground environment in which the tools operate. In this paper, a spacecraft communication system with multiple arrays is simulated. Multiple array combined technique is used to increase the radio frequency coverage and data rate performance. The technique is to achieve phase coherence among the phased arrays to combine the signals at the targeting receiver constructively. There are many technical challenges in spacecraft integration with a high transmit power communication system. The array combining technique can improve the communication system data rate and coverage performances without increasing the system transmit power requirements. Example simulation results indicate significant performance improvement can be achieved with phase coherence implementation.

  19. Assessing BMP Performance Using Microtox Toxicity Analysis

    EPA Science Inventory

    Best Management Practices (BMPs) have been shown to be effective in reducing runoff and pollutants from urban areas and thus provide a mechanism to improve downstream water quality. Currently, BMP performance regarding water quality improvement is assessed through measuring each...

  20. Space conditioning performance analysis and simulation study

    NASA Astrophysics Data System (ADS)

    Patani, A.; Bonne, U.

    1981-07-01

    The engine driven heat pump model was expanded to incorporate an approach for evaluating the influence of cycling and systems included the sensitivity of performance to electric consumption, compressor speed, mixing, and climate. A modular program for evaluating the steady state performance of absorption heat pumps was developed. Initial simulations indicated performance trends as a function of outdoor temperature and the refrigerant absorber charge. The combustion heating system model, HFLAME, was used to simulate the benefits of fan/pump overrun and the dependence of corresponding setpoints on off period losses and electric costs. Benefits of fuel, fuel/air modulation as compared to cyclic performance were also analyzed. An energy distribution factor was defined to describe the effect of the distribution system on realizing savings of retrofits.

  1. Assessing BMP Performance Using Microtox Toxicity Analysis

    EPA Science Inventory

    Best Management Practices (BMPs) have been shown to be effective in reducing runoff and pollutants from urban areas and thus provide a mechanism to improve downstream water quality. Currently, BMP performance regarding water quality improvement is assessed through measuring each...

  2. Rocket-in-a-Duct Performance Analysis

    NASA Technical Reports Server (NTRS)

    Schneider, Steven J.; Reed, Brian D.

    1999-01-01

    An axisymmetric, 110 N class, rocket configured with a free expansion between the rocket nozzle and a surrounding duct was tested in an altitude simulation facility. The propellants were gaseous hydrogen and gaseous oxygen and the hardware consisted of a heat sink type copper rocket firing through copper ducts of various diameters and lengths. A secondary flow of nitrogen was introduced at the blind end of the duct to mix with the primary rocket mass flow in the duct. This flow was in the range of 0 to 10% of the primary massflow and its effect on nozzle performance was measured. The random measurement errors on thrust and massflow were within +/-1%. One dimensional equilibrium calculations were used to establish the possible theoretical performance of these rocket-in-a-duct nozzles. Although the scale of these tests was small, they simulated the relevant flow expansion physics at a modest experimental cost. Test results indicated that lower performance was obtained at higher free expansion area ratios and longer ducts, while, higher performance was obtained with the addition of secondary flow. There was a discernable peak in specific impulse efficiency at 4% secondary flow. The small scale of these tests resulted in low performance efficiencies, but prior numerical modeling of larger rocket-in-a-duct engines predicted performance that was comparable to that of optimized rocket nozzles. This remains to be proven in large-scale, rocket-in-a-duct tests.

  3. Network interface unit design options performance analysis

    NASA Technical Reports Server (NTRS)

    Miller, Frank W.

    1991-01-01

    An analysis is presented of three design options for the Space Station Freedom (SSF) onboard Data Management System (DMS) Network Interface Unit (NIU). The NIU provides the interface from the Fiber Distributed Data Interface (FDDI) local area network (LAN) to the DMS processing elements. The FDDI LAN provides the primary means for command and control and low and medium rate telemetry data transfers on board the SSF. The results of this analysis provide the basis for the implementation of the NIU.

  4. Path analysis of self-efficacy and diving performance revisited.

    PubMed

    Feltz, Deborah L; Chow, Graig M; Hepler, Teri J

    2008-06-01

    The Feltz (1982) path analysis of the relationship between diving efficacy and performance showed that, over trials, past performance was a stronger predictor than self-efficacy of performance. Bandura (1997) criticized the study as statistically "overcontrolling" for past performance by using raw past performance scores along with self-efficacy as predictors of performance. He suggests residualizing past performance by regressing the raw scores on self-efficacy and entering them into the model to remove prior contributions of self-efficacy imbedded in past performance scores. To resolve this controversy, we reanalyzed the Feltz data using three statistical models: raw past performance, residual past performance, and a method that residualizes past performance and self-efficacy. Results revealed that self-efficacy was a stronger predictor of performance in both residualized models than in the raw past performance model. Furthermore, the influence of past performance on future performance was weaker when the residualized methods were conducted.

  5. Managing Performance Analysis with Dynamic Statistical Projection Pursuit

    SciTech Connect

    Vetter, J.S.; Reed, D.A.

    2000-05-22

    Computer systems and applications are growing more complex. Consequently, performance analysis has become more difficult due to the complex, transient interrelationships among runtime components. To diagnose these types of performance issues, developers must use detailed instrumentation to capture a large number of performance metrics. Unfortunately, this instrumentation may actually influence the performance analysis, leading the developer to an ambiguous conclusion. In this paper, we introduce a technique for focusing a performance analysis on interesting performance metrics. This technique, called dynamic statistical projection pursuit, identifies interesting performance metrics that the monitoring system should capture across some number of processors. By reducing the number of performance metrics, projection pursuit can limit the impact of instrumentation on the performance of the target system and can reduce the volume of performance data.

  6. An optical probe for micromachine performance analysis

    SciTech Connect

    Dickey, F.M.; Holswade, S.C.; Smith, N.F.; Miller, S.L.

    1997-01-01

    Understanding the mechanisms that impact the performance of Microelectromechanical Systems (MEMS) is essential to the development of optimized designs and fabrication processes, as well as the qualification of devices for commercial applications. Silicon micromachines include engines that consist of orthogonally oriented linear comb drive actuators mechanically connected to a rotating gear. These gears are as small as 50 {mu}m in diameter and can be driven at rotation rates exceeding 300,000 rpm. Optical techniques offer the potential for measuring long term statistical performance data and transient responses needed to optimize designs and manufacturing techniques. We describe the development of Micromachine Optical Probe (MOP) technology for the evaluation of micromachine performance. The MOP approach is based on the detection of optical signals scattered by the gear teeth or other physical structures. We present experimental results obtained with a prototype optical probe and micromachines developed at Sandia National Laboratories.

  7. System performance analysis of stretched membrane heliostats

    SciTech Connect

    Anderson, J V; Murphy, L M; Short, W; Wendelin, T

    1985-12-01

    The optical performance of both focused and unfocused stretched membrane heliostats was examined in the context of the overall cost and performance of central receiver systems. The sensitivity of optical performance to variations in design parameters such as the system size (capacity), delivery temperature, heliostat size, and heliostat surface quality was also examined. The results support the conclusion that focused stretched membrane systems provide an economically attractive alternative to current glass/metal heliostats over essentially the entire range of design parameters studied. In addition, unfocused stretched membrane heliostats may be attractive for a somewhat more limited range of applications, which would include the larger plant sizes (e.g., 450 MW) and lower delivery temperatures (e.g., 450/sup 0/C), or situations in which the heliostat size could economically be reduced.

  8. Forecast analysis of optical waveguide bus performance

    NASA Technical Reports Server (NTRS)

    Ledesma, R.; Rourke, M. D.

    1979-01-01

    Elements to be considered in the design of a data bus include: architecture; data rate; modulation, encoding, detection; power distribution requirements; protocol, work structure; bus reliability, maintainability; interterminal transmission medium; cost; and others specific to application. Fiber- optic data bus considerations for a 32 port transmissive star architecture, are discussed in a tutorial format. General optical-waveguide bus concepts, are reviewed. The electrical and optical performance of a 32 port transmissive star bus, and the effects of temperature on the performance of optical-waveguide buses are examined. A bibliography of pertinent references and the bus receiver test results are included.

  9. Performance analysis of a VSAT network

    NASA Astrophysics Data System (ADS)

    Karam, Fouad G.; Miller, Neville; Karam, Antoine

    With the growing need for efficient satellite networking facilities, the very small aperture terminal (VSAT) technology emerges as the leading edge of satellite communications. Achieving the required performance of a VSAT network is dictated by the multiple access technique utilized. Determining the inbound access method best suited for a particular application involves trade-offs between response time and space segment utilization. In this paper, the slotted Aloha and dedicated stream access techniques are compared. It is shown that network performance is dependent on the traffic offered from remote earth stations as well as the sensitivity of customer's applications to satellite delay.

  10. Systematic Performance Analysis in Foreign Language Instruction.

    ERIC Educational Resources Information Center

    Prokop, Manfred

    This paper reports on the development of an error-coding instrument and a computerized continuous feedback system for the diagnostic evaluation and remedial treatment of unstructured second-language performances and the use of such a system in an instructional setting. The system does the following: it allows quick, objective, accurate, and…

  11. Applying Mechanics to Swimming Performance Analysis.

    ERIC Educational Resources Information Center

    Barthels, Katharine

    1989-01-01

    Swimming teachers and coaches can improve their feedback to swimmers, when correcting or refining swim movements, by applying some basic biomechanical concepts relevant to swimming. This article focuses on the biomechanical considerations used in analyzing swimming performance. Techniques for spotting and correcting problems that impede…

  12. Cognitive Performance: A Model for Analysis

    ERIC Educational Resources Information Center

    Marjoribanks, Kevin

    1975-01-01

    In the present study, cognitive performance was examined by analysing a path model which included family environment variables, social status indicators, and a set of enabling conditions consisting of self-esteem, attitudes toward schoolwork and educational and occupational aspirations. (Editor)

  13. A performance analysis system for MEMS using automated imaging methods

    SciTech Connect

    LaVigne, G.F.; Miller, S.L.

    1998-08-01

    The ability to make in-situ performance measurements of MEMS operating at high speeds has been demonstrated using a new image analysis system. Significant improvements in performance and reliability have directly resulted from the use of this system.

  14. THERMAL PERFORMANCE ANALYSIS FOR WSB DRUM

    SciTech Connect

    Lee, S

    2008-06-26

    The Nuclear Nonproliferation Programs Design Authority is in the design stage of the Waste Solidification Building (WSB) for the treatment and solidification of the radioactive liquid waste streams generated by the Pit Disassembly and Conversion Facility (PDCF) and Mixed Oxide (MOX) Fuel Fabrication Facility (MFFF). The waste streams will be mixed with a cementitious dry mix in a 55-gallon waste container. Savannah River National Laboratory (SRNL) has been performing the testing and evaluations to support technical decisions for the WSB. Engineering Modeling & Simulation Group was requested to evaluate the thermal performance of the 55-gallon drum containing hydration heat source associated with the current baseline cement waste form. A transient axi-symmetric heat transfer model for the drum partially filled with waste form cement has been developed and heat transfer calculations performed for the baseline design configurations. For this case, 65 percent of the drum volume was assumed to be filled with the waste form, which has transient hydration heat source, as one of the baseline conditions. A series of modeling calculations has been performed using a computational heat transfer approach. The baseline modeling results show that the time to reach the maximum temperature of the 65 percent filled drum is about 32 hours when a 43 C initial cement temperature is assumed to be cooled by natural convection with 27 C external air. In addition, the results computed by the present model were compared with analytical solutions. The modeling results will be benchmarked against the prototypic test results. The verified model will be used for the evaluation of the thermal performance for the WSB drum.

  15. NEAT breadboard system analysis and performance models

    NASA Astrophysics Data System (ADS)

    Hénault, François; Crouzier, Antoine; Malbet, Fabien; Kern, Pierre; Martin, Guillermo; Feautrier, Philippe; Staedler, Eric; Lafrasse, Sylvain; Delboulbé, Alain; Le Duigou, Jean-Michel; Cara, Christophe; Léger, Alain

    2014-08-01

    NEAT (Nearby Earth Astrometric Telescope) is an astrometric space mission aiming at detecting Earth-like exoplanets located in the habitable zone of nearby solar-type stars. For that purpose, NEAT should be able to measure stellar centroids within an accuracy of 5 10-6 pixels. In order to fulfil such stringent requirement, NEAT incorporates an interferometric metrology system measuring pixel gains and location errors. To validate this technology and assess the whole performance of the instrument, a dedicated test bench has been built at IPAG, in Grenoble (France). In this paper are summarized the main system engineering considerations allowing to define sub-systems specifications. Then we describe the general architecture of the performance models (including photometric, interferometric, and final astrometric budgets) and confront their predictions with the experimental results obtained on the test bench. It is concluded that most of error items are well understood, although some of them deserve further investigations.

  16. Performance analysis of panoramic infrared systems

    NASA Astrophysics Data System (ADS)

    Furxhi, Orges; Driggers, Ronald G.; Holst, Gerald; Krapels, Keith

    2014-05-01

    Panoramic imagers are becoming more commonplace in the visible part of the spectrum. These imagers are often used in the real estate market, extreme sports, teleconferencing, and security applications. Infrared panoramic imagers, on the other hand, are not as common and only a few have been demonstrated. A panoramic image can be formed in several ways, using pan and stitch, distributed aperture, or omnidirectional optics. When omnidirectional optics are used, the detected image is a warped view of the world that is mapped on the focal plane array in a donut shape. The final image on the display is the mapping of the omnidirectional donut shape image back to the panoramic world view. In this paper we analyze the performance of uncooled thermal panoramic imagers that use omnidirectional optics, focusing on range performance.

  17. A guide for performing system safety analysis

    NASA Technical Reports Server (NTRS)

    Brush, J. M.; Douglass, R. W., III.; Williamson, F. R.; Dorman, M. C. (Editor)

    1974-01-01

    A general guide is presented for performing system safety analyses of hardware, software, operations and human elements of an aerospace program. The guide describes a progression of activities that can be effectively applied to identify hazards to personnel and equipment during all periods of system development. The general process of performing safety analyses is described; setting forth in a logical order the information and data requirements, the analytical steps, and the results. These analyses are the technical basis of a system safety program. Although the guidance established by this document cannot replace human experience and judgement, it does provide a methodical approach to the identification of hazards and evaluation of risks to the system.

  18. Performance Analysis of IIUM Wireless Campus Network

    NASA Astrophysics Data System (ADS)

    Abd Latif, Suhaimi; Masud, Mosharrof H.; Anwar, Farhat

    2013-12-01

    International Islamic University Malaysia (IIUM) is one of the leading universities in the world in terms of quality of education that has been achieved due to providing numerous facilities including wireless services to every enrolled student. The quality of this wireless service is controlled and monitored by Information Technology Division (ITD), an ISO standardized organization under the university. This paper aims to investigate the constraints of wireless campus network of IIUM. It evaluates the performance of the IIUM wireless campus network in terms of delay, throughput and jitter. QualNet 5.2 simulator tool has employed to measure these performances of IIUM wireless campus network. The observation from the simulation result could be one of the influencing factors in improving wireless services for ITD and further improvement.

  19. Performance Management Systems: A Statistical Analysis

    DTIC Science & Technology

    1993-09-01

    PERFORMANCE AREAS PEARSON CORR. SIGNIFICANCE Respon. to Customer needs .4993 < .001 Continuous Process Imp. .3004 .001 Product Innovation .5020 < .001...that existed in the areas of "responsiveness to customer needs ", "first to market" and "productivity", could be a function of 32 current position held...34aresponsiveness to customer needs ", "continuous process improvement", "on t-’ , delivery" and "productivity". While some of the relationships

  20. Functional endoscopic analysis of beatbox performers.

    PubMed

    Sapthavee, Andrew; Yi, Paul; Sims, H Steven

    2014-05-01

    Beatboxing is a form of vocal percussion in which performers imitate drum sounds, interspersed with vocalization and other sounds, using their vocal tracts. Although similarities between beatboxing and singing are expected because of the anatomy involved, the medical literature has a wealth of information on singing and minimal studies on beatboxing. The objective of our study was to report on a case series of functional endoscopic evaluation of the anatomy involved in beatboxing and determine whether beatboxing may be a risk factor for phonotrauma or if this form of vocalization might be protective of the vocal folds. We reviewed the flexible fiberoptic data collected from four beatbox artists who were evaluated at an outpatient Laryngology clinic. These records included videos of a standard flexible laryngoscopic evaluation during which the beatboxers performed beatbox sounds in isolation and in various combinations ("beats"), both standardized and improvised. All four participants were males aged 22-32 years. We found that voicing during beatboxing was not the same as full voice to have sustained phonation interlaced with percussive sounds. Performers overall demonstrated similarities in delivery of the same beatbox sounds, although subtle differences were noted between performers. Beatboxing is a complex form of vocal percussion using the entire vocal tract. Although similarities with singing in the anatomical structures and positioning are noted in beatboxing, there are several unique and interesting anatomical processes occurring. Use of the entire vocal tract, including the pharyngeal constrictors, may actually protect against glottic injury. Copyright © 2014 The Voice Foundation. Published by Mosby, Inc. All rights reserved.

  1. Performance Analysis of the Unitree Central File

    NASA Technical Reports Server (NTRS)

    Pentakalos, Odysseas I.; Flater, David

    1994-01-01

    This report consists of two parts. The first part briefly comments on the documentation status of two major systems at NASA#s Center for Computational Sciences, specifically the Cray C98 and the Convex C3830. The second part describes the work done on improving the performance of file transfers between the Unitree Mass Storage System running on the Convex file server and the users workstations distributed over a large georgraphic area.

  2. Experimental system and component performance analysis

    SciTech Connect

    Peterman, K.

    1984-10-01

    A prototype dye laser flow loop was constructed to flow test large power amplifiers in Building 169. The flow loop is designed to operate at supply pressures up to 900 psig and flow rates up to 250 GPM. During the initial startup of the flow loop experimental measurements were made to evaluate component and system performance. Three candidate dye flow loop pumps and three different pulsation dampeners were tested.

  3. Cost and Training Effectiveness Analysis Performance Guide

    DTIC Science & Technology

    1980-07-23

    perform cost and training effectiveness analyses (CTEA) during Weapon System Acquisition required by the Life Cycle System Management Model (LCSMM) and...light cf training risk . This would be the case if a training program were estimated to be more effective in training certain high- risk tasks out were...also estimated to be somewhat more costly than the next best program. Possible impacts of training risk may indi- cate that the more effective

  4. Moisture performance analysis of EPS frost insulation

    SciTech Connect

    Ojanen, T.; Kokko, E.

    1997-11-01

    A horizontal layer of expanded polystyrene foam (EPS) is widely used as a frost insulation of building foundations in the Nordic countries. The performance properties of the insulation depend strongly on the moisture level of the material. Experimental methods are needed to produce samples for testing the material properties in realistic moisture conditions. The objective was to analyze the moisture loads and the wetting mechanisms of horizontal EPS frost insulation. Typical wetting tests, water immersion and diffusive water vapor absorption tests, were studied and the results were compared with the data from site investigations. Usually these tests give higher moisture contents of EPS than what are detected in drained frost insulation applications. Also the effect of different parameters, like the immersion depth and temperature gradient were studied. Special attention was paid to study the effect of diffusion on the wetting process. Numerical simulation showed that under real working conditions the long period diffusive moisture absorption in EPS frost insulation remained lower than 1% Vol. Moisture performance was determined experimentally as a function of the distance between the insulation and the free water level in the ground. The main moisture loads and the principles for good moisture performance of frost insulation are presented.

  5. Database for LDV Signal Processor Performance Analysis

    NASA Technical Reports Server (NTRS)

    Baker, Glenn D.; Murphy, R. Jay; Meyers, James F.

    1989-01-01

    A comparative and quantitative analysis of various laser velocimeter signal processors is difficult because standards for characterizing signal bursts have not been established. This leaves the researcher to select a signal processor based only on manufacturers' claims without the benefit of direct comparison. The present paper proposes the use of a database of digitized signal bursts obtained from a laser velocimeter under various configurations as a method for directly comparing signal processors.

  6. Performance analysis of the multichannel astrometric photometer

    NASA Technical Reports Server (NTRS)

    Huang, Chunsheng; Lawrence, George N.; Levy, Eugene H.; Mcmillan, Robert S.

    1987-01-01

    It has been proposed that extrasolar planetary systems may be observed if perturbations in star position due to the orbit of Jupiter-type planets could be detected. To see this motion, high accuracy measurements of 0.01 milliarcsecond are required over a relatively large field of view. Techniques using a moving Ronchi grating have been proposed for this application and have been successful in ground-based lower resolution tests. The method may have application to other precision angular measurement problems. This paper explores the theoretical description of the method, considers certain of the error sources, and presents a preliminary calculation of the performance which may be achieved.

  7. Shuttle TPS thermal performance and analysis methodology

    NASA Technical Reports Server (NTRS)

    Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.

    1983-01-01

    Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.

  8. Probabilistic Analysis of Multistage Interconnection Network Performance

    DTIC Science & Technology

    1992-04-01

    out itndep~endlence of channel loads has b~een pie -co.nr)tited. and channels have been assigned itatnes genleratedl from thle namies of the their nodes...lthe exam pie below: > (setq d8x8 (parse-multistage-network determinist ically-interwired-8x8-rep)) (HA PTER I. j1I:1?F� Nx(’l0 A ( I’ . iI’l.I...performs considerably worse than either. (71.1 I’TI’) I . tI’IIOXI.I.IJTO.\\’.0’ -)? A1 I’L TIPO I’ll Xl7J0IK.K 71 Throughput 12 1 10 8 O 6 4 2 0 0.2 0.4

  9. Performance analysis of advanced spacecraft TPS

    NASA Technical Reports Server (NTRS)

    Pitts, William C.

    1991-01-01

    Spacecraft entering a planetary atmosphere require a very sophisticated thermal protection system. The materials used must be tailored to each specific vehicle based on its planned mission profiles. Starting with the Space Shuttle, many types of ceramic insulation with various combinations of thermal properties have been developed by others. The development of two new materials is described: A Composite Flexible Blanket Insulation which has a significantly lower effective thermal conductivity than other ceramic blankets; and a Silicon Matrix Composite which has applications at high temperature locations such as wing leading edges. Also, a systematic study is described that considers the application of these materials for a proposed Personnel Launch System. The study shows how most of these available ceramic materials would perform during atmospheric entry of this vehicle. Other specific applications of these thermal protection materials are discussed.

  10. Performance analysis of embedded-wavelet coders

    NASA Astrophysics Data System (ADS)

    Yang, Shih-Hsuan; Liao, Wu-Jie

    2005-09-01

    We analyze the design issues for the SPIHT (set partitioning in hierarchical trees) coding, one of the best-regarded embedded-wavelet-based algorithms in the literature. Equipped with the multiresolution decomposition, progressive scalar quantization, and adaptive arithmetic coding, SPIHT generates highly compact scalable bitstreams suitable for real-time multimedia applications. The design parameters at each stage of SPIHT greatly influence its performance in terms of compression efficiency and computational complexity. We first evaluate two important classes of wavelet filters, orthogonal and biorthogonal. Orthogonal filters are energy-preserving, while biorthogonal linear-phase filters allow symmetric extension across the boundary. Among the various properties of wavelets pertaining to coding, we investigate the effects of energy compaction, energy conservation, and symmetric extension, respectively. Second, the magnitude of biorthogonal wavelet coefficients may not faithfully reflect their actual significance. We explore a scaling scheme in quantization that minimizes the overall mean squared error. Finally, the contribution of entropy coding is measured.

  11. Performance analysis of microphone array methods

    NASA Astrophysics Data System (ADS)

    Herold, Gert; Sarradj, Ennes

    2017-08-01

    Microphone array methods aim at the characterization of multiple simultaneously operating sound sources. However, existing data processing algorithms have been shown to yield different results when applied to the same input data. The present paper introduces a method for estimating the reliability of such algorithms. Using Monte Carlo simulations, data sets with random variation of selected parameters are generated. Four different microphone array methods are applied to analyze the simulated data sets. The calculated results are compared with the expected outcome, and the dependency of the reliability on several parameters is quantified. It is shown not only that the performance of a method depends on the given source distribution, but also that the methods differ in terms of their sensitivity to imperfect input data.

  12. A New Approach to Aircraft Robust Performance Analysis

    NASA Technical Reports Server (NTRS)

    Gregory, Irene M.; Tierno, Jorge E.

    2004-01-01

    A recently developed algorithm for nonlinear system performance analysis has been applied to an F16 aircraft to begin evaluating the suitability of the method for aerospace problems. The algorithm has a potential to be much more efficient than the current methods in performance analysis for aircraft. This paper is the initial step in evaluating this potential.

  13. Covariance of lucky images: performance analysis

    NASA Astrophysics Data System (ADS)

    Cagigal, Manuel P.; Valle, Pedro J.; Cagigas, Miguel A.; Villó-Pérez, Isidro; Colodro-Conde, Carlos; Ginski, C.; Mugrauer, M.; Seeliger, M.

    2017-01-01

    The covariance of ground-based lucky images is a robust and easy-to-use algorithm that allows us to detect faint companions surrounding a host star. In this paper, we analyse the relevance of the number of processed frames, the frames' quality, the atmosphere conditions and the detection noise on the companion detectability. This analysis has been carried out using both experimental and computer-simulated imaging data. Although the technique allows us the detection of faint companions, the camera detection noise and the use of a limited number of frames reduce the minimum detectable companion intensity to around 1000 times fainter than that of the host star when placed at an angular distance corresponding to the few first Airy rings. The reachable contrast could be even larger when detecting companions with the assistance of an adaptive optics system.

  14. Performance analysis of advanced spacecraft TPS

    NASA Technical Reports Server (NTRS)

    Pitts, William C.

    1987-01-01

    The analysis on the feasibility for using metal hydrides in the thermal protection system of cryogenic tanks in space was based on the heat capacity of ice as the phase change material (PCM). It was found that with ice the thermal protection system weight could be reduced by, at most, about 20 percent over an all LI-900 insulation. For this concept to be viable, a metal hydride with considerably more capacity than water would be required. None were found. Special metal hydrides were developed for hydrogen fuel storage applications and it may be possible to do so for the current application. Until this appears promising further effort on this feasibility study does not seem warranted.

  15. Performance, Applications, and Analysis of Rotating Detonation Engine Technologies (Preprint)

    DTIC Science & Technology

    2015-12-01

    AFRL-RQ-WP-TP-2015-0171 PERFORMANCE, APPLICATION , AND ANALYSIS OF ROTATING DETONATION ENGINE TECHNOLOGIES (PREPRINT) Brent A. Rankin...SUBTITLE PERFORMANCE, APPLICATION , AND ANALYSIS OF ROTATING DETONATION ENGINE TECHNOLOGIES (PREPRINT) 5a. CONTRACT NUMBER In-house 5b. GRANT NUMBER...compact engine designs for a broad range of power and propulsion applications . Recent accomplishments related to the performance, application , and

  16. Electromagnetic tracking performance analysis and optimization.

    PubMed

    Qi, Yu; Sadjadi, Hossein; Yeo, Caitlin T; Hashtrudi-Zaad, Keyvan; Fichtinger, Gabor

    2014-01-01

    The purpose of this study is to evaluate the uncertainties of an electromagnetic (EM) tracking system and to improve both the trueness and the precision of the EM tracker. For evaluating errors, we introduce an optical (OP) tracking system and consider its measurement as "ground truth". In the experiment, static data sets and dynamic profiles are collected in both relatively less-metallic environments. Static data sets are for error modeling, and dynamic ones are for testing. To improve the trueness and precision of the EM tracker, tracker calibration based on polynomial fitting and smooth filters, such as the Kalman filter, the moving average filter and the local regression filter, are deployed. From the experimental data analysis, as the distance between the transmitter and the sensor of the EM tracking system increases, the trueness and precision tend to decrease. The system's trueness and jitter errors can be modeled as the 3(rd) order polynomial error equations. After minimizing the positional error and applying smoothing filters, the mean value of error reduction is 36.9%. Our method can effectively reduce both positional systematic error and jitter error caused by EM field distortion. The method is successfully applied to calibrate an EM tracked surgical cautery tool.

  17. Hydrothermal performance analysis of wind barrier structures

    SciTech Connect

    Ojanen, T.; Kohonen, R.O.

    1995-08-01

    Wind barriers are used in structures that have air-permeable thermal insulation materials. Their main function is to prevent the pressure differences from causing airflow-related heat loss through the building envelope. Wind barriers should not contribute to moisture problems in structures by causing condensation or moisture accumulation. This paper presents requirements for the air tightness of wind barriers and results of the hydrothermal analysis of wind barrier structures. The studied wind barrier structures were typical for small houses in Finland--timber-framed structures with lightweight glass wool thermal insulation. The air permeances and the parameter sensitivities were studied numerically both for ideal and nonideal structures. In ideal structures, the material layers were assumed to be tightly (ideally) connected to each other, but in nonideal structures, there were air leakage routes (air cracks) at the interphases of thermal insulation and adjacent material layers. The drying of moisture through different wind barriers was analyzed in laboratory experiments under boundary conditions similar to those in practice, e.g., with outdoor temperatures below and above 0 C. The measured moisture flows were compared with those derived from a wet-cup water vapor permeability test. Also, the liquid flow along the interface of the wind barrier and glass wool was studied in full-scale experiments with high moisture loads.

  18. What Do HPT Consultants Do for Performance Analysis?

    ERIC Educational Resources Information Center

    Kang, Sung

    2017-01-01

    This study was conducted to contribute to the field of Human Performance Technology (HPT) through the validation of the performance analysis process of the International Society for Performance Improvement (ISPI) HPT model, the most representative and frequently utilized process model in the HPT field. The study was conducted using content…

  19. Beyond "Yes or No": The Vulpe' Performance Analysis System. Revised.

    ERIC Educational Resources Information Center

    Hampton Univ., VA.

    The booklet describes the Vulpe' Performance Analysis System (VPAS), a measure of a child's progress in developmental activities which provides a link to instructional programming. In the assessment stage the child's performance is scored according to how much and what type of assistance is required to perform the task. The scale ranges from no…

  20. What Do HPT Consultants Do for Performance Analysis?

    ERIC Educational Resources Information Center

    Kang, Sung

    2017-01-01

    This study was conducted to contribute to the field of Human Performance Technology (HPT) through the validation of the performance analysis process of the International Society for Performance Improvement (ISPI) HPT model, the most representative and frequently utilized process model in the HPT field. The study was conducted using content…

  1. 76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-30

    ... COMMISSION Metal Fatigue Analysis Performed by Computer Software AGENCY: Nuclear Regulatory Commission... applicants' analyses and methodologies using the computer software package, WESTEMS TM , to demonstrate... by Computer Software Addressees All holders of, and applicants for, a power reactor operating...

  2. Instrument performs nondestructive chemical analysis, data can be telemetered

    NASA Technical Reports Server (NTRS)

    Turkevich, A.

    1965-01-01

    Instrument automatically performs a nondestructive chemical analysis of surfaces and transmits the data in the form of electronic signals. It employs solid-state nuclear particle detectors with a charged nuclear particle source and an electronic pulse-height analyzer.

  3. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  4. Performance Analysis of Pilot-Aided Forward CDMA Cellular Channel

    DTIC Science & Technology

    2001-09-01

    in the DS- CDMA cellular systems. For example, performance analysis in a Nakagami or a Ricean instead of a Rayleigh fading channel could be done...operating in a Rayleigh -fading, Lognormal-shadowing environment. We develop an upper bound on the probability of bit error, including all the participating...Wireless, Performance Analysis, Rayleigh Fading, Lognormal Shadowing, Hata Model, Convolutional Code, Narrowband Filtering, Pilot Tone, Power

  5. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replication and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. A technique is used that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed database with both shared and exclusive locks.

  6. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replications and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. Here, a technique is discussed that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed databases with both shared and exclusive locks.

  7. Performance analysis of parallel supernodal sparse LU factorization

    SciTech Connect

    Grigori, Laura; Li, Xiaoye S.

    2004-02-05

    We investigate performance characteristics for the LU factorization of large matrices with various sparsity patterns. We consider supernodal right-looking parallel factorization on a bi-dimensional grid of processors, making use of static pivoting. We develop a performance model and we validate it using the implementation in SuperLU-DIST, the real matrices and the IBM Power3 machine at NERSC. We use this model to obtain performance bounds on parallel computers, to perform scalability analysis and to identify performance bottlenecks. We also discuss the role of load balance and data distribution in this approach.

  8. School Performance Feedback Systems: Conceptualization, Analysis, and Reflection.

    ERIC Educational Resources Information Center

    Visscher, Adrie J.; Coe, Robert

    2003-01-01

    Presents a conceptualization and analysis of school performance feedback systems (SPFS), followed by framework that includes factors crucial for their use and effects. Provides two examples of use of SPFS. Summarizes evidence on the process, problems, and impact of SPFS; suggests strategies for using performance feedback to improve schools.…

  9. A Systemic Cause Analysis Model for Human Performance Technicians

    ERIC Educational Resources Information Center

    Sostrin, Jesse

    2011-01-01

    This article presents a systemic, research-based cause analysis model for use in the field of human performance technology (HPT). The model organizes the most prominent barriers to workplace learning and performance into a conceptual framework that explains and illuminates the architecture of these barriers that exist within the fabric of everyday…

  10. A Systemic Cause Analysis Model for Human Performance Technicians

    ERIC Educational Resources Information Center

    Sostrin, Jesse

    2011-01-01

    This article presents a systemic, research-based cause analysis model for use in the field of human performance technology (HPT). The model organizes the most prominent barriers to workplace learning and performance into a conceptual framework that explains and illuminates the architecture of these barriers that exist within the fabric of everyday…

  11. An Exploratory Analysis of Performance on the SAT.

    ERIC Educational Resources Information Center

    Wainer, Howard

    1984-01-01

    Techniques of exploratory data analysis (EDA) were used to decompose data tables portraying performance of ethnic groups on the Scholastic Aptitude Test. These analyses indicate the size and structure of differences in performance among groups studied, nature of changes across time, and interactions between group membership and time. (Author/DWH)

  12. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    SciTech Connect

    Bremer, Peer-Timo; Mohr, Bernd; Schulz, Martin; Pasccci, Valerio; Gamblin, Todd; Brunst, Holger

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  13. School Performance Feedback Systems: Conceptualization, Analysis, and Reflection.

    ERIC Educational Resources Information Center

    Visscher, Adrie J.; Coe, Robert

    2003-01-01

    Presents a conceptualization and analysis of school performance feedback systems (SPFS), followed by framework that includes factors crucial for their use and effects. Provides two examples of use of SPFS. Summarizes evidence on the process, problems, and impact of SPFS; suggests strategies for using performance feedback to improve schools.…

  14. Using Importance-Performance Analysis to Evaluate Training

    ERIC Educational Resources Information Center

    Siniscalchi, Jason M.; Beale, Edward K.; Fortuna, Ashley

    2008-01-01

    The importance-performance analysis (IPA) is a tool that can provide timely and usable feedback to improve training. IPA measures the gaps between the importance and how good (performance) a class is perceived by a student and is presented on a 2x2 matrix. The quadrant in which data land in this matrix aids in determining potential future action.…

  15. Analysis of a Ubiquitous Performance Support System for Teachers

    ERIC Educational Resources Information Center

    Chen, Chao-Hsiu; Hwang, Gwo-Jen; Yang, Tzu-Chi; Chen, Shih-Hsuan; Huang, Shen-Yu

    2009-01-01

    This paper describes a Ubiquitous Performance Support System for Teachers (UPSST) and its implementation model. Personal Digital Assistants (PDAs) were used as the platform to support high-school teachers. Based on concepts of Electronic Performance Support Systems and design-based research, the authors conducted an iterative process of analysis,…

  16. Merging Right: Questions of Access and Merit in South African Higher Education Reform, 1994-2002

    ERIC Educational Resources Information Center

    Elliott, John

    2005-01-01

    The dismantling of South Africa's apartheid-controlled education system after 1994 brought with it unprecedented policy complications, among them the question of how best to integrate the desiderata of access and merit in school education and tertiary sectors. For the higher education sector, institutional mergers became an increasingly visible…

  17. [Phenotypic diagnosis of primary immunodeficiencies in Antioquia, Colombia, 1994-2002].

    PubMed

    Montoya, Carlos Julio; Henao, Julieta; Salgado, Helí; Olivares, María M; López, Juan A; Rugeles, Claudia; Franco, José Luis; Orrego, Julio; García, Diana M; Patiño, Pablo J

    2002-12-01

    Recurrent infections are a frequent cause of medical visits. They can be due to a heterogeneous group of dysfunctions that increase the susceptibility to pathogenic and opportunistic microorganisms, such as immunological deficiencies. To define an opportune rational treatment and to guide the molecular diagnosis of primary immunodeficiency diseases, we establish a program for the phenotypic diagnosis of these illnesses in Antioquia, Colombia, including clinical and laboratory evaluations of patients who present recurrent infections with abnormal evolution. Between August 1, 1994 and July 31, 2002, phenotypic diagnosis of primary immunodeficiency was made in 98 patients. Similar to data reported in the literature, antibody deficiencies were the most frequent (40.8%), followed by combined deficiencies (21.4%). This phenotypic characterization has allowed for appropriate treatments for each patient and, in some cases, functional and molecular studies that can lead to a definite molecular diagnosis.

  18. Merging Right: Questions of Access and Merit in South African Higher Education Reform, 1994-2002

    ERIC Educational Resources Information Center

    Elliott, John

    2005-01-01

    The dismantling of South Africa's apartheid-controlled education system after 1994 brought with it unprecedented policy complications, among them the question of how best to integrate the desiderata of access and merit in school education and tertiary sectors. For the higher education sector, institutional mergers became an increasingly visible…

  19. Long-tailed duck (Clangula hyemalis) microsatellite DNA data; Alaska, Canada, Russia, 1994-2002

    USGS Publications Warehouse

    Wilson, Robert E.; Talbot, Sandra L.

    2016-01-01

    This data set describes nuclear microsatellite genotypes derived from twelve autosomal loci (6AB, Aph02, Aph08, Aph19, Aph23, Bca10, Bca11, Hhi5, Sfi11, Smo07, Smo09, and CRG), and two Z-linked microsatellite loci (Bca4 and Smo1). A total of 111 Long-tailed Ducks were examined for this genotyping with samples coming from the two primary breeding locales within Alaska (Arctic Coastal Plain of Alaska and the Yukon Delta, Western Alaska) and a representative locale in the central Canadian Arctic (Queen Maud Gulf Bird Sanctuary, Nunavut, Canada). The sex of most samples was determined in the field by plumage and later confirmed by using the CHD molecular sexing protocol (Griffiths et al., 1998).

  20. Phosphorus and suspended sediment load estimates for the Lower Boise River, Idaho, 1994-2002

    USGS Publications Warehouse

    Donato, Mary M.; MacCoy, Dorene E.

    2004-01-01

    The U.S. Geological Survey used LOADEST, newly developed load estimation software, to develop regression equations and estimate loads of total phosphorus (TP), dissolved orthophosphorus (OP), and suspended sediment (SS) from January 1994 through September 2002 at four sites on the lower Boise River: Boise River below Diversion Dam near Boise, Boise River at Glenwood Bridge at Boise, Boise River near Middleton, and Boise River near Parma. The objective was to help the Idaho Department of Environmental Quality develop and implement total maximum daily loads (TMDLs) by providing spatial and temporal resolution for phosphorus and sediment loads and enabling load estimates made by mass balance calculations to be refined and validated. Regression models for TP and OP generally were well fit on the basis of regression coefficients of determination (R2), but results varied in quality from site to site. The TP and OP results for Glenwood probably were affected by the upstream wastewater-treatment plant outlet, which provides a variable phosphorus input that is unrelated to river discharge. Regression models for SS generally were statistically well fit. Regression models for Middleton for all constituents, although statistically acceptable, were of limited usefulness because sparse and intermittent discharge data at that site caused many gaps in the resulting estimates. Although the models successfully simulated measured loads under predominant flow conditions, errors in TP and SS estimates at Middleton and in TP estimates at Parma were larger during high- and low-flow conditions. This shortcoming might be improved if additional concentration data for a wider range of flow conditions were available for calibrating the model. The average estimated daily TP load ranged from less than 250 pounds per day (lb/d) at Diversion to nearly 2,200 lb/d at Parma. Estimated TP loads at all four sites displayed cyclical variations coinciding with seasonal fluctuations in discharge. Estimated annual loads of TP ranged from less than 8 tons at Diversion to 570 tons at Parma. Annual loads of dissolved OP peaked in 1997 at all sites and were consistently higher at Parma than at the other sites. The ratio of OP to TP varied considerably throughout the year at all sites. Peaks in the OP:TP ratio occurred primarily when flows were at their lowest annual stages; estimated seasonal OP:TP ratios were highest in autumn at all sites. Conversely, when flows were high, the ratio was low, reflecting increased TP associated with particulate matter during high flows. Parma exhibited the highest OP:TP ratio during all seasons, at least 0.60 in spring and nearly 0.90 in autumn. Similar OP:TP ratios were estimated at Glenwood. Whereas the OP:TP ratio for Parma and Glenwood peaked in November or December, decreased from January through May, and increased again after June, estimates for Diversion showed nearly the opposite pattern ? ratios were highest in July and lowest in January and February. This difference might reflect complex biological and geochemical processes involving nutrient cycling in Lucky Peak Lake, but further data are needed to substantiate this hypothesis. Estimated monthly average SS loads were highest at Diversion, about 400 tons per day (ton/d). Average annual loads from 1994 through 2002 were 144,000 tons at Diversion, 33,000 tons at Glenwood, and 88,000 tons at Parma. Estimated SS loads peaked in the spring at all sites, coinciding with high flows. Increases in TP in the reach from Diversion to Glenwood ranged from 200 to 350 lb/d. Decreases in TP were small in this reach only during high flows in January and February 1997. Decreases in SS, were large during high-flow conditions indicating sediment deposition in the reach. Intermittent data at Middleton indicated that increases and decreases in TP in the reach from Glenwood to Middleton were during low- and high-flow conditions, respectively. All constituents increased in the r

  1. Using Latent Class Analysis To Set Academic Performance Standards.

    ERIC Educational Resources Information Center

    Brown, Richard S.

    The use of latent class analysis for establishing student performance standards was studied. Latent class analysis (LCA) is an established procedure for investigating the latent structure of a set of data. LCA presumes that groups, classes, or respondents differ qualitatively from one another, and that these differences account for all of the…

  2. Using Importance-Performance Analysis To Evaluate Teaching Effectiveness.

    ERIC Educational Resources Information Center

    Attarian, Aram

    This paper introduces Importance-Performance (IP) analysis as a method to evaluate teaching effectiveness in a university outdoor program. Originally developed for use in the field of marketing, IP analysis is simple and easy to administer, and provides the instructor with a visual representation of what teaching attributes are important, how…

  3. Preliminary Analysis of Remote Monitoring & Robotic Concepts for Performance Confirmation

    SciTech Connect

    D.A. McAffee

    1997-02-18

    As defined in 10 CFR Part 60.2, Performance Confirmation is the ''program of tests, experiments and analyses which is conducted to evaluate the accuracy and adequacy of the information used to determine with reasonable assurance that the performance objectives for the period after permanent closure will be met''. The overall Performance Confirmation program begins during site characterization and continues up to repository closure. The main purpose of this document is to develop, explore and analyze initial concepts for using remotely operated and robotic systems in gathering repository performance information during Performance Confirmation. This analysis focuses primarily on possible Performance Confirmation related applications within the emplacement drifts after waste packages have been emplaced (post-emplacement) and before permanent closure of the repository (preclosure). This will be a period of time lasting approximately 100 years and basically coincides with the Caretaker phase of the project. This analysis also examines, to a lesser extent, some applications related to Caretaker operations. A previous report examined remote handling and robotic technologies that could be employed during the waste package emplacement phase of the project (Reference 5.1). This analysis is being prepared to provide an early investigation of possible design concepts and technical challenges associated with developing remote systems for monitoring and inspecting activities during Performance Confirmation. The writing of this analysis preceded formal development of Performance Confirmation functional requirements and program plans and therefore examines, in part, the fundamental Performance Confirmation monitoring needs and operating conditions. The scope and primary objectives of this analysis are to: (1) Describe the operating environment and conditions expected in the emplacement drifts during the preclosure period. (Presented in Section 7.2). (2) Identify and discuss the

  4. Advanced Risk Analysis for High-Performing Organizations

    DTIC Science & Technology

    2006-01-01

    using traditional risk analysis techniques. Mission Assurance Analysis Protocol (MAAP) is one technique that high performers can use to identify and mitigate the risks arising from operational complexity....The operational environment for many types of organizations is changing. Changes in operational environments are driving the need for advanced risk ... analysis techniques. Many types of risk prevalent in today’s operational environments (e.g., event risks, inherited risk) are not readily identified

  5. Analysis of portable impactor performance for enumeration of viable bioaerosols.

    PubMed

    Yao, Maosheng; Mainelis, Gediminas

    2007-07-01

    Portable impactors are increasingly being used to estimate concentration of bioaerosols in residential and occupational environments; however, little data are available about their performance. This study investigated the overall performances of the SMA MicroPortable, BioCulture, Microflow, Microbiological Air Sampler (MAS-100), Millipore Air Tester, SAS Super 180, and RCS High Flow portable microbial samplers when collecting bacteria and fungi both indoors and outdoors. The performance of these samplers was compared with that of the BioStage impactor. The Button Aerosol Sampler equipped with gelatin filter was also included in the study. Results showed that the sampling environment can have a statistically significant effect on sampler performance, most likely due to the differences in airborne microorganism composition and/or their size distribution. Data analysis using analysis of variance showed that the relative performance of all samplers (except the RCS High Flow and MAS-100) was statistically different (lower) compared with the BioStage. The MAS-100 also had statistically higher performance compared with other portable samplers except the RCS High Flow. The Millipore Air Tester and the SMA had the lowest performances. The relative performance of the impactors was described using a multiple linear regression model (R(2) = 0.83); the effects of the samplers' cutoff sizes and jet-to-plate distances as predictor variables were statistically significant. The data presented in this study will help field professionals in selecting bioaerosol samplers. The developed empirical formula describing the overall performance of bioaerosol impactors can assist in sampler design.

  6. The development of a reliable amateur boxing performance analysis template.

    PubMed

    Thomson, Edward; Lamb, Kevin; Nicholas, Ceri

    2013-01-01

    The aim of this study was to devise a valid performance analysis system for the assessment of the movement characteristics associated with competitive amateur boxing and assess its reliability using analysts of varying experience of the sport and performance analysis. Key performance indicators to characterise the demands of an amateur contest (offensive, defensive and feinting) were developed and notated using a computerised notational analysis system. Data were subjected to intra- and inter-observer reliability assessment using median sign tests and calculating the proportion of agreement within predetermined limits of error. For all performance indicators, intra-observer reliability revealed non-significant differences between observations (P > 0.05) and high agreement was established (80-100%) regardless of whether exact or the reference value of ±1 was applied. Inter-observer reliability was less impressive for both analysts (amateur boxer and experienced analyst), with the proportion of agreement ranging from 33-100%. Nonetheless, there was no systematic bias between observations for any indicator (P > 0.05), and the proportion of agreement within the reference range (±1) was 100%. A reliable performance analysis template has been developed for the assessment of amateur boxing performance and is available for use by researchers, coaches and athletes to classify and quantify the movement characteristics of amateur boxing.

  7. Performance criteria for emergency medicine residents: a job analysis.

    PubMed

    Blouin, Danielle; Dagnone, Jeffrey Damon

    2008-11-01

    A major role of admission interviews is to assess a candidate's suitability for a residency program. Structured interviews have greater reliability and validity than do unstructured ones. The development of content for a structured interview is typically based on the dimensions of performance that are perceived as important to succeed in a particular line of work. A formal job analysis is normally conducted to determine these dimensions. The dimensions essential to succeed as an emergency medicine (EM) resident have not yet been studied. We aimed to analyze the work of EM residents to determine these essential dimensions. The "critical incident technique" was used to generate scenarios of poor and excellent resident performance. Two reviewers independently read each scenario and labelled the performance dimensions that were reflected in each. All labels assigned to a particular scenario were pooled and reviewed again until a consensus was reached. Five faculty members (25% of our total faculty) comprised the subject experts. Fifty-one incidents were generated and 50 different labels were applied. Eleven dimensions of performance applied to at least 5 incidents. "Professionalism" was the most valued performance dimension, represented in 56% of the incidents, followed by "self-confidence" (22%), "experience" (20%) and "knowledge" (20%). "Professionalism," "self-confidence," "experience" and "knowledge" were identified as the performance dimensions essential to succeed as an EM resident based on our formal job analysis using the critical incident technique. Performing a formal job analysis may assist training program directors with developing admission interviews.

  8. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  9. Performing modal analysis for multi-metric measurements: a discussion

    NASA Astrophysics Data System (ADS)

    Soman, R.; Majewska, K.; Radzienski, M.; Ostachowicz, W.

    2016-04-01

    This work addresses the severe lack of literature in the area of modal analysis for multi-metric sensing. The paper aims at providing a step by step tutorial for performance of modal analysis using Fiber Bragg Grating (FBG) strain sensors and Laser Doppler Vibrometer (LDV) for displacement measurements. The paper discusses in detail the different parameters which affect the accuracy of the experimental results. It highlights the often implied, and un-mentioned problems, that researchers face while performing experiments. The paper tries to bridge the gap between the theoretical idea of the experiment and its actual execution by discussing each aspect including the choice of specimen, boundary conditions, sensors, sensor position, excitation mechanism and its location as well as the post processing of the data. The paper may be viewed as a checklist for performing modal analysis in order to ensure high quality measurements by avoiding the systematic errors to creep in.

  10. Advanced Video Analysis Needs for Human Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Campbell, Paul D.

    1994-01-01

    Evaluators of human task performance in space missions make use of video as a primary source of data. Extraction of relevant human performance information from video is often a labor-intensive process requiring a large amount of time on the part of the evaluator. Based on the experiences of several human performance evaluators, needs were defined for advanced tools which could aid in the analysis of video data from space missions. Such tools should increase the efficiency with which useful information is retrieved from large quantities of raw video. They should also provide the evaluator with new analytical functions which are not present in currently used methods. Video analysis tools based on the needs defined by this study would also have uses in U.S. industry and education. Evaluation of human performance from video data can be a valuable technique in many industrial and institutional settings where humans are involved in operational systems and processes.

  11. Advanced Video Analysis Needs for Human Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Campbell, Paul D.

    1994-01-01

    Evaluators of human task performance in space missions make use of video as a primary source of data. Extraction of relevant human performance information from video is often a labor-intensive process requiring a large amount of time on the part of the evaluator. Based on the experiences of several human performance evaluators, needs were defined for advanced tools which could aid in the analysis of video data from space missions. Such tools should increase the efficiency with which useful information is retrieved from large quantities of raw video. They should also provide the evaluator with new analytical functions which are not present in currently used methods. Video analysis tools based on the needs defined by this study would also have uses in U.S. industry and education. Evaluation of human performance from video data can be a valuable technique in many industrial and institutional settings where humans are involved in operational systems and processes.

  12. Temporal geospatial analysis of secondary school students’ examination performance

    NASA Astrophysics Data System (ADS)

    Nik Abd Kadir, ND; Adnan, NA

    2016-06-01

    Malaysia's Ministry of Education has improved the organization of the data to have the geographical information system (GIS) school database. However, no further analysis is done using geospatial analysis tool. Mapping has emerged as a communication tool and becomes effective way to publish the digital and statistical data such as school performance results. The objective of this study is to analyse secondary school student performance of science and mathematics scores of the Sijil Pelajaran Malaysia Examination result in the year 2010 to 2014 for the Kelantan's state schools with the aid of GIS software and geospatial analysis. The school performance according to school grade point average (GPA) from Grade A to Grade G were interpolated and mapped and query analysis using geospatial tools able to be done. This study will be beneficial to the education sector to analyse student performance not only in Kelantan but to the whole Malaysia and this will be a good method to publish in map towards better planning and decision making to prepare young Malaysians for the challenges of education system and performance.

  13. Visualization and Data Analysis for High-Performance Computing

    SciTech Connect

    Sewell, Christopher Meyer

    2016-09-27

    This is a set of slides from a guest lecture for a class at the University of Texas, El Paso on visualization and data analysis for high-performance computing. The topics covered are the following: trends in high-performance computing; scientific visualization, such as OpenGL, ray tracing and volume rendering, VTK, and ParaView; data science at scale, such as in-situ visualization, image databases, distributed memory parallelism, shared memory parallelism, VTK-m, "big data", and then an analysis example.

  14. Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Caubet, Jordi; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    In this paper we describe how to apply powerful performance analysis techniques to understand the behavior of multilevel parallel applications. We use the Paraver/OMPItrace performance analysis system for our study. This system consists of two major components: The OMPItrace dynamic instrumentation mechanism, which allows the tracing of processes and threads and the Paraver graphical user interface for inspection and analyses of the generated traces. We describe how to use the system to conduct a detailed comparative study of a benchmark code implemented in five different programming paradigms applicable for shared memory

  15. Artificial Intelligence: An Analysis of Potential Applications to Training, Performance Measurement, and Job Performance Aiding.

    DTIC Science & Technology

    1983-09-01

    AD-Ali33 592 ARTIFICIAL INTELLIGENCE: AN ANALYSIS OF POTENTIAL 1/1 APPLICATIONS TO TRAININ..(U) DENVER RESEARCH INST CO JRICHARDSON SEP 83 AFHRL-TP...83-28 b ’ 3 - 4. TITLE (aied Suhkie) 5. TYPE OF REPORT & PERIOD COVERED ARTIFICIAL INTEL11GENCE: AN ANALYSIS OF Interim POTENTIAL APPLICATIONS TO...8217 sde if neceseamy end ides*f by black naumber) artificial intelligence military research * computer-aided diagnosis performance tests computer

  16. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  17. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  18. Integrated design environment for human performance and human reliability analysis

    SciTech Connect

    Nelson, W.R.

    1997-05-01

    Work over the last few years at the Idaho National Engineering and Environmental Laboratory (INEEL) has included a major focus on applying human performance and human reliability knowledge and methods as an integral element of system design and development. This work has been pursued in programs in a wide variety of technical domains, beginning with nuclear power plant operations. Since the mid-1980`s the laboratory has transferred the methods and tools developed in the nuclear domain to military weapons systems and aircraft, offshore oil and shipping operations, and commercial aviation operations and aircraft design. Through these diverse applications the laboratory has developed an integrated approach and framework for application of human performance analysis, human reliability analysis (HRA), operational data analysis, and simulation studies of human performance to the design and development of complex systems. This approach was recently tested in the NASA Advanced Concepts Program {open_quotes}Structured Human Error Analysis for Aircraft Design.{close_quotes} This program resulted in the prototype software tool THEA (Tool for Human Error Analysis) for incorporating human error analysis in the design of commercial aircraft, focusing on airplane maintenance tasks. Current effort is directed toward applying this framework to the development of advanced Air Traffic Management (ATM) systems as part of NASA`s Advanced Air Transportation Technologies (AATT) program. This paper summarizes the approach, describes recent and current applications in commercial aviation, and provides perspectives on how the approach could be utilized in the nuclear power industry.

  19. Relative performance analysis of IR FPA technologies from the perspective of system level performance

    NASA Astrophysics Data System (ADS)

    Haran, Terence L.; James, J. Christopher; Cincotta, Tomas E.

    2017-08-01

    The majority of high performance infrared systems today utilize FPAs composed of intrinsic direct bandgap semiconductor photon detectors such as MCT or InSb. Quantum well detector technologies such as QWIPs, QDIPs, and SLS photodetectors are potentially lower cost alternatives to MCT and InSb, but the relative performance of these technologies has not been sufficiently high to allow widespread adoption outside of a handful of applications. While detectors are often evaluated using figures of merit such as NETD or D∗, these metrics, which include many underlying aspects such as spectral quantum efficiency, dark current, well size, MTF, and array response uniformity, may be far removed from the performance metrics used to judge performance of a system in an operationally relevant scenario. True comparisons of performance for various detector technologies from the perspective of end-to-end system performance have rarely been conducted, especially considering the rapid progress of the newer quantum well technologies. System level models such as the US Army's Night Vision Integrated Performance Model (NV-IPM) can calculate image contrast and spatial frequency content using data from the target/background, intervening atmosphere, and system components. This paper includes results from a performance parameter sensitivity analysis using NV-IPM to determine the relative importance of various FPA performance parameters to the overall performance of a long range imaging system. Parameters included are: QE, dark current density, quantum well capacity, downstream readout noise, well fill, image frame rate, frame averaging, and residual fixed pattern noise. The state-of-the art for XBn, QWIP, and SLS detector technologies operating in the MWIR and LWIR bands will be surveyed to assess performance of quantum structures compared to MCT and InSb. The intent is to provide a comprehensive assessment of quantum detector performance and to identify areas where increased research

  20. Design and performance analysis of gas and liquid radial turbines

    NASA Astrophysics Data System (ADS)

    Tan, Xu

    In the first part of the research, pumps running in reverse as turbines are studied. This work uses experimental data of wide range of pumps representing the centrifugal pumps' configurations in terms of specific speed. Based on specific speed and specific diameter an accurate correlation is developed to predict the performances at best efficiency point of the centrifugal pump in its turbine mode operation. The proposed prediction method yields very good results to date compared to previous such attempts. The present method is compared to nine previous methods found in the literature. The comparison results show that the method proposed in this paper is the most accurate. The proposed method can be further complemented and supplemented by more future tests to increase its accuracy. The proposed method is meaningful because it is based both specific speed and specific diameter. The second part of the research is focused on the design and analysis of the radial gas turbine. The specification of the turbine is obtained from the solar biogas hybrid system. The system is theoretically analyzed and constructed based on the purchased compressor. Theoretical analysis results in a specification of 100lb/min, 900ºC inlet total temperature and 1.575atm inlet total pressure. 1-D and 3-D geometry of the rotor is generated based on Aungier's method. 1-D loss model analysis and 3-D CFD simulations are performed to examine the performances of the rotor. The total-to-total efficiency of the rotor is more than 90%. With the help of CFD analysis, modifications on the preliminary design obtained optimized aerodynamic performances. At last, the theoretical performance analysis on the hybrid system is performed with the designed turbine.

  1. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    SciTech Connect

    Bhatele, Abhinav

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  2. Analysis of Photovoltaic System Energy Performance Evaluation Method

    SciTech Connect

    Kurtz, S.; Newmiller, J.; Kimber, A.; Flottemesch, R.; Riley, E.; Dierauf, T.; McKee, J.; Krishnani, P.

    2013-11-01

    Documentation of the energy yield of a large photovoltaic (PV) system over a substantial period can be useful to measure a performance guarantee, as an assessment of the health of the system, for verification of a performance model to then be applied to a new system, or for a variety of other purposes. Although the measurement of this performance metric might appear to be straight forward, there are a number of subtleties associated with variations in weather and imperfect data collection that complicate the determination and data analysis. A performance assessment is most valuable when it is completed with a very low uncertainty and when the subtleties are systematically addressed, yet currently no standard exists to guide this process. This report summarizes a draft methodology for an Energy Performance Evaluation Method, the philosophy behind the draft method, and the lessons that were learned by implementing the method.

  3. An Empirical Analysis of Human Performance and Nuclear Safety Culture

    SciTech Connect

    Jeffrey Joe; Larry G. Blackwood

    2006-06-01

    The purpose of this analysis, which was conducted for the US Nuclear Regulatory Commission (NRC), was to test whether an empirical connection exists between human performance and nuclear power plant safety culture. This was accomplished through analyzing the relationship between a measure of human performance and a plant’s Safety Conscious Work Environment (SCWE). SCWE is an important component of safety culture the NRC has developed, but it is not synonymous with it. SCWE is an environment in which employees are encouraged to raise safety concerns both to their own management and to the NRC without fear of harassment, intimidation, retaliation, or discrimination. Because the relationship between human performance and allegations is intuitively reciprocal and both relationship directions need exploration, two series of analyses were performed. First, human performance data could be indicative of safety culture, so regression analyses were performed using human performance data to predict SCWE. It also is likely that safety culture contributes to human performance issues at a plant, so a second set of regressions were performed using allegations to predict HFIS results.

  4. Performance Indicators in Math: Implications for Brief Experimental Analysis of Academic Performance

    ERIC Educational Resources Information Center

    VanDerheyden, Amanda M.; Burns, Matthew K.

    2009-01-01

    Brief experimental analysis (BEA) can be used to specify intervention characteristics that produce positive learning gains for individual students. A key challenge to the use of BEA for intervention planning is the identification of performance indicators (including topography of the skill, measurement characteristics, and decision criteria) that…

  5. Path Analysis Tests of Theoretical Models of Children's Memory Performance

    ERIC Educational Resources Information Center

    DeMarie, Darlene; Miller, Patricia H.; Ferron, John; Cunningham, Walter R.

    2004-01-01

    Path analysis was used to test theoretical models of relations among variables known to predict differences in children's memory--strategies, capacity, and metamemory. Children in kindergarten to fourth grade (chronological ages 5 to 11) performed different memory tasks. Several strategies (i.e., sorting, clustering, rehearsal, and self-testing)…

  6. Accounting for trip frequency in importance-performance analysis

    Treesearch

    Joshua K. Gill; J.M. Bowker; John C. Bergstrom; Stanley J. Zarnoch

    2010-01-01

    Understanding customer satisfaction is critical to the successful operation of both privately and publicly managed recreation venues. A popular tool for assessing recreation visitor satisfaction is Importance- Performance Analysis (IPA). IPA provides resource managers, government officials, and private businesses with easy-to-understand and -use information about...

  7. Performance Analysis of GAME: A Generic Automated Marking Environment

    ERIC Educational Resources Information Center

    Blumenstein, Michael; Green, Steve; Fogelman, Shoshana; Nguyen, Ann; Muthukkumarasamy, Vallipuram

    2008-01-01

    This paper describes the Generic Automated Marking Environment (GAME) and provides a detailed analysis of its performance in assessing student programming projects and exercises. GAME has been designed to automatically assess programming assignments written in a variety of languages based on the "structure" of the source code and the correctness…

  8. Frontiers of Performance Analysis on Leadership-Class Systems

    SciTech Connect

    Fowler, R J; Adhianto, L; de Supinski, B R; Fagan, M; Gamblin, T; Krentel, M; Mellor-Crummey, J; Schulz, M; Tallent, N

    2009-06-15

    The number of cores in high-end systems for scientific computing are employing is increasing rapidly. As a result, there is an pressing need for tools that can measure, model, and diagnose performance problems in highly-parallel runs. We describe two tools that employ complementary approaches for analysis at scale and we illustrate their use on DOE leadership-class systems.

  9. Performance Analysis of GAME: A Generic Automated Marking Environment

    ERIC Educational Resources Information Center

    Blumenstein, Michael; Green, Steve; Fogelman, Shoshana; Nguyen, Ann; Muthukkumarasamy, Vallipuram

    2008-01-01

    This paper describes the Generic Automated Marking Environment (GAME) and provides a detailed analysis of its performance in assessing student programming projects and exercises. GAME has been designed to automatically assess programming assignments written in a variety of languages based on the "structure" of the source code and the correctness…

  10. Performance on the Pharmacy College Admission Test: An Exploratory Analysis.

    ERIC Educational Resources Information Center

    Kawahara, Nancy E.; Ethington, Corinna

    1994-01-01

    Median polishing, an exploratory data statistical analysis technique, was used to study achievement patterns for men and women on the Pharmacy College Admission Test over a six-year period. In general, a declining trend in scores was found, and males performed better than females, with the largest differences found in chemistry and biology.…

  11. The Analysis of Athletic Performance: Some Practical and Philosophical Considerations

    ERIC Educational Resources Information Center

    Nelson, Lee J.; Groom, Ryan

    2012-01-01

    This article presents a hypothetical dialogue between a notational analyst (NA) recently schooled in the positivistic assessment of athletic performance, an "old-school" traditional coach (TC) who favours subjective analysis, and a pragmatic educator (PE). The conversation opens with NA and TC debating the respective value of quantitative and…

  12. Visuo-Spatial Performance in Autism: A Meta-Analysis

    ERIC Educational Resources Information Center

    Muth, Anne; Hönekopp, Johannes; Falter, Christine M.

    2014-01-01

    Visuo-spatial skills are believed to be enhanced in autism spectrum disorders (ASDs). This meta-analysis tests the current state of evidence for Figure Disembedding, Block Design, Mental Rotation and Navon tasks in ASD and neurotypicals. Block Design (d = 0.32) and Figure Disembedding (d = 0.26) showed superior performance for ASD with large…

  13. A Semiotic Reading and Discourse Analysis of Postmodern Street Performance

    ERIC Educational Resources Information Center

    Lee, Mimi Miyoung; Chung, Sheng Kuan

    2009-01-01

    Postmodern street art operates under a set of references that requires art educators and researchers to adopt alternative analytical frameworks in order to understand its meanings. In this article, we describe social semiotics, critical discourse analysis, and postmodern street performance as well as the relevance of the former two in interpreting…

  14. Cost and performance analysis of physical security systems

    SciTech Connect

    Hicks, M.J.; Yates, D.; Jago, W.H.

    1997-06-01

    CPA - Cost and Performance Analysis - is a prototype integration of existing PC-based cost and performance analysis tools: ACEIT (Automated Cost Estimating Integrated Tools) and ASSESS (Analytic System and Software for Evaluating Safeguards and Security). ACE is an existing DOD PC-based tool that supports cost analysis over the full life cycle of a system; that is, the cost to procure, operate, maintain and retire the system and all of its components. ASSESS is an existing DOE PC-based tool for analysis of performance of physical protection systems. Through CPA, the cost and performance data are collected into Excel workbooks, making the data readily available to analysts and decision makers in both tabular and graphical formats and at both the system and subsystem levels. The structure of the cost spreadsheets incorporates an activity-based approach to cost estimation. Activity-based costing (ABC) is an accounting philosophy used by industry to trace direct and indirect costs to the products or services of a business unit. By tracing costs through security sensors and procedures and then mapping the contributions of the various sensors and procedures to system effectiveness, the CPA architecture can provide security managers with information critical for both operational and strategic decisions. The architecture, features and applications of the CPA prototype are presented. 5 refs., 3 figs.

  15. Human Schedule Performance, Protocol Analysis, and the "Silent Dog" Methodology

    ERIC Educational Resources Information Center

    Cabello, Francisco; Luciano, Carmen; Gomez, Inmaculada; Barnes-Holmes, Dermot

    2004-01-01

    The purpose of the current experiment was to investigate the role of private verbal behavior on the operant performances of human adults, using a protocol analysis procedure with additional methodological controls (the "silent dog" method). Twelve subjects were exposed to fixed ratio 8 and differential reinforcement of low rate 3-s schedules. For…

  16. Identification of human operator performance models utilizing time series analysis

    NASA Technical Reports Server (NTRS)

    Holden, F. M.; Shinners, S. M.

    1973-01-01

    The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.

  17. A Semiotic Reading and Discourse Analysis of Postmodern Street Performance

    ERIC Educational Resources Information Center

    Lee, Mimi Miyoung; Chung, Sheng Kuan

    2009-01-01

    Postmodern street art operates under a set of references that requires art educators and researchers to adopt alternative analytical frameworks in order to understand its meanings. In this article, we describe social semiotics, critical discourse analysis, and postmodern street performance as well as the relevance of the former two in interpreting…

  18. The Analysis of Athletic Performance: Some Practical and Philosophical Considerations

    ERIC Educational Resources Information Center

    Nelson, Lee J.; Groom, Ryan

    2012-01-01

    This article presents a hypothetical dialogue between a notational analyst (NA) recently schooled in the positivistic assessment of athletic performance, an "old-school" traditional coach (TC) who favours subjective analysis, and a pragmatic educator (PE). The conversation opens with NA and TC debating the respective value of quantitative and…

  19. Storage element performance optimization for CMS analysis jobs

    NASA Astrophysics Data System (ADS)

    Behrmann, G.; Dahlblom, J.; Guldmyr, J.; Happonen, K.; Lindén, T.

    2012-12-01

    Tier-2 computing sites in the Worldwide Large Hadron Collider Computing Grid (WLCG) host CPU-resources (Compute Element, CE) and storage resources (Storage Element, SE). The vast amount of data that needs to processed from the Large Hadron Collider (LHC) experiments requires good and efficient use of the available resources. Having a good CPU efficiency for the end users analysis jobs requires that the performance of the storage system is able to scale with I/O requests from hundreds or even thousands of simultaneous jobs. In this presentation we report on the work on improving the SE performance at the Helsinki Institute of Physics (HIP) Tier-2 used for the Compact Muon Experiment (CMS) at the LHC. Statistics from CMS grid jobs are collected and stored in the CMS Dashboard for further analysis, which allows for easy performance monitoring by the sites and by the CMS collaboration. As part of the monitoring framework CMS uses the JobRobot which sends every four hours 100 analysis jobs to each site. CMS also uses the HammerCloud tool for site monitoring and stress testing and it has replaced the JobRobot. The performance of the analysis workflow submitted with JobRobot or HammerCloud can be used to track the performance due to site configuration changes, since the analysis workflow is kept the same for all sites and for months in time. The CPU efficiency of the JobRobot jobs at HIP was increased approximately by 50 % to more than 90 %, by tuning the SE and by improvements in the CMSSW and dCache software. The performance of the CMS analysis jobs improved significantly too. Similar work has been done on other CMS Tier-sites, since on average the CPU efficiency for CMSSW jobs has increased during 2011. Better monitoring of the SE allows faster detection of problems, so that the performance level can be kept high. The next storage upgrade at HIP consists of SAS disk enclosures which can be stress tested on demand with HammerCloud workflows, to make sure that the I/O-performance

  20. Modeling and performance analysis of GPS vector tracking algorithms

    NASA Astrophysics Data System (ADS)

    Lashley, Matthew

    This dissertation provides a detailed analysis of GPS vector tracking algorithms and the advantages they have over traditional receiver architectures. Standard GPS receivers use a decentralized architecture that separates the tasks of signal tracking and position/velocity estimation. Vector tracking algorithms combine the two tasks into a single algorithm. The signals from the various satellites are processed collectively through a Kalman filter. The advantages of vector tracking over traditional, scalar tracking methods are thoroughly investigated. A method for making a valid comparison between vector and scalar tracking loops is developed. This technique avoids the ambiguities encountered when attempting to make a valid comparison between tracking loops (which are characterized by noise bandwidths and loop order) and the Kalman filters (which are characterized by process and measurement noise covariance matrices) that are used by vector tracking algorithms. The improvement in performance offered by vector tracking is calculated in multiple different scenarios. Rule of thumb analysis techniques for scalar Frequency Lock Loops (FLL) are extended to the vector tracking case. The analysis tools provide a simple method for analyzing the performance of vector tracking loops. The analysis tools are verified using Monte Carlo simulations. Monte Carlo simulations are also used to study the effects of carrier to noise power density (C/N0) ratio estimation and the advantage offered by vector tracking over scalar tracking. The improvement from vector tracking ranges from 2.4 to 6.2 dB in various scenarios. The difference in the performance of the three vector tracking architectures is analyzed. The effects of using a federated architecture with and without information sharing between the receiver's channels are studied. A combination of covariance analysis and Monte Carlo simulation is used to analyze the performance of the three algorithms. The federated algorithm without

  1. Performance Analysis of HF Band FB-MC-SS

    SciTech Connect

    Hussein Moradi; Stephen Andrew Laraway; Behrouz Farhang-Boroujeny

    2016-01-01

    Abstract—In a recent paper [1] the filter bank multicarrier spread spectrum (FB-MC-SS) waveform was proposed for wideband spread spectrum HF communications. A significant benefit of this waveform is robustness against narrow and partial band interference. Simulation results in [1] demonstrated good performance in a wideband HF channel over a wide range of conditions. In this paper we present a theoretical analysis of the bit error probably for this system. Our analysis tailors the results from [2] where BER performance was analyzed for maximum ration combining systems that accounted for correlation between subcarriers and channel estimation error. Equations are give for BER that closely match the simulated performance in most situations.

  2. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  3. Relative performance of academic departments using DEA with sensitivity analysis.

    PubMed

    Tyagi, Preeti; Yadav, Shiv Prasad; Singh, S P

    2009-05-01

    The process of liberalization and globalization of Indian economy has brought new opportunities and challenges in all areas of human endeavor including education. Educational institutions have to adopt new strategies to make best use of the opportunities and counter the challenges. One of these challenges is how to assess the performance of academic programs based on multiple criteria. Keeping this in view, this paper attempts to evaluate the performance efficiencies of 19 academic departments of IIT Roorkee (India) through data envelopment analysis (DEA) technique. The technique has been used to assess the performance of academic institutions in a number of countries like USA, UK, Australia, etc. But we are using it first time in Indian context to the best of our knowledge. Applying DEA models, we calculate technical, pure technical and scale efficiencies and identify the reference sets for inefficient departments. Input and output projections are also suggested for inefficient departments to reach the frontier. Overall performance, research performance and teaching performance are assessed separately using sensitivity analysis.

  4. Performance demonstration program plan for analysis of simulated headspace gases

    SciTech Connect

    1995-06-01

    The Performance Demonstration Program (PDP) for analysis of headspace gases will consist of regular distribution and analyses of test standards to evaluate the capability for analyzing VOCs, hydrogen, and methane in the headspace of transuranic (TRU) waste throughout the Department of Energy (DOE) complex. Each distribution is termed a PDP cycle. These evaluation cycles will provide an objective measure of the reliability of measurements performed for TRU waste characterization. Laboratory performance will be demonstrated by the successful analysis of blind audit samples of simulated TRU waste drum headspace gases according to the criteria set within the text of this Program Plan. Blind audit samples (hereinafter referred to as PDP samples) will be used as an independent means to assess laboratory performance regarding compliance with the QAPP QAOs. The concentration of analytes in the PDP samples will encompass the range of concentrations anticipated in actual waste characterization gas samples. Analyses which are required by the WIPP to demonstrate compliance with various regulatory requirements and which are included in the PDP must be performed by laboratories which have demonstrated acceptable performance in the PDP.

  5. Analysis performance of proton exchange membrane fuel cell (PEMFC)

    NASA Astrophysics Data System (ADS)

    Mubin, A. N. A.; Bahrom, M. H.; Azri, M.; Ibrahim, Z.; Rahim, N. A.; Raihan, S. R. S.

    2017-06-01

    Recently, the proton exchange membrane fuel cell (PEMFC) has gained much attention to the technology of renewable energy due to its mechanically ideal and zero emission power source. PEMFC performance reflects from the surroundings such as temperature and pressure. This paper presents an analysis of the performance of the PEMFC by developing the mathematical thermodynamic modelling using Matlab/Simulink. Apart from that, the differential equation of the thermodynamic model of the PEMFC is used to explain the contribution of heat to the performance of the output voltage of the PEMFC. On the other hand, the partial pressure equation of the hydrogen is included in the PEMFC mathematical modeling to study the PEMFC voltage behaviour related to the input variable input hydrogen pressure. The efficiency of the model is 33.8% which calculated by applying the energy conversion device equations on the thermal efficiency. PEMFC’s voltage output performance is increased by increasing the hydrogen input pressure and temperature.

  6. Mission analysis and performance specification studies report, appendix A

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The Near Term Hybrid Passenger Vehicle Development Program tasks included defining missions, developing distributions of daily travel and composite driving cycles for these missions, providing information necessary to estimate the potential replacement of the existing fleet by hybrids, and estimating acceleration/gradeability performance requirements for safe operation. The data was then utilized to develop mission specifications, define reference vehicles, develop hybrid vehicle performance specifications, and make fuel consumption estimates for the vehicles. The major assumptions which underlie the approach taken to the mission analysis and development of performance specifications are the following: the daily operating range of a hybrid vehicle should not be limited by the stored energy capacity and the performance of such a vehicle should not be strongly dependent on the battery state of charge.

  7. Intelligent Performance Analysis with a Natural Language Interface

    NASA Astrophysics Data System (ADS)

    Juuso, Esko K.

    2017-09-01

    Performance improvement is taken as the primary goal in the asset management. Advanced data analysis is needed to efficiently integrate condition monitoring data into the operation and maintenance. Intelligent stress and condition indices have been developed for control and condition monitoring by combining generalized norms with efficient nonlinear scaling. These nonlinear scaling methodologies can also be used to handle performance measures used for management since management oriented indicators can be presented in the same scale as intelligent condition and stress indices. Performance indicators are responses of the process, machine or system to the stress contributions analyzed from process and condition monitoring data. Scaled values are directly used in intelligent temporal analysis to calculate fluctuations and trends. All these methodologies can be used in prognostics and fatigue prediction. The meanings of the variables are beneficial in extracting expert knowledge and representing information in natural language. The idea of dividing the problems into the variable specific meanings and the directions of interactions provides various improvements for performance monitoring and decision making. The integrated temporal analysis and uncertainty processing facilitates the efficient use of domain expertise. Measurements can be monitored with generalized statistical process control (GSPC) based on the same scaling functions.

  8. Integrated microfluidic systems for high-performance genetic analysis.

    PubMed

    Liu, Peng; Mathies, Richard A

    2009-10-01

    Driven by the ambitious goals of genome-related research, fully integrated microfluidic systems have developed rapidly to advance biomolecular and, in particular, genetic analysis. To produce a microsystem with high performance, several key elements must be strategically chosen, including device materials, temperature control, microfluidic control, and sample/product transport integration. We review several significant examples of microfluidic integration in DNA sequencing, gene expression analysis, pathogen detection, and forensic short tandem repeat typing. The advantages of high speed, increased sensitivity, and enhanced reliability enable these integrated microsystems to address bioanalytical challenges such as single-copy DNA sequencing, single-cell gene expression analysis, pathogen detection, and forensic identification of humans in formats that enable large-scale and point-of-analysis applications.

  9. Thermodynamic performance analysis of ramjet engine at wide working conditions

    NASA Astrophysics Data System (ADS)

    Ou, Min; Yan, Li; Tang, Jing-feng; Huang, Wei; Chen, Xiao-qian

    2017-03-01

    Although ramjet has the advantages of high-speed flying and higher specific impulse, the performance parameters will decline seriously with the increase of flight Mach number and flight height. Therefore, the investigation on the thermodynamic performance of ramjet is very crucial for broadening the working range. In the current study, a typical ramjet model has been employed to investigate the performance characteristics at wide working conditions. First of all, the compression characteristic analysis is carried out based on the Brayton cycle. The obtained results show that the specific cross-section area (A2 and A5) and the air-fuel ratio (f) have a great influence on the ramjet performance indexes. Secondly, the thermodynamic calculation process of ramjet is given from the view of the pneumatic thermal analysis. Then, the variable trends of the ramjet performance indexes with the flow conditions, the air-fuel ratio (f), the specific cross-sectional area (A2 and A5) under the fixed operating condition, equipotential dynamic pressure condition and variable dynamic pressure condition have been discussed. Finally, the optimum value of the specific cross-sectional area (A5) and the air-fuel ratio (f) of the ramjet model at a fixed work condition (Ma=3.5, H=12 km) are obtained.

  10. Safety and performance analysis of a commercial photovoltaic installation

    NASA Astrophysics Data System (ADS)

    Hamzavy, Babak T.; Bradley, Alexander Z.

    2013-09-01

    Continuing to better understand the performance of PV systems and changes in performance with the system life is vital to the sustainable growth of solar. A systematic understanding of degradation mechanisms that are induced as a result of variables such as the service environment, installation, module/material design, weather, operation and maintenance, and manufacturing is required for reliable operation throughout a system's lifetime. We wish to report the results from an analysis of a commercial c-Si PV array owned and operated by DuPont. We assessed the electrical performance of the modules by comparing the original manufacturers' performance data with the measurements obtained using a solar simulator to determine the degradation rate. This evaluation provides valuable PV system field experience and document key issues regarding safety and performance. A review of the nondestructive and destructive analytical methods and characterization strategies we have found useful for system, module, and subsequent material component evaluations are presented. We provide an overview of our inspection protocol and subsequent control process to mitigate risk. The objective is to explore and develop best practice protocols regarding PV asset optimization and provide a rationale to reduce risk based on the analysis of our own commercial installations.

  11. Performance requirements analysis for payload delivery from a space station

    NASA Technical Reports Server (NTRS)

    Friedlander, A. L.; Soldner, J. K.; Bell, J. (Editor); Ricks, G. W.; Kincade, R. E.; Deatkins, D.; Reynolds, R.; Nader, B. A.; Hill, O.; Babb, G. R.

    1983-01-01

    Operations conducted from a space station in low Earth orbit which have different constraints and opportunities than those conducted from direct Earth launch were examined. While a space station relieves many size and performance constraints on the space shuttle, the space station's inertial orbit has different launch window constraints from those associated with customary Earth launches which reflect upon upper stage capability. A performance requirements analysis was developed to provide a reference source of parametric data, and specific case solutions and upper stage sizing trade to assist potential space station users and space station and upper stage developers assess the impacts of a space station on missions of interest.

  12. Aerodynamic Analysis of Cup Anemometers Performance: The Stationary Harmonic Response

    PubMed Central

    Pindado, Santiago; Cubas, Javier; Sanz-Andrés, Ángel

    2013-01-01

    The effect of cup anemometer shape parameters, such as the cups' shape, their size, and their center rotation radius, was experimentally analyzed. This analysis was based on both the calibration constants of the transfer function and the most important harmonic term of the rotor's movement, which due to the cup anemometer design is the third one. This harmonic analysis represents a new approach to study cup anemometer performances. The results clearly showed a good correlation between the average rotational speed of the anemometer's rotor and the mentioned third harmonic term of its movement. PMID:24381512

  13. INL FY2014 1st Quarterly Performance Analysis

    SciTech Connect

    Kinghorn, Loran

    2014-07-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS), as prescribed in DOE Order 232.2 “Occurrence Reporting and Processing of Operations Information,” requires a quarterly analysis of events, both reportable and not reportable, for the previous 12 months. This report is the analysis of 76 occurrence reports and over 16 other deficiency reports (including not reportable events) identified at the INL during the period of October 2013 through December 2013. Battelle Energy Alliance (BEA) operates the INL under contract DE AC 07 051D14517

  14. Aerodynamic analysis of cup anemometers performance: the stationary harmonic response.

    PubMed

    Pindado, Santiago; Cubas, Javier; Sanz-Andrés, Angel

    2013-01-01

    The effect of cup anemometer shape parameters, such as the cups' shape, their size, and their center rotation radius, was experimentally analyzed. This analysis was based on both the calibration constants of the transfer function and the most important harmonic term of the rotor's movement, which due to the cup anemometer design is the third one. This harmonic analysis represents a new approach to study cup anemometer performances. The results clearly showed a good correlation between the average rotational speed of the anemometer's rotor and the mentioned third harmonic term of its movement.

  15. A Study of ATLAS Grid Performance for Distributed Analysis

    NASA Astrophysics Data System (ADS)

    Panitkin, Sergey; Fine, Valery; Wenaus, Torre

    2012-12-01

    In the past two years the ATLAS Collaboration at the LHC has collected a large volume of data and published a number of ground breaking papers. The Grid-based ATLAS distributed computing infrastructure played a crucial role in enabling timely analysis of the data. We will present a study of the performance and usage of the ATLAS Grid as platform for physics analysis in 2011. This includes studies of general properties as well as timing properties of user jobs (wait time, run time, etc). These studies are based on mining of data archived by the PanDA workload management system.

  16. RTOD- RADIAL TURBINE OFF-DESIGN PERFORMANCE ANALYSIS

    NASA Technical Reports Server (NTRS)

    Glassman, A. J.

    1994-01-01

    The RTOD program was developed to accurately predict radial turbine off-design performance. The radial turbine has been used extensively in automotive turbochargers and aircraft auxiliary power units. It is now being given serious consideration for primary powerplant applications. In applications where the turbine will operate over a wide range of power settings, accurate off-design performance prediction is essential for a successful design. RTOD predictions have already illustrated a potential improvement in off-design performance offered by rotor back-sweep for high-work-factor radial turbines. RTOD can be used to analyze other potential performance enhancing design features. RTOD predicts the performance of a radial turbine (with or without rotor blade sweep) as a function of pressure ratio, speed, and stator setting. The program models the flow with the following: 1) stator viscous and trailing edge losses; 2) a vaneless space loss between the stator and the rotor; and 3) rotor incidence, viscous, trailing-edge, clearance, and disk friction losses. The stator and rotor viscous losses each represent the combined effects of profile, endwall, and secondary flow losses. The stator inlet and exit and the rotor inlet flows are modeled by a mean-line analysis, but a sector analysis is used at the rotor exit. The leakage flow through the clearance gap in a pivoting stator is also considered. User input includes gas properties, turbine geometry, and the stator and rotor viscous losses at a reference performance point. RTOD output includes predicted turbine performance over a specified operating range and any user selected flow parameters. The RTOD program is written in FORTRAN IV for batch execution and has been implemented on an IBM 370 series computer with a central memory requirement of approximately 100K of 8 bit bytes. The RTOD program was developed in 1983.

  17. Microfabricated devices for performing chemical and biochemical analysis

    SciTech Connect

    Ramsey, J.M.; Jacobson, S.C.; Foote, R.S.

    1997-05-01

    There is growing interest in microfabricated devices that perform chemical and biochemical analysis. The general goal is to use microfabrication tools to construct miniature devices that can perform a complete analysis starting with an unprocessed sample. Such devices have been referred to as lab-on-a-chip devices. Initial efforts on microfluidic laboratory-on-a-chip devices focused on chemical separations. There are many potential applications of these fluidic microchip devices. Some applications such as chemical process control or environmental monitoring would require that a chip be used over an extended period of time or for many analyses. Other applications such as forensics, clinical diagnostics, and genetic diagnostics would employ the chip devices as single use disposable devices.

  18. Modeling and performance analysis of QoS data

    NASA Astrophysics Data System (ADS)

    Strzeciwilk, Dariusz; Zuberek, Włodzimierz M.

    2016-09-01

    The article presents the results of modeling and analysis of data transmission performance on systems that support quality of service. Models are designed and tested, taking into account multiservice network architecture, i.e. supporting the transmission of data related to different classes of traffic. Studied were mechanisms of traffic shaping systems, which are based on the Priority Queuing with an integrated source of data and the various sources of data that is generated. Discussed were the basic problems of the architecture supporting QoS and queuing systems. Designed and built were models based on Petri nets, supported by temporal logics. The use of simulation tools was to verify the mechanisms of shaping traffic with the applied queuing algorithms. It is shown that temporal models of Petri nets can be effectively used in the modeling and analysis of the performance of computer networks.

  19. Identifying influential factors of business process performance using dependency analysis

    NASA Astrophysics Data System (ADS)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  20. Performance analysis of a generalized upset detection procedure

    NASA Technical Reports Server (NTRS)

    Blough, Douglas M.; Masson, Gerald M.

    1987-01-01

    A general procedure for upset detection in complex systems, called the data block capture and analysis upset monitoring process is described and analyzed. The process consists of repeatedly recording a fixed amount of data from a set of predetermined observation lines of the system being monitored (i.e., capturing a block of data), and then analyzing the captured block in an attempt to determine whether the system is functioning correctly. The algorithm which analyzes the data blocks can be characterized in terms of the amount of time it requires to examine a given length data block to ascertain the existence of features/conditions that have been predetermined to characterize the upset-free behavior of the system. The performance of linear, quadratic, and logarithmic data analysis algorithms is rigorously characterized in terms of three performance measures: (1) the probability of correctly detecting an upset; (2) the expected number of false alarms; and (3) the expected latency in detecting upsets.

  1. Performance analysis of solar powered absorption refrigeration system

    NASA Astrophysics Data System (ADS)

    Abu-Ein, Suleiman Qaseem; Fayyad, Sayel M.; Momani, Waleed; Al-Bousoul, Mamdouh

    2009-12-01

    The present work provides a detailed thermodynamic analysis of a 10 kW solar absorption refrigeration system using ammonia-water mixtures as a working medium. This analysis includes both first law and second law of thermodynamics. The coefficient of performance (COP), exergetic coefficient of performance (ECOP) and the exergy losses (Δ E) through each component of the system at different operating conditions are obtained. The minimum and maximum values of COP and ECOP were found to be at 110 and 200°C generator temperatures respectively. About 40% of the system exergy losses were found to be in the generator. The maximum exergy losses in the absorber occur at generator temperature of 130°C for all evaporator temperatures. A computer simulation model is developed to carry out the calculations and to obtain the results of the present study.

  2. High Performance Descriptive Semantic Analysis of Semantic Graph Databases

    SciTech Connect

    Joslyn, Cliff A.; Adolf, Robert D.; al-Saffar, Sinan; Feo, John T.; Haglin, David J.; Mackey, Greg E.; Mizell, David W.

    2011-06-02

    As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to understand their inherent semantic structure, whether codified in explicit ontologies or not. Our group is researching novel methods for what we call descriptive semantic analysis of RDF triplestores, to serve purposes of analysis, interpretation, visualization, and optimization. But data size and computational complexity makes it increasingly necessary to bring high performance computational resources to bear on this task. Our research group built a novel high performance hybrid system comprising computational capability for semantic graph database processing utilizing the large multi-threaded architecture of the Cray XMT platform, conventional servers, and large data stores. In this paper we describe that architecture and our methods, and present the results of our analyses of basic properties, connected components, namespace interaction, and typed paths such for the Billion Triple Challenge 2010 dataset.

  3. Analysis and performance of flat-plate solar collector arrays

    SciTech Connect

    Wang, X.A.; Wu, L.G. )

    1990-01-01

    A new discrete numerical model is proposed to calculate the flow and temperature distribution in solar collector arrays. The flow nonuniformity, the longitudinal heat conduction, and the buoyancy effect are all taken into account in the analysis. The numerical results of pressure and temperature distribution are found in agreement with the experimental results. It is found that the flow nonuniformity has detrimental effect on the thermal performance of collector array.

  4. Using Linguistic Analysis to Identify High Performing Teams

    DTIC Science & Technology

    2006-06-01

    linguistic analysis (specifically the Linguistic Inquiry and Word Count, LIWC) in identifying potential high performing teams. In a series of studies...usefulness of one technological tool, the Linguistic Inquiry Word Count (LIWC; Pennebaker, Francis, & Booth, 2001), in identifying productive groups. The...LIWC analyzes text on a word -by- word basis, categorizes each word using 72 linguistic dimensions (e.g., pronoun, present tense, cognitive process), and

  5. Detection of Wind Turbine Power Performance Abnormalities Using Eigenvalue Analysis

    DTIC Science & Technology

    2014-12-23

    Detection of Wind Turbine Power Performance Abnormalities Using Eigenvalue Analysis Georgios Alexandros Skrimpas1, Christian Walsted Sweeney2, Kun S...University of Denmark, Lyngby, 2800, Denmark nm@elektro.dtu.dk jh@elektro.dtu.dk ABSTRACT Condition monitoring of wind turbines is a field of continu...ous research and development as new turbine configurations enter into the market and new failure modes appear. Systems utilising well established

  6. Design and Performance Analysis of a Digital Acoustic Telemetry System

    DTIC Science & Technology

    1988-05-01

    Charles D. Hollister Dean of Graduate Students DESIGN AND PERFORMANCE ANALYSIS OF A DIGITAL ACOUSTIC TELEMETRY SYSTEM by Josko A. Catipovic B. S...light of available digital hard- ware. This section is not intended as a detailed design guide , but gives a clear indication that the proposed and...ocean acoustic channels and confirm the theoretical predictions for system behavior, a complete simulation of the proposed system was implemented on a

  7. Performance of multifractal detrended fluctuation analysis on short time series

    NASA Astrophysics Data System (ADS)

    López, Juan Luis; Contreras, Jesús Guillermo

    2013-02-01

    The performance of the multifractal detrended analysis on short time series is evaluated for synthetic samples of several mono- and multifractal models. The reconstruction of the generalized Hurst exponents is used to determine the range of applicability of the method and the precision of its results as a function of the decreasing length of the series. As an application the series of the daily exchange rate between the U.S. dollar and the euro is studied.

  8. Analysis of guidance law performance using personal computers

    NASA Technical Reports Server (NTRS)

    Barrios, J. Rene

    1990-01-01

    A point mass, three-degree of freedom model is presented as a basic development tool for PC based simulation models. The model has been used in the development of guidance algorithms as well as in other applications such as performance management systems to compute optimal speeds. Its limitations and advantages are discussed with regard to the windshear environment. A method for simulating a simple autopilot is explained in detail and applied in the analysis of different guidance laws.

  9. Analysis of End-to-End Performance of LAN Systems

    DTIC Science & Technology

    1990-03-01

    thesis research. We show the results with respect to LAN utilization, request delay, complete transfer, delivery time , and incomplete transfer. These...request delay, LAN utilization and delivery time will be measured for the purpose of the performance analysis of LANs. Since the analytical approach based...one or two servers (as Tables 7-12 and Figures 26-31) " AVG, STD DEV delivery time for transaction class 1 with one or two servers: from PC to Server

  10. Detection of Wind Turbine Power Performance Abnormalities Using Eigenvalue Analysis

    DTIC Science & Technology

    2014-10-02

    seen for Turbine #09 in figure 9. Then, a noise reduction mode was enabled for the current wind turbine (and for the vast major- ity of the turbines ...Production, Wind Speed and Power Curve - Case: Enabling of noise reduction mode. Figure 11. Turbine #07 - Noise reduction mode - Trending behaviour of...Detection of Wind Turbine Power Performance Abnormalities Using Eigenvalue Analysis Georgios Alexandros Skrimpas1, Christian Walsted Sweeney2, Kun S

  11. Architecture and Performance Analysis of General Bio-Molecular Networks

    DTIC Science & Technology

    2012-01-14

    General Bio -Molecular Networks Contract/Grant #: FA9550-10-1-0128 Table of Contents...14-10-2011 4. TITLE AND SUBTITLE Architecture and Performance Analysis of Bio -Molecular Network 5a. CONTRACT NUMBER FA9550-10-1-0128 5b...method is expected to be much better, in terms of the running time, for the system with more molecules. 15. SUBJECT TERMS Stochastic Bio -molecular

  12. Performance Analysis of Visible Light Communication Using CMOS Sensors.

    PubMed

    Do, Trong-Hop; Yoo, Myungsik

    2016-02-29

    This paper elucidates the fundamentals of visible light communication systems that use the rolling shutter mechanism of CMOS sensors. All related information involving different subjects, such as photometry, camera operation, photography and image processing, are studied in tandem to explain the system. Then, the system performance is analyzed with respect to signal quality and data rate. To this end, a measure of signal quality, the signal to interference plus noise ratio (SINR), is formulated. Finally, a simulation is conducted to verify the analysis.

  13. Performance analysis of wireless sensor networks in geophysical sensing applications

    NASA Astrophysics Data System (ADS)

    Uligere Narasimhamurthy, Adithya

    Performance is an important criteria to consider before switching from a wired network to a wireless sensing network. Performance is especially important in geophysical sensing where the quality of the sensing system is measured by the precision of the acquired signal. Can a wireless sensing network maintain the same reliability and quality metrics that a wired system provides? Our work focuses on evaluating the wireless GeoMote sensor motes that were developed by previous computer science graduate students at Mines. Specifically, we conducted a set of experiments, namely WalkAway and Linear Array experiments, to characterize the performance of the wireless motes. The motes were also equipped with the Sticking Heartbeat Aperture Resynchronization Protocol (SHARP), a time synchronization protocol developed by a previous computer science graduate student at Mines. This protocol should automatically synchronize the mote's internal clocks and reduce time synchronization errors. We also collected passive data to evaluate the response of GeoMotes to various frequency components associated with the seismic waves. With the data collected from these experiments, we evaluated the performance of the SHARP protocol and compared the performance of our GeoMote wireless system against the industry standard wired seismograph system (Geometric-Geode). Using arrival time analysis and seismic velocity calculations, we set out to answer the following question. Can our wireless sensing system (GeoMotes) perform similarly to a traditional wired system in a realistic scenario?

  14. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    SciTech Connect

    Carlsbad Field Office

    2007-11-13

    The Performance Demonstration Program (PDP) for headspace gases distributes blind audit samples in a gas matrix for analysis of volatile organic compounds (VOCs). Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility’s compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document

  15. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    SciTech Connect

    Carlsbad Field Office

    2007-11-19

    The Performance Demonstration Program (PDP) for headspace gases distributes blind audit samples in a gas matrix for analysis of volatile organic compounds (VOCs). Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility’s compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document

  16. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    SciTech Connect

    Carlsbad Field Office

    2006-04-01

    The Performance Demonstration Program (PDP) for headspace gases distributes sample gases of volatile organic compounds (VOCs) for analysis. Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility’s compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document. Participating measurement

  17. An assessment of SBS modified asphalt concrete pavements performance features performing numerical analysis

    NASA Astrophysics Data System (ADS)

    Karakas, Ahmet Sertac; Bozkurt, Tarik Serhat; Sayin, Baris; Ortes, Faruk

    2017-07-01

    In passenger and freight traffic on the roads, which has the largest share of the hot mix asphalt (HMA) prepared asphalt concrete pavement is one of the most preferred type of flexible superstructure. During the service life of the road, they must provide the performance which is expected to show. HMA must be high performance mix design, comfortable, safe and resistance to degradation. In addition, it becomes a critical need to use various additives materials for roads to be able to serve long-term against environmental conditions such as traffic and climate due to the fact that the way of raw materials is limited. Styrene Butadiene Styrene (SBS) polymers are widely used among additives. In this study, the numerical analysis of SBS modified HMA designed asphalt concrete coatings prepared with different thicknesses with SBS modified HMA is performed. After that, stress and deformation values of the three pavement models are compared and evaluated.

  18. Multi-order analysis framework for comprehensive biometric performance evaluation

    NASA Astrophysics Data System (ADS)

    Gorodnichy, Dmitry O.

    2010-04-01

    It is not uncommon for contemporary biometric systems to have more than one match below the matching threshold, or to have two or more matches having close matching scores. This is especially true for those that store large quantities of identities and/or are applied to measure loosely constrained biometric traits, such as in identification from video or at a distance. Current biometric performance evaluation standards however are still largely based on measuring single-score statistics such as False Match, False Non-Match rates and the trade-off curves based thereon. Such methodology and reporting makes it impossible to investigate the risks and risk mitigation strategies associated with not having a unique identifying score. To address the issue, Canada Border Services Agency has developed a novel modality-agnostic multi-order performance analysis framework. The framework allows one to analyze the system performance at several levels of detail, by defining the traditional single-score-based metrics as Order-1 analysis, and introducing Order- 2 and Order-3 analysis to permit the investigation of the system reliability and the confidence of its recognition decisions. Implemented in a toolkit called C-BET (Comprehensive Biometrics Evaluation Toolkit), the framework has been applied in a recent examination of the state-of-the art iris recognition systems, the results of which are presented, and is now recommended to other agencies interested in testing and tuning the biometric systems.

  19. A Multifaceted Independent Performance Analysis of Facial Subspace Recognition Algorithms

    PubMed Central

    Bajwa, Usama Ijaz; Taj, Imtiaz Ahmad; Anwar, Muhammad Waqas; Wang, Xuan

    2013-01-01

    Face recognition has emerged as the fastest growing biometric technology and has expanded a lot in the last few years. Many new algorithms and commercial systems have been proposed and developed. Most of them use Principal Component Analysis (PCA) as a base for their techniques. Different and even conflicting results have been reported by researchers comparing these algorithms. The purpose of this study is to have an independent comparative analysis considering both performance and computational complexity of six appearance based face recognition algorithms namely PCA, 2DPCA, A2DPCA, (2D)2PCA, LPP and 2DLPP under equal working conditions. This study was motivated due to the lack of unbiased comprehensive comparative analysis of some recent subspace methods with diverse distance metric combinations. For comparison with other studies, FERET, ORL and YALE databases have been used with evaluation criteria as of FERET evaluations which closely simulate real life scenarios. A comparison of results with previous studies is performed and anomalies are reported. An important contribution of this study is that it presents the suitable performance conditions for each of the algorithms under consideration. PMID:23451054

  20. SIMS analysis of high-performance accelerator niobium

    SciTech Connect

    Maheshwari, P.; Stevie, F. A.; Myneni, Ganapati Rao; Rigsbee, J, M.; Dhakal, Pashupati; Ciovati, Gianluigi; Griffis, D. P.

    2014-11-01

    Niobium is used to fabricate superconducting radio frequency accelerator modules because of its high critical temperature, high critical magnetic field, and easy formability. Recent experiments have shown a very significant improvement in performance (over 100%) after a high-temperature bake at 1400 degrees C for 3h. SIMS analysis of this material showed the oxygen profile was significantly deeper than the native oxide with a shape that is indicative of diffusion. Positive secondary ion mass spectra showed the presence of Ti with a depth profile similar to that of O. It is suspected that Ti is associated with the performance improvement. The source of Ti contamination in the anneal furnace has been identified, and a new furnace was constructed without Ti. Initial results from the new furnace do not show the yield improvement. Further analyses should determine the relationship of Ti to cavity performance.

  1. Performance Analysis of Coaxial Fed Stacked Patch Antennas

    NASA Astrophysics Data System (ADS)

    Jain, Satish K.; Jain, Shobha

    2014-01-01

    A performance analysis of coaxial fed stacked dual patch electromagnetic-coupled microstrip antenna useful for satellite communication working in X/Ku band is presented. A simplified structure of stacked dual patch antenna is proposed with adjustable foam-gap between patches. Few important geometrical parameters were chosen on which the performance of stacked dual patch antenna mainly depends. Dimension of lower square patch, upper square patch and height of foam-gap between two patches are the parameters, which were varied one by one keeping other parameters constant. The performance was observed through the reflection coefficient (dB) and smith chart impedance plot, obtained from the numerical simulator (IE3D) for the dual resonance frequency and bandwidth. Proposed geometry of stacked dual patch antenna was also analyzed with cavity model and artificial neural network modeling technique. Dual resonance frequencies and associated bandwidth were calculated through them and results were cross checked in the laboratory with a few experimental findings.

  2. Performance Analysis and Improvement of WPAN MAC for Home Networks

    PubMed Central

    Mehta, Saurabh; Kwak, Kyung Sup

    2010-01-01

    The wireless personal area network (WPAN) is an emerging wireless technology for future short range indoor and outdoor communication applications. The IEEE 802.15.3 medium access control (MAC) is proposed to coordinate the access to the wireless medium among the competing devices, especially for short range and high data rate applications in home networks. In this paper we use analytical modeling to study the performance analysis of WPAN (IEEE 802.15.3) MAC in terms of throughput, efficient bandwidth utilization, and delay with various ACK policies under error channel condition. This allows us to introduce a K-Dly-ACK-AGG policy, payload size adjustment mechanism, and Improved Backoff algorithm to improve the performance of the WPAN MAC. Performance evaluation results demonstrate the impact of our improvements on network capacity. Moreover, these results can be very useful to WPAN application designers and protocol architects to easily and correctly implement WPAN for home networking. PMID:22319274

  3. Performance analysis and improvement of WPAN MAC for home networks.

    PubMed

    Mehta, Saurabh; Kwak, Kyung Sup

    2010-01-01

    The wireless personal area network (WPAN) is an emerging wireless technology for future short range indoor and outdoor communication applications. The IEEE 802.15.3 medium access control (MAC) is proposed to coordinate the access to the wireless medium among the competing devices, especially for short range and high data rate applications in home networks. In this paper we use analytical modeling to study the performance analysis of WPAN (IEEE 802.15.3) MAC in terms of throughput, efficient bandwidth utilization, and delay with various ACK policies under error channel condition. This allows us to introduce a K-Dly-ACK-AGG policy, payload size adjustment mechanism, and Improved Backoff algorithm to improve the performance of the WPAN MAC. Performance evaluation results demonstrate the impact of our improvements on network capacity. Moreover, these results can be very useful to WPAN application designers and protocol architects to easily and correctly implement WPAN for home networking.

  4. Total systems design analysis of high performance structures

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1993-01-01

    Designer-control parameters were identified at interdiscipline interfaces to optimize structural systems performance and downstream development and operations with reliability and least life-cycle cost. Interface tasks and iterations are tracked through a matrix of performance disciplines integration versus manufacturing, verification, and operations interactions for a total system design analysis. Performance integration tasks include shapes, sizes, environments, and materials. Integrity integrating tasks are reliability and recurring structural costs. Significant interface designer control parameters were noted as shapes, dimensions, probability range factors, and cost. Structural failure concept is presented, and first-order reliability and deterministic methods, benefits, and limitations are discussed. A deterministic reliability technique combining benefits of both is proposed for static structures which is also timely and economically verifiable. Though launch vehicle environments were primarily considered, the system design process is applicable to any surface system using its own unique filed environments.

  5. Crew Exploration Vehicle Launch Abort Controller Performance Analysis

    NASA Technical Reports Server (NTRS)

    Sparks, Dean W., Jr.; Raney, David L.

    2007-01-01

    This paper covers the simulation and evaluation of a controller design for the Crew Module (CM) Launch Abort System (LAS), to measure its ability to meet the abort performance requirements. The controller used in this study is a hybrid design, including features developed by the Government and the Contractor. Testing is done using two separate 6-degree-of-freedom (DOF) computer simulation implementations of the LAS/CM throughout the ascent trajectory: 1) executing a series of abort simulations along a nominal trajectory for the nominal LAS/CM system; and 2) using a series of Monte Carlo runs with perturbed initial flight conditions and perturbed system parameters. The performance of the controller is evaluated against a set of criteria, which is based upon the current functional requirements of the LAS. Preliminary analysis indicates that the performance of the present controller meets (with the exception of a few cases) the evaluation criteria mentioned above.

  6. The Vehicle Integrated Performance Analysis Experience: Reconnecting With Technical Integration

    NASA Technical Reports Server (NTRS)

    McGhee, D. S.

    2006-01-01

    Very early in the Space Launch Initiative program, a small team of engineers at MSFC proposed a process for performing system-level assessments of a launch vehicle. Aimed primarily at providing insight and making NASA a smart buyer, the Vehicle Integrated Performance Analysis (VIPA) team was created. The difference between the VIPA effort and previous integration attempts is that VIPA a process using experienced people from various disciplines, which focuses them on a technically integrated assessment. The foundations of VIPA s process are described. The VIPA team also recognized the need to target early detailed analysis toward identifying significant systems issues. This process is driven by the T-model for technical integration. VIPA s approach to performing system-level technical integration is discussed in detail. The VIPA process significantly enhances the development and monitoring of realizable project requirements. VIPA s assessment validates the concept s stated performance, identifies significant issues either with the concept or the requirements, and then reintegrates these issues to determine impacts. This process is discussed along with a description of how it may be integrated into a program s insight and review process. The VIPA process has gained favor with both engineering and project organizations for being responsive and insightful

  7. Performance bounds for modal analysis using sparse linear arrays

    NASA Astrophysics Data System (ADS)

    Li, Yuanxin; Pezeshki, Ali; Scharf, Louis L.; Chi, Yuejie

    2017-05-01

    We study the performance of modal analysis using sparse linear arrays (SLAs) such as nested and co-prime arrays, in both first-order and second-order measurement models. We treat SLAs as constructed from a subset of sensors in a dense uniform linear array (ULA), and characterize the performance loss of SLAs with respect to the ULA due to using much fewer sensors. In particular, we claim that, provided the same aperture, in order to achieve comparable performance in terms of Cramér-Rao bound (CRB) for modal analysis, SLAs require more snapshots, of which the number is about the number of snapshots used by ULA times the compression ratio in the number of sensors. This is shown analytically for the case with one undamped mode, as well as empirically via extensive numerical experiments for more complex scenarios. Moreover, the misspecified CRB proposed by Richmond and Horowitz is also studied, where SLAs suffer more performance loss than their ULA counterpart.

  8. Dynamic performances analysis of a real vehicle driving

    NASA Astrophysics Data System (ADS)

    Abdullah, M. A.; Jamil, J. F.; Salim, M. A.

    2015-12-01

    Vehicle dynamic is the effects of movement of a vehicle generated from the acceleration, braking, ride and handling activities. The dynamic behaviours are determined by the forces from tire, gravity and aerodynamic which acting on the vehicle. This paper emphasizes the analysis of vehicle dynamic performance of a real vehicle. Real driving experiment on the vehicle is conducted to determine the effect of vehicle based on roll, pitch, and yaw, longitudinal, lateral and vertical acceleration. The experiment is done using the accelerometer to record the reading of the vehicle dynamic performance when the vehicle is driven on the road. The experiment starts with weighing a car model to get the center of gravity (COG) to place the accelerometer sensor for data acquisition (DAQ). The COG of the vehicle is determined by using the weight of the vehicle. A rural route is set to launch the experiment and the road conditions are determined for the test. The dynamic performance of the vehicle are depends on the road conditions and driving maneuver. The stability of a vehicle can be controlled by the dynamic performance analysis.

  9. Reproducible LTE uplink performance analysis using precomputed interference signals

    NASA Astrophysics Data System (ADS)

    Pauli, Volker; Nisar, Muhammad Danish; Seidel, Eiko

    2011-12-01

    The consideration of realistic uplink inter-cell interference is essential for the overall performance testing of future cellular systems, and in particular for the evaluation of the radio resource management (RRM) algorithms. Most beyond-3G communication systems employ orthogonal multiple access in uplink (SC-FDMA in LTE and OFDMA in WiMAX), and additionally rely on frequency-selective RRM (scheduling) algorithms. This makes the task of accurate modeling of uplink interference both crucial and non-trivial. Traditional methods for its modeling (e.g., via additive white Gaussian noise interference sources) are therefore proving to be ineffective to realistically model the uplink interference in the next generation cellular systems. In this article, we propose the use of realistic precomputed interference patterns for LTE uplink performance analysis and testing. The interference patterns are generated via an LTE system-level simulator for a given set of scenario parameters, such as cell configuration, user configurations, and traffic models. The generated interference patterns (some of which are made publicly available) can be employed to benchmark the performance of any LTE uplink system in both lab simulations and field trials for practical deployments. It is worth mentioning that the proposed approach can also be extended to other cellular communication systems employing OFDMA-like multiple access with frequency-selective RRM techniques. The proposed approach offers twofold advantages. First, it allows for repeatability and reproducibility of the performance analysis. This is of crucial significance not only for researchers and developers to analyze the behavior and performance of their systems, but also for the network operators to compare the performance of competing system vendors. Second, the proposed testing mechanism evades the need for deployment of multiple cells (with multiple active users in each) to achieve realistic field trials, thereby resulting in

  10. Design and performance analysis of multilayer nested grazing incidence optics

    NASA Astrophysics Data System (ADS)

    Zuo, Fuchang; Deng, Loulou; Mei, Zhiwu; Li, Liansheng; Lv, Zhengxin

    2014-10-01

    We have developed X-ray grazing incidence optics with a single mirror. Although t can be used to demonstrate and test on the ground to verify the feasibility of X-ray detection system, it is unable to meet the requirements of X-ray pulsar navigation due to small effective area and large mass. There is an urgent need to develop multilayer nested grazing incidence optics, which consists of multilayer mirrors to form a coaxial and confocal system to maximize the use of space and increase the effective area. In this paper, aiming at the future demand of X-ray pulsar navigation, optimization and analysis of nested X-ray grazing incidence optics was carried out, the recurrence relations between the layers of mirrors were derived, reasonable initial structural parameters and stray light reduction method was given, and theoretical effective collection area was calculated. The initial structure and stray light eliminating structure are designed. The optical-mechanical-thermal numerical model was established using optical analysis software and finite element software for stray light analysis, focusing performance analysis, tolerance analysis, and mechanical analysis, providing evidence and guidance for the processing and alignment of nested X-ray grazing incidence optics.

  11. Quantitative analysis of the reconstruction performance of interpolants

    NASA Technical Reports Server (NTRS)

    Lansing, Donald L.; Park, Stephen K.

    1987-01-01

    The analysis presented provides a quantitative measure of the reconstruction or interpolation performance of linear, shift-invariant interpolants. The performance criterion is the mean square error of the difference between the sampled and reconstructed functions. The analysis is applicable to reconstruction algorithms used in image processing and to many types of splines used in numerical analysis and computer graphics. When formulated in the frequency domain, the mean square error clearly separates the contribution of the interpolation method from the contribution of the sampled data. The equations provide a rational basis for selecting an optimal interpolant; that is, one which minimizes the mean square error. The analysis has been applied to a selection of frequently used data splines and reconstruction algorithms: parametric cubic and quintic Hermite splines, exponential and nu splines (including the special case of the cubic spline), parametric cubic convolution, Keys' fourth-order cubic, and a cubic with a discontinuous first derivative. The emphasis in this paper is on the image-dependent case in which no a priori knowledge of the frequency spectrum of the sampled function is assumed.

  12. Evaluating health service quality: using importance performance analysis.

    PubMed

    Izadi, Azar; Jahani, Younes; Rafiei, Sima; Masoud, Ali; Vali, Leila

    2017-08-14

    Purpose Measuring healthcare service quality provides an objective guide for managers and policy makers to improve their services and patient satisfaction. Consequently, the purpose of this paper is to measure service quality provided to surgical and medical inpatients at Kerman Medical Sciences University (KUMS) in 2015. Design/methodology/approach A descriptive-analytic study, using a cross-sectional method in the KUMS training hospitals, was implemented between October 2 and March 15, 2015. Using stratified random sampling, 268 patients were selected. Data were collected using an importance-performance analysis (IPA) questionnaire, which measures current performance and determines each item's importance from the patients' perspectives. These data indicate overall satisfaction and appropriate practical strategies for managers to plan accordingly. Findings Findings revealed a significant gap between service importance and performance. From the patients' viewpoint, tangibility was the highest priority (mean=3.54), while reliability was given the highest performance (mean=3.02). The least important and lowest performance level was social accountability (mean=1.91 and 1.98, respectively). Practical implications Healthcare managers should focus on patient viewpoints and apply patient comments to solve problems, improve service quality and patient satisfaction. Originality/value The authors applied an IPA questionnaire to measure service quality provided to surgical and medical ward patients. This method identifies and corrects service quality shortcomings and improving service recipient perceptions.

  13. How motivation affects academic performance: a structural equation modelling analysis.

    PubMed

    Kusurkar, R A; Ten Cate, Th J; Vos, C M P; Westers, P; Croiset, G

    2013-03-01

    Few studies in medical education have studied effect of quality of motivation on performance. Self-Determination Theory based on quality of motivation differentiates between Autonomous Motivation (AM) that originates within an individual and Controlled Motivation (CM) that originates from external sources. To determine whether Relative Autonomous Motivation (RAM, a measure of the balance between AM and CM) affects academic performance through good study strategy and higher study effort and compare this model between subgroups: males and females; students selected via two different systems namely qualitative and weighted lottery selection. Data on motivation, study strategy and effort was collected from 383 medical students of VU University Medical Center Amsterdam and their academic performance results were obtained from the student administration. Structural Equation Modelling analysis technique was used to test a hypothesized model in which high RAM would positively affect Good Study Strategy (GSS) and study effort, which in turn would positively affect academic performance in the form of grade point averages. This model fit well with the data, Chi square = 1.095, df = 3, p = 0.778, RMSEA model fit = 0.000. This model also fitted well for all tested subgroups of students. Differences were found in the strength of relationships between the variables for the different subgroups as expected. In conclusion, RAM positively correlated with academic performance through deep strategy towards study and higher study effort. This model seems valid in medical education in subgroups such as males, females, students selected by qualitative and weighted lottery selection.

  14. Analysis of Latency Performance of Bluetooth Low Energy (BLE) Networks

    PubMed Central

    Cho, Keuchul; Park, Woojin; Hong, Moonki; Park, Gisu; Cho, Wooseong; Seo, Jihoon; Han, Kijun

    2015-01-01

    Bluetooth Low Energy (BLE) is a short-range wireless communication technology aiming at low-cost and low-power communication. The performance evaluation of classical Bluetooth device discovery have been intensively studied using analytical modeling and simulative methods, but these techniques are not applicable to BLE, since BLE has a fundamental change in the design of the discovery mechanism, including the usage of three advertising channels. Recently, there several works have analyzed the topic of BLE device discovery, but these studies are still far from thorough. It is thus necessary to develop a new, accurate model for the BLE discovery process. In particular, the wide range settings of the parameters introduce lots of potential for BLE devices to customize their discovery performance. This motivates our study of modeling the BLE discovery process and performing intensive simulation. This paper is focused on building an analytical model to investigate the discovery probability, as well as the expected discovery latency, which are then validated via extensive experiments. Our analysis considers both continuous and discontinuous scanning modes. We analyze the sensitivity of these performance metrics to parameter settings to quantitatively examine to what extent parameters influence the performance metric of the discovery processes. PMID:25545266

  15. Cross-industry Performance Modeling: Toward Cooperative Analysis

    SciTech Connect

    Reece, Wendy Jane; Blackman, Harold Stabler

    1998-10-01

    One of the current unsolved problems in human factors is the difficulty in acquiring information from lessons learned and data collected among human performance analysts in different domains. There are several common concerns and generally accepted issues of importance for human factors, psychology and industry analysts of performance and safety. Among these are the need to incorporate lessons learned in design, to carefully consider implementation of new designs and automation, and the need to reduce human performance-based contributions to risk. In spite of shared concerns, there are several roadblocks to widespread sharing of data and lessons learned from operating experience and simulation, including the fact that very few publicly accessible data bases exist (Gertman & Blackman, 1994, and Kirwan, 1997). There is a need to draw together analysts and analytic methodologies to comprise a centralized source of data with sufficient detail to be meaningful while ensuring source anonymity. We propose that a generic source of performance data and a multi-domain data store may provide the first steps toward cooperative performance modeling and analysis across industries.

  16. Cross-Industry Performance Modeling: Toward Cooperative Analysis

    SciTech Connect

    H. S. Blackman; W. J. Reece

    1998-10-01

    One of the current unsolved problems in human factors is the difficulty in acquiring information from lessons learned and data collected among human performance analysts in different domains. There are several common concerns and generally accepted issues of importance for human factors, psychology and industry analysts of performance and safety. Among these are the need to incorporate lessons learned in design, to carefully consider implementation of new designs and automation, and the need to reduce human performance-based contributions to risk. In spite of shared concerns, there are several road blocks to widespread sharing of data and lessons learned from operating experience and simulation, including the fact that very few publicly accessible data bases exist(Gertman & Blackman, 1994, and Kirwan, 1997). There is a need to draw together analysts and analytic methodologies to comprise a centralized source of data with sufficient detail to be meaningful while ensuring source anonymity. We propose that a generic source of performance data and a multi-domain data store may provide the first steps toward cooperative performance modeling and analysis across industries.

  17. Analysis of latency performance of bluetooth low energy (BLE) networks.

    PubMed

    Cho, Keuchul; Park, Woojin; Hong, Moonki; Park, Gisu; Cho, Wooseong; Seo, Jihoon; Han, Kijun

    2014-12-23

    Bluetooth Low Energy (BLE) is a short-range wireless communication technology aiming at low-cost and low-power communication. The performance evaluation of classical Bluetooth device discovery have been intensively studied using analytical modeling and simulative methods, but these techniques are not applicable to BLE, since BLE has a fundamental change in the design of the discovery mechanism, including the usage of three advertising channels. Recently, there several works have analyzed the topic of BLE device discovery, but these studies are still far from thorough. It is thus necessary to develop a new, accurate model for the BLE discovery process. In particular, the wide range settings of the parameters introduce lots of potential for BLE devices to customize their discovery performance. This motivates our study of modeling the BLE discovery process and performing intensive simulation. This paper is focused on building an analytical model to investigate the discovery probability, as well as the expected discovery latency, which are then validated via extensive experiments. Our analysis considers both continuous and discontinuous scanning modes. We analyze the sensitivity of these performance metrics to parameter settings to quantitatively examine to what extent parameters influence the performance metric of the discovery processes.

  18. Clinical laboratory as an economic model for business performance analysis

    PubMed Central

    Buljanović, Vikica; Patajac, Hrvoje; Petrovečki, Mladen

    2011-01-01

    Aim To perform SWOT (strengths, weaknesses, opportunities, and threats) analysis of a clinical laboratory as an economic model that may be used to improve business performance of laboratories by removing weaknesses, minimizing threats, and using external opportunities and internal strengths. Methods Impact of possible threats to and weaknesses of the Clinical Laboratory at Našice General County Hospital business performance and use of strengths and opportunities to improve operating profit were simulated using models created on the basis of SWOT analysis results. The operating profit as a measure of profitability of the clinical laboratory was defined as total revenue minus total expenses and presented using a profit and loss account. Changes in the input parameters in the profit and loss account for 2008 were determined using opportunities and potential threats, and economic sensitivity analysis was made by using changes in the key parameters. The profit and loss account and economic sensitivity analysis were tools for quantifying the impact of changes in the revenues and expenses on the business operations of clinical laboratory. Results Results of simulation models showed that operational profit of €470 723 in 2008 could be reduced to only €21 542 if all possible threats became a reality and current weaknesses remained the same. Also, operational gain could be increased to €535 804 if laboratory strengths and opportunities were utilized. If both the opportunities and threats became a reality, the operational profit would decrease by €384 465. Conclusion The operational profit of the clinical laboratory could be significantly reduced if all threats became a reality and the current weaknesses remained the same. The operational profit could be increased by utilizing strengths and opportunities as much as possible. This type of modeling may be used to monitor business operations of any clinical laboratory and improve its financial situation by

  19. Clinical laboratory as an economic model for business performance analysis.

    PubMed

    Buljanović, Vikica; Patajac, Hrvoje; Petrovecki, Mladen

    2011-08-15

    To perform SWOT (strengths, weaknesses, opportunities, and threats) analysis of a clinical laboratory as an economic model that may be used to improve business performance of laboratories by removing weaknesses, minimizing threats, and using external opportunities and internal strengths. Impact of possible threats to and weaknesses of the Clinical Laboratory at Našice General County Hospital business performance and use of strengths and opportunities to improve operating profit were simulated using models created on the basis of SWOT analysis results. The operating profit as a measure of profitability of the clinical laboratory was defined as total revenue minus total expenses and presented using a profit and loss account. Changes in the input parameters in the profit and loss account for 2008 were determined using opportunities and potential threats, and economic sensitivity analysis was made by using changes in the key parameters. The profit and loss account and economic sensitivity analysis were tools for quantifying the impact of changes in the revenues and expenses on the business operations of clinical laboratory. Results of simulation models showed that operational profit of €470 723 in 2008 could be reduced to only €21 542 if all possible threats became a reality and current weaknesses remained the same. Also, operational gain could be increased to €535 804 if laboratory strengths and opportunities were utilized. If both the opportunities and threats became a reality, the operational profit would decrease by €384 465. The operational profit of the clinical laboratory could be significantly reduced if all threats became a reality and the current weaknesses remained the same. The operational profit could be increased by utilizing strengths and opportunities as much as possible. This type of modeling may be used to monitor business operations of any clinical laboratory and improve its financial situation by implementing changes in the next fiscal

  20. Capacity and performance analysis of signaling networks in multivendor environments

    NASA Astrophysics Data System (ADS)

    Bafutto, Marcos; Kuehn, Paul J.; Willmann, Gert

    1994-04-01

    The load of common channel signaling networks is being increased through the introduction of new services such as supplementary services or mobile communication services. This may lead to a performance degradation of the signaling network, which affects both the quality of the new services and of the services already offered by the network. In this paper, a generic modeling methodology for the signaling load and the signaling network performance as a result of the various communication services is extended in order to include certain implementation-dependent particularities. The models are obtained by considering the protocol functions of Signaling System No. 7 as specified by the CCITT, as well as the information flows through these functions. With this approach, virtual processor models are derived which can be mapped onto particular implementations. This allows the analysis of signaling networks in a multivendor environment. Using these principles, a signaling network planning tool concept has been developed which provides the distinct loading of hardware and software signaling network resources, and on which hierarchical performance analysis and planning procedures are based. This allows to support the planning of signaling networks according to given service, load, and grade-of-service figures. A simple case study outlines the application of the tool concept to a network supporting Freephone, Credit Card, and ISDN voice services.

  1. Thermal performance analysis of vacuum variable-temperature blackbody system

    NASA Astrophysics Data System (ADS)

    Lee, Sang-Yong; Kim, Geon-Hee; Lee, Young-Shin; Kim, Ghiseok

    2014-05-01

    In this paper, the design and structure of a vacuum variable-temperature blackbody system were described, and the steady-state thermal analysis of a 3-D blackbody model was presented. Also, the thermal performance of the blackbody was evaluated using an infrared camera system. The blackbody system was constructed to operate under vacuum conditions (2.67 × 10-2 Pa) to reduce its temperature uncertainty, which can be caused by vapor condensation at low temperatures usually below 273.15 K. A heat sink and heat shield including a cold shield were embedded around the radiator to maintain the heat balance of the blackbody. A simplified 3-D model of the blackbody including a radiator, heat sink, heat shield, cold shield, and heat source was thermophysically evaluated by performing finite elements analysis using the extended Stefan-Boltzmann's rule, and the infrared radiating performance of the developed system was analyzed using an infrared camera system. On the basis of the results of measurements and simulations, we expect that the suggested blackbody system can serve as a highly stable reference source for the calibration and measurement of infrared optical systems within operational temperature ranges.

  2. Experimental and Numerical analysis of Metallic Bellow for Acoustic Performance

    NASA Astrophysics Data System (ADS)

    Panchwadkar, Amit A.; Awasare, Pradeep J., Dr.; Ingle, Ravidra B., Dr.

    2017-08-01

    Noise will concern about the work environment of industry. Machinery environment has overall noise which interrupts communication between the workers. This problem of miscommunication and health hazard will make sense to go for noise attenuation. Modification in machine setup may affect the performance of it. Instead of that, Helmholtz resonator principle will be a better option for noise reduction along the transmission path. Resonator has design variables which gives resonating frequency will help us to confirm the frequency range. This paper deals with metallic bellow which behaves like inertial mass under incident sound wave. Sound wave energy is affected by hard boundary condition of resonator and bellow. Metallic bellow is used in combination with resonator to find out Transmission loss (TL). Microphone attachment with FFT analyzer will give the frequency range for numerical analysis. Numerical analysis of bellow and resonator is carried out to summarize the acoustic behavior of bellow. Bellow can be numerically analyzed to check noise attenuation for centrifugal blower. An impedance tube measurement technique is performed to validate the numerical results for assembly. Dimensional and shape modification can be done to get the acoustic performance of bellow.

  3. Results of a 24-inch Hybrid Motor Performance Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Sims, Joseph D.; Coleman, Hugh W.

    1998-01-01

    The subscale (11 and 24-inch) hybrid motors at the Marshall Space Flight Center (MSFC) have been used as versatile and cost effective testbeds for developing new technology. Comparisons between motor configuration, ignition systems, feed systems, fuel formulations, and nozzle materials have been carried out without detailed consideration as to haw "good" the motor performance data were. For the 250,000 lb/thrust motor developed by the Hybrid Propulsion Demonstration Program consortium, this shortcoming is particularly risky because motor performance will likely be used as put of a set of downselect criteria to choose between competing ignition and feed systems under development. This analysis directly addresses that shortcoming by applying uncertainty analysis techniques to the experimental determination of the characteristic velocity, theoretical characteristic velocity, and characteristic velocity efficiency for a 24-inch motor firing. With the adoption of fuel-lined headends, flow restriction, and aft mixing chambers, state of the an 24-inch hybrid motors have become very efficient However, impossibly high combustion efficiencies (some computed as high as 108%) have been measured in some tests with 11-inch motors. This analysis has given new insight into explaining how these efficiencies were measured to be so high, and into which experimental measurements contribute the most to the overall uncertainty.

  4. Active charge/passive discharge solar heating systems: Thermal analysis and performance comparisons and performance comparisons

    NASA Astrophysics Data System (ADS)

    Swisher, J.

    1981-06-01

    This type of system combines liquid-cooled solar collector panels with a massive integral storage component that passively heats the building interior by radiation and free convection. The TRNSYS simulation program is used to evaluate system performance and to provide input for the development of a simplified analysis method. This method, which provides monthly calculations of delivered solar energy, is based on Klein's Phi-bar procedure and data from hourly TRNSYS simulations. The method can be applied to systems using a floor slab, a structural wall, or a water tank as the storage component. Important design parameters include collector area and orientation, building heat loss, collector and heat exchanger efficiencies, storage capacity, and storage to room coupling. Performance simulation results are used for comparisons with active and passive solar designs.

  5. Effects of specified performance criterion and performance feedback on staff behavior: a component analysis.

    PubMed

    Hardesty, Samantha L; Hagopian, Louis P; McIvor, Melissa M; Wagner, Leaora L; Sigurdsson, Sigurdur O; Bowman, Lynn G

    2014-09-01

    The present study isolated the effects of frequently used staff training intervention components to increase communication between direct care staff and clinicians working on an inpatient behavioral unit. Written "protocol review" quizzes developed by clinicians were designed to assess knowledge about a patient's behavioral protocols. Direct care staff completed these at the beginning of each day and evening shift. Clinicians were required to score and discuss these protocol reviews with direct care staff for at least 75% of shifts over a 2-week period. During baseline, only 21% of clinicians met this requirement. Completing and scoring of protocol reviews did not improve following additional in-service training (M = 15%) or following an intervention aimed at decreasing response effort combined with prompting (M = 28%). After implementing an intervention involving specified performance criterion and performance feedback, 86% of clinicians reached the established goal. Results of a component analysis suggested that the presentation of both the specified performance criterion and supporting contingencies was necessary to maintain acceptable levels of performance. © The Author(s) 2014.

  6. An analysis of calendar performance in two autistic calendar savants

    PubMed Central

    Kennedy, Daniel P.; Squire, Larry R.

    2007-01-01

    We acquired large data sets of calendar performance from two autistic calendar savants, DG and RN. An analysis of their errors and reaction times revealed that (1) both individuals had knowledge of calendar information from a limited range of years; (2) there was no evidence for the use of memorized anchor dates that could, by virtue of counting away from the anchors, allow correct responses to questions about other dates; and (3) the two individuals differed in their calendar knowledge, as well as in their ability to perform secondary tasks in which calendar knowledge was assessed indirectly. In view of the fact that there are only 14 possible annual calendars, we suggest that both savants worked by memorizing these 14 possible calendar arrangements. PMID:17686947

  7. Performance analysis of a laser propelled interorbital tansfer vehicle

    NASA Technical Reports Server (NTRS)

    Minovitch, M. A.

    1976-01-01

    Performance capabilities of a laser-propelled interorbital transfer vehicle receiving propulsive power from one ground-based transmitter was investigated. The laser transmits propulsive energy to the vehicle during successive station fly-overs. By applying a series of these propulsive maneuvers, large payloads can be economically transferred between low earth orbits and synchronous orbits. Operations involving the injection of large payloads onto escape trajectories are also studied. The duration of each successive engine burn must be carefully timed so that the vehicle reappears over the laser station to receive additional propulsive power within the shortest possible time. The analytical solution for determining these time intervals is presented, as is a solution to the problem of determining maximum injection payloads. Parameteric computer analysis based on these optimization studies is presented. The results show that relatively low beam powers, on the order of 50 MW to 60 MW, produce significant performance capabilities.

  8. Computational analysis of Variable Thrust Engine (VTE) performance

    NASA Technical Reports Server (NTRS)

    Giridharan, M. G.; Krishnan, A.; Przekwas, A. J.

    1993-01-01

    The Variable Thrust Engine (VTE) of the Orbital Maneuvering Vehicle (OMV) uses a hypergolic propellant combination of Monomethyl Hydrazine (MMH) and Nitrogen Tetroxide (NTO) as fuel and oxidizer, respectively. The performance of the VTE depends on a number of complex interacting phenomena such as atomization, spray dynamics, vaporization, turbulent mixing, convective/radiative heat transfer, and hypergolic combustion. This study involved the development of a comprehensive numerical methodology to facilitate detailed analysis of the VTE. An existing Computational Fluid Dynamics (CFD) code was extensively modified to include the following models: a two-liquid, two-phase Eulerian-Lagrangian spray model; a chemical equilibrium model; and a discrete ordinate radiation heat transfer model. The modified code was used to conduct a series of simulations to assess the effects of various physical phenomena and boundary conditions on the VTE performance. The details of the models and the results of the simulations are presented.

  9. Performance Evaluation and Analysis for Gravity Matching Aided Navigation

    PubMed Central

    Wu, Lin; Wang, Hubiao; Chai, Hua; Zhang, Lu; Hsu, Houtse; Wang, Yong

    2017-01-01

    Simulation tests were accomplished in this paper to evaluate the performance of gravity matching aided navigation (GMAN). Four essential factors were focused in this study to quantitatively evaluate the performance: gravity database (DB) resolution, fitting degree of gravity measurements, number of samples in matching, and gravity changes in the matching area. Marine gravity anomaly DB derived from satellite altimetry was employed. Actual dynamic gravimetry accuracy and operating conditions were referenced to design the simulation parameters. The results verified that the improvement of DB resolution, gravimetry accuracy, number of measurement samples, or gravity changes in the matching area generally led to higher positioning accuracies, while the effects of them were different and interrelated. Moreover, three typical positioning accuracy targets of GMAN were proposed, and the conditions to achieve these targets were concluded based on the analysis of several different system requirements. Finally, various approaches were provided to improve the positioning accuracy of GMAN. PMID:28379178

  10. Using SWE Standards for Ubiquitous Environmental Sensing: A Performance Analysis

    PubMed Central

    Tamayo, Alain; Granell, Carlos; Huerta, Joaquín

    2012-01-01

    Although smartphone applications represent the most typical data consumer tool from the citizen perspective in environmental applications, they can also be used for in-situ data collection and production in varied scenarios, such as geological sciences and biodiversity. The use of standard protocols, such as SWE, to exchange information between smartphones and sensor infrastructures brings benefits such as interoperability and scalability, but their reliance on XML is a potential problem when large volumes of data are transferred, due to limited bandwidth and processing capabilities on mobile phones. In this article we present a performance analysis about the use of SWE standards in smartphone applications to consume and produce environmental sensor data, analysing to what extent the performance problems related to XML can be alleviated by using alternative uncompressed and compressed formats.

  11. Computational analysis of Variable Thrust Engine (VTE) performance

    NASA Astrophysics Data System (ADS)

    Giridharan, M. G.; Krishnan, A.; Przekwas, A. J.

    1993-02-01

    The Variable Thrust Engine (VTE) of the Orbital Maneuvering Vehicle (OMV) uses a hypergolic propellant combination of Monomethyl Hydrazine (MMH) and Nitrogen Tetroxide (NTO) as fuel and oxidizer, respectively. The performance of the VTE depends on a number of complex interacting phenomena such as atomization, spray dynamics, vaporization, turbulent mixing, convective/radiative heat transfer, and hypergolic combustion. This study involved the development of a comprehensive numerical methodology to facilitate detailed analysis of the VTE. An existing Computational Fluid Dynamics (CFD) code was extensively modified to include the following models: a two-liquid, two-phase Eulerian-Lagrangian spray model; a chemical equilibrium model; and a discrete ordinate radiation heat transfer model. The modified code was used to conduct a series of simulations to assess the effects of various physical phenomena and boundary conditions on the VTE performance. The details of the models and the results of the simulations are presented.

  12. Performance analysis of multiple PRF technique for ambiguity resolution

    NASA Technical Reports Server (NTRS)

    Chang, C. Y.; Curlander, J. C.

    1992-01-01

    For short wavelength spaceborne synthetic aperture radar (SAR), ambiguity in Doppler centroid estimation occurs when the azimuth squint angle uncertainty is larger than the azimuth antenna beamwidth. Multiple pulse recurrence frequency (PRF) hopping is a technique developed to resolve the ambiguity by operating the radar in different PRF's in the pre-imaging sequence. Performance analysis results of the multiple PRF technique are presented, given the constraints of the attitude bound, the drift rate uncertainty, and the arbitrary numerical values of PRF's. The algorithm performance is derived in terms of the probability of correct ambiguity resolution. Examples, using the Shuttle Imaging Radar-C (SIR-C) and X-SAR parameters, demonstrate that the probability of correct ambiguity resolution obtained by the multiple PRF technique is greater than 95 percent and 80 percent for the SIR-C and X-SAR applications, respectively. The success rate is significantly higher than that achieved by the range cross correlation technique.

  13. Fluid and thermal performance analysis of PMSM used for driving

    NASA Astrophysics Data System (ADS)

    Ding, Shuye; Cui, Guanghui; Li, Zhongyu; Guan, Tianyu

    2016-03-01

    The permanent magnet synchronous motor (PMSM) is widely used in ships under frequency conversion control system. The fluid flow performance and temperature distribution of the PMSM are difficult to clarify due to its complex structure and variable frequency control condition. Therefore, in order to investigate the fluid and thermal characteristics of the PMSM, a 50 kW PMSM was taken as an example in this study, and a 3-D coupling analysis model of fluid and thermal was established. The fluid and temperature fields were calculated by using finite volume method. The cooling medium's properties, such a velocity, streamlines, and temperature, were then analyzed. The correctness of the proposed model, and the rationality of the solution method, were verified by a temperature test of the PMSM. In this study, the changing rheology on the performance of the cooling medium and the working temperature of the PMSM were revealed, which could be helpful for designing the PMSM.

  14. Performance analysis of charge plasma based dual electrode tunnel FET

    NASA Astrophysics Data System (ADS)

    Anand, Sunny; Intekhab Amin, S.; Sarin, R. K.

    2016-05-01

    This paper proposes the charge plasma based dual electrode doping-less tunnel FET (DEDLTFET). The paper compares the device performance of the conventional doping-less TFET (DLTFET) and doped TFET (DGTFET). DEDLTEFT gives the superior results with high ON state current (ION ∼ 0.56 mA/μm), ION/IOFF ratio ∼ 9.12 × 1013 and an average subthreshold swing (AV-SS ∼ 48 mV/dec). The variation of different device parameters such as channel length, gate oxide material, gate oxide thickness, silicon thickness, gate work function and temperature variation are done and compared with DLTFET and DGTFET. Through the extensive analysis it is found that DEDLTFET shows the better performance than the other two devices, which gives the indication for an excellent future in low power applications.

  15. Performance analysis and optimization of eccentric annular disk fins

    SciTech Connect

    Kundu, B.; Das, P.K.

    1999-02-01

    In the first part of the paper, a semi-analytical method has been described for solving the two-dimensional heat conduction equation in an eccentric annular disk fin circumscribing a circular tube, subjected to convective cooling. Analysis has been done considering both convective and insulated conditions at the fin tip. The effects of surface and tip heat transfer coefficients and eccentricity on the performance of the fin have been studied. Comparative studies have also been made between the performance of concentric and eccentric fins with same radius ratio. Next, the optimum dimensions for eccentric annular fins have been determined using Lagrange multiplier technique. In the scheme, either the fin volume or the heat transfer duty can be taken as the constraint. Finally, it has been shown that when space restriction is imposed on one side of the tube, eccentric annular fins can be designed to have lesser volumes compared to concentric annular fins above a certain heat transfer duty.

  16. An analysis of calendar performance in two autistic calendar savants.

    PubMed

    Kennedy, Daniel P; Squire, Larry R

    2007-08-01

    We acquired large data sets of calendar performance from two autistic calendar savants, DG and RN. An analysis of their errors and reaction times revealed that (1) both individuals had knowledge of calendar information from a limited range of years; (2) there was no evidence for the use of memorized anchor dates that could, by virtue of counting away from the anchors, allow correct responses to questions about other dates; and (3) the two individuals differed in their calendar knowledge, as well as in their ability to perform secondary tasks in which calendar knowledge was assessed indirectly. In view of the fact that there are only 14 possible annual calendars, we suggest that both savants worked by memorizing these 14 possible calendar arrangements.

  17. Commissioning and Performance Analysis of WhisperGen Stirling Engine

    NASA Astrophysics Data System (ADS)

    Pradip, Prashant Kaliram

    Stirling engine based cogeneration systems have potential to reduce energy consumption and greenhouse gas emission, due to their high cogeneration efficiency and emission control due to steady external combustion. To date, most studies on this unit have focused on performance based on both experimentation and computer models, and lack experimental data for diversified operating ranges. This thesis starts with the commissioning of a WhisperGen Stirling engine with components and instrumentation to evaluate power and thermal performance of the system. Next, a parametric study on primary engine variables, including air, diesel, and coolant flowrate and temperature were carried out to further understand their effect on engine power and efficiency. Then, this trend was validated with the thermodynamic model developed for the energy analysis of a Stirling cycle. Finally, the energy balance of the Stirling engine was compared without and with heat recovery from the engine block and the combustion chamber exhaust.

  18. Performance analysis of multiple PRF technique for ambiguity resolution

    NASA Technical Reports Server (NTRS)

    Chang, C. Y.; Curlander, J. C.

    1992-01-01

    For short wavelength spaceborne synthetic aperture radar (SAR), ambiguity in Doppler centroid estimation occurs when the azimuth squint angle uncertainty is larger than the azimuth antenna beamwidth. Multiple pulse recurrence frequency (PRF) hopping is a technique developed to resolve the ambiguity by operating the radar in different PRF's in the pre-imaging sequence. Performance analysis results of the multiple PRF technique are presented, given the constraints of the attitude bound, the drift rate uncertainty, and the arbitrary numerical values of PRF's. The algorithm performance is derived in terms of the probability of correct ambiguity resolution. Examples, using the Shuttle Imaging Radar-C (SIR-C) and X-SAR parameters, demonstrate that the probability of correct ambiguity resolution obtained by the multiple PRF technique is greater than 95 percent and 80 percent for the SIR-C and X-SAR applications, respectively. The success rate is significantly higher than that achieved by the range cross correlation technique.

  19. Performance Model and Sensitivity Analysis for a Solar Thermoelectric Generator

    NASA Astrophysics Data System (ADS)

    Rehman, Naveed Ur; Siddiqui, Mubashir Ali

    2017-03-01

    In this paper, a regression model for evaluating the performance of solar concentrated thermoelectric generators (SCTEGs) is established and the significance of contributing parameters is discussed in detail. The model is based on several natural, design and operational parameters of the system, including the thermoelectric generator (TEG) module and its intrinsic material properties, the connected electrical load, concentrator attributes, heat transfer coefficients, solar flux, and ambient temperature. The model is developed by fitting a response curve, using the least-squares method, to the results. The sample points for the model were obtained by simulating a thermodynamic model, also developed in this paper, over a range of values of input variables. These samples were generated employing the Latin hypercube sampling (LHS) technique using a realistic distribution of parameters. The coefficient of determination was found to be 99.2%. The proposed model is validated by comparing the predicted results with those in the published literature. In addition, based on the elasticity for parameters in the model, sensitivity analysis was performed and the effects of parameters on the performance of SCTEGs are discussed in detail. This research will contribute to the design and performance evaluation of any SCTEG system for a variety of applications.

  20. Performance Model and Sensitivity Analysis for a Solar Thermoelectric Generator

    NASA Astrophysics Data System (ADS)

    Rehman, Naveed Ur; Siddiqui, Mubashir Ali

    2017-01-01

    In this paper, a regression model for evaluating the performance of solar concentrated thermoelectric generators (SCTEGs) is established and the significance of contributing parameters is discussed in detail. The model is based on several natural, design and operational parameters of the system, including the thermoelectric generator (TEG) module and its intrinsic material properties, the connected electrical load, concentrator attributes, heat transfer coefficients, solar flux, and ambient temperature. The model is developed by fitting a response curve, using the least-squares method, to the results. The sample points for the model were obtained by simulating a thermodynamic model, also developed in this paper, over a range of values of input variables. These samples were generated employing the Latin hypercube sampling (LHS) technique using a realistic distribution of parameters. The coefficient of determination was found to be 99.2%. The proposed model is validated by comparing the predicted results with those in the published literature. In addition, based on the elasticity for parameters in the model, sensitivity analysis was performed and the effects of parameters on the performance of SCTEGs are discussed in detail. This research will contribute to the design and performance evaluation of any SCTEG system for a variety of applications.

  1. A Divergence Statistics Extension to VTK for Performance Analysis

    SciTech Connect

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2015-02-01

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.

  2. Space mission scenario development and performance analysis tool

    NASA Technical Reports Server (NTRS)

    Kordon, Mark; Baker, John; Gilbert, John; Hanks, David

    2004-01-01

    This paper discusses a new and innovative approach for a rapid spacecraft multi-disciplinary performance analysis using a tool called the Mission Scenario Development Workbench (MSDW). To meet the needs of new classes of space missions, analysis tools with proven models were developed and integrated into a framework to enable rapid trades and analyses between spacecraft designs and operational scenarios during the formulation phase of a mission. Generally speaking, spacecraft resources are highly constrained on deep space missions and this approach makes it possible to maximize the use of existing resources to attain the best possible science return. This approach also has the potential benefit of reducing the risk of costly design changes made later in the design cycle necessary to meet the mission requirements by understanding system design sensitivities early and adding appropriate margins. This paper will describe the approach used by the Mars Science Laboratory Project to accomplish this result.

  3. Space mission scenario development and performance analysis tool

    NASA Technical Reports Server (NTRS)

    Kordon, Mark; Baker, John; Gilbert, John; Hanks, David

    2004-01-01

    This paper discusses a new and innovative approach for a rapid spacecraft multi-disciplinary performance analysis using a tool called the Mission Scenario Development Workbench (MSDW). To meet the needs of new classes of space missions, analysis tools with proven models were developed and integrated into a framework to enable rapid trades and analyses between spacecraft designs and operational scenarios during the formulation phase of a mission. Generally speaking, spacecraft resources are highly constrained on deep space missions and this approach makes it possible to maximize the use of existing resources to attain the best possible science return. This approach also has the potential benefit of reducing the risk of costly design changes made later in the design cycle necessary to meet the mission requirements by understanding system design sensitivities early and adding appropriate margins. This paper will describe the approach used by the Mars Science Laboratory Project to accomplish this result.

  4. Removing Grit During Wastewater Treatment: CFD Analysis of HDVS Performance.

    PubMed

    Meroney, Robert N; Sheker, Robert E

    2016-05-01

    Computational Fluid Dynamics (CFD) was used to simulate the grit and sand separation effectiveness of a typical hydrodynamic vortex separator (HDVS) system. The analysis examined the influences on the separator efficiency of: flow rate, fluid viscosities, total suspended solids (TSS), and particle size and distribution. It was found that separator efficiency for a wide range of these independent variables could be consolidated into a few curves based on the particle fall velocity to separator inflow velocity ratio, Ws/Vin. Based on CFD analysis it was also determined that systems of different sizes with length scale ratios ranging from 1 to 10 performed similarly when Ws/Vin and TSS were held constant. The CFD results have also been compared to a limited range of experimental data.

  5. Analysis and performance of a parallel axis flatness measuring instrument

    SciTech Connect

    Marsh, Eric; Schalcosky, David; Couey, Jeremiah; Vallance, Ryan

    2006-02-15

    This article describes the design, analysis, and performance of a flatness inspection instrument to measure workpieces with up to 1 mm departure from flatness. The instrument uses two air bearing spindles arranged with parallel axes to simultaneously rotate a workpiece and slowly pass a capacitance probe over the spinning surface. Capacitance probes offer user-selectable sensitivity to provide multiple combinations of measurement range and resolution. In tests with a high sensitivity probe, the instrument demonstrated measurement repeatability of 25 nm on a null-set 75 mm workpiece. This article presents a complete homogeneous transformation matrix analysis of the propagation of errors into the measurement as well as sample measurements on diamond turned workpieces.

  6. Transient analysis techniques in performing impact and crash dynamic studies

    NASA Technical Reports Server (NTRS)

    Pifko, A. B.; Winter, R.

    1989-01-01

    Because of the emphasis being placed on crashworthiness as a design requirement, increasing demands are being made by various organizations to analyze a wide range of complex structures that must perform safely when subjected to severe impact loads, such as those generated in a crash event. The ultimate goal of crashworthiness design and analysis is to produce vehicles with the ability to reduce the dynamic forces experienced by the occupants to specified levels, while maintaining a survivable envelope around them during a specified crash event. DYCAST is a nonlinear structural dynamic finite element computer code that started from the plans systems of a finite element program for static nonlinear structural analysis. The essential features of DYCAST are outlined.

  7. Cost-Performance Analysis of Perovskite Solar Modules.

    PubMed

    Cai, Molang; Wu, Yongzhen; Chen, Han; Yang, Xudong; Qiang, Yinghuai; Han, Liyuan

    2017-01-01

    Perovskite solar cells (PSCs) are promising candidates for the next generation of solar cells because they are easy to fabricate and have high power conversion efficiencies. However, there has been no detailed analysis of the cost of PSC modules. We selected two representative examples of PSCs and performed a cost analysis of their productions: one was a moderate-efficiency module produced from cheap materials, and the other was a high-efficiency module produced from expensive materials. The costs of both modules were found to be lower than those of other photovoltaic technologies. We used the calculated module costs to estimate the levelized cost of electricity (LCOE) of PSCs. The LCOE was calculated to be 3.5-4.9 US cents/kWh with an efficiency and lifetime of greater than 12% and 15 years respectively, below the cost of traditional energy sources.

  8. Cost‐Performance Analysis of Perovskite Solar Modules

    PubMed Central

    Cai, Molang; Wu, Yongzhen; Chen, Han; Yang, Xudong; Qiang, Yinghuai

    2016-01-01

    Perovskite solar cells (PSCs) are promising candidates for the next generation of solar cells because they are easy to fabricate and have high power conversion efficiencies. However, there has been no detailed analysis of the cost of PSC modules. We selected two representative examples of PSCs and performed a cost analysis of their productions: one was a moderate‐efficiency module produced from cheap materials, and the other was a high‐efficiency module produced from expensive materials. The costs of both modules were found to be lower than those of other photovoltaic technologies. We used the calculated module costs to estimate the levelized cost of electricity (LCOE) of PSCs. The LCOE was calculated to be 3.5–4.9 US cents/kWh with an efficiency and lifetime of greater than 12% and 15 years respectively, below the cost of traditional energy sources. PMID:28105403

  9. Analysis of Random Segment Errors on Coronagraph Performance

    NASA Technical Reports Server (NTRS)

    Stahl, Mark T.; Stahl, H. Philip; Shaklan, Stuart B.; N'Diaye, Mamadou

    2016-01-01

    At 2015 SPIE O&P we presented "Preliminary Analysis of Random Segment Errors on Coronagraph Performance" Key Findings: Contrast Leakage for 4thorder Sinc2(X) coronagraph is 10X more sensitive to random segment piston than random tip/tilt, Fewer segments (i.e. 1 ring) or very many segments (> 16 rings) has less contrast leakage as a function of piston or tip/tilt than an aperture with 2 to 4 rings of segments. Revised Findings: Piston is only 2.5X more sensitive than Tip/Tilt

  10. Statistical Performance Analysis of Data-Driven Neural Models.

    PubMed

    Freestone, Dean R; Layton, Kelvin J; Kuhlmann, Levin; Cook, Mark J

    2017-02-01

    Data-driven model-based analysis of electrophysiological data is an emerging technique for understanding the mechanisms of seizures. Model-based analysis enables tracking of hidden brain states that are represented by the dynamics of neural mass models. Neural mass models describe the mean firing rates and mean membrane potentials of populations of neurons. Various neural mass models exist with different levels of complexity and realism. An ideal data-driven model-based analysis framework will incorporate the most realistic model possible, enabling accurate imaging of the physiological variables. However, models must be sufficiently parsimonious to enable tracking of important variables using data. This paper provides tools to inform the realism versus parsimony trade-off, the Bayesian Cramer-Rao (lower) Bound (BCRB). We demonstrate how the BCRB can be used to assess the feasibility of using various popular neural mass models to track epilepsy-related dynamics via stochastic filtering methods. A series of simulations show how optimal state estimates relate to measurement noise, model error and initial state uncertainty. We also demonstrate that state estimation accuracy will vary between seizure-like and normal rhythms. The performance of the extended Kalman filter (EKF) is assessed against the BCRB. This work lays a foundation for assessing feasibility of model-based analysis. We discuss how the framework can be used to design experiments to better understand epilepsy.

  11. Aerocapture Performance Analysis of A Venus Exploration Mission

    NASA Technical Reports Server (NTRS)

    Starr, Brett R.; Westhelle, Carlos H.

    2005-01-01

    A performance analysis of a Discovery Class Venus Exploration Mission in which aerocapture is used to capture a spacecraft into a 300km polar orbit for a two year science mission has been conducted to quantify its performance. A preliminary performance assessment determined that a high heritage 70 sphere-cone rigid aeroshell with a 0.25 lift to drag ratio has adequate control authority to provide an entry flight path angle corridor large enough for the mission s aerocapture maneuver. A 114 kilograms per square meter ballistic coefficient reference vehicle was developed from the science requirements and the preliminary assessment s heating indicators and deceleration loads. Performance analyses were conducted for the reference vehicle and for sensitivity studies on vehicle ballistic coefficient and maximum bank rate. The performance analyses used a high fidelity flight simulation within a Monte Carlo executive to define the aerocapture heating environment and deceleration loads and to determine mission success statistics. The simulation utilized the Program to Optimize Simulated Trajectories (POST) that was modified to include Venus specific atmospheric and planet models, aerodynamic characteristics, and interplanetary trajectory models. In addition to Venus specific models, an autonomous guidance system, HYPAS, and a pseudo flight controller were incorporated in the simulation. The Monte Carlo analyses incorporated a reference set of approach trajectory delivery errors, aerodynamic uncertainties, and atmospheric density variations. The reference performance analysis determined the reference vehicle achieves 100% successful capture and has a 99.87% probability of attaining the science orbit with a 90 meters per second delta V budget for post aerocapture orbital adjustments. A ballistic coefficient trade study conducted with reference uncertainties determined that the 0.25 L/D vehicle can achieve 100% successful capture with a ballistic coefficient of 228 kilograms

  12. Voxel model in BNCT treatment planning: performance analysis and improvements.

    PubMed

    González, Sara J; Carando, Daniel G; Santa Cruz, Gustavo A; Zamenhof, Robert G

    2005-02-07

    In recent years, many efforts have been made to study the performance of treatment planning systems in deriving an accurate dosimetry of the complex radiation fields involved in boron neutron capture therapy (BNCT). The computational model of the patient's anatomy is one of the main factors involved in this subject. This work presents a detailed analysis of the performance of the 1 cm based voxel reconstruction approach. First, a new and improved material assignment algorithm implemented in NCTPlan treatment planning system for BNCT is described. Based on previous works, the performances of the 1 cm based voxel methods used in the MacNCTPlan and NCTPlan treatment planning systems are compared by standard simulation tests. In addition, the NCTPlan voxel model is benchmarked against in-phantom physical dosimetry of the RA-6 reactor of Argentina. This investigation shows the 1 cm resolution to be accurate enough for all reported tests, even in the extreme cases such as a parallelepiped phantom irradiated through one of its sharp edges. This accuracy can be degraded at very shallow depths in which, to improve the estimates, the anatomy images need to be positioned in a suitable way. Rules for this positioning are presented. The skin is considered one of the organs at risk in all BNCT treatments and, in the particular case of cutaneous melanoma of extremities, limits the delivered dose to the patient. Therefore, the performance of the voxel technique is deeply analysed in these shallow regions. A theoretical analysis is carried out to assess the distortion caused by homogenization and material percentage rounding processes. Then, a new strategy for the treatment of surface voxels is proposed and tested using two different irradiation problems. For a parallelepiped phantom perpendicularly irradiated with a 5 keV neutron source, the large thermal neutron fluence deviation present at shallow depths (from 54% at 0 mm depth to 5% at 4 mm depth) is reduced to 2% on average

  13. Performance and Degradation Analysis of Operating PV Systems

    NASA Astrophysics Data System (ADS)

    Da Silva Freire, Felipe

    The environmental concerns together with the decrease in technology cost lead the solar market to growth rapidly along the last decade. The photovoltaic (PV) systems are one of the solar energy alternatives and the silicon solar cells are currently the most widespread technology. Photovoltaic (PV) modules are considered the most reliable component of a photovoltaic system. The reliability and lifetime depends on the modules energy conversion performance and degradation modes. The analysis of monitoring data give insights about the PV system performance along its service time. The comparison between this data and mathematical models configure a way to predict the futures and new PV installations performance. The goal of this study is to understand the PV systems performance and degradation along its lifetime. A mathematical model was employed to predict the power output of a real, relatively new operating PV system with respect to environmental parameters temperature, irradiance and cloud coverage. The model used is based on one diode ideality factor and takes into account the parasitic series resistance. The results have been compared with the actual PV output data collected for the year 2014 and show good correlation. As the model predicts the system power output assuming the system in new conditions, the deviation in performance of the real data in comparison to the modeling results need to be further investigated for systems in service for longer time. For this propose, the study presents a condensed review of various causes of degradation in silicon PV modules and techniques to observe and investigate these degradation mechanisms. Major effects on output performance exhibit increase in observed ideality factor n2 and recombination current J02 primarily caused by decrease in minority carrier lifetime, shunts and increase in series resistance. The study further, investigates the governing degradation modes on a ten years old PV crystalline silicon module

  14. Performance Analysis of the Least-Squares Estimator in Astrometry

    NASA Astrophysics Data System (ADS)

    Lobos, Rodrigo A.; Silva, Jorge F.; Mendez, Rene A.; Orchard, Marcos

    2015-11-01

    We characterize the performance of the widely-used least-squares estimator in astrometry in terms of a comparison with the Cramer-Rao lower variance bound. In this inference context the performance of the least-squares estimator does not offer a closed-form expression, but a new result is presented (Theorem 1) where both the bias and the mean-square-error of the least-squares estimator are bounded and approximated analytically, in the latter case in terms of a nominal value and an interval around it. From the predicted nominal value we analyze how efficient is the least-squares estimator in comparison with the minimum variance Cramer-Rao bound. Based on our results, we show that, for the high signal-to-noise ratio regime, the performance of the least-squares estimator is significantly poorer than the Cramer-Rao bound, and we characterize this gap analytically. On the positive side, we show that for the challenging low signal-to-noise regime (attributed to either a weak astronomical signal or a noise-dominated condition) the least-squares estimator is near optimal, as its performance asymptotically approaches the Cramer-Rao bound. However, we also demonstrate that, in general, there is no unbiased estimator for the astrometric position that can precisely reach the Cramer-Rao bound. We validate our theoretical analysis through simulated digital-detector observations under typical observing conditions. We show that the nominal value for the mean-square-error of the least-squares estimator (obtained from our theorem) can be used as a benchmark indicator of the expected statistical performance of the least-squares method under a wide range of conditions. Our results are valid for an idealized linear (one-dimensional) array detector where intra-pixel response changes are neglected, and where flat-fielding is achieved with very high accuracy.

  15. Performance analysis of jump-gliding locomotion for miniature robotics.

    PubMed

    Vidyasagar, A; Zufferey, Jean-Christohphe; Floreano, Dario; Kovač, M

    2015-03-26

    Recent work suggests that jumping locomotion in combination with a gliding phase can be used as an effective mobility principle in robotics. Compared to pure jumping without a gliding phase, the potential benefits of hybrid jump-gliding locomotion includes the ability to extend the distance travelled and reduce the potentially damaging impact forces upon landing. This publication evaluates the performance of jump-gliding locomotion and provides models for the analysis of the relevant dynamics of flight. It also defines a jump-gliding envelope that encompasses the range that can be achieved with jump-gliding robots and that can be used to evaluate the performance and improvement potential of jump-gliding robots. We present first a planar dynamic model and then a simplified closed form model, which allow for quantification of the distance travelled and the impact energy on landing. In order to validate the prediction of these models, we validate the model with experiments using a novel jump-gliding robot, named the 'EPFL jump-glider'. It has a mass of 16.5 g and is able to perform jumps from elevated positions, perform steered gliding flight, land safely and traverse on the ground by repetitive jumping. The experiments indicate that the developed jump-gliding model fits very well with the measured flight data using the EPFL jump-glider, confirming the benefits of jump-gliding locomotion to mobile robotics. The jump-glide envelope considerations indicate that the EPFL jump-glider, when traversing from a 2 m height, reaches 74.3% of optimal jump-gliding distance compared to pure jumping without a gliding phase which only reaches 33.4% of the optimal jump-gliding distance. Methods of further improving flight performance based on the models and inspiration from biological systems are presented providing mechanical design pathways to future jump-gliding robot designs.

  16. A conceptual design tool for RBCC engine performance analysis

    NASA Astrophysics Data System (ADS)

    Olds, John R.; Saks, Greg

    1997-01-01

    Future reusable launch vehicles will depend on new propulsion technologies to lower system operational costs while maintaining adequate performance. Recently, a number of vehicle systems utilizing rocket-based combined-cycle (RBCC) propulsion have been proposed as possible low-cost space launch solutions. Vehicles using RBCC propulsion have the potential to combine the best aspects of airbreathing propulsion (high average Isp) with the best aspects of rocket propulsion (high propellant bulk density and engine T/W). Proper conceptual assessment of each proposed vehicle will require computer-based tools that allow for quick and cheap, yet sufficiently accurate disciplinary analyses. At Georgia Tech, a spreadsheet-based tool has been developed that uses quasi-1D flow analysis with component efficiencies to parametrically model RBCC engine performance in ejector, fan-ramjet, ramjet and pure rocket modes. The technique is similar to an earlier RBCC modeling technique developed by the Marquardt Corporation in the mid-1960's. For a given sea-level static thrust requirement, the current tool generates engine weight and size data, as well as Isp and thrust data vs. altitude and Mach number. The latter is output in tabular form for use in a trajectory optimization program. This paper reviews the current state of the RBCC analysis tool and the effort to upgrade it from a Microsoft Excel spreadsheet to a design-oriented UNIX program in C suitable for integration into a multidisciplinary design optimization (MDO) framework.

  17. A meta-analysis of math performance in Turner syndrome.

    PubMed

    Baker, Joseph M; Reiss, Allan L

    2016-02-01

    Studies investigating the relationship between Turner syndrome and math learning disability have used a wide variation of tasks designed to test various aspects of mathematical competencies. Although these studies have revealed much about the math deficits common to Turner syndrome, their diversity makes comparisons between individual studies difficult. As a result, the consistency of outcomes among these diverse measures remains unknown. The overarching aim of this review is to provide a systematic meta-analysis of the differences in math and number performance between females with Turner syndrome and age-matched neurotypical peers. We provide a meta-analysis of behavioral performance in Turner syndrome relative to age-matched neurotypical populations on assessments of math and number aptitude. In total, 112 comparisons collected across 17 studies were included. Although 54% of all statistical comparisons in our analyses failed to reject the null hypothesis, our results indicate that meaningful group differences exist on all comparisons except those that do not require explicit calculation. Taken together, these results help elucidate our current understanding of math and number weaknesses in Turner syndrome, while highlighting specific topics that require further investigation. © 2015 Mac Keith Press.

  18. Autotasked Performance in the NAS Workload: A Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Carter, R. L.; Stockdale, I. E.; Kutler, Paul (Technical Monitor)

    1998-01-01

    A statistical analysis of the workload performance of a production quality FORTRAN code for five different Cray Y-MP hardware and system software configurations is performed. The analysis was based on an experimental procedure that was designed to minimize correlations between the number of requested CPUs and the time of day the runs were initiated. Observed autotasking over heads were significantly larger for the set of jobs that requested the maximum number of CPUs. Speedups for UNICOS 6 releases show consistent wall clock speedups in the workload of around 2. which is quite good. The observed speed ups were very similar for the set of jobs that requested 8 CPUs and the set that requested 4 CPUs. The original NAS algorithm for determining charges to the user discourages autotasking in the workload. A new charging algorithm to be applied to jobs run in the NQS multitasking queues also discourages NAS users from using auto tasking. The new algorithm favors jobs requesting 8 CPUs over those that request less, although the jobs requesting 8 CPUs experienced significantly higher over head and presumably degraded system throughput. A charging algorithm is presented that has the following desirable characteristics when applied to the data: higher overhead jobs requesting 8 CPUs are penalized when compared to moderate overhead jobs requesting 4 CPUs, thereby providing a charging incentive to NAS users to use autotasking in a manner that provides them with significantly improved turnaround while also maintaining system throughput.

  19. TPMS Data Analysis for Enhancing Intelligent Vehicle Performance

    NASA Astrophysics Data System (ADS)

    Hannan, M. A.; Hussain, A.; Mohamed, A.; Samad, S. A.

    The main objective of the study is to analyze Tire Pressure Monitoring System (TPMS) data that contributes significantly towards the enhancement of the intelligent vehicle performance evaluation. TPMS pressure and temperature data were collected from the prototype model of the MEMS Tire Pressure Module (TPM) that was fitted on to an intelligent tire rim through its receiver. In this study, we are focusing only analytical data analysis of TPMS. In the analytical study, a novel method for data classification, goodness of fit and hypothesis testing was proposed. A classification scheme was employed to classify the temperature and pressure data based on ID at the quadrant basis operating zone of the Front Right (FR), Front Left (FL), Rear Left (RL) and Rear Right (RR) tires. Principle Component Analysis (PCA) with polynomial fitting for exploring goodness of fit of tire data was also applied. Finally, hypothesis testing using Satterthwaite statistic was carried out. Results obtained are in agreement with the null hypothesis and as such validate the usefulness of the TPMS system in maintaining and enhancing vehicle performance.

  20. A meta-analysis of math performance in Turner syndrome

    PubMed Central

    Baker, Joseph M; Reiss, Allan L

    2015-01-01

    AIM Studies investigating the relationship between Turner syndrome and math learning disability have used a wide variation of tasks designed to test various aspects of mathematical competencies. Although these studies have revealed much about the math deficits common to Turner syndrome, their diversity makes comparisons between individual studies difficult. As a result, the consistency of outcomes among these diverse measures remains unknown. The overarching aim of this review is to provide a systematic meta-analysis of the differences in math and number performance between females with Turner syndrome and age-matched neurotypical peers. METHOD We provide a meta-analysis of behavioral performance in Turner syndrome relative to age-matched neurotypical populations on assessments of math and number aptitude. In total, 112 comparisons collected across 17 studies were included. RESULTS Although 54% of all statistical comparisons in our analyses failed to reject the null hypothesis, our results indicate that meaningful group differences exist on all comparisons except those that do not require explicit calculation. INTERPRETATION Taken together, these results help elucidate our current understanding of math and number weaknesses in Turner syndrome, while highlighting specific topics that require further investigation. PMID:26566693

  1. A conceptual design tool for RBCC engine performance analysis

    SciTech Connect

    Olds, J.R.; Saks, G.

    1997-01-01

    Future reusable launch vehicles will depend on new propulsion technologies to lower system operational costs while maintaining adequate performance. Recently, a number of vehicle systems utilizing rocket-based combined-cycle (RBCC) propulsion have been proposed as possible low-cost space launch solutions. Vehicles using RBCC propulsion have the potential to combine the best aspects of airbreathing propulsion (high average Isp) with the best aspects of rocket propulsion (high propellant bulk density and engine T/W). Proper conceptual assessment of each proposed vehicle will require computer-based tools that allow for quick and cheap, yet sufficiently accurate disciplinary analyses. At Georgia Tech, a spreadsheet-based tool has been developed that uses quasi-1D flow analysis with component efficiencies to parametrically model RBCC engine performance in ejector, fan-ramjet, ramjet and pure rocket modes. The technique is similar to an earlier RBCC modeling technique developed by the Marquardt Corporation in the mid-1960{close_quote}s. For a given sea-level static thrust requirement, the current tool generates engine weight and size data, as well as Isp and thrust data vs. altitude and Mach number. The latter is output in tabular form for use in a trajectory optimization program. This paper reviews the current state of the RBCC analysis tool and the effort to upgrade it from a Microsoft Excel spreadsheet to a design-oriented UNIX program in C suitable for integration into a multidisciplinary design optimization (MDO) framework. {copyright} {ital 1997 American Institute of Physics.}

  2. Topology design and performance analysis of an integrated communication network

    NASA Technical Reports Server (NTRS)

    Li, V. O. K.; Lam, Y. F.; Hou, T. C.; Yuen, J. H.

    1985-01-01

    A research study on the topology design and performance analysis for the Space Station Information System (SSIS) network is conducted. It is begun with a survey of existing research efforts in network topology design. Then a new approach for topology design is presented. It uses an efficient algorithm to generate candidate network designs (consisting of subsets of the set of all network components) in increasing order of their total costs, and checks each design to see if it forms an acceptable network. This technique gives the true cost-optimal network, and is particularly useful when the network has many constraints and not too many components. The algorithm for generating subsets is described in detail, and various aspects of the overall design procedure are discussed. Two more efficient versions of this algorithm (applicable in specific situations) are also given. Next, two important aspects of network performance analysis: network reliability and message delays are discussed. A new model is introduced to study the reliability of a network with dependent failures. For message delays, a collection of formulas from existing research results is given to compute or estimate the delays of messages in a communication network without making the independence assumption. The design algorithm coded in PASCAL is included as an appendix.

  3. Performance analysis for standoff biological warfare agent detection lidar

    NASA Astrophysics Data System (ADS)

    Steinvall, Ove; Jonsson, Per; Kullander, Fredrik

    2007-10-01

    Lidar has been identified as a promising sensor for remote detection of biological warfare agents. Elastic lidar can be used for cloud detection at long ranges and UV laser induced fluorescence can be used for discrimination of bioaerosols against naturally occurring aerosols. This paper analyzes the performance of elastic lidar such as sensitivity, range and angular coverage rate vs. atmospheric visibility, laser and receiver parameters. The analysis of the UV fluorescence lidar is concentrated on estimating the signal strength as a function of range, concentration and optical background level. The performance analysis supports the goal for a practical lidar system to detect 1000 particles/liter at 2-3 km using elastic backscatter and to verify the bioaerosol using fluorescence characterization at 1 km. Some examples of test results with an elastic lidar and a range gated imaging system both at 1.5 μm wavelength are presented together with fluorescence spectra of biological warfare agent simulants measured at an excitation wavelength of 355 nm.

  4. Topology design and performance analysis of an integrated communication network

    NASA Astrophysics Data System (ADS)

    Li, V. O. K.; Lam, Y. F.; Hou, T. C.; Yuen, J. H.

    1985-09-01

    A research study on the topology design and performance analysis for the Space Station Information System (SSIS) network is conducted. It is begun with a survey of existing research efforts in network topology design. Then a new approach for topology design is presented. It uses an efficient algorithm to generate candidate network designs (consisting of subsets of the set of all network components) in increasing order of their total costs, and checks each design to see if it forms an acceptable network. This technique gives the true cost-optimal network, and is particularly useful when the network has many constraints and not too many components. The algorithm for generating subsets is described in detail, and various aspects of the overall design procedure are discussed. Two more efficient versions of this algorithm (applicable in specific situations) are also given. Next, two important aspects of network performance analysis: network reliability and message delays are discussed. A new model is introduced to study the reliability of a network with dependent failures. For message delays, a collection of formulas from existing research results is given to compute or estimate the delays of messages in a communication network without making the independence assumption. The design algorithm coded in PASCAL is included as an appendix.

  5. Correlation analysis between ionospheric scintillation levels and receiver tracking performance

    NASA Astrophysics Data System (ADS)

    Sreeja, V.; Aquino, M.; Elmas, Z. G.; Forte, B.

    2012-06-01

    Rapid fluctuations in the amplitude and phase of a transionospheric radio signal caused by small scale plasma density irregularities in the ionosphere are known as scintillation. Scintillation can seriously impair a GNSS (Global Navigation Satellite Systems) receiver tracking performance, thus affecting the required levels of availability, accuracy and integrity, and consequently the reliability of modern day GNSS based applications. This paper presents an analysis of correlation between scintillation levels and tracking performance of a GNSS receiver for GPS L1C/A, L2C and GLONASS L1, L2 signals. The analyses make use of data recorded over Presidente Prudente (22.1°S, 51.4°W, dip latitude ˜12.3°S) in Brazil, a location close to the Equatorial Ionisation Anomaly (EIA) crest in Latin America. The study presents for the first time this type of correlation analysis for GPS L2C and GLONASS L1, L2 signals. The scintillation levels are defined by the amplitude scintillation index, S4 and the receiver tracking performance is evaluated by the phase tracking jitter. Both S4 and the phase tracking jitter are estimated from the post correlation In-Phase (I) and Quadra-Phase (Q) components logged by the receiver at a high rate. Results reveal that the dependence of the phase tracking jitter on the scintillation levels can be represented by a quadratic fit for the signals. The results presented in this paper are of importance to GNSS users, especially in view of the forthcoming high phase of solar cycle 24 (predicted for 2013).

  6. Design and Performance Analysis of Incremental Networked Predictive Control Systems.

    PubMed

    Pang, Zhong-Hua; Liu, Guo-Ping; Zhou, Donghua

    2016-06-01

    This paper is concerned with the design and performance analysis of networked control systems with network-induced delay, packet disorder, and packet dropout. Based on the incremental form of the plant input-output model and an incremental error feedback control strategy, an incremental networked predictive control (INPC) scheme is proposed to actively compensate for the round-trip time delay resulting from the above communication constraints. The output tracking performance and closed-loop stability of the resulting INPC system are considered for two cases: 1) plant-model match case and 2) plant-model mismatch case. For the former case, the INPC system can achieve the same output tracking performance and closed-loop stability as those of the corresponding local control system. For the latter case, a sufficient condition for the stability of the closed-loop INPC system is derived using the switched system theory. Furthermore, for both cases, the INPC system can achieve a zero steady-state output tracking error for step commands. Finally, both numerical simulations and practical experiments on an Internet-based servo motor system illustrate the effectiveness of the proposed method.

  7. Instantaneous BeiDou-GPS attitude determination: A performance analysis

    NASA Astrophysics Data System (ADS)

    Nadarajah, Nandakumaran; Teunissen, Peter J. G.; Raziq, Noor

    2014-09-01

    The advent of modernized and new global navigation satellite systems (GNSS) has enhanced the availability of satellite based positioning, navigation, and timing (PNT) solutions. Specifically, it increases redundancy and yields operational back-up or independence in case of failure or unavailability of one system. Among existing GNSS, the Chinese BeiDou system (BDS) is being developed and will consist of geostationary (GEO) satellites, inclined geosynchronous orbit (IGSO) satellites, and medium-Earth-orbit (MEO) satellites. In this contribution, a BeiDou-GPS robustness analysis is carried out for instantaneous, unaided attitude determination. Precise attitude determination using multiple GNSS antennas mounted on a platform relies on the successful resolution of the integer carrier phase ambiguities. The constrained Least-squares AMBiguity Decorrelation Adjustment (C-LAMBDA) method has been developed for the quadratically constrained GNSS compass model that incorporates the known baseline length. In this contribution the method is used to analyse the attitude determination performance when using the GPS and BeiDou systems. The attitude determination performance is evaluated using GPS/BeiDou data sets from a real data campaign in Australia spanning several days. The study includes the performance analyses of both stand-alone and mixed constellation (GPS/BeiDou) attitude estimation under various satellite deprived environments. We demonstrate and quantify the improved availability and accuracy of attitude determination using the combined constellation.

  8. Visual Analysis of Cloud Computing Performance Using Behavioral Lines.

    PubMed

    Muelder, Chris; Zhu, Biao; Chen, Wei; Zhang, Hongxin; Ma, Kwan-Liu

    2016-02-29

    Cloud computing is an essential technology to Big Data analytics and services. A cloud computing system is often comprised of a large number of parallel computing and storage devices. Monitoring the usage and performance of such a system is important for efficient operations, maintenance, and security. Tracing every application on a large cloud system is untenable due to scale and privacy issues. But profile data can be collected relatively efficiently by regularly sampling the state of the system, including properties such as CPU load, memory usage, network usage, and others, creating a set of multivariate time series for each system. Adequate tools for studying such large-scale, multidimensional data are lacking. In this paper, we present a visual based analysis approach to understanding and analyzing the performance and behavior of cloud computing systems. Our design is based on similarity measures and a layout method to portray the behavior of each compute node over time. When visualizing a large number of behavioral lines together, distinct patterns often appear suggesting particular types of performance bottleneck. The resulting system provides multiple linked views, which allow the user to interactively explore the data by examining the data or a selected subset at different levels of detail. Our case studies, which use datasets collected from two different cloud systems, show that this visual based approach is effective in identifying trends and anomalies of the systems.

  9. The use of error analysis to assess resident performance.

    PubMed

    D'Angelo, Anne-Lise D; Law, Katherine E; Cohen, Elaine R; Greenberg, Jacob A; Kwan, Calvin; Greenberg, Caprice; Wiegmann, Douglas A; Pugh, Carla M

    2015-11-01

    The aim of this study was to assess validity of a human factors error assessment method for evaluating resident performance during a simulated operative procedure. Seven postgraduate year 4-5 residents had 30 minutes to complete a simulated laparoscopic ventral hernia (LVH) repair on day 1 of a national, advanced laparoscopic course. Faculty provided immediate feedback on operative errors and residents participated in a final product analysis of their repairs. Residents then received didactic and hands-on training regarding several advanced laparoscopic procedures during a lecture session and animate lab. On day 2, residents performed a nonequivalent LVH repair using a simulator. Three investigators reviewed and coded videos of the repairs using previously developed human error classification systems. Residents committed 121 total errors on day 1 compared with 146 on day 2. One of 7 residents successfully completed the LVH repair on day 1 compared with all 7 residents on day 2 (P = .001). The majority of errors (85%) committed on day 2 were technical and occurred during the last 2 steps of the procedure. There were significant differences in error type (P ≤ .001) and level (P = .019) from day 1 to day 2. The proportion of omission errors decreased from day 1 (33%) to day 2 (14%). In addition, there were more technical and commission errors on day 2. The error assessment tool was successful in categorizing performance errors, supporting known-groups validity evidence. Evaluating resident performance through error classification has great potential in facilitating our understanding of operative readiness. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. The Use of Error Analysis to Assess Resident Performance

    PubMed Central

    D’Angelo, Anne-Lise D.; Law, Katherine E.; Cohen, Elaine R.; Greenberg, Jacob A.; Kwan, Calvin; Greenberg, Caprice; Wiegmann, Douglas A.; Pugh, Carla M.

    2015-01-01

    Background The aim of this study is to assess validity of a human factors error assessment method for evaluating resident performance during a simulated operative procedure. Methods Seven PGY4-5 residents had 30 minutes to complete a simulated laparoscopic ventral hernia (LVH) repair on Day 1 of a national, advanced laparoscopic course. Faculty provided immediate feedback on operative errors and residents participated in a final product analysis of their repairs. Residents then received didactic and hands-on training regarding several advanced laparoscopic procedures during a lecture session and animate lab. On Day 2, residents performed a nonequivalent LVH repair using a simulator. Three investigators reviewed and coded videos of the repairs using previously developed human error classification systems. Results Residents committed 121 total errors on Day 1 compared to 146 on Day 2. One of seven residents successfully completed the LVH repair on Day 1 compared to all seven residents on Day 2 (p=.001). The majority of errors (85%) committed on Day 2 were technical and occurred during the last two steps of the procedure. There were significant differences in error type (p=<.001) and level (p=.019) from Day 1 to Day 2. The proportion of omission errors decreased from Day 1 (33%) to Day 2 (14%). In addition, there were more technical and commission errors on Day 2. Conclusion The error assessment tool was successful in categorizing performance errors, supporting known-groups validity evidence. Evaluating resident performance through error classification has great potential in facilitating our understanding of operative readiness. PMID:26003910

  11. Optical performance analysis and optimization of large telescope structural designs

    NASA Astrophysics Data System (ADS)

    Roberts, Scott; Sun, Simon; Kerley, Dan

    2005-09-01

    We describe a tool to analyze the effects of gravity induced deflections on a telescope structure with segmented primary mirror optics. An objective of the telescope structural design process is to minimize image quality degradation due to uncorrectable static deflections of the optics under gravity, while ensuring that the overall system meets several requirements including limits of maximum primary mirror actuator stroke, segment rotation and decenter, and secondary mirror actuation. These design and performance criteria are not readily calculated within a finite element program. Our Merit Function routine, implemented in MATLAB and called by ANSYS, calculates these parameters and makes them available within ANSYS for evaluation and design optimization. In this analysis, ANSYS outputs key structural-model nodal displacements to a file, which are used to determine the 6 degree of freedom motion of the telescope's optical surfaces. MATLAB then utilizes these displacements, along with a database containing coordinate system transforms and a linear optics model derived from ZEMAX, to calculate various performance criteria. The values returned to ANSYS can be used to iteratively optimize performance over a set of structural design parameters. Optical parameters calculated by this routine include the optical path difference at the pupil, RMS wavefront, encircled energy and low order Zernike terms resulting from primary mirror segment rotation and decenter. Also reported are the maximum actuator strokes required to restore tip-tilt and piston of the primary mirror segments, and the deflection of the secondary mirror under gravitational load. The merit function routine is being used by the Thirty Meter Telescope (TMT) project to optimize and assess the performance of various telescope structural designs. This paper describes the mathematical basis of the calculations, their implementation and gives preliminary results of the TMT Telescope Structure Reference Design.

  12. Aerocapture Performance Analysis for a Neptune-Triton Exploration Mission

    NASA Technical Reports Server (NTRS)

    Starr, Brett R.; Westhelle, Carlos H.; Masciarelli, James P.

    2004-01-01

    A systems analysis has been conducted for a Neptune-Triton Exploration Mission in which aerocapture is used to capture a spacecraft at Neptune. Aerocapture uses aerodynamic drag instead of propulsion to decelerate from the interplanetary approach trajectory to a captured orbit during a single pass through the atmosphere. After capture, propulsion is used to move the spacecraft from the initial captured orbit to the desired science orbit. A preliminary assessment identified that a spacecraft with a lift to drag ratio of 0.8 was required for aerocapture. Performance analyses of the 0.8 L/D vehicle were performed using a high fidelity flight simulation within a Monte Carlo executive to determine mission success statistics. The simulation was the Program to Optimize Simulated Trajectories (POST) modified to include Neptune specific atmospheric and planet models, spacecraft aerodynamic characteristics, and interplanetary trajectory models. To these were added autonomous guidance and pseudo flight controller models. The Monte Carlo analyses incorporated approach trajectory delivery errors, aerodynamic characteristics uncertainties, and atmospheric density variations. Monte Carlo analyses were performed for a reference set of uncertainties and sets of uncertainties modified to produce increased and reduced atmospheric variability. For the reference uncertainties, the 0.8 L/D flatbottom ellipsled vehicle achieves 100% successful capture and has a 99.87 probability of attaining the science orbit with a 360 m/s V budget for apoapsis and periapsis adjustment. Monte Carlo analyses were also performed for a guidance system that modulates both bank angle and angle of attack with the reference set of uncertainties. An alpha and bank modulation guidance system reduces the 99.87 percentile DELTA V 173 m/s (48%) to 187 m/s for the reference set of uncertainties.

  13. Advanced multiphysics coupling for LWR fuel performance analysis

    DOE PAGES

    Hales, J. D.; Tonks, M. R.; Gleicher, F. N.; ...

    2015-10-01

    Even the most basic nuclear fuel analysis is a multiphysics undertaking, as a credible simulation must consider at a minimum coupled heat conduction and mechanical deformation. The need for more realistic fuel modeling under a variety of conditions invariably leads to a desire to include coupling between a more complete set of the physical phenomena influencing fuel behavior, including neutronics, thermal hydraulics, and mechanisms occurring at lower length scales. This paper covers current efforts toward coupled multiphysics LWR fuel modeling in three main areas. The first area covered in this paper concerns thermomechanical coupling. The interaction of these two physics,more » particularly related to the feedback effect associated with heat transfer and mechanical contact at the fuel/clad gap, provides numerous computational challenges. An outline is provided of an effective approach used to manage the nonlinearities associated with an evolving gap in BISON, a nuclear fuel performance application. A second type of multiphysics coupling described here is that of coupling neutronics with thermomechanical LWR fuel performance. DeCART, a high-fidelity core analysis program based on the method of characteristics, has been coupled to BISON. DeCART provides sub-pin level resolution of the multigroup neutron flux, with resonance treatment, during a depletion or a fast transient simulation. Two-way coupling between these codes was achieved by mapping fission rate density and fast neutron flux fields from DeCART to BISON and the temperature field from BISON to DeCART while employing a Picard iterative algorithm. Finally, the need for multiscale coupling is considered. Fission gas production and evolution significantly impact fuel performance by causing swelling, a reduction in the thermal conductivity, and fission gas release. The mechanisms involved occur at the atomistic and grain scale and are therefore not the domain of a fuel performance code. However, it is

  14. Advanced multiphysics coupling for LWR fuel performance analysis

    SciTech Connect

    Hales, J. D.; Tonks, M. R.; Gleicher, F. N.; Spencer, B. W.; Novascone, S. R.; Williamson, R. L.; Pastore, G.; Perez, D. M.

    2015-10-01

    Even the most basic nuclear fuel analysis is a multiphysics undertaking, as a credible simulation must consider at a minimum coupled heat conduction and mechanical deformation. The need for more realistic fuel modeling under a variety of conditions invariably leads to a desire to include coupling between a more complete set of the physical phenomena influencing fuel behavior, including neutronics, thermal hydraulics, and mechanisms occurring at lower length scales. This paper covers current efforts toward coupled multiphysics LWR fuel modeling in three main areas. The first area covered in this paper concerns thermomechanical coupling. The interaction of these two physics, particularly related to the feedback effect associated with heat transfer and mechanical contact at the fuel/clad gap, provides numerous computational challenges. An outline is provided of an effective approach used to manage the nonlinearities associated with an evolving gap in BISON, a nuclear fuel performance application. A second type of multiphysics coupling described here is that of coupling neutronics with thermomechanical LWR fuel performance. DeCART, a high-fidelity core analysis program based on the method of characteristics, has been coupled to BISON. DeCART provides sub-pin level resolution of the multigroup neutron flux, with resonance treatment, during a depletion or a fast transient simulation. Two-way coupling between these codes was achieved by mapping fission rate density and fast neutron flux fields from DeCART to BISON and the temperature field from BISON to DeCART while employing a Picard iterative algorithm. Finally, the need for multiscale coupling is considered. Fission gas production and evolution significantly impact fuel performance by causing swelling, a reduction in the thermal conductivity, and fission gas release. The mechanisms involved occur at the atomistic and grain scale and are therefore not the domain of a fuel performance code. However, it is possible to use

  15. 1-D Numerical Analysis of ABCC Engine Performance

    NASA Technical Reports Server (NTRS)

    Holden, Richard

    1999-01-01

    ABCC engine combines air breathing and rocket engine into a single engine to increase the specific impulse over an entire flight trajectory. Except for the heat source, the basic operation of the ABCC is similar to the basic operation of the RBCC engine. The ABCC is intended to have a higher specific impulse than the RBCC for single stage Earth to orbit vehicle. Computational fluid dynamics (CFD) is a useful tool for the analysis of complex transport processes in various components in ABCC propulsion system. The objective of the present research was to develop a transient 1-D numerical model using conservation of mass, linear momentum, and energy equations that could be used to predict flow behavior throughout a generic ABCC engine following a flight path. At specific points during the development of the 1-D numerical model a myriad of tests were performed to prove the program produced consistent, realistic numbers that follow compressible flow theory for various inlet conditions.

  16. A theoretical analysis of vacuum arc thruster performance

    NASA Technical Reports Server (NTRS)

    Polk, James E.; Sekerak, Mike; Ziemer, John K.; Schein, Jochen; Qi, Niansheng; Binder, Robert; Anders, Andre

    2001-01-01

    In vacuum arc discharges the current is conducted through vapor evaporated from the cathode surface. In these devices very dense, highly ionized plasmas can be created from any metallic or conducting solid used as the cathode. This paper describes theoretical models of performance for several thruster configurations which use vacuum arc plasma sources. This analysis suggests that thrusters using vacuum arc sources can be operated efficiently with a range of propellant options that gives great flexibility in specific impulse. In addition, the efficiency of plasma production in these devices appears to be largely independent of scale because the metal vapor is ionized within a few microns of the cathode electron emission sites, so this approach is well-suited for micropropulsion.

  17. Performance Analysis: ITS Data through September 30, 2009

    SciTech Connect

    Kerr, C E

    2009-12-07

    Data from ITS was analyzed to understand the issues at LLNL and to identify issues that may require additional management attention and these that meet the threshold for reporting to the DOE Noncompliance Tracking System (NTS). In this report we discuss assessments and issues entered in ITS and compare the number and type presently entered in ITS to previous time periods. Issues reported in ITS were evaluated and discussed. The analysis identified two noncompliances that meet the threshold for reporting to the DOE NTS. All of the data in ITS is analyzed; however, the primary focus of this report is to meet requirements for performance analysis of specific functional areas. The DOE Office of Enforcement expects LLNL to 'implement comprehensive management and independent assessments that are effective in identifying deficiencies and broader problems in safety and security programs, as well as opportunities for continuous improvement within the organization' and to 'regularly perform assessments to evaluate implementation of the contractor's's processes for screening and internal reporting.' LLNL has a self-assessment program, described in the document applicable during this time period, ES&H Manual Document 4.1, that includes line, management and independent assessments. LLNL also has in place a process to identify and report deficiencies of nuclear, worker safety and health and security requirements. In addition, the DOE Office of Enforcement expects that 'issues management databases are used to identify adverse trends, dominant problem areas, and potential repetitive events or conditions' (page 15, DOE Enforcement Process Overview, June 2009). LLNL requires that all worker safety and health and nuclear safety noncompliances be tracked as 'deficiencies' in the LLNL Issues Tracking System (ITS). Data from the ITS are analyzed for worker safety and health (WSH) and nuclear safety noncompliances that may meet the threshold for reporting to the DOE Noncompliance

  18. Performance analysis of spread spectrum modulation in data hiding

    NASA Astrophysics Data System (ADS)

    Gang, Litao; Akansu, Ali N.; Ramkumar, Mahalingam

    2001-12-01

    Watermarking or steganography technology provides a possible solution in digital multimedia copyright protection and pirate tracking. Most of the current data hiding schemes are based on spread spectrum modulation. A small value watermark signal is embedded into the content signal in some watermark domain. The information bits can be extracted via correlation. The schemes are applied both in escrow and oblivious cases. This paper reveals, through analysis and simulation, that in oblivious applications where the original signal is not available, the commonly used correlation detection is not optimal. Its maximum likelihood detection is analyzed and a feasible suboptimal detector is derived. Its performance is explored and compared with the correlation detector. Subsequently a linear embedding scheme is proposed and studied. Experiments with image data hiding demonstrates its effectiveness in applications.

  19. Performance Analysis of an Actor-Based Distributed Simulation

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1998-01-01

    Object-oriented design of simulation programs appears to be very attractive because of the natural association of components in the simulated system with objects. There is great potential in distributing the simulation across several computers for the purpose of parallel computation and its consequent handling of larger problems in less elapsed time. One approach to such a design is to use "actors", that is, active objects with their own thread of control. Because these objects execute concurrently, communication is via messages. This is in contrast to an object-oriented design using passive objects where communication between objects is via method calls (direct calls when they are in the same address space and remote procedure calls when they are in different address spaces or different machines). This paper describes a performance analysis program for the evaluation of a design for distributed simulations based upon actors.

  20. Portable Life Support Subsystem Thermal Hydraulic Performance Analysis

    NASA Technical Reports Server (NTRS)

    Barnes, Bruce; Pinckney, John; Conger, Bruce

    2010-01-01

    This paper presents the current state of the thermal hydraulic modeling efforts being conducted for the Constellation Space Suit Element (CSSE) Portable Life Support Subsystem (PLSS). The goal of these efforts is to provide realistic simulations of the PLSS under various modes of operation. The PLSS thermal hydraulic model simulates the thermal, pressure, flow characteristics, and human thermal comfort related to the PLSS performance. This paper presents modeling approaches and assumptions as well as component model descriptions. Results from the models are presented that show PLSS operations at steady-state and transient conditions. Finally, conclusions and recommendations are offered that summarize results, identify PLSS design weaknesses uncovered during review of the analysis results, and propose areas for improvement to increase model fidelity and accuracy.

  1. Portable Life Support Subsystem Thermal Hydraulic Performance Analysis

    NASA Technical Reports Server (NTRS)

    Barnes, Bruce; Pinckney, John; Conger, Bruce

    2010-01-01

    This paper presents the current state of the thermal hydraulic modeling efforts being conducted for the Constellation Space Suit Element (CSSE) Portable Life Support Subsystem (PLSS). The goal of these efforts is to provide realistic simulations of the PLSS under various modes of operation. The PLSS thermal hydraulic model simulates the thermal, pressure, flow characteristics, and human thermal comfort related to the PLSS performance. This paper presents modeling approaches and assumptions as well as component model descriptions. Results from the models are presented that show PLSS operations at steady-state and transient conditions. Finally, conclusions and recommendations are offered that summarize results, identify PLSS design weaknesses uncovered during review of the analysis results, and propose areas for improvement to increase model fidelity and accuracy.

  2. Analysis of Different Blade Architectures on small VAWT Performance

    NASA Astrophysics Data System (ADS)

    Battisti, L.; Brighenti, A.; Benini, E.; Raciti Castelli, M.

    2016-09-01

    The present paper aims at describing and comparing different small Vertical Axis Wind Turbine (VAWT) architectures, in terms of performance and loads. These characteristics can be highlighted by resorting to the Blade Element-Momentum (BE-M) model, commonly adopted for rotor pre-design and controller assessment. After validating the model with experimental data, the paper focuses on the analysis of VAWT loads depending on some relevant rotor features: blade number (2 and 3), airfoil camber line (comparing symmetrical and asymmetrical profiles) and blade inclination (straight versus helical blade). The effect of such characteristics on both power and thrusts (in the streamwise direction and in the crosswise one) as a function of both the blades azimuthal position and their Tip Speed Ratio (TSR) are presented and widely discussed.

  3. Human performance analysis of industrial radiography radiation exposure events

    SciTech Connect

    Reece, W.J.; Hill, S.G.

    1995-12-01

    A set of radiation overexposure event reports were reviewed as part of a program to examine human performance in industrial radiography for the US Nuclear Regulatory Commission. Incident records for a seven year period were retrieved from an event database. Ninety-five exposure events were initially categorized and sorted for further analysis. Descriptive models were applied to a subset of severe overexposure events. Modeling included: (1) operational sequence tables to outline the key human actions and interactions with equipment, (2) human reliability event trees, (3) an application of an information processing failures model, and (4) an extrapolated use of the error influences and effects diagram. Results of the modeling analyses provided insights into the industrial radiography task and suggested areas for further action and study to decrease overexposures.

  4. Performance analysis of ultrasonic ranging using a digital polarity correlator

    NASA Astrophysics Data System (ADS)

    Kodama, T.; Nakahira, K.

    2013-01-01

    This paper presents performance analysis of the distance measurement using a digital polarity correlator applied to the ultrasonic ranging system, consisting of piezoelectric transducers for pulse echo operation and a pulse compression filter using chirp signals. Analytical and simulation results show that the technique of one-bit correlation is as effective as two-bit correlation with respect to signal-to-noise ratios and probability of detecting a target, and further that both methods approach results obtained from a complete correlation of received signals with a reference signal, in the case that the threshold of the received signals is adjusted with regards to the noise level. Experimental results show close agreement with the presented theory.

  5. Hydrodynamic body shape analysis and their impact on swimming performance.

    PubMed

    Li, Tian-Zeng; Zhan, Jie-Min

    2015-01-01

    This study presents the hydrodynamic characteristics of different adult male swimmer's body shape using computational fluid dynamics method. This simulation strategy is carried out by CFD fluent code with solving the 3D incompressible Navier-Stokes equations using the RNG k-ε turbulence closure. The water free surface is captured by the volume of fluid (VOF) method. A set of full body models, which is based on the anthropometrical characteristics of the most common male swimmers, is created by Computer Aided Industrial Design (CAID) software, Rhinoceros. The analysis of CFD results revealed that swimmer's body shape has a noticeable effect on the hydrodynamics performances. This explains why male swimmer with an inverted triangle body shape has good hydrodynamic characteristics for competitive swimming.

  6. Performance Analysis of Paraboloidal Reflector Antennas in Radio Telescopes

    NASA Astrophysics Data System (ADS)

    Yeap, Kim Ho; Law, Young Hui; Rizman, Zairi Ismael; Cheong, Yuen Kiat; Ong, Chu En; Chong, Kok Hen

    2013-10-01

    In this paper, we present an analysis on the performance of the three most commonly used paraboloidal reflector antennas in radio telescopes - i.e. the prime focus, Cassegrain, and Gregorian antennas. In our study, we have adopted the design parameters for the Cassegrain configuration used in the Atacama Large Millimeter Array (ALMA) project. The parameters are subsequently re-calculated so as to meet the design requirement of the Gregorian and prime focus configurations. The simulation results obtained from GRASP reveal that the prime focus configuration produces the lowest side lobes and the highest main lobe level. Such configuration, however, has the disadvantage of being highly susceptible to thermal ground noise radiation. The radiation characteristics produced by both the Cassegrain and Gregorian configurations are very close to each other. Indeed, the results show that there is no significant advantage between the two designs. Hence, we can conclude that both co! nfigurations are comparable in the application of radio telescopes.

  7. Automotive Gas Turbine Power System-Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    1997-01-01

    An open cycle gas turbine numerical modelling code suitable for thermodynamic performance analysis (i.e. thermal efficiency, specific fuel consumption, cycle state points, working fluid flowrates etc.) of automotive and aircraft powerplant applications has been generated at the NASA Lewis Research Center's Power Technology Division. The use this code can be made available to automotive gas turbine preliminary design efforts, either in its present version, or, assuming that resources can be obtained to incorporate empirical models for component weight and packaging volume, in later version that includes the weight-volume estimator feature. The paper contains a brief discussion of the capabilities of the presently operational version of the code, including a listing of input and output parameters and actual sample output listings.

  8. 1-D Numerical Analysis of ABCC Engine Performance

    NASA Technical Reports Server (NTRS)

    Holden, Richard

    1999-01-01

    ABCC engine combines air breathing and rocket engine into a single engine to increase the specific impulse over an entire flight trajectory. Except for the heat source, the basic operation of the ABCC is similar to the basic operation of the RBCC engine. The ABCC is intended to have a higher specific impulse than the RBCC for single stage Earth to orbit vehicle. Computational fluid dynamics (CFD) is a useful tool for the analysis of complex transport processes in various components in ABCC propulsion system. The objective of the present research was to develop a transient 1-D numerical model using conservation of mass, linear momentum, and energy equations that could be used to predict flow behavior throughout a generic ABCC engine following a flight path. At specific points during the development of the 1-D numerical model a myriad of tests were performed to prove the program produced consistent, realistic numbers that follow compressible flow theory for various inlet conditions.

  9. Seismic performance analysis of Tendaho earth fill dam, Ethiopia.

    NASA Astrophysics Data System (ADS)

    Berhe, T.; Wu, W.

    2009-04-01

    The Tendaho dam is found in the Afar regional state, North Eastern part of Ethiopia. It is located within an area known as the ‘Tendaho Graben' ,which forms the center of Afar triangle, a low lying area of land where East African, Red sea and the Gulf of Eden Rift systems converge. The dam is an earthfill dam with a volume of about 4 Million cubic meters and with mixed clay core. The geological setting associated with the site of the dam, the geotechnical properties of the dam materials and seismicity of the region are reviewed. Based on this review, the foundation materials and dam body include some liquefiable granular soils. Moreover, the active East African Rift Valley fault, which can generate an earthquake of magnitude greater than 6, passes through the dam body. This valley is the primary seismic source contributing to the hazard at the Tendaho dam site. The availability of liquefiable materials beneath and within the dam body and the presence of the active fault crossing the dam site demand a thorough seismic analysis of the dam. The peak ground acceleration (PGA) is selected as a measure of ground motion severity. The PGA was selected according to the guidelines of the International Commission on Large Dams, ICOLD. Based on the criteria set by the ICOLD, the dam is analyzed for two different earthquake magnitudes, the Maximum Credible Earthquake (MCE) and the Operating Basis Earthquake (OBE). Numerical codes are useful tools to investigate the safety of dams in seismic prone areas. In this paper, FLAC3D numerical tool is used to investigate the performance of the dam under dynamic loading. Based on the numerical analysis, the seismic performance of the dam is investigated.

  10. Visualization and Analysis of Climate Simulation Performance Data

    NASA Astrophysics Data System (ADS)

    Röber, Niklas; Adamidis, Panagiotis; Behrens, Jörg

    2015-04-01

    Visualization is the key process of transforming abstract (scientific) data into a graphical representation, to aid in the understanding of the information hidden within the data. Climate simulation data sets are typically quite large, time varying, and consist of many different variables sampled on an underlying grid. A large variety of climate models - and sub models - exist to simulate various aspects of the climate system. Generally, one is mainly interested in the physical variables produced by the simulation runs, but model developers are also interested in performance data measured along with these simulations. Climate simulation models are carefully developed complex software systems, designed to run in parallel on large HPC systems. An important goal thereby is to utilize the entire hardware as efficiently as possible, that is, to distribute the workload as even as possible among the individual components. This is a very challenging task, and detailed performance data, such as timings, cache misses etc. have to be used to locate and understand performance problems in order to optimize the model implementation. Furthermore, the correlation of performance data to the processes of the application and the sub-domains of the decomposed underlying grid is vital when addressing communication and load imbalance issues. High resolution climate simulations are carried out on tens to hundreds of thousands of cores, thus yielding a vast amount of profiling data, which cannot be analyzed without appropriate visualization techniques. This PICO presentation displays and discusses the ICON simulation model, which is jointly developed by the Max Planck Institute for Meteorology and the German Weather Service and in partnership with DKRZ. The visualization and analysis of the models performance data allows us to optimize and fine tune the model, as well as to understand its execution on the HPC system. We show and discuss our workflow, as well as present new ideas and

  11. A performance-based approach to landslide risk analysis

    NASA Astrophysics Data System (ADS)

    Romeo, R. W.

    2009-04-01

    An approach for the risk assessment based on a probabilistic analysis of the performance of structures threatened by landslides is shown and discussed. The risk is a possible loss due to the occurrence of a potentially damaging event. Analytically the risk is the probability convolution of hazard, which defines the frequency of occurrence of the event (i.e., the demand), and fragility that defines the capacity of the system to withstand the event given its characteristics (i.e., severity) and those of the exposed goods (vulnerability), that is: Risk=p(D>=d|S,V) The inequality sets a damage (or loss) threshold beyond which the system's performance is no longer met. Therefore a consistent approach to risk assessment should: 1) adopt a probabilistic model which takes into account all the uncertainties of the involved variables (capacity and demand), 2) follow a performance approach based on given loss or damage thresholds. The proposed method belongs to the category of the semi-empirical ones: the theoretical component is given by the probabilistic capacity-demand model; the empirical component is given by the observed statistical behaviour of structures damaged by landslides. Two landslide properties alone are required: the area-extent and the type (or kinematism). All other properties required to determine the severity of landslides (such as depth, speed and frequency) are derived via probabilistic methods. The severity (or intensity) of landslides, in terms of kinetic energy, is the demand of resistance; the resistance capacity is given by the cumulative distribution functions of the limit state performance (fragility functions) assessed via damage surveys and cards compilation. The investigated limit states are aesthetic (of nominal concern alone), functional (interruption of service) and structural (economic and social losses). The damage probability is the probabilistic convolution of hazard (the probability mass function of the frequency of occurrence of given

  12. Performance analysis and optimization of power plants with gas turbines

    NASA Astrophysics Data System (ADS)

    Besharati-Givi, Maryam

    The gas turbine is one of the most important applications for power generation. The purpose of this research is performance analysis and optimization of power plants by using different design systems at different operation conditions. In this research, accurate efficiency calculation and finding optimum values of efficiency for design of chiller inlet cooling and blade cooled gas turbine are investigated. This research shows how it is possible to find the optimum design for different operation conditions, like ambient temperature, relative humidity, turbine inlet temperature, and compressor pressure ratio. The simulated designs include the chiller, with varied COP and fogging cooling for a compressor. In addition, the overall thermal efficiency is improved by adding some design systems like reheat and regenerative heating. The other goal of this research focuses on the blade-cooled gas turbine for higher turbine inlet temperature, and consequently, higher efficiency. New film cooling equations, along with changing film cooling effectiveness for optimum cooling air requirement at the first-stage blades, and an internal and trailing edge cooling for the second stage, are innovated for optimal efficiency calculation. This research sets the groundwork for using the optimum value of efficiency calculation, while using inlet cooling and blade cooling designs. In the final step, the designed systems in the gas cycles are combined with a steam cycle for performance improvement.

  13. Magnetohydrodynamic Augmented Propulsion Experiment: I. Performance Analysis and Design

    NASA Technical Reports Server (NTRS)

    Litchford, R. J.; Cole, J. W.; Lineberry, J. T.; Chapman, J. N.; Schmidt, H. J.; Lineberry, C. W.

    2003-01-01

    The performance of conventional thermal propulsion systems is fundamentally constrained by the specific energy limitations associated with chemical fuels and the thermal limits of available materials. Electromagnetic thrust augmentation represents one intriguing possibility for improving the fuel composition of thermal propulsion systems, thereby increasing overall specific energy characteristics; however, realization of such a system requires an extremely high-energy-density electrical power source as well as an efficient plasma acceleration device. This Technical Publication describes the development of an experimental research facility for investigating the use of cross-field magnetohydrodynamic (MHD) accelerators as a possible thrust augmentation device for thermal propulsion systems. In this experiment,a 1.5-MW(sub e) Aerotherm arc heater is used to drive a 2-MW(sub e) MHD accelerator. The heatsink MHD accelerator is configured as an externally diagonalized, segmented channel, which is inserted into a large-bore, 2-T electromagnet. The performance analysis and engineering design of the flow path are described as well as the parameter measurements and flow diagnostics planned for the initial series of test runs.

  14. Performance analysis of a digital capacitance measuring circuit

    NASA Astrophysics Data System (ADS)

    Xu, Lijun; Sun, Shijie; Cao, Zhang; Yang, Wuqiang

    2015-05-01

    This paper presents the design and study of a digital capacitance measuring circuit with theoretical analysis, numerical simulation, and experimental evaluation. The static and dynamic performances of the capacitance measuring circuit are first defined, including signal-to-noise ratio (SNR), standard deviation, accuracy, linearity, sensitivity, and response time, within a given measurement range. Then numerical simulation is carried out to analyze the SNR and standard deviation of the circuit, followed by experiments to validate the overall performance of the circuit. The simulation results show that when the standard deviation of noise is 0.08 mV and the measured capacitance decreases from 6 pF to 3 fF, the SNR decreases from 90 dB to 22 dB and the standard deviation is between 0.17 fF and 0.24 fF. The experimental results show that when the measured capacitance decreases from 6 pF to 40 fF and the data sampled in a single period are used for demodulation, the SNR decreases from 88 dB to 40 dB and the standard deviation is between 0.18 fF and 0.25 fF. The maximum absolute error and relative error are 5.12 fF and 1.26%, respectively. The SNR and standard deviation can be further improved if the data sampled in more than one period are used for demodulation by the circuit.

  15. Performance analysis for second-design space Stirling engine model

    NASA Astrophysics Data System (ADS)

    Ogiwara, Sachio; Fujiwara, Tsutomu; Eguchi, Kunihisa; Nakamura, Yoshihiro

    A hybrid free-piston Stirling research engine, called NALSEM 125, has been tested since 1988 as part of a solar dynamic power technology program. It is a gamma-type Stirling driven linear-alternator machine with helium as a working fluid. The objective of the experimental program is to understand the thermodynamic and dynamic mechanisms of the free piston engine integrated with a magnet-moving alternator. After the first phase engine experiments of NALSEM 125, a second design Stirling engine of NALSEM 125 R has been tested. By using a second-order analytical tool, some design modifications were performed to provide much more stable dynamic operations over a required operating range, as well as to incorporate an electric heater head simulating a hot interface of 12 sodium heat pipes. Describes in this paper are thermodynamic performance data of NALSEM 125R operations, which are also compared with the computational analysis, considering the power losses resulting from pressure drop and gas leakage.

  16. RWMC Performance Assessment/Composite Analysis Monitoring Report - FY-2002

    SciTech Connect

    Ritter, P.D.; Parsons, A.M.

    2002-09-30

    US DOE Order 435.1, Radioactive Waste Management, Chapter IV and the associated implementation manual and guidance require monitoring of low-level radioactive waste (LLW) disposal facilities. The Performance Assessment/Composite Analysis (PA/CA) Monitoring program was developed and implemented to meet this requirement. This report represents the results of PA/CA monitoring projects that are available as of September 2002. The technical basis for the PA/CA program is provided in the PA/CA Monitoring Program document and a program description document (PDD) serves as the quality assurance project plan for implementing the PM program. Subsurface monitoring, air pathway surveillance, and subsidence monitoring/control are required to comply with DOE Order 435.1, Chapter IV. Subsidence monitoring/control and air pathway surveillance are performed entirely by other INEEL programs - their work is summarized herein. Subsurface monitoring includes near-field (source) monitoring of buried activated beryllium and steel, monitoring of groundwater in the vadose zone, and monitoring of the Snake River Plain Aquifer. Most of the required subsurface monitoring information presented in this report was gathered from the results of ongoing INEEL monitoring programs. This report also presents results for several new monitoring efforts that have been initiated to characterize any migration of radionuclides in surface sediment near the waste.

  17. Scalable Analysis Techniques for Microprocessor Performance Counter Metrics

    SciTech Connect

    Ahn, D H; Vetter, J S

    2002-07-24

    Contemporary microprocessors provide a rich set of integrated performance counters that allow application developers and system architects alike the opportunity to gather important information about workload behaviors. These counters can capture instruction, memory, and operating system behaviors. Current techniques for analyzing data produced from these counters use raw counts, ratios, and visualization techniques to help users make decisions about their application source code. While these techniques are appropriate for analyzing data from one process, they do not scale easily to new levels demanded by contemporary computing systems. Indeed, the amount of data generated by these experiments is on the order of tens of thousands of data points. Furthermore, if users execute multiple experiments, then we add yet another dimension to this already knotty picture. This flood of multidimensional data can swamp efforts to harvest important ideas from these valuable counters. Very simply, this paper addresses these concerns by evaluating several multivariate statistical techniques on these datasets. We find that several techniques, such as statistical clustering, can automatically extract important features from this data. These derived results can, in turn, be feed directly back to an application developer, or used as input to a more comprehensive performance analysis environment, such as a visualization or an expert system.

  18. Performance Analysis of a NASA Integrated Network Array

    NASA Technical Reports Server (NTRS)

    Nessel, James A.

    2012-01-01

    The Space Communications and Navigation (SCaN) Program is planning to integrate its individual networks into a unified network which will function as a single entity to provide services to user missions. This integrated network architecture is expected to provide SCaN customers with the capabilities to seamlessly use any of the available SCaN assets to support their missions to efficiently meet the collective needs of Agency missions. One potential optimal application of these assets, based on this envisioned architecture, is that of arraying across existing networks to significantly enhance data rates and/or link availabilities. As such, this document provides an analysis of the transmit and receive performance of a proposed SCaN inter-network antenna array. From the study, it is determined that a fully integrated internetwork array does not provide any significant advantage over an intra-network array, one in which the assets of an individual network are arrayed for enhanced performance. Therefore, it is the recommendation of this study that NASA proceed with an arraying concept, with a fundamental focus on a network-centric arraying.

  19. Insomnia and daytime cognitive performance: a meta-analysis.

    PubMed

    Fortier-Brochu, Emilie; Beaulieu-Bonneau, Simon; Ivers, Hans; Morin, Charles M

    2012-02-01

    Individuals with insomnia consistently report difficulties pertaining to their cognitive functioning (e.g., memory, concentration). However, objective measurements of their performance on neuropsychological tests have produced inconsistent findings. This meta-analysis was conducted to provide a quantitative summary of evidence regarding the magnitude of differences between individuals with primary insomnia and normal sleepers on a broad range of neuropsychological measures. Reference databases (PubMed, PsycInfo, Dissertation Abstracts International) were searched for studies comparing adults with primary insomnia to normal sleepers on neuropsychological measures. Dependent variables related to cognitive and psychomotor performance were extracted from each study. Variables were classified independently by two licensed neuropsychologists according to the main cognitive function being measured. Individual effect sizes (Cohen's d) were weighted by variability and combined for each cognitive function using a fixed effects model. Average effect sizes and their 95% confidence intervals were computed for each cognitive function. Twenty-four studies met inclusion criteria, for a total of 639 individuals with insomnia and 558 normal sleepers. Significant impairments (p<0.05) of small to moderate magnitude were found in individuals with insomnia for tasks assessing episodic memory (ES = -0.51), problem solving (ES = -0.42), manipulation in working memory (ES = -0.42), and retention in working memory (ES = -0.22). No significant group differences were observed for tasks assessing general cognitive function, perceptual and psychomotor processes, procedural learning, verbal functions, different dimensions of attention (alertness, complex reaction time, speed of information processing, selective attention, sustained attention/vigilance) and some aspects of executive functioning (verbal fluency, cognitive flexibility). Individuals with insomnia exhibit performance impairments for

  20. Physiological stress and performance analysis to karate combat.

    PubMed

    Chaabene, Helmi; Hellara, Ilhem; Ghali, Faten B; Franchini, Emerson; Neffati, Fedoua; Tabben, Montassar; Najjar, Mohamed F; Hachana, Younés

    2016-10-01

    This study aimed to evaluate the relationship between physiological, and parameters of performance analysis during karate contest. Nine elite-level karate athletes participated in this study. Saliva sample was collected pre- and post-karate combat. Salivary cortisol (sC) post-combat 2 raised significantly compared to that recorded at pre-combat 1 (Δ%=105.3%; P=0.04; dz=0.78). The largest decrease of the salivary T/C ratio (sR) compared to pre-combat 1 was recorded post-combat 2 (Δ%=-43.5%; P=0.03). Moreover, blood lactate concentration post-combat 1 correlated positively to sCpost-combat 1 (r=0.66; P=0.05) and negatively to both salivary testosterone (sT) (r=-0.76; P=0.01) and sRpost-combat 1 (r=-0.76; P=0.01). There was no significant relationship between hormonal measures and parameters of match analysis. Although under simulated condition, karate combat poses large physiological stress to the karateka. Additionally, physiological strain to karate combat led to a catabolic hormonal response.

  1. Analysis of beamed-energy ramjet/scramjet performance

    NASA Technical Reports Server (NTRS)

    Myrabo, L. N.; Powers, M. V.; Zaretzky, C. L.

    1986-01-01

    A study has been performed on a laser-heated ramjet/scramjet vehicle concept for propulsion during the air-breathing portion of an orbital launch trajectory. The concept considers axisymmetric, high-thrust vehicles with external inlets and nozzles. Conceptual design and ramjet/scramjet cycle analysis are emphasized, with propulsive energy provided by combustion of on-board fuel. The conventional ramjet/scramjet combustion chamber is replaced by a laser energy absorption chamber. The elimination of on-board propellant can result in very high thrust-to-weight ratios and payload fractions, in a vehicle with a relatively small degree of mechanical complexity. The basic vehicle has a weight of 12,250 lbf, and a diameter of 5 meters, which is close to the size of the Apollo command module. The ramjet calculations are based on a Mach 3 isentropic inlet with a 13.7 degree half-angle conical tip. The scramjet analysis considers conical inlets with 10, 15, and 30 degree half-angles. Flight Mach numbers from 2 to 20 are considered in the calculations.

  2. Performance analysis and experiment validation of a pneumatic vibration isolator

    NASA Astrophysics Data System (ADS)

    Yang, Yuanyuan; Tan, Jiubin; Wang, Lei; Zhou, Tong

    2015-02-01

    A performance analysis and experiment validation of a pneumatic vibration isolator (PVI) that applied in the wafer stage of lithography is proposed in this work. The wafer stage of lithography is a dual-stage actuator system, including a long-stroke stage (LS) and a short-stroke stage (SS). In order to achieve the nanometer level positioning the isolator is designed to reduce the transmission of LS excitations to SS. In addition, considering the SS with six degrees of freedom and required to keep a strict constant temperature environment, the isolator need to have two functions, including the decoupling for vertical to horizontal and gravity compensation. In this isolator, a biaxial hinge was designed to decouple vertical rotation freedom, and a gas bearing was designed to decouple horizontal motion. The stiffness and damping of the pneumatic vibration isolator were analyzed. Besides, an analysis of the natural frequency and vibration transmissibility of the isolator is presented. In the end, the results show that vibration transmission is reduced significantly by the isolator and natural frequency can be lower than 0.6 Hz. This means that experimental results accord with the prediction model.

  3. Comparative performance analysis: Commercial cut-flower rose production

    SciTech Connect

    Whittier, J.; Fischer, C.L.

    1990-04-01

    A comparative performance analysis has been conducted to examine the various factors associated with establishing and operating a commercial rose cut-flower greenhouse in ten different locations across the United States. Plant productivity, defined as net blooms produced per plant per year, is largely dependent upon local climatic conditions and technological improvements. Regional variations in productivity have been explicitly analyzed. The greenhouse operation is assumed to be four acres in size and the facilities utilize current technologies. The operation is designed as a professionally-organized company with an owner/manager, grower, and salesperson. The primary product is a red hybrid tea rose for sales. Selling markets vary by location, but in general they are large metropolitan areas. The analysis strongly indicates that new installations for cut-flower rose production are profitable in several areas in the U.S. Southwest, particularly in New Mexico, Arizona, and Texas. No ones stands out as a favored location. Las Cruces, New Mexico, has the highest net present volume and return on investment results. 68 refs., 1 fig., 8 tabs.

  4. The Current State of Human Performance Technology: A Citation Network Analysis of "Performance Improvement Quarterly," 1988-2010

    ERIC Educational Resources Information Center

    Cho, Yonjoo; Jo, Sung Jun; Park, Sunyoung; Kang, Ingu; Chen, Zengguan

    2011-01-01

    This study conducted a citation network analysis (CNA) of human performance technology (HPT) to examine its current state of the field. Previous reviews of the field have used traditional research methods, such as content analysis, survey, Delphi, and citation analysis. The distinctive features of CNA come from using a social network analysis…

  5. Effects of lithium on cognitive performance: a meta-analysis.

    PubMed

    Wingo, Aliza P; Wingo, Thomas S; Harvey, Philip D; Baldessarini, Ross J

    2009-11-01

    Cognitive impairment is underrecognized among patients with bipolar disorder and may represent not only effects of the illness but also adverse effects of its treatments. Among these, lithium is the best-studied mood stabilizer. As its cognitive effects are mixed and not well-known, we assessed reported effects of lithium on cognitive performance. MEDLINE, PsycINFO, and EMBASE databases (1950 to December 2008) were queried with the keywords lithium, cognit*, neurocognit*, neuropsych*, psycholog*, attention, concentration, processing speed, memory, executive, and learning. Database searches were supplemented with bibliographic cross-referencing by hand. The literature search was conducted independently by 2 authors (A.P.W. and T.S.W.) during August and September 2008, and questions about appropriate inclusion or exclusion were resolved between them by consensus. Of 586 reports initially identified as being of potential interest, 12, involving 539 subjects, met our inclusion criteria: (1) cognitive performance compared between subjects taking lithium and comparable subjects not taking lithium; comparability was assured by: (2) patients with the same affective disorder diagnoses in euthymic or remitted status or healthy volunteers; (3) groups of similar age and sex; (4) similar intelligence, education, or occupation; (5) similar distribution of other concurrent psychotropic drugs; and (6) cognitive abilities (outcomes) assessed with performance-based measures. Standardized mean-difference effect size (ES), corrected for small-sample bias (Hedges' g), was computed for cognitive tasks in each study. ES estimates were transformed so that positive values indicate poorer performance by lithium-treated subjects. Infrequently, when means and standard deviations were not provided, ES was estimated from reported values of t, F, or z tests. For analysis, similar neurocognitive tests were grouped a priori based on the cognitive domains they aimed to assess. We identified 12

  6. Analysis of correlation between corneal topographical data and visual performance

    NASA Astrophysics Data System (ADS)

    Zhou, Chuanqing; Yu, Lei; Ren, Qiushi

    2007-02-01

    Purpose: To study correlation among corneal asphericity, higher-order aberrations and visual performance for eyes of virgin myopia and postoperative laser in situ keratomileusis (LASIK). Methods: There were 320 candidates 590 eyes for LASIK treatment included in this study. The mean preoperative spherical equivalence was -4.35+/-1.51D (-1.25 to -9.75), with astigmatism less than 2.5 D. Corneal topography maps and contrast sensitivity were measured and analyzed for every eye before and one year after LASIK for the analysis of corneal asphericity and wavefront aberrations. Results: Preoperatively, only 4th and 6th order aberration had significant correlation with corneal asphericity and apical radius of curvature (p<0.001). Postoperatively, all 3th to 6th order aberrations had statistically significant correlation with corneal asphericity (p<0.01), but only 4th and 6th order aberration had significant correlation with apical radius of curvature (p<0.05). The asymmetrical aberration like coma had significant correlation with vertical offset of pupil center (p<0.01). Preoperatively, corneal aberrations had no significant correlation with visual acuity and area under the log contrast sensitivity (AULCSF) (P>0.05). Postoperatively, corneal aberrations still didn't have significant correlation with visual acuity (P>0.05), but had significantly negative correlation with AULCSF (P<0.01). Corneal asphericity had no significant correlation with AULCSF before and after the treatment (P>0.05). Conclusions: Corneal aberrations had different correlation with corneal profile and visual performance for eyes of virgin myopia and postoperative LASIK, which may be due to changed corneal profile and limitation of metrics of corneal aberrations.

  7. Analysis of Student Performance in Peer Led Undergraduate Supplements

    NASA Astrophysics Data System (ADS)

    Gardner, Linda M.

    Foundations of Chemistry courses at the University of Kansas have traditionally accommodated nearly 1,000 individual students every year with a single course in a large lecture hall. To develop a more student-centered learning atmosphere, Peer Led Undergraduate Supplements (PLUS) were introduced to assist students, starting in the spring of 2010. PLUS was derived from the more well-known Peer-Led Team Learning with modifications to meet the specific needs of the university and the students. The yearlong investigation of PLUS Chemistry began in the fall of 2012 to allow for adequate development of materials and training of peer leaders. We examined the impact of academic achievement for students who attended PLUS sessions while controlling for high school GPA, math ACT scores, credit hours earned in high school, completion of calculus, gender, and those aspiring to be pharmacists (i.e., pre-pharmacy students). In a least linear squares multiple regression, PLUS participants performed on average one percent higher on exam scores for Chemistry 184 and four tenths of a percent on Chemistry 188 for each PLUS session attended. Pre-pharmacy students moderated the effect of PLUS attendance on chemistry achievement, ultimately negating any relative gain associated by attending PLUS sessions. Evidence of gender difference was demonstrated in the Chemistry 188 model, indicating females experience a greater benefit from PLUS sessions. Additionally, an item analysis studied the relationship between PLUS material to individual items on exams. The research discovered that students who attended PLUS session, answered the items correctly 10 to 20 percent more than their comparison group for PLUS interrelated items and no difference to 10 percent for non-PLUS related items. In summary, PLUS has a positive effect on exam performance in introductory chemistry courses at the University of Kansas.

  8. Measurement Performance of a Computer Assisted Vertebral Motion Analysis System.

    PubMed

    Davis, Reginald J; Lee, David C; Wade, Chip; Cheng, Boyle

    2015-01-01

    Segmental instability of the lumbar spine is a significant cost within the US health care system; however current thresholds for indication of radiographic instability are not well defined. To determine the performance measurements of sagittal lumbar intervertebral measurements using computerassisted measurements of the lumbar spine using motion sequences from a video-fluoroscopic technique. Sensitivity, specificity, predictive values, prevalence, and test-retest reliability evaluation of digitized manual versus computer-assisted measurements of the lumbar spine. A total of 2239 intervertebral levels from 509 symptomatic patients, and 287 intervertebral levels from 73 asymptomatic participants were retrospectively evaluated. Specificity, sensitivity, negative predictive value (NPV), diagnostic accuracy, and prevalence between the two measurement techniques; Measurements of Coefficient of repeatability (CR), limits of agreement (LOA), intraclass correlation coefficient (ICC; type 3,1), and standard error of measurement for both measurement techniques. Asymptomatic individuals and symptomatic patients were all evaluated using both the Vertebral Motion Analysis (VMA) system and fluoroscopic flexion extension static radiographs (FE). The analysis was compared to known thresholds of 15% intervertebral translation (IVT, equivalent to 5.3mm assuming a 35mm vertebral body depth) and 25° intervertebral rotation (IVR). The VMA measurements demonstrated greater specificity, % change in sensitivity, NPV, prevalence, and reliability compared with FE for radiographic evidence of instability. Specificity was 99.4% and 99.1% in the VMA compared to 98.3% and 98.2% in the FE for IVR and IVT, respectively. Sensitivity in this study was 41.2% and 44.6% greater in the VMA compared to the FE for IVR and IVT, respectively. NPV was 91% and 88% in the VMA compared to 62% and 66% in the FE for IVR and IVT, respectively. Prevalence was 12.3% and 11.9% for the VMA compared to 6.1% and 5

  9. Measurement Performance of a Computer Assisted Vertebral Motion Analysis System

    PubMed Central

    Davis, Reginald J.; Lee, David C.; Cheng, Boyle

    2015-01-01

    Background Segmental instability of the lumbar spine is a significant cost within the US health care system; however current thresholds for indication of radiographic instability are not well defined. Purpose To determine the performance measurements of sagittal lumbar intervertebral measurements using computerassisted measurements of the lumbar spine using motion sequences from a video-fluoroscopic technique. Study design Sensitivity, specificity, predictive values, prevalence, and test-retest reliability evaluation of digitized manual versus computer-assisted measurements of the lumbar spine. Patient sample A total of 2239 intervertebral levels from 509 symptomatic patients, and 287 intervertebral levels from 73 asymptomatic participants were retrospectively evaluated. Outcome measures Specificity, sensitivity, negative predictive value (NPV), diagnostic accuracy, and prevalence between the two measurement techniques; Measurements of Coefficient of repeatability (CR), limits of agreement (LOA), intraclass correlation coefficient (ICC; type 3,1), and standard error of measurement for both measurement techniques. Methods Asymptomatic individuals and symptomatic patients were all evaluated using both the Vertebral Motion Analysis (VMA) system and fluoroscopic flexion extension static radiographs (FE). The analysis was compared to known thresholds of 15% intervertebral translation (IVT, equivalent to 5.3mm assuming a 35mm vertebral body depth) and 25° intervertebral rotation (IVR). Results The VMA measurements demonstrated greater specificity, % change in sensitivity, NPV, prevalence, and reliability compared with FE for radiographic evidence of instability. Specificity was 99.4% and 99.1% in the VMA compared to 98.3% and 98.2% in the FE for IVR and IVT, respectively. Sensitivity in this study was 41.2% and 44.6% greater in the VMA compared to the FE for IVR and IVT, respectively. NPV was 91% and 88% in the VMA compared to 62% and 66% in the FE for IVR and IVT

  10. Design, fabrication & performance analysis of an unmanned aerial vehicle

    NASA Astrophysics Data System (ADS)

    Khan, M. I.; Salam, M. A.; Afsar, M. R.; Huda, M. N.; Mahmud, T.

    2016-07-01

    An Unmanned Aerial Vehicle was designed, analyzed and fabricated to meet design requirements and perform the entire mission for an international aircraft design competition. The goal was to have a balanced design possessing, good demonstrated flight handling qualities, practical and affordable manufacturing requirements while providing a high vehicle performance. The UAV had to complete total three missions named ferry flight (1st mission), maximum load mission (2nd mission) and emergency medical mission (3rd mission). The requirement of ferry flight mission was to fly as many as laps as possible within 4 minutes. The maximum load mission consists of flying 3 laps while carrying two wooden blocks which simulate cargo. The requirement of emergency medical mission was complete 3 laps as soon as possible while carrying two attendances and two patients. A careful analysis revealed lowest rated aircraft cost (RAC) as the primary design objective. So, the challenge was to build an aircraft with minimum RAC that can fly fast, fly with maximum payload, and fly fast with all the possible configurations. The aircraft design was reached by first generating numerous design concepts capable of completing the mission requirements. In conceptual design phase, Figure of Merit (FOM) analysis was carried out to select initial aircraft configuration, propulsion, empennage and landing gear. After completion of the conceptual design, preliminary design was carried out. The preliminary design iterations had a low wing loading, high lift coefficient, and a high thrust to weight ratio. To make the aircraft capable of Rough Field Taxi; springs were added in the landing gears for absorbing shock. An airfoil shaped fuselage was designed to allowed sufficient space for payload and generate less drag to make the aircraft fly fast. The final design was a high wing monoplane with conventional tail, single tractor propulsion system and a tail dragger landing gear. Payload was stored in

  11. Finite element analysis and performance evaluation of synthetic jet actuators

    NASA Astrophysics Data System (ADS)

    Ro, Jeng-Jong; Wu, K. C.

    2002-11-01

    The primary objective of active flow control research is to develop a cost-effective technology that has the potential for revolutionary advances in aerodynamic performance and maneuvering compared to conventional approaches. The development of such systems have many implications for aerospace vehicles including: reducing mechanical complexity and hydraulic failure, reducing noise and weight, lowering energy and fuel consumption, lowering downtime and maintenance, enhancing maneuvering and agility with enhanced aerodynamic performance and safety. Interest in active flow control for aerospace applications has stimulated the recent development of innovative actuator designs that create localized disturbances in a flowfield. A novel class of devices, known as synthetic jet actuator, has been demonstrated to exhibit promising flow control capabilities including separation control and thrust vectoring. The basic components of a synthetic jet actuator are made of cavity and oscillating materials. The synthetic jet actuator developed at NASA LaRC has a small housing in which a cylindrical cavity is enclosed by two metal diaphragms, 50 mm in diameter, placed opposite each other. A circular piezoelectric wafer is attached to the center of the outside face of each metal diaphragm. The pair of piezoelectric metal diaphragms is operated with a 180° phase differential at the same sinusoidal voltage and frequency. With actuation, a synthetic jet issues from a 35.5mm long by 0.5mm wide slot on the top of the device. In this study, a finite element model of synthetic jet actuator developed at NASA LaRC is investigated. The developed finite element model can be utilized to design and determine the performance of synthetic jet actuator. The analysis includes the FE model of circular plate, FE model of piezoelectric actuator/circular plate, piezoelectric (electrical field)/circular plate (structural field)/cavity (flow field) coupled system and experimental validation. The phase

  12. Advanced Analysis of Finger-Tapping Performance: A Preliminary Study

    PubMed Central

    Barut, Çağatay; Kızıltan, Erhan; Gelir, Ethem; Köktürk, Fürüzan

    2013-01-01

    Background: The finger-tapping test is a commonly employed quantitative assessment tool used to measure motor performance in the upper extremities. This task is a complex motion that is affected by external stimuli, mood and health status. The complexity of this task is difficult to explain with a single average intertap-interval value (time difference between successive tappings) which only provides general information and neglects the temporal effects of the aforementioned factors. Aims: This study evaluated the time course of average intertap-interval values and the patterns of variation in both the right and left hands of right-handed subjects using a computer-based finger-tapping system. Study Design: Cross sectional study. Methods: Thirty eight male individuals aged between 20 and 28 years (Mean±SD = 22.24±1.65) participated in the study. Participants were asked to perform single-finger-tapping test for 10 seconds of test period. Only the results of right-handed (RH) 35 participants were considered in this study. The test records the time of tapping and saves data as the time difference between successive tappings for further analysis. The average number of tappings and the temporal fluctuation patterns of the intertap-intervals were calculated and compared. The variations in the intertap-interval were evaluated with the best curve fit method. Results: An average tapping speed or tapping rate can reliably be defined for a single-finger tapping test by analysing the graphically presented data of the number of tappings within the test period. However, a different presentation of the same data, namely the intertap-interval values, shows temporal variation as the number of tapping increases. Curve fitting applications indicate that the variation has a biphasic nature. Conclusion: The measures obtained in this study reflect the complex nature of the finger-tapping task and are suggested to provide reliable information regarding hand performance. Moreover, the

  13. Thermal Performance Analysis of a Geologic Borehole Repository

    SciTech Connect

    Reagin, Lauren

    2016-08-16

    The Brazilian Nuclear Research Institute (IPEN) proposed a design for the disposal of Disused Sealed Radioactive Sources (DSRS) based on the IAEA Borehole Disposal of Sealed Radioactive Sources (BOSS) design that would allow the entirety of Brazil’s inventory of DSRS to be disposed in a single borehole. The proposed IPEN design allows for 170 waste packages (WPs) containing DSRS (such as Co-60 and Cs-137) to be stacked on top of each other inside the borehole. The primary objective of this work was to evaluate the thermal performance of a conservative approach to the IPEN proposal with the equivalent of two WPs and two different inside configurations using Co-60 as the radioactive heat source. The current WP configuration (heterogeneous) for the IPEN proposal has 60% of the WP volume being occupied by a nuclear radioactive heat source and the remaining 40% as vacant space. The second configuration (homogeneous) considered for this project was a homogeneous case where 100% of the WP volume was occupied by a nuclear radioactive heat source. The computational models for the thermal analyses of the WP configurations with the Co-60 heat source considered three different cooling mechanisms (conduction, radiation, and convection) and the effect of mesh size on the results from the thermal analysis. The results of the analyses yielded maximum temperatures inside the WPs for both of the WP configurations and various mesh sizes. The heterogeneous WP considered the cooling mechanisms of conduction, convection, and radiation. The temperature results from the heterogeneous WP analysis suggest that the model is cooled predominantly by conduction with effect of radiation and natural convection on cooling being negligible. From the thermal analysis comparing the two WP configurations, the results suggest that either WP configuration could be used for the design. The mesh sensitivity results verify the meshes used, and results obtained from the thermal analyses were close to

  14. Performance Analysis of Intelligent Robust Facility Layout Design

    NASA Astrophysics Data System (ADS)

    Moslemipour, G.; Lee, T. S.; Loong, Y. T.

    2017-03-01

    Design of a robust production facility layout with minimum handling cost (MHC) presents an appropriate approach to tackle facility layout problems in a dynamic volatile environment, in which product demands randomly change in each planning period. The objective of the design is to find the robust facility layout with minimum total material handling cost over the entire multi-period planning horizon. This paper proposes a new mathematical model for designing robust machine layout in the stochastic dynamic environment of manufacturing systems using quadratic assignment problem (QAP) formulation. In this investigation, product demands are assumed to be normally distributed random variables with known expected value, variance, and covariance that randomly change from period to period. The proposed model was verified and validated using randomly generated numerical data and benchmark examples. The effect of dependent product demands and varying interest rate on the total cost function of the proposed model has also been investigated. Sensitivity analysis on the proposed model has been performed. Dynamic programming and simulated annealing optimization algorithms were used in solving the modeled example problems.

  15. Analysis of classifiers performance for classification of potential microcalcification

    NASA Astrophysics Data System (ADS)

    M. N., Arun K.; Sheshadri, H. S.

    2013-07-01

    Breast cancer is a significant public health problem in the world. According to the literature early detection improve breast cancer prognosis. Mammography is a screening tool used for early detection of breast cancer. About 10-30% cases are missed during the routine check as it is difficult for the radiologists to make accurate analysis due to large amount of data. The Microcalcifications (MCs) are considered to be important signs of breast cancer. It has been reported in literature that 30% - 50% of breast cancer detected radio graphically show MCs on mammograms. Histologic examinations report 62% to 79% of breast carcinomas reveals MCs. MC are tiny, vary in size, shape, and distribution, and MC may be closely connected to surrounding tissues. There is a major challenge using the traditional classifiers in the classification of individual potential MCs as the processing of mammograms in appropriate stage generates data sets with an unequal amount of information for both classes (i.e., MC, and Not-MC). Most of the existing state-of-the-art classification approaches are well developed by assuming the underlying training set is evenly distributed. However, they are faced with a severe bias problem when the training set is highly imbalanced in distribution. This paper addresses this issue by using classifiers which handle the imbalanced data sets. In this paper, we also compare the performance of classifiers which are used in the classification of potential MC.

  16. Applying importance-performance analysis to patient safety culture.

    PubMed

    Lee, Yii-Ching; Wu, Hsin-Hung; Hsieh, Wan-Lin; Weng, Shao-Jen; Hsieh, Liang-Po; Huang, Chih-Hsuan

    2015-01-01

    The Sexton et al.'s (2006) safety attitudes questionnaire (SAQ) has been widely used to assess staff's attitudes towards patient safety in healthcare organizations. However, to date there have been few studies that discuss the perceptions of patient safety both from hospital staff and upper management. The purpose of this paper is to improve and to develop better strategies regarding patient safety in healthcare organizations. The Chinese version of SAQ based on the Taiwan Joint Commission on Hospital Accreditation is used to evaluate the perceptions of hospital staff. The current study then lies in applying importance-performance analysis technique to identify the major strengths and weaknesses of the safety culture. The results show that teamwork climate, safety climate, job satisfaction, stress recognition and working conditions are major strengths and should be maintained in order to provide a better patient safety culture. On the contrary, perceptions of management and hospital handoffs and transitions are important weaknesses and should be improved immediately. Research limitations/implications - The research is restricted in generalizability. The assessment of hospital staff in patient safety culture is physicians and registered nurses. It would be interesting to further evaluate other staff's (e.g. technicians, pharmacists and others) opinions regarding patient safety culture in the hospital. Few studies have clearly evaluated the perceptions of healthcare organization management regarding patient safety culture. Healthcare managers enable to take more effective actions to improve the level of patient safety by investigating key characteristics (either strengths or weaknesses) that healthcare organizations should focus on.

  17. Performance Analysis of Intelligent Robust Facility Layout Design

    NASA Astrophysics Data System (ADS)

    Moslemipour, G.; Lee, T. S.; Loong, Y. T.

    2017-03-01

    Design of a robust production facility layout with minimum handling cost (MHC) presents an appropriate approach to tackle facility layout problems in a dynamic volatile environment, in which product demands randomly change in each planning period. The objective of the design is to find the robust facility layout with minimum total material handling cost over the entire multi-period planning horizon. This paper proposes a new mathematical model for designing robust machine layout in the stochastic dynamic environment of manufacturing systems using quadratic assignment problem (QAP) formulation. In this investigation, product demands are assumed to be normally distributed random variables with known expected value, variance, and covariance that randomly change from period to period. The proposed model was verified and validated using randomly generated numerical data and benchmark examples. The effect of dependent product demands and varying interest rate on the total cost function of the proposed model has also been investigated. Sensitivity analysis on the proposed model has been performed. Dynamic programming and simulated annealing optimization algorithms were used in solving the modeled example problems.

  18. Aerothermal Performance Constraint Analysis of Sharp Nosecaps and Leading Edges

    NASA Technical Reports Server (NTRS)

    Rizk, Yehia; Gee, Ken

    2004-01-01

    The main objective of this work is to predict the Aerothermal Performance Constraint (APC) for a class of Crew Transfer Vehicles (CTV) with shap noses and wing leading edges made out of UHTC which is a family of Ultra High Temperature Ceramics materials developed at NASA Ames. The APC is based on the theoretical temperature limit of the material which is usually encountered at the CTV nose or wing leading edge. The APC places a lower limit on the trajectory of the CTV in the altitude velocity space. The APC is used as one of the constraints in developing reentry and abort trajectories for the CTV. The trajectories are then used to generate transient thermal response of the nosecaps and wing leading edges which are represented as either a one piece of UHTC or two piece (UHTC + RCC) with perfect axial contact. The final paper will include more details about the analysis procedure and will also include results for reentry and abort design trajectories.

  19. Analysis of Performance of Stereoscopic-Vision Software

    NASA Technical Reports Server (NTRS)

    Kim, Won; Ansar, Adnan; Steele, Robert; Steinke, Robert

    2007-01-01

    A team of JPL researchers has analyzed stereoscopic vision software and produced a document describing its performance. This software is of the type used in maneuvering exploratory robotic vehicles on Martian terrain. The software in question utilizes correlations between portions of the images recorded by two electronic cameras to compute stereoscopic disparities, which, in conjunction with camera models, are used in computing distances to terrain points to be included in constructing a three-dimensional model of the terrain. The analysis included effects of correlation- window size, a pyramidal image down-sampling scheme, vertical misalignment, focus, maximum disparity, stereo baseline, and range ripples. Contributions of sub-pixel interpolation, vertical misalignment, and foreshortening to stereo correlation error were examined theoretically and experimentally. It was found that camera-calibration inaccuracy contributes to both down-range and cross-range error but stereo correlation error affects only the down-range error. Experimental data for quantifying the stereo disparity error were obtained by use of reflective metrological targets taped to corners of bricks placed at known positions relative to the cameras. For the particular 1,024-by-768-pixel cameras of the system analyzed, the standard deviation of the down-range disparity error was found to be 0.32 pixel.

  20. 1-D Numerical Analysis of RBCC Engine Performance

    NASA Technical Reports Server (NTRS)

    Han, Samuel S.

    1998-01-01

    An RBCC engine combines air breathing and rocket engines into a single engine to increase the specific impulse over an entire flight trajectory. Considerable research pertaining to RBCC propulsion was performed during the 1960's and these engines were revisited recently as a candidate propulsion system for either a single-stage-to-orbit (SSTO) or two-stage-to-orbit (TSTO) launch vehicle. There are a variety of RBCC configurations that had been evaluated and new designs are currently under development. However, the basic configuration of all RBCC systems is built around the ejector scramjet engine originally developed for the hypersonic airplane. In this configuration, a rocket engine plays as an ejector in the air-augmented initial acceleration mode, as a fuel injector in scramjet mode and the rocket in all rocket mode for orbital insertion. Computational fluid dynamics (CFD) is a useful tool for the analysis of complex transport processes in various components in RBCC propulsion systems. The objective of the present research was to develop a transient 1-D numerical model that could be used to predict flow behavior throughout a generic RBCC engine following a flight path.

  1. Analysis of Illinois Home Performance with ENERGY STAR® Measure Packages

    SciTech Connect

    Baker, J.; Yee, S.; Brand, L.

    2013-09-01

    Through the Chicagoland Single Family Housing Characterization and Retrofit Prioritization report, the Partnership for Advanced Residential Retrofit research team characterized 15 housing types in the Chicagoland region based on assessor data, utility billing history, and available data from prior energy efficiency programs. Within these 15 groups, a subset showed the greatest opportunity for energy savings based on BEopt Version 1.1 modeling of potential energy efficiency package options and the percent of the housing stock represented by each group. In this project, collected field data from a whole-home program in Illinois are utilized to compare marketplace-installed measures to the energy saving optimal packages previously developed for the 15 housing types. Housing type, conditions, energy efficiency measures installed, and retrofit cost information were collected from 19 homes that participated in the Illinois Home Performance with ENERGY STAR program in 2012, representing eight of the characterized housing groups. Two were selected for further case study analysis to provide an illustration of the differences between optimal and actually installed measures. Taken together, these homes are representative of 34.8% of the Chicagoland residential building stock. In one instance, actual installed measures closely matched optimal recommended measures.

  2. Space rescue system definition (system performance analysis and trades)

    NASA Astrophysics Data System (ADS)

    Housten, Sam; Elsner, Tim; Redler, Ken; Svendsen, Hal; Wenzel, Sheri

    This paper addresses key technical issues involved in the system definition of the Assured Crew Return Vehicle (ACRV). The perspective on these issues is that of a prospective ACRV contractor, performing system analysis and trade studies. The objective of these analyses and trade studies is to develop the recovery vehicle system concept and top level requirements. The starting point for this work is the definition of the set of design missions for the ACRV. This set of missions encompasses three classes of contingency/emergency (crew illness/injury, space station catastrophe/failure, transportation element catastrophe/failure). The need is to provide a system to return Space Station crew to Earth quickly (less than 24 hours) in response to randomly occurring contingency events over an extended period of time (30 years of planned Space Station life). The main topics addressed and characterized in this paper include the following: Key Recovery (Rescue) Site Access Considerations; Rescue Site Locations and Distribution; Vehicle Cross Range vs Site Access; On-orbit Loiter Capability and Vehicle Design; and Water vs. Land Recovery.

  3. Performance Analysis: Work Control Events Identified January - August 2010

    SciTech Connect

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting in each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded elements of an institutional work planning and control system. By the end of that year this system was documented and implementation had begun. In 2009

  4. Routing performance analysis and optimization within a massively parallel computer

    DOEpatents

    Archer, Charles Jens; Peters, Amanda; Pinnow, Kurt Walter; Swartz, Brent Allen

    2013-04-16

    An apparatus, program product and method optimize the operation of a massively parallel computer system by, in part, receiving actual performance data concerning an application executed by the plurality of interconnected nodes, and analyzing the actual performance data to identify an actual performance pattern. A desired performance pattern may be determined for the application, and an algorithm may be selected from among a plurality of algorithms stored within a memory, the algorithm being configured to achieve the desired performance pattern based on the actual performance data.

  5. Performance Evaluation and Analysis of Effective Range and Data Throughput for Unmodified Bluetooth Communication Devices

    DTIC Science & Technology

    2003-03-01

    Performance Evaluation and Analysis of Effective Range and Data Throughput for Unmodified Bluetooth Communication Devices THESIS...Government. AFIT/GCS/ENG/03-08 Performance Evaluation and Analysis of Effective Range and Data Throughput for Unmodified Bluetooth Communication...AFIT/GCS/ENG/03-08 Performance Evaluation and Analysis of Effective Range and Data Throughput for Unmodified Bluetooth Communication Devices

  6. Using Performance Analysis for Training in an Organization Implementing Integrated Manufacturing: A Case Study.

    ERIC Educational Resources Information Center

    Sleezer, Catherine M.

    1996-01-01

    Examines the use of the Performance Analysis for Training (PAT) model to assess performance needs in an organization that implements integrated manufacturing processes. Results showed that the PAT model was a useful guide for assessing performance needs and that the process and product of the performance analysis was influenced by the…

  7. Analysis of TIMS performance subjected to simulated wind blast

    NASA Technical Reports Server (NTRS)

    Jaggi, S.; Kuo, S.

    1992-01-01

    The results of the performance of the Thermal Infrared Multispectral Scanner (TIMS) when it is subjected to various wind conditions in the laboratory are described. Various wind conditions were simulated using a 24 inch fan or combinations of air jet streams blowing toward either or both of the blackbody surfaces. The fan was used to simulate a large volume of air flow at moderate speeds (up to 30 mph). The small diameter air jets were used to probe TIMS system response in reaction to localized wind perturbations. The maximum nozzle speed of the air jet was 60 mph. A range of wind directions and speeds were set up in the laboratory during the test. The majority of the wind tests were conducted under ambient conditions with the room temperature fluctuating no more than 2 C. The temperature of the high speed air jet was determined to be within 1 C of the room temperature. TIMS response was recorded on analog tape. Additional thermistor readouts of the blackbody temperatures and thermocouple readout of the ambient temperature were recorded manually to be compared with the housekeeping data recorded on the tape. Additional tests were conducted under conditions of elevated and cooled room temperatures. The room temperature was varied between 19.5 to 25.5 C in these tests. The calibration parameters needed for quantitative analysis of TIMS data were first plotted on a scanline-by-scanline basis. These parameters are the low and high blackbody temperature readings as recorded by the TIMS and their corresponding digitized count values. Using these values, the system transfer equations were calculated. This equation allows us to compute the flux for any video count by computing the slope and intercept of the straight line that relates the flux to the digital count. The actual video of the target (the lab floor in this case) was then compared with a simulated target. This simulated target was assumed to be a blackbody at emissivity of .95 degrees and the temperature was

  8. Performance analysis & optimization of well production in unconventional resource plays

    NASA Astrophysics Data System (ADS)

    Sehbi, Baljit Singh

    The Unconventional Resource Plays consisting of the lowest tier of resources (large volumes and most difficult to develop) have been the main focus of US domestic activity during recent times. Horizontal well drilling and hydraulic fracturing completion technology have been primarily responsible for this paradigm shift. The concept of drainage volume is being examined using pressure diffusion along streamlines. We use diffusive time of flight to optimize the number of hydraulic fracture stages in horizontal well application for Tight Gas reservoirs. Numerous field case histories are available in literature for optimizing number of hydraulic fracture stages, although the conclusions are case specific. In contrast, a general method is being presented that can be used to augment field experiments necessary to optimize the number of hydraulic fracture stages. The optimization results for the tight gas example are in line with the results from economic analysis. The fluid flow simulation for Naturally Fractured Reservoirs (NFR) is performed by Dual-Permeability or Dual-Porosity formulations. Microseismic data from Barnett Shale well is used to characterize the hydraulic fracture geometry. Sensitivity analysis, uncertainty assessment, manual & computer assisted history matching are integrated to develop a comprehensive workflow for building reliable reservoir simulation models. We demonstrate that incorporating proper physics of flow is the first step in building reliable reservoir simulation models. Lack of proper physics often leads to unreasonable reservoir parameter estimates. The workflow demonstrates reduced non-uniqueness for the inverse history matching problem. The behavior of near-critical fluids in Liquid Rich Shale plays defies the production behavior observed in conventional reservoir systems. In conventional reservoirs an increased gas-oil ratio is observed as flowing bottom-hole pressure is less than the saturation pressure. The production behavior is

  9. Performance Analysis of Saturated Induction Motors by Virtual Tests

    ERIC Educational Resources Information Center

    Ojaghi, M.; Faiz, J.; Kazemi, M.; Rezaei, M.

    2012-01-01

    Many undergraduate-level electrical machines textbooks give detailed treatments of the performance of induction motors. Students can deepen this understanding of motor performance by performing the appropriate practical work in laboratories or in simulation using proper software packages. This paper considers various common and less-common tests…

  10. Performance Analysis of Saturated Induction Motors by Virtual Tests

    ERIC Educational Resources Information Center

    Ojaghi, M.; Faiz, J.; Kazemi, M.; Rezaei, M.

    2012-01-01

    Many undergraduate-level electrical machines textbooks give detailed treatments of the performance of induction motors. Students can deepen this understanding of motor performance by performing the appropriate practical work in laboratories or in simulation using proper software packages. This paper considers various common and less-common tests…

  11. Light-frame wall and floor systems : analysis and performance

    Treesearch

    G. Sherwood; R. C. Moody

    1989-01-01

    This report describes methods of predicting the performance of light-frame wood structures with emphasis on floor and wall systems. Methods of predicting structural performance, fire safety, and environmental concerns including thermal, moisture, and acoustic performance are addressed in the three major sections.

  12. Performance analysis of next-generation lunar laser retroreflectors

    NASA Astrophysics Data System (ADS)

    Ciocci, Emanuele; Martini, Manuele; Contessa, Stefania; Porcelli, Luca; Mastrofini, Marco; Currie, Douglas; Delle Monache, Giovanni; Dell'Agnello, Simone

    2017-09-01

    ) to GR tests and to constraints on new gravitational theories (like non-minimally coupled gravity and spacetime torsion), the description of the associated physics analysis and global LLR error budget is outside of the chosen scope of present paper. We note that, according to Reasenberg et al. (2016), software models used for LLR physics and lunar science cannot process residuals with an accuracy better than few centimeters and that, in order to process millimeter ranging data (or better) coming from (not only) future reflectors, it is necessary to update and improve the respective models inside the software package. The work presented here on results of the SCF-test thermal and optical analysis shows that a good performance is expected by MoonLIGHT-2 after its deployment on the Moon. This in turn will stimulate improvements in LLR ground segment hardware and help refine the LLR software code and models. Without a significant improvement of the LLR space segment, the acquisition of improved ground LLR hardware and challenging LLR software refinements may languish for lack of motivation, since the librations of the old generation LLR payloads largely dominate the global LLR error budget.

  13. Graphical User Interface for Simulink Integrated Performance Analysis Model

    NASA Technical Reports Server (NTRS)

    Durham, R. Caitlyn

    2009-01-01

    The J-2X Engine (built by Pratt & Whitney Rocketdyne,) in the Upper Stage of the Ares I Crew Launch Vehicle, will only start within a certain range of temperature and pressure for Liquid Hydrogen and Liquid Oxygen propellants. The purpose of the Simulink Integrated Performance Analysis Model is to verify that in all reasonable conditions the temperature and pressure of the propellants are within the required J-2X engine start boxes. In order to run the simulation, test variables must be entered at all reasonable values of parameters such as heat leak and mass flow rate. To make this testing process as efficient as possible in order to save the maximum amount of time and money, and to show that the J-2X engine will start when it is required to do so, a graphical user interface (GUI) was created to allow the input of values to be used as parameters in the Simulink Model, without opening or altering the contents of the model. The GUI must allow for test data to come from Microsoft Excel files, allow those values to be edited before testing, place those values into the Simulink Model, and get the output from the Simulink Model. The GUI was built using MATLAB, and will run the Simulink simulation when the Simulate option is activated. After running the simulation, the GUI will construct a new Microsoft Excel file, as well as a MATLAB matrix file, using the output values for each test of the simulation so that they may graphed and compared to other values.

  14. The Algerian Seismic Network: Performance from data quality analysis

    NASA Astrophysics Data System (ADS)

    Yelles, Abdelkarim; Allili, Toufik; Alili, Azouaou

    2013-04-01

    Seismic monitoring in Algeria has seen a great change after the Boumerdes earthquake of May 21st, 2003. Indeed the installation of a New Digital seismic network (ADSN) upgrade drastically the previous analog telemetry network. During the last four years, the number of stations in operation has greatly increased to 66 stations with 15 Broad Band, 02 Very Broad band, 47 Short period and 21 accelerometers connected in real time using various mode of transmission ( VSAT, ADSL, GSM, ...) and managed by Antelope software. The spatial distribution of these stations covers most of northern Algeria from east to west. Since the operation of the network, significant number of local, regional and tele-seismic events was located by the automatic processing, revised and archived in databases. This new set of data is characterized by the accuracy of the automatic location of local seismicity and the ability to determine its focal mechanisms. Periodically, data recorded including earthquakes, calibration pulse and cultural noise are checked using PSD (Power Spectral Density) analysis to determine the noise level. ADSN Broadband stations data quality is controlled in quasi real time using the "PQLX" software by computing PDFs and PSDs of the recordings. Some other tools and programs allow the monitoring and the maintenance of the entire electronic system for example to check the power state of the system, the mass position of the sensors and the environment conditions (Temperature, Humidity, Air Pressure) inside the vaults. The new design of the network allows management of many aspects of real time seismology: seismic monitoring, rapid determination of earthquake, message alert, moment tensor estimation, seismic source determination, shakemaps calculation, etc. The international standards permit to contribute in regional seismic monitoring and the Mediterranean warning system. The next two years with the acquisition of new seismic equipment to reach 50 new BB stations led to

  15. Polarisation of High-Performing and Low-Performing Secondary Schools in Victoria, Australia: An Analysis of Causal Complexities

    ERIC Educational Resources Information Center

    Bandaranayake, Bandara

    2016-01-01

    Applying qualitative comparative analysis (QCA), this study explores the configurations of conditions that contribute to the polarisation of high-performing and low-performing secondary schools in Victoria, Australia. It is argued that the success and failure of schools can be understood in terms of causal complexity, where one or several…

  16. VALIDATION GUIDELINES FOR LABORATORIES PERFORMING FORENSIC ANALYSIS OF CHEMICAL TERRORISM

    EPA Science Inventory

    The Scientific Working Group on Forensic Analysis of Chemical Terrorism (SWGFACT) has developed the following guidelines for laboratories engaged in the forensic analysis of chemical evidence associated with terrorism. This document provides a baseline framework and guidance for...

  17. VALIDATION GUIDELINES FOR LABORATORIES PERFORMING FORENSIC ANALYSIS OF CHEMICAL TERRORISM

    EPA Science Inventory

    The Scientific Working Group on Forensic Analysis of Chemical Terrorism (SWGFACT) has developed the following guidelines for laboratories engaged in the forensic analysis of chemical evidence associated with terrorism. This document provides a baseline framework and guidance for...

  18. An empirical performance analysis of commodity memories in commodity servers

    SciTech Connect

    Kerbyson, D. J.; Lang, M. K.; Patino, G.

    2004-01-01

    This work details a performance study of six different commodity memories in two commodity server nodes on a number of microbenchmarks, that measure low-level performance characteristics, as well as on two applications representative of the ASCI workload. Thc memories vary both in terms of performance, including latency and bandwidths, and also in terms of their physical properties and manufacturer. Two server nodes were used; one Itanium-II Madison based system, and one Xeon based system. All the memories examined can be used within both processing nodes. This allows the performance of the memories to be directly examined while keeping all other factors within a processing node the same (processor, motherboard, operating system etc.). The results of this study show that there can be a significant difference in application performance from the different memories - by as much as 20%. Thus, by choosing the most appropriate memory for a processing node at a minimal cost differential, significant improved performance may be achievable.

  19. Analysis of Aurora's Performance Simulation Engine for Three Systems

    SciTech Connect

    Freeman, Janine; Simon, Joseph

    2015-07-07

    Aurora Solar Inc. is building a cloud-based optimization platform to automate the design, engineering, and permit generation process of solar photovoltaic (PV) installations. They requested that the National Renewable Energy Laboratory (NREL) validate the performance of the PV system performance simulation engine of Aurora Solar’s solar design platform, Aurora. In previous work, NREL performed a validation of multiple other PV modeling tools 1, so this study builds upon that work by examining all of the same fixed-tilt systems with available module datasheets that NREL selected and used in the aforementioned study. Aurora Solar set up these three operating PV systems in their modeling platform using NREL-provided system specifications and concurrent weather data. NREL then verified the setup of these systems, ran the simulations, and compared the Aurora-predicted performance data to measured performance data for those three systems, as well as to performance data predicted by other PV modeling tools.

  20. Integrated Flight Performance Analysis of a Launch Abort System Concept

    NASA Technical Reports Server (NTRS)

    Tartabini, Paul V.

    2007-01-01

    This paper describes initial flight performance analyses conducted early in the Orion Project to support concept feasibility studies for the Crew Exploration Vehicle s Launch Abort System (LAS). Key performance requirements that significantly affect abort capability are presented. These requirements have implications on sizing the Abort Motor, tailoring its thrust profile to meet escape requirements for both launch pad and high drag/high dynamic pressure ascent aborts. Additional performance considerations are provided for the Attitude Control Motor, a key element of the Orion LAS design that eliminates the need for ballast and provides performance robustness over a passive control approach. Finally, performance of the LAS jettison function is discussed, along with implications on Jettison Motor sizing and the timing of the jettison event during a nominal mission. These studies provide an initial understanding of LAS performance that will continue to evolve as the Orion design is matured.

  1. Multidimensional scaling analysis of simulated air combat maneuvering performance data.

    PubMed

    Polzella, D J; Reid, G B

    1989-02-01

    This paper describes the decomposition of air combat maneuvering by means of multidimensional scaling (MDS). MDS analyses were applied to performance data obtained from expert and novice pilots during simulated air-to-air combat. The results of these analyses revealed that the performance of expert pilots is characterized by advantageous maneuverability and intelligent energy management. It is argued that MDS, unlike simpler metrics, permits the investigator to achieve greater insights into the underlying structure associated with performance of a complex task.

  2. Student academic performance analysis using fuzzy C-means clustering

    NASA Astrophysics Data System (ADS)

    Rosadi, R.; Akamal; Sudrajat, R.; Kharismawan, B.; Hambali, Y. A.

    2017-01-01

    Grade Point Average (GPA) is commonly used as an indicator of academic performance. Academic performance evaluations is a basic way to evaluate the progression of student performance, when evaluating student’s academic performance, there are occasion where the student data is grouped especially when the amounts of data is large. Thus, the pattern of data relationship within and among groups can be revealed. Grouping data can be done by using clustering method, where one of the methods is the Fuzzy C-Means algorithm. Furthermore, this algorithm is then applied to a set of student data form the Faculty of Mathematics and Natural Sciences, Padjadjaran University.

  3. Shadowboxing with Data: A Framework for Informing the Critical Analysis of Performance and Performance Measures

    ERIC Educational Resources Information Center

    Winiecki, Donald J.

    2009-01-01

    The practice of performance improvement requires measuring before and after conditions to determine if changes have occurred as a result of an intervention. Understanding how to take, make, interpret, and use measurements can go a long way toward improving performance improvement work and improving conditions for clients. The risk of not…

  4. Separation of Performance-Approach and Performance-Avoidance Achievement Goals : A Broader Analysis

    ERIC Educational Resources Information Center

    Murayama, Kou; Elliot, Andrew J.; Yamagata, Shinji

    2011-01-01

    In the literature on achievement goals, performance-approach goals (striving to do better than others) and performance-avoidance goals (striving to avoid doing worse than others) tend to exhibit a moderate to high correlation, raising questions about whether the 2 goals represent distinct constructs. In the current article, we sought to examine…

  5. Manual control analysis of drug effects on driving performance

    NASA Technical Reports Server (NTRS)

    Smiley, A.; Ziedman, K.; Moskowitz, H.

    1981-01-01

    The effects of secobarbital, diazepam, alcohol, and marihuana on car-driver transfer functions obtained using a driving simulator were studied. The first three substances, all CNS depressants, reduced gain, crossover frequency, and coherence which resulted in poorer tracking performance. Marihuana also impaired tracking performance but the only effect on the transfer function parameters was to reduce coherence.

  6. Analysis of Air Force Wartime Contracted Construction Project Performance

    DTIC Science & Technology

    2015-03-26

    peacetime factors as a baseline, project factors, health and safety compliance, quality of work, technical performance, work productivity, and...factors: Project Factors, Health and Safety, Quality, Technical Performance, and Productivity, and External Environmental Factors in wartime...factors as peacetime projects, with some notable additions. Using peacetime factors as a baseline, project factors, health and safety compliance

  7. An Analysis of Parents' Attitudes towards Authentic Performance Assessment.

    ERIC Educational Resources Information Center

    Xue, Yange; Meisels, Samuel J.; Bickel, Donna DiPrima; Nicholson, Julie; Atkins-Burnett, Sally

    This study focused on parents' reactions to the implementation of a curriculum-embedded performance assessment for young children. It examines the Work Sampling System (WSS) (S. Meisels, J. Jablon, D. Marsden, M. Dichtelmiller, A. Dorfman, and D. Steele, 1994), a continuous progress performance assessment system that offers an alternative to…

  8. Cost analysis when open surgeons perform minimally invasive hysterectomy.

    PubMed

    Shepherd, Jonathan P; Kantartzis, Kelly L; Ahn, Ki Hoon; Bonidie, Michael J; Lee, Ted

    2014-01-01

    The costs to perform a hysterectomy are widely variable. Our objective was to determine hysterectomy costs by route and whether traditionally open surgeons lower costs when performing laparoscopy versus robotics. Hysterectomy costs including subcategories were collected from 2011 to 2013. Costs were skewed, so 2 statistical transformations were performed. Costs were compared by surgeon classification (open, laparoscopic, or robotic) and surgery route. A total of 4,871 hysterectomies were performed: 34.2% open, 50.7% laparoscopic, and 15.1% robotic. Laparoscopic hysterectomy had the lowest total costs (P < .001). By cost subcategory, laparoscopic hysterectomy was lower than robotic hysterectomy in 6 and higher in 1. When performing robotic hysterectomy, open and robotic surgeon costs were similar. With laparoscopic hysterectomy, open surgeons had higher costs than laparoscopic surgeons for 1 of 2 statistical transformations (P = .007). Open surgeons had lower costs performing laparoscopic hysterectomy than robotic hysterectomy with robotic maintenance and depreciation included (P < .001) but similar costs if these variables were excluded. Although laparoscopic hysterectomy had lowest costs overall, robotics may be no more costly than laparoscopic hysterectomy when performed by surgeons who predominantly perform open hysterectomy.

  9. Pitch Error Analysis of Young Piano Students' Music Reading Performances

    ERIC Educational Resources Information Center

    Rut Gudmundsdottir, Helga

    2010-01-01

    This study analyzed the music reading performances of 6-13-year-old piano students (N = 35) in their second year of piano study. The stimuli consisted of three piano pieces, systematically constructed to vary in terms of left-hand complexity and input simultaneity. The music reading performances were recorded digitally and a code of error analysis…

  10. The Relationship between Performance and Satisfaction: A Utility Analysis.

    DTIC Science & Technology

    1985-03-01

    BUREAU OF STANDARDS-1963-A REPRODUCED AT GOVEANMENT EXPENSE Organizational Behavior Research Department of Management CV Department of Psychology (VJ...will briefly review present thought on the satisfaction-performance relationship, and then turn to an explanation for the proposed curvilinear...implications of this new approach will be discussed. The Relationship between Performance and Satisfaction Since the consistent finding in several reviews

  11. Pitch Error Analysis of Young Piano Students' Music Reading Performances

    ERIC Educational Resources Information Center

    Rut Gudmundsdottir, Helga

    2010-01-01

    This study analyzed the music reading performances of 6-13-year-old piano students (N = 35) in their second year of piano study. The stimuli consisted of three piano pieces, systematically constructed to vary in terms of left-hand complexity and input simultaneity. The music reading performances were recorded digitally and a code of error analysis…

  12. An Analysis of a High Performing School District's Culture

    ERIC Educational Resources Information Center

    Corum, Kenneth D.; Schuetz, Todd B.

    2012-01-01

    This report describes a problem based learning project focusing on the cultural elements of a high performing school district. Current literature on school district culture provides numerous cultural elements that are present in high performing school districts. With the current climate in education placing pressure on school districts to perform…

  13. An Analysis of a High Performing School District's Culture

    ERIC Educational Resources Information Center

    Corum, Kenneth D.; Schuetz, Todd B.

    2012-01-01

    This report describes a problem based learning project focusing on the cultural elements of a high performing school district. Current literature on school district culture provides numerous cultural elements that are present in high performing school districts. With the current climate in education placing pressure on school districts to perform…

  14. Improving Department of Defense Global Distribution Performance Through Network Analysis

    DTIC Science & Technology

    2016-06-01

    Segment Days .................................................................. 16 2. Standards...17 C. METHODOLOGY ......................................................................... 17 1. Segment ...Independence ................................................... 17 2. Statistical Analysis of IPGs by Segment ........................ 18 3. Network

  15. Modelling and performance analysis of four and eight element TCAS

    NASA Technical Reports Server (NTRS)

    Sampath, K. S.; Rojas, R. G.; Burnside, W. D.

    1990-01-01

    This semi-annual report describes the work performed during the period September 1989 through March 1990. The first section presents a description of the effect of the engines of the Boeing 737-200 on the performance of a bottom mounted eight-element traffic alert and collision avoidance system (TCAS). The second section deals exclusively with a four element TCAS antenna. The model obtained to simulate the four element TCAS and new algorithms developed for studying its performance are described. The effect of location on its performance when mounted on top of a Boeing 737-200 operating at 1060 MHz is discussed. It was found that the four element TCAS generally does not perform as well as the eight element TCAS III.

  16. Sensitivity analysis and performance estimation of refractivity from clutter techniques

    NASA Astrophysics Data System (ADS)

    Yardim, Caglar; Gerstoft, Peter; Hodgkiss, William S.

    2009-02-01

    Refractivity from clutter (RFC) refers to techniques that estimate the atmospheric refractivity profile from radar clutter returns. A RFC algorithm works by finding the environment whose simulated clutter pattern matches the radar measured one. This paper introduces a procedure to compute RFC estimator performance. It addresses the major factors such as the radar parameters, the sea surface characteristics, and the environment (region, time of the day, season) that affect the estimator performance and formalizes an error metric combining all of these. This is important for applications such as calculating the optimal radar parameters, selecting the best RFC inversion algorithm under a set of conditions, and creating a regional performance map of a RFC system. The performance metric is used to compute the RFC performance of a non-Bayesian evaporation duct estimator. A Bayesian estimator that incorporates meteorological statistics in the inversion is introduced and compared to the non-Bayesian estimator. The performance metric is used to determine the optimal radar parameters of the evaporation duct estimator for six scenarios. An evaporation duct inversion performance map for a S band radar is created for the larger Mediterranean/Arabian Sea region.

  17. Experimental Analysis of Team Performance: Methodological Developments and Research Results.

    DTIC Science & Technology

    1982-07-06

    The effects of a cooperation contingency on behavior in a continuous three-person environment. Journal of the Experimental Analysis of Behavior , 25...J.V. Effects of a pairing contingency on behavior in a three-person programmed environment. Journal of the Experimental Analysis of Behavior , 1978

  18. Mir Cooperative Solar Array Flight Performance Data and Computational Analysis

    NASA Technical Reports Server (NTRS)

    Kerslake, Thomas W.; Hoffman, David J.

    1997-01-01

    The Mir Cooperative Solar Array (MCSA) was developed jointly by the United States (US) and Russia to provide approximately 6 kW of photovoltaic power to the Russian space station Mir. The MCSA was launched to Mir in November 1995 and installed on the Kvant-1 module in May 1996. Since the MCSA photovoltaic panel modules (PPMs) are nearly identical to those of the International Space Station (ISS) photovoltaic arrays, MCSA operation offered an opportunity to gather multi-year performance data on this technology prior to its implementation on ISS. Two specially designed test sequences were executed in June and December 1996 to measure MCSA performance. Each test period encompassed 3 orbital revolutions whereby the current produced by the MCSA channels was measured. The temperature of MCSA PPMs was also measured. To better interpret the MCSA flight data, a dedicated FORTRAN computer code was developed to predict the detailed thermal-electrical performance of the MCSA. Flight data compared very favorably with computational performance predictions. This indicated that the MCSA electrical performance was fully meeting pre-flight expectations. There were no measurable indications of unexpected or precipitous MCSA performance degradation due to contamination or other causes after 7 months of operation on orbit. Power delivered to the Mir bus was lower than desired as a consequence of the retrofitted power distribution cabling. The strong correlation of experimental and computational results further bolsters the confidence level of performance codes used in critical ISS electric power forecasting. In this paper, MCSA flight performance tests are described as well as the computational modeling behind the performance predictions.

  19. Mir Cooperative Solar Array flight performance data and computational analysis

    SciTech Connect

    Kerslake, T.W.; Hoffman, D.J.

    1997-12-31

    The Mir Cooperative Solar Array (MCSA) was developed jointly by the United States (US) and Russia to provide approximately 6 kW of photovoltaic power to the Russian space station Mir. The MCSA was launched to Mir in November 1995 and installed on the Kvant-1 module in May 1996. Since the MCSA photovoltaic panel modules (PPMs) are nearly identical to those of the International Space Station (ISS) photovoltaic arrays, MCSA operation offered an opportunity to gather multi-year performance data on this technology prior to its implementation on ISS. Two specially designed test sequences were executed in June and December 1996 to measure MCSA performance. Each test period encompassed 3 orbital revolutions whereby the current produced by the MCSA channels was measured. The temperature of MCSA PPMs was also measured. To better interpret the MCSA flight data, a dedicated FORTRAN computer code was developed to predict the detailed thermal-electrical performance of the MCSA. Flight data compared very favorably with computational performance predictions. This indicated that the MCSA electrical performance was fully meeting pre-flight expectations. There were no measurable indications of unexpected or precipitous MCSA performance degradation due to contamination or other causes after 7 months of operation on orbit. Power delivered to the Mir bus was lower than desired as a consequence of the retrofitted power distribution cabling. The strong correlation of experimental and computational results further bolsters the confidence level of performance codes used in critical ISS electric power forecasting. In this paper, MCSA flight performance tests are described as well as the computational modeling behind the performance predictions.

  20. Issues in performing a network meta-analysis.

    PubMed

    Senn, Stephen; Gavini, Francois; Magrez, David; Scheen, André

    2013-04-01

    The example of the analysis of a collection of trials in diabetes consisting of a sparsely connected network of 10 treatments is used to make some points about approaches to analysis. In particular various graphical and tabular presentations, both of the network and of the results are provided and the connection to the literature of incomplete blocks is made. It is clear from this example that is inappropriate to treat the main effect of trial as random and the implications of this for analysis are discussed. It is also argued that the generalisation from a classic random-effect meta-analysis to one applied to a network usually involves strong assumptions about the variance components involved. Despite this, it is concluded that such an analysis can be a useful way of exploring a set of trials.

  1. Performance Analysis of FSO Communication Using Different Coding Schemes

    NASA Astrophysics Data System (ADS)

    Gupta, Nidhi; Prakash, Siddi Jai; Kaushal, Hemani; Jain, V. K.; Kar, Subrat

    2011-10-01

    A major impairment in Free Space Optical (FSO) links is the turbulence induced fading which severely degrades the link performance. To mitigate turbulence induced fading and, therefore, to improve the error rate performance, error control coding schemes can be used. In this paper, we investigate the bit error performance of FSO links with different coding techniques over log normal atmospheric turbulence fading channels. The modulation scheme considered is BPSK. On the basis of computed results using Monte Carlo simulation, a comparative study of uncoded and coded systems is made.

  2. Kinematic performance analysis of a parallel-chain hexapod machine

    SciTech Connect

    Jing Song; Jong-I Mou; Calvin King

    1998-05-18

    Inverse and forward kinematic models were derived to analyze the performance of a parallel-chain hexapod machine. Analytical models were constructed for both ideal and real structures. Performance assessment and enhancement algorithms were developed to determine the strut lengths for both ideal and real structures. The strut lengths determined from both cases can be used to analyze the effect of structural imperfections on machine performance. In an open-architecture control environment, strut length errors can be fed back to the controller to compensate for the displacement errors and thus improve the machine's accuracy in production.

  3. Imaging Performance Analysis of Simbol-X with Simulations

    SciTech Connect

    Chauvin, M.; Roques, J. P.

    2009-05-11

    Simbol-X is an X-Ray telescope operating in formation flight. It means that its optical performances will strongly depend on the drift of the two spacecrafts and its ability to measure these drifts for image reconstruction. We built a dynamical ray tracing code to study the impact of these parameters on the optical performance of Simbol-X (see Chauvin et al., these proceedings). Using the simulation tool we have developed, we have conducted detailed analyses of the impact of different parameters on the imaging performance of the Simbol-X telescope.

  4. Imaging Performance Analysis of Simbol-X with Simulations

    NASA Astrophysics Data System (ADS)

    Chauvin, M.; Roques, J. P.

    2009-05-01

    Simbol-X is an X-Ray telescope operating in formation flight. It means that its optical performances will strongly depend on the drift of the two spacecrafts and its ability to measure these drifts for image reconstruction. We built a dynamical ray tracing code to study the impact of these parameters on the optical performance of Simbol-X (see Chauvin et al., these proceedings). Using the simulation tool we have developed, we have conducted detailed analyses of the impact of different parameters on the imaging performance of the Simbol-X telescope.

  5. Performance analysis of job scheduling policies in parallel supercomputing environments

    SciTech Connect

    Naik, V.K.; Squillante, M.S.; Setia, S.K.

    1993-12-31

    In this paper the authors analyze three general classes of scheduling policies under a workload typical of largescale scientific computing. These policies differ in the manner in which processors are partitioned among the jobs as well as the way in which jobs are prioritized for execution on the partitions. Their results indicate that existing static schemes do not perform well under varying workloads. Adaptive policies tend to make better scheduling decisions, but their ability to adjust to workload changes is limited. Dynamic partitioning policies, on the other hand, yield the best performance and can be tuned to provide desired performance differences among jobs with varying resource demands.

  6. Assessing BMP Performance Using Microtox Toxicity Analysis - Rhode Island

    EPA Science Inventory

    Best Management Practices (BMPs) have been shown to be effective in reducing runoff and pollutants from urban areas and thus provide a mechanism to improve downstream water quality. Currently, BMP performance regarding water quality improvement is assessed through measuring each...

  7. Preliminary basic performance analysis of the Cedar multiprocessor memory system

    NASA Technical Reports Server (NTRS)

    Gallivan, K.; Jalby, W.; Turner, S.; Veidenbaum, A.; Wijshoff, H.

    1991-01-01

    Some preliminary basic results on the performance of the Cedar multiprocessor memory system are presented. Empirical results are presented and used to calibrate a memory system simulator which is then used to discuss the scalability of the system.

  8. Parent involvement and student academic performance: a multiple mediational analysis.

    PubMed

    Topor, David R; Keane, Susan P; Shelton, Terri L; Calkins, Susan D

    2010-01-01

    Parent involvement in a child's education is consistently found to be positively associated with a child's academic performance. However, there has been little investigation of the mechanisms that explain this association. The present study examines two potential mechanisms of this association: the child's perception of cognitive competence and the quality of the student-teacher relationship. This study used a sample of 158 seven-year-old participants, their mothers, and their teachers. Results indicated a statistically significant association between parent involvement and a child's academic performance, over and above the impact of the child's intelligence. A multiple mediation model indicated that the child's perception of cognitive competence fully mediated the relation between parent involvement and the child's performance on a standardized achievement test. The quality of the student-teacher relationship fully mediated the relation between parent involvement and teacher ratings of the child's classroom academic performance. Limitations, future research directions, and implications for public policy initiatives are discussed.

  9. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    EPA Pesticide Factsheets

    The model performance evaluation consists of metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors.

  10. Parent involvement and student academic performance: A multiple mediational analysis

    PubMed Central

    Topor, David R.; Keane, Susan P.; Shelton, Terri L.; Calkins, Susan D.

    2011-01-01

    Parent involvement in a child's education is consistently found to be positively associated with a child's academic performance. However, there has been little investigation of the mechanisms that explain this association. The present study examines two potential mechanisms of this association: the child's perception of cognitive competence and the quality of the student-teacher relationship. This study used a sample of 158 seven-year old participants, their mothers, and their teachers. Results indicated a statistically significant association between parent involvement and a child's academic performance, over and above the impact of the child's intelligence. A multiple mediation model indicated that the child's perception of cognitive competence fully mediated the relation between parent involvement and the child's performance on a standardized achievement test. The quality of the student-teacher relationship fully mediated the relation between parent involvement and teacher ratings of the child's classroom academic performance. Limitations, future research directions, and implications for public policy initiatives were discussed. PMID:20603757

  11. Analysis of complex network performance and heuristic node removal strategies

    NASA Astrophysics Data System (ADS)

    Jahanpour, Ehsan; Chen, Xin

    2013-12-01

    Removing important nodes from complex networks is a great challenge in fighting against criminal organizations and preventing disease outbreaks. Six network performance metrics, including four new metrics, are applied to quantify networks' diffusion speed, diffusion scale, homogeneity, and diameter. In order to efficiently identify nodes whose removal maximally destroys a network, i.e., minimizes network performance, ten structured heuristic node removal strategies are designed using different node centrality metrics including degree, betweenness, reciprocal closeness, complement-derived closeness, and eigenvector centrality. These strategies are applied to remove nodes from the September 11, 2001 hijackers' network, and their performance are compared to that of a random strategy, which removes randomly selected nodes, and the locally optimal solution (LOS), which removes nodes to minimize network performance at each step. The computational complexity of the 11 strategies and LOS is also analyzed. Results show that the node removal strategies using degree and betweenness centralities are more efficient than other strategies.

  12. Assessing BMP Performance Using Microtox® Toxicity Analysis

    EPA Science Inventory

    Best Management Practices (BMPs) have been shown to be effective in reducing runoff and pollutants from urban areas and thus provide a mechanism to improve downstream water quality. Currently, BMP performance regarding water quality improvement is assessed through measuring each...

  13. Performance analysis of ten brands of batteries for hearing aids

    PubMed Central

    Penteado, Silvio Pires; Bento, Ricardo Ferreira

    2013-01-01

    Summary Introduction: Comparison of the performance of hearing instrument batteries from various manufacturers can enable otologists, audiologists, or final consumers to select the best products, maximizing the use of these materials. Aim: To analyze the performance of ten brands of batteries for hearing aids available in the Brazilian marketplace. Methods: Hearing aid batteries in four sizes were acquired from ten manufacturers and subjected to the same test conditions in an acoustic laboratory. Results: The results obtained in the laboratory contrasted with the values reported by manufacturers highlighted significant discrepancies, besides the fact that certain brands in certain sizes perform better on some tests, but does not indicate which brand is the best in all sizes. Conclusions: It was possible to investigate the performance of ten brands of hearing aid batteries and describe the procedures to be followed for leakage, accidental intake, and disposal. PMID:25992026

  14. Assessing BMP Performance Using Microtox Toxicity Analysis - Rhode Island

    EPA Science Inventory

    Best Management Practices (BMPs) have been shown to be effective in reducing runoff and pollutants from urban areas and thus provide a mechanism to improve downstream water quality. Currently, BMP performance regarding water quality improvement is assessed through measuring each...

  15. Assessing BMP Performance Using Microtox® Toxicity Analysis

    EPA Science Inventory

    Best Management Practices (BMPs) have been shown to be effective in reducing runoff and pollutants from urban areas and thus provide a mechanism to improve downstream water quality. Currently, BMP performance regarding water quality improvement is assessed through measuring each...

  16. Moisture and Structural Analysis for High Performance Hybrid Wall Assemblies

    SciTech Connect

    Grin, A.; Lstiburek, J.

    2012-09-01

    This report describes the work conducted by the Building Science Corporation (BSC) Building America Research Team's 'Energy Efficient Housing Research Partnerships' project. Based on past experience in the Building America program, they have found that combinations of materials and approaches---in other words, systems--usually provide optimum performance. No single manufacturer typically provides all of the components for an assembly, nor has the specific understanding of all the individual components necessary for optimum performance.

  17. A Performance Analysis of the USAF Work Information Management System

    DTIC Science & Technology

    1990-09-01

    numbers reset when they reach 65535. Two statistics are available from this information that provide a means to evaluate the impact of VTOC I/Os and...One SA recommends page pools on all disks (18). The decision on number and placement of page pools can best be made with a complete evaluation of I...provide economic evaluation or justification for performance improvement alternatives. Several methods are available to measure performance. Benchmark

  18. Distributed Sensor Fusion Performance Analysis Under an Uncertain Environment

    DTIC Science & Technology

    2012-10-01

    cannot be obtained accurately, the sub-optimal fusion processor is assumed to have an estimated correlation coefficient and its performance difference...detectability indices for the sub-optimal and optimal cases is derived as a function of the true correlation coefficient , the estimated value, and the...performance is to a mismatched estimation of the correlation coefficient . Furthermore, we show that for the special case where all local sensors have the

  19. Performance Analysis of Computer Installations Virtual Machine / 370 (VM/370).

    DTIC Science & Technology

    1981-12-01

    designated the 4331 Mcdel Group 1. The most important aspects of the 4300 Secies product line are (Ref. 2]: 1) the strikingly improved price/performance...Chapter III presents the highlights and performance considerations of the Virtual Machine/System Product (VM/SP), which is planned to replace the VH...improve both the functionality cf its software and the support it provides to users of these products . At the same time, the software policy is clearly

  20. Performance analysis of morphological component analysis (MCA) method for mammograms using some statistical features

    NASA Astrophysics Data System (ADS)

    Gardezi, Syed Jamal Safdar; Faye, Ibrahima; Kamel, Nidal; Eltoukhy, Mohamed Meselhy; Hussain, Muhammad

    2014-10-01

    Early detection of breast cancer helps reducing the mortality rates. Mammography is very useful tool in breast cancer detection. But it is very difficult to separate different morphological features in mammographic images. In this study, Morphological Component Analysis (MCA) method is used to extract different morphological aspects of mammographic images by effectively preserving the morphological characteristics of regions. MCA decomposes the mammogram into piecewise smooth part and the texture part using the Local Discrete Cosine Transform (LDCT) and Curvelet Transform via wrapping (CURVwrap). In this study, simple comparison in performance has been done using some statistical features for the original image versus the piecewise smooth part obtained from the MCA decomposition. The results show that MCA suppresses the structural noises and blood vessels from the mammogram and enhances the performance for mass detection.

  1. Using Weibull Distribution Analysis to Evaluate ALARA Performance

    SciTech Connect

    E. L. Frome, J. P. Watkins, and D. A. Hagemeyer

    2009-10-01

    As Low as Reasonably Achievable (ALARA) is the underlying principle for protecting nuclear workers from potential health outcomes related to occupational radiation exposure. Radiation protection performance is currently evaluated by measures such as collective dose and average measurable dose, which do not indicate ALARA performance. The purpose of this work is to show how statistical modeling of individual doses using the Weibull distribution can provide objective supplemental performance indicators for comparing ALARA implementation among sites and for insights into ALARA practices within a site. Maximum likelihood methods were employed to estimate the Weibull shape and scale parameters used for performance indicators. The shape parameter reflects the effectiveness of maximizing the number of workers receiving lower doses and is represented as the slope of the fitted line on a Weibull probability plot. Additional performance indicators derived from the model parameters include the 99th percentile and the exceedance fraction. When grouping sites by collective total effective dose equivalent (TEDE) and ranking by 99th percentile with confidence intervals, differences in performance among sites can be readily identified. Applying this methodology will enable more efficient and complete evaluation of the effectiveness of ALARA implementation.

  2. Using Weibull Distribution Analysis to Evaluate ALARA Performance

    SciTech Connect

    E. L. Frome, J. P. Watkins, and D. A. Hagemeyer

    2009-10-01

    As Low as Reasonably Achievable (ALARA) is the underlying principle for protecting nuclear workers from potential health outcomes related to occupational radiation exposure. Radiation protection performance is currently evaluated by measures such as collective dose and average measurable dose, which do not indicate ALARA performance. The purpose of this work is to show how statistical modeling of individual doses using the Weibull distribution can provide objective supplemental performance indicators for comparing ALARA implementation among sites and for insights into ALARA practices within a site. Maximum likelihood methods were employed to estimate the Weibull shape and scale parameters used for performance indicators. The shape parameter reflects the effectiveness of maximizing the number of workers receiving lower doses and is represented as the slope of the fitted line on a Weibull probability plot. Additional performance indicators derived from the model parameters include the 99th percentile and the exceedance fraction. When grouping sites by collective total effective dose equivalent (TEDE) and ranking by 99th percentile with confidence intervals, differences in performance among sites can be readily identified. Applying this methodology will enable more efficient and complete evaluation of the effectiveness of ALARA implementation.

  3. Simulation and performance analysis of triple-effect absorption cycles

    SciTech Connect

    Grossman, G.; Wilk, M.; DeVault, R.C.

    1993-08-01

    Performance simulation has been carried out for several triple-effect cycles, designed to improve utilization of high temperature heat sources for absorption systems and capable of substantial performance improvement over equivalent double-effect cycles. The systems investigated include the three-condenser-three-desorber (3C3D) cycle, forming an extension of the conventional double-effect one; the recently proposed Double Condenser Coupled (DCC) cycle which recovers heat from the hot condensate leaving the high temperature condensers and adds it to the lower temperature desorbers; and the dual loop cycle comprising two complete single-effect loops, recovering heat from the condenser and absorber of one loop to the desorber of the other loop and generating a cooling effect in the evaporators of both loops. A modular computer code for simulation of absorption systems was used to investigate the performances of the cycles and compare them on an equivalent basis, by selecting a common reference design and operating condition. Performance simulation was carried out over a range of operating conditions, including some investigation of the influence of the design parameters. Coefficients of performance ranging from 1.27 for the series-flow 3C3D to 1.73 for the parallel-flow DCC have been calculated at the design point. The relative merits and shortcomings of the different cycle configurations has been studied.

  4. An Analysis of Performance Enhancement Techniques for Overset Grid Applications

    NASA Technical Reports Server (NTRS)

    Djomehri, J. J.; Biswas, R.; Potsdam, M.; Strawn, R. C.; Biegel, Bryan (Technical Monitor)

    2002-01-01

    The overset grid methodology has significantly reduced time-to-solution of high-fidelity computational fluid dynamics (CFD) simulations about complex aerospace configurations. The solution process resolves the geometrical complexity of the problem domain by using separately generated but overlapping structured discretization grids that periodically exchange information through interpolation. However, high performance computations of such large-scale realistic applications must be handled efficiently on state-of-the-art parallel supercomputers. This paper analyzes the effects of various performance enhancement techniques on the parallel efficiency of an overset grid Navier-Stokes CFD application running on an SGI Origin2000 machine. Specifically, the role of asynchronous communication, grid splitting, and grid grouping strategies are presented and discussed. Results indicate that performance depends critically on the level of latency hiding and the quality of load balancing across the processors.

  5. Full-Envelope Launch Abort System Performance Analysis Methodology

    NASA Technical Reports Server (NTRS)

    Aubuchon, Vanessa V.

    2014-01-01

    The implementation of a new dispersion methodology is described, which dis-perses abort initiation altitude or time along with all other Launch Abort System (LAS) parameters during Monte Carlo simulations. In contrast, the standard methodology assumes that an abort initiation condition is held constant (e.g., aborts initiated at altitude for Mach 1, altitude for maximum dynamic pressure, etc.) while dispersing other LAS parameters. The standard method results in large gaps in performance information due to the discrete nature of initiation conditions, while the full-envelope dispersion method provides a significantly more comprehensive assessment of LAS abort performance for the full launch vehicle ascent flight envelope and identifies performance "pinch-points" that may occur at flight conditions outside of those contained in the discrete set. The new method has significantly increased the fidelity of LAS abort simulations and confidence in the results.

  6. Off-design performance analysis of MHD generator channels

    NASA Astrophysics Data System (ADS)

    Wilson, D. R.; Williams, T. S.

    1980-01-01

    A computer code for performing parametric design point calculations, and evaluating the off-design performance of MHD generators has been developed. The program is capable of analyzing Faraday, Hall, and DCW channels, including the effect of electrical shorting in the gas boundary layers and coal slag layers. Direct integration of the electrode voltage drops is included. The program can be run in either the design or off-design mode. Details of the computer code, together with results of a study of the design and off-design performance of the proposed ETF MHD generator are presented. Design point variations of pre-heat and stoichiometry were analyzed. The off-design study included variations in mass flow rate and oxygen enrichment.

  7. Hydrogen engine performance analysis project. Second annual report

    SciTech Connect

    Adt, Jr., R. R.; Swain, M. R.; Pappas, J. M.

    1980-01-01

    Progress in a 3 year research program to evaluate the performance and emission characteristics of hydrogen-fueled internal combustion engines is reported. Fifteen hydrogen engine configurations will be subjected to performance and emissions characterization tests. During the first two years, baseline data for throttled and unthrottled, carburetted and timed hydrogen induction, Pre IVC hydrogen-fueled engine configurations, with and without exhaust gas recirculation (EGR) and water injection, were obtained. These data, along with descriptions of the test engine and its components, the test apparatus, experimental techniques, experiments performed and the results obtained, are given. Analyses of other hydrogen-engine project data are also presented and compared with the results of the present effort. The unthrottled engine vis-a-vis the throttled engine is found, in general, to exhibit higher brake thermal efficiency. The unthrottled engine also yields lower NO/sub x/ emissions, which were found to be a strong function of fuel-air equivalence ratio. (LCL)

  8. Trajectory analysis and performance for SEP Comet Encke missions

    NASA Technical Reports Server (NTRS)

    Sauer, C. G., Jr.

    1973-01-01

    A summary of the performance of Solar Electric Propulsion spacecraft for Comet Encke missions for the 1980, 1984 and 1987 mission opportunities is presented together with a description of the spacecraft trajectory for each opportunity. Included is data for rendezvous trajectories for all three opportunities and data for a slow flyby mission during the 1980 opportunity. A range of propulsion system input powers of 10 to 20 kW are considered together with a constant spacecraft power requirement of 400 watts. The performance presented in this paper is indicative of that using 30 cm Mercury electron bombardment thrusters that are currently being developed. Performance is given in terms of final spacecraft mass and is thus independent of any particular spacecraft design concept.

  9. Performance analysis of the ascent propulsion system of the Apollo spacecraft

    NASA Technical Reports Server (NTRS)

    Hooper, J. C., III

    1973-01-01

    Activities involved in the performance analysis of the Apollo lunar module ascent propulsion system are discussed. A description of the ascent propulsion system, including hardware, instrumentation, and system characteristics, is included. The methods used to predict the inflight performance and to establish performance uncertainties of the ascent propulsion system are discussed. The techniques of processing the telemetered flight data and performing postflight performance reconstruction to determine actual inflight performance are discussed. Problems that have been encountered and results from the analysis of the ascent propulsion system performance during the Apollo 9, 10, and 11 missions are presented.

  10. Performance analysis of different database in new internet mapping system

    NASA Astrophysics Data System (ADS)

    Yao, Xing; Su, Wei; Gao, Shuai

    2017-03-01

    In the Mapping System of New Internet, Massive mapping entries between AID and RID need to be stored, added, updated, and deleted. In order to better deal with the problem when facing a large number of mapping entries update and query request, the Mapping System of New Internet must use high-performance database. In this paper, we focus on the performance of Redis, SQLite, and MySQL these three typical databases, and the results show that the Mapping System based on different databases can adapt to different needs according to the actual situation.

  11. Analysis of thermal performance of penetrated multi-layer insulation

    NASA Technical Reports Server (NTRS)

    Foster, Winfred A., Jr.; Jenkins, Rhonald M.; Yoo, Chai H.; Barrett, William E.

    1988-01-01

    Results of research performed for the purpose of studying the sensitivity of multi-layer insulation blanket performance caused by penetrations through the blanket are presented. The work described in this paper presents the experimental data obtained from thermal vacuum tests of various penetration geometries similar to those present on the Hubble Space Telescope. The data obtained from these tests is presented in terms of electrical power required sensitivity factors referenced to a multi-layer blanket without a penetration. The results of these experiments indicate that a significant increase in electrical power is required to overcome the radiation heat losses in the vicinity of the penetrations.

  12. A relative performance analysis of atmospheric Laser Doppler Velocimeter methods.

    NASA Technical Reports Server (NTRS)

    Farmer, W. M.; Hornkohl, J. O.; Brayton, D. B.

    1971-01-01

    Evaluation of the effectiveness of atmospheric applications of a Laser Doppler Velocimeter (LDV) at a wavelength of about 0.5 micrometer in conjunction with dual scatter LDV illuminating techniques, or at a wavelength of 10.6 micrometer with local oscillator LDV illuminating techniques. Equations and examples are given to provide a quantitative basis for LDV system selection and performance criteria in atmospheric research. The comparative study shows that specific ranges and conditions exist where performance of one of the methods is superior to that of the other. It is also pointed out that great care must be exercised in choosing system parameters that optimize a particular LDV designed for atmospheric applications.

  13. Virtual Mastoidectomy Performance Evaluation through Multi-Volume Analysis

    PubMed Central

    Kerwin, Thomas; Stredney, Don; Wiet, Gregory; Shen, Han-Wei

    2012-01-01

    Purpose Development of a visualization system that provides surgical instructors with a method to compare the results of many virtual surgeries (n > 100). Methods A masked distance field models the overlap between expert and resident results. Multiple volume displays are used side-by-side with a 2D point display. Results Performance characteristics were examined by comparing the results of specific residents with those of experts and the entire class. Conclusions The software provides a promising approach for comparing performance between large groups of residents learning mastoidectomy techniques. PMID:22528058

  14. Performance issues for engineering analysis on MIMD parallel computers

    SciTech Connect

    Fang, H.E.; Vaughan, C.T.; Gardner, D.R.

    1994-08-01

    We discuss how engineering analysts can obtain greater computational resolution in a more timely manner from applications codes running on MIMD parallel computers. Both processor speed and memory capacity are important to achieving better performance than a serial vector supercomputer. To obtain good performance, a parallel applications code must be scalable. In addition, the aspect ratios of the subdomains in the decomposition of the simulation domain onto the parallel computer should be of order 1. We demonstrate these conclusions using simulations conducted with the PCTH shock wave physics code running on a Cray Y-MP, a 1024-node nCUBE 2, and an 1840-node Paragon.

  15. Water-quality and biological conditions in the Lower Boise River, Ada and Canyon Counties, Idaho, 1994-2002

    USGS Publications Warehouse

    MacCoy, Dorene E.

    2004-01-01

    The water quality and biotic integrity of the lower Boise River between Lucky Peak Dam and the river's mouth near Parma, Idaho, have been affected by agricultural land and water use, wastewater treatment facility discharge, urbanization, reservoir operations, and river channel alteration. The U.S. Geological Survey (USGS) and cooperators have studied water-quality and biological aspects of the lower Boise River in the past to address water-quality concerns and issues brought forth by the Clean Water Act of 1977. Past and present issues include preservation of beneficial uses of the river for fisheries, recreation, and irrigation; and maintenance of high-quality water for domestic and agricultural uses. Evaluation of the data collected from 1994 to 2002 by the USGS revealed increases in constituent concentrations in the lower Boise in a downstream direction. Median suspended sediment concentrations from Diversion Dam (downstream from Lucky Peak Dam) to Parma increased more than 11 times, nitrogen concentrations increased more than 8 times, phosphorus concentrations increased more than 7 times, and fecal coliform concentrations increased more than 400 times. Chlorophyll-a concentrations, used as an indicator of nutrient input and the potential for nuisance algal growth, also increased in a downstream direction; median concentrations were highest at the Middleton and Parma sites. There were no discernible temporal trends in nutrients, sediment, or bacteria concentrations over the 8-year study. The State of Idaho?s temperature standards to protect coldwater biota and salmonid spawning were exceeded most frequently at Middleton and Parma. Suspended sediment concentrations exceeded criteria proposed by Idaho Department of Environmental Quality most frequently at Parma and at all but three tributaries. Total nitrogen concentrations at Glenwood, Middleton, and Parma exceeded national background levels; median flow-adjusted total nitrogen concentrations at Middleton and Parma were higher than those in undeveloped basins sampled nationwide by the USGS. Total phosphorus concentrations at Glenwood, Middleton, and Parma also exceeded those in undeveloped basins. Macroinvertebrate and fish communities were used to evaluate the long-term integration of water-quality contaminants and loss of habitat in the lower Boise. Biological integrity of the macroinvertebrate population was assessed with the attributes (metrics) of Ephemeroptera, Plecoptera, and Trichoptera (EPT) richness and metrics used in the Idaho River Macroinvertebrate Index (RMI): taxa richness; EPT richness; percent dominant taxon; percent Elmidae (riffle beetles); and percent predators. Average EPT was about 10, and RMI scores were frequently below 16, which indicated intermediate or poor water quality. The number of EPT taxa and RMI scores for the lower Boise were half those for least-impacted streams in Idaho. The fine sediment bioassessment index (FSBI) was used to evaluate macroinvertebrate sediment tolerance. The FSBI scores were lower than those for a site upstream in the Boise River Basin near Twin Springs, a site not impacted by urbanization and agriculture, which indicated that the lower Boise macroinvertebrate population may be impacted by fine sediment. Macroinvertebrate functional feeding groups and percent tolerant species, mainly at Middleton and Parma, were typical of those in areas of degraded water quality and habitat. The biological integrity of the fish population was evaluated using the Idaho River Fish Index (RFI), which consists of the 10 metrics: number of coldwater native species, percent sculpin, percent coldwater species, percent sensitive native individuals, percent tolerant individuals, number of nonindigenous species, number of coldwater fish captured per minute of electrofishing, percent of fish with deformities (eroded fins, lesions, or tumors), number of trout age classes, and percent carp. RFI scores for lower Boise sites indicated a d

  16. [Decreasing incidence of perinatal group B streptococcal disease (Barcelona 1994-2002). Relation with hospital prevention policies].

    PubMed

    Andreu, Antonia; Sanfeliu, Isabel; Viñas, Lluis; Barranco, Margarita; Bosch, Jordi; Dopico, Eva; Guardia, Celia; Juncosa, Teresa; Lite, Josep; Matas, Lurdes; Sánchez, Ferrán; Sierr, Montse

    2003-04-01

    To analyze the incidence of perinatal sepsis due to group B streptococcus (GBS) as related to compliance with recommendations for its prevention issued by the Catalan Societies for Obstetrics, for Pediatrics, and for Infectious Diseases and Clinical Microbiology in 1997. The study was conducted from 1994 to 2001 in 10 Barcelona-area hospitals, where 157,848 live infants were born. GBS disease was diagnosed in 129 neonates. Incidence decreased by 86.1% over the study period, from 1.92 cases per 1000 live births in 1994 to 0.26 per 1000 in 2001 (p < 0.001). Changes in the characteristics of perinatal GBS disease were observed in the 18 cases diagnosed in the last 3 years, the time when prevention policies were operative. The incidence was lower (0.28 per 1000 vs. 1.19 for the previous 5 years, p <.00006), the proportion of mothers without risk factors was greater (77.8% vs. 55.9%, p 5 0.009), and premature neonates were not affected (0% vs. 12.6%, p 5 0.003); nevertheless, mortality was similar (5.5% vs. 4.5%, p 5 0.8). Among these 18 cases of sepsis, 9 can be considered failures inherent to the prevention policy and 9 failures of compliance. Only 3 hospitals had prevention policies in 1994, whereas all 10 used intrapartum prophylaxis based on screening results in 2001. A substantial decrease in the incidence of perinatal GBS disease coinciding with the application of prevention measures for this pathology has been registered in 10 participating hospitals over the 1994-2001 period.

  17. Relationship between climate, pollen concentrations of Ambrosia and medical consultations for allergic rhinitis in Montreal, 1994-2002.

    PubMed

    Breton, Marie-Claude; Garneau, Michelle; Fortier, Isabel; Guay, Frédéric; Louis, Jacques

    2006-10-15

    The aim of this study is to evaluate the influence of meteorological factors on Ambrosia pollen concentrations and its impact on medical consultations for allergic rhinitis of residents from various socio-economic levels in Montréal (Québec, Canada) between 1994 and 2002. The study was conducted to recognize the sensitivity of pollen productivity to daily climate variability in order to estimate the consequences on human health vulnerability in the context of global climate change. Information related to medical consultations for allergic rhinitis due to pollen comes from the Quebec Health Insurance Board (Régie de l'assurance-maladie du Québec). Ambrosia pollen concentration was measured by the Aerobiology Research Laboratories (Nepean, Ontario). Daily temperature (maximum, minimum, and mean) and precipitation data were obtained from the Meteorological Service of Canada. Socio-economic data come from the 1996 and 2001 census data of Statistics Canada. Between 1994 and 2002, during the Ambrosia pollen season, 7667 consultations for allergic rhinitis due to pollen were recorded. We found a significant association between the number of medical consultations and pollen levels. Significant associations were detected for over-consultation the day of exposure, 1, 2, 3 and 5 days after exposure to high levels of pollen. The consultation rate is higher from low-income residents (3.10 consultations per 10,000 inhabitants) than for high-income (1.65 consultations per 10,000 inhabitants). Considering the demonstrated impact of pollen levels on health, it has become critical to ensure adequate monitoring of Ambrosia and its meteorological sensivity in the context of the anticipated climate change and its potential consequences on human health.

  18. The effect of urban street gang densities on small area homicide incidence in a large metropolitan county, 1994-2002.

    PubMed

    Robinson, Paul L; Boscardin, W John; George, Sheba M; Teklehaimanot, Senait; Heslin, Kevin C; Bluthenthal, Ricky N

    2009-07-01

    The presence of street gangs has been hypothesized as influencing overall levels of violence in urban communities through a process of gun-drug diffusion and cross-type homicide. This effect is said to act independently of other known correlates of violence, i.e., neighborhood poverty. To test this hypothesis, we independently assessed the impact of population exposure to local street gang densities on 8-year homicide rates in small areas of Los Angeles County, California. Homicide data from the Los Angeles County Coroners Office were analyzed with original field survey data on street gang locations, while controlling for the established covariates of community homicide rates. Bivariate and multivariate regression analyses explicated strong relationships between homicide rates, gang density, race/ethnicity, and socioeconomic structure. Street gang densities alone had cumulative effects on small area homicide rates. Local gang densities, along with high school dropout rates, high unemployment rates, racial and ethnic concentration, and higher population densities, together explained 90% of the variation in local 8-year homicide rates. Several other commonly considered covariates were insignificant in the model. Urban environments with higher densities of street gangs exhibited higher overall homicide rates, independent of other community covariates of homicide. The unique nature of street gang killings and their greater potential to influence future local rates of violence suggests that more direct public health interventions are needed alongside traditional criminal justice mechanisms to combat urban violence and homicides.

  19. Comparison of performance between rescaled range analysis and rescaled variance analysis in detecting abrupt dynamic change

    NASA Astrophysics Data System (ADS)

    He, Wen-Ping; Liu, Qun-Qun; Jiang, Yun-Di; Lu, Ying

    2015-04-01

    In the present paper, a comparison of the performance between moving cutting data-rescaled range analysis (MC-R/S) and moving cutting data-rescaled variance analysis (MC-V/S) is made. The results clearly indicate that the operating efficiency of the MC-R/S algorithm is higher than that of the MC-V/S algorithm. In our numerical test, the computer time consumed by MC-V/S is approximately 25 times that by MC-R/S for an identical window size in artificial data. Except for the difference in operating efficiency, there are no significant differences in performance between MC-R/S and MC-V/S for the abrupt dynamic change detection. MC-R/S and MC-V/S both display some degree of anti-noise ability. However, it is important to consider the influences of strong noise on the detection results of MC-R/S and MC-V/S in practical application processes. Project supported by the National Basic Research Program of China (Grant No. 2012CB955902) and the National Natural Science Foundation of China (Grant Nos. 41275074, 41475073, and 41175084).

  20. Performance analysis of the Jersey City Total Energy Site

    NASA Astrophysics Data System (ADS)

    Hurley, C. W.; Ryan, J. D.; Phillips, C. W.

    1982-08-01

    Engineering, economic, environmental, and reliability data from a 486 - unit apartment/commercial complex was gathered. The complex was designed to recover waste heat from diesel engines to make the central equipment building a total energy (TE) plant. Analysis of the data indicates that a significant savings in fuel is possible by minor modifications in plant procedures. The results of an analysis of the quality of utility services supplied to the consumers on the site and an analysis of a series of environmental tests made the effects of the plant on air quality and noise are included. In general, although those systems utilizing the TE concept showed a significant savings in fuel, such systems do not represent attractive investments compared to conventional systems.

  1. Performance analysis of exam gloves used for aseptic rodent surgery.

    PubMed

    LeMoine, Dana M; Bergdall, Valerie K; Freed, Carrie

    2015-05-01

    Aseptic technique includes the use of sterile surgical gloves for survival surgeries in rodents to minimize the incidence of infections. Exam gloves are much less expensive than are surgical gloves and may represent a cost-effective, readily available option for use in rodent surgery. This study examined the effectiveness of surface disinfection of exam gloves with 70% isopropyl alcohol or a solution of hydrogen peroxide and peracetic acid (HP-PA) in reducing bacterial contamination. Performance levels for asepsis were met when gloves were negative for bacterial contamination after surface disinfection and sham 'exertion' activity. According to these criteria, 94% of HP-PA-disinfected gloves passed, compared with 47% of alcohol-disinfected gloves. In addition, the effect of autoclaving on the integrity of exam gloves was examined, given that autoclaving is another readily available option for aseptic preparation. Performance criteria for glove integrity after autoclaving consisted of: the ability to don the gloves followed by successful simulation of wound closure and completion of stretch tests without tearing or observable defects. Using this criteria, 98% of autoclaved nitrile exam gloves and 76% of autoclaved latex exam gloves met performance expectations compared with the performance of standard surgical gloves (88% nitrile, 100% latex). The results of this study support the use of HP-PA-disinfected latex and nitrile exam gloves or autoclaved nitrile exam gloves as viable cost-effective alternatives to sterile surgical gloves for rodent surgeries.

  2. Performance Analysis of Exam Gloves Used for Aseptic Rodent Surgery

    PubMed Central

    LeMoine, Dana M; Bergdall, Valerie K; Freed, Carrie

    2015-01-01

    Aseptic technique includes the use of sterile surgical gloves for survival surgeries in rodents to minimize the incidence of infections. Exam gloves are much less expensive than are surgical gloves and may represent a cost-effective, readily available option for use in rodent surgery. This study examined the effectiveness of surface disinfection of exam gloves with 70% isopropyl alcohol or a solution of hydrogen peroxide and peracetic acid (HP–PA) in reducing bacterial contamination. Performance levels for asepsis were met when gloves were negative for bacterial contamination after surface disinfection and sham ‘exertion’ activity. According to these criteria, 94% of HP–PA-disinfected gloves passed, compared with 47% of alcohol-disinfected gloves. In addition, the effect of autoclaving on the integrity of exam gloves was examined, given that autoclaving is another readily available option for aseptic preparation. Performance criteria for glove integrity after autoclaving consisted of: the ability to don the gloves followed by successful simulation of wound closure and completion of stretch tests without tearing or observable defects. Using this criteria, 98% of autoclaved nitrile exam gloves and 76% of autoclaved latex exam gloves met performance expectations compared with the performance of standard surgical gloves (88% nitrile, 100% latex). The results of this study support the use of HP–PA-disinfected latex and nitrile exam gloves or autoclaved nitrile exam gloves as viable cost-effective alternatives to sterile surgical gloves for rodent surgeries. PMID:26045458

  3. Performance analysis of vortex based mixers for confined flows

    NASA Astrophysics Data System (ADS)

    Buschhagen, Timo

    The hybrid rocket is still sparsely employed within major space or defense projects due to their relatively poor combustion efficiency and low fuel grain regression rate. Although hybrid rockets can claim advantages in safety, environmental and performance aspects against established solid and liquid propellant systems, the boundary layer combustion process and the diffusion based mixing within a hybrid rocket grain port leaves the core flow unmixed and limits the system performance. One principle used to enhance the mixing of gaseous flows is to induce streamwise vorticity. The counter-rotating vortex pair (CVP) mixer utilizes this principle and introduces two vortices into a confined flow, generating a stirring motion in order to transport near wall media towards the core and vice versa. Recent studies investigated the velocity field introduced by this type of swirler. The current work is evaluating the mixing performance of the CVP concept, by using an experimental setup to simulate an axial primary pipe flow with a radially entering secondary flow. Hereby the primary flow is altered by the CVP swirler unit. The resulting setup therefore emulates a hybrid rocket motor with a cylindrical single port grain. In order to evaluate the mixing performance the secondary flow concentration at the pipe assembly exit is measured, utilizing a pressure-sensitive paint based procedure.

  4. Leadership Styles and Organizational Performance: A Predictive Analysis

    ERIC Educational Resources Information Center

    Kieu, Hung Q.

    2010-01-01

    Leadership is critically important because it affects the health of the organization. Research has found that leadership is one of the most significant contributors to organizational performance. Expanding and replicating previous research, and focusing on the specific telecommunications sector, this study used multiple correlation and regression…

  5. Design and performance analysis of digital acoustic underwater telemetry system

    NASA Astrophysics Data System (ADS)

    Catipovic, J. A.; Baggeroer, A. B.; Vonderheydt, K.; Koelsch, D. E.

    1985-11-01

    The work discusses the design and performance characteristics of a Digital Acoustic Telemetry System (DATS) which incorporates the current state-of-the-art technology and is capable of reliable data transmission at rates useful to a wide range of ocean exploration and development gear.

  6. An analysis of performance models for free water surface wetlands.

    PubMed

    Carleton, James N; Montas, Hubert J

    2010-06-01

    Although treatment wetlands are intended to attenuate pollutants, reliably predicting their performance remains a challenge because removal processes are often complex, spatially heterogeneous, and incompletely understood. Although initially popular for characterizing wetland performance, plug flow reactor models are problematic because their parameters exhibit correlation with hydraulic loading. One-dimensional advective-dispersive-reactive models may also be inadequate when longitudinal dispersion is non-Fickian as a result of pronounced transverse gradients in velocity (preferential flow). Models that make use of residence time distributions have shown promise in improving wetland performance characterization, however their applicability may be limited by certain inherent assumptions, e.g. that transverse mixing is nil. A recently-developed bicontinuum (mobile-mobile) model that addresses some of these weaknesses may hold promise for improving wetland performance modeling, however this model has yet to be tested against real-world wetland data. This paper examines the state of the science of free water surface wetland hydrodynamics and transport modeling, discusses the strengths and weaknesses of various steady state models, and compares them to each other in terms of each model's ability to represent data sets from monitored wetlands.

  7. Performance analysis of digital integrate-and-dump filters

    NASA Technical Reports Server (NTRS)

    Chie, C. M.

    1982-01-01

    Key design parameters associated with the operation of a digital integrate-and-dump filter are identified in this paper. Performance degradation effects associated with the selection of these parameters such as analog-to-digital converter (ADC) gain loading factor, number of bits used, predetection bandwidth, sampling rate, and accumulator length are considered. Numerical results of practical interest are also provided.

  8. Analysis of Factors that Predict Clinical Performance in Medical School

    ERIC Educational Resources Information Center

    White, Casey B.; Dey, Eric L.; Fantone, Joseph C.

    2009-01-01

    Academic achievement indices including GPAs and MCAT scores are used to predict the spectrum of medical student academic performance types. However, use of these measures ignores two changes influencing medical school admissions: student diversity and affirmative action, and an increased focus on communication skills. To determine if GPA and MCAT…

  9. Performance Factors Analysis -- A New Alternative to Knowledge Tracing

    ERIC Educational Resources Information Center

    Pavlik, Philip I., Jr.; Cen, Hao; Koedinger, Kenneth R.

    2009-01-01

    Knowledge tracing (KT)[1] has been used in various forms for adaptive computerized instruction for more than 40 years. However, despite its long history of application, it is difficult to use in domain model search procedures, has not been used to capture learning where multiple skills are needed to perform a single action, and has not been used…

  10. Using Conjoint Analysis to Evaluate and Reward Teaching Performance

    ERIC Educational Resources Information Center

    Bacon, Donald R.; Zheng, Yilong; Stewart, Kim A.; Johnson, Carol J.; Paul, Pallab

    2016-01-01

    Although widely used, student evaluations of teaching do not address several factors that should be considered in evaluating teaching performance such as new course preparations, teaching larger classes, and inconvenient class times. Consequently, the incentive exists to avoid certain teaching assignments to achieve high SET scores while…

  11. Wireless imaging sensor network design and performance analysis

    NASA Astrophysics Data System (ADS)

    Sundaram, Ramakrishnan

    2016-05-01

    This paper discusses (a) the design and implementation of the integrated radio tomographic imaging (RTI) interface for radio signal strength (RSS) data obtained from a wireless imaging sensor network (WISN) (b) the use of model-driven methods to determine the extent of regularization to be applied to reconstruct images from the RSS data, and (c) preliminary study of the performance of the network.

  12. How Motivation Affects Academic Performance: A Structural Equation Modelling Analysis

    ERIC Educational Resources Information Center

    Kusurkar, R. A.; Ten Cate, Th. J.; Vos, C. M. P.; Westers, P.; Croiset, G.

    2013-01-01

    Few studies in medical education have studied effect of quality of motivation on performance. Self-Determination Theory based on quality of motivation differentiates between Autonomous Motivation (AM) that originates within an individual and Controlled Motivation (CM) that originates from external sources. To determine whether Relative Autonomous…

  13. Performance Assessment in Water Polo Using Compositional Data Analysis.

    PubMed

    Ordóñez, Enrique García; Pérez, María Del Carmen Iglesias; González, Carlos Touriño

    2016-12-01

    The aim of the present study was to identify groups of offensive performance indicators which best discriminated between a match score (favourable, balanced or unfavourable) in water polo. The sample comprised 88 regular season games (2011-2014) from the Spanish Professional Water Polo League. The offensive performance indicators were clustered in five groups: Attacks in relation to the different playing situations; Shots in relation to the different playing situations; Attacks outcome; Origin of shots; Technical execution of shots. The variables of each group had a constant sum which equalled 100%. The data were compositional data, therefore the variables were changed by means of the additive log-ratio (alr) transformation. Multivariate discriminant analyses to compare the match scores were calculated using the transformed variables. With regard to the percentage of right classification, the results showed the group that discriminated the most between the match scores was "Attacks outcome" (60.4% for the original sample and 52.2% for cross-validation). The performance indicators that discriminated the most between the match scores in games with penalties were goals (structure coefficient (SC) = .761), counterattack shots (SC = .541) and counterattacks (SC = .481). In matches without penalties, goals were the primary discriminating factor (SC = .576). This approach provides a new tool to compare the importance of the offensive performance groups and their effect on the match score discrimination.

  14. State Aid and Student Performance: A Supply-Demand Analysis

    ERIC Educational Resources Information Center

    Kinnucan, Henry W.; Zheng, Yuqing; Brehmer, Gerald

    2006-01-01

    Using a supply-demand framework, a six-equation model is specified to generate hypotheses about the relationship between state aid and student performance. Theory predicts that an increase in state or federal aid provides an incentive to decrease local funding, but that the disincentive associated with increased state aid is moderated when federal…

  15. Meta-Analysis of Predictors of Dental School Performance

    ERIC Educational Resources Information Center

    DeCastro, Jeanette E.

    2012-01-01

    Accurate prediction of which candidates show the most promise of success in dental school is imperative for the candidates, the profession, and the public. Several studies suggested that predental GPAs and the Dental Admissions Test (DAT) produce a range of correlations with dental school performance measures. While there have been similarities,…

  16. Analysis of uncompensated phase error on automatic target recognition performance

    NASA Astrophysics Data System (ADS)

    Montagnino, Lee J.; Cassabaum, Mary L.; Halversen, Shawn D.; Rupp, Chad T.; Wagner, Gregory M.; Young, Matthew T.

    2009-05-01

    Performance of Automatic Target Recognition (ATR) algorithms for Synthetic Aperture Radar (SAR) systems relies heavily on the system performance and specifications of the SAR sensor. A representative multi-stage SAR ATR algorithm [1, 2] is analyzed across imagery containing phase errors in the down-range direction induced during the transmission of the radar's waveform. The degradation induced on the SAR imagery by the phase errors is measured in terms of peak phase error, Root-Mean-Square (RMS) phase error, and multiplicative noise. The ATR algorithm consists of three stages: a two-parameter CFAR, a discrimination stage to reduce false alarms, and a classification stage to identify targets in the scene. The end-to-end performance of the ATR algorithm is quantified as a function of the multiplicative noise present in the SAR imagery through Receiver Operating Characteristic (ROC) curves. Results indicate that the performance of the ATR algorithm presented is robust over a 3dB change in multiplicative noise.

  17. Relative Performance of Academic Departments Using DEA with Sensitivity Analysis

    ERIC Educational Resources Information Center

    Tyagi, Preeti; Yadav, Shiv Prasad; Singh, S. P.

    2009-01-01

    The process of liberalization and globalization of Indian economy has brought new opportunities and challenges in all areas of human endeavor including education. Educational institutions have to adopt new strategies to make best use of the opportunities and counter the challenges. One of these challenges is how to assess the performance of…

  18. Meta-Analysis of Predictors of Dental School Performance

    ERIC Educational Resources Information Center

    DeCastro, Jeanette E.

    2012-01-01

    Accurate prediction of which candidates show the most promise of success in dental school is imperative for the candidates, the profession, and the public. Several studies suggested that predental GPAs and the Dental Admissions Test (DAT) produce a range of correlations with dental school performance measures. While there have been similarities,…

  19. Elementary School Size and Student Performance: A Conceptual Analysis

    ERIC Educational Resources Information Center

    Zoda, Pamela; Combs, Julie P.; Slate, John R.

    2011-01-01

    In this article, we reviewed the empirical literature concerning the relationship between school size and student performance with a focus was on determining the extent to which school size, specifically elementary school size, was related to student academic achievement. Most of the extant literature was on secondary school size with fewer…

  20. How Motivation Affects Academic Performance: A Structural Equation Modelling Analysis

    ERIC Educational Resources Information Center

    Kusurkar, R. A.; Ten Cate, Th. J.; Vos, C. M. P.; Westers, P.; Croiset, G.

    2013-01-01

    Few studies in medical education have studied effect of quality of motivation on performance. Self-Determination Theory based on quality of motivation differentiates between Autonomous Motivation (AM) that originates within an individual and Controlled Motivation (CM) that originates from external sources. To determine whether Relative Autonomous…

  1. Performance Assessment in Water Polo Using Compositional Data Analysis

    PubMed Central

    Ordóñez, Enrique García; González, Carlos Touriño

    2016-01-01

    Abstract The aim of the present study was to identify groups of offensive performance indicators which best discriminated between a match score (favourable, balanced or unfavourable) in water polo. The sample comprised 88 regular season games (2011-2014) from the Spanish Professional Water Polo League. The offensive performance indicators were clustered in five groups: Attacks in relation to the different playing situations; Shots in relation to the different playing situations; Attacks outcome; Origin of shots; Technical execution of shots. The variables of each group had a constant sum which equalled 100%. The data were compositional data, therefore the variables were changed by means of the additive log-ratio (alr) transformation. Multivariate discriminant analyses to compare the match scores were calculated using the transformed variables. With regard to the percentage of right classification, the results showed the group that discriminated the most between the match scores was “Attacks outcome” (60.4% for the original sample and 52.2% for cross-validation). The performance indicators that discriminated the most between the match scores in games with penalties were goals (structure coefficient (SC) = .761), counterattack shots (SC = .541) and counterattacks (SC = .481). In matches without penalties, goals were the primary discriminating factor (SC = .576). This approach provides a new tool to compare the importance of the offensive performance groups and their effect on the match score discrimination. PMID:28031766

  2. Leadership Styles and Organizational Performance: A Predictive Analysis

    ERIC Educational Resources Information Center

    Kieu, Hung Q.

    2010-01-01

    Leadership is critically important because it affects the health of the organization. Research has found that leadership is one of the most significant contributors to organizational performance. Expanding and replicating previous research, and focusing on the specific telecommunications sector, this study used multiple correlation and regression…

  3. Social Cognitive Career Theory, Conscientiousness, and Work Performance: A Meta-Analytic Path Analysis

    ERIC Educational Resources Information Center

    Brown, Steven D.; Lent, Robert W.; Telander, Kyle; Tramayne, Selena

    2011-01-01

    We performed a meta-analytic path analysis of an abbreviated version of social cognitive career theory's (SCCT) model of work performance (Lent, Brown, & Hackett, 1994). The model we tested included the central cognitive predictors of performance (ability, self-efficacy, performance goals), with the exception of outcome expectations. Results…

  4. Social Cognitive Career Theory, Conscientiousness, and Work Performance: A Meta-Analytic Path Analysis

    ERIC Educational Resources Information Center

    Brown, Steven D.; Lent, Robert W.; Telander, Kyle; Tramayne, Selena

    2011-01-01

    We performed a meta-analytic path analysis of an abbreviated version of social cognitive career theory's (SCCT) model of work performance (Lent, Brown, & Hackett, 1994). The model we tested included the central cognitive predictors of performance (ability, self-efficacy, performance goals), with the exception of outcome expectations. Results…

  5. Rapid analysis of phentolamine by high-performance liquid chromatography.

    PubMed

    Webster, Gregory K; Lemmer, Robert R; Greenwald, Steven

    2003-02-01

    A rapid liquid chromatographic method is validated for the quantitative analysis of phentolamine. Phentolamine exists in three forms for this investigation: as a mesylate salt, hydrochloride salt, and free base. In solution, phentolamine dissociates from its salt and is chromatographed as free phentolamine. This validation confirms the analysis of each form, which is simply based upon molar mass differences encountered in weighing. As such, both the United States Pharmacopeia hydrochloride and mesylate standards are used throughout this validation to demonstrate this equivalency. The validation demonstrates that this method may be used to quantitate phentolamine, regardless of its salt form.

  6. Theoretical performance analysis for CMOS based high resolution detectors.

    PubMed

    Jain, Amit; Bednarek, Daniel R; Rudin, Stephen

    2013-03-06

    High resolution imaging capabilities are essential for accurately guiding successful endovascular interventional procedures. Present x-ray imaging detectors are not always adequate due to their inherent limitations. The newly-developed high-resolution micro-angiographic fluoroscope (MAF-CCD) detector has demonstrated excellent clinical image quality; however, further improvement in performance and physical design may be possible using CMOS sensors. We have thus calculated the theoretical performance of two proposed CMOS detectors which may be used as a successor to the MAF. The proposed detectors have a 300 μm thick HL-type CsI phosphor, a 50 μm-pixel CMOS sensor with and without a variable gain light image intensifier (LII), and are designated MAF-CMOS-LII and MAF-CMOS, respectively. For the performance evaluation, linear cascade modeling was used. The detector imaging chains were divided into individual stages characterized by one of the basic processes (quantum gain, binomial selection, stochastic and deterministic blurring, additive noise). Ranges of readout noise and exposure were used to calculate the detectors' MTF and DQE. The MAF-CMOS showed slightly better MTF than the MAF-CMOS-LII, but the MAF-CMOS-LII showed far better DQE, especially for lower exposures. The proposed detectors can have improved MTF and DQE compared with the present high resolution MAF detector. The performance of the MAF-CMOS is excellent for the angiography exposure range; however it is limited at fluoroscopic levels due to additive instrumentation noise. The MAF-CMOS-LII, having the advantage of the variable LII gain, can overcome the noise limitation and hence may perform exceptionally for the full range of required exposures; however, it is more complex and hence more expensive.

  7. An Integrated Solution for Performing Thermo-fluid Conjugate Analysis

    NASA Technical Reports Server (NTRS)

    Kornberg, Oren

    2009-01-01

    A method has been developed which integrates a fluid flow analyzer and a thermal analyzer to produce both steady state and transient results of 1-D, 2-D, and 3-D analysis models. The Generalized Fluid System Simulation Program (GFSSP) is a one dimensional, general purpose fluid analysis code which computes pressures and flow distributions in complex fluid networks. The MSC Systems Improved Numerical Differencing Analyzer (MSC.SINDA) is a one dimensional general purpose thermal analyzer that solves network representations of thermal systems. Both GFSSP and MSC.SINDA have graphical user interfaces which are used to build the respective model and prepare it for analysis. The SINDA/GFSSP Conjugate Integrator (SGCI) is a formbase graphical integration program used to set input parameters for the conjugate analyses and run the models. The contents of this paper describes SGCI and its thermo-fluids conjugate analysis techniques and capabilities by presenting results from some example models including the cryogenic chill down of a copper pipe, a bar between two walls in a fluid stream, and a solid plate creating a phase change in a flowing fluid.

  8. Mirror Analysis: How To Achieve Customer-Driven Human Performance.

    ERIC Educational Resources Information Center

    Mourier, Pierre

    1999-01-01

    Presents an evaluation/development method for achieving customer-driven improvement in organizations. Describes the steps to external and internal "mirror analysis," a process for determining if the organization functions as a mirror of customers' needs and expectations. Twelve figures illustrate factors in the process. (AEF)

  9. Applying Score Analysis to a Rehearsal Pedagogy of Expressive Performance

    ERIC Educational Resources Information Center

    Byo, James L.

    2014-01-01

    The discoveries of score analysis (e.g., minor seventh chord, ostinato, phrase elision, melodic fragment, half cadence) are more than just compositional techniques or music vocabulary. They are sounds--fascinating, storytelling, dynamic modes of expression--that when approached as such enrich the rehearsal experience. This article presents a…

  10. Analysis of Activity Patterns and Performance in Polio Survivors

    DTIC Science & Technology

    2006-10-01

    subjective estimate of their activity level over the past week using the Physical Activity Scale for the Elderly (PASE).3 This instrument dealt with...May 2004. Talaty M. Models for Gait Analysis. 5th SIAMOC (Societa Italiana Di Analisi Del Movimento in Clinica) Congress, Loano, Italy November

  11. Design and Performance Analysis of a Relational Replicated Database System

    DTIC Science & Technology

    1988-01-01

    Data Placement ... .......... .177 Value Range Partitioning ( VRP ) Data Placement 179 Analysis of RR and VRP ...... ................ 188 A Data...Data Structures For EMPLOYEES Relation ........ ..................... 184 43. Partitioning Of EMPLOYEES Relation Resulting From VRP Of Figure 42 And...featuring a collection of autonomous, geographically-dispersed systems which communicate over a long-haul communication network. The 4backend system concept

  12. An Analysis of Tasks Performed in the Ornamental Horticulture Industry.

    ERIC Educational Resources Information Center

    Berkey, Arthur L.; Drake, William E.

    This publication is the result of a detailed task analysis study of the ornamental horticulture industry in New York State. Nine types of horticulture businesses identified were: (1) retail florists, (2) farm and garden supply store, (3) landscape services, (4) greenhouse production, (5) nursery production, (6) turf production, (7) arborist…

  13. Mirror Analysis: How To Achieve Customer-Driven Human Performance.

    ERIC Educational Resources Information Center

    Mourier, Pierre

    1999-01-01

    Presents an evaluation/development method for achieving customer-driven improvement in organizations. Describes the steps to external and internal "mirror analysis," a process for determining if the organization functions as a mirror of customers' needs and expectations. Twelve figures illustrate factors in the process. (AEF)

  14. Rapid Automatized Naming and Reading Performance: A Meta-Analysis

    ERIC Educational Resources Information Center

    Araújo, Susana; Reis, Alexandra; Petersson, Karl Magnus; Faísca, Luís

    2015-01-01

    Evidence that rapid naming skill is associated with reading ability has become increasingly prevalent in recent years. However, there is considerable variation in the literature concerning the magnitude of this relationship. The objective of the present study was to provide a comprehensive analysis of the evidence on the relationship between rapid…

  15. Applying Score Analysis to a Rehearsal Pedagogy of Expressive Performance

    ERIC Educational Resources Information Center

    Byo, James L.

    2014-01-01

    The discoveries of score analysis (e.g., minor seventh chord, ostinato, phrase elision, melodic fragment, half cadence) are more than just compositional techniques or music vocabulary. They are sounds--fascinating, storytelling, dynamic modes of expression--that when approached as such enrich the rehearsal experience. This article presents a…

  16. An Integrated Solution for Performing Thermo-fluid Conjugate Analysis

    NASA Technical Reports Server (NTRS)

    Kornberg, Oren

    2009-01-01

    A method has been developed which integrates a fluid flow analyzer and a thermal analyzer to produce both steady state and transient results of 1-D, 2-D, and 3-D analysis models. The Generalized Fluid System Simulation Program (GFSSP) is a one dimensional, general purpose fluid analysis code which computes pressures and flow distributions in complex fluid networks. The MSC Systems Improved Numerical Differencing Analyzer (MSC.SINDA) is a one dimensional general purpose thermal analyzer that solves network representations of thermal systems. Both GFSSP and MSC.SINDA have graphical user interfaces which are used to build the respective model and prepare it for analysis. The SINDA/GFSSP Conjugate Integrator (SGCI) is a formbase graphical integration program used to set input parameters for the conjugate analyses and run the models. The contents of this paper describes SGCI and its thermo-fluids conjugate analysis techniques and capabilities by presenting results from some example models including the cryogenic chill down of a copper pipe, a bar between two walls in a fluid stream, and a solid plate creating a phase change in a flowing fluid.

  17. Rapid Automatized Naming and Reading Performance: A Meta-Analysis

    ERIC Educational Resources Information Center

    Araújo, Susana; Reis, Alexandra; Petersson, Karl Magnus; Faísca, Luís

    2015-01-01

    Evidence that rapid naming skill is associated with reading ability has become increasingly prevalent in recent years. However, there is considerable variation in the literature concerning the magnitude of this relationship. The objective of the present study was to provide a comprehensive analysis of the evidence on the relationship between rapid…

  18. The NetLogger Methodology for High Performance Distributed Systems Performance Analysis

    SciTech Connect

    Tierney, Brian; Johnston, William; Crowley, Brian; Hoo, Gary; Brooks, Chris; Gunter, Dan

    1999-12-23

    The authors describe a methodology that enables the real-time diagnosis of performance problems in complex high-performance distributed systems. The methodology includes tools for generating precision event logs that can be used to provide detailed end-to-end application and system level monitoring; a Java agent-based system for managing the large amount of logging data; and tools for visualizing the log data and real-time state of the distributed system. The authors developed these tools for analyzing a high-performance distributed system centered around the transfer of large amounts of data at high speeds from a distributed storage server to a remote visualization client. However, this methodology should be generally applicable to any distributed system. This methodology, called NetLogger, has proven invaluable for diagnosing problems in networks and in distributed systems code. This approach is novel in that it combines network, host, and application-level monitoring, providing a complete view of the entire system.

  19. ROC analysis of diagnostic performance in liver scintigraphy.

    PubMed

    Fritz, S L; Preston, D F; Gallagher, J H

    1981-02-01

    Studies on the accuracy of liver scintigraphy for the detection of metastases were assembled from 38 sources in the medical literature. An ROC curve was fitted to the observed values of sensitivity and specificity using an algorithm developed by Ogilvie and Creelman. This ROC curve fitted the data better than average sensitivity and specificity values in each of four subsets of the data. For the subset dealing with Tc-99m sulfur colloid scintigraphy, performed for detection of suspected metastases and containing data on 2800 scans from 17 independent series, it was not possible to reject the hypothesis that interobserver variation was entirely due to the use of different decision thresholds by the reporting clinicians. Thus the ROC curve obtained is a reasonable baseline estimate of the performance potentially achievable in today's clinical setting. Comparison of new reports with these data is possible, but is limited by the small sample sizes in most reported series.

  20. Performance Analysis and Portability of the PLUM Load Balancing System

    NASA Technical Reports Server (NTRS)

    Oliker, Leonid; Biswas, Rupak; Gabow, Harold N.

    1998-01-01

    The ability to dynamically adapt an unstructured mesh is a powerful tool for solving computational problems with evolving physical features; however, an efficient parallel implementation is rather difficult. To address this problem, we have developed PLUM, an automatic portable framework for performing adaptive numerical computations in a message-passing environment. PLUM requires that all data be globally redistributed after each mesh adaption to achieve load balance. We present an algorithm for minimizing this remapping overhead by guaranteeing an optimal processor reassignment. We also show that the data redistribution cost can be significantly reduced by applying our heuristic processor reassignment algorithm to the default mapping of the parallel partitioner. Portability is examined by comparing performance on a SP2, an Origin2000, and a T3E. Results show that PLUM can be successfully ported to different platforms without any code modifications.

  1. Analysis of microgravity measurements performed during D1

    NASA Astrophysics Data System (ADS)

    Hamacher, H.; Jilg, R.; Merbold, U.

    Characteristic features and results of D1 μg-measurements are discussed as performed in the Material Science Double Rack (MSDR) and MEDEA. The μg-data are analysed with respect to selected mission events such as thruster firings for attitude control, operations of Spacelab experiment facilities, vestibular experiments and crew activities. The origins are divided into orbit, vehicle and experiment induced perturbations. It has been found that the μg-environment is dictated mainly by payload induced perturbations. To reduce the μg-level, the design of some experiment facilities has to be improved. In addition, strongly disturbing experiments and very sensitive investigations should be performed in separate mission phases.

  2. ATM solar array in-flight performance analysis

    NASA Technical Reports Server (NTRS)

    Thornton, J. P.; Crabtree, L. W.

    1974-01-01

    The physical and electrical characteristics of the Apollo Telescope Mount (ATM) solar array are described and in-flight performance data are analyzed and compared with predicted results. Two solar cell module configurations were used. Type I module consists of 228 2 x 6 cm solar cells with two cells in parallel and 114 cells in series. Type II modules contain 684 2 x 2 cm cells with six cells in parallel and 114 cells in series. A different interconnection scheme was used for each type. Panels using type II modules with mesh interconnect system performed marginally better than those using type I module with loop interconnect system. The average degradation rate for the ATM array was 8.2% for a 271-day mission.

  3. Control Design and Performance Analysis for Autonomous Formation Flight Experimentss

    NASA Astrophysics Data System (ADS)

    Rice, Caleb Michael

    Autonomous Formation Flight is a key approach for reducing greenhouse gas emissions and managing traffic in future high density airspace. Unmanned Aerial Vehicles (UAV's) have made it possible for the physical demonstration and validation of autonomous formation flight concepts inexpensively and eliminates the flight risk to human pilots. This thesis discusses the design, implementation, and flight testing of three different formation flight control methods, Proportional Integral and Derivative (PID); Fuzzy Logic (FL); and NonLinear Dynamic Inversion (NLDI), and their respective performance behavior. Experimental results show achievable autonomous formation flight and performance quality with a pair of low-cost unmanned research fixed wing aircraft and also with a solo vertical takeoff and landing (VTOL) quadrotor.

  4. Performance Analysis of XCPC Powered Solar Cooling Demonstration Project

    NASA Astrophysics Data System (ADS)

    Widyolar, Bennett K.

    A solar thermal cooling system using novel non-tracking External Compound Parabolic Concentrators (XCPC) has been built at the University of California, Merced and operated for two cooling seasons. Its performance in providing power for space cooling has been analyzed. This solar cooling system is comprised of 53.3 m2 of XCPC trough collectors which are used to power a 23 kW double effect (LiBr) absorption chiller. This is the first system that combines both XCPC and absorption chilling technologies. Performance of the system was measured in both sunny and cloudy conditions, with both clean and dirty collectors. It was found that these collectors are well suited at providing thermal power to drive absorption cooling systems and that both the coinciding of available thermal power with cooling demand and the simplicity of the XCPC collectors compared to other solar thermal collectors makes them a highly attractive candidate for cooling projects.

  5. NS&T Management Observations: Quarterly Performance Analysis

    SciTech Connect

    Gianotto, David

    2014-09-01

    The INL Management Observation Program (MOP) is designed to improve managers and supervisors understanding of work being performed by employees and the barriers impacting their success. The MOP also increases workers understanding of managements’ expectations as they relate to safety, security, quality, and work performance. Management observations (observations) are designed to improve the relationship and trust between employees and managers through increased engagement and interactions between managers and researchers in the field. As part of continuous improvement, NS&T management took initiative to focus on the participation and quality of observations in FY-14. This quarterly report is intended to (a) summarize the participation and quality of management’s observations, (b) assess observations for commonalities or trends related to facility or process barriers impacting research, and (c) provide feedback and make recommendations for improvements NS&T’s MOP.

  6. Transaction Performance vs. Moore's Law: A Trend Analysis

    NASA Astrophysics Data System (ADS)

    Nambiar, Raghunath; Poess, Meikel

    Intel co-founder Gordon E. Moore postulated in his famous 1965 paper that the number of components in integrated circuits had doubled every year from their invention in 1958 until 1965, and then predicted that the trend would continue for at least ten years. Later, David House, an Intel colleague, after factoring in the increase in performance of transistors, concluded that integrated circuits would double in performance every 18 months. Despite this trend in microprocessor improvements, your favored text editor continues to take the same time to start and your PC takes pretty much the same time to reboot as it took 10 years ago. Can this observation be made on systems supporting the fundamental aspects of our information based economy, namely transaction processing systems?

  7. Resilient Plant Monitoring System: Design, Analysis, and Performance Evaluation

    SciTech Connect

    Humberto E. Garcia; Wen-Chiao Lin; Semyon M. Meerkov; Maruthi T. Ravichandran

    2013-12-01

    Resilient monitoring systems are sensor networks that degrade gracefully under malicious attacks on their sensors, causing them to project misleading information. The goal of this paper is to design, analyze, and evaluate the performance of a resilient monitoring system intended to monitor plant conditions (normal or anomalous). The architecture developed consists of four layers: data quality assessment, process variable assessment, plant condition assessment, and sensor network adaptation. Each of these layers is analyzed by either analytical or numerical tools, and the performance of the overall system is evaluated using simulations. The measure of resiliency of the resulting system is evaluated using Kullback Leibler divergence, and is shown to be sufficiently high in all scenarios considered.

  8. Failure Analysis and Regeneration Performances Evaluation on Engine Lubricating Oil

    NASA Astrophysics Data System (ADS)

    Wang, X. L.; Zhang, G. N.; Zhang, J. Y.; Yin, Y. L.; Xu, Y.

    To investigate the behavior of failure and recycling of lubricating oils, three sorts of typical 10w-40 lubricating oils used in heavy-load vehicle including the new oil, waste oil and regeneration oil regenerated by self-researched green regeneration technology were selected. The tribology properties were tested by four-ball friction wear tester as well. The results indicated that the performance of anti-extreme pressure of regeneration oil increase by 34.1% compared with the waste one and its load- carrying ability is close to the new oil; the feature of wear spot are better than those of the waste oil and frictional coefficient almost reach the level of the new oil's. As a result, the performance of anti-wear and friction reducing are getting better obviously.

  9. Performance analysis of an inexpensive Direct Imaging Transmission Ion Microscope

    NASA Astrophysics Data System (ADS)

    Barnes, Patrick; Pallone, Arthur

    2013-03-01

    A direct imaging transmission ion microscope (DITIM) is built from a modified webcam and a commercially available polonium-210 antistatic device mounted on an optics rail. The performance of the DITIM in radiographic mode is analyzed in terms of the line spread function (LSF) and modulation transfer function (MTF) for an opaque edge. Limitations of, potential uses for, and suggested improvements to the DITIM are also discussed. Faculty sponsor

  10. Moisture and Structural Analysis for High Performance Hybrid Wall Assemblies

    SciTech Connect

    Grin, A.; Lstiburek, J.

    2012-09-01

    Based on past experience in the Building America program, BSC has found that combinations of materials and approaches—in other words, systems—usually provide optimum performance. Integration is necessary, as described in this research project. The hybrid walls analyzed utilize a combination of exterior insulation, diagonal metal strapping, and spray polyurethane foam and leave room for cavity-fill insulation. These systems can provide effective thermal, air, moisture, and water barrier systems in one assembly and provide structure.

  11. Performance of silicon immersed gratings: measurement, analysis, and modeling

    NASA Astrophysics Data System (ADS)

    Rodenhuis, Michiel; Tol, Paul J. J.; Coppens, Tonny H. M.; Laubert, Phillip P.; van Amerongen, Aaldert H.

    2015-09-01

    The use of Immersed Gratings offers advantages for both space- and ground-based spectrographs. As diffraction takes place inside the high-index medium, the optical path difference and angular dispersion are boosted proportionally, thereby allowing a smaller grating area and a smaller spectrometer size. Short-wave infrared (SWIR) spectroscopy is used in space-based monitoring of greenhouse and pollution gases in the Earth atmosphere. On the extremely large telescopes currently under development, mid-infrared high-resolution spectrographs will, among other things, be used to characterize exo-planet atmospheres. At infrared wavelengths, Silicon is transparent. This means that production methods used in the semiconductor industry can be applied to the fabrication of immersed gratings. Using such methods, we have designed and built immersed gratings for both space- and ground-based instruments, examples being the TROPOMI instrument for the European Space Agency Sentinel-5 precursor mission, Sentinel-5 (ESA) and the METIS (Mid-infrared E-ELT Imager and Spectrograph) instrument for the European Extremely Large Telescope. Three key parameters govern the performance of such gratings: The efficiency, the level of scattered light and the wavefront error induced. In this paper we describe how we can optimize these parameters during the design and manufacturing phase. We focus on the tools and methods used to measure the actual performance realized and present the results. In this paper, the bread-board model (BBM) immersed grating developed for the SWIR-1 channel of Sentinel-5 is used to illustrate this process. Stringent requirements were specified for this grating for the three performance criteria. We will show that -with some margin- the performance requirements have all been met.

  12. Performance analysis of a solar water pumping system

    SciTech Connect

    Katan, R.E.; Agelidis, V.G.; Nayar, C.V.

    1995-12-31

    The performance of a solar water pumping system is discussed in this paper. The system consists of a photovoltaic (PV) array, a permanent magnet (PM) DC motor and a helical rotor pump. The operation of the PV array is analyzed using PSPICE. The efficiency of the system is improved with a maximum power point tracker (MPPT) and a sun-tracker. Simulation and field test results are presented.

  13. Performance Analysis of Distributed Object-Oriented Applications

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1998-01-01

    The purpose of this research was to evaluate the efficiency of a distributed simulation architecture which creates individual modules which are made self-scheduling through the use of a message-based communication system used for requesting input data from another module which is the source of that data. To make the architecture as general as possible, the message-based communication architecture was implemented using standard remote object architectures (Common Object Request Broker Architecture (CORBA) and/or Distributed Component Object Model (DCOM)). A series of experiments were run in which different systems are distributed in a variety of ways across multiple computers and the performance evaluated. The experiments were duplicated in each case so that the overhead due to message communication and data transmission can be separated from the time required to actually perform the computational update of a module each iteration. The software used to distribute the modules across multiple computers was developed in the first year of the current grant and was modified considerably to add a message-based communication scheme supported by the DCOM distributed object architecture. The resulting performance was analyzed using a model created during the first year of this grant which predicts the overhead due to CORBA and DCOM remote procedure calls and includes the effects of data passed to and from the remote objects. A report covering the distributed simulation software and the results of the performance experiments has been submitted separately. The above report also discusses possible future work to apply the methodology to dynamically distribute the simulation modules so as to minimize overall computation time.

  14. Deployable Air Beam Fender System (DAFS): Energy Absorption Performance Analysis

    DTIC Science & Technology

    2007-03-30

    its energy absorption performance. Quarter-scale and full-scale models were evaluated and compared to protot ype tests for a variety of inflation...pressures, impact berthing conditions, and ballast levels. Model predictions were validated with correlated test data. The explicit FEA method captured...was used. In step 1, the fender was inflated to the specified inflation pressure and the acceleration caused by gravity (386.4 in./s 2) was applied

  15. A Detection Theory Analysis of Visual Display Performance

    DTIC Science & Technology

    1988-01-01

    PAGE BEFORE COMPLETING FORM GOVTR ORNNUOEF. REPORT NUMBER 82. VT ACCESSION NO. 3. RECIPIENT’S CATALOG NUMBER AFIT/CI/IR 88- 1 ,(• TITLE (and Subtitle...was used in an evaluation of four different display formats: ( 1 ) A numerical display composed of n two-digit numbers arranged in a linear horizontal...linear horizontal format. Performance was evaluated for 1 , 2, 4, 9, and 16 element displays and over a range of display durations. Detection

  16. Analysis of Soldier Radio Waveform Performance in Operational Test

    DTIC Science & Technology

    2015-05-01

    air vehicles. Interoperability with other, high - throughput networking waveforms, such as the Wideband Networking Waveform (WNW), is also a key... throughput . Though high overhead is often blamed for poor network performance (usually with good reason), we are unfamiliar with any M&S efforts...because they minimize periodic control messages that would be needed to update routing information. The downside is high latency due to the time spent

  17. Distributed Tracking Fidelity-Metric Performance Analysis Using Confusion Matrices

    DTIC Science & Technology

    2012-07-01

    evaluation [3, 4], T2T developments [ 5 , 6], and simultaneous tracking and ID (STID) approaches [7, 8, 9, 10], we seek a method for distributed tracking...While there is an interest to process all the data in signal-level fusion, such as image fusion [ 31 ], the transmission of the data is limited by...combination. Section 2 describes the tracking metrics. Section 3 overviews the JBPDAF. Section 4 describes the CM DLF. Section 5 shows a performance

  18. Performance Analysis of Automated Attack Graph Generation Software

    DTIC Science & Technology

    2006-12-01

    gapped from the target network . Although no performance details were available from Skybox, an examination of recent patents submitted by Skybox...distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The current generation of network vulnerability detection software uses...databases of known vulnerabilities and scans target networks for these weaknesses. The results can be voluminous and difficult to assess. Thus, the

  19. Long-Term Performance Analysis of GIOVE Clocks

    DTIC Science & Technology

    2010-11-01

    Francisco Gonzalez, Stefano Binda European Space Agency, ESA- ESTEC , Noordwijk , The Netherlands pierre.waller@esa.int Daniel Rodriguez, Guillermo...ADDRESS(ES) European Space Agency, ESA- ESTEC , Noordwijk , The Netherlands 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME...European Frequency and Time Forum (EFTF), 13-16 April 2010, Noordwijk , The Netherlands. [6] J. Camparo, 2004, “Frequency Equilibration and the Light

  20. Numerical analysis of maximal bat performance in baseball.

    PubMed

    Nicholls, Rochelle L; Miller, Karol; Elliott, Bruce C

    2006-01-01

    Metal baseball bats have been experimentally demonstrated to produce higher ball exit velocity (BEV) than wooden bats. In the United States, all bats are subject to BEV tests using hitting machines that rotate the bat in a horizontal plane. In this paper, a model of bat-ball impact was developed based on 3-D translational and rotational kinematics of a swing performed by high-level players. The model was designed to simulate the maximal performance of specific models of a wooden bat and a metal bat when swung by a player, and included material properties and kinematics specific to each bat. Impact dynamics were quantified using the finite element method (ANSYS/LSDYNA, version 6.1). Maximum BEV from both a metal (61.5 m/s) and a wooden (50.9 m/s) bat exceeded the 43.1 m/s threshold by which bats are certified as appropriate for commercial sale. The lower BEV from the wooden bat was attributed to a lower pre-impact bat linear velocity, and a more oblique impact that resulted in a greater proportion of BEV being lost to lateral and vertical motion. The results demonstrate the importance of factoring bat linear velocity and spatial orientation into tests of maximal bat performance, and have implications for the design of metal baseball bats.