Sample records for exergetic performance analysis

  1. Exergy analysis of hybrid nanofluids with optimum concentration in a plate heat exchanger

    NASA Astrophysics Data System (ADS)

    Kumar, Vikas; Tiwari, Arun Kumar; Ghosh, Subrata Kumar

    2018-06-01

    This paper highlights an investigation on the comparative analyses of exergetic performance with optimum volume concentration of hybrid nanofluids in a plate heat exchanger (PHE). Different types of hybrid nanofluids (Al2O3 + MWCNT/water, TiO2 + MWCNT/water, ZnO + MWCNT/water, and CeO2 + MWCNT/water) as coolant have been tested. Proportion of 0.75% of nanofluid has been found to be the optimum volume concentration. The requisite thermal and physical properties of the hybrid nanofluids were measured at 35 °C. Various exergetic performance parameters have been examined for comparing different hybrid nanofluids. The highest reduction in exergy loss of CeO2 + MWCNT/water hybrid nanofluid has been obtained at a concentration of about 24.75%. Entropy generation decreased with the increase in volume concentration. The results established that CeO2 + MWCNT/water hybrid nanofluid can be a promising coolant for exergetic performances in a PHE.

  2. Exergetic analysis of autonomous power complex for drilling rig

    NASA Astrophysics Data System (ADS)

    Lebedev, V. A.; Karabuta, V. S.

    2017-10-01

    The article considers the issue of increasing the energy efficiency of power equipment of the drilling rig. At present diverse types of power plants are used in power supply systems. When designing and choosing a power plant, one of the main criteria is its energy efficiency. The main indicator in this case is the effective efficiency factor calculated by the method of thermal balances. In the article, it is suggested to use the exergy method to determine energy efficiency, which allows to perform estimations of the thermodynamic perfection degree of the system by the example of a gas turbine plant: relative estimation (exergetic efficiency factor) and an absolute estimation. An exergetic analysis of the gas turbine plant operating in a simple scheme was carried out using the program WaterSteamPro. Exergy losses in equipment elements are calculated.

  3. Application of exergetic sustainability index to a nano-scale irreversible Brayton cycle operating with ideal Bose and Fermi gasses

    NASA Astrophysics Data System (ADS)

    Açıkkalp, Emin; Caner, Necmettin

    2015-09-01

    In this study, a nano-scale irreversible Brayton cycle operating with quantum gasses including Bose and Fermi gasses is researched. Developments in the nano-technology cause searching the nano-scale machines including thermal systems to be unavoidable. Thermodynamic analysis of a nano-scale irreversible Brayton cycle operating with Bose and Fermi gasses was performed (especially using exergetic sustainability index). In addition, thermodynamic analysis involving classical evaluation parameters such as work output, exergy output, entropy generation, energy and exergy efficiencies were conducted. Results are submitted numerically and finally some useful recommendations were conducted. Some important results are: entropy generation and exergetic sustainability index are affected mostly for Bose gas and power output and exergy output are affected mostly for the Fermi gas by x. At the high temperature conditions, work output and entropy generation have high values comparing with other degeneracy conditions.

  4. Thermodynamic analysis of a new dual evaporator CO2 transcritical refrigeration cycle

    NASA Astrophysics Data System (ADS)

    Abdellaoui, Ezzaalouni Yathreb; Kairouani, Lakdar Kairouani

    2017-03-01

    In this work, a new dual-evaporator CO2 transcritical refrigeration cycle with two ejectors is proposed. In this new system, we proposed to recover the lost energy of condensation coming off the gas cooler and operate the refrigeration cycle ejector free and enhance the system performance and obtain dual-temperature refrigeration simultaneously. The effects of some key parameters on the thermodynamic performance of the modified cycle are theoretically investigated based on energetic and exergetic analysis. The simulation results for the modified cycle indicate more effective system performance improvement than the single ejector in the CO2 vapor compression cycle using ejector as an expander ranging up to 46%. The exergetic analysis for this system is made. The performance characteristics of the proposed cycle show its promise in dual-evaporator refrigeration system.

  5. The use of exergetic indicators in the food industry - A review.

    PubMed

    Zisopoulos, Filippos K; Rossier-Miranda, Francisco J; van der Goot, Atze Jan; Boom, Remko M

    2017-01-02

    Assessment of sustainability will become more relevant for the food industry in the years to come. Analysis based on exergy, including the use of exergetic indicators and Grassmann diagrams, is a useful tool for the quantitative and qualitative assessment of the efficiency of industrial food chains. In this paper, we review the methodology of exergy analysis and the exergetic indicators that are most appropriate for use in the food industry. The challenges of applying exergy analysis in industrial food chains and the specific features of food processes are also discussed.

  6. Energy and Exergy Analysis of a Diesel Engine Fuelled with Diesel and Simarouba Biodiesel Blends

    NASA Astrophysics Data System (ADS)

    Panigrahi, Nabnit; Mohanty, Mahendra Kumar; Mishra, Sruti Ranjan; Mohanty, Ramesh Chandra

    2018-02-01

    This article intends to determine the available work and various losses of a diesel engine fuelled with diesel and SB20 (20 % Simarouba biodiesel by volume blended with 80 % diesel by volume). The energy and exergy analysis were carried out by using first law and second law of thermodynamics respectively. The experiments were carried out on a 3.5 kW compression ignition engine. The analysis was conducted on per mole of fuel basis. The energy analysis indicates that about 37.23 and 37.79 % of input energy is converted into the capacity to do work for diesel and SB20 respectively. The exergetic efficiency was 34.8 and 35 % for diesel and Simarouba respectively. Comparative study indicates that the energetic and exergetic performance of SB20 resembles with that of diesel fuel.

  7. Analyses of exergy efficiency for forced convection heat transfer in a tube with CNT nanofluid under laminar flow conditions

    NASA Astrophysics Data System (ADS)

    Hazbehian, Mohammad; Mohammadiun, Mohammad; Maddah, Heydar; Alizadeh, Mostafa

    2017-05-01

    In the present study, the theoretical and experimental results of the second law analysis on the performance of a uniform heat flux tube using are presented in the laminar flow regime. For this purpose, carbon nanotube/water nanofluids is considered as the base fluid. The experimental investigations were undertaken in the Reynolds number range from 800 to 2600, volume concentrations of 0.1-1 %. Results are verified with well-known correlations. The focus will be on the entrance region under the laminar flow conditions for SWCNT nanofluid. The results showed that the Nu number increased about 90-270 % with the enhancement of nanoparticles volume concentration compared to water. The enhancement was particularly significant in the entrance region. Based on the exergy analysis, the results show that exergetic heat transfer effectiveness is increased by 22-67 % employing nanofluids. The exergetic efficiency is increase with increase in nanoparticles concentration. On the other hand, exergy loss was reduced by 23-43 % employing nanofluids as a heat transfer medium with comparing to conventional fluid. In addition, the empirical correlation for exergetic efficiency has also been developed. The consequential results obtained from the correlation are found to be in good agreement with the experimental results within ±5 % variation.

  8. Exergy analysis of a solid oxide fuel cell micropowerplant

    NASA Astrophysics Data System (ADS)

    Hotz, Nico; Senn, Stephan M.; Poulikakos, Dimos

    In this paper, an analytical model of a micro solid oxide fuel cell (SOFC) system fed by butane is introduced and analyzed in order to optimize its exergetic efficiency. The micro SOFC system is equipped with a partial oxidation (POX) reformer, a vaporizer, two pre-heaters, and a post-combustor. A one-dimensional (1D) polarization model of the SOFC is used to examine the effects of concentration overpotentials, activation overpotentials, and ohmic resistances on cell performance. This 1D polarization model is extended in this study to a two-dimensional (2D) fuel cell model considering convective mass and heat transport along the fuel cell channel and from the fuel cell to the environment. The influence of significant operational parameters on the exergetic efficiency of the micro SOFC system is discussed. The present study shows the importance of an exergy analysis of the fuel cell as part of an entire thermodynamic system (transportable micropowerplant) generating electric power.

  9. Resource recovery from residual household waste: An application of exergy flow analysis and exergetic life cycle assessment.

    PubMed

    Laner, David; Rechberger, Helmut; De Soete, Wouter; De Meester, Steven; Astrup, Thomas F

    2015-12-01

    Exergy is based on the Second Law of thermodynamics and can be used to express physical and chemical potential and provides a unified measure for resource accounting. In this study, exergy analysis was applied to four residual household waste management scenarios with focus on the achieved resource recovery efficiencies. The calculated exergy efficiencies were used to compare the scenarios and to evaluate the applicability of exergy-based measures for expressing resource quality and for optimizing resource recovery. Exergy efficiencies were determined based on two approaches: (i) exergy flow analysis of the waste treatment system under investigation and (ii) exergetic life cycle assessment (LCA) using the Cumulative Exergy Extraction from the Natural Environment (CEENE) as a method for resource accounting. Scenario efficiencies of around 17-27% were found based on the exergy flow analysis (higher efficiencies were associated with high levels of material recycling), while the scenario efficiencies based on the exergetic LCA lay in a narrow range around 14%. Metal recovery was beneficial in both types of analyses, but had more influence on the overall efficiency in the exergetic LCA approach, as avoided burdens associated with primary metal production were much more important than the exergy content of the recovered metals. On the other hand, plastic recovery was highly beneficial in the exergy flow analysis, but rather insignificant in exergetic LCA. The two approaches thereby offered different quantitative results as well as conclusions regarding material recovery. With respect to resource quality, the main challenge for the exergy flow analysis is the use of exergy content and exergy losses as a proxy for resource quality and resource losses, as exergy content is not per se correlated with the functionality of a material. In addition, the definition of appropriate waste system boundaries is critical for the exergy efficiencies derived from the flow analysis, as it is constrained by limited information available about the composition of flows in the system as well as about secondary production processes and their interaction with primary or traditional production chains. In the exergetic LCA, resource quality could be reflected by the savings achieved by product substitution and the consideration of the waste's upstream burden allowed for an evaluation of the waste's resource potential. For a comprehensive assessment of resource efficiency in waste LCA, the sensitivity of accounting for product substitution should be carefully analyzed and cumulative exergy consumption measures should be complimented by other impact categories. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Sustainability assessment of turbofan engine with mixed exhaust through exergetic approach

    NASA Astrophysics Data System (ADS)

    Saadon, S.; Redzuan, M. S. Mohd

    2017-12-01

    In this study, the theory, methods and example application are described for a CF6 high-bypass turbofan engine with mixed exhaust flow based on exergo-sustainable point of view. To determine exergetic sustainability index, the turbofan engine has to undergo detailed exergy analysis. The sustainability indicators reviewed here are the overall exergy efficiency of the system, waste exergy ratio, exergy destruction factor, environmental effect factor and the exergetic sustainability index. The results obtained for these parameters are 26.9%, 73.1%, 38.6%, 2.72 and 0.37, respectively, for the maximum take-off condition of the engine. These results would be useful to better understand the connection between the propulsion system parameters and their impact to the environment in order to make it more sustainable for future development.

  11. Practical exergy analysis of centrifugal compressor performance using ASME-PTC-10 data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carranti, F.J.

    1997-07-01

    It has been shown that measures of performance currently in use for industrial and process compressors do not give a true measure of energy utilization, and that the required assumptions of isentropic or adiabatic behavior are now always valid. A better indication of machine or process performance can be achieved using exergetic (second law) efficiencies and by employing the second law of thermodynamics to indicate the nature of irreversibilities and entropy generation in the compression process. In this type of analysis, performance is related to an environmental equilibrium condition, or dead state. Often, the differences between avoidable and unavoidable irreversibilitiesmore » ca be interpreted from these results. A general overview of the techniques involved in exergy analysis as applied to compressors and blowers is presented. A practical method to allow the calculation of exergetic efficiencies by manufacturers and end users is demonstrated using data from ASME Power Test Code input. These data are often readily available from compressor manufacturers for both design and off-design conditions, or can sometimes be obtained from field measurements. The calculations involved are simple and straightforward, and can demonstrate the energy usage situation for a variety of conditions. Here off-design is taken to mean at different rates of flow, as well as at different environmental states. The techniques presented are also applicable to many other equipment and process types.« less

  12. Performance assessment of an irreversible nano Brayton cycle operating with Maxwell-Boltzmann gas

    NASA Astrophysics Data System (ADS)

    Açıkkalp, Emin; Caner, Necmettin

    2015-05-01

    In the last decades, nano-technology has been developed very fast. According to this, nano-cycle thermodynamics should improve with a similar rate. In this paper, a nano-scale irreversible Brayton cycle working with helium is evaluated for different thermodynamic criteria. These are maximum work output, ecological function, ecological coefficient of performance, exergetic performance criteria and energy efficiency. Thermodynamic analysis was performed for these criteria and results were submitted numerically. In addition, these criteria are compared with each other and the most convenient methods for the optimum conditions are suggested.

  13. Exergetic simulation of a combined infrared-convective drying process

    NASA Astrophysics Data System (ADS)

    Aghbashlo, Mortaza

    2016-04-01

    Optimal design and performance of a combined infrared-convective drying system with respect to the energy issue is extremely put through the application of advanced engineering analyses. This article proposes a theoretical approach for exergy analysis of the combined infrared-convective drying process using a simple heat and mass transfer model. The applicability of the developed model to actual drying processes was proved using an illustrative example for a typical food.

  14. Thrust Performance Evaluation of a Turbofan Engine Based on Exergetic Approach and Thrust Management in Aircraft

    NASA Astrophysics Data System (ADS)

    Yalcin, Enver

    2017-05-01

    The environmental parameters such as temperature and air pressure which are changing depending on altitudes are effective on thrust and fuel consumption of aircraft engines. In flights with long routes, thrust management function in airplane information system has a structure that ensures altitude and performance management. This study focused on thrust changes throughout all flight were examined by taking into consideration their energy and exergy performances for fuel consumption of an aircraft engine used in flight with long route were taken as reference. The energetic and exergetic performance evaluations were made under the various altitude conditions. The thrust changes for different altitude conditions were obtained to be at 86.53 % in descending direction and at 142.58 % in ascending direction while the energy and exergy efficiency changes for the referenced engine were found to be at 80.77 % and 84.45 %, respectively. The results revealed here can be helpful to manage thrust and reduce fuel consumption, but engine performance will be in accordance with operation requirements.

  15. Exergetic analysis of a thermo-generator for automotive application: A dynamic numerical approach

    NASA Astrophysics Data System (ADS)

    Glavatskaya, O.; Goupil, C.; Bakkali, A. El; Shonda, O.

    2012-06-01

    It is well known that, when using a passenger car with an ICE (Internal Combustion Engine), only a fraction of the burnt fuel energy actually contributes to drive the vehicle. Typical passenger vehicle engines run about 25% efficiency while a great part of the remaining energy (about 40%), is lost through the exhaust gases. This latter has a significant energy conversion potential since the temperature (more than 300°C) and the mass flow rate are high enough. Thus, direct conversion of heat into electricity is a credible option if the overall system is optimized. This point is crucial since the heat conversion into work process is very sensible to any mismatching of the different parts of the system, and very sensible significant to the possible varying working conditions. All these effects constitute irreversibility sources that degrade the overall efficiency. The exergetic analysis is known to be an efficient tool for finding the root causes of theses irreversible processes. In order to investigate the performance of our automotive thermo-generator we propose an analysis of the exergy flow through the system under dynamic conditions. Taking into account the different irreversible sources such as thermal conduction and Joule effect, we are able to localize and quantify the exergy losses. Then, in order to optimize the thermoelectric converter for a given vehicle, correct actions in term of design and working conditions can be proposed.

  16. Designing an artificial neural network using radial basis function to model exergetic efficiency of nanofluids in mini double pipe heat exchanger

    NASA Astrophysics Data System (ADS)

    Ghasemi, Nahid; Aghayari, Reza; Maddah, Heydar

    2018-06-01

    The present study aims at predicting and optimizing exergetic efficiency of TiO2-Al2O3/water nanofluid at different Reynolds numbers, volume fractions and twisted ratios using Artificial Neural Networks (ANN) and experimental data. Central Composite Design (CCD) and cascade Radial Basis Function (RBF) were used to display the significant levels of the analyzed factors on the exergetic efficiency. The size of TiO2-Al2O3/water nanocomposite was 20-70 nm. The parameters of ANN model were adapted by a training algorithm of radial basis function (RBF) with a wide range of experimental data set. Total mean square error and correlation coefficient were used to evaluate the results which the best result was obtained from double layer perceptron neural network with 30 neurons in which total Mean Square Error(MSE) and correlation coefficient (R2) were equal to 0.002 and 0.999, respectively. This indicated successful prediction of the network. Moreover, the proposed equation for predicting exergetic efficiency was extremely successful. According to the optimal curves, the optimum designing parameters of double pipe heat exchanger with inner twisted tape and nanofluid under the constrains of exergetic efficiency 0.937 are found to be Reynolds number 2500, twisted ratio 2.5 and volume fraction( v/v%) 0.05.

  17. Sustainability Metrics of a Small Scale Turbojet Engine

    NASA Astrophysics Data System (ADS)

    Ekici, Selcuk; Sohret, Yasin; Coban, Kahraman; Altuntas, Onder; Karakoc, T. Hikmet

    2018-05-01

    Over the last decade, sustainable energy consumption has attracted the attention of scientists and researchers. The current paper presents sustainability indicators of a small scale turbojet engine, operated on micro-aerial vehicles, for discussion of the sustainable development of the aviation industry from a different perspective. Experimental data was obtained from an engine at full power load and utilized to conduct an exergy-based sustainability analysis. Exergy efficiency, waste exergy ratio, recoverable exergy ratio, environmental effect factor, exergy destruction factor and exergetic sustainability index are evaluated as exergetic sustainability indicators of the turbojet engine under investigation in the current study. The exergy efficiency of the small scale turbojet engine is calculated as 27.25 % whereas the waste exergy ratio, the exergy destruction factor and the sustainability index of the engine are found to be 0.9756, 0.5466 and 0.2793, respectively.

  18. Metallic phase change material thermal storage for Dish Stirling

    DOE PAGES

    Andraka, C. E.; Kruizenga, A. M.; Hernandez-Sanchez, B. A.; ...

    2015-06-05

    Dish-Stirling systems provide high-efficiency solar-only electrical generation and currently hold the world record at 31.25%. This high efficiency results in a system with a high possibility of meeting the DOE SunShot goal of $0.06/kWh. However, current dish-Stirling systems do not incorporate thermal storage. For the next generation of non-intermittent and cost-competitive solar power plants, we propose adding a thermal energy storage system that combines latent (phase-change) energy transport and latent energy storage in order to match the isothermal input requirements of Stirling engines while also maximizing the exergetic efficiency of the entire system. This paper reports current findings in themore » area of selection, synthesis and evaluation of a suitable high performance metallic phase change material (PCM) as well as potential interactions with containment alloy materials. The metallic PCM's, while more expensive than salts, have been identified as having substantial performance advantages primarily due to high thermal conductivity, leading to high exergetic efficiency. Systems modeling has indicated, based on high dish Stirling system performance, an allowable cost of the PCM storage system that is substantially higher than SunShot goals for storage cost on tower systems. Several PCM's are identified with suitable melting temperature, cost, and performance.« less

  19. Fuels and chemicals from equine-waste-derived tail gas reactive pyrolysis oil: technoeconomic analysis, environmental and exergetic life cycle assessment

    USDA-ARS?s Scientific Manuscript database

    Horse manure, whose improper disposal imposes considerable environmental costs, constitutes an apt feedstock for conversion to renewable fuels and chemicals when tail gas reactive pyrolysis (TGRP) is employed. TGRP is a modification of fast pyrolysis that recycles its non-condensable gases and produ...

  20. Experimental study of 2-layer regenerators using Mn-Fe-Si-P materials

    NASA Astrophysics Data System (ADS)

    Christiaanse, T. V.; Trevizoli, P. V.; Misra, Sumohan; Carroll, Colman; van Asten, David; Zhang, Lian; Teyber, R.; Govindappa, P.; Niknia, I.; Rowe, A.

    2018-03-01

    This work describes an experimental study of a two layer active magnetic regenerator with varying transition temperature spacing. The transition temperature of the materials is based on the specific heat peak of the materials. A transition temperature based on the average of the heating and cooling curves at zero Tesla field value is used to refer to the materials throughout this paper. This study uses five Mn-Fe-Si-P materials with transition temperatures of 294.6 K, 292.3 K, 290.7 K, 282.5 K and 281.4 K. Six different regenerators are tested. A reference configuration is tested using the 294.6 K material a hot side layer and with a second passive layer of lead spheres as cold side layer. Followed by four configurations that use the same 294.6 K material as hot side layer, but where each configuration uses a different cold side material. For the second active layer the materials are used in sequence; 292.3 K, 290.7 K, 282.5 K and 281.4K. Lastly, a sixth configuration uses the 292.3 K and 282.5 K materials. For each configuration, the temperature span is measured for rejection temperatures from 40 °C to 9 °C and at 0 W and 2 W applied load. Experimental results for temperature span and exergetic cooling power are compared based on the differences from the reference configuration. Materials are analysed based on material performance metrics such as peak adiabatic temperature change, peak entropy change and RCP(s) values. For the cases considered, a closer transition temperature spacing generally gives a greater temperature span and exergetic cooling power than further spaced materials, even when the combined materials have comparatively lower performance metrics. When two materials with higher RCP(s) values with large transition temperature spacing are compared to materials with lower RCP(s) values but, closer transition temperature spacing a higher exergetic cooling power and temperature span is found for the latter.

  1. Energo- and exergo-technical assessment of ground-source heat pump systems for geothermal energy production from underground mines.

    PubMed

    Amiri, Leyla; Madadian, Edris; Hassani, Ferri P

    2018-06-08

    The objective of this study is to perform the energy and exergy analysis of an integrated ground-source heat pump (GSHP) system, along with technical assessment, for geothermal energy production by deployment of Engineering Equation Solver (EES). The system comprises heat pump cycle and ground heat exchanger for extracting geothermal energy from underground mine water. A simultaneous energy and exergy analysis of the system is brought off. These analyses provided persuasive outcomes due to the use of an economic and green source of energy. The energetic coefficient of performance (COP) of the entire system is 2.33 and the exergy efficiency of the system is 28.6%. The exergetic efficiencies of the compressor, ground heat exchanger, evaporator, expansion valve, condenser and fan are computed to be 38%, 42%, 53%, 55%, 60% and 64%, respectively. In the numerical investigation, different alteration such as changing the temperature and pressure of the condenser show promising potential for further application of GSHPs. The outcomes of this research can be used for developing and designing novel coupled heat and power systems.

  2. Exergy as a useful tool for the performance assessment of aircraft gas turbine engines: A key review

    NASA Astrophysics Data System (ADS)

    Şöhret, Yasin; Ekici, Selcuk; Altuntaş, Önder; Hepbasli, Arif; Karakoç, T. Hikmet

    2016-05-01

    It is known that aircraft gas turbine engines operate according to thermodynamic principles. Exergy is considered a very useful tool for assessing machines working on the basis of thermodynamics. In the current study, exergy-based assessment methodologies are initially explained in detail. A literature overview is then presented. According to the literature overview, turbofans may be described as the most investigated type of aircraft gas turbine engines. The combustion chamber is found to be the most irreversible component, and the gas turbine component needs less exergetic improvement compared to all other components of an aircraft gas turbine engine. Finally, the need for analyses of exergy, exergo-economic, exergo-environmental and exergo-sustainability for aircraft gas turbine engines is emphasized. A lack of agreement on exergy analysis paradigms and assumptions is noted by the authors. Exergy analyses of aircraft gas turbine engines, fed with conventional fuel as well as alternative fuel using advanced exergy analysis methodology to understand the interaction among components, are suggested to those interested in thermal engineering, aerospace engineering and environmental sciences.

  3. Thermodynamic and Mechanical Analysis of a Thermomagnetic Rotary Engine

    NASA Astrophysics Data System (ADS)

    Fajar, D. M.; Khotimah, S. N.; Khairurrijal

    2016-08-01

    A heat engine in magnetic system had three thermodynamic coordinates: magnetic intensity ℋ, total magnetization ℳ, and temperature T, where the first two of them are respectively analogous to that of gaseous system: pressure P and volume V. Consequently, Carnot cycle that constitutes the principle of a heat engine in gaseous system is also valid on that in magnetic system. A thermomagnetic rotary engine is one model of it that was designed in the form of a ferromagnetic wheel that can rotates because of magnetization change at Curie temperature. The study is aimed to describe the thermodynamic and mechanical analysis of a thermomagnetic rotary engine and calculate the efficiencies. In thermodynamic view, the ideal processes are isothermal demagnetization, adiabatic demagnetization, isothermal magnetization, and adiabatic magnetization. The values of thermodynamic efficiency depend on temperature difference between hot and cold reservoir. In mechanical view, a rotational work is determined through calculation of moment of inertia and average angular speed. The value of mechanical efficiency is calculated from ratio between rotational work and heat received by system. The study also obtains exergetic efficiency that states the performance quality of the engine.

  4. Heat Transfer and Friction Characteristics of Artificially Roughened Duct used for Solar Air Heaters—a Review

    NASA Astrophysics Data System (ADS)

    Kumar, Khushmeet; Prajapati, D. R.; Samir, Sushant

    2018-02-01

    Solar air heater uses the energy coming from the sun to heat the air. The conversion rate of solar energy to heat depends upon the efficiency of the solar air heater and this efficiency can be increased by the use of artificial roughness on the surface of absorber plate. Various studies were carried out to analyse the effect of different roughness geometries on heat transfer and friction factor characteristics. The thermo-hydraulic performance of solar air heater can be evaluated in terms of effective efficiency, thermo-hydraulic performance parameter and exergetic efficiency. In this study various geometries used for artificial roughness and to improve the performance of solar air heaters were studied. Also correlations developed by various researchers are presented in this paper.

  5. Exergy analysis of helium liquefaction systems based on modified Claude cycle with two-expanders

    NASA Astrophysics Data System (ADS)

    Thomas, Rijo Jacob; Ghosh, Parthasarathi; Chowdhury, Kanchan

    2011-06-01

    Large-scale helium liquefaction systems, being energy-intensive, demand judicious selection of process parameters. An effective tool for design and analysis of thermodynamic cycles for these systems is exergy analysis, which is used to study the behavior of a helium liquefaction system based on modified Claude cycle. Parametric evaluation using process simulator Aspen HYSYS® helps to identify the effects of cycle pressure ratio and expander flow fraction on the exergetic efficiency of the liquefaction cycle. The study computes the distribution of losses at different refrigeration stages of the cycle and helps in selecting optimum cycle pressures, operating temperature levels of expanders and mass flow rates through them. Results from the analysis may help evolving guidelines for designing appropriate thermodynamic cycles for practical helium liquefaction systems.

  6. A methodology for thermodynamic simulation of high temperature, internal reforming fuel cell systems

    NASA Astrophysics Data System (ADS)

    Matelli, José Alexandre; Bazzo, Edson

    This work presents a methodology for simulation of fuel cells to be used in power production in small on-site power/cogeneration plants that use natural gas as fuel. The methodology contemplates thermodynamics and electrochemical aspects related to molten carbonate and solid oxide fuel cells (MCFC and SOFC, respectively). Internal steam reforming of the natural gas hydrocarbons is considered for hydrogen production. From inputs as cell potential, cell power, number of cell in the stack, ancillary systems power consumption, reformed natural gas composition and hydrogen utilization factor, the simulation gives the natural gas consumption, anode and cathode stream gases temperature and composition, and thermodynamic, electrochemical and practical efficiencies. Both energetic and exergetic methods are considered for performance analysis. The results obtained from natural gas reforming thermodynamics simulation show that the hydrogen production is maximum around 700 °C, for a steam/carbon ratio equal to 3. As shown in the literature, the found results indicate that the SOFC is more efficient than MCFC.

  7. Evaluation of Fuel Cell Operation and Degradation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Mark; Gemmen, Randall; Richards, George

    The concepts of area specific resistance (ASR) and degradation are developed for different fuel cell operating modes. The concepts of exergetic efficiency and entropy production were applied to ASR and degradation. It is shown that exergetic efficiency is a time-dependent function useful describing the thermal efficiency of a fuel cell and the change in thermal efficiency of a degrading fuel cell. Entropy production was evaluated for the cases of constant voltage operation and constant current operation of the fuel cell for a fuel cell undergoing ohmic degradation. It was discovered that the Gaussian hypergeometric function describes the cumulative entropy andmore » electrical work produced by fuel cells operating at constant voltage. The Gaussian hypergeometric function is found in many applications in modern physics. This paper builds from and is an extension of several papers recently published by the authors in the Journal of The Electrochemical Society (ECS), ECS Transactions, Journal of Power Sources, and the Journal of Fuel Cell Science and Technology.« less

  8. Exergy Analysis of the Cryogenic Helium Distribution System for the Large Hadron Collider (lhc)

    NASA Astrophysics Data System (ADS)

    Claudet, S.; Lebrun, Ph.; Tavian, L.; Wagner, U.

    2010-04-01

    The Large Hadron Collider (LHC) at CERN features the world's largest helium cryogenic system, spreading over the 26.7 km circumference of the superconducting accelerator. With a total equivalent capacity of 145 kW at 4.5 K including 18 kW at 1.8 K, the LHC refrigerators produce an unprecedented exergetic load, which must be distributed efficiently to the magnets in the tunnel over the 3.3 km length of each of the eight independent sectors of the machine. We recall the main features of the LHC cryogenic helium distribution system at different temperature levels and present its exergy analysis, thus enabling to qualify second-principle efficiency and identify main remaining sources of irreversibility.

  9. Microsystem process networks

    DOEpatents

    Wegeng, Robert S [Richland, WA; TeGrotenhuis, Ward E [Kennewick, WA; Whyatt, Greg A [West Richland, WA

    2006-10-24

    Various aspects and applications of microsystem process networks are described. The design of many types of microsystems can be improved by ortho-cascading mass, heat, or other unit process operations. Microsystems having exergetically efficient microchannel heat exchangers are also described. Detailed descriptions of numerous design features in microcomponent systems are also provided.

  10. Performance Evaluation of an Experimental Turbojet Engine

    NASA Astrophysics Data System (ADS)

    Ekici, Selcuk; Sohret, Yasin; Coban, Kahraman; Altuntas, Onder; Karakoc, T. Hikmet

    2017-11-01

    An exergy analysis is presented including design parameters and performance assessment, by identifying the losses and efficiency of a gas turbine engine. The aim of this paper is to determine the performance of a small turbojet engine with an exergetic analysis based on test data. Experimental data from testing was collected at full-load of small turbojet engine. The turbojet engine exhaust data contains CO2, CO, CH4, H2, H2O, NO, NO2, N2 and O2 with a relative humidity of 35 % for the ambient air of the performed experiments. The evaluated main components of the turbojet engine are the air compressor, the combustion chamber and the gas turbine. As a result of the thermodynamic analysis, exergy efficiencies (based on product/fuel) of the air compressor, the combustion chamber and the gas turbine are 81.57 %, 50.13 % and 97.81 %, respectively. A major proportion of the total exergy destruction was found for the combustion chamber at 167.33 kW. The exergy destruction rates are 8.20 %, 90.70 % and 1.08 % in the compressor, the combustion chamber and the gas turbine, respectively. The rates of exergy destruction within the system components are compared on the basis of the exergy rate of the fuel provided to the engine. Eventually, the exergy rate of the fuel is calculated to be 4.50 % of unusable due to exergy destruction within the compressor, 49.76 % unusable due to exergy destruction within the combustion chamber and 0.59 % unusable due to exergy destruction within the gas turbine. It can be stated that approximately 55 % of the exergy rate of the fuel provided to the engine can not be used by the engine.

  11. Study on the Effect of a Cogeneration System Capacity on its CO2 Emissions

    NASA Astrophysics Data System (ADS)

    Fonseca, J. G. S., Jr.; Asano, Hitoshi; Fujii, Terushige; Hirasawa, Shigeki

    With the global warming problem aggravating and subsequent implementation of the Kyoto Protocol, CO2 emissions are becoming an important factor when verifying the usability of cogeneration systems. Considering this, the purpose of this work is to study the effect of the capacity of a cogeneration system on its CO2 emissions under two kinds of operation strategies: one focused on exergetic efficiency and another on running cost. The system meets the demand pattern typical of a hospital in Japan, operating during one year with an average heat-to-power ratio of 1.3. The main equipments of the cogeneration system are: a gas turbine with waste heat boiler, a main boiler and an auxiliary steam turbine. Each of these equipments was characterized with partial load models, and the turbine efficiencies at full load changed according to the system capacity. Still, it was assumed that eventual surplus of electricity generated could be sold. The main results showed that for any of the capacities simulated, an exergetic efficiency-focused operational strategy always resulted in higher CO2 emissions reduction when compared to the running cost-focused strategy. Furthermore, the amount of reduction in emissions decreased when the system capacity decreased, reaching a value of 1.6% when the system capacity was 33% of the maximum electricity demand with a heat-to-power ratio of 4.1. When the system operated focused on running cost, the economic savings increased with the capacity and reached 42% for a system capacity of 80% of maximum electricity demand and with a heat-to-power ratio of 2.3. In such conditions however, there was an increase in emissions of 8.5%. Still for the same capacity, an exergetic efficiency operation strategy presented the best balance between cost and emissions, generating economic savings of 29% with a decrease in CO2 emissions of 7.1%. The results found showed the importance of an exergy-focused operational strategy and also indicated that lower capacities resulted in lesser gains of both CO2 emissions and running cost reduction.

  12. Econophysics and bio-chemical engineering thermodynamics: The exergetic analysis of a municipality

    NASA Astrophysics Data System (ADS)

    Lucia, Umberto

    2016-11-01

    Exergy is a fundamental quantity because it allows us to obtain information on the useful work obtainable in a process. The analyses of irreversibility are important not only in the design and development of the industrial devices, but also in fundamental thermodynamics and in the socio-economic analysis of municipality. Consequently, the link between entropy and exergy is discussed in order to link econophysics to the bio-chemical engineering thermodynamics. Last, this link holds to the fundamental role of fluxes and to the exergy exchanged in the interaction between the system and its environment. The result consists in a thermodynamic approach to the analysis of the unavailability of the economic, productive or social systems. The unavailability is what the system cannot use in relation to its internal processes. This quantity result is interesting also as a support to public manager for economic decisions. Here, the Alessandria Municipality is analyzed in order to highlight the application of the theoretical results.

  13. Influence of the cooling degree upon performances of internal combustion engine

    NASA Astrophysics Data System (ADS)

    Grǎdinariu, Andrei Cristian; Mihai, Ioan

    2016-12-01

    Up to present, air cooling systems still raise several unsolved problems due to conditions imposed by the environment in terms of temperature and pollution levels. The present paper investigates the impact of the engine cooling degree upon its performances, as important specific power is desired for as low as possible fuel consumption. A technical solution advanced by the authors[1], consists of constructing a bi-flux compressor, which can enhance the engine's performances. The bi-flux axial compressor accomplishes two major functions, that is it cools down the engine and it also turbocharges it. The present paper investigates the temperature changes corresponding to the fresh load, during the use of a bi-flux axial compressor. This compressor is economically simple, compact, and offers an optimal response at low rotational speeds of the engine, when two compression steps are used. The influence of the relative coefficient of air temperature drop upon working agent temperature at the intercooler exit is also investigated in the present work. The variation of the thermal load coefficient by report to the working agent temperature is also investigated during engine cooling. The variation of the average combustion temperature is analyzed in correlation to the thermal load coefficient and the temperatures of the working fluid at its exit from the cooling system. An exergetic analysis was conducted upon the influence of the cooling degree on the motor fluid and the gases resulted from the combustion process.

  14. High performance felt-metal-wick heat pipe for solar receivers

    NASA Astrophysics Data System (ADS)

    Andraka, Charles E.; Moss, Timothy A.; Baturkin, Volodymyr; Zaripov, Vladlen; Nishchyk, Oleksandr

    2016-05-01

    Sodium heat pipes have been identified as a potentially effective heat transport approach for CSP systems that require near-isothermal input to power cycles or storage, such as dish Stirling and highly recuperated reheat-cycle supercritical CO2 turbines. Heat pipes offer high heat flux capabilities, leading to small receivers, as well as low exergetic losses through isothermal coupling with the engine. Sandia developed a felt metal wick approach in the 1990's, and demonstrated very high performance1. However, multiple durability issues arose, primarily the structural collapse of the wick at temperature over short time periods. NTUU developed several methods of improving robustness of the wick2, but the resulting wick had limited performance capabilities. For application to CSP systems, the wick structures must retain high heat pipe performance with robustness for long term operation. In this paper we present our findings in developing an optimal balance between performance and ruggedness, including operation of a laboratory-scale heat pipe for over 5500 hours so far. Application of heat pipes to dish-Stirling systems has been shown to increase performance as much as 20%3, and application to supercritical CO2 systems has been proposed.

  15. Design of Particle-Based Thermal Energy Storage for a Concentrating Solar Power System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Zhiwen; Zhang, Ruichong; Sawaged, Fadi

    Solid particles can operate at higher temperature than current molten salt or oil, and they can be a heat-transfer and storage medium in a concentrating solar power (CSP) system. By using inexpensive solid particles and containment material for thermal energy storage (TES), the particle-TES cost can be significantly lower than other TES methods such as a nitrate-salt system. The particle-TES system can hold hot particles at more than 800 degrees C with high thermal performance. The high particle temperatures increase the temperature difference between the hot and cold particles, and they improve the TES capacity. The particle-based CSP system ismore » able to support high-efficiency power generation, such as the supercritical carbon-dioxide Brayton power cycle, to achieve >50% thermal-electric conversion efficiency. This paper describes a solid particle-TES system that integrates into a CSP plant. The hot particles discharge to a heat exchanger to drive the power cycle. The returning cold particles circulate through a particle receiver to absorb solar heat and charge the TES. This paper shows the design of a particle-TES system including containment silos, foundation, silo insulation, and particle materials. The analysis provides results for four TES capacities and two silo configurations. The design analysis indicates that the system can achieve high thermal efficiency, storage effectiveness (i.e., percentage usage of the hot particles), and exergetic efficiency. An insulation method for the hot silo was considered. The particle-TES system can achieve high performance and low cost, and it holds potential for next-generation CSP technology.« less

  16. Exergy and extended exergy accounting of very large complex systems with an application to the province of Siena, Italy.

    PubMed

    Sciubba, Enrico; Bastianoni, Simone; Tiezzi, Enzo

    2008-01-01

    This paper describes the application of exergy and extended exergy analyses to large complex systems. The system to be analysed is assumed to be at steady state, and the input and output fluxes of matter and energy are expressed in units of exergy. Human societies of any reasonable extent are indeed Very Large Complex Systems and can be represented as interconnected networks of N elementary "components", their Subsystems; the detail of the disaggregation depends on the type and quality of the available data. The structural connectivity of the "model" of the System must correctly describe the interactions of each mass or energy flow with each sector of the society: since it is seldom the case that all of these fluxes are available in detail, some preliminary mass- and energy balances must be completed and constitute in fact a part of the initial assumptions. Exergy accounting converts the total amount of resources inflow into their equivalent exergetic form with the help of a table of "raw exergy data" available in the literature. The quantification of each flow on a homogeneous exergetic basis paves the way to the evaluation of the efficiency of each energy and mass transfer between the N sectors and makes it possible to quantify the irreversible losses and identify their sources. The advantage of the EEA, compared to a classical exergy accounting, is the inclusion in the system balance of the exergetic equivalents of three additional "Production Factors": human Labour, Capital and Environmental Remediation costs. EEA has an additional advantage: it allows for the calculation of the efficiency of the domestic sector (impossible to evaluate with any other energy- or exergy-based method) by considering the working hours as its product. As implied in the title, an application of the method was made to a model of the province of Siena (on a year 2000 database): the results show that the sectors of this Province have values of efficiency close to the Italian average, with the exception of the commercial and energy conversion sectors that are more efficient, in agreement with the rather peculiar socio-economic situation of the Province. The largest inefficiency is found to be in the transportation sector, which has an efficiency lower then 30% in EEA and lower than 10% in classical exergy accounting.

  17. Comparison between reverse Brayton and Kapitza based LNG boil-off gas reliquefaction system using exergy analysis

    NASA Astrophysics Data System (ADS)

    Kochunni, Sarun Kumar; Chowdhury, Kanchan

    2017-02-01

    LNG boil-off gas (BOG) reliquefaction systems in LNG carrier ships uses refrigeration devices which are based on reverse Brayton, Claude, Kapitza (modified Claude) or Cascade cycles. Some of these refrigeration devices use nitrogen as the refrigerants and hence nitrogen storage vessels or nitrogen generators needs to be installed in LNG carrier ships which consume space and add weight to the carrier. In the present work, a new configuration based on Kapitza liquefaction cycle which uses BOG itself as working fluid is proposed and has been compared with Reverse Brayton Cycle (RBC) on sizes of heat exchangers and compressor operating parameters. Exergy analysis is done after simulating at steady state with Aspen Hysys 8.6® and the comparison between RBC and Kapitza may help designers to choose reliquefaction system with appropriate process parameters and sizes of equipment. With comparable exergetic efficiency as that of an RBC, a Kaptiza system needs only BOG compressor without any need of nitrogen gas.

  18. Supercharging an internal combustion engine by aid of a dual-rotor bi-flux axial compressor

    NASA Astrophysics Data System (ADS)

    Grǎdinariu, Andrei Cristian; Mihai, Ioan

    2016-12-01

    Internal combustion engines can be supercharged in order to enhance their performances [1-3]. Engine power is proportional to the quantity of fresh fluid introduced into the cylinder. At present, the general tendency is to try to obtain actual specific powers as high as possible, for as small as possible cylinder capacity, without increasing the generated pollution hazards. The present paper investigates the impact of replacing a centrifugal turbo-compressor with an axial double-rotor bi-flux one [4]. The proposed method allows that for the same number of cylinders, an increase in discharged airflow, accompanied by a decrease in fuel consumption. Using a program developed under the MathCad environment, the present work was aimed at studying the way temperature modifies at the end of isentropic compression under supercharging conditions. Taking into account a variation between extreme limits of the ambient temperature, its influence upon the evolution of thermal load coefficient was analyzed considering the air pressure at the compressor cooling system outlet. This analysis was completed by an exergetical study of the heat evacuated through cylinder walls in supercharged engine conditions. The conducted investigation allows verification of whether significant differences can be observed between an axial, dual-rotor, bi-flux compressor and centrifugal compressors.

  19. Impact of large beam-induced heat loads on the transient operation of the beam screens and the cryogenic plants of the Future Circular Collider (FCC)

    NASA Astrophysics Data System (ADS)

    Correia Rodrigues, H.; Tavian, L.

    2017-12-01

    The Future Circular Collider (FCC) under study at CERN will produce 50-TeV high-energy proton beams. The high-energy particle beams are bent by 16-T superconducting dipole magnets operating at 1.9 K and distributed over a circumference of 80 km. The circulating beams induce 5 MW of dynamic heat loads by several processes such as synchrotron radiation, resistive dissipation of beam image currents and electron clouds. These beam-induced heat loads will be intercepted by beam screens operating between 40 and 60 K and induce transients during beam injection. Energy ramp-up and beam dumping on the distributed beam-screen cooling loops, the sector cryogenic plants and the dedicated circulators. Based on the current baseline parameters, numerical simulations of the fluid flow in the cryogenic distribution system during a beam operation cycle were performed. The effects of the thermal inertia of the headers on the helium flow temperature at the cryogenic plant inlet as well as the temperature gradient experienced by the beam screen has been assessed. Additionally, this work enabled a thorough exergetic analysis of different cryogenic plant configurations and laid the building-block for establishing design specification of cold and warm circulators.

  20. Multi-criteria assessment of energy conversion systems by means of thermodynamic, economic and environmental parameters

    NASA Astrophysics Data System (ADS)

    Becerra Lopez, Humberto Ruben

    2007-12-01

    High expansion of power demand is expected in the Upper Rio Grande region (El Paso, Hudspeth, Culberson, Jeff Davis, Presidio and Brewster counties) as a result of both electrical demand growth and decommissioning of installed capacity. On the supply side a notable deployment of renewable power technologies can be projected owing to the recent introduction of a new energy policy in Texas, which attempts to reach 10,000 installed-MWe of renewable capacity for 2025. Power generation fueled by natural-gas might consistently expand due to the encouraged use of this fuel. In this context the array of participating technologies can be optimized, which, within a sustainability framework, translates into a multidimensional problem. The solution to the problem is presented through this dissertation in two main parts. The first part solves the thermodynamic-environmental problem through developing a dynamic model to project maximum allowable expansion of technologies. Predetermined alternatives include diverse renewable energy technologies (wind turbine, photovoltaic conversion, hybrid solar thermal parabolic trough, and solid oxide fuel cells), a conventional fossil-fuel technology (natural gas combined-cycle), and a breakthrough fossil-fuel technology (solid oxide fuel cells). The analysis is based on the concept of cumulative exergy consumption, expanded to include abatement of emissions. A Gompertz sigmoid growth is assumed and constrained by both exergetic self-sustenance and regional energy resource availability. This part of the analysis assumes that power demand expansion is met by full deployment of alternative technologies backed up by conventional technology. Results show that through a proper allowance for exergy reinvestment the power demand expansion may be met largely by alternative technologies minimizing the primary resource depletion. The second part of the study makes use of the dynamic model to support a multi-objective optimization routine, where the exergetic and economic costs are established as primary competing factors. An optimization algorithm is implemented using the constraint method. The solution is given as Pareto optimality with arrays for minimum cost and possible arrays for the tradeoff front. These arrays are further analyzed in terms of sustainability, cumulative exergy loss (i.e. irreversibilities and waste exergy) and incremental economic cost, and the results are compared with the goals of current legislated energy policy.

  1. Development and Analysis of New Integrated Energy Systems for Sustainable Buildings

    NASA Astrophysics Data System (ADS)

    Khalid, Farrukh

    Excessive consumption of fossil fuels in the residential sector and their associated negative environmental impacts bring a significant challenge to engineers within research and industrial communities throughout the world to develop more environmentally benign methods of meeting energy needs of residential sector in particular. This thesis addresses potential solutions for the issue of fossils fuel consumption in residential buildings. Three novel renewable energy based multigeneration systems are proposed for different types of residential buildings, and a comprehensive assessment of energetic and exergetic performances is given on the basis of total occupancy, energy load, and climate conditions. System 1 is a multigeneration system based on two renewable energy sources. It uses biomass and solar resources. The outputs of System 1 are electricity, space heating, cooling, and hot water. The energy and exergy efficiencies of System 1 are 91.0% and 34.9%, respectively. The results of the optimisation analysis show that the net present cost of System 1 is 2,700,496 and that the levelised cost of electricity is 0.117/kWh. System 2 is a multigeneration system, integrating three renewable energy based subsystems; wind turbine, concentrated solar collector, and Organic Rankine Cycle supplied by a ground source heat exchanger. The outputs of the System 2 are electricity, hot water, heating and cooling. The optimisation analysis shows that net present cost is 35,502 and levelised cost of electricity is 0.186/kWh. The energy and exergy efficiencies of System 2 are found to be 34.6% and 16.2%, respectively. System 3 is a multigeneration system, comprising two renewable energy subsystems-- geothermal and solar to supply power, cooling, heating, and hot water. The optimisation analysis shows that the net present cost of System 3 is 598,474, and levelised cost of electricity of 0.111/kWh. The energy and exergy efficiencies of System 3 are 20.2% and 19.2%, respectively, with outputs of electricity, hot water, cooling and space heating. A performance assessment for identical conditions indicates that System 3 offers the best performance, with the minimum net present cost of 26,001 and levelised cost of electricity of 0.136/kWh.

  2. Thermal Assessment of a Latent-Heat Energy Storage Module During Melting and Freezing for Solar Energy Applications

    NASA Astrophysics Data System (ADS)

    Ramos Archibold, Antonio

    Capital investment reduction, exergetic efficiency improvement and material compatibility issues have been identified as the primary techno-economic challenges associated, with the near-term development and deployment of thermal energy storage (TES) in commercial-scale concentrating solar power plants. Three TES techniques have gained attention in the solar energy research community as possible candidates to reduce the cost of solar-generated electricity, namely (1) sensible heat storage, (2) latent heat (tank filled with phase change materials (PCMs) or encapsulated PCMs packed in a vessel) and (3) thermochemical storage. Among these the PCM macro-encapsulation approach seems to be one of the most-promising methods because of its potential to develop more effective energy exchange, reduce the cost associated with the tank and increase the exergetic efficiency. However, the technological barriers to this approach arise from the encapsulation techniques used to create a durable capsule, as well as an assessment of the fundamental thermal energy transport mechanisms during the phase change. A comprehensive study of the energy exchange interactions and induced fluid flow during melting and solidification of a confined storage medium is reported in this investigation from a theoretical perspective. Emphasis has been placed on the thermal characterization of a single constituent storage module rather than an entire storage system, in order to, precisely capture the energy exchange contributions of all the fundamental heat transfer mechanisms during the phase change processes. Two-dimensional, axisymmetric, transient equations for mass, momentum and energy conservation have been solved numerically by the finite volume scheme. Initially, the interaction between conduction and natural convection energy transport modes, in the absence of thermal radiation, is investigated for solar power applications at temperatures (300--400°C). Later, participating thermal radiation within the storage medium has been included in order to extend the conventional natural convection-dominated model and to analyze its influence on the melting and freezing dynamics at elevated temperatures (800-850°C). A parametric analysis has been performed in order to ascertain the effects of the controlling parameters on the melting/freezing rates and the total and radiative heat transfer rates at the inner surface of the shell. The results show that the presence of thermal radiation enhances the melting and solidification processes. Finally, a simplified model of the packed bed heat exchanger with multiple spherical capsules filled with the storage medium and positioned in a vertical array inside a cylindrical container is analyzed and numerically solved. The influence of the inlet mass flow rate, inner shell surface emissivity and PCM attenuation coefficient on the melting dynamics of the PCM has been analyzed and quantified.

  3. Process development and exergy cost sensitivity analysis of a hybrid molten carbonate fuel cell power plant and carbon dioxide capturing process

    NASA Astrophysics Data System (ADS)

    Mehrpooya, Mehdi; Ansarinasab, Hojat; Moftakhari Sharifzadeh, Mohammad Mehdi; Rosen, Marc A.

    2017-10-01

    An integrated power plant with a net electrical power output of 3.71 × 105 kW is developed and investigated. The electrical efficiency of the process is found to be 60.1%. The process includes three main sub-systems: molten carbonate fuel cell system, heat recovery section and cryogenic carbon dioxide capturing process. Conventional and advanced exergoeconomic methods are used for analyzing the process. Advanced exergoeconomic analysis is a comprehensive evaluation tool which combines an exergetic approach with economic analysis procedures. With this method, investment and exergy destruction costs of the process components are divided into endogenous/exogenous and avoidable/unavoidable parts. Results of the conventional exergoeconomic analyses demonstrate that the combustion chamber has the largest exergy destruction rate (182 MW) and cost rate (13,100 /h). Also, the total process cost rate can be decreased by reducing the cost rate of the fuel cell and improving the efficiency of the combustion chamber and heat recovery steam generator. Based on the total avoidable endogenous cost rate, the priority for modification is the heat recovery steam generator, a compressor and a turbine of the power plant, in rank order. A sensitivity analysis is done to investigate the exergoeconomic factor parameters through changing the effective parameter variations.

  4. Modeling and comparative assessment of bubbling fluidized bed gasification system for syngas production - a gateway for a cleaner future in Pakistan.

    PubMed

    Shehzad, Areeb; Bashir, Mohammed J K; Horttanainen, Mika; Manttari, Mika; Havukainen, Jouni; Abbas, Ghulam

    2017-06-19

    The present study explores the potential of MSW gasification for exergy analysis and has been recently given a premier attention in a region like Pakistan where the urbanization is rapidly growing and resources are few. The plant capacity was set at 50 MW based on reference data available and the total exergetic efficiency was recorded to be 31.5 MW. The largest irreversibility distribution appears in the gasifier followed by methanation unit and CO 2 capture. The effect of process temperature, equivalence ratio and MSW moisture content was explored for inspecting the variations in syngas composition, lower heating value, carbon conversion efficiency and cold gas efficiency. Special attention of the paper is paid to the comparative assessment of MSW gasification products in four regions, namely Pakistan, USA, UAE and Thailand. This extended study gave an insight into the spectrum of socioeconomic conditions with varying MSW compositions in order to explain the effect of MSW composition variance on the gasification products.

  5. Waste-to-energy: A review of life cycle assessment and its extension methods.

    PubMed

    Zhou, Zhaozhi; Tang, Yuanjun; Chi, Yong; Ni, Mingjiang; Buekens, Alfons

    2018-01-01

    This article proposes a comprehensive review of evaluation tools based on life cycle thinking, as applied to waste-to-energy. Habitually, life cycle assessment is adopted to assess environmental burdens associated with waste-to-energy initiatives. Based on this framework, several extension methods have been developed to focus on specific aspects: Exergetic life cycle assessment for reducing resource depletion, life cycle costing for evaluating its economic burden, and social life cycle assessment for recording its social impacts. Additionally, the environment-energy-economy model integrates both life cycle assessment and life cycle costing methods and judges simultaneously these three features for sustainable waste-to-energy conversion. Life cycle assessment is sufficiently developed on waste-to-energy with concrete data inventory and sensitivity analysis, although the data and model uncertainty are unavoidable. Compared with life cycle assessment, only a few evaluations are conducted to waste-to-energy techniques by using extension methods and its methodology and application need to be further developed. Finally, this article succinctly summarises some recommendations for further research.

  6. Analysis and optimization of hybrid electric vehicle thermal management systems

    NASA Astrophysics Data System (ADS)

    Hamut, H. S.; Dincer, I.; Naterer, G. F.

    2014-02-01

    In this study, the thermal management system of a hybrid electric vehicle is optimized using single and multi-objective evolutionary algorithms in order to maximize the exergy efficiency and minimize the cost and environmental impact of the system. The objective functions are defined and decision variables, along with their respective system constraints, are selected for the analysis. In the multi-objective optimization, a Pareto frontier is obtained and a single desirable optimal solution is selected based on LINMAP decision-making process. The corresponding solutions are compared against the exergetic, exergoeconomic and exergoenvironmental single objective optimization results. The results show that the exergy efficiency, total cost rate and environmental impact rate for the baseline system are determined to be 0.29, ¢28 h-1 and 77.3 mPts h-1 respectively. Moreover, based on the exergoeconomic optimization, 14% higher exergy efficiency and 5% lower cost can be achieved, compared to baseline parameters at an expense of a 14% increase in the environmental impact. Based on the exergoenvironmental optimization, a 13% higher exergy efficiency and 5% lower environmental impact can be achieved at the expense of a 27% increase in the total cost.

  7. Exergy-based efficiency and renewability assessment of biofuel production.

    PubMed

    Dewulf, J; Van Langenhove, H; Van De Velde, B

    2005-05-15

    This study presents an efficiency and renewability analysis of the production of three biofuels: rapeseed methyl ester (RME), soybean methyl ester (SME) and corn-based ethanol (EtOH). The overall production chains have been taken into account: not only the agricultural crop production and the industrial conversion into biofuel, but also production of the supply of agricultural resources (pesticides, fertilizers, fuel, seeding material) and industrial resources (energy and chemicals) to transform the crops into biofuel. Simultaneously, byproducts of the agricultural and industrial processes have been taken into account when resources have to be allocated to the biofuels. The technical analysis via the second law of thermodynamics revealed that corn-based EtOH results in the highest production rate with an exergetic fuel content of 68.8 GJ ha(-1) yr(-1), whereas the RME and SME results were limited to 47.5 and 16.4 GJ ha(-1) yr(-1). The allocated nonrenewable resource input to deliver these biofuels is significant: 16.5, 15.4, and 5.6 MJ ha(-1) yr(-1). This means that these biofuels, generally considered as renewable resources, embed a nonrenewable fraction of one-quarter for EtOH and even one-third for RME and SME. This type of analysis provides scientifically sound quantitative information that is necessarywith respect to the sustainability analysis of so-called renewable energy.

  8. Modelling and operation strategies of DLR's large scale thermocline test facility (TESIS)

    NASA Astrophysics Data System (ADS)

    Odenthal, Christian; Breidenbach, Nils; Bauer, Thomas

    2017-06-01

    In this work an overview of the TESIS:store thermocline test facility and its current construction status will be given. Based on this, the TESIS:store facility using sensible solid filler material is modelled with a fully transient model, implemented in MATLAB®. Results in terms of the impact of filler site and operation strategies will be presented. While low porosity and small particle diameters for the filler material are beneficial, operation strategy is one key element with potential for optimization. It is shown that plant operators have to ponder between utilization and exergetic efficiency. Different durations of the charging and discharging period enable further potential for optimizations.

  9. Simulation of high temperature thermal energy storage system based on coupled metal hydrides for solar driven steam power plants

    DOE PAGES

    d'Entremont, Anna; Corgnale, Claudio; Hardy, Bruce; ...

    2018-01-11

    Concentrating solar power plants can achieve low cost and efficient renewable electricity production if equipped with adequate thermal energy storage systems. Metal hydride based thermal energy storage systems are appealing candidates due to their demonstrated potential for very high volumetric energy densities, high exergetic efficiencies, and low costs. The feasibility and performance of a thermal energy storage system based on NaMgH 2F hydride paired with TiCr 1.6Mn 0.2 is examined, discussing its integration with a solar-driven ultra-supercritical steam power plant. The simulated storage system is based on a laboratory-scale experimental apparatus. It is analyzed using a detailed transport model accountingmore » for the thermochemical hydrogen absorption and desorption reactions, including kinetics expressions adequate for the current metal hydride system. The results show that the proposed metal hydride pair can suitably be integrated with a high temperature steam power plant. The thermal energy storage system achieves output energy densities of 226 kWh/m 3, 9 times the DOE SunShot target, with moderate temperature and pressure swings. Also, simulations indicate that there is significant scope for performance improvement via heat-transfer enhancement strategies.« less

  10. Simulation of high temperature thermal energy storage system based on coupled metal hydrides for solar driven steam power plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    d'Entremont, Anna; Corgnale, Claudio; Hardy, Bruce

    Concentrating solar power plants can achieve low cost and efficient renewable electricity production if equipped with adequate thermal energy storage systems. Metal hydride based thermal energy storage systems are appealing candidates due to their demonstrated potential for very high volumetric energy densities, high exergetic efficiencies, and low costs. The feasibility and performance of a thermal energy storage system based on NaMgH 2F hydride paired with TiCr 1.6Mn 0.2 is examined, discussing its integration with a solar-driven ultra-supercritical steam power plant. The simulated storage system is based on a laboratory-scale experimental apparatus. It is analyzed using a detailed transport model accountingmore » for the thermochemical hydrogen absorption and desorption reactions, including kinetics expressions adequate for the current metal hydride system. The results show that the proposed metal hydride pair can suitably be integrated with a high temperature steam power plant. The thermal energy storage system achieves output energy densities of 226 kWh/m 3, 9 times the DOE SunShot target, with moderate temperature and pressure swings. Also, simulations indicate that there is significant scope for performance improvement via heat-transfer enhancement strategies.« less

  11. Comparison of Overall Resource Consumption of Biosolids Management System Processes Using Exergetic Life Cycle Assessment.

    PubMed

    Alanya, Sevda; Dewulf, Jo; Duran, Metin

    2015-08-18

    This study focused on the evaluation of biosolids management systems (BMS) from a natural resource consumption point of view. Additionally, the environmental impact of the facilities was benchmarked using Life Cycle Assessment (LCA) to provide a comprehensive assessment. This is the first study to apply a Cumulative Exergy Extraction from the Natural Environment (CEENE) method for an in-depth resource use assessment of BMS where two full-scale BMS and seven system variations were analyzed. CEENE allows better system evaluation and understanding of how much benefit is achievable from the products generated by BMS, which have valorization potential. LCA results showed that environmental burden is mostly from the intense electricity consumption. The CEENE analysis further revealed that the environmental burden is due to the high consumption of fossil and nuclear-based natural resources. Using Cumulative Degree of Perfection, higher resource-use efficiency, 53%, was observed in the PTA-2 where alkaline stabilization rather than anaerobic digestion is employed. However, an anaerobic digestion process is favorable over alkaline stabilization, with 35% lower overall natural resource use. The most significant reduction of the resource footprint occurred when the output biogas was valorized in a combined heat and power system.

  12. Experimental comparison between R409A and R437A performance in a heat pump unit

    NASA Astrophysics Data System (ADS)

    Duarte, M. V.; Pires, L. C.; Silva, P. D.; Gaspar, P. D.

    2017-04-01

    This paper reports an experimental comparison between the use of the refrigerants R409A and R437A in a heat pump unit designed and developed to work with R12. Although the use of both refrigerants in new equipments were abolished in EU and US according the new F-Gas Regulation of EU and SNAP, they still being used as options for R12 in old equipments, especially in developing countries. Both refrigerants were studied for the same test conditions, according to two groups of tests: group A (variation of the heat source temperature) and group B (variation of refrigerant flow rate). The results obtained showed that the R437A presents a higher discharge pressure and a lower discharge temperature. The heating and cooling capacities of both refrigerants were similar, as well as the exergetic efficiency. For the group A of tests the COP of both refrigerants was similar and for the group B of tests the R409A presented an average COP 15% higher. According to the results obtained it is recommended the use of R409A in old equipments (as transition refrigerant) until the acquisition of equipments operating with refrigerants with low-GWP becomes technically and economic feasible.

  13. Exergetic assessment for resources input and environmental emissions by Chinese industry during 1997-2006.

    PubMed

    Zhang, Bo; Peng, Beihua; Liu, Mingchu

    2012-01-01

    This paper presents an overview of the resources use and environmental impact of the Chinese industry during 1997-2006. For the purpose of this analysis the thermodynamic concept of exergy has been employed both to quantify and aggregate the resources input and the environmental emissions arising from the sector. The resources input and environmental emissions show an increasing trend in this period. Compared with 47568.7 PJ in 1997, resources input in 2006 increased by 75.4% and reached 83437.9 PJ, of which 82.5% came from nonrenewable resources, mainly from coal and other energy minerals. Furthermore, the total exergy of environmental emissions was estimated to be 3499.3 PJ in 2006, 1.7 times of that in 1997, of which 93.4% was from GHG emissions and only 6.6% from "three wastes" emissions. A rapid increment of the nonrenewable resources input and GHG emissions over 2002-2006 can be found, owing to the excessive expansion of resource- and energy-intensive subsectors. Exergy intensities in terms of resource input intensity and environmental emission intensity time-series are also calculated, and the trends are influenced by the macroeconomic situation evidently, particularly by the investment-derived economic development in recent years. Corresponding policy implications to guide a more sustainable industry system are addressed.

  14. Sustainability Efficiency Factor: Measuring Sustainability in Advanced Energy Systems through Exergy, Exergoeconomic, Life Cycle, and Economic Analyses

    NASA Astrophysics Data System (ADS)

    Boldon, Lauren

    The Encyclopedia of Life Support Systems defines sustainability or industrial ecology as "the wise use of resources through critical attention to policy, social, economic, technological, and ecological management of natural and human engineered capital so as to promote innovations that assure a higher degree of human needs fulfilment, or life support, across all regions of the world, while at the same time ensuring intergenerational equity" (Encyclopedia of Life Support Systems 1998). Developing and integrating sustainable energy systems to meet growing energy demands is a daunting task. Although the technology to utilize renewable energies is well understood, there are limited locations which are ideally suited for renewable energy development. Even in areas with significant wind or solar availability, backup or redundant energy supplies are still required during periods of low renewable generation. This is precisely why it would be difficult to make the switch directly from fossil fuel to renewable energy generation. A transition period in which a base-load generation supports renewables is required, and nuclear energy suits this need well with its limited life cycle emissions and fuel price stability. Sustainability is achieved by balancing environmental, economic, and social considerations, such that energy is produced without detriment to future generations through loss of resources, harm to the environment, etcetera. In essence, the goal is to provide future generations with the same opportunities to produce energy that the current generation has. This research explores sustainability metrics as they apply to a small modular reactor (SMR)-hydrogen production plant coupled with wind energy and storage technologies to develop a new quantitative sustainability metric, the Sustainability Efficiency Factor (SEF), for comparison of energy systems. The SEF incorporates the three fundamental aspects of sustainability and provides SMR or nuclear hybrid energy system (NHES) reference case studies to (1) introduce sustainability metrics, such as life cycle assessment, (2) demonstrate the methods behind exergy and exergoeconomic analyses, (3) provide an economic analysis of the potential for SMR development from first-of-a-kind (FOAK) to nth-of-a-kind (NOAK), thereby illustrating possible cost reductions and deployment flexibility for SMRs over large conventional nuclear reactors, (4) assess the competitive potential for incorporation of storage and hydrogen production in NHES and in regulated and deregulated electricity markets, (5) compare an SMR-hydrogen production plant to a natural gas steam methane reforming plant using the SEF, and (6) identify and review the social considerations which would support future nuclear development domestically and abroad, such as public and political/regulatory needs and challenges. The Global Warming Potential (GWP) for the SMR (300 MWth)-wind (60 MWe)-high temperature steam electrolysis (200 tons Hydrogen per day) system was calculated as approximately 874 g CO2-equivalent as part of the life cycle assessment. This is 92.6% less than the GWP estimated for steam methane reforming production of hydrogen by Spath and Mann. The unit exergetic and exergoeconomic costs were determined for each flow within the NHES system as part of the exergy/exergoeconomic cost analyses. The unit exergetic cost is lower for components yielding more meaningful work like the one exiting the SMR with a unit exergetic cost of 1.075 MW/MW. In comparison, the flow exiting the turbine has a very high unit exergetic cost of 15.31, as most of the useful work was already removed through the turning of the generator/compressor shaft. In a similar manner, the high unit exergoeconomic cost of 12.45/MW*sec is observed for the return flow to the reactors, because there is very little exergy present. The first and second law efficiencies and the exergoeconomic factors were also determined over several cases. For the first or base SMR case, first and second law efficiencies of 81.5% and 93.3% were observed respectively. With an increase in reactor outlet temperature of only 20°C, both the SMR efficiencies increased, while the exergoeconomic factor decreased by 0.2%. As part of the SMR economic analysis, specific capital and total capital investment costs (TCIC) were determined in addition to conditional effects on the net present value (NPV), levelized cost of electricity (LCOE), and payback periods. For a 1260 MWe FOAK multi-module SMR site with 7 modules, the specific capital costs were 27-38% higher than that of a 1260 MWe single large reactor site. A NOAK site, on the other hand, may be 19% lower to 18% higher than the large reactor site, demonstrating that it may break even or be even more economical in average or favorable market conditions. The NOAK TCIC for single and multi-module SMR sites were determined to be 914-1,230 million and 660-967 million per module, respectively, reflecting the substantial savings incurred with sites designed for and deployed with multiple modules. For the same NOAK 7-unit multi-module site, the LCOE was calculated as 67-84/MWh, which is slightly less than that of the conventional large reactor LCOE of 89/MWh with a weighted average cost of capital of 10%, a 50%-50% share of debt and equity, and a corporate tax rate of 35%. The payback period for the SMR site, however, is 4 years longer. Construction delays were also analyzed to compare the SMR and large reactor sites, demonstrating the SMR NPV and LCOE are less sensitive to delays. For a 3 year delay, the SMR NPV decreased by 22%, while the large reactor NPV decreased by 34.1%. Similarly the SMR and large reactor LCOEs increased by 7.8% and 8.1%, respectively. An NHES case with hydrogen production and storage was performed, illustrating how the profit share of revenue is improved with the addition of hydrogen production. Although the costs are increased with the addition, 78% of the hydrogen revenue is profit, while only 50% of the electricity generation revenue is profit. A second NHES case study was analyzed to assess the NPV, LCOE, and payback differences in deregulated and regulated electricity markets. For a 60 year lifetime, Case C (with nuclear, wind, and hydrogen production) is economical in the deregulated market with an NPV of 66.3 million and a payback period of 10 years, but not in the regulated one with an NPV of approximately -115.3 million and a payback period of 11 years. With either market type, the plants levelized costs remain $82.82/MWh, which is still reasonable with respect to prior LCOE values determined for SMR and large reactor sites. Utilizing all the methodology and results obtained and presented in this thesis, the SEF may be calculated. The NHES SEF was determined to be 18.3% higher than that of natural gas steam methane reforming, illustrating a higher level of sustainability. The SEF quantitatively uses the exergoeconomic cost and irreversibilities obtained from the exergy analysis, the GWP obtained from the life cycle assessment and costs/fees associated with emissions and pollutants, and relevant economic data obtained from an economic analysis. This reflects the environmental, socio-political, and economic pillars of sustainability.

  15. Performance Testing of Jefferson Lab 12 GeV Helium Screw Compressors

    DOE PAGES

    Knudsen, P.; Ganni, V.; Dixon, K.; ...

    2015-08-10

    Oil injected screw compressors have essentially superseded all other types of compressors in modern helium refrigeration systems due to their large displacement capacity, reliability, minimal vibration, and capability of handling helium's high heat of compression. At the present state of compressor system designs for helium refrigeration systems, typically two-thirds of the lost input power is due to the compression system. It is important to understand the isothermal and volumetric efficiencies of these machines to help properly design the compression system to match the refrigeration process. It is also important to identify those primary compressor skid exergetic loss mechanisms which maymore » be reduced, thereby offering the possibility of significantly reducing the input power to helium refrigeration processes which are extremely energy intensive. This paper summarizes the results collected during the commissioning of the new compressor system for Jefferson Lab's (JLab's) 12 GeV upgrade. The compressor skid packages were designed by JLab and built to print by industry. They incorporate a number of modifications not typical of helium screw compressor packages and most importantly allow a very wide range of operation so that JLab's patented Floating Pressure Process can be fully utilized. This paper also summarizes key features of the skid design that allow this process and facilitate the maintenance and reliability of these helium compressor systems.« less

  16. Performance Testing of Jefferson Lab 12 GeV Helium Screw Compressors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knudsen, P.; Ganni, V.; Dixon, K.

    Oil injected screw compressors have essentially superseded all other types of compressors in modern helium refrigeration systems due to their large displacement capacity, reliability, minimal vibration, and capability of handling helium's high heat of compression. At the present state of compressor system designs for helium refrigeration systems, typically two-thirds of the lost input power is due to the compression system. It is important to understand the isothermal and volumetric efficiencies of these machines to help properly design the compression system to match the refrigeration process. It is also important to identify those primary compressor skid exergetic loss mechanisms which maymore » be reduced, thereby offering the possibility of significantly reducing the input power to helium refrigeration processes which are extremely energy intensive. This paper summarizes the results collected during the commissioning of the new compressor system for Jefferson Lab's (JLab's) 12 GeV upgrade. The compressor skid packages were designed by JLab and built to print by industry. They incorporate a number of modifications not typical of helium screw compressor packages and most importantly allow a very wide range of operation so that JLab's patented Floating Pressure Process can be fully utilized. This paper also summarizes key features of the skid design that allow this process and facilitate the maintenance and reliability of these helium compressor systems.« less

  17. Preliminary design of the beam screen cooling for the Future Circular Collider of hadron beams

    NASA Astrophysics Data System (ADS)

    Kotnig, C.; Tavian, L.

    2015-12-01

    Following recommendations of the recent update of the European strategy in particle physics, CERN has undertaken an international study of possible future circular colliders beyond the LHC. This study considers an option for a very high energy (100 TeV) hadron-hadron collider located in a quasi-circular underground tunnel having a circumference of 80 to 100 km. The synchrotron radiation emitted by the high-energy hadron beam increases by more than two orders of magnitude compared to the LHC. To reduce the entropic load on the superconducting magnets’ refrigeration system, beam screens are indispensable to extract the heat load at a higher temperature level. After illustrating the decisive constraints of the beam screen's refrigeration design, this paper presents a preliminary design of the length of a continuous cooling loop comparing helium and neon, for different cooling channel geometries with emphasis on the cooling length limitations and the exergetic efficiency.

  18. Exergie /4th revised and enlarged edition/

    NASA Astrophysics Data System (ADS)

    Baloh, T.; Wittwer, E.

    The theoretical concept of exergy is explained and its practical applications are discussed. Equilibrium and thermal equilibrium are reviewed as background, and exergy is considered as a reference point for solid-liquid, liquid-liquid, and liquid-gas systems. Exergetic calculations and their graphic depictions are covered. The concepts of enthalpy and entropy are reviewed in detail, including their applications to gas mixtures, solutions, and isolated substances. The exergy of gas mixtures, solutions, and isolated substances is discussed, including moist air, liquid water in water vapor, dry air, and saturation-limited solutions. Mollier exergy-enthalpy-entropy diagrams are presented for two-component systems, and exergy losses for throttling, isobaric mixing, and heat transfer are addressed. The relationship of exergy to various processes is covered, including chemical processes, combustion, and nuclear reactions. The optimization of evaporation plants through exergy is discussed. Calculative examples are presented for energy production and heating, industrial chemical processes, separation of liquid air, nuclear reactors, and others.

  19. Comparative Assessment of Gasification Based Coal Power Plants with Various CO2 Capture Technologies Producing Electricity and Hydrogen

    PubMed Central

    2014-01-01

    Seven different types of gasification-based coal conversion processes for producing mainly electricity and in some cases hydrogen (H2), with and without carbon dioxide (CO2) capture, were compared on a consistent basis through simulation studies. The flowsheet for each process was developed in a chemical process simulation tool “Aspen Plus”. The pressure swing adsorption (PSA), physical absorption (Selexol), and chemical looping combustion (CLC) technologies were separately analyzed for processes with CO2 capture. The performances of the above three capture technologies were compared with respect to energetic and exergetic efficiencies, and the level of CO2 emission. The effect of air separation unit (ASU) and gas turbine (GT) integration on the power output of all the CO2 capture cases is assessed. Sensitivity analysis was carried out for the CLC process (electricity-only case) to examine the effect of temperature and water-cooling of the air reactor on the overall efficiency of the process. The results show that, when only electricity production in considered, the case using CLC technology has an electrical efficiency 1.3% and 2.3% higher than the PSA and Selexol based cases, respectively. The CLC based process achieves an overall CO2 capture efficiency of 99.9% in contrast to 89.9% for PSA and 93.5% for Selexol based processes. The overall efficiency of the CLC case for combined electricity and H2 production is marginally higher (by 0.3%) than Selexol and lower (by 0.6%) than PSA cases. The integration between the ASU and GT units benefits all three technologies in terms of electrical efficiency. Furthermore, our results suggest that it is favorable to operate the air reactor of the CLC process at higher temperatures with excess air supply in order to achieve higher power efficiency. PMID:24578590

  20. Comparative Assessment of Gasification Based Coal Power Plants with Various CO2 Capture Technologies Producing Electricity and Hydrogen.

    PubMed

    Mukherjee, Sanjay; Kumar, Prashant; Hosseini, Ali; Yang, Aidong; Fennell, Paul

    2014-02-20

    Seven different types of gasification-based coal conversion processes for producing mainly electricity and in some cases hydrogen (H 2 ), with and without carbon dioxide (CO 2 ) capture, were compared on a consistent basis through simulation studies. The flowsheet for each process was developed in a chemical process simulation tool "Aspen Plus". The pressure swing adsorption (PSA), physical absorption (Selexol), and chemical looping combustion (CLC) technologies were separately analyzed for processes with CO 2 capture. The performances of the above three capture technologies were compared with respect to energetic and exergetic efficiencies, and the level of CO 2 emission. The effect of air separation unit (ASU) and gas turbine (GT) integration on the power output of all the CO 2 capture cases is assessed. Sensitivity analysis was carried out for the CLC process (electricity-only case) to examine the effect of temperature and water-cooling of the air reactor on the overall efficiency of the process. The results show that, when only electricity production in considered, the case using CLC technology has an electrical efficiency 1.3% and 2.3% higher than the PSA and Selexol based cases, respectively. The CLC based process achieves an overall CO 2 capture efficiency of 99.9% in contrast to 89.9% for PSA and 93.5% for Selexol based processes. The overall efficiency of the CLC case for combined electricity and H 2 production is marginally higher (by 0.3%) than Selexol and lower (by 0.6%) than PSA cases. The integration between the ASU and GT units benefits all three technologies in terms of electrical efficiency. Furthermore, our results suggest that it is favorable to operate the air reactor of the CLC process at higher temperatures with excess air supply in order to achieve higher power efficiency.

  1. Exergetic life cycle assessment of hydrogen production from renewables

    NASA Astrophysics Data System (ADS)

    Granovskii, Mikhail; Dincer, Ibrahim; Rosen, Marc A.

    Life cycle assessment is extended to exergetic life cycle assessment and used to evaluate the exergy efficiency, economic effectiveness and environmental impact of producing hydrogen using wind and solar energy in place of fossil fuels. The product hydrogen is considered a fuel for fuel cell vehicles and a substitute for gasoline. Fossil fuel technologies for producing hydrogen from natural gas and gasoline from crude oil are contrasted with options using renewable energy. Exergy efficiencies and greenhouse gas and air pollution emissions are evaluated for all process steps, including crude oil and natural gas pipeline transportation, crude oil distillation and natural gas reforming, wind and solar electricity generation, hydrogen production through water electrolysis, and gasoline and hydrogen distribution and utilization. The use of wind power to produce hydrogen via electrolysis, and its application in a fuel cell vehicle, exhibits the lowest fossil and mineral resource consumption rate. However, the economic attractiveness, as measured by a "capital investment effectiveness factor," of renewable technologies depends significantly on the ratio of costs for hydrogen and natural gas. At the present cost ratio of about 2 (per unit of lower heating value or exergy), capital investments are about five times lower to produce hydrogen via natural gas rather than wind energy. As a consequence, the cost of wind- and solar-based electricity and hydrogen is substantially higher than that of natural gas. The implementation of a hydrogen fuel cell instead of an internal combustion engine permits, theoretically, an increase in a vehicle's engine efficiency of about of two times. Depending on the ratio in engine efficiencies, the substitution of gasoline with "renewable" hydrogen leads to (a) greenhouse gas (GHG) emissions reductions of 12-23 times for hydrogen from wind and 5-8 times for hydrogen from solar energy, and (b) air pollution (AP) emissions reductions of 38-76 times for hydrogen from wind and 16-32 times for hydrogen from solar energy. By comparison, substitution of gasoline with hydrogen from natural gas allows reductions in GHG emissions only as a result of the increased efficiency of a fuel cell engine, and a reduction of AP emissions of 2.5-5 times. These data suggest that "renewable" hydrogen represents a potential long-term solution to many environmental problems.

  2. Exergetic Assessment for Resources Input and Environmental Emissions by Chinese Industry during 1997–2006

    PubMed Central

    Zhang, Bo; Peng, Beihua; Liu, Mingchu

    2012-01-01

    This paper presents an overview of the resources use and environmental impact of the Chinese industry during 1997–2006. For the purpose of this analysis the thermodynamic concept of exergy has been employed both to quantify and aggregate the resources input and the environmental emissions arising from the sector. The resources input and environmental emissions show an increasing trend in this period. Compared with 47568.7 PJ in 1997, resources input in 2006 increased by 75.4% and reached 83437.9 PJ, of which 82.5% came from nonrenewable resources, mainly from coal and other energy minerals. Furthermore, the total exergy of environmental emissions was estimated to be 3499.3 PJ in 2006, 1.7 times of that in 1997, of which 93.4% was from GHG emissions and only 6.6% from “three wastes” emissions. A rapid increment of the nonrenewable resources input and GHG emissions over 2002–2006 can be found, owing to the excessive expansion of resource- and energy-intensive subsectors. Exergy intensities in terms of resource input intensity and environmental emission intensity time-series are also calculated, and the trends are influenced by the macroeconomic situation evidently, particularly by the investment-derived economic development in recent years. Corresponding policy implications to guide a more sustainable industry system are addressed. PMID:22973176

  3. Modeling of a thermal energy storage system based on coupled metal hydrides (magnesium iron – sodium alanate) for concentrating solar power plants

    DOE PAGES

    d'Entremont, A.; Corgnale, C.; Sulic, M.; ...

    2017-08-31

    Concentrating solar power plants represent low cost and efficient solutions for renewable electricity production only if adequate thermal energy storage systems are included. Metal hydride thermal energy storage systems have demonstrated the potential to achieve very high volumetric energy densities, high exergetic efficiencies, and low costs. The current work analyzes the technical feasibility and the performance of a storage system based on the high temperature Mg 2FeH 6 hydride coupled with the low temperature Na 3AlH 6 hydride. To accomplish this, a detailed transport model has been set up and the coupled metal hydride system has been simulated based onmore » a laboratory scale experimental configuration. Proper kinetics expressions have been developed and included in the model to replicate the absorption and desorption process in the high temperature and low temperature hydride materials. The system showed adequate hydrogen transfer between the two metal hydrides, with almost complete charging and discharging, during both thermal energy storage and thermal energy release. The system operating temperatures varied from 450°C to 500°C, with hydrogen pressures between 30 bar and 70 bar. This makes the thermal energy storage system a suitable candidate for pairing with a solar driven steam power plant. The model results, obtained for the selected experimental configuration, showed an actual thermal energy storage system volumetric energy density of about 132 kWh/m 3, which is more than 5 times the U.S. Department of Energy SunShot target (25 kWh/m 3).« less

  4. Modeling of a thermal energy storage system based on coupled metal hydrides (magnesium iron – sodium alanate) for concentrating solar power plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    d'Entremont, A.; Corgnale, C.; Sulic, M.

    Concentrating solar power plants represent low cost and efficient solutions for renewable electricity production only if adequate thermal energy storage systems are included. Metal hydride thermal energy storage systems have demonstrated the potential to achieve very high volumetric energy densities, high exergetic efficiencies, and low costs. The current work analyzes the technical feasibility and the performance of a storage system based on the high temperature Mg 2FeH 6 hydride coupled with the low temperature Na 3AlH 6 hydride. To accomplish this, a detailed transport model has been set up and the coupled metal hydride system has been simulated based onmore » a laboratory scale experimental configuration. Proper kinetics expressions have been developed and included in the model to replicate the absorption and desorption process in the high temperature and low temperature hydride materials. The system showed adequate hydrogen transfer between the two metal hydrides, with almost complete charging and discharging, during both thermal energy storage and thermal energy release. The system operating temperatures varied from 450°C to 500°C, with hydrogen pressures between 30 bar and 70 bar. This makes the thermal energy storage system a suitable candidate for pairing with a solar driven steam power plant. The model results, obtained for the selected experimental configuration, showed an actual thermal energy storage system volumetric energy density of about 132 kWh/m 3, which is more than 5 times the U.S. Department of Energy SunShot target (25 kWh/m 3).« less

  5. Modeling the energetic and exergetic self-sustainability of societies with different structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sciubba, E.

    1995-06-01

    The paper examines global energy and exergy flows in various models of organized human societies: from primitive tribal organizations to teocratic/aristocratic societies, to the present industrial (and post-industrial) society, to possible future highly robotized or central control social organizations. The analysis focuses on the very general chain of technological processes connected to the extraction, conversion, distribution and final use of the real energetic content of natural resources (i.e., their exergy): the biological food chain is also considered, albeit in a very simplified and humankind sense. It is argued that, to sustain this chain of processes, it is necessary to usemore » a substantial portion of the final-use energy flow, and to employ a large portion of the total work force sustained by this end-use energy. It is shown that if these quantities can be related to the total exergy flow rate (from the source) and to the total available work force, then this functional relationship takes different forms in different types of society. The procedure is very general: each type of societal organization is reduced to a simple model for which energy and exergy flow diagrams are calculated, under certain well-defined assumptions, which restrain both the exchanges among the functional groups which constitute the model, and the exchanges with the environment. The results can be quantified using some assumptions/projections about energy consumption levels for different stages of technological development which are available in the literature; the procedure is applied to some models of primitive and pre-industrial societies, to the present industrial/post-industrial society, and to a hypothetical model of a future, high-technology society.« less

  6. Operation Results of the Kstar Helium Refrigeration System

    NASA Astrophysics Data System (ADS)

    Chang, H.-S.; Fauve, E.; Park, D.-S.; Joo, J.-J.; Moon, K.-M.; Cho, K.-W.; Na, H. K.; Kwon, M.; Yang, S.-H.; Gistau-Baguer, G.

    2010-04-01

    The "first plasma" (100 kA of controllable plasma current for 100 ms) of KSTAR has been successfully generated in July 2008. The major outstanding feature of KSTAR compared to most other Tokamaks is that all the magnet coils are superconducting (SC), which enables higher plasma current values for a longer time duration when the nominal operation status has been reached. However, to establish the operating condition for the SC coils, other cold components, such as thermal shields, coil-supporting structures, SC buslines, and current leads also must be maintained at proper cryogenic temperature levels. A helium refrigeration system (HRS) with an exergetic equivalent cooling power of 9 kW at 4.5 K has been installed for such purposes and successfully commissioned. In this proceeding, we will report on the operation results of the HRS during the first plasma campaign of KSTAR. Using the HRS, the 300-ton cold mass of KSTAR was cooled down from ambient to the operating temperature levels of each cold component. Stable and steady cryogenic conditions, proper for the generation of the "first plasma" have been maintained for three months, after which, all of the cold mass was warmed up again to ambient temperature.

  7. Environmental sustainability assessments of pharmaceuticals: an emerging need for simplification in life cycle assessments.

    PubMed

    De Soete, Wouter; Debaveye, Sam; De Meester, Steven; Van der Vorst, Geert; Aelterman, Wim; Heirman, Bert; Cappuyns, Philippe; Dewulf, Jo

    2014-10-21

    The pharmaceutical and fine chemical industries are eager to strive toward innovative products and technologies. This study first derives hotspots in resource consumption of 2839 Basic Operations in 40 Active Pharmaceutical Ingredient synthesis steps through Exergetic Life Cycle Assessment (ELCA). Second, since companies are increasingly obliged to quantify the environmental sustainability of their products, two alternative ways of simplifying (E)LCA are discussed. The usage of averaged product group values (R(2) = 3.40 × 10(-30)) is compared with multiple linear regression models (R(2) = 8.66 × 10(-01)) in order to estimate resource consumption of synthesis steps. An optimal set of predictor variables is postulated to balance model complexity and embedded information with usability and capability of merging models with existing Enterprise Resource Planning (ERP) data systems. The amount of organic solvents used, molar efficiency, and duration of a synthesis step were shown to be the most significant predictor variables. Including additional predictor variables did not contribute to the predictive power and eventually weakens the model interpretation. Ideally, an organization should be able to derive its environmental impact from readily available ERP data, linking supply chains back to the cradle of resource extraction, excluding the need for an approximation with product group averages.

  8. Nuclear Hybrid Energy System: Molten Salt Energy Storage (Summer Report 2013)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sabharwall, Piyush; mckellar, Michael George; Yoon, Su-Jong

    2013-11-01

    Effective energy use is a main focus and concern in the world today because of the growing demand for energy. The nuclear hybrid energy system (NHES) is a valuable technical concept that can potentially diversify and leverage existing energy technologies. This report considers a particular NHES design that combines multiple energy systems including a nuclear reactor, energy storage system (ESS), variable renewable generator (VRG), and additional process heat applications. Energy storage is an essential component of this particular NHES because its design allows the system to produce peak power while the nuclear reactor operates at constant power output. Many energymore » storage options are available, but this study mainly focuses on a molten salt ESS. The primary purpose of the molten salt ESS is to enable the nuclear reactor to be a purely constant heat source by acting as a heat storage component for the reactor during times of low demand, and providing additional capacity for thermo-electric power generation during times of peak electricity demand. This report will describe the rationale behind using a molten salt ESS and identify an efficient molten salt ESS configuration that may be used in load following power applications. Several criteria are considered for effective energy storage and are used to identify the most effective ESS within the NHES. Different types of energy storage are briefly described with their advantages and disadvantages. The general analysis to determine the most efficient molten salt ESS involves two parts: thermodynamic, in which energetic and exergetic efficiencies are considered; and economic. Within the molten salt ESS, the two-part analysis covers three major system elements: molten salt ESS designs (two tank direct and thermocline), the molten salt choice, and the different power cycles coupled with the molten salt ESS. Analysis models are formulated and analyzed to determine the most effective ESS. The results show that the most efficient idealized energy storage system is the two tank direct molten salt ESS with an Air Brayton combined cycle using LiF-NaF-KF as the molten salt, and the most economical is the same design with KCl MgCl2 as the molten salt. With energy production being a major worldwide industry, understanding the most efficient molten salt ESS boosts development of an effective NHES with cheap, clean, and steady power.« less

  9. Regenerative Carbonate-Based Thermochemical Energy Storage System for Concentrating Solar Power

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gangwal, Santosh; Muto, Andrew

    Southern Research has developed a thermochemical energy storage (TCES) technology that utilizes the endothermic-exothermic reversible carbonation of calcium oxide (lime) to store thermal energy at high-temperatures, such as those achieved by next generation concentrating solar power (CSP) facilities. The major challenges addressed in the development of this system include refining a high capacity, yet durable sorbent material and designing a low thermal resistance low-cost heat exchanger reactor system to move heat between the sorbent and a heat transfer fluid under conditions relevant for CSP operation (e.g., energy density, reaction kinetics, heat flow). The proprietary stabilized sorbent was developed by Precisionmore » Combustion, Inc. (PCI). A factorial matrix of sorbent compositions covering the design space was tested using accelerated high throughput screening in a thermo-gravimetric analyzer. Several promising formulations were selected for more thorough evaluation and one formulation with high capacity (0.38 g CO 2/g sorbent) and durability (>99.7% capacity retention over 100 cycles) was chosen as a basis for further development of the energy storage reactor system. In parallel with this effort, a full range of currently available commercial and developmental heat exchange reactor systems and sorbent loading methods were examined through literature research and contacts with commercial vendors. Process models were developed to examine if a heat exchange reactor system and balance of plant can meet required TCES performance and cost targets, optimizing tradeoffs between thermal performance, exergetic efficiency, and cost. Reactor types evaluated included many forms, from microchannel reactor, to diffusion bonded heat exchanger, to shell and tube heat exchangers. The most viable design for application to a supercritical CO 2 power cycle operating at 200-300 bar pressure and >700°C was determined to be a combination of a diffusion bonded heat exchanger with a shell and tube reactor. A bench scale reactor system was then designed and constructed to test sorbent performance under more commercially relevant conditions. This system utilizes a tube-in tube reactor design containing approximately 250 grams sorbent and is able to operate under a wide range of temperature, pressure and flow conditions as needed to explore system performance under a variety of operating conditions. A variety of sorbent loading methods may be tested using the reactor design. Initial bench test results over 25 cycles showed very high sorbent stability (>99%) and sufficient capacity (>0.28 g CO 2/g sorbent) for an economical commercial-scale system. Initial technoeconomic evaluation of the proposed storage system show that the sorbent cost should not have a significant impact on overall system cost, and that the largest cost impacts come from the heat exchanger reactor and balance of plant equipment, including compressors and gas storage, due to the high temperatures for sCO 2 cycles. Current estimated system costs are $47/kWhth based on current material and equipment cost estimates.« less

  10. Evaluation of Hybrid Power Plants using Biomass, Photovoltaics and Steam Electrolysis for Hydrogen and Power Generation

    NASA Astrophysics Data System (ADS)

    Petrakopoulou, F.; Sanz, J.

    2014-12-01

    Steam electrolysis is a promising process of large-scale centralized hydrogen production, while it is also considered an excellent option for the efficient use of renewable solar and geothermal energy resources. This work studies the operation of an intermediate temperature steam electrolyzer (ITSE) and its incorporation into hybrid power plants that include biomass combustion and photovoltaic panels (PV). The plants generate both electricity and hydrogen. The reference -biomass- power plant and four variations of a hybrid biomass-PV incorporating the reference biomass plant and the ITSE are simulated and evaluated using exergetic analysis. The variations of the hybrid power plants are associated with (1) the air recirculation from the electrolyzer to the biomass power plant, (2) the elimination of the sweep gas of the electrolyzer, (3) the replacement of two electric heaters with gas/gas heat exchangers, and (4) the replacement two heat exchangers of the reference electrolyzer unit with one heat exchanger that uses steam from the biomass power plant. In all cases, 60% of the electricity required in the electrolyzer is covered by the biomass plant and 40% by the photovoltaic panels. When comparing the hybrid plants with the reference biomass power plant that has identical operation and structure as that incorporated in the hybrid plants, we observe an efficiency decrease that varies depending on the scenario. The efficiency decrease stems mainly from the low effectiveness of the photovoltaic panels (14.4%). When comparing the hybrid scenarios, we see that the elimination of the sweep gas decreases the power consumption due to the elimination of the compressor used to cover the pressure losses of the filter, the heat exchangers and the electrolyzer. Nevertheless, if the sweep gas is used to preheat the air entering the boiler of the biomass power plant, the efficiency of the plant increases. When replacing the electric heaters with gas-gas heat exchangers, the efficiency of the plant increases, although the higher pressure losses of the flue-gas path increase the requirements of the air compressor. Finally, replacing the two heat exchangers of the electrolyzer unit with one that uses extracted steam from the biomass power plant can lead to an overall decrease in the operating and investment costs of the plant.

  11. Debugging and Performance Analysis Software Tools for Peregrine System |

    Science.gov Websites

    High-Performance Computing | NREL Debugging and Performance Analysis Software Tools for Peregrine System Debugging and Performance Analysis Software Tools for Peregrine System Learn about debugging and performance analysis software tools available to use with the Peregrine system. Allinea

  12. Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0

    DOE PAGES

    Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; ...

    2008-01-01

    The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less

  13. Model-Based Linkage Analysis of a Quantitative Trait.

    PubMed

    Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H

    2017-01-01

    Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.

  14. Structural-Thermal-Optical-Performance (STOP) Analysis

    NASA Technical Reports Server (NTRS)

    Bolognese, Jeffrey; Irish, Sandra

    2015-01-01

    The presentation will be given at the 26th Annual Thermal Fluids Analysis Workshop (TFAWS 2015) hosted by the Goddard Spaceflight Center (GSFC) Thermal Engineering Branch (Code 545). A STOP analysis is a multidiscipline analysis, consisting of Structural, Thermal and Optical Performance Analyses, that is performed for all space flight instruments and satellites. This course will explain the different parts of performing this analysis. The student will learn how to effectively interact with each discipline in order to accurately obtain the system analysis results.

  15. Orbit Transfer Vehicle (OTV) engine, phase A study. Volume 2: Study

    NASA Technical Reports Server (NTRS)

    Mellish, J. A.

    1979-01-01

    The hydrogen oxygen engine used in the orbiter transfer vehicle is described. The engine design is analyzed and minimum engine performance and man rating requirements are discussed. Reliability and safety analysis test results are presented and payload, risk and cost, and engine installation parameters are defined. Engine tests were performed including performance analysis, structural analysis, thermal analysis, turbomachinery analysis, controls analysis, and cycle analysis.

  16. SEP thrust subsystem performance sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Atkins, K. L.; Sauer, C. G., Jr.; Kerrisk, D. J.

    1973-01-01

    This is a two-part report on solar electric propulsion (SEP) performance sensitivity analysis. The first part describes the preliminary analysis of the SEP thrust system performance for an Encke rendezvous mission. A detailed description of thrust subsystem hardware tolerances on mission performance is included together with nominal spacecraft parameters based on these tolerances. The second part describes the method of analysis and graphical techniques used in generating the data for Part 1. Included is a description of both the trajectory program used and the additional software developed for this analysis. Part 2 also includes a comprehensive description of the use of the graphical techniques employed in this performance analysis.

  17. A case study in nonconformance and performance trend analysis

    NASA Technical Reports Server (NTRS)

    Maloy, Joseph E.; Newton, Coy P.

    1990-01-01

    As part of NASA's effort to develop an agency-wide approach to trend analysis, a pilot nonconformance and performance trending analysis study was conducted on the Space Shuttle auxiliary power unit (APU). The purpose of the study was to (1) demonstrate that nonconformance analysis can be used to identify repeating failures of a specific item (and the associated failure modes and causes) and (2) determine whether performance parameters could be analyzed and monitored to provide an indication of component or system degradation prior to failure. The nonconformance analysis of the APU did identify repeating component failures, which possibly could be reduced if key performance parameters were monitored and analyzed. The performance-trending analysis verified that the characteristics of hardware parameters can be effective in detecting degradation of hardware performance prior to failure.

  18. Performance analysis of mini-propellers based on FlightGear

    NASA Astrophysics Data System (ADS)

    Vogeltanz, Tomáš

    2016-06-01

    This paper presents a performance analysis of three mini-propellers based on the FlightGear flight simulator. Although a basic propeller analysis has to be performed before the use of FlightGear, for a complex and more practical performance analysis, it is advantageous to use a propeller model in cooperation with a particular aircraft model. This approach may determine whether the propeller has sufficient quality in respect of aircraft requirements. In the first section, the software used for the analysis is illustrated. Then, the parameters of the analyzed mini-propellers and the tested UAV are described. Finally, the main section shows and discusses the results of the performance analysis of the mini-propellers.

  19. Power analysis for multivariate and repeated measurements designs via SPSS: correction and extension of D'Amico, Neilands, and Zambarano (2001).

    PubMed

    Osborne, Jason W

    2006-05-01

    D'Amico, Neilands, and Zambarano (2001) published SPSS syntax to perform power analyses for three complex procedures: ANCOVA, MANOVA, and repeated measures ANOVA. Unfortunately, the published SPSS syntax for performing the repeated measures analysis needed some minor revision in order to perform the analysis correctly. This article presents the corrected syntax that will successfully perform the repeated measures analysis and provides some guidance on modifying the syntax to customize the analysis.

  20. Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures

    NASA Technical Reports Server (NTRS)

    Biegel, Bryan A. (Technical Monitor); Jost, G.; Jin, H.; Labarta J.; Gimenez, J.; Caubet, J.

    2003-01-01

    Parallel programming paradigms include process level parallelism, thread level parallelization, and multilevel parallelism. This viewgraph presentation describes a detailed performance analysis of these paradigms for Shared Memory Architecture (SMA). This analysis uses the Paraver Performance Analysis System. The presentation includes diagrams of a flow of useful computations.

  1. EEG Correlates of Fluctuation in Cognitive Performance in an Air Traffic Control Task

    DTIC Science & Technology

    2014-11-01

    using non-parametric statistical analysis to identify neurophysiological patterns due to the time-on-task effect. Significant changes in EEG power...EEG, Cognitive Performance, Power Spectral Analysis , Non-Parametric Analysis Document is available to the public through the Internet...3 Performance Data Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 EEG

  2. A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth

    2005-03-15

    The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scalemore » long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK projects have made use of this infrastructure to build performance measurement and analysis tools that scale to long-running programs on large parallel and distributed systems and that automate much of the search for performance bottlenecks.« less

  3. Dispersion analysis for baseline reference mission 2

    NASA Technical Reports Server (NTRS)

    Snow, L. S.

    1975-01-01

    A dispersion analysis considering uncertainties (or perturbations) in platform, vehicle, and environmental parameters was performed for baseline reference mission (BRM) 2. The dispersion analysis is based on the nominal trajectory for BRM 2. The analysis was performed to determine state vector and performance dispersions (or variations) which result from the indicated uncertainties. The dispersions are determined at major mission events and fixed times from liftoff (time slices). The dispersion results will be used to evaluate the capability of the vehicle to perform the mission within a specified level of confidence and to determine flight performance reserves.

  4. COBRA ATD minefield detection model initial performance analysis

    NASA Astrophysics Data System (ADS)

    Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.

    2000-08-01

    A statistical performance analysis of the USMC Coastal Battlefield Reconnaissance and Analysis (COBRA) Minefield Detection (MFD) Model has been performed in support of the COBRA ATD Program under execution by the Naval Surface Warfare Center/Dahlgren Division/Coastal Systems Station . This analysis uses the Veridian ERIM International MFD model from the COBRA Sensor Performance Evaluation and Computational Tools for Research Analysis modeling toolbox and a collection of multispectral mine detection algorithm response distributions for mines and minelike clutter objects. These mine detection response distributions were generated form actual COBRA ATD test missions over littoral zone minefields. This analysis serves to validate both the utility and effectiveness of the COBRA MFD Model as a predictive MFD performance too. COBRA ATD minefield detection model algorithm performance results based on a simulate baseline minefield detection scenario are presented, as well as result of a MFD model algorithm parametric sensitivity study.

  5. Acquisition and production of skilled behavior in dynamic decision-making tasks. Semiannual Status Report M.S. Thesis - Georgia Inst. of Tech., Nov. 1992

    NASA Technical Reports Server (NTRS)

    Kirlik, Alex; Kossack, Merrick Frank

    1993-01-01

    This status report consists of a thesis entitled 'Ecological Task Analysis: A Method for Display Enhancements.' Previous use of various analysis processes for the purpose of display interface design or enhancement has run the risk of failing to improve user performance due to the analysis resulting in only a sequencial listing of user tasks. Adopting an ecological approach to performing the task analysis, however, may result in the necessary modeling of an unpredictable and variable task domain required to improve user performance. Kirlik has proposed an Ecological Task Analysis framework which is designed for this purpose. It is the purpose of this research to measure this framework's effectiveness at enhancing display interfaces in order to improve user performance. Following the proposed framework, an ecological task analysis of experienced users of a complex and dynamic laboratory task, Star Cruiser, was performed. Based on this analysis, display enhancements were proposed and implemented. An experiment was then conducted to compare this new version of Star Cruiser to the original. By measuring user performance at different tasks, it was determined that during early sessions, use of the enhanced display contributed to better user performance compared to that achieved using the original display. Furthermore, the results indicate that the enhancements proposed as a result of the ecological task analysis affected user performance differently depending on whether they are enhancements which aid in the selection of a possible action or in the performance of an action. Generalizations of these findings to larger, more complex systems were avoided since the analysis was only performed on this one particular system.

  6. Performance analysis of the ascent propulsion system of the Apollo spacecraft

    NASA Technical Reports Server (NTRS)

    Hooper, J. C., III

    1973-01-01

    Activities involved in the performance analysis of the Apollo lunar module ascent propulsion system are discussed. A description of the ascent propulsion system, including hardware, instrumentation, and system characteristics, is included. The methods used to predict the inflight performance and to establish performance uncertainties of the ascent propulsion system are discussed. The techniques of processing the telemetered flight data and performing postflight performance reconstruction to determine actual inflight performance are discussed. Problems that have been encountered and results from the analysis of the ascent propulsion system performance during the Apollo 9, 10, and 11 missions are presented.

  7. Box truss analysis and technology development. Task 1: Mesh analysis and control

    NASA Technical Reports Server (NTRS)

    Bachtell, E. E.; Bettadapur, S. S.; Coyner, J. V.

    1985-01-01

    An analytical tool was developed to model, analyze and predict RF performance of box truss antennas with reflective mesh surfaces. The analysis system is unique in that it integrates custom written programs for cord tied mesh surfaces, thereby drastically reducing the cost of analysis. The analysis system is capable of determining the RF performance of antennas under any type of manufacturing or operating environment by integrating together the various disciplines of design, finite element analysis, surface best fit analysis and RF analysis. The Integrated Mesh Analysis System consists of six separate programs: The Mesh Tie System Model Generator, The Loadcase Generator, The Model Optimizer, The Model Solver, The Surface Topography Solver and The RF Performance Solver. Additionally, a study using the mesh analysis system was performed to determine the effect of on orbit calibration, i.e., surface adjustment, on a typical box truss antenna.

  8. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    NASA Technical Reports Server (NTRS)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  9. The Current State of Human Performance Technology: A Citation Network Analysis of "Performance Improvement Quarterly," 1988-2010

    ERIC Educational Resources Information Center

    Cho, Yonjoo; Jo, Sung Jun; Park, Sunyoung; Kang, Ingu; Chen, Zengguan

    2011-01-01

    This study conducted a citation network analysis (CNA) of human performance technology (HPT) to examine its current state of the field. Previous reviews of the field have used traditional research methods, such as content analysis, survey, Delphi, and citation analysis. The distinctive features of CNA come from using a social network analysis…

  10. Comparative analysis of techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Hitt, E. F.; Bridgman, M. S.; Robinson, A. C.

    1981-01-01

    Performability analysis is a technique developed for evaluating the effectiveness of fault-tolerant computing systems in multiphase missions. Performability was evaluated for its accuracy, practical usefulness, and relative cost. The evaluation was performed by applying performability and the fault tree method to a set of sample problems ranging from simple to moderately complex. The problems involved as many as five outcomes, two to five mission phases, permanent faults, and some functional dependencies. Transient faults and software errors were not considered. A different analyst was responsible for each technique. Significantly more time and effort were required to learn performability analysis than the fault tree method. Performability is inherently as accurate as fault tree analysis. For the sample problems, fault trees were more practical and less time consuming to apply, while performability required less ingenuity and was more checkable. Performability offers some advantages for evaluating very complex problems.

  11. Performance optimisations for distributed analysis in ALICE

    NASA Astrophysics Data System (ADS)

    Betev, L.; Gheata, A.; Gheata, M.; Grigoras, C.; Hristov, P.

    2014-06-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with "sensors" collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis.

  12. Performance Analysis of GYRO: A Tool Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Worley, P.; Roth, P.; Candy, J.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manualmore » analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.« less

  13. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    EPA Science Inventory

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...

  14. Using Importance-Performance Analysis to Guide Instructional Design of Experiential Learning Activities

    ERIC Educational Resources Information Center

    Anderson, Sheri; Hsu, Yu-Chang; Kinney, Judy

    2016-01-01

    Designing experiential learning activities requires an instructor to think about what they want the students to learn. Using importance-performance analysis can assist with the instructional design of the activities. This exploratory study used importance-performance analysis in an online introduction to criminology course. There is limited…

  15. Performance Reports: Mirror alignment system performance prediction comparison between SAO and EKC

    NASA Technical Reports Server (NTRS)

    Tananbaum, H. D.; Zhang, J. P.

    1994-01-01

    The objective of this study is to perform an independent analysis of the residual high resolution mirror assembly (HRMA) mirror distortions caused by force and moment errors in the mirror alignment system (MAS) to statistically predict the HRMA performance. These performance predictions are then compared with those performed by Kodak to verify their analysis results.

  16. Initial empirical analysis of nuclear power plant organization and its effect on safety performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olson, J.; McLaughlin, S.D.; Osborn, R.N.

    This report contains an analysis of the relationship between selected aspects of organizational structure and the safety-related performance of nuclear power plants. The report starts by identifying and operationalizing certain key dimensions of organizational structure that may be expected to be related to plant safety performance. Next, indicators of plant safety performance are created by combining existing performance measures into more reliable indicators. Finally, the indicators of plant safety performance using correlational and discriminant analysis. The overall results show that plants with better developed coordination mechanisms, shorter vertical hierarchies, and a greater number of departments tend to perform more safely.

  17. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  18. Role of IAC in large space systems thermal analysis

    NASA Technical Reports Server (NTRS)

    Jones, G. K.; Skladany, J. T.; Young, J. P.

    1982-01-01

    Computer analysis programs to evaluate critical coupling effects that can significantly influence spacecraft system performance are described. These coupling effects arise from the varied parameters of the spacecraft systems, environments, and forcing functions associated with disciplines such as thermal, structures, and controls. Adverse effects can be expected to significantly impact system design aspects such as structural integrity, controllability, and mission performance. One such needed design analysis capability is a software system that can integrate individual discipline computer codes into a highly user-oriented/interactive-graphics-based analysis capability. The integrated analysis capability (IAC) system can be viewed as: a core framework system which serves as an integrating base whereby users can readily add desired analysis modules and as a self-contained interdisciplinary system analysis capability having a specific set of fully integrated multidisciplinary analysis programs that deal with the coupling of thermal, structures, controls, antenna radiation performance, and instrument optical performance disciplines.

  19. Neophyte experiences of football (soccer) match analysis: a multiple case study approach.

    PubMed

    McKenna, Mark; Cowan, Daryl Thomas; Stevenson, David; Baker, Julien Steven

    2018-03-05

    Performance analysis is extensively used in sport, but its pedagogical application is little understood. Given its expanding role across football, this study explored the experiences of neophyte performance analysts. Experiences of six analysis interns, across three professional football clubs, were investigated as multiple cases of new match analysis. Each intern was interviewed after their first season, with archival data providing background information. Four themes emerged from qualitative analysis: (1) "building of relationships" was important, along with trust and role clarity; (2) "establishing an analysis system" was difficult due to tacit coach knowledge, but analysis was established; (3) the quality of the "feedback process" hinged on coaching styles, with balance of feedback and athlete engagement considered essential; (4) "establishing effect" was complex with no statistical effects reported; yet enhanced relationships, role clarity, and improved performances were reported. Other emic accounts are required to further understand occupational culture within performance analysis.

  20. OAO battery data analysis

    NASA Technical Reports Server (NTRS)

    Gaston, S.; Wertheim, M.; Orourke, J. A.

    1973-01-01

    Summary, consolidation and analysis of specifications, manufacturing process and test controls, and performance results for OAO-2 and OAO-3 lot 20 Amp-Hr sealed nickel cadmium cells and batteries are reported. Correlation of improvements in control requirements with performance is a key feature. Updates for a cell/battery computer model to improve performance prediction capability are included. Applicability of regression analysis computer techniques to relate process controls to performance is checked.

  1. Boosting Stochastic Problem Solvers Through Online Self-Analysis of Performance

    DTIC Science & Technology

    2003-07-21

    Boosting Stochastic Problem Solvers Through Online Self-Analysis of Performance Vincent A. Cicirello CMU-RI-TR-03-27 Submitted in partial fulfillment...AND SUBTITLE Boosting Stochastic Problem Solvers Through Online Self-Analysis of Performance 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...lead to the development of a search control framework, called QD-BEACON that uses online -generated statistical models of search performance to

  2. Performance characterization of image and video analysis systems at Siemens Corporate Research

    NASA Astrophysics Data System (ADS)

    Ramesh, Visvanathan; Jolly, Marie-Pierre; Greiffenhagen, Michael

    2000-06-01

    There has been a significant increase in commercial products using imaging analysis techniques to solve real-world problems in diverse fields such as manufacturing, medical imaging, document analysis, transportation and public security, etc. This has been accelerated by various factors: more advanced algorithms, the availability of cheaper sensors, and faster processors. While algorithms continue to improve in performance, a major stumbling block in translating improvements in algorithms to faster deployment of image analysis systems is the lack of characterization of limits of algorithms and how they affect total system performance. The research community has realized the need for performance analysis and there have been significant efforts in the last few years to remedy the situation. Our efforts at SCR have been on statistical modeling and characterization of modules and systems. The emphasis is on both white-box and black box methodologies to evaluate and optimize vision systems. In the first part of this paper we review the literature on performance characterization and then provide an overview of the status of research in performance characterization of image and video understanding systems. The second part of the paper is on performance evaluation of medical image segmentation algorithms. Finally, we highlight some research issues in performance analysis in medical imaging systems.

  3. Seven Performance Drivers.

    ERIC Educational Resources Information Center

    Ross, Linda

    2003-01-01

    Recent work with automotive e-commerce clients led to the development of a performance analysis methodology called the Seven Performance Drivers, including: standards, incentives, capacity, knowledge and skill, measurement, feedback, and analysis. This methodology has been highly effective in introducing and implementing performance improvement.…

  4. Multiphysics Nuclear Thermal Rocket Thrust Chamber Analysis

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See

    2005-01-01

    The objective of this effort is t o develop an efficient and accurate thermo-fluid computational methodology to predict environments for hypothetical thrust chamber design and analysis. The current task scope is to perform multidimensional, multiphysics analysis of thrust performance and heat transfer analysis for a hypothetical solid-core, nuclear thermal engine including thrust chamber and nozzle. The multiphysics aspects of the model include: real fluid dynamics, chemical reactivity, turbulent flow, and conjugate heat transfer. The model will be designed to identify thermal, fluid, and hydrogen environments in all flow paths and materials. This model would then be used to perform non- nuclear reproduction of the flow element failures demonstrated in the Rover/NERVA testing, investigate performance of specific configurations and assess potential issues and enhancements. A two-pronged approach will be employed in this effort: a detailed analysis of a multi-channel, flow-element, and global modeling of the entire thrust chamber assembly with a porosity modeling technique. It is expected that the detailed analysis of a single flow element would provide detailed fluid, thermal, and hydrogen environments for stress analysis, while the global thrust chamber assembly analysis would promote understanding of the effects of hydrogen dissociation and heat transfer on thrust performance. These modeling activities will be validated as much as possible by testing performed by other related efforts.

  5. Evaluation of adding item-response theory analysis for evaluation of the European Board of Ophthalmology Diploma examination.

    PubMed

    Mathysen, Danny G P; Aclimandos, Wagih; Roelant, Ella; Wouters, Kristien; Creuzot-Garcher, Catherine; Ringens, Peter J; Hawlina, Marko; Tassignon, Marie-José

    2013-11-01

    To investigate whether introduction of item-response theory (IRT) analysis, in parallel to the 'traditional' statistical analysis methods available for performance evaluation of multiple T/F items as used in the European Board of Ophthalmology Diploma (EBOD) examination, has proved beneficial, and secondly, to study whether the overall assessment performance of the current written part of EBOD is sufficiently high (KR-20≥ 0.90) to be kept as examination format in future EBOD editions. 'Traditional' analysis methods for individual MCQ item performance comprise P-statistics, Rit-statistics and item discrimination, while overall reliability is evaluated through KR-20 for multiple T/F items. The additional set of statistical analysis methods for the evaluation of EBOD comprises mainly IRT analysis. These analysis techniques are used to monitor whether the introduction of negative marking for incorrect answers (since EBOD 2010) has a positive influence on the statistical performance of EBOD as a whole and its individual test items in particular. Item-response theory analysis demonstrated that item performance parameters should not be evaluated individually, but should be related to one another. Before the introduction of negative marking, the overall EBOD reliability (KR-20) was good though with room for improvement (EBOD 2008: 0.81; EBOD 2009: 0.78). After the introduction of negative marking, the overall reliability of EBOD improved significantly (EBOD 2010: 0.92; EBOD 2011:0.91; EBOD 2012: 0.91). Although many statistical performance parameters are available to evaluate individual items, our study demonstrates that the overall reliability assessment remains the only crucial parameter to be evaluated allowing comparison. While individual item performance analysis is worthwhile to undertake as secondary analysis, drawing final conclusions seems to be more difficult. Performance parameters need to be related, as shown by IRT analysis. Therefore, IRT analysis has proved beneficial for the statistical analysis of EBOD. Introduction of negative marking has led to a significant increase in the reliability (KR-20 > 0.90), indicating that the current examination format can be kept for future EBOD examinations. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  6. Value-Added and Other Methods for Measuring School Performance: An Analysis of Performance Measurement Strategies in Teacher Incentive Fund Proposals. Research Brief

    ERIC Educational Resources Information Center

    National Center on Performance Incentives, 2008

    2008-01-01

    In "Value-Added and Other Methods for Measuring School Performance: An Analysis of Performance Measurement Strategies in Teacher Incentive Fund Proposals"--a paper presented at the February 2008 National Center on Performance Incentives research to policy conference--Robert Meyer and Michael Christian examine select performance-pay plans…

  7. Advanced flight design systems subsystem performance models. Sample model: Environmental analysis routine library

    NASA Technical Reports Server (NTRS)

    Parker, K. C.; Torian, J. G.

    1980-01-01

    A sample environmental control and life support model performance analysis using the environmental analysis routines library is presented. An example of a complete model set up and execution is provided. The particular model was synthesized to utilize all of the component performance routines and most of the program options.

  8. 32 CFR 989.38 - Requirements for analysis abroad.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Requirements for analysis abroad. 989.38 Section... PROTECTION ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.38 Requirements for analysis abroad. (a) The EPF will generally perform the same functions for analysis of actions abroad that it performs in the...

  9. 32 CFR 989.38 - Requirements for analysis abroad.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 6 2012-07-01 2012-07-01 false Requirements for analysis abroad. 989.38 Section... PROTECTION ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.38 Requirements for analysis abroad. (a) The EPF will generally perform the same functions for analysis of actions abroad that it performs in the...

  10. 32 CFR 989.38 - Requirements for analysis abroad.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 6 2013-07-01 2013-07-01 false Requirements for analysis abroad. 989.38 Section... PROTECTION ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.38 Requirements for analysis abroad. (a) The EPF will generally perform the same functions for analysis of actions abroad that it performs in the...

  11. 32 CFR 989.38 - Requirements for analysis abroad.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 6 2014-07-01 2014-07-01 false Requirements for analysis abroad. 989.38 Section... PROTECTION ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.38 Requirements for analysis abroad. (a) The EPF will generally perform the same functions for analysis of actions abroad that it performs in the...

  12. 32 CFR 989.38 - Requirements for analysis abroad.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 6 2011-07-01 2011-07-01 false Requirements for analysis abroad. 989.38 Section... PROTECTION ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.38 Requirements for analysis abroad. (a) The EPF will generally perform the same functions for analysis of actions abroad that it performs in the...

  13. Spotlight-8 Image Analysis Software

    NASA Technical Reports Server (NTRS)

    Klimek, Robert; Wright, Ted

    2006-01-01

    Spotlight is a cross-platform GUI-based software package designed to perform image analysis on sequences of images generated by combustion and fluid physics experiments run in a microgravity environment. Spotlight can perform analysis on a single image in an interactive mode or perform analysis on a sequence of images in an automated fashion. Image processing operations can be employed to enhance the image before various statistics and measurement operations are performed. An arbitrarily large number of objects can be analyzed simultaneously with independent areas of interest. Spotlight saves results in a text file that can be imported into other programs for graphing or further analysis. Spotlight can be run on Microsoft Windows, Linux, and Apple OS X platforms.

  14. Constant False Alarm Rate (CFAR) Autotrend Evaluation Report

    DTIC Science & Technology

    2011-12-01

    represent a level of uncertainty in the performance analysis. The performance analysis produced the following Key Performance Indicators ( KPIs ) as...Identity KPI Key Performance Indicator MooN M-out-of-N MSPU Modernized Signal Processor Unit NFF No Fault Found PAT Parameter Allocation Table PD

  15. Reconsidering Human Performance Technology

    ERIC Educational Resources Information Center

    Jang, Hwan Young

    2008-01-01

    This article discusses three perceived challenges in the field of human performance technology: a missing link from training to performance, limitations in gap analysis and cause analysis, and a lack of attention to business and organization performance. It then provides possible alternatives for each issue, such as instructional system…

  16. Model Performance Evaluation and Scenario Analysis (MPESA)

    EPA Pesticide Factsheets

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  17. Front-End Analysis Cornerstone of Logistics

    NASA Technical Reports Server (NTRS)

    Nager, Paul J.

    2000-01-01

    The presentation provides an overview of Front-End Logistics Support Analysis (FELSA), when it should be performed, benefits of performing FELSA and why it should be performed, how it is conducted, and examples.

  18. Body sway, aim point fluctuation and performance in rifle shooters: inter- and intra-individual analysis.

    PubMed

    Ball, Kevin A; Best, Russell J; Wrigley, Tim V

    2003-07-01

    In this study, we examined the relationships between body sway, aim point fluctuation and performance in rifle shooting on an inter- and intra-individual basis. Six elite shooters performed 20 shots under competition conditions. For each shot, body sway parameters and four aim point fluctuation parameters were quantified for the time periods 5 s to shot, 3 s to shot and 1 s to shot. Three parameters were used to indicate performance. An AMTI LG6-4 force plate was used to measure body sway parameters, while a SCATT shooting analysis system was used to measure aim point fluctuation and shooting performance. Multiple regression analysis indicated that body sway was related to performance for four shooters. Also, body sway was related to aim point fluctuation for all shooters. These relationships were specific to the individual, with the strength of association, parameters of importance and time period of importance different for different shooters. Correlation analysis of significant regressions indicated that, as body sway increased, performance decreased and aim point fluctuation increased for most relationships. We conclude that body sway and aim point fluctuation are important in elite rifle shooting and performance errors are highly individual-specific at this standard. Individual analysis should be a priority when examining elite sports performance.

  19. Techno-Economic Analysis | Energy Analysis | NREL

    Science.gov Websites

    Technology Cost and Performance Data for Distributed Generation Explore our capital cost and performance analysis yields insights to support industry decisions about R&D targets, investment strategies, and

  20. Performance Evaluation of Counter-Based Dynamic Load Balancing Schemes for Massive Contingency Analysis with Different Computing Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Huang, Zhenyu; Chavarría-Miranda, Daniel

    Contingency analysis is a key function in the Energy Management System (EMS) to assess the impact of various combinations of power system component failures based on state estimation. Contingency analysis is also extensively used in power market operation for feasibility test of market solutions. High performance computing holds the promise of faster analysis of more contingency cases for the purpose of safe and reliable operation of today’s power grids with less operating margin and more intermittent renewable energy sources. This paper evaluates the performance of counter-based dynamic load balancing schemes for massive contingency analysis under different computing environments. Insights frommore » the performance evaluation can be used as guidance for users to select suitable schemes in the application of massive contingency analysis. Case studies, as well as MATLAB simulations, of massive contingency cases using the Western Electricity Coordinating Council power grid model are presented to illustrate the application of high performance computing with counter-based dynamic load balancing schemes.« less

  1. A New Approach to Aircraft Robust Performance Analysis

    NASA Technical Reports Server (NTRS)

    Gregory, Irene M.; Tierno, Jorge E.

    2004-01-01

    A recently developed algorithm for nonlinear system performance analysis has been applied to an F16 aircraft to begin evaluating the suitability of the method for aerospace problems. The algorithm has a potential to be much more efficient than the current methods in performance analysis for aircraft. This paper is the initial step in evaluating this potential.

  2. An Empirical Analysis of the Performance of Vietnamese Higher Education Institutions

    ERIC Educational Resources Information Center

    Tran, Carolyn-Dung T. T.; Villano, Renato A.

    2017-01-01

    This article provides an analysis of the academic performance of higher education institutions (HEIs) in Vietnam with 50 universities and 50 colleges in 2011/12. The two-stage semiparametric data envelopment analysis is used to estimate the efficiency of HEIs and investigate the effects of various factors on their performance. The findings reveal…

  3. Empirical Analysis of Optical Attenuator Performance in Quantum Key Distribution Systems Using a Particle Model

    DTIC Science & Technology

    2012-03-01

    EMPIRICAL ANALYSIS OF OPTICAL ATTENUATOR PERFORMANCE IN QUANTUM KEY DISTRIBUTION SYSTEMS USING A...DISTRIBUTION IS UNLIMITED AFIT/GCS/ENG/12-01 EMPIRICAL ANALYSIS OF OPTICAL ATTENUATOR PERFORMANCE IN QUANTUM KEY DISTRIBUTION SYSTEMS USING ...challenging as the complexity of actual implementation specifics are considered. Two components common to most quantum key distribution

  4. Performance Cycle Analysis of a Two-Spool, Separate-Exhaust Turbofan With Interstage Turbine Burner

    NASA Technical Reports Server (NTRS)

    Liew, K. H.; Urip, E.; Yang, S. L.; Mattingly, J. D.; Marek, C. J.

    2005-01-01

    This paper presents the performance cycle analysis of a dual-spool, separate-exhaust turbofan engine, with an Interstage Turbine Burner serving as a secondary combustor. The ITB, which is located at the transition duct between the high- and the low-pressure turbines, is a relatively new concept for increasing specific thrust and lowering pollutant emissions in modern jet engine propulsion. A detailed performance analysis of this engine has been conducted for steady-state engine performance prediction. A code is written and is capable of predicting engine performances (i.e., thrust and thrust specific fuel consumption) at varying flight conditions and throttle settings. Two design-point engines were studied to reveal trends in performance at both full and partial throttle operations. A mission analysis is also presented to assure the advantage of saving fuel by adding ITB.

  5. The value of job analysis, job description and performance.

    PubMed

    Wolfe, M N; Coggins, S

    1997-01-01

    All companies, regardless of size, are faced with the same employment concerns. Efficient personnel management requires the use of three human resource techniques--job analysis, job description and performance appraisal. These techniques and tools are not for large practices only. Small groups can obtain the same benefits by employing these performance control measures. Job analysis allows for the development of a compensation system. Job descriptions summarize the most important duties. Performance appraisals help reward outstanding work.

  6. Multivariate meta-analysis of individual participant data helped externally validate the performance and implementation of a prediction model.

    PubMed

    Snell, Kym I E; Hua, Harry; Debray, Thomas P A; Ensor, Joie; Look, Maxime P; Moons, Karel G M; Riley, Richard D

    2016-01-01

    Our aim was to improve meta-analysis methods for summarizing a prediction model's performance when individual participant data are available from multiple studies for external validation. We suggest multivariate meta-analysis for jointly synthesizing calibration and discrimination performance, while accounting for their correlation. The approach estimates a prediction model's average performance, the heterogeneity in performance across populations, and the probability of "good" performance in new populations. This allows different implementation strategies (e.g., recalibration) to be compared. Application is made to a diagnostic model for deep vein thrombosis (DVT) and a prognostic model for breast cancer mortality. In both examples, multivariate meta-analysis reveals that calibration performance is excellent on average but highly heterogeneous across populations unless the model's intercept (baseline hazard) is recalibrated. For the cancer model, the probability of "good" performance (defined by C statistic ≥0.7 and calibration slope between 0.9 and 1.1) in a new population was 0.67 with recalibration but 0.22 without recalibration. For the DVT model, even with recalibration, there was only a 0.03 probability of "good" performance. Multivariate meta-analysis can be used to externally validate a prediction model's calibration and discrimination performance across multiple populations and to evaluate different implementation strategies. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.

  7. Network analysis of patient flow in two UK acute care hospitals identifies key sub-networks for A&E performance

    PubMed Central

    Stringer, Clive; Beeknoo, Neeraj

    2017-01-01

    The topology of the patient flow network in a hospital is complex, comprising hundreds of overlapping patient journeys, and is a determinant of operational efficiency. To understand the network architecture of patient flow, we performed a data-driven network analysis of patient flow through two acute hospital sites of King’s College Hospital NHS Foundation Trust. Administration databases were queried for all intra-hospital patient transfers in an 18-month period and modelled as a dynamic weighted directed graph. A ‘core’ subnetwork containing only 13–17% of all edges channelled 83–90% of the patient flow, while an ‘ephemeral’ network constituted the remainder. Unsupervised cluster analysis and differential network analysis identified sub-networks where traffic is most associated with A&E performance. Increased flow to clinical decision units was associated with the best A&E performance in both sites. The component analysis also detected a weekend effect on patient transfers which was not associated with performance. We have performed the first data-driven hypothesis-free analysis of patient flow which can enhance understanding of whole healthcare systems. Such analysis can drive transformation in healthcare as it has in industries such as manufacturing. PMID:28968472

  8. Plausibility assessment of a 2-state self-paced mental task-based BCI using the no-control performance analysis.

    PubMed

    Faradji, Farhad; Ward, Rabab K; Birch, Gary E

    2009-06-15

    The feasibility of having a self-paced brain-computer interface (BCI) based on mental tasks is investigated. The EEG signals of four subjects performing five mental tasks each are used in the design of a 2-state self-paced BCI. The output of the BCI should only be activated when the subject performs a specific mental task and should remain inactive otherwise. For each subject and each task, the feature coefficient and the classifier that yield the best performance are selected, using the autoregressive coefficients as the features. The classifier with a zero false positive rate and the highest true positive rate is selected as the best classifier. The classifiers tested include: linear discriminant analysis, quadratic discriminant analysis, Mahalanobis discriminant analysis, support vector machine, and radial basis function neural network. The results show that: (1) some classifiers obtained the desired zero false positive rate; (2) the linear discriminant analysis classifier does not yield acceptable performance; (3) the quadratic discriminant analysis classifier outperforms the Mahalanobis discriminant analysis classifier and performs almost as well as the radial basis function neural network; and (4) the support vector machine classifier has the highest true positive rates but unfortunately has nonzero false positive rates in most cases.

  9. An Analysis of Effects of Variable Factors on Weapon Performance

    DTIC Science & Technology

    1993-03-01

    ALTERNATIVE ANALYSIS A. CATEGORICAL DATA ANALYSIS Statistical methodology for categorical data analysis traces its roots to the work of Francis Galton in the...choice of statistical tests . This thesis examines an analysis performed by Surface Warfare Development Group (SWDG). The SWDG analysis is shown to be...incorrect due to the misapplication of testing methods. A corrected analysis is presented and recommendations suggested for changes to the testing

  10. Receiver design, performance analysis, and evaluation for space-borne laser altimeters and space-to-space laser ranging systems

    NASA Technical Reports Server (NTRS)

    Davidson, Frederic M.; Sun, Xiaoli; Field, Christopher T.

    1994-01-01

    Accomplishments in the following areas of research are presented: receiver performance study of spaceborne laser altimeters and cloud and aerosol lidars; receiver performance analysis for space-to-space laser ranging systems; and receiver performance study for the Mars Environmental Survey (MESUR).

  11. What Do HPT Consultants Do for Performance Analysis?

    ERIC Educational Resources Information Center

    Kang, Sung

    2017-01-01

    This study was conducted to contribute to the field of Human Performance Technology (HPT) through the validation of the performance analysis process of the International Society for Performance Improvement (ISPI) HPT model, the most representative and frequently utilized process model in the HPT field. The study was conducted using content…

  12. An urban energy performance evaluation system and its computer implementation.

    PubMed

    Wang, Lei; Yuan, Guan; Long, Ruyin; Chen, Hong

    2017-12-15

    To improve the urban environment and effectively reflect and promote urban energy performance, an urban energy performance evaluation system was constructed, thereby strengthening urban environmental management capabilities. From the perspectives of internalization and externalization, a framework of evaluation indicators and key factors that determine urban energy performance and explore the reasons for differences in performance was proposed according to established theory and previous studies. Using the improved stochastic frontier analysis method, an urban energy performance evaluation and factor analysis model was built that brings performance evaluation and factor analysis into the same stage for study. According to data obtained for the Chinese provincial capitals from 2004 to 2013, the coefficients of the evaluation indicators and key factors were calculated by the urban energy performance evaluation and factor analysis model. These coefficients were then used to compile the program file. The urban energy performance evaluation system developed in this study was designed in three parts: a database, a distributed component server, and a human-machine interface. Its functions were designed as login, addition, edit, input, calculation, analysis, comparison, inquiry, and export. On the basis of these contents, an urban energy performance evaluation system was developed using Microsoft Visual Studio .NET 2015. The system can effectively reflect the status of and any changes in urban energy performance. Beijing was considered as an example to conduct an empirical study, which further verified the applicability and convenience of this evaluation system. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Space tug economic analysis study. Volume 2: Tug concepts analysis. Appendix: Tug design and performance data base

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The tug design and performance data base for the economic analysis of space tug operation are presented. A compendium of the detailed design and performance information from the data base is developed. The design data are parametric across a range of reusable space tug sizes. The performance curves are generated for selected point designs of expendable orbit injection stages and reusable tugs. Data are presented in the form of graphs for various modes of operation.

  14. Space Shuttle Main Engine structural analysis and data reduction/evaluation. Volume 5: Main Injector LOX Inlet analysis

    NASA Technical Reports Server (NTRS)

    Violett, Rebeca S.

    1989-01-01

    The analysis performed on the Main Injector LOX Inlet Assembly located on the Space Shuttle Main Engine is summarized. An ANSYS finite element model of the inlet assemably was built and executed. Static stress analysis was also performed.

  15. Integrated Modeling Activities for the James Webb Space Telescope (JWST): Structural-Thermal-Optical Analysis

    NASA Technical Reports Server (NTRS)

    Johnston, John D.; Parrish, Keith; Howard, Joseph M.; Mosier, Gary E.; McGinnis, Mark; Bluth, Marcel; Kim, Kevin; Ha, Hong Q.

    2004-01-01

    This is a continuation of a series of papers on modeling activities for JWST. The structural-thermal- optical, often referred to as "STOP", analysis process is used to predict the effect of thermal distortion on optical performance. The benchmark STOP analysis for JWST assesses the effect of an observatory slew on wavefront error. The paper begins an overview of multi-disciplinary engineering analysis, or integrated modeling, which is a critical element of the JWST mission. The STOP analysis process is then described. This process consists of the following steps: thermal analysis, structural analysis, and optical analysis. Temperatures predicted using geometric and thermal math models are mapped to the structural finite element model in order to predict thermally-induced deformations. Motions and deformations at optical surfaces are input to optical models and optical performance is predicted using either an optical ray trace or WFE estimation techniques based on prior ray traces or first order optics. Following the discussion of the analysis process, results based on models representing the design at the time of the System Requirements Review. In addition to baseline performance predictions, sensitivity studies are performed to assess modeling uncertainties. Of particular interest is the sensitivity of optical performance to uncertainties in temperature predictions and variations in metal properties. The paper concludes with a discussion of modeling uncertainty as it pertains to STOP analysis.

  16. Uncovering the requirements of cognitive work.

    PubMed

    Roth, Emilie M

    2008-06-01

    In this article, the author provides an overview of cognitive analysis methods and how they can be used to inform system analysis and design. Human factors has seen a shift toward modeling and support of cognitively intensive work (e.g., military command and control, medical planning and decision making, supervisory control of automated systems). Cognitive task analysis and cognitive work analysis methods extend traditional task analysis techniques to uncover the knowledge and thought processes that underlie performance in cognitively complex settings. The author reviews the multidisciplinary roots of cognitive analysis and the variety of cognitive task analysis and cognitive work analysis methods that have emerged. Cognitive analysis methods have been used successfully to guide system design, as well as development of function allocation, team structure, and training, so as to enhance performance and reduce the potential for error. A comprehensive characterization of cognitive work requires two mutually informing analyses: (a) examination of domain characteristics and constraints that define cognitive requirements and challenges and (b) examination of practitioner knowledge and strategies that underlie both expert and error-vulnerable performance. A variety of specific methods can be adapted to achieve these aims within the pragmatic constraints of particular projects. Cognitive analysis methods can be used effectively to anticipate cognitive performance problems and specify ways to improve individual and team cognitive performance (be it through new forms of training, user interfaces, or decision aids).

  17. Analysis relating to pavement material characterizations and their effects on pavement performance.

    DOT National Transportation Integrated Search

    1998-01-01

    This report presents the analysis conducted on relating pavement performance or response measures and design considerations to specific pavement layers utilizing data contained in the Long-Term Pavement Performance Program National Information Manage...

  18. Stability, performance and sensitivity analysis of I.I.D. jump linear systems

    NASA Astrophysics Data System (ADS)

    Chávez Fuentes, Jorge R.; González, Oscar R.; Gray, W. Steven

    2018-06-01

    This paper presents a symmetric Kronecker product analysis of independent and identically distributed jump linear systems to develop new, lower dimensional equations for the stability and performance analysis of this type of systems than what is currently available. In addition, new closed form expressions characterising multi-parameter relative sensitivity functions for performance metrics are introduced. The analysis technique is illustrated with a distributed fault-tolerant flight control example where the communication links are allowed to fail randomly.

  19. Elastic-plastic mixed-iterative finite element analysis: Implementation and performance assessment

    NASA Technical Reports Server (NTRS)

    Sutjahjo, Edhi; Chamis, Christos C.

    1993-01-01

    An elastic-plastic algorithm based on Von Mises and associative flow criteria is implemented in MHOST-a mixed iterative finite element analysis computer program developed by NASA Lewis Research Center. The performance of the resulting elastic-plastic mixed-iterative analysis is examined through a set of convergence studies. Membrane and bending behaviors of 4-node quadrilateral shell finite elements are tested for elastic-plastic performance. Generally, the membrane results are excellent, indicating the implementation of elastic-plastic mixed-iterative analysis is appropriate.

  20. Low-Latency Embedded Vision Processor (LLEVS)

    DTIC Science & Technology

    2016-03-01

    26 3.2.3 Task 3 Projected Performance Analysis of FPGA- based Vision Processor ........... 31 3.2.3.1 Algorithms Latency Analysis ...Programmable Gate Array Custom Hardware for Real- Time Multiresolution Analysis . ............................................... 35...conduct data analysis for performance projections. The data acquired through measurements , simulation and estimation provide the requisite platform for

  1. Skylab M518 multipurpose furnace convection analysis

    NASA Technical Reports Server (NTRS)

    Bourgeois, S. V.; Spradley, L. W.

    1975-01-01

    An analysis was performed of the convection which existed on ground tests and during skylab processing of two experiments: vapor growth of IV-VI compounds growth of spherical crystals. A parallel analysis was also performed on Skylab experiment indium antimonide crystals because indium antimonide (InSb) was used and a free surface existed in the tellurium-doped Skylab III sample. In addition, brief analyses were also performed of the microsegregation in germanium experiment because the Skylab crystals indicated turbulent convection effects. Simple dimensional analysis calculations and a more accurate, but complex, convection computer model, were used in the analysis.

  2. Determining team cognition from delay analysis using cross recurrence plot.

    PubMed

    Hajari, Nasim; Cheng, Irene; Bin Zheng; Basu, Anup

    2016-08-01

    Team cognition is an important factor in evaluating and determining team performance. Forming a team with good shared cognition is even more crucial for laparoscopic surgery applications. In this study, we analyzed the eye tracking data of two surgeons during a laparoscopic simulation operation, then performed Cross Recurrence Analysis (CRA) on the recorded data to study the delay behaviour for good performer and poor performer teams. Dual eye tracking data for twenty two dyad teams were recorded during a laparoscopic task and then the teams were divided into good performer and poor performer teams based on the task times. Eventually we studied the delay between two team members for good and poor performer teams. The results indicated that the good performer teams show a smaller delay comparing to poor performer teams. This study is compatible with gaze overlap analysis between team members and therefore it is a good evidence of shared cognition between team members.

  3. Ongoing Analysis of Rocket Based Combined Cycle Engines by the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Ruf, Joseph; Holt, James B.; Canabal, Francisco

    1999-01-01

    This paper presents the status of analyses on three Rocket Based Combined Cycle configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes code for ejector mode fluid dynamics. The Draco engine analysis is a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.

  4. Non-Simultaneous Relative Importance-Performance Analysis: Meta-Results from 80 College Choice Surveys with 55,276 Respondents.

    ERIC Educational Resources Information Center

    Chapman, Randall G.

    1993-01-01

    A study investigated the utility of importance-performance analysis, a marketing tool for assessing marketing position and performance, in learning how college applicants perceive their chosen college in comparison with others. Findings reflect the complexity of student decisions and suggest the "average" college performs above average…

  5. Rotorcraft performance data for AEDT : Methods of using the NASA Design and Analysis of Rotorcraft tool for developing data for AEDT's Rotorcraft Performance Model

    DOT National Transportation Integrated Search

    2016-09-01

    This report documents use of the NASA Design and Analysis of Rotorcraft (NDARC) helicopter performance software tool in developing data for the FAAs Aviation Environmental Design Tool (AEDT). These data support the Rotorcraft Performance Model (RP...

  6. Performance Management in the French System of Secondary-Teacher Training

    ERIC Educational Resources Information Center

    Tchibozo, Guy

    2005-01-01

    The present study focuses on performance analysis and performance management in teacher training in France. After a brief summary of the French system of secondary-teacher training, determinants affecting performance are analyzed. The analysis shows that three determinants--the number of external competitors, the size of a department and the…

  7. Digital microarray analysis for digital artifact genomics

    NASA Astrophysics Data System (ADS)

    Jaenisch, Holger; Handley, James; Williams, Deborah

    2013-06-01

    We implement a Spatial Voting (SV) based analogy of microarray analysis for digital gene marker identification in malware code sections. We examine a famous set of malware formally analyzed by Mandiant and code named Advanced Persistent Threat (APT1). APT1 is a Chinese organization formed with specific intent to infiltrate and exploit US resources. Manidant provided a detailed behavior and sting analysis report for the 288 malware samples available. We performed an independent analysis using a new alternative to the traditional dynamic analysis and static analysis we call Spatial Analysis (SA). We perform unsupervised SA on the APT1 originating malware code sections and report our findings. We also show the results of SA performed on some members of the families associated by Manidant. We conclude that SV based SA is a practical fast alternative to dynamics analysis and static analysis.

  8. Temporal geospatial analysis of secondary school students’ examination performance

    NASA Astrophysics Data System (ADS)

    Nik Abd Kadir, ND; Adnan, NA

    2016-06-01

    Malaysia's Ministry of Education has improved the organization of the data to have the geographical information system (GIS) school database. However, no further analysis is done using geospatial analysis tool. Mapping has emerged as a communication tool and becomes effective way to publish the digital and statistical data such as school performance results. The objective of this study is to analyse secondary school student performance of science and mathematics scores of the Sijil Pelajaran Malaysia Examination result in the year 2010 to 2014 for the Kelantan's state schools with the aid of GIS software and geospatial analysis. The school performance according to school grade point average (GPA) from Grade A to Grade G were interpolated and mapped and query analysis using geospatial tools able to be done. This study will be beneficial to the education sector to analyse student performance not only in Kelantan but to the whole Malaysia and this will be a good method to publish in map towards better planning and decision making to prepare young Malaysians for the challenges of education system and performance.

  9. Managing in-hospital quality improvement: An importance-performance analysis to set priorities for ST-elevation myocardial infarction care.

    PubMed

    Aeyels, Daan; Seys, Deborah; Sinnaeve, Peter R; Claeys, Marc J; Gevaert, Sofie; Schoors, Danny; Sermeus, Walter; Panella, Massimiliano; Bruyneel, Luk; Vanhaecht, Kris

    2018-02-01

    A focus on specific priorities increases the success rate of quality improvement efforts for broad and complex-care processes. Importance-performance analysis presents a possible approach to set priorities around which to design and implement effective quality improvement initiatives. Persistent variation in hospital performance makes ST-elevation myocardial infarction care relevant to consider for importance-performance analysis. The purpose of this study was to identify quality improvement priorities in ST-elevation myocardial infarction care. Importance and performance levels of ST-elevation myocardial infarction key interventions were combined in an importance-performance analysis. Content validity indexes on 23 ST-elevation myocardial infarction key interventions of a multidisciplinary RAND Delphi Survey defined importance levels. Structured review of 300 patient records in 15 acute hospitals determined performance levels. The significance of between-hospital variation was determined by a Kruskal-Wallis test. A performance heat-map allowed for hospital-specific priority setting. Seven key interventions were each rated as an overall improvement priority. Priority key interventions related to risk assessment, timely reperfusion by percutaneous coronary intervention and secondary prevention. Between-hospital performance varied significantly for the majority of key interventions. The type and number of priorities varied strongly across hospitals. Guideline adherence in ST-elevation myocardial infarction care is low and improvement priorities vary between hospitals. Importance-performance analysis helps clinicians and management in demarcation of the nature, number and order of improvement priorities. By offering a tailored improvement focus, this methodology makes improvement efforts more specific and achievable.

  10. The Effectiveness of Self-Regulated Learning Scaffolds on Academic Performance in Computer-Based Learning Environments: A Meta-Analysis

    ERIC Educational Resources Information Center

    Zheng, Lanqin

    2016-01-01

    This meta-analysis examined research on the effects of self-regulated learning scaffolds on academic performance in computer-based learning environments from 2004 to 2015. A total of 29 articles met inclusion criteria and were included in the final analysis with a total sample size of 2,648 students. Moderator analyses were performed using a…

  11. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replication and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. A technique is used that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed database with both shared and exclusive locks.

  12. Navigation Design and Analysis for the Orion Cislunar Exploration Missions

    NASA Technical Reports Server (NTRS)

    D'Souza, Christopher; Holt, Greg; Gay, Robert; Zanetti, Renato

    2014-01-01

    This paper details the design and analysis of the cislunar optical navigation system being proposed for the Orion Earth-Moon (EM) missions. In particular, it presents the mathematics of the navigation filter. It also presents the sensitivity analysis that has been performed to understand the performance of the proposed system, with particular attention paid to entry flight path angle constraints and the DELTA V performance

  13. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments.

    PubMed

    Bass, Ellen J; Baumgart, Leigh A; Shepley, Kathryn Klein

    2013-03-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance.

  14. Multidisciplinary design optimization using multiobjective formulation techniques

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Pagaldipti, Narayanan S.

    1995-01-01

    This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.

  15. Meta-analysis of the efficacy of psychological and educational interventions to improve academic performance of students with learning disabilities in Iran

    PubMed Central

    Faramarzi, Salar; Shamsi, Abdolhossein; Samadi, Maryam; Ahmadzade, Maryam

    2015-01-01

    Introduction: with due attention to the importance of learning disabilities and necessity of presenting interventions for improvement of these disorders in order to prevent future problems, this study used meta-analysis of the research model on the impact of psychological and educational interventions to improve academic performance of students with learning disabilities. Methods: with the use of meta-analysis method by integrating the results of various researches, this study specifies the effect of psychological and educational interventions. In this order, 57 studies, which their methodology was accepted, were selected and meta-analysis was performed on them. The research instrument was a meta-analysis checklist. Results: The effect size for the effectiveness of psychological-educational interventions on improving the academic performance of students with mathematics disorder (0.57), impaired writing (0.50) and dyslexia (0.55) were reported. Conclusions: The result of meta-analysis showed that according to Cohen's table, the effect size is above average, and it can be said that educational and psychological interventions improve the academic performance of students with learning disabilities. PMID:26430685

  16. Meta-analysis of the efficacy of psychological and educational interventions to improve academic performance of students with learning disabilities in Iran.

    PubMed

    Faramarzi, Salar; Shamsi, Abdolhossein; Samadi, Maryam; Ahmadzade, Maryam

    2015-01-01

    with due attention to the importance of learning disabilities and necessity of presenting interventions for improvement of these disorders in order to prevent future problems, this study used meta-analysis of the research model on the impact of psychological and educational interventions to improve academic performance of students with learning disabilities. with the use of meta-analysis method by integrating the results of various researches, this study specifies the effect of psychological and educational interventions. In this order, 57 studies, which their methodology was accepted, were selected and meta-analysis was performed on them. The research instrument was a meta-analysis checklist. The effect size for the effectiveness of psychological-educational interventions on improving the academic performance of students with mathematics disorder (0.57), impaired writing (0.50) and dyslexia (0.55) were reported. The result of meta-analysis showed that according to Cohen's table, the effect size is above average, and it can be said that educational and psychological interventions improve the academic performance of students with learning disabilities.

  17. Evaluation of Counter-Based Dynamic Load Balancing Schemes for Massive Contingency Analysis on Over 10,000 Cores

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Huang, Zhenyu; Rice, Mark J.

    Contingency analysis studies are necessary to assess the impact of possible power system component failures. The results of the contingency analysis are used to ensure the grid reliability, and in power market operation for the feasibility test of market solutions. Currently, these studies are performed in real time based on the current operating conditions of the grid with a set of pre-selected contingency list, which might result in overlooking some critical contingencies caused by variable system status. To have a complete picture of a power grid, more contingencies need to be studied to improve grid reliability. High-performance computing techniques holdmore » the promise of being able to perform the analysis for more contingency cases within a much shorter time frame. This paper evaluates the performance of counter-based dynamic load balancing schemes for a massive contingency analysis program on 10,000+ cores. One million N-2 contingency analysis cases with a Western Electricity Coordinating Council power grid model have been used to demonstrate the performance. The speedup of 3964 with 4096 cores and 7877 with 10240 cores are obtained. This paper reports the performance of the load balancing scheme with a single counter and two counters, describes disk I/O issues, and discusses other potential techniques for further improving the performance.« less

  18. Validity analysis on merged and averaged data using within and between analysis: focus on effect of qualitative social capital on self-rated health.

    PubMed

    Shin, Sang Soo; Shin, Young-Jeon

    2016-01-01

    With an increasing number of studies highlighting regional social capital (SC) as a determinant of health, many studies are using multi-level analysis with merged and averaged scores of community residents' survey responses calculated from community SC data. Sufficient examination is required to validate if the merged and averaged data can represent the community. Therefore, this study analyzes the validity of the selected indicators and their applicability in multi-level analysis. Within and between analysis (WABA) was performed after creating community variables using merged and averaged data of community residents' responses from the 2013 Community Health Survey in Korea, using subjective self-rated health assessment as a dependent variable. Further analysis was performed following the model suggested by WABA result. Both E-test results (1) and WABA results (2) revealed that single-level analysis needs to be performed using qualitative SC variable with cluster mean centering. Through single-level multivariate regression analysis, qualitative SC with cluster mean centering showed positive effect on self-rated health (0.054, p<0.001), although there was no substantial difference in comparison to analysis using SC variables without cluster mean centering or multi-level analysis. As modification in qualitative SC was larger within the community than between communities, we validate that relational analysis of individual self-rated health can be performed within the group, using cluster mean centering. Other tests besides the WABA can be performed in the future to confirm the validity of using community variables and their applicability in multi-level analysis.

  19. EEG source analysis of data from paralysed subjects

    NASA Astrophysics Data System (ADS)

    Carabali, Carmen A.; Willoughby, John O.; Fitzgibbon, Sean P.; Grummett, Tyler; Lewis, Trent; DeLosAngeles, Dylan; Pope, Kenneth J.

    2015-12-01

    One of the limitations of Encephalography (EEG) data is its quality, as it is usually contaminated with electric signal from muscle. This research intends to study results of two EEG source analysis methods applied to scalp recordings taken in paralysis and in normal conditions during the performance of a cognitive task. The aim is to determinate which types of analysis are appropriate for dealing with EEG data containing myogenic components. The data used are the scalp recordings of six subjects in normal conditions and during paralysis while performing different cognitive tasks including the oddball task which is the object of this research. The data were pre-processed by filtering it and correcting artefact, then, epochs of one second long for targets and distractors were extracted. Distributed source analysis was performed in BESA Research 6.0, using its results and information from the literature, 9 ideal locations for source dipoles were identified. The nine dipoles were used to perform discrete source analysis, fitting them to the averaged epochs for obtaining source waveforms. The results were statistically analysed comparing the outcomes before and after the subjects were paralysed. Finally, frequency analysis was performed for better explain the results. The findings were that distributed source analysis could produce confounded results for EEG contaminated with myogenic signals, conversely, statistical analysis of the results from discrete source analysis showed that this method could help for dealing with EEG data contaminated with muscle electrical signal.

  20. Performer-centric Interface Design.

    ERIC Educational Resources Information Center

    McGraw, Karen L.

    1995-01-01

    Describes performer-centric interface design and explains a model-based approach for conducting performer-centric analysis and design. Highlights include design methodology, including cognitive task analysis; creating task scenarios; creating the presentation model; creating storyboards; proof of concept screens; object models and icons;…

  1. Correlation between physics A-levels/A-levels and degree performance

    NASA Astrophysics Data System (ADS)

    Chadwick, Roy

    1985-09-01

    The author presents an analysis of 178 students who left Solihull Sixth form College between 1975 and 1981 to do a degree in physics (approximately one third) or engineering (approximately two thirds) at university or polytechnic. The first table is an analysis of physics A-level grade and degree performance; the second table an analysis of the points total for physics A-level plus maths A-level (five for A, four for B, etc.), against degree performances, and the final table an analysis of the points total for physics A-level plus maths A-level plus third A-level (again five for A, four for B, etc.), against degree performance.

  2. A performance analysis method for distributed real-time robotic systems: A case study of remote teleoperation

    NASA Technical Reports Server (NTRS)

    Lefebvre, D. R.; Sanderson, A. C.

    1994-01-01

    Robot coordination and control systems for remote teleoperation applications are by necessity implemented on distributed computers. Modeling and performance analysis of these distributed robotic systems is difficult, but important for economic system design. Performance analysis methods originally developed for conventional distributed computer systems are often unsatisfactory for evaluating real-time systems. The paper introduces a formal model of distributed robotic control systems; and a performance analysis method, based on scheduling theory, which can handle concurrent hard-real-time response specifications. Use of the method is illustrated by a case of remote teleoperation which assesses the effect of communication delays and the allocation of robot control functions on control system hardware requirements.

  3. Apollo 15 time and motion study

    NASA Technical Reports Server (NTRS)

    Kubis, J. F.; Elrod, J. T.; Rusnak, R.; Barnes, J. E.

    1972-01-01

    A time and motion study of Apollo 15 lunar surface activity led to examination of four distinct areas of crewmen activity. These areas are: an analysis of lunar mobility, a comparative analysis of tasks performed in 1-g training and lunar EVA, an analysis of the metabolic cost of two activities that are performed in several EVAs, and a fall/near-fall analysis. An analysis of mobility showed that the crewmen used three basic mobility patterns (modified walk, hop, side step) while on the lunar surface. These mobility patterns were utilized as adaptive modes to compensate for the uneven terrain and varied soil conditions that the crewmen encountered. A comparison of the time required to perform tasks at the final 1-g lunar EVA training sessions and the time required to perform the same task on the lunar surface indicates that, in almost all cases, it took significantly more time (on the order of 40%) to perform tasks on the moon. This increased time was observed even after extraneous factors (e.g., hardware difficulties) were factored out.

  4. SRM Internal Flow Test and Computational Fluid Dynamic Analysis. Volume 1; Major Task Summaries

    NASA Technical Reports Server (NTRS)

    Whitesides, R. Harold; Dill, Richard A.; Purinton, David C.

    1995-01-01

    During the four year period of performance for NASA contract, NASB-39095, ERC has performed a wide variety of tasks to support the design and continued development of new and existing solid rocket motors and the resolution of operational problems associated with existing solid rocket motor's at NASA MSFC. This report summarizes the support provided to NASA MSFC during the contractual period of performance. The report is divided into three main sections. The first section presents summaries for the major tasks performed. These tasks are grouped into three major categories: full scale motor analysis, subscale motor analysis and cold flow analysis. The second section includes summaries describing the computational fluid dynamics (CFD) tasks performed. The third section, the appendices of the report, presents detailed descriptions of the analysis efforts as well as published papers, memoranda and final reports associated with specific tasks. These appendices are referenced in the summaries. The subsection numbers for the three sections correspond to the same topics for direct cross referencing.

  5. Comparative study on DuPont analysis and DEA models for measuring stock performance using financial ratio

    NASA Astrophysics Data System (ADS)

    Arsad, Roslah; Shaari, Siti Nabilah Mohd; Isa, Zaidi

    2017-11-01

    Determining stock performance using financial ratio is challenging for many investors and researchers. Financial ratio can indicate the strengths and weaknesses of a company's stock performance. There are five categories of financial ratios namely liquidity, efficiency, leverage, profitability and market ratios. It is important to interpret the ratio correctly for proper financial decision making. The purpose of this study is to compare the performance of listed companies in Bursa Malaysia using Data Envelopment Analysis (DEA) and DuPont analysis Models. The study is conducted in 2015 involving 116 consumer products companies listed in Bursa Malaysia. The estimation method of Data Envelopment Analysis computes the efficiency scores and ranks the companies accordingly. The Alirezaee and Afsharian's method of analysis based Charnes, Cooper and Rhodes (CCR) where Constant Return to Scale (CRS) is employed. The DuPont analysis is a traditional tool for measuring the operating performance of companies. In this study, DuPont analysis is used to evaluate three different aspects such as profitability, efficiency of assets utilization and financial leverage. Return on Equity (ROE) is also calculated in DuPont analysis. This study finds that both analysis models provide different rankings of the selected samples. Hypothesis testing based on Pearson's correlation, indicates that there is no correlation between rankings produced by DEA and DuPont analysis. The DEA ranking model proposed by Alirezaee and Asharian is unstable. The method cannot provide complete ranking because the values of Balance Index is equal and zero.

  6. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replications and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. Here, a technique is discussed that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed databases with both shared and exclusive locks.

  7. Competence and Performance in Belief-Desire Reasoning across Two Cultures: The Truth, the Whole Truth and Nothing but the Truth about False Belief?

    ERIC Educational Resources Information Center

    Yazdi, Amir Amin; German, Tim P.; Defeyter, Margaret Anne; Siegal, Michael

    2006-01-01

    There is a change in false belief task performance across the 3-5 year age range, as confirmed in a recent meta-analysis [Wellman, H. M., Cross, D., & Watson, J. (2001). Meta-analysis of theory mind development: The truth about false-belief. "Child Development," 72, 655-684]. This meta-analysis identified several performance factors influencing…

  8. Updating the Behavior Engineering Model.

    ERIC Educational Resources Information Center

    Chevalier, Roger

    2003-01-01

    Considers Thomas Gilbert's Behavior Engineering Model as a tool for systematically identifying barriers to individual and organizational performance. Includes a detailed case study and a performance aid that incorporates gap analysis, cause analysis, and force field analysis to update the original model. (Author/LRW)

  9. Analysis of a Rocket Based Combined Cycle Engine during Rocket Only Operation

    NASA Technical Reports Server (NTRS)

    Smith, T. D.; Steffen, C. J., Jr.; Yungster, S.; Keller, D. J.

    1998-01-01

    The all rocket mode of operation is a critical factor in the overall performance of a rocket based combined cycle (RBCC) vehicle. However, outside of performing experiments or a full three dimensional analysis, there are no first order parametric models to estimate performance. As a result, an axisymmetric RBCC engine was used to analytically determine specific impulse efficiency values based upon both full flow and gas generator configurations. Design of experiments methodology was used to construct a test matrix and statistical regression analysis was used to build parametric models. The main parameters investigated in this study were: rocket chamber pressure, rocket exit area ratio, percent of injected secondary flow, mixer-ejector inlet area, mixer-ejector area ratio, and mixer-ejector length-to-inject diameter ratio. A perfect gas computational fluid dynamics analysis was performed to obtain values of vacuum specific impulse. Statistical regression analysis was performed based on both full flow and gas generator engine cycles. Results were also found to be dependent upon the entire cycle assumptions. The statistical regression analysis determined that there were five significant linear effects, six interactions, and one second-order effect. Two parametric models were created to provide performance assessments of an RBCC engine in the all rocket mode of operation.

  10. Performance of an Axisymmetric Rocket Based Combined Cycle Engine During Rocket Only Operation Using Linear Regression Analysis

    NASA Technical Reports Server (NTRS)

    Smith, Timothy D.; Steffen, Christopher J., Jr.; Yungster, Shaye; Keller, Dennis J.

    1998-01-01

    The all rocket mode of operation is shown to be a critical factor in the overall performance of a rocket based combined cycle (RBCC) vehicle. An axisymmetric RBCC engine was used to determine specific impulse efficiency values based upon both full flow and gas generator configurations. Design of experiments methodology was used to construct a test matrix and multiple linear regression analysis was used to build parametric models. The main parameters investigated in this study were: rocket chamber pressure, rocket exit area ratio, injected secondary flow, mixer-ejector inlet area, mixer-ejector area ratio, and mixer-ejector length-to-inlet diameter ratio. A perfect gas computational fluid dynamics analysis, using both the Spalart-Allmaras and k-omega turbulence models, was performed with the NPARC code to obtain values of vacuum specific impulse. Results from the multiple linear regression analysis showed that for both the full flow and gas generator configurations increasing mixer-ejector area ratio and rocket area ratio increase performance, while increasing mixer-ejector inlet area ratio and mixer-ejector length-to-diameter ratio decrease performance. Increasing injected secondary flow increased performance for the gas generator analysis, but was not statistically significant for the full flow analysis. Chamber pressure was found to be not statistically significant.

  11. Validating Human Behavioral Models for Combat Simulations Using Techniques for the Evaluation of Human Performance

    DTIC Science & Technology

    2004-01-01

    Cognitive Task Analysis Abstract As Department of Defense (DoD) leaders rely more on modeling and simulation to provide information on which to base...capabilities and intent. Cognitive Task Analysis (CTA) Cognitive Task Analysis (CTA) is an extensive/detailed look at tasks and subtasks performed by a...Domain Analysis and Task Analysis: A Difference That Matters. In Cognitive Task Analysis , edited by J. M. Schraagen, S.

  12. CFD Predictions for Transonic Performance of the ERA Hybrid Wing-Body Configuration

    NASA Technical Reports Server (NTRS)

    Deere, Karen A.; Luckring, James M.; McMillin, S. Naomi; Flamm, Jeffrey D.; Roman, Dino

    2016-01-01

    A computational study was performed for a Hybrid Wing Body configuration that was focused at transonic cruise performance conditions. In the absence of experimental data, two fully independent computational fluid dynamics analyses were conducted to add confidence to the estimated transonic performance predictions. The primary analysis was performed by Boeing with the structured overset-mesh code OVERFLOW. The secondary analysis was performed by NASA Langley Research Center with the unstructured-mesh code USM3D. Both analyses were performed at full-scale flight conditions and included three configurations customary to drag buildup and interference analysis: a powered complete configuration, the configuration with the nacelle/pylon removed, and the powered nacelle in isolation. The results in this paper are focused primarily on transonic performance up to cruise and through drag rise. Comparisons between the CFD results were very good despite some minor geometric differences in the two analyses.

  13. Uncertainty analysis for low-level radioactive waste disposal performance assessment at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, D.W.; Yambert, M.W.; Kocher, D.C.

    1994-12-31

    A performance assessment of the operating Solid Waste Storage Area 6 (SWSA 6) facility for the disposal of low-level radioactive waste at the Oak Ridge National Laboratory has been prepared to provide the technical basis for demonstrating compliance with the performance objectives of DOE Order 5820.2A, Chapter 111.2 An analysis of the uncertainty incorporated into the assessment was performed which addressed the quantitative uncertainty in the data used by the models, the subjective uncertainty associated with the models used for assessing performance of the disposal facility and site, and the uncertainty in the models used for estimating dose and humanmore » exposure. The results of the uncertainty analysis were used to interpret results and to formulate conclusions about the performance assessment. This paper discusses the approach taken in analyzing the uncertainty in the performance assessment and the role of uncertainty in performance assessment.« less

  14. Quantitative Analysis of Color Differences within High Contrast, Low Power Reversible Electrophoretic Displays

    DOE PAGES

    Giera, Brian; Bukosky, Scott; Lee, Elaine; ...

    2018-01-23

    Here, quantitative color analysis is performed on videos of high contrast, low power reversible electrophoretic deposition (EPD)-based displays operated under different applied voltages. This analysis is coded in an open-source software, relies on a color differentiation metric, ΔE * 00, derived from digital video, and provides an intuitive relationship between the operating conditions of the devices and their performance. Time-dependent ΔE * 00 color analysis reveals color relaxation behavior, recoverability for different voltage sequences, and operating conditions that can lead to optimal performance.

  15. Quantitative Analysis of Color Differences within High Contrast, Low Power Reversible Electrophoretic Displays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giera, Brian; Bukosky, Scott; Lee, Elaine

    Here, quantitative color analysis is performed on videos of high contrast, low power reversible electrophoretic deposition (EPD)-based displays operated under different applied voltages. This analysis is coded in an open-source software, relies on a color differentiation metric, ΔE * 00, derived from digital video, and provides an intuitive relationship between the operating conditions of the devices and their performance. Time-dependent ΔE * 00 color analysis reveals color relaxation behavior, recoverability for different voltage sequences, and operating conditions that can lead to optimal performance.

  16. Fusion Advanced Design Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    El-Guebaly, Laila; Henderson, Douglass; Wilson, Paul

    2017-03-24

    During the January 1, 2013 – December 31, 2015 contract period, the UW Fusion Technology Institute personnel have actively participated in the ARIES-ACT and FESS-FNSF projects, led the nuclear and thermostructural tasks, attended several project meetings, and participated in all conference calls. The main areas of effort and technical achievements include updating and documenting the nuclear analysis for ARIES-ACT1, performing nuclear analysis for ARIES-ACT2, performing thermostructural analysis for ARIES divertor, performing disruption analysis for ARIES vacuum vessel, and developing blanket testing strategy and Materials Test Module for FNSF.

  17. Safety and Performance Analysis of the Non-Radar Oceanic/Remote Airspace In-Trail Procedure

    NASA Technical Reports Server (NTRS)

    Carreno, Victor A.; Munoz, Cesar A.

    2007-01-01

    This document presents a safety and performance analysis of the nominal case for the In-Trail Procedure (ITP) in a non-radar oceanic/remote airspace. The analysis estimates the risk of collision between the aircraft performing the ITP and a reference aircraft. The risk of collision is only estimated for the ITP maneuver and it is based on nominal operating conditions. The analysis does not consider human error, communication error conditions, or the normal risk of flight present in current operations. The hazards associated with human error and communication errors are evaluated in an Operational Hazards Analysis presented elsewhere.

  18. Ongoing Analyses of Rocket Based Combined Cycle Engines by the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Ruf, Joseph H.; Holt, James B.; Canabal, Francisco

    2001-01-01

    This paper presents the status of analyses on three Rocket Based Combined Cycle (RBCC) configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics (CFD) analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes (FDNS) code for ejector mode fluid dynamics. The Draco analysis was a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.

  19. The effectiveness of external sensory cues in improving functional performance in individuals with Parkinson's disease: a systematic review with meta-analysis.

    PubMed

    Cassimatis, Constantine; Liu, Karen P Y; Fahey, Paul; Bissett, Michelle

    2016-09-01

    A systematic review with meta-analysis was performed to investigate the effect external sensory cued therapy on activities of daily living (ADL) performance that include walking and daily tasks such as dressing for individuals with Parkinson's disease (PD). A detailed computer-aided search of the literature was applied to MEDLINE, Cumulative Index to Nursing and Allied Health Literature, EMBASE and PubMed. Studies investigating the effects of external sensory cued therapy on ADL performance for individuals with PD in all stages of disease progression were collected. Relevant articles were critically reviewed and study results were synthesized by two independent researchers. A data-analysis method was used to extract data from selected articles. A meta-analysis was carried out for all randomized-controlled trials. Six studies with 243 individuals with PD were included in this review. All six studies yielded positive findings in favour of external sensory cues. The meta-analysis showed that external sensory cued therapy improved statistically after treatment (P=0.011) and at follow-up (P<0.001) for ADL performance. The results of this review provided evidence of an improvement in ADL performance in general in individuals with PD. It is recommended that clinicians incorporate external sensory into a training programme focused on improving daily task performance.

  20. The development of a reliable amateur boxing performance analysis template.

    PubMed

    Thomson, Edward; Lamb, Kevin; Nicholas, Ceri

    2013-01-01

    The aim of this study was to devise a valid performance analysis system for the assessment of the movement characteristics associated with competitive amateur boxing and assess its reliability using analysts of varying experience of the sport and performance analysis. Key performance indicators to characterise the demands of an amateur contest (offensive, defensive and feinting) were developed and notated using a computerised notational analysis system. Data were subjected to intra- and inter-observer reliability assessment using median sign tests and calculating the proportion of agreement within predetermined limits of error. For all performance indicators, intra-observer reliability revealed non-significant differences between observations (P > 0.05) and high agreement was established (80-100%) regardless of whether exact or the reference value of ±1 was applied. Inter-observer reliability was less impressive for both analysts (amateur boxer and experienced analyst), with the proportion of agreement ranging from 33-100%. Nonetheless, there was no systematic bias between observations for any indicator (P > 0.05), and the proportion of agreement within the reference range (±1) was 100%. A reliable performance analysis template has been developed for the assessment of amateur boxing performance and is available for use by researchers, coaches and athletes to classify and quantify the movement characteristics of amateur boxing.

  1. HPLC-Orbitrap analysis for identification of organic molecules in complex material

    NASA Astrophysics Data System (ADS)

    Gautier, T.; Schmitz-Afonso, I.; Carrasco, N.; Touboul, D.; Szopa, C.; Buch, A.; Pernot, P.

    2015-10-01

    We performed High Performance Liquid Chromatography (HPLC) coupled to Orbitrap High Resolution Mass Spectrometry (OHR MS) analysis of Titan's tholins. This analysis allowed us to determine the exact composition and structure of some of the major components of tholins.

  2. Performance analysis of fusion nuclear-data benchmark experiments for light to heavy materials in MeV energy region with a neutron spectrum shifter

    NASA Astrophysics Data System (ADS)

    Murata, Isao; Ohta, Masayuki; Miyamaru, Hiroyuki; Kondo, Keitaro; Yoshida, Shigeo; Iida, Toshiyuki; Ochiai, Kentaro; Konno, Chikara

    2011-10-01

    Nuclear data are indispensable for development of fusion reactor candidate materials. However, benchmarking of the nuclear data in MeV energy region is not yet adequate. In the present study, benchmark performance in the MeV energy region was investigated theoretically for experiments by using a 14 MeV neutron source. We carried out a systematical analysis for light to heavy materials. As a result, the benchmark performance for the neutron spectrum was confirmed to be acceptable, while for gamma-rays it was not sufficiently accurate. Consequently, a spectrum shifter has to be applied. Beryllium had the best performance as a shifter. Moreover, a preliminary examination of whether it is really acceptable that only the spectrum before the last collision is considered in the benchmark performance analysis. It was pointed out that not only the last collision but also earlier collisions should be considered equally in the benchmark performance analysis.

  3. Training Situation Analysis: Conducting a Needs Analysis for Teams and New Systems.

    ERIC Educational Resources Information Center

    Dell, Jay; Fox, John; Malcolm, Ralph

    1998-01-01

    The United States Coast Guard uses training situation analysis (TSA) to develop quantified training requirements, collect training and non-training performance data, and overcome turf issues to focus on performance outcomes. Presents the 1947 MLB (Motor Lifeboat Project) as a case study. Outlines 11 steps in the TSA needs analysis for teams and…

  4. Precision Attitude Determination System (PADS) design and analysis. Two-axis gimbal star tracker

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Development of the Precision Attitude Determination System (PADS) focused chiefly on the two-axis gimballed star tracker and electronics design improved from that of Precision Pointing Control System (PPCS), and application of the improved tracker for PADS at geosynchronous altitude. System design, system analysis, software design, and hardware design activities are reported. The system design encompasses the PADS configuration, system performance characteristics, component design summaries, and interface considerations. The PADS design and performance analysis includes error analysis, performance analysis via attitude determination simulation, and star tracker servo design analysis. The design of the star tracker and electronics are discussed. Sensor electronics schematics are included. A detailed characterization of the application software algorithms and computer requirements is provided.

  5. Performance Analysis of MYSEA

    DTIC Science & Technology

    2012-09-01

    Services FSD Federated Services Daemon I&A Identification and Authentication IKE Internet Key Exchange KPI Key Performance Indicator LAN Local Area...spection takes place in different processes in the server architecture. Key Performance Indica- tor ( KPI )s associated with the system need to be...application and risk analysis of security controls. Thus, measurement of the KPIs is needed before an informed tradeoff between the performance penalties

  6. Spectral analysis using CCDs

    NASA Technical Reports Server (NTRS)

    Hewes, C. R.; Brodersen, R. W.; De Wit, M.; Buss, D. D.

    1976-01-01

    Charge-coupled devices (CCDs) are ideally suited for performing sampled-data transversal filtering operations in the analog domain. Two algorithms have been identified for performing spectral analysis in which the bulk of the computation can be performed in a CCD transversal filter; the chirp z-transform and the prime transform. CCD implementation of both these transform algorithms is presented together with performance data and applications.

  7. Grid orthogonality effects on predicted turbine midspan heat transfer and performance

    NASA Technical Reports Server (NTRS)

    Boyle, R. J.; Ameri, A. A.

    1995-01-01

    The effect of five different C type grid geometries on the predicted heat transfer and aerodynamic performance of a turbine stator is examined. Predictions were obtained using two flow analysis codes. One was a finite difference analysis, and the other was a finite volume analysis. Differences among the grids in terms of heat transfer and overall performance were small. The most significant difference among the five grids occurred in the prediction of pitchwise variation in total pressure. There was consistency between results obtained with each of the flow analysis codes when the same grid was used. A grid generating procedure in which the viscous grid is embedded within an inviscid type grid resulted in the best overall performance.

  8. Solar array electrical performance assessment for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Smith, Bryan K.; Brisco, Holly

    1993-01-01

    Electrical power for Space Station Freedom will be generated by large Photovoltaic arrays with a beginning of life power requirement of 30.8 kW per array. The solar arrays will operate in a Low Earth Orbit (LEO) over a design life of fifteen years. This paper provides an analysis of the predicted solar array electrical performance over the design life and presents a summary of supporting analysis and test data for the assigned model parameters and performance loss factors. Each model parameter and loss factor is assessed based upon program requirements, component analysis, and test data to date. A description of the LMSC performance model, future test plans, and predicted performance ranges are also given.

  9. Solar array electrical performance assessment for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Smith, Bryan K.; Brisco, Holly

    1993-01-01

    Electrical power for Space Station Freedom will be generated by large photovoltaic arrays with a beginning of life power requirement of 30.8 kW per array. The solar arrays will operate in a Low Earth Orbit (LEO) over a design life of fifteen years. This paper provides an analysis of the predicted solar array electrical performance over the design life and presents a summary of supporting analysis and test data for the assigned model parameters and performance loss factors. Each model parameter and loss factor is assessed based upon program requirements, component analysis and test data to date. A description of the LMSC performance model future test plans and predicted performance ranges are also given.

  10. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments

    PubMed Central

    Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein

    2014-01-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance. PMID:24847184

  11. Analysis of swimming performance: perceptions and practices of US-based swimming coaches.

    PubMed

    Mooney, Robert; Corley, Gavin; Godfrey, Alan; Osborough, Conor; Newell, John; Quinlan, Leo Richard; ÓLaighin, Gearóid

    2016-01-01

    In elite swimming, a broad range of methods are used to assess performance, inform coaching practices and monitor athletic progression. The aim of this paper was to examine the performance analysis practices of swimming coaches and to explore the reasons behind the decisions that coaches take when analysing performance. Survey data were analysed from 298 Level 3 competitive swimming coaches (245 male, 53 female) based in the United States. Results were compiled to provide a generalised picture of practices and perceptions and to examine key emerging themes. It was found that a disparity exists between the importance swim coaches place on biomechanical analysis of swimming performance and the types of analyses that are actually conducted. Video-based methods are most frequently employed, with over 70% of coaches using these methods at least monthly, with analyses being mainly qualitative in nature rather than quantitative. Barriers to the more widespread use of quantitative biomechanical analysis in elite swimming environments were explored. Constraints include time, cost and availability of resources, but other factors such as sources of information on swimming performance and analysis and control over service provision are also discussed, with particular emphasis on video-based methods and emerging sensor-based technologies.

  12. Performance (Off-Design) Cycle Analysis for a Turbofan Engine With Interstage Turbine Burner

    NASA Technical Reports Server (NTRS)

    Liew, K. H.; Urip, E.; Yang, S. L.; Mattingly, J. D.; Marek, C. J.

    2005-01-01

    This report presents the performance of a steady-state, dual-spool, separate-exhaust turbofan engine, with an interstage turbine burner (ITB) serving as a secondary combustor. The ITB, which is located in the transition duct between the high- and the low-pressure turbines, is a relatively new concept for increasing specific thrust and lowering pollutant emissions in modern jet-engine propulsion. A detailed off-design performance analysis of ITB engines is written in Microsoft(Registered Trademark) Excel (Redmond, Washington) macrocode with Visual Basic Application to calculate engine performances over the entire operating envelope. Several design-point engine cases are pre-selected using a parametric cycle-analysis code developed previously in Microsoft(Registered Trademark) Excel, for off-design analysis. The off-design code calculates engine performances (i.e. thrust and thrust-specific-fuel-consumption) at various flight conditions and throttle settings.

  13. 78 FR 8150 - Proposed Information Collection Activity; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-05

    ... three components: the ``Design and Implementation Study,'' the ``Performance Analysis Study,'' and the...- Component Evaluation--Data Collection Related to the Performance Analysis Study and the Impact and the In-depth Implementation Study. OMB No.: 0970-0398 Description: The Office of Data Analysis, Research, and...

  14. Fortran 4 program for two-impulse rendezvous analysis

    NASA Technical Reports Server (NTRS)

    Barling, W. H., Jr.; Brothers, W. J.; Darling, W. H., Jr.

    1967-01-01

    Program determines if rendezvous in near space is possible, and performs an analysis to determine the approximate required values of the magnitude and direction of two thrust applications of the upper stage of a rocket firing. The analysis is performed by using ordinary Keplerian mechanics.

  15. Posttest analysis of the FFTF inherent safety tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Padilla, A. Jr.; Claybrook, S.W.

    Inherent safety tests were performed during 1986 in the 400-MW (thermal) Fast Flux Test Facility (FFTF) reactor to demonstrate the effectiveness of an inherent shutdown device called the gas expansion module (GEM). The GEM device provided a strong negative reactivity feedback during loss-of-flow conditions by increasing the neutron leakage as a result of an expanding gas bubble. The best-estimate pretest calculations for these tests were performed using the IANUS plant analysis code (Westinghouse Electric Corporation proprietary code) and the MELT/SIEX3 core analysis code. These two codes were also used to perform the required operational safety analyses for the FFTF reactormore » and plant. Although it was intended to also use the SASSYS systems (core and plant) analysis code, the calibration of the SASSYS code for FFTF core and plant analysis was not completed in time to perform pretest analyses. The purpose of this paper is to present the results of the posttest analysis of the 1986 FFTF inherent safety tests using the SASSYS code.« less

  16. 77 FR 20474 - Delegation of Authority; Delegation of Authority No. 24 to the Chief Operating Officer

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-04

    ... and performance planning, measurement, analysis, assessment of progress, and use of performance...; long- range budgeting and accounting; hiring and training employees; modernizing information technology..., measurement, analysis, regular assessment of progress and the use of performance information to improve...

  17. Safety and operational performance evaluation of four types of exit ramps on Florida's freeways (final report).

    DOT National Transportation Integrated Search

    2010-12-01

    This project mainly focuses on exit ramp performance analysis of safety and operations. In addition, issues of advance guide sign for exit ramp are also mentioned. : Safety analysis evaluates safety performances of different exit ramps used in Florid...

  18. Vehicle Design Evaluation Program (VDEP). A computer program for weight sizing, economic, performance and mission analysis of fuel-conservative aircraft, multibodied aircraft and large cargo aircraft using both JP and alternative fuels

    NASA Technical Reports Server (NTRS)

    Oman, B. H.

    1977-01-01

    The NASA Langley Research Center vehicle design evaluation program (VDEP-2) was expanded by (1) incorporating into the program a capability to conduct preliminary design studies on subsonic commercial transport type aircraft using both JP and such alternate fuels as hydrogen and methane;(2) incorporating an aircraft detailed mission and performance analysis capability; and (3) developing and incorporating an external loads analysis capability. The resulting computer program (VDEP-3) provides a preliminary design tool that enables the user to perform integrated sizing, structural analysis, and cost studies on subsonic commercial transport aircraft. Both versions of the VDEP-3 Program which are designated preliminary Analysis VDEP-3 and detailed Analysis VDEP utilize the same vehicle sizing subprogram which includes a detailed mission analysis capability, as well as a geometry and weight analysis for multibodied configurations.

  19. Interfacing Computer Aided Parallelization and Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Biegel, Bryan A. (Technical Monitor)

    2003-01-01

    When porting sequential applications to parallel computer architectures, the program developer will typically go through several cycles of source code optimization and performance analysis. We have started a project to develop an environment where the user can jointly navigate through program structure and performance data information in order to make efficient optimization decisions. In a prototype implementation we have interfaced the CAPO computer aided parallelization tool with the Paraver performance analysis tool. We describe both tools and their interface and give an example for how the interface helps within the program development cycle of a benchmark code.

  20. Separation analysis, a tool for analyzing multigrid algorithms

    NASA Technical Reports Server (NTRS)

    Costiner, Sorin; Taasan, Shlomo

    1995-01-01

    The separation of vectors by multigrid (MG) algorithms is applied to the study of convergence and to the prediction of the performance of MG algorithms. The separation operator for a two level cycle algorithm is derived. It is used to analyze the efficiency of the cycle when mixing of eigenvectors occurs. In particular cases the separation analysis reduces to Fourier type analysis. The separation operator of a two level cycle for a Schridubger eigenvalue problem, is derived and analyzed in a Fourier basis. Separation analysis gives information on how to choose performance relaxations and inter-level transfers. Separation analysis is a tool for analyzing and designing algorithms, and for optimizing their performance.

  1. Nuclear reactor descriptions for space power systems analysis

    NASA Technical Reports Server (NTRS)

    Mccauley, E. W.; Brown, N. J.

    1972-01-01

    For the small, high performance reactors required for space electric applications, adequate neutronic analysis is of crucial importance, but in terms of computational time consumed, nuclear calculations probably yield the least amount of detail for mission analysis study. It has been found possible, after generation of only a few designs of a reactor family in elaborate thermomechanical and nuclear detail to use simple curve fitting techniques to assure desired neutronic performance while still performing the thermomechanical analysis in explicit detail. The resulting speed-up in computation time permits a broad detailed examination of constraints by the mission analyst.

  2. Inlet Development for a Rocket Based Combined Cycle, Single Stage to Orbit Vehicle Using Computational Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    DeBonis, J. R.; Trefny, C. J.; Steffen, C. J., Jr.

    1999-01-01

    Design and analysis of the inlet for a rocket based combined cycle engine is discussed. Computational fluid dynamics was used in both the design and subsequent analysis. Reynolds averaged Navier-Stokes simulations were performed using both perfect gas and real gas assumptions. An inlet design that operates over the required Mach number range from 0 to 12 was produced. Performance data for cycle analysis was post processed using a stream thrust averaging technique. A detailed performance database for cycle analysis is presented. The effect ot vehicle forebody compression on air capture is also examined.

  3. A Workbench for Discovering Task-Specific Theories of Learning

    DTIC Science & Technology

    1989-03-03

    mind (the cognitive architecture) will not be of much use to educators who wish to perform a cognitive task analysis of their subject matter before...analysis packages that can be added to a cognitive architecture, thus creating a ’workbench’ for performing cognitive task analysis . Such tools becomes...learning theories have been. Keywords: Cognitive task analysis , Instructional design, Cognitive modelling, Learning.

  4. HEPDOOP: High-Energy Physics Analysis using Hadoop

    NASA Astrophysics Data System (ADS)

    Bhimji, W.; Bristow, T.; Washbrook, A.

    2014-06-01

    We perform a LHC data analysis workflow using tools and data formats that are commonly used in the "Big Data" community outside High Energy Physics (HEP). These include Apache Avro for serialisation to binary files, Pig and Hadoop for mass data processing and Python Scikit-Learn for multi-variate analysis. Comparison is made with the same analysis performed with current HEP tools in ROOT.

  5. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool

    PubMed Central

    Clark, Neil R.; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D.; Jones, Matthew R.; Ma’ayan, Avi

    2016-01-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community. PMID:26848405

  6. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool.

    PubMed

    Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi

    2015-11-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.

  7. Aeroelastic Analysis Of Joined Wing Of High Altitude Long Endurance (HALE) Aircraft Based On The Sensor-Craft Configuration

    NASA Astrophysics Data System (ADS)

    Marisarla, Soujanya; Ghia, Urmila; "Karman" Ghia, Kirti

    2002-11-01

    Towards a comprehensive aeroelastic analysis of a joined wing, fluid dynamics and structural analyses are initially performed separately. Steady flow calculations are currently performed using 3-D compressible Navier-Stokes equations. Flow analysis of M6-Onera wing served to validate the software for the fluid dynamics analysis. The complex flow field of the joined wing is analyzed and the prevailing fluid dynamic forces are computed using COBALT software. Currently, these forces are being transferred as fluid loads on the structure. For the structural analysis, several test cases were run considering the wing as a cantilever beam; these served as validation cases. A nonlinear structural analysis of the wing is being performed using ANSYS software to predict the deflections and stresses on the joined wing. Issues related to modeling, and selecting appropriate mesh for the structure were addressed by first performing a linear analysis. The frequencies and mode shapes of the deformed wing are obtained from modal analysis. Both static and dynamic analyses are carried out, and the results obtained are carefully analyzed. Loose coupling between the fluid and structural analyses is currently being examined.

  8. Performance Indicators in Math: Implications for Brief Experimental Analysis of Academic Performance

    ERIC Educational Resources Information Center

    VanDerheyden, Amanda M.; Burns, Matthew K.

    2009-01-01

    Brief experimental analysis (BEA) can be used to specify intervention characteristics that produce positive learning gains for individual students. A key challenge to the use of BEA for intervention planning is the identification of performance indicators (including topography of the skill, measurement characteristics, and decision criteria) that…

  9. Improving Student Naval Aviator Aircraft Carrier Landing Performance

    ERIC Educational Resources Information Center

    Sheppard, Thomas H.; Foster, T. Chris

    2008-01-01

    This article discusses the use of human performance technology (HPT) to improve qualification rates for learning to land onboard aircraft carriers. This project started as a request for a business case analysis and evolved into a full-fledged performance improvement project, from mission analysis through evaluation. The result was a significant…

  10. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bremer, Peer-Timo; Mohr, Bernd; Schulz, Martin

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  11. Comparison of variance estimators for meta-analysis of instrumental variable estimates

    PubMed Central

    Schmidt, AF; Hingorani, AD; Jefferis, BJ; White, J; Groenwold, RHH; Dudbridge, F

    2016-01-01

    Abstract Background: Mendelian randomization studies perform instrumental variable (IV) analysis using genetic IVs. Results of individual Mendelian randomization studies can be pooled through meta-analysis. We explored how different variance estimators influence the meta-analysed IV estimate. Methods: Two versions of the delta method (IV before or after pooling), four bootstrap estimators, a jack-knife estimator and a heteroscedasticity-consistent (HC) variance estimator were compared using simulation. Two types of meta-analyses were compared, a two-stage meta-analysis pooling results, and a one-stage meta-analysis pooling datasets. Results: Using a two-stage meta-analysis, coverage of the point estimate using bootstrapped estimators deviated from nominal levels at weak instrument settings and/or outcome probabilities ≤ 0.10. The jack-knife estimator was the least biased resampling method, the HC estimator often failed at outcome probabilities ≤ 0.50 and overall the delta method estimators were the least biased. In the presence of between-study heterogeneity, the delta method before meta-analysis performed best. Using a one-stage meta-analysis all methods performed equally well and better than two-stage meta-analysis of greater or equal size. Conclusions: In the presence of between-study heterogeneity, two-stage meta-analyses should preferentially use the delta method before meta-analysis. Weak instrument bias can be reduced by performing a one-stage meta-analysis. PMID:27591262

  12. Artificial Intelligence: An Analysis of Potential Applications to Training, Performance Measurement, and Job Performance Aiding.

    DTIC Science & Technology

    1983-09-01

    AD-Ali33 592 ARTIFICIAL INTELLIGENCE: AN ANALYSIS OF POTENTIAL 1/1 APPLICATIONS TO TRAININ..(U) DENVER RESEARCH INST CO JRICHARDSON SEP 83 AFHRL-TP...83-28 b ’ 3 - 4. TITLE (aied Suhkie) 5. TYPE OF REPORT & PERIOD COVERED ARTIFICIAL INTEL11GENCE: AN ANALYSIS OF Interim POTENTIAL APPLICATIONS TO...8217 sde if neceseamy end ides*f by black naumber) artificial intelligence military research * computer-aided diagnosis performance tests computer

  13. Paramedir: A Tool for Programmable Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  14. Terminal Performance of Lead Free Pistol Bullets in Ballistic Gelatin Using Retarding Force Analysis from High Speed Video

    DTIC Science & Technology

    2016-04-04

    Terminal Performance of Lead-Free Pistol Bullets in Ballistic Gelatin Using Retarding Force Analysis from High Speed Video ELIJAH COURTNEY, AMY...quantified using high speed video . The temporary stretch cavities and permanent wound cavities are also characterized. Two factors tend to re- duce the...Performance of Lead-Free Pistol Bullets in Ballistic Gelatin Using Retarding Force Analysis from High Speed Video cavity. In addition, stretching can also

  15. Development of Response Surface Models for Rapid Analysis & Multidisciplinary Optimization of Launch Vehicle Design Concepts

    NASA Technical Reports Server (NTRS)

    Unal, Resit

    1999-01-01

    Multdisciplinary design optimization (MDO) is an important step in the design and evaluation of launch vehicles, since it has a significant impact on performance and lifecycle cost. The objective in MDO is to search the design space to determine the values of design parameters that optimize the performance characteristics subject to system constraints. Vehicle Analysis Branch (VAB) at NASA Langley Research Center has computerized analysis tools in many of the disciplines required for the design and analysis of launch vehicles. Vehicle performance characteristics can be determined by the use of these computerized analysis tools. The next step is to optimize the system performance characteristics subject to multidisciplinary constraints. However, most of the complex sizing and performance evaluation codes used for launch vehicle design are stand-alone tools, operated by disciplinary experts. They are, in general, difficult to integrate and use directly for MDO. An alternative has been to utilize response surface methodology (RSM) to obtain polynomial models that approximate the functional relationships between performance characteristics and design variables. These approximation models, called response surface models, are then used to integrate the disciplines using mathematical programming methods for efficient system level design analysis, MDO and fast sensitivity simulations. A second-order response surface model of the form given has been commonly used in RSM since in many cases it can provide an adequate approximation especially if the region of interest is sufficiently limited.

  16. HSI top-down requirements analysis for ship manpower reduction

    NASA Astrophysics Data System (ADS)

    Malone, Thomas B.; Bost, J. R.

    2000-11-01

    U.S. Navy ship acquisition programs such as DD 21 and CVNX are increasingly relying on top down requirements analysis (TDRA) to define and assess design approaches for workload and manpower reduction, and for ensuring required levels of human performance, reliability, safety, and quality of life at sea. The human systems integration (HSI) approach to TDRA begins with a function analysis which identifies the functions derived from the requirements in the Operational Requirements Document (ORD). The function analysis serves as the function baseline for the ship, and also supports the definition of RDT&E and Total Ownership Cost requirements. A mission analysis is then conducted to identify mission scenarios, again based on requirements in the ORD, and the Design Reference Mission (DRM). This is followed by a mission/function analysis which establishes the function requirements to successfully perform the ship's missions. Function requirements of major importance for HSI are information, performance, decision, and support requirements associated with each function. An allocation of functions defines the roles of humans and automation in performing the functions associated with a mission. Alternate design concepts, based on function allocation strategies, are then described, and task networks associated with the concepts are developed. Task network simulations are conducted to assess workloads and human performance capabilities associated with alternate concepts. An assessment of the affordability and risk associated with alternate concepts is performed, and manning estimates are developed for feasible design concepts.

  17. Methodology issues concerning the accuracy of kinematic data collection and analysis using the ariel performance analysis system

    NASA Technical Reports Server (NTRS)

    Wilmington, R. P.; Klute, Glenn K. (Editor); Carroll, Amy E. (Editor); Stuart, Mark A. (Editor); Poliner, Jeff (Editor); Rajulu, Sudhakar (Editor); Stanush, Julie (Editor)

    1992-01-01

    Kinematics, the study of motion exclusive of the influences of mass and force, is one of the primary methods used for the analysis of human biomechanical systems as well as other types of mechanical systems. The Anthropometry and Biomechanics Laboratory (ABL) in the Crew Interface Analysis section of the Man-Systems Division performs both human body kinematics as well as mechanical system kinematics using the Ariel Performance Analysis System (APAS). The APAS supports both analysis of analog signals (e.g. force plate data collection) as well as digitization and analysis of video data. The current evaluations address several methodology issues concerning the accuracy of the kinematic data collection and analysis used in the ABL. This document describes a series of evaluations performed to gain quantitative data pertaining to position and constant angular velocity movements under several operating conditions. Two-dimensional as well as three-dimensional data collection and analyses were completed in a controlled laboratory environment using typical hardware setups. In addition, an evaluation was performed to evaluate the accuracy impact due to a single axis camera offset. Segment length and positional data exhibited errors within 3 percent when using three-dimensional analysis and yielded errors within 8 percent through two-dimensional analysis (Direct Linear Software). Peak angular velocities displayed errors within 6 percent through three-dimensional analyses and exhibited errors of 12 percent when using two-dimensional analysis (Direct Linear Software). The specific results from this series of evaluations and their impacts on the methodology issues of kinematic data collection and analyses are presented in detail. The accuracy levels observed in these evaluations are also presented.

  18. Lunar Exploration Architecture Level Key Drivers and Sensitivities

    NASA Technical Reports Server (NTRS)

    Goodliff, Kandyce; Cirillo, William; Earle, Kevin; Reeves, J. D.; Shyface, Hilary; Andraschko, Mark; Merrill, R. Gabe; Stromgren, Chel; Cirillo, Christopher

    2009-01-01

    Strategic level analysis of the integrated behavior of lunar transportation and lunar surface systems architecture options is performed to assess the benefit, viability, affordability, and robustness of system design choices. This analysis employs both deterministic and probabilistic modeling techniques so that the extent of potential future uncertainties associated with each option are properly characterized. The results of these analyses are summarized in a predefined set of high-level Figures of Merit (FOMs) so as to provide senior NASA Constellation Program (CxP) and Exploration Systems Mission Directorate (ESMD) management with pertinent information to better inform strategic level decision making. The strategic level exploration architecture model is designed to perform analysis at as high a level as possible but still capture those details that have major impacts on system performance. The strategic analysis methodology focuses on integrated performance, affordability, and risk analysis, and captures the linkages and feedbacks between these three areas. Each of these results leads into the determination of the high-level FOMs. This strategic level analysis methodology has been previously applied to Space Shuttle and International Space Station assessments and is now being applied to the development of the Constellation Program point-of-departure lunar architecture. This paper provides an overview of the strategic analysis methodology and the lunar exploration architecture analyses to date. In studying these analysis results, the strategic analysis team has identified and characterized key drivers affecting the integrated architecture behavior. These key drivers include inclusion of a cargo lander, mission rate, mission location, fixed-versus- variable costs/return on investment, and the requirement for probabilistic analysis. Results of sensitivity analysis performed on lunar exploration architecture scenarios are also presented.

  19. Ca analysis: an Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis.

    PubMed

    Greensmith, David J

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. Copyright © 2013 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.

  20. Caffeine ingestion enhances Wingate performance: a meta-analysis.

    PubMed

    Grgic, Jozo

    2018-03-01

    The positive effects of caffeine ingestion on aerobic performance are well-established; however, recent findings are suggesting that caffeine ingestion might also enhance components of anaerobic performance. A commonly used test of anaerobic performance and power output is the 30-second Wingate test. Several studies explored the effects of caffeine ingestion on Wingate performance, with equivocal findings. To elucidate this topic, this paper aims to determine the effects of caffeine ingestion on Wingate performance using meta-analytic statistical techniques. Following a search through PubMed/MEDLINE, Scopus, and SportDiscus ® , 16 studies were found meeting the inclusion criteria (pooled number of participants = 246). Random-effects meta-analysis of standardized mean differences (SMD) for peak power output and mean power output was performed. Study quality was assessed using the modified version of the PEDro checklist. Results of the meta-analysis indicated a significant difference (p = .005) between the placebo and caffeine trials on mean power output with SMD values of small magnitude (0.18; 95% confidence interval: 0.05, 0.31; +3%). The meta-analysis performed for peak power output indicated a significant difference (p = .006) between the placebo and caffeine trials (SMD = 0.27; 95% confidence interval: 0.08, 0.47 [moderate magnitude]; +4%). The results from the PEDro checklist indicated that, in general, studies are of good and excellent methodological quality. This meta-analysis adds on to the current body of evidence showing that caffeine ingestion can also enhance components of anaerobic performance. The results presented herein may be helpful for developing more efficient evidence-based recommendations regarding caffeine supplementation.

  1. The role of ecological dynamics in analysing performance in team sports.

    PubMed

    Vilar, Luís; Araújo, Duarte; Davids, Keith; Button, Chris

    2012-01-01

    Performance analysis is a subdiscipline of sports sciences and one-approach, notational analysis, has been used to objectively audit and describe behaviours of performers during different subphases of play, providing additional information for practitioners to improve future sports performance. Recent criticisms of these methods have suggested the need for a sound theoretical rationale to explain performance behaviours, not just describe them. The aim of this article was to show how ecological dynamics provides a valid theoretical explanation of performance in team sports by explaining the formation of successful and unsuccessful patterns of play, based on symmetry-breaking processes emerging from functional interactions between players and the performance environment. We offer the view that ecological dynamics is an upgrade to more operational methods of performance analysis that merely document statistics of competitive performance. In support of our arguments, we refer to exemplar data on competitive performance in team sports that have revealed functional interpersonal interactions between attackers and defenders, based on variations in the spatial positioning of performers relative to each other in critical performance areas, such as the scoring zones. Implications of this perspective are also considered for practice task design and sport development programmes.

  2. Factors affecting construction performance: exploratory factor analysis

    NASA Astrophysics Data System (ADS)

    Soewin, E.; Chinda, T.

    2018-04-01

    The present work attempts to develop a multidimensional performance evaluation framework for a construction company by considering all relevant measures of performance. Based on the previous studies, this study hypothesizes nine key factors, with a total of 57 associated items. The hypothesized factors, with their associated items, are then used to develop questionnaire survey to gather data. The exploratory factor analysis (EFA) was applied to the collected data which gave rise 10 factors with 57 items affecting construction performance. The findings further reveal that the items constituting ten key performance factors (KPIs) namely; 1) Time, 2) Cost, 3) Quality, 4) Safety & Health, 5) Internal Stakeholder, 6) External Stakeholder, 7) Client Satisfaction, 8) Financial Performance, 9) Environment, and 10) Information, Technology & Innovation. The analysis helps to develop multi-dimensional performance evaluation framework for an effective measurement of the construction performance. The 10 key performance factors can be broadly categorized into economic aspect, social aspect, environmental aspect, and technology aspects. It is important to understand a multi-dimension performance evaluation framework by including all key factors affecting the construction performance of a company, so that the management level can effectively plan to implement an effective performance development plan to match with the mission and vision of the company.

  3. PIPER: Performance Insight for Programmers and Exascale Runtimes: Guiding the Development of the Exascale Software Stack

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mellor-Crummey, John

    The PIPER project set out to develop methodologies and software for measurement, analysis, attribution, and presentation of performance data for extreme-scale systems. Goals of the project were to support analysis of massive multi-scale parallelism, heterogeneous architectures, multi-faceted performance concerns, and to support both post-mortem performance analysis to identify program features that contribute to problematic performance and on-line performance analysis to drive adaptation. This final report summarizes the research and development activity at Rice University as part of the PIPER project. Producing a complete suite of performance tools for exascale platforms during the course of this project was impossible since bothmore » hardware and software for exascale systems is still a moving target. For that reason, the project focused broadly on the development of new techniques for measurement and analysis of performance on modern parallel architectures, enhancements to HPCToolkit’s software infrastructure to support our research goals or use on sophisticated applications, engaging developers of multithreaded runtimes to explore how support for tools should be integrated into their designs, engaging operating system developers with feature requests for enhanced monitoring support, engaging vendors with requests that they add hardware measure- ment capabilities and software interfaces needed by tools as they design new components of HPC platforms including processors, accelerators and networks, and finally collaborations with partners interested in using HPCToolkit to analyze and tune scalable parallel applications.« less

  4. Development of Rasch-based item banks for the assessment of work performance in patients with musculoskeletal diseases.

    PubMed

    Mueller, Evelyn A; Bengel, Juergen; Wirtz, Markus A

    2013-12-01

    This study aimed to develop a self-description assessment instrument to measure work performance in patients with musculoskeletal diseases. In terms of the International Classification of Functioning, Disability and Health (ICF), work performance is defined as the degree of meeting the work demands (activities) at the actual workplace (environment). To account for the fact that work performance depends on the work demands of the job, we strived to develop item banks that allow a flexible use of item subgroups depending on the specific work demands of the patients' jobs. Item development included the collection of work tasks from literature and content validation through expert surveys and patient interviews. The resulting 122 items were answered by 621 patients with musculoskeletal diseases. Exploratory factor analysis to ascertain dimensionality and Rasch analysis (partial credit model) for each of the resulting dimensions were performed. Exploratory factor analysis resulted in four dimensions, and subsequent Rasch analysis led to the following item banks: 'impaired productivity' (15 items), 'impaired cognitive performance' (18), 'impaired coping with stress' (13) and 'impaired physical performance' (low physical workload 20 items, high physical workload 10 items). The item banks exhibited person separation indices (reliability) between 0.89 and 0.96. The assessment of work performance adds the activities component to the more commonly employed participation component of the ICF-model. The four item banks can be adapted to specific jobs where necessary without losing comparability of person measures, as the item banks are based on Rasch analysis.

  5. Predictive Validity of National Basketball Association Draft Combine on Future Performance.

    PubMed

    Teramoto, Masaru; Cross, Chad L; Rieger, Randall H; Maak, Travis G; Willick, Stuart E

    2018-02-01

    Teramoto, M, Cross, CL, Rieger, RH, Maak, TG, and Willick, SE. Predictive validity of national basketball association draft combine on future performance. J Strength Cond Res 32(2): 396-408, 2018-The National Basketball Association (NBA) Draft Combine is an annual event where prospective players are evaluated in terms of their athletic abilities and basketball skills. Data collected at the Combine should help NBA teams select right the players for the upcoming NBA draft; however, its value for predicting future performance of players has not been examined. This study investigated predictive validity of the NBA Draft Combine on future performance of basketball players. We performed a principal component analysis (PCA) on the 2010-2015 Combine data to reduce correlated variables (N = 234), a correlation analysis on the Combine data and future on-court performance to examine relationships (maximum pairwise N = 217), and a robust principal component regression (PCR) analysis to predict first-year and 3-year on-court performance from the Combine measures (N = 148 and 127, respectively). Three components were identified within the Combine data through PCA (= Combine subscales): length-size, power-quickness, and upper-body strength. As per the correlation analysis, the individual Combine items for anthropometrics, including height without shoes, standing reach, weight, wingspan, and hand length, as well as the Combine subscale of length-size, had positive, medium-to-large-sized correlations (r = 0.313-0.545) with defensive performance quantified by Defensive Box Plus/Minus. The robust PCR analysis showed that the Combine subscale of length-size was a predictor most significantly associated with future on-court performance (p ≤ 0.05), including Win Shares, Box Plus/Minus, and Value Over Replacement Player, followed by upper-body strength. In conclusion, the NBA Draft Combine has value for predicting future performance of players.

  6. Performance Analysis Tool for HPC and Big Data Applications on Scientific Clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Wucherl; Koo, Michelle; Cao, Yu

    Big data is prevalent in HPC computing. Many HPC projects rely on complex workflows to analyze terabytes or petabytes of data. These workflows often require running over thousands of CPU cores and performing simultaneous data accesses, data movements, and computation. It is challenging to analyze the performance involving terabytes or petabytes of workflow data or measurement data of the executions, from complex workflows over a large number of nodes and multiple parallel task executions. To help identify performance bottlenecks or debug the performance issues in large-scale scientific applications and scientific clusters, we have developed a performance analysis framework, using state-ofthe-more » art open-source big data processing tools. Our tool can ingest system logs and application performance measurements to extract key performance features, and apply the most sophisticated statistical tools and data mining methods on the performance data. It utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of the big data analysis framework, we conduct case studies on the workflows from an astronomy project known as the Palomar Transient Factory (PTF) and the job logs from the genome analysis scientific cluster. Our study processed many terabytes of system logs and application performance measurements collected on the HPC systems at NERSC. The implementation of our tool is generic enough to be used for analyzing the performance of other HPC systems and Big Data workows.« less

  7. Using Importance-Performance Analysis To Evaluate Teaching Effectiveness.

    ERIC Educational Resources Information Center

    Attarian, Aram

    This paper introduces Importance-Performance (IP) analysis as a method to evaluate teaching effectiveness in a university outdoor program. Originally developed for use in the field of marketing, IP analysis is simple and easy to administer, and provides the instructor with a visual representation of what teaching attributes are important, how…

  8. Homemade Bienzymatic-Amperometric Biosensor for Beverages Analysis

    ERIC Educational Resources Information Center

    Blanco-Lopez, M. C.; Lobo-Castanon, M. J.; Miranda-Ordieres, A. J.

    2007-01-01

    The construction of an amperometric biosensor for glucose analysis is described demonstrating that the analysis is easy to perform and the biosensor gives good analytical performance. This experiment helped the students to acquire problem-solving and teamwork skills, allowing them to reach a high level of independent and critical thought.

  9. A Rational Analysis of the Acquisition of Multisensory Representations

    ERIC Educational Resources Information Center

    Yildirim, Ilker; Jacobs, Robert A.

    2012-01-01

    How do people learn multisensory, or amodal, representations, and what consequences do these representations have for perceptual performance? We address this question by performing a rational analysis of the problem of learning multisensory representations. This analysis makes use of a Bayesian nonparametric model that acquires latent multisensory…

  10. Meta-Analysis in Stata Using Gllamm

    ERIC Educational Resources Information Center

    Bagos, Pantelis G.

    2015-01-01

    There are several user-written programs for performing meta-analysis in Stata (Stata Statistical Software: College Station, TX: Stata Corp LP). These include metan, metareg, mvmeta, and glst. However, there are several cases for which these programs do not suffice. For instance, there is no software for performing univariate meta-analysis with…

  11. Cue Representation and Situational Awareness in Task Analysis

    ERIC Educational Resources Information Center

    Carl, Diana R.

    2009-01-01

    Task analysis in human performance technology is used to determine how human performance can be well supported with training, job aids, environmental changes, and other interventions. Early work by Miller (1953) and Gilbert (1969, 1974) addressed cue processing in task execution and recommended cue descriptions in task analysis. Modern task…

  12. An Analysis of the Automobile Sales Occupation.

    ERIC Educational Resources Information Center

    Bohac, Robert D.; Vernon, Robert C.

    The general purpose of the occupational analysis is to provide workable, basic information dealing with the many and varied duties performed in the auto sales occupation. The analysis follows the salesperson through the essential everyday performance of the tasks in the occupation. The duties involve the process of obtaining the prospects and…

  13. The effect of biological movement variability on the performance of the golf swing in high- and low-handicapped players.

    PubMed

    Bradshaw, Elizabeth J; Keogh, Justin W L; Hume, Patria A; Maulder, Peter S; Nortje, Jacques; Marnewick, Michel

    2009-06-01

    The purpose of this study was to examine the role of neuromotor noise on golf swing performance in high- and low-handicap players. Selected two-dimensional kinematic measures of 20 male golfers (n=10 per high- or low-handicap group) performing 10 golf swings with a 5-iron club was obtained through video analysis. Neuromotor noise was calculated by deducting the standard error of the measurement from the coefficient of variation obtained from intra-individual analysis. Statistical methods included linear regression analysis and one-way analysis of variance using SPSS. Absolute invariance in the key technical positions (e.g., at the top of the backswing) of the golf swing appears to be a more favorable technique for skilled performance.

  14. Dispersion analysis for baseline reference mission 1. [flight simulation and trajectory analysis for space shuttle orbiter

    NASA Technical Reports Server (NTRS)

    Kuhn, A. E.

    1975-01-01

    A dispersion analysis considering 3 sigma uncertainties (or perturbations) in platform, vehicle, and environmental parameters was performed for the baseline reference mission (BRM) 1 of the space shuttle orbiter. The dispersion analysis is based on the nominal trajectory for the BRM 1. State vector and performance dispersions (or variations) which result from the indicated 3 sigma uncertainties were studied. The dispersions were determined at major mission events and fixed times from lift-off (time slices) and the results will be used to evaluate the capability of the vehicle to perform the mission within a 3 sigma level of confidence and to determine flight performance reserves. A computer program is given that was used for dynamic flight simulations of the space shuttle orbiter.

  15. Rapid Elemental Analysis and Provenance Study of Blumea balsamifera DC Using Laser-Induced Breakdown Spectroscopy

    PubMed Central

    Liu, Xiaona; Zhang, Qiao; Wu, Zhisheng; Shi, Xinyuan; Zhao, Na; Qiao, Yanjiang

    2015-01-01

    Laser-induced breakdown spectroscopy (LIBS) was applied to perform a rapid elemental analysis and provenance study of Blumea balsamifera DC. Principal component analysis (PCA) and partial least squares discriminant analysis (PLS-DA) were implemented to exploit the multivariate nature of the LIBS data. Scores and loadings of computed principal components visually illustrated the differing spectral data. The PLS-DA algorithm showed good classification performance. The PLS-DA model using complete spectra as input variables had similar discrimination performance to using selected spectral lines as input variables. The down-selection of spectral lines was specifically focused on the major elements of B. balsamifera samples. Results indicated that LIBS could be used to rapidly analyze elements and to perform provenance study of B. balsamifera. PMID:25558999

  16. Comprehensive analysis of transport aircraft flight performance

    NASA Astrophysics Data System (ADS)

    Filippone, Antonio

    2008-04-01

    This paper reviews the state-of-the art in comprehensive performance codes for fixed-wing aircraft. The importance of system analysis in flight performance is discussed. The paper highlights the role of aerodynamics, propulsion, flight mechanics, aeroacoustics, flight operation, numerical optimisation, stochastic methods and numerical analysis. The latter discipline is used to investigate the sensitivities of the sub-systems to uncertainties in critical state parameters or functional parameters. The paper discusses critically the data used for performance analysis, and the areas where progress is required. Comprehensive analysis codes can be used for mission fuel planning, envelope exploration, competition analysis, a wide variety of environmental studies, marketing analysis, aircraft certification and conceptual aircraft design. A comprehensive program that uses the multi-disciplinary approach for transport aircraft is presented. The model includes a geometry deck, a separate engine input deck with the main parameters, a database of engine performance from an independent simulation, and an operational deck. The comprehensive code has modules for deriving the geometry from bitmap files, an aerodynamics model for all flight conditions, a flight mechanics model for flight envelopes and mission analysis, an aircraft noise model and engine emissions. The model is validated at different levels. Validation of the aerodynamic model is done against the scale models DLR-F4 and F6. A general model analysis and flight envelope exploration are shown for the Boeing B-777-300 with GE-90 turbofan engines with intermediate passenger capacity (394 passengers in 2 classes). Validation of the flight model is done by sensitivity analysis on the wetted area (or profile drag), on the specific air range, the brake-release gross weight and the aircraft noise. A variety of results is shown, including specific air range charts, take-off weight-altitude charts, payload-range performance, atmospheric effects, economic Mach number and noise trajectories at F.A.R. landing points.

  17. Application Performance Analysis and Efficient Execution on Systems with multi-core CPUs, GPUs and MICs: A Case Study with Microscopy Image Analysis

    PubMed Central

    Teodoro, George; Kurc, Tahsin; Andrade, Guilherme; Kong, Jun; Ferreira, Renato; Saltz, Joel

    2015-01-01

    We carry out a comparative performance study of multi-core CPUs, GPUs and Intel Xeon Phi (Many Integrated Core-MIC) with a microscopy image analysis application. We experimentally evaluate the performance of computing devices on core operations of the application. We correlate the observed performance with the characteristics of computing devices and data access patterns, computation complexities, and parallelization forms of the operations. The results show a significant variability in the performance of operations with respect to the device used. The performances of operations with regular data access are comparable or sometimes better on a MIC than that on a GPU. GPUs are more efficient than MICs for operations that access data irregularly, because of the lower bandwidth of the MIC for random data accesses. We propose new performance-aware scheduling strategies that consider variabilities in operation speedups. Our scheduling strategies significantly improve application performance compared to classic strategies in hybrid configurations. PMID:28239253

  18. Multiplex network analysis of employee performance and employee social relationships

    NASA Astrophysics Data System (ADS)

    Cai, Meng; Wang, Wei; Cui, Ying; Stanley, H. Eugene

    2018-01-01

    In human resource management, employee performance is strongly affected by both formal and informal employee networks. Most previous research on employee performance has focused on monolayer networks that can represent only single categories of employee social relationships. We study employee performance by taking into account the entire multiplex structure of underlying employee social networks. We collect three datasets consisting of five different employee relationship categories in three firms, and predict employee performance using degree centrality and eigenvector centrality in a superimposed multiplex network (SMN) and an unfolded multiplex network (UMN). We use a quadratic assignment procedure (QAP) analysis and a regression analysis to demonstrate that the different categories of relationship are mutually embedded and that the strength of their impact on employee performance differs. We also use weighted/unweighted SMN/UMN to measure the predictive accuracy of this approach and find that employees with high centrality in a weighted UMN are more likely to perform well. Our results shed new light on how social structures affect employee performance.

  19. The performance evaluation model of mining project founded on the weight optimization entropy value method

    NASA Astrophysics Data System (ADS)

    Mao, Chao; Chen, Shou

    2017-01-01

    According to the traditional entropy value method still have low evaluation accuracy when evaluating the performance of mining projects, a performance evaluation model of mineral project founded on improved entropy is proposed. First establish a new weight assignment model founded on compatible matrix analysis of analytic hierarchy process (AHP) and entropy value method, when the compatibility matrix analysis to achieve consistency requirements, if it has differences between subjective weights and objective weights, moderately adjust both proportions, then on this basis, the fuzzy evaluation matrix for performance evaluation. The simulation experiments show that, compared with traditional entropy and compatible matrix analysis method, the proposed performance evaluation model of mining project based on improved entropy value method has higher accuracy assessment.

  20. Using a virtual reality temporal bone simulator to assess otolaryngology trainees.

    PubMed

    Zirkle, Molly; Roberson, David W; Leuwer, Rudolf; Dubrowski, Adam

    2007-02-01

    The objective of this study is to determine the feasibility of computerized evaluation of resident performance using hand motion analysis on a virtual reality temporal bone (VR TB) simulator. We hypothesized that both computerized analysis and expert ratings would discriminate the performance of novices from experienced trainees. We also hypothesized that performance on the virtual reality temporal bone simulator (VR TB) would differentiate based on previous drilling experience. The authors conducted a randomized, blind assessment study. Nineteen volunteers from the Otolaryngology-Head and Neck Surgery training program at the University of Toronto drilled both a cadaveric TB and a simulated VR TB. Expert reviewers were asked to assess operative readiness of the trainee based on a blind video review of their performance. Computerized hand motion analysis of each participant's performance was conducted. Expert raters were able to discriminate novices from experienced trainees (P < .05) on cadaveric temporal bones, and there was a trend toward discrimination on VR TB performance. Hand motion analysis showed that experienced trainees had better movement economy than novices (P < .05) on the VR TB. Performance, as measured by hand motion analysis on the VR TB simulator, reflects trainees' previous drilling experience. This study suggests that otolaryngology trainees could accomplish initial temporal bone training on a VR TB simulator, which can provide feedback to the trainee, and may reduce the need for constant faculty supervision and evaluation.

  1. The Development of a Handbook for Astrobee F Performance and Stability Analysis

    NASA Technical Reports Server (NTRS)

    Wolf, R. S.

    1982-01-01

    An astrobee F performance and stability analysis is presented, for use by the NASA Sounding Rocket Division. The performance analysis provides information regarding altitude, mach number, dynamic pressure, and velocity as functions of time since launch. It is found that payload weight has the greatest effect on performance, and performance prediction accuracy was calculated to remain within 1%. In addition, to assure sufficient flight stability, a predicted rigid-body static margin of at least 8% of the total vehicle length is required. Finally, fin cant angle predictions are given in order to achieve a 2.5 cycle per second burnout roll rate, based on obtaining 75% of the steady roll rate. It is noted that this method can be used by flight performance engineers to create a similar handbook for any sounding rocket series.

  2. NASA trend analysis procedures

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This publication is primarily intended for use by NASA personnel engaged in managing or implementing trend analysis programs. 'Trend analysis' refers to the observation of current activity in the context of the past in order to infer the expected level of future activity. NASA trend analysis was divided into 5 categories: problem, performance, supportability, programmatic, and reliability. Problem trend analysis uncovers multiple occurrences of historical hardware or software problems or failures in order to focus future corrective action. Performance trend analysis observes changing levels of real-time or historical flight vehicle performance parameters such as temperatures, pressures, and flow rates as compared to specification or 'safe' limits. Supportability trend analysis assesses the adequacy of the spaceflight logistics system; example indicators are repair-turn-around time and parts stockage levels. Programmatic trend analysis uses quantitative indicators to evaluate the 'health' of NASA programs of all types. Finally, reliability trend analysis attempts to evaluate the growth of system reliability based on a decreasing rate of occurrence of hardware problems over time. Procedures for conducting all five types of trend analysis are provided in this publication, prepared through the joint efforts of the NASA Trend Analysis Working Group.

  3. Measurement uncertainty analysis techniques applied to PV performance measurements

    NASA Astrophysics Data System (ADS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  4. NDARC NASA Design and Analysis of Rotorcraft. Appendix 5; Theory

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2017-01-01

    The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.

  5. NDARC: NASA Design and Analysis of Rotorcraft. Appendix 3; Theory

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2016-01-01

    The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet speci?ed requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft con?gurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates con?guration ?exibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-?delity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy speci?ed design conditions and missions. The analysis tasks can include off-design mission performance calculation, ?ight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft con?gurations is facilitated, while retaining the capability to model novel and advanced concepts. Speci?c rotorcraft con?gurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-?delity attribute models for a component, as well as addition of new components.

  6. NDARC NASA Design and Analysis of Rotorcraft - Input, Appendix 2

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2016-01-01

    The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration exibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tilt-rotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.

  7. NDARC NASA Design and Analysis of Rotorcraft. Appendix 6; Input

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2017-01-01

    The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.

  8. NDARC NASA Design and Analysis of Rotorcraft

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne R.

    2009-01-01

    The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool intended to support both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility; a hierarchy of models; and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with lowfidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single main-rotor and tailrotor helicopter; tandem helicopter; coaxial helicopter; and tiltrotors. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.

  9. NDARC - NASA Design and Analysis of Rotorcraft

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2015-01-01

    The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.

  10. NDARC NASA Design and Analysis of Rotorcraft Theory Appendix 1

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2016-01-01

    The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kupca, L.; Beno, P.

    A very brief summary is provided of a primary circuit piping material properties analysis. The analysis was performed for the Bohunice V-1 reactor and the Kola-1 and -2 reactors. Assessment was performed on Bohunice V-1 archive materials and primary piping material cut from the Kola units after 100,000 hours of operation. Main research program tasks included analysis of mechanical properties, corrosion stability, and microstructural properties. Analysis results are not provided.

  12. Space tug economic analysis study. Volume 2: Tug concepts analysis. Part 1: Overall approach and data generation

    NASA Technical Reports Server (NTRS)

    1972-01-01

    An economic analysis of space tug operations is presented. The subjects discussed are: (1) data base for orbit injection stages, (2) data base for reusable space tug, (3) performance equations, (4) data integration and interpretation, (5) tug performance and mission model accomodation, (6) total program cost, (7) payload analysis, (8) computer software, and (9) comparison of tug concepts.

  13. State Policy Climates for College Student Success: An Analysis of State Policy Documents Pertaining to College Persistence and Completion

    ERIC Educational Resources Information Center

    McLendon, Michael K.; Tuchmayer, Jeremy B.; Park, Toby J.

    2010-01-01

    This article reports the findings of an exploratory analysis of state policy climates for college student persistence and completion. We performed an analysis of more than 100 documents collected from 8 states chosen largely on the basis of their performance on past "Measuring Up" reports. Our analysis of governors' state-of-the-state…

  14. Parallel Algorithms for Image Analysis.

    DTIC Science & Technology

    1982-06-01

    8217 _ _ _ _ _ _ _ 4. TITLE (aid Subtitle) S. TYPE OF REPORT & PERIOD COVERED PARALLEL ALGORITHMS FOR IMAGE ANALYSIS TECHNICAL 6. PERFORMING O4G. REPORT NUMBER TR-1180...Continue on reverse side it neceesary aid Identlfy by block number) Image processing; image analysis ; parallel processing; cellular computers. 20... IMAGE ANALYSIS TECHNICAL 6. PERFORMING ONG. REPORT NUMBER TR-1180 - 7. AUTHOR(&) S. CONTRACT OR GRANT NUMBER(s) Azriel Rosenfeld AFOSR-77-3271 9

  15. Evaluating the Effect of Virtual Reality Temporal Bone Simulation on Mastoidectomy Performance: A Meta-analysis.

    PubMed

    Lui, Justin T; Hoy, Monica Y

    2017-06-01

    Background The increasing prevalence of virtual reality simulation in temporal bone surgery warrants an investigation to assess training effectiveness. Objectives To determine if temporal bone simulator use improves mastoidectomy performance. Data Sources Ovid Medline, Embase, and PubMed databases were systematically searched per the PRISMA guidelines. Review Methods Inclusion criteria were peer-reviewed publications that utilized quantitative data of mastoidectomy performance following the use of a temporal bone simulator. The search was restricted to human studies published in English. Studies were excluded if they were in non-peer-reviewed format, were descriptive in nature, or failed to provide surgical performance outcomes. Meta-analysis calculations were then performed. Results A meta-analysis based on the random-effects model revealed an improvement in overall mastoidectomy performance following training on the temporal bone simulator. A standardized mean difference of 0.87 (95% CI, 0.38-1.35) was generated in the setting of a heterogeneous study population ( I 2 = 64.3%, P < .006). Conclusion In the context of a diverse population of virtual reality simulation temporal bone surgery studies, meta-analysis calculations demonstrate an improvement in trainee mastoidectomy performance with virtual simulation training.

  16. Quantitative analysis of the anti-noise performance of an m-sequence in an electromagnetic method

    NASA Astrophysics Data System (ADS)

    Yuan, Zhe; Zhang, Yiming; Zheng, Qijia

    2018-02-01

    An electromagnetic method with a transmitted waveform coded by an m-sequence achieved better anti-noise performance compared to the conventional manner with a square-wave. The anti-noise performance of the m-sequence varied with multiple coding parameters; hence, a quantitative analysis of the anti-noise performance for m-sequences with different coding parameters was required to optimize them. This paper proposes the concept of an identification system, with the identified Earth impulse response obtained by measuring the system output with the input of the voltage response. A quantitative analysis of the anti-noise performance of the m-sequence was achieved by analyzing the amplitude-frequency response of the corresponding identification system. The effects of the coding parameters on the anti-noise performance are summarized by numerical simulation, and their optimization is further discussed in our conclusions; the validity of the conclusions is further verified by field experiment. The quantitative analysis method proposed in this paper provides a new insight into the anti-noise mechanism of the m-sequence, and could be used to evaluate the anti-noise performance of artificial sources in other time-domain exploration methods, such as the seismic method.

  17. Application of the Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA) for Dynamic Systems Analysis

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Zinnecker, Alicia M.

    2014-01-01

    The aircraft engine design process seeks to achieve the best overall system-level performance, weight, and cost for a given engine design. This is achieved by a complex process known as systems analysis, where steady-state simulations are used to identify trade-offs that should be balanced to optimize the system. The steady-state simulations and data on which systems analysis relies may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic Systems Analysis provides the capability for assessing these trade-offs at an earlier stage of the engine design process. The concept of dynamic systems analysis and the type of information available from this analysis are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed. This tool aids a user in the design of a power management controller to regulate thrust, and a transient limiter to protect the engine model from surge at a single flight condition (defined by an altitude and Mach number). Results from simulation of the closed-loop system may be used to estimate the dynamic performance of the model. This enables evaluation of the trade-off between performance and operability, or safety, in the engine, which could not be done with steady-state data alone. A design study is presented to compare the dynamic performance of two different engine models integrated with the TTECTrA software.

  18. EPA/ECLSS consumables analyses for the Spacelab 1 flight

    NASA Technical Reports Server (NTRS)

    Steines, G. J.; Pipher, M. D.

    1976-01-01

    The results of electrical power system (EPS) and environmental control/life support system (ECLSS) consumables analyses of the Spacelab 1 mission are presented. The analyses were performed to assess the capability of the orbiter systems to support the proposed mission and to establish the various non propulsive consumables requirements. The EPS analysis was performed using the shuttle electrical power system (SEPS) analysis computer program. The ECLSS analysis was performed using the shuttle environmental consumables requirements evaluation tool (SECRET) program.

  19. Optical Performance Of The Gemini Carbon Dioxide Laser Fusion System

    NASA Astrophysics Data System (ADS)

    Viswanathan, V. K.; Hayden, J. J.; Liberman, I.

    1980-11-01

    The performance of the Gemini two beam carbon dioxide laser fusion system was recently upgraded by installation of optical components with improved quality in the final amplifier. A theoretical analysis was conducted in conlunction with measurements of the new performance. The analysis and experimental procedures, and results obtained are reported and compared. Good agreement was found which was within the uncertainties of the analysis and the inaccuracies of the experiments. The focal spot Strehl ratio was between 0.24 and 0.3 for both beams.

  20. A Review and Analysis of Performance Appraisal Processes, Volume III. Performance Appraisal for Professional Service Employees: Non-Technical Report. Professionalism in Schools Series.

    ERIC Educational Resources Information Center

    Ondrack, D. A.; Oliver, C.

    The third of three volumes, this report summarizes the findings of, first, a review and analysis of published literature on performance appraisal in general and particularly on the use of appraisals in public education systems, and, second, a series of field-site investigations of performance appraisal systems in action. The field site studies of…

  1. Arcjet thruster research and technology

    NASA Technical Reports Server (NTRS)

    Makel, Darby B.; Cann, Gordon L.

    1988-01-01

    The design, analysis, and performance testing of an advanced lower power arcjet is described. A high impedance, vortex stabilized 1-kw class arcjet has been studied. A baseline research thruster has been built and endurance and performance tested. This advanced arcjet has demonstrated long lifetime characteristics, but lower than expected performance. Analysis of the specific design has identified modifications which should improve performance and maintain the long life time shown by the arcjet.

  2. A Ballistic Limit Analysis Program for Shielding Against Micrometeoroids and Orbital Debris

    NASA Technical Reports Server (NTRS)

    Ryan, Shannon; Christiansen, Erie

    2010-01-01

    A software program has been developed that enables the user to quickly and simply perform ballistic limit calculations for common spacecraft structures that are subject to hypervelocity impact of micrometeoroid and orbital debris (MMOD) projectiles. This analysis program consists of two core modules: design, and; performance. The design module enables a user to calculate preliminary dimensions of a shield configuration (e.g., thicknesses/areal densities, spacing, etc.) for a ?design? particle (diameter, density, impact velocity, incidence). The performance module enables a more detailed shielding analysis, providing the performance of a user-defined shielding configuration over the range of relevant in-orbit impact conditions.

  3. An Analysis of Factors That Affect the Educational Performance of Agricultural Students

    ERIC Educational Resources Information Center

    Greenway, Gina

    2012-01-01

    Many factors contribute to student achievement. This study focuses on three areas: how students learn, how student personality type affects performance, and how course format affects performance outcomes. The analysis sought to improve understanding of the direction and magnitude with which each of these factors impacts student success. Improved…

  4. Longitudinal Trend Analysis of Performance Indicators for South Carolina's Technical Colleges

    ERIC Educational Resources Information Center

    Hossain, Mohammad Nurul

    2010-01-01

    This study included an analysis of the trend of performance indicators for the technical college sector of higher education in South Carolina. In response to demands for accountability and transparency in higher education, the state of South Carolina developed sector specific performance indicators to measure various educational outcomes for each…

  5. A Systemic Cause Analysis Model for Human Performance Technicians

    ERIC Educational Resources Information Center

    Sostrin, Jesse

    2011-01-01

    This article presents a systemic, research-based cause analysis model for use in the field of human performance technology (HPT). The model organizes the most prominent barriers to workplace learning and performance into a conceptual framework that explains and illuminates the architecture of these barriers that exist within the fabric of everyday…

  6. Using Multilevel Analysis to Monitor Test Performance across Administrations. Research Report. ETS RR-14-29

    ERIC Educational Resources Information Center

    Wei, Youhua; Qu, Yanxuan

    2014-01-01

    For a testing program with frequent administrations, it is important to understand and monitor the stability and fluctuation of test performance across administrations. Different methods have been proposed for this purpose. This study explored the potential of using multilevel analysis to understand and monitor examinees' test performance across…

  7. Analytically Quantifying Gains in the Test and Evaluation Process through Capabilities-Based Analysis

    DTIC Science & Technology

    2011-09-01

    Evaluation Process through Capabilities-Based Analysis 5. FUNDING NUMBERS 6. AUTHOR(S) Eric J. Lednicky 7. PERFORMING ORGANIZATION NAME(S) AND...ADDRESS(ES) Naval Postgraduate School Monterey, CA 93943-5000 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING /MONITORING AGENCY NAME(S...14 C. MEASURES OF EFFECTIVENESS / MEASURES OF PERFORMANCE

  8. Embedded Figures Test Performance in the Broader Autism Phenotype: A Meta-Analysis

    ERIC Educational Resources Information Center

    Cribb, Serena J.; Olaithe, Michelle; Di Lorenzo, Renata; Dunlop, Patrick D.; Maybery, Murray T.

    2016-01-01

    People with autism show superior performance to controls on the Embedded Figures Test (EFT). However, studies examining the relationship between autistic-like traits and EFT performance in neurotypical individuals have yielded inconsistent findings. To examine the inconsistency, a meta-analysis was conducted of studies that (a) compared high and…

  9. Performance analysis of an integrated GPS/inertial attitude determination system. M.S. Thesis - MIT

    NASA Technical Reports Server (NTRS)

    Sullivan, Wendy I.

    1994-01-01

    The performance of an integrated GPS/inertial attitude determination system is investigated using a linear covariance analysis. The principles of GPS interferometry are reviewed, and the major error sources of both interferometers and gyroscopes are discussed and modeled. A new figure of merit, attitude dilution of precision (ADOP), is defined for two possible GPS attitude determination methods, namely single difference and double difference interferometry. Based on this figure of merit, a satellite selection scheme is proposed. The performance of the integrated GPS/inertial attitude determination system is determined using a linear covariance analysis. Based on this analysis, it is concluded that the baseline errors (i.e., knowledge of the GPS interferometer baseline relative to the vehicle coordinate system) are the limiting factor in system performance. By reducing baseline errors, it should be possible to use lower quality gyroscopes without significantly reducing performance. For the cases considered, single difference interferometry is only marginally better than double difference interferometry. Finally, the performance of the system is found to be relatively insensitive to the satellite selection technique.

  10. Ca analysis: An Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis☆

    PubMed Central

    Greensmith, David J.

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. PMID:24125908

  11. The UTRC wind energy conversion system performance analysis for horizontal axis wind turbines (WECSPER)

    NASA Technical Reports Server (NTRS)

    Egolf, T. A.; Landgrebe, A. J.

    1981-01-01

    The theory for the UTRC Energy Conversion System Performance Analysis (WECSPER) for the prediction of horizontal axis wind turbine performance is presented. Major features of the analysis are the ability to: (1) treat the wind turbine blades as lifting lines with a prescribed wake model; (2) solve for the wake-induced inflow and blade circulation using real nonlinear airfoil data; and (3) iterate internally to obtain a compatible wake transport velocity and blade loading solution. This analysis also provides an approximate treatment of wake distortions due to tower shadow or wind shear profiles. Finally, selected results of internal UTRC application of the analysis to existing wind turbines and correlation with limited test data are described.

  12. Numerical Analysis of Coolant Flow and Heat Transfer in ITER Diagnostic First Wall

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khodak, A.; Loesser, G.; Zhai, Y.

    2015-07-24

    We performed numerical simulations of the ITER Diagnostic First Wall (DFW) using ANSYS workbench. During operation DFW will include solid main body as well as liquid coolant. Thus thermal and hydraulic analysis of the DFW was performed using conjugated heat transfer approach, in which heat transfer was resolved in both solid and liquid parts, and simultaneously fluid dynamics analysis was performed only in the liquid part. This approach includes interface between solid and liquid part of the systemAnalysis was performed using ANSYS CFX software. CFX software allows solution of heat transfer equations in solid and liquid part, and solution ofmore » the flow equations in the liquid part. Coolant flow in the DFW was assumed turbulent and was resolved using Reynolds averaged Navier-Stokes equations with Shear Stress Transport turbulence model. Meshing was performed using CFX method available within ANSYS. The data cloud for thermal loading consisting of volumetric heating and surface heating was imported into CFX Volumetric heating source was generated using Attila software. Surface heating was obtained using radiation heat transfer analysis. Our results allowed us to identify areas of excessive heating. Proposals for cooling channel relocation were made. Additional suggestions were made to improve hydraulic performance of the cooling system.« less

  13. Inertial Sensor Technology for Elite Swimming Performance Analysis: A Systematic Review

    PubMed Central

    Mooney, Robert; Corley, Gavin; Godfrey, Alan; Quinlan, Leo R; ÓLaighin, Gearóid

    2015-01-01

    Technical evaluation of swimming performance is an essential factor of elite athletic preparation. Novel methods of analysis, incorporating body worn inertial sensors (i.e., Microelectromechanical systems, or MEMS, accelerometers and gyroscopes), have received much attention recently from both research and commercial communities as an alternative to video-based approaches. This technology may allow for improved analysis of stroke mechanics, race performance and energy expenditure, as well as real-time feedback to the coach, potentially enabling more efficient, competitive and quantitative coaching. The aim of this paper is to provide a systematic review of the literature related to the use of inertial sensors for the technical analysis of swimming performance. This paper focuses on providing an evaluation of the accuracy of different feature detection algorithms described in the literature for the analysis of different phases of swimming, specifically starts, turns and free-swimming. The consequences associated with different sensor attachment locations are also considered for both single and multiple sensor configurations. Additional information such as this should help practitioners to select the most appropriate systems and methods for extracting the key performance related parameters that are important to them for analysing their swimmers’ performance and may serve to inform both applied and research practices. PMID:26712760

  14. Efficacy of Ginseng Supplements on Fatigue and Physical Performance: a Meta-analysis

    PubMed Central

    2016-01-01

    We conducted a meta-analysis to investigate the efficacy of ginseng supplements on fatigue reduction and physical performance enhancement as reported by randomized controlled trials (RCTs). RCTs that investigated the efficacy of ginseng supplements on fatigue reduction and physical performance enhancement compared with placebos were included. The main outcome measures were fatigue reduction and physical performance enhancement. Out of 155 articles meeting initial criteria, 12 RCTs involving 630 participants (311 participants in the intervention group and 319 participants in the placebo group) were included in the final analysis. In the fixed-effect meta-analysis of four RCTs, there was a statistically significant efficacy of ginseng supplements on fatigue reduction (standardized mean difference, SMD = 0.34; 95% confidence interval [CI] = 0.16 to 0.52). However, ginseng supplements were not associated with physical performance enhancement in the fixed-effect meta-analysis of eight RCTs (SMD = −0.01; 95% CI = −0.29 to 0.27). We found that there was insufficient clinical evidence to support the use of ginseng supplements on reducing fatigue and enhancing physical performance because only few RCTs with a small sample size have been published so far. Further lager RCTs are required to confirm the efficacy of ginseng supplements on fatigue reduction. PMID:27822924

  15. The reliability of an instrumented start block analysis system.

    PubMed

    Tor, Elaine; Pease, David L; Ball, Kevin A

    2015-02-01

    The swimming start is highly influential to overall competition performance. Therefore, it is paramount to develop reliable methods to perform accurate biomechanical analysis of start performance for training and research. The Wetplate Analysis System is a custom-made force plate system developed by the Australian Institute of Sport--Aquatic Testing, Training and Research Unit (AIS ATTRU). This sophisticated system combines both force data and 2D digitization to measure a number of kinetic and kinematic parameter values in an attempt to evaluate start performance. Fourteen elite swimmers performed two maximal effort dives (performance was defined as time from start signal to 15 m) over two separate testing sessions. Intraclass correlation coefficients (ICC) were used to determine each parameter's reliability. The kinetic parameters all had ICC greater than 0.9 except the time of peak vertical force (0.742). This may have been due to variations in movement initiation after the starting signal between trials. The kinematic and time parameters also had ICC greater than 0.9 apart from for the time of maximum depth (0.719). This parameter was lower due to the swimmers varying their depth between trials. Based on the high ICC scores for all parameters, the Wetplate Analysis System is suitable for biomechanical analysis of swimming starts.

  16. Individual and population pharmacokinetic compartment analysis: a graphic procedure for quantification of predictive performance.

    PubMed

    Eksborg, Staffan

    2013-01-01

    Pharmacokinetic studies are important for optimizing of drug dosing, but requires proper validation of the used pharmacokinetic procedures. However, simple and reliable statistical methods suitable for evaluation of the predictive performance of pharmacokinetic analysis are essentially lacking. The aim of the present study was to construct and evaluate a graphic procedure for quantification of predictive performance of individual and population pharmacokinetic compartment analysis. Original data from previously published pharmacokinetic compartment analyses after intravenous, oral, and epidural administration, and digitized data, obtained from published scatter plots of observed vs predicted drug concentrations from population pharmacokinetic studies using the NPEM algorithm and NONMEM computer program and Bayesian forecasting procedures, were used for estimating the predictive performance according to the proposed graphical method and by the method of Sheiner and Beal. The graphical plot proposed in the present paper proved to be a useful tool for evaluation of predictive performance of both individual and population compartment pharmacokinetic analysis. The proposed method is simple to use and gives valuable information concerning time- and concentration-dependent inaccuracies that might occur in individual and population pharmacokinetic compartment analysis. Predictive performance can be quantified by the fraction of concentration ratios within arbitrarily specified ranges, e.g. within the range 0.8-1.2.

  17. Modeling and Analysis of Actinide Diffusion Behavior in Irradiated Metal Fuel

    NASA Astrophysics Data System (ADS)

    Edelmann, Paul G.

    There have been numerous attempts to model fast reactor fuel behavior in the last 40 years. The US currently does not have a fully reliable tool to simulate the behavior of metal fuels in fast reactors. The experimental database necessary to validate the codes is also very limited. The DOE-sponsored Advanced Fuels Campaign (AFC) has performed various experiments that are ready for analysis. Current metal fuel performance codes are either not available to the AFC or have limitations and deficiencies in predicting AFC fuel performance. A modified version of a new fuel performance code, FEAST-Metal , was employed in this investigation with useful results. This work explores the modeling and analysis of AFC metallic fuels using FEAST-Metal, particularly in the area of constituent actinide diffusion behavior. The FEAST-Metal code calculations for this work were conducted at Los Alamos National Laboratory (LANL) in support of on-going activities related to sensitivity analysis of fuel performance codes. A sensitivity analysis of FEAST-Metal was completed to identify important macroscopic parameters of interest to modeling and simulation of metallic fuel performance. A modification was made to the FEAST-Metal constituent redistribution model to enable accommodation of newer AFC metal fuel compositions with verified results. Applicability of this modified model for sodium fast reactor metal fuel design is demonstrated.

  18. Effects of user mental state on EEG-BCI performance.

    PubMed

    Myrden, Andrew; Chau, Tom

    2015-01-01

    Changes in psychological state have been proposed as a cause of variation in brain-computer interface performance, but little formal analysis has been conducted to support this hypothesis. In this study, we investigated the effects of three mental states-fatigue, frustration, and attention-on BCI performance. Twelve able-bodied participants were trained to use a two-class EEG-BCI based on the performance of user-specific mental tasks. Following training, participants completed three testing sessions, during which they used the BCI to play a simple maze navigation game while periodically reporting their perceived levels of fatigue, frustration, and attention. Statistical analysis indicated that there is a significant relationship between frustration and BCI performance while the relationship between fatigue and BCI performance approached significance. BCI performance was 7% lower than average when self-reported fatigue was low and 7% higher than average when self-reported frustration was moderate. A multivariate analysis of mental state revealed the presence of contiguous regions in mental state space where BCI performance was more accurate than average, suggesting the importance of moderate fatigue for achieving effortless focus on BCI control, frustration as a potential motivating factor, and attention as a compensatory mechanism to increasing frustration. Finally, a visual analysis showed the sensitivity of underlying class distributions to changes in mental state. Collectively, these results indicate that mental state is closely related to BCI performance, encouraging future development of psychologically adaptive BCIs.

  19. Develop Advanced Nonlinear Signal Analysis Topographical Mapping System

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1997-01-01

    During the development of the SSME, a hierarchy of advanced signal analysis techniques for mechanical signature analysis has been developed by NASA and AI Signal Research Inc. (ASRI) to improve the safety and reliability for Space Shuttle operations. These techniques can process and identify intelligent information hidden in a measured signal which is often unidentifiable using conventional signal analysis methods. Currently, due to the highly interactive processing requirements and the volume of dynamic data involved, detailed diagnostic analysis is being performed manually which requires immense man-hours with extensive human interface. To overcome this manual process, NASA implemented this program to develop an Advanced nonlinear signal Analysis Topographical Mapping System (ATMS) to provide automatic/unsupervised engine diagnostic capabilities. The ATMS will utilize a rule-based Clips expert system to supervise a hierarchy of diagnostic signature analysis techniques in the Advanced Signal Analysis Library (ASAL). ASAL will perform automatic signal processing, archiving, and anomaly detection/identification tasks in order to provide an intelligent and fully automated engine diagnostic capability. The ATMS has been successfully developed under this contract. In summary, the program objectives to design, develop, test and conduct performance evaluation for an automated engine diagnostic system have been successfully achieved. Software implementation of the entire ATMS system on MSFC's OISPS computer has been completed. The significance of the ATMS developed under this program is attributed to the fully automated coherence analysis capability for anomaly detection and identification which can greatly enhance the power and reliability of engine diagnostic evaluation. The results have demonstrated that ATMS can significantly save time and man-hours in performing engine test/flight data analysis and performance evaluation of large volumes of dynamic test data.

  20. From recording discrete actions to studying continuous goal-directed behaviours in team sports.

    PubMed

    Correia, Vanda; Araújo, Duarte; Vilar, Luís; Davids, Keith

    2013-01-01

    This paper highlights the importance of examining interpersonal interactions in performance analysis of team sports, predicated on the relationship between perception and action, compared to the traditional cataloguing of actions by individual performers. We discuss how ecological dynamics may provide a potential unifying theoretical and empirical framework to achieve this re-emphasis in research. With reference to data from illustrative studies on performance analysis and sport expertise, we critically evaluate some of the main assumptions and methodological approaches with regard to understanding how information influences action and decision-making during team sports performance. Current data demonstrate how the understanding of performance behaviours in team sports by sport scientists and practitioners may be enhanced with a re-emphasis in research on the dynamics of emergent ongoing interactions. Ecological dynamics provides formal and theoretically grounded descriptions of player-environment interactions with respect to key performance goals and the unfolding information of competitive performance. Developing these formal descriptions and explanations of sport performance may provide a significant contribution to the field of performance analysis, supporting design and intervention in both research and practice.

  1. 2007 international meeting on Reduced Enrichment for Research and Test Reactors (RERTR). Abstracts and available papers presented at the meeting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2008-07-15

    The Meeting papers discuss research and test reactor fuel performance, manufacturing and testing. Some of the main topics are: conversion from HEU to LEU in different reactors and corresponding problems and activities; flux performance and core lifetime analysis with HEU and LEU fuels; physics and safety characteristics; measurement of gamma field parameters in core with LEU fuel; nondestructive analysis of RERTR fuel; thermal hydraulic analysis; fuel interactions; transient analyses and thermal hydraulics for HEU and LEU cores; microstructure research reactor fuels; post irradiation analysis and performance; computer codes and other related problems.

  2. Storage element performance optimization for CMS analysis jobs

    NASA Astrophysics Data System (ADS)

    Behrmann, G.; Dahlblom, J.; Guldmyr, J.; Happonen, K.; Lindén, T.

    2012-12-01

    Tier-2 computing sites in the Worldwide Large Hadron Collider Computing Grid (WLCG) host CPU-resources (Compute Element, CE) and storage resources (Storage Element, SE). The vast amount of data that needs to processed from the Large Hadron Collider (LHC) experiments requires good and efficient use of the available resources. Having a good CPU efficiency for the end users analysis jobs requires that the performance of the storage system is able to scale with I/O requests from hundreds or even thousands of simultaneous jobs. In this presentation we report on the work on improving the SE performance at the Helsinki Institute of Physics (HIP) Tier-2 used for the Compact Muon Experiment (CMS) at the LHC. Statistics from CMS grid jobs are collected and stored in the CMS Dashboard for further analysis, which allows for easy performance monitoring by the sites and by the CMS collaboration. As part of the monitoring framework CMS uses the JobRobot which sends every four hours 100 analysis jobs to each site. CMS also uses the HammerCloud tool for site monitoring and stress testing and it has replaced the JobRobot. The performance of the analysis workflow submitted with JobRobot or HammerCloud can be used to track the performance due to site configuration changes, since the analysis workflow is kept the same for all sites and for months in time. The CPU efficiency of the JobRobot jobs at HIP was increased approximately by 50 % to more than 90 %, by tuning the SE and by improvements in the CMSSW and dCache software. The performance of the CMS analysis jobs improved significantly too. Similar work has been done on other CMS Tier-sites, since on average the CPU efficiency for CMSSW jobs has increased during 2011. Better monitoring of the SE allows faster detection of problems, so that the performance level can be kept high. The next storage upgrade at HIP consists of SAS disk enclosures which can be stress tested on demand with HammerCloud workflows, to make sure that the I/O-performance is good.

  3. Meta-analysis on occupational exposure to pesticides--neurobehavioral impact and dose-response relationships.

    PubMed

    Meyer-Baron, Monika; Knapp, Guido; Schäper, Michael; van Thriel, Christoph

    2015-01-01

    While the health impact of high exposures to pesticides is acknowledged, the impact of chronic exposures in the absence of acute poisonings is controversial. A systematic analysis of dose-response relationships is still missing. Its absence may provoke alternative explanations for altered performances. Consequently, opportunities for health prevention in the occupational and environmental field may be missed. Objectives were (1) quantification of the neurotoxic impact of pesticides by an analysis of functional alterations in workers measured by neuropsychological performance tests, (2) estimates of dose-response relationships on the basis of exposure duration, and (3) exploration of susceptible subgroups. The meta-analysis employed a random effects model to obtain overall effects for individual performance tests. Twenty-two studies with a total of 1758 exposed and 1260 reference individuals met the inclusion criteria. At least three independent outcomes were available for twenty-six performance variables. Significant performance effects were shown in adults and referred to both cognitive and motor performances. Effect sizes ranging from dRE=-0.14 to dRE=-0.67 showed consistent outcomes for memory and attention. Relationships between effect sizes and exposure duration were indicated for individual performance variables and the total of measured performances. Studies on adolescents had to be analyzed separately due to numerous outliers. The large variation among outcomes hampered the analysis of the susceptibility in this group, while data on female workers was too scant for the analysis. Relationships exist between the impact of pesticides on performances and exposure duration. A change in test paradigms would help to decipher the impact more specifically. The use of biomarkers appropriate for lower exposures would allow a better prevention of neurotoxic effects due to occupational and environmental exposure. Intervention studies in adolescents seem warranted to specify their risk. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Influence of Averaging Preprocessing on Image Analysis with a Markov Random Field Model

    NASA Astrophysics Data System (ADS)

    Sakamoto, Hirotaka; Nakanishi-Ohno, Yoshinori; Okada, Masato

    2018-02-01

    This paper describes our investigations into the influence of averaging preprocessing on the performance of image analysis. Averaging preprocessing involves a trade-off: image averaging is often undertaken to reduce noise while the number of image data available for image analysis is decreased. We formulated a process of generating image data by using a Markov random field (MRF) model to achieve image analysis tasks such as image restoration and hyper-parameter estimation by a Bayesian approach. According to the notions of Bayesian inference, posterior distributions were analyzed to evaluate the influence of averaging. There are three main results. First, we found that the performance of image restoration with a predetermined value for hyper-parameters is invariant regardless of whether averaging is conducted. We then found that the performance of hyper-parameter estimation deteriorates due to averaging. Our analysis of the negative logarithm of the posterior probability, which is called the free energy based on an analogy with statistical mechanics, indicated that the confidence of hyper-parameter estimation remains higher without averaging. Finally, we found that when the hyper-parameters are estimated from the data, the performance of image restoration worsens as averaging is undertaken. We conclude that averaging adversely influences the performance of image analysis through hyper-parameter estimation.

  5. Response of the Alliance 1 Proof-of-Concept Airplane Under Gust Loads

    NASA Technical Reports Server (NTRS)

    Naser, A. S.; Pototzky, A. S.; Spain, C. V.

    2001-01-01

    This report presents the work performed by Lockheed Martin's Langley Program Office in support of NASA's Environmental Research Aircraft and Sensor Technology (ERAST) program. The primary purpose of this work was to develop and demonstrate a gust analysis method which accounts for the span-wise variation of gust velocity. This is important because these unmanned aircraft having high aspect ratios and low wing loading are very flexible, and fly at low speeds. The main focus of the work was therefore to perform a two-dimensional Power Spectrum Density (PSD) analysis of the Alliance 1 Proof-of-Concept Unmanned Aircraft, As of this writing, none of the aircraft described in this report have been constructed. They are concepts represented by analytical models. The process first involved the development of suitable structural and aeroelastic Finite Element Models (FEM). This was followed by development of a one-dimensional PSD gust analysis, and then the two-dimensional (PSD) analysis of the Alliance 1. For further validation and comparison, two additional analyses were performed. A two-dimensional PSD gust analysis was performed on a simplet MSC/NASTRAN example problem. Finally a one-dimensional discrete gust analysis was performed on Alliance 1. This report describes this process, shows the relevant comparisons between analytical methods, and discusses the physical meanings of the results.

  6. The Diagnostic Performance of Stool DNA Testing for Colorectal Cancer: A Systematic Review and Meta-Analysis.

    PubMed

    Zhai, Rong-Lin; Xu, Fei; Zhang, Pei; Zhang, Wan-Li; Wang, Hui; Wang, Ji-Liang; Cai, Kai-Lin; Long, Yue-Ping; Lu, Xiao-Ming; Tao, Kai-Xiong; Wang, Guo-Bin

    2016-02-01

    This meta-analysis was designed to evaluate the diagnostic performance of stool DNA testing for colorectal cancer (CRC) and compare the performance between single-gene and multiple-gene tests.MEDLINE, Cochrane, EMBASE databases were searched using keywords colorectal cancers, stool/fecal, sensitivity, specificity, DNA, and screening. Sensitivity analysis, quality assessments, and performance bias were performed for the included studies.Fifty-three studies were included in the analysis with a total sample size of 7524 patients. The studies were heterogeneous with regard to the genes being analyzed for fecal genetic biomarkers of CRC, as well as the laboratory methods being used for each assay. The sensitivity of the different assays ranged from 2% to 100% and the specificity ranged from 81% to 100%. The meta-analysis found that the pooled sensitivities for single- and multigene assays were 48.0% and 77.8%, respectively, while the pooled specificities were 97.0% and 92.7%. Receiver operator curves and diagnostic odds ratios showed no significant difference between both tests with regard to sensitivity or specificity.This meta-analysis revealed that using assays that evaluated multiple genes compared with single-gene assays did not increase the sensitivity or specificity of stool DNA testing in detecting CRC.

  7. Frame synchronization performance and analysis

    NASA Technical Reports Server (NTRS)

    Aguilera, C. S. R.; Swanson, L.; Pitt, G. H., III

    1988-01-01

    The analysis used to generate the theoretical models showing the performance of the frame synchronizer is described for various frame lengths and marker lengths at various signal to noise ratios and bit error tolerances.

  8. Increasing Transparency Through a Multiverse Analysis.

    PubMed

    Steegen, Sara; Tuerlinckx, Francis; Gelman, Andrew; Vanpaemel, Wolf

    2016-09-01

    Empirical research inevitably includes constructing a data set by processing raw data into a form ready for statistical analysis. Data processing often involves choices among several reasonable options for excluding, transforming, and coding data. We suggest that instead of performing only one analysis, researchers could perform a multiverse analysis, which involves performing all analyses across the whole set of alternatively processed data sets corresponding to a large set of reasonable scenarios. Using an example focusing on the effect of fertility on religiosity and political attitudes, we show that analyzing a single data set can be misleading and propose a multiverse analysis as an alternative practice. A multiverse analysis offers an idea of how much the conclusions change because of arbitrary choices in data construction and gives pointers as to which choices are most consequential in the fragility of the result. © The Author(s) 2016.

  9. Automatically visualise and analyse data on pathways using PathVisioRPC from any programming environment.

    PubMed

    Bohler, Anwesha; Eijssen, Lars M T; van Iersel, Martijn P; Leemans, Christ; Willighagen, Egon L; Kutmon, Martina; Jaillard, Magali; Evelo, Chris T

    2015-08-23

    Biological pathways are descriptive diagrams of biological processes widely used for functional analysis of differentially expressed genes or proteins. Primary data analysis, such as quality control, normalisation, and statistical analysis, is often performed in scripting languages like R, Perl, and Python. Subsequent pathway analysis is usually performed using dedicated external applications. Workflows involving manual use of multiple environments are time consuming and error prone. Therefore, tools are needed that enable pathway analysis directly within the same scripting languages used for primary data analyses. Existing tools have limited capability in terms of available pathway content, pathway editing and visualisation options, and export file formats. Consequently, making the full-fledged pathway analysis tool PathVisio available from various scripting languages will benefit researchers. We developed PathVisioRPC, an XMLRPC interface for the pathway analysis software PathVisio. PathVisioRPC enables creating and editing biological pathways, visualising data on pathways, performing pathway statistics, and exporting results in several image formats in multiple programming environments. We demonstrate PathVisioRPC functionalities using examples in Python. Subsequently, we analyse a publicly available NCBI GEO gene expression dataset studying tumour bearing mice treated with cyclophosphamide in R. The R scripts demonstrate how calls to existing R packages for data processing and calls to PathVisioRPC can directly work together. To further support R users, we have created RPathVisio simplifying the use of PathVisioRPC in this environment. We have also created a pathway module for the microarray data analysis portal ArrayAnalysis.org that calls the PathVisioRPC interface to perform pathway analysis. This module allows users to use PathVisio functionality online without having to download and install the software and exemplifies how the PathVisioRPC interface can be used by data analysis pipelines for functional analysis of processed genomics data. PathVisioRPC enables data visualisation and pathway analysis directly from within various analytical environments used for preliminary analyses. It supports the use of existing pathways from WikiPathways or pathways created using the RPC itself. It also enables automation of tasks performed using PathVisio, making it useful to PathVisio users performing repeated visualisation and analysis tasks. PathVisioRPC is freely available for academic and commercial use at http://projects.bigcat.unimaas.nl/pathvisiorpc.

  10. Evolutionary space platform concept study. Volume 2, part B: Manned space platform concepts

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Logical, cost-effective steps in the evolution of manned space platforms are investigated and assessed. Tasks included the analysis of requirements for a manned space platform, identifying alternative concepts, performing system analysis and definition of the concepts, comparing the concepts and performing programmatic analysis for a reference concept.

  11. 77 FR 75173 - Comprehensive Assessment of the Process for the Review of Device Submissions; Request for Comments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-19

    ... performing the technical analysis, management assessment, and program evaluation tasks required to address... premarket reviews that meet regulatory review standards. 2. Analysis of elements of the review process... process. This includes analysis of root causes for inefficiencies that may affect review performance and...

  12. Analysis of rosen piezoelectric transformers with a varying cross-section.

    PubMed

    Xue, H; Yang, J; Hu, Y

    2008-07-01

    We study the effects of a varying cross-section on the performance of Rosen piezoelectric transformers operating with length extensional modes of rods. A theoretical analysis is performed using an extended version of a one-dimensional model developed in a previous paper. Numerical results based on the theoretical analysis are presented.

  13. An Ideal Observer Analysis of Visual Working Memory

    ERIC Educational Resources Information Center

    Sims, Chris R.; Jacobs, Robert A.; Knill, David C.

    2012-01-01

    Limits in visual working memory (VWM) strongly constrain human performance across many tasks. However, the nature of these limits is not well understood. In this article we develop an ideal observer analysis of human VWM by deriving the expected behavior of an optimally performing but limited-capacity memory system. This analysis is framed around…

  14. 41 CFR 102-80.130 - Who must perform the equivalent level of safety analysis?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire Prevention Equivalent Level of Safety Analysis... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Who must perform the equivalent level of safety analysis? 102-80.130 Section 102-80.130 Public Contracts and Property Management...

  15. An analysis for high speed propeller-nacelle aerodynamic performance prediction. Volume 1: Theory and application

    NASA Technical Reports Server (NTRS)

    Egolf, T. Alan; Anderson, Olof L.; Edwards, David E.; Landgrebe, Anton J.

    1988-01-01

    A computer program, the Propeller Nacelle Aerodynamic Performance Prediction Analysis (PANPER), was developed for the prediction and analysis of the performance and airflow of propeller-nacelle configurations operating over a forward speed range inclusive of high speed flight typical of recent propfan designs. A propeller lifting line, wake program was combined with a compressible, viscous center body interaction program, originally developed for diffusers, to compute the propeller-nacelle flow field, blade loading distribution, propeller performance, and the nacelle forebody pressure and viscous drag distributions. The computer analysis is applicable to single and coaxial counterrotating propellers. The blade geometries can include spanwise variations in sweep, droop, taper, thickness, and airfoil section type. In the coaxial mode of operation the analysis can treat both equal and unequal blade number and rotational speeds on the propeller disks. The nacelle portion of the analysis can treat both free air and tunnel wall configurations including wall bleed. The analysis was applied to many different sets of flight conditions using selected aerodynamic modeling options. The influence of different propeller nacelle-tunnel wall configurations was studied. Comparisons with available test data for both single and coaxial propeller configurations are presented along with a discussion of the results.

  16. Frequency Spectrum Method-Based Stress Analysis for Oil Pipelines in Earthquake Disaster Areas

    PubMed Central

    Wu, Xiaonan; Lu, Hongfang; Huang, Kun; Wu, Shijuan; Qiao, Weibiao

    2015-01-01

    When a long distance oil pipeline crosses an earthquake disaster area, inertial force and strong ground motion can cause the pipeline stress to exceed the failure limit, resulting in bending and deformation failure. To date, researchers have performed limited safety analyses of oil pipelines in earthquake disaster areas that include stress analysis. Therefore, using the spectrum method and theory of one-dimensional beam units, CAESAR II is used to perform a dynamic earthquake analysis for an oil pipeline in the XX earthquake disaster area. This software is used to determine if the displacement and stress of the pipeline meet the standards when subjected to a strong earthquake. After performing the numerical analysis, the primary seismic action axial, longitudinal and horizontal displacement directions and the critical section of the pipeline can be located. Feasible project enhancement suggestions based on the analysis results are proposed. The designer is able to utilize this stress analysis method to perform an ultimate design for an oil pipeline in earthquake disaster areas; therefore, improving the safe operation of the pipeline. PMID:25692790

  17. Study of Solid State Drives performance in PROOF distributed analysis system

    NASA Astrophysics Data System (ADS)

    Panitkin, S. Y.; Ernst, M.; Petkus, R.; Rind, O.; Wenaus, T.

    2010-04-01

    Solid State Drives (SSD) is a promising storage technology for High Energy Physics parallel analysis farms. Its combination of low random access time and relatively high read speed is very well suited for situations where multiple jobs concurrently access data located on the same drive. It also has lower energy consumption and higher vibration tolerance than Hard Disk Drive (HDD) which makes it an attractive choice in many applications raging from personal laptops to large analysis farms. The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF is especially efficient together with distributed local storage systems like Xrootd, when data are distributed over computing nodes. In such an architecture the local disk subsystem I/O performance becomes a critical factor, especially when computing nodes use multi-core CPUs. We will discuss our experience with SSDs in PROOF environment. We will compare performance of HDD with SSD in I/O intensive analysis scenarios. In particular we will discuss PROOF system performance scaling with a number of simultaneously running analysis jobs.

  18. Frequency spectrum method-based stress analysis for oil pipelines in earthquake disaster areas.

    PubMed

    Wu, Xiaonan; Lu, Hongfang; Huang, Kun; Wu, Shijuan; Qiao, Weibiao

    2015-01-01

    When a long distance oil pipeline crosses an earthquake disaster area, inertial force and strong ground motion can cause the pipeline stress to exceed the failure limit, resulting in bending and deformation failure. To date, researchers have performed limited safety analyses of oil pipelines in earthquake disaster areas that include stress analysis. Therefore, using the spectrum method and theory of one-dimensional beam units, CAESAR II is used to perform a dynamic earthquake analysis for an oil pipeline in the XX earthquake disaster area. This software is used to determine if the displacement and stress of the pipeline meet the standards when subjected to a strong earthquake. After performing the numerical analysis, the primary seismic action axial, longitudinal and horizontal displacement directions and the critical section of the pipeline can be located. Feasible project enhancement suggestions based on the analysis results are proposed. The designer is able to utilize this stress analysis method to perform an ultimate design for an oil pipeline in earthquake disaster areas; therefore, improving the safe operation of the pipeline.

  19. Dynamic Systems Analysis for Turbine Based Aero Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.

    2016-01-01

    The aircraft engine design process seeks to optimize the overall system-level performance, weight, and cost for a given concept. Steady-state simulations and data are used to identify trade-offs that should be balanced to optimize the system in a process known as systems analysis. These systems analysis simulations and data may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic systems analysis provides the capability for assessing the dynamic tradeoffs at an earlier stage of the engine design process. The dynamic systems analysis concept, developed tools, and potential benefit are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed to provide the user with an estimate of the closed-loop performance (response time) and operability (high pressure compressor surge margin) for a given engine design and set of control design requirements. TTECTrA along with engine deterioration information, can be used to develop a more generic relationship between performance and operability that can impact the engine design constraints and potentially lead to a more efficient engine.

  20. Experiment Design and Analysis Guide - Neutronics & Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Misti A Lillo

    2014-06-01

    The purpose of this guide is to provide a consistent, standardized approach to performing neutronics/physics analysis for experiments inserted into the Advanced Test Reactor (ATR). This document provides neutronics/physics analysis guidance to support experiment design and analysis needs for experiments irradiated in the ATR. This guide addresses neutronics/physics analysis in support of experiment design, experiment safety, and experiment program objectives and goals. The intent of this guide is to provide a standardized approach for performing typical neutronics/physics analyses. Deviation from this guide is allowed provided that neutronics/physics analysis details are properly documented in an analysis report.

  1. LOX/LH2 vane pump for auxiliary propulsion systems

    NASA Technical Reports Server (NTRS)

    Hemminger, J. A.; Ulbricht, T. E.

    1985-01-01

    Positive displacement pumps offer potential efficiency advantages over centrifugal pumps for future low thrust space missions. Low flow rate applications, such as space station auxiliary propulsion or dedicated low thrust orbiter transfer vehicles, are typical of missions where low flow and high head rise challenge centrifugal pumps. The positive displacement vane pump for pumping of LOX and LH2 is investigated. This effort has included: (1) a testing program in which pump performance was investigated for differing pump clearances and for differing pump materials while pumping LN2, LOX, and LH2; and (2) an analysis effort, in which a comprehensive pump performance analysis computer code was developed and exercised. An overview of the theoretical framework of the performance analysis computer code is presented, along with a summary of analysis results. Experimental results are presented for pump operating in liquid nitrogen. Included are data on the effects on pump performance of pump clearance, speed, and pressure rise. Pump suction performance is also presented.

  2. Movement analysis of upper limb during resistance training using general purpose robot arm "PA10"

    NASA Astrophysics Data System (ADS)

    Morita, Yoshifumi; Yamamoto, Takashi; Suzuki, Takahiro; Hirose, Akinori; Ukai, Hiroyuki; Matsui, Nobuyuki

    2005-12-01

    In this paper we perform movement analysis of an upper limb during resistance training. We selected sanding training, which is one type of resistance training for upper limbs widely performed in occupational therapy. Our final aims in the future are to quantitatively evaluate the therapeutic effect of upper limb motor function during training and to develop a new rehabilitation training support system. For these purposes, first of all we perform movement analysis using a conventional training tool. By measuring upper limb motion during the sanding training we perform feature abstraction. Next we perform movement analysis using the simulated sanding training system. This system is constructed using the general purpose robot arm "PA10". This system enables us to measure the force/torque exerted by subjects and to easily change the load of resistance. The control algorithm is based on impedance control. We found these features of the upper limb motion during the sanding training.

  3. Design and performance of an analysis-by-synthesis class of predictive speech coders

    NASA Technical Reports Server (NTRS)

    Rose, Richard C.; Barnwell, Thomas P., III

    1990-01-01

    The performance of a broad class of analysis-by-synthesis linear predictive speech coders is quantified experimentally. The class of coders includes a number of well-known techniques as well as a very large number of speech coders which have not been named or studied. A general formulation for deriving the parametric representation used in all of the coders in the class is presented. A new coder, named the self-excited vocoder, is discussed because of its good performance with low complexity, and because of the insight this coder gives to analysis-by-synthesis coders in general. The results of a study comparing the performances of different members of this class are presented. The study takes the form of a series of formal subjective and objective speech quality tests performed on selected coders. The results of this study lead to some interesting and important observations concerning the controlling parameters for analysis-by-synthesis speech coders.

  4. Performance analysis of distributed applications using automatic classification of communication inefficiencies

    DOEpatents

    Vetter, Jeffrey S.

    2005-02-01

    The method and system described herein presents a technique for performance analysis that helps users understand the communication behavior of their message passing applications. The method and system described herein may automatically classifies individual communication operations and reveal the cause of communication inefficiencies in the application. This classification allows the developer to quickly focus on the culprits of truly inefficient behavior, rather than manually foraging through massive amounts of performance data. Specifically, the method and system described herein trace the message operations of Message Passing Interface (MPI) applications and then classify each individual communication event using a supervised learning technique: decision tree classification. The decision tree may be trained using microbenchmarks that demonstrate both efficient and inefficient communication. Since the method and system described herein adapt to the target system's configuration through these microbenchmarks, they simultaneously automate the performance analysis process and improve classification accuracy. The method and system described herein may improve the accuracy of performance analysis and dramatically reduce the amount of data that users must encounter.

  5. How to Perform an Ethical Risk Analysis (eRA).

    PubMed

    Hansson, Sven Ove

    2018-02-26

    Ethical analysis is often needed in the preparation of policy decisions on risk. A three-step method is proposed for performing an ethical risk analysis (eRA). In the first step, the people concerned are identified and categorized in terms of the distinct but compatible roles of being risk-exposed, a beneficiary, or a decisionmaker. In the second step, a more detailed classification of roles and role combinations is performed, and ethically problematic role combinations are identified. In the third step, further ethical deliberation takes place, with an emphasis on individual risk-benefit weighing, distributional analysis, rights analysis, and power analysis. Ethical issues pertaining to subsidiary risk roles, such as those of experts and journalists, are also treated in this phase. An eRA should supplement, not replace, a traditional risk analysis that puts emphasis on the probabilities and severities of undesirable events but does not cover ethical issues such as agency, interpersonal relationships, and justice. © 2018 Society for Risk Analysis.

  6. Note on Professor Sizer's Paper.

    ERIC Educational Resources Information Center

    Balderston, Frederick E.

    1979-01-01

    Issues suggested by John Sizer's paper, an overview of the assessment of institutional performance, include: the efficient-frontier approach, multiple-criterion decision-making models, performance analysis approached as path analysis, and assessment of academic quality. (JMD)

  7. DistributedFBA.jl: High-level, high-performance flux balance analysis in Julia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heirendt, Laurent; Thiele, Ines; Fleming, Ronan M. T.

    Flux balance analysis and its variants are widely used methods for predicting steady-state reaction rates in biochemical reaction networks. The exploration of high dimensional networks with such methods is currently hampered by software performance limitations. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on a subset or all the reactions of large and huge-scale networks, on any number of threads or nodes. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on amore » subset or all the reactions of large and huge-scale networks, on any number of threads or nodes.« less

  8. DistributedFBA.jl: High-level, high-performance flux balance analysis in Julia

    DOE PAGES

    Heirendt, Laurent; Thiele, Ines; Fleming, Ronan M. T.

    2017-01-16

    Flux balance analysis and its variants are widely used methods for predicting steady-state reaction rates in biochemical reaction networks. The exploration of high dimensional networks with such methods is currently hampered by software performance limitations. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on a subset or all the reactions of large and huge-scale networks, on any number of threads or nodes. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on amore » subset or all the reactions of large and huge-scale networks, on any number of threads or nodes.« less

  9. The Causality Study of External Environment Analysis (EEA), Internal Environment Analysis (IEA), Strategy Implementation on Study Program Performance at Vocational High School (VHS) in Nias Archipelago, Indonesia

    ERIC Educational Resources Information Center

    Waruwu, Binahati; Sitompul, Harun; Manullang, Belferik

    2016-01-01

    The purposes of this study are to find out the significant effect of: (1) EEA on strategy implementation, (2) IEA on strategy implementation, (3) EEA on study program performance, (4) IEA on study program performance, and (5) strategy implementation on study program performance of Vocational High School (VHS) in Nias Archipelago. The population of…

  10. Exact Performance Analysis of Two Distributed Processes with Multiple Synchronization Points.

    DTIC Science & Technology

    1987-05-01

    number of processes with straight-line sequences of semaphore operations . We use the geometric model for performance analysis, in contrast to proving...distribution unlimited. 4. PERFORMING’*ORGANIZATION REPORT NUMBERS) 5. MONITORING ORGANIZATION REPORT NUMB CS-TR-1845 6a. NAME OF PERFORMING ORGANIZATION 6b...OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATIO U University of Maryland (If applicable) Office of Naval Research N/A 6c. ADDRESS (City, State, and

  11. Space Shuttle Main Engine structural analysis and data reduction/evaluation. Volume 3A: High pressure oxidizer turbo-pump preburner pump housing stress analysis report

    NASA Technical Reports Server (NTRS)

    Shannon, Robert V., Jr.

    1989-01-01

    The model generation and structural analysis performed for the High Pressure Oxidizer Turbopump (HPOTP) preburner pump volute housing located on the main pump end of the HPOTP in the space shuttle main engine are summarized. An ANSYS finite element model of the volute housing was built and executed. A static structural analysis was performed on the Engineering Analysis and Data System (EADS) Cray-XMP supercomputer

  12. Performance bounds on parallel self-initiating discrete-event

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1990-01-01

    The use is considered of massively parallel architectures to execute discrete-event simulations of what is termed self-initiating models. A logical process in a self-initiating model schedules its own state re-evaluation times, independently of any other logical process, and sends its new state to other logical processes following the re-evaluation. The interest is in the effects of that communication on synchronization. The performance is considered of various synchronization protocols by deriving upper and lower bounds on optimal performance, upper bounds on Time Warp's performance, and lower bounds on the performance of a new conservative protocol. The analysis of Time Warp includes the overhead costs of state-saving and rollback. The analysis points out sufficient conditions for the conservative protocol to outperform Time Warp. The analysis also quantifies the sensitivity of performance to message fan-out, lookahead ability, and the probability distributions underlying the simulation.

  13. A Finite Rate Chemical Analysis of Nitric Oxide Flow Contamination Effects on Scramjet Performance

    NASA Technical Reports Server (NTRS)

    Cabell, Karen F.; Rock, Kenneth E.

    2003-01-01

    The level of nitric oxide contamination in the test gas of the Langley Research Center Arc-Heated Scramjet Test Facility and the effect of the contamination on scramjet test engine performance were investigated analytically. A finite rate chemical analysis was performed to determine the levels of nitric oxide produced in the facility at conditions corresponding to Mach 6 to 8 flight simulations. Results indicate that nitric oxide levels range from one to three mole percent, corroborating previously obtained measurements. A three-stream combustor code with finite rate chemistry was used to investigate the effects of nitric oxide on scramjet performance. Results indicate that nitric oxide in the test gas causes a small increase in heat release and thrust performance for the test conditions investigated. However, a rate constant uncertainty analysis suggests that the effect of nitric oxide ranges from no net effect, to an increase of about 10 percent in thrust performance.

  14. Performance Analysis: Control of Hazardous Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Grange, Connie E.; Freeman, Jeff W.; Kerr, Christine E.

    2010-10-06

    LLNL experienced 26 occurrences related to the control of hazardous energy from January 1, 2008 through August 2010. These occurrences were 17% of the total number of reported occurrences during this 32-month period. The Performance Analysis and Reporting Section of the Contractor Assurance Office (CAO) routinely analyzes reported occurrences and issues looking for patterns that may indicate changes in LLNL’s performance and early indications of performance trends. It became apparent through these analyses that LLNL might have experienced a change in the control of hazardous energy and that these occurrences should be analyzed in more detail to determine if themore » perceived change in performance was real, whether that change is significant and if the causes of the occurrences are similar. This report documents the results of this more detailed analysis.« less

  15. Analysis of high vacuum systems using SINDA'85

    NASA Technical Reports Server (NTRS)

    Spivey, R. A.; Clanton, S. E.; Moore, J. D.

    1993-01-01

    The theory, algorithms, and test data correlation analysis of a math model developed to predict performance of the Space Station Freedom Vacuum Exhaust System are presented. The theory used to predict the flow characteristics of viscous, transition, and molecular flow is presented in detail. Development of user subroutines which predict the flow characteristics in conjunction with the SINDA'85/FLUINT analysis software are discussed. The resistance-capacitance network approach with application to vacuum system analysis is demonstrated and results from the model are correlated with test data. The model was developed to predict the performance of the Space Station Freedom Vacuum Exhaust System. However, the unique use of the user subroutines developed in this model and written into the SINDA'85/FLUINT thermal analysis model provides a powerful tool that can be used to predict the transient performance of vacuum systems and gas flow in tubes of virtually any geometry. This can be accomplished using a resistance-capacitance (R-C) method very similar to the methods used to perform thermal analyses.

  16. Performation Metrics Development Analysis for Information and Communications Technology Outsourcing: A Case Study

    ERIC Educational Resources Information Center

    Travis, James L., III

    2014-01-01

    This study investigated how and to what extent the development and use of the OV-5a operational architecture decomposition tree (OADT) from the Department of Defense (DoD) Architecture Framework (DoDAF) affects requirements analysis with respect to complete performance metrics for performance-based services acquisition of ICT under rigid…

  17. Task Analysis for Health Occupations. Cluster: Nursing. Occupation: Geriatric Aide. Education for Employment Task Lists.

    ERIC Educational Resources Information Center

    Lake County Area Vocational Center, Grayslake, IL.

    This task analysis for nursing education provides performance standards, steps to be followed, knowledge required, attitudes to be developed, safety procedures, and equipment and supplies needed for 13 tasks performed by geriatric aides in the duty area of performing diagnostic measures and for 30 tasks in the duty area of providing therapeutic…

  18. Application of Data Envelopment Analysis on the Indicators Contributing to Learning and Teaching Performance

    ERIC Educational Resources Information Center

    Montoneri, Bernard; Lin, Tyrone T.; Lee, Chia-Chi; Huang, Shio-Ling

    2012-01-01

    This paper applies data envelopment analysis (DEA) to explore the quantitative relative efficiency of 18 classes of freshmen students studying a course of English conversation in a university of Taiwan from the academic year 2004-2006. A diagram of teaching performance improvement mechanism is designed to identify key performance indicators for…

  19. Effect analysis of design variables on the disc in a double-eccentric butterfly valve.

    PubMed

    Kang, Sangmo; Kim, Da-Eun; Kim, Kuk-Kyeom; Kim, Jun-Oh

    2014-01-01

    We have performed a shape optimization of the disc in an industrial double-eccentric butterfly valve using the effect analysis of design variables to enhance the valve performance. For the optimization, we select three performance quantities such as pressure drop, maximum stress, and mass (weight) as the responses and three dimensions regarding the disc shape as the design variables. Subsequently, we compose a layout of orthogonal array (L16) by performing numerical simulations on the flow and structure using a commercial package, ANSYS v13.0, and then make an effect analysis of the design variables on the responses using the design of experiments. Finally, we formulate a multiobjective function consisting of the three responses and then propose an optimal combination of the design variables to maximize the valve performance. Simulation results show that the disc thickness makes the most significant effect on the performance and the optimal design provides better performance than the initial design.

  20. Relative performance of academic departments using DEA with sensitivity analysis.

    PubMed

    Tyagi, Preeti; Yadav, Shiv Prasad; Singh, S P

    2009-05-01

    The process of liberalization and globalization of Indian economy has brought new opportunities and challenges in all areas of human endeavor including education. Educational institutions have to adopt new strategies to make best use of the opportunities and counter the challenges. One of these challenges is how to assess the performance of academic programs based on multiple criteria. Keeping this in view, this paper attempts to evaluate the performance efficiencies of 19 academic departments of IIT Roorkee (India) through data envelopment analysis (DEA) technique. The technique has been used to assess the performance of academic institutions in a number of countries like USA, UK, Australia, etc. But we are using it first time in Indian context to the best of our knowledge. Applying DEA models, we calculate technical, pure technical and scale efficiencies and identify the reference sets for inefficient departments. Input and output projections are also suggested for inefficient departments to reach the frontier. Overall performance, research performance and teaching performance are assessed separately using sensitivity analysis.

  1. Bridging ImmunoGenomic Data Analysis Workflow Gaps (BIGDAWG): An integrated case-control analysis pipeline.

    PubMed

    Pappas, Derek J; Marin, Wesley; Hollenbach, Jill A; Mack, Steven J

    2016-03-01

    Bridging ImmunoGenomic Data-Analysis Workflow Gaps (BIGDAWG) is an integrated data-analysis pipeline designed for the standardized analysis of highly-polymorphic genetic data, specifically for the HLA and KIR genetic systems. Most modern genetic analysis programs are designed for the analysis of single nucleotide polymorphisms, but the highly polymorphic nature of HLA and KIR data require specialized methods of data analysis. BIGDAWG performs case-control data analyses of highly polymorphic genotype data characteristic of the HLA and KIR loci. BIGDAWG performs tests for Hardy-Weinberg equilibrium, calculates allele frequencies and bins low-frequency alleles for k×2 and 2×2 chi-squared tests, and calculates odds ratios, confidence intervals and p-values for each allele. When multi-locus genotype data are available, BIGDAWG estimates user-specified haplotypes and performs the same binning and statistical calculations for each haplotype. For the HLA loci, BIGDAWG performs the same analyses at the individual amino-acid level. Finally, BIGDAWG generates figures and tables for each of these comparisons. BIGDAWG obviates the error-prone reformatting needed to traffic data between multiple programs, and streamlines and standardizes the data-analysis process for case-control studies of highly polymorphic data. BIGDAWG has been implemented as the bigdawg R package and as a free web application at bigdawg.immunogenomics.org. Copyright © 2015 American Society for Histocompatibility and Immunogenetics. Published by Elsevier Inc. All rights reserved.

  2. Diagnostic Performance of Mammographic Texture Analysis in the Differential Diagnosis of Benign and Malignant Breast Tumors.

    PubMed

    Li, Zhiming; Yu, Lan; Wang, Xin; Yu, Haiyang; Gao, Yuanxiang; Ren, Yande; Wang, Gang; Zhou, Xiaoming

    2017-11-09

    The purpose of this study was to investigate the diagnostic performance of mammographic texture analysis in the differential diagnosis of benign and malignant breast tumors. Digital mammography images were obtained from the Picture Archiving and Communication System at our institute. Texture features of mammographic images were calculated. Mann-Whitney U test was used to identify differences between the benign and malignant group. The receiver operating characteristic (ROC) curve analysis was used to assess the diagnostic performance of texture features. Significant differences of texture features of histogram, gray-level co-occurrence matrix (GLCM) and run length matrix (RLM) were found between the benign and malignant breast group (P < .05). The area under the ROC (AUROC) of histogram, GLCM, and RLM were 0.800, 0.787, and 0.761, with no differences between them (P > .05). The AUROCs of imaging-based diagnosis, texture analysis, and imaging-based diagnosis combined with texture analysis were 0.873, 0.863, and 0.961, respectively. When imaging-based diagnosis was combined with texture analysis, the AUROC was higher than that of imaging-based diagnosis or texture analysis (P < .05). Mammographic texture analysis is a reliable technique for differential diagnosis of benign and malignant breast tumors. Furthermore, the combination of imaging-based diagnosis and texture analysis can significantly improve diagnostic performance. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Rotor design optimization using a free wake analysis

    NASA Technical Reports Server (NTRS)

    Quackenbush, Todd R.; Boschitsch, Alexander H.; Wachspress, Daniel A.; Chua, Kiat

    1993-01-01

    The aim of this effort was to develop a comprehensive performance optimization capability for tiltrotor and helicopter blades. The analysis incorporates the validated EHPIC (Evaluation of Hover Performance using Influence Coefficients) model of helicopter rotor aerodynamics within a general linear/quadratic programming algorithm that allows optimization using a variety of objective functions involving the performance. The resulting computer code, EHPIC/HERO (HElicopter Rotor Optimization), improves upon several features of the previous EHPIC performance model and allows optimization utilizing a wide spectrum of design variables, including twist, chord, anhedral, and sweep. The new analysis supports optimization of a variety of objective functions, including weighted measures of rotor thrust, power, and propulsive efficiency. The fundamental strength of the approach is that an efficient search for improved versions of the baseline design can be carried out while retaining the demonstrated accuracy inherent in the EHPIC free wake/vortex lattice performance analysis. Sample problems are described that demonstrate the success of this approach for several representative rotor configurations in hover and axial flight. Features that were introduced to convert earlier demonstration versions of this analysis into a generally applicable tool for researchers and designers is also discussed.

  4. Reliability-Based Stability Analysis of Rock Slopes Using Numerical Analysis and Response Surface Method

    NASA Astrophysics Data System (ADS)

    Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.

    2017-08-01

    While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.

  5. Session 6: Dynamic Modeling and Systems Analysis

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey; Chapman, Jeffryes; May, Ryan

    2013-01-01

    These presentations cover some of the ongoing work in dynamic modeling and dynamic systems analysis. The first presentation discusses dynamic systems analysis and how to integrate dynamic performance information into the systems analysis. The ability to evaluate the dynamic performance of an engine design may allow tradeoffs between the dynamic performance and operability of a design resulting in a more efficient engine design. The second presentation discusses the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS). T-MATS is a Simulation system with a library containing the basic building blocks that can be used to create dynamic Thermodynamic Systems. Some of the key features include Turbo machinery components, such as turbines, compressors, etc., and basic control system blocks. T-MAT is written in the Matlab-Simulink environment and is open source software. The third presentation focuses on getting additional performance from the engine by allowing the limit regulators only to be active when a limit is danger of being violated. Typical aircraft engine control architecture is based on MINMAX scheme, which is designed to keep engine operating within prescribed mechanical/operational safety limits. Using a conditionally active min-max limit regulator scheme, additional performance can be gained by disabling non-relevant limit regulators

  6. Performance-cost evaluation methodology for ITS equipment deployment

    DOT National Transportation Integrated Search

    2000-09-01

    Although extensive Intelligent Transportation Systems (ITS) technology is being deployed in the field, little analysis is being performed to evaluate the benefits of implementation schemes. Benefit analysis is particularly in need for one popular ITS...

  7. Cone-Beam Computed Tomography (CBCT) Hepatic Arteriography in Chemoembolization for Hepatocellular Carcinoma: Performance Depicting Tumors and Tumor Feeders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, In Joon; Chung, Jin Wook, E-mail: chungjw@snu.ac.kr; Yin, Yong Hu

    2015-10-15

    PurposeThis study was designed to analyze retrospectively the performance of cone-beam computed tomography (CBCT) hepatic arteriography in depicting tumors and their feeders and to investigate the related determining factors in chemoembolization for hepatocellular carcinoma (HCC).MethodsEighty-six patients with 142 tumors satisfying the imaging diagnosis criteria of HCC were included in this study. The performance of CBCT hepatic arteriography for chemoembolization per tumor and per patient was evaluated using maximum intensity projection images alone (MIP analysis) or MIP combined with multiplanar reformation images (MIP + MPR analysis) regarding the following three aspects: tumor depiction, confidence of tumor feeder detection, and trackability of tumor feeders.more » Tumor size, tumor enhancement, tumor location, number of feeders, diaphragmatic motion, portal vein enhancement, and hepatic artery to parenchyma enhancement ratio were regarded as potential determining factors.ResultsTumors were depicted in 125 (88.0 %) and 142 tumors (100 %) on MIP and MIP + MPR analysis, respectively. Imaging performances on MIP and MIP + MPR analysis were good enough to perform subsegmental chemoembolization without additional angiographic investigation in 88 (62.0 %) and 128 tumors (90.1 %) on per-tumor basis and in 43 (50 %) and 73 (84.9 %) on per-patient basis, respectively. Significant determining factors for performance in MIP + MPR analysis on per tumor basis were tumor size (p = 0.030), tumor enhancement (0.005), tumor location (p = 0.001), and diaphragmatic motion (p < 0.001).ConclusionsCBCT hepatic arteriography provided sufficient information for subsegmental chemoembolization by depicting tumors and their feeders in the vast majority of patients. Combined analysis of MIP and MPR images was essential to enhance the performance of CBCT hepatic arteriography.« less

  8. Coupled Solid Rocket Motor Ballistics and Trajectory Modeling for Higher Fidelity Launch Vehicle Design

    NASA Technical Reports Server (NTRS)

    Ables, Brett

    2014-01-01

    Multi-stage launch vehicles with solid rocket motors (SRMs) face design optimization challenges, especially when the mission scope changes frequently. Significant performance benefits can be realized if the solid rocket motors are optimized to the changing requirements. While SRMs represent a fixed performance at launch, rapid design iterations enable flexibility at design time, yielding significant performance gains. The streamlining and integration of SRM design and analysis can be achieved with improved analysis tools. While powerful and versatile, the Solid Performance Program (SPP) is not conducive to rapid design iteration. Performing a design iteration with SPP and a trajectory solver is a labor intensive process. To enable a better workflow, SPP, the Program to Optimize Simulated Trajectories (POST), and the interfaces between them have been improved and automated, and a graphical user interface (GUI) has been developed. The GUI enables real-time visual feedback of grain and nozzle design inputs, enforces parameter dependencies, removes redundancies, and simplifies manipulation of SPP and POST's numerous options. Automating the analysis also simplifies batch analyses and trade studies. Finally, the GUI provides post-processing, visualization, and comparison of results. Wrapping legacy high-fidelity analysis codes with modern software provides the improved interface necessary to enable rapid coupled SRM ballistics and vehicle trajectory analysis. Low cost trade studies demonstrate the sensitivities of flight performance metrics to propulsion characteristics. Incorporating high fidelity analysis from SPP into vehicle design reduces performance margins and improves reliability. By flying an SRM designed with the same assumptions as the rest of the vehicle, accurate comparisons can be made between competing architectures. In summary, this flexible workflow is a critical component to designing a versatile launch vehicle model that can accommodate a volatile mission scope.

  9. Design and performance analysis of gas and liquid radial turbines

    NASA Astrophysics Data System (ADS)

    Tan, Xu

    In the first part of the research, pumps running in reverse as turbines are studied. This work uses experimental data of wide range of pumps representing the centrifugal pumps' configurations in terms of specific speed. Based on specific speed and specific diameter an accurate correlation is developed to predict the performances at best efficiency point of the centrifugal pump in its turbine mode operation. The proposed prediction method yields very good results to date compared to previous such attempts. The present method is compared to nine previous methods found in the literature. The comparison results show that the method proposed in this paper is the most accurate. The proposed method can be further complemented and supplemented by more future tests to increase its accuracy. The proposed method is meaningful because it is based both specific speed and specific diameter. The second part of the research is focused on the design and analysis of the radial gas turbine. The specification of the turbine is obtained from the solar biogas hybrid system. The system is theoretically analyzed and constructed based on the purchased compressor. Theoretical analysis results in a specification of 100lb/min, 900ºC inlet total temperature and 1.575atm inlet total pressure. 1-D and 3-D geometry of the rotor is generated based on Aungier's method. 1-D loss model analysis and 3-D CFD simulations are performed to examine the performances of the rotor. The total-to-total efficiency of the rotor is more than 90%. With the help of CFD analysis, modifications on the preliminary design obtained optimized aerodynamic performances. At last, the theoretical performance analysis on the hybrid system is performed with the designed turbine.

  10. Analysis and methodology for aeronautical systems technology program planning

    NASA Technical Reports Server (NTRS)

    White, M. J.; Gershkoff, I.; Lamkin, S.

    1983-01-01

    A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.

  11. Performance Analysis of HF Band FB-MC-SS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hussein Moradi; Stephen Andrew Laraway; Behrouz Farhang-Boroujeny

    Abstract—In a recent paper [1] the filter bank multicarrier spread spectrum (FB-MC-SS) waveform was proposed for wideband spread spectrum HF communications. A significant benefit of this waveform is robustness against narrow and partial band interference. Simulation results in [1] demonstrated good performance in a wideband HF channel over a wide range of conditions. In this paper we present a theoretical analysis of the bit error probably for this system. Our analysis tailors the results from [2] where BER performance was analyzed for maximum ration combining systems that accounted for correlation between subcarriers and channel estimation error. Equations are give formore » BER that closely match the simulated performance in most situations.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goltz, G.; Weiner, H.

    A computer program has been developed for designing and analyzing the performance of solar array/battery power systems for the U. S. Coast Guard Navigational Aids. This program is called the Design Synthesis/Performance Analysis (DSPA) Computer Program. The basic function of the Design Synthesis portion of the DSPA program is to evaluate functional and economic criteria to provide specifications for viable solar array/battery power systems. The basic function of the Performance Analysis portion of the DSPA program is to simulate the operation of solar array/battery power systems under specific loads and environmental conditions. This document provides all the information necessary tomore » access the DSPA programs, to input required data and to generate appropriate Design Synthesis or Performance Analysis Output.« less

  13. Interactive design and analysis of future large spacecraft concepts

    NASA Technical Reports Server (NTRS)

    Garrett, L. B.

    1981-01-01

    An interactive computer aided design program used to perform systems level design and analysis of large spacecraft concepts is presented. Emphasis is on rapid design, analysis of integrated spacecraft, and automatic spacecraft modeling for lattice structures. Capabilities and performance of multidiscipline applications modules, the executive and data management software, and graphics display features are reviewed. A single user at an interactive terminal create, design, analyze, and conduct parametric studies of Earth orbiting spacecraft with relative ease. Data generated in the design, analysis, and performance evaluation of an Earth-orbiting large diameter antenna satellite are used to illustrate current capabilities. Computer run time statistics for the individual modules quantify the speed at which modeling, analysis, and design evaluation of integrated spacecraft concepts is accomplished in a user interactive computing environment.

  14. Space Shuttle Main Engine structural analysis and data reduction/evaluation. Volume 3B: High pressure fuel turbo-pump preburner pump bearing assembly analysis

    NASA Technical Reports Server (NTRS)

    Power, Gloria B.; Violett, Rebeca S.

    1989-01-01

    The analysis performed on the High Pressure Oxidizer Turbopump (HPOTP) preburner pump bearing assembly located on the Space Shuttle Main Engine (SSME) is summarized. An ANSYS finite element model for the inlet assembly was built and executed. Thermal and static analyses were performed.

  15. Evaluating Language Environment Analysis System Performance for Chinese: A Pilot Study in Shanghai

    ERIC Educational Resources Information Center

    Gilkerson, Jill; Zhang, Yiwen; Xu, Dongxin; Richards, Jeffrey A.; Xu, Xiaojuan; Jiang, Fan; Harnsberger, James; Topping, Keith

    2015-01-01

    Purpose: The purpose of this study was to evaluate performance of the Language Environment Analysis (LENA) automated language-analysis system for the Chinese Shanghai dialect and Mandarin (SDM) languages. Method: Volunteer parents of 22 children aged 3-23 months were recruited in Shanghai. Families provided daylong in-home audio recordings using…

  16. Analysis of Performance Factors for Accounting and Finance Related Business Courses in a Distance Education Environment

    ERIC Educational Resources Information Center

    Benligiray, Serdar; Onay, Ahmet

    2017-01-01

    The objective of this study is to explore business courses performance factors with a focus on accounting and finance. Course score interrelations are assumed to represent interpretable constructs of these factors. Factor analysis is proposed to identify the constructs that explain the correlations. Factor analysis results identify three…

  17. Analysis of Multiple-Impact Ballistic Performance of a Tempered Glass Laminate with a Strike Face Film

    DTIC Science & Technology

    2014-02-01

    a 0.18 in thick polymer interlayer between two layers of 0.5 in tempered silica based “ soda lime ” glass . A 0.08 in shatter resistant film was...AFCEC-CX-TY-TR-2014-0005 ANALYSIS OF MULTIPLE-IMPACT BALLISTIC PERFORMANCE OF A TEMPERED GLASS LAMINATE WITH A STRIKE FACE FILM Michael A. Magrini...Interim Technical Report 3 JAN 2012 to 2 JAN 2013 Analysis of Multiple-Impact Ballistic Performance of A Tempered Glass Laminate with a Strike Face Film

  18. Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Caubet, Jordi; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    In this paper we describe how to apply powerful performance analysis techniques to understand the behavior of multilevel parallel applications. We use the Paraver/OMPItrace performance analysis system for our study. This system consists of two major components: The OMPItrace dynamic instrumentation mechanism, which allows the tracing of processes and threads and the Paraver graphical user interface for inspection and analyses of the generated traces. We describe how to use the system to conduct a detailed comparative study of a benchmark code implemented in five different programming paradigms applicable for shared memory

  19. A Comparison of Satellite Conjunction Analysis Screening Tools

    DTIC Science & Technology

    2011-09-01

    visualization tool. Version 13.1.4 for Linux was tested. The SOAP conjunction analysis function does not have the capacity to perform the large...was examined by SOAP to confirm the conjunction. STK Advanced CAT STK Advanced CAT (Conjunction Analysis Tools) is an add-on module for the STK ...run with each tool. When attempting to perform the seven day all vs all analysis with STK Advanced CAT, the program consistently crashed during report

  20. Automatic differentiation for design sensitivity analysis of structural systems using multiple processors

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi

    1994-01-01

    An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.

  1. How to Perform a Systematic Review and Meta-analysis of Diagnostic Imaging Studies.

    PubMed

    Cronin, Paul; Kelly, Aine Marie; Altaee, Duaa; Foerster, Bradley; Petrou, Myria; Dwamena, Ben A

    2018-05-01

    A systematic review is a comprehensive search, critical evaluation, and synthesis of all the relevant studies on a specific (clinical) topic that can be applied to the evaluation of diagnostic and screening imaging studies. It can be a qualitative or a quantitative (meta-analysis) review of available literature. A meta-analysis uses statistical methods to combine and summarize the results of several studies. In this review, a 12-step approach to performing a systematic review (and meta-analysis) is outlined under the four domains: (1) Problem Formulation and Data Acquisition, (2) Quality Appraisal of Eligible Studies, (3) Statistical Analysis of Quantitative Data, and (4) Clinical Interpretation of the Evidence. This review is specifically geared toward the performance of a systematic review and meta-analysis of diagnostic test accuracy (imaging) studies. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  2. Integrative analysis of environmental sequences using MEGAN4.

    PubMed

    Huson, Daniel H; Mitra, Suparna; Ruscheweyh, Hans-Joachim; Weber, Nico; Schuster, Stephan C

    2011-09-01

    A major challenge in the analysis of environmental sequences is data integration. The question is how to analyze different types of data in a unified approach, addressing both the taxonomic and functional aspects. To facilitate such analyses, we have substantially extended MEGAN, a widely used taxonomic analysis program. The new program, MEGAN4, provides an integrated approach to the taxonomic and functional analysis of metagenomic, metatranscriptomic, metaproteomic, and rRNA data. While taxonomic analysis is performed based on the NCBI taxonomy, functional analysis is performed using the SEED classification of subsystems and functional roles or the KEGG classification of pathways and enzymes. A number of examples illustrate how such analyses can be performed, and show that one can also import and compare classification results obtained using others' tools. MEGAN4 is freely available for academic purposes, and installers for all three major operating systems can be downloaded from www-ab.informatik.uni-tuebingen.de/software/megan.

  3. Optimization of analytical laboratory work using computer networking and databasing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upp, D.L.; Metcalf, R.A.

    1996-06-01

    The Health Physics Analysis Laboratory (HPAL) performs around 600,000 analyses for radioactive nuclides each year at Los Alamos National Laboratory (LANL). Analysis matrices vary from nasal swipes, air filters, work area swipes, liquids, to the bottoms of shoes and cat litter. HPAL uses 8 liquid scintillation counters, 8 gas proportional counters, and 9 high purity germanium detectors in 5 laboratories to perform these analyses. HPAL has developed a computer network between the labs and software to produce analysis results. The software and hardware package includes barcode sample tracking, log-in, chain of custody, analysis calculations, analysis result printing, and utility programs.more » All data are written to a database, mirrored on a central server, and eventually written to CD-ROM to provide for online historical results. This system has greatly reduced the work required to provide for analysis results as well as improving the quality of the work performed.« less

  4. [Development of performance evaluation and management system on advanced schistosomiasis medical treatment].

    PubMed

    Zhou, Xiao-Rong; Huang, Shui-Sheng; Gong, Xin-Guo; Cen, Li-Ping; Zhang, Cong; Zhu, Hong; Yang, Jun-Jing; Chen, Li

    2012-04-01

    To construct a performance evaluation and management system on advanced schistosomiasis medical treatment, and analyze and evaluate the work of the advanced schistosomiasis medical treatment over the years. By applying the database management technique and C++ programming technique, we inputted the information of the advanced schistosomiasis cases into the system, and comprehensively evaluated the work of the advanced schistosomiasis medical treatment through the cost-effect analysis, cost-effectiveness analysis, and cost-benefit analysis. We made a set of software formula about cost-effect analysis, cost-effectiveness analysis, and cost-benefit analysis. This system had many features such as clear building, easy to operate, friendly surface, convenient information input and information search. It could benefit the performance evaluation of the province's advanced schistosomiasis medical treatment work. This system can satisfy the current needs of advanced schistosomiasis medical treatment work and can be easy to be widely used.

  5. Performance Characteristics of a Kernel-Space Packet Capture Module

    DTIC Science & Technology

    2010-03-01

    Defense, or the United States Government . AFIT/GCO/ENG/10-03 PERFORMANCE CHARACTERISTICS OF A KERNEL-SPACE PACKET CAPTURE MODULE THESIS Presented to the...3.1.2.3 Prototype. The proof of concept for this research is the design, development, and comparative performance analysis of a kernel level N2d capture...changes to kernel code 5. Can be used for both user-space and kernel-space capture applications in order to control comparative performance analysis to

  6. Quality evaluation of moluodan concentrated pill using high-performance liquid chromatography fingerprinting coupled with chemometrics.

    PubMed

    Tao, Lingyan; Zhang, Qing; Wu, Yongjiang; Liu, Xuesong

    2016-12-01

    In this study, a fast and effective high-performance liquid chromatography method was developed to obtain a fingerprint chromatogram and quantitative analysis simultaneously of four indexes including gallic acid, chlorogenic acid, albiflorin and paeoniflorin of the traditional Chinese medicine Moluodan Concentrated Pill. The method was performed by using a Waters X-bridge C 18 reversed phase column on an Agilent 1200S high-performance liquid chromatography system coupled with diode array detection. The mobile phase of the high-performance liquid chromatography method was composed of 20 mmol/L phosphate solution and acetonitrile with a 1 mL/min eluent velocity, under a detection temperature of 30°C and a UV detection wavelength of 254 nm. After the methodology validation, 16 batches of Moluodan Concentrated Pill were analyzed by this high-performance liquid chromatography method and both qualitative and quantitative evaluation results were achieved by similarity analysis, principal component analysis and hierarchical cluster analysis. The results of these three chemometrics were in good agreement and all indicated that batch 10 and batch 16 showed significant differences with the other 14 batches. This suggested that the developed high-performance liquid chromatography method could be applied in the quality evaluation of Moluodan Concentrated Pill. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Model Performance Evaluation and Scenario Analysis ...

    EPA Pesticide Factsheets

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors. The performance measures include error analysis, coefficient of determination, Nash-Sutcliffe efficiency, and a new weighted rank method. These performance metrics only provide useful information about the overall model performance. Note that MPESA is based on the separation of observed and simulated time series into magnitude and sequence components. The separation of time series into magnitude and sequence components and the reconstruction back to time series provides diagnostic insights to modelers. For example, traditional approaches lack the capability to identify if the source of uncertainty in the simulated data is due to the quality of the input data or the way the analyst adjusted the model parameters. This report presents a suite of model diagnostics that identify if mismatches between observed and simulated data result from magnitude or sequence related errors. MPESA offers graphical and statistical options that allow HSPF users to compare observed and simulated time series and identify the parameter values to adjust or the input data to modify. The scenario analysis part of the too

  8. Probability of Failure Analysis Standards and Guidelines for Expendable Launch Vehicles

    NASA Astrophysics Data System (ADS)

    Wilde, Paul D.; Morse, Elisabeth L.; Rosati, Paul; Cather, Corey

    2013-09-01

    Recognizing the central importance of probability of failure estimates to ensuring public safety for launches, the Federal Aviation Administration (FAA), Office of Commercial Space Transportation (AST), the National Aeronautics and Space Administration (NASA), and U.S. Air Force (USAF), through the Common Standards Working Group (CSWG), developed a guide for conducting valid probability of failure (POF) analyses for expendable launch vehicles (ELV), with an emphasis on POF analysis for new ELVs. A probability of failure analysis for an ELV produces estimates of the likelihood of occurrence of potentially hazardous events, which are critical inputs to launch risk analysis of debris, toxic, or explosive hazards. This guide is intended to document a framework for POF analyses commonly accepted in the US, and should be useful to anyone who performs or evaluates launch risk analyses for new ELVs. The CSWG guidelines provide performance standards and definitions of key terms, and are being revised to address allocation to flight times and vehicle response modes. The POF performance standard allows a launch operator to employ alternative, potentially innovative methodologies so long as the results satisfy the performance standard. Current POF analysis practice at US ranges includes multiple methodologies described in the guidelines as accepted methods, but not necessarily the only methods available to demonstrate compliance with the performance standard. The guidelines include illustrative examples for each POF analysis method, which are intended to illustrate an acceptable level of fidelity for ELV POF analyses used to ensure public safety. The focus is on providing guiding principles rather than "recipe lists." Independent reviews of these guidelines were performed to assess their logic, completeness, accuracy, self- consistency, consistency with risk analysis practices, use of available information, and ease of applicability. The independent reviews confirmed the general validity of the performance standard approach and suggested potential updates to improve the accuracy each of the example methods, especially to address reliability growth.

  9. What Performance Analysts Need to Know About Research Trends in Association Football (2012-2016): A Systematic Review.

    PubMed

    Sarmento, Hugo; Clemente, Filipe Manuel; Araújo, Duarte; Davids, Keith; McRobert, Allistair; Figueiredo, António

    2018-04-01

    Evolving patterns of match analysis research need to be systematically reviewed regularly since this area of work is burgeoning rapidly and studies can offer new insights to performance analysts if theoretically and coherently organized. The purpose of this paper was to conduct a systematic review of published articles on match analysis in adult male football, identify and organize common research topics, and synthesize the emerging patterns of work between 2012 and 2016, according to the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) guidelines. The Web of Science database was searched for relevant published studies using the following keywords: 'football' and 'soccer', each one associated with the terms 'match analysis', 'performance analysis', 'notational analysis', 'game analysis', 'tactical analysis' and 'patterns of play'. Of 483 studies initially identified, 77 were fully reviewed and their outcome measures extracted and analyzed. Results showed that research mainly focused on (1) performance at set pieces, i.e. corner kicks, free kicks, penalty kicks; (2) collective system behaviours, captured by established variables such as team centroid (geometrical centre of a set of players) and team dispersion (quantification of how far players are apart), as well as tendencies for team communication (establishing networks based on passing sequences), sequential patterns (predicting future passing sequences), and group outcomes (relationships between match-related statistics and final match scores); and (3) activity profile of players, i.e. playing roles, effects of fatigue, substitutions during matches, and the effects of environmental constraints on performance, such as heat and altitude. From the previous review, novel variables were identified that require new measurement techniques. It is evident that the complexity engendered during performance in competitive soccer requires an integrated approach that considers multiple aspects. A challenge for researchers is to align these new measures with the needs of the coaches through a more integrated relationship between coaches and researchers, to produce practical and usable information that improves player performance and coach activity.

  10. Performance Analysis of Three-Phase Induction Motor with AC Direct and VFD

    NASA Astrophysics Data System (ADS)

    Kumar, Dinesh

    2018-03-01

    The electrical machine analysis and performance calculation is a very important aspect of efficient drive system design. The development of power electronics devices and power converters provide smooth speed control of Induction Motors by changing the frequency of input supply. These converters, on one hand are providing a more flexible speed control that also leads to problems of harmonics and their associated ailments like pulsating torque, distorted current and voltage waveforms, increasing losses etc. This paper includes the performance analysis of three phase induction motor with three-phase AC direct and variable frequency drives (VFD). The comparison has been concluded with respect to various parameters. MATLAB-SIMULINKTM is used for the analysis.

  11. Data Transmission Signal Design and Analysis

    NASA Technical Reports Server (NTRS)

    Moore, J. D.

    1972-01-01

    The error performances of several digital signaling methods are determined as a function of a specified signal-to-noise ratio. Results are obtained for Gaussian noise and impulse noise. Performance of a receiver for differentially encoded biphase signaling is obtained by extending the results of differential phase shift keying. The analysis presented obtains a closed-form answer through the use of some simplifying assumptions. The results give an insight into the analysis problem, however, the actual error performance may show a degradation because of the assumptions made in the analysis. Bipolar signaling decision-threshold selection is investigated. The optimum threshold depends on the signal-to-noise ratio and requires the use of an adaptive receiver.

  12. Surface Management System Departure Event Data Analysis

    NASA Technical Reports Server (NTRS)

    Monroe, Gilena A.

    2010-01-01

    This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.

  13. Is math anxiety in the secondary classroom limiting physics mastery? A study of math anxiety and physics performance

    NASA Astrophysics Data System (ADS)

    Mercer, Gary J.

    This quantitative study examined the relationship between secondary students with math anxiety and physics performance in an inquiry-based constructivist classroom. The Revised Math Anxiety Rating Scale was used to evaluate math anxiety levels. The results were then compared to the performance on a physics standardized final examination. A simple correlation was performed, followed by a multivariate regression analysis to examine effects based on gender and prior math background. The correlation showed statistical significance between math anxiety and physics performance. The regression analysis showed statistical significance for math anxiety, physics performance, and prior math background, but did not show statistical significance for math anxiety, physics performance, and gender.

  14. Predicting performance with traffic analysis tools : final report.

    DOT National Transportation Integrated Search

    2008-03-01

    This document provides insights into the common pitfalls and challenges associated with use of traffic analysis tools for predicting future performance of a transportation facility. It provides five in-depth case studies that demonstrate common ways ...

  15. Rapid Energy Modeling Workflow Demonstration Project

    DTIC Science & Technology

    2014-01-01

    Conditioning Engineers BIM Building Information Model BLCC building life cycle costs BPA Building Performance Analysis CAD computer assisted...invited to enroll in the Autodesk Building Performance Analysis ( BPA ) Certificate Program under a group 30 specifically for DoD installation

  16. Evaluating Service Quality from Patients' Perceptions: Application of Importance-performance Analysis Method.

    PubMed

    Mohebifar, Rafat; Hasani, Hana; Barikani, Ameneh; Rafiei, Sima

    2016-08-01

    Providing high service quality is one of the main functions of health systems. Measuring service quality is the basic prerequisite for improving quality. The aim of this study was to evaluate the quality of service in teaching hospitals using importance-performance analysis matrix. A descriptive-analytic study was conducted through a cross-sectional method in six academic hospitals of Qazvin, Iran, in 2012. A total of 360 patients contributed to the study. The sampling technique was stratified random sampling. Required data were collected based on a standard questionnaire (SERVQUAL). Data analysis was done through SPSS version 18 statistical software and importance-performance analysis matrix. The results showed a significant gap between importance and performance in all five dimensions of service quality (p < 0.05). In reviewing the gap, "reliability" (2.36) and "assurance" (2.24) dimensions had the highest quality gap and "responsiveness" had the lowest gap (1.97). Also, according to findings, reliability and assurance were in Quadrant (I), empathy was in Quadrant (II), and tangibles and responsiveness were in Quadrant (IV) of the importance-performance matrix. The negative gap in all dimensions of quality shows that quality improvement is necessary in all dimensions. Using quality and diagnosis measurement instruments such as importance-performance analysis will help hospital managers with planning of service quality improvement and achieving long-term goals.

  17. Integrated Modeling Activities for the James Webb Space Telescope: Structural-Thermal-Optical Analysis

    NASA Technical Reports Server (NTRS)

    Johnston, John D.; Howard, Joseph M.; Mosier, Gary E.; Parrish, Keith A.; McGinnis, Mark A.; Bluth, Marcel; Kim, Kevin; Ha, Kong Q.

    2004-01-01

    The James Web Space Telescope (JWST) is a large, infrared-optimized space telescope scheduled for launch in 2011. This is a continuation of a series of papers on modeling activities for JWST. The structural-thermal-optical, often referred to as STOP, analysis process is used to predict the effect of thermal distortion on optical performance. The benchmark STOP analysis for JWST assesses the effect of an observatory slew on wavefront error. Temperatures predicted using geometric and thermal math models are mapped to a structural finite element model in order to predict thermally induced deformations. Motions and deformations at optical surfaces are then input to optical models, and optical performance is predicted using either an optical ray trace or a linear optical analysis tool. In addition to baseline performance predictions, a process for performing sensitivity studies to assess modeling uncertainties is described.

  18. Cost Analysis and Performance Assessment of Partner Services for Human Immunodeficiency Virus and Sexually Transmitted Diseases, New York State, 2014.

    PubMed

    Johnson, Britney L; Tesoriero, James; Feng, Wenhui; Qian, Feng; Martin, Erika G

    2017-12-01

    To estimate the programmatic costs of partner services for HIV, syphilis, gonorrhea, and chlamydial infection. New York State and local health departments conducting partner services activities in 2014. A cost analysis estimated, from the state perspective, total program costs and cost per case assignment, patient interview, partner notification, and disease-specific key performance indicator. Data came from contracts, a time study of staff effort, and statewide surveillance systems. Disease-specific costs per case assignment (mean: $580; range: $502-$1,111), patient interview ($703; $608-$1,609), partner notification ($1,169; $950-$1,936), and key performance indicator ($2,697; $1,666-$20,255) varied across diseases. Most costs (79 percent) were devoted to gonorrhea and chlamydial infection investigations. Cost analysis complements cost-effectiveness analysis in evaluating program performance and guiding improvements. © Health Research and Educational Trust.

  19. Big Data Analysis of Contractor Performance Information for Services Acquisition in DoD: A Proof of Concept

    DTIC Science & Technology

    2016-04-30

    qÜáêíÉÉåíÜ=^ååì~ä= ^Åèìáëáíáçå=oÉëÉ~êÅÜ= póãéçëáìã= qÜìêëÇ~ó=pÉëëáçåë= sçäìãÉ=ff= = Big Data Analysis of Contractor Performance Information for Services...Director, Acquisition Career Management, ASN(RD&A) Big Data Analysis of Contractor Performance Information for Services Acquisition in DoD: A Proof of...oÉëÉ~êÅÜ=mêçÖê~ãW= `êÉ~íáåÖ=póåÉêÖó=Ñçê=fåÑçêãÉÇ=`Ü~åÖÉ= - 398 - Big Data Analysis of Contractor Performance Information for Services Acquisition in

  20. Spectral multi-energy CT texture analysis with machine learning for tissue classification: an investigation using classification of benign parotid tumours as a testing paradigm.

    PubMed

    Al Ajmi, Eiman; Forghani, Behzad; Reinhold, Caroline; Bayat, Maryam; Forghani, Reza

    2018-06-01

    There is a rich amount of quantitative information in spectral datasets generated from dual-energy CT (DECT). In this study, we compare the performance of texture analysis performed on multi-energy datasets to that of virtual monochromatic images (VMIs) at 65 keV only, using classification of the two most common benign parotid neoplasms as a testing paradigm. Forty-two patients with pathologically proven Warthin tumour (n = 25) or pleomorphic adenoma (n = 17) were evaluated. Texture analysis was performed on VMIs ranging from 40 to 140 keV in 5-keV increments (multi-energy analysis) or 65-keV VMIs only, which is typically considered equivalent to single-energy CT. Random forest (RF) models were constructed for outcome prediction using separate randomly selected training and testing sets or the entire patient set. Using multi-energy texture analysis, tumour classification in the independent testing set had accuracy, sensitivity, specificity, positive predictive value, and negative predictive value of 92%, 86%, 100%, 100%, and 83%, compared to 75%, 57%, 100%, 100%, and 63%, respectively, for single-energy analysis. Multi-energy texture analysis demonstrates superior performance compared to single-energy texture analysis of VMIs at 65 keV for classification of benign parotid tumours. • We present and validate a paradigm for texture analysis of DECT scans. • Multi-energy dataset texture analysis is superior to single-energy dataset texture analysis. • DECT texture analysis has high accura\\cy for diagnosis of benign parotid tumours. • DECT texture analysis with machine learning can enhance non-invasive diagnostic tumour evaluation.

  1. A Meta-Analysis of Schema Instruction on the Problem-Solving Performance of Elementary School Students

    ERIC Educational Resources Information Center

    Peltier, Corey; Vannest, Kimberly J.

    2017-01-01

    A variety of instructional practices have been recommended to increase the problem-solving (PS) performance of elementary school children. The purpose of this meta-analysis was to systematically review research on the use of schema instruction to increase the PS performance of elementary school-age students. A total of 21 studies, with 3,408…

  2. Performance Analysis and Optimization on the UCLA Parallel Atmospheric General Circulation Model Code

    NASA Technical Reports Server (NTRS)

    Lou, John; Ferraro, Robert; Farrara, John; Mechoso, Carlos

    1996-01-01

    An analysis is presented of several factors influencing the performance of a parallel implementation of the UCLA atmospheric general circulation model (AGCM) on massively parallel computer systems. Several modificaitons to the original parallel AGCM code aimed at improving its numerical efficiency, interprocessor communication cost, load-balance and issues affecting single-node code performance are discussed.

  3. Application of Cognitive Apprenticeship Model to a Graduate Course in Performance Systems Analysis: A Case Study

    ERIC Educational Resources Information Center

    Darabi, A. Aubteen

    2005-01-01

    This article reports a case study describing how the principles of a cognitive apprenticeship (CA) model developed by Collins, Brown, and Holum (1991) were applied to a graduate course on performance systems analysis (PSA), and the differences this application made in student performance and evaluation of the course compared to the previous…

  4. A Theoretical Analysis of the Performance of Learning Disabled Students on the Woodcock-Johnson Psycho-Educational Battery.

    ERIC Educational Resources Information Center

    Shinn, Mark; And Others

    Two studies were conducted to (1) analyze the subtest characteristics of the Woodcock-Johnson Psycho-Educational Battery, and (2) apply those results to an analysis of 50 fourth grade learning disabled (LD) students' performance on the Battery. Analyses indicated that the poorer performance of LD students on the Woodcock-Johnson Tests of Cognitive…

  5. Free wake analysis of hover performance using a new influence coefficient method

    NASA Technical Reports Server (NTRS)

    Quackenbush, Todd R.; Bliss, Donald B.; Ong, Ching Cho; Ching, Cho Ong

    1990-01-01

    A new approach to the prediction of helicopter rotor performance using a free wake analysis was developed. This new method uses a relaxation process that does not suffer from the convergence problems associated with previous time marching simulations. This wake relaxation procedure was coupled to a vortex-lattice, lifting surface loads analysis to produce a novel, self contained performance prediction code: EHPIC (Evaluation of Helicopter Performance using Influence Coefficients). The major technical features of the EHPIC code are described and a substantial amount of background information on the capabilities and proper operation of the code is supplied. Sample problems were undertaken to demonstrate the robustness and flexibility of the basic approach. Also, a performance correlation study was carried out to establish the breadth of applicability of the code, with very favorable results.

  6. A conflict analysis of 4D descent strategies in a metered, multiple-arrival route environment

    NASA Technical Reports Server (NTRS)

    Izumi, K. H.; Harris, C. S.

    1990-01-01

    A conflict analysis was performed on multiple arrival traffic at a typical metered airport. The Flow Management Evaluation Model (FMEM) was used to simulate arrival operations using Denver Stapleton's arrival route structure. Sensitivities of conflict performance to three different 4-D descent strategies (clear-idle Mach/Constant AirSpeed (CAS), constant descent angle Mach/CAS and energy optimal) were examined for three traffic mixes represented by those found at Denver Stapleton, John F. Kennedy and typical en route metering (ERM) airports. The Monte Carlo technique was used to generate simulation entry point times. Analysis results indicate that the clean-idle descent strategy offers the best compromise in overall performance. Performance measures primarily include susceptibility to conflict and conflict severity. Fuel usage performance is extrapolated from previous descent strategy studies.

  7. The Impact of Measurement Noise in GPA Diagnostic Analysis of a Gas Turbine Engine

    NASA Astrophysics Data System (ADS)

    Ntantis, Efstratios L.; Li, Y. G.

    2013-12-01

    The performance diagnostic analysis of a gas turbine is accomplished by estimating a set of internal engine health parameters from available sensor measurements. No physical measuring instruments however can ever completely eliminate the presence of measurement uncertainties. Sensor measurements are often distorted by noise and bias leading to inaccurate estimation results. This paper explores the impact of measurement noise on Gas Turbine GPA analysis. The analysis is demonstrated with a test case where gas turbine performance simulation and diagnostics code TURBOMATCH is used to build a performance model of a model engine similar to Rolls-Royce Trent 500 turbofan engine, and carry out the diagnostic analysis with the presence of different levels of measurement noise. Conclusively, to improve the reliability of the diagnostic results, a statistical analysis of the data scattering caused by sensor uncertainties is made. The diagnostic tool used to deal with the statistical analysis of measurement noise impact is a model-based method utilizing a non-linear GPA.

  8. Comparison of Collisional and Electron-Based Dissociation Modes for Middle-Down Analysis of Multiply Glycosylated Peptides

    NASA Astrophysics Data System (ADS)

    Khatri, Kshitij; Pu, Yi; Klein, Joshua A.; Wei, Juan; Costello, Catherine E.; Lin, Cheng; Zaia, Joseph

    2018-04-01

    Analysis of singly glycosylated peptides has evolved to a point where large-scale LC-MS analyses can be performed at almost the same scale as proteomics experiments. While collisionally activated dissociation (CAD) remains the mainstay of bottom-up analyses, it performs poorly for the middle-down analysis of multiply glycosylated peptides. With improvements in instrumentation, electron-activated dissociation (ExD) modes are becoming increasingly prevalent for proteomics experiments and for the analysis of fragile modifications such as glycosylation. While these methods have been applied for glycopeptide analysis in isolated studies, an organized effort to compare their efficiencies, particularly for analysis of multiply glycosylated peptides (termed here middle-down glycoproteomics), has not been made. We therefore compared the performance of different ExD modes for middle-down glycopeptide analyses. We identified key features among the different dissociation modes and show that increased electron energy and supplemental activation provide the most useful data for middle-down glycopeptide analysis. [Figure not available: see fulltext.

  9. A comparison of hierarchical cluster analysis and league table rankings as methods for analysis and presentation of district health system performance data in Uganda.

    PubMed

    Tashobya, Christine K; Dubourg, Dominique; Ssengooba, Freddie; Speybroeck, Niko; Macq, Jean; Criel, Bart

    2016-03-01

    In 2003, the Uganda Ministry of Health introduced the district league table for district health system performance assessment. The league table presents district performance against a number of input, process and output indicators and a composite index to rank districts. This study explores the use of hierarchical cluster analysis for analysing and presenting district health systems performance data and compares this approach with the use of the league table in Uganda. Ministry of Health and district plans and reports, and published documents were used to provide information on the development and utilization of the Uganda district league table. Quantitative data were accessed from the Ministry of Health databases. Statistical analysis using SPSS version 20 and hierarchical cluster analysis, utilizing Wards' method was used. The hierarchical cluster analysis was conducted on the basis of seven clusters determined for each year from 2003 to 2010, ranging from a cluster of good through moderate-to-poor performers. The characteristics and membership of clusters varied from year to year and were determined by the identity and magnitude of performance of the individual variables. Criticisms of the league table include: perceived unfairness, as it did not take into consideration district peculiarities; and being oversummarized and not adequately informative. Clustering organizes the many data points into clusters of similar entities according to an agreed set of indicators and can provide the beginning point for identifying factors behind the observed performance of districts. Although league table ranking emphasize summation and external control, clustering has the potential to encourage a formative, learning approach. More research is required to shed more light on factors behind observed performance of the different clusters. Other countries especially low-income countries that share many similarities with Uganda can learn from these experiences. © The Author 2015. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.

  10. A comparison of hierarchical cluster analysis and league table rankings as methods for analysis and presentation of district health system performance data in Uganda†

    PubMed Central

    Tashobya, Christine K; Dubourg, Dominique; Ssengooba, Freddie; Speybroeck, Niko; Macq, Jean; Criel, Bart

    2016-01-01

    In 2003, the Uganda Ministry of Health introduced the district league table for district health system performance assessment. The league table presents district performance against a number of input, process and output indicators and a composite index to rank districts. This study explores the use of hierarchical cluster analysis for analysing and presenting district health systems performance data and compares this approach with the use of the league table in Uganda. Ministry of Health and district plans and reports, and published documents were used to provide information on the development and utilization of the Uganda district league table. Quantitative data were accessed from the Ministry of Health databases. Statistical analysis using SPSS version 20 and hierarchical cluster analysis, utilizing Wards’ method was used. The hierarchical cluster analysis was conducted on the basis of seven clusters determined for each year from 2003 to 2010, ranging from a cluster of good through moderate-to-poor performers. The characteristics and membership of clusters varied from year to year and were determined by the identity and magnitude of performance of the individual variables. Criticisms of the league table include: perceived unfairness, as it did not take into consideration district peculiarities; and being oversummarized and not adequately informative. Clustering organizes the many data points into clusters of similar entities according to an agreed set of indicators and can provide the beginning point for identifying factors behind the observed performance of districts. Although league table ranking emphasize summation and external control, clustering has the potential to encourage a formative, learning approach. More research is required to shed more light on factors behind observed performance of the different clusters. Other countries especially low-income countries that share many similarities with Uganda can learn from these experiences. PMID:26024882

  11. Relationships among video gaming proficiency and spatial orientation, laparoscopic, and traditional surgical skills of third-year veterinary students.

    PubMed

    Millard, Heather A Towle; Millard, Ralph P; Constable, Peter D; Freeman, Lyn J

    2014-02-01

    To determine the relationships among traditional and laparoscopic surgical skills, spatial analysis skills, and video gaming proficiency of third-year veterinary students. Prospective, randomized, controlled study. A convenience sample of 29 third-year veterinary students. The students had completed basic surgical skills training with inanimate objects but had no experience with soft tissue, orthopedic, or laparoscopic surgery; the spatial analysis test; or the video games that were used in the study. Scores for traditional surgical, laparoscopic, spatial analysis, and video gaming skills were determined, and associations among these were analyzed by means of Spearman's rank order correlation coefficient (rs). A significant positive association (rs = 0.40) was detected between summary scores for video game performance and laparoscopic skills, but not between video game performance and traditional surgical skills scores. Spatial analysis scores were positively (rs = 0.30) associated with video game performance scores; however, that result was not significant. Spatial analysis scores were not significantly associated with laparoscopic surgical skills scores. Traditional surgical skills scores were not significantly associated with laparoscopic skills or spatial analysis scores. Results of this study indicated video game performance of third-year veterinary students was predictive of laparoscopic but not traditional surgical skills, suggesting that laparoscopic performance may be improved with video gaming experience. Additional studies would be required to identify methods for improvement of traditional surgical skills.

  12. Finite element method analysis of cold forging for deformation and densification of Mo alloyed sintered steel

    NASA Astrophysics Data System (ADS)

    Kamakoshi, Y.; Nishida, S.; Kanbe, K.; Shohji, I.

    2017-10-01

    In recent years, powder metallurgy (P/M) materials have been expected to be applied to automobile products. Then, not only high cost performance but also more strength, wear resistance, long-life and so on are required for P/M materials. As an improvement method of mechanical properties of P/M materials, a densification is expected to be one of effective processes. In this study, to examine behaviours of the densification of Mo-alloyed sintered steel in a cold-forging process, finite element method (FEM) analysis was performed. Firstly, a columnar specimen was cut out from the inner part of a sintered specimen and a load-stroke diagram was obtained by the compression test. 2D FEM analysis was performed using the obtained load-stroke diagram. To correct the errors of stress between the porous mode and the rigid-elastic mode of analysis software, the analysis of a polynominal approximation was performed. As a result, the modified true stress-true strain diagram was obtained for the sintered steel with the densification. Afterwards, 3D FEM analysis of backward extrusion was carried out using the modified true stress-true strain diagram. It was confirmed that both the shape and density of the sintered steel analyzed by new FEM analysis that we suggest correspond well with experimental ones.

  13. CLUSFAVOR 5.0: hierarchical cluster and principal-component analysis of microarray-based transcriptional profiles

    PubMed Central

    Peterson, Leif E

    2002-01-01

    CLUSFAVOR (CLUSter and Factor Analysis with Varimax Orthogonal Rotation) 5.0 is a Windows-based computer program for hierarchical cluster and principal-component analysis of microarray-based transcriptional profiles. CLUSFAVOR 5.0 standardizes input data; sorts data according to gene-specific coefficient of variation, standard deviation, average and total expression, and Shannon entropy; performs hierarchical cluster analysis using nearest-neighbor, unweighted pair-group method using arithmetic averages (UPGMA), or furthest-neighbor joining methods, and Euclidean, correlation, or jack-knife distances; and performs principal-component analysis. PMID:12184816

  14. Predictability of Brayton electric power system performance

    NASA Technical Reports Server (NTRS)

    Klann, J. L.; Hettel, H. J.

    1972-01-01

    Data from the first tests of the 2- to 15-kilowatt space power system in a vacuum chamber were compared with predictions of both a pretest analysis and a modified version of that analysis. The pretest analysis predicted test results with differences of no more than 9 percent of the largest measured value for each quantity. The modified analysis correlated measurements. Differences in conversion efficiency and power output were no greater than plus or minus 2.5 percent. This modified analysis was used to project space performance maps for the current test system.

  15. Therapy of bovine endometritis with prostaglandin F2α: a meta-analysis.

    PubMed

    Haimerl, P; Heuwieser, W; Arlt, S

    2013-05-01

    The objective of the conducted meta-analysis was to assess the efficacy of the treatment of bovine endometritis with PGF(2α) by statistical means. Postpartum uterine infections have a high prevalence and a very negative effect on reproductive performance in dairy cattle. Because of a wide discordance between research results, a meta-analysis of the efficacy of the treatment of bovine endometritis with PGF(2α) was conducted. A comprehensive literature search was performed using online databases to reveal a total of 2,307 references. In addition, 5 articles were retrieved by reviewing citations. After applying specific exclusion criteria and evaluating specific evidence parameters, 5 publications, comprising 6 trials, were eligible for being analyzed by means of meta-analysis. Data for each trial were extracted and analyzed using meta-analysis software Review Manager (version 5.1; The Nordic Cochrane Centre, Copenhagen, Denmark). Estimated effect sizes of PGF(2α) were calculated on calving to first service and calving to conception interval. Prostaglandin F(2α) treatment of cows with chronic endometritis had a negative effect on both reproductive performance parameters. Heterogeneity was substantial for calving to first service and calving to conception interval [I(2) (measure of variation beyond chance)=100 and 87%, respectively]; therefore, random-effects models were used. Sensitivity analysis as well as subgroup analysis showed that the performance of randomization was influential in modifying effect size of PGF(2α) treatment. The funnel plot illustrated a publication bias toward smaller studies that reported a prolonged calving to conception interval after a PGF(2α) treatment. We conclude that the investigation of this subject by means of meta-analysis did not reveal an improvement of reproductive performance of cows with endometritis after treatment with PGF(2α). Furthermore, there is a shortage of comparable high quality studies investigating reproductive performance after PGF(2α) treatment of cows with chronic endometritis. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  16. Demonstration and Evaluation of Solid Phase Microextraction for the Assessment of Bioavailability and Contaminant Mobility. ESTCP Cost and Performance Report

    DTIC Science & Technology

    2012-08-01

    subsequent chemical analysis (into acetonitrile for high-performance liquid chromatography [ HPLC ] analysis or hexane for gas chromatography [GC... analysis ) is rapid and complete. In this work, PAHs were analyzed by Waters 2795 HPLC with fluorescent detection (USEPA Method 8310) and PCBs were...detection limits by direct water injection versus SPME with PDMS and coefficient of variation and correlation coefficient for SPME. Analysis by HPLC

  17. Radiological performance assessment for the E-Area Vaults Disposal Facility. Appendices A through M

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, J.R.

    1994-04-15

    These document contains appendices A-M for the performance assessment. They are A: details of models and assumptions, B: computer codes, C: data tabulation, D: geochemical interactions, E: hydrogeology of the Savannah River Site, F: software QA plans, G: completeness review guide, H: performance assessment peer review panel recommendations, I: suspect soil performance analysis, J: sensitivity/uncertainty analysis, K: vault degradation study, L: description of naval reactor waste disposal, M: porflow input file. (GHH)

  18. MSFC Skylab structures and mechanical systems mission evaluation

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A performance analysis for structural and mechanical major hardware systems and components is presented. Development background testing, modifications, and requirement adjustments are included. Functional narratives are provided for comparison purposes as are predicted design performance criterion. Each item is evaluated on an individual basis: that is, (1) history (requirements, design, manufacture, and test); (2) in-orbit performance (description and analysis); and (3) conclusions and recommendations regarding future space hardware application. Overall, the structural and mechanical performance of the Skylab hardware was outstanding.

  19. Multi-ingredients determination and fingerprint analysis of leaves from Ilex latifolia using ultra-performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry.

    PubMed

    Fan, Chunlin; Deng, Jiewei; Yang, Yunyun; Liu, Junshan; Wang, Ying; Zhang, Xiaoqi; Fai, Kuokchiu; Zhang, Qingwen; Ye, Wencai

    2013-10-01

    An ultra-performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry (UPLC-QTOF-MS) method integrating multi-ingredients determination and fingerprint analysis has been established for quality assessment and control of leaves from Ilex latifolia. The method possesses the advantages of speediness, efficiency, accuracy, and allows the multi-ingredients determination and fingerprint analysis in one chromatographic run within 13min. Multi-ingredients determination was performed based on the extracted ion chromatograms of the exact pseudo-molecular ions (with a 0.01Da window), and fingerprint analysis was performed based on the base peak chromatograms, obtained by negative-ion electrospray ionization QTOF-MS. The method validation results demonstrated our developed method possessing desirable specificity, linearity, precision and accuracy. The method was utilized to analyze 22 I. latifolia samples from different origins. The quality assessment was achieved by using both similarity analysis (SA) and principal component analysis (PCA), and the results from SA were consistent with those from PCA. Our experimental results demonstrate that the strategy integrated multi-ingredients determination and fingerprint analysis using UPLC-QTOF-MS technique is a useful approach for rapid pharmaceutical analysis, with promising prospects for the differentiation of origin, the determination of authenticity, and the overall quality assessment of herbal medicines. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. The Function of Neuroendocrine Cells in Prostate Cancer

    DTIC Science & Technology

    2013-04-01

    integration site. We then performed deep sequencing and aligned reads to the genome. Our analysis revealed that both histological phenotypes are derived from...lentiviral integration site analysis . (B) Laser capture microdissection was performed on individual glands containing both squamous and...lentiviral integration site analysis . LTR: long terminal repeat (viral DNA), PCR: polymerase chain reaction. (D) Venn diagrams depict shared lentiviral

  1. Failure Mode/Mechanism Distributions

    DTIC Science & Technology

    1991-09-01

    circuits , hybrids, discrete semiconductors, microwave devices, optoelectronics and nonelectronic parts employed in military, space, industrial and...FMEA may be performed as a hardware analysis, a functional analysis, or a combination analysis and is ideally initiated at the part, circuit or...by a single replaceable module , a separate FMEA could be performed on the internal functions of the module , viewing the module as a system. The level

  2. Finite wordlength implementation of a megachannel digital spectrum analyzer

    NASA Technical Reports Server (NTRS)

    Satorius, E. H.; Grimm, M. J.; Zimmerman, G. A.; Wilck, H. C.

    1986-01-01

    The results of an extensive system analysis of the megachannel spectrum analyzer currently being developed for use in various applications of the Deep Space Network are presented. The intent of this analysis is to quantify the effects of digital quantization errors on system performance. The results of this analysis provide useful guidelines for choosing various system design parameters to enhance system performance.

  3. Material nonlinear analysis via mixed-iterative finite element method

    NASA Technical Reports Server (NTRS)

    Sutjahjo, Edhi; Chamis, Christos C.

    1992-01-01

    The performance of elastic-plastic mixed-iterative analysis is examined through a set of convergence studies. Membrane and bending behaviors are tested using 4-node quadrilateral finite elements. The membrane result is excellent, which indicates the implementation of elastic-plastic mixed-iterative analysis is appropriate. On the other hand, further research to improve bending performance of the method seems to be warranted.

  4. Driver performance measurement and analysis system (DPMAS). Volume 1, Description and operations manual

    DOT National Transportation Integrated Search

    1976-08-01

    A prototype driver performance measurement and analysis system (DPMAS) has been developed for the National Highway Traffic Safety Administration (NHTSA). This system includes a completely instrumented 1974 Chevrolet Impala capable of digitally record...

  5. Statistical Evaluation of Time Series Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Benignus, V. A.

    1973-01-01

    The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.

  6. Fem and Experimental Analysis of Thin-Walled Composite Elements Under Compression

    NASA Astrophysics Data System (ADS)

    Różyło, P.; Wysmulski, P.; Falkowicz, K.

    2017-05-01

    Thin-walled steel elements in the form of openwork columns with variable geometrical parameters of holes were studied. The samples of thin-walled composite columns were modelled numerically. They were subjected to axial compression to examine their behavior in the critical and post-critical state. The numerical models were articulately supported on the upper and lower edges of the cross-section of the profiles. The numerical analysis was conducted only with respect to the non-linear stability of the structure. The FEM analysis was performed until the material achieved its yield stress. This was done to force the loss of stability by the structures. The numerical analysis was performed using the ABAQUS® software. The numerical analysis was performed only for the elastic range to ensure the operating stability of the tested thin-walled structures.

  7. Performance analysis and dynamic modeling of a single-spool turbojet engine

    NASA Astrophysics Data System (ADS)

    Andrei, Irina-Carmen; Toader, Adrian; Stroe, Gabriela; Frunzulica, Florin

    2017-01-01

    The purposes of modeling and simulation of a turbojet engine are the steady state analysis and transient analysis. From the steady state analysis, which consists in the investigation of the operating, equilibrium regimes and it is based on appropriate modeling describing the operation of a turbojet engine at design and off-design regimes, results the performance analysis, concluded by the engine's operational maps (i.e. the altitude map, velocity map and speed map) and the engine's universal map. The mathematical model that allows the calculation of the design and off-design performances, in case of a single spool turbojet is detailed. An in house code was developed, its calibration was done for the J85 turbojet engine as the test case. The dynamic modeling of the turbojet engine is obtained from the energy balance equations for compressor, combustor and turbine, as the engine's main parts. The transient analysis, which is based on appropriate modeling of engine and its main parts, expresses the dynamic behavior of the turbojet engine, and further, provides details regarding the engine's control. The aim of the dynamic analysis is to determine a control program for the turbojet, based on the results provided by performance analysis. In case of the single-spool turbojet engine, with fixed nozzle geometry, the thrust is controlled by one parameter, which is the fuel flow rate. The design and management of the aircraft engine controls are based on the results of the transient analysis. The construction of the design model is complex, since it is based on both steady-state and transient analysis, further allowing the flight path cycle analysis and optimizations. This paper presents numerical simulations for a single-spool turbojet engine (J85 as test case), with appropriate modeling for steady-state and dynamic analysis.

  8. Development of an automated analysis system for data from flow cytometric intracellular cytokine staining assays from clinical vaccine trials

    PubMed Central

    Shulman, Nick; Bellew, Matthew; Snelling, George; Carter, Donald; Huang, Yunda; Li, Hongli; Self, Steven G.; McElrath, M. Juliana; De Rosa, Stephen C.

    2008-01-01

    Background Intracellular cytokine staining (ICS) by multiparameter flow cytometry is one of the primary methods for determining T cell immunogenicity in HIV-1 clinical vaccine trials. Data analysis requires considerable expertise and time. The amount of data is quickly increasing as more and larger trials are performed, and thus there is a critical need for high throughput methods of data analysis. Methods A web based flow cytometric analysis system, LabKey Flow, was developed for analyses of data from standardized ICS assays. A gating template was created manually in commercially-available flow cytometric analysis software. Using this template, the system automatically compensated and analyzed all data sets. Quality control queries were designed to identify potentially incorrect sample collections. Results Comparison of the semi-automated analysis performed by LabKey Flow and the manual analysis performed using FlowJo software demonstrated excellent concordance (concordance correlation coefficient >0.990). Manual inspection of the analyses performed by LabKey Flow for 8-color ICS data files from several clinical vaccine trials indicates that template gates can appropriately be used for most data sets. Conclusions The semi-automated LabKey Flow analysis system can analyze accurately large ICS data files. Routine use of the system does not require specialized expertise. This high-throughput analysis will provide great utility for rapid evaluation of complex multiparameter flow cytometric measurements collected from large clinical trials. PMID:18615598

  9. Handbook of experiences in the design and installation of solar heating and cooling systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ward, D.S.; Oberoi, H.S.

    1980-07-01

    A large array of problems encountered are detailed, including design errors, installation mistakes, cases of inadequate durability of materials and unacceptable reliability of components, and wide variations in the performance and operation of different solar systems. Durability, reliability, and design problems are reviewed for solar collector subsystems, heat transfer fluids, thermal storage, passive solar components, piping/ducting, and reliability/operational problems. The following performance topics are covered: criteria for design and performance analysis, domestic hot water systems, passive space heating systems, active space heating systems, space cooling systems, analysis of systems performance, and performance evaluations. (MHR)

  10. How Game Location Affects Soccer Performance: T-Pattern Analysis of Attack Actions in Home and Away Matches.

    PubMed

    Diana, Barbara; Zurloni, Valentino; Elia, Massimiliano; Cavalera, Cesare M; Jonsson, Gudberg K; Anguera, M Teresa

    2017-01-01

    The influence of game location on performance has been widely examined in sport contexts. Concerning soccer, game-location affects positively the secondary and tertiary level of performance; however, there are fewer evidences about its effect on game structure (primary level of performance). This study aimed to detect the effect of game location on a primary level of performance in soccer. In particular, the objective was to reveal the hidden structures underlying the attack actions, in both home and away matches played by a top club (Serie A 2012/2013-First Leg). The methodological approach was based on systematic observation, supported by digital recordings and T-pattern analysis. Data were analyzed with THEME 6.0 software. A quantitative analysis, with nonparametric Mann-Whitney test and descriptive statistics, was carried out to test the hypotheses. A qualitative analysis on complex patterns was performed to get in-depth information on the game structure. This study showed that game tactics were significantly different, with home matches characterized by a more structured and varied game than away matches. In particular, a higher number of different patterns, with a higher level of complexity and including more unique behaviors was detected in home matches than in the away ones. No significant differences were found in the number of events coded per game between the two conditions. THEME software, and the corresponding T-pattern detection algorithm, enhance research opportunities by going further than frequency-based analyses, making this method an effective tool in supporting sport performance analysis and training.

  11. How Game Location Affects Soccer Performance: T-Pattern Analysis of Attack Actions in Home and Away Matches

    PubMed Central

    Diana, Barbara; Zurloni, Valentino; Elia, Massimiliano; Cavalera, Cesare M.; Jonsson, Gudberg K.; Anguera, M. Teresa

    2017-01-01

    The influence of game location on performance has been widely examined in sport contexts. Concerning soccer, game-location affects positively the secondary and tertiary level of performance; however, there are fewer evidences about its effect on game structure (primary level of performance). This study aimed to detect the effect of game location on a primary level of performance in soccer. In particular, the objective was to reveal the hidden structures underlying the attack actions, in both home and away matches played by a top club (Serie A 2012/2013—First Leg). The methodological approach was based on systematic observation, supported by digital recordings and T-pattern analysis. Data were analyzed with THEME 6.0 software. A quantitative analysis, with nonparametric Mann–Whitney test and descriptive statistics, was carried out to test the hypotheses. A qualitative analysis on complex patterns was performed to get in-depth information on the game structure. This study showed that game tactics were significantly different, with home matches characterized by a more structured and varied game than away matches. In particular, a higher number of different patterns, with a higher level of complexity and including more unique behaviors was detected in home matches than in the away ones. No significant differences were found in the number of events coded per game between the two conditions. THEME software, and the corresponding T-pattern detection algorithm, enhance research opportunities by going further than frequency-based analyses, making this method an effective tool in supporting sport performance analysis and training. PMID:28878712

  12. T-tubule disease: Relationship between t-tubule organization and regional contractile performance in human dilated cardiomyopathy.

    PubMed

    Crossman, David J; Young, Alistair A; Ruygrok, Peter N; Nason, Guy P; Baddelely, David; Soeller, Christian; Cannell, Mark B

    2015-07-01

    Evidence from animal models suggest that t-tubule changes may play an important role in the contractile deficit associated with heart failure. However samples are usually taken at random with no regard as to regional variability present in failing hearts which leads to uncertainty in the relationship between contractile performance and possible t-tubule derangement. Regional contraction in human hearts was measured by tagged cine MRI and model fitting. At transplant, failing hearts were biopsy sampled in identified regions and immunocytochemistry was used to label t-tubules and sarcomeric z-lines. Computer image analysis was used to assess 5 different unbiased measures of t-tubule structure/organization. In regions of failing hearts that showed good contractile performance, t-tubule organization was similar to that seen in normal hearts, with worsening structure correlating with the loss of regional contractile performance. Statistical analysis showed that t-tubule direction was most highly correlated with local contractile performance, followed by the amplitude of the sarcomeric peak in the Fourier transform of the t-tubule image. Other area based measures were less well correlated. We conclude that regional contractile performance in failing human hearts is strongly correlated with the local t-tubule organization. Cluster tree analysis with a functional definition of failing contraction strength allowed a pathological definition of 't-tubule disease'. The regional variability in contractile performance and cellular structure is a confounding issue for analysis of samples taken from failing human hearts, although this may be overcome with regional analysis by using tagged cMRI and biopsy mapping. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Robust Control Systems.

    DTIC Science & Technology

    1981-12-01

    time control system algorithms that will perform adequately (i.e., at least maintain closed-loop system stability) when ucertain parameters in the...system design models vary significantly. Such a control algorithm is said to have stability robustness-or more simply is said to be "robust". This...cas6s above, the performance is analyzed using a covariance analysis. The development of all the controllers and the performance analysis algorithms is

  14. Applications of advanced V/STOL aircraft concepts to civil utility missions. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The linear performance definition curves for the lift fan aircraft, tilt rotor aircraft, and advanced helicopter are given. The computer program written to perform the mission analysis for this study is also documented, and examples of its use are shown. Methods used to derive the performance coefficients for use in the mission analysis of the lift fan aircraft are described.

  15. The Quadratic Relationship between Socioeconomic Status and Learning Performance in China by Multilevel Analysis: Implications for Policies to Foster Education Equity

    ERIC Educational Resources Information Center

    Zhao, Ningning; Valcke, Martin; Desoete, Annemie; Verhaeghe, JeanPierre

    2012-01-01

    The purpose of the present study is to explore the relationship between family socioeconomic status and mathematics performance on the base of a multi-level analysis involving a large sample of Chinese primary school students. A weak relationship is found between socioeconomic status and performance in the Chinese context. The relationship does…

  16. Study on the influence of X-ray tube spectral distribution on the analysis of bulk samples and thin films: Fundamental parameters method and theoretical coefficient algorithms

    NASA Astrophysics Data System (ADS)

    Sitko, Rafał

    2008-11-01

    Knowledge of X-ray tube spectral distribution is necessary in theoretical methods of matrix correction, i.e. in both fundamental parameter (FP) methods and theoretical influence coefficient algorithms. Thus, the influence of X-ray tube distribution on the accuracy of the analysis of thin films and bulk samples is presented. The calculations are performed using experimental X-ray tube spectra taken from the literature and theoretical X-ray tube spectra evaluated by three different algorithms proposed by Pella et al. (X-Ray Spectrom. 14 (1985) 125-135), Ebel (X-Ray Spectrom. 28 (1999) 255-266), and Finkelshtein and Pavlova (X-Ray Spectrom. 28 (1999) 27-32). In this study, Fe-Cr-Ni system is selected as an example and the calculations are performed for X-ray tubes commonly applied in X-ray fluorescence analysis (XRF), i.e., Cr, Mo, Rh and W. The influence of X-ray tube spectra on FP analysis is evaluated when quantification is performed using various types of calibration samples. FP analysis of bulk samples is performed using pure-element bulk standards and multielement bulk standards similar to the analyzed material, whereas for FP analysis of thin films, the bulk and thin pure-element standards are used. For the evaluation of the influence of X-ray tube spectra on XRF analysis performed by theoretical influence coefficient methods, two algorithms for bulk samples are selected, i.e. Claisse-Quintin (Can. Spectrosc. 12 (1967) 129-134) and COLA algorithms (G.R. Lachance, Paper Presented at the International Conference on Industrial Inorganic Elemental Analysis, Metz, France, June 3, 1981) and two algorithms (constant and linear coefficients) for thin films recently proposed by Sitko (X-Ray Spectrom. 37 (2008) 265-272).

  17. Plug cluster module demonstration

    NASA Technical Reports Server (NTRS)

    Rousar, D. C.

    1978-01-01

    The low pressure, film cooled rocket engine design concept developed during two previous ALRC programs was re-evaluated for application as a module for a plug cluster engine capable of performing space shuttle OTV missions. The nominal engine mixture ratio was 5.5 and the engine life requirements were 1200 thermal cycles and 10 hours total operating life. The program consisted of pretest analysis; engine tests, performed using residual components; and posttest analysis. The pretest analysis indicated that operation of the operation of the film cooled engine at O/F = 5.5 was feasible. During the engine tests, steady state wall temperature and performance measurement were obtained over a range of film cooling flow rates, and the durability of the engine was demonstrated by firing the test engine 1220 times at a nominal performance ranging from 430 - 432 seconds. The performance of the test engine was limited by film coolant sleeve damage which had occurred during previous testing. The post-test analyses indicated that the nominal performance level can be increased to 436 seconds.

  18. Musicians, postural quality and musculoskeletal health: A literature's review.

    PubMed

    Blanco-Piñeiro, Patricia; Díaz-Pereira, M Pino; Martínez, Aurora

    2017-01-01

    An analysis of the salient characteristics of research papers published between 1989 and 2015 that evaluate the relationship between postural quality during musical performance and various performance quality and health factors, with emphasis on musculoskeletal health variables. Searches of Medline, Scopus and Google Scholar for papers that analysed the subject of the study objective. The following MeSH descriptors were used: posture; postural balance; muscle, skeletal; task performance and analysis; back; and spine and music. A descriptive statistical analysis of their methodology (sample types, temporal design, and postural, health and other variables analysed) and findings has been made. The inclusion criterion was that the body postural quality of the musicians during performance was included among the target study variables. Forty-one relevant empirical studies were found, written in English. Comparison and analysis of their results was hampered by great disparities in measuring instruments and operationalization of variables. Despite the growing interest in the relationships among these variables, the empirical knowledge base still has many limitations, making rigorous comparative analysis difficult. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Transmission Index Research of Parallel Manipulators Based on Matrix Orthogonal Degree

    NASA Astrophysics Data System (ADS)

    Shao, Zhu-Feng; Mo, Jiao; Tang, Xiao-Qiang; Wang, Li-Ping

    2017-11-01

    Performance index is the standard of performance evaluation, and is the foundation of both performance analysis and optimal design for the parallel manipulator. Seeking the suitable kinematic indices is always an important and challenging issue for the parallel manipulator. So far, there are extensive studies in this field, but few existing indices can meet all the requirements, such as simple, intuitive, and universal. To solve this problem, the matrix orthogonal degree is adopted, and generalized transmission indices that can evaluate motion/force transmissibility of fully parallel manipulators are proposed. Transmission performance analysis of typical branches, end effectors, and parallel manipulators is given to illustrate proposed indices and analysis methodology. Simulation and analysis results reveal that proposed transmission indices possess significant advantages, such as normalized finite (ranging from 0 to 1), dimensionally homogeneous, frame-free, intuitive and easy to calculate. Besides, proposed indices well indicate the good transmission region and relativity to the singularity with better resolution than the traditional local conditioning index, and provide a novel tool for kinematic analysis and optimal design of fully parallel manipulators.

  20. SENSITIVITY ANALYSIS OF THE USEPA WINS PM 2.5 SEPARATOR

    EPA Science Inventory

    Factors affecting the performance of the US EPA WINS PM2.5 separator have been systematically evaluated. In conjunction with the separator's laboratory calibrated penetration curve, analysis of the governing equation that describes conventional impactor performance was used to ...

  1. MODA A Framework for Memory Centric Performance Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shrestha, Sunil; Su, Chun-Yi; White, Amanda M.

    2012-06-29

    In the age of massive parallelism, the focus of performance analysis has switched from the processor and related structures to the memory and I/O resources. Adapting to this new reality, a performance analysis tool has to provide a way to analyze resource usage to pinpoint existing and potential problems in a given application. This paper provides an overview of the Memory Observant Data Analysis (MODA) tool, a memory-centric tool first implemented on the Cray XMT supercomputer. Throughout the paper, MODA's capabilities have been showcased with experiments done on matrix multiply and Graph-500 application codes.

  2. Scalable and Power Efficient Data Analytics for Hybrid Exascale Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choudhary, Alok; Samatova, Nagiza; Wu, Kesheng

    This project developed a generic and optimized set of core data analytics functions. These functions organically consolidate a broad constellation of high performance analytical pipelines. As the architectures of emerging HPC systems become inherently heterogeneous, there is a need to design algorithms for data analysis kernels accelerated on hybrid multi-node, multi-core HPC architectures comprised of a mix of CPUs, GPUs, and SSDs. Furthermore, the power-aware trend drives the advances in our performance-energy tradeoff analysis framework which enables our data analysis kernels algorithms and software to be parameterized so that users can choose the right power-performance optimizations.

  3. SEM-PLS Analysis of Inhibiting Factors of Cost Performance for Large Construction Projects in Malaysia: Perspective of Clients and Consultants

    PubMed Central

    Memon, Aftab Hameed; Rahman, Ismail Abdul

    2014-01-01

    This study uncovered inhibiting factors to cost performance in large construction projects of Malaysia. Questionnaire survey was conducted among clients and consultants involved in large construction projects. In the questionnaire, a total of 35 inhibiting factors grouped in 7 categories were presented to the respondents for rating significant level of each factor. A total of 300 questionnaire forms were distributed. Only 144 completed sets were received and analysed using advanced multivariate statistical software of Structural Equation Modelling (SmartPLS v2). The analysis involved three iteration processes where several of the factors were deleted in order to make the model acceptable. The result of the analysis found that R 2 value of the model is 0.422 which indicates that the developed model has a substantial impact on cost performance. Based on the final form of the model, contractor's site management category is the most prominent in exhibiting effect on cost performance of large construction projects. This finding is validated using advanced techniques of power analysis. This vigorous multivariate analysis has explicitly found the significant category which consists of several causative factors to poor cost performance in large construction projects. This will benefit all parties involved in construction projects for controlling cost overrun. PMID:24693227

  4. SEM-PLS analysis of inhibiting factors of cost performance for large construction projects in Malaysia: perspective of clients and consultants.

    PubMed

    Memon, Aftab Hameed; Rahman, Ismail Abdul

    2014-01-01

    This study uncovered inhibiting factors to cost performance in large construction projects of Malaysia. Questionnaire survey was conducted among clients and consultants involved in large construction projects. In the questionnaire, a total of 35 inhibiting factors grouped in 7 categories were presented to the respondents for rating significant level of each factor. A total of 300 questionnaire forms were distributed. Only 144 completed sets were received and analysed using advanced multivariate statistical software of Structural Equation Modelling (SmartPLS v2). The analysis involved three iteration processes where several of the factors were deleted in order to make the model acceptable. The result of the analysis found that R(2) value of the model is 0.422 which indicates that the developed model has a substantial impact on cost performance. Based on the final form of the model, contractor's site management category is the most prominent in exhibiting effect on cost performance of large construction projects. This finding is validated using advanced techniques of power analysis. This vigorous multivariate analysis has explicitly found the significant category which consists of several causative factors to poor cost performance in large construction projects. This will benefit all parties involved in construction projects for controlling cost overrun.

  5. Application of a High-Fidelity Icing Analysis Method to a Model-Scale Rotor in Forward Flight

    NASA Technical Reports Server (NTRS)

    Narducci, Robert; Orr, Stanley; Kreeger, Richard E.

    2012-01-01

    An icing analysis process involving the loose coupling of OVERFLOW-RCAS for rotor performance prediction and with LEWICE3D for thermal analysis and ice accretion is applied to a model-scale rotor for validation. The process offers high-fidelity rotor analysis for the noniced and iced rotor performance evaluation that accounts for the interaction of nonlinear aerodynamics with blade elastic deformations. Ice accumulation prediction also involves loosely coupled data exchanges between OVERFLOW and LEWICE3D to produce accurate ice shapes. Validation of the process uses data collected in the 1993 icing test involving Sikorsky's Powered Force Model. Non-iced and iced rotor performance predictions are compared to experimental measurements as are predicted ice shapes.

  6. Specialized data analysis of SSME and advanced propulsion system vibration measurements

    NASA Technical Reports Server (NTRS)

    Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi

    1993-01-01

    The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.

  7. User's Guide for a Modular Flutter Analysis Software System (Fast Version 1.0)

    NASA Technical Reports Server (NTRS)

    Desmarais, R. N.; Bennett, R. M.

    1978-01-01

    The use and operation of a group of computer programs to perform a flutter analysis of a single planar wing are described. This system of programs is called FAST for Flutter Analysis System, and consists of five programs. Each program performs certain portions of a flutter analysis and can be run sequentially as a job step or individually. FAST uses natural vibration modes as input data and performs a conventional V-g type of solution. The unsteady aerodynamics programs in FAST are based on the subsonic kernel function lifting-surface theory although other aerodynamic programs can be used. Application of the programs is illustrated by a sample case of a complete flutter calculation that exercises each program.

  8. NPSS Multidisciplinary Integration and Analysis

    NASA Technical Reports Server (NTRS)

    Hall, Edward J.; Rasche, Joseph; Simons, Todd A.; Hoyniak, Daniel

    2006-01-01

    The objective of this task was to enhance the capability of the Numerical Propulsion System Simulation (NPSS) by expanding its reach into the high-fidelity multidisciplinary analysis area. This task investigated numerical techniques to convert between cold static to hot running geometry of compressor blades. Numerical calculations of blade deformations were iteratively done with high fidelity flow simulations together with high fidelity structural analysis of the compressor blade. The flow simulations were performed with the Advanced Ducted Propfan Analysis (ADPAC) code, while structural analyses were performed with the ANSYS code. High fidelity analyses were used to evaluate the effects on performance of: variations in tip clearance, uncertainty in manufacturing tolerance, variable inlet guide vane scheduling, and the effects of rotational speed on the hot running geometry of the compressor blades.

  9. Space radiator simulation manual for computer code

    NASA Technical Reports Server (NTRS)

    Black, W. Z.; Wulff, W.

    1972-01-01

    A computer program that simulates the performance of a space radiator is presented. The program basically consists of a rigorous analysis which analyzes a symmetrical fin panel and an approximate analysis that predicts system characteristics for cases of non-symmetrical operation. The rigorous analysis accounts for both transient and steady state performance including aerodynamic and radiant heating of the radiator system. The approximate analysis considers only steady state operation with no aerodynamic heating. A description of the radiator system and instructions to the user for program operation is included. The input required for the execution of all program options is described. Several examples of program output are contained in this section. Sample output includes the radiator performance during ascent, reentry and orbit.

  10. Percutaneous Trigger Finger Release: A Cost-effectiveness Analysis.

    PubMed

    Gancarczyk, Stephanie M; Jang, Eugene S; Swart, Eric P; Makhni, Eric C; Kadiyala, Rajendra Kumar

    2016-07-01

    Percutaneous trigger finger releases (TFRs) performed in the office setting are becoming more prevalent. This study compares the costs of in-hospital open TFRs, open TFRs performed in ambulatory surgical centers (ASCs), and in-office percutaneous releases. An expected-value decision-analysis model was constructed from the payer perspective to estimate total costs of the three competing treatment strategies for TFR. Model parameters were estimated based on the best available literature and were tested using multiway sensitivity analysis. Percutaneous TFR performed in the office and then, if needed, revised open TFR performed in the ASC, was the most cost-effective strategy, with an attributed cost of $603. The cost associated with an initial open TFR performed in the ASC was approximately 7% higher. Initial open TFR performed in the hospital was the least cost-effective, with an attributed cost nearly twice that of primary percutaneous TFR. An initial attempt at percutaneous TFR is more cost-effective than an open TFR. Currently, only about 5% of TFRs are performed in the office; therefore, a substantial opportunity exists for cost savings in the future. Decision model level II.

  11. A novel integrated assessment methodology of urban water reuse.

    PubMed

    Listowski, A; Ngo, H H; Guo, W S; Vigneswaran, S

    2011-01-01

    Wastewater is no longer considered a waste product and water reuse needs to play a stronger part in securing urban water supply. Although treatment technologies for water reclamation have significantly improved the question that deserves further analysis is, how selection of a particular wastewater treatment technology relates to performance and sustainability? The proposed assessment model integrates; (i) technology, characterised by selected quantity and quality performance parameters; (ii) productivity, efficiency and reliability criteria; (iii) quantitative performance indicators; (iv) development of evaluation model. The challenges related to hierarchy and selections of performance indicators have been resolved through the case study analysis. The goal of this study is to validate a new assessment methodology in relation to performance of the microfiltration (MF) technology, a key element of the treatment process. Specific performance data and measurements were obtained at specific Control and Data Acquisition Points (CP) to satisfy the input-output inventory in relation to water resources, products, material flows, energy requirements, chemicals use, etc. Performance assessment process contains analysis and necessary linking across important parametric functions leading to reliable outcomes and results.

  12. Organizational variables on nurses' job performance in Turkey: nursing assessments.

    PubMed

    Top, Mehmet

    2013-01-01

    The purpose of this study was to describe the influence of organizational variables on hospital staff nurses' job performance as reported by staff nurses in two cities in Turkey. Hospital ownership status, employment status were examined for their effect on this influence. The reported influence of organizational variables on job performance was measured by a questionnaire developed for this study. Nurses were asked to evaluate the influence of 28 organizational variables on their job performance using a five-point Likert-type scale (1- Never effective, 5- Very effective). The study used comparative and descriptive study design. The staff nurses who were included in this study were 831 hospital staff nurses. Descriptive statistics, frequencies, t-test, ANOVA and factor analysis were used for data analysis. The study showed the relative importance of the 28 organizational variables in influencing nurses' job performance. Nurses in this study reported that workload and technological support are the most influential organizational variables on their job performance. Factor analysis yielded a five-factor model that explained 53.99% of total variance. Administratively controllable influence job organizational variables influence job performance of nurses in different magnitude.

  13. Development of the performance confirmation program at YUCCA mountain, nevada

    USGS Publications Warehouse

    LeCain, G.D.; Barr, D.; Weaver, D.; Snell, R.; Goodin, S.W.; Hansen, F.D.

    2006-01-01

    The Yucca Mountain Performance Confirmation program consists of tests, monitoring activities, experiments, and analyses to evaluate the adequacy of assumptions, data, and analyses that form the basis of the conceptual and numerical models of flow and transport associated with a proposed radioactive waste repository at Yucca Mountain, Nevada. The Performance Confirmation program uses an eight-stage risk-informed, performance-based approach. Selection of the Performance Confirmation activities for inclusion in the Performance Confirmation program was done using a risk-informed performance-based decision analysis. The result of this analysis was a Performance Confirmation base portfolio that consists of 20 activities. The 20 Performance Confirmation activities include geologic, hydrologie, and construction/engineering testing. Some of the activities began during site characterization, and others will begin during construction, or post emplacement, and continue until repository closure.

  14. Internal Flow of Contra-Rotating Small Hydroturbine at Off- Design Flow Rates

    NASA Astrophysics Data System (ADS)

    SHIGEMITSU, Toru; TAKESHIMA, Yasutoshi; OGAWA, Yuya; FUKUTOMI, Junichiro

    2016-11-01

    Small hydropower generation is one of important alternative energy, and enormous potential lie in the small hydropower. However, efficiency of small hydroturbines is lower than that of large one. Then, there are demands for small hydroturbines to keep high performance in wide flow rate range. Therefore, we adopted contra-rotating rotors, which can be expected to achieve high performance. In this research, performance of the contra-rotating small hydroturbine with 60mm casing diameter was investigated by an experiment and numerical analysis. Efficiency of the contra-rotating small hydroturbine was high in pico-hydroturbine and high efficiency could be kept in wide flow rate range, however the performance of a rear rotor decreased significantly in partial flow rates. Then, internal flow condition, which was difficult to measure experimentally, was investigated by the numerical flow analysis. Then, a relation between the performance and internal flow condition was considered by the numerical analysis result.

  15. Numerical Stability and Control Analysis Towards Falling-Leaf Prediction Capabilities of Splitflow for Two Generic High-Performance Aircraft Models

    NASA Technical Reports Server (NTRS)

    Charlton, Eric F.

    1998-01-01

    Aerodynamic analysis are performed using the Lockheed-Martin Tactical Aircraft Systems (LMTAS) Splitflow computational fluid dynamics code to investigate the computational prediction capabilities for vortex-dominated flow fields of two different tailless aircraft models at large angles of attack and sideslip. These computations are performed with the goal of providing useful stability and control data to designers of high performance aircraft. Appropriate metrics for accuracy, time, and ease of use are determined in consultations with both the LMTAS Advanced Design and Stability and Control groups. Results are obtained and compared to wind-tunnel data for all six components of forces and moments. Moment data is combined to form a "falling leaf" stability analysis. Finally, a handful of viscous simulations were also performed to further investigate nonlinearities and possible viscous effects in the differences between the accumulated inviscid computational and experimental data.

  16. Advanced Video Analysis Needs for Human Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Campbell, Paul D.

    1994-01-01

    Evaluators of human task performance in space missions make use of video as a primary source of data. Extraction of relevant human performance information from video is often a labor-intensive process requiring a large amount of time on the part of the evaluator. Based on the experiences of several human performance evaluators, needs were defined for advanced tools which could aid in the analysis of video data from space missions. Such tools should increase the efficiency with which useful information is retrieved from large quantities of raw video. They should also provide the evaluator with new analytical functions which are not present in currently used methods. Video analysis tools based on the needs defined by this study would also have uses in U.S. industry and education. Evaluation of human performance from video data can be a valuable technique in many industrial and institutional settings where humans are involved in operational systems and processes.

  17. Performance criteria for emergency medicine residents: a job analysis.

    PubMed

    Blouin, Danielle; Dagnone, Jeffrey Damon

    2008-11-01

    A major role of admission interviews is to assess a candidate's suitability for a residency program. Structured interviews have greater reliability and validity than do unstructured ones. The development of content for a structured interview is typically based on the dimensions of performance that are perceived as important to succeed in a particular line of work. A formal job analysis is normally conducted to determine these dimensions. The dimensions essential to succeed as an emergency medicine (EM) resident have not yet been studied. We aimed to analyze the work of EM residents to determine these essential dimensions. The "critical incident technique" was used to generate scenarios of poor and excellent resident performance. Two reviewers independently read each scenario and labelled the performance dimensions that were reflected in each. All labels assigned to a particular scenario were pooled and reviewed again until a consensus was reached. Five faculty members (25% of our total faculty) comprised the subject experts. Fifty-one incidents were generated and 50 different labels were applied. Eleven dimensions of performance applied to at least 5 incidents. "Professionalism" was the most valued performance dimension, represented in 56% of the incidents, followed by "self-confidence" (22%), "experience" (20%) and "knowledge" (20%). "Professionalism," "self-confidence," "experience" and "knowledge" were identified as the performance dimensions essential to succeed as an EM resident based on our formal job analysis using the critical incident technique. Performing a formal job analysis may assist training program directors with developing admission interviews.

  18. B-1B Avionics/Automatic Test Equipment: Maintenance Queueing Analysis.

    DTIC Science & Technology

    1983-12-01

    analysis (which is logistics terminology for an avionics/ATE queueing analysis). To allow each vendor the opportunity to perform such an analysis...for system performance measures may be found for the queueing system in Figure 7. This is due to the preemptive blocking caused by ATE failures. The...D-R14l1i75 B-iB AVIONICS/AUTOMPTIC TEST EQUIPMENT: MRINTENRNCE 1/2 QUEUEING RNRLYSIS(U) RIP FORCE INST OF TECH HRIGHT-PRTTERSON RFB OH SCHOOL OF

  19. Distributed intelligent data analysis in diabetic patient management.

    PubMed Central

    Bellazzi, R.; Larizza, C.; Riva, A.; Mira, A.; Fiocchi, S.; Stefanelli, M.

    1996-01-01

    This paper outlines the methodologies that can be used to perform an intelligent analysis of diabetic patients' data, realized in a distributed management context. We present a decision-support system architecture based on two modules, a Patient Unit and a Medical Unit, connected by telecommunication services. We stress the necessity to resort to temporal abstraction techniques, combined with time series analysis, in order to provide useful advice to patients; finally, we outline how data analysis and interpretation can be cooperatively performed by the two modules. PMID:8947655

  20. Method of assessing heterogeneity in images

    DOEpatents

    Jacob, Richard E.; Carson, James P.

    2016-08-23

    A method of assessing heterogeneity in images is disclosed. 3D images of an object are acquired. The acquired images may be filtered and masked. Iterative decomposition is performed on the masked images to obtain image subdivisions that are relatively homogeneous. Comparative analysis, such as variogram analysis or correlogram analysis, is performed of the decomposed images to determine spatial relationships between regions of the images that are relatively homogeneous.

  1. Differences in the Performance of Children with Specific Language Impairment and Their Typically Developing Peers on Nonverbal Cognitive Tests: A Meta-Analysis

    ERIC Educational Resources Information Center

    Gallinat, Erica; Spaulding, Tammie J.

    2014-01-01

    Purpose: This study used meta-analysis to investigate the difference in nonverbal cognitive test performance of children with specific language impairment (SLI) and their typically developing (TD) peers. Method: The meta-analysis included studies (a) that were published between 1995 and 2012 of children with SLI who were age matched (and not…

  2. Insulation commonality assessment (phase 1). Volume 2: Section 7.0 through 16.0. [evaluation of materials used for spacecraft thermal insulation

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The heat transfer characteristics of various materials used for the thermal insulation of spacecraft are discussed. Techniques for conducting thermal performance analysis, structural performance analysis, and dynamic analysis are described. Processes for producing and finishing the materials are explained. The methods for determining reliability, system safety, materials tests, and design effectiveness are explained.

  3. Statistical evaluation of vibration analysis techniques

    NASA Technical Reports Server (NTRS)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  4. Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    2011-01-01

    Systems analysis of a planetary entry (SAPE), descent, and landing (EDL) is a multidisciplinary activity in nature. SAPE improves the performance of the systems analysis team by automating and streamlining the process, and this improvement can reduce the errors that stem from manual data transfer among discipline experts. SAPE is a multidisciplinary tool for systems analysis of planetary EDL for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. It performs EDL systems analysis for any planet, operates cross-platform (i.e., Windows, Mac, and Linux operating systems), uses existing software components and open-source software to avoid software licensing issues, performs low-fidelity systems analysis in one hour on a computer that is comparable to an average laptop, and keeps discipline experts in the analysis loop. SAPE uses Python, a platform-independent, open-source language, for integration and for the user interface. Development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE currently includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and interface for structural sizing.

  5. Divergence in Morris Water Maze-Based Cognitive Performance under Chronic Stress Is Associated with the Hippocampal Whole Transcriptomic Modification in Mice

    PubMed Central

    Jung, Seung H.; Brownlow, Milene L.; Pellegrini, Matteo; Jankord, Ryan

    2017-01-01

    Individual susceptibility determines the magnitude of stress effects on cognitive function. The hippocampus, a brain region of memory consolidation, is vulnerable to stressful environments, and the impact of stress on hippocampus may determine individual variability in cognitive performance. Therefore, the purpose of this study was to define the relationship between the divergence in spatial memory performance under chronically unpredictable stress and an associated transcriptomic alternation in hippocampus, the brain region of spatial memory consolidation. Multiple strains of BXD (B6 × D2) recombinant inbred mice went through a 4-week chronic variable stress (CVS) paradigm, and the Morris water maze (MWM) test was conducted during the last week of CVS to assess hippocampal-dependent spatial memory performance and grouped animals into low and high performing groups based on the cognitive performance. Using hippocampal whole transcriptome RNA-sequencing data, differential expression, PANTHER analysis, WGCNA, Ingenuity's upstream regulator analysis in the Ingenuity Pathway Analysis® and phenotype association analysis were conducted. Our data identified multiple genes and pathways that were significantly associated with chronic stress-associated cognitive modification and the divergence in hippocampal dependent memory performance under chronic stress. Biological pathways associated with memory performance following chronic stress included metabolism, neurotransmitter and receptor regulation, immune response and cellular process. The Ingenuity's upstream regulator analysis identified 247 upstream transcriptional regulators from 16 different molecule types. Transcripts predictive of cognitive performance under high stress included genes that are associated with a high occurrence of Alzheimer's and cognitive impairments (e.g., Ncl, Eno1, Scn9a, Slc19a3, Ncstn, Fos, Eif4h, Copa, etc.). Our results show that the variable effects of chronic stress on the hippocampal transcriptome are related to the ability to complete the MWM task and that the modulations of specific pathways are indicative of hippocampal dependent memory performance. Thus, the divergence in spatial memory performance following chronic stress is related to the unique pattern of gene expression within the hippocampus. PMID:28912681

  6. Performance Evaluation Model for Application Layer Firewalls.

    PubMed

    Xuan, Shichang; Yang, Wu; Dong, Hui; Zhang, Jiangchuan

    2016-01-01

    Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  7. Scalable Performance Environments for Parallel Systems

    NASA Technical Reports Server (NTRS)

    Reed, Daniel A.; Olson, Robert D.; Aydt, Ruth A.; Madhyastha, Tara M.; Birkett, Thomas; Jensen, David W.; Nazief, Bobby A. A.; Totty, Brian K.

    1991-01-01

    As parallel systems expand in size and complexity, the absence of performance tools for these parallel systems exacerbates the already difficult problems of application program and system software performance tuning. Moreover, given the pace of technological change, we can no longer afford to develop ad hoc, one-of-a-kind performance instrumentation software; we need scalable, portable performance analysis tools. We describe an environment prototype based on the lessons learned from two previous generations of performance data analysis software. Our environment prototype contains a set of performance data transformation modules that can be interconnected in user-specified ways. It is the responsibility of the environment infrastructure to hide details of module interconnection and data sharing. The environment is written in C++ with the graphical displays based on X windows and the Motif toolkit. It allows users to interconnect and configure modules graphically to form an acyclic, directed data analysis graph. Performance trace data are represented in a self-documenting stream format that includes internal definitions of data types, sizes, and names. The environment prototype supports the use of head-mounted displays and sonic data presentation in addition to the traditional use of visual techniques.

  8. Performance assessment of human resource by integration of HSE and ergonomics and EFQM management system.

    PubMed

    Sadegh Amalnick, Mohsen; Zarrin, Mansour

    2017-03-13

    Purpose The purpose of this paper is to present an integrated framework for performance evaluation and analysis of human resource (HR) with respect to the factors of health, safety, environment and ergonomics (HSEE) management system, and also the criteria of European federation for quality management (EFQM) as one of the well-known business excellence models. Design/methodology/approach In this study, an intelligent algorithm based on adaptive neuro-fuzzy inference system (ANFIS) along with fuzzy data envelopment analysis (FDEA) are developed and employed to assess the performance of the company. Furthermore, the impact of the factors on the company's performance as well as their strengths and weaknesses are identified by conducting a sensitivity analysis on the results. Similarly, a design of experiment is performed to prioritize the factors in the order of importance. Findings The results show that EFQM model has a far greater impact upon the company's performance than HSEE management system. According to the obtained results, it can be argued that integration of HSEE and EFQM leads to the performance improvement in the company. Practical implications In current study, the required data for executing the proposed framework are collected via valid questionnaires which are filled in by the staff of an aviation industry located in Tehran, Iran. Originality/value Managing HR performance results in improving usability, maintainability and reliability and finally in a significant reduction in the commercial aviation accident rate. Also, study of factors affecting HR performance authorities participate in developing systems in order to help operators better manage human error. This paper for the first time presents an intelligent framework based on ANFIS, FDEA and statistical tests for HR performance assessment and analysis with the ability of handling uncertainty and vagueness existing in real world environment.

  9. Sleep Disturbance, Daytime Symptoms, and Functional Performance in Patients With Stable Heart Failure: A Mediation Analysis.

    PubMed

    Jeon, Sangchoon; Redeker, Nancy S

    2016-01-01

    Sleep disturbance is common among patients with heart failure (HF) who also experience symptom burden and poor functional performance. We evaluated the extent to which sleep-related, daytime symptoms (fatigue, excessive daytime sleepiness, and depressive symptoms) mediate the relationship between sleep disturbance and functional performance among patients with stable HF. We recruited patients with stable HF for this secondary analysis of data from a cross-sectional, observational study. Participants completed unattended ambulatory polysomnography from which the Respiratory Disturbance Index was calculated, along with a Six-Minute Walk Test, questionnaires to elicit sleep disturbance (Pittsburgh Sleep Quality Index, Insomnia Symptoms from the Sleep Habits Questionnaire), daytime symptoms (Center for Epidemiologic Studies Depression Scale, Global Fatigue Index, Epworth Sleepiness Scale), and self-reported functional performance (Medical Outcomes Study SF36 V2 Physical Function Scale). We used structural equation modeling with latent variables for the key analysis. Follow-up, exploratory regression analysis with bootstrapped samples was used to examine the extent to which individual daytime symptoms mediated effects of sleep disturbance on functional performance after controlling for clinical and demographic covariates. The sample included 173 New York Heart Association Class I-IV HF patients (n = 60/34.7% women; M = 60.7, SD = 16.07 years of age). Daytime symptoms mediated the relationship between sleep disturbance and functional performance. Fatigue and depression mediated the relationship between insomnia symptoms and self-reported functional performance, whereas fatigue and sleepiness mediated the relationship between sleep quality and functional performance. Sleepiness mediated the relationship between the respiratory index and self-reported functional performance only in people who did not report insomnia. Daytime symptoms explain the relationships between sleep disturbance and functional performance in stable HF.

  10. An analysis of the relationship of seven selected variables to State Board Test Pool Examination performance of the University of Tennessee, Knoxville, College of Nursing.

    PubMed

    Sharp, T G

    1984-02-01

    The study was designed to determine whether any one of seven selected variables or a combination of the variables is predictive of performance on the State Board Test Pool Examination. The selected variables studied were: high school grade point average (HSGPA), The University of Tennessee, Knoxville, College of Nursing grade point average (GPA), and American College Test Assessment (ACT) standard scores (English, ENG; mathematics, MA; social studies, SS; natural sciences, NSC; composite, COMP). Data utilized were from graduates of the baccalaureate program of The University of Tennessee, Knoxville, College of Nursing from 1974 through 1979. The sample of 322 was selected from a total population of 572. The Statistical Analysis System (SAS) was designed to accomplish analysis of the predictive relationship of each of the seven selected variables to State Board Test Pool Examination performance (result of pass or fail), a stepwise discriminant analysis was designed for determining the predictive relationship of the strongest combination of the independent variables to overall State Board Test Pool Examination performance (result of pass or fail), and stepwise multiple regression analysis was designed to determine the strongest predictive combination of selected variables for each of the five subexams of the State Board Test Pool Examination. The selected variables were each found to be predictive of SBTPE performance (result of pass or fail). The strongest combination for predicting SBTPE performance (result of pass or fail) was found to be GPA, MA, and NSC.

  11. The acoustic and perceptual differences to the non-singer's singing voice before and after a singing vocal warm-up

    NASA Astrophysics Data System (ADS)

    DeRosa, Angela

    The present study analyzed the acoustic and perceptual differences in non-singer's singing voice before and after a vocal warm-up. Experiments were conducted with 12 females who had no singing experience and considered themselves to be non-singers. Participants were recorded performing 3 tasks: a musical scale stretching to their most comfortable high and low pitches, sustained productions of the vowels /a/ and /i/, and singing performance of the "Star Spangled Banner." Participants were recorded performing these three tasks before a vocal warm-up, after a vocal warm-up, and then again 2-3 weeks later after 2-3 weeks of practice. Acoustical analysis consisted of formant frequency analysis, singer's formant/singing power ratio analysis, maximum phonation frequency range analysis, and an analysis of jitter, noise to harmonic ratio (NHR), relative average perturbation (RAP), and voice turbulence index (VTI). A perceptual analysis was also conducted with 12 listeners rating comparison performances of before vs. after the vocal warm-up, before vs. after the second vocal warm-up, and after both vocal warm-ups. There were no significant findings for the formant frequency analysis of the vowel /a/, but there was significance for the 1st formant frequency analysis of the vowel /i/. Singer's formant analyzed via Singing Power Ratio analysis showed significance only for the vowel /i/. Maximum phonation frequency range analysis showed a significant increase after the vocal warm-ups. There were no significant findings for the acoustic measures of jitter, NHR, RAP, and VTI. Perceptual analysis showed a significant difference after a vocal warm-up. The results indicate that a singing vocal warm-up can have a significant positive influence on the singing voice of non-singers.

  12. DSRC standards testing : 5MHz band-plan analysis, clustered system architecture and communication in emergency scenarios.

    DOT National Transportation Integrated Search

    2011-12-01

    Researchers performed a system level technical study of physical layer and network layer performance of vehicular communication in a specially licensed Dedicated Short Range Communication (DSRC) 5.9 GHz frequency band. Physical layer analysis provide...

  13. Integrated corridor management initiative : demonstration phase evaluation – Dallas corridor performance analysis test plan.

    DOT National Transportation Integrated Search

    2012-08-01

    This report presents the test plan for conducting the Corridor Performance Analysis for the United States Department of Transportation (U.S. DOT) evaluation of the Dallas U.S. 75 Integrated Corridor Management (ICM) Initiative Demonstration. The ICM ...

  14. Integrated corridor management initiative : demonstration phase evaluation – San Diego corridor performance analysis test plan.

    DOT National Transportation Integrated Search

    2012-08-01

    This report presents the test plan for conducting the Corridor Performance Analysis for the United States Department of Transportation (U.S. DOT) evaluation of the San Diego Integrated Corridor Management (ICM) Initiative Demonstration. The ICM proje...

  15. Using Ratio Analysis to Evaluate Financial Performance.

    ERIC Educational Resources Information Center

    Minter, John; And Others

    1982-01-01

    The ways in which ratio analysis can help in long-range planning, budgeting, and asset management to strengthen financial performance and help avoid financial difficulties are explained. Types of ratios considered include balance sheet ratios, net operating ratios, and contribution and demand ratios. (MSE)

  16. The age of peak performance in Ironman triathlon: a cross-sectional and longitudinal data analysis

    PubMed Central

    2013-01-01

    Background The aims of the present study were, firstly, to investigate in a cross-sectional analysis the age of peak Ironman performance within one calendar year in all qualifiers for Ironman Hawaii and Ironman Hawaii; secondly, to determine in a longitudinal analysis on a qualifier for Ironman Hawaii whether the age of peak Ironman performance and Ironman performance itself change across years; and thirdly, to determine the gender difference in performance. Methods In a cross-sectional analysis, the age of the top ten finishers for all qualifier races for Ironman Hawaii and Ironman Hawaii was determined in 2010. For a longitudinal analysis, the age and the performance of the annual top ten female and male finishers in a qualifier for Ironman Hawaii was determined in Ironman Switzerland between 1995 and 2010. Results In 19 of the 20 analyzed triathlons held in 2010, there was no difference in the age of peak Ironman performance between women and men (p > 0.05). The only difference in the age of peak Ironman performance between genders was in ‘Ironman Canada’ where men were older than women (p = 0.023). For all 20 races, the age of peak Ironman performance was 32.2 ± 1.5 years for men and 33.0 ± 1.6 years for women (p > 0.05). In Ironman Switzerland, there was no difference in the age of peak Ironman performance between genders for top ten women and men from 1995 to 2010 (F = 0.06, p = 0.8). The mean age of top ten women and men was 31.4 ± 1.7 and 31.5 ± 1.7 years (Cohen's d = 0.06), respectively. The gender difference in performance in the three disciplines and for overall race time decreased significantly across years. Men and women improved overall race times by approximately 1.2 and 4.2 min/year, respectively. Conclusions Women and men peak at a similar age of 32–33 years in an Ironman triathlon with no gender difference. In a qualifier for Ironman Hawaii, the age of peak Ironman performance remained unchanged across years. In contrast, gender differences in performance in Ironman Switzerland decreased during the studied period, suggesting that elite female Ironman triathletes might still narrow the gender gap in the future. PMID:24004814

  17. Automotive Exterior Noise Optimization Using Grey Relational Analysis Coupled with Principal Component Analysis

    NASA Astrophysics Data System (ADS)

    Chen, Shuming; Wang, Dengfeng; Liu, Bo

    This paper investigates optimization design of the thickness of the sound package performed on a passenger automobile. The major characteristics indexes for performance selected to evaluate the processes are the SPL of the exterior noise and the weight of the sound package, and the corresponding parameters of the sound package are the thickness of the glass wool with aluminum foil for the first layer, the thickness of the glass fiber for the second layer, and the thickness of the PE foam for the third layer. In this paper, the process is fundamentally with multiple performances, thus, the grey relational analysis that utilizes grey relational grade as performance index is especially employed to determine the optimal combination of the thickness of the different layers for the designed sound package. Additionally, in order to evaluate the weighting values corresponding to various performance characteristics, the principal component analysis is used to show their relative importance properly and objectively. The results of the confirmation experiments uncover that grey relational analysis coupled with principal analysis methods can successfully be applied to find the optimal combination of the thickness for each layer of the sound package material. Therefore, the presented method can be an effective tool to improve the vehicle exterior noise and lower the weight of the sound package. In addition, it will also be helpful for other applications in the automotive industry, such as the First Automobile Works in China, Changan Automobile in China, etc.

  18. Validating the performance of one-time decomposition for fMRI analysis using ICA with automatic target generation process.

    PubMed

    Yao, Shengnan; Zeng, Weiming; Wang, Nizhuan; Chen, Lei

    2013-07-01

    Independent component analysis (ICA) has been proven to be effective for functional magnetic resonance imaging (fMRI) data analysis. However, ICA decomposition requires to optimize the unmixing matrix iteratively whose initial values are generated randomly. Thus the randomness of the initialization leads to different ICA decomposition results. Therefore, just one-time decomposition for fMRI data analysis is not usually reliable. Under this circumstance, several methods about repeated decompositions with ICA (RDICA) were proposed to reveal the stability of ICA decomposition. Although utilizing RDICA has achieved satisfying results in validating the performance of ICA decomposition, RDICA cost much computing time. To mitigate the problem, in this paper, we propose a method, named ATGP-ICA, to do the fMRI data analysis. This method generates fixed initial values with automatic target generation process (ATGP) instead of being produced randomly. We performed experimental tests on both hybrid data and fMRI data to indicate the effectiveness of the new method and made a performance comparison of the traditional one-time decomposition with ICA (ODICA), RDICA and ATGP-ICA. The proposed method demonstrated that it not only could eliminate the randomness of ICA decomposition, but also could save much computing time compared to RDICA. Furthermore, the ROC (Receiver Operating Characteristic) power analysis also denoted the better signal reconstruction performance of ATGP-ICA than that of RDICA. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. Machine learning-based analysis of MR radiomics can help to improve the diagnostic performance of PI-RADS v2 in clinically relevant prostate cancer.

    PubMed

    Wang, Jing; Wu, Chen-Jiang; Bao, Mei-Ling; Zhang, Jing; Wang, Xiao-Ning; Zhang, Yu-Dong

    2017-10-01

    To investigate whether machine learning-based analysis of MR radiomics can help improve the performance PI-RADS v2 in clinically relevant prostate cancer (PCa). This IRB-approved study included 54 patients with PCa undergoing multi-parametric (mp) MRI before prostatectomy. Imaging analysis was performed on 54 tumours, 47 normal peripheral (PZ) and 48 normal transitional (TZ) zone based on histological-radiological correlation. Mp-MRI was scored via PI-RADS, and quantified by measuring radiomic features. Predictive model was developed using a novel support vector machine trained with: (i) radiomics, (ii) PI-RADS scores, (iii) radiomics and PI-RADS scores. Paired comparison was made via ROC analysis. For PCa versus normal TZ, the model trained with radiomics had a significantly higher area under the ROC curve (Az) (0.955 [95% CI 0.923-0.976]) than PI-RADS (Az: 0.878 [0.834-0.914], p < 0.001). The Az between them was insignificant for PCa versus PZ (0.972 [0.945-0.988] vs. 0.940 [0.905-0.965], p = 0.097). When radiomics was added, performance of PI-RADS was significantly improved for PCa versus PZ (Az: 0.983 [0.960-0.995]) and PCa versus TZ (Az: 0.968 [0.940-0.985]). Machine learning analysis of MR radiomics can help improve the performance of PI-RADS in clinically relevant PCa. • Machine-based analysis of MR radiomics outperformed in TZ cancer against PI-RADS. • Adding MR radiomics significantly improved the performance of PI-RADS. • DKI-derived Dapp and Kapp were two strong markers for the diagnosis of PCa.

  20. Classifying Facial Actions

    PubMed Central

    Donato, Gianluca; Bartlett, Marian Stewart; Hager, Joseph C.; Ekman, Paul; Sejnowski, Terrence J.

    2010-01-01

    The Facial Action Coding System (FACS) [23] is an objective method for quantifying facial movement in terms of component actions. This system is widely used in behavioral investigations of emotion, cognitive processes, and social interaction. The coding is presently performed by highly trained human experts. This paper explores and compares techniques for automatically recognizing facial actions in sequences of images. These techniques include analysis of facial motion through estimation of optical flow; holistic spatial analysis, such as principal component analysis, independent component analysis, local feature analysis, and linear discriminant analysis; and methods based on the outputs of local filters, such as Gabor wavelet representations and local principal components. Performance of these systems is compared to naive and expert human subjects. Best performances were obtained using the Gabor wavelet representation and the independent component representation, both of which achieved 96 percent accuracy for classifying 12 facial actions of the upper and lower face. The results provide converging evidence for the importance of using local filters, high spatial frequencies, and statistical independence for classifying facial actions. PMID:21188284

  1. Analysis of airframe/engine interactions in integrated flight and propulsion control

    NASA Technical Reports Server (NTRS)

    Schierman, John D.; Schmidt, David K.

    1991-01-01

    An analysis framework for the assessment of dynamic cross-coupling between airframe and engine systems from the perspective of integrated flight/propulsion control is presented. This analysis involves to determining the significance of the interactions with respect to deterioration in stability robustness and performance, as well as critical frequency ranges where problems may occur due to these interactions. The analysis illustrated here investigates both the airframe's effects on the engine control loops and the engine's effects on the airframe control loops in two case studies. The second case study involves a multi-input/multi-output analysis of the airframe. Sensitivity studies are performed on critical interactions to examine the degradations in the system's stability robustness and performance. Magnitudes of the interactions required to cause instabilities, as well as the frequencies at which the instabilities occur are recorded. Finally, the analysis framework is expanded to include control laws which contain cross-feeds between the airframe and engine systems.

  2. Universal microfluidic automaton for autonomous sample processing: application to the Mars Organic Analyzer.

    PubMed

    Kim, Jungkyu; Jensen, Erik C; Stockton, Amanda M; Mathies, Richard A

    2013-08-20

    A fully integrated multilayer microfluidic chemical analyzer for automated sample processing and labeling, as well as analysis using capillary zone electrophoresis is developed and characterized. Using lifting gate microfluidic control valve technology, a microfluidic automaton consisting of a two-dimensional microvalve cellular array is fabricated with soft lithography in a format that enables facile integration with a microfluidic capillary electrophoresis device. The programmable sample processor performs precise mixing, metering, and routing operations that can be combined to achieve automation of complex and diverse assay protocols. Sample labeling protocols for amino acid, aldehyde/ketone and carboxylic acid analysis are performed automatically followed by automated transfer and analysis by the integrated microfluidic capillary electrophoresis chip. Equivalent performance to off-chip sample processing is demonstrated for each compound class; the automated analysis resulted in a limit of detection of ~16 nM for amino acids. Our microfluidic automaton provides a fully automated, portable microfluidic analysis system capable of autonomous analysis of diverse compound classes in challenging environments.

  3. Performance review using sequential sampling and a practice computer.

    PubMed

    Difford, F

    1988-06-01

    The use of sequential sample analysis for repeated performance review is described with examples from several areas of practice. The value of a practice computer in providing a random sample from a complete population, evaluating the parameters of a sequential procedure, and producing a structured worksheet is discussed. It is suggested that sequential analysis has advantages over conventional sampling in the area of performance review in general practice.

  4. Thermal Deformation and RF Performance Analyses for the SWOT Large Deployable Ka-Band Reflectarray

    NASA Technical Reports Server (NTRS)

    Fang, H.; Sunada, E.; Chaubell, J.; Esteban-Fernandez, D.; Thomson, M.; Nicaise, F.

    2010-01-01

    A large deployable antenna technology for the NASA Surface Water and Ocean Topography (SWOT) Mission is currently being developed by JPL in response to NRC Earth Science Tier 2 Decadal Survey recommendations. This technology is required to enable the SWOT mission due to the fact that no currently available antenna is capable of meeting SWOT's demanding Ka-Band remote sensing requirements. One of the key aspects of this antenna development is to minimize the effect of the on-orbit thermal distortion to the antenna RF performance. An analysis process which includes: 1) the on-orbit thermal analysis to obtain the temperature distribution; 2) structural deformation analysis to get the geometry of the antenna surface; and 3) the RF performance with the given deformed antenna surface has been developed to accommodate the development of this antenna technology. The detailed analysis process and some analysis results will be presented and discussed by this paper.

  5. CASAS: Cancer Survival Analysis Suite, a web based application

    PubMed Central

    Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne

    2017-01-01

    We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis:  quantile, landmark and competing risks, in addition to standard survival analysis.  The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots.  Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/. PMID:28928946

  6. CASAS: Cancer Survival Analysis Suite, a web based application.

    PubMed

    Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne

    2017-01-01

    We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis:  quantile, landmark and competing risks, in addition to standard survival analysis.  The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots.  Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/.

  7. Using Solid State Disk Array as a Cache for LHC ATLAS Data Analysis

    NASA Astrophysics Data System (ADS)

    Yang, W.; Hanushevsky, A. B.; Mount, R. P.; Atlas Collaboration

    2014-06-01

    User data analysis in high energy physics presents a challenge to spinning-disk based storage systems. The analysis is data intense, yet reads are small, sparse and cover a large volume of data files. It is also unpredictable due to users' response to storage performance. We describe here a system with an array of Solid State Disk as a non-conventional, standalone file level cache in front of the spinning disk storage to help improve the performance of LHC ATLAS user analysis at SLAC. The system uses several days of data access records to make caching decisions. It can also use information from other sources such as a work-flow management system. We evaluate the performance of the system both in terms of caching and its impact on user analysis jobs. The system currently uses Xrootd technology, but the technique can be applied to any storage system.

  8. Heave-pitch-roll analysis and testing of air cushion landing systems

    NASA Technical Reports Server (NTRS)

    Boghani, A. B.; Captain, K. M.; Wormley, D. N.

    1978-01-01

    The analytical tools (analysis and computer simulation) needed to explain and predict the dynamic operation of air cushion landing systems (ACLS) is described. The following tasks were performed: the development of improved analytical models for the fan and the trunk; formulation of a heave pitch roll analysis for the complete ACLS; development of a general purpose computer simulation to evaluate landing and taxi performance of an ACLS equipped aircraft; and the verification and refinement of the analysis by comparison with test data obtained through lab testing of a prototype cushion. Demonstration of simulation capabilities through typical landing and taxi simulation of an ACLS aircraft are given. Initial results show that fan dynamics have a major effect on system performance. Comparison with lab test data (zero forward speed) indicates that the analysis can predict most of the key static and dynamic parameters (pressure, deflection, acceleration, etc.) within a margin of a 10 to 25 percent.

  9. Wearable inertial sensors in swimming motion analysis: a systematic review.

    PubMed

    de Magalhaes, Fabricio Anicio; Vannozzi, Giuseppe; Gatta, Giorgio; Fantozzi, Silvia

    2015-01-01

    The use of contemporary technology is widely recognised as a key tool for enhancing competitive performance in swimming. Video analysis is traditionally used by coaches to acquire reliable biomechanical data about swimming performance; however, this approach requires a huge computational effort, thus introducing a delay in providing quantitative information. Inertial and magnetic sensors, including accelerometers, gyroscopes and magnetometers, have been recently introduced to assess the biomechanics of swimming performance. Research in this field has attracted a great deal of interest in the last decade due to the gradual improvement of the performance of sensors and the decreasing cost of miniaturised wearable devices. With the aim of describing the state of the art of current developments in this area, a systematic review of the existing methods was performed using the following databases: PubMed, ISI Web of Knowledge, IEEE Xplore, Google Scholar, Scopus and Science Direct. Twenty-seven articles published in indexed journals and conference proceedings, focusing on the biomechanical analysis of swimming by means of inertial sensors were reviewed. The articles were categorised according to sensor's specification, anatomical sites where the sensors were attached, experimental design and applications for the analysis of swimming performance. Results indicate that inertial sensors are reliable tools for swimming biomechanical analyses.

  10. Latent Profile Analysis of Schizotypy and Paranormal Belief: Associations with Probabilistic Reasoning Performance

    PubMed Central

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew

    2018-01-01

    This study assessed the extent to which within-individual variation in schizotypy and paranormal belief influenced performance on probabilistic reasoning tasks. A convenience sample of 725 non-clinical adults completed measures assessing schizotypy (Oxford-Liverpool Inventory of Feelings and Experiences; O-Life brief), belief in the paranormal (Revised Paranormal Belief Scale; RPBS) and probabilistic reasoning (perception of randomness, conjunction fallacy, paranormal perception of randomness, and paranormal conjunction fallacy). Latent profile analysis (LPA) identified four distinct groups: class 1, low schizotypy and low paranormal belief (43.9% of sample); class 2, moderate schizotypy and moderate paranormal belief (18.2%); class 3, moderate schizotypy (high cognitive disorganization) and low paranormal belief (29%); and class 4, moderate schizotypy and high paranormal belief (8.9%). Identification of homogeneous classes provided a nuanced understanding of the relative contribution of schizotypy and paranormal belief to differences in probabilistic reasoning performance. Multivariate analysis of covariance revealed that groups with lower levels of paranormal belief (classes 1 and 3) performed significantly better on perception of randomness, but not conjunction problems. Schizotypy had only a negligible effect on performance. Further analysis indicated that framing perception of randomness and conjunction problems in a paranormal context facilitated performance for all groups but class 4. PMID:29434562

  11. The impact of Lean bundles on hospital performance: does size matter?

    PubMed

    Al-Hyari, Khalil; Abu Hammour, Sewar; Abu Zaid, Mohammad Khair Saleem; Haffar, Mohamed

    2016-10-10

    Purpose The purpose of this paper is to study the effect of the implementation of Lean bundles on hospital performance in private hospitals in Jordan and evaluate how much the size of organization can affect the relationship between Lean bundles implementation and hospital performance. Design/methodology/approach The research is considered as quantitative method (descriptive and hypothesis testing). Three statistical techniques were adopted to analyse the data. Structural equation modeling techniques and multi-group analysis were used to examine the research's hypothesis, and to perform the required statistical analysis of the data from the survey. Reliability analysis and confirmatory factor analysis were used to test the construct validity, reliability and measurement loadings that were performed. Findings Lean bundles have been identified as an effective approach that can dramatically improve the organizational performance of private hospitals in Jordan. Main Lean bundles - just in time, human resource management, and total quality management are applicable to large, small and medium hospitals without significant differences in advantages that depend on size. Originality/value According to the researchers' best knowledge, this is the first research that studies the impact of Lean bundles implementation in healthcare sector in Jordan. This research also makes a significant contribution for decision makers in healthcare to increase their awareness of Lean bundles.

  12. Thermal Analysis of a Disposable, Instrument-Free DNA Amplification Lab-on-a-Chip Platform.

    PubMed

    Pardy, Tamás; Rang, Toomas; Tulp, Indrek

    2018-06-04

    Novel second-generation rapid diagnostics based on nucleic acid amplification tests (NAAT) offer performance metrics on par with clinical laboratories in detecting infectious diseases at the point of care. The diagnostic assay is typically performed within a Lab-on-a-Chip (LoC) component with integrated temperature regulation. However, constraints on device dimensions, cost and power supply inherent with the device format apply to temperature regulation as well. Thermal analysis on simplified thermal models for the device can help overcome these barriers by speeding up thermal optimization. In this work, we perform experimental thermal analysis on the simplified thermal model for our instrument-free, single-use LoC NAAT platform. The system is evaluated further by finite element modelling. Steady-state as well as transient thermal analysis are performed to evaluate the performance of a self-regulating polymer resin heating element in the proposed device geometry. Reaction volumes in the target temperature range of the amplification reaction are estimated in the simulated model to assess compliance with assay requirements. Using the proposed methodology, we demonstrated our NAAT device concept capable of performing loop-mediated isothermal amplification in the 20⁻25 °C ambient temperature range with 32 min total assay time.

  13. Latent Profile Analysis of Schizotypy and Paranormal Belief: Associations with Probabilistic Reasoning Performance.

    PubMed

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew

    2018-01-01

    This study assessed the extent to which within-individual variation in schizotypy and paranormal belief influenced performance on probabilistic reasoning tasks. A convenience sample of 725 non-clinical adults completed measures assessing schizotypy (Oxford-Liverpool Inventory of Feelings and Experiences; O-Life brief), belief in the paranormal (Revised Paranormal Belief Scale; RPBS) and probabilistic reasoning (perception of randomness, conjunction fallacy, paranormal perception of randomness, and paranormal conjunction fallacy). Latent profile analysis (LPA) identified four distinct groups: class 1, low schizotypy and low paranormal belief (43.9% of sample); class 2, moderate schizotypy and moderate paranormal belief (18.2%); class 3, moderate schizotypy (high cognitive disorganization) and low paranormal belief (29%); and class 4, moderate schizotypy and high paranormal belief (8.9%). Identification of homogeneous classes provided a nuanced understanding of the relative contribution of schizotypy and paranormal belief to differences in probabilistic reasoning performance. Multivariate analysis of covariance revealed that groups with lower levels of paranormal belief (classes 1 and 3) performed significantly better on perception of randomness, but not conjunction problems. Schizotypy had only a negligible effect on performance. Further analysis indicated that framing perception of randomness and conjunction problems in a paranormal context facilitated performance for all groups but class 4.

  14. Some applications of categorical data analysis to epidemiological studies.

    PubMed Central

    Grizzle, J E; Koch, G G

    1979-01-01

    Several examples of categorized data from epidemiological studies are analyzed to illustrate that more informative analysis than tests of independence can be performed by fitting models. All of the analyses fit into a unified conceptual framework that can be performed by weighted least squares. The methods presented show how to calculate point estimate of parameters, asymptotic variances, and asymptotically valid chi 2 tests. The examples presented are analysis of relative risks estimated from several 2 x 2 tables, analysis of selected features of life tables, construction of synthetic life tables from cross-sectional studies, and analysis of dose-response curves. PMID:540590

  15. Automated Hazard Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riddle, F. J.

    2003-06-26

    The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control and job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the workmore » planning process.« less

  16. Analysis Tools for CFD Multigrid Solvers

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.; Thomas, James L.; Diskin, Boris

    2004-01-01

    Analysis tools are needed to guide the development and evaluate the performance of multigrid solvers for the fluid flow equations. Classical analysis tools, such as local mode analysis, often fail to accurately predict performance. Two-grid analysis tools, herein referred to as Idealized Coarse Grid and Idealized Relaxation iterations, have been developed and evaluated within a pilot multigrid solver. These new tools are applicable to general systems of equations and/or discretizations and point to problem areas within an existing multigrid solver. Idealized Relaxation and Idealized Coarse Grid are applied in developing textbook-efficient multigrid solvers for incompressible stagnation flow problems.

  17. Rapid isolation of biomarkers for compound specific radiocarbon dating using high-performance liquid chromatography and flow injection analysis-atmospheric pressure chemical ionisation mass spectrometry.

    PubMed

    Smittenberg, Rienk H; Hopmans, Ellen C; Schouten, Stefan; Sinninghe Damsté, Jaap S

    2002-11-29

    Repeated semi-preparative normal-phase HPLC was performed to isolate selected biomarkers from sediment extracts for radiocarbon analysis. Flow injection analysis-mass spectrometry was used for rapid analysis of collected fractions to evaluate the separation procedure, taking only 1 min per fraction. In this way 100-1000 microg of glycerol dialkyl glycerol tetraethers, sterol fractions and chlorophyll-derived phytol were isolated from typically 100 g of marine sediment, i.e., in sufficient quantities for radiocarbon analysis, without significant carbon isotopic fractionation or contamination.

  18. Capability of the Gas Analysis and Testing Laboratory at the NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Broerman, Craig; Jimenez, Javier; Sweterlitsch, Jeff

    2012-01-01

    The Gas Analysis and Testing Laboratory is an integral part of the testing performed at the NASA Johnson Space Center. The Gas Analysis and Testing Laboratory is a high performance laboratory providing real time analytical instruments to support manned and unmanned testing. The lab utilizes precision gas chromatographs, gas analyzers and spectrophotometers to support the technology development programs within the NASA community. The Gas Analysis and Testing Laboratory works with a wide variety of customers and provides engineering support for user-specified applications in compressed gas, chemical analysis, general and research laboratory.

  19. Capability of the Gas Analysis and Testing Laboratory at the NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Broerman, Craig; Jimenez, Javier; Sweterlitsch, Jeff

    2011-01-01

    The Gas Analysis and Testing Laboratory is an integral part of the testing performed at the NASA Johnson Space Center. The Gas Analysis and Testing Laboratory is a high performance laboratory providing real time analytical instruments to support manned and unmanned testing. The lab utilizes precision gas chromatographs, gas analyzers and spectrophotometers to support the technology development programs within the NASA community. The Gas Analysis and Testing Laboratory works with a wide variety of customers and provides engineering support for user-specified applications in compressed gas, chemical analysis, general and research laboratory

  20. Optical ensemble analysis of intraocular lens performance through a simulated clinical trial with ZEMAX.

    PubMed

    Zhao, Huawei

    2009-01-01

    A ZEMAX model was constructed to simulate a clinical trial of intraocular lenses (IOLs) based on a clinically oriented Monte Carlo ensemble analysis using postoperative ocular parameters. The purpose of this model is to test the feasibility of streamlining and optimizing both the design process and the clinical testing of IOLs. This optical ensemble analysis (OEA) is also validated. Simulated pseudophakic eyes were generated by using the tolerancing and programming features of ZEMAX optical design software. OEA methodology was verified by demonstrating that the results of clinical performance simulations were consistent with previously published clinical performance data using the same types of IOLs. From these results we conclude that the OEA method can objectively simulate the potential clinical trial performance of IOLs.

  1. Computer program for design and performance analysis of navigation-aid power systems. Program documentation. Volume 1: Software requirements document

    NASA Technical Reports Server (NTRS)

    Goltz, G.; Kaiser, L. M.; Weiner, H.

    1977-01-01

    A computer program has been developed for designing and analyzing the performance of solar array/battery power systems for the U.S. Coast Guard Navigational Aids. This program is called the Design Synthesis/Performance Analysis (DSPA) Computer Program. The basic function of the Design Synthesis portion of the DSPA program is to evaluate functional and economic criteria to provide specifications for viable solar array/battery power systems. The basic function of the Performance Analysis portion of the DSPA program is to simulate the operation of solar array/battery power systems under specific loads and environmental conditions. This document establishes the software requirements for the DSPA computer program, discusses the processing that occurs within the program, and defines the necessary interfaces for operation.

  2. Analysis of the Energy Performance of the Chesapeake Bay Foundation's Philip Merrill Environmental Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griffith, B.; Deru M.; Torcellini, P.

    2005-04-01

    The Chesapeake Bay Foundation designed their new headquarters building to minimize its environmental impact on the already highly polluted Chesapeake Bay by incorporating numerous high-performance energy saving features into the building design. CBF then contacted NREL to perform a nonbiased energy evaluation of the building. Because their building attracted much attention in the sustainable design community, an unbiased evaluation was necessary to help designers replicate successes and identify and correct problem areas. This report focuses on NREL's monitoring and analysis of the overall energy performance of the building.

  3. Measuring the performance of Internet companies using a two-stage data envelopment analysis model

    NASA Astrophysics Data System (ADS)

    Cao, Xiongfei; Yang, Feng

    2011-05-01

    In exploring the business operation of Internet companies, few researchers have used data envelopment analysis (DEA) to evaluate their performance. Since the Internet companies have a two-stage production process: marketability and profitability, this study employs a relational two-stage DEA model to assess the efficiency of the 40 dot com firms. The results show that our model performs better in measuring efficiency, and is able to discriminate the causes of inefficiency, thus helping business management to be more effective through providing more guidance to business performance improvement.

  4. Combined cumulative sum (CUSUM) and chronological environmental analysis as a tool to improve the learning environment for linear-probe endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) trainees: a pilot study.

    PubMed

    Norisue, Yasuhiro; Tokuda, Yasuharu; Juarez, Mayrol; Uchimido, Ryo; Fujitani, Shigeki; Stoeckel, David A

    2017-02-07

    Cumulative sum (CUSUM) analysis can be used to continuously monitor the performance of an individual or process and detect deviations from a preset or standard level of achievement. However, no previous study has evaluated the utility of CUSUM analysis in facilitating timely environmental assessment and interventions to improve performance of linear-probe endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA). The aim of this study was to evaluate the usefulness of combined CUSUM and chronological environmental analysis as a tool to improve the learning environment for EBUS-TBNA trainees. This study was an observational chart review. To determine if performance was acceptable, CUSUM analysis was used to track procedural outcomes of trainees in EBUS-TBNA. To investigate chronological changes in the learning environment, multivariate logistic regression analysis was used to compare several indices before and after time points when significant changes occurred in proficiency. Presence of an additional attending bronchoscopist was inversely associated with nonproficiency (odds ratio, 0.117; 95% confidence interval, 0-0.749; P = 0.019). Other factors, including presence of an on-site cytopathologist and dose of sedatives used, were not significantly associated with duration of nonproficiency. Combined CUSUM and chronological environmental analysis may be useful in hastening interventions that improve performance of EBUS-TBNA.

  5. Grey Relational Analysis Coupled with Principal Component Analysis for Optimization of Stereolithography Process to Enhance Part Quality

    NASA Astrophysics Data System (ADS)

    Raju, B. S.; Sekhar, U. Chandra; Drakshayani, D. N.

    2017-08-01

    The paper investigates optimization of stereolithography process for SL5530 epoxy resin material to enhance part quality. The major characteristics indexed for performance selected to evaluate the processes are tensile strength, Flexural strength, Impact strength and Density analysis and corresponding process parameters are Layer thickness, Orientation and Hatch spacing. In this study, the process is intrinsically with multiple parameters tuning so that grey relational analysis which uses grey relational grade as performance index is specially adopted to determine the optimal combination of process parameters. Moreover, the principal component analysis is applied to evaluate the weighting values corresponding to various performance characteristics so that their relative importance can be properly and objectively desired. The results of confirmation experiments reveal that grey relational analysis coupled with principal component analysis can effectively acquire the optimal combination of process parameters. Hence, this confirm that the proposed approach in this study can be an useful tool to improve the process parameters in stereolithography process, which is very useful information for machine designers as well as RP machine users.

  6. How the study of online collaborative learning can guide teachers and predict students' performance in a medical course.

    PubMed

    Saqr, Mohammed; Fors, Uno; Tedre, Matti

    2018-02-06

    Collaborative learning facilitates reflection, diversifies understanding and stimulates skills of critical and higher-order thinking. Although the benefits of collaborative learning have long been recognized, it is still rarely studied by social network analysis (SNA) in medical education, and the relationship of parameters that can be obtained via SNA with students' performance remains largely unknown. The aim of this work was to assess the potential of SNA for studying online collaborative clinical case discussions in a medical course and to find out which activities correlate with better performance and help predict final grade or explain variance in performance. Interaction data were extracted from the learning management system (LMS) forum module of the Surgery course in Qassim University, College of Medicine. The data were analyzed using social network analysis. The analysis included visual as well as a statistical analysis. Correlation with students' performance was calculated, and automatic linear regression was used to predict students' performance. By using social network analysis, we were able to analyze a large number of interactions in online collaborative discussions and gain an overall insight of the course social structure, track the knowledge flow and the interaction patterns, as well as identify the active participants and the prominent discussion moderators. When augmented with calculated network parameters, SNA offered an accurate view of the course network, each user's position, and level of connectedness. Results from correlation coefficients, linear regression, and logistic regression indicated that a student's position and role in information relay in online case discussions, combined with the strength of that student's network (social capital), can be used as predictors of performance in relevant settings. By using social network analysis, researchers can analyze the social structure of an online course and reveal important information about students' and teachers' interactions that can be valuable in guiding teachers, improve students' engagement, and contribute to learning analytics insights.

  7. 10 CFR 431.445 - Determination of small electric motor efficiency.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... statistical analysis, computer simulation or modeling, or other analytic evaluation of performance data. (3... statistical analysis, computer simulation or modeling, and other analytic evaluation of performance data on.... (ii) If requested by the Department, the manufacturer shall conduct simulations to predict the...

  8. Analytical techniques and instrumentation: A compilation. [analytical instrumentation, materials performance, and systems analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.

  9. Static analysis techniques for semiautomatic synthesis of message passing software skeletons

    DOE PAGES

    Sottile, Matthew; Dagit, Jason; Zhang, Deli; ...

    2015-06-29

    The design of high-performance computing architectures demands performance analysis of large-scale parallel applications to derive various parameters concerning hardware design and software development. The process of performance analysis and benchmarking an application can be done in several ways with varying degrees of fidelity. One of the most cost-effective ways is to do a coarse-grained study of large-scale parallel applications through the use of program skeletons. The concept of a “program skeleton” that we discuss in this article is an abstracted program that is derived from a larger program where source code that is determined to be irrelevant is removed formore » the purposes of the skeleton. In this work, we develop a semiautomatic approach for extracting program skeletons based on compiler program analysis. Finally, we demonstrate correctness of our skeleton extraction process by comparing details from communication traces, as well as show the performance speedup of using skeletons by running simulations in the SST/macro simulator.« less

  10. High perfomance liquid chromatography fingerprint analysis for quality control of brotowali (Tinospora crispa)

    NASA Astrophysics Data System (ADS)

    Syarifah, V. B.; Rafi, M.; Wahyuni, W. T.

    2017-05-01

    Brotowali (Tinospora crispa) is widely used in Indonesia as ingredient of herbal medicine formulation. To ensure the quality, safety, and efficacy of herbal medicine products, its chemical constituents should be continuously evaluated. High performance liquid chromatography (HPLC) fingerprint is one of powerful technique for this quality control process. In this study, HPLC fingerprint analysis method was developed for quality control of brotowali. HPLC analysis was performed in C18 column and detection was performed using photodiode array detector. The optimum mobile phase for brotowali fingerprint was acetonitrile (ACN) and 0.1% formic acid in gradient elution mode at a flow rate of 1 mL/min. The number of peaks detected in HPLC fingerprint of brotowali was 32 peaks and 23 peaks for stems and leaves, respectively. Berberine as marker compound was detected at retention time of 20.525 minutes. Evaluation of analytical performance including precision, reproducibility, and stability prove that this HPLC fingerprint analysis was reliable and could be applied for quality control of brotowali.

  11. Convective Array Cooling for a Solar Powered Aircraft

    NASA Technical Reports Server (NTRS)

    Colozza, Anthony J.; Dolce, James (Technical Monitor)

    2003-01-01

    A general characteristic of photovoltaics is that they increase in efficiency as their operating temperature decreases. Based on this principal, the ability to increase a solar aircraft's performance by cooling the solar cells was examined. The solar cells were cooled by channeling some air underneath the cells and providing a convective cooling path to the back side of the array. A full energy balance and flow analysis of the air within the cooling passage was performed. The analysis was first performed on a preliminary level to estimate the benefits of the cooling passage. This analysis established a clear benefit to the cooling passage. Based on these results a more detailed analysis was performed. From this cell temperatures were calculated and array output power throughout a day period were determined with and without the cooling passage. The results showed that if the flow through the cooling passage remained laminar then the benefit in increased output power more than offset the drag induced by the cooling passage.

  12. Performance Analysis, Design Considerations, and Applications of Extreme-Scale In Situ Infrastructures

    DOE PAGES

    Ayachit, Utkarsh; Bauer, Andrew; Duque, Earl P. N.; ...

    2016-11-01

    A key trend facing extreme-scale computational science is the widening gap between computational and I/O rates, and the challenge that follows is how to best gain insight from simulation data when it is increasingly impractical to save it to persistent storage for subsequent visual exploration and analysis. One approach to this challenge is centered around the idea of in situ processing, where visualization and analysis processing is performed while data is still resident in memory. Our paper examines several key design and performance issues related to the idea of in situ processing at extreme scale on modern platforms: Scalability, overhead,more » performance measurement and analysis, comparison and contrast with a traditional post hoc approach, and interfacing with simulation codes. We illustrate these principles in practice with studies, conducted on large-scale HPC platforms, that include a miniapplication and multiple science application codes, one of which demonstrates in situ methods in use at greater than 1M-way concurrency.« less

  13. Supersonic through-flow fan assessment

    NASA Technical Reports Server (NTRS)

    Kepler, C. E.; Champagne, G. A.

    1988-01-01

    A study was conducted to assess the performance potential of a supersonic through-flow fan engine for supersonic cruise aircraft. It included a mean-line analysis of fans designed to operate with in-flow velocities ranging from subsonic to high supersonic speeds. The fan performance generated was used to estimate the performance of supersonic fan engines designed for four applications: a Mach 2.3 supersonic transport, a Mach 2.5 fighter, a Mach 3.5 cruise missile, and a Mach 5.0 cruise vehicle. For each application an engine was conceptualized, fan performance and engine performance calculated, weight estimates made, engine installed in a hypothetical vehicle, and mission analysis was conducted.

  14. Performance analysis of LAN bridges and routers

    NASA Technical Reports Server (NTRS)

    Hajare, Ankur R.

    1991-01-01

    Bridges and routers are used to interconnect Local Area Networks (LANs). The performance of these devices is important since they can become bottlenecks in large multi-segment networks. Performance metrics and test methodology for bridges and routers were not standardized. Performance data reported by vendors is not applicable to the actual scenarios encountered in an operational network. However, vendor-provided data can be used to calibrate models of bridges and routers that, along with other models, yield performance data for a network. Several tools are available for modeling bridges and routers - Network II.5 was used. The results of the analysis of some bridges and routers are presented.

  15. Multiple Criteria and Multiple Periods Performance Analysis: The Comparison of North African Railways

    NASA Astrophysics Data System (ADS)

    Sabri, Karim; Colson, Gérard E.; Mbangala, Augustin M.

    2008-10-01

    Multi-period differences of technical and financial performances are analysed by comparing five North African railways over the period (1990-2004). A first approach is based on the Malmquist DEA TFP index for measuring the total factors productivity change, decomposed into technical efficiency change and technological changes. A multiple criteria analysis is also performed using the PROMETHEE II method and the software ARGOS. These methods provide complementary detailed information, especially by discriminating the technological and management progresses by Malmquist and the two dimensions of performance by Promethee: that are the service to the community and the enterprises performances, often in conflict.

  16. Game Performance Versus Competitive Performance in the World Championship of Handball 2011

    PubMed Central

    Gutiérrez, Óscar; Ruiz, José L.

    2013-01-01

    This article assesses the game performance of the teams participating in the Men’s World Championship of Handball of 2011 by using Data Envelopment Analysis (DEA) and the cross-efficiency evaluation. DEA uses Linear Programming to yield a measure of the overall performance of the game of particular teams, and allows us to identify relative strengths and weaknesses by means of benchmarking analysis. The cross-efficiency evaluation provides a peer-appraisal of the teams with different patterns of game, and makes it possible to rank them. Comparisons between this ranking and the final classification in the championship provide an insight into the game performance of the teams versus their competitive performance. We highlight the fact that France, which is the world champion, is also identified as an “all-round” performer in our game performance assessment. PMID:23717363

  17. ANALYSING PERFORMANCE ASSESSMENT IN PUBLIC SERVICES: HOW USEFUL IS THE CONCEPT OF A PERFORMANCE REGIME?

    PubMed

    Martin, Steve; Nutley, Sandra; Downe, James; Grace, Clive

    2016-03-01

    Approaches to performance assessment have been described as 'performance regimes', but there has been little analysis of what is meant by this concept and whether it has any real value. We draw on four perspectives on regimes - 'institutions and instruments', 'risk regulation regimes', 'internal logics and effects' and 'analytics of government' - to explore how the concept of a multi-dimensional regime can be applied to performance assessment in public services. We conclude that the concept is valuable. It helps to frame comparative and longitudinal analyses of approaches to performance assessment and draws attention to the ways in which public service performance regimes operate at different levels, how they change over time and what drives their development. Areas for future research include analysis of the impacts of performance regimes and interactions between their visible features (such as inspections, performance indicators and star ratings) and the veiled rationalities which underpin them.

  18. Numerical Modeling of Pulse Detonation Rocket Engine Gasdynamics and Performance

    NASA Technical Reports Server (NTRS)

    2003-01-01

    This paper presents viewgraphs on the numerical modeling of pulse detonation rocket engines (PDRE), with an emphasis on the Gasdynamics and performance analysis of these engines. The topics include: 1) Performance Analysis of PDREs; 2) Simplified PDRE Cycle; 3) Comparison of PDRE and Steady-State Rocket Engines (SSRE) Performance; 4) Numerical Modeling of Quasi 1-D Rocket Flows; 5) Specific PDRE Geometries Studied; 6) Time-Accurate Thrust Calculations; 7) PDRE Performance (Geometries A B C and D); 8) PDRE Blowdown Gasdynamics (Geom. A B C and D); 9) PDRE Geometry Performance Comparison; 10) PDRE Blowdown Time (Geom. A B C and D); 11) Specific SSRE Geometry Studied; 12) Effect of F-R Chemistry on SSRE Performance; 13) PDRE/SSRE Performance Comparison; 14) PDRE Performance Study; 15) Grid Resolution Study; and 16) Effect of F-R Chemistry on SSRE Exit Species Mole Fractions.

  19. Kinetic performance comparison of fully and superficially porous particles with a particle size of 5 µm: intrinsic evaluation and application to the impurity analysis of griseofulvin.

    PubMed

    Kahsay, Getu; Broeckhoven, Ken; Adams, Erwin; Desmet, Gert; Cabooter, Deirdre

    2014-05-01

    After the great commercial success of sub-3 µm superficially porous particles, vendors are now also starting to commercialize 5 µm superficially porous particles, as an alternative to their fully porous counterparts which are routinely used in pharmaceutical analysis. In this study, the performance of 5 µm superficially porous particles was compared to that of fully porous 5 µm particles in terms of efficiency, separation performance and loadability on a conventional HPLC instrument. Van Deemter and kinetic plots were first used to evaluate the efficiency and performance of both particle types using alkylphenones as a test mixture. The van Deemter and kinetic plots showed that the superficially porous particles provide a superior kinetic performance compared to the fully porous particles over the entire relevant range of separation conditions, when both support types were evaluated at the same operating pressure. The same observations were made both for isocratic and gradient analysis. The superior performance was further demonstrated for the separation of a pharmaceutical compound (griseofulvin) and its impurities, where a gain in analysis time of around 2 could be obtained using the superficially porous particles. Finally, both particle types were evaluated in terms of loadability by plotting the resolution of the active pharmaceutical ingredient and its closest impurity as a function of the signal-to-noise ratio obtained for the smallest impurity. It was demonstrated that the superficially porous particles show better separation performance for griseofulvin and its impurities without significantly compromising sensitivity due to loadability issues in comparison with their fully porous counterparts. Moreover these columns can be used on conventional equipment without modifications to obtain a significant improvement in analysis time. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. How Expert Pilots Think Cognitive Processes in Expert Decision Making

    DTIC Science & Technology

    1993-02-01

    Management (CRM) This document is available to the public Advanced Qualification Program (AQP) through the National Technical Information Cognitive Task Analysis (CTA...8217 Selecting realistic EDM scenarios with critical events and performing a cognitive task analysis of novice vs. expert decision making for these events...scenarios with critical events and performing a cognitive task analysis of novice vs. expert decision making for these events is a basic requirement for

  1. Runtime Speculative Software-Only Fault Tolerance

    DTIC Science & Technology

    2012-06-01

    reliability of RSFT, a in-depth analysis on its window of vulnerability is also discussed and measured via simulated fault injection. The performance...propagation of faults through the entire program. For optimal performance, these techniques have to use herotic alias analysis to find the minimum set of...affect program output. No program source code or alias analysis is needed to analyze the fault propagation ahead of time. 2.3 Limitations of Existing

  2. Prospective Molecular Characterization of Burn Wound Colonization: Novel Tools and Analysis

    DTIC Science & Technology

    2014-02-01

    from patients with endocarditis and wound/soft tissue infections, have been sequenced and an initial analysis performed. Finally, enrollment in the...Price LB. Analysis of S. aureus isolates from endocarditis and skin/soft tissue infections A strict blast search was performed on the S. aureus...targets differentiating the cellulitis and endocarditis isolates. This was followed with a basic Pearson’s Chi-squared test with Yates’ continuity

  3. Is Heart Rate Variability Better Than Routine Vital Signs for Prehospital Identification of Major Hemorrhage

    DTIC Science & Technology

    2015-01-01

    different PRBC transfusion volumes. We performed multivariate regression analysis using HRV metrics and routine vital signs to test the hypothesis that...study sponsors did not have any role in the study design, data collection, analysis and interpretation of data, report writing, or the decision to...primary outcome was hemorrhagic injury plus different PRBC transfusion volumes. We performed multivariate regression analysis using HRV metrics and

  4. Acoustic prediction methods for the NASA generalized advanced propeller analysis system (GAPAS)

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Block, P. J. W.

    1984-01-01

    Classical methods of propeller performance analysis are coupled with state-of-the-art Aircraft Noise Prediction Program (ANOPP:) techniques to yield a versatile design tool, the NASA Generalized Advanced Propeller Analysis System (GAPAS) for the novel quiet and efficient propellers. ANOPP is a collection of modular specialized programs. GAPAS as a whole addresses blade geometry and aerodynamics, rotor performance and loading, and subsonic propeller noise.

  5. Quality assessment of crude and processed Arecae semen based on colorimeter and HPLC combined with chemometrics methods.

    PubMed

    Sun, Meng; Yan, Donghui; Yang, Xiaolu; Xue, Xingyang; Zhou, Sujuan; Liang, Shengwang; Wang, Shumei; Meng, Jiang

    2017-05-01

    Raw Arecae Semen, the seed of Areca catechu L., as well as Arecae Semen Tostum and Arecae semen carbonisata are traditionally processed by stir-baking for subsequent use in a variety of clinical applications. These three Arecae semen types, important Chinese herbal drugs, have been used in China and other Asian countries for thousands of years. In this study, the sensory technologies of a colorimeter and sensitive validated high-performance liquid chromatography with diode array detection were employed to discriminate raw Arecae semen and its processed drugs. The color parameters of the samples were determined by a colorimeter instrument CR-410. Moreover, the fingerprints of the four alkaloids of arecaidine, guvacine, arecoline and guvacoline were surveyed by high-performance liquid chromatography. Subsequently, Student's t test, the analysis of variance, fingerprint similarity analysis, hierarchical cluster analysis, principal component analysis, factor analysis and Pearson's correlation test were performed for final data analysis. The results obtained demonstrated a significant color change characteristic for components in raw Arecae semen and its processed drugs. Crude and processed Arecae semen could be determined based on colorimetry and high-performance liquid chromatography with a diode array detector coupled with chemometrics methods for a comprehensive quality evaluation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Space Life-Support Engineering Program

    NASA Technical Reports Server (NTRS)

    Seagrave, Richard C. (Principal Investigator)

    1995-01-01

    This report covers the seventeen months of work performed under an extended one year NASA University Grant awarded to Iowa State University to perform research on topics relating to the development of closed-loop long-term life support systems with the initial principal focus on space water management. In the first phase of the program, investigators from chemistry and chemical engineering with demonstrated expertise in systems analysis, thermodynamics, analytical chemistry and instrumentation, performed research and development in two major related areas; the development of low-cost, accurate, and durable sensors for trace chemical and biological species, and the development of unsteady-state simulation packages for use in the development and optimization of control systems for life support systems. In the second year of the program, emphasis was redirected towards concentrating on the development of dynamic simulation techniques and software and on performing a thermodynamic systems analysis, centered on availability or energy analysis, in an effort to begin optimizing the systems needed for water purification. The third year of the program, the subject of this report, was devoted to the analysis of the water balance for the interaction between humans and the life support system during space flight and exercise, to analysis of the cardiopulmonary systems of humans during space flight, and to analysis of entropy production during operation of the air recovery system during space flight.

  7. Dehydrated Carbon Coupled with Laser-Induced Breakdown Spectrometry (LIBS) for the Determination of Heavy Metals in Solutions.

    PubMed

    Niu, Guanghui; Shi, Qi; Xu, Mingjun; Lai, Hongjun; Lin, Qingyu; Liu, Kunping; Duan, Yixiang

    2015-10-01

    In this article, a novel and alternative method of laser-induced breakdown spectroscopy (LIBS) analysis for liquid sample is proposed, which involves the removal of metal ions from a liquid to a solid substrate using a cost-efficient adsorbent, dehydrated carbon, obtained using a dehydration reaction. Using this new technique, researchers can detect trace metal ions in solutions qualitatively and quantitatively, and the drawbacks of performing liquid analysis using LIBS can be avoided because the analysis is performed on a solid surface. To achieve better performance using this technique, we considered parameters potentially influencing both adsorption performance and LIBS analysis. The calibration curves were evaluated, and the limits of detection obtained for Cu(2+), Pb(2+), and Cr(3+) were 0.77, 0.065, and 0.46 mg/L, respectively, which are better than those in the previous studies. In addition, compared to other absorbents, the adsorbent used in this technique is much cheaper in cost, easier to obtain, and has fewer or no other elements other than C, H, and O that could result in spectral interference during analysis. We also used the recommended method to analyze spiked samples, obtaining satisfactory results. Thus, this new technique is helpful and promising for use in wastewater analysis and management.

  8. The Effects of Music on Microsurgical Technique and Performance: A Motion Analysis Study.

    PubMed

    Shakir, Afaaf; Chattopadhyay, Arhana; Paek, Laurence S; McGoldrick, Rory B; Chetta, Matthew D; Hui, Kenneth; Lee, Gordon K

    2017-05-01

    Music is commonly played in operating rooms (ORs) throughout the country. If a preferred genre of music is played, surgeons have been shown to perform surgical tasks quicker and with greater accuracy. However, there are currently no studies investigating the effects of music on microsurgical technique. Motion analysis technology has recently been validated in the objective assessment of plastic surgery trainees' performance of microanastomoses. Here, we aimed to examine the effects of music on microsurgical skills using motion analysis technology as a primary objective assessment tool. Residents and fellows in the Plastic and Reconstructive Surgery program were recruited to complete a demographic survey and participate in microsurgical tasks. Each participant completed 2 arterial microanastomoses on a chicken foot model, one with music playing, and the other without music playing. Participants were blinded to the study objectives and encouraged to perform their best. The order of music and no music was randomized. Microanastomoses were video recorded using a digitalized S-video system and deidentified. Video segments were analyzed using ProAnalyst motion analysis software for automatic noncontact markerless video tracking of the needle driver tip. Nine residents and 3 plastic surgery fellows were tested. Reported microsurgical experience ranged from 1 to 10 arterial anastomoses performed (n = 2), 11 to 100 anastomoses (n = 9), and 101 to 500 anastomoses (n = 1). Mean age was 33 years (range, 29-36 years), with 11 participants right-handed and 1 ambidextrous. Of the 12 subjects tested, 11 (92%) preferred music in the OR. Composite instrument motion analysis scores significantly improved with playing preferred music during testing versus no music (paired t test, P <0.001). Improvement with music was significant even after stratifying scores by order in which variables were tested (music first vs no music first), postgraduate year, and number of anastomoses (analysis of variance, P < 0.01). Preferred music in the OR may have a positive effect on trainees' microsurgical performance; as such, trainees should be encouraged to participate in setting the conditions of the OR to optimize their comfort and, possibly, performance. Moreover, motion analysis technology is a useful tool with a wide range of applications for surgical education and outcomes optimization.

  9. Probabilistic Finite Element Analysis & Design Optimization for Structural Designs

    NASA Astrophysics Data System (ADS)

    Deivanayagam, Arumugam

    This study focuses on implementing probabilistic nature of material properties (Kevlar® 49) to the existing deterministic finite element analysis (FEA) of fabric based engine containment system through Monte Carlo simulations (MCS) and implementation of probabilistic analysis in engineering designs through Reliability Based Design Optimization (RBDO). First, the emphasis is on experimental data analysis focusing on probabilistic distribution models which characterize the randomness associated with the experimental data. The material properties of Kevlar® 49 are modeled using experimental data analysis and implemented along with an existing spiral modeling scheme (SMS) and user defined constitutive model (UMAT) for fabric based engine containment simulations in LS-DYNA. MCS of the model are performed to observe the failure pattern and exit velocities of the models. Then the solutions are compared with NASA experimental tests and deterministic results. MCS with probabilistic material data give a good prospective on results rather than a single deterministic simulation results. The next part of research is to implement the probabilistic material properties in engineering designs. The main aim of structural design is to obtain optimal solutions. In any case, in a deterministic optimization problem even though the structures are cost effective, it becomes highly unreliable if the uncertainty that may be associated with the system (material properties, loading etc.) is not represented or considered in the solution process. Reliable and optimal solution can be obtained by performing reliability optimization along with the deterministic optimization, which is RBDO. In RBDO problem formulation, in addition to structural performance constraints, reliability constraints are also considered. This part of research starts with introduction to reliability analysis such as first order reliability analysis, second order reliability analysis followed by simulation technique that are performed to obtain probability of failure and reliability of structures. Next, decoupled RBDO procedure is proposed with a new reliability analysis formulation with sensitivity analysis, which is performed to remove the highly reliable constraints in the RBDO, thereby reducing the computational time and function evaluations. Followed by implementation of the reliability analysis concepts and RBDO in finite element 2D truss problems and a planar beam problem are presented and discussed.

  10. Fluorescence spectroscopy for neoplasms control

    NASA Astrophysics Data System (ADS)

    Bratchenko, I. A.; Kristoforova, Yu. A.; Myakinin, O. O.; Artemyev, D. N.; Kozlov, S. V.; Moryatov, A. A.; Zakharov, V. P.

    2016-04-01

    Investigation of malignant skin tumors diagnosis was performed involving two setups for native tissues fluorescence control in visible and near infrared regions. Combined fluorescence analysis for skin malignant melanomas and basal cell carcinomas was performed. Autofluorescence spectra of normal skin and oncological pathologies stimulated by 457 nm and 785 nm lasers were registered for 74 skin tissue samples. Spectra of 10 melanomas and 27 basal cell carcinomas were registered ex vivo. Skin tumors analysis was made on the basis of autofluorescence spectra intensity and curvature for analysis of porphyrins, lipo-pigments, flavins and melanin. Separation of melanomas and basal cell carcinomas was performed on the basis of discriminant analysis. Overall accuracy of basal cell carcinomas and malignant melanomas separation in current study reached 86.5% with 70% sensitivity and 92.6% specificity.

  11. Comparison of fluorescent tags for analysis of mannose-6-phosphate glycans.

    PubMed

    Kang, Ji-Yeon; Kwon, Ohsuk; Gil, Jin Young; Oh, Doo-Byoung

    2016-05-15

    Mannose-6-phosphate (M-6-P) glycan analysis is important for quality control of therapeutic enzymes for lysosomal storage diseases. Here, we found that the analysis of glycans containing two M-6-Ps was highly affected by the hydrophilicity of the elution solvent used in high-performance liquid chromatography (HPLC). In addition, the performances of three fluorescent tags--2-aminobenzoic acid (2-AA), 2-aminobenzamide (2-AB), and 3-(acetyl-amino)-6-aminoacridine (AA-Ac)--were compared with each other for M-6-P glycan analysis using HPLC and matrix-assisted laser desorption/ionization time-of-flight mass spectrometry. The best performance for analyzing M-6-P glycans was shown by 2-AA labeling in both analyses. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Performance analysis of a coherent free space optical communication system based on experiment.

    PubMed

    Cao, Jingtai; Zhao, Xiaohui; Liu, Wei; Gu, Haijun

    2017-06-26

    Based on our previous study and designed experimental AO system with a 97-element continuous surface deformable mirror, we conduct the performance analysis of a coherent free space optical communication (FSOC) system for mixing efficiency (ME), bit error rate (BER) and outage probability under different Greenwood frequency and atmospheric coherent length. The results show that the influence of the atmospheric temporal characteristics on the performance is slightly stronger than that of the spatial characteristics when the receiving aperture and the number of sub-apertures are given. This analysis result provides a reference for the design of the coherent FSOC system.

  13. Statistical analysis of microgravity experiment performance using the degrees of success scale

    NASA Technical Reports Server (NTRS)

    Upshaw, Bernadette; Liou, Ying-Hsin Andrew; Morilak, Daniel P.

    1994-01-01

    This paper describes an approach to identify factors that significantly influence microgravity experiment performance. Investigators developed the 'degrees of success' scale to provide a numerical representation of success. A degree of success was assigned to 293 microgravity experiments. Experiment information including the degree of success rankings and factors for analysis was compiled into a database. Through an analysis of variance, nine significant factors in microgravity experiment performance were identified. The frequencies of these factors are presented along with the average degree of success at each level. A preliminary discussion of the relationship between the significant factors and the degree of success is presented.

  14. Seventy-meter antenna performance predictions: GTD analysis compared with traditional ray-tracing methods

    NASA Technical Reports Server (NTRS)

    Schredder, J. M.

    1988-01-01

    A comparative analysis was performed, using both the Geometrical Theory of Diffraction (GTD) and traditional pathlength error analysis techniques, for predicting RF antenna gain performance and pointing corrections. The NASA/JPL 70 meter antenna with its shaped surface was analyzed for gravity loading over the range of elevation angles. Also analyzed were the effects of lateral and axial displacements of the subreflector. Significant differences were noted between the predictions of the two methods, in the effect of subreflector displacements, and in the optimal subreflector positions to focus a gravity-deformed main reflector. The results are of relevance to future design procedure.

  15. On-line evaluation of multiloop digital controller performance

    NASA Technical Reports Server (NTRS)

    Wieseman, Carol D.

    1993-01-01

    The purpose of this presentation is to inform the Guidance and Control community of capabilities which were developed by the Aeroservoelasticity Branch to evaluate the performance of multivariable control laws, on-line, during wind-tunnel testing. The capabilities are generic enough to be useful for all kinds of on-line analyses involving multivariable control in experimental testing. Consequently, it was decided to present this material at this workshop even though it has been presented elsewhere. Topics covered include: essential on-line analysis requirements; on-line analysis capabilities; on-line analysis software; frequency domain procedures; controller performance evaluation frequency-domain flutter suppression; and plant determination.

  16. Ultrasonic Bolt Gage

    NASA Technical Reports Server (NTRS)

    Gleman, Stuart M. (Inventor); Rowe, Geoffrey K. (Inventor)

    1999-01-01

    An ultrasonic bolt gage is described which uses a crosscorrelation algorithm to determine a tension applied to a fastener, such as a bolt. The cross-correlation analysis is preferably performed using a processor operating on a series of captured ultrasonic echo waveforms. The ultrasonic bolt gage is further described as using the captured ultrasonic echo waveforms to perform additional modes of analysis, such as feature recognition. Multiple tension data outputs, therefore, can be obtained from a single data acquisition for increased measurement reliability. In addition, one embodiment of the gage has been described as multi-channel, having a multiplexer for performing a tension analysis on one of a plurality of bolts.

  17. Data Envelopment Analysis (DEA) Model in Operation Management

    NASA Astrophysics Data System (ADS)

    Malik, Meilisa; Efendi, Syahril; Zarlis, Muhammad

    2018-01-01

    Quality management is an effective system in operation management to develops, maintains, and improves quality from groups of companies that allow marketing, production, and service at the most economycal level as well as ensuring customer satisfication. Many companies are practicing quality management to improve their bussiness performance. One of performance measurement is through measurement of efficiency. One of the tools can be used to assess efficiency of companies performance is Data Envelopment Analysis (DEA). The aim of this paper is using Data Envelopment Analysis (DEA) model to assess efficiency of quality management. In this paper will be explained CCR, BCC, and SBM models to assess efficiency of quality management.

  18. Using enterprise architecture to analyse how organisational structure impact motivation and learning

    NASA Astrophysics Data System (ADS)

    Närman, Pia; Johnson, Pontus; Gingnell, Liv

    2016-06-01

    When technology, environment, or strategies change, organisations need to adjust their structures accordingly. These structural changes do not always enhance the organisational performance as intended partly because organisational developers do not understand the consequences of structural changes in performance. This article presents a model-based analysis framework for quantitative analysis of the effect of organisational structure on organisation performance in terms of employee motivation and learning. The model is based on Mintzberg's work on organisational structure. The quantitative analysis is formalised using the Object Constraint Language (OCL) and the Unified Modelling Language (UML) and implemented in an enterprise architecture tool.

  19. Application of the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) for Dynamic Systems Analysis

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey; Zinnecker, Alicia

    2014-01-01

    Systems analysis involves steady-state simulations of combined components to evaluate the steady-state performance, weight, and cost of a system; dynamic considerations are not included until later in the design process. The Dynamic Systems Analysis task, under NASAs Fixed Wing project, is developing the capability for assessing dynamic issues at earlier stages during systems analysis. To provide this capability the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) has been developed to design a single flight condition controller (defined as altitude and Mach number) and, ultimately, provide an estimate of the closed-loop performance of the engine model. This tool has been integrated with the Commercial Modular Aero-Propulsion System Simulation 40,000(CMAPSS40k) engine model to demonstrate the additional information TTECTrA makes available for dynamic systems analysis. This dynamic data can be used to evaluate the trade-off between performance and safety, which could not be done with steady-state systems analysis data. TTECTrA has been designed to integrate with any turbine engine model that is compatible with the MATLABSimulink (The MathWorks, Inc.) environment.

  20. Application of the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) for Dynamic Systems Analysis

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey Thomas; Zinnecker, Alicia Mae

    2014-01-01

    Systems analysis involves steady-state simulations of combined components to evaluate the steady-state performance, weight, and cost of a system; dynamic considerations are not included until later in the design process. The Dynamic Systems Analysis task, under NASAs Fixed Wing project, is developing the capability for assessing dynamic issues at earlier stages during systems analysis. To provide this capability the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) has been developed to design a single flight condition controller (defined as altitude and Mach number) and, ultimately, provide an estimate of the closed-loop performance of the engine model. This tool has been integrated with the Commercial Modular Aero-Propulsion System Simulation 40,000 (CMAPSS 40k) engine model to demonstrate the additional information TTECTrA makes available for dynamic systems analysis. This dynamic data can be used to evaluate the trade-off between performance and safety, which could not be done with steady-state systems analysis data. TTECTrA has been designed to integrate with any turbine engine model that is compatible with the MATLAB Simulink (The MathWorks, Inc.) environment.

Top