Dispersion analysis for baseline reference mission 2
NASA Technical Reports Server (NTRS)
Snow, L. S.
1975-01-01
A dispersion analysis considering uncertainties (or perturbations) in platform, vehicle, and environmental parameters was performed for baseline reference mission (BRM) 2. The dispersion analysis is based on the nominal trajectory for BRM 2. The analysis was performed to determine state vector and performance dispersions (or variations) which result from the indicated uncertainties. The dispersions are determined at major mission events and fixed times from liftoff (time slices). The dispersion results will be used to evaluate the capability of the vehicle to perform the mission within a specified level of confidence and to determine flight performance reserves.
EEG source analysis of data from paralysed subjects
NASA Astrophysics Data System (ADS)
Carabali, Carmen A.; Willoughby, John O.; Fitzgibbon, Sean P.; Grummett, Tyler; Lewis, Trent; DeLosAngeles, Dylan; Pope, Kenneth J.
2015-12-01
One of the limitations of Encephalography (EEG) data is its quality, as it is usually contaminated with electric signal from muscle. This research intends to study results of two EEG source analysis methods applied to scalp recordings taken in paralysis and in normal conditions during the performance of a cognitive task. The aim is to determinate which types of analysis are appropriate for dealing with EEG data containing myogenic components. The data used are the scalp recordings of six subjects in normal conditions and during paralysis while performing different cognitive tasks including the oddball task which is the object of this research. The data were pre-processed by filtering it and correcting artefact, then, epochs of one second long for targets and distractors were extracted. Distributed source analysis was performed in BESA Research 6.0, using its results and information from the literature, 9 ideal locations for source dipoles were identified. The nine dipoles were used to perform discrete source analysis, fitting them to the averaged epochs for obtaining source waveforms. The results were statistically analysed comparing the outcomes before and after the subjects were paralysed. Finally, frequency analysis was performed for better explain the results. The findings were that distributed source analysis could produce confounded results for EEG contaminated with myogenic signals, conversely, statistical analysis of the results from discrete source analysis showed that this method could help for dealing with EEG data contaminated with muscle electrical signal.
NASA Technical Reports Server (NTRS)
Kirlik, Alex; Kossack, Merrick Frank
1993-01-01
This status report consists of a thesis entitled 'Ecological Task Analysis: A Method for Display Enhancements.' Previous use of various analysis processes for the purpose of display interface design or enhancement has run the risk of failing to improve user performance due to the analysis resulting in only a sequencial listing of user tasks. Adopting an ecological approach to performing the task analysis, however, may result in the necessary modeling of an unpredictable and variable task domain required to improve user performance. Kirlik has proposed an Ecological Task Analysis framework which is designed for this purpose. It is the purpose of this research to measure this framework's effectiveness at enhancing display interfaces in order to improve user performance. Following the proposed framework, an ecological task analysis of experienced users of a complex and dynamic laboratory task, Star Cruiser, was performed. Based on this analysis, display enhancements were proposed and implemented. An experiment was then conducted to compare this new version of Star Cruiser to the original. By measuring user performance at different tasks, it was determined that during early sessions, use of the enhanced display contributed to better user performance compared to that achieved using the original display. Furthermore, the results indicate that the enhancements proposed as a result of the ecological task analysis affected user performance differently depending on whether they are enhancements which aid in the selection of a possible action or in the performance of an action. Generalizations of these findings to larger, more complex systems were avoided since the analysis was only performed on this one particular system.
Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0
Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; ...
2008-01-01
The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less
Measurement uncertainty analysis techniques applied to PV performance measurements
NASA Astrophysics Data System (ADS)
Wells, C.
1992-10-01
The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.
COBRA ATD minefield detection model initial performance analysis
NASA Astrophysics Data System (ADS)
Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.
2000-08-01
A statistical performance analysis of the USMC Coastal Battlefield Reconnaissance and Analysis (COBRA) Minefield Detection (MFD) Model has been performed in support of the COBRA ATD Program under execution by the Naval Surface Warfare Center/Dahlgren Division/Coastal Systems Station . This analysis uses the Veridian ERIM International MFD model from the COBRA Sensor Performance Evaluation and Computational Tools for Research Analysis modeling toolbox and a collection of multispectral mine detection algorithm response distributions for mines and minelike clutter objects. These mine detection response distributions were generated form actual COBRA ATD test missions over littoral zone minefields. This analysis serves to validate both the utility and effectiveness of the COBRA MFD Model as a predictive MFD performance too. COBRA ATD minefield detection model algorithm performance results based on a simulate baseline minefield detection scenario are presented, as well as result of a MFD model algorithm parametric sensitivity study.
Orbit Transfer Vehicle (OTV) engine, phase A study. Volume 2: Study
NASA Technical Reports Server (NTRS)
Mellish, J. A.
1979-01-01
The hydrogen oxygen engine used in the orbiter transfer vehicle is described. The engine design is analyzed and minimum engine performance and man rating requirements are discussed. Reliability and safety analysis test results are presented and payload, risk and cost, and engine installation parameters are defined. Engine tests were performed including performance analysis, structural analysis, thermal analysis, turbomachinery analysis, controls analysis, and cycle analysis.
Elastic-plastic mixed-iterative finite element analysis: Implementation and performance assessment
NASA Technical Reports Server (NTRS)
Sutjahjo, Edhi; Chamis, Christos C.
1993-01-01
An elastic-plastic algorithm based on Von Mises and associative flow criteria is implemented in MHOST-a mixed iterative finite element analysis computer program developed by NASA Lewis Research Center. The performance of the resulting elastic-plastic mixed-iterative analysis is examined through a set of convergence studies. Membrane and bending behaviors of 4-node quadrilateral shell finite elements are tested for elastic-plastic performance. Generally, the membrane results are excellent, indicating the implementation of elastic-plastic mixed-iterative analysis is appropriate.
Structural-Thermal-Optical-Performance (STOP) Analysis
NASA Technical Reports Server (NTRS)
Bolognese, Jeffrey; Irish, Sandra
2015-01-01
The presentation will be given at the 26th Annual Thermal Fluids Analysis Workshop (TFAWS 2015) hosted by the Goddard Spaceflight Center (GSFC) Thermal Engineering Branch (Code 545). A STOP analysis is a multidiscipline analysis, consisting of Structural, Thermal and Optical Performance Analyses, that is performed for all space flight instruments and satellites. This course will explain the different parts of performing this analysis. The student will learn how to effectively interact with each discipline in order to accurately obtain the system analysis results.
Optimization of analytical laboratory work using computer networking and databasing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Upp, D.L.; Metcalf, R.A.
1996-06-01
The Health Physics Analysis Laboratory (HPAL) performs around 600,000 analyses for radioactive nuclides each year at Los Alamos National Laboratory (LANL). Analysis matrices vary from nasal swipes, air filters, work area swipes, liquids, to the bottoms of shoes and cat litter. HPAL uses 8 liquid scintillation counters, 8 gas proportional counters, and 9 high purity germanium detectors in 5 laboratories to perform these analyses. HPAL has developed a computer network between the labs and software to produce analysis results. The software and hardware package includes barcode sample tracking, log-in, chain of custody, analysis calculations, analysis result printing, and utility programs.more » All data are written to a database, mirrored on a central server, and eventually written to CD-ROM to provide for online historical results. This system has greatly reduced the work required to provide for analysis results as well as improving the quality of the work performed.« less
Faramarzi, Salar; Shamsi, Abdolhossein; Samadi, Maryam; Ahmadzade, Maryam
2015-01-01
Introduction: with due attention to the importance of learning disabilities and necessity of presenting interventions for improvement of these disorders in order to prevent future problems, this study used meta-analysis of the research model on the impact of psychological and educational interventions to improve academic performance of students with learning disabilities. Methods: with the use of meta-analysis method by integrating the results of various researches, this study specifies the effect of psychological and educational interventions. In this order, 57 studies, which their methodology was accepted, were selected and meta-analysis was performed on them. The research instrument was a meta-analysis checklist. Results: The effect size for the effectiveness of psychological-educational interventions on improving the academic performance of students with mathematics disorder (0.57), impaired writing (0.50) and dyslexia (0.55) were reported. Conclusions: The result of meta-analysis showed that according to Cohen's table, the effect size is above average, and it can be said that educational and psychological interventions improve the academic performance of students with learning disabilities. PMID:26430685
Data Transmission Signal Design and Analysis
NASA Technical Reports Server (NTRS)
Moore, J. D.
1972-01-01
The error performances of several digital signaling methods are determined as a function of a specified signal-to-noise ratio. Results are obtained for Gaussian noise and impulse noise. Performance of a receiver for differentially encoded biphase signaling is obtained by extending the results of differential phase shift keying. The analysis presented obtains a closed-form answer through the use of some simplifying assumptions. The results give an insight into the analysis problem, however, the actual error performance may show a degradation because of the assumptions made in the analysis. Bipolar signaling decision-threshold selection is investigated. The optimum threshold depends on the signal-to-noise ratio and requires the use of an adaptive receiver.
Performance Reports: Mirror alignment system performance prediction comparison between SAO and EKC
NASA Technical Reports Server (NTRS)
Tananbaum, H. D.; Zhang, J. P.
1994-01-01
The objective of this study is to perform an independent analysis of the residual high resolution mirror assembly (HRMA) mirror distortions caused by force and moment errors in the mirror alignment system (MAS) to statistically predict the HRMA performance. These performance predictions are then compared with those performed by Kodak to verify their analysis results.
NASA Astrophysics Data System (ADS)
Nakashima, Hiroshi; Takatsu, Yuzuru; Shinone, Hisanori; Matsukawa, Hisao; Kasetani, Takahiro
Soil-tire system interaction is a fundamental and important research topic in terramechanics. We applied a 2D finite element, discrete element method (FE-DEM), using FEM for the tire and the bottom soil layer and DEM for the surface soil layer. Satisfactory performance analysis was achieved. In this study, to clarify the capabilities and limitations of the method for soil-tire interaction analysis, the tractive performance of real automobile tires with two different tread patterns—smooth and grooved—was analyzed by FE-DEM, and the numerical results compared with the experimental results obtained using an indoor traction measurement system. The analysis of tractive performance could be performed with sufficient accuracy by the proposed 2D dynamic FE-DEM. FE-DEM obtained larger drawbar pull for a tire with a grooved tread pattern, which was verified by the experimental results. Moreover, the result for the grooved tire showed almost the same gross tractive effort and similar running resistance as in experiments. However, for a tire with smooth tread pattern, the analyzed gross tractive effort and running resistance behaved differently than the experimental results, largely due to the difference in tire sinkage in FE-DEM.
Lunar Exploration Architecture Level Key Drivers and Sensitivities
NASA Technical Reports Server (NTRS)
Goodliff, Kandyce; Cirillo, William; Earle, Kevin; Reeves, J. D.; Shyface, Hilary; Andraschko, Mark; Merrill, R. Gabe; Stromgren, Chel; Cirillo, Christopher
2009-01-01
Strategic level analysis of the integrated behavior of lunar transportation and lunar surface systems architecture options is performed to assess the benefit, viability, affordability, and robustness of system design choices. This analysis employs both deterministic and probabilistic modeling techniques so that the extent of potential future uncertainties associated with each option are properly characterized. The results of these analyses are summarized in a predefined set of high-level Figures of Merit (FOMs) so as to provide senior NASA Constellation Program (CxP) and Exploration Systems Mission Directorate (ESMD) management with pertinent information to better inform strategic level decision making. The strategic level exploration architecture model is designed to perform analysis at as high a level as possible but still capture those details that have major impacts on system performance. The strategic analysis methodology focuses on integrated performance, affordability, and risk analysis, and captures the linkages and feedbacks between these three areas. Each of these results leads into the determination of the high-level FOMs. This strategic level analysis methodology has been previously applied to Space Shuttle and International Space Station assessments and is now being applied to the development of the Constellation Program point-of-departure lunar architecture. This paper provides an overview of the strategic analysis methodology and the lunar exploration architecture analyses to date. In studying these analysis results, the strategic analysis team has identified and characterized key drivers affecting the integrated architecture behavior. These key drivers include inclusion of a cargo lander, mission rate, mission location, fixed-versus- variable costs/return on investment, and the requirement for probabilistic analysis. Results of sensitivity analysis performed on lunar exploration architecture scenarios are also presented.
Finite wordlength implementation of a megachannel digital spectrum analyzer
NASA Technical Reports Server (NTRS)
Satorius, E. H.; Grimm, M. J.; Zimmerman, G. A.; Wilck, H. C.
1986-01-01
The results of an extensive system analysis of the megachannel spectrum analyzer currently being developed for use in various applications of the Deep Space Network are presented. The intent of this analysis is to quantify the effects of digital quantization errors on system performance. The results of this analysis provide useful guidelines for choosing various system design parameters to enhance system performance.
Grid orthogonality effects on predicted turbine midspan heat transfer and performance
NASA Technical Reports Server (NTRS)
Boyle, R. J.; Ameri, A. A.
1995-01-01
The effect of five different C type grid geometries on the predicted heat transfer and aerodynamic performance of a turbine stator is examined. Predictions were obtained using two flow analysis codes. One was a finite difference analysis, and the other was a finite volume analysis. Differences among the grids in terms of heat transfer and overall performance were small. The most significant difference among the five grids occurred in the prediction of pitchwise variation in total pressure. There was consistency between results obtained with each of the flow analysis codes when the same grid was used. A grid generating procedure in which the viscous grid is embedded within an inviscid type grid resulted in the best overall performance.
Comprehensive Analysis Modeling of Small-Scale UAS Rotors
NASA Technical Reports Server (NTRS)
Russell, Carl R.; Sekula, Martin K.
2017-01-01
Multicopter unmanned aircraft systems (UAS), or drones, have continued their explosive growth in recent years. With this growth comes demand for increased performance as the limits of existing technologies are reached. In order to better design multicopter UAS aircraft, better performance prediction tools are needed. This paper presents the results of a study aimed at using the rotorcraft comprehensive analysis code CAMRAD II to model a multicopter UAS rotor in hover. Parametric studies were performed to determine the level of fidelity needed in the analysis code inputs to achieve results that match test data. Overall, the results show that CAMRAD II is well suited to model small-scale UAS rotors in hover. This paper presents the results of the parametric studies as well as recommendations for the application of comprehensive analysis codes to multicopter UAS rotors.
Shin, Sang Soo; Shin, Young-Jeon
2016-01-01
With an increasing number of studies highlighting regional social capital (SC) as a determinant of health, many studies are using multi-level analysis with merged and averaged scores of community residents' survey responses calculated from community SC data. Sufficient examination is required to validate if the merged and averaged data can represent the community. Therefore, this study analyzes the validity of the selected indicators and their applicability in multi-level analysis. Within and between analysis (WABA) was performed after creating community variables using merged and averaged data of community residents' responses from the 2013 Community Health Survey in Korea, using subjective self-rated health assessment as a dependent variable. Further analysis was performed following the model suggested by WABA result. Both E-test results (1) and WABA results (2) revealed that single-level analysis needs to be performed using qualitative SC variable with cluster mean centering. Through single-level multivariate regression analysis, qualitative SC with cluster mean centering showed positive effect on self-rated health (0.054, p<0.001), although there was no substantial difference in comparison to analysis using SC variables without cluster mean centering or multi-level analysis. As modification in qualitative SC was larger within the community than between communities, we validate that relational analysis of individual self-rated health can be performed within the group, using cluster mean centering. Other tests besides the WABA can be performed in the future to confirm the validity of using community variables and their applicability in multi-level analysis.
Performance analysis of mini-propellers based on FlightGear
NASA Astrophysics Data System (ADS)
Vogeltanz, Tomáš
2016-06-01
This paper presents a performance analysis of three mini-propellers based on the FlightGear flight simulator. Although a basic propeller analysis has to be performed before the use of FlightGear, for a complex and more practical performance analysis, it is advantageous to use a propeller model in cooperation with a particular aircraft model. This approach may determine whether the propeller has sufficient quality in respect of aircraft requirements. In the first section, the software used for the analysis is illustrated. Then, the parameters of the analyzed mini-propellers and the tested UAV are described. Finally, the main section shows and discusses the results of the performance analysis of the mini-propellers.
NASA Technical Reports Server (NTRS)
Gaston, S.; Wertheim, M.; Orourke, J. A.
1973-01-01
Summary, consolidation and analysis of specifications, manufacturing process and test controls, and performance results for OAO-2 and OAO-3 lot 20 Amp-Hr sealed nickel cadmium cells and batteries are reported. Correlation of improvements in control requirements with performance is a key feature. Updates for a cell/battery computer model to improve performance prediction capability are included. Applicability of regression analysis computer techniques to relate process controls to performance is checked.
Apollo 15 mission report, supplement 4: Descent propulsion system final flight evaluation
NASA Technical Reports Server (NTRS)
Avvenire, A. T.; Wood, S. C.
1972-01-01
The results of a postflight analysis of the LM-10 Descent Propulsion System (DPS) during the Apollo 15 Mission are reported. The analysis determined the steady state performance of the DPS during the descent phase of the manned lunar landing. Flight measurement discrepancies are discussed. Simulated throttle performance results are cited along with overall performance results. Evaluations of the propellant quantity gaging system, propellant loading, pressurization system, and engine are reported. Graphic illustrations of the evaluations are included.
CFD Predictions for Transonic Performance of the ERA Hybrid Wing-Body Configuration
NASA Technical Reports Server (NTRS)
Deere, Karen A.; Luckring, James M.; McMillin, S. Naomi; Flamm, Jeffrey D.; Roman, Dino
2016-01-01
A computational study was performed for a Hybrid Wing Body configuration that was focused at transonic cruise performance conditions. In the absence of experimental data, two fully independent computational fluid dynamics analyses were conducted to add confidence to the estimated transonic performance predictions. The primary analysis was performed by Boeing with the structured overset-mesh code OVERFLOW. The secondary analysis was performed by NASA Langley Research Center with the unstructured-mesh code USM3D. Both analyses were performed at full-scale flight conditions and included three configurations customary to drag buildup and interference analysis: a powered complete configuration, the configuration with the nacelle/pylon removed, and the powered nacelle in isolation. The results in this paper are focused primarily on transonic performance up to cruise and through drag rise. Comparisons between the CFD results were very good despite some minor geometric differences in the two analyses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, D.W.; Yambert, M.W.; Kocher, D.C.
1994-12-31
A performance assessment of the operating Solid Waste Storage Area 6 (SWSA 6) facility for the disposal of low-level radioactive waste at the Oak Ridge National Laboratory has been prepared to provide the technical basis for demonstrating compliance with the performance objectives of DOE Order 5820.2A, Chapter 111.2 An analysis of the uncertainty incorporated into the assessment was performed which addressed the quantitative uncertainty in the data used by the models, the subjective uncertainty associated with the models used for assessing performance of the disposal facility and site, and the uncertainty in the models used for estimating dose and humanmore » exposure. The results of the uncertainty analysis were used to interpret results and to formulate conclusions about the performance assessment. This paper discusses the approach taken in analyzing the uncertainty in the performance assessment and the role of uncertainty in performance assessment.« less
Dissociation and Recombination Effects on the Performance of Pulse Detonation Engines
NASA Technical Reports Server (NTRS)
Povinelli, Louis A.
2003-01-01
This paper summarizes major theoretical results for pulse detonation engine performance taking into account real gas chemistry, as well as significant performance differences resulting from the presence of ram and compression heating. An unsteady CFD analysis, as well as a thermodynamic cycle analysis, was conducted in order to determine the actual and the ideal performance for an air-breathing pulse detonation engine (PDE) using either a hydrogen-air or ethylene-air mixture over a flight Mach number range from 0 to 4. The results clearly elucidate the competitive regime of PDE application relative to ramjets and gas turbines.
Performance analysis of the ascent propulsion system of the Apollo spacecraft
NASA Technical Reports Server (NTRS)
Hooper, J. C., III
1973-01-01
Activities involved in the performance analysis of the Apollo lunar module ascent propulsion system are discussed. A description of the ascent propulsion system, including hardware, instrumentation, and system characteristics, is included. The methods used to predict the inflight performance and to establish performance uncertainties of the ascent propulsion system are discussed. The techniques of processing the telemetered flight data and performing postflight performance reconstruction to determine actual inflight performance are discussed. Problems that have been encountered and results from the analysis of the ascent propulsion system performance during the Apollo 9, 10, and 11 missions are presented.
Comparison of variance estimators for meta-analysis of instrumental variable estimates
Schmidt, AF; Hingorani, AD; Jefferis, BJ; White, J; Groenwold, RHH; Dudbridge, F
2016-01-01
Abstract Background: Mendelian randomization studies perform instrumental variable (IV) analysis using genetic IVs. Results of individual Mendelian randomization studies can be pooled through meta-analysis. We explored how different variance estimators influence the meta-analysed IV estimate. Methods: Two versions of the delta method (IV before or after pooling), four bootstrap estimators, a jack-knife estimator and a heteroscedasticity-consistent (HC) variance estimator were compared using simulation. Two types of meta-analyses were compared, a two-stage meta-analysis pooling results, and a one-stage meta-analysis pooling datasets. Results: Using a two-stage meta-analysis, coverage of the point estimate using bootstrapped estimators deviated from nominal levels at weak instrument settings and/or outcome probabilities ≤ 0.10. The jack-knife estimator was the least biased resampling method, the HC estimator often failed at outcome probabilities ≤ 0.50 and overall the delta method estimators were the least biased. In the presence of between-study heterogeneity, the delta method before meta-analysis performed best. Using a one-stage meta-analysis all methods performed equally well and better than two-stage meta-analysis of greater or equal size. Conclusions: In the presence of between-study heterogeneity, two-stage meta-analyses should preferentially use the delta method before meta-analysis. Weak instrument bias can be reduced by performing a one-stage meta-analysis. PMID:27591262
Effect of semen preparation on casa motility results in cryopreserved bull spermatozoa.
Contri, Alberto; Valorz, Claudio; Faustini, Massimo; Wegher, Laura; Carluccio, Augusto
2010-08-01
Computer-assisted sperm analyzers (CASA) have become the standard tool for evaluating sperm motility and kinetic patterns because they provide objective data for thousands of sperm tracks. However, these devices are not ready-to-use and standardization of analytical practices is a fundamental requirement. In this study, we evaluated the effects of some settings, such as frame rate and frames per field, chamber and time of analysis, and samples preparations, including thawing temperature, sperm sample concentration, and media used for dilution, on the kinetic results of bovine frozen-thawed semen using a CASA. In Experiment 1, the frame rate (30-60 frame/s) significantly affected motility parameters, whereas the number of frames per field (30 or 45) did not seem to affect sperm kinetics. In Experiment 2, the thawing protocol affects sperm motility and kinetic parameters. Sperm sample concentration significantly limited the opportunity to perform the analysis and the kinetic results. A concentration of 100 and 50 x 10(6) sperm/mL limited the device's ability to perform the analysis or gave wrong results, whereas 5, 10, 20, and 30 x 10(6) sperm/mL concentrations allowed the analysis to be performed, but with different results (Experiment 3). The medium used for the dilution of the sample, which is fundamental for a correct sperm head detection, affects sperm motility results (Experiment 4). In this study, Makler and Leja chambers were used to perform the semen analysis with CASA devices. The chamber used significantly affected motility results (Experiment 5). The time between chamber loading and analysis affected sperm velocities, regardless of chamber used. Based on results recorded in this study, we propose that the CASA evaluation of motility of bovine frozen-thawed semen using Hamilton-Thorne IVOS 12.3 should be performed using a frame rate of 60 frame/s and 30 frames per field. Semen should be diluted at least at 20 x 10(6) sperm/mL using PBS. Furthermore, it is necessary to consider the type of chamber used and perform the analysis within 1 or 2 min, regardless of the chamber used. Copyright 2010 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Chapman, Randall G.
1993-01-01
A study investigated the utility of importance-performance analysis, a marketing tool for assessing marketing position and performance, in learning how college applicants perceive their chosen college in comparison with others. Findings reflect the complexity of student decisions and suggest the "average" college performs above average…
NASA Technical Reports Server (NTRS)
Kuhn, A. E.
1975-01-01
A dispersion analysis considering 3 sigma uncertainties (or perturbations) in platform, vehicle, and environmental parameters was performed for the baseline reference mission (BRM) 1 of the space shuttle orbiter. The dispersion analysis is based on the nominal trajectory for the BRM 1. State vector and performance dispersions (or variations) which result from the indicated 3 sigma uncertainties were studied. The dispersions were determined at major mission events and fixed times from lift-off (time slices) and the results will be used to evaluate the capability of the vehicle to perform the mission within a 3 sigma level of confidence and to determine flight performance reserves. A computer program is given that was used for dynamic flight simulations of the space shuttle orbiter.
CASAS: Cancer Survival Analysis Suite, a web based application
Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne
2017-01-01
We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis: quantile, landmark and competing risks, in addition to standard survival analysis. The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots. Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/. PMID:28928946
CASAS: Cancer Survival Analysis Suite, a web based application.
Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne
2017-01-01
We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis: quantile, landmark and competing risks, in addition to standard survival analysis. The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots. Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/.
Temporal geospatial analysis of secondary school students’ examination performance
NASA Astrophysics Data System (ADS)
Nik Abd Kadir, ND; Adnan, NA
2016-06-01
Malaysia's Ministry of Education has improved the organization of the data to have the geographical information system (GIS) school database. However, no further analysis is done using geospatial analysis tool. Mapping has emerged as a communication tool and becomes effective way to publish the digital and statistical data such as school performance results. The objective of this study is to analyse secondary school student performance of science and mathematics scores of the Sijil Pelajaran Malaysia Examination result in the year 2010 to 2014 for the Kelantan's state schools with the aid of GIS software and geospatial analysis. The school performance according to school grade point average (GPA) from Grade A to Grade G were interpolated and mapped and query analysis using geospatial tools able to be done. This study will be beneficial to the education sector to analyse student performance not only in Kelantan but to the whole Malaysia and this will be a good method to publish in map towards better planning and decision making to prepare young Malaysians for the challenges of education system and performance.
Multidisciplinary design optimization using multiobjective formulation techniques
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Pagaldipti, Narayanan S.
1995-01-01
This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.
Faramarzi, Salar; Shamsi, Abdolhossein; Samadi, Maryam; Ahmadzade, Maryam
2015-01-01
with due attention to the importance of learning disabilities and necessity of presenting interventions for improvement of these disorders in order to prevent future problems, this study used meta-analysis of the research model on the impact of psychological and educational interventions to improve academic performance of students with learning disabilities. with the use of meta-analysis method by integrating the results of various researches, this study specifies the effect of psychological and educational interventions. In this order, 57 studies, which their methodology was accepted, were selected and meta-analysis was performed on them. The research instrument was a meta-analysis checklist. The effect size for the effectiveness of psychological-educational interventions on improving the academic performance of students with mathematics disorder (0.57), impaired writing (0.50) and dyslexia (0.55) were reported. The result of meta-analysis showed that according to Cohen's table, the effect size is above average, and it can be said that educational and psychological interventions improve the academic performance of students with learning disabilities.
Mueller, Evelyn A; Bengel, Juergen; Wirtz, Markus A
2013-12-01
This study aimed to develop a self-description assessment instrument to measure work performance in patients with musculoskeletal diseases. In terms of the International Classification of Functioning, Disability and Health (ICF), work performance is defined as the degree of meeting the work demands (activities) at the actual workplace (environment). To account for the fact that work performance depends on the work demands of the job, we strived to develop item banks that allow a flexible use of item subgroups depending on the specific work demands of the patients' jobs. Item development included the collection of work tasks from literature and content validation through expert surveys and patient interviews. The resulting 122 items were answered by 621 patients with musculoskeletal diseases. Exploratory factor analysis to ascertain dimensionality and Rasch analysis (partial credit model) for each of the resulting dimensions were performed. Exploratory factor analysis resulted in four dimensions, and subsequent Rasch analysis led to the following item banks: 'impaired productivity' (15 items), 'impaired cognitive performance' (18), 'impaired coping with stress' (13) and 'impaired physical performance' (low physical workload 20 items, high physical workload 10 items). The item banks exhibited person separation indices (reliability) between 0.89 and 0.96. The assessment of work performance adds the activities component to the more commonly employed participation component of the ICF-model. The four item banks can be adapted to specific jobs where necessary without losing comparability of person measures, as the item banks are based on Rasch analysis.
Lobb, Eric C
2016-07-08
Version 6.3 of the RITG148+ software package offers eight automated analysis routines for quality assurance of the TomoTherapy platform. A performance evaluation of each routine was performed in order to compare RITG148+ results with traditionally accepted analysis techniques and verify that simulated changes in machine parameters are correctly identified by the software. Reference films were exposed according to AAPM TG-148 methodology for each routine and the RITG148+ results were compared with either alternative software analysis techniques or manual analysis techniques in order to assess baseline agreement. Changes in machine performance were simulated through translational and rotational adjustments to subsequently irradiated films, and these films were analyzed to verify that the applied changes were accurately detected by each of the RITG148+ routines. For the Hounsfield unit routine, an assessment of the "Frame Averaging" functionality and the effects of phantom roll on the routine results are presented. All RITG148+ routines reported acceptable baseline results consistent with alternative analysis techniques, with 9 of the 11 baseline test results showing agreement of 0.1mm/0.1° or better. Simulated changes were correctly identified by the RITG148+ routines within approximately 0.2 mm/0.2° with the exception of the Field Centervs. Jaw Setting routine, which was found to have limited accuracy in cases where field centers were not aligned for all jaw settings due to inaccurate autorotation of the film during analysis. The performance of the RITG148+ software package was found to be acceptable for introduction into our clinical environment as an automated alternative to traditional analysis techniques for routine TomoTherapy quality assurance testing.
Performance of Blind Source Separation Algorithms for FMRI Analysis using a Group ICA Method
Correa, Nicolle; Adali, Tülay; Calhoun, Vince D.
2007-01-01
Independent component analysis (ICA) is a popular blind source separation (BSS) technique that has proven to be promising for the analysis of functional magnetic resonance imaging (fMRI) data. A number of ICA approaches have been used for fMRI data analysis, and even more ICA algorithms exist, however the impact of using different algorithms on the results is largely unexplored. In this paper, we study the performance of four major classes of algorithms for spatial ICA, namely information maximization, maximization of non-gaussianity, joint diagonalization of cross-cumulant matrices, and second-order correlation based methods when they are applied to fMRI data from subjects performing a visuo-motor task. We use a group ICA method to study the variability among different ICA algorithms and propose several analysis techniques to evaluate their performance. We compare how different ICA algorithms estimate activations in expected neuronal areas. The results demonstrate that the ICA algorithms using higher-order statistical information prove to be quite consistent for fMRI data analysis. Infomax, FastICA, and JADE all yield reliable results; each having their strengths in specific areas. EVD, an algorithm using second-order statistics, does not perform reliably for fMRI data. Additionally, for the iterative ICA algorithms, it is important to investigate the variability of the estimates from different runs. We test the consistency of the iterative algorithms, Infomax and FastICA, by running the algorithm a number of times with different initializations and note that they yield consistent results over these multiple runs. Our results greatly improve our confidence in the consistency of ICA for fMRI data analysis. PMID:17540281
Performance of blind source separation algorithms for fMRI analysis using a group ICA method.
Correa, Nicolle; Adali, Tülay; Calhoun, Vince D
2007-06-01
Independent component analysis (ICA) is a popular blind source separation technique that has proven to be promising for the analysis of functional magnetic resonance imaging (fMRI) data. A number of ICA approaches have been used for fMRI data analysis, and even more ICA algorithms exist; however, the impact of using different algorithms on the results is largely unexplored. In this paper, we study the performance of four major classes of algorithms for spatial ICA, namely, information maximization, maximization of non-Gaussianity, joint diagonalization of cross-cumulant matrices and second-order correlation-based methods, when they are applied to fMRI data from subjects performing a visuo-motor task. We use a group ICA method to study variability among different ICA algorithms, and we propose several analysis techniques to evaluate their performance. We compare how different ICA algorithms estimate activations in expected neuronal areas. The results demonstrate that the ICA algorithms using higher-order statistical information prove to be quite consistent for fMRI data analysis. Infomax, FastICA and joint approximate diagonalization of eigenmatrices (JADE) all yield reliable results, with each having its strengths in specific areas. Eigenvalue decomposition (EVD), an algorithm using second-order statistics, does not perform reliably for fMRI data. Additionally, for iterative ICA algorithms, it is important to investigate the variability of estimates from different runs. We test the consistency of the iterative algorithms Infomax and FastICA by running the algorithm a number of times with different initializations, and we note that they yield consistent results over these multiple runs. Our results greatly improve our confidence in the consistency of ICA for fMRI data analysis.
Performance Analysis of HF Band FB-MC-SS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hussein Moradi; Stephen Andrew Laraway; Behrouz Farhang-Boroujeny
Abstract—In a recent paper [1] the filter bank multicarrier spread spectrum (FB-MC-SS) waveform was proposed for wideband spread spectrum HF communications. A significant benefit of this waveform is robustness against narrow and partial band interference. Simulation results in [1] demonstrated good performance in a wideband HF channel over a wide range of conditions. In this paper we present a theoretical analysis of the bit error probably for this system. Our analysis tailors the results from [2] where BER performance was analyzed for maximum ration combining systems that accounted for correlation between subcarriers and channel estimation error. Equations are give formore » BER that closely match the simulated performance in most situations.« less
SU-F-T-295: MLCs Performance and Patient-Specific IMRT QA Using Log File Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osman, A; American University of Biuret Medical Center, Biuret; Maalej, N
2016-06-15
Purpose: To analyze the performance of the multi-leaf collimators (MLCs) from the log files recorded during the intensity modulated radiotherapy (IMRT) treatment and to construct the relative fluence maps and do the gamma analysis to compare the planned and executed MLCs movement. Methods: We developed a program to extract and analyze the data from dynamic log files (dynalog files) generated from sliding window IMRT delivery treatments. The program extracts the planned and executed (actual or delivered) MLCs movement, calculates and compares the relative planned and executed fluences. The fluence maps were used to perform the gamma analysis (with 3% dosemore » difference and 3 mm distance to agreement) for 3 IMR patients. We compared our gamma analysis results with those obtained from portal dose image prediction (PDIP) algorithm performed using the EPID. Results: For 3 different IMRT patient treatments, the maximum difference between the planned and the executed MCLs positions was 1.2 mm. The gamma analysis results of the planned and delivered fluences were in good agreement with the gamma analysis from portal dosimetry. The maximum difference for number of pixels passing the gamma criteria (3%/3mm) was 0.19% with respect to portal dosimetry results. Conclusion: MLC log files can be used to verify the performance of the MLCs. Patientspecific IMRT QA based on MLC movement log files gives similar results to EPID dosimetry results. This promising method for patient-specific IMRT QA is fast, does not require dose measurements in a phantom, can be done before the treatment and for every fraction, and significantly reduces the IMRT workload. The author would like to thank King Fahd University of petroleum and Minerals for the support.« less
Sharp, T G
1984-02-01
The study was designed to determine whether any one of seven selected variables or a combination of the variables is predictive of performance on the State Board Test Pool Examination. The selected variables studied were: high school grade point average (HSGPA), The University of Tennessee, Knoxville, College of Nursing grade point average (GPA), and American College Test Assessment (ACT) standard scores (English, ENG; mathematics, MA; social studies, SS; natural sciences, NSC; composite, COMP). Data utilized were from graduates of the baccalaureate program of The University of Tennessee, Knoxville, College of Nursing from 1974 through 1979. The sample of 322 was selected from a total population of 572. The Statistical Analysis System (SAS) was designed to accomplish analysis of the predictive relationship of each of the seven selected variables to State Board Test Pool Examination performance (result of pass or fail), a stepwise discriminant analysis was designed for determining the predictive relationship of the strongest combination of the independent variables to overall State Board Test Pool Examination performance (result of pass or fail), and stepwise multiple regression analysis was designed to determine the strongest predictive combination of selected variables for each of the five subexams of the State Board Test Pool Examination. The selected variables were each found to be predictive of SBTPE performance (result of pass or fail). The strongest combination for predicting SBTPE performance (result of pass or fail) was found to be GPA, MA, and NSC.
Herbicide Orange Site Characterization Study Naval Construction Battalion Center
1987-01-01
U.S. Testing Laboratories for analysis. Over 200 additional analyses were performed for a variety of quality assurance criteria. The resultant data...TABLE 9. NCBC PERFORMANCE AUDIT SAMPLE ANALYSIS SUNMARYa (SERIES 1) TCDD Sppb ) Reported Detection Relative b Sample Number Concentration Limit...limit rather than estimating the variance of the results. The sample results were transformed using the natural logarithm. The Shapiro-Wilk W test
Caffeine ingestion enhances Wingate performance: a meta-analysis.
Grgic, Jozo
2018-03-01
The positive effects of caffeine ingestion on aerobic performance are well-established; however, recent findings are suggesting that caffeine ingestion might also enhance components of anaerobic performance. A commonly used test of anaerobic performance and power output is the 30-second Wingate test. Several studies explored the effects of caffeine ingestion on Wingate performance, with equivocal findings. To elucidate this topic, this paper aims to determine the effects of caffeine ingestion on Wingate performance using meta-analytic statistical techniques. Following a search through PubMed/MEDLINE, Scopus, and SportDiscus ® , 16 studies were found meeting the inclusion criteria (pooled number of participants = 246). Random-effects meta-analysis of standardized mean differences (SMD) for peak power output and mean power output was performed. Study quality was assessed using the modified version of the PEDro checklist. Results of the meta-analysis indicated a significant difference (p = .005) between the placebo and caffeine trials on mean power output with SMD values of small magnitude (0.18; 95% confidence interval: 0.05, 0.31; +3%). The meta-analysis performed for peak power output indicated a significant difference (p = .006) between the placebo and caffeine trials (SMD = 0.27; 95% confidence interval: 0.08, 0.47 [moderate magnitude]; +4%). The results from the PEDro checklist indicated that, in general, studies are of good and excellent methodological quality. This meta-analysis adds on to the current body of evidence showing that caffeine ingestion can also enhance components of anaerobic performance. The results presented herein may be helpful for developing more efficient evidence-based recommendations regarding caffeine supplementation.
Posttest analysis of the FFTF inherent safety tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Padilla, A. Jr.; Claybrook, S.W.
Inherent safety tests were performed during 1986 in the 400-MW (thermal) Fast Flux Test Facility (FFTF) reactor to demonstrate the effectiveness of an inherent shutdown device called the gas expansion module (GEM). The GEM device provided a strong negative reactivity feedback during loss-of-flow conditions by increasing the neutron leakage as a result of an expanding gas bubble. The best-estimate pretest calculations for these tests were performed using the IANUS plant analysis code (Westinghouse Electric Corporation proprietary code) and the MELT/SIEX3 core analysis code. These two codes were also used to perform the required operational safety analyses for the FFTF reactormore » and plant. Although it was intended to also use the SASSYS systems (core and plant) analysis code, the calibration of the SASSYS code for FFTF core and plant analysis was not completed in time to perform pretest analyses. The purpose of this paper is to present the results of the posttest analysis of the 1986 FFTF inherent safety tests using the SASSYS code.« less
NASA Technical Reports Server (NTRS)
Andrews, E. H., Jr.; Mackley, E. A.
1976-01-01
An aerodynamic engine inlet analysis was performed on the experimental results obtained at nominal Mach numbers of 5, 6, and 7 from the NASA Hypersonic Research Engine (HRE) Aerothermodynamic Integration Model (AIM). Incorporation on the AIM of the mixed-compression inlet design represented the final phase of an inlet development program of the HRE Project. The purpose of this analysis was to compare the AIM inlet experimental results with theoretical results. Experimental performance was based on measured surface pressures used in a one-dimensional force-momentum theorem. Results of the analysis indicate that surface static-pressure measurements agree reasonably well with theoretical predictions except in the regions where the theory predicts large pressure discontinuities. Experimental and theoretical results both based on the one-dimensional force-momentum theorem yielded inlet performance parameters as functions of Mach number that exhibited reasonable agreement. Previous predictions of inlet unstart that resulted from pressure disturbances created by fuel injection and combustion appeared to be pessimistic.
Sadegh Amalnick, Mohsen; Zarrin, Mansour
2017-03-13
Purpose The purpose of this paper is to present an integrated framework for performance evaluation and analysis of human resource (HR) with respect to the factors of health, safety, environment and ergonomics (HSEE) management system, and also the criteria of European federation for quality management (EFQM) as one of the well-known business excellence models. Design/methodology/approach In this study, an intelligent algorithm based on adaptive neuro-fuzzy inference system (ANFIS) along with fuzzy data envelopment analysis (FDEA) are developed and employed to assess the performance of the company. Furthermore, the impact of the factors on the company's performance as well as their strengths and weaknesses are identified by conducting a sensitivity analysis on the results. Similarly, a design of experiment is performed to prioritize the factors in the order of importance. Findings The results show that EFQM model has a far greater impact upon the company's performance than HSEE management system. According to the obtained results, it can be argued that integration of HSEE and EFQM leads to the performance improvement in the company. Practical implications In current study, the required data for executing the proposed framework are collected via valid questionnaires which are filled in by the staff of an aviation industry located in Tehran, Iran. Originality/value Managing HR performance results in improving usability, maintainability and reliability and finally in a significant reduction in the commercial aviation accident rate. Also, study of factors affecting HR performance authorities participate in developing systems in order to help operators better manage human error. This paper for the first time presents an intelligent framework based on ANFIS, FDEA and statistical tests for HR performance assessment and analysis with the ability of handling uncertainty and vagueness existing in real world environment.
Improving Student Naval Aviator Aircraft Carrier Landing Performance
ERIC Educational Resources Information Center
Sheppard, Thomas H.; Foster, T. Chris
2008-01-01
This article discusses the use of human performance technology (HPT) to improve qualification rates for learning to land onboard aircraft carriers. This project started as a request for a business case analysis and evolved into a full-fledged performance improvement project, from mission analysis through evaluation. The result was a significant…
Discrete retardance second harmonic generation ellipsometry.
Dehen, Christopher J; Everly, R Michael; Plocinik, Ryan M; Hedderich, Hartmut G; Simpson, Garth J
2007-01-01
A new instrument was constructed to perform discrete retardance nonlinear optical ellipsometry (DR-NOE). The focus of the design was to perform second harmonic generation NOE while maximizing sample and application flexibility and minimizing data acquisition time. The discrete retardance configuration results in relatively simple computational algorithms for performing nonlinear optical ellipsometric analysis. NOE analysis of a disperse red 19 monolayer yielded results that were consistent with previously reported values for the same surface system, but with significantly reduced acquisition times.
Initial empirical analysis of nuclear power plant organization and its effect on safety performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olson, J.; McLaughlin, S.D.; Osborn, R.N.
This report contains an analysis of the relationship between selected aspects of organizational structure and the safety-related performance of nuclear power plants. The report starts by identifying and operationalizing certain key dimensions of organizational structure that may be expected to be related to plant safety performance. Next, indicators of plant safety performance are created by combining existing performance measures into more reliable indicators. Finally, the indicators of plant safety performance using correlational and discriminant analysis. The overall results show that plants with better developed coordination mechanisms, shorter vertical hierarchies, and a greater number of departments tend to perform more safely.
A Finite Rate Chemical Analysis of Nitric Oxide Flow Contamination Effects on Scramjet Performance
NASA Technical Reports Server (NTRS)
Cabell, Karen F.; Rock, Kenneth E.
2003-01-01
The level of nitric oxide contamination in the test gas of the Langley Research Center Arc-Heated Scramjet Test Facility and the effect of the contamination on scramjet test engine performance were investigated analytically. A finite rate chemical analysis was performed to determine the levels of nitric oxide produced in the facility at conditions corresponding to Mach 6 to 8 flight simulations. Results indicate that nitric oxide levels range from one to three mole percent, corroborating previously obtained measurements. A three-stream combustor code with finite rate chemistry was used to investigate the effects of nitric oxide on scramjet performance. Results indicate that nitric oxide in the test gas causes a small increase in heat release and thrust performance for the test conditions investigated. However, a rate constant uncertainty analysis suggests that the effect of nitric oxide ranges from no net effect, to an increase of about 10 percent in thrust performance.
Motivation, Compensation, and Performance for Science and Technological Teachers
NASA Astrophysics Data System (ADS)
Abast, R. M.; Sangi, N. M.; Tumanduk, M. S. S. S.; Roring, R.
2018-02-01
This research is operationally aimed to obtain the result of analysis and interpretation about: relationship of achievement motive, compensation with performance at a junior high school in Manado, Indonesia. This research applies a quantitative approach with correlation analysis method. The research was conducted at one junior high school in Manado, Indonesia. The results showed achievement motive at the school teachers is quite high. This result means that, generally, the teachers of the school have a desire to improve achievement; the performance at the school is good enough. This result means that in general, the performance of teachers at the school is increasing, there is a linkage degree and determinative power between the achievement motive with the performance of teachers at the school amounted 0.773% or 77.3%, compensation for the school teachers in Manado is good enough. This result means that the compensation received is satisfactory, there is a linkage degree and determinative power between compensation and performance of the school teachers in Manado amounted to 0.582 or 58.2%.
Fan, Chunlin; Deng, Jiewei; Yang, Yunyun; Liu, Junshan; Wang, Ying; Zhang, Xiaoqi; Fai, Kuokchiu; Zhang, Qingwen; Ye, Wencai
2013-10-01
An ultra-performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry (UPLC-QTOF-MS) method integrating multi-ingredients determination and fingerprint analysis has been established for quality assessment and control of leaves from Ilex latifolia. The method possesses the advantages of speediness, efficiency, accuracy, and allows the multi-ingredients determination and fingerprint analysis in one chromatographic run within 13min. Multi-ingredients determination was performed based on the extracted ion chromatograms of the exact pseudo-molecular ions (with a 0.01Da window), and fingerprint analysis was performed based on the base peak chromatograms, obtained by negative-ion electrospray ionization QTOF-MS. The method validation results demonstrated our developed method possessing desirable specificity, linearity, precision and accuracy. The method was utilized to analyze 22 I. latifolia samples from different origins. The quality assessment was achieved by using both similarity analysis (SA) and principal component analysis (PCA), and the results from SA were consistent with those from PCA. Our experimental results demonstrate that the strategy integrated multi-ingredients determination and fingerprint analysis using UPLC-QTOF-MS technique is a useful approach for rapid pharmaceutical analysis, with promising prospects for the differentiation of origin, the determination of authenticity, and the overall quality assessment of herbal medicines. Copyright © 2013 Elsevier B.V. All rights reserved.
Analysis of a Rocket Based Combined Cycle Engine during Rocket Only Operation
NASA Technical Reports Server (NTRS)
Smith, T. D.; Steffen, C. J., Jr.; Yungster, S.; Keller, D. J.
1998-01-01
The all rocket mode of operation is a critical factor in the overall performance of a rocket based combined cycle (RBCC) vehicle. However, outside of performing experiments or a full three dimensional analysis, there are no first order parametric models to estimate performance. As a result, an axisymmetric RBCC engine was used to analytically determine specific impulse efficiency values based upon both full flow and gas generator configurations. Design of experiments methodology was used to construct a test matrix and statistical regression analysis was used to build parametric models. The main parameters investigated in this study were: rocket chamber pressure, rocket exit area ratio, percent of injected secondary flow, mixer-ejector inlet area, mixer-ejector area ratio, and mixer-ejector length-to-inject diameter ratio. A perfect gas computational fluid dynamics analysis was performed to obtain values of vacuum specific impulse. Statistical regression analysis was performed based on both full flow and gas generator engine cycles. Results were also found to be dependent upon the entire cycle assumptions. The statistical regression analysis determined that there were five significant linear effects, six interactions, and one second-order effect. Two parametric models were created to provide performance assessments of an RBCC engine in the all rocket mode of operation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Huang, Zhenyu; Rice, Mark J.
Contingency analysis studies are necessary to assess the impact of possible power system component failures. The results of the contingency analysis are used to ensure the grid reliability, and in power market operation for the feasibility test of market solutions. Currently, these studies are performed in real time based on the current operating conditions of the grid with a set of pre-selected contingency list, which might result in overlooking some critical contingencies caused by variable system status. To have a complete picture of a power grid, more contingencies need to be studied to improve grid reliability. High-performance computing techniques holdmore » the promise of being able to perform the analysis for more contingency cases within a much shorter time frame. This paper evaluates the performance of counter-based dynamic load balancing schemes for a massive contingency analysis program on 10,000+ cores. One million N-2 contingency analysis cases with a Western Electricity Coordinating Council power grid model have been used to demonstrate the performance. The speedup of 3964 with 4096 cores and 7877 with 10240 cores are obtained. This paper reports the performance of the load balancing scheme with a single counter and two counters, describes disk I/O issues, and discusses other potential techniques for further improving the performance.« less
Janssen, Christian P; Brumby, Duncan P; Dowell, John; Chater, Nick; Howes, Andrew
2011-01-01
We report the results of a dual-task study in which participants performed a tracking and typing task under various experimental conditions. An objective payoff function was used to provide explicit feedback on how participants should trade off performance between the tasks. Results show that participants' dual-task interleaving strategy was sensitive to changes in the difficulty of the tracking task and resulted in differences in overall task performance. To test the hypothesis that people select strategies that maximize payoff, a Cognitively Bounded Rational Analysis model was developed. This analysis evaluated a variety of dual-task interleaving strategies to identify the optimal strategy for maximizing payoff in each condition. The model predicts that the region of optimum performance is different between experimental conditions. The correspondence between human data and the prediction of the optimal strategy is found to be remarkably high across a number of performance measures. This suggests that participants were honing their behavior to maximize payoff. Limitations are discussed. Copyright © 2011 Cognitive Science Society, Inc.
LOX/LH2 vane pump for auxiliary propulsion systems
NASA Technical Reports Server (NTRS)
Hemminger, J. A.; Ulbricht, T. E.
1985-01-01
Positive displacement pumps offer potential efficiency advantages over centrifugal pumps for future low thrust space missions. Low flow rate applications, such as space station auxiliary propulsion or dedicated low thrust orbiter transfer vehicles, are typical of missions where low flow and high head rise challenge centrifugal pumps. The positive displacement vane pump for pumping of LOX and LH2 is investigated. This effort has included: (1) a testing program in which pump performance was investigated for differing pump clearances and for differing pump materials while pumping LN2, LOX, and LH2; and (2) an analysis effort, in which a comprehensive pump performance analysis computer code was developed and exercised. An overview of the theoretical framework of the performance analysis computer code is presented, along with a summary of analysis results. Experimental results are presented for pump operating in liquid nitrogen. Included are data on the effects on pump performance of pump clearance, speed, and pressure rise. Pump suction performance is also presented.
Design and performance of an analysis-by-synthesis class of predictive speech coders
NASA Technical Reports Server (NTRS)
Rose, Richard C.; Barnwell, Thomas P., III
1990-01-01
The performance of a broad class of analysis-by-synthesis linear predictive speech coders is quantified experimentally. The class of coders includes a number of well-known techniques as well as a very large number of speech coders which have not been named or studied. A general formulation for deriving the parametric representation used in all of the coders in the class is presented. A new coder, named the self-excited vocoder, is discussed because of its good performance with low complexity, and because of the insight this coder gives to analysis-by-synthesis coders in general. The results of a study comparing the performances of different members of this class are presented. The study takes the form of a series of formal subjective and objective speech quality tests performed on selected coders. The results of this study lead to some interesting and important observations concerning the controlling parameters for analysis-by-synthesis speech coders.
Performance analysis of a coherent free space optical communication system based on experiment.
Cao, Jingtai; Zhao, Xiaohui; Liu, Wei; Gu, Haijun
2017-06-26
Based on our previous study and designed experimental AO system with a 97-element continuous surface deformable mirror, we conduct the performance analysis of a coherent free space optical communication (FSOC) system for mixing efficiency (ME), bit error rate (BER) and outage probability under different Greenwood frequency and atmospheric coherent length. The results show that the influence of the atmospheric temporal characteristics on the performance is slightly stronger than that of the spatial characteristics when the receiving aperture and the number of sub-apertures are given. This analysis result provides a reference for the design of the coherent FSOC system.
Cassimatis, Constantine; Liu, Karen P Y; Fahey, Paul; Bissett, Michelle
2016-09-01
A systematic review with meta-analysis was performed to investigate the effect external sensory cued therapy on activities of daily living (ADL) performance that include walking and daily tasks such as dressing for individuals with Parkinson's disease (PD). A detailed computer-aided search of the literature was applied to MEDLINE, Cumulative Index to Nursing and Allied Health Literature, EMBASE and PubMed. Studies investigating the effects of external sensory cued therapy on ADL performance for individuals with PD in all stages of disease progression were collected. Relevant articles were critically reviewed and study results were synthesized by two independent researchers. A data-analysis method was used to extract data from selected articles. A meta-analysis was carried out for all randomized-controlled trials. Six studies with 243 individuals with PD were included in this review. All six studies yielded positive findings in favour of external sensory cues. The meta-analysis showed that external sensory cued therapy improved statistically after treatment (P=0.011) and at follow-up (P<0.001) for ADL performance. The results of this review provided evidence of an improvement in ADL performance in general in individuals with PD. It is recommended that clinicians incorporate external sensory into a training programme focused on improving daily task performance.
Performance of concrete members subjected to large hydrocarbon pool fires
Zwiers, Renata I.; Morgan, Bruce J.
1989-01-01
The authors discuss an investigation to determine analytically if the performance of concrete beams and columns in a hydrocarbon pool test fire would differ significantly from their performance in a standard test fire. The investigation consisted of a finite element analysis to obtain temperature distributions in typical cross sections, a comparison of the resulting temperature distribution in the cross section, and a strength analysis of a beam based on temperature distribution data. Results of the investigation are reported.
Hao, Bibo; Sun, Wen; Yu, Yiqin; Li, Jing; Hu, Gang; Xie, Guotong
2016-01-01
Recent advances in cloud computing and machine learning made it more convenient for researchers to gain insights from massive healthcare data, while performing analyses on healthcare data in current practice still lacks efficiency for researchers. What's more, collaborating among different researchers and sharing analysis results are challenging issues. In this paper, we developed a practice to make analytics process collaborative and analysis results reproducible by exploiting and extending Jupyter Notebook. After applying this practice in our use cases, we can perform analyses and deliver results with less efforts in shorter time comparing to our previous practice.
Design and performance analysis of gas and liquid radial turbines
NASA Astrophysics Data System (ADS)
Tan, Xu
In the first part of the research, pumps running in reverse as turbines are studied. This work uses experimental data of wide range of pumps representing the centrifugal pumps' configurations in terms of specific speed. Based on specific speed and specific diameter an accurate correlation is developed to predict the performances at best efficiency point of the centrifugal pump in its turbine mode operation. The proposed prediction method yields very good results to date compared to previous such attempts. The present method is compared to nine previous methods found in the literature. The comparison results show that the method proposed in this paper is the most accurate. The proposed method can be further complemented and supplemented by more future tests to increase its accuracy. The proposed method is meaningful because it is based both specific speed and specific diameter. The second part of the research is focused on the design and analysis of the radial gas turbine. The specification of the turbine is obtained from the solar biogas hybrid system. The system is theoretically analyzed and constructed based on the purchased compressor. Theoretical analysis results in a specification of 100lb/min, 900ºC inlet total temperature and 1.575atm inlet total pressure. 1-D and 3-D geometry of the rotor is generated based on Aungier's method. 1-D loss model analysis and 3-D CFD simulations are performed to examine the performances of the rotor. The total-to-total efficiency of the rotor is more than 90%. With the help of CFD analysis, modifications on the preliminary design obtained optimized aerodynamic performances. At last, the theoretical performance analysis on the hybrid system is performed with the designed turbine.
Development of parallel algorithms for electrical power management in space applications
NASA Technical Reports Server (NTRS)
Berry, Frederick C.
1989-01-01
The application of parallel techniques for electrical power system analysis is discussed. The Newton-Raphson method of load flow analysis was used along with the decomposition-coordination technique to perform load flow analysis. The decomposition-coordination technique enables tasks to be performed in parallel by partitioning the electrical power system into independent local problems. Each independent local problem represents a portion of the total electrical power system on which a loan flow analysis can be performed. The load flow analysis is performed on these partitioned elements by using the Newton-Raphson load flow method. These independent local problems will produce results for voltage and power which can then be passed to the coordinator portion of the solution procedure. The coordinator problem uses the results of the local problems to determine if any correction is needed on the local problems. The coordinator problem is also solved by an iterative method much like the local problem. The iterative method for the coordination problem will also be the Newton-Raphson method. Therefore, each iteration at the coordination level will result in new values for the local problems. The local problems will have to be solved again along with the coordinator problem until some convergence conditions are met.
Spotlight-8 Image Analysis Software
NASA Technical Reports Server (NTRS)
Klimek, Robert; Wright, Ted
2006-01-01
Spotlight is a cross-platform GUI-based software package designed to perform image analysis on sequences of images generated by combustion and fluid physics experiments run in a microgravity environment. Spotlight can perform analysis on a single image in an interactive mode or perform analysis on a sequence of images in an automated fashion. Image processing operations can be employed to enhance the image before various statistics and measurement operations are performed. An arbitrarily large number of objects can be analyzed simultaneously with independent areas of interest. Spotlight saves results in a text file that can be imported into other programs for graphing or further analysis. Spotlight can be run on Microsoft Windows, Linux, and Apple OS X platforms.
The Effect of Laminar Flow on Rotor Hover Performance
NASA Technical Reports Server (NTRS)
Overmeyer, Austin D.; Martin, Preston B.
2017-01-01
The topic of laminar flow effects on hover performance is introduced with respect to some historical efforts where laminar flow was either measured or attempted. An analysis method is outlined using combined blade element, momentum method coupled to an airfoil analysis method, which includes the full e(sup N) transition model. The analysis results compared well with the measured hover performance including the measured location of transition on both the upper and lower blade surfaces. The analysis method is then used to understand the upper limits of hover efficiency as a function of disk loading. The impact of laminar flow is higher at low disk loading, but significant improvement in terms of power loading appears possible even up to high disk loading approaching 20 ps f. A optimum planform design equation is derived for cases of zero profile drag and finite drag levels. These results are intended to be a guide for design studies and as a benchmark to compare higher fidelity analysis results. The details of the analysis method are given to enable other researchers to use the same approach for comparison to other approaches.
Planar doped barrier devices for subharmonic mixers
NASA Technical Reports Server (NTRS)
Lee, T. H.; East, J. R.; Haddad, G. I.
1991-01-01
An overview is given of planar doped barrier (PDB) devices for subharmonic mixer applications. A simplified description is given of PDB characteristics along with a more complete numerical analysis of the current versus voltage characteristics of typical structures. The analysis points out the tradeoffs between the device structure and the resulting characteristics that are important for mixer performance. Preliminary low-frequency characterization results are given for the device structures, and a computer analysis of subharmonic mixer parameters and performance is presented.
Analysis of rosen piezoelectric transformers with a varying cross-section.
Xue, H; Yang, J; Hu, Y
2008-07-01
We study the effects of a varying cross-section on the performance of Rosen piezoelectric transformers operating with length extensional modes of rods. A theoretical analysis is performed using an extended version of a one-dimensional model developed in a previous paper. Numerical results based on the theoretical analysis are presented.
Frequency Spectrum Method-Based Stress Analysis for Oil Pipelines in Earthquake Disaster Areas
Wu, Xiaonan; Lu, Hongfang; Huang, Kun; Wu, Shijuan; Qiao, Weibiao
2015-01-01
When a long distance oil pipeline crosses an earthquake disaster area, inertial force and strong ground motion can cause the pipeline stress to exceed the failure limit, resulting in bending and deformation failure. To date, researchers have performed limited safety analyses of oil pipelines in earthquake disaster areas that include stress analysis. Therefore, using the spectrum method and theory of one-dimensional beam units, CAESAR II is used to perform a dynamic earthquake analysis for an oil pipeline in the XX earthquake disaster area. This software is used to determine if the displacement and stress of the pipeline meet the standards when subjected to a strong earthquake. After performing the numerical analysis, the primary seismic action axial, longitudinal and horizontal displacement directions and the critical section of the pipeline can be located. Feasible project enhancement suggestions based on the analysis results are proposed. The designer is able to utilize this stress analysis method to perform an ultimate design for an oil pipeline in earthquake disaster areas; therefore, improving the safe operation of the pipeline. PMID:25692790
Frequency spectrum method-based stress analysis for oil pipelines in earthquake disaster areas.
Wu, Xiaonan; Lu, Hongfang; Huang, Kun; Wu, Shijuan; Qiao, Weibiao
2015-01-01
When a long distance oil pipeline crosses an earthquake disaster area, inertial force and strong ground motion can cause the pipeline stress to exceed the failure limit, resulting in bending and deformation failure. To date, researchers have performed limited safety analyses of oil pipelines in earthquake disaster areas that include stress analysis. Therefore, using the spectrum method and theory of one-dimensional beam units, CAESAR II is used to perform a dynamic earthquake analysis for an oil pipeline in the XX earthquake disaster area. This software is used to determine if the displacement and stress of the pipeline meet the standards when subjected to a strong earthquake. After performing the numerical analysis, the primary seismic action axial, longitudinal and horizontal displacement directions and the critical section of the pipeline can be located. Feasible project enhancement suggestions based on the analysis results are proposed. The designer is able to utilize this stress analysis method to perform an ultimate design for an oil pipeline in earthquake disaster areas; therefore, improving the safe operation of the pipeline.
NDARC - NASA Design and Analysis of Rotorcraft Validation and Demonstration
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2010-01-01
Validation and demonstration results from the development of the conceptual design tool NDARC (NASA Design and Analysis of Rotorcraft) are presented. The principal tasks of NDARC are to design a rotorcraft to satisfy specified design conditions and missions, and then analyze the performance of the aircraft for a set of off-design missions and point operating conditions. The aircraft chosen as NDARC development test cases are the UH-60A single main-rotor and tail-rotor helicopter, the CH-47D tandem helicopter, the XH-59A coaxial lift-offset helicopter, and the XV-15 tiltrotor. These aircraft were selected because flight performance data, a weight statement, detailed geometry information, and a correlated comprehensive analysis model are available for each. Validation consists of developing the NDARC models for these aircraft by using geometry and weight information, airframe wind tunnel test data, engine decks, rotor performance tests, and comprehensive analysis results; and then comparing the NDARC results for aircraft and component performance with flight test data. Based on the calibrated models, the capability of the code to size rotorcraft is explored.
Zhao, Huawei
2009-01-01
A ZEMAX model was constructed to simulate a clinical trial of intraocular lenses (IOLs) based on a clinically oriented Monte Carlo ensemble analysis using postoperative ocular parameters. The purpose of this model is to test the feasibility of streamlining and optimizing both the design process and the clinical testing of IOLs. This optical ensemble analysis (OEA) is also validated. Simulated pseudophakic eyes were generated by using the tolerancing and programming features of ZEMAX optical design software. OEA methodology was verified by demonstrating that the results of clinical performance simulations were consistent with previously published clinical performance data using the same types of IOLs. From these results we conclude that the OEA method can objectively simulate the potential clinical trial performance of IOLs.
The Impact of Measurement Noise in GPA Diagnostic Analysis of a Gas Turbine Engine
NASA Astrophysics Data System (ADS)
Ntantis, Efstratios L.; Li, Y. G.
2013-12-01
The performance diagnostic analysis of a gas turbine is accomplished by estimating a set of internal engine health parameters from available sensor measurements. No physical measuring instruments however can ever completely eliminate the presence of measurement uncertainties. Sensor measurements are often distorted by noise and bias leading to inaccurate estimation results. This paper explores the impact of measurement noise on Gas Turbine GPA analysis. The analysis is demonstrated with a test case where gas turbine performance simulation and diagnostics code TURBOMATCH is used to build a performance model of a model engine similar to Rolls-Royce Trent 500 turbofan engine, and carry out the diagnostic analysis with the presence of different levels of measurement noise. Conclusively, to improve the reliability of the diagnostic results, a statistical analysis of the data scattering caused by sensor uncertainties is made. The diagnostic tool used to deal with the statistical analysis of measurement noise impact is a model-based method utilizing a non-linear GPA.
Dynamic test/analysis correlation using reduced analytical models
NASA Technical Reports Server (NTRS)
Mcgowan, Paul E.; Angelucci, A. Filippo; Javeed, Mehzad
1992-01-01
Test/analysis correlation is an important aspect of the verification of analysis models which are used to predict on-orbit response characteristics of large space structures. This paper presents results of a study using reduced analysis models for performing dynamic test/analysis correlation. The reduced test-analysis model (TAM) has the same number and orientation of DOF as the test measurements. Two reduction methods, static (Guyan) reduction and the Improved Reduced System (IRS) reduction, are applied to the test/analysis correlation of a laboratory truss structure. Simulated test results and modal test data are used to examine the performance of each method. It is shown that selection of DOF to be retained in the TAM is critical when large structural masses are involved. In addition, the use of modal test results may provide difficulties in TAM accuracy even if a large number of DOF are retained in the TAM.
Task versus relationship conflict, team performance, and team member satisfaction: a meta-analysis.
De Dreu, Carsten K W; Weingart, Laurie R
2003-08-01
This study provides a meta-analysis of research on the associations between relationship conflict, task conflict, team performance, and team member satisfaction. Consistent with past theorizing, results revealed strong and negative correlations between relationship conflict, team performance, and team member satisfaction. In contrast to what has been suggested in both academic research and introductory textbooks, however, results also revealed strong and negative (instead of the predicted positive) correlations between task conflict team performance, and team member satisfaction. As predicted, conflict had stronger negative relations with team performance in highly complex (decision making, project, mixed) than in less complex (production) tasks. Finally, task conflict was less negatively related to team performance when task conflict and relationship conflict were weakly, rather than strongly, correlated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kupca, L.; Beno, P.
A very brief summary is provided of a primary circuit piping material properties analysis. The analysis was performed for the Bohunice V-1 reactor and the Kola-1 and -2 reactors. Assessment was performed on Bohunice V-1 archive materials and primary piping material cut from the Kola units after 100,000 hours of operation. Main research program tasks included analysis of mechanical properties, corrosion stability, and microstructural properties. Analysis results are not provided.
Prior elicitation and Bayesian analysis of the Steroids for Corneal Ulcers Trial.
See, Craig W; Srinivasan, Muthiah; Saravanan, Somu; Oldenburg, Catherine E; Esterberg, Elizabeth J; Ray, Kathryn J; Glaser, Tanya S; Tu, Elmer Y; Zegans, Michael E; McLeod, Stephen D; Acharya, Nisha R; Lietman, Thomas M
2012-12-01
To elicit expert opinion on the use of adjunctive corticosteroid therapy in bacterial corneal ulcers. To perform a Bayesian analysis of the Steroids for Corneal Ulcers Trial (SCUT), using expert opinion as a prior probability. The SCUT was a placebo-controlled trial assessing visual outcomes in patients receiving topical corticosteroids or placebo as adjunctive therapy for bacterial keratitis. Questionnaires were conducted at scientific meetings in India and North America to gauge expert consensus on the perceived benefit of corticosteroids as adjunct treatment. Bayesian analysis, using the questionnaire data as a prior probability and the primary outcome of SCUT as a likelihood, was performed. For comparison, an additional Bayesian analysis was performed using the results of the SCUT pilot study as a prior distribution. Indian respondents believed there to be a 1.21 Snellen line improvement, and North American respondents believed there to be a 1.24 line improvement with corticosteroid therapy. The SCUT primary outcome found a non-significant 0.09 Snellen line benefit with corticosteroid treatment. The results of the Bayesian analysis estimated a slightly greater benefit than did the SCUT primary analysis (0.19 lines verses 0.09 lines). Indian and North American experts had similar expectations on the effectiveness of corticosteroids in bacterial corneal ulcers; that corticosteroids would markedly improve visual outcomes. Bayesian analysis produced results very similar to those produced by the SCUT primary analysis. The similarity in result is likely due to the large sample size of SCUT and helps validate the results of SCUT.
Statistical Analysis of NAS Parallel Benchmarks and LINPACK Results
NASA Technical Reports Server (NTRS)
Meuer, Hans-Werner; Simon, Horst D.; Strohmeier, Erich; Lasinski, T. A. (Technical Monitor)
1994-01-01
In the last three years extensive performance data have been reported for parallel machines both based on the NAS Parallel Benchmarks, and on LINPACK. In this study we have used the reported benchmark results and performed a number of statistical experiments using factor, cluster, and regression analyses. In addition to the performance results of LINPACK and the eight NAS parallel benchmarks, we have also included peak performance of the machine, and the LINPACK n and n(sub 1/2) values. Some of the results and observations can be summarized as follows: 1) All benchmarks are strongly correlated with peak performance. 2) LINPACK and EP have each a unique signature. 3) The remaining NPB can grouped into three groups as follows: (CG and IS), (LU and SP), and (MG, FT, and BT). Hence three (or four with EP) benchmarks are sufficient to characterize the overall NPB performance. Our poster presentation will follow a standard poster format, and will present the data of our statistical analysis in detail.
Performance analysis and dynamic modeling of a single-spool turbojet engine
NASA Astrophysics Data System (ADS)
Andrei, Irina-Carmen; Toader, Adrian; Stroe, Gabriela; Frunzulica, Florin
2017-01-01
The purposes of modeling and simulation of a turbojet engine are the steady state analysis and transient analysis. From the steady state analysis, which consists in the investigation of the operating, equilibrium regimes and it is based on appropriate modeling describing the operation of a turbojet engine at design and off-design regimes, results the performance analysis, concluded by the engine's operational maps (i.e. the altitude map, velocity map and speed map) and the engine's universal map. The mathematical model that allows the calculation of the design and off-design performances, in case of a single spool turbojet is detailed. An in house code was developed, its calibration was done for the J85 turbojet engine as the test case. The dynamic modeling of the turbojet engine is obtained from the energy balance equations for compressor, combustor and turbine, as the engine's main parts. The transient analysis, which is based on appropriate modeling of engine and its main parts, expresses the dynamic behavior of the turbojet engine, and further, provides details regarding the engine's control. The aim of the dynamic analysis is to determine a control program for the turbojet, based on the results provided by performance analysis. In case of the single-spool turbojet engine, with fixed nozzle geometry, the thrust is controlled by one parameter, which is the fuel flow rate. The design and management of the aircraft engine controls are based on the results of the transient analysis. The construction of the design model is complex, since it is based on both steady-state and transient analysis, further allowing the flight path cycle analysis and optimizations. This paper presents numerical simulations for a single-spool turbojet engine (J85 as test case), with appropriate modeling for steady-state and dynamic analysis.
Prior Elicitation and Bayesian Analysis of the Steroids for Corneal Ulcers Trial
See, Craig W.; Srinivasan, Muthiah; Saravanan, Somu; Oldenburg, Catherine E.; Esterberg, Elizabeth J.; Ray, Kathryn J.; Glaser, Tanya S.; Tu, Elmer Y.; Zegans, Michael E.; McLeod, Stephen D.; Acharya, Nisha R.; Lietman, Thomas M.
2013-01-01
Purpose To elicit expert opinion on the use of adjunctive corticosteroid therapy in bacterial corneal ulcers. To perform a Bayesian analysis of the Steroids for Corneal Ulcers Trial (SCUT), using expert opinion as a prior probability. Methods The SCUT was a placebo-controlled trial assessing visual outcomes in patients receiving topical corticosteroids or placebo as adjunctive therapy for bacterial keratitis. Questionnaires were conducted at scientific meetings in India and North America to gauge expert consensus on the perceived benefit of corticosteroids as adjunct treatment. Bayesian analysis, using the questionnaire data as a prior probability and the primary outcome of SCUT as a likelihood, was performed. For comparison, an additional Bayesian analysis was performed using the results of the SCUT pilot study as a prior distribution. Results Indian respondents believed there to be a 1.21 Snellen line improvement, and North American respondents believed there to be a 1.24 line improvement with corticosteroid therapy. The SCUT primary outcome found a non-significant 0.09 Snellen line benefit with corticosteroid treatment. The results of the Bayesian analysis estimated a slightly greater benefit than did the SCUT primary analysis (0.19 lines verses 0.09 lines). Conclusion Indian and North American experts had similar expectations on the effectiveness of corticosteroids in bacterial corneal ulcers; that corticosteroids would markedly improve visual outcomes. Bayesian analysis produced results very similar to those produced by the SCUT primary analysis. The similarity in result is likely due to the large sample size of SCUT and helps validate the results of SCUT. PMID:23171211
ERIC Educational Resources Information Center
Benligiray, Serdar; Onay, Ahmet
2017-01-01
The objective of this study is to explore business courses performance factors with a focus on accounting and finance. Course score interrelations are assumed to represent interpretable constructs of these factors. Factor analysis is proposed to identify the constructs that explain the correlations. Factor analysis results identify three…
Modeling and Analysis of Actinide Diffusion Behavior in Irradiated Metal Fuel
NASA Astrophysics Data System (ADS)
Edelmann, Paul G.
There have been numerous attempts to model fast reactor fuel behavior in the last 40 years. The US currently does not have a fully reliable tool to simulate the behavior of metal fuels in fast reactors. The experimental database necessary to validate the codes is also very limited. The DOE-sponsored Advanced Fuels Campaign (AFC) has performed various experiments that are ready for analysis. Current metal fuel performance codes are either not available to the AFC or have limitations and deficiencies in predicting AFC fuel performance. A modified version of a new fuel performance code, FEAST-Metal , was employed in this investigation with useful results. This work explores the modeling and analysis of AFC metallic fuels using FEAST-Metal, particularly in the area of constituent actinide diffusion behavior. The FEAST-Metal code calculations for this work were conducted at Los Alamos National Laboratory (LANL) in support of on-going activities related to sensitivity analysis of fuel performance codes. A sensitivity analysis of FEAST-Metal was completed to identify important macroscopic parameters of interest to modeling and simulation of metallic fuel performance. A modification was made to the FEAST-Metal constituent redistribution model to enable accommodation of newer AFC metal fuel compositions with verified results. Applicability of this modified model for sodium fast reactor metal fuel design is demonstrated.
Risović, Dubravko; Pavlović, Zivko
2013-01-01
Processing of gray scale images in order to determine the corresponding fractal dimension is very important due to widespread use of imaging technologies and application of fractal analysis in many areas of science, technology, and medicine. To this end, many methods for estimation of fractal dimension from gray scale images have been developed and routinely used. Unfortunately different methods (dimension estimators) often yield significantly different results in a manner that makes interpretation difficult. Here, we report results of comparative assessment of performance of several most frequently used algorithms/methods for estimation of fractal dimension. To that purpose, we have used scanning electron microscope images of aluminum oxide surfaces with different fractal dimensions. The performance of algorithms/methods was evaluated using the statistical Z-score approach. The differences between performances of six various methods are discussed and further compared with results obtained by electrochemical impedance spectroscopy on the same samples. The analysis of results shows that the performance of investigated algorithms varies considerably and that systematically erroneous fractal dimensions could be estimated using certain methods. The differential cube counting, triangulation, and box counting algorithms showed satisfactory performance in the whole investigated range of fractal dimensions. Difference statistic is proved to be less reliable generating 4% of unsatisfactory results. The performances of the Power spectrum, Partitioning and EIS were unsatisfactory in 29%, 38%, and 75% of estimations, respectively. The results of this study should be useful and provide guidelines to researchers using/attempting fractal analysis of images obtained by scanning microscopy or atomic force microscopy. © Wiley Periodicals, Inc.
CMS endcap RPC performance analysis
NASA Astrophysics Data System (ADS)
Teng, H.; CMS Collaboration
2014-08-01
The Resistive Plate Chamber (RPC) detector system in LHC-CMS experiment is designed for the trigger purpose. The endcap RPC system has been successfully operated since the commissioning period (2008) to the end of RUN1 (2013). We have developed an analysis tool for endcap RPC performance and validated the efficiency calculation algorithm, focusing on the first endcap station which was assembled and tested by the Peking University group. We cross checked the results obtained with those extracted with alternative methods and we found good agreement in terms of performance parameters [1]. The results showed that the CMS-RPC endcap system fulfilled the performance expected in the Technical Design Report [2].
NASA Technical Reports Server (NTRS)
Walsh, J. L.; Weston, R. P.; Samareh, J. A.; Mason, B. H.; Green, L. L.; Biedron, R. T.
2000-01-01
An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity finite-element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a high-speed civil transport configuration. The paper describes both the preliminary results from implementing and validating the multidisciplinary analysis and the results from an aerodynamic optimization. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture compliant software product. A companion paper describes the formulation of the multidisciplinary analysis and optimization system.
Job-mix modeling and system analysis of an aerospace multiprocessor.
NASA Technical Reports Server (NTRS)
Mallach, E. G.
1972-01-01
An aerospace guidance computer organization, consisting of multiple processors and memory units attached to a central time-multiplexed data bus, is described. A job mix for this type of computer is obtained by analysis of Apollo mission programs. Multiprocessor performance is then analyzed using: 1) queuing theory, under certain 'limiting case' assumptions; 2) Markov process methods; and 3) system simulation. Results of the analyses indicate: 1) Markov process analysis is a useful and efficient predictor of simulation results; 2) efficient job execution is not seriously impaired even when the system is so overloaded that new jobs are inordinately delayed in starting; 3) job scheduling is significant in determining system performance; and 4) a system having many slow processors may or may not perform better than a system of equal power having few fast processors, but will not perform significantly worse.
NASA Technical Reports Server (NTRS)
Mcfarland, E.; Tabakoff, W.; Hamed, A.
1977-01-01
An investigation of the effects of coolant injection on the aerodynamic performance of cooled turbine blades is presented. The coolant injection is modeled in the inviscid irrotational adiabatic flow analysis through the cascade using the distributed singularities approach. The resulting integral equations are solved using a minimized surface singularity density criteria. The aerodynamic performance was evaluated using this solution in conjunction with an existing mixing theory analysis. The results of the present analysis are compared with experimental measurements in cold flow tests.
ERIC Educational Resources Information Center
Shinn, Mark; And Others
Two studies were conducted to (1) analyze the subtest characteristics of the Woodcock-Johnson Psycho-Educational Battery, and (2) apply those results to an analysis of 50 fourth grade learning disabled (LD) students' performance on the Battery. Analyses indicated that the poorer performance of LD students on the Woodcock-Johnson Tests of Cognitive…
Design and Evaluation of Glass/epoxy Composite Blade and Composite Tower Applied to Wind Turbine
NASA Astrophysics Data System (ADS)
Park, Hyunbum
2018-02-01
In the study, the analysis and manufacturing of small class wind turbine blade was performed. In the structural design, firstly the loading conditions are defined through the load case analysis. The proposed structural configuration of blade has a sandwich type composite structure with the E-glass/Epoxy face sheets and the Urethane foam core for lightness, structural stability, low manufacturing cost and easy manufacturing process. And also, this work proposes a design procedure and results of tower for the small scale wind turbine systems. Structural analysis of blade including load cases, stress, deformation, buckling, vibration and fatigue life was performed using the finite element method, the load spectrum analysis and the Miner rule. Moreover, investigation on structural safety of tower was verified through structural analysis by FEM. The manufacturing of blade and tower was performed based on structural design. In order to investigate the designed structure, the structural tests were conducted and its results were compared with the calculated results. It is confirmed that the final proposed blade and tower meet the design requirements.
Experience with HEP analysis on mounted filesystems.
NASA Astrophysics Data System (ADS)
Fuhrmann, Patrick; Gasthuber, Martin; Kemp, Yves; Ozerov, Dmitry
2012-12-01
We present results on different approaches on mounted filesystems in use or under investigation at DESY. dCache, established since long as a storage system for physics data has implemented the NFS v4.1/pNFS protocol. New performance results will be shown with the most current version of the dCache server. In addition to the native usage of the mounted filesystem in a LAN environment, the results are given for the performance of the dCache NFS v4.1/pNFS in WAN case. Several commercial vendors are currently in alpha or beta phase of adding the NFS v4.1/pNFS protocol to their storage appliances. We will test some of these vendor solutions for their readiness for HEP analysis. DESY has recently purchased an IBM Sonas system. We will present the result of a thorough performance evaluation using the native protocols NFS (v3 or v4) and GPFS. As the emphasis is on the usability for end user analysis, we will use latest ROOT versions and current end user analysis code for benchmark scenarios.
EPA/ECLSS consumables analyses for the Spacelab 1 flight
NASA Technical Reports Server (NTRS)
Steines, G. J.; Pipher, M. D.
1976-01-01
The results of electrical power system (EPS) and environmental control/life support system (ECLSS) consumables analyses of the Spacelab 1 mission are presented. The analyses were performed to assess the capability of the orbiter systems to support the proposed mission and to establish the various non propulsive consumables requirements. The EPS analysis was performed using the shuttle electrical power system (SEPS) analysis computer program. The ECLSS analysis was performed using the shuttle environmental consumables requirements evaluation tool (SECRET) program.
Optical Performance Of The Gemini Carbon Dioxide Laser Fusion System
NASA Astrophysics Data System (ADS)
Viswanathan, V. K.; Hayden, J. J.; Liberman, I.
1980-11-01
The performance of the Gemini two beam carbon dioxide laser fusion system was recently upgraded by installation of optical components with improved quality in the final amplifier. A theoretical analysis was conducted in conlunction with measurements of the new performance. The analysis and experimental procedures, and results obtained are reported and compared. Good agreement was found which was within the uncertainties of the analysis and the inaccuracies of the experiments. The focal spot Strehl ratio was between 0.24 and 0.3 for both beams.
Single-Cell RNA-Sequencing: Assessment of Differential Expression Analysis Methods.
Dal Molin, Alessandra; Baruzzo, Giacomo; Di Camillo, Barbara
2017-01-01
The sequencing of the transcriptomes of single-cells, or single-cell RNA-sequencing, has now become the dominant technology for the identification of novel cell types and for the study of stochastic gene expression. In recent years, various tools for analyzing single-cell RNA-sequencing data have been proposed, many of them with the purpose of performing differentially expression analysis. In this work, we compare four different tools for single-cell RNA-sequencing differential expression, together with two popular methods originally developed for the analysis of bulk RNA-sequencing data, but largely applied to single-cell data. We discuss results obtained on two real and one synthetic dataset, along with considerations about the perspectives of single-cell differential expression analysis. In particular, we explore the methods performance in four different scenarios, mimicking different unimodal or bimodal distributions of the data, as characteristic of single-cell transcriptomics. We observed marked differences between the selected methods in terms of precision and recall, the number of detected differentially expressed genes and the overall performance. Globally, the results obtained in our study suggest that is difficult to identify a best performing tool and that efforts are needed to improve the methodologies for single-cell RNA-sequencing data analysis and gain better accuracy of results.
Convective Array Cooling for a Solar Powered Aircraft
NASA Technical Reports Server (NTRS)
Colozza, Anthony J.; Dolce, James (Technical Monitor)
2003-01-01
A general characteristic of photovoltaics is that they increase in efficiency as their operating temperature decreases. Based on this principal, the ability to increase a solar aircraft's performance by cooling the solar cells was examined. The solar cells were cooled by channeling some air underneath the cells and providing a convective cooling path to the back side of the array. A full energy balance and flow analysis of the air within the cooling passage was performed. The analysis was first performed on a preliminary level to estimate the benefits of the cooling passage. This analysis established a clear benefit to the cooling passage. Based on these results a more detailed analysis was performed. From this cell temperatures were calculated and array output power throughout a day period were determined with and without the cooling passage. The results showed that if the flow through the cooling passage remained laminar then the benefit in increased output power more than offset the drag induced by the cooling passage.
An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft
NASA Technical Reports Server (NTRS)
Olson, E. D.; Mavris, D. N.
2000-01-01
An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.
Solar energy system economic evaluation: Fern Tunkhannock, Tunkhannock, Pennsylvania
NASA Astrophysics Data System (ADS)
1980-09-01
The economic performance of an Operational Test Site (OTS) is described. The long term economic performance of the system at its installation site and extrapolation to four additional selected locations to demonstrate the viability of the design over a broad range of environmental and economic conditions is reported. Topics discussed are: system description, study approach, economic analysis and system optimization, and technical and economical results of analysis. Data for the economic analysis are generated through evaluation of the OTS. The simulation is based on the technical results of the seasonal report simulation. In addition localized and standard economic parameters are used for economic analysis.
Solar energy system economic evaluation: Fern Tunkhannock, Tunkhannock, Pennsylvania
NASA Technical Reports Server (NTRS)
1980-01-01
The economic performance of an Operational Test Site (OTS) is described. The long term economic performance of the system at its installation site and extrapolation to four additional selected locations to demonstrate the viability of the design over a broad range of environmental and economic conditions is reported. Topics discussed are: system description, study approach, economic analysis and system optimization, and technical and economical results of analysis. Data for the economic analysis are generated through evaluation of the OTS. The simulation is based on the technical results of the seasonal report simulation. In addition localized and standard economic parameters are used for economic analysis.
Material nonlinear analysis via mixed-iterative finite element method
NASA Technical Reports Server (NTRS)
Sutjahjo, Edhi; Chamis, Christos C.
1992-01-01
The performance of elastic-plastic mixed-iterative analysis is examined through a set of convergence studies. Membrane and bending behaviors are tested using 4-node quadrilateral finite elements. The membrane result is excellent, which indicates the implementation of elastic-plastic mixed-iterative analysis is appropriate. On the other hand, further research to improve bending performance of the method seems to be warranted.
Performance testing and analysis results of AMTEC cells for space applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borkowski, C.A.; Barkan, A.; Hendricks, T.J.
1998-01-01
Testing and analysis has shown that AMTEC (Alkali Metal Thermal to Electric Conversion) (Weber, 1974) cells can reach the performance (power) levels required by a variety of space applications. The performance of an AMTEC cell is highly dependent on the thermal environment to which it is subjected. A guard heater assembly has been designed, fabricated, and used to expose individual AMTEC cells to various thermal environments. The design and operation of the guard heater assembly will be discussed. Performance test results of an AMTEC cell operated under guard heated conditions to simulate an adiabatic cell wall thermal environment are presented.more » Experimental data and analytic model results are compared to illustrate validation of the model. {copyright} {ital 1998 American Institute of Physics.}« less
Tao, Lingyan; Zhang, Qing; Wu, Yongjiang; Liu, Xuesong
2016-12-01
In this study, a fast and effective high-performance liquid chromatography method was developed to obtain a fingerprint chromatogram and quantitative analysis simultaneously of four indexes including gallic acid, chlorogenic acid, albiflorin and paeoniflorin of the traditional Chinese medicine Moluodan Concentrated Pill. The method was performed by using a Waters X-bridge C 18 reversed phase column on an Agilent 1200S high-performance liquid chromatography system coupled with diode array detection. The mobile phase of the high-performance liquid chromatography method was composed of 20 mmol/L phosphate solution and acetonitrile with a 1 mL/min eluent velocity, under a detection temperature of 30°C and a UV detection wavelength of 254 nm. After the methodology validation, 16 batches of Moluodan Concentrated Pill were analyzed by this high-performance liquid chromatography method and both qualitative and quantitative evaluation results were achieved by similarity analysis, principal component analysis and hierarchical cluster analysis. The results of these three chemometrics were in good agreement and all indicated that batch 10 and batch 16 showed significant differences with the other 14 batches. This suggested that the developed high-performance liquid chromatography method could be applied in the quality evaluation of Moluodan Concentrated Pill. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Nuclear reactor descriptions for space power systems analysis
NASA Technical Reports Server (NTRS)
Mccauley, E. W.; Brown, N. J.
1972-01-01
For the small, high performance reactors required for space electric applications, adequate neutronic analysis is of crucial importance, but in terms of computational time consumed, nuclear calculations probably yield the least amount of detail for mission analysis study. It has been found possible, after generation of only a few designs of a reactor family in elaborate thermomechanical and nuclear detail to use simple curve fitting techniques to assure desired neutronic performance while still performing the thermomechanical analysis in explicit detail. The resulting speed-up in computation time permits a broad detailed examination of constraints by the mission analyst.
NASA Technical Reports Server (NTRS)
Steele, Glen; Lansdowne, Chatwin; Zucha, Joan; Schlensinger, Adam
2013-01-01
The Soft Decision Analyzer (SDA) is an instrument that combines hardware, firmware, and software to perform realtime closed-loop end-to-end statistical analysis of single- or dual- channel serial digital RF communications systems operating in very low signal-to-noise conditions. As an innovation, the unique SDA capabilities allow it to perform analysis of situations where the receiving communication system slips bits due to low signal-to-noise conditions or experiences constellation rotations resulting in channel polarity in versions or channel assignment swaps. SDA s closed-loop detection allows it to instrument a live system and correlate observations with frame, codeword, and packet losses, as well as Quality of Service (QoS) and Quality of Experience (QoE) events. The SDA s abilities are not confined to performing analysis in low signal-to-noise conditions. Its analysis provides in-depth insight of a communication system s receiver performance in a variety of operating conditions. The SDA incorporates two techniques for identifying slips. The first is an examination of content of the received data stream s relation to the transmitted data content and the second is a direct examination of the receiver s recovered clock signals relative to a reference. Both techniques provide benefits in different ways and allow the communication engineer evaluating test results increased confidence and understanding of receiver performance. Direct examination of data contents is performed by two different data techniques, power correlation or a modified Massey correlation, and can be applied to soft decision data widths 1 to 12 bits wide over a correlation depth ranging from 16 to 512 samples. The SDA detects receiver bit slips within a 4 bits window and can handle systems with up to four quadrants (QPSK, SQPSK, and BPSK systems). The SDA continuously monitors correlation results to characterize slips and quadrant change and is capable of performing analysis even when the receiver under test is subjected to conditions where its performance degrades to high error rates (30 percent or beyond). The design incorporates a number of features, such as watchdog triggers that permit the SDA system to recover from large receiver upsets automatically and continue accumulating performance analysis unaided by operator intervention. This accommodates tests that can last in the order of days in order to gain statistical confidence in results and is also useful for capturing snapshots of rare events.
NASA Technical Reports Server (NTRS)
Ruf, Joseph; Holt, James B.; Canabal, Francisco
1999-01-01
This paper presents the status of analyses on three Rocket Based Combined Cycle configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes code for ejector mode fluid dynamics. The Draco engine analysis is a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.
Buckling Testing and Analysis of Space Shuttle Solid Rocket Motor Cylinders
NASA Technical Reports Server (NTRS)
Weidner, Thomas J.; Larsen, David V.; McCool, Alex (Technical Monitor)
2002-01-01
A series of full-scale buckling tests were performed on the space shuttle Reusable Solid Rocket Motor (RSRM) cylinders. The tests were performed to determine the buckling capability of the cylinders and to provide data for analytical comparison. A nonlinear ANSYS Finite Element Analysis (FEA) model was used to represent and evaluate the testing. Analytical results demonstrated excellent correlation to test results, predicting the failure load within 5%. The analytical value was on the conservative side, predicting a lower failure load than was applied to the test. The resulting study and analysis indicated the important parameters for FEA to accurately predict buckling failure. The resulting method was subsequently used to establish the pre-launch buckling capability of the space shuttle system.
Safety Culture Assessment in Petrochemical Industry: A Comparative Study of Two Algerian Plants
Boughaba, Assia; Hassane, Chabane; Roukia, Ouddai
2014-01-01
Background To elucidate the relationship between safety culture maturity and safety performance of a particular company. Methods To identify the factors that contribute to a safety culture, a survey questionnaire was created based mainly on the studies of Fernández-Muñiz et al. The survey was randomly distributed to 1000 employees of two oil companies and realized a rate of valid answer of 51%. Minitab 16 software was used and diverse tests, including the descriptive statistical analysis, factor analysis, reliability analysis, mean analysis, and correlation, were used for the analysis of data. Ten factors were extracted using the analysis of factor to represent safety culture and safety performance. Results The results of this study showed that the managers' commitment, training, incentives, communication, and employee involvement are the priority domains on which it is necessary to stress the effort of improvement, where they had all the descriptive average values lower than 3.0 at the level of Company B. Furthermore, the results also showed that the safety culture influences the safety performance of the company. Therefore, Company A with a good safety culture (the descriptive average values more than 4.0), is more successful than Company B in terms of accident rates. Conclusion The comparison between the two petrochemical plants of the group Sonatrach confirms these results in which Company A, the managers of which are English and Norwegian, distinguishes itself by the maturity of their safety culture has significantly higher evaluations than the company B, who is constituted of Algerian staff, in terms of safety management practices and safety performance. PMID:25180135
Thermodynamic Cycle and CFD Analyses for Hydrogen Fueled Air-breathing Pulse Detonation Engines
NASA Technical Reports Server (NTRS)
Povinelli, Louis A.; Yungster, Shaye
2002-01-01
This paper presents the results of a thermodynamic cycle analysis of a pulse detonation engine (PDE) using a hydrogen-air mixture at static conditions. The cycle performance results, namely the specific thrust, fuel consumption and impulse are compared to a single cycle CFD analysis for a detonation tube which considers finite rate chemistry. The differences in the impulse values were indicative of the additional performance potential attainable in a PDE.
NASA Technical Reports Server (NTRS)
Wells, C.; Kolkhorst, H. E.
1971-01-01
The consumables analysis was performed for the Skylab 2, 3, and 4 Preliminary Reference Interim Revision Flight Plan. The analysis and the results are based on the mission requirements as specified in the flight plan and on other available data. The results indicate that the consumables requirements for the Skylab missions allow for remaining margins (percent) of oxygen, nitrogen, and water nominal as follows: 83.5, 90.8, and 88.7 for mission SL-2; 57.1, 64.1, and 67.3 for SL-3; and 30.8, 44.3, and 46.5 for SL-4. Performance of experiment M509 as scheduled in the flight plan results in venting overboard the cluster atmosphere. This is due to the addition of nitrogen for propulsion and to the additional oxygen introduced into the cabin when the experiment is performed with the crewman suited.
Digital microarray analysis for digital artifact genomics
NASA Astrophysics Data System (ADS)
Jaenisch, Holger; Handley, James; Williams, Deborah
2013-06-01
We implement a Spatial Voting (SV) based analogy of microarray analysis for digital gene marker identification in malware code sections. We examine a famous set of malware formally analyzed by Mandiant and code named Advanced Persistent Threat (APT1). APT1 is a Chinese organization formed with specific intent to infiltrate and exploit US resources. Manidant provided a detailed behavior and sting analysis report for the 288 malware samples available. We performed an independent analysis using a new alternative to the traditional dynamic analysis and static analysis we call Spatial Analysis (SA). We perform unsupervised SA on the APT1 originating malware code sections and report our findings. We also show the results of SA performed on some members of the families associated by Manidant. We conclude that SV based SA is a practical fast alternative to dynamics analysis and static analysis.
Solnica, Bogdan
2009-01-01
In this issue of Journal of Diabetes Science and Technology, Chang and colleagues present the analytical performance evaluation of the OneTouch® UltraVue™ blood glucose meter. This device is an advanced construction with a color display, used-strip ejector, no-button interface, and short assay time. Accuracy studies were performed using a YSI 2300 analyzer, considered the reference. Altogether, 349 pairs of results covering a wide range of blood glucose concentrations were analyzed. Patients with diabetes performed a significant part of the tests. Obtained results indicate good accuracy of OneTouch UltraVue blood glucose monitoring system, satisfying the International Organization for Standardization recommendations and thereby locating >95% of tests within zone A of the error grid. Results of the precision studies indicate good reproducibility of measurements. In conclusion, the evaluation of the OneTouch UltraVue meter revealed good analytical performance together with convenient handling useful for self-monitoring of blood glucose performed by elderly diabetes patients. PMID:20144432
Solnica, Bogdan
2009-09-01
In this issue of Journal of Diabetes Science and Technology, Chang and colleagues present the analytical performance evaluation of the OneTouch UltraVue blood glucose meter. This device is an advanced construction with a color display, used-strip ejector, no-button interface, and short assay time. Accuracy studies were performed using a YSI 2300 analyzer, considered the reference. Altogether, 349 pairs of results covering a wide range of blood glucose concentrations were analyzed. Patients with diabetes performed a significant part of the tests. Obtained results indicate good accuracy of OneTouch UltraVue blood glucose monitoring system, satisfying the International Organization for Standardization recommendations and thereby locating >95% of tests within zone A of the error grid. Results of the precision studies indicate good reproducibility of measurements. In conclusion, the evaluation of the OneTouch UltraVue meter revealed good analytical performance together with convenient handling useful for self-monitoring of blood glucose performed by elderly diabetes patients. 2009 Diabetes Technology Society.
Statistical evaluation of vibration analysis techniques
NASA Technical Reports Server (NTRS)
Milner, G. Martin; Miller, Patrice S.
1987-01-01
An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.
Millard, Heather A Towle; Millard, Ralph P; Constable, Peter D; Freeman, Lyn J
2014-02-01
To determine the relationships among traditional and laparoscopic surgical skills, spatial analysis skills, and video gaming proficiency of third-year veterinary students. Prospective, randomized, controlled study. A convenience sample of 29 third-year veterinary students. The students had completed basic surgical skills training with inanimate objects but had no experience with soft tissue, orthopedic, or laparoscopic surgery; the spatial analysis test; or the video games that were used in the study. Scores for traditional surgical, laparoscopic, spatial analysis, and video gaming skills were determined, and associations among these were analyzed by means of Spearman's rank order correlation coefficient (rs). A significant positive association (rs = 0.40) was detected between summary scores for video game performance and laparoscopic skills, but not between video game performance and traditional surgical skills scores. Spatial analysis scores were positively (rs = 0.30) associated with video game performance scores; however, that result was not significant. Spatial analysis scores were not significantly associated with laparoscopic surgical skills scores. Traditional surgical skills scores were not significantly associated with laparoscopic skills or spatial analysis scores. Results of this study indicated video game performance of third-year veterinary students was predictive of laparoscopic but not traditional surgical skills, suggesting that laparoscopic performance may be improved with video gaming experience. Additional studies would be required to identify methods for improvement of traditional surgical skills.
Determining team cognition from delay analysis using cross recurrence plot.
Hajari, Nasim; Cheng, Irene; Bin Zheng; Basu, Anup
2016-08-01
Team cognition is an important factor in evaluating and determining team performance. Forming a team with good shared cognition is even more crucial for laparoscopic surgery applications. In this study, we analyzed the eye tracking data of two surgeons during a laparoscopic simulation operation, then performed Cross Recurrence Analysis (CRA) on the recorded data to study the delay behaviour for good performer and poor performer teams. Dual eye tracking data for twenty two dyad teams were recorded during a laparoscopic task and then the teams were divided into good performer and poor performer teams based on the task times. Eventually we studied the delay between two team members for good and poor performer teams. The results indicated that the good performer teams show a smaller delay comparing to poor performer teams. This study is compatible with gaze overlap analysis between team members and therefore it is a good evidence of shared cognition between team members.
NASA Technical Reports Server (NTRS)
Ruf, Joseph H.; Holt, James B.; Canabal, Francisco
2001-01-01
This paper presents the status of analyses on three Rocket Based Combined Cycle (RBCC) configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics (CFD) analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes (FDNS) code for ejector mode fluid dynamics. The Draco analysis was a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.
An Overview of the NIRA Status
NASA Technical Reports Server (NTRS)
Hughes, William
2003-01-01
The NASA Glenn Research Center (GRC) has been tasked by NASA JSC's ISS Payloads Office to perform the NIRA (Non-Isolated Rack Assessment) microgravity prediction analysis task for the International Space Station. Previously, the NIRA analysis task had been performed by Boeing/Houston. Boeing's last NIRA analysis was released in 1999 and was denoted as "NIRA 99." GRC is currently close to completing our first full-NIRA analysis (encompassing the frequency range from 0 to 50 Hz) to be released as "NIRA 2003." This presentation will focus on describing the NIRA analysis, the transition of this analysis task from Boeing to GRC, and the current status and schedule for release of the NIRA 2003 results. Additionally, the results obtained from a mini-NIRA analysis requested by ESA and completed by GRC in the Spring of 2003 will be shown. This mini-analysis focused solely on predicting the microgravity environment at the COF-EPF (Columbus Orbiting Facility - External Payload Facility).
NUMERICAL FLOW AND TRANSPORT SIMULATIONS SUPPORTING THE SALTSTONE FACILITY PERFORMANCE ASSESSMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flach, G.
2009-02-28
The Saltstone Disposal Facility Performance Assessment (PA) is being revised to incorporate requirements of Section 3116 of the Ronald W. Reagan National Defense Authorization Act for Fiscal Year 2005 (NDAA), and updated data and understanding of vault performance since the 1992 PA (Cook and Fowler 1992) and related Special Analyses. A hybrid approach was chosen for modeling contaminant transport from vaults and future disposal cells to exposure points. A higher resolution, largely deterministic, analysis is performed on a best-estimate Base Case scenario using the PORFLOW numerical analysis code. a few additional sensitivity cases are simulated to examine alternative scenarios andmore » parameter settings. Stochastic analysis is performed on a simpler representation of the SDF system using the GoldSim code to estimate uncertainty and sensitivity about the Base Case. This report describes development of PORFLOW models supporting the SDF PA, and presents sample results to illustrate model behaviors and define impacts relative to key facility performance objectives. The SDF PA document, when issued, should be consulted for a comprehensive presentation of results.« less
A case study by life cycle assessment
NASA Astrophysics Data System (ADS)
Li, Shuyun
2017-05-01
This article aims to assess the potential environmental impact of an electrical grinder during its life cycle. The Life Cycle Inventory Analysis was conducted based on the Simplified Life Cycle Assessment (SLCA) Drivers that calculated from the Valuation of Social Cost and Simplified Life Cycle Assessment Model (VSSM). The detailed results for LCI can be found under Appendix II. The Life Cycle Impact Assessment was performed based on Eco-indicator 99 method. The analysis results indicated that the major contributor to the environmental impact as it accounts for over 60% overall SLCA output. In which, 60% of the emission resulted from the logistic required for the maintenance activities. This was measured by conducting the hotspot analysis. After performing sensitivity analysis, it is evidenced that changing fuel type results in significant decrease environmental footprint. The environmental benefit can also be seen from the negative output values of the recycling activities. By conducting Life Cycle Assessment analysis, the potential environmental impact of the electrical grinder was investigated.
Parametric and experimental analysis using a power flow approach
NASA Technical Reports Server (NTRS)
Cuschieri, J. M.
1988-01-01
Having defined and developed a structural power flow approach for the analysis of structure-borne transmission of structural vibrations, the technique is used to perform an analysis of the influence of structural parameters on the transmitted energy. As a base for comparison, the parametric analysis is first performed using a Statistical Energy Analysis approach and the results compared with those obtained using the power flow approach. The advantages of using structural power flow are thus demonstrated by comparing the type of results obtained by the two methods. Additionally, to demonstrate the advantages of using the power flow method and to show that the power flow results represent a direct physical parameter that can be measured on a typical structure, an experimental investigation of structural power flow is also presented. Results are presented for an L-shaped beam for which an analytical solution has already been obtained. Furthermore, the various methods available to measure vibrational power flow are compared to investigate the advantages and disadvantages of each method.
Between Domain Cognitive Dispersion and Functional Abilities in Older Adults
Fellows, Robert P.; Schmitter-Edgecombe, Maureen
2016-01-01
Objective Within-person variability in cognitive performance is related to neurological integrity, but the association with functional abilities is less clear. The primary aim of this study was to examine the association between cognitive dispersion, or within-person variability, and everyday multitasking and the way in which these variables may influence performance on a naturalistic assessment of functional abilities. Method Participants were 156 community-dwelling adults, age 50 or older. Cognitive dispersion was calculated by measuring within-person variability in cognitive domains, established through principal components analysis. Path analysis was used to determine the independent contribution of cognitive dispersion to functional ability, mediated by multitasking. Results Results of the path analysis revealed that the number of subtasks interweaved (i.e., multitasked) mediated the association between cognitive dispersion and task sequencing and accuracy. Although increased multitasking was associated with worse task performance in the path model, secondary analyses revealed that for individuals with low cognitive dispersion, increased multitasking was associated with better task performance, whereas for those with higher levels of dispersion multitasking was negatively correlated with task performance. Conclusion These results suggest that cognitive dispersion between domains may be a useful indicator of multitasking and daily living skills among older adults. PMID:26300441
Structural dynamics of shroudless, hollow fan blades with composite in-lays
NASA Technical Reports Server (NTRS)
Aiello, R. A.; Hirschbein, M. S.; Chamis, C. C.
1982-01-01
Structural and dynamic analyses are presented for a shroudless, hollow titanium fan blade proposed for future use in aircraft turbine engines. The blade was modeled and analyzed using the composite blade structural analysis computer program (COBSTRAN); an integrated program consisting of mesh generators, composite mechanics codes, NASTRAN, and pre- and post-processors. Vibration and impact analyses are presented. The vibration analysis was conducted with COBSTRAN. Results show the effect of the centrifugal force field on frequencies, twist, and blade camber. Bird impact analysis was performed with the multi-mode blade impact computer program. This program uses the geometric model and modal analysis from the COBSTRAN vibration analysis to determine the gross impact response of the fan blades to bird strikes. The structural performance of this blade is also compared to a blade of similar design but with composite in-lays on the outer surface. Results show that the composite in-lays can be selected (designed) to substantially modify the mechanical performance of the shroudless, hollow fan blade.
NASA Technical Reports Server (NTRS)
Farley, Gary L.; Seshadri, Banavara R.
2005-01-01
An analysis based investigation of aluminum with metal matrix composite selectively reinforced single- and multi-hole specimens was performed and their results compared with results from geometrically comparable non-reinforced specimens. All reinforced specimens exhibited a significant increase in performance. Performance increase of up to 170 percent was achieved. Specimen failure modes were consistent with results from reinforced polymeric matrix composite specimens. Localized reinforcement application (circular) proved as effective as a broader area (strip) reinforcement. Also, selective reinforcement is an excellent method of increasing the performance of multi-hole specimens.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fuches, J.L.
1998-02-01
The Nuclear Regulatory Commission`s (NRC) strategic plan [NUREG-1614, Vol. 1, September 1997] establishes a strategic framework that will guide future decision-making and will help the NRC continue to meet its responsibility for protecting public health and safety, promoting the common defense and security, and protecting the environment. This performance plan complements the agency`s strategic plan by setting annual goals with measurable target levels of performance for FY 1999, as required by the Government Performance and Results Act. No significant contribution was made to the preparation of the performance plan by any non-Federal entity. However, a contractor was used to helpmore » facilitate discussions and resolution of issues. Within six months after the close of FY 1999, the NRC will submit to the President and the Congress a report on program performance for FY 1999. This performance report will review the success of the agency in achieving the performance goals established for FY 1999. Where those goals have been achieved, the underlying assumptions and strategies will be examined to ensure that continued applicability is warranted in the future. If any of the FY 1999 performance goals are not met, the agency will conduct a thorough analysis of why it did not meet the goal and the actions necessary to meet-the goal in the future. One result of this analysis will be the documentation of plans and schedules for achieving the established performance goal. If the analysis should indicate that the performance goal is impractical or infeasible, the performance report will document why that is the case and what action is recommended.« less
Magnetic resonance angiography for the nonpalpable testis: a cost and cancer risk analysis.
Eggener, S E; Lotan, Y; Cheng, E Y
2005-05-01
For the unilateral nonpalpable testis standard management is open surgical or laparoscopic exploration. An ideal imaging technique would reliably identify testicular nubbins and safely allow children to forgo surgical exploration without compromising future health or fertility. Our goal was to perform a cost and risk analysis of magnetic resonance angiography (MRA) for unilateral nonpalpable cryptorchid testes. A search of the English medical literature revealed 3 studies addressing the usefulness of MRA for the nonpalpable testicle. We performed a meta-analysis and applied the results to a hypothetical set of patients using historical testicular localization data. Analysis was then performed using 3 different management protocols-MRA with removal of testicular nubbin tissue, MRA with observation of testicular nubbin tissue and diagnostic laparoscopy. A cancer risk and cost analysis was then performed. MRA with observation of testicular nubbin tissue results in 29% of patients avoiding surgery without any increased cost of care. Among the 29% of boys with testicular nubbins left in situ and observed the highest estimated risk was 1 in 300 of cancer developing, and 1 in 5,300 of dying of cancer. A protocol using MRA with observation of inguinal nubbins results in nearly a third of boys avoiding surgical intervention at a similar cost to standard care without any significant increased risk of development of testis cancer.
Preliminary basic performance analysis of the Cedar multiprocessor memory system
NASA Technical Reports Server (NTRS)
Gallivan, K.; Jalby, W.; Turner, S.; Veidenbaum, A.; Wijshoff, H.
1991-01-01
Some preliminary basic results on the performance of the Cedar multiprocessor memory system are presented. Empirical results are presented and used to calibrate a memory system simulator which is then used to discuss the scalability of the system.
A Multidimensional Analysis Tool for Visualizing Online Interactions
ERIC Educational Resources Information Center
Kim, Minjeong; Lee, Eunchul
2012-01-01
This study proposes and verifies the performance of an analysis tool for visualizing online interactions. A review of the most widely used methods for analyzing online interactions, including quantitative analysis, content analysis, and social network analysis methods, indicates these analysis methods have some limitations resulting from their…
Probabilistic Finite Element Analysis & Design Optimization for Structural Designs
NASA Astrophysics Data System (ADS)
Deivanayagam, Arumugam
This study focuses on implementing probabilistic nature of material properties (Kevlar® 49) to the existing deterministic finite element analysis (FEA) of fabric based engine containment system through Monte Carlo simulations (MCS) and implementation of probabilistic analysis in engineering designs through Reliability Based Design Optimization (RBDO). First, the emphasis is on experimental data analysis focusing on probabilistic distribution models which characterize the randomness associated with the experimental data. The material properties of Kevlar® 49 are modeled using experimental data analysis and implemented along with an existing spiral modeling scheme (SMS) and user defined constitutive model (UMAT) for fabric based engine containment simulations in LS-DYNA. MCS of the model are performed to observe the failure pattern and exit velocities of the models. Then the solutions are compared with NASA experimental tests and deterministic results. MCS with probabilistic material data give a good prospective on results rather than a single deterministic simulation results. The next part of research is to implement the probabilistic material properties in engineering designs. The main aim of structural design is to obtain optimal solutions. In any case, in a deterministic optimization problem even though the structures are cost effective, it becomes highly unreliable if the uncertainty that may be associated with the system (material properties, loading etc.) is not represented or considered in the solution process. Reliable and optimal solution can be obtained by performing reliability optimization along with the deterministic optimization, which is RBDO. In RBDO problem formulation, in addition to structural performance constraints, reliability constraints are also considered. This part of research starts with introduction to reliability analysis such as first order reliability analysis, second order reliability analysis followed by simulation technique that are performed to obtain probability of failure and reliability of structures. Next, decoupled RBDO procedure is proposed with a new reliability analysis formulation with sensitivity analysis, which is performed to remove the highly reliable constraints in the RBDO, thereby reducing the computational time and function evaluations. Followed by implementation of the reliability analysis concepts and RBDO in finite element 2D truss problems and a planar beam problem are presented and discussed.
Analysis of the Department of Defense Pre-Award Contracting Process
2014-12-01
Justification and Approval JBSA Joint Base San Antonio KPIs Key Performance Indicators MAJCOMs Major Command MP Mandatory Commands NAVIAR...meets desired results. Results-based performance measurement establishes key performance indicators ( KPIs ) that determine whether procurement...or goals, and underlying business processes (Cullen, 2009, p. 38). Within each quadrant, Cullen provided examples of KPIs that serve to measure
Azadeh, Ali; Sheikhalishahi, Mohammad
2014-01-01
Background A unique framework for performance optimization of generation companies (GENCOs) based on health, safety, environment, and ergonomics (HSEE) indicators is presented. Methods To rank this sector of industry, the combination of data envelopment analysis (DEA), principal component analysis (PCA), and Taguchi are used for all branches of GENCOs. These methods are applied in an integrated manner to measure the performance of GENCO. The preferred model between DEA, PCA, and Taguchi is selected based on sensitivity analysis and maximum correlation between rankings. To achieve the stated objectives, noise is introduced into input data. Results The results show that Taguchi outperforms other methods. Moreover, a comprehensive experiment is carried out to identify the most influential factor for ranking GENCOs. Conclusion The approach developed in this study could be used for continuous assessment and improvement of GENCO's performance in supplying energy with respect to HSEE factors. The results of such studies would help managers to have better understanding of weak and strong points in terms of HSEE factors. PMID:26106505
Niu, Guanghui; Shi, Qi; Xu, Mingjun; Lai, Hongjun; Lin, Qingyu; Liu, Kunping; Duan, Yixiang
2015-10-01
In this article, a novel and alternative method of laser-induced breakdown spectroscopy (LIBS) analysis for liquid sample is proposed, which involves the removal of metal ions from a liquid to a solid substrate using a cost-efficient adsorbent, dehydrated carbon, obtained using a dehydration reaction. Using this new technique, researchers can detect trace metal ions in solutions qualitatively and quantitatively, and the drawbacks of performing liquid analysis using LIBS can be avoided because the analysis is performed on a solid surface. To achieve better performance using this technique, we considered parameters potentially influencing both adsorption performance and LIBS analysis. The calibration curves were evaluated, and the limits of detection obtained for Cu(2+), Pb(2+), and Cr(3+) were 0.77, 0.065, and 0.46 mg/L, respectively, which are better than those in the previous studies. In addition, compared to other absorbents, the adsorbent used in this technique is much cheaper in cost, easier to obtain, and has fewer or no other elements other than C, H, and O that could result in spectral interference during analysis. We also used the recommended method to analyze spiked samples, obtaining satisfactory results. Thus, this new technique is helpful and promising for use in wastewater analysis and management.
Predictability of Brayton electric power system performance
NASA Technical Reports Server (NTRS)
Klann, J. L.; Hettel, H. J.
1972-01-01
Data from the first tests of the 2- to 15-kilowatt space power system in a vacuum chamber were compared with predictions of both a pretest analysis and a modified version of that analysis. The pretest analysis predicted test results with differences of no more than 9 percent of the largest measured value for each quantity. The modified analysis correlated measurements. Differences in conversion efficiency and power output were no greater than plus or minus 2.5 percent. This modified analysis was used to project space performance maps for the current test system.
A feasibility study on age-related factors of wrist pulse using principal component analysis.
Jang-Han Bae; Young Ju Jeon; Sanghun Lee; Jaeuk U Kim
2016-08-01
Various analysis methods for examining wrist pulse characteristics are needed for accurate pulse diagnosis. In this feasibility study, principal component analysis (PCA) was performed to observe age-related factors of wrist pulse from various analysis parameters. Forty subjects in the age group of 20s and 40s were participated, and their wrist pulse signal and respiration signal were acquired with the pulse tonometric device. After pre-processing of the signals, twenty analysis parameters which have been regarded as values reflecting pulse characteristics were calculated and PCA was performed. As a results, we could reduce complex parameters to lower dimension and age-related factors of wrist pulse were observed by combining-new analysis parameter derived from PCA. These results demonstrate that PCA can be useful tool for analyzing wrist pulse signal.
NASA Astrophysics Data System (ADS)
Tran, Trang; Tran, Huy; Mansfield, Marc; Lyman, Seth; Crosman, Erik
2018-03-01
Four-dimensional data assimilation (FDDA) was applied in WRF-CMAQ model sensitivity tests to study the impact of observational and analysis nudging on model performance in simulating inversion layers and O3 concentration distributions within the Uintah Basin, Utah, U.S.A. in winter 2013. Observational nudging substantially improved WRF model performance in simulating surface wind fields, correcting a 10 °C warm surface temperature bias, correcting overestimation of the planetary boundary layer height (PBLH) and correcting underestimation of inversion strengths produced by regular WRF model physics without nudging. However, the combined effects of poor performance of WRF meteorological model physical parameterization schemes in simulating low clouds, and warm and moist biases in the temperature and moisture initialization and subsequent simulation fields, likely amplified the overestimation of warm clouds during inversion days when observational nudging was applied, impacting the resulting O3 photochemical formation in the chemistry model. To reduce the impact of a moist bias in the simulations on warm cloud formation, nudging with the analysis water mixing ratio above the planetary boundary layer (PBL) was applied. However, due to poor analysis vertical temperature profiles, applying analysis nudging also increased the errors in the modeled inversion layer vertical structure compared to observational nudging. Combining both observational and analysis nudging methods resulted in unrealistically extreme stratified stability that trapped pollutants at the lowest elevations at the center of the Uintah Basin and yielded the worst WRF performance in simulating inversion layer structure among the four sensitivity tests. The results of this study illustrate the importance of carefully considering the representativeness and quality of the observational and model analysis data sets when applying nudging techniques within stable PBLs, and the need to evaluate model results on a basin-wide scale.
Comparison of a quasi-3D analysis and experimental performance for three compact radial turbines
NASA Technical Reports Server (NTRS)
Simonyi, P. S.; Boyle, R. J.
1991-01-01
An experimental aerodynamic evaluation of three compact radial turbine builds was performed. Two rotors which were 40-50 percent shorter in axial length than conventional state-of-the-art radial rotors were tested. A single nozzle design was used. One rotor was tested with the nozzle at two stagger angle settings. A second rotor was tested with the nozzle in only the closed down setting. Experimental results were compared to predicted results from a quasi-3D inviscid and boundary layer analysis, called MTSB (Meridl/Tsonic/Blayer). This analysis was used to predict turbine performance. It has previously been calibrated only for axial, not radial, turbomachinery. The predicted and measured efficiencies were compared at the design point for the three turbines. At the design points the analysis overpredicted the efficiency by less than 1.7 points. Comparisons were also made at off-design operating points. The results of these comparisons showed the importance of an accurate clearance model for efficiency predictions and also that there are deficiencies in the incidence loss model used.
Comparison of a quasi-3D analysis and experimental performance for three compact radial turbines
NASA Technical Reports Server (NTRS)
Simonyi, P. S.; Boyle, R. J.
1991-01-01
An experimental aerodynamic evaluation of three compact radial turbine builds was performed. Two rotors which were 40 to 50 percent shorter in axial length than conventional state of the art radial rotors were tested. A single nozzle design was used. One rotor was tested with the nozzle at two stagger angle settings. A second rotor was tested with the nozzle in only the closed down setting. Experimental results were compared to predict results from a quasi-3D inviscid and boundary layer analysis, called Meridl/Tsonic/Blayer (MTSB). This analysis was used to predict turbine performance. It has previously been calibrated only for axial, not radial, turbomachinery. The predicted and measured efficiencies were compared at the design point for the three turbines. At the design points the analysis overpredicted the efficiency by less than 1.7 points. Comparisons were also made at off-design operating points. The results of these comparisons showed the importance of an accurate clearance model for efficiency predictions and also that there are deficiencies in the incidence loss model used.
Tank 241-B-108, cores 172 and 173 analytical results for the final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nuzum, J.L., Fluoro Daniel Hanford
1997-03-04
The Data Summary Table (Table 3) included in this report compiles analytical results in compliance with all applicable DQOS. Liquid subsamples that were prepared for analysis by an acid adjustment of the direct subsample are indicated by a `D` in the A column in Table 3. Solid subsamples that were prepared for analysis by performing a fusion digest are indicated by an `F` in the A column in Table 3. Solid subsamples that were prepared for analysis by performing a water digest are indicated by a I.wl. or an `I` in the A column of Table 3. Due to poormore » precision and accuracy in original analysis of both Lower Half Segment 2 of Core 173 and the core composite of Core 173, fusion and water digests were performed for a second time. Precision and accuracy improved with the repreparation of Core 173 Composite. Analyses with the repreparation of Lower Half Segment 2 of Core 173 did not show improvement and suggest sample heterogeneity. Results from both preparations are included in Table 3.« less
NASA Astrophysics Data System (ADS)
Ohlídal, Ivan; Vohánka, Jiří; Čermák, Martin; Franta, Daniel
2017-10-01
The modification of the effective medium approximation for randomly microrough surfaces covered by very thin overlayers based on inhomogeneous fictitious layers is formulated. The numerical analysis of this modification is performed using simulated ellipsometric data calculated using the Rayleigh-Rice theory. The system used to perform this numerical analysis consists of a randomly microrough silicon single crystal surface covered with a SiO2 overlayer. A comparison to the effective medium approximation based on homogeneous fictitious layers is carried out within this numerical analysis. For ellipsometry of the system mentioned above the possibilities and limitations of both the effective medium approximation approaches are discussed. The results obtained by means of the numerical analysis are confirmed by the ellipsometric characterization of two randomly microrough silicon single crystal substrates covered with native oxide overlayers. It is shown that the effective medium approximation approaches for this system exhibit strong deficiencies compared to the Rayleigh-Rice theory. The practical consequences implied by these results are presented. The results concerning the random microroughness are verified by means of measurements performed using atomic force microscopy.
Simulation for analysis and control of superplastic forming. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zacharia, T.; Aramayo, G.A.; Simunovic, S.
1996-08-01
A joint study was conducted by Oak Ridge National Laboratory (ORNL) and the Pacific Northwest Laboratory (PNL) for the U.S. Department of Energy-Lightweight Materials (DOE-LWM) Program. the purpose of the study was to assess and benchmark the current modeling capabilities with respect to accuracy of predictions and simulation time. Two modeling capabilities with respect to accuracy of predictions and simulation time. Two simulation platforms were considered in this study, which included the LS-DYNA3D code installed on ORNL`s high- performance computers and the finite element code MARC used at PNL. both ORNL and PNL performed superplastic forming (SPF) analysis on amore » standard butter-tray geometry, which was defined by PNL, to better understand the capabilities of the respective models. The specific geometry was selected and formed at PNL, and the experimental results, such as forming time and thickness at specific locations, were provided for comparisons with numerical predictions. Furthermore, comparisons between the ORNL simulation results, using elasto-plastic analysis, and PNL`s results, using rigid-plastic flow analysis, were performed.« less
NASA Technical Reports Server (NTRS)
Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac
1987-01-01
A preliminary uncertainty analysis was performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis is presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.
NASA Technical Reports Server (NTRS)
Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac
1987-01-01
A preliminary uncertainty analysis has been performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis are presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.
2012-01-01
Background Cognitive deficits and multiple psychoactive drug regimens are both common in patients treated for opioid-dependence. Therefore, we examined whether the cognitive performance of patients in opioid-substitution treatment (OST) is associated with their drug treatment variables. Methods Opioid-dependent patients (N = 104) who were treated either with buprenorphine or methadone (n = 52 in both groups) were given attention, working memory, verbal, and visual memory tests after they had been a minimum of six months in treatment. Group-wise results were analysed by analysis of variance. Predictors of cognitive performance were examined by hierarchical regression analysis. Results Buprenorphine-treated patients performed statistically significantly better in a simple reaction time test than methadone-treated ones. No other significant differences between groups in cognitive performance were found. In each OST drug group, approximately 10% of the attention performance could be predicted by drug treatment variables. Use of benzodiazepine medication predicted about 10% of performance variance in working memory. Treatment with more than one other psychoactive drug (than opioid or BZD) and frequent substance abuse during the past month predicted about 20% of verbal memory performance. Conclusions Although this study does not prove a causal relationship between multiple prescription drug use and poor cognitive functioning, the results are relevant for psychosocial recovery, vocational rehabilitation, and psychological treatment of OST patients. Especially for patients with BZD treatment, other treatment options should be actively sought. PMID:23121989
Effects of Small-Group Learning on Transfer: A Meta-Analysis
ERIC Educational Resources Information Center
Pai, Hui-Hua; Sears, David A.; Maeda, Yukiko
2015-01-01
This study investigated the potential benefit of small-group learning on transfer performance using the method of meta-analysis. Results showed positive support for the hypothesis that small-group learning can increase students' transfer performance (average effect size of 0.30). Unlike reviews of effects of cooperation on learning, this…
Simultaneous acquisition of EEG and NIRS during cognitive tasks for an open access dataset.
Shin, Jaeyoung; von Lühmann, Alexander; Kim, Do-Won; Mehnert, Jan; Hwang, Han-Jeong; Müller, Klaus-Robert
2018-02-13
We provide an open access multimodal brain-imaging dataset of simultaneous electroencephalography (EEG) and near-infrared spectroscopy (NIRS) recordings. Twenty-six healthy participants performed three cognitive tasks: 1) n-back (0-, 2- and 3-back), 2) discrimination/selection response task (DSR) and 3) word generation (WG) tasks. The data provided includes: 1) measured data, 2) demographic data, and 3) basic analysis results. For n-back (dataset A) and DSR tasks (dataset B), event-related potential (ERP) analysis was performed, and spatiotemporal characteristics and classification results for 'target' versus 'non-target' (dataset A) and symbol 'O' versus symbol 'X' (dataset B) are provided. Time-frequency analysis was performed to show the EEG spectral power to differentiate the task-relevant activations. Spatiotemporal characteristics of hemodynamic responses are also shown. For the WG task (dataset C), the EEG spectral power and spatiotemporal characteristics of hemodynamic responses are analyzed, and the potential merit of hybrid EEG-NIRS BCIs was validated with respect to classification accuracy. We expect that the dataset provided will facilitate performance evaluation and comparison of many neuroimaging analysis techniques.
Simultaneous acquisition of EEG and NIRS during cognitive tasks for an open access dataset
Shin, Jaeyoung; von Lühmann, Alexander; Kim, Do-Won; Mehnert, Jan; Hwang, Han-Jeong; Müller, Klaus-Robert
2018-01-01
We provide an open access multimodal brain-imaging dataset of simultaneous electroencephalography (EEG) and near-infrared spectroscopy (NIRS) recordings. Twenty-six healthy participants performed three cognitive tasks: 1) n-back (0-, 2- and 3-back), 2) discrimination/selection response task (DSR) and 3) word generation (WG) tasks. The data provided includes: 1) measured data, 2) demographic data, and 3) basic analysis results. For n-back (dataset A) and DSR tasks (dataset B), event-related potential (ERP) analysis was performed, and spatiotemporal characteristics and classification results for ‘target’ versus ‘non-target’ (dataset A) and symbol ‘O’ versus symbol ‘X’ (dataset B) are provided. Time-frequency analysis was performed to show the EEG spectral power to differentiate the task-relevant activations. Spatiotemporal characteristics of hemodynamic responses are also shown. For the WG task (dataset C), the EEG spectral power and spatiotemporal characteristics of hemodynamic responses are analyzed, and the potential merit of hybrid EEG-NIRS BCIs was validated with respect to classification accuracy. We expect that the dataset provided will facilitate performance evaluation and comparison of many neuroimaging analysis techniques. PMID:29437166
A multifaceted independent performance analysis of facial subspace recognition algorithms.
Bajwa, Usama Ijaz; Taj, Imtiaz Ahmad; Anwar, Muhammad Waqas; Wang, Xuan
2013-01-01
Face recognition has emerged as the fastest growing biometric technology and has expanded a lot in the last few years. Many new algorithms and commercial systems have been proposed and developed. Most of them use Principal Component Analysis (PCA) as a base for their techniques. Different and even conflicting results have been reported by researchers comparing these algorithms. The purpose of this study is to have an independent comparative analysis considering both performance and computational complexity of six appearance based face recognition algorithms namely PCA, 2DPCA, A2DPCA, (2D)(2)PCA, LPP and 2DLPP under equal working conditions. This study was motivated due to the lack of unbiased comprehensive comparative analysis of some recent subspace methods with diverse distance metric combinations. For comparison with other studies, FERET, ORL and YALE databases have been used with evaluation criteria as of FERET evaluations which closely simulate real life scenarios. A comparison of results with previous studies is performed and anomalies are reported. An important contribution of this study is that it presents the suitable performance conditions for each of the algorithms under consideration.
NASA Technical Reports Server (NTRS)
Egolf, T. A.; Landgrebe, A. J.
1981-01-01
The theory for the UTRC Energy Conversion System Performance Analysis (WECSPER) for the prediction of horizontal axis wind turbine performance is presented. Major features of the analysis are the ability to: (1) treat the wind turbine blades as lifting lines with a prescribed wake model; (2) solve for the wake-induced inflow and blade circulation using real nonlinear airfoil data; and (3) iterate internally to obtain a compatible wake transport velocity and blade loading solution. This analysis also provides an approximate treatment of wake distortions due to tower shadow or wind shear profiles. Finally, selected results of internal UTRC application of the analysis to existing wind turbines and correlation with limited test data are described.
Liu, Xiaona; Zhang, Qiao; Wu, Zhisheng; Shi, Xinyuan; Zhao, Na; Qiao, Yanjiang
2015-01-01
Laser-induced breakdown spectroscopy (LIBS) was applied to perform a rapid elemental analysis and provenance study of Blumea balsamifera DC. Principal component analysis (PCA) and partial least squares discriminant analysis (PLS-DA) were implemented to exploit the multivariate nature of the LIBS data. Scores and loadings of computed principal components visually illustrated the differing spectral data. The PLS-DA algorithm showed good classification performance. The PLS-DA model using complete spectra as input variables had similar discrimination performance to using selected spectral lines as input variables. The down-selection of spectral lines was specifically focused on the major elements of B. balsamifera samples. Results indicated that LIBS could be used to rapidly analyze elements and to perform provenance study of B. balsamifera. PMID:25558999
Crop Identification Technology Assessment for Remote Sensing (CITARS)
NASA Technical Reports Server (NTRS)
Bauer, M. E.; Cary, T. K.; Davis, B. J.; Swain, P. H.
1975-01-01
The results of classifications and experiments performed for the Crop Identification Technology Assessment for Remote Sensing (CITARS) project are summarized. Fifteen data sets were classified using two analysis procedures. One procedure used class weights while the other assumed equal probabilities of occurrence for all classes. In addition, 20 data sets were classified using training statistics from another segment or date. The results of both the local and non-local classifications in terms of classification and proportion estimation are presented. Several additional experiments are described which were performed to provide additional understanding of the CITARS results. These experiments investigated alternative analysis procedures, training set selection and size, effects of multitemporal registration, the spectral discriminability of corn, soybeans, and other, and analysis of aircraft multispectral data.
Hydrocarbon group type determination in jet fuels by high performance liquid chromatography
NASA Technical Reports Server (NTRS)
Antoine, A. C.
1977-01-01
Results are given for the analysis of some jet and diesel fuel samples which were prepared from oil shale and coal syncrudes. Thirty-two samples of varying chemical composition and physical properties were obtained. Hydrocarbon types in these samples were determined by fluorescent indicator adsorption (FIA) analysis, and the results from three laboratories are presented and compared. Recently, rapid high performance liquid chromatography (HPLC) methods have been proposed for hydrocarbon group type analysis, with some suggestion for their use as a replacement of the FIA technique. Two of these methods were used to analyze some of the samples, and these results are also presented and compared. Two samples of petroleum-based Jet A fuel are similarly analyzed.
[Video-based self-control in surgical teaching. A new tool in a new concept].
Dahmen, U; Sänger, C; Wurst, C; Arlt, J; Wei, W; Dondorf, F; Richter, B; Settmacher, U; Dirsch, O
2013-10-01
Image and video-based results and process control are essential tools of a new teaching concept for conveying surgical skills. The new teaching concept integrates approved teaching principles and new media. Every performance of exercises is videotaped and the result photographically recorded. The quality of the process and result becomes accessible for an analysis by the teacher and the student/learner. The learner is instructed to perform a criteria-based self-analysis of the video and image material by themselves. The new learning concept has so far been successfully applied in seven rounds within the newly designed modular class "Intensivkurs Chirurgische Techniken" (Intensive training of surgical techniques). Result documentation and analysis via digital picture was completed by almost every student. The quality of the results was high. Interestingly the result quality did not correlate with the time needed for the exercise. The training success had a lasting effect. The new and elaborate concept improves the quality of teaching. In the long run resources for patient care should be saved when training students according to this concept prior to performing tasks in the operating theater. These resources should be allocated for further refining innovative teaching concepts.
NASA Technical Reports Server (NTRS)
1975-01-01
Gas turbine engines were assessed for application to hear duty transportation. A summary of the assumptions, applications, and methods of analysis is included along with a discussion of the approach taken, the technical program flow chart, and weighting criteria used for performance evaluation. The various engines are compared on the bases of weight, performance, emissions and noise, technology status, and growth potential. The results of the engine screening phase and the conceptual design phase are presented.
Multi-Core Processor Memory Contention Benchmark Analysis Case Study
NASA Technical Reports Server (NTRS)
Simon, Tyler; McGalliard, James
2009-01-01
Multi-core processors dominate current mainframe, server, and high performance computing (HPC) systems. This paper provides synthetic kernel and natural benchmark results from an HPC system at the NASA Goddard Space Flight Center that illustrate the performance impacts of multi-core (dual- and quad-core) vs. single core processor systems. Analysis of processor design, application source code, and synthetic and natural test results all indicate that multi-core processors can suffer from significant memory subsystem contention compared to similar single-core processors.
NASA Technical Reports Server (NTRS)
Oman, B. H.
1977-01-01
The NASA Langley Research Center vehicle design evaluation program (VDEP-2) was expanded by (1) incorporating into the program a capability to conduct preliminary design studies on subsonic commercial transport type aircraft using both JP and such alternate fuels as hydrogen and methane;(2) incorporating an aircraft detailed mission and performance analysis capability; and (3) developing and incorporating an external loads analysis capability. The resulting computer program (VDEP-3) provides a preliminary design tool that enables the user to perform integrated sizing, structural analysis, and cost studies on subsonic commercial transport aircraft. Both versions of the VDEP-3 Program which are designated preliminary Analysis VDEP-3 and detailed Analysis VDEP utilize the same vehicle sizing subprogram which includes a detailed mission analysis capability, as well as a geometry and weight analysis for multibodied configurations.
Yao, Shengnan; Zeng, Weiming; Wang, Nizhuan; Chen, Lei
2013-07-01
Independent component analysis (ICA) has been proven to be effective for functional magnetic resonance imaging (fMRI) data analysis. However, ICA decomposition requires to optimize the unmixing matrix iteratively whose initial values are generated randomly. Thus the randomness of the initialization leads to different ICA decomposition results. Therefore, just one-time decomposition for fMRI data analysis is not usually reliable. Under this circumstance, several methods about repeated decompositions with ICA (RDICA) were proposed to reveal the stability of ICA decomposition. Although utilizing RDICA has achieved satisfying results in validating the performance of ICA decomposition, RDICA cost much computing time. To mitigate the problem, in this paper, we propose a method, named ATGP-ICA, to do the fMRI data analysis. This method generates fixed initial values with automatic target generation process (ATGP) instead of being produced randomly. We performed experimental tests on both hybrid data and fMRI data to indicate the effectiveness of the new method and made a performance comparison of the traditional one-time decomposition with ICA (ODICA), RDICA and ATGP-ICA. The proposed method demonstrated that it not only could eliminate the randomness of ICA decomposition, but also could save much computing time compared to RDICA. Furthermore, the ROC (Receiver Operating Characteristic) power analysis also denoted the better signal reconstruction performance of ATGP-ICA than that of RDICA. Copyright © 2013 Elsevier Inc. All rights reserved.
Examining Evolving Performance on the Force Concept Inventory Using Factor Analysis
ERIC Educational Resources Information Center
Semak, M. R.; Dietz, R. D.; Pearson, R. H.; Willis, C. W
2017-01-01
The application of factor analysis to the "Force Concept Inventory" (FCI) has proven to be problematic. Some studies have suggested that factor analysis of test results serves as a helpful tool in assessing the recognition of Newtonian concepts by students. Other work has produced at best ambiguous results. For the FCI administered as a…
Shaded-Color Picture Generation of Computer-Defined Arbitrary Shapes
NASA Technical Reports Server (NTRS)
Cozzolongo, J. V.; Hermstad, D. L.; Mccoy, D. S.; Clark, J.
1986-01-01
SHADE computer program generates realistic color-shaded pictures from computer-defined arbitrary shapes. Objects defined for computer representation displayed as smooth, color-shaded surfaces, including varying degrees of transparency. Results also used for presentation of computational results. By performing color mapping, SHADE colors model surface to display analysis results as pressures, stresses, and temperatures. NASA has used SHADE extensively in sign and analysis of high-performance aircraft. Industry should find applications for SHADE in computer-aided design and computer-aided manufacturing. SHADE written in VAX FORTRAN and MACRO Assembler for either interactive or batch execution.
NASA Technical Reports Server (NTRS)
Campbell, B. H.
1974-01-01
A study is described which was initiated to identify and quantify the interrelationships between and within the performance, safety, cost, and schedule parameters for unmanned, automated payload programs. The result of the investigation was a systems cost/performance model which was implemented as a digital computer program and could be used to perform initial program planning, cost/performance tradeoffs, and sensitivity analyses for mission model and advanced payload studies. Program objectives and results are described briefly.
Statistical Analysis of Deflation in Covariance and Resultant Pc Values for AQUA, AURA and TERRA
NASA Technical Reports Server (NTRS)
Hasan, Syed O.
2016-01-01
This presentation will display statistical analysis performed for raw conjunction CDMs received for the EOS Aqua, Aura and Terra satellites within the period of February 2015 through July 2016. The analysis performed indicates a discernable deflation in covariance calculated at the JSpOC after the utilization of the dynamic drag consider parameter was implemented operationally in May 2015. As a result, the overall diminution in the conjunction plane intersection of the primary and secondary objects appears to be leading to reduced probability of collision (Pc) values for these conjunction events. This presentation also displays evidence for this theory with analysis of Pc trending plots using data calculated by the SpaceNav CRMS system.
NASA Technical Reports Server (NTRS)
1972-01-01
The results of the telecommunications subsystem analysis are presented. The relay system requirements and constraints, interference analysis, frequency selection, modulation and coding analysis, and the performance analysis of the relay system are included.
SIMS prototype system 3 test results: Engineering analysis
NASA Technical Reports Server (NTRS)
1978-01-01
The results obtained during testing of a closed hydronic drain down solar system designed for space and hot water heating is presented. Data analysis is included which documents the system performance and verifies the suitability of SIMS Prototype System 3 for field installation.
NASA Astrophysics Data System (ADS)
Murata, Isao; Ohta, Masayuki; Miyamaru, Hiroyuki; Kondo, Keitaro; Yoshida, Shigeo; Iida, Toshiyuki; Ochiai, Kentaro; Konno, Chikara
2011-10-01
Nuclear data are indispensable for development of fusion reactor candidate materials. However, benchmarking of the nuclear data in MeV energy region is not yet adequate. In the present study, benchmark performance in the MeV energy region was investigated theoretically for experiments by using a 14 MeV neutron source. We carried out a systematical analysis for light to heavy materials. As a result, the benchmark performance for the neutron spectrum was confirmed to be acceptable, while for gamma-rays it was not sufficiently accurate. Consequently, a spectrum shifter has to be applied. Beryllium had the best performance as a shifter. Moreover, a preliminary examination of whether it is really acceptable that only the spectrum before the last collision is considered in the benchmark performance analysis. It was pointed out that not only the last collision but also earlier collisions should be considered equally in the benchmark performance analysis.
Zarya Energy Balance Analysis: The Effect of Spacecraft Shadowing on Solar Array Performance
NASA Technical Reports Server (NTRS)
Hoffman, David J.; Kolosov, Vladimir
1999-01-01
The first element of the International Space Station (ISS). Zarya, was funded by NASA and built by the Russian aerospace company Khrunichev State Research and Production Space Center (KhSC). NASA Glenn Research Center (GRC) and KhSC collaborated in performing analytical predictions of the on-orbit electrical performance of Zarya's solar arrays. GRC assessed the pointing characteristics of and shadow patterns on Zarya's solar arrays to determine the average solar energy incident on the arrays. KHSC used the incident energy results to determine Zarya's electrical power generation capability and orbit-average power balance. The power balance analysis was performed over a range of solar beta angles and vehicle operational conditions. This analysis enabled identification of problems that could impact the power balance for specific flights during ISS assembly and was also used as the primary means of verifying that Zarya complied with electrical power requirements. Analytical results are presented for select stages in the ISS assembly sequence along with a discussion of the impact of shadowing on the electrical performance of Zarya's solar arrays.
Structural Analysis of Women’s Heptathlon
Gassmann, Freya; Fröhlich, Michael; Emrich, Eike
2016-01-01
The heptathlon comprises the results of seven single disciplines, assuming an equal influence from each discipline, depending on the measured performance. Data analysis was based on the data recorded for the individual performances of the 10 winning heptathletes in the World Athletics Championships from 1987 to 2013 and the Olympic Games from 1988 to 2012. In addition to descriptive analysis methods, correlations, bivariate and multivariate linear regressions, and panel data regressions were used. The transformation of the performances from seconds, centimeters, and meters into points showed that the individual disciplines do not equally affect the overall competition result. The currently valid conversion formula for the run, jump, and throw disciplines prefers the sprint and jump disciplines but penalizes the athletes performing in the 800 m run, javelin throw, and shotput disciplines. Furthermore, 21% to 48% of the variance of the sum of points can be attributed to the performances in the disciplines of long jump, 200 m sprint, 100 m hurdles, and high jump. To balance the effects of the single disciplines in the heptathlon, the formula to calculate points should be reevaluated. PMID:29910260
Conceptual Design and Performance Analysis for a Large Civil Compound Helicopter
NASA Technical Reports Server (NTRS)
Russell, Carl; Johnson, Wayne
2012-01-01
A conceptual design study of a large civil compound helicopter is presented. The objective is to determine how a compound helicopter performs when compared to both a conventional helicopter and a tiltrotor using a design mission that is shorter than optimal for a tiltrotor and longer than optimal for a helicopter. The designs are generated and analyzed using conceptual design software and are further evaluated with a comprehensive rotorcraft analysis code. Multiple metrics are used to determine the suitability of each design for the given mission. Plots of various trade studies and parameter sweeps as well as comprehensive analysis results are presented. The results suggest that the compound helicopter examined for this study would not be competitive with a tiltrotor or conventional helicopter, but multiple possibilities are identified for improving the performance of the compound helicopter in future research.
NASA Astrophysics Data System (ADS)
Mercer, Gary J.
This quantitative study examined the relationship between secondary students with math anxiety and physics performance in an inquiry-based constructivist classroom. The Revised Math Anxiety Rating Scale was used to evaluate math anxiety levels. The results were then compared to the performance on a physics standardized final examination. A simple correlation was performed, followed by a multivariate regression analysis to examine effects based on gender and prior math background. The correlation showed statistical significance between math anxiety and physics performance. The regression analysis showed statistical significance for math anxiety, physics performance, and prior math background, but did not show statistical significance for math anxiety, physics performance, and gender.
Measuring the performance of Internet companies using a two-stage data envelopment analysis model
NASA Astrophysics Data System (ADS)
Cao, Xiongfei; Yang, Feng
2011-05-01
In exploring the business operation of Internet companies, few researchers have used data envelopment analysis (DEA) to evaluate their performance. Since the Internet companies have a two-stage production process: marketability and profitability, this study employs a relational two-stage DEA model to assess the efficiency of the 40 dot com firms. The results show that our model performs better in measuring efficiency, and is able to discriminate the causes of inefficiency, thus helping business management to be more effective through providing more guidance to business performance improvement.
Analysis of Developmental Data: Comparison Among Alternative Methods
ERIC Educational Resources Information Center
Wilson, Ronald S.
1975-01-01
To examine the ability of the correction factor epsilon to counteract statistical bias in univariate analysis, an analysis of variance (adjusted by epsilon) and a multivariate analysis of variance were performed on the same data. The results indicated that univariate analysis is a fully protected design when used with epsilon. (JMB)
Safety culture assessment in petrochemical industry: a comparative study of two algerian plants.
Boughaba, Assia; Hassane, Chabane; Roukia, Ouddai
2014-06-01
To elucidate the relationship between safety culture maturity and safety performance of a particular company. To identify the factors that contribute to a safety culture, a survey questionnaire was created based mainly on the studies of Fernández-Muñiz et al. The survey was randomly distributed to 1000 employees of two oil companies and realized a rate of valid answer of 51%. Minitab 16 software was used and diverse tests, including the descriptive statistical analysis, factor analysis, reliability analysis, mean analysis, and correlation, were used for the analysis of data. Ten factors were extracted using the analysis of factor to represent safety culture and safety performance. The results of this study showed that the managers' commitment, training, incentives, communication, and employee involvement are the priority domains on which it is necessary to stress the effort of improvement, where they had all the descriptive average values lower than 3.0 at the level of Company B. Furthermore, the results also showed that the safety culture influences the safety performance of the company. Therefore, Company A with a good safety culture (the descriptive average values more than 4.0), is more successful than Company B in terms of accident rates. The comparison between the two petrochemical plants of the group Sonatrach confirms these results in which Company A, the managers of which are English and Norwegian, distinguishes itself by the maturity of their safety culture has significantly higher evaluations than the company B, who is constituted of Algerian staff, in terms of safety management practices and safety performance.
The psychoacoustics of musical articulation
NASA Astrophysics Data System (ADS)
Spiegelberg, Scott Charles
This dissertation develops psychoacoustical definitions of notated articulations, the necessary first step in articulation research. This research can be useful to theorists interested in timbre analysis, the psychology of performance, analysis and performance, the psychology of style differentiation, and performance pedagogy. An explanation of wavelet transforms precedes the development of new techniques for analyzing transient sounds. A history of timbre perception research reveals the inadequacies of current sound segmentation models, resulting in the creation of a new model, the Pitch/Amplitude/Centroid Trajectory (PACT) model of sound segmentation. The new analysis techniques and PACT model are used to analyze recordings of performers playing a melodic fragment in a series of notated articulations. Statistical tests showed that the performers generally agreed on the interpretation of five different articulation groups. A cognitive test of articulation similarity, using musicians and non-musicians as participants, revealed a close correlation between similarity judgments and physical attributes, though additional unknown factors are clearly present. A second psychological test explored the perceptual salience of articulation notation, by asking musically-trained participants to match stimuli to the same notations the performers used. The participants also marked verbal descriptors for each articulation, such as short/long, sharp/dull, loud/soft, harsh/gentle, and normal/extreme. These results were matched against the results of Chapters Five and Six, providing an overall interpretation of the psychoacoustics of articulation.
Evaluation of field performance of poplar clones using selected competition indices.
Chandler Brodie; D.S. DeBell
2004-01-01
Use of competition indices in the analysis of forestry experiments may improve detection and understanding of treatment effects, and thereby improve the application of results. In this paper, we compared the performance of eight indices in an analysis of a spacing trial of four Populus clones planted in pure and mixed clonal plots. Indices were...
Ninety six gasoline samples were collected from around the U.S. in Autumn 2004. A detailed hydrocarbon analysis was performed on each sample resulting in a data set of approximately 300 chemicals per sample. Statistical analyses were performed on the entire suite of reported chem...
Catching up with Harvard: Results from Regression Analysis of World Universities League Tables
ERIC Educational Resources Information Center
Li, Mei; Shankar, Sriram; Tang, Kam Ki
2011-01-01
This paper uses regression analysis to test if the universities performing less well according to Shanghai Jiao Tong University's world universities league tables are able to catch up with the top performers, and to identify national and institutional factors that could affect this catching up process. We have constructed a dataset of 461…
ERIC Educational Resources Information Center
Muslihah, Oleh Eneng
2015-01-01
The research examines the correlation between the understanding of school-based management, emotional intelligences and headmaster performance. Data was collected, using quantitative methods. The statistical analysis used was the Pearson Correlation, and multivariate regression analysis. The results of this research suggest firstly that there is…
ERIC Educational Resources Information Center
Ibourk, Aomar
2013-01-01
Based on data from international surveys measuring learning (TIMSS), this article focuses on the analysis of the academic performance Moroccan students. The results of the econometric model show that the students' characteristics, their family environment and school context are key determinants of these performances. The study also shows that the…
Rocket-Based Combined Cycle Engine Technology Development: Inlet CFD Validation and Application
NASA Technical Reports Server (NTRS)
DeBonis, J. R.; Yungster, S.
1996-01-01
A CFD methodology has been developed for inlet analyses of Rocket-Based Combined Cycle (RBCC) Engines. A full Navier-Stokes analysis code, NPARC, was used in conjunction with pre- and post-processing tools to obtain a complete description of the flow field and integrated inlet performance. This methodology was developed and validated using results from a subscale test of the inlet to a RBCC 'Strut-Jet' engine performed in the NASA Lewis 1 x 1 ft. supersonic wind tunnel. Results obtained from this study include analyses at flight Mach numbers of 5 and 6 for super-critical operating conditions. These results showed excellent agreement with experimental data. The analysis tools were also used to obtain pre-test performance and operability predictions for the RBCC demonstrator engine planned for testing in the NASA Lewis Hypersonic Test Facility. This analysis calculated the baseline fuel-off internal force of the engine which is needed to determine the net thrust with fuel on.
Trajectory-based heating analysis for the European Space Agency/Rosetta Earth Return Vehicle
NASA Technical Reports Server (NTRS)
Henline, William D.; Tauber, Michael E.
1994-01-01
A coupled, trajectory-based flowfield and material thermal-response analysis is presented for the European Space Agency proposed Rosetta comet nucleus sample return vehicle. The probe returns to earth along a hyperbolic trajectory with an entry velocity of 16.5 km/s and requires an ablative heat shield on the forebody. Combined radiative and convective ablating flowfield analyses were performed for the significant heating portion of the shallow ballistic entry trajectory. Both quasisteady ablation and fully transient analyses were performed for a heat shield composed of carbon-phenolic ablative material. Quasisteady analysis was performed using the two-dimensional axisymmetric codes RASLE and BLIMPK. Transient computational results were obtained from the one-dimensional ablation/conduction code CMA. Results are presented for heating, temperature, and ablation rate distributions over the probe forebody for various trajectory points. Comparison of transient and quasisteady results indicates that, for the heating pulse encountered by this probe, the quasisteady approach is conservative from the standpoint of predicted surface recession.
Cost and performance: complements for improvement.
Rouse, Paul; Harrison, Julie; Turner, Nikki
2011-10-01
Activity-based costing (ABC) and Data Envelopment Analysis (DEA) share similar views of resource consumption in the production of outputs. While DEA has a high level focus typically using aggregated data in the form of inputs and outputs, ABC is more detailed and oriented around very disaggregated data. We use a case study of immunisation activities in 24 New Zealand primary care practices to illustrate how DEA and ABC can be used in conjunction to improve performance analysis and benchmarking. Results show that practice size, socio-economic environment, parts of the service delivery process as well as regular administrative tasks are major cost and performance drivers for general practices in immunisation activities. It is worth noting that initial analyses of the ABC results, using contextual information and conventional methods of analysis such as regression and correlations, did not result in any patterns of significance. Reorganising this information using the DEA efficiency scores has revealed trends that make sense to practitioners and provide insights into where to place efforts for improvement.
Ortiz, Glorimar; Schacht, Lucille
2012-01-01
Measurement of consumers' satisfaction in psychiatric settings is important because it has been correlated with improved clinical outcomes and administrative measures of high-quality care. These consumer satisfaction measurements are actively used as performance measures required by the accreditation process and for quality improvement activities. Our objectives were (i) to re-evaluate, through exploratory factor analysis (EFA) and confirmatory factor analysis (CFA), the structure of an instrument intended to measure consumers' satisfaction with care in psychiatric settings and (ii) to examine and publish the psychometric characteristics, validity and reliability, of the Inpatient Consumer Survey (ICS). To psychometrically test the structure of the ICS, 34 878 survey results, submitted by 90 psychiatric hospitals in 2008, were extracted from the Behavioral Healthcare Performance Measurement System (BHPMS). Basic descriptive item-response and correlation analyses were performed for total surveys. Two datasets were randomly created for analysis. A random sample of 8229 survey results was used for EFA. Another random sample of 8261 consumer survey results was used for CFA. This same sample was used to perform validity and reliability analyses. The item-response analysis showed that the mean range for a disagree/agree five-point scale was 3.10-3.94. Correlation analysis showed a strong relationship between items. Six domains (dignity, rights, environment, empowerment, participation, and outcome) with internal reliabilities between good to moderate (0.87-0.73) were shown to be related to overall care satisfaction. Overall reliability for the instrument was excellent (0.94). Results from CFA provided support for the domains structure of the ICS proposed through EFA. The overall findings from this study provide evidence that the ICS is a reliable measure of consumer satisfaction in psychiatric inpatient settings. The analysis has shown the ICS to provide valid and reliable results and to focus on the specific concerns of consumers of psychiatric inpatient care. Scores by item indicate that opportunity for improvement exists across healthcare organizations.
Meta-analysis of the relationship between TQM and Business Performance
NASA Astrophysics Data System (ADS)
F, Ahmad M.; N, Zakuan; A, Jusoh; Z, Tasir; J, Takala
2013-06-01
Meta-analysis has been conducted based on 20 previous works from 4,040 firms at 16 countries from Asia, Europe and America. Throughout this paper a meta-analysis, this paper reviews the relationships between TQM and business performance amongst the regions. Meta-analysis result concludes that the average of rc is 0.47; Asia (rc=0.54), America (rc=0.43) and Europe (rc=0.38). The analysis also shows that Asia developed countries have greatest impact of TQM (rc=0.56). However, the analysis of ANOVA and t-test show that there is no significant difference amongst type of country (developed and developing countries) and regions at p=0.05. In addition, the average result of rc2 is 0.24; Asia (rc2=0.33), America (rc2=0.22) and Europe (rc2=0.15). Meanwhile, rc2 in developing countries (rc2=0.28) are higher than developed countries (rc2=0.21).
Thermal analysis of a conceptual design for a 250 We GPHS/FPSE space power system
NASA Technical Reports Server (NTRS)
Mccomas, Thomas J.; Dugan, Edward T.
1991-01-01
A thermal analysis has been performed for a 250-We space nuclear power system which combines the US Department of Energy's general purpose heat source (GPHS) modules with a state-of-the-art free-piston Stirling engine (FPSE). The focus of the analysis is on the temperature of the indium fuel clad within the GPHS modules. The thermal analysis results indicate fuel clad temperatures slightly higher than the design goal temperature of 1573 K. The results are considered favorable due to numerous conservative assumptions used. To demonstrate the effects of the conservatism, a brief sensitivity analysis is performed in which a few of the key system parameters are varied to determine their effect on the fuel clad temperatures. It is shown that thermal analysis of a more detailed thermal mode should yield fuel clad temperatures below 1573 K.
Faradji, Farhad; Ward, Rabab K; Birch, Gary E
2009-06-15
The feasibility of having a self-paced brain-computer interface (BCI) based on mental tasks is investigated. The EEG signals of four subjects performing five mental tasks each are used in the design of a 2-state self-paced BCI. The output of the BCI should only be activated when the subject performs a specific mental task and should remain inactive otherwise. For each subject and each task, the feature coefficient and the classifier that yield the best performance are selected, using the autoregressive coefficients as the features. The classifier with a zero false positive rate and the highest true positive rate is selected as the best classifier. The classifiers tested include: linear discriminant analysis, quadratic discriminant analysis, Mahalanobis discriminant analysis, support vector machine, and radial basis function neural network. The results show that: (1) some classifiers obtained the desired zero false positive rate; (2) the linear discriminant analysis classifier does not yield acceptable performance; (3) the quadratic discriminant analysis classifier outperforms the Mahalanobis discriminant analysis classifier and performs almost as well as the radial basis function neural network; and (4) the support vector machine classifier has the highest true positive rates but unfortunately has nonzero false positive rates in most cases.
NASA Technical Reports Server (NTRS)
Jeanneret, P. R.
1988-01-01
The development and use of a menu of performance tests that can be self-administered on a portable microcomputer are investigated. In order to identify, develop, or otherwise select the relevant human capabilities/attributes to measure and hence include in the performance battery, it is essential that an analysis be conducted of the jobs or functions that will be performed throughout a space shuttle mission. The primary job analysis instrument, the Position Analysis Questionnaire (PAQ), is discussed in detail so the reader will have sufficient background for understanding the application of the instrument to the various work activities included within the scope of the study, and the derivation of the human requirements (abilities/attributes) from the PAQ analyses. The research methodology is described and includes the procedures used for gathering the PAQ data. The results are presented in detail with specific emphasis on identifying critical requirements that can be measured with a portable computerized assessment battery. A discussion of the results is given with implications for future research.
Azadeh, Ali; Sheikhalishahi, Mohammad
2015-06-01
A unique framework for performance optimization of generation companies (GENCOs) based on health, safety, environment, and ergonomics (HSEE) indicators is presented. To rank this sector of industry, the combination of data envelopment analysis (DEA), principal component analysis (PCA), and Taguchi are used for all branches of GENCOs. These methods are applied in an integrated manner to measure the performance of GENCO. The preferred model between DEA, PCA, and Taguchi is selected based on sensitivity analysis and maximum correlation between rankings. To achieve the stated objectives, noise is introduced into input data. The results show that Taguchi outperforms other methods. Moreover, a comprehensive experiment is carried out to identify the most influential factor for ranking GENCOs. The approach developed in this study could be used for continuous assessment and improvement of GENCO's performance in supplying energy with respect to HSEE factors. The results of such studies would help managers to have better understanding of weak and strong points in terms of HSEE factors.
Performance Analysis: Control of Hazardous Energy
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Grange, Connie E.; Freeman, Jeff W.; Kerr, Christine E.
2010-10-06
LLNL experienced 26 occurrences related to the control of hazardous energy from January 1, 2008 through August 2010. These occurrences were 17% of the total number of reported occurrences during this 32-month period. The Performance Analysis and Reporting Section of the Contractor Assurance Office (CAO) routinely analyzes reported occurrences and issues looking for patterns that may indicate changes in LLNL’s performance and early indications of performance trends. It became apparent through these analyses that LLNL might have experienced a change in the control of hazardous energy and that these occurrences should be analyzed in more detail to determine if themore » perceived change in performance was real, whether that change is significant and if the causes of the occurrences are similar. This report documents the results of this more detailed analysis.« less
Performance analysis of Aloha networks with power capture and near/far effect
NASA Astrophysics Data System (ADS)
McCartin, Joseph T.
1989-06-01
An analysis is presented for the throughput characteristics for several classes of Aloha packet networks. Specifically, the throughput for variable packet length Aloha utilizing multiple power levels to induce receiver capture is derived. The results are extended to an analysis of a selective-repeat ARQ Aloha network. Analytical results are presented which indicate a significant increase in throughput for a variable packet network implementing a random two power level capture scheme. Further research into the area of the near/far effect on Aloha networks is included. Improvements in throughput for mobile radio Aloha networks which are subject to the near/far effect are presented. Tactical Command, Control and Communications (C3) systems of the future will rely on Aloha ground mobile data networks. The incorporation of power capture and the near/far effect into future tactical networks will result in improved system analysis, design, and performance.
Thermal Design, Analysis, and Testing of the Quench Module Insert Bread Board
NASA Technical Reports Server (NTRS)
Breeding, Shawn; Khodabandeh, Julia
2002-01-01
Contents include the following: Quench Module Insert (QMI) science requirements. QMI interfaces. QMI design layout. QMI thermal analysis and design methodology. QMI bread board testing and instrumentation approach. QMI thermal probe design parameters. Design features for gradient measurement. Design features for heated zone measurements. Thermal gradient analysis results. Heated zone analysis results. Bread board thermal probe layout. QMI bread board correlation and performance. Summary and conclusions.
Performance optimisations for distributed analysis in ALICE
NASA Astrophysics Data System (ADS)
Betev, L.; Gheata, A.; Gheata, M.; Grigoras, C.; Hristov, P.
2014-06-01
Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with "sensors" collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis.
ERIC Educational Resources Information Center
Bragg, Theresa A.
This paper discusses the validity of licensure examination results as an indicator of the performance of higher educational institutions. Licensure examination scores are available to departments for a variety of disciplines and analysis is best performed within the departments. The quality of feedback may dictate the usefulness of results. Using…
Strengths and weaknesses in the cognitive profile of youngsters with Prader-Willi syndrome.
Curfs, L M; Wiegers, A M; Sommers, J R; Borghgraef, M; Fryns, J P
1991-12-01
In this report we present the results of a study of the intellectual functioning and cognitive profile of 26 Prader-Willi syndrome (PWS) patients. The mean IQ score was 62.3 (range 39-96). In 13 patients a significant difference between verbal and performance IQ was found. In 10 of them the performance IQ was higher than the verbal. The results of subtest analysis indicate that cognitive strengths are more visible than cognitive weaknesses. Highest scores were noted especially in the performance scale, i.e. Block Design (9 children) and Coding or Mazes (5 children). Analysis of all available data indicates that PWS patients score better on visual motor discrimination skills than on auditory verbal processing skills. These results are promising for intervention programs and education strategies.
Increasing Transparency Through a Multiverse Analysis.
Steegen, Sara; Tuerlinckx, Francis; Gelman, Andrew; Vanpaemel, Wolf
2016-09-01
Empirical research inevitably includes constructing a data set by processing raw data into a form ready for statistical analysis. Data processing often involves choices among several reasonable options for excluding, transforming, and coding data. We suggest that instead of performing only one analysis, researchers could perform a multiverse analysis, which involves performing all analyses across the whole set of alternatively processed data sets corresponding to a large set of reasonable scenarios. Using an example focusing on the effect of fertility on religiosity and political attitudes, we show that analyzing a single data set can be misleading and propose a multiverse analysis as an alternative practice. A multiverse analysis offers an idea of how much the conclusions change because of arbitrary choices in data construction and gives pointers as to which choices are most consequential in the fragility of the result. © The Author(s) 2016.
Berretta, Massimiliano; Micek, Agnieszka; Lafranconi, Alessandra; Rossetti, Sabrina; Di Francia, Raffaele; De Paoli, Paolo; Rossi, Paola; Facchini, Gaetano
2018-04-17
Coffee consumption has been associated with numerous cancers, but evidence on ovarian cancer risk is controversial. Therefore, we performed a meta-analysis on prospective cohort studies in order to review the evidence on coffee consumption and risk of ovarian cancer. Studies were identified through searching the PubMed and MEDLINE databases up to March 2017. Risk estimates were retrieved from the studies, and dose-response analysis was modelled by using restricted cubic splines. Additionally, a stratified analysis by menopausal status was performed. A total of 8 studies were eligible for the dose-response meta-analysis. Studies included in the analysis comprised 787,076 participants and 3,541 ovarian cancer cases. The results showed that coffee intake was not associated with ovarian cancer risk (RR = 1.06, 95% CI: 0.89, 1.26). Stratified and subgroup analysis showed consisted results. This comprehensive meta-analysis did not find evidence of an association between the consumption of coffee and risk of ovarian cancer.
Range Finding with a Plenoptic Camera
2014-03-27
92 Experimental Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 Simulated Camera Analysis...Varying Lens Diameter . . . . . . . . . . . . . . . . 95 Simulated Camera Analysis: Varying Detector Size . . . . . . . . . . . . . . . . . 98 Simulated ...Matching Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76 37 Simulated Camera Performance with SIFT
Surface Management System Departure Event Data Analysis
NASA Technical Reports Server (NTRS)
Monroe, Gilena A.
2010-01-01
This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.
ERIC Educational Resources Information Center
vonWurmb, Elizabeth C.
2013-01-01
This dissertation undertakes an analysis of 1,044 performance evaluations from New York State School Music Association (NYSSMA) Spring Festival solo adjudication ratings of student performers from a large suburban school district. It relies on results of evaluations of observed performances, and takes these evaluations as assessments of what the…
Performance Improvement of Power Analysis Attacks on AES with Encryption-Related Signals
NASA Astrophysics Data System (ADS)
Lee, You-Seok; Lee, Young-Jun; Han, Dong-Guk; Kim, Ho-Won; Kim, Hyoung-Nam
A power analysis attack is a well-known side-channel attack but the efficiency of the attack is frequently degraded by the existence of power components, irrelative to the encryption included in signals used for the attack. To enhance the performance of the power analysis attack, we propose a preprocessing method based on extracting encryption-related parts from the measured power signals. Experimental results show that the attacks with the preprocessed signals detect correct keys with much fewer signals, compared to the conventional power analysis attacks.
Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis
NASA Technical Reports Server (NTRS)
Babcock, P.; Schor, A.; Rosch, G.
1998-01-01
This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.
Lessons Learned During TBCC Design for the NASA-AFRL Joint System Study
NASA Technical Reports Server (NTRS)
Snyder, Christopher A.; Espinosa, A. M.
2013-01-01
NASA and the Air Force Research Laboratory are involved in a Joint System Study (JSS) on Two-Stage-to-Orbit (TSTO) vehicles. The JSS will examine the performance, operability and analysis uncertainty of unmanned, fully reusable, TSTO launch vehicle concepts. NASA is providing a vehicle concept using turbine-based combined cycle (TBCC) propulsion on the booster stage and an all-rocket orbiter. The variation in vehicle and mission requirements for different potential customers, combined with analysis uncertainties, make it problematic to define optimum vehicle types or concepts, but the study is being used by NASA for tool assessment and development, and to identify technology gaps. Preliminary analyses were performed on the entire TBCC booster concept; then higher-fidelity analyses were performed for particular areas to verify results or reduce analysis uncertainties. Preliminary TBCC system analyses indicated that there would be sufficient thrust margin over its mission portion. The higher fidelity analyses, which included inlet and nozzle performance corrections for significant area mismatches between TBCC propulsion requirements versus the vehicle design, resulted in significant performance penalties from the preliminary results. TBCC system design and vehicle operation assumptions were reviewed to identify items to mitigate these performance penalties. The most promising items were then applied and analyses rerun to update performance predictions. A study overview is given to orient the reader, quickly focusing upon the NASA TBCC booster and low speed propulsion system. Details for the TBCC concept and the analyses performed are described. Finally, a summary of "Lessons Learned" are discussed with suggestions to improve future study efforts.
Algorithm sensitivity analysis and parameter tuning for tissue image segmentation pipelines
Kurç, Tahsin M.; Taveira, Luís F. R.; Melo, Alba C. M. A.; Gao, Yi; Kong, Jun; Saltz, Joel H.
2017-01-01
Abstract Motivation: Sensitivity analysis and parameter tuning are important processes in large-scale image analysis. They are very costly because the image analysis workflows are required to be executed several times to systematically correlate output variations with parameter changes or to tune parameters. An integrated solution with minimum user interaction that uses effective methodologies and high performance computing is required to scale these studies to large imaging datasets and expensive analysis workflows. Results: The experiments with two segmentation workflows show that the proposed approach can (i) quickly identify and prune parameters that are non-influential; (ii) search a small fraction (about 100 points) of the parameter search space with billions to trillions of points and improve the quality of segmentation results (Dice and Jaccard metrics) by as much as 1.42× compared to the results from the default parameters; (iii) attain good scalability on a high performance cluster with several effective optimizations. Conclusions: Our work demonstrates the feasibility of performing sensitivity analyses, parameter studies and auto-tuning with large datasets. The proposed framework can enable the quantification of error estimations and output variations in image segmentation pipelines. Availability and Implementation: Source code: https://github.com/SBU-BMI/region-templates/. Contact: teodoro@unb.br Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28062445
Enabling Efficient Climate Science Workflows in High Performance Computing Environments
NASA Astrophysics Data System (ADS)
Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.
2015-12-01
A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.
Gonzalez, Aroa Garcia; Taraba, Lukáš; Hraníček, Jakub; Kozlík, Petr; Coufal, Pavel
2017-01-01
Dasatinib is a novel oral prescription drug proposed for treating adult patients with chronic myeloid leukemia. Three analytical methods, namely ultra high performance liquid chromatography, capillary zone electrophoresis, and sequential injection analysis, were developed, validated, and compared for determination of the drug in the tablet dosage form. The total analysis time of optimized ultra high performance liquid chromatography and capillary zone electrophoresis methods was 2.0 and 2.2 min, respectively. Direct ultraviolet detection with detection wavelength of 322 nm was employed in both cases. The optimized sequential injection analysis method was based on spectrophotometric detection of dasatinib after a simple colorimetric reaction with folin ciocalteau reagent forming a blue-colored complex with an absorbance maximum at 745 nm. The total analysis time was 2.5 min. The ultra high performance liquid chromatography method provided the lowest detection and quantitation limits and the most precise and accurate results. All three newly developed methods were demonstrated to be specific, linear, sensitive, precise, and accurate, providing results satisfactorily meeting the requirements of the pharmaceutical industry, and can be employed for the routine determination of the active pharmaceutical ingredient in the tablet dosage form. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
11th Annual CMMI Technology Conference and User Group
2011-11-17
Examples of triggers may include: – Cost performance – Schedule performance – Results of management reviews – Occurrence of the risk • as a...Analysis (PHA) – Method 3 – Through bottom- up analysis of design data (e.g., flow diagrams, Failure Mode Effects and Criticality Analysis (FMECA...of formal reviews and the setting up of delta or follow- up reviews can be used to give the organization more places to look at the products as they
Meta‐analysis of test accuracy studies using imputation for partial reporting of multiple thresholds
Deeks, J.J.; Martin, E.C.; Riley, R.D.
2017-01-01
Introduction For tests reporting continuous results, primary studies usually provide test performance at multiple but often different thresholds. This creates missing data when performing a meta‐analysis at each threshold. A standard meta‐analysis (no imputation [NI]) ignores such missing data. A single imputation (SI) approach was recently proposed to recover missing threshold results. Here, we propose a new method that performs multiple imputation of the missing threshold results using discrete combinations (MIDC). Methods The new MIDC method imputes missing threshold results by randomly selecting from the set of all possible discrete combinations which lie between the results for 2 known bounding thresholds. Imputed and observed results are then synthesised at each threshold. This is repeated multiple times, and the multiple pooled results at each threshold are combined using Rubin's rules to give final estimates. We compared the NI, SI, and MIDC approaches via simulation. Results Both imputation methods outperform the NI method in simulations. There was generally little difference in the SI and MIDC methods, but the latter was noticeably better in terms of estimating the between‐study variances and generally gave better coverage, due to slightly larger standard errors of pooled estimates. Given selective reporting of thresholds, the imputation methods also reduced bias in the summary receiver operating characteristic curve. Simulations demonstrate the imputation methods rely on an equal threshold spacing assumption. A real example is presented. Conclusions The SI and, in particular, MIDC methods can be used to examine the impact of missing threshold results in meta‐analysis of test accuracy studies. PMID:29052347
Krista M. Gebert; Susan L. Odell
2007-01-01
This report summarizes the results of a 2004 analysis of county-level eligibility for financial and technical assistance through the USDA Forest Service Economic Recovery program and contrasts those results to the initial eligibility analysis performed in 1993. County-level eligibility was based on three criteria: (1) proximity to a National Forest or National...
RLV Turbine Performance Optimization
NASA Technical Reports Server (NTRS)
Griffin, Lisa W.; Dorney, Daniel J.
2001-01-01
A task was developed at NASA/Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. Final results of the preliminary design and the results of the two-dimensional (2D) detailed design of the first-stage vane of a supersonic turbine suitable for a reusable launch vehicle (R-LV) are presented. Analytical techniques for obtaining the results are also discussed.
NASA Astrophysics Data System (ADS)
Lü, Chengxu; Jiang, Xunpeng; Zhou, Xingfan; Zhang, Yinqiao; Zhang, Naiqian; Wei, Chongfeng; Mao, Wenhua
2017-10-01
Wet gluten is a useful quality indicator for wheat, and short wave near infrared spectroscopy (NIRS) is a high performance technique with the advantage of economic rapid and nondestructive test. To study the feasibility of short wave NIRS analyzing wet gluten directly from wheat seed, 54 representative wheat seed samples were collected and scanned by spectrometer. 8 spectral pretreatment method and genetic algorithm (GA) variable selection method were used to optimize analysis. Both quantitative and qualitative model of wet gluten were built by partial least squares regression and discriminate analysis. For quantitative analysis, normalization is the optimized pretreatment method, 17 wet gluten sensitive variables are selected by GA, and GA model performs a better result than that of all variable model, with R2V=0.88, and RMSEV=1.47. For qualitative analysis, automatic weighted least squares baseline is the optimized pretreatment method, all variable models perform better results than those of GA models. The correct classification rates of 3 class of <24%, 24-30%, >30% wet gluten content are 95.45, 84.52, and 90.00%, respectively. The short wave NIRS technique shows potential for both quantitative and qualitative analysis of wet gluten for wheat seed.
Peri-Implant Crevicular Fluid Analysis, Enzymes and Biomarkers: a Systemetic Review
Dursun, Erhan
2016-01-01
ABSTRACT Objectives To review the current understanding of the biomarkers and enzymes associated with different forms peri-implant diseases and how their level changes influence the pathogenesis of the inflammatory diseases around dental implants. Material and Methods An electronic search in two different databases was performed including MEDLINE (PubMed) and EMBASE between 1996 to 2016. Human studies analyse peri-implant crevicular fluid (PICF) biomarker and enzyme levels of implants having peri-implant mucositis and peri-implantitis published in English language, were evaluated. A systematic review was performed to assess which biomarkers and enzymes in PICF were used to identify the inflammatory conditions around dental implants. Results Fifty-one articles were identified of which 41 were further evaluated and included in the analysis. Due to significant heterogeneity between included studies, a meta-analysis could not be performed. Instead, a systematic descriptive review was performed. Conclusions Biomarkers and enzymes in peri-implant crevicular fluid have shown promising results in differentiating from peri-implant disease condition to health. However, due to inconsistent results and acquiring much evidence from cross-sectional studies, additional evidence supported by randomized-controlled trials is needed to validate the links reported. PMID:27833734
NASA Technical Reports Server (NTRS)
Pham, Timothy T.; Machuzak, Richard J.; Bedrossian, Alina; Kelly, Richard M.; Liao, Jason C.
2012-01-01
This software provides an automated capability to measure and qualify the frequency stability performance of the Deep Space Network (DSN) ground system, using daily spacecraft tracking data. The results help to verify if the DSN performance is meeting its specification, therefore ensuring commitments to flight missions; in particular, the radio science investigations. The rich set of data also helps the DSN Operations and Maintenance team to identify the trends and patterns, allowing them to identify the antennas of lower performance and implement corrective action in a timely manner. Unlike the traditional approach where the performance can only be obtained from special calibration sessions that are both time-consuming and require manual setup, the new method taps into the daily spacecraft tracking data. This new approach significantly increases the amount of data available for analysis, roughly by two orders of magnitude, making it possible to conduct trend analysis with good confidence. The software is built with automation in mind for end-to-end processing. From the inputs gathering to computation analysis and later data visualization of the results, all steps are done automatically, making the data production at near zero cost. This allows the limited engineering resource to focus on high-level assessment and to follow up with the exceptions/deviations. To make it possible to process the continual stream of daily incoming data without much effort, and to understand the results quickly, the processing needs to be automated and the data summarized at a high level. Special attention needs to be given to data gathering, input validation, handling anomalous conditions, computation, and presenting the results in a visual form that makes it easy to spot items of exception/ deviation so that further analysis can be directed and corrective actions followed.
NASA Technical Reports Server (NTRS)
Pham, Timothy T.; Machuzak, Richard J.; Bedrossian, Alina; Kelly, Richard M.; Liao, Jason C.
2012-01-01
This software provides an automated capability to measure and qualify the frequency stability performance of the Deep Space Network (DSN) ground system, using daily spacecraft tracking data. The results help to verify if the DSN performance is meeting its specification, therefore ensuring commitments to flight missions; in particular, the radio science investigations. The rich set of data also helps the DSN Operations and Maintenance team to identify the trends and patterns, allowing them to identify the antennas of lower performance and implement corrective action in a timely manner. Unlike the traditional approach where the performance can only be obtained from special calibration sessions that are both time-consuming and require manual setup, the new method taps into the daily spacecraft tracking data. This new approach significantly increases the amount of data available for analysis, roughly by two orders of magnitude, making it possible to conduct trend analysis with good confidence. The software is built with automation in mind for end-to-end processing. From the inputs gathering to computation analysis and later data visualization of the results, all steps are done automatically, making the data production at near zero cost. This allows the limited engineering resource to focus on high-level assessment and to follow up with the exceptions/deviations. To make it possible to process the continual stream of daily incoming data without much effort, and to understand the results quickly, the processing needs to be automated and the data summarized at a high level. Special attention needs to be given to data gathering, input validation, handling anomalous conditions, computation, and presenting the results in a visual form that makes it easy to spot items of exception/deviation so that further analysis can be directed and corrective actions followed.
Study on the CO2 electric driven fixed swash plate type compressor for eco-friendly vehicles
NASA Astrophysics Data System (ADS)
Nam, Donglim; Kim, Kitae; Lee, Jehie; Kwon, Yunki; Lee, Geonho
2017-08-01
The purpose of this study is to experiment and to performance analysis about the electric-driven fixed swash plate compressor using alternate refrigerant(R744). Comprehensive simulation model for an electric driven compressor using CO2 for eco-friendly vehicle is presented. This model consists of compression model and dynamic model. The compression model included valve dynamics, leakage, and heat transfer models. And the dynamic model included frictional loss between piston ring and cylinder wall, frictional loss between shoe and swash plate, frictional loss of bearings, and electric efficiency. Especially, because the efficiency of an electric parts(motor and inverter) in the compressor affects the loss of the compressor, the dynamo test was performed. We made the designed compressor, and tested the performance of the compressor about the variety pressure conditions. Also we compared the performance analysis result and performance test result.
ERIC Educational Resources Information Center
Kai, Jiang
2009-01-01
Accountability, which is closely related to evaluation of efficiency, effectiveness, and performance, requires proving that higher education has achieved planned results and performance in an effective manner. Highlighting efficiency and effectiveness and emphasizing results and outcomes are the basic characteristics of accountability in higher…
Teodoro, George; Kurc, Tahsin; Andrade, Guilherme; Kong, Jun; Ferreira, Renato; Saltz, Joel
2015-01-01
We carry out a comparative performance study of multi-core CPUs, GPUs and Intel Xeon Phi (Many Integrated Core-MIC) with a microscopy image analysis application. We experimentally evaluate the performance of computing devices on core operations of the application. We correlate the observed performance with the characteristics of computing devices and data access patterns, computation complexities, and parallelization forms of the operations. The results show a significant variability in the performance of operations with respect to the device used. The performances of operations with regular data access are comparable or sometimes better on a MIC than that on a GPU. GPUs are more efficient than MICs for operations that access data irregularly, because of the lower bandwidth of the MIC for random data accesses. We propose new performance-aware scheduling strategies that consider variabilities in operation speedups. Our scheduling strategies significantly improve application performance compared to classic strategies in hybrid configurations. PMID:28239253
Multiplex network analysis of employee performance and employee social relationships
NASA Astrophysics Data System (ADS)
Cai, Meng; Wang, Wei; Cui, Ying; Stanley, H. Eugene
2018-01-01
In human resource management, employee performance is strongly affected by both formal and informal employee networks. Most previous research on employee performance has focused on monolayer networks that can represent only single categories of employee social relationships. We study employee performance by taking into account the entire multiplex structure of underlying employee social networks. We collect three datasets consisting of five different employee relationship categories in three firms, and predict employee performance using degree centrality and eigenvector centrality in a superimposed multiplex network (SMN) and an unfolded multiplex network (UMN). We use a quadratic assignment procedure (QAP) analysis and a regression analysis to demonstrate that the different categories of relationship are mutually embedded and that the strength of their impact on employee performance differs. We also use weighted/unweighted SMN/UMN to measure the predictive accuracy of this approach and find that employees with high centrality in a weighted UMN are more likely to perform well. Our results shed new light on how social structures affect employee performance.
77 FR 37717 - Electrical Cable Test Results and Analysis During Fire Exposure
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-22
... Fire Exposure AGENCY: Nuclear Regulatory Commission. ACTION: Draft NUREG; request for comment. SUMMARY...-2128, ``Electrical Cable Test Results and Analysis during Fire Exposure (ELECTRA-FIRE), A Consolidation of the Three Major Fire-Induced Circuit and Cable Failure Experiments Performed between 2001 and 2011...
Delakis, Ioannis; Wise, Robert; Morris, Lauren; Kulama, Eugenia
2015-11-01
The purpose of this work was to evaluate the contrast-detail performance of full field digital mammography (FFDM) systems using ideal (Hotelling) observer Signal-to-Noise Ratio (SNR) methodology and ascertain whether it can be considered an alternative to the conventional, automated analysis of CDMAM phantom images. Five FFDM units currently used in the national breast screening programme were evaluated, which differed with respect to age, detector, Automatic Exposure Control (AEC) and target/filter combination. Contrast-detail performance was analysed using CDMAM and ideal observer SNR methodology. The ideal observer SNR was calculated for input signal originating from gold discs of varying thicknesses and diameters, and then used to estimate the threshold gold thickness for each diameter as per CDMAM analysis. The variability of both methods and the dependence of CDMAM analysis on phantom manufacturing discrepancies also investigated. Results from both CDMAM and ideal observer methodologies were informative differentiators of FFDM systems' contrast-detail performance, displaying comparable patterns with respect to the FFDM systems' type and age. CDMAM results suggested higher threshold gold thickness values compared with the ideal observer methodology, especially for small-diameter details, which can be attributed to the behaviour of the CDMAM phantom used in this study. In addition, ideal observer methodology results showed lower variability than CDMAM results. The Ideal observer SNR methodology can provide a useful metric of the FFDM systems' contrast detail characteristics and could be considered a surrogate for conventional, automated analysis of CDMAM images. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Comparison of normalization methods for the analysis of metagenomic gene abundance data.
Pereira, Mariana Buongermino; Wallroth, Mikael; Jonsson, Viktor; Kristiansson, Erik
2018-04-20
In shotgun metagenomics, microbial communities are studied through direct sequencing of DNA without any prior cultivation. By comparing gene abundances estimated from the generated sequencing reads, functional differences between the communities can be identified. However, gene abundance data is affected by high levels of systematic variability, which can greatly reduce the statistical power and introduce false positives. Normalization, which is the process where systematic variability is identified and removed, is therefore a vital part of the data analysis. A wide range of normalization methods for high-dimensional count data has been proposed but their performance on the analysis of shotgun metagenomic data has not been evaluated. Here, we present a systematic evaluation of nine normalization methods for gene abundance data. The methods were evaluated through resampling of three comprehensive datasets, creating a realistic setting that preserved the unique characteristics of metagenomic data. Performance was measured in terms of the methods ability to identify differentially abundant genes (DAGs), correctly calculate unbiased p-values and control the false discovery rate (FDR). Our results showed that the choice of normalization method has a large impact on the end results. When the DAGs were asymmetrically present between the experimental conditions, many normalization methods had a reduced true positive rate (TPR) and a high false positive rate (FPR). The methods trimmed mean of M-values (TMM) and relative log expression (RLE) had the overall highest performance and are therefore recommended for the analysis of gene abundance data. For larger sample sizes, CSS also showed satisfactory performance. This study emphasizes the importance of selecting a suitable normalization methods in the analysis of data from shotgun metagenomics. Our results also demonstrate that improper methods may result in unacceptably high levels of false positives, which in turn may lead to incorrect or obfuscated biological interpretation.
NASA Technical Reports Server (NTRS)
Johnston, John D.; Parrish, Keith; Howard, Joseph M.; Mosier, Gary E.; McGinnis, Mark; Bluth, Marcel; Kim, Kevin; Ha, Hong Q.
2004-01-01
This is a continuation of a series of papers on modeling activities for JWST. The structural-thermal- optical, often referred to as "STOP", analysis process is used to predict the effect of thermal distortion on optical performance. The benchmark STOP analysis for JWST assesses the effect of an observatory slew on wavefront error. The paper begins an overview of multi-disciplinary engineering analysis, or integrated modeling, which is a critical element of the JWST mission. The STOP analysis process is then described. This process consists of the following steps: thermal analysis, structural analysis, and optical analysis. Temperatures predicted using geometric and thermal math models are mapped to the structural finite element model in order to predict thermally-induced deformations. Motions and deformations at optical surfaces are input to optical models and optical performance is predicted using either an optical ray trace or WFE estimation techniques based on prior ray traces or first order optics. Following the discussion of the analysis process, results based on models representing the design at the time of the System Requirements Review. In addition to baseline performance predictions, sensitivity studies are performed to assess modeling uncertainties. Of particular interest is the sensitivity of optical performance to uncertainties in temperature predictions and variations in metal properties. The paper concludes with a discussion of modeling uncertainty as it pertains to STOP analysis.
NASA Technical Reports Server (NTRS)
Scola, Salvatore; Stavely, Rebecca; Jackson, Trevor; Boyer, Charlie; Osmundsen, Jim; Turczynski, Craig; Stimson, Chad
2016-01-01
Performance-related effects of system level temperature changes can be a key consideration in the design of many types of optical instruments. This is especially true for space-based imagers, which may require complex thermal control systems to maintain alignment of the optical components. Structural-Thermal-Optical-Performance (STOP) analysis is a multi-disciplinary process that can be used to assess the performance of these optical systems when subjected to the expected design environment. This type of analysis can be very time consuming, which makes it difficult to use as a trade study tool early in the project life cycle. In many cases, only one or two iterations can be performed over the course of a project. This limits the design space to best practices since it may be too difficult, or take too long, to test new concepts analytically. In order to overcome this challenge, automation, and a standard procedure for performing these studies is essential. A methodology was developed within the framework of the Comet software tool that captures the basic inputs, outputs, and processes used in most STOP analyses. This resulted in a generic, reusable analysis template that can be used for design trades for a variety of optical systems. The template captures much of the upfront setup such as meshing, boundary conditions, data transfer, naming conventions, and post-processing, and therefore saves time for each subsequent project. A description of the methodology and the analysis template is presented, and results are described for a simple telescope optical system.
Probabilistic Structural Evaluation of Uncertainties in Radiator Sandwich Panel Design
NASA Technical Reports Server (NTRS)
Kuguoglu, Latife; Ludwiczak, Damian
2006-01-01
The Jupiter Icy Moons Orbiter (JIMO) Space System is part of the NASA's Prometheus Program. As part of the JIMO engineering team at NASA Glenn Research Center, the structural design of the JIMO Heat Rejection Subsystem (HRS) is evaluated. An initial goal of this study was to perform sensitivity analyses to determine the relative importance of the input variables on the structural responses of the radiator panel. The desire was to let the sensitivity analysis information identify the important parameters. The probabilistic analysis methods illustrated here support this objective. The probabilistic structural performance evaluation of a HRS radiator sandwich panel was performed. The radiator panel structural performance was assessed in the presence of uncertainties in the loading, fabrication process variables, and material properties. The stress and displacement contours of the deterministic structural analysis at mean probability was performed and results presented. It is followed by a probabilistic evaluation to determine the effect of the primitive variables on the radiator panel structural performance. Based on uncertainties in material properties, structural geometry and loading, the results of the displacement and stress analysis are used as an input file for the probabilistic analysis of the panel. The sensitivity of the structural responses, such as maximum displacement and maximum tensile and compressive stresses of the facesheet in x and y directions and maximum VonMises stresses of the tube, to the loading and design variables is determined under the boundary condition where all edges of the radiator panel are pinned. Based on this study, design critical material and geometric parameters of the considered sandwich panel are identified.
NASA Astrophysics Data System (ADS)
Scola, Salvatore; Stavely, Rebecca; Jackson, Trevor; Boyer, Charlie; Osmundsen, Jim; Turczynski, Craig; Stimson, Chad
2016-09-01
Performance-related effects of system level temperature changes can be a key consideration in the design of many types of optical instruments. This is especially true for space-based imagers, which may require complex thermal control systems to maintain alignment of the optical components. Structural-Thermal-Optical-Performance (STOP) analysis is a multi-disciplinary process that can be used to assess the performance of these optical systems when subjected to the expected design environment. This type of analysis can be very time consuming, which makes it difficult to use as a trade study tool early in the project life cycle. In many cases, only one or two iterations can be performed over the course of a project. This limits the design space to best practices since it may be too difficult, or take too long, to test new concepts analytically. In order to overcome this challenge, automation, and a standard procedure for performing these studies is essential. A methodology was developed within the framework of the Comet software tool that captures the basic inputs, outputs, and processes used in most STOP analyses. This resulted in a generic, reusable analysis template that can be used for design trades for a variety of optical systems. The template captures much of the upfront setup such as meshing, boundary conditions, data transfer, naming conventions, and post-processing, and therefore saves time for each subsequent project. A description of the methodology and the analysis template is presented, and results are described for a simple telescope optical system.
Wang, Tong; Wu, Hai-Long; Xie, Li-Xia; Zhu, Li; Liu, Zhi; Sun, Xiao-Dong; Xiao, Rong; Yu, Ru-Qin
2017-04-01
In this work, a smart chemometrics-enhanced strategy, high-performance liquid chromatography, and diode array detection coupled with second-order calibration method based on alternating trilinear decomposition algorithm was proposed to simultaneously quantify 12 polyphenols in different kinds of apple peel and pulp samples. The proposed strategy proved to be a powerful tool to solve the problems of coelution, unknown interferences, and chromatographic shifts in the process of high-performance liquid chromatography analysis, making it possible for the determination of 12 polyphenols in complex apple matrices within 10 min under simple conditions of elution. The average recoveries with standard deviations, and figures of merit including sensitivity, selectivity, limit of detection, and limit of quantitation were calculated to validate the accuracy of the proposed method. Compared to the quantitative analysis results from the classic high-performance liquid chromatography method, the statistical and graphical analysis showed that our proposed strategy obtained more reliable results. All results indicated that our proposed method used in the quantitative analysis of apple polyphenols was an accurate, fast, universal, simple, and green one, and it was expected to be developed as an attractive alternative method for simultaneous determination of multitargeted analytes in complex matrices. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Probabilistic Evaluation of Advanced Ceramic Matrix Composite Structures
NASA Technical Reports Server (NTRS)
Abumeri, Galib H.; Chamis, Christos C.
2003-01-01
The objective of this report is to summarize the deterministic and probabilistic structural evaluation results of two structures made with advanced ceramic composites (CMC): internally pressurized tube and uniformly loaded flange. The deterministic structural evaluation includes stress, displacement, and buckling analyses. It is carried out using the finite element code MHOST, developed for the 3-D inelastic analysis of structures that are made with advanced materials. The probabilistic evaluation is performed using the integrated probabilistic assessment of composite structures computer code IPACS. The affects of uncertainties in primitive variables related to the material, fabrication process, and loadings on the material property and structural response behavior are quantified. The primitive variables considered are: thermo-mechanical properties of fiber and matrix, fiber and void volume ratios, use temperature, and pressure. The probabilistic structural analysis and probabilistic strength results are used by IPACS to perform reliability and risk evaluation of the two structures. The results will show that the sensitivity information obtained for the two composite structures from the computational simulation can be used to alter the design process to meet desired service requirements. In addition to detailed probabilistic analysis of the two structures, the following were performed specifically on the CMC tube: (1) predicted the failure load and the buckling load, (2) performed coupled non-deterministic multi-disciplinary structural analysis, and (3) demonstrated that probabilistic sensitivities can be used to select a reduced set of design variables for optimization.
Plenis, Alina; Rekowska, Natalia; Bączek, Tomasz
2016-01-01
This article focuses on correlating the column classification obtained from the method created at the Katholieke Universiteit Leuven (KUL), with the chromatographic resolution attained in biomedical separation. In the KUL system, each column is described with four parameters, which enables estimation of the FKUL value characterising similarity of those parameters to the selected reference stationary phase. Thus, a ranking list based on the FKUL value can be calculated for the chosen reference column, then correlated with the results of the column performance test. In this study, the column performance test was based on analysis of moclobemide and its two metabolites in human plasma by liquid chromatography (LC), using 18 columns. The comparative study was performed using traditional correlation of the FKUL values with the retention parameters of the analytes describing the column performance test. In order to deepen the comparative assessment of both data sets, factor analysis (FA) was also used. The obtained results indicated that the stationary phase classes, closely related according to the KUL method, yielded comparable separation for the target substances. Therefore, the column ranking system based on the FKUL-values could be considered supportive in the choice of the appropriate column for biomedical analysis. PMID:26805819
Plenis, Alina; Rekowska, Natalia; Bączek, Tomasz
2016-01-21
This article focuses on correlating the column classification obtained from the method created at the Katholieke Universiteit Leuven (KUL), with the chromatographic resolution attained in biomedical separation. In the KUL system, each column is described with four parameters, which enables estimation of the FKUL value characterising similarity of those parameters to the selected reference stationary phase. Thus, a ranking list based on the FKUL value can be calculated for the chosen reference column, then correlated with the results of the column performance test. In this study, the column performance test was based on analysis of moclobemide and its two metabolites in human plasma by liquid chromatography (LC), using 18 columns. The comparative study was performed using traditional correlation of the FKUL values with the retention parameters of the analytes describing the column performance test. In order to deepen the comparative assessment of both data sets, factor analysis (FA) was also used. The obtained results indicated that the stationary phase classes, closely related according to the KUL method, yielded comparable separation for the target substances. Therefore, the column ranking system based on the FKUL-values could be considered supportive in the choice of the appropriate column for biomedical analysis.
Performance deterioration based on existing (historical) data; JT9D jet engine diagnostics program
NASA Technical Reports Server (NTRS)
Sallee, G. P.
1978-01-01
The results of the collection and analysis of historical data pertaining to the deterioration of JT9D engine performance are presented. The results of analyses of prerepair and postrepair engine test stand performance data from a number of airlines to establish the individual as well as average losses in engine performance with respect to service use are included. Analysis of the changes in mechanical condition of parts, obtained by inspection of used gas-path parts of varying age, allowed preliminary assessments of component performance deterioration levels and identification of the causitive factors. These component performance estimates, refined by data from special engine back-to-back testing related to module performance restoration, permitted the development of preliminary models of engine component/module performance deterioration with respect to usage. The preliminary assessment of the causes of module performance deterioration and the trends with usage are explained, along with the role each module plays in overall engine performance deterioration. Preliminary recommendations with respect to operating and maintenance practices which could be adopted to control the level of performance deterioration are presented. The needs for additional component sensitivity testing as well as outstanding issues are discussed.
NASA Technical Reports Server (NTRS)
Hailperin, M.
1993-01-01
This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that the authors' techniques allow more accurate estimation of the global system loading, resulting in fewer object migrations than local methods. The authors' method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive load-balancing methods. Results from a preliminary analysis of another system and from simulation with a synthetic load provide some evidence of more general applicability.
Performance analysis of LAN bridges and routers
NASA Technical Reports Server (NTRS)
Hajare, Ankur R.
1991-01-01
Bridges and routers are used to interconnect Local Area Networks (LANs). The performance of these devices is important since they can become bottlenecks in large multi-segment networks. Performance metrics and test methodology for bridges and routers were not standardized. Performance data reported by vendors is not applicable to the actual scenarios encountered in an operational network. However, vendor-provided data can be used to calibrate models of bridges and routers that, along with other models, yield performance data for a network. Several tools are available for modeling bridges and routers - Network II.5 was used. The results of the analysis of some bridges and routers are presented.
The effects of videotape modeling on staff acquisition of functional analysis methodology.
Moore, James W; Fisher, Wayne W
2007-01-01
Lectures and two types of video modeling were compared to determine their relative effectiveness in training 3 staff members to conduct functional analysis sessions. Video modeling that contained a larger number of therapist exemplars resulted in mastery-level performance eight of the nine times it was introduced, whereas neither lectures nor partial video modeling produced significant improvements in performance. Results demonstrated that video modeling provided an effective training strategy but only when a wide range of exemplars of potential therapist behaviors were depicted in the videotape.
The Effects of Videotape Modeling on Staff Acquisition of Functional Analysis Methodology
Moore, James W; Fisher, Wayne W
2007-01-01
Lectures and two types of video modeling were compared to determine their relative effectiveness in training 3 staff members to conduct functional analysis sessions. Video modeling that contained a larger number of therapist exemplars resulted in mastery-level performance eight of the nine times it was introduced, whereas neither lectures nor partial video modeling produced significant improvements in performance. Results demonstrated that video modeling provided an effective training strategy but only when a wide range of exemplars of potential therapist behaviors were depicted in the videotape. PMID:17471805
NASA Astrophysics Data System (ADS)
Xu, Jun; Dang, Chao; Kong, Fan
2017-10-01
This paper presents a new method for efficient structural reliability analysis. In this method, a rotational quasi-symmetric point method (RQ-SPM) is proposed for evaluating the fractional moments of the performance function. Then, the derivation of the performance function's probability density function (PDF) is carried out based on the maximum entropy method in which constraints are specified in terms of fractional moments. In this regard, the probability of failure can be obtained by a simple integral over the performance function's PDF. Six examples, including a finite element-based reliability analysis and a dynamic system with strong nonlinearity, are used to illustrate the efficacy of the proposed method. All the computed results are compared with those by Monte Carlo simulation (MCS). It is found that the proposed method can provide very accurate results with low computational effort.
MTF measurements on real time for performance analysis of electro-optical systems
NASA Astrophysics Data System (ADS)
Stuchi, Jose Augusto; Signoreto Barbarini, Elisa; Vieira, Flavio Pascoal; dos Santos, Daniel, Jr.; Stefani, Mário Antonio; Yasuoka, Fatima Maria Mitsue; Castro Neto, Jarbas C.; Linhari Rodrigues, Evandro Luis
2012-06-01
The need of methods and tools that assist in determining the performance of optical systems is actually increasing. One of the most used methods to perform analysis of optical systems is to measure the Modulation Transfer Function (MTF). The MTF represents a direct and quantitative verification of the image quality. This paper presents the implementation of the software, in order to calculate the MTF of electro-optical systems. The software was used for calculating the MTF of Digital Fundus Camera, Thermal Imager and Ophthalmologic Surgery Microscope. The MTF information aids the analysis of alignment and measurement of optical quality, and also defines the limit resolution of optical systems. The results obtained with the Fundus Camera and Thermal Imager was compared with the theoretical values. For the Microscope, the results were compared with MTF measured of Microscope Zeiss model, which is the quality standard of ophthalmological microscope.
A Shot Number Based Approach to Performance Analysis in Table Tennis
Yoshida, Kazuto; Yamada, Koshi
2017-01-01
Abstract The current study proposes a novel approach that improves the conventional performance analysis in table tennis by introducing the concept of frequency, or the number of shots, of each shot number. The improvements over the conventional method are as follows: better accuracy of the evaluation of skills and tactics of players, additional insights into scoring and returning skills and ease of understanding the results with a single criterion. The performance analysis of matches played at the 2012 Summer Olympics in London was conducted using the proposed method. The results showed some effects of the shot number and gender differences in table tennis. Furthermore, comparisons were made between Chinese players and players from other countries, what threw light on the skills and tactics of the Chinese players. The present findings demonstrate that the proposed method provides useful information and has some advantages over the conventional method. PMID:28210334
Aerocapture Systems Analysis for a Neptune Mission
NASA Technical Reports Server (NTRS)
Lockwood, Mary Kae; Edquist, Karl T.; Starr, Brett R.; Hollis, Brian R.; Hrinda, Glenn A.; Bailey, Robert W.; Hall, Jeffery L.; Spilker, Thomas R.; Noca, Muriel A.; O'Kongo, N.
2006-01-01
A Systems Analysis was completed to determine the feasibility, benefit and risk of an aeroshell aerocapture system for Neptune and to identify technology gaps and technology performance goals. The systems analysis includes the following disciplines: science; mission design; aeroshell configuration; interplanetary navigation analyses; atmosphere modeling; computational fluid dynamics for aerodynamic performance and aeroheating environment; stability analyses; guidance development; atmospheric flight simulation; thermal protection system design; mass properties; structures; spacecraft design and packaging; and mass sensitivities. Results show that aerocapture is feasible and performance is adequate for the Neptune mission. Aerocapture can deliver 1.4 times more mass to Neptune orbit than an all-propulsive system for the same launch vehicle and results in a 3-4 year reduction in trip time compared to all-propulsive systems. Enabling technologies for this mission include TPS manufacturing; and aerothermodynamic methods for determining coupled 3-D convection, radiation and ablation aeroheating rates and loads.
Posttest RELAP4 analysis of LOFT experiment L1-4
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grush, W.H.; Holmstrom, H.L.O.
Results of posttest analysis of LOFT loss-of-coolant experiment L1-4 with the RELAP4 code are presented. The results are compared with the pretest prediction and the test data. Differences between the RELAP4 model used for this analysis and that used for the pretest prediction are in the areas of initial conditions, nodalization, emergency core cooling system, broken loop hot leg, and steam generator secondary. In general, these changes made only minor improvement in the comparison of the analytical results to the data. Also presented are the results of a limited study of LOFT downcomer modeling which compared the performance of themore » conventional single downcomer model with that of the new split downcomer model. A RELAP4 sensitivity calculation with artificially elevated emergency core coolant temperature was performed to highlight the need for an ECC mixing model in RELAP4.« less
Bhaduri, Anirban; Ghosh, Dipak
2016-01-01
The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation. PMID:26909045
Bhaduri, Anirban; Ghosh, Dipak
2016-01-01
The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation.
Analysis of 238Pu and 56Fe Evaluated Data for Use in MYRRHA
NASA Astrophysics Data System (ADS)
Díez, C. J.; Cabellos, O.; Martínez, J. S.; Stankovskiy, A.; Van den Eynde, G.; Schillebeeckx, P.; Heyse, J.
2014-04-01
A sensitivity analysis on the multiplication factor, keff, to the cross section data has been carried out for the MYRRHA critical configuration in order to show the most relevant reactions. With these results, a further analysis on the 238Pu and 56Fe cross sections has been performed, comparing the evaluations provided in the JEFF-3.1.2 and ENDF/B-VII.1 libraries for these nuclides. Then, the effect in MYRRHA of the differences between evaluations are analysed, presenting the source of the differences. With these results, recommendations for the 56Fe and 238Pu evaluations are suggested. These calculations have been performed with SCALE6.1 and MCNPX-2.7e.
Specialized data analysis of SSME and advanced propulsion system vibration measurements
NASA Technical Reports Server (NTRS)
Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi
1993-01-01
The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.
Aeroelastic Stability of Idling Wind Turbines
NASA Astrophysics Data System (ADS)
Wang, Kai; Riziotis, Vasilis A.; Voutsinas, Spyros G.
2016-09-01
Wind turbine rotors in idling operation mode can experience high angles of attack, within the post stall region that are capable of triggering stall-induced vibrations. In the present paper rotor stability in slow idling operation is assessed on the basis of non-linear time domain and linear eigenvalue analysis. Analysis is performed for a 10 MW conceptual wind turbine designed by DTU. First the flow conditions that are likely to favour stall induced instabilities are identified through non-linear time domain aeroelastic analysis. Next, for the above specified conditions, eigenvalue stability simulations are performed aiming at identifying the low damped modes of the turbine. Finally the results of the eigenvalue analysis are evaluated through computations of the work of the aerodynamic forces by imposing harmonic vibrations following the shape and frequency of the various modes. Eigenvalue analysis indicates that the asymmetric and symmetric out-of-plane modes have the lowest damping. The results of the eigenvalue analysis agree well with those of the time domain analysis.
Blood pressure and neuropsychological test performance in healthy postmenopausal women.
Alsumali, Adnan; Mekary, Rania A; Seeger, John; Regestein, Quentin
2016-06-01
To study the association between blood pressure and neuropsychological test performance in healthy postmenopausal women. Data from 88 healthy postmenopausal women aged 46-73 years, who were not experiencing hot flashes, and who had participated in a prior drug trial, were analyzed to find whether baseline blood pressure was associated with impaired performance on neuropsychological testing done at 3 follow-up visits separated by 4 weeks. Factor analysis was used to reduce the dimensions of neuropsychological test performance. Mixed linear modeling was used to evaluate the association between baseline blood pressure and repeatedly measured neuropsychological test performance at follow-up in a complete case analysis (n=53). In a sensitivity analysis (n=88), multiple-imputation using the Markov Chain Monte Carlo method was used to account for missing data (blood pressure results) for some visits. The variables recording neuropsychological test performance were reduced to two main factors (Factor 1=selective attention; Factor 2=complex processing). In the complete case analysis, the association between a 20-mmHg increase in diastolic blood pressure and Factor 1 remained statistically significant after adjusting for potential confounders, before adjusting for systolic blood pressure (slope=0.60; 95%CI=0.04,1.16), and after adjusting for systolic blood pressure (slope=0.76; 95%CI=0.06, 1.47). The positive slopes indicated an increase in the time spent performing a given task (i.e., a decrease in neuropsychological test performance). No other significant associations were found between systolic blood pressure and either factor. The results did not materially change after applying the multiple-imputation method. An increase in diastolic blood pressure was associated with a decrease in neuropsychological test performance among older healthy postmenopausal women experiencing hot flashes. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Schredder, J. M.
1988-01-01
A comparative analysis was performed, using both the Geometrical Theory of Diffraction (GTD) and traditional pathlength error analysis techniques, for predicting RF antenna gain performance and pointing corrections. The NASA/JPL 70 meter antenna with its shaped surface was analyzed for gravity loading over the range of elevation angles. Also analyzed were the effects of lateral and axial displacements of the subreflector. Significant differences were noted between the predictions of the two methods, in the effect of subreflector displacements, and in the optimal subreflector positions to focus a gravity-deformed main reflector. The results are of relevance to future design procedure.
Papantoniou, Panagiotis
2018-04-03
The present research relies on 2 main objectives. The first is to investigate whether latent model analysis through a structural equation model can be implemented on driving simulator data in order to define an unobserved driving performance variable. Subsequently, the second objective is to investigate and quantify the effect of several risk factors including distraction sources, driver characteristics, and road and traffic environment on the overall driving performance and not in independent driving performance measures. For the scope of the present research, 95 participants from all age groups were asked to drive under different types of distraction (conversation with passenger, cell phone use) in urban and rural road environments with low and high traffic volume in a driving simulator experiment. Then, in the framework of the statistical analysis, a correlation table is presented investigating any of a broad class of statistical relationships between driving simulator measures and a structural equation model is developed in which overall driving performance is estimated as a latent variable based on several individual driving simulator measures. Results confirm the suitability of the structural equation model and indicate that the selection of the specific performance measures that define overall performance should be guided by a rule of representativeness between the selected variables. Moreover, results indicate that conversation with the passenger was not found to have a statistically significant effect, indicating that drivers do not change their performance while conversing with a passenger compared to undistracted driving. On the other hand, results support the hypothesis that cell phone use has a negative effect on driving performance. Furthermore, regarding driver characteristics, age, gender, and experience all have a significant effect on driving performance, indicating that driver-related characteristics play the most crucial role in overall driving performance. The findings of this study allow a new approach to the investigation of driving behavior in driving simulator experiments and in general. By the successful implementation of the structural equation model, driving behavior can be assessed in terms of overall performance and not through individual performance measures, which allows an important scientific step forward from piecemeal analyses to a sound combined analysis of the interrelationship between several risk factors and overall driving performance.
Free wake analysis of hover performance using a new influence coefficient method
NASA Technical Reports Server (NTRS)
Quackenbush, Todd R.; Bliss, Donald B.; Ong, Ching Cho; Ching, Cho Ong
1990-01-01
A new approach to the prediction of helicopter rotor performance using a free wake analysis was developed. This new method uses a relaxation process that does not suffer from the convergence problems associated with previous time marching simulations. This wake relaxation procedure was coupled to a vortex-lattice, lifting surface loads analysis to produce a novel, self contained performance prediction code: EHPIC (Evaluation of Helicopter Performance using Influence Coefficients). The major technical features of the EHPIC code are described and a substantial amount of background information on the capabilities and proper operation of the code is supplied. Sample problems were undertaken to demonstrate the robustness and flexibility of the basic approach. Also, a performance correlation study was carried out to establish the breadth of applicability of the code, with very favorable results.
A conflict analysis of 4D descent strategies in a metered, multiple-arrival route environment
NASA Technical Reports Server (NTRS)
Izumi, K. H.; Harris, C. S.
1990-01-01
A conflict analysis was performed on multiple arrival traffic at a typical metered airport. The Flow Management Evaluation Model (FMEM) was used to simulate arrival operations using Denver Stapleton's arrival route structure. Sensitivities of conflict performance to three different 4-D descent strategies (clear-idle Mach/Constant AirSpeed (CAS), constant descent angle Mach/CAS and energy optimal) were examined for three traffic mixes represented by those found at Denver Stapleton, John F. Kennedy and typical en route metering (ERM) airports. The Monte Carlo technique was used to generate simulation entry point times. Analysis results indicate that the clean-idle descent strategy offers the best compromise in overall performance. Performance measures primarily include susceptibility to conflict and conflict severity. Fuel usage performance is extrapolated from previous descent strategy studies.
Analysis of Phenix end-of-life natural convection test with the MARS-LMR code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeong, H. Y.; Ha, K. S.; Lee, K. L.
The end-of-life test of Phenix reactor performed by the CEA provided an opportunity to have reliable and valuable test data for the validation and verification of a SFR system analysis code. KAERI joined this international program for the analysis of Phenix end-of-life natural circulation test coordinated by the IAEA from 2008. The main objectives of this study were to evaluate the capability of existing SFR system analysis code MARS-LMR and to identify any limitation of the code. The analysis was performed in three stages: pre-test analysis, blind posttest analysis, and final post-test analysis. In the pre-test analysis, the design conditionsmore » provided by the CEA were used to obtain a prediction of the test. The blind post-test analysis was based on the test conditions measured during the tests but the test results were not provided from the CEA. The final post-test analysis was performed to predict the test results as accurate as possible by improving the previous modeling of the test. Based on the pre-test analysis and blind test analysis, the modeling for heat structures in the hot pool and cold pool, steel structures in the core, heat loss from roof and vessel, and the flow path at core outlet were reinforced in the final analysis. The results of the final post-test analysis could be characterized into three different phases. In the early phase, the MARS-LMR simulated the heat-up process correctly due to the enhanced heat structure modeling. In the mid phase before the opening of SG casing, the code reproduced the decrease of core outlet temperature successfully. Finally, in the later phase the increase of heat removal by the opening of the SG opening was well predicted with the MARS-LMR code. (authors)« less
DOT National Transportation Integrated Search
2010-05-01
This report documents the results of a strategic job analysis that examined the job tasks and knowledge, skills, abilities, and other characteristics (KSAOs) needed to perform the job of a work schedule manager. The strategic job analysis compared in...
DOT National Transportation Integrated Search
2010-05-01
This report documents the results of a strategic job analysis that examined the job tasks and knowledge, skills, abilities, and other characteristics (KSAOs) needed to perform the job of a work schedule manager. The strategic job analysis compared in...
2017-01-01
Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers (QIBs) to measure changes in these features. Critical to the performance of a QIB in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method and metrics used to assess a QIB for clinical use. It is therefore, difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America (RSNA) and the Quantitative Imaging Biomarker Alliance (QIBA) with technical, radiological and statistical experts developed a set of technical performance analysis methods, metrics and study designs that provide terminology, metrics and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of QIB performance studies so that results from multiple studies can be compared, contrasted or combined. PMID:24919831
Analysis of swimming performance: perceptions and practices of US-based swimming coaches.
Mooney, Robert; Corley, Gavin; Godfrey, Alan; Osborough, Conor; Newell, John; Quinlan, Leo Richard; ÓLaighin, Gearóid
2016-01-01
In elite swimming, a broad range of methods are used to assess performance, inform coaching practices and monitor athletic progression. The aim of this paper was to examine the performance analysis practices of swimming coaches and to explore the reasons behind the decisions that coaches take when analysing performance. Survey data were analysed from 298 Level 3 competitive swimming coaches (245 male, 53 female) based in the United States. Results were compiled to provide a generalised picture of practices and perceptions and to examine key emerging themes. It was found that a disparity exists between the importance swim coaches place on biomechanical analysis of swimming performance and the types of analyses that are actually conducted. Video-based methods are most frequently employed, with over 70% of coaches using these methods at least monthly, with analyses being mainly qualitative in nature rather than quantitative. Barriers to the more widespread use of quantitative biomechanical analysis in elite swimming environments were explored. Constraints include time, cost and availability of resources, but other factors such as sources of information on swimming performance and analysis and control over service provision are also discussed, with particular emphasis on video-based methods and emerging sensor-based technologies.
Quantitative analysis of regional myocardial performance in coronary artery disease
NASA Technical Reports Server (NTRS)
Stewart, D. K.; Dodge, H. T.; Frimer, M.
1975-01-01
Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.
Dual nozzle aerodynamic and cooling analysis study. [dual throat and dual expander nozzles
NASA Technical Reports Server (NTRS)
Meagher, G. M.
1980-01-01
Geometric, aerodynamic flow field, performance prediction, and heat transfer analyses are considered for two advanced chamber nozzle concepts applicable to Earth-to-orbit engine systems. Topics covered include improvements to the dual throat aerodynamic and performance prediction program; geometric and flow field analyses of the dual expander concept; heat transfer analysis of both concepts, and engineering analysis of data from the NASA/MSFC hot-fire testing of a dual throat thruster model thrust chamber assembly. Preliminary results obtained are presented in graphs.
NASA Technical Reports Server (NTRS)
Smith, Timothy D.; Steffen, Christopher J., Jr.; Yungster, Shaye; Keller, Dennis J.
1998-01-01
The all rocket mode of operation is shown to be a critical factor in the overall performance of a rocket based combined cycle (RBCC) vehicle. An axisymmetric RBCC engine was used to determine specific impulse efficiency values based upon both full flow and gas generator configurations. Design of experiments methodology was used to construct a test matrix and multiple linear regression analysis was used to build parametric models. The main parameters investigated in this study were: rocket chamber pressure, rocket exit area ratio, injected secondary flow, mixer-ejector inlet area, mixer-ejector area ratio, and mixer-ejector length-to-inlet diameter ratio. A perfect gas computational fluid dynamics analysis, using both the Spalart-Allmaras and k-omega turbulence models, was performed with the NPARC code to obtain values of vacuum specific impulse. Results from the multiple linear regression analysis showed that for both the full flow and gas generator configurations increasing mixer-ejector area ratio and rocket area ratio increase performance, while increasing mixer-ejector inlet area ratio and mixer-ejector length-to-diameter ratio decrease performance. Increasing injected secondary flow increased performance for the gas generator analysis, but was not statistically significant for the full flow analysis. Chamber pressure was found to be not statistically significant.
How to Perform a Systematic Review and Meta-analysis of Diagnostic Imaging Studies.
Cronin, Paul; Kelly, Aine Marie; Altaee, Duaa; Foerster, Bradley; Petrou, Myria; Dwamena, Ben A
2018-05-01
A systematic review is a comprehensive search, critical evaluation, and synthesis of all the relevant studies on a specific (clinical) topic that can be applied to the evaluation of diagnostic and screening imaging studies. It can be a qualitative or a quantitative (meta-analysis) review of available literature. A meta-analysis uses statistical methods to combine and summarize the results of several studies. In this review, a 12-step approach to performing a systematic review (and meta-analysis) is outlined under the four domains: (1) Problem Formulation and Data Acquisition, (2) Quality Appraisal of Eligible Studies, (3) Statistical Analysis of Quantitative Data, and (4) Clinical Interpretation of the Evidence. This review is specifically geared toward the performance of a systematic review and meta-analysis of diagnostic test accuracy (imaging) studies. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Integrative analysis of environmental sequences using MEGAN4.
Huson, Daniel H; Mitra, Suparna; Ruscheweyh, Hans-Joachim; Weber, Nico; Schuster, Stephan C
2011-09-01
A major challenge in the analysis of environmental sequences is data integration. The question is how to analyze different types of data in a unified approach, addressing both the taxonomic and functional aspects. To facilitate such analyses, we have substantially extended MEGAN, a widely used taxonomic analysis program. The new program, MEGAN4, provides an integrated approach to the taxonomic and functional analysis of metagenomic, metatranscriptomic, metaproteomic, and rRNA data. While taxonomic analysis is performed based on the NCBI taxonomy, functional analysis is performed using the SEED classification of subsystems and functional roles or the KEGG classification of pathways and enzymes. A number of examples illustrate how such analyses can be performed, and show that one can also import and compare classification results obtained using others' tools. MEGAN4 is freely available for academic purposes, and installers for all three major operating systems can be downloaded from www-ab.informatik.uni-tuebingen.de/software/megan.
DOD/NASA system impact analysis (study 2.1). Volume 2: Study results
NASA Technical Reports Server (NTRS)
1973-01-01
Results of the tug turnaround cost study and the space transportation system (STS) abort modes and effects study are presented for DOD/NASA system impact analysis. Cost estimates are given for tug turnabout; and vehicle description, abort assessment, and abort performance capability are given for the STS.
Role of Copper in the Performance of CdS/CdTe Solar Cells (Poster)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demtsu, S.; Albin, D.; Sites, J.
2006-05-01
The performance of CdS/CdTe solar cells made with evaporated Cu as a primary back contact was studied through current-voltage (JV) at different intensities, quantum efficiency (QE) under light and voltage bias, capacitance-voltage (CV), and drive-level capacitance profiling (DLCP) measurements. The results show that while modest amounts of Cu enhance cell performance, excessive amounts degrade device quality and reduce performance. The analysis is supported with numerical simulations to reproduce and explain some of the experimental results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zaininger, H.W.
1998-08-01
This report describes the results of an analysis to determine the economic and operational value of battery storage to wind and photovoltaic (PV) generation technologies to the Sacramento Municipal Utility District (SMUD) system. The analysis approach consisted of performing a benefit-cost economic assessment using established SMUD financial parameters, system expansion plans, and current system operating procedures. This report presents the results of the analysis. Section 2 describes expected wind and PV plant performance. Section 3 describes expected benefits to SMUD associated with employing battery storage. Section 4 presents preliminary benefit-cost results for battery storage added at the Solano wind plantmore » and the Hedge PV plant. Section 5 presents conclusions and recommendations resulting from this analysis. The results of this analysis should be reviewed subject to the following caveat. The assumptions and data used in developing these results were based on reports available from and interaction with appropriate SMUD operating, planning, and design personnel in 1994 and early 1995 and are compatible with financial assumptions and system expansion plans as of that time. Assumptions and SMUD expansion plans have changed since then. In particular, SMUD did not install the additional 45 MW of wind that was planned for 1996. Current SMUD expansion plans and assumptions should be obtained from appropriate SMUD personnel.« less
Hong, Do-Kwan; Joo, Dae-Suk; Woo, Byung-Chul; Koo, Dae-Hyun; Ahn, Chan-Woo
2014-01-01
The objective of the present study was to deal with the rotordynamics of the rotor of an ultra-high speed PM type synchronous motor-generator for a 500 W rated micro gas turbine generator. This paper introduces dynamic analysis, and experiments on the motor-generator. The focus is placed on an analytical approach considering the mechanical dynamic problems. It is essential to deal with dynamic stability at ultra-high speeds. Unbalance response analysis is performed by calculating the unbalance with and without balancing using a balancing machine. Critical speed analysis is performed to determine the operating speed with sufficient separation margin. The unbalance response analysis is compared with the experimental results considering the balancing grade (ISO 1940-1) and predicted vibration displacement with and without balancing. Based on these results, a high-speed motor-generator was successfully developed. PMID:25177804
Pyrolysis of coal, biomass and their blends: performance assessment by thermogravimetric analysis.
Ferrara, Francesca; Orsini, Alessandro; Plaisant, Alberto; Pettinau, Alberto
2014-11-01
With the aim to support the experimental tests in a gasification pilot plant, the thermal decomposition of coal, biomass and their mixtures has been carried out through a thermogravimetric analysis (TGA) and a simplified kinetic analysis. The TGA of pure fuels indicates the low reactivity of South African coal and the relatively high reactivity of Sardinian Sulcis coal during pyrolysis. Among the tested fuels, biomass (stone pine wood chips) is the most reactive one. These results fully confirm those obtained during the experimental tests in the gasification pilot plant. As for the fuel blends, the analysis shows that the synergic effects between the considered coals and biomass are negligible when they are co-pyrolyzed. The results of the analysis confirm that TGA could be very useful to generally predict the gasification performance and to optimize the experimental campaigns in pilot-scale gasification plants. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Muraca, R. J.; Stephens, M. V.; Dagenhart, J. R.
1975-01-01
A general analysis capable of predicting performance characteristics of cross-wind axis turbines was developed, including the effects of airfoil geometry, support struts, blade aspect ratio, windmill solidity, blade interference and curved flow. The results were compared with available wind tunnel results for a catenary blade shape. A theoretical performance curve for an aerodynamically efficient straight blade configuration was also presented. In addition, a linearized analytical solution applicable for straight configurations was developed. A listing of the computer program developed for numerical solutions of the general performance equations is included in the appendix.
NASA Technical Reports Server (NTRS)
Caglayan, A. K.; Godiwala, P. M.; Morrell, F. R.
1985-01-01
This paper presents the performance analysis results of a fault inferring nonlinear detection system (FINDS) using integrated avionics sensor flight data for the NASA ATOPS B-737 aircraft in a Microwave Landing System (MLS) environment. First, an overview of the FINDS algorithm structure is given. Then, aircraft state estimate time histories and statistics for the flight data sensors are discussed. This is followed by an explanation of modifications made to the detection and decision functions in FINDS to improve false alarm and failure detection performance. Next, the failure detection and false alarm performance of the FINDS algorithm are analyzed by injecting bias failures into fourteen sensor outputs over six repetitive runs of the five minutes of flight data. Results indicate that the detection speed, failure level estimation, and false alarm performance show a marked improvement over the previously reported simulation runs. In agreement with earlier results, detection speed is faster for filter measurement sensors such as MLS than for filter input sensors such as flight control accelerometers. Finally, the progress in modifications of the FINDS algorithm design to accommodate flight computer constraints is discussed.
NASA Technical Reports Server (NTRS)
Whitcomb, John D.
1989-01-01
Strain-energy release rates are often used to predict when delamination growth will occur in laminates under compression. Because of the inherently high computational cost of performing such analyses, less rigorous analyses such as thin-film plate analysis were used. The assumptions imposed by plate theory restrict the analysis to the calculation of total strain energy, G(sub t). The objective is to determine the accuracy of thin-film plate analysis by comparing the distribution of G(sub t) calculated using fully three dimensional (3D), thin-film 3D, and thin-film plate analyses. Thin-film 3D analysis is the same as thin-film plate analysis, except 3D analysis is used to model the sublaminate. The 3D stress analyses were performed using the finite element program NONLIN3D. The plate analysis results were obtained from published data, which used STAGS. Strain-energy release rates were calculated using variations of the virtual crack closure technique. The results demonstrate that thin-film plate analysis can predict the distribution of G(sub t) quite well, at least for the configurations considered. Also, these results verify the accuracy of the strain-energy release rate procedure for plate analysis.
Analysis of high vacuum systems using SINDA'85
NASA Technical Reports Server (NTRS)
Spivey, R. A.; Clanton, S. E.; Moore, J. D.
1993-01-01
The theory, algorithms, and test data correlation analysis of a math model developed to predict performance of the Space Station Freedom Vacuum Exhaust System are presented. The theory used to predict the flow characteristics of viscous, transition, and molecular flow is presented in detail. Development of user subroutines which predict the flow characteristics in conjunction with the SINDA'85/FLUINT analysis software are discussed. The resistance-capacitance network approach with application to vacuum system analysis is demonstrated and results from the model are correlated with test data. The model was developed to predict the performance of the Space Station Freedom Vacuum Exhaust System. However, the unique use of the user subroutines developed in this model and written into the SINDA'85/FLUINT thermal analysis model provides a powerful tool that can be used to predict the transient performance of vacuum systems and gas flow in tubes of virtually any geometry. This can be accomplished using a resistance-capacitance (R-C) method very similar to the methods used to perform thermal analyses.
Kaji, Amy H; Langford, Vinette; Lewis, Roger J
2008-09-01
There is currently no validated method for assessing hospital disaster preparedness. We determine the degree of correlation between the results of 3 methods for assessing hospital disaster preparedness: administration of an on-site survey, drill observation using a structured evaluation tool, and video analysis of team performance in the hospital incident command center. This was a prospective, observational study conducted during a regional disaster drill, comparing the results from an on-site survey, a structured disaster drill evaluation tool, and a video analysis of teamwork, performed at 6 911-receiving hospitals in Los Angeles County, CA. The on-site survey was conducted separately from the drill and assessed hospital disaster plan structure, vendor agreements, modes of communication, medical and surgical supplies, involvement of law enforcement, mutual aid agreements with other facilities, drills and training, surge capacity, decontamination capability, and pharmaceutical stockpiles. The drill evaluation tool, developed by Johns Hopkins University under contract from the Agency for Healthcare Research and Quality, was used to assess various aspects of drill performance, such as the availability of the hospital disaster plan, the geographic configuration of the incident command center, whether drill participants were identifiable, whether the noise level interfered with effective communication, and how often key information (eg, number of available staffed floor, intensive care, and isolation beds; number of arriving victims; expected triage level of victims; number of potential discharges) was received by the incident command center. Teamwork behaviors in the incident command center were quantitatively assessed, using the MedTeams analysis of the video recordings obtained during the disaster drill. Spearman rank correlations of the results between pair-wise groupings of the 3 assessment methods were calculated. The 3 evaluation methods demonstrated qualitatively different results with respect to each hospital's level of disaster preparedness. The Spearman rank correlation coefficient between the results of the on-site survey and the video analysis of teamwork was -0.34; between the results of the on-site survey and the structured drill evaluation tool, 0.15; and between the results of the video analysis and the drill evaluation tool, 0.82. The disparate results obtained from the 3 methods suggest that each measures distinct aspects of disaster preparedness, and perhaps no single method adequately characterizes overall hospital preparedness.
Working Performance Analysis of Rolling Bearings Used in Mining Electric Excavator Crowd Reducer
NASA Astrophysics Data System (ADS)
Zhang, Y. H.; Hou, G.; Chen, G.; Liang, J. F.; Zheng, Y. M.
2017-12-01
Refer to the statistical load data of digging process, on the basis of simulation analysis of crowd reducer system dynamics, the working performance simulation analysis of rolling bearings used in crowd reducer of large mining electric excavator is completed. The contents of simulation analysis include analysis of internal load distribution, rolling elements contact stresses and rolling bearing fatigue life. The internal load characteristics of rolling elements in cylindrical roller bearings are obtained. The results of this study identified that all rolling bearings satisfy the requirements of contact strength and fatigue life. The rationality of bearings selection and arrangement is also verified.
Evaluating Web-Based Nursing Education's Effects: A Systematic Review and Meta-Analysis.
Kang, Jiwon; Seomun, GyeongAe
2017-09-01
This systematic review and meta-analysis investigated whether using web-based nursing educational programs increases a participant's knowledge and clinical performance. We performed a meta-analysis of studies published between January 2000 and July 2016 and identified through RISS, CINAHL, ProQuest Central, Embase, the Cochrane Library, and PubMed. Eleven studies were eligible for inclusion in this analysis. The results of the meta-analysis demonstrated significant differences not only for the overall effect but also specifically for blended programs and short (2 weeks or 4 weeks) intervention periods. To present more evidence supporting the effectiveness of web-based nursing educational programs, further research is warranted.
Examining the impact of cell phone conversations on driving using meta-analytic techniques.
Horrey, William J; Wickens, Christopher D
2006-01-01
The performance costs associated with cell phone use while driving were assessed meta-analytically using standardized measures of effect size along five dimensions. There have been many studies on the impact of cell phone use on driving, showing some mixed findings. Twenty-three studies (contributing 47 analysis entries) met the appropriate conditions for the meta-analysis. The statistical results from each of these studies were converted into effect sizes and combined in the meta-analysis. Overall, there were clear costs to driving performance when drivers were engaged in cell phone conversations. However, subsequent analyses indicated that these costs were borne primarily by reaction time tasks, with far smaller costs associated with tracking (lane-keeping) performance. Hands-free and handheld phones revealed similar patterns of results for both measures of performance. Conversation tasks tended to show greater costs than did information-processing tasks (e.g., word games). There was a similar pattern of results for passenger and remote (cell phone) conversations. Finally, there were some small differences between simulator and field studies, though both exhibited costs in performance for cell phone use. We suggest that (a) there are significant costs to driver reactions to external hazards or events associated with cell phone use, (b) hands-free cell phones do not eliminate or substantially reduce these costs, and (c) different research methodologies or performance measures may underestimate these costs. Potential applications of this research include the assessment of performance costs attributable to different types of cell phones, cell phone conversations, experimental measures, or methodologies.
NASA Astrophysics Data System (ADS)
Marisarla, Soujanya; Ghia, Urmila; "Karman" Ghia, Kirti
2002-11-01
Towards a comprehensive aeroelastic analysis of a joined wing, fluid dynamics and structural analyses are initially performed separately. Steady flow calculations are currently performed using 3-D compressible Navier-Stokes equations. Flow analysis of M6-Onera wing served to validate the software for the fluid dynamics analysis. The complex flow field of the joined wing is analyzed and the prevailing fluid dynamic forces are computed using COBALT software. Currently, these forces are being transferred as fluid loads on the structure. For the structural analysis, several test cases were run considering the wing as a cantilever beam; these served as validation cases. A nonlinear structural analysis of the wing is being performed using ANSYS software to predict the deflections and stresses on the joined wing. Issues related to modeling, and selecting appropriate mesh for the structure were addressed by first performing a linear analysis. The frequencies and mode shapes of the deformed wing are obtained from modal analysis. Both static and dynamic analyses are carried out, and the results obtained are carefully analyzed. Loose coupling between the fluid and structural analyses is currently being examined.
NASA Technical Reports Server (NTRS)
Ulbricht, T. E.; Hemminger, J. A.
1986-01-01
The low flow rate and high head rise requirements of hydrogen/oxygen auxiliary propulsion systems make the application of centrifugal pumps difficult. Positive displacement pumps are well-suited for these flow conditions, but little is known about their performance and life characteristics in liquid hydrogen. An experimental and analytical investigation was conducted to determine the performance and life characteristics of a vane-type, positive displacement pump. In the experimental part of this effort, mass flow rate and shaft torque were determined as functions of shaft speed and pump pressure rise. Since liquid hydrogen offers little lubrication in a rubbing situation, pump life is an issue. During the life test, the pump was operated intermittently for 10 hr at the steady-state point of 0.074 lbm/sec (0.03 kg/sec) flow rate, 3000 psid (2.07 MPa) pressure rise, and 8000 rpm (838 rad/sec) shaft speed. Pump performance was monitored during the life test series and the results indicated no loss in performance. Material loss from the vanes was recorded and wear of the other components was documented. In the analytical part of this effort, a comprehensive pump performance analysis computer code, developed in-house, was used to predict pump performance. The results of the experimental investigation are presented and compared with the results of the analysis. Results of the life test are also presented.
A Spatial Analysis of Contextual Effects on Educational Accountability in Kentucky.
ERIC Educational Resources Information Center
Pitts, Timothy C.; Reeves, Edward B.
A cornerstone of the Kentucky Education Reform Act of 1990 was the creation of a high-stakes performance assessment program called the Kentucky Instructional Results Information System (KIRIS). KIRIS test results were the basis for granting monetary rewards to schools and school districts where student test performance improved significantly and…
Effect analysis of design variables on the disc in a double-eccentric butterfly valve.
Kang, Sangmo; Kim, Da-Eun; Kim, Kuk-Kyeom; Kim, Jun-Oh
2014-01-01
We have performed a shape optimization of the disc in an industrial double-eccentric butterfly valve using the effect analysis of design variables to enhance the valve performance. For the optimization, we select three performance quantities such as pressure drop, maximum stress, and mass (weight) as the responses and three dimensions regarding the disc shape as the design variables. Subsequently, we compose a layout of orthogonal array (L16) by performing numerical simulations on the flow and structure using a commercial package, ANSYS v13.0, and then make an effect analysis of the design variables on the responses using the design of experiments. Finally, we formulate a multiobjective function consisting of the three responses and then propose an optimal combination of the design variables to maximize the valve performance. Simulation results show that the disc thickness makes the most significant effect on the performance and the optimal design provides better performance than the initial design.
Emergency department visual urinalysis versus laboratory urinalysis.
Worrall, James C
2009-11-01
The primary objective of this study was to compare the results of nurse-performed urinalysis (NPU) interpreted visually in the emergency department (ED) with laboratory performed urinalysis (LPU) interpreted by reflectance photometry. This was a prospective observational study based on a convenience sample from my emergency practice. Emergency nurses, who were unaware of the study, performed usual dipstick analysis before sending the same urine sample to the laboratory for testing. Of 140 urinalyses performed during the study period, 124 were suitable for analysis. When compared with the reference standard LPU, the NPU had an overall sensitivity of 100% (95% confidence interval [CI] 95%-100%) and a specificity of 49% (95% CI 33%-65%) for the presence of any 1 of blood, leukocyte esterase, nitrites, protein, glucose or ketones in the urine. Of 20 falsely positive NPUs, 18 were a result of the nurse recording 1 or more components as "trace" positive. Although NPU does not yield identical results to LPU, a negative LPU is expected when the initial NPU in the ED is negative.
NASA Technical Reports Server (NTRS)
Brown, Andrew M.; Schmauch, Preston
2012-01-01
Turbine blades in rocket and jet engine turbomachinery experience enormous harmonic loading conditions. These loads result from the integer number of upstream and downstream stator vanes as well as the other turbine stages. The standard technique for forced response analysis to assess structural integrity is to decompose a CFD generated flow field into its harmonic components, and to then perform a frequency response analysis at the problematic natural frequencies. Recent CFD analysis and water-flow testing at NASA/MSFC, though, indicates that this technique may miss substantial harmonic and non-harmonic excitation sources that become present in complex flows. These complications suggest the question of whether frequency domain analysis is capable of capturing the excitation content sufficiently. Two studies comparing frequency response analysis with transient response analysis, therefore, have been performed. The first is of a bladed disk with each blade modeled by simple beam elements. It was hypothesized that the randomness and other variation from the standard harmonic excitation would reduce the blade structural response, but the results showed little reduction. The second study was of a realistic model of a bladed-disk excited by the same CFD used in the J2X engine program. The results showed that the transient analysis results were up to 10% higher for "clean" nodal diameter excitations and six times larger for "messy" excitations, where substantial Fourier content around the main harmonic exists.
Yang, Yongxin; Zhou, Rui; Ge, Yaojun; Du, Yanliang; Zhang, Lihai
2018-06-27
In this study, the influence of two critical geometrical parameters (i.e., angles of wind fairing, α; and lower inclined web, β) in the aerodynamic performance of closed-box girder bridges was systematically investigated through conducting a theoretical analysis and wind tunnel testing using laser displacement sensors. The results show that, for a particular inclined web angle β, a closed-box girder with a sharper wind fairing angle of α = 50° has better flutter and vortex-induced vibration (VIV) performance than that with α = 60°, while an inclined web angle of β = 14° produces the best VIV performance. In addition, the results from particle image velocimetry (PIV) tests indicate that a wind fairing angle of α = 50° produces a better flutter performance by inducing a single vortex structure and a balanced distribution of the strength of vorticity in both upper and lower parts of the wake region. Furthermore, two-dimensional three-degrees-of-freedom (2D-3DOF) analysis results demonstrate that the absolute values of Part A (with a reference of flutter derivative A ₂ * ) and Part D (with a reference of A ₁ * H ₃ * ) generally decrease with the increase of β, while the change of the participation level of heaving degrees of freedom (DOF) in torsion-dominated coupled flutter initially increases, reaches its peak, and then decreases with the increase of β.
Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun
2015-02-01
Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun
2017-01-01
Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test–retest repeatability data for illustrative purposes. PMID:24872353
Chen, Qixuan; Li, Jingguang
2014-05-01
Many recent studies have examined the association between number acuity, which is the ability to rapidly and non-symbolically estimate the quantity of items appearing in a scene, and symbolic math performance. However, various contradictory results have been reported. To comprehensively evaluate the association between number acuity and symbolic math performance, we conduct a meta-analysis to synthesize the results observed in previous studies. First, a meta-analysis of cross-sectional studies (36 samples, N = 4705) revealed a significant positive correlation between these skills (r = 0.20, 95% CI = [0.14, 0.26]); the association remained after considering other potential moderators (e.g., whether general cognitive abilities were controlled). Moreover, a meta-analysis of longitudinal studies revealed 1) that number acuity may prospectively predict later math performance (r = 0.24, 95% CI = [0.11, 0.37]; 6 samples) and 2) that number acuity is retrospectively correlated to early math performance as well (r = 0.17, 95% CI = [0.07, 0.26]; 5 samples). In summary, these pieces of evidence demonstrate a moderate but statistically significant association between number acuity and math performance. Based on the estimated effect sizes, power analyses were conducted, which suggested that many previous studies were underpowered due to small sample sizes. This may account for the disparity between findings in the literature, at least in part. Finally, the theoretical and practical implications of our meta-analytic findings are presented, and future research questions are discussed. Copyright © 2014 Elsevier B.V. All rights reserved.
Quantifying Low Energy Proton Damage in Multijunction Solar Cells
NASA Technical Reports Server (NTRS)
Messenger, Scott R.; Burke, Edward A.; Walters, Robert J.; Warner, Jeffrey H.; Summers, Geoffrey P.; Lorentzen, Justin R.; Morton, Thomas L.; Taylor, Steven J.
2007-01-01
An analysis of the effects of low energy proton irradiation on the electrical performance of triple junction (3J) InGaP2/GaAs/Ge solar cells is presented. The Monte Carlo ion transport code (SRIM) is used to simulate the damage profile induced in a 3J solar cell under the conditions of typical ground testing and that of the space environment. The results are used to present a quantitative analysis of the defect, and hence damage, distribution induced in the cell active region by the different radiation conditions. The modelling results show that, in the space environment, the solar cell will experience a uniform damage distribution through the active region of the cell. Through an application of the displacement damage dose analysis methodology, the implications of this result on mission performance predictions are investigated.
14 CFR 437.29 - Hazard analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...
14 CFR 437.29 - Hazard analysis.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...
Identification of human operator performance models utilizing time series analysis
NASA Technical Reports Server (NTRS)
Holden, F. M.; Shinners, S. M.
1973-01-01
The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.
Radon Assessment of Occupational Facilities, Grissom ARB, IN
2012-11-28
4) EPA 402-R-92-014, Radon Measurements in Schools , July 1993 b. Measurement Device Protocols: The following protocols were used when placing...Performance Tests : A biennial performance test from commercial vendors evaluates the proficiency of USAFSAM’s radon analysis. A proficiency test ...listing of duplicates and analysis. (4) Calibration Tests : Please see Attachment 4 for calibration certificates. 3. RESULTS: In total, 106 radon monitors
NASA Technical Reports Server (NTRS)
Gates, R. M.; Williams, J. E.
1974-01-01
Results are given of analytical studies performed in support of the design, implementation, checkout and use of NASA's dynamic docking test system (DDTS). Included are analyses of simulator components, a list of detailed operational test procedures, a summary of simulator performance, and an analysis and comparison of docking dynamics and loads obtained by test and analysis.
ERIC Educational Resources Information Center
Uzzell, Renata; Fernandez, Jeannette; Palacios, Moses; Hart, Ray; Casserly, Michael
2014-01-01
The Council of the Great City Schools prepared this thirteenth edition of "Beating the Odds" to give the nation an in-depth look at how big-city schools are performing on the academic goals and standards set by the states. This analysis examines student achievement in mathematics and reading from spring 2010 through spring 2013; measures…
Development of the performance confirmation program at YUCCA mountain, nevada
LeCain, G.D.; Barr, D.; Weaver, D.; Snell, R.; Goodin, S.W.; Hansen, F.D.
2006-01-01
The Yucca Mountain Performance Confirmation program consists of tests, monitoring activities, experiments, and analyses to evaluate the adequacy of assumptions, data, and analyses that form the basis of the conceptual and numerical models of flow and transport associated with a proposed radioactive waste repository at Yucca Mountain, Nevada. The Performance Confirmation program uses an eight-stage risk-informed, performance-based approach. Selection of the Performance Confirmation activities for inclusion in the Performance Confirmation program was done using a risk-informed performance-based decision analysis. The result of this analysis was a Performance Confirmation base portfolio that consists of 20 activities. The 20 Performance Confirmation activities include geologic, hydrologie, and construction/engineering testing. Some of the activities began during site characterization, and others will begin during construction, or post emplacement, and continue until repository closure.
Energy localization and frequency analysis in the locust ear.
Malkin, Robert; McDonagh, Thomas R; Mhatre, Natasha; Scott, Thomas S; Robert, Daniel
2014-01-06
Animal ears are exquisitely adapted to capture sound energy and perform signal analysis. Studying the ear of the locust, we show how frequency signal analysis can be performed solely by using the structural features of the tympanum. Incident sound waves generate mechanical vibrational waves that travel across the tympanum. These waves shoal in a tsunami-like fashion, resulting in energy localization that focuses vibrations onto the mechanosensory neurons in a frequency-dependent manner. Using finite element analysis, we demonstrate that two mechanical properties of the locust tympanum, distributed thickness and tension, are necessary and sufficient to generate frequency-dependent energy localization.
Desova, A A; Dorofeyuk, A A; Anokhin, A M
2017-01-01
We performed a comparative analysis of the types of spectral density typical of various parameters of pulse signal. The experimental material was obtained during the examination of school age children with various psychosomatic disorders. We also performed a typological analysis of the spectral density functions corresponding to the time series of different parameters of a single oscillation of pulse signals; the results of their comparative analysis are presented. We determined the most significant spectral components for two disordersin children: arterial hypertension and mitral valve prolapse.
NASA Astrophysics Data System (ADS)
Kim, Jeong-Man; Koo, Min-Mo; Jeong, Jae-Hoon; Hong, Keyyong; Cho, Il-Hyoung; Choi, Jang-Young
2017-05-01
This paper reports the design and analysis of a tubular permanent magnet linear generator (TPMLG) for a small-scale wave-energy converter. The analytical field computation is performed by applying a magnetic vector potential and a 2-D analytical model to determine design parameters. Based on analytical solutions, parametric analysis is performed to meet the design specifications of a wave-energy converter (WEC). Then, 2-D FEA is employed to validate the analytical method. Finally, the experimental result confirms the predictions of the analytical and finite element analysis (FEA) methods under regular and irregular wave conditions.
NASA Astrophysics Data System (ADS)
Wallace, Brian D.
A series of field tests and theoretical analyses were performed on various wind turbine rotor designs at two Penn State residential-scale wind-electric facilities. This work involved the prediction and experimental measurement of the electrical and aerodynamic performance of three wind turbines; a 3 kW rated Whisper 175, 2.4 kW rated Skystream 3.7, and the Penn State designed Carolus wind turbine. Both the Skystream and Whisper 175 wind turbines are OEM blades which were originally installed at the facilities. The Carolus rotor is a carbon-fiber composite 2-bladed machine, designed and assembled at Penn State, with the intent of replacing the Whisper 175 rotor at the off-grid system. Rotor aerodynamic performance is modeled using WT_Perf, a National Renewable Energy Laboratory developed Blade Element Momentum theory based performance prediction code. Steady-state power curves are predicted by coupling experimentally determined electrical characteristics with the aerodynamic performance of the rotor simulated with WT_Perf. A dynamometer test stand is used to establish the electromechanical efficiencies of the wind-electric system generator. Through the coupling of WT_Perf and dynamometer test results, an aero-electro-mechanical analysis procedure is developed and provides accurate predictions of wind system performance. The analysis of three different wind turbines gives a comprehensive assessment of the capability of the field test facilities and the accuracy of aero-electro-mechanical analysis procedures. Results from this study show that the Carolus and Whisper 175 rotors are running at higher tip-speed ratios than are optimum for power production. The aero-electro-mechanical analysis predicted the high operating tip-speed ratios of the rotors and was accurate at predicting output power for the systems. It is shown that the wind turbines operate at high tip-speeds because of a miss-match between the aerodynamic drive torque and the operating torque of the wind-system generator. Through the change of load impedance on the wind generator, the research facility has the ability to modify the rotational speed of the wind turbines, allowing the rotors to perform closer to their optimum tip-speed. Comparisons between field test data and performance predictions show that the aero-electro-mechanical analysis was able to predict differences in power production and rotational speed which result from changes in the system load impedance.
D-region blunt probe data analysis using hybrid computer techniques
NASA Technical Reports Server (NTRS)
Burkhard, W. J.
1973-01-01
The feasibility of performing data reduction techniques with a hybrid computer was studied. The data was obtained from the flight of a parachute born probe through the D-region of the ionosphere. A presentation of the theory of blunt probe operation is included with emphasis on the equations necessary to perform the analysis. This is followed by a discussion of computer program development. Included in this discussion is a comparison of computer and hand reduction results for the blunt probe launched on 31 January 1972. The comparison showed that it was both feasible and desirable to use the computer for data reduction. The results of computer data reduction performed on flight data acquired from five blunt probes are also presented.
Data resulting from the CFD analysis of ten window frames according to the UNI EN ISO 10077-2.
Baglivo, Cristina; Malvoni, Maria; Congedo, Paolo Maria
2016-09-01
Data are related to the numerical simulation performed in the study entitled "CFD modeling to evaluate the thermal performances of window frames in accordance with the ISO 10077" (Malvoni et al., 2016) [1]. The paper focuses on the results from a two-dimensional numerical analysis for ten frame sections suggested by the ISO 10077-2 and performed using GAMBIT 2.2 and ANSYS FLUENT 14.5 CFD code. The dataset specifically includes information about the CFD setup and boundary conditions considered as the input values of the simulations. The trend of the isotherms points out the different impacts on the thermal behaviour of all sections with air solid material or ideal gas into the cavities.
Analysis of a digital RF memory in a signal-delay application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jelinek, D.A.
1992-03-01
Laboratory simulation of the approach of a radar fuze towards a target is an important factor in our ability to accurately measure the radar's performance. This simulation is achieved, in part, by dynamically delaying and attenuating the radar's transmitted pulse and sending the result back to the radar's receiver. Historically, the device used to perform the dynamic delay has been a limiting factor in the evaluation of a radar's performance and characteristics. A new device has been proposed that appears to have more capability than previous dynamic delay devices. This device is the digital RF memory. This report presents themore » results of an analysis of a digital RF memory used in a signal-delay application. 2 refs.« less
NASA Astrophysics Data System (ADS)
Li, Jiangui; Wang, Junhua; Zhigang, Zhao; Yan, Weili
2012-04-01
In this paper, analytical analysis of the permanent magnet vernier (PMV) is presented. The key is to analytically solve the governing Laplacian/quasi-Poissonian field equations in the motor regions. By using the time-stepping finite element method, the analytical method is verified. Hence, the performances of the PMV machine are quantitatively compared with that of the analytical results. The analytical results agree well with the finite element method results. Finally, the experimental results are given to further show the validity of the analysis.
Computational Predictions of the Performance Wright 'Bent End' Propellers
NASA Technical Reports Server (NTRS)
Wang, Xiang-Yu; Ash, Robert L.; Bobbitt, Percy J.; Prior, Edwin (Technical Monitor)
2002-01-01
Computational analysis of two 1911 Wright brothers 'Bent End' wooden propeller reproductions have been performed and compared with experimental test results from the Langley Full Scale Wind Tunnel. The purpose of the analysis was to check the consistency of the experimental results and to validate the reliability of the tests. This report is one part of the project on the propeller performance research of the Wright 'Bent End' propellers, intend to document the Wright brothers' pioneering propeller design contributions. Two computer codes were used in the computational predictions. The FLO-MG Navier-Stokes code is a CFD (Computational Fluid Dynamics) code based on the Navier-Stokes Equations. It is mainly used to compute the lift coefficient and the drag coefficient at specified angles of attack at different radii. Those calculated data are the intermediate results of the computation and a part of the necessary input for the Propeller Design Analysis Code (based on Adkins and Libeck method), which is a propeller design code used to compute the propeller thrust coefficient, the propeller power coefficient and the propeller propulsive efficiency.
A meta-analysis of math performance in Turner syndrome.
Baker, Joseph M; Reiss, Allan L
2016-02-01
Studies investigating the relationship between Turner syndrome and math learning disability have used a wide variation of tasks designed to test various aspects of mathematical competencies. Although these studies have revealed much about the math deficits common to Turner syndrome, their diversity makes comparisons between individual studies difficult. As a result, the consistency of outcomes among these diverse measures remains unknown. The overarching aim of this review is to provide a systematic meta-analysis of the differences in math and number performance between females with Turner syndrome and age-matched neurotypical peers. We provide a meta-analysis of behavioral performance in Turner syndrome relative to age-matched neurotypical populations on assessments of math and number aptitude. In total, 112 comparisons collected across 17 studies were included. Although 54% of all statistical comparisons in our analyses failed to reject the null hypothesis, our results indicate that meaningful group differences exist on all comparisons except those that do not require explicit calculation. Taken together, these results help elucidate our current understanding of math and number weaknesses in Turner syndrome, while highlighting specific topics that require further investigation. © 2015 Mac Keith Press.
Yazdi, Amir Amin; German, Tim P; Defeyter, Margaret Anne; Siegal, Michael
2006-06-01
There is a change in false belief task performance across the 3-5 year age range, as confirmed in a recent meta-analysis [Wellman, H. M., Cross, D., & Watson, J. (2001). Meta-analysis of theory mind development: The truth about false-belief. Child Development, 72, 655-684]. This meta-analysis identified several performance factors influencing success, including manipulations that highlight the salience of the initial belief content (such as asking where Sally will look first for the marble). However, because a proportion of variance in performance remained unexplained even when identified performance factors were controlled for, the authors concluded from the standpoint of a 'theory-theory' account that children's improvement is the result of conceptual change. Further, the meta-analysis showed that manipulations such as 'look first' improve performance only in children who are in the older part of the 3-5 year range, and thus plausibly operating with a 'transitional' theory of mind--just on the point of realizing conceptual change. Here, we present three studies systematically investigating the 'look first' manipulation which showed that: (i) the advantage for the look first question can be demonstrated in children across different cultures, (ii) look first has an effect that is additive to the improvement with age; there is no interaction such that older children gain more benefit from younger children, (iii) performance in younger children can be, but is not always, elevated to levels that are statistically above chance. These results challenge the theory-theory account and are discussed in terms of models of belief-desire reasoning in which both conceptual competence and performance factors play central roles.
Lísa, Miroslav; Cífková, Eva; Khalikova, Maria; Ovčačíková, Magdaléna; Holčapek, Michal
2017-11-24
Lipidomic analysis of biological samples in a clinical research represents challenging task for analytical methods given by the large number of samples and their extreme complexity. In this work, we compare direct infusion (DI) and chromatography - mass spectrometry (MS) lipidomic approaches represented by three analytical methods in terms of comprehensiveness, sample throughput, and validation results for the lipidomic analysis of biological samples represented by tumor tissue, surrounding normal tissue, plasma, and erythrocytes of kidney cancer patients. Methods are compared in one laboratory using the identical analytical protocol to ensure comparable conditions. Ultrahigh-performance liquid chromatography/MS (UHPLC/MS) method in hydrophilic interaction liquid chromatography mode and DI-MS method are used for this comparison as the most widely used methods for the lipidomic analysis together with ultrahigh-performance supercritical fluid chromatography/MS (UHPSFC/MS) method showing promising results in metabolomics analyses. The nontargeted analysis of pooled samples is performed using all tested methods and 610 lipid species within 23 lipid classes are identified. DI method provides the most comprehensive results due to identification of some polar lipid classes, which are not identified by UHPLC and UHPSFC methods. On the other hand, UHPSFC method provides an excellent sensitivity for less polar lipid classes and the highest sample throughput within 10min method time. The sample consumption of DI method is 125 times higher than for other methods, while only 40μL of organic solvent is used for one sample analysis compared to 3.5mL and 4.9mL in case of UHPLC and UHPSFC methods, respectively. Methods are validated for the quantitative lipidomic analysis of plasma samples with one internal standard for each lipid class. Results show applicability of all tested methods for the lipidomic analysis of biological samples depending on the analysis requirements. Copyright © 2017 Elsevier B.V. All rights reserved.
Liu, Wei; Wang, Dongmei; Liu, Jianjun; Li, Dengwu; Yin, Dongxue
2016-01-01
The present study was performed to assess the quality of Potentilla fruticosa L. sampled from distinct regions of China using high performance liquid chromatography (HPLC) fingerprinting coupled with a suite of chemometric methods. For this quantitative analysis, the main active phytochemical compositions and the antioxidant activity in P. fruticosa were also investigated. Considering the high percentages and antioxidant activities of phytochemicals, P. fruticosa samples from Kangding, Sichuan were selected as the most valuable raw materials. Similarity analysis (SA) of HPLC fingerprints, hierarchical cluster analysis (HCA), principle component analysis (PCA), and discriminant analysis (DA) were further employed to provide accurate classification and quality estimates of P. fruticosa. Two principal components (PCs) were collected by PCA. PC1 separated samples from Kangding, Sichuan, capturing 57.64% of the variance, whereas PC2 contributed to further separation, capturing 18.97% of the variance. Two kinds of discriminant functions with a 100% discrimination ratio were constructed. The results strongly supported the conclusion that the eight samples from different regions were clustered into three major groups, corresponding with their morphological classification, for which HPLC analysis confirmed the considerable variation in phytochemical compositions and that P. fruticosa samples from Kangding, Sichuan were of high quality. The results of SA, HCA, PCA, and DA were in agreement and performed well for the quality assessment of P. fruticosa. Consequently, HPLC fingerprinting coupled with chemometric techniques provides a highly flexible and reliable method for the quality evaluation of traditional Chinese medicines.
Liu, Wei; Wang, Dongmei; Liu, Jianjun; Li, Dengwu; Yin, Dongxue
2016-01-01
The present study was performed to assess the quality of Potentilla fruticosa L. sampled from distinct regions of China using high performance liquid chromatography (HPLC) fingerprinting coupled with a suite of chemometric methods. For this quantitative analysis, the main active phytochemical compositions and the antioxidant activity in P. fruticosa were also investigated. Considering the high percentages and antioxidant activities of phytochemicals, P. fruticosa samples from Kangding, Sichuan were selected as the most valuable raw materials. Similarity analysis (SA) of HPLC fingerprints, hierarchical cluster analysis (HCA), principle component analysis (PCA), and discriminant analysis (DA) were further employed to provide accurate classification and quality estimates of P. fruticosa. Two principal components (PCs) were collected by PCA. PC1 separated samples from Kangding, Sichuan, capturing 57.64% of the variance, whereas PC2 contributed to further separation, capturing 18.97% of the variance. Two kinds of discriminant functions with a 100% discrimination ratio were constructed. The results strongly supported the conclusion that the eight samples from different regions were clustered into three major groups, corresponding with their morphological classification, for which HPLC analysis confirmed the considerable variation in phytochemical compositions and that P. fruticosa samples from Kangding, Sichuan were of high quality. The results of SA, HCA, PCA, and DA were in agreement and performed well for the quality assessment of P. fruticosa. Consequently, HPLC fingerprinting coupled with chemometric techniques provides a highly flexible and reliable method for the quality evaluation of traditional Chinese medicines. PMID:26890416
Seal Analysis for the Ares-I Upper Stage Fuel Tank Manhole Cover
NASA Technical Reports Server (NTRS)
Phillips, Dawn R.; Wingate, Robert J.
2010-01-01
Techniques for studying the performance of Naflex pressure-assisted seals in the Ares-I Upper Stage liquid hydrogen tank manhole cover seal joint are explored. To assess the feasibility of using the identical seal design for the Upper Stage as was used for the Space Shuttle External Tank manhole covers, a preliminary seal deflection analysis using the ABAQUS commercial finite element software is employed. The ABAQUS analyses are performed using three-dimensional symmetric wedge finite element models. This analysis technique is validated by first modeling a heritage External Tank liquid hydrogen tank manhole cover joint and correlating the results to heritage test data. Once the technique is validated, the Upper Stage configuration is modeled. The Upper Stage analyses are performed at 1.4 times the expected pressure to comply with the Constellation Program factor of safety requirement on joint separation. Results from the analyses performed with the External Tank and Upper Stage models demonstrate the effects of several modeling assumptions on the seal deflection. The analyses for Upper Stage show that the integrity of the seal is successfully maintained.
Performance Test Data Analysis of Scintillation Cameras
NASA Astrophysics Data System (ADS)
Demirkaya, Omer; Mazrou, Refaat Al
2007-10-01
In this paper, we present a set of image analysis tools to calculate the performance parameters of gamma camera systems from test data acquired according to the National Electrical Manufacturers Association NU 1-2001 guidelines. The calculation methods are either completely automated or require minimal user interaction; minimizing potential human errors. The developed methods are robust with respect to varying conditions under which these tests may be performed. The core algorithms have been validated for accuracy. They have been extensively tested on images acquired by the gamma cameras from different vendors. All the algorithms are incorporated into a graphical user interface that provides a convenient way to process the data and report the results. The entire application has been developed in MATLAB programming environment and is compiled to run as a stand-alone program. The developed image analysis tools provide an automated, convenient and accurate means to calculate the performance parameters of gamma cameras and SPECT systems. The developed application is available upon request for personal or non-commercial uses. The results of this study have been partially presented in Society of Nuclear Medicine Annual meeting as an InfoSNM presentation.
Implementation of probe data performance measures.
DOT National Transportation Integrated Search
2017-04-03
This report presents results from a 12-month project where three arterial analysis tools based on probe vehicle segment speed data were developed for District 6. A case study of 5 arterials and two incidents was performed.
Vessel traffic service watchstander performance in routine operations
DOT National Transportation Integrated Search
1979-10-01
Human factors specialists observed and measured the performance of watchstanders at four Vessel Traffic Service (VTS) centers: Houston-Galveston, Puget Sound, New Orleans, and San Francisco. Analysis of the data yielded results amenable to mathematic...
Nuclear radiation environment analysis for thermoelectric outer planet spacecraft
NASA Technical Reports Server (NTRS)
Davis, H. S.; Koprowski, E. F.
1972-01-01
Neutron and gamma ray transport calculations were performed using Monte Carlo methods and a three-dimensional geometric model of the spacecraft. The results are compared with similar calculations performed for an earlier design.
Viscoelastic properties of chalcogenide glasses and the simulation of their molding processes
NASA Astrophysics Data System (ADS)
Liu, Weiguo; Shen, Ping; Jin, Na
In order to simulate the precision molding process, the viscoelastic properties of chalcogenide glasses under high temperatures were investigated. Thermomechanical analysis were performed to measure and analysis the thermomechanical properties of chalcogenide glasses. The creep responses of the glasses at different temperatures were obtained. Finite element analysis was applied for the simulation of the molding processes. The simulation results were in consistence with previously reported experiment results. Stress concentration and evolution during the molding processes was also described with the simulation results.
Structural Performance’s Optimally Analysing and Implementing Based on ANSYS Technology
NASA Astrophysics Data System (ADS)
Han, Na; Wang, Xuquan; Yue, Haifang; Sun, Jiandong; Wu, Yongchun
2017-06-01
Computer-aided Engineering (CAE) is a hotspot both in academic field and in modern engineering practice. Analysis System(ANSYS) simulation software for its excellent performance become outstanding one in CAE family, it is committed to the innovation of engineering simulation to help users to shorten the design process, improve product innovation and performance. Aimed to explore a structural performance’s optimally analyzing model for engineering enterprises, this paper introduced CAE and its development, analyzed the necessity for structural optimal analysis as well as the framework of structural optimal analysis on ANSYS Technology, used ANSYS to implement a reinforced concrete slab structural performance’s optimal analysis, which was display the chart of displacement vector and the chart of stress intensity. Finally, this paper compared ANSYS software simulation results with the measured results,expounded that ANSYS is indispensable engineering calculation tools.
NASA Astrophysics Data System (ADS)
Chen, X.; Kumar, M.; Basso, S.; Marani, M.
2017-12-01
Storage-discharge (S-Q) relations are widely used to derive watershed properties and predict streamflow responses. These relations are often obtained using different recession analysis methods, which vary in recession period identification criteria and Q vs. -dQ/dt fitting scheme. Although previous studies have indicated that different recession analysis methods can result in significantly different S-Q relations and subsequently derived hydrological variables, this observation has often been overlooked and S-Q relations have been used in as is form. This study evaluated the effectiveness of four recession analysis methods in obtaining the characteristic S-Q relation and reconstructing the streamflow. Results indicate that while some methods generally performed better than others, none of them consistently outperformed the others. Even the best-performing method could not yield accurate reconstructed streamflow time series and its PDFs in some watersheds, implying that either derived S-Q relations might not be reliable or S-Q relations cannot be used for hydrological simulations. Notably, accuracy of the methods is influenced by the extent of scatter in the ln(-dQ/dt) vs. ln(Q) plot. In addition, the derived S-Q relation was very sensitive to the criteria used for identifying recession periods. This result raises a warning sign against indiscriminate application of recession analysis methods and derived S-Q relations for watershed characterizations or hydrologic simulations. Thorough evaluation of representativeness of the derived S-Q relation should be performed before it is used for hydrologic analysis.
MAP stability, design, and analysis
NASA Technical Reports Server (NTRS)
Ericsson-Jackson, A. J.; Andrews, S. F.; O'Donnell, J. R., Jr.; Markley, F. L.
1998-01-01
The Microwave Anisotropy Probe (MAP) is a follow-on to the Differential Microwave Radiometer (DMR) instrument on the Cosmic Background Explorer (COBE) spacecraft. The design and analysis of the MAP attitude control system (ACS) have been refined since work previously reported. The full spacecraft and instrument flexible model was developed in NASTRAN, and the resulting flexible modes were plotted and reduced with the Modal Significance Analysis Package (MSAP). The reduced-order model was used to perform the linear stability analysis for each control mode, the results of which are presented in this paper. Although MAP is going to a relatively disturbance-free Lissajous orbit around the Earth-Sun L(2) Lagrange point, a detailed disturbance-torque analysis is required because there are only a small number of opportunities for momentum unloading each year. Environmental torques, including solar pressure at L(2), aerodynamic and gravity gradient during phasing-loop orbits, were calculated and simulated. Thruster plume impingement torques that could affect the performance of the thruster modes were estimated and simulated, and a simple model of fuel slosh was derived to model its effect on the motion of the spacecraft. In addition, a thruster mode linear impulse controller was developed to meet the accuracy requirements of the phasing loop burns. A dynamic attitude error limiter was added to improve the performance of the ACS during large attitude slews. The result of this analysis is a stable ACS subsystem that meets all of the mission's requirements.
24 CFR 3500.17 - Escrow accounts.
Code of Federal Regulations, 2013 CFR
2013-04-01
...: Aggregate (or) composite analysis, hereafter called aggregate analysis, means an accounting method a... advances funds for a borrower, then the servicer must perform an escrow account analysis before seeking.... If a servicer advances funds in paying a disbursement, which is not the result of a borrower's...
24 CFR 3500.17 - Escrow accounts.
Code of Federal Regulations, 2012 CFR
2012-04-01
...: Aggregate (or) composite analysis, hereafter called aggregate analysis, means an accounting method a... advances funds for a borrower, then the servicer must perform an escrow account analysis before seeking.... If a servicer advances funds in paying a disbursement, which is not the result of a borrower's...
24 CFR 3500.17 - Escrow accounts.
Code of Federal Regulations, 2014 CFR
2014-04-01
...: Aggregate (or) composite analysis, hereafter called aggregate analysis, means an accounting method a... advances funds for a borrower, then the servicer must perform an escrow account analysis before seeking.... If a servicer advances funds in paying a disbursement, which is not the result of a borrower's...
Linear regression analysis: part 14 of a series on evaluation of scientific publications.
Schneider, Astrid; Hommel, Gerhard; Blettner, Maria
2010-11-01
Regression analysis is an important statistical method for the analysis of medical data. It enables the identification and characterization of relationships among multiple factors. It also enables the identification of prognostically relevant risk factors and the calculation of risk scores for individual prognostication. This article is based on selected textbooks of statistics, a selective review of the literature, and our own experience. After a brief introduction of the uni- and multivariable regression models, illustrative examples are given to explain what the important considerations are before a regression analysis is performed, and how the results should be interpreted. The reader should then be able to judge whether the method has been used correctly and interpret the results appropriately. The performance and interpretation of linear regression analysis are subject to a variety of pitfalls, which are discussed here in detail. The reader is made aware of common errors of interpretation through practical examples. Both the opportunities for applying linear regression analysis and its limitations are presented.
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steven M.
2006-01-01
A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis - Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steven M.
2007-01-01
A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis-Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.
Faying Surface Lubrication Effects on Nut Factors
NASA Technical Reports Server (NTRS)
Taylor, Deneen M.; Morrison, Raymond F.
2006-01-01
Bolted joint analysis typically is performed using nut factors derived from textbooks and procedures from program requirement documents. Joint specific testing was performed for a critical International Space Station (ISS) joint. Test results indicate that for some configurations the nut factor may be significantly different than accepted textbook values. This paper presents results of joint specific testing to aid in determining if joint specific testing should be performed to insure required preloads are obtained.
Effects of Rating Purpose and Rater Self-Esteem on Performance Ratings.
1983-03-01
examined in a laboratory study, using a 2x2 analysis of variance design. Results indicate that low self - esteem raters assign significantly higher...design. Results indicate that low self - esteem raters assign significantly higher performance ratings when performance appraisal information will be used...studies indicated that individuals low in self - esteem have less self -confidence, feel less competent, and rely more on others’ opinions than do individuals
40 CFR 265.200 - Waste analysis and trial tests.
Code of Federal Regulations, 2010 CFR
2010-07-01
... operator to place the results from each waste analysis and trial test, or the documented information, in... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Waste analysis and trial tests. 265... DISPOSAL FACILITIES Tank Systems § 265.200 Waste analysis and trial tests. In addition to performing the...
40 CFR 265.200 - Waste analysis and trial tests.
Code of Federal Regulations, 2013 CFR
2013-07-01
... operator to place the results from each waste analysis and trial test, or the documented information, in... 40 Protection of Environment 27 2013-07-01 2013-07-01 false Waste analysis and trial tests. 265... DISPOSAL FACILITIES Tank Systems § 265.200 Waste analysis and trial tests. In addition to performing the...
40 CFR 265.200 - Waste analysis and trial tests.
Code of Federal Regulations, 2011 CFR
2011-07-01
... operator to place the results from each waste analysis and trial test, or the documented information, in... 40 Protection of Environment 26 2011-07-01 2011-07-01 false Waste analysis and trial tests. 265... DISPOSAL FACILITIES Tank Systems § 265.200 Waste analysis and trial tests. In addition to performing the...
40 CFR 265.200 - Waste analysis and trial tests.
Code of Federal Regulations, 2012 CFR
2012-07-01
... operator to place the results from each waste analysis and trial test, or the documented information, in... 40 Protection of Environment 27 2012-07-01 2012-07-01 false Waste analysis and trial tests. 265... DISPOSAL FACILITIES Tank Systems § 265.200 Waste analysis and trial tests. In addition to performing the...
40 CFR 265.200 - Waste analysis and trial tests.
Code of Federal Regulations, 2014 CFR
2014-07-01
... operator to place the results from each waste analysis and trial test, or the documented information, in... 40 Protection of Environment 26 2014-07-01 2014-07-01 false Waste analysis and trial tests. 265... DISPOSAL FACILITIES Tank Systems § 265.200 Waste analysis and trial tests. In addition to performing the...
Lui, Justin T; Hoy, Monica Y
2017-06-01
Background The increasing prevalence of virtual reality simulation in temporal bone surgery warrants an investigation to assess training effectiveness. Objectives To determine if temporal bone simulator use improves mastoidectomy performance. Data Sources Ovid Medline, Embase, and PubMed databases were systematically searched per the PRISMA guidelines. Review Methods Inclusion criteria were peer-reviewed publications that utilized quantitative data of mastoidectomy performance following the use of a temporal bone simulator. The search was restricted to human studies published in English. Studies were excluded if they were in non-peer-reviewed format, were descriptive in nature, or failed to provide surgical performance outcomes. Meta-analysis calculations were then performed. Results A meta-analysis based on the random-effects model revealed an improvement in overall mastoidectomy performance following training on the temporal bone simulator. A standardized mean difference of 0.87 (95% CI, 0.38-1.35) was generated in the setting of a heterogeneous study population ( I 2 = 64.3%, P < .006). Conclusion In the context of a diverse population of virtual reality simulation temporal bone surgery studies, meta-analysis calculations demonstrate an improvement in trainee mastoidectomy performance with virtual simulation training.
Anomalous heat transfer modes of nanofluids: a review based on statistical analysis
NASA Astrophysics Data System (ADS)
Sergis, Antonis; Hardalupas, Yannis
2011-05-01
This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.
Anomalous heat transfer modes of nanofluids: a review based on statistical analysis.
Sergis, Antonis; Hardalupas, Yannis
2011-05-19
This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.
Anomalous heat transfer modes of nanofluids: a review based on statistical analysis
2011-01-01
This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids. PMID:21711932
Assessing performance of an Electronic Health Record (EHR) using Cognitive Task Analysis.
Saitwal, Himali; Feng, Xuan; Walji, Muhammad; Patel, Vimla; Zhang, Jiajie
2010-07-01
Many Electronic Health Record (EHR) systems fail to provide user-friendly interfaces due to the lack of systematic consideration of human-centered computing issues. Such interfaces can be improved to provide easy to use, easy to learn, and error-resistant EHR systems to the users. To evaluate the usability of an EHR system and suggest areas of improvement in the user interface. The user interface of the AHLTA (Armed Forces Health Longitudinal Technology Application) was analyzed using the Cognitive Task Analysis (CTA) method called GOMS (Goals, Operators, Methods, and Selection rules) and an associated technique called KLM (Keystroke Level Model). The GOMS method was used to evaluate the AHLTA user interface by classifying each step of a given task into Mental (Internal) or Physical (External) operators. This analysis was performed by two analysts independently and the inter-rater reliability was computed to verify the reliability of the GOMS method. Further evaluation was performed using KLM to estimate the execution time required to perform the given task through application of its standard set of operators. The results are based on the analysis of 14 prototypical tasks performed by AHLTA users. The results show that on average a user needs to go through 106 steps to complete a task. To perform all 14 tasks, they would spend about 22 min (independent of system response time) for data entry, of which 11 min are spent on more effortful mental operators. The inter-rater reliability analysis performed for all 14 tasks was 0.8 (kappa), indicating good reliability of the method. This paper empirically reveals and identifies the following finding related to the performance of AHLTA: (1) large number of average total steps to complete common tasks, (2) high average execution time and (3) large percentage of mental operators. The user interface can be improved by reducing (a) the total number of steps and (b) the percentage of mental effort, required for the tasks. 2010 Elsevier Ireland Ltd. All rights reserved.
Determining characteristics of artificial near-Earth objects using observability analysis
NASA Astrophysics Data System (ADS)
Friedman, Alex M.; Frueh, Carolin
2018-03-01
Observability analysis is a method for determining whether a chosen state of a system can be determined from the output or measurements. Knowledge of state information availability resulting from observability analysis leads to improved sensor tasking for observation of orbital debris and better control of active spacecraft. This research performs numerical observability analysis of artificial near-Earth objects. Analysis of linearization methods and state transition matrices is performed to determine the viability of applying linear observability methods to the nonlinear orbit problem. Furthermore, pre-whitening is implemented to reformulate classical observability analysis. In addition, the state in observability analysis is typically composed of position and velocity; however, including object characteristics beyond position and velocity can be crucial for precise orbit propagation. For example, solar radiation pressure has a significant impact on the orbit of high area-to-mass ratio objects in geosynchronous orbit. Therefore, determining the time required for solar radiation pressure parameters to become observable is important for understanding debris objects. In order to compare observability analysis results with and without measurement noise and an extended state, quantitative measures of observability are investigated and implemented.
Memory and Obstructive Sleep Apnea: A Meta-Analysis
Wallace, Anna; Bucks, Romola S.
2013-01-01
Study Objectives: To examine episodic memory performance in individuals with obstructive sleep apnea (OSA). Design Meta-analysis was used to synthesize results from individual studies examining the impact of OSA on episodic memory performance. The performance of individuals with OSA was compared to healthy controls or normative data. Participants Forty-two studies were included, comprising 2,294 adults with untreated OSA and 1,364 healthy controls. Studies that recorded information about participants at baseline prior to treatment interventions were included in the analysis. Measurements Participants were assessed with tasks that included a measure of episodic memory: immediate recall, delayed recall, learning, and/or recognition memory. Results: The results of the meta-analyses provide evidence that individuals with OSA are significantly impaired when compared to healthy controls on verbal episodic memory (immediate recall, delayed recall, learning, and recognition) and visuo-spatial episodic memory (immediate and delayed recall), but not visual immediate recall or visuo-spatial learning. When patients were compared to norms, negative effects of OSA were found only in verbal immediate and delayed recall. Conclusions: This meta-analysis contributes to understanding of the nature of episodic memory deficits in individuals with OSA. Impairments to episodic memory are likely to affect the daily functioning of individuals with OSA. Citation Wallace A; Bucks RS. Memory and obstructive sleep apnea: a meta-analysis. SLEEP 2013;36(2):203-220. PMID:23372268
An Approach to Experimental Design for the Computer Analysis of Complex Phenomenon
NASA Technical Reports Server (NTRS)
Rutherford, Brian
2000-01-01
The ability to make credible system assessments, predictions and design decisions related to engineered systems and other complex phenomenon is key to a successful program for many large-scale investigations in government and industry. Recently, many of these large-scale analyses have turned to computational simulation to provide much of the required information. Addressing specific goals in the computer analysis of these complex phenomenon is often accomplished through the use of performance measures that are based on system response models. The response models are constructed using computer-generated responses together with physical test results where possible. They are often based on probabilistically defined inputs and generally require estimation of a set of response modeling parameters. As a consequence, the performance measures are themselves distributed quantities reflecting these variabilities and uncertainties. Uncertainty in the values of the performance measures leads to uncertainties in predicted performance and can cloud the decisions required of the analysis. A specific goal of this research has been to develop methodology that will reduce this uncertainty in an analysis environment where limited resources and system complexity together restrict the number of simulations that can be performed. An approach has been developed that is based on evaluation of the potential information provided for each "intelligently selected" candidate set of computer runs. Each candidate is evaluated by partitioning the performance measure uncertainty into two components - one component that could be explained through the additional computational simulation runs and a second that would remain uncertain. The portion explained is estimated using a probabilistic evaluation of likely results for the additional computational analyses based on what is currently known about the system. The set of runs indicating the largest potential reduction in uncertainty is then selected and the computational simulations are performed. Examples are provided to demonstrate this approach on small scale problems. These examples give encouraging results. Directions for further research are indicated.
Holmquist-Johnson, C. L.
2009-01-01
River spanning rock structures are being constructed for water delivery as well as to enable fish passage at barriers and provide or improve the aquatic habitat for endangered fish species. Current design methods are based upon anecdotal information applicable to a narrow range of channel conditions. The complex flow patterns and performance of rock weirs is not well understood. Without accurate understanding of their hydraulics, designers cannot address the failure mechanisms of these structures. Flow characteristics such as jets, near bed velocities, recirculation, eddies, and plunging flow govern scour pool development. These detailed flow patterns can be replicated using a 3D numerical model. Numerical studies inexpensively simulate a large number of cases resulting in an increased range of applicability in order to develop design tools and predictive capability for analysis and design. The analysis and results of the numerical modeling, laboratory modeling, and field data provide a process-based method for understanding how structure geometry affects flow characteristics, scour development, fish passage, water delivery, and overall structure stability. Results of the numerical modeling allow designers to utilize results of the analysis to determine the appropriate geometry for generating desirable flow parameters. The end product of this research will develop tools and guidelines for more robust structure design or retrofits based upon predictable engineering and hydraulic performance criteria. ?? 2009 ASCE.
Internal Flow of Contra-Rotating Small Hydroturbine at Off- Design Flow Rates
NASA Astrophysics Data System (ADS)
SHIGEMITSU, Toru; TAKESHIMA, Yasutoshi; OGAWA, Yuya; FUKUTOMI, Junichiro
2016-11-01
Small hydropower generation is one of important alternative energy, and enormous potential lie in the small hydropower. However, efficiency of small hydroturbines is lower than that of large one. Then, there are demands for small hydroturbines to keep high performance in wide flow rate range. Therefore, we adopted contra-rotating rotors, which can be expected to achieve high performance. In this research, performance of the contra-rotating small hydroturbine with 60mm casing diameter was investigated by an experiment and numerical analysis. Efficiency of the contra-rotating small hydroturbine was high in pico-hydroturbine and high efficiency could be kept in wide flow rate range, however the performance of a rear rotor decreased significantly in partial flow rates. Then, internal flow condition, which was difficult to measure experimentally, was investigated by the numerical flow analysis. Then, a relation between the performance and internal flow condition was considered by the numerical analysis result.
NASA Technical Reports Server (NTRS)
Charlton, Eric F.
1998-01-01
Aerodynamic analysis are performed using the Lockheed-Martin Tactical Aircraft Systems (LMTAS) Splitflow computational fluid dynamics code to investigate the computational prediction capabilities for vortex-dominated flow fields of two different tailless aircraft models at large angles of attack and sideslip. These computations are performed with the goal of providing useful stability and control data to designers of high performance aircraft. Appropriate metrics for accuracy, time, and ease of use are determined in consultations with both the LMTAS Advanced Design and Stability and Control groups. Results are obtained and compared to wind-tunnel data for all six components of forces and moments. Moment data is combined to form a "falling leaf" stability analysis. Finally, a handful of viscous simulations were also performed to further investigate nonlinearities and possible viscous effects in the differences between the accumulated inviscid computational and experimental data.
ERIC Educational Resources Information Center
Burton, D. Bradley; And Others
1994-01-01
A maximum-likelihood confirmatory factor analysis was performed by applying LISREL VII to the Wechsler Adult Intelligence Scale-Revised results of a normal elderly sample of 225 adults. Results indicate that a three-factor model fits best across all sample combinations. A mild gender effect is discussed. (SLD)
NASA Technical Reports Server (NTRS)
Traversi, M.
1979-01-01
Data are presented on the sensitivity of: (1) mission analysis results to the boundary values given for number of passenger cars and average annual vehicle miles traveled per car; (2) vehicle characteristics and performance to specifications; and (3) tradeoff study results to the expected parameters.
Stress Analysis and Fracture in Nanolaminate Composites
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2008-01-01
A stress analysis is performed on a nanolaminate subjected to bending. A composite mechanics computer code that is based on constituent properties and nanoelement formulation is used to evaluate the nanolaminate stresses. The results indicate that the computer code is sufficient for the analysis. The results also show that when a stress concentration is present, the nanolaminate stresses exceed their corresponding matrix-dominated strengths and the nanofiber fracture strength.
NASA Technical Reports Server (NTRS)
Repa, B. S.; Zucker, R. S.; Wierwille, W. W.
1977-01-01
The influence of vehicle transient response characteristics on driver-vehicle performance in discrete maneuvers as measured by integral performance criteria was investigated. A group of eight ordinary drivers was presented with a series of eight vehicle transfer function configurations in a driving simulator. Performance in two discrete maneuvers was analyzed by means of integral performance criteria. Results are presented.
NASA Technical Reports Server (NTRS)
1995-01-01
The purpose of the Advanced Transportation System Studies (ATSS) Technical Area 2 (TA-2) Heavy Lift Launch Vehicle Development contract was to provide advanced launch vehicle concept definition and analysis to assist NASA in the identification of future launch vehicle requirements. Contracted analysis activities included vehicle sizing and performance analysis, subsystem concept definition, propulsion subsystem definition (foreign and domestic), ground operations and facilities analysis, and life cycle cost estimation. This document is Volume 2 of the final report for the contract. It provides documentation of selected technical results from various TA-2 analysis activities, including a detailed narrative description of the SSTO concept assessment results, a user's guide for the associated SSTO sizing tools, an SSTO turnaround assessment report, an executive summary of the ground operations assessments performed during the first year of the contract, a configuration-independent vehicle health management system requirements report, a copy of all major TA-2 contract presentations, a copy of the FLO launch vehicle final report, and references to Pratt & Whitney's TA-2 sponsored final reports regarding the identification of Russian main propulsion technologies.
[Role of biometric analysis in the retrospective assessment of exposure to asbestos].
Pairon, J C; Dumortier, P
1999-12-01
Despite intrinsic limitations due to differences in the bio-persistence of the various asbestos types, in the definition of control populations and in analytical techniques used by the laboratories, mineralogical analysis of biological samples is useful in the assessment of past exposure to asbestos. It provides additional information to occupational and environmental questionnaires, particularly when exposure to asbestos is doubtful, unknown or forgotten by a subject. Results should be interpreted taking into account clinical information. A positive result does not mean existence of asbestos-related disease. A negative result does not exclude previous significant asbestos exposure, clearly identified by an occupational questionnaire (particularly for exposure to chrysotile). Threshold values indicative of a high probability of previous asbestos exposure have been established for bronchoalveolar lavage fluid (BALF) samples and lung tissue samples. Quantification of asbestos bodies by light microscopy is easy to perform. Sensitivity and specificity of this analysis towards the total pulmonary asbestos fiber burden is good. Therefore this analysis should be performed first. Mineralogical analysis in BALF or lung tissue should be considered only when sampling is supported by diagnostic or therapeutic implications.
Uncertainty in the analysis of the overall equipment effectiveness on the shop floor
NASA Astrophysics Data System (ADS)
Rößler, M. P.; Abele, E.
2013-06-01
In this article an approach will be presented which supports transparency regarding the effectiveness of manufacturing equipment by combining the fuzzy set theory with the method of the overall equipment effectiveness analysis. One of the key principles of lean production and also a fundamental task in production optimization projects is the prior analysis of the current state of a production system by the use of key performance indicators to derive possible future states. The current state of the art in overall equipment effectiveness analysis is usually performed by cumulating different machine states by means of decentralized data collection without the consideration of uncertainty. In manual data collection or semi-automated plant data collection systems the quality of derived data often diverges and leads optimization teams to distorted conclusions about the real optimization potential of manufacturing equipment. The method discussed in this paper is to help practitioners to get more reliable results in the analysis phase and so better results of optimization projects. Under consideration of a case study obtained results are discussed.
Macro elemental analysis of food samples by nuclear analytical technique
NASA Astrophysics Data System (ADS)
Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.
2017-06-01
Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.
Multiscale Medical Image Fusion in Wavelet Domain
Khare, Ashish
2013-01-01
Wavelet transforms have emerged as a powerful tool in image fusion. However, the study and analysis of medical image fusion is still a challenging area of research. Therefore, in this paper, we propose a multiscale fusion of multimodal medical images in wavelet domain. Fusion of medical images has been performed at multiple scales varying from minimum to maximum level using maximum selection rule which provides more flexibility and choice to select the relevant fused images. The experimental analysis of the proposed method has been performed with several sets of medical images. Fusion results have been evaluated subjectively and objectively with existing state-of-the-art fusion methods which include several pyramid- and wavelet-transform-based fusion methods and principal component analysis (PCA) fusion method. The comparative analysis of the fusion results has been performed with edge strength (Q), mutual information (MI), entropy (E), standard deviation (SD), blind structural similarity index metric (BSSIM), spatial frequency (SF), and average gradient (AG) metrics. The combined subjective and objective evaluations of the proposed fusion method at multiple scales showed the effectiveness and goodness of the proposed approach. PMID:24453868
NASA Astrophysics Data System (ADS)
Hou, Zhenlong; Huang, Danian
2017-09-01
In this paper, we make a study on the inversion of probability tomography (IPT) with gravity gradiometry data at first. The space resolution of the results is improved by multi-tensor joint inversion, depth weighting matrix and the other methods. Aiming at solving the problems brought by the big data in the exploration, we present the parallel algorithm and the performance analysis combining Compute Unified Device Architecture (CUDA) with Open Multi-Processing (OpenMP) based on Graphics Processing Unit (GPU) accelerating. In the test of the synthetic model and real data from Vinton Dome, we get the improved results. It is also proved that the improved inversion algorithm is effective and feasible. The performance of parallel algorithm we designed is better than the other ones with CUDA. The maximum speedup could be more than 200. In the performance analysis, multi-GPU speedup and multi-GPU efficiency are applied to analyze the scalability of the multi-GPU programs. The designed parallel algorithm is demonstrated to be able to process larger scale of data and the new analysis method is practical.
Illumination analysis of LAPAN's IR micro bolometer
NASA Astrophysics Data System (ADS)
Bustanul, A.; Irwan, P.; Andi M., T.
2016-10-01
We have since 2 years ago been doing a research in term of an IR Micrometer Bolometer which aims to fulfill our office, LAPAN, desire to put it as one of payloads into LAPAN's next micro satellite project, either at LAPAN A4 or at LAPAN A5. Due to the lack of experience on the subject, everything had been initiated by spectral radiance analysis adjusted by catastrophes sources in Indonesia, mainly wild fire (forest fire) and active volcano. Based on the result of the appropriate spectral radiance wavelength, 3.8 - 4 μm, and field of view (FOV), we, then, went through the further analysis, optical analysis. Focusing in illumination matter, the process was done by using Zemax software. Optical pass Interference and Stray light were two things that become our concern throughout the work. They could also be an evaluation of the performance optimization of illumination analysis of our optical design. The results, graphs, show that our design performance is close diffraction limited and the image blur of the geometrical produced by Lapan's IR Micro Bolometer lenses is in the pixel area range. Therefore, our optical design performance is relatively good and will produce image with high quality. In this paper, the Illumination analysis and process of LAPAN's Infra Red (IR) Micro Bolometer is presented.
A reliable unipedal stance test for the assessment of balance using a force platform.
Ponce-González, J G; Sanchis-Moysi, J; González-Henriquez, J J; Arteaga-Ortiz, R; Calbet, J A L; Dorado, C
2014-02-01
The aim was to develop a unipedal stance test for the assessment of balance using a force platform. A single-leg balance test was conducted in 23 students (mean ± SD) age: 23 ± 3 years) in a standard position limiting the movement of the arms and non-supporting leg. Six attempts, with both the jumping (JL) and the contralateral leg (CL), were performed under 3 conditions: 1) eyes opened; 2) eyes closed; 3) eyes opened and executing a precision task. The same protocol was repeated two-week apart. The mean and the best result of the six attempts performed each day were taken as representative of balance. The speed of the centre of pressure (CP-Speed) showed excellent reliability for the "best result" analysis in all tests (ICCs 0.87-0.97), except in the test with the eyes closed performed on the CL (ICC<0.4). The CP-Speed had better reliability with the "best result" than with the "mean result" analysis (P<0.05), whilst no significant differences were observed between the JL and the CL (P=0.71 and P=0.96 for mean and best results analysis, respectively). A lower dispersion in the Bland and Altman graph was observed with the eyes opened than closed, and the dynamic test. The single-leg stance balance test proposed is a reliable method to assess balance, especially when performed in a static position, with the eyes opened and using the best result of six attempts as reference, independently of the stance leg.
Results of a collaborative study on DNA identification of aged bone samples
Vanek, Daniel; Budowle, Bruce; Dubska-Votrubova, Jitka; Ambers, Angie; Frolik, Jan; Pospisek, Martin; Al Afeefi, Ahmed Anwar; Al Hosani, Khalid Ismaeil; Allen, Marie; Al Naimi, Khudooma Saeed; Al Salafi, Dina; Al Tayyari, Wafa Ali Rashid; Arguetaa, Wendy; Bottinelli, Michel; Bus, Magdalena M.; Cemper-Kiesslich, Jan; Cepil, Olivier; De Cock, Greet; Desmyter, Stijn; El Amri, Hamid; El Ossmani, Hicham; Galdies, Ruth; Grün, Sebastian; Guidet, Francois; Hoefges, Anna; Iancu, Cristian Bogdan; Lotz, Petra; Maresca, Alessandro; Nagy, Marion; Novotny, Jindrich; Rachid, Hajar; Rothe, Jessica; Stenersen, Marguerethe; Stephenson, Mishel; Stevanovitch, Alain; Strien, Juliane; Sumita, Denilce R.; Vella, Joanna; Zander, Judith
2017-01-01
Aim A collaborative exercise with several institutes was organized by the Forensic DNA Service (FDNAS) and the Institute of the Legal Medicine, 2nd Faculty of Medicine, Charles University in Prague, Czech Republic, with the aim to test performance of different laboratories carrying out DNA analysis of relatively old bone samples. Methods Eighteen laboratories participating in the collaborative exercise were asked to perform DNA typing of two samples of bone powder. Two bone samples provided by the National Museum and the Institute of Archaelogy in Prague, Czech Republic, came from archeological excavations and were estimated to be approximately 150 and 400 years old. The methods of genetic characterization including autosomal, gonosomal, and mitochondrial markers was selected solely at the discretion of the participating laboratory. Results Although the participating laboratories used different extraction and amplification strategies, concordant results were obtained from the relatively intact 150 years old bone sample. Typing was more problematic with the analysis of the 400 years old bone sample due to poorer quality. Conclusion The laboratories performing identification DNA analysis of bone and teeth samples should regularly test their ability to correctly perform DNA-based identification on bone samples containing degraded DNA and potential inhibitors and demonstrate that risk of contamination is minimized. PMID:28613037
Shenoy, Shailesh M
2016-07-01
A challenge in any imaging laboratory, especially one that uses modern techniques, is to achieve a sustainable and productive balance between using open source and commercial software to perform quantitative image acquisition, analysis and visualization. In addition to considering the expense of software licensing, one must consider factors such as the quality and usefulness of the software's support, training and documentation. Also, one must consider the reproducibility with which multiple people generate results using the same software to perform the same analysis, how one may distribute their methods to the community using the software and the potential for achieving automation to improve productivity.
Probabilistic Analysis of Solid Oxide Fuel Cell Based Hybrid Gas Turbine System
NASA Technical Reports Server (NTRS)
Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.
2003-01-01
The emergence of fuel cell systems and hybrid fuel cell systems requires the evolution of analysis strategies for evaluating thermodynamic performance. A gas turbine thermodynamic cycle integrated with a fuel cell was computationally simulated and probabilistically evaluated in view of the several uncertainties in the thermodynamic performance parameters. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the uncertainties in the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design and make it cost effective. The analysis leads to the selection of criteria for gas turbine performance.
Dunphy, C H; Polski, J M; Evans, H L; Gardner, L J
2001-08-01
Immunophenotyping of bone marrow (BM) specimens with acute myelogenous leukemia (AML) may be performed by flow cytometric (FC) or immunohistochemical (IH) techniques. Some markers (CD34, CD15, and CD117) are available for both techniques. Myeloperoxidase (MPO) analysis may be performed by enzyme cytochemical (EC) or IH techniques. To determine the reliability of these markers and MPO by these techniques, we designed a study to compare the results of analyses of these markers and MPO by FC (CD34, CD15, and CD117), EC (MPO), and IH (CD34, CD15, CD117, and MPO) techniques. Twenty-nine AMLs formed the basis of the study. These AMLs all had been immunophenotyped previously by FC analysis; 27 also had had EC analysis performed. Of the AMLs, 29 had BM core biopsies and 26 had BM clots that could be evaluated. The paraffin blocks of the 29 BM core biopsies and 26 BM clots were stained for CD34, CD117, MPO, and CD15. These results were compared with results by FC analysis (CD34, CD15, and CD117) and EC analysis (MPO). Immunodetection of CD34 expression in AML had a similar sensitivity by FC and IH techniques. Immunodetection of CD15 and CD117 had a higher sensitivity by FC analysis than by IH analysis. Detection of MPO by IH analysis was more sensitive than by EC analysis. There was no correlation of French-American-British (FAB) subtype of AML with CD34 or CD117 expression. Expression of CD15 was associated with AMLs with a monocytic component. Myeloperoxidase reactivity by IH analysis was observed in AMLs originally FAB subtyped as M0. CD34 can be equally detected by FC and IH techniques. CD15 and CD117 are better detected by FC analysis and MPO is better detected by IH analysis.
The CF6 engine performance improvement
NASA Technical Reports Server (NTRS)
Fasching, W. A.
1982-01-01
As part of the NASA-sponsored Engine Component Improvement (ECI) Program, a feasibility analysis of performance improvement and retention concepts for the CF6-6 and CF6-50 engines was conducted and seven concepts were identified for development and ground testing: new fan, new front mount, high pressure turbine aerodynamic performance improvement, high pressure turbine roundness, high pressure turbine active clearance control, low pressure turbine active clearance control, and short core exhaust nozzle. The development work and ground testing are summarized, and the major test results and an enomic analysis for each concept are presented.
Numerical flow analysis of axial flow compressor for steady and unsteady flow cases
NASA Astrophysics Data System (ADS)
Prabhudev, B. M.; Satish kumar, S.; Rajanna, D.
2017-07-01
Performance of jet engine is dependent on the performance of compressor. This paper gives numerical study of performance characteristics for axial compressor. The test rig is present at CSIR LAB Bangalore. Flow domains are meshed and fluid dynamic equations are solved using ANSYS package. Analysis is done for six different speeds and for operating conditions like choke, maximum efficiency & before stall point. Different plots are compared and results are discussed. Shock displacement, vortex flows, leakage patterns are presented along with unsteady FFT plot and time step plot.
Grasso, Chiara; Trevisan, Morena; Fiano, Valentina; Tarallo, Valentina; De Marco, Laura; Sacerdote, Carlotta; Richiardi, Lorenzo; Merletti, Franco; Gillio-Tos, Anna
2016-01-01
Pyrosequencing has emerged as an alternative method of nucleic acid sequencing, well suited for many applications which aim to characterize single nucleotide polymorphisms, mutations, microbial types and CpG methylation in the target DNA. The commercially available pyrosequencing systems can harbor two different types of software which allow analysis in AQ or CpG mode, respectively, both widely employed for DNA methylation analysis. Aim of the study was to assess the performance for DNA methylation analysis at CpG sites of the two pyrosequencing software which allow analysis in AQ or CpG mode, respectively. Despite CpG mode having been specifically generated for CpG methylation quantification, many investigations on this topic have been carried out with AQ mode. As proof of equivalent performance of the two software for this type of analysis is not available, the focus of this paper was to evaluate if the two modes currently used for CpG methylation assessment by pyrosequencing may give overlapping results. We compared the performance of the two software in quantifying DNA methylation in the promoter of selected genes (GSTP1, MGMT, LINE-1) by testing two case series which include DNA from paraffin embedded prostate cancer tissues (PC study, N = 36) and DNA from blood fractions of healthy people (DD study, N = 28), respectively. We found discrepancy in the two pyrosequencing software-based quality assignment of DNA methylation assays. Compared to the software for analysis in the AQ mode, less permissive criteria are supported by the Pyro Q-CpG software, which enables analysis in CpG mode. CpG mode warns the operators about potential unsatisfactory performance of the assay and ensures a more accurate quantitative evaluation of DNA methylation at CpG sites. The implementation of CpG mode is strongly advisable in order to improve the reliability of the methylation analysis results achievable by pyrosequencing.
Joint modality fusion and temporal context exploitation for semantic video analysis
NASA Astrophysics Data System (ADS)
Papadopoulos, Georgios Th; Mezaris, Vasileios; Kompatsiaris, Ioannis; Strintzis, Michael G.
2011-12-01
In this paper, a multi-modal context-aware approach to semantic video analysis is presented. Overall, the examined video sequence is initially segmented into shots and for every resulting shot appropriate color, motion and audio features are extracted. Then, Hidden Markov Models (HMMs) are employed for performing an initial association of each shot with the semantic classes that are of interest separately for each modality. Subsequently, a graphical modeling-based approach is proposed for jointly performing modality fusion and temporal context exploitation. Novelties of this work include the combined use of contextual information and multi-modal fusion, and the development of a new representation for providing motion distribution information to HMMs. Specifically, an integrated Bayesian Network is introduced for simultaneously performing information fusion of the individual modality analysis results and exploitation of temporal context, contrary to the usual practice of performing each task separately. Contextual information is in the form of temporal relations among the supported classes. Additionally, a new computationally efficient method for providing motion energy distribution-related information to HMMs, which supports the incorporation of motion characteristics from previous frames to the currently examined one, is presented. The final outcome of this overall video analysis framework is the association of a semantic class with every shot. Experimental results as well as comparative evaluation from the application of the proposed approach to four datasets belonging to the domains of tennis, news and volleyball broadcast video are presented.
Knapp, F; Viechtbauer, W; Leonhart, R; Nitschke, K; Kaller, C P
2017-08-01
Despite a large body of research on planning performance in adult schizophrenia patients, results of individual studies are equivocal, suggesting either no, moderate or severe planning deficits. This meta-analysis therefore aimed to quantify planning deficits in schizophrenia and to examine potential sources of the heterogeneity seen in the literature. The meta-analysis comprised outcomes of planning accuracy of 1377 schizophrenia patients and 1477 healthy controls from 31 different studies which assessed planning performance using tower tasks such as the Tower of London, the Tower of Hanoi and the Stockings of Cambridge. A meta-regression analysis was applied to assess the influence of potential moderator variables (i.e. sociodemographic and clinical variables as well as task difficulty). The findings indeed demonstrated a planning deficit in schizophrenia patients (mean effect size: ; 95% confidence interval 0.56-0.78) that was moderated by task difficulty in terms of the minimum number of moves required for a solution. The results did not reveal any significant relationship between the extent of planning deficits and sociodemographic or clinical variables. The current results provide first meta-analytic evidence for the commonly assumed impairments of planning performance in schizophrenia. Deficits are more likely to become manifest in problem items with higher demands on planning ahead, which may at least partly explain the heterogeneity of previous findings. As only a small fraction of studies reported coherent information on sample characteristics, future meta-analyses would benefit from more systematic reports on those variables.
Hormann, Wymke; Hahn, Melanie; Gerlach, Stefan; Hochstrate, Nicola; Affeldt, Kai; Giesen, Joyce; Fechner, Kai; Damoiseaux, Jan G M C
2017-11-27
Antibodies directed against dsDNA are a highly specific diagnostic marker for the presence of systemic lupus erythematosus and of particular importance in its diagnosis. To assess anti-dsDNA antibodies, the Crithidia luciliae-based indirect immunofluorescence test (CLIFT) is one of the assays considered to be the best choice. To overcome the drawback of subjective result interpretation that inheres indirect immunofluorescence assays in general, automated systems have been introduced into the market during the last years. Among these systems is the EUROPattern Suite, an advanced automated fluorescence microscope equipped with different software packages, capable of automated pattern interpretation and result suggestion for ANA, ANCA and CLIFT analysis. We analyzed the performance of the EUROPattern Suite with its automated fluorescence interpretation for CLIFT in a routine setting, reflecting the everyday life of a diagnostic laboratory. Three hundred and twelve consecutive samples were collected, sent to the Central Diagnostic Laboratory of the Maastricht University Medical Centre with a request for anti-dsDNA analysis over a period of 7 months. Agreement between EUROPattern assay analysis and the visual read was 93.3%. Sensitivity and specificity were 94.1% and 93.2%, respectively. The EUROPattern Suite performed reliably and greatly supported result interpretation. Automated image acquisition is readily performed and automated image classification gives a reliable recommendation for assay evaluation to the operator. The EUROPattern Suite optimizes workflow and contributes to standardization between different operators or laboratories.
Quantitative analysis of peel-off degree for printed electronics
NASA Astrophysics Data System (ADS)
Park, Janghoon; Lee, Jongsu; Sung, Ki-Hak; Shin, Kee-Hyun; Kang, Hyunkyoo
2018-02-01
We suggest a facile methodology of peel-off degree evaluation by image processing on printed electronics. The quantification of peeled and printed areas was performed using open source programs. To verify the accuracy of methods, we manually removed areas from the printed circuit that was measured, resulting in 96.3% accuracy. The sintered patterns showed a decreasing tendency in accordance with the increase in the energy density of an infrared lamp, and the peel-off degree increased. Thus, the comparison between both results was presented. Finally, the correlation between performance characteristics was determined by quantitative analysis.
Power Plant Emission Reductions Using a Generation Performance Standard
2001-01-01
In an earlier analysis completed in response to a request received from Representative David McIntosh, Chairman of the Subcommittee on National Economic Growth, Natural Resources, and Regulatory Affairs, the Energy Information Administration analyzed the impacts of power sector caps on nitrogen oxides, sulfur dioxide, and carbon dioxide emissions, assuming a policy instrument patterned after the sulfur dioxide allowance program created in the Clean Air Act Amendments of 1990. This paper compares the results of that work with the results of an analysis that assumes the use of a dynamic generation performance standard as an instrument for reducing carbon dioxide emissions.
NASA Astrophysics Data System (ADS)
Boy, M.; Yaşar, N.; Çiftçi, İ.
2016-11-01
In recent years, turning of hardened steels has replaced grinding for finishing operations. This process is compared to grinding operations; hard turning has higher material removal rates, the possibility of greater process flexibility, lower equipment costs, and shorter setup time. CBN or ceramic cutting tools are widely used hard part machining. For successful application of hard turning, selection of suitable cutting parameters for a given cutting tool is an important step. For this purpose, an experimental investigation was conducted to determine the effects of cutting tool edge geometry, feed rate and cutting speed on surface roughness and resultant cutting force in hard turning of AISI H13 steel with ceramic cutting tools. Machining experiments were conducted in a CNC lathe based on Taguchi experimental design (L16) in different levels of cutting parameters. In the experiments, a Kistler 9257 B, three cutting force components (Fc, Ff and Fr) piezoelectric dynamometer was used to measure cutting forces. Surface roughness measurements were performed by using a Mahrsurf PS1 device. For statistical analysis, analysis of variance has been performed and mathematical model have been developed for surface roughness and resultant cutting forces. The analysis of variance results showed that the cutting edge geometry, cutting speed and feed rate were the most significant factors on resultant cutting force while the cutting edge geometry and feed rate were the most significant factor for the surface roughness. The regression analysis was applied to predict the outcomes of the experiment. The predicted values and measured values were very close to each other. Afterwards a confirmation tests were performed to make a comparison between the predicted results and the measured results. According to the confirmation test results, measured values are within the 95% confidence interval.
NASA Technical Reports Server (NTRS)
Egolf, T. Alan; Anderson, Olof L.; Edwards, David E.; Landgrebe, Anton J.
1988-01-01
A computer program, the Propeller Nacelle Aerodynamic Performance Prediction Analysis (PANPER), was developed for the prediction and analysis of the performance and airflow of propeller-nacelle configurations operating over a forward speed range inclusive of high speed flight typical of recent propfan designs. A propeller lifting line, wake program was combined with a compressible, viscous center body interaction program, originally developed for diffusers, to compute the propeller-nacelle flow field, blade loading distribution, propeller performance, and the nacelle forebody pressure and viscous drag distributions. The computer analysis is applicable to single and coaxial counterrotating propellers. The blade geometries can include spanwise variations in sweep, droop, taper, thickness, and airfoil section type. In the coaxial mode of operation the analysis can treat both equal and unequal blade number and rotational speeds on the propeller disks. The nacelle portion of the analysis can treat both free air and tunnel wall configurations including wall bleed. The analysis was applied to many different sets of flight conditions using selected aerodynamic modeling options. The influence of different propeller nacelle-tunnel wall configurations was studied. Comparisons with available test data for both single and coaxial propeller configurations are presented along with a discussion of the results.
Software Engineering Laboratory (SEL) Ada performance study report
NASA Technical Reports Server (NTRS)
Booth, Eric W.; Stark, Michael E.
1991-01-01
The goals of the Ada Performance Study are described. The methods used are explained. Guidelines for future Ada development efforts are given. The goals and scope of the study are detailed, and the background of Ada development in the Flight Dynamics Division (FDD) is presented. The organization and overall purpose of each test are discussed. The purpose, methods, and results of each test and analyses of these results are given. Guidelines for future development efforts based on the analysis of results from this study are provided. The approach used on the performance tests is discussed.
Performance Evaluation of Nano-JASMINE
NASA Astrophysics Data System (ADS)
Hatsutori, Y.; Kobayashi, Y.; Gouda, N.; Yano, T.; Murooka, J.; Niwa, Y.; Yamada, Y.
We report the results of performance evaluation of the first Japanese astrometry satellite, Nano-JASMINE. It is a very small satellite and weighs only 35 kg. It aims to carry out astrometry measurement of nearby bright stars (z ≤ 7.5 mag) with an accuracy of 3 milli-arcseconds. Nano-JASMINE will be launched by Cyclone-4 rocket in August 2011 from Brazil. The current status is in the process of evaluating the performances. A series of performance tests and numerical analysis were conducted. As a result, the engineering model (EM) of the telescope was measured to be achieving a diffraction-limited performance and confirmed that it has enough performance for scientific astrometry.
ERIC Educational Resources Information Center
Bransky, Judith; Qualter, Anne
1993-01-01
Describes the findings of secondary analysis of data from the Assessment of Performance Unit (APU) Science. The most striking feature of the study is the extremely low level of scores obtained for questions which invite a written response. The results also clearly show the consistent negative reaction of girls to the technical context of…
Gao, Meng; Wang, Yuesheng; Wei, Huizhen; Ouyang, Hui; He, Mingzhen; Zeng, Lianqing; Shen, Fengyun; Guo, Qiang; Rao, Yi
2014-06-01
A method was developed for the determination of amygdalin and its metabolite prunasin in rat plasma after intragastric administration of Maxing shigan decoction. The analytes were identified by ultra-high performance liquid chromatography-tandem quadrupole time of flight mass spectrometry and quantitatively determined by ultra-high performance liquid chromatography-tandem triple quadrupole mass spectrometry. After purified by liquid-liquid extraction, the qualitative analysis of amygdalin and prunasin in the plasma sample was performed on a Shim-pack XR-ODS III HPLC column (75 mm x 2.0 mm, 1.6 microm), using acetonitrile-0.1% (v/v) formic acid aqueous solution. The detection was performed on a Triple TOF 5600 quadrupole time of flight mass spectrometer. The quantitative analysis of amygdalin and prunasin in the plasma sample was performed by separation on an Agilent C18 HPLC column (50 mm x 2.1 mm, 1.7 microm), using acetonitrile-0.1% (v/v) formic acid aqueous solution. The detection was performed on an AB Q-TRAP 4500 triple quadrupole mass spectrometer utilizing electrospray ionization (ESI) interface operated in negative ion mode and multiple-reaction monitoring (MRM) mode. The qualitative analysis results showed that amygdalin and its metabolite prunasin were detected in the plasma sample. The quantitative analysis results showed that the linear range of amygdalin was 1.05-4 200 ng/mL with the correlation coefficient of 0.999 0 and the linear range of prunasin was 1.25-2 490 ng/mL with the correlation coefficient of 0.997 0. The method had a good precision with the relative standard deviations (RSDs) lower than 9.20% and the overall recoveries varied from 82.33% to 95.25%. The limits of detection (LODs) of amygdalin and prunasin were 0.50 ng/mL. With good reproducibility, the method is simple, fast and effective for the qualitative and quantitative analysis of the amygdalin and prunasin in plasma sample of rats which were administered by Maxing shigan decoction.
Thermal Deformation and RF Performance Analyses for the SWOT Large Deployable Ka-Band Reflectarray
NASA Technical Reports Server (NTRS)
Fang, H.; Sunada, E.; Chaubell, J.; Esteban-Fernandez, D.; Thomson, M.; Nicaise, F.
2010-01-01
A large deployable antenna technology for the NASA Surface Water and Ocean Topography (SWOT) Mission is currently being developed by JPL in response to NRC Earth Science Tier 2 Decadal Survey recommendations. This technology is required to enable the SWOT mission due to the fact that no currently available antenna is capable of meeting SWOT's demanding Ka-Band remote sensing requirements. One of the key aspects of this antenna development is to minimize the effect of the on-orbit thermal distortion to the antenna RF performance. An analysis process which includes: 1) the on-orbit thermal analysis to obtain the temperature distribution; 2) structural deformation analysis to get the geometry of the antenna surface; and 3) the RF performance with the given deformed antenna surface has been developed to accommodate the development of this antenna technology. The detailed analysis process and some analysis results will be presented and discussed by this paper.
Heave-pitch-roll analysis and testing of air cushion landing systems
NASA Technical Reports Server (NTRS)
Boghani, A. B.; Captain, K. M.; Wormley, D. N.
1978-01-01
The analytical tools (analysis and computer simulation) needed to explain and predict the dynamic operation of air cushion landing systems (ACLS) is described. The following tasks were performed: the development of improved analytical models for the fan and the trunk; formulation of a heave pitch roll analysis for the complete ACLS; development of a general purpose computer simulation to evaluate landing and taxi performance of an ACLS equipped aircraft; and the verification and refinement of the analysis by comparison with test data obtained through lab testing of a prototype cushion. Demonstration of simulation capabilities through typical landing and taxi simulation of an ACLS aircraft are given. Initial results show that fan dynamics have a major effect on system performance. Comparison with lab test data (zero forward speed) indicates that the analysis can predict most of the key static and dynamic parameters (pressure, deflection, acceleration, etc.) within a margin of a 10 to 25 percent.
Skills for the 21st Century Supervisor: What Factory Personnel Think.
ERIC Educational Resources Information Center
Hotek, Douglas R.
2002-01-01
Discusses supervisory skills that factory personnel believe are important for leading and improving employee performance in complex manufacturing environments. Highlights include a historical perspective; manufacturing technologies; results of Pareto analysis, comparative analysis, and analysis of variance; and a Taxonomy of Supervisory Skills.…
Assessment of statewide intersection safety performance.
DOT National Transportation Integrated Search
2011-06-01
This report summarizes the results of an analysis of the safety performance of Oregons intersections. Following a pilot : study, a database of 500 intersections randomly sampled from around the state of Oregon in both urban and rural : environment...
WASTE COMBUSTION SYSTEM ANALYSIS
The report gives results of a study of biomass combustion alternatives. The objective was to evaluate the thermal performance and costs of available and developing biomass systems. The characteristics of available biomass fuels were reviewed, and the performance parameters of alt...
Performance Management and Reward
NASA Astrophysics Data System (ADS)
Yiannis, Triantafyllopoulos; Ioannis, Seimenis; Nikolaos, Konstantopoulos
2009-08-01
The article aims to examine, current Performance Management practices on Reward, financial or non-financial using lessons from the literature and the results of a qualitative analysis as these revealed from the interview of some executive members of Greek companies.
Optimizing construction quality management of pavements using mechanistic performance analysis.
DOT National Transportation Integrated Search
2004-08-01
This report presents a statistical-based algorithm that was developed to reconcile the results from several pavement performance models used in the state of practice with systematic process control techniques. These algorithms identify project-specif...
ERIC Educational Resources Information Center
Chiang, Hsu-Min; Tsai, Luke Y.; Cheung, Ying Kuen; Brown, Alice; Li, Huacheng
2014-01-01
A meta-analysis was performed to examine differences in IQ profiles between individuals with Asperger's disorder (AspD) and high-functioning autism (HFA). Fifty-two studies were included for this study. The results showed that (a) individuals with AspD had significantly higher full-scale IQ, verbal IQ (VIQ), and performance IQ (PIQ) than did…
IUE Data Analysis Software for Personal Computers
NASA Technical Reports Server (NTRS)
Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.
1996-01-01
This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.
Performance analysis and improvement of WPAN MAC for home networks.
Mehta, Saurabh; Kwak, Kyung Sup
2010-01-01
The wireless personal area network (WPAN) is an emerging wireless technology for future short range indoor and outdoor communication applications. The IEEE 802.15.3 medium access control (MAC) is proposed to coordinate the access to the wireless medium among the competing devices, especially for short range and high data rate applications in home networks. In this paper we use analytical modeling to study the performance analysis of WPAN (IEEE 802.15.3) MAC in terms of throughput, efficient bandwidth utilization, and delay with various ACK policies under error channel condition. This allows us to introduce a K-Dly-ACK-AGG policy, payload size adjustment mechanism, and Improved Backoff algorithm to improve the performance of the WPAN MAC. Performance evaluation results demonstrate the impact of our improvements on network capacity. Moreover, these results can be very useful to WPAN application designers and protocol architects to easily and correctly implement WPAN for home networking.
Optimum performance of hovering rotors
NASA Technical Reports Server (NTRS)
Wu, J. C.; Goorjian, P. M.
1972-01-01
A theory for the optimum performance of a rotor hovering out of ground effect is developed. The performance problem is formulated using general momentum theory for an infinitely bladed rotor, and the effect of a finite number of blades is estimated. The analysis takes advantage of the fact that a simple relation exists between the radial distributions of static pressure and angular velocity in the ultimate wake, far downstream of the rotor, since the radial velocity vanishes there. This relation permits the establishment of an optimum performance criterion in terms of the ultimate wake velocities by introducing a small local perturbation of the rotational velocity and requiring the resulting ratio of thrust and power changes to be independent of the radial location of the perturbation. This analysis fully accounts for the changes in static pressure distribution and axial velocity distribution throughout the wake as the result of the local perturbation of the rotational velocity component.
Performance Analysis and Improvement of WPAN MAC for Home Networks
Mehta, Saurabh; Kwak, Kyung Sup
2010-01-01
The wireless personal area network (WPAN) is an emerging wireless technology for future short range indoor and outdoor communication applications. The IEEE 802.15.3 medium access control (MAC) is proposed to coordinate the access to the wireless medium among the competing devices, especially for short range and high data rate applications in home networks. In this paper we use analytical modeling to study the performance analysis of WPAN (IEEE 802.15.3) MAC in terms of throughput, efficient bandwidth utilization, and delay with various ACK policies under error channel condition. This allows us to introduce a K-Dly-ACK-AGG policy, payload size adjustment mechanism, and Improved Backoff algorithm to improve the performance of the WPAN MAC. Performance evaluation results demonstrate the impact of our improvements on network capacity. Moreover, these results can be very useful to WPAN application designers and protocol architects to easily and correctly implement WPAN for home networking. PMID:22319274
Amplify-and-forward cooperative diversity for green UWB-based WBSNs.
Shaban, Heba; Abou El-Nasr, Mohamad
2013-01-01
This paper proposes a novel green cooperative diversity technique based on suboptimal template-based ultra-wideband (UWB) wireless body sensor networks (WBSNs) using amplify-and-forward (AF) relays. In addition, it analyzes the bit-error-rate (BER) performance of the proposed nodes. The analysis is based on the moment-generating function (MGF) of the total signal-to-noise ratio (SNR) at the destination. It also provides an approximate value for the total SNR. The analysis studies the performance of equally correlated binary pulse position modulation (EC-BPPM) assuming the sinusoidal and square suboptimal template pulses. Numerical results are provided for the performance evaluation of optimal and suboptimal template-based nodes with and without relay cooperation. Results show that one relay node provides ~23 dB performance enhancement at 1e - 3 BER, which mitigates the effect of the nondesirable non-line-of-sight (NLOS) links in WBSNs.
Amplify-and-Forward Cooperative Diversity for Green UWB-Based WBSNs
2013-01-01
This paper proposes a novel green cooperative diversity technique based on suboptimal template-based ultra-wideband (UWB) wireless body sensor networks (WBSNs) using amplify-and-forward (AF) relays. In addition, it analyzes the bit-error-rate (BER) performance of the proposed nodes. The analysis is based on the moment-generating function (MGF) of the total signal-to-noise ratio (SNR) at the destination. It also provides an approximate value for the total SNR. The analysis studies the performance of equally correlated binary pulse position modulation (EC-BPPM) assuming the sinusoidal and square suboptimal template pulses. Numerical results are provided for the performance evaluation of optimal and suboptimal template-based nodes with and without relay cooperation. Results show that one relay node provides ~23 dB performance enhancement at 1e − 3 BER, which mitigates the effect of the nondesirable non-line-of-sight (NLOS) links in WBSNs. PMID:24307880
Noise and linearity optimization methods for a 1.9GHz low noise amplifier.
Guo, Wei; Huang, Da-Quan
2003-01-01
Noise and linearity performances are critical characteristics for radio frequency integrated circuits (RFICs), especially for low noise amplifiers (LNAs). In this paper, a detailed analysis of noise and linearity for the cascode architecture, a widely used circuit structure in LNA designs, is presented. The noise and the linearity improvement techniques for cascode structures are also developed and have been proven by computer simulating experiments. Theoretical analysis and simulation results showed that, for cascode structure LNAs, the first metallic oxide semiconductor field effect transistor (MOSFET) dominates the noise performance of the LNA, while the second MOSFET contributes more to the linearity. A conclusion is thus obtained that the first and second MOSFET of the LNA can be designed to optimize the noise performance and the linearity performance separately, without trade-offs. The 1.9GHz Complementary Metal-Oxide-Semiconductor (CMOS) LNA simulation results are also given as an application of the developed theory.
NASA Aeronautics and Space Database for bibliometric analysis
NASA Technical Reports Server (NTRS)
Powers, R.; Rudman, R.
2004-01-01
The authors use the NASA Aeronautics and Space Database to perform bibliometric analysis of citations. This paper explains their research methodology and gives some sample results showing collaboration trends between NASA Centers and other institutions.
Letter Report: Stable Hydrogen and Oxygen Isotope Analysis of B-Complex Groundwater Samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Brady D.; Moran, James J.; Nims, Megan K.
Report summarizing stable oxygen and hydrogen isotope analysis of two groundwater samples from the B-Complex. Results from analyses were compared to perched water and pore water analyses performed previously.
Caous, Cristofer André; Machado, Birajara; Hors, Cora; Zeh, Andrea Kaufmann; Dias, Cleber Gustavo; Amaro Junior, Edson
2012-01-01
To propose a measure (index) of expected risks to evaluate and follow up the performance analysis of research projects involving financial and adequate structure parameters for its development. A ranking of acceptable results regarding research projects with complex variables was used as an index to gauge a project performance. In order to implement this method the ulcer index as the basic model to accommodate the following variables was applied: costs, high impact publication, fund raising, and patent registry. The proposed structured analysis, named here as RoSI (Return on Scientific Investment) comprises a pipeline of analysis to characterize the risk based on a modeling tool that comprises multiple variables interacting in semi-quantitatively environments. This method was tested with data from three different projects in our Institution (projects A, B and C). Different curves reflected the ulcer indexes identifying the project that may have a minor risk (project C) related to development and expected results according to initial or full investment. The results showed that this model contributes significantly to the analysis of risk and planning as well as to the definition of necessary investments that consider contingency actions with benefits to the different stakeholders: the investor or donor, the project manager and the researchers.
NASA Astrophysics Data System (ADS)
Klimczak, Marcin; Bojarski, Jacek; Ziembicki, Piotr; Kęskiewicz, Piotr
2017-11-01
The requirements concerning energy performance of buildings and their internal installations, particularly HVAC systems, have been growing continuously in Poland and all over the world. The existing, traditional calculation methods following from the static heat exchange model are frequently not sufficient for a reasonable heating design of a building. Both in Poland and elsewhere in the world, methods and software are employed which allow a detailed simulation of the heating and moisture conditions in a building, and also an analysis of the performance of HVAC systems within a building. However, these systems are usually difficult in use and complex. In addition, the development of a simulation model that is sufficiently adequate to the real building requires considerable time involvement of a designer, is time-consuming and laborious. A simplification of the simulation model of a building renders it possible to reduce the costs of computer simulations. The paper analyses in detail the effect of introducing a number of different variants of the simulation model developed in Design Builder on the quality of final results obtained. The objective of this analysis is to find simplifications which allow obtaining simulation results which have an acceptable level of deviations from the detailed model, thus facilitating a quick energy performance analysis of a given building.
Model-based engineering for laser weapons systems
NASA Astrophysics Data System (ADS)
Panthaki, Malcolm; Coy, Steve
2011-10-01
The Comet Performance Engineering Workspace is an environment that enables integrated, multidisciplinary modeling and design/simulation process automation. One of the many multi-disciplinary applications of the Comet Workspace is for the integrated Structural, Thermal, Optical Performance (STOP) analysis of complex, multi-disciplinary space systems containing Electro-Optical (EO) sensors such as those which are designed and developed by and for NASA and the Department of Defense. The CometTM software is currently able to integrate performance simulation data and processes from a wide range of 3-D CAD and analysis software programs including CODE VTM from Optical Research Associates and SigFitTM from Sigmadyne Inc. which are used to simulate the optics performance of EO sensor systems in space-borne applications. Over the past year, Comet Solutions has been working with MZA Associates of Albuquerque, NM, under a contract with the Air Force Research Laboratories. This funded effort is a "risk reduction effort", to help determine whether the combination of Comet and WaveTrainTM, a wave optics systems engineering analysis environment developed and maintained by MZA Associates and used by the Air Force Research Laboratory, will result in an effective Model-Based Engineering (MBE) environment for the analysis and design of laser weapons systems. This paper will review the results of this effort and future steps.
NASA Astrophysics Data System (ADS)
Wang, Huiqin; Wang, Xue; Lynette, Kibe; Cao, Minghua
2018-06-01
The performance of multiple-input multiple-output wireless optical communication systems that adopt Q-ary pulse position modulation over spatial correlated log-normal fading channel is analyzed in terms of its un-coded bit error rate and ergodic channel capacity. The analysis is based on the Wilkinson's method which approximates the distribution of a sum of correlated log-normal random variables to a log-normal random variable. The analytical and simulation results corroborate the increment of correlation coefficients among sub-channels lead to system performance degradation. Moreover, the receiver diversity has better performance in resistance of spatial correlation caused channel fading.
Virtual DRI dataset development
NASA Astrophysics Data System (ADS)
Hixson, Jonathan G.; Teaney, Brian P.; May, Christopher; Maurer, Tana; Nelson, Michael B.; Pham, Justin R.
2017-05-01
The U.S. Army RDECOM CERDEC NVESD MSD's target acquisition models have been used for many years by the military analysis community for sensor design, trade studies, and field performance prediction. This paper analyzes the results of perception tests performed to compare the results of a field DRI (Detection, Recognition, and Identification Test) performed in 2009 to current Soldier performance viewing the same imagery in a laboratory environment and simulated imagery of the same data set. The purpose of the experiment is to build a robust data set for use in the virtual prototyping of infrared sensors. This data set will provide a strong foundation relating, model predictions, field DRI results and simulated imagery.
NASA Technical Reports Server (NTRS)
Kim, S.; Trinh, H. P.
1992-01-01
The paper discusses the performance effects resulting from plugged LOX posts of the Space Shuttle Main Engine Injector. The simulation was performed with the REFLEQS 2-D code. Analysis was performed axisymmetrically and injector surface was divided into several regions to account for the mixture ratio variation on the injector surface. The reduction of vaccum specific impulse was approximately 0.01 second per plugged LOX post. This reduction is an order of magnitude higher than the result of Space Shuttle flight reconstruction data. It is presumed that this overprediction is due to the axisymmetric simulation that smears local effects.
Thermal and Alignment Analysis of the Instrument-Level ATLAS Thermal Vacuum Test
NASA Technical Reports Server (NTRS)
Bradshaw, Heather
2012-01-01
This paper describes the thermal analysis and test design performed in preparation for the ATLAS thermal vacuum test. NASA's Advanced Topographic Laser Altimeter System (ATLAS) will be flown as the sole instrument aboard the Ice, Cloud, and land Elevation Satellite-2 (ICESat-2). It will be used to take measurements of topography and ice thickness for Arctic and Antarctic regions, providing crucial data used to predict future changes in worldwide sea levels. Due to the precise measurements ATLAS is taking, the laser altimeter has very tight pointing requirements. Therefore, the instrument is very sensitive to temperature-induced thermal distortions. For this reason, it is necessary to perform a Structural, Thermal, Optical Performance (STOP) analysis not only for flight, but also to ensure performance requirements can be operationally met during instrument-level thermal vacuum testing. This paper describes the thermal model created for the chamber setup, which was used to generate inputs for the environmental STOP analysis. This paper also presents the results of the STOP analysis, which indicate that the test predictions adequately replicate the thermal distortions predicted for flight. This is a new application of an existing process, as STOP analyses are generally performed to predict flight behavior only. Another novel aspect of this test is that it presents the opportunity to verify pointing results of a STOP model, which is not generally done. It is possible in this case, however, because the actual pointing will be measured using flight hardware during thermal vacuum testing and can be compared to STOP predictions.
Pérez, Teresa; Makrestsov, Nikita; Garatt, John; Torlakovic, Emina; Gilks, C Blake; Mallett, Susan
The Canadian Immunohistochemistry Quality Control program monitors clinical laboratory performance for estrogen receptor and progesterone receptor tests used in breast cancer treatment management in Canada. Current methods assess sensitivity and specificity at each time point, compared with a reference standard. We investigate alternative performance analysis methods to enhance the quality assessment. We used 3 methods of analysis: meta-analysis of sensitivity and specificity of each laboratory across all time points; sensitivity and specificity at each time point for each laboratory; and fitting models for repeated measurements to examine differences between laboratories adjusted by test and time point. Results show 88 laboratories participated in quality control at up to 13 time points using typically 37 to 54 histology samples. In meta-analysis across all time points no laboratories have sensitivity or specificity below 80%. Current methods, presenting sensitivity and specificity separately for each run, result in wide 95% confidence intervals, typically spanning 15% to 30%. Models of a single diagnostic outcome demonstrated that 82% to 100% of laboratories had no difference to reference standard for estrogen receptor and 75% to 100% for progesterone receptor, with the exception of 1 progesterone receptor run. Laboratories with significant differences to reference standard identified with Generalized Estimating Equation modeling also have reduced performance by meta-analysis across all time points. The Canadian Immunohistochemistry Quality Control program has a good design, and with this modeling approach has sufficient precision to measure performance at each time point and allow laboratories with a significantly lower performance to be targeted for advice.
Performance assessment for continuing and future operations at Solid Waste Storage Area 6
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1994-02-01
This radiological performance assessment for the continued disposal operations at Solid Waste Storage Area 6 (SWSA 6) on the Oak Ridge Reservation (ORR) has been prepared to demonstrate compliance with the requirements of the US DOE. The analysis of SWSA 6 required the use of assumptions to supplement the available site data when the available data were incomplete for the purpose of analysis. Results indicate that SWSA 6 does not presently meet the performance objectives of DOE Order 5820.2A. Changes in operations and continued work on the performance assessment are expected to demonstrate compliance with the performance objectives for continuingmore » operations at the Interim Waste Management Facility (IWMF). All other disposal operations in SWSA 6 are to be discontinued as of January 1, 1994. The disposal units at which disposal operations are discontinued will be subject to CERCLA remediation, which will result in acceptable protection of the public health and safety.« less
NASA Technical Management Report (533Q)
NASA Technical Reports Server (NTRS)
Klosko, S. M.; Sanchez, B. (Technical Monitor)
2001-01-01
The objective of this task is analytical support of the NASA Satellite Laser Ranging (SLR) program in the areas of SLR data analysis, software development, assessment of SLR station performance, development of improved models for atmospheric propagation and interpretation of station calibration techniques, and science coordination and analysis functions for the NASA led Central Bureau of the International Laser Ranging Service (ILRS). The contractor shall in each year of the five year contract: (1) Provide software development and analysis support to the NASA SLR program and the ILRS. Attend and make analysis reports at the monthly meetings of the Central Bureau of the ILRS covering data received during the previous period. Provide support to the Analysis Working Group of the ILRS including special tiger teams that are established to handle unique analysis problems. Support the updating of the SLR Bibliography contained on the ILRS web site; (2) Perform special assessments of SLR station performance from available data to determine unique biases and technical problems at the station; (3) Develop improvements to models of atmospheric propagation and for handling pre- and post-pass calibration data provided by global network stations; (4) Provide review presentation of overall ILRS network data results at one major scientific meeting per year; (5) Contribute to and support the publication of NASA SLR and ILRS reports highlighting the results of SLR analysis activity.
A value-based medicine cost-utility analysis of idiopathic epiretinal membrane surgery.
Gupta, Omesh P; Brown, Gary C; Brown, Melissa M
2008-05-01
To perform a reference case, cost-utility analysis of epiretinal membrane (ERM) surgery using current literature on outcomes and complications. Computer-based, value-based medicine analysis. Decision analyses were performed under two scenarios: ERM surgery in better-seeing eye and ERM surgery in worse-seeing eye. The models applied long-term published data primarily from the Blue Mountains Eye Study and the Beaver Dam Eye Study. Visual acuity and major complications were derived from 25-gauge pars plana vitrectomy studies. Patient-based, time trade-off utility values, Markov modeling, sensitivity analysis, and net present value adjustments were used in the design and calculation of results. Main outcome measures included the number of discounted quality-adjusted-life-years (QALYs) gained and dollars spent per QALY gained. ERM surgery in the better-seeing eye compared with observation resulted in a mean gain of 0.755 discounted QALYs (3% annual rate) per patient treated. This model resulted in $4,680 per QALY for this procedure. When sensitivity analysis was performed, utility values varied from $6,245 to $3,746/QALY gained, medical costs varied from $3,510 to $5,850/QALY gained, and ERM recurrence rate increased to $5,524/QALY. ERM surgery in the worse-seeing eye compared with observation resulted in a mean gain of 0.27 discounted QALYs per patient treated. The $/QALY was $16,146 with a range of $20,183 to $12,110 based on sensitivity analyses. Utility values ranged from $21,520 to $12,916/QALY and ERM recurrence rate increased to $16,846/QALY based on sensitivity analysis. ERM surgery is a very cost-effective procedure when compared with other interventions across medical subspecialties.
RNA-Seq workflow: gene-level exploratory analysis and differential expression
Love, Michael I.; Anders, Simon; Kim, Vladislav; Huber, Wolfgang
2015-01-01
Here we walk through an end-to-end gene-level RNA-Seq differential expression workflow using Bioconductor packages. We will start from the FASTQ files, show how these were aligned to the reference genome, and prepare a count matrix which tallies the number of RNA-seq reads/fragments within each gene for each sample. We will perform exploratory data analysis (EDA) for quality assessment and to explore the relationship between samples, perform differential gene expression analysis, and visually explore the results. PMID:26674615
Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives
NASA Technical Reports Server (NTRS)
Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.
2016-01-01
A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.
Numerical Analysis of Coolant Flow and Heat Transfer in ITER Diagnostic First Wall
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khodak, A.; Loesser, G.; Zhai, Y.
2015-07-24
We performed numerical simulations of the ITER Diagnostic First Wall (DFW) using ANSYS workbench. During operation DFW will include solid main body as well as liquid coolant. Thus thermal and hydraulic analysis of the DFW was performed using conjugated heat transfer approach, in which heat transfer was resolved in both solid and liquid parts, and simultaneously fluid dynamics analysis was performed only in the liquid part. This approach includes interface between solid and liquid part of the systemAnalysis was performed using ANSYS CFX software. CFX software allows solution of heat transfer equations in solid and liquid part, and solution ofmore » the flow equations in the liquid part. Coolant flow in the DFW was assumed turbulent and was resolved using Reynolds averaged Navier-Stokes equations with Shear Stress Transport turbulence model. Meshing was performed using CFX method available within ANSYS. The data cloud for thermal loading consisting of volumetric heating and surface heating was imported into CFX Volumetric heating source was generated using Attila software. Surface heating was obtained using radiation heat transfer analysis. Our results allowed us to identify areas of excessive heating. Proposals for cooling channel relocation were made. Additional suggestions were made to improve hydraulic performance of the cooling system.« less
Structural Analysis of Kufasat Using Ansys Program
NASA Astrophysics Data System (ADS)
Al-Maliky, Firas T.; AlBermani, Mohamed J.
2018-03-01
The current work focuses on vibration and modal analysis of KufaSat structure using ANSYS 16 program. Three types of Aluminum alloys (5052-H32, 6061-T6 and 7075-T6) were selected for investigation of the structure under design loads. Finite element analysis (FEA) in design static load of 51 g was performed. The natural frequencies for five modes were estimated using modal analysis. In order to ensure that KufaSat could withstand with various conditions during launch, the Margin of safety was calculated. The results of deformation and Von Mises stress for linear buckling analysis were also performed. The comparison of data was done to select the optimum material for KufaSat structures.
ATD-1 Avionics Phase 2: Post-Flight Data Analysis Report
NASA Technical Reports Server (NTRS)
Scharl, Julien
2017-01-01
This report aims to satisfy Air Traffic Management Technology Demonstration - 1 (ATD-1) Statement of Work (SOW) 3.6.19 and serves as the delivery mechanism for the analysis described in Annex C of the Flight Test Plan. The report describes the data collected and derived as well as the analysis methodology and associated results extracted from the data set collected during the ATD-1 Flight Test. All analyses described in the SOW were performed and are covered in this report except for the analysis of Final Approach Speed and its effect on performance. This analysis was de-prioritized and, at the time of this report, is not considered feasible in the schedule and costs remaining.
NASA Technical Reports Server (NTRS)
Ling, Lisa
2014-01-01
For the purpose of performing safety analysis and risk assessment for a potential off-nominal atmospheric reentry resulting in vehicle breakup, a synthesis of trajectory propagation coupled with thermal analysis and the evaluation of node failure is required to predict the sequence of events, the timeline, and the progressive demise of spacecraft components. To provide this capability, the Simulation for Prediction of Entry Article Demise (SPEAD) analysis tool was developed. The software and methodology have been validated against actual flights, telemetry data, and validated software, and safety/risk analyses were performed for various programs using SPEAD. This report discusses the capabilities, modeling, validation, and application of the SPEAD analysis tool.
Caudullo, Giorgio; Caruso, Valentina; Cappella, Annalisa; Sguazza, Emanuela; Mazzarelli, Debora; Amadasi, Alberto; Cattaneo, Cristina
2017-01-01
When forensic pathologists and anthropologists have to deal with the evaluation of the post-mortem interval (PMI) in skeletal remains, luminol testing is frequently performed as a preliminary screening method. However, the repeatability of this test on the same bone, as well as comparative studies on different bones of the same individual, has never been performed. Therefore, with the aim of investigating the influence that different types of bones may exert on the response to the luminol test, the present study analysed three different skeletal elements (femoral diaphysis, vertebra and cranial vault), gathered from ten recent exhumed skeletons (all with a 20-year PMI). The analysis was performed twice on the same bone after 2 months: the analysis at time 0 concerned the whole bone, whereas the second concerned only a part of the same bone taken during the first test (which already had been broken). The overall results showed different responses, depending on the type of bone and on the integrity of the samples. Negative results at the first analysis (6.6% out of the total of samples) are consistent with what is reported in the literature, whilst at the second analysis, the increase of about 20% of false-negative results highlights that the luminol test ought to be performed with caution in case of broken bones or elements which are taphonomically altered. Results have thus proven that the exposition to environmental agents might result in haemoglobin (Hb) loss, as detected even after only 2 months. The study also focused on the crucial issue of the type of bone subjected to testing, remarking the suitability of the femoral diaphysis (100% of positive responses at the first analysis vs only 18% of false-negative results at the second test, corresponding to 5% of total false-negative results) as opposed to other bone elements that showed a low yield. In particular, the cranial vault gave poor results, with 40% of discrepancy between results from the two analyses, which suggests caution in choosing the type of bone sample to test. In conclusion, luminol testing should be used with caution on bones different from long bones or on non-intact bones.
Rotor design optimization using a free wake analysis
NASA Technical Reports Server (NTRS)
Quackenbush, Todd R.; Boschitsch, Alexander H.; Wachspress, Daniel A.; Chua, Kiat
1993-01-01
The aim of this effort was to develop a comprehensive performance optimization capability for tiltrotor and helicopter blades. The analysis incorporates the validated EHPIC (Evaluation of Hover Performance using Influence Coefficients) model of helicopter rotor aerodynamics within a general linear/quadratic programming algorithm that allows optimization using a variety of objective functions involving the performance. The resulting computer code, EHPIC/HERO (HElicopter Rotor Optimization), improves upon several features of the previous EHPIC performance model and allows optimization utilizing a wide spectrum of design variables, including twist, chord, anhedral, and sweep. The new analysis supports optimization of a variety of objective functions, including weighted measures of rotor thrust, power, and propulsive efficiency. The fundamental strength of the approach is that an efficient search for improved versions of the baseline design can be carried out while retaining the demonstrated accuracy inherent in the EHPIC free wake/vortex lattice performance analysis. Sample problems are described that demonstrate the success of this approach for several representative rotor configurations in hover and axial flight. Features that were introduced to convert earlier demonstration versions of this analysis into a generally applicable tool for researchers and designers is also discussed.
NASA Astrophysics Data System (ADS)
Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.
2017-08-01
While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.
Session 6: Dynamic Modeling and Systems Analysis
NASA Technical Reports Server (NTRS)
Csank, Jeffrey; Chapman, Jeffryes; May, Ryan
2013-01-01
These presentations cover some of the ongoing work in dynamic modeling and dynamic systems analysis. The first presentation discusses dynamic systems analysis and how to integrate dynamic performance information into the systems analysis. The ability to evaluate the dynamic performance of an engine design may allow tradeoffs between the dynamic performance and operability of a design resulting in a more efficient engine design. The second presentation discusses the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS). T-MATS is a Simulation system with a library containing the basic building blocks that can be used to create dynamic Thermodynamic Systems. Some of the key features include Turbo machinery components, such as turbines, compressors, etc., and basic control system blocks. T-MAT is written in the Matlab-Simulink environment and is open source software. The third presentation focuses on getting additional performance from the engine by allowing the limit regulators only to be active when a limit is danger of being violated. Typical aircraft engine control architecture is based on MINMAX scheme, which is designed to keep engine operating within prescribed mechanical/operational safety limits. Using a conditionally active min-max limit regulator scheme, additional performance can be gained by disabling non-relevant limit regulators
Experimental BCAS Performance Results
DOT National Transportation Integrated Search
1978-07-01
The results of the (Litchford) Beacon-based Collision Avoidance System concept feasibility evaluation are reported. Included are a description of the concept, analysis and flight test results. The system concept is based on the range and bearing meas...
Influence of Averaging Preprocessing on Image Analysis with a Markov Random Field Model
NASA Astrophysics Data System (ADS)
Sakamoto, Hirotaka; Nakanishi-Ohno, Yoshinori; Okada, Masato
2018-02-01
This paper describes our investigations into the influence of averaging preprocessing on the performance of image analysis. Averaging preprocessing involves a trade-off: image averaging is often undertaken to reduce noise while the number of image data available for image analysis is decreased. We formulated a process of generating image data by using a Markov random field (MRF) model to achieve image analysis tasks such as image restoration and hyper-parameter estimation by a Bayesian approach. According to the notions of Bayesian inference, posterior distributions were analyzed to evaluate the influence of averaging. There are three main results. First, we found that the performance of image restoration with a predetermined value for hyper-parameters is invariant regardless of whether averaging is conducted. We then found that the performance of hyper-parameter estimation deteriorates due to averaging. Our analysis of the negative logarithm of the posterior probability, which is called the free energy based on an analogy with statistical mechanics, indicated that the confidence of hyper-parameter estimation remains higher without averaging. Finally, we found that when the hyper-parameters are estimated from the data, the performance of image restoration worsens as averaging is undertaken. We conclude that averaging adversely influences the performance of image analysis through hyper-parameter estimation.
Response of the Alliance 1 Proof-of-Concept Airplane Under Gust Loads
NASA Technical Reports Server (NTRS)
Naser, A. S.; Pototzky, A. S.; Spain, C. V.
2001-01-01
This report presents the work performed by Lockheed Martin's Langley Program Office in support of NASA's Environmental Research Aircraft and Sensor Technology (ERAST) program. The primary purpose of this work was to develop and demonstrate a gust analysis method which accounts for the span-wise variation of gust velocity. This is important because these unmanned aircraft having high aspect ratios and low wing loading are very flexible, and fly at low speeds. The main focus of the work was therefore to perform a two-dimensional Power Spectrum Density (PSD) analysis of the Alliance 1 Proof-of-Concept Unmanned Aircraft, As of this writing, none of the aircraft described in this report have been constructed. They are concepts represented by analytical models. The process first involved the development of suitable structural and aeroelastic Finite Element Models (FEM). This was followed by development of a one-dimensional PSD gust analysis, and then the two-dimensional (PSD) analysis of the Alliance 1. For further validation and comparison, two additional analyses were performed. A two-dimensional PSD gust analysis was performed on a simplet MSC/NASTRAN example problem. Finally a one-dimensional discrete gust analysis was performed on Alliance 1. This report describes this process, shows the relevant comparisons between analytical methods, and discusses the physical meanings of the results.
Impact of Embedded Military Metal Alloys on Skeletal Physiology in an Animal Model
2017-04-04
turnover were completed and statistical comparison performed for each time point. Each ELISA was performed according to the instructions within each kit...expectations for controls. Results of osteocalcin ELISA were evaluated and any results with a coefficient of variation greater than 25% were omitted...Results of TRAP5b ELISA were evaluated and any results with a coefficient of variation greater than 25% were omitted from analysis. Measures of TRAP5b
Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C
2015-02-01
Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Task complexity, student perceptions of vocabulary learning in EFL, and task performance.
Wu, Xiaoli; Lowyck, Joost; Sercu, Lies; Elen, Jan
2013-03-01
The study deepened our understanding of how students' self-efficacy beliefs contribute to the context of teaching English as a foreign language in the framework of cognitive mediational paradigm at a fine-tuned task-specific level. The aim was to examine the relationship among task complexity, self-efficacy beliefs, domain-related prior knowledge, learning strategy use, and task performance as they were applied to English vocabulary learning from reading tasks. Participants were 120 second-year university students (mean age 21) from a Chinese university. This experiment had two conditions (simple/complex). A vocabulary level test was first conducted to measure participants' prior knowledge of English vocabulary. Participants were then randomly assigned to one of the learning tasks. Participants were administered task booklets together with the self-efficacy scales, measures of learning strategy use, and post-tests. Data obtained were submitted to multivariate analysis of variance (MANOVA) and path analysis. Results from the MANOVA model showed a significant effect of vocabulary level on self-efficacy beliefs, learning strategy use, and task performance. Task complexity showed no significant effect; however, an interaction effect between vocabulary level and task complexity emerged. Results from the path analysis showed self-efficacy beliefs had an indirect effect on performance. Our results highlighted the mediating role of self-efficacy beliefs and learning strategy use. Our findings indicate that students' prior knowledge plays a crucial role on both self-efficacy beliefs and task performance, and the predictive power of self-efficacy on task performance may lie in its association with learning strategy use. © 2011 The British Psychological Society.
Diffusive deposition of aerosols in Phebus containment during FPT-2 test
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kontautas, A.; Urbonavicius, E.
2012-07-01
At present the lumped-parameter codes is the main tool to investigate the complex response of the containment of Nuclear Power Plant in case of an accident. Continuous development and validation of the codes is required to perform realistic investigation of the processes that determine the possible source term of radioactive products to the environment. Validation of the codes is based on the comparison of the calculated results with the measurements performed in experimental facilities. The most extensive experimental program to investigate fission product release from the molten fuel, transport through the cooling circuit and deposition in the containment is performedmore » in PHEBUS test facility. Test FPT-2 performed in this facility is considered for analysis of processes taking place in containment. Earlier performed investigations using COCOSYS code showed that the code could be successfully used for analysis of thermal-hydraulic processes and deposition of aerosols, but there was also noticed that diffusive deposition on the vertical walls does not fit well with the measured results. In the CPA module of ASTEC code there is implemented different model for diffusive deposition, therefore the PHEBUS containment model was transferred from COCOSYS code to ASTEC-CPA to investigate the influence of the diffusive deposition modelling. Analysis was performed using PHEBUS containment model of 16 nodes. The calculated thermal-hydraulic parameters are in good agreement with measured results, which gives basis for realistic simulation of aerosol transport and deposition processes. Performed investigations showed that diffusive deposition model has influence on the aerosol deposition distribution on different surfaces in the test facility. (authors)« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-23
... over time. (This study is an institutional analysis only, not a technical analysis, and it is not... Adam Hopps at (202) 680-0091. The ITS JPO will present results from an early analysis of organizational models. This analysis will describe the functions that need to be performed by a CME; identify key...
NASA Astrophysics Data System (ADS)
Song, Young-Joo; Bae, Jonghee; Kim, Young-Rok; Kim, Bang-Yeop
2016-12-01
In this study, the uncertainty requirements for orbit, attitude, and burn performance were estimated and analyzed for the execution of the 1st lunar orbit insertion (LOI) maneuver of the Korea Pathfinder Lunar Orbiter (KPLO) mission. During the early design phase of the system, associate analysis is an essential design factor as the 1st LOI maneuver is the largest burn that utilizes the onboard propulsion system; the success of the lunar capture is directly affected by the performance achieved. For the analysis, the spacecraft is assumed to have already approached the periselene with a hyperbolic arrival trajectory around the moon. In addition, diverse arrival conditions and mission constraints were considered, such as varying periselene approach velocity, altitude, and orbital period of the capture orbit after execution of the 1st LOI maneuver. The current analysis assumed an impulsive LOI maneuver, and two-body equations of motion were adapted to simplify the problem for a preliminary analysis. Monte Carlo simulations were performed for the statistical analysis to analyze diverse uncertainties that might arise at the moment when the maneuver is executed. As a result, three major requirements were analyzed and estimated for the early design phase. First, the minimum requirements were estimated for the burn performance to be captured around the moon. Second, the requirements for orbit, attitude, and maneuver burn performances were simultaneously estimated and analyzed to maintain the 1st elliptical orbit achieved around the moon within the specified orbital period. Finally, the dispersion requirements on the B-plane aiming at target points to meet the target insertion goal were analyzed and can be utilized as reference target guidelines for a mid-course correction (MCC) maneuver during the transfer. More detailed system requirements for the KPLO mission, particularly for the spacecraft bus itself and for the flight dynamics subsystem at the ground control center, are expected to be prepared and established based on the current results, including a contingency trajectory design plan.
Development of Flight-Test Performance Estimation Techniques for Small Unmanned Aerial Systems
NASA Astrophysics Data System (ADS)
McCrink, Matthew Henry
This dissertation provides a flight-testing framework for assessing the performance of fixed-wing, small-scale unmanned aerial systems (sUAS) by leveraging sub-system models of components unique to these vehicles. The development of the sub-system models, and their links to broader impacts on sUAS performance, is the key contribution of this work. The sub-system modeling and analysis focuses on the vehicle's propulsion, navigation and guidance, and airframe components. Quantification of the uncertainty in the vehicle's power available and control states is essential for assessing the validity of both the methods and results obtained from flight-tests. Therefore, detailed propulsion and navigation system analyses are presented to validate the flight testing methodology. Propulsion system analysis required the development of an analytic model of the propeller in order to predict the power available over a range of flight conditions. The model is based on the blade element momentum (BEM) method. Additional corrections are added to the basic model in order to capture the Reynolds-dependent scale effects unique to sUAS. The model was experimentally validated using a ground based testing apparatus. The BEM predictions and experimental analysis allow for a parameterized model relating the electrical power, measurable during flight, to the power available required for vehicle performance analysis. Navigation system details are presented with a specific focus on the sensors used for state estimation, and the resulting uncertainty in vehicle state. Uncertainty quantification is provided by detailed calibration techniques validated using quasi-static and hardware-in-the-loop (HIL) ground based testing. The HIL methods introduced use a soft real-time flight simulator to provide inertial quality data for assessing overall system performance. Using this tool, the uncertainty in vehicle state estimation based on a range of sensors, and vehicle operational environments is presented. The propulsion and navigation system models are used to evaluate flight-testing methods for evaluating fixed-wing sUAS performance. A brief airframe analysis is presented to provide a foundation for assessing the efficacy of the flight-test methods. The flight-testing presented in this work is focused on validating the aircraft drag polar, zero-lift drag coefficient, and span efficiency factor. Three methods are detailed and evaluated for estimating these design parameters. Specific focus is placed on the influence of propulsion and navigation system uncertainty on the resulting performance data. Performance estimates are used in conjunction with the propulsion model to estimate the impact sensor and measurement uncertainty on the endurance and range of a fixed-wing sUAS. Endurance and range results for a simplistic power available model are compared to the Reynolds-dependent model presented in this work. Additional parameter sensitivity analysis related to state estimation uncertainties encountered in flight-testing are presented. Results from these analyses indicate that the sub-system models introduced in this work are of first-order importance, on the order of 5-10% change in range and endurance, in assessing the performance of a fixed-wing sUAS.
More on Time Series Designs: A Reanalysis of Mayer and Kozlow's Data.
ERIC Educational Resources Information Center
Willson, Victor L.
1982-01-01
Differentiating between time-series design and time-series analysis, examines design considerations and reanalyzes data previously reported by Mayer and Kozlow in this journal. The current analysis supports the analysis performed by Mayer and Kozlow but puts the results on a somewhat firmer statistical footing. (Author/JN)
Teodoro, George; Kurc, Tahsin; Kong, Jun; Cooper, Lee; Saltz, Joel
2014-01-01
We study and characterize the performance of operations in an important class of applications on GPUs and Many Integrated Core (MIC) architectures. Our work is motivated by applications that analyze low-dimensional spatial datasets captured by high resolution sensors, such as image datasets obtained from whole slide tissue specimens using microscopy scanners. Common operations in these applications involve the detection and extraction of objects (object segmentation), the computation of features of each extracted object (feature computation), and characterization of objects based on these features (object classification). In this work, we have identify the data access and computation patterns of operations in the object segmentation and feature computation categories. We systematically implement and evaluate the performance of these operations on modern CPUs, GPUs, and MIC systems for a microscopy image analysis application. Our results show that the performance on a MIC of operations that perform regular data access is comparable or sometimes better than that on a GPU. On the other hand, GPUs are significantly more efficient than MICs for operations that access data irregularly. This is a result of the low performance of MICs when it comes to random data access. We also have examined the coordinated use of MICs and CPUs. Our experiments show that using a performance aware task strategy for scheduling application operations improves performance about 1.29× over a first-come-first-served strategy. This allows applications to obtain high performance efficiency on CPU-MIC systems - the example application attained an efficiency of 84% on 192 nodes (3072 CPU cores and 192 MICs). PMID:25419088
Probabilistic Analysis of Gas Turbine Field Performance
NASA Technical Reports Server (NTRS)
Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.
2002-01-01
A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.
Thermodynamic analysis of a new dual evaporator CO2 transcritical refrigeration cycle
NASA Astrophysics Data System (ADS)
Abdellaoui, Ezzaalouni Yathreb; Kairouani, Lakdar Kairouani
2017-03-01
In this work, a new dual-evaporator CO2 transcritical refrigeration cycle with two ejectors is proposed. In this new system, we proposed to recover the lost energy of condensation coming off the gas cooler and operate the refrigeration cycle ejector free and enhance the system performance and obtain dual-temperature refrigeration simultaneously. The effects of some key parameters on the thermodynamic performance of the modified cycle are theoretically investigated based on energetic and exergetic analysis. The simulation results for the modified cycle indicate more effective system performance improvement than the single ejector in the CO2 vapor compression cycle using ejector as an expander ranging up to 46%. The exergetic analysis for this system is made. The performance characteristics of the proposed cycle show its promise in dual-evaporator refrigeration system.
Antiwindup analysis and design approaches for MIMO systems
NASA Technical Reports Server (NTRS)
Marcopoli, Vincent R.; Phillips, Stephen M.
1994-01-01
Performance degradation of multiple-input multiple-output (MIMO) control systems having limited actuators is often handled by augmenting the controller with an antiwindup mechanism, which attempts to maintain system performance when limits are encountered. The goals of this paper are: (1) To develop a method to analyze antiwindup systems to determine precisely what stability and performance degradation is incurred under limited conditions. It is shown that by reformulating limited actuator commands as resulting from multiplicative perturbations to the corresponding controller requests, mu-analysis tools can be utilized to obtain quantitative measures of stability and performance degradation. (2) To propose a linear, time invariant (LTI) criterion on which to base the antiwindup design. These analysis and design methods are illustrated through the evaluation of two competing antiwindup schemes augmenting the controller of a Short Take-Off and Vertical Landing (STOVL) aircraft in transition flight.
Antiwindup analysis and design approaches for MIMO systems
NASA Technical Reports Server (NTRS)
Marcopoli, Vincent R.; Phillips, Stephen M.
1993-01-01
Performance degradation of multiple-input multiple-output (MIMO) control systems having limited actuators is often handled by augmenting the controller with an antiwindup mechanism, which attempts to maintain system performance when limits are encountered. The goals of this paper are: 1) to develop a method to analyze antiwindup systems to determine precisely what stability and performance degradation is incurred under limited conditions. It is shown that by reformulating limited actuator commands as resulting from multiplicative perturbations to the corresponding controller requests, mu-analysis tools can be utilized to obtain quantitative measures of stability and performance degradation. 2) To propose a linear, time invariant (LTI) criterion on which to base the antiwindup design. These analysis and design methods are illustrated through the evaluation of two competing antiwindup schemes augmenting the controller of a Short Take-Off and Vertical Landing (STOVL) aircraft in transition flight.
Post2 End-to-End Descent and Landing Simulation for ALHAT Design Analysis Cycle 2
NASA Technical Reports Server (NTRS)
Davis, Jody L.; Striepe, Scott A.; Maddock, Robert W.; Johnson, Andrew E.; Paschall, Stephen C., II
2010-01-01
The ALHAT project is an agency-level program involving NASA centers, academia, and industry, with a primary goal to develop a safe, autonomous, precision-landing system for robotic and crew-piloted lunar and planetary descent vehicles. POST2 is used as the 6DOF descent and landing trajectory simulation for determining integrated system performance of ALHAT landing-system models and lunar environment models. This paper presents updates in the development of the ALHAT POST2 simulation, as well as preliminary system performance analysis for ALDAC-2 used for the testing and assessment of ALHAT system models. The ALDAC-2 POST2 Monte Carlo simulation results have been generated and focus on HRN model performance with the fully integrated system, as well performance improvements of AGNC and TSAR model since the previous design analysis cycle
Summary Results of the 1980 NACUBO Comparative Performance Study and Investment Questionnaire.
ERIC Educational Resources Information Center
National Association of College and University Business Officers, Washington, DC.
This summary report contains information on the investments of college and university funds. It is the tenth annual study in the series by the National Association of College and University Business Officers (NACUBO). Sections present data and brief analysis on investment performance and endowment characteristics. Under investment performance,…
ERIC Educational Resources Information Center
And Others; Townsend, J. William
1974-01-01
The present study investigated the efficiency of various existing measures, mainly psychological tests, for predicting job performance of mentally retarded workers in a sheltered occupational shop. Results indicated that existing measures are predictive of performance on some but not all jobs in a sheltered workshop. (Author)
ERIC Educational Resources Information Center
Florin-Thuma, Beth C.; Boudreau, John W.
1987-01-01
Investigated the frequent but previously untested assertion that utility analysis can improve communication and decision making about human resource management programs by examining a performance feedback intervention in a small fast-food store. Results suggest substantial payoffs from performance feedback, though the store's owner-managers had…
NASA Technical Reports Server (NTRS)
Hailperin, Max
1993-01-01
This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that our techniques allow more accurate estimation of the global system load ing, resulting in fewer object migration than local methods. Our method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive methods.
The magnetotelluric phase tensor analysis of the Sembalun-Propok area, West Nusa Tenggara, Indonesia
NASA Astrophysics Data System (ADS)
Febriani, F.; Widarto, D. S.; Gaffar, E.; Nasution, A.; Grandis, H.
2017-04-01
The subsurface structure of the Sembalun-Propok area, NTB, Indonesia, has been investigated using magnetotelluric method (MT). To obtain the information of the dimensionality of the regional structure and determine the regional strike of the study area, the phase tensor analysis has been performed in this study. The results show that most of the skew angle values (β) are distributed within ± 5°. It indicates that the regional structure of the study area can be assumed as two dimensional. In addition, to determine the regional strike of the study area, we also calculated the major axes of the phase tensor. The result presents that the regional strike of the study area is about N330°E. According to the results of the phase tensor analysis, we rotated the impedance tensor to N330°E and performed 2-D inversion modeling. The result presents that the substructure model suits with the geological background of the study area.
Measurement system analysis of viscometers used for drilling mud characterization
NASA Astrophysics Data System (ADS)
Mat-Shayuti, M. S.; Adzhar, S. N.
2017-07-01
Viscometers in the Faculty of Chemical Engineering, University Teknologi MARA, are subject to heavy utilization from the members of the faculty. Due to doubts surrounding their result integrity and maintenance management, Measurement System Analysis was executed. 5 samples of drilling muds with varied barite content from 5 - 25 weight% were prepared and their rheological properties determined in 3 trials by 3 operators using the viscometers. Gage Linearity and Bias Study were performed using Minitab software and the result shows high biases in the range of 19.2% to 38.7%, with non-linear trend along the span of measurements. Gage Repeatability & Reproducibility (Nested) analysis later produces Percent Repeatability & Reproducibility more than 7.7% and Percent Tolerance above 30%. Lastly, good and marginal Distinct Categories output are seen among the results. Despite acceptable performance of the measurement system in Distinct Categories, the poor results in accuracy, linearity, and Percent Repeatability & Reproducibility render the gage generally not capable. Improvement to the measurement system is imminent.
Automated Thermal Sample Acquisition with Applications
NASA Astrophysics Data System (ADS)
Kooshesh, K. A.; Lineberger, D. H.
2012-03-01
We created an Arduino®-based robot to detect samples subject to an experiment, perform measurements once each sample is located, and store the results for further analysis. We then relate the robot’s performance to an experiment on thermal inertia.
NASA Technical Reports Server (NTRS)
Kapoor, Kamlesh; Anderson, Bernhard H.; Shaw, Robert J.
1994-01-01
A full Navier-Stokes analysis was performed to evaluate the performance of the subsonic diffuser of a NASA Lewis Research Center 70/30 mixed-compression bifurcated supersonic inlet for high speed civil transport application. The PARC3D code was used in the present study. The computations were also performed when approximately 2.5 percent of the engine mass flow was allowed to bypass through the engine bypass doors. The computational results were compared with the available experimental data which consisted of detailed Mach number and total pressure distribution along the entire length of the subsonic diffuser. The total pressure recovery, flow distortion, and crossflow velocity at the engine face were also calculated. The computed surface ramp and cowl pressure distributions were compared with experiments. Overall, the computational results compared well with experimental data. The present CFD analysis demonstrated that the bypass flow improves the total pressure recovery and lessens flow distortions at the engine face.
Statistical analysis of the determinations of the Sun's Galactocentric distance
NASA Astrophysics Data System (ADS)
Malkin, Zinovy
2013-02-01
Based on several tens of R0 measurements made during the past two decades, several studies have been performed to derive the best estimate of R0. Some used just simple averaging to derive a result, whereas others provided comprehensive analyses of possible errors in published results. In either case, detailed statistical analyses of data used were not performed. However, a computation of the best estimates of the Galactic rotation constants is not only an astronomical but also a metrological task. Here we perform an analysis of 53 R0 measurements (published in the past 20 years) to assess the consistency of the data. Our analysis shows that they are internally consistent. It is also shown that any trend in the R0 estimates from the last 20 years is statistically negligible, which renders the presence of a bandwagon effect doubtful. On the other hand, the formal errors in the published R0 estimates improve significantly with time.
Examination of Spectral Transformations on Spectral Mixture Analysis
NASA Astrophysics Data System (ADS)
Deng, Y.; Wu, C.
2018-04-01
While many spectral transformation techniques have been applied on spectral mixture analysis (SMA), few study examined their necessity and applicability. This paper focused on exploring the difference between spectrally transformed schemes and untransformed scheme to find out which transformed scheme performed better in SMA. In particular, nine spectrally transformed schemes as well as untransformed scheme were examined in two study areas. Each transformed scheme was tested 100 times using different endmember classes' spectra under the endmember model of vegetation- high albedo impervious surface area-low albedo impervious surface area-soil (V-ISAh-ISAl-S). Performance of each scheme was assessed based on mean absolute error (MAE). Statistical analysis technique, Paired-Samples T test, was applied to test the significance of mean MAEs' difference between transformed and untransformed schemes. Results demonstrated that only NSMA could exceed the untransformed scheme in all study areas. Some transformed schemes showed unstable performance since they outperformed the untransformed scheme in one area but weakened the SMA result in another region.
Van Dessel, E; Fierens, K; Pattyn, P; Van Nieuwenhove, Y; Berrevoet, F; Troisi, R; Ceelen, W
2009-01-01
Approximately 5%-20% of colorectal cancer (CRC) patients present with synchronous potentially resectable liver metastatic disease. Preclinical and clinical studies suggest a benefit of the 'liver first' approach, i.e. resection of the liver metastasis followed by resection of the primary tumour. A formal decision analysis may support a rational choice between several therapy options. Survival and morbidity data were retrieved from relevant clinical studies identified by a Web of Science search. Data were entered into decision analysis software (TreeAge Pro 2009, Williamstown, MA, USA). Transition probabilities including the risk of death from complications or disease progression associated with individual therapy options were entered into the model. Sensitivity analysis was performed to evaluate the model's validity under a variety of assumptions. The result of the decision analysis confirms the superiority of the 'liver first' approach. Sensitivity analysis demonstrated that this assumption is valid on condition that the mortality associated with the hepatectomy first is < 4.5%, and that the mortality of colectomy performed after hepatectomy is < 3.2%. The results of this decision analysis suggest that, in patients with synchronous resectable colorectal liver metastases, the 'liver first' approach is to be preferred. Randomized trials will be needed to confirm the results of this simulation based outcome.
Time-frequency analysis of human motion during rhythmic exercises.
Omkar, S N; Vyas, Khushi; Vikranth, H N
2011-01-01
Biomechanical signals due to human movements during exercise are represented in time-frequency domain using Wigner Distribution Function (WDF). Analysis based on WDF reveals instantaneous spectral and power changes during a rhythmic exercise. Investigations were carried out on 11 healthy subjects who performed 5 cycles of sun salutation, with a body-mounted Inertial Measurement Unit (IMU) as a motion sensor. Variance of Instantaneous Frequency (I.F) and Instantaneous Power (I.P) for performance analysis of the subject is estimated using one-way ANOVA model. Results reveal that joint Time-Frequency analysis of biomechanical signals during motion facilitates a better understanding of grace and consistency during rhythmic exercise.
Optimization of structures on the basis of fracture mechanics and reliability criteria
NASA Technical Reports Server (NTRS)
Heer, E.; Yang, J. N.
1973-01-01
Systematic summary of factors which are involved in optimization of given structural configuration is part of report resulting from study of analysis of objective function. Predicted reliability of performance of finished structure is sharply dependent upon results of coupon tests. Optimization analysis developed by study also involves expected cost of proof testing.
ERIC Educational Resources Information Center
Smedema, Susan Miller; Pfaller, Joseph; Moser, Erin; Tu, Wei-Mo; Chan, Fong
2013-01-01
Objective: To evaluate the measurement structure of the Trait Hope Scale (THS) among individuals with spinal cord injury. Design: Confirmatory factor analysis and reliability and validity analyses were performed. Participants: 242 individuals with spinal cord injury. Results: Results support the two-factor measurement model for the THS with agency…
NASA Technical Reports Server (NTRS)
Hulka, J. R.; Jones, G. W.
2010-01-01
Liquid rocket engines using oxygen and methane propellants are being considered by the National Aeronautics and Space Administration (NASA) for in-space vehicles. This propellant combination has not been previously used in a flight-qualified engine system, so limited test data and analysis results are available at this stage of early development. NASA has funded several hardware-oriented activities with oxygen and methane propellants over the past several years with the Propulsion and Cryogenic Advanced Development (PCAD) project, under the Exploration Technology Development Program. As part of this effort, the NASA Marshall Space Flight Center has conducted combustion, performance, and combustion stability analyses of several of the configurations. This paper summarizes the analyses of combustion and performance as a follow-up to a paper published in the 2008 JANNAF/LPS meeting. Combustion stability analyses are presented in a separate paper. The current paper includes test and analysis results of coaxial element injectors using liquid oxygen and liquid methane or gaseous methane propellants. Several thrust chamber configurations have been modeled, including thrust chambers with multi-element swirl coax element injectors tested at the NASA MSFC, and a uni-element chamber with shear and swirl coax injectors tested at The Pennsylvania State University. Configurations were modeled with two one-dimensional liquid rocket combustion analysis codes, the Rocket Combustor Interaction Design and Analysis (ROCCID), and the Coaxial Injector Combustion Model (CICM). Significant effort was applied to show how these codes can be used to model combustion and performance with oxygen/methane propellants a priori, and what anchoring or calibrating features need to be applied or developed in the future. This paper describes the test hardware configurations, presents the results of all the analyses, and compares the results from the two analytical methods
The efficiency and budgeting of public hospitals: case study of iran.
Yusefzadeh, Hasan; Ghaderi, Hossein; Bagherzade, Rafat; Barouni, Mohsen
2013-05-01
Hospitals are the most costly and important components of any health care system, so it is important to know their economic values, pay attention to their efficiency and consider factors affecting them. The aim of this study was to assess the technical scale and economic efficiency of hospitals in the West Azerbaijan province of Iran, for which Data Envelopment Analysis (DEA) was used to propose a model for operational budgeting. This study was a descriptive-analysis that was conducted in 2009 and had three inputs and two outputs. Deap2, 1 software was used for data analysis. Slack and radial movements and surplus of inputs were calculated for selected hospitals. Finally, a model was proposed for performance-based budgeting of hospitals and health sectors using the DEA technique. The average scores of technical efficiency, pure technical efficiency (managerial efficiency) and scale efficiency of hospitals were 0.584, 0.782 and 0.771, respectively. In other words the capacity of efficiency promotion in hospitals without any increase in costs and with the same amount of inputs was about 41.5%. Only four hospitals among all hospitals had the maximum level of technical efficiency. Moreover, surplus production factors were evident in these hospitals. Reduction of surplus production factors through comprehensive planning based on the results of the Data Envelopment Analysis can play a major role in cost reduction of hospitals and health sectors. In hospitals with a technical efficiency score of less than one, the original and projected values of inputs were different; resulting in a surplus. Hence, these hospitals should reduce their values of inputs to achieve maximum efficiency and optimal performance. The results of this method was applied to hospitals a benchmark for making decisions about resource allocation; linking budgets to performance results; and controlling and improving hospitals performance.
Hayes, Timothy; Usami, Satoshi; Jacobucci, Ross; McArdle, John J
2015-12-01
In this article, we describe a recent development in the analysis of attrition: using classification and regression trees (CART) and random forest methods to generate inverse sampling weights. These flexible machine learning techniques have the potential to capture complex nonlinear, interactive selection models, yet to our knowledge, their performance in the missing data analysis context has never been evaluated. To assess the potential benefits of these methods, we compare their performance with commonly employed multiple imputation and complete case techniques in 2 simulations. These initial results suggest that weights computed from pruned CART analyses performed well in terms of both bias and efficiency when compared with other methods. We discuss the implications of these findings for applied researchers. (c) 2015 APA, all rights reserved).
Initial Data Analysis Results for ATD-2 ISAS HITL Simulation
NASA Technical Reports Server (NTRS)
Lee, Hanbong
2017-01-01
To evaluate the operational procedures and information requirements for the core functional capabilities of the ATD-2 project, such as tactical surface metering tool, APREQ-CFR procedure, and data element exchanges between ramp and tower, human-in-the-loop (HITL) simulations were performed in March, 2017. This presentation shows the initial data analysis results from the HITL simulations. With respect to the different runway configurations and metering values in tactical surface scheduler, various airport performance metrics were analyzed and compared. These metrics include gate holding time, taxi-out in time, runway throughput, queue size and wait time in queue, and TMI flight compliance. In addition to the metering value, other factors affecting the airport performance in the HITL simulation, including run duration, runway changes, and TMI constraints, are also discussed.
8 years of CPV: ISFOC CPV plants, long-term performance analysis and results
NASA Astrophysics Data System (ADS)
Martínez, María; Sánchez, Daniel; Calvo-Parra, Gustavo; Gil, Eduardo; Hipólito, Ángel; de Gregorio, Fernando; de la Rubia, Oscar
2017-09-01
ISFOC is an R&D center focused on CPV in Puertollano (Spain). It was founded in 2006 and has 2.3MW of CPV plants in operation and connected to the grid since 2008. Therefore, for the time of the conference ISFOC has more than 8 years of real operation data. The performance analysis has been focused on ISFOC - La Nava CPV plant: 800kW of Concentrix (Soitec), SolFocus and Isofotón and one flat PV plant mounted on two-axis tracker. The main result obtained is that the rate of performance decrease obtained for a mature CPV technology and IEC 62108 certified is in the range of flat PV values, this means that the CPV technology does not present higher degradation rates than flat PV.
Some Observations on the Current Status of Performing Finite Element Analyses
NASA Technical Reports Server (NTRS)
Raju, Ivatury S.; Knight, Norman F., Jr; Shivakumar, Kunigal N.
2015-01-01
Aerospace structures are complex high-performance structures. Advances in reliable and efficient computing and modeling tools are enabling analysts to consider complex configurations, build complex finite element models, and perform analysis rapidly. Many of the early career engineers of today are very proficient in the usage of modern computers, computing engines, complex software systems, and visualization tools. These young engineers are becoming increasingly efficient in building complex 3D models of complicated aerospace components. However, the current trends demonstrate blind acceptance of the results of the finite element analysis results. This paper is aimed at raising an awareness of this situation. Examples of the common encounters are presented. To overcome the current trends, some guidelines and suggestions for analysts, senior engineers, and educators are offered.
Analysis of a digital RF memory in a signal-delay application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jelinek, D.A.
1992-03-01
Laboratory simulation of the approach of a radar fuze towards a target is an important factor in our ability to accurately measure the radar`s performance. This simulation is achieved, in part, by dynamically delaying and attenuating the radar`s transmitted pulse and sending the result back to the radar`s receiver. Historically, the device used to perform the dynamic delay has been a limiting factor in the evaluation of a radar`s performance and characteristics. A new device has been proposed that appears to have more capability than previous dynamic delay devices. This device is the digital RF memory. This report presents themore » results of an analysis of a digital RF memory used in a signal-delay application. 2 refs.« less
Hayes, Timothy; Usami, Satoshi; Jacobucci, Ross; McArdle, John J.
2016-01-01
In this article, we describe a recent development in the analysis of attrition: using classification and regression trees (CART) and random forest methods to generate inverse sampling weights. These flexible machine learning techniques have the potential to capture complex nonlinear, interactive selection models, yet to our knowledge, their performance in the missing data analysis context has never been evaluated. To assess the potential benefits of these methods, we compare their performance with commonly employed multiple imputation and complete case techniques in 2 simulations. These initial results suggest that weights computed from pruned CART analyses performed well in terms of both bias and efficiency when compared with other methods. We discuss the implications of these findings for applied researchers. PMID:26389526
Cluster Correspondence Analysis.
van de Velden, M; D'Enza, A Iodice; Palumbo, F
2017-03-01
A method is proposed that combines dimension reduction and cluster analysis for categorical data by simultaneously assigning individuals to clusters and optimal scaling values to categories in such a way that a single between variance maximization objective is achieved. In a unified framework, a brief review of alternative methods is provided and we show that the proposed method is equivalent to GROUPALS applied to categorical data. Performance of the methods is appraised by means of a simulation study. The results of the joint dimension reduction and clustering methods are compared with the so-called tandem approach, a sequential analysis of dimension reduction followed by cluster analysis. The tandem approach is conjectured to perform worse when variables are added that are unrelated to the cluster structure. Our simulation study confirms this conjecture. Moreover, the results of the simulation study indicate that the proposed method also consistently outperforms alternative joint dimension reduction and clustering methods.
Straylight analysis of the BepiColombo Laser Altimeter
NASA Astrophysics Data System (ADS)
Weigel, T.; Rugi-Grond, E.; Kudielka, K.
2008-09-01
The BepiColombo Laser Altimeter (BELA) shall profile the surface of planet Mercury and operates on the day side as well as on the night side. Because of the high thermal loads, most interior surfaces of the front optics are highly reflective and specular, including the baffle. This puts a handicap on the straylight performance, which is needed to limit the solar background. We present the design measures used to reach an attenuation of about 10-8. We resume the method of backward straylight analysis which starts the rays at the detector and analyses the results in object space. The backward analysis can be quickly compiled and challenges computer resources rather than labor effort. This is very useful in a conceptual design phase when a design is iterated and trade-offs are to be performed. For one design, we compare the results with values obtained from a forward analysis.
Design Considerations for a New Terminal Area Arrival Scheduler
NASA Technical Reports Server (NTRS)
Thipphavong, Jane; Mulfinger, Daniel
2010-01-01
Design of a terminal area arrival scheduler depends on the interrelationship between throughput, delay and controller intervention. The main contribution of this paper is an analysis of the above interdependence for several stochastic behaviors of expected system performance distributions in the aircraft s time of arrival at the meter fix and runway. Results of this analysis serve to guide the scheduler design choices for key control variables. Two types of variables are analyzed, separation buffers and terminal delay margins. The choice for these decision variables was tested using sensitivity analysis. Analysis suggests that it is best to set the separation buffer at the meter fix to its minimum and adjust the runway buffer to attain the desired system performance. Delay margin was found to have the least effect. These results help characterize the variables most influential in the scheduling operations of terminal area arrivals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fricke, Brian A; Abdelaziz, Omar; Vineyard, Edward Allan
In this paper, Life Cycle Climate Performance (LCCP) analysis is used to estimate lifetime direct and indirect carbon dioxide equivalent gas emissions of various refrigerant options and commercial refrigeration system designs, including the multiplex DX system with various hydrofluorocarbon (HFC) refrigerants, the HFC/R744 cascade system incorporating a medium-temperature R744 secondary loop, and the transcritical R744 booster system. The results of the LCCP analysis are presented, including the direct and indirect carbon dioxide equivalent emissions for each refrigeration system and refrigerant option. Based on the results of the LCCP analysis, recommendations are given for the selection of low GWP replacement refrigerantsmore » for use in existing commercial refrigeration systems, as well as for the selection of commercial refrigeration system designs with low carbon dioxide equivalent emissions, suitable for new installations.« less
High-performance parallel analysis of coupled problems for aircraft propulsion
NASA Technical Reports Server (NTRS)
Felippa, C. A.; Farhat, C.; Lanteri, S.; Maman, N.; Piperno, S.; Gumaste, U.
1994-01-01
This research program deals with the application of high-performance computing methods for the analysis of complete jet engines. We have entitled this program by applying the two dimensional parallel aeroelastic codes to the interior gas flow problem of a bypass jet engine. The fluid mesh generation, domain decomposition, and solution capabilities were successfully tested. We then focused attention on methodology for the partitioned analysis of the interaction of the gas flow with a flexible structure and with the fluid mesh motion that results from these structural displacements. This is treated by a new arbitrary Lagrangian-Eulerian (ALE) technique that models the fluid mesh motion as that of a fictitious mass-spring network. New partitioned analysis procedures to treat this coupled three-component problem are developed. These procedures involved delayed corrections and subcycling. Preliminary results on the stability, accuracy, and MPP computational efficiency are reported.
García Vicente, Ana María; Delgado-Bolton, Roberto C; Amo-Salas, Mariano; López-Fidalgo, Jesús; Caresia Aróztegui, Ana Paula; García Garzón, José Ramón; Orcajo Rincón, Javier; García Velloso, María José; de Arcocha Torres, María; Alvárez Ruíz, Soledad
2017-08-01
The detection of occult cancer in patients suspected of having a paraneoplastic neurological syndrome (PNS) poses a diagnostic challenge. The aim of our study was to perform a systematic review and meta-analysis to assess the diagnostic performance of FDG PET for the detection of occult malignant disease responsible for PNS. A systematic review of the literature (MEDLINE, EMBASE, Cochrane, and DARE) was undertaken to identify studies published in any language. The search strategy was structured after addressing clinical questions regarding the validity or usefulness of the test, following the PICO framework. Inclusion criteria were studies involving patients with PNS in whom FDG PET was performed to detect malignancy, and which reported sufficient primary data to allow calculation of diagnostic accuracy parameters. When possible, a meta-analysis was performed to calculate the joint sensitivity, specificity, and detection rate for malignancy (with 95% confidence intervals [CIs]), as well as a subgroup analysis based on patient characteristics (antibodies, syndrome). The comprehensive literature search revealed 700 references. Sixteen studies met the inclusion criteria and were ultimately selected. Most of the studies were retrospective (12/16). For the quality assessment, the QUADAS-2 tool was applied to assess the risk of bias. Across 16 studies (793 patients), the joint sensitivity, specificity, and detection rate for malignancy with FDG PET were 0.87 (95% CI: 0.80-0.93), 0.86 (95% CI: 0.83-0.89), and 14.9% (95% CI: 11.5-18.7), respectively. The area under the curve (AUC) of the summary ROC curve was 0.917. Homogeneity of results was observed for sensitivity but not for specificity. Some of the individual studies showed large 95% CIs as a result of small sample size. The results of our meta-analysis reveal high diagnostic performance of FDG PET in the detection of malignancy responsible for PNS, not affected by the presence of onconeural antibodies or clinical characteristics.
Donato, Gianluca; Bartlett, Marian Stewart; Hager, Joseph C.; Ekman, Paul; Sejnowski, Terrence J.
2010-01-01
The Facial Action Coding System (FACS) [23] is an objective method for quantifying facial movement in terms of component actions. This system is widely used in behavioral investigations of emotion, cognitive processes, and social interaction. The coding is presently performed by highly trained human experts. This paper explores and compares techniques for automatically recognizing facial actions in sequences of images. These techniques include analysis of facial motion through estimation of optical flow; holistic spatial analysis, such as principal component analysis, independent component analysis, local feature analysis, and linear discriminant analysis; and methods based on the outputs of local filters, such as Gabor wavelet representations and local principal components. Performance of these systems is compared to naive and expert human subjects. Best performances were obtained using the Gabor wavelet representation and the independent component representation, both of which achieved 96 percent accuracy for classifying 12 facial actions of the upper and lower face. The results provide converging evidence for the importance of using local filters, high spatial frequencies, and statistical independence for classifying facial actions. PMID:21188284
Kim, Jungkyu; Jensen, Erik C; Stockton, Amanda M; Mathies, Richard A
2013-08-20
A fully integrated multilayer microfluidic chemical analyzer for automated sample processing and labeling, as well as analysis using capillary zone electrophoresis is developed and characterized. Using lifting gate microfluidic control valve technology, a microfluidic automaton consisting of a two-dimensional microvalve cellular array is fabricated with soft lithography in a format that enables facile integration with a microfluidic capillary electrophoresis device. The programmable sample processor performs precise mixing, metering, and routing operations that can be combined to achieve automation of complex and diverse assay protocols. Sample labeling protocols for amino acid, aldehyde/ketone and carboxylic acid analysis are performed automatically followed by automated transfer and analysis by the integrated microfluidic capillary electrophoresis chip. Equivalent performance to off-chip sample processing is demonstrated for each compound class; the automated analysis resulted in a limit of detection of ~16 nM for amino acids. Our microfluidic automaton provides a fully automated, portable microfluidic analysis system capable of autonomous analysis of diverse compound classes in challenging environments.
The dependability of medical students' performance ratings as documented on in-training evaluations.
van Barneveld, Christina
2005-03-01
To demonstrate an approach to obtain an unbiased estimate of the dependability of students' performance ratings during training, when the data-collection design includes nesting of student in rater, unbalanced nest sizes, and dependent observations. In 2003, two variance components analyses of in-training evaluation (ITE) report data were conducted using urGENOVA software. In the first analysis, the dependability for the nested and unbalanced data-collection design was calculated. In the second analysis, an approach using multiple generalizability studies was used to obtain an unbiased estimate of the student variance component, resulting in an unbiased estimate of dependability. Results suggested that there is bias in estimates of the dependability of students' performance on ITEs that are attributable to the data-collection design. When the bias was corrected, the results indicated that the dependability of ratings of student performance was almost zero. The combination of the multiple generalizability studies method and the use of specialized software provides an unbiased estimate of the dependability of ratings of student performance on ITE scores for data-collection designs that include nesting of student in rater, unbalanced nest sizes, and dependent observations.
Effects of user mental state on EEG-BCI performance.
Myrden, Andrew; Chau, Tom
2015-01-01
Changes in psychological state have been proposed as a cause of variation in brain-computer interface performance, but little formal analysis has been conducted to support this hypothesis. In this study, we investigated the effects of three mental states-fatigue, frustration, and attention-on BCI performance. Twelve able-bodied participants were trained to use a two-class EEG-BCI based on the performance of user-specific mental tasks. Following training, participants completed three testing sessions, during which they used the BCI to play a simple maze navigation game while periodically reporting their perceived levels of fatigue, frustration, and attention. Statistical analysis indicated that there is a significant relationship between frustration and BCI performance while the relationship between fatigue and BCI performance approached significance. BCI performance was 7% lower than average when self-reported fatigue was low and 7% higher than average when self-reported frustration was moderate. A multivariate analysis of mental state revealed the presence of contiguous regions in mental state space where BCI performance was more accurate than average, suggesting the importance of moderate fatigue for achieving effortless focus on BCI control, frustration as a potential motivating factor, and attention as a compensatory mechanism to increasing frustration. Finally, a visual analysis showed the sensitivity of underlying class distributions to changes in mental state. Collectively, these results indicate that mental state is closely related to BCI performance, encouraging future development of psychologically adaptive BCIs.
NASA Astrophysics Data System (ADS)
Hasyim, M.; Prastyo, D. D.
2018-03-01
Survival analysis performs relationship between independent variables and survival time as dependent variable. In fact, not all survival data can be recorded completely by any reasons. In such situation, the data is called censored data. Moreover, several model for survival analysis requires assumptions. One of the approaches in survival analysis is nonparametric that gives more relax assumption. In this research, the nonparametric approach that is employed is Multivariate Regression Adaptive Spline (MARS). This study is aimed to measure the performance of private university’s lecturer. The survival time in this study is duration needed by lecturer to obtain their professional certificate. The results show that research activities is a significant factor along with developing courses material, good publication in international or national journal, and activities in research collaboration.
Methodological aspects of fuel performance system analysis at raw hydrocarbon processing plants
NASA Astrophysics Data System (ADS)
Kulbjakina, A. V.; Dolotovskij, I. V.
2018-01-01
The article discusses the methodological aspects of fuel performance system analysis at raw hydrocarbon (RH) processing plants. Modern RH processing facilities are the major consumers of energy resources (ER) for their own needs. To reduce ER, including fuel consumption, and to develop rational fuel system structure are complex and relevant scientific tasks that can only be done using system analysis and complex system synthesis. In accordance with the principles of system analysis, the hierarchical structure of the fuel system, the block scheme for the synthesis of the most efficient alternative of the fuel system using mathematical models and the set of performance criteria have been developed on the main stages of the study. The results from the introduction of specific engineering solutions to develop their own energy supply sources for RH processing facilities have been provided.
Musicians, postural quality and musculoskeletal health: A literature's review.
Blanco-Piñeiro, Patricia; Díaz-Pereira, M Pino; Martínez, Aurora
2017-01-01
An analysis of the salient characteristics of research papers published between 1989 and 2015 that evaluate the relationship between postural quality during musical performance and various performance quality and health factors, with emphasis on musculoskeletal health variables. Searches of Medline, Scopus and Google Scholar for papers that analysed the subject of the study objective. The following MeSH descriptors were used: posture; postural balance; muscle, skeletal; task performance and analysis; back; and spine and music. A descriptive statistical analysis of their methodology (sample types, temporal design, and postural, health and other variables analysed) and findings has been made. The inclusion criterion was that the body postural quality of the musicians during performance was included among the target study variables. Forty-one relevant empirical studies were found, written in English. Comparison and analysis of their results was hampered by great disparities in measuring instruments and operationalization of variables. Despite the growing interest in the relationships among these variables, the empirical knowledge base still has many limitations, making rigorous comparative analysis difficult. Copyright © 2016 Elsevier Ltd. All rights reserved.
Transmission Index Research of Parallel Manipulators Based on Matrix Orthogonal Degree
NASA Astrophysics Data System (ADS)
Shao, Zhu-Feng; Mo, Jiao; Tang, Xiao-Qiang; Wang, Li-Ping
2017-11-01
Performance index is the standard of performance evaluation, and is the foundation of both performance analysis and optimal design for the parallel manipulator. Seeking the suitable kinematic indices is always an important and challenging issue for the parallel manipulator. So far, there are extensive studies in this field, but few existing indices can meet all the requirements, such as simple, intuitive, and universal. To solve this problem, the matrix orthogonal degree is adopted, and generalized transmission indices that can evaluate motion/force transmissibility of fully parallel manipulators are proposed. Transmission performance analysis of typical branches, end effectors, and parallel manipulators is given to illustrate proposed indices and analysis methodology. Simulation and analysis results reveal that proposed transmission indices possess significant advantages, such as normalized finite (ranging from 0 to 1), dimensionally homogeneous, frame-free, intuitive and easy to calculate. Besides, proposed indices well indicate the good transmission region and relativity to the singularity with better resolution than the traditional local conditioning index, and provide a novel tool for kinematic analysis and optimal design of fully parallel manipulators.
Consistency of performance of robot-assisted surgical tasks in virtual reality.
Suh, I H; Siu, K-C; Mukherjee, M; Monk, E; Oleynikov, D; Stergiou, N
2009-01-01
The purpose of this study was to investigate consistency of performance of robot-assisted surgical tasks in a virtual reality environment. Eight subjects performed two surgical tasks, bimanual carrying and needle passing, with both the da Vinci surgical robot and a virtual reality equivalent environment. Nonlinear analysis was utilized to evaluate consistency of performance by calculating the regularity and the amount of divergence in the movement trajectories of the surgical instrument tips. Our results revealed that movement patterns for both training tasks were statistically similar between the two environments. Consistency of performance as measured by nonlinear analysis could be an appropriate methodology to evaluate the complexity of the training tasks between actual and virtual environments and assist in developing better surgical training programs.
Performance management and goal ambiguity: managerial implications in a single payer system.
Calciolari, Stefano; Cantù, Elena; Fattore, Giovanni
2011-01-01
Goal ambiguity influences the effectiveness of performance management systems to drive organizations toward enhanced results. The literature analyzes the antecedents of goal ambiguity and shows the influence of goal ambiguity on the performance of U.S. federal agencies. However, no study has analyzed goal ambiguity in other countries or in health care systems. This study has three aims: to test the validity of a measurement instrument for goal ambiguity, to investigate its main antecedents, and to explore the relationship between goal ambiguity and organizational performance in a large, public, Beveridge-type health care system. A nationwide survey of general managers of the Italian national health system was performed. A factor analysis was used to validate the mono-dimensionality of an instrument that measured goal ambiguity. Structural equation modeling was used to test both the antecedents and the influence of goal ambiguity on organizational performance. Data from 135 health care organizations (53% response rate) were available for analysis. The results confirm the mono-dimensionality of the instrument, the existence of two environmental sources of ambiguity (political endorsement and governance commitment), and the negative relationship between goal ambiguity and organizational performance. Goal ambiguity matters because it may hamper organizational performance. Therefore, performance should be fostered by reducing goal ambiguity (e.g., goal-setting model, funding arrangements, and political support). Mutatis mutandis, our results may apply to public health care systems of other countries or other "public interest" sectors, such as social care and education.
A novel integrated assessment methodology of urban water reuse.
Listowski, A; Ngo, H H; Guo, W S; Vigneswaran, S
2011-01-01
Wastewater is no longer considered a waste product and water reuse needs to play a stronger part in securing urban water supply. Although treatment technologies for water reclamation have significantly improved the question that deserves further analysis is, how selection of a particular wastewater treatment technology relates to performance and sustainability? The proposed assessment model integrates; (i) technology, characterised by selected quantity and quality performance parameters; (ii) productivity, efficiency and reliability criteria; (iii) quantitative performance indicators; (iv) development of evaluation model. The challenges related to hierarchy and selections of performance indicators have been resolved through the case study analysis. The goal of this study is to validate a new assessment methodology in relation to performance of the microfiltration (MF) technology, a key element of the treatment process. Specific performance data and measurements were obtained at specific Control and Data Acquisition Points (CP) to satisfy the input-output inventory in relation to water resources, products, material flows, energy requirements, chemicals use, etc. Performance assessment process contains analysis and necessary linking across important parametric functions leading to reliable outcomes and results.
RERTR-13 Irradiation Summary Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. M. Perez; M. A. Lillo; G. S. Chang
2012-09-01
The Reduced Enrichment for Research and Test Reactor (RERTR) experiment RERTR-13 was designed to assess performance of different types of neutron absorbers that can be potentially used as burnable poisons in the low enriched uranium-molybdenum based dispersion and monolithic fuels.1 The following report summarizes the life of the RERTR-13 experiment through end of irradiation, including as-run neutronic analysis results, thermal analysis results and hydraulic testing results.
Pediatric Eye Screening Instrumentation
NASA Astrophysics Data System (ADS)
Chen, Ying-Ling; Lewis, J. W. L.
2001-11-01
Computational evaluations are presented for binocular eye screening using the off-axis digital retinascope. The retinascope, such as the iScreen digital screening system, has been employed to perform pediatric binocular screening using a flash lamp and single-shot camera recording. The digital images are transferred electronically to a reading center for analysis. The method has been shown to detect refractive error, amblyopia, anisocoria, and ptosis. This computational work improves the performance of the system and forms the basis for automated data analysis. For this purpose, variouis published eye models are evaluated with simulated retinascope images. Two to ten million rays are traced in each image calculation. The poster will present the simulation results for a range of eye conditions of refractive error of -20 to +20 diopters with 0.5- to-1 diopter resolution, pupil size of 3 to 8 mm diameter (1-mm increment), and staring angle of 2 to 12 degree (2-degree increment). The variation of the results with the system conditions such as the off-axis distance of light source and the shutter size of camera are also evaluated. The quantitative analysis for each eye’s and system’s condition is then performed to obtain parameters for automatic reading. The summary of the system performance is given and performance-enhancement design modifications are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nichols, Will E.
In accordance with U.S. Department of Energy (DOE) requirements in DOE O 435.1 Chg 11 and DOE M 435.1-1 Chg 1,2 a determination of continued adequacy of the performance assessment (PA), composite analysis (CA), and disposal authorization statement (DAS) is required on an annual basis. This determination must consider the results of data collection and analysis from research, field studies, and monitoring.
Meta-analysis identifies a MECOM gene as a novel predisposing factor of osteoporotic fracture
Hwang, Joo-Yeon; Lee, Seung Hun; Go, Min Jin; Kim, Beom-Jun; Kou, Ikuyo; Ikegawa, Shiro; Guo, Yan; Deng, Hong-Wen; Raychaudhuri, Soumya; Kim, Young Jin; Oh, Ji Hee; Kim, Youngdoe; Moon, Sanghoon; Kim, Dong-Joon; Koo, Heejo; Cha, My-Jung; Lee, Min Hye; Yun, Ji Young; Yoo, Hye-Sook; Kang, Young-Ah; Cho, Eun-Hee; Kim, Sang-Wook; Oh, Ki Won; Kang, Moo II; Son, Ho Young; Kim, Shin-Yoon; Kim, Ghi Su; Han, Bok-Ghee; Cho, Yoon Shin; Cho, Myeong-Chan; Lee, Jong-Young; Koh, Jung-Min
2014-01-01
Background Osteoporotic fracture (OF) as a clinical endpoint is a major complication of osteoporosis. To screen for OF susceptibility genes, we performed a genome-wide association study and carried out de novo replication analysis of an East Asian population. Methods Association was tested using a logistic regression analysis. A meta-analysis was performed on the combined results using effect size and standard errors estimated for each study. Results In a combined meta-analysis of a discovery cohort (288 cases and 1139 controls), three hospital based sets in replication stage I (462 cases and 1745 controls), and an independent ethnic group in replication stage II (369 cases and 560 for controls), we identified a new locus associated with OF (rs784288 in the MECOM gene) that showed genome-wide significance (p=3.59×10−8; OR 1.39). RNA interference revealed that a MECOM knockdown suppresses osteoclastogenesis. Conclusions Our findings provide new insights into the genetic architecture underlying OF in East Asians. PMID:23349225
Numerical prediction of Pelton turbine efficiency
NASA Astrophysics Data System (ADS)
Jošt, D.; Mežnar, P.; Lipej, A.
2010-08-01
This paper presents a numerical analysis of flow in a 2 jet Pelton turbine with horizontal axis. The analysis was done for the model at several operating points in different operating regimes. The results were compared to the results of a test of the model. Analysis was performed using ANSYS CFX-12.1 computer code. A k-ω SST turbulent model was used. Free surface flow was modelled by two-phase homogeneous model. At first, a steady state analysis of flow in the distributor with two injectors was performed for several needle strokes. This provided us with data on flow energy losses in the distributor and the shape and velocity of jets. The second step was an unsteady analysis of the runner with jets. Torque on the shaft was then calculated from pressure distribution data. Averaged torque values are smaller than measured ones. Consequently, calculated turbine efficiency is also smaller than the measured values, the difference is about 4 %. The shape of the efficiency diagram conforms well to the measurements.
Methodology for CFD Design Analysis of National Launch System Nozzle Manifold
NASA Technical Reports Server (NTRS)
Haire, Scot L.
1993-01-01
The current design environment dictates that high technology CFD (Computational Fluid Dynamics) analysis produce quality results in a timely manner if it is to be integrated into the design process. The design methodology outlined describes the CFD analysis of an NLS (National Launch System) nozzle film cooling manifold. The objective of the analysis was to obtain a qualitative estimate for the flow distribution within the manifold. A complex, 3D, multiple zone, structured grid was generated from a 3D CAD file of the geometry. A Euler solution was computed with a fully implicit compressible flow solver. Post processing consisted of full 3D color graphics and mass averaged performance. The result was a qualitative CFD solution that provided the design team with relevant information concerning the flow distribution in and performance characteristics of the film cooling manifold within an effective time frame. Also, this design methodology was the foundation for a quick turnaround CFD analysis of the next iteration in the manifold design.
Orion Orbit Control Design and Analysis
NASA Technical Reports Server (NTRS)
Jackson, Mark; Gonzalez, Rodolfo; Sims, Christopher
2007-01-01
The analysis of candidate thruster configurations for the Crew Exploration Vehicle (CEV) is presented. Six candidate configurations were considered for the prime contractor baseline design. The analysis included analytical assessments of control authority, control precision, efficiency and robustness, as well as simulation assessments of control performance. The principles used in the analytic assessments of controllability, robustness and fuel performance are covered and results provided for the configurations assessed. Simulation analysis was conducted using a pulse width modulated, 6 DOF reaction system control law with a simplex-based thruster selection algorithm. Control laws were automatically derived from hardware configuration parameters including thruster locations, directions, magnitude and specific impulse, as well as vehicle mass properties. This parameterized controller allowed rapid assessment of multiple candidate layouts. Simulation results are presented for final phase rendezvous and docking, as well as low lunar orbit attitude hold. Finally, on-going analysis to consider alternate Service Module designs and to assess the pilot-ability of the baseline design are discussed to provide a status of orbit control design work to date.
NASA Technical Reports Server (NTRS)
Ray, Ronald J.
1994-01-01
New flight test maneuvers and analysis techniques for evaluating the dynamic response of in-flight thrust models during throttle transients have been developed and validated. The approach is based on the aircraft and engine performance relationship between thrust and drag. Two flight test maneuvers, a throttle step and a throttle frequency sweep, were developed and used in the study. Graphical analysis techniques, including a frequency domain analysis method, were also developed and evaluated. They provide quantitative and qualitative results. Four thrust calculation methods were used to demonstrate and validate the test technique. Flight test applications on two high-performance aircraft confirmed the test methods as valid and accurate. These maneuvers and analysis techniques were easy to implement and use. Flight test results indicate the analysis techniques can identify the combined effects of model error and instrumentation response limitations on the calculated thrust value. The methods developed in this report provide an accurate approach for evaluating, validating, or comparing thrust calculation methods for dynamic flight applications.
Preliminary Performance Analyses of the Constellation Program ARES 1 Crew Launch Vehicle
NASA Technical Reports Server (NTRS)
Phillips, Mark; Hanson, John; Shmitt, Terri; Dukemand, Greg; Hays, Jim; Hill, Ashley; Garcia, Jessica
2007-01-01
By the time NASA's Exploration Systems Architecture Study (ESAS) report had been released to the public in December 2005, engineers at NASA's Marshall Space Flight Center had already initiated the first of a series of detailed design analysis cycles (DACs) for the Constellation Program Crew Launch Vehicle (CLV), which has been given the name Ares I. As a major component of the Constellation Architecture, the CLV's initial role will be to deliver crew and cargo aboard the newly conceived Crew Exploration Vehicle (CEV) to a staging orbit for eventual rendezvous with the International Space Station (ISS). However, the long-term goal and design focus of the CLV will be to provide launch services for a crewed CEV in support of lunar exploration missions. Key to the success of the CLV design effort and an integral part of each DAC is a detailed performance analysis tailored to assess nominal and dispersed performance of the vehicle, to determine performance sensitivities, and to generate design-driving dispersed trajectories. Results of these analyses provide valuable design information to the program for the current design as well as provide feedback to engineers on how to adjust the current design in order to maintain program goals. This paper presents a condensed subset of the CLV performance analyses performed during the CLV DAC-1 cycle. Deterministic studies include development of the CLV DAC-1 reference trajectories, identification of vehicle stage impact footprints, an assessment of launch window impacts to payload performance, and the computation of select CLV payload partials. Dispersion studies include definition of input uncertainties, Monte Carlo analysis of trajectory performance parameters based on input dispersions, assessment of CLV flight performance reserve (FPR), assessment of orbital insertion accuracy, and an assessment of bending load indicators due to dispersions in vehicle angle of attack and side slip angle. A short discussion of the various customers for the dispersion results, along with results and ramifications of each study, are also provided.
Ferrone, Carol; Galgano, Jessica; Ramig, Lorraine Olson
2011-05-01
To test the hypothesis that extensive use of La MaMa vocal technique may result in symptoms of vocal abuse, an evaluation of the acoustic and perceptual characteristics of voice for eight performers from the Great Jones Repertory Company of the La MaMa Experimental Theater was conducted. This vocal technique includes wide ranges of frequency from 46 to 2003 Hz and vocal intensity that is sustained at 90-108 dB sound pressure level with a mouth-to-microphone distance of 30 cm for 3-4 hours per performance. The actors rehearsed for 4 hours per day, 5 days per week for 14 weeks before the series of performances. Thirty-nine performances were presented in 6 weeks. Three pretraining, three posttraining, and two postperformance series data collection sessions were carried out for each performer. Speech samples were gathered using the CSL 4500 and analyzed using Real-Time Pitch program and Multidimensional Voice Program. Acoustic analysis was performed on 48 tokens of sustained vowel phonation for each subject. Statistical analysis was performed using the Friedman test of related samples. Perceptual analysis included professional listeners rating voice quality in pretraining, posttraining, and postperformance samples of the Rainbow Passage and sample lines from the plays. The majority of professional listeners (11/12) judged that this technique would result in symptoms of vocal abuse; however, acoustic data revealed statistically stable or improved measurements for all subjects in most dependent acoustic variables when compared with both posttraining and postperformance trials. These findings add support to the notion that a technique that may be perceived as vocally abusive, generating 90-100 dB sound pressure level and sustained over 6 weeks of performances, actually resulted in improved vocal strength and flexibility. Copyright © 2011 The Voice Foundation. Published by Mosby, Inc. All rights reserved.
Independent Orbiter Assessment (IOA): Assessment of the extravehicular mobility unit, volume 1
NASA Technical Reports Server (NTRS)
Raffaelli, Gary G.
1988-01-01
The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort performed an independent analysis of the Extravehicular Mobility Unit (EMU) hardware and system, generating draft failure modes criticalities and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were than compared to the most recent proposed Post 51-L NASA FMEA/CIL baseline. A resolution of each discrepancy from the comparison was provided through additional analysis as required. This report documents the results of that comparison for the Orbiter EMU hardware.
Does Exercise Improve Cognitive Performance? A Conservative Message from Lord's Paradox.
Liu, Sicong; Lebeau, Jean-Charles; Tenenbaum, Gershon
2016-01-01
Although extant meta-analyses support the notion that exercise results in cognitive performance enhancement, methodology shortcomings are noted among primary evidence. The present study examined relevant randomized controlled trials (RCTs) published in the past 20 years (1996-2015) for methodological concerns arise from Lord's paradox. Our analysis revealed that RCTs supporting the positive effect of exercise on cognition are likely to include Type I Error(s). This result can be attributed to the use of gain score analysis on pretest-posttest data as well as the presence of control group superiority over the exercise group on baseline cognitive measures. To improve accuracy of causal inferences in this area, analysis of covariance on pretest-posttest data is recommended under the assumption of group equivalence. Important experimental procedures are discussed to maintain group equivalence.
Droplet-Based Segregation and Extraction of Concentrated Samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buie, C R; Buckley, P; Hamilton, J
2007-02-23
Microfluidic analysis often requires sample concentration and separation techniques to isolate and detect analytes of interest. Complex or scarce samples may also require an orthogonal separation and detection method or off-chip analysis to confirm results. To perform these additional steps, the concentrated sample plug must be extracted from the primary microfluidic channel with minimal sample loss and dilution. We investigated two extraction techniques; injection of immiscible fluid droplets into the sample stream (''capping'''') and injection of the sample into an immiscible fluid stream (''extraction''). From our results we conclude that capping is the more effective partitioning technique. Furthermore, this functionalitymore » enables additional off-chip post-processing procedures such as DNA/RNA microarray analysis, realtime polymerase chain reaction (RT-PCR), and culture growth to validate chip performance.« less
NASA Astrophysics Data System (ADS)
Jian, X. H.; Dong, F. L.; Xu, J.; Li, Z. J.; Jiao, Y.; Cui, Y. Y.
2018-05-01
The feasibility of differentiating tissue components by performing frequency domain analysis of photoacoustic images acquired at different wavelengths was studied in this paper. Firstly, according to the basic theory of photoacoustic imaging, a brief theoretical model for frequency domain analysis of multiwavelength photoacoustic signal was deduced. The experiment results proved that the performance of different targets in frequency domain is quite different. Especially, the acoustic spectrum characteristic peaks of different targets are unique, which are 2.93 MHz, 5.37 MHz, 6.83 MHz, and 8.78 MHz for PDMS phantom, while 13.20 MHz, 16.60 MHz, 26.86 MHz, and 29.30 MHz for pork fat. The results indicated that the acoustic spectrum of photoacoustic imaging signals is possible to be utilized for tissue composition characterization.
Effect of Geometrical Imperfection on Buckling Failure of ITER VVPSS Tank
NASA Astrophysics Data System (ADS)
Jha, Saroj Kumar; Gupta, Girish Kumar; Pandey, Manish Kumar; Bhattacharya, Avik; Jogi, Gaurav; Bhardwaj, Anil Kumar
2017-04-01
The ‘Vacuum Vessel Pressure Suppression System’ (VVPSS) is part of ITER machine, which is designed to protect the ITER Vacuum Vessel and its connected systems, from an over-pressure situation. It is comprised of a partially evacuated tank of stainless steel approximately 46 m long and 6 m in diameter and thickness 30 mm. It is to hold approximately 675 tonnes of water at room temperature to condense the steam resulting from the adverse water leakage into the Vacuum Vessel chamber. For any vacuum vessel, geometrical imperfection has significant effect on buckling failure and structural integrity. Major geometrical imperfection in VVPSS tank depends on form tolerances. To study the effect of geometrical imperfection on buckling failure of VVPSS tank, finite element analysis (FEA) has been performed in line with ASME section VIII division 2 part 5 [1], ‘design by analysis method’. Linear buckling analysis has been performed to get the buckled shape and displacement. Geometrical imperfection due to form tolerance is incorporated in FEA model of VVPSS tank by scaling the resulted buckled shape by a factor ‘60’. This buckled shape model is used as input geometry for plastic collapse and buckling failure assessment. Plastic collapse and buckling failure of VVPSS tank has been assessed by using the elastic-plastic analysis method. This analysis has been performed for different values of form tolerance. The results of analysis show that displacement and load proportionality factor (LPF) vary inversely with form tolerance. For higher values of form tolerance LPF reduces significantly with high values of displacement.
The Need for Anticoagulation Following Inferior Vena Cava Filter Placement: Systematic Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ray, Charles E.; Prochazka, Allan
Purpose. To perform a systemic review to determine the effect of anticoagulation on the rates of venous thromboembolism (pulmonary embolus, deep venous thrombosis, inferior vena cava (IVC) filter thrombosis) following placement of an IVC filter. Methods. A comprehensive computerized literature search was performed to identify relevant articles. Data were abstracted by two reviewers. Studies were included if it could be determined whether or not subjects received anticoagulation following filter placement, and if follow-up data were presented. A meta-analysis of patients from all included studies was performed. A total of 14 articles were included in the final analysis, but the datamore » from only nine articles could be used in the meta-analysis; five studies were excluded because they did not present raw data which could be analyzed in the meta-analysis. A total of 1,369 subjects were included in the final meta-analysis. Results. The summary odds ratio for the effect of anticoagulation on venous thromboembolism rates following filter deployment was 0.639 (95% CI 0.351 to 1.159, p = 0.141). There was significant heterogeneity in the results from different studies [Q statistic of 15.95 (p = 0.043)]. Following the meta-analysis, there was a trend toward decreased venous thromboembolism rates in patients with post-filter anticoagulation (12.3% vs. 15.8%), but the result failed to reach statistical significance. Conclusion. Inferior vena cava filters can be placed in patients who cannot receive concomitant anticoagulation without placing them at significantly higher risk of development of venous thromboembolism.« less
Uncertainty analysis of hydrological modeling in a tropical area using different algorithms
NASA Astrophysics Data System (ADS)
Rafiei Emam, Ammar; Kappas, Martin; Fassnacht, Steven; Linh, Nguyen Hoang Khanh
2018-01-01
Hydrological modeling outputs are subject to uncertainty resulting from different sources of errors (e.g., error in input data, model structure, and model parameters), making quantification of uncertainty in hydrological modeling imperative and meant to improve reliability of modeling results. The uncertainty analysis must solve difficulties in calibration of hydrological models, which further increase in areas with data scarcity. The purpose of this study is to apply four uncertainty analysis algorithms to a semi-distributed hydrological model, quantifying different source of uncertainties (especially parameter uncertainty) and evaluate their performance. In this study, the Soil and Water Assessment Tools (SWAT) eco-hydrological model was implemented for the watershed in the center of Vietnam. The sensitivity of parameters was analyzed, and the model was calibrated. The uncertainty analysis for the hydrological model was conducted based on four algorithms: Generalized Likelihood Uncertainty Estimation (GLUE), Sequential Uncertainty Fitting (SUFI), Parameter Solution method (ParaSol) and Particle Swarm Optimization (PSO). The performance of the algorithms was compared using P-factor and Rfactor, coefficient of determination (R 2), the Nash Sutcliffe coefficient of efficiency (NSE) and Percent Bias (PBIAS). The results showed the high performance of SUFI and PSO with P-factor>0.83, R-factor <0.56 and R 2>0.91, NSE>0.89, and 0.18
Evaluation and Analysis of Regional Best Management Practices in San Diego, California (USA)
NASA Astrophysics Data System (ADS)
Flint, K.; Kinoshita, A. M.
2017-12-01
In urban areas, surface water quality is often impaired due to pollutants transported by stormwater runoff. To maintain and improve surface water quality, the United States Clean Water Act (CWA) requires an evaluation of available water quality information to develop a list of impaired water bodies and establish contaminant restrictions. Structural Best Management Practices (BMPs) are designed to reduce runoff volume and/or pollutant concentrations to comply with CWA requirements. Local level policy makers and managers require an improved understanding of the costs and benefits associated with BMP installation, performance, and maintenance. The International Stormwater BMP Database (Database) is an online platform for submittal of information about existing BMPs, such as cost, design details, and statistical analysis of influent and effluent pollutant concentrations. While the Database provides an aggregation of data which supports analysis of overall BMP performance at international and national scales, the sparse spatial distribution of the data is not suitable for regional and local analysis. This research conducts an extensive review of local inventory and spatial analysis of existing permanent BMPs throughout the San Diego River watershed in California, USA. Information collected from cities within the San Diego River watershed will include BMP types, locations, dates of installation, costs, expected removal efficiencies, monitoring data, and records of maintenance. Aggregating and mapping this information will facilitate BMP evaluation. Specifically, the identification of spatial trends, inconsistencies in BMP performances, and gaps in current records. Regression analysis will provide insight into the nature and significance of correlations between BMP performance and physical characteristics such as land use, soil type, and proximity to impaired waters. This analysis will also result in a metric of relative BMP performance and will provide a basis for future predictions of BMP effectiveness. Ultimately, results from this work will provide information to local governments and agencies for prioritizing, maintaining and monitoring BMPs, and improvement of hydrologic and water quality modeling in urban systems subject to compliance.
Multidisciplinary Shape Optimization of a Composite Blended Wing Body Aircraft
NASA Astrophysics Data System (ADS)
Boozer, Charles Maxwell
A multidisciplinary shape optimization tool coupling aerodynamics, structure, and performance was developed for battery powered aircraft. Utilizing high-fidelity computational fluid dynamics analysis tools and a structural wing weight tool, coupled based on the multidisciplinary feasible optimization architecture; aircraft geometry is modified in the optimization of the aircraft's range or endurance. The developed tool is applied to three geometries: a hybrid blended wing body, delta wing UAS, the ONERA M6 wing, and a modified ONERA M6 wing. First, the optimization problem is presented with the objective function, constraints, and design vector. Next, the tool's architecture and the analysis tools that are utilized are described. Finally, various optimizations are described and their results analyzed for all test subjects. Results show that less computationally expensive inviscid optimizations yield positive performance improvements using planform, airfoil, and three-dimensional degrees of freedom. From the results obtained through a series of optimizations, it is concluded that the newly developed tool is both effective at improving performance and serves as a platform ready to receive additional performance modules, further improving its computational design support potential.
Indicators of suboptimal performance embedded in the Wechsler Memory Scale-Fourth Edition (WMS-IV).
Bouman, Zita; Hendriks, Marc P H; Schmand, Ben A; Kessels, Roy P C; Aldenkamp, Albert P
2016-01-01
Recognition and visual working memory tasks from the Wechsler Memory Scale-Fourth Edition (WMS-IV) have previously been documented as useful indicators for suboptimal performance. The present study examined the clinical utility of the Dutch version of the WMS-IV (WMS-IV-NL) for the identification of suboptimal performance using an analogue study design. The patient group consisted of 59 mixed-etiology patients; the experimental malingerers were 50 healthy individuals who were asked to simulate cognitive impairment as a result of a traumatic brain injury; the last group consisted of 50 healthy controls who were instructed to put forth full effort. Experimental malingerers performed significantly lower on all WMS-IV-NL tasks than did the patients and healthy controls. A binary logistic regression analysis was performed on the experimental malingerers and the patients. The first model contained the visual working memory subtests (Spatial Addition and Symbol Span) and the recognition tasks of the following subtests: Logical Memory, Verbal Paired Associates, Designs, Visual Reproduction. The results showed an overall classification rate of 78.4%, and only Spatial Addition explained a significant amount of variation (p < .001). Subsequent logistic regression analysis and receiver operating characteristic (ROC) analysis supported the discriminatory power of the subtest Spatial Addition. A scaled score cutoff of <4 produced 93% specificity and 52% sensitivity for detection of suboptimal performance. The WMS-IV-NL Spatial Addition subtest may provide clinically useful information for the detection of suboptimal performance.
Lifetime assessment analysis of Galileo Li/SO2 cells: Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levy, S.C.; Jaeger, C.D.; Bouchard, D.A.
Galileo Li/SO2 cells from five lots and five storage temperatures were studied to establish a database from which the performance of flight modules may be predicted. Nondestructive tests consisting of complex impedance analysis and a 15-s pulse were performed on all cells. Chemical analysis was performed on one cell from each lot/storage group, and the remaining cells were discharged at Galileo mission loads. An additional number of cells were placed on high-temperature accelerated aging storage for 6 months and then discharged. All data were statistically analyzed. Results indicate that the present Galileo design Li/SO2 cell will satisfy electrical requirements formore » a 10-year mission. 10 figs., 4 tabs.« less
NASA Technical Reports Server (NTRS)
Csank, Jeffrey T.; Zinnecker, Alicia M.
2014-01-01
The aircraft engine design process seeks to achieve the best overall system-level performance, weight, and cost for a given engine design. This is achieved by a complex process known as systems analysis, where steady-state simulations are used to identify trade-offs that should be balanced to optimize the system. The steady-state simulations and data on which systems analysis relies may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic Systems Analysis provides the capability for assessing these trade-offs at an earlier stage of the engine design process. The concept of dynamic systems analysis and the type of information available from this analysis are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed. This tool aids a user in the design of a power management controller to regulate thrust, and a transient limiter to protect the engine model from surge at a single flight condition (defined by an altitude and Mach number). Results from simulation of the closed-loop system may be used to estimate the dynamic performance of the model. This enables evaluation of the trade-off between performance and operability, or safety, in the engine, which could not be done with steady-state data alone. A design study is presented to compare the dynamic performance of two different engine models integrated with the TTECTrA software.
Impact damage in composite plates
NASA Technical Reports Server (NTRS)
Shahid, I.; Lee, S.; Chang, F. K.; Shah, B. M.
1995-01-01
The objective of this research paper was to link two computer codes, PDCOMP (for Progressive Damage Analysis for Laminated Composites) and 3DIMPACT (for the prediction of the extent of delaminations in laminated composites resulting from point impact loads), in order to predict impact damage by taking into account local damage and material degradation and to estimate residual stiffness of composites after impact. The resulting graphs and analysis versus test results are presented along with the conclusive results of the codes' performances.
NASA Technical Reports Server (NTRS)
Veres, Joseph P.; Jorgenson, Philip, C. E.; Jones, Scott M.
2014-01-01
The main focus of this study is to apply a computational tool for the flow analysis of the engine that has been tested with ice crystal ingestion in the Propulsion Systems Laboratory (PSL) of NASA Glenn Research Center. A data point was selected for analysis during which the engine experienced a full roll back event due to the ice accretion on the blades and flow path of the low pressure compressor. The computational tool consists of the Numerical Propulsion System Simulation (NPSS) engine system thermodynamic cycle code, and an Euler-based compressor flow analysis code, that has an ice particle melt estimation code with the capability of determining the rate of sublimation, melting, and evaporation through the compressor blade rows. Decreasing the performance characteristics of the low pressure compressor (LPC) within the NPSS cycle analysis resulted in matching the overall engine performance parameters measured during testing at data points in short time intervals through the progression of the roll back event. Detailed analysis of the fan-core and LPC with the compressor flow analysis code simulated the effects of ice accretion by increasing the aerodynamic blockage and pressure losses through the low pressure compressor until achieving a match with the NPSS cycle analysis results, at each scan. With the additional blockages and losses in the LPC, the compressor flow analysis code results were able to numerically reproduce the performance that was determined by the NPSS cycle analysis, which was in agreement with the PSL engine test data. The compressor flow analysis indicated that the blockage due to ice accretion in the LPC exit guide vane stators caused the exit guide vane (EGV) to be nearly choked, significantly reducing the air flow rate into the core. This caused the LPC to eventually be in stall due to increasing levels of diffusion in the rotors and high incidence angles in the inlet guide vane (IGV) and EGV stators. The flow analysis indicating compressor stall is substantiated by the video images of the IGV taken during the PSL test, which showed water on the surface of the IGV flowing upstream out of the engine, indicating flow reversal, which is characteristic of a stalled compressor.
Model Performance Evaluation and Scenario Analysis ...
This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors. The performance measures include error analysis, coefficient of determination, Nash-Sutcliffe efficiency, and a new weighted rank method. These performance metrics only provide useful information about the overall model performance. Note that MPESA is based on the separation of observed and simulated time series into magnitude and sequence components. The separation of time series into magnitude and sequence components and the reconstruction back to time series provides diagnostic insights to modelers. For example, traditional approaches lack the capability to identify if the source of uncertainty in the simulated data is due to the quality of the input data or the way the analyst adjusted the model parameters. This report presents a suite of model diagnostics that identify if mismatches between observed and simulated data result from magnitude or sequence related errors. MPESA offers graphical and statistical options that allow HSPF users to compare observed and simulated time series and identify the parameter values to adjust or the input data to modify. The scenario analysis part of the too
Relation between brain architecture and mathematical ability in children: a DBM study.
Han, Zhaoying; Davis, Nicole; Fuchs, Lynn; Anderson, Adam W; Gore, John C; Dawant, Benoit M
2013-12-01
Population-based studies indicate that between 5 and 9 percent of US children exhibit significant deficits in mathematical reasoning, yet little is understood about the brain morphological features related to mathematical performances. In this work, deformation-based morphometry (DBM) analyses have been performed on magnetic resonance images of the brains of 79 third graders to investigate whether there is a correlation between brain morphological features and mathematical proficiency. Group comparison was also performed between Math Difficulties (MD-worst math performers) and Normal Controls (NC), where each subgroup consists of 20 age and gender matched subjects. DBM analysis is based on the analysis of the deformation fields generated by non-rigid registration algorithms, which warp the individual volumes to a common space. To evaluate the effect of registration algorithms on DBM results, five nonrigid registration algorithms have been used: (1) the Adaptive Bases Algorithm (ABA); (2) the Image Registration Toolkit (IRTK); (3) the FSL Nonlinear Image Registration Tool; (4) the Automatic Registration Tool (ART); and (5) the normalization algorithm available in SPM8. The deformation field magnitude (DFM) was used to measure the displacement at each voxel, and the Jacobian determinant (JAC) was used to quantify local volumetric changes. Results show there are no statistically significant volumetric differences between the NC and the MD groups using JAC. However, DBM analysis using DFM found statistically significant anatomical variations between the two groups around the left occipital-temporal cortex, left orbital-frontal cortex, and right insular cortex. Regions of agreement between at least two algorithms based on voxel-wise analysis were used to define Regions of Interest (ROIs) to perform an ROI-based correlation analysis on all 79 volumes. Correlations between average DFM values and standard mathematical scores over these regions were found to be significant. We also found that the choice of registration algorithm has an impact on DBM-based results, so we recommend using more than one algorithm when conducting DBM studies. To the best of our knowledge, this is the first study that uses DBM to investigate brain anatomical features related to mathematical performance in a relatively large population of children. © 2013.
TASS Model Application for Testing the TDWAP Model
NASA Technical Reports Server (NTRS)
Switzer, George F.
2009-01-01
One of the operational modes of the Terminal Area Simulation System (TASS) model simulates the three-dimensional interaction of wake vortices within turbulent domains in the presence of thermal stratification. The model allows the investigation of turbulence and stratification on vortex transport and decay. The model simulations for this work all assumed fully-periodic boundary conditions to remove the effects from any surface interaction. During the Base Period of this contract, NWRA completed generation of these datasets but only presented analysis for the neutral stratification runs of that set (Task 3.4.1). Phase 1 work began with the analysis of the remaining stratification datasets, and in the analysis we discovered discrepancies with the vortex time to link predictions. This finding necessitated investigating the source of the anomaly, and we found a problem with the background turbulence. Using the most up to date version TASS with some important defect fixes, we regenerated a larger turbulence domain, and verified the vortex time to link with a few cases before proceeding to regenerate the entire 25 case set (Task 3.4.2). The effort of Phase 2 (Task 3.4.3) concentrated on analysis of several scenarios investigating the effects of closely spaced aircraft. The objective was to quantify the minimum aircraft separations necessary to avoid vortex interactions between neighboring aircraft. The results consist of spreadsheets of wake data and presentation figures prepared for NASA technical exchanges. For these formation cases, NASA carried out the actual TASS simulations and NWRA performed the analysis of the results by making animations, line plots, and other presentation figures. This report contains the description of the work performed during this final phase of the contract, the analysis procedures adopted, and sample plots of the results from the analysis performed.
NASA Astrophysics Data System (ADS)
Folley, Christopher; Bronowicki, Allen
2005-09-01
Prediction of optical performance for large, deployable telescopes under environmental conditions and mechanical disturbances is a crucial part of the design verification process of such instruments for all phases of design and operation: ground testing, commissioning, and on-orbit operation. A Structural-Thermal-Optical-Performance (STOP) analysis methodology is often created that integrates the output of one analysis with the input of another. The integration of thermal environment predictions with structural models is relatively well understood, while the integration of structural deformation results into optical analysis/design software is less straightforward. A Matlab toolbox has been created that effectively integrates the predictions of mechanical deformations on optical elements generated by, for example, finite element analysis, and computes optical path differences for the distorted prescription. The engine of the toolbox is the real ray-tracing algorithm that allows the optical surfaces to be defined in a single, global coordinate system thereby allowing automatic alignment of the mechanical coordinate system with the optical coordinate system. Therefore, the physical location of the optical surfaces is identical in the optical prescription and the finite element model. The application of rigid body displacements to optical surfaces, however, is more general than for use solely in STOP analysis, such as the analysis of misalignments during the commissioning process. Furthermore, all the functionality of Matlab is available for optimization and control. Since this is a new tool for use on flight programs, it has been verified against CODE V. The toolbox' functionality, to date, is described, verification results are presented, and, as an example of its utility, results of a thermal distortion analysis are presented using the James Webb Space Telescope (JWST) prescription.
Analysis method comparison of on-time and on-budget data.
DOT National Transportation Integrated Search
2007-02-01
New Mexico Department of Transportation (NMDOT) results for On-Time and On-Budget performance measures as reported in (AASHTO/SCoQ) NCHRP 20-24(37) Project Measuring Performance Among State DOTs (Phase I) are lower than construction personnel kno...
NASA Technical Reports Server (NTRS)
Gale, R. L.; Nease, A. W.; Nelson, D. J.
1978-01-01
Computer program mathematically describes complete hydraulic systems to study their dynamic performance. Program employs subroutines that simulate components of hydraulic system, which are then controlled by main program. Program is useful to engineers working with detailed performance results of aircraft, spacecraft, or similar hydraulic systems.
Verification of Orthogrid Finite Element Modeling Techniques
NASA Technical Reports Server (NTRS)
Steeve, B. E.
1996-01-01
The stress analysis of orthogrid structures, specifically with I-beam sections, is regularly performed using finite elements. Various modeling techniques are often used to simplify the modeling process but still adequately capture the actual hardware behavior. The accuracy of such 'Oshort cutso' is sometimes in question. This report compares three modeling techniques to actual test results from a loaded orthogrid panel. The finite element models include a beam, shell, and mixed beam and shell element model. Results show that the shell element model performs the best, but that the simpler beam and beam and shell element models provide reasonable to conservative results for a stress analysis. When deflection and stiffness is critical, it is important to capture the effect of the orthogrid nodes in the model.
Portable Life Support Subsystem Thermal Hydraulic Performance Analysis
NASA Technical Reports Server (NTRS)
Barnes, Bruce; Pinckney, John; Conger, Bruce
2010-01-01
This paper presents the current state of the thermal hydraulic modeling efforts being conducted for the Constellation Space Suit Element (CSSE) Portable Life Support Subsystem (PLSS). The goal of these efforts is to provide realistic simulations of the PLSS under various modes of operation. The PLSS thermal hydraulic model simulates the thermal, pressure, flow characteristics, and human thermal comfort related to the PLSS performance. This paper presents modeling approaches and assumptions as well as component model descriptions. Results from the models are presented that show PLSS operations at steady-state and transient conditions. Finally, conclusions and recommendations are offered that summarize results, identify PLSS design weaknesses uncovered during review of the analysis results, and propose areas for improvement to increase model fidelity and accuracy.
Using Job Analysis Techniques to Understand Training Needs for Promotores de Salud.
Ospina, Javier H; Langford, Toshiko A; Henry, Kimberly L; Nelson, Tristan Q
2018-04-01
Despite the value of community health worker programs, such as Promotores de Salud, for addressing health disparities in the Latino community, little consensus has been reached to formally define the unique roles and duties associated with the job, thereby creating unique job training challenges. Understanding the job tasks and worker attributes central to this work is a critical first step for developing the training and evaluation systems of promotores programs. Here, we present the process and findings of a job analysis conducted for promotores working for Planned Parenthood. We employed a systematic approach, the combination job analysis method, to define the job in terms of its work and worker requirements, identifying key job tasks, as well as the worker attributes necessary to effectively perform them. Our results suggest that the promotores' job encompasses a broad range of activities and requires an equally broad range of personal characteristics to perform. These results played an important role in the development of our training and evaluation protocols. In this article, we introduce the technique of job analysis, provide an overview of the results from our own application of this technique, and discuss how these findings can be used to inform a training and performance evaluation system. This article provides a template for other organizations implementing similar community health worker programs and illustrates the value of conducting a job analysis for clarifying job roles, developing and evaluating job training materials, and selecting qualified job candidates.
Factor analysis of serogroups botanica and aurisina of Leptospira biflexa.
Cinco, M
1977-11-01
Factor analysis is performed on serovars of Botanica and Aurisina serogroup of Leptospira biflexa. The results show the arrangement of main factors serovar and serogroup specific, as well as the antigens common with serovars of heterologous serogroups.
NASA Technical Reports Server (NTRS)
1979-01-01
Structural analysis and certification of the collector system is presented. System verification against the interim performance criteria is presented and indicated by matrices. The verification discussion, analysis, and test results are also given.
Performance assessment in algebra learning process
NASA Astrophysics Data System (ADS)
Lestariani, Ida; Sujadi, Imam; Pramudya, Ikrar
2017-12-01
The purpose of research to describe the implementation of performance assessment on algebra learning process. The subject in this research is math educator of SMAN 1 Ngawi class X. This research includes descriptive qualitative research type. Techniques of data collecting are done by observation method, interview, and documentation. Data analysis technique is done by data reduction, data presentation, and conclusion. The results showed any indication that the steps taken by the educator in applying the performance assessment are 1) preparing individual worksheets and group worksheets, 2) preparing rubric assessments for independent worksheets and groups and 3) making performance assessments rubric to learners’ performance results with individual or groups task.
Performance characteristics of LOX-H2, tangential-entry, swirl-coaxial, rocket injectors
NASA Technical Reports Server (NTRS)
Howell, Doug; Petersen, Eric; Clark, Jim
1993-01-01
Development of a high performing swirl-coaxial injector requires an understanding of fundamental performance characteristics. This paper addresses the findings of studies on cold flow atomic characterizations which provided information on the influence of fluid properties and element operating conditions on the produced droplet sprays. These findings are applied to actual rocket conditions. The performance characteristics of swirl-coaxial injection elements under multi-element hot-fire conditions were obtained by analysis of combustion performance data from three separate test series. The injection elements are described and test results are analyzed using multi-variable linear regression. A direct comparison of test results indicated that reduced fuel injection velocity improved injection element performance through improved propellant mixing.
Kang, Sang Wook; Kim, Su Kang; Jung, Hee-Jae; Kim, Kwan-Il; Kim, Jinju
2016-01-01
The relationship between polymorphism of the angiotensin I converting enzyme (ACE) gene and chronic obstructive pulmonary disease (COPD) has been examined in many previous studies. However, their results were controversial. Therefore, we performed a meta-analysis to evaluate the relationship between the ACE gene and the risk of COPD. Fourteen case-control studies were included in this meta-analysis. The pooled p value, odds ratio (OR), and 95% confidence interval (95% CI) were used to investigate the strength of the association. The meta-analysis was performed using comprehensive meta-analysis software. Our meta-analysis results revealed that ACE polymorphisms were not related to the risk of COPD (p > 0.05 in each model). In further analyses based on ethnicity, we observed an association between insertion/deletion polymorphism of the ACE gene and risk of COPD in the Asian population (codominant 2, OR = 3.126, 95% CI = 1.919–5.093, p < 0.001; recessive, OR = 3.326, 95% CI = 2.190–5.050, p < 0.001) but not in the Caucasian population (p > 0.05 in each model). In conclusion, the present meta-analysis indicated that the insertion/deletion polymorphism of the ACE gene may be associated with susceptibility to COPD in the Asian population but not in the Caucasian population. However, the results of the present meta-analysis need to be confirmed in a larger sample. PMID:27830153
Algorithm sensitivity analysis and parameter tuning for tissue image segmentation pipelines.
Teodoro, George; Kurç, Tahsin M; Taveira, Luís F R; Melo, Alba C M A; Gao, Yi; Kong, Jun; Saltz, Joel H
2017-04-01
Sensitivity analysis and parameter tuning are important processes in large-scale image analysis. They are very costly because the image analysis workflows are required to be executed several times to systematically correlate output variations with parameter changes or to tune parameters. An integrated solution with minimum user interaction that uses effective methodologies and high performance computing is required to scale these studies to large imaging datasets and expensive analysis workflows. The experiments with two segmentation workflows show that the proposed approach can (i) quickly identify and prune parameters that are non-influential; (ii) search a small fraction (about 100 points) of the parameter search space with billions to trillions of points and improve the quality of segmentation results (Dice and Jaccard metrics) by as much as 1.42× compared to the results from the default parameters; (iii) attain good scalability on a high performance cluster with several effective optimizations. Our work demonstrates the feasibility of performing sensitivity analyses, parameter studies and auto-tuning with large datasets. The proposed framework can enable the quantification of error estimations and output variations in image segmentation pipelines. Source code: https://github.com/SBU-BMI/region-templates/ . teodoro@unb.br. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Kriston, Levente; Meister, Ramona
2014-03-01
Judging applicability (relevance) of meta-analytical findings to particular clinical decision-making situations remains challenging. We aimed to describe an evidence synthesis method that accounts for possible uncertainty regarding applicability of the evidence. We conceptualized uncertainty regarding applicability of the meta-analytical estimates to a decision-making situation as the result of uncertainty regarding applicability of the findings of the trials that were included in the meta-analysis. This trial-level applicability uncertainty can be directly assessed by the decision maker and allows for the definition of trial inclusion probabilities, which can be used to perform a probabilistic meta-analysis with unequal probability resampling of trials (adaptive meta-analysis). A case study with several fictitious decision-making scenarios was performed to demonstrate the method in practice. We present options to elicit trial inclusion probabilities and perform the calculations. The result of an adaptive meta-analysis is a frequency distribution of the estimated parameters from traditional meta-analysis that provides individually tailored information according to the specific needs and uncertainty of the decision maker. The proposed method offers a direct and formalized combination of research evidence with individual clinical expertise and may aid clinicians in specific decision-making situations. Copyright © 2014 Elsevier Inc. All rights reserved.
Evolutionary computing for the design search and optimization of space vehicle power subsystems
NASA Technical Reports Server (NTRS)
Kordon, Mark; Klimeck, Gerhard; Hanks, David; Hua, Hook
2004-01-01
Evolutionary computing has proven to be a straightforward and robust approach for optimizing a wide range of difficult analysis and design problems. This paper discusses the application of these techniques to an existing space vehicle power subsystem resource and performance analysis simulation in a parallel processing environment. Out preliminary results demonstrate that this approach has the potential to improve the space system trade study process by allowing engineers to statistically weight subsystem goals of mass, cost and performance then automatically size power elements based on anticipated performance of the subsystem rather than on worst-case estimates.
NPAC-Nozzle Performance Analysis Code
NASA Technical Reports Server (NTRS)
Barnhart, Paul J.
1997-01-01
A simple and accurate nozzle performance analysis methodology has been developed. The geometry modeling requirements are minimal and very flexible, thus allowing rapid design evaluations. The solution techniques accurately couple: continuity, momentum, energy, state, and other relations which permit fast and accurate calculations of nozzle gross thrust. The control volume and internal flow analyses are capable of accounting for the effects of: over/under expansion, flow divergence, wall friction, heat transfer, and mass addition/loss across surfaces. The results from the nozzle performance methodology are shown to be in excellent agreement with experimental data for a variety of nozzle designs over a range of operating conditions.
Development and Performance Analysis of a Photonics-Assisted RF Converter for 5G Applications
NASA Astrophysics Data System (ADS)
Borges, Ramon Maia; Muniz, André Luiz Marques; Sodré Junior, Arismar Cerqueira
2017-03-01
This article presents a simple, ultra-wideband and tunable radiofrequency (RF) converter for 5G cellular networks. The proposed optoelectronic device performs broadband photonics-assisted upconversion and downconversion using a single optical modulator. Experimental results demonstrate RF conversion from DC to millimeter waves, including 28 and 38 GHz that are potential frequency bands for 5G applications. Narrow linewidth and low phase noise characteristics are observed in all generated RF carriers. An experimental digital performance analysis using different modulation schemes illustrates the applicability of the proposed photonics-based device in reconfigurable optical wireless communications.
Three-dimensional analysis of the cranio-cervico-mandibular complex during piano performance.
Clemente, M; Lourenço, S; Coimbra, D; Silva, A; Gabriel, J; Pinho, Jc
2014-09-01
Piano players, as well as other musicians, spend a long time training to achieve the best results, sometimes adopting unnatural body positions that may cause musculoskeletal pain. This paper presents the preliminary results of a study targeting the analysis of the head and cervical postures of 17 piano players during musical performance. It was found, as a common feature, that the players tilt the head to the right and forward towards the score and keyboard. Players who know the score by heart tend to move their heads more compared to the ones who have to keep their eyes on the score.
Goddard high resolution spectrograph science verification and data analysis
NASA Technical Reports Server (NTRS)
1992-01-01
The data analysis performed was to support the Orbital Verification (OV) and Science Verification (SV) of the GHRS was in the areas of the Digicon detector's performance and stability, wavelength calibration, and geomagnetic induced image motion. The results of the analyses are briefly described. Detailed results are given in the form of attachments. Specialized software was developed for the analyses. Calibration files were formatted according to the specifications in a Space Telescope Science report. IRAS images were restored of the Large Magellanic Cloud using a blocked iterative algorithm. The algorithm works with the raw data scans without regridding or interpolating the data on an equally spaced image grid.
Deductive Evaluation: Formal Code Analysis With Low User Burden
NASA Technical Reports Server (NTRS)
Di Vito, Ben. L
2016-01-01
We describe a framework for symbolically evaluating iterative C code using a deductive approach that automatically discovers and proves program properties. Although verification is not performed, the method can infer detailed program behavior. Software engineering work flows could be enhanced by this type of analysis. Floyd-Hoare verification principles are applied to synthesize loop invariants, using a library of iteration-specific deductive knowledge. When needed, theorem proving is interleaved with evaluation and performed on the fly. Evaluation results take the form of inferred expressions and type constraints for values of program variables. An implementation using PVS (Prototype Verification System) is presented along with results for sample C functions.