Sample records for performance analysis demonstrates

  1. Integrated corridor management initiative : demonstration phase evaluation – Dallas corridor performance analysis test plan.

    DOT National Transportation Integrated Search

    2012-08-01

    This report presents the test plan for conducting the Corridor Performance Analysis for the United States Department of Transportation (U.S. DOT) evaluation of the Dallas U.S. 75 Integrated Corridor Management (ICM) Initiative Demonstration. The ICM ...

  2. Integrated corridor management initiative : demonstration phase evaluation – San Diego corridor performance analysis test plan.

    DOT National Transportation Integrated Search

    2012-08-01

    This report presents the test plan for conducting the Corridor Performance Analysis for the United States Department of Transportation (U.S. DOT) evaluation of the San Diego Integrated Corridor Management (ICM) Initiative Demonstration. The ICM proje...

  3. DNA Fingerprinting in a Forensic Teaching Experiment

    ERIC Educational Resources Information Center

    Wagoner, Stacy A.; Carlson, Kimberly A.

    2008-01-01

    This article presents an experiment designed to provide students, in a classroom laboratory setting, a hands-on demonstration of the steps used in DNA forensic analysis by performing DNA extraction, DNA fingerprinting, and statistical analysis of the data. This experiment demonstrates how DNA fingerprinting is performed and how long it takes. It…

  4. Rotor design optimization using a free wake analysis

    NASA Technical Reports Server (NTRS)

    Quackenbush, Todd R.; Boschitsch, Alexander H.; Wachspress, Daniel A.; Chua, Kiat

    1993-01-01

    The aim of this effort was to develop a comprehensive performance optimization capability for tiltrotor and helicopter blades. The analysis incorporates the validated EHPIC (Evaluation of Hover Performance using Influence Coefficients) model of helicopter rotor aerodynamics within a general linear/quadratic programming algorithm that allows optimization using a variety of objective functions involving the performance. The resulting computer code, EHPIC/HERO (HElicopter Rotor Optimization), improves upon several features of the previous EHPIC performance model and allows optimization utilizing a wide spectrum of design variables, including twist, chord, anhedral, and sweep. The new analysis supports optimization of a variety of objective functions, including weighted measures of rotor thrust, power, and propulsive efficiency. The fundamental strength of the approach is that an efficient search for improved versions of the baseline design can be carried out while retaining the demonstrated accuracy inherent in the EHPIC free wake/vortex lattice performance analysis. Sample problems are described that demonstrate the success of this approach for several representative rotor configurations in hover and axial flight. Features that were introduced to convert earlier demonstration versions of this analysis into a generally applicable tool for researchers and designers is also discussed.

  5. ESTCP Munitions Response Live Site Demonstrations Southwest Proving Ground, Arkansas Andersen Air Force Base, Guam Cost and Performance Report

    DTIC Science & Technology

    2017-09-01

    CANNOT BE ANALYZED ....... 17  7.0  COST BENEFIT ANALYSIS...20  7.3  COST BENEFIT ...sites and assess the performance and cost benefits of implementing AGC technologies. OBJECTIVES OF THE DEMONSTRATION The demonstrations were

  6. Performance analysis of a Principal Component Analysis ensemble classifier for Emotiv headset P300 spellers.

    PubMed

    Elsawy, Amr S; Eldawlatly, Seif; Taher, Mohamed; Aly, Gamal M

    2014-01-01

    The current trend to use Brain-Computer Interfaces (BCIs) with mobile devices mandates the development of efficient EEG data processing methods. In this paper, we demonstrate the performance of a Principal Component Analysis (PCA) ensemble classifier for P300-based spellers. We recorded EEG data from multiple subjects using the Emotiv neuroheadset in the context of a classical oddball P300 speller paradigm. We compare the performance of the proposed ensemble classifier to the performance of traditional feature extraction and classifier methods. Our results demonstrate the capability of the PCA ensemble classifier to classify P300 data recorded using the Emotiv neuroheadset with an average accuracy of 86.29% on cross-validation data. In addition, offline testing of the recorded data reveals an average classification accuracy of 73.3% that is significantly higher than that achieved using traditional methods. Finally, we demonstrate the effect of the parameters of the P300 speller paradigm on the performance of the method.

  7. Assessing Organizational Effectiveness: The Role of Performance Measures

    ERIC Educational Resources Information Center

    Matthews, Joseph R.

    2011-01-01

    A brief overview of the challenges associated with demonstrating organizational effectiveness and the role of performance measures as surrogates for demonstrating effectiveness are provided. The complexity of analysis and the importance of use of performance measures provide a way to review the strengths and weakness of eight different ways to…

  8. Exploration Laboratory Analysis

    NASA Technical Reports Server (NTRS)

    Krihak, M.; Ronzano, K.; Shaw, T.

    2016-01-01

    The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the availability of inflight laboratory analysis instrumentation has been identified as an essential capability for manned exploration missions. Since a single, compact space-ready laboratory analysis capability to perform all exploration clinical measurements is not commercially available, the ELA project objective is to demonstrate the feasibility of emerging operational and analytical capability as a biomedical diagnostics precursor to long duration manned exploration missions. The initial step towards ground and flight demonstrations in fiscal year (FY) 2015 was the downselection of platform technologies for demonstrations in the space environment. The technologies selected included two Small Business Innovation Research (SBIR) performers: DNA Medicine Institute's rHEALTH X and Intelligent Optical System's lateral flow assays combined with Holomic's smartphone analyzer. The selection of these technologies were based on their compact size, breadth of analytical capability and favorable ability to process fluids in a space environment, among several factors. These two technologies will be advanced to meet ground and flight demonstration success criteria and requirements. The technology demonstrations and metrics for success will be finalized in FY16. Also, the downselected performers will continue the technology development phase towards meeting prototype deliverables in either late 2016 or 2017.

  9. Step 1: C3 Flight Demo Data Analysis Plan

    NASA Technical Reports Server (NTRS)

    2005-01-01

    The Data Analysis Plan (DAP) describes the data analysis that the C3 Work Package (WP) will perform in support of the Access 5 Step 1 C3 flight demonstration objectives as well as the processes that will be used by the Flight IPT to gather and distribute the data collected to satisfy those objectives. In addition to C3 requirements, this document will encompass some Human Systems Interface (HSI) requirements in performing the C3 flight demonstrations. The C3 DAP will be used as the primary interface requirements document between the C3 Work Package and Flight Test organizations (Flight IPT and Non-Access 5 Flight Programs). In addition to providing data requirements for Access 5 flight test (piggyback technology demonstration flights, dedicated C3 technology demonstration flights, and Airspace Operations Demonstration flights), the C3 DAP will be used to request flight data from Non- Access 5 flight programs for C3 related data products

  10. Application of tissue mesodissection to molecular cancer diagnostics.

    PubMed

    Krizman, David; Adey, Nils; Parry, Robert

    2015-02-01

    To demonstrate clinical application of a mesodissection platform that was developed to combine advantages of laser-based instrumentation with the speed/ease of manual dissection for automated dissection of tissue off standard glass slides. Genomic analysis for KRAS gene mutation was performed on formalin fixed paraffin embedded (FFPE) cancer patient tissue that was dissected using the mesodissection platform. Selected reaction monitoring proteomic analysis for quantitative Her2 protein expression was performed on FFPE patient tumour tissue dissected by a laser-based instrument and the MilliSect instrument. Genomic analysis demonstrates highly confident detection of KRAS mutation specifically in lung cancer cells and not the surrounding benign, non-tumour tissue. Proteomic analysis demonstrates Her2 quantitative protein expression in breast cancer cells dissected manually, by laser-based instrumentation and by MilliSect instrumentation (mesodissection). Slide-mounted tissue dissection is commonly performed using laser-based instruments or manually scraping tissue by scalpel. Here we demonstrate that the mesodissection platform as performed by the MilliSect instrument for tissue dissection is cost-effective; it functions comparably to laser-based dissection and which can be adopted into a clinical diagnostic workflow. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  11. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  12. Predicting performance with traffic analysis tools : final report.

    DOT National Transportation Integrated Search

    2008-03-01

    This document provides insights into the common pitfalls and challenges associated with use of traffic analysis tools for predicting future performance of a transportation facility. It provides five in-depth case studies that demonstrate common ways ...

  13. Rapid Energy Modeling Workflow Demonstration Project

    DTIC Science & Technology

    2014-01-01

    Conditioning Engineers BIM Building Information Model BLCC building life cycle costs BPA Building Performance Analysis CAD computer assisted...invited to enroll in the Autodesk Building Performance Analysis ( BPA ) Certificate Program under a group 30 specifically for DoD installation

  14. Homemade Bienzymatic-Amperometric Biosensor for Beverages Analysis

    ERIC Educational Resources Information Center

    Blanco-Lopez, M. C.; Lobo-Castanon, M. J.; Miranda-Ordieres, A. J.

    2007-01-01

    The construction of an amperometric biosensor for glucose analysis is described demonstrating that the analysis is easy to perform and the biosensor gives good analytical performance. This experiment helped the students to acquire problem-solving and teamwork skills, allowing them to reach a high level of independent and critical thought.

  15. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was built using the OpenMDAO framework. Pycycle provides analytic derivatives allowing for an efficient use of gradient-based optimization methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  16. Exploration Laboratory Analysis

    NASA Technical Reports Server (NTRS)

    Krihak, M.; Ronzano, K.; Shaw, T.

    2016-01-01

    The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the availability of inflight laboratory analysis instrumentation has been identified as an essential capability for manned exploration missions. Since a single, compact space-ready laboratory analysis capability to perform all exploration clinical measurements is not commercially available, the ELA project objective is to demonstrate the feasibility of emerging operational and analytical capability as a biomedical diagnostics precursor to long duration manned exploration missions. The initial step towards ground and flight demonstrations in fiscal year (FY) 2015 was the down selection of platform technologies for demonstrations in the space environment. The technologies selected included two Small Business Innovation Research (SBIR) performers: DNA Medicine Institutes rHEALTH X and Intelligent Optical Systems later flow assays combined with Holomics smartphone analyzer. The selection of these technologies were based on their compact size, breadth of analytical capability and favorable ability to process fluids in a space environment, among several factors. These two technologies will be advanced to meet ground and flight demonstration success criteria and requirements that will be finalized in FY16. Also, the down selected performers will continue the technology development phase towards meeting prototype deliverables in either late 2016 or 2017.

  17. Determination of awareness in patients with severe brain injury using EEG power spectral analysis

    PubMed Central

    Goldfine, Andrew M.; Victor, Jonathan D.; Conte, Mary M.; Bardin, Jonathan C.; Schiff, Nicholas D.

    2011-01-01

    Objective To determine whether EEG spectral analysis could be used to demonstrate awareness in patients with severe brain injury. Methods We recorded EEG from healthy controls and three patients with severe brain injury, ranging from minimally conscious state (MCS) to locked-in-state (LIS), while they were asked to imagine motor and spatial navigation tasks. We assessed EEG spectral differences from 4 to 24 Hz with univariate comparisons (individual frequencies) and multivariate comparisons (patterns across the frequency range). Results In controls, EEG spectral power differed at multiple frequency bands and channels during performance of both tasks compared to a resting baseline. As patterns of signal change were inconsistent between controls, we defined a positive response in patient subjects as consistent spectral changes across task performances. One patient in MCS and one in LIS showed evidence of motor imagery task performance, though with patterns of spectral change different from the controls. Conclusion EEG power spectral analysis demonstrates evidence for performance of mental imagery tasks in healthy controls and patients with severe brain injury. Significance EEG power spectral analysis can be used as a flexible bedside tool to demonstrate awareness in brain-injured patients who are otherwise unable to communicate. PMID:21514214

  18. Future applications of associative processor systems to operational KSC systems for optimizing cost and enhancing performance characteristics

    NASA Technical Reports Server (NTRS)

    Perkinson, J. A.

    1974-01-01

    The application of associative memory processor equipment to conventional host processors type systems is discussed. Efforts were made to demonstrate how such application relieves the task burden of conventional systems, and enhance system speed and efficiency. Data cover comparative theoretical performance analysis, demonstration of expanded growth capabilities, and demonstrations of actual hardware in simulated environment.

  19. Distributed optical fiber vibration sensor based on spectrum analysis of Polarization-OTDR system.

    PubMed

    Zhang, Ziyi; Bao, Xiaoyi

    2008-07-07

    A fully distributed optical fiber vibration sensor is demonstrated based on spectrum analysis of Polarization-OTDR system. Without performing any data averaging, vibration disturbances up to 5 kHz is successfully demonstrated in a 1km fiber link with 10m spatial resolution. The FFT is performed at each spatial resolution; the relation of the disturbance at each frequency component versus location allows detection of multiple events simultaneously with different and the same frequency components.

  20. Plug cluster module demonstration

    NASA Technical Reports Server (NTRS)

    Rousar, D. C.

    1978-01-01

    The low pressure, film cooled rocket engine design concept developed during two previous ALRC programs was re-evaluated for application as a module for a plug cluster engine capable of performing space shuttle OTV missions. The nominal engine mixture ratio was 5.5 and the engine life requirements were 1200 thermal cycles and 10 hours total operating life. The program consisted of pretest analysis; engine tests, performed using residual components; and posttest analysis. The pretest analysis indicated that operation of the operation of the film cooled engine at O/F = 5.5 was feasible. During the engine tests, steady state wall temperature and performance measurement were obtained over a range of film cooling flow rates, and the durability of the engine was demonstrated by firing the test engine 1220 times at a nominal performance ranging from 430 - 432 seconds. The performance of the test engine was limited by film coolant sleeve damage which had occurred during previous testing. The post-test analyses indicated that the nominal performance level can be increased to 436 seconds.

  1. Performance-Approach Goal Effects Depend on How They Are Defined: Meta-Analytic Evidence from Multiple Educational Outcomes

    ERIC Educational Resources Information Center

    Senko, Corwin; Dawson, Blair

    2017-01-01

    Achievement goal theory originally defined performance-approach goals as striving to demonstrate competence to outsiders by outperforming peers. The research, however, has operationalized the goals inconsistently, emphasizing the competence demonstration element in some cases and the peer comparison element in others. A meta-analysis by Hulleman…

  2. Collection, processing and dissemination of data for the national solar demonstration program

    NASA Technical Reports Server (NTRS)

    Day, R. E.; Murphy, L. J.; Smok, J. T.

    1978-01-01

    A national solar data system developed for the DOE by IBM provides for automatic gathering, conversion, transfer, and analysis of demonstration site data. NASA requirements for this system include providing solar site hardware, engineering, data collection, and analysis. The specific tasks include: (1) solar energy system design/integration; (2) developing a site data acquisition subsystem; (3) developing a central data processing system; (4) operating the test facility at Marshall Space Flight Center; (5) collecting and analyzing data. The systematic analysis and evaluation of the data from the National Solar Data System is reflected in a monthly performance report and a solar energy system performance evaluation report.

  3. Develop feedback system for intelligent dynamic resource allocation to improve application performance.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gentile, Ann C.; Brandt, James M.; Tucker, Thomas

    2011-09-01

    This report provides documentation for the completion of the Sandia Level II milestone 'Develop feedback system for intelligent dynamic resource allocation to improve application performance'. This milestone demonstrates the use of a scalable data collection analysis and feedback system that enables insight into how an application is utilizing the hardware resources of a high performance computing (HPC) platform in a lightweight fashion. Further we demonstrate utilizing the same mechanisms used for transporting data for remote analysis and visualization to provide low latency run-time feedback to applications. The ultimate goal of this body of work is performance optimization in the facemore » of the ever increasing size and complexity of HPC systems.« less

  4. A case study in nonconformance and performance trend analysis

    NASA Technical Reports Server (NTRS)

    Maloy, Joseph E.; Newton, Coy P.

    1990-01-01

    As part of NASA's effort to develop an agency-wide approach to trend analysis, a pilot nonconformance and performance trending analysis study was conducted on the Space Shuttle auxiliary power unit (APU). The purpose of the study was to (1) demonstrate that nonconformance analysis can be used to identify repeating failures of a specific item (and the associated failure modes and causes) and (2) determine whether performance parameters could be analyzed and monitored to provide an indication of component or system degradation prior to failure. The nonconformance analysis of the APU did identify repeating component failures, which possibly could be reduced if key performance parameters were monitored and analyzed. The performance-trending analysis verified that the characteristics of hardware parameters can be effective in detecting degradation of hardware performance prior to failure.

  5. Arcjet thruster research and technology

    NASA Technical Reports Server (NTRS)

    Makel, Darby B.; Cann, Gordon L.

    1988-01-01

    The design, analysis, and performance testing of an advanced lower power arcjet is described. A high impedance, vortex stabilized 1-kw class arcjet has been studied. A baseline research thruster has been built and endurance and performance tested. This advanced arcjet has demonstrated long lifetime characteristics, but lower than expected performance. Analysis of the specific design has identified modifications which should improve performance and maintain the long life time shown by the arcjet.

  6. Paramedir: A Tool for Programmable Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  7. Brief Experimental Analyses of Academic Performance: Introduction to the Special Series

    ERIC Educational Resources Information Center

    McComas, Jennifer J.; Burns, Matthew K.

    2009-01-01

    Academic skills are frequent concerns in K-12 schools that could benefit from the application of applied behavior analysis (ABA). Brief experimental analysis (BEA) of academic performance is perhaps the most promising approach to apply ABA to student learning. Although research has consistently demonstrated the effectiveness of academic…

  8. Deployable antenna phase A study

    NASA Technical Reports Server (NTRS)

    Schultz, J.; Bernstein, J.; Fischer, G.; Jacobson, G.; Kadar, I.; Marshall, R.; Pflugel, G.; Valentine, J.

    1979-01-01

    Applications for large deployable antennas were re-examined, flight demonstration objectives were defined, the flight article (antenna) was preliminarily designed, and the flight program and ground development program, including the support equipment, were defined for a proposed space transportation system flight experiment to demonstrate a large (50 to 200 meter) deployable antenna system. Tasks described include: (1) performance requirements analysis; (2) system design and definition; (3) orbital operations analysis; and (4) programmatic analysis.

  9. Automatic differentiation for design sensitivity analysis of structural systems using multiple processors

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi

    1994-01-01

    An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.

  10. Application of structured analysis to a telerobotic system

    NASA Technical Reports Server (NTRS)

    Dashman, Eric; Mclin, David; Harrison, F. W.; Soloway, Donald; Young, Steven

    1990-01-01

    The analysis and evaluation of a multiple arm telerobotic research and demonstration system developed by the NASA Intelligent Systems Research Laboratory (ISRL) is described. Structured analysis techniques were used to develop a detailed requirements model of an existing telerobotic testbed. Performance models generated during this process were used to further evaluate the total system. A commercial CASE tool called Teamwork was used to carry out the structured analysis and development of the functional requirements model. A structured analysis and design process using the ISRL telerobotic system as a model is described. Evaluation of this system focused on the identification of bottlenecks in this implementation. The results demonstrate that the use of structured methods and analysis tools can give useful performance information early in a design cycle. This information can be used to ensure that the proposed system meets its design requirements before it is built.

  11. Economic Evaluation of Observatory Solar-Energy System

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Long-term economic performance of a commercial solar-energy system was analyzed and used to predict economic performance at four additional sites. Analysis described in report was done to demonstrate viability of design over a broad range of environmental/economic conditions. Topics covered are system description, study approach, economic analysis and system optimization.

  12. Demonstrating the Financial Benefit of Human Resource Development: Status and Update on the Theory and Practice.

    ERIC Educational Resources Information Center

    Swanson, Richard A.

    1998-01-01

    A research review identified findings about the financial analysis method, forecasting of the financial benefits of human resource development (HRD), and recent financial analysis research: (1) HRD embedded in a performance improvement framework yielded high return on investment; and (2) HRD interventions focused on performance variables forecast…

  13. NDARC - NASA Design and Analysis of Rotorcraft Validation and Demonstration

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2010-01-01

    Validation and demonstration results from the development of the conceptual design tool NDARC (NASA Design and Analysis of Rotorcraft) are presented. The principal tasks of NDARC are to design a rotorcraft to satisfy specified design conditions and missions, and then analyze the performance of the aircraft for a set of off-design missions and point operating conditions. The aircraft chosen as NDARC development test cases are the UH-60A single main-rotor and tail-rotor helicopter, the CH-47D tandem helicopter, the XH-59A coaxial lift-offset helicopter, and the XV-15 tiltrotor. These aircraft were selected because flight performance data, a weight statement, detailed geometry information, and a correlated comprehensive analysis model are available for each. Validation consists of developing the NDARC models for these aircraft by using geometry and weight information, airframe wind tunnel test data, engine decks, rotor performance tests, and comprehensive analysis results; and then comparing the NDARC results for aircraft and component performance with flight test data. Based on the calibrated models, the capability of the code to size rotorcraft is explored.

  14. Measurement uncertainty analysis techniques applied to PV performance measurements

    NASA Astrophysics Data System (ADS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  15. Thermodynamic Analysis of Dual-Mode Scramjet Engine Operation and Performance

    NASA Technical Reports Server (NTRS)

    Riggins, David; Tacket, Regan; Taylor, Trent; Auslender, Aaron

    2006-01-01

    Recent analytical advances in understanding the performance continuum (the thermodynamic spectrum) for air-breathing engines based on fundamental second-law considerations have clarified scramjet and ramjet operation, performance, and characteristics. Second-law based analysis is extended specifically in this work to clarify and describe the performance characteristics for dual-mode scramjet operation in the mid-speed range of flight Mach 4 to 7. This is done by a fundamental investigation of the complex but predictable interplay between heat release and irreversibilities in such an engine; results demonstrate the flow and performance character of the dual mode regime and of dual mode transition behavior. Both analytical and computational (multi-dimensional CFD) studies of sample dual-mode flow-fields are performed in order to demonstrate the second-law capability and performance and operability issues. The impact of the dual-mode regime is found to be characterized by decreasing overall irreversibility with increasing heat release, within the operability limits of the system.

  16. Performance Analysis and Electronics Packaging of the Optical Communications Demonstrator

    NASA Technical Reports Server (NTRS)

    Jeganathan, M.; Monacos, S.

    1998-01-01

    The Optical Communications Demonstrator (OCD), under development at the Jet Propulsion Laboratory (JPL), is a laboratory-based lasercomm terminal designed to validate several key technologies, primarily precision beam pointing, high bandwidth tracking, and beacon acquisition.

  17. ITS pilot project demonstration program summary report

    DOT National Transportation Integrated Search

    2008-12-01

    The purpose of this contract is to conduct a technical performance analysis and operational evaluation of ITS projects : demonstrated at the Innovative Mobility Experience Showcase in conjunction with the 2005 ITS World Congress. Some : of these proj...

  18. Thermal imager for dismounted infantry

    NASA Astrophysics Data System (ADS)

    Bigwood, Christopher R.; Eccles, Lee; Jones, Arwyn O.; Jones, Berwyn; Meakin, David L.; Rickard, Steve; Robinson, Rob

    2004-12-01

    Thermal Imager for Dismounted Infantry (TIDI), is a UK MOD / Thales Optics Ltd. joint funded technology demonstrator programme and is part of the overall programme managed by QinetiQ. The aim of this programme is to evaluate and demonstrate a cost effective route to equipping the infantry soldier with a small, lightweight, rugged, short range, weapon mounted thermal imaging sight; intended for mass deployment. TIDI is an unusual programme in that the requirement was not rigidly defined in terms of a detailed specification. Instead, the requirement was expressed in terms of the question 'What weapon sight performance can be achieved for a volume production cost of 5000 Euro?' This requirement was subject to the constraints that the sight mass should be less than 500 g and the volume should be less than 500 ml. To address the requirements of this programme, Thales Optics Ltd. have performed a detailed trade-off analysis considering alternative uncooled LWIR sensor formats and technologies. The effect of using alternative sensors on the sight cost, mass, volume, power and performance has been compared. A design study has been performed concentrating on simplification of the optics, mechanics and electronics to minimise the overall sight complexity. Based on this analysis, a demonstrator sight has been designed that is cost effective and suitable for volume manufacture, whilst still offering useful performance to the user. Six technical demonstrator units based on this design have been manufactured and evaluated. This paper will give an overview of the work completed to date on the TIDI program, including a description of the demonstrator hardware and its performance.

  19. Demonstration and Evaluation of Solid Phase Microextraction for the Assessment of Bioavailability and Contaminant Mobility. ESTCP Cost and Performance Report

    DTIC Science & Technology

    2012-08-01

    subsequent chemical analysis (into acetonitrile for high-performance liquid chromatography [ HPLC ] analysis or hexane for gas chromatography [GC... analysis ) is rapid and complete. In this work, PAHs were analyzed by Waters 2795 HPLC with fluorescent detection (USEPA Method 8310) and PCBs were...detection limits by direct water injection versus SPME with PDMS and coefficient of variation and correlation coefficient for SPME. Analysis by HPLC

  20. Multi-ingredients determination and fingerprint analysis of leaves from Ilex latifolia using ultra-performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry.

    PubMed

    Fan, Chunlin; Deng, Jiewei; Yang, Yunyun; Liu, Junshan; Wang, Ying; Zhang, Xiaoqi; Fai, Kuokchiu; Zhang, Qingwen; Ye, Wencai

    2013-10-01

    An ultra-performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry (UPLC-QTOF-MS) method integrating multi-ingredients determination and fingerprint analysis has been established for quality assessment and control of leaves from Ilex latifolia. The method possesses the advantages of speediness, efficiency, accuracy, and allows the multi-ingredients determination and fingerprint analysis in one chromatographic run within 13min. Multi-ingredients determination was performed based on the extracted ion chromatograms of the exact pseudo-molecular ions (with a 0.01Da window), and fingerprint analysis was performed based on the base peak chromatograms, obtained by negative-ion electrospray ionization QTOF-MS. The method validation results demonstrated our developed method possessing desirable specificity, linearity, precision and accuracy. The method was utilized to analyze 22 I. latifolia samples from different origins. The quality assessment was achieved by using both similarity analysis (SA) and principal component analysis (PCA), and the results from SA were consistent with those from PCA. Our experimental results demonstrate that the strategy integrated multi-ingredients determination and fingerprint analysis using UPLC-QTOF-MS technique is a useful approach for rapid pharmaceutical analysis, with promising prospects for the differentiation of origin, the determination of authenticity, and the overall quality assessment of herbal medicines. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Polarization-Analyzing CMOS Image Sensor With Monolithically Embedded Polarizer for Microchemistry Systems.

    PubMed

    Tokuda, T; Yamada, H; Sasagawa, K; Ohta, J

    2009-10-01

    This paper proposes and demonstrates a polarization-analyzing CMOS sensor based on image sensor architecture. The sensor was designed targeting applications for chiral analysis in a microchemistry system. The sensor features a monolithically embedded polarizer. Embedded polarizers with different angles were implemented to realize a real-time absolute measurement of the incident polarization angle. Although the pixel-level performance was confirmed to be limited, estimation schemes based on the variation of the polarizer angle provided a promising performance for real-time polarization measurements. An estimation scheme using 180 pixels in a 1deg step provided an estimation accuracy of 0.04deg. Polarimetric measurements of chiral solutions were also successfully performed to demonstrate the applicability of the sensor to optical chiral analysis.

  2. Automated Sensitivity Analysis of Interplanetary Trajectories

    NASA Technical Reports Server (NTRS)

    Knittel, Jeremy; Hughes, Kyle; Englander, Jacob; Sarli, Bruno

    2017-01-01

    This work describes a suite of Python tools known as the Python EMTG Automated Trade Study Application (PEATSA). PEATSA was written to automate the operation of trajectory optimization software, simplify the process of performing sensitivity analysis, and was ultimately found to out-perform a human trajectory designer in unexpected ways. These benefits will be discussed and demonstrated on sample mission designs.

  3. Counting and Comparing School Performance: An Analysis of Media Coverage of PISA in Australia, 2000-2014

    ERIC Educational Resources Information Center

    Baroutsis, Aspa; Lingard, Bob

    2017-01-01

    This paper empirically documents media portrayals of Australia's performance on the Program for the International Student Assessment (PISA), 2000-2014. We analyse newspaper articles from two national and eight metropolitan newspapers. This analysis demonstrates increased media coverage of PISA over the period in question. Our research data were…

  4. Evaluation of adding item-response theory analysis for evaluation of the European Board of Ophthalmology Diploma examination.

    PubMed

    Mathysen, Danny G P; Aclimandos, Wagih; Roelant, Ella; Wouters, Kristien; Creuzot-Garcher, Catherine; Ringens, Peter J; Hawlina, Marko; Tassignon, Marie-José

    2013-11-01

    To investigate whether introduction of item-response theory (IRT) analysis, in parallel to the 'traditional' statistical analysis methods available for performance evaluation of multiple T/F items as used in the European Board of Ophthalmology Diploma (EBOD) examination, has proved beneficial, and secondly, to study whether the overall assessment performance of the current written part of EBOD is sufficiently high (KR-20≥ 0.90) to be kept as examination format in future EBOD editions. 'Traditional' analysis methods for individual MCQ item performance comprise P-statistics, Rit-statistics and item discrimination, while overall reliability is evaluated through KR-20 for multiple T/F items. The additional set of statistical analysis methods for the evaluation of EBOD comprises mainly IRT analysis. These analysis techniques are used to monitor whether the introduction of negative marking for incorrect answers (since EBOD 2010) has a positive influence on the statistical performance of EBOD as a whole and its individual test items in particular. Item-response theory analysis demonstrated that item performance parameters should not be evaluated individually, but should be related to one another. Before the introduction of negative marking, the overall EBOD reliability (KR-20) was good though with room for improvement (EBOD 2008: 0.81; EBOD 2009: 0.78). After the introduction of negative marking, the overall reliability of EBOD improved significantly (EBOD 2010: 0.92; EBOD 2011:0.91; EBOD 2012: 0.91). Although many statistical performance parameters are available to evaluate individual items, our study demonstrates that the overall reliability assessment remains the only crucial parameter to be evaluated allowing comparison. While individual item performance analysis is worthwhile to undertake as secondary analysis, drawing final conclusions seems to be more difficult. Performance parameters need to be related, as shown by IRT analysis. Therefore, IRT analysis has proved beneficial for the statistical analysis of EBOD. Introduction of negative marking has led to a significant increase in the reliability (KR-20 > 0.90), indicating that the current examination format can be kept for future EBOD examinations. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  5. Performance Contracting as a Performance Management Tool in the Public Sector in Kenya: Lessons of learning

    ERIC Educational Resources Information Center

    Hope, Kempe Ronald, Sr.

    2013-01-01

    The purpose of this article is to provide an assessment and analysis of public sector performance contracting as a performance management tool in Kenya. It aims to demonstrate that performance contracting remains a viable and important tool for improving public sector performance as a key element of the on-going public sector transformation…

  6. An Exploratory Study of Intrinsic & Extrinsic Motivators and Student Performance in an Auditing Course

    ERIC Educational Resources Information Center

    Mo, Songtao

    2011-01-01

    The objective of this study is to investigate the association of intrinsic and extrinsic motivators and student performance. This study performs an exploratory analysis and presents evidence to demonstrate that intrinsic motivators affect the connection between external motivators and student performance. The empirical tests follow the framework…

  7. Efficiency of bowel preparation for capsule endoscopy examination: a meta-analysis.

    PubMed

    Niv, Yaron

    2008-03-07

    Good preparation before endoscopic procedures is essential for successful visualization. The small bowel is difficult to evaluate because of its length and complex configuration. A meta-analysis was conducted of studies comparing small bowel visualization by capsule endoscopy with and without preparation. Medical data bases were searched for all studies investigating the preparation for capsule endoscopy of the small bowel up to July 31, 2007. Studies that scored bowel cleanness and measured gastric and small bowel transit time and rate of cecum visualization were included. The primary endpoint was the quality of bowel visualization. The secondary endpoints were transit times and proportion of examinations that demonstrated the cecum, with and without preparation. Meta-analysis was performed with StatDirect Statistical software, version 2.6.1 (http://statsdirect.com). Eight studies met the inclusion criteria. Bowel visualization was scored as "good" in 78% of the examinations performed with preparation and 49% performed without (P<0.0001). There were no significant differences in transit times or in the proportion of examinations that demonstrated the cecum with and without preparation. Capsule endoscopy preparation improves the quality of small bowel visualization, but has no effect on transit times, or demonstration of the cecum.

  8. Efficiency of bowel preparation for capsule endoscopy examination: A meta-analysis

    PubMed Central

    Niv, Yaron

    2008-01-01

    Good preparation before endoscopic procedures is essential for successful visualization. The small bowel is difficult to evaluate because of its length and complex configuration. A meta-analysis was conducted of studies comparing small bowel visualization by capsule endoscopy with and without preparation. Medical data bases were searched for all studies investigating the preparation for capsule endoscopy of the small bowel up to July 31, 2007. Studies that scored bowel cleanness and measured gastric and small bowel transit time and rate of cecum visualization were included. The primary endpoint was the quality of bowel visualization. The secondary endpoints were transit times and proportion of examinations that demonstrated the cecum, with and without preparation. Meta-analysis was performed with StatDirect Statistical software, version 2.6.1 (http://statsdirect.com). Eight studies met the inclusion criteria. Bowel visualization was scored as “good” in 78% of the examinations performed with preparation and 49% performed without (P < 0.0001). There were no significant differences in transit times or in the proportion of examinations that demonstrated the cecum with and without preparation. Capsule endoscopy preparation improves the quality of small bowel visualization, but has no effect on transit times, or demonstration of the cecum. PMID:18322940

  9. Model-Based Linkage Analysis of a Quantitative Trait.

    PubMed

    Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H

    2017-01-01

    Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.

  10. Automated Sensitivity Analysis of Interplanetary Trajectories for Optimal Mission Design

    NASA Technical Reports Server (NTRS)

    Knittel, Jeremy; Hughes, Kyle; Englander, Jacob; Sarli, Bruno

    2017-01-01

    This work describes a suite of Python tools known as the Python EMTG Automated Trade Study Application (PEATSA). PEATSA was written to automate the operation of trajectory optimization software, simplify the process of performing sensitivity analysis, and was ultimately found to out-perform a human trajectory designer in unexpected ways. These benefits will be discussed and demonstrated on sample mission designs.

  11. Nonlinear structural analysis of a turbine airfoil using the Walker viscoplastic material model for B1900 + Hf

    NASA Technical Reports Server (NTRS)

    Meyer, T. G.; Hill, J. T.; Weber, R. M.

    1988-01-01

    A viscoplastic material model for the high temperature turbine airfoil material B1900 + Hf was developed and was demonstrated in a three dimensional finite element analysis of a typical turbine airfoil. The demonstration problem is a simulated flight cycle and includes the appropriate transient thermal and mechanical loads typically experienced by these components. The Walker viscoplastic material model was shown to be efficient, stable and easily used. The demonstration is summarized and the performance of the material model is evaluated.

  12. Analysis of Solar Receiver Flux Distributions for US/Russian Solar Dynamic System Demonstration on the MIR Space Station

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Analyses have been performed at the NASA Lewis Research Center's Power Systems Project Office to support the design and development of the joint U.S./Russian Solar Dynamic Flight Demonstration Project. The optical analysis of the concentrator and solar flux predictions on target receiver surfaces have an important influence on receiver design and control of the Brayton engine.

  13. The design and realisation of the IXV Mission Analysis and Flight Mechanics

    NASA Astrophysics Data System (ADS)

    Haya-Ramos, Rodrigo; Blanco, Gonzalo; Pontijas, Irene; Bonetti, Davide; Freixa, Jordi; Parigini, Cristina; Bassano, Edmondo; Carducci, Riccardo; Sudars, Martins; Denaro, Angelo; Angelini, Roberto; Mancuso, Salvatore

    2016-07-01

    The Intermediate eXperimental Vehicle (IXV) is a suborbital re-entry demonstrator successfully launched in February 2015 focusing on the in-flight demonstration of a lifting body system with active aerodynamic control surfaces. This paper presents an overview of the Mission Analysis and Flight Mechanics of the IXV vehicle, which comprises computation of the End-to-End (launch to splashdown) design trajectories, characterisation of the Entry Corridor, assessment of the Mission Performances through Monte Carlo campaigns, contribution to the aerodynamic database, analysis of the Visibility and link budget from Ground Stations and GPS, support to safety analyses (off nominal footprints), specification of the Centre of Gravity box, selection of the Angle of Attack trim line to be flown and characterisation of the Flying Qualities performances. An initial analysis and comparison with the raw flight data obtained during the flight will be discussed and first lessons learned derived.

  14. Performance assessment for continuing and future operations at Solid Waste Storage Area 6

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-02-01

    This radiological performance assessment for the continued disposal operations at Solid Waste Storage Area 6 (SWSA 6) on the Oak Ridge Reservation (ORR) has been prepared to demonstrate compliance with the requirements of the US DOE. The analysis of SWSA 6 required the use of assumptions to supplement the available site data when the available data were incomplete for the purpose of analysis. Results indicate that SWSA 6 does not presently meet the performance objectives of DOE Order 5820.2A. Changes in operations and continued work on the performance assessment are expected to demonstrate compliance with the performance objectives for continuingmore » operations at the Interim Waste Management Facility (IWMF). All other disposal operations in SWSA 6 are to be discontinued as of January 1, 1994. The disposal units at which disposal operations are discontinued will be subject to CERCLA remediation, which will result in acceptable protection of the public health and safety.« less

  15. Immediate drop on demand technology (I-DOT) coupled with mass spectrometry via an open port sampling interface.

    PubMed

    Van Berkel, Gary J; Kertesz, Vilmos; Boeltz, Harry

    2017-11-01

    The aim of this work was to demonstrate and evaluate the analytical performance of coupling the immediate drop on demand technology to a mass spectrometer via the recently introduced open port sampling interface and ESI. Methodology & results: A maximum sample analysis throughput of 5 s per sample was demonstrated. Signal reproducibility was 10% or better as demonstrated by the quantitative analysis of propranolol and its stable isotope-labeled internal standard propranolol-d7. The ability of the system to multiply charge and analyze macromolecules was demonstrated using the protein cytochrome c. This immediate drop on demand technology/open port sampling interface/ESI-MS combination allowed for the quantitative analysis of relatively small mass analytes and was used for the identification of macromolecules like proteins.

  16. Robust demarcation of basal cell carcinoma by dependent component analysis-based segmentation of multi-spectral fluorescence images.

    PubMed

    Kopriva, Ivica; Persin, Antun; Puizina-Ivić, Neira; Mirić, Lina

    2010-07-02

    This study was designed to demonstrate robust performance of the novel dependent component analysis (DCA)-based approach to demarcation of the basal cell carcinoma (BCC) through unsupervised decomposition of the red-green-blue (RGB) fluorescent image of the BCC. Robustness to intensity fluctuation is due to the scale invariance property of DCA algorithms, which exploit spectral and spatial diversities between the BCC and the surrounding tissue. Used filtering-based DCA approach represents an extension of the independent component analysis (ICA) and is necessary in order to account for statistical dependence that is induced by spectral similarity between the BCC and surrounding tissue. This generates weak edges what represents a challenge for other segmentation methods as well. By comparative performance analysis with state-of-the-art image segmentation methods such as active contours (level set), K-means clustering, non-negative matrix factorization, ICA and ratio imaging we experimentally demonstrate good performance of DCA-based BCC demarcation in two demanding scenarios where intensity of the fluorescent image has been varied almost two orders of magnitude. Copyright 2010 Elsevier B.V. All rights reserved.

  17. Developing focused wellness programs: using concept analysis to increase business value.

    PubMed

    Byczek, Lance; Kalina, Christine M; Levin, Pamela F

    2003-09-01

    Concept analysis is a useful tool in providing clarity to an abstract idea as well as an objective basis for developing wellness program products, goals, and outcomes. To plan for and develop successful wellness programs, it is critical for occupational health nurses to clearly understand a program concept as applied to a particular community or population. Occupational health nurses can use the outcome measures resulting from the concept analysis process to help demonstrate the business value of their wellness programs. This concept analysis demonstrates a predominance of the performance related attributes of fitness in the scientific literature.

  18. Demonstration of TRAF-NETSIM for traffic operations management : final report.

    DOT National Transportation Integrated Search

    1991-08-01

    The utility of the simulation package TRAF-NETSIM to the traffic engineer is assessed and demonstrated by means of a case study. The methodology employed in performing the analysis is presented in a way that will aid future users of TRAF-NETSIM. The ...

  19. A balanced perspective: using nonfinancial measures to assess financial performance.

    PubMed

    Watkins, Ann L

    2003-11-01

    Assessments of hospitals' financial performance have traditionally been based exclusively on analysis of a concise set of key financial ratios. One study, however, demonstrates that analysis of a hospital's financial condition can be significantly enhanced with the addition of several nonfinancial measures, including case-mix adjusted admissions, case-mix adjusted admissions per full-time equivalent, and case-mix adjusted admissions per beds in service.

  20. Life Cycle Costing: A Working Level Approach

    DTIC Science & Technology

    1981-06-01

    Effects Analysis ( FMEA ) ...... ................ .. 59 Logistics Performance Factors (LPFs) 60 Planning the Use of Life Cycle Cost in the Demonstration...form. Failure Mode and Effects Analysis ( FMEA ). Description. FMEA is a technique that attempts to improve the design of any particular unit. The FMEA ...failure modes and also eliminate extra parts or ones that are used to achieve more performance than is necessary (16:5-14]. Advantages. FMEA forces

  1. SUPERFUND INNOVATIVE TECHNOLOGIES EVALUATION ...

    EPA Pesticide Factsheets

    This task seeks to identify high priority needs of the Regions and Program Offices for innovative field sampling, characterization, monitoring, and measurement technologies. When an appropriate solution to a specific problem is identified, a field demonstration is conducted to document the performance and cost of the proposed technologies. The use of field analysis almost always provides a savings in time and cost over the usual sample and ship to a conventional laboratory for analysis approach to site characterization and monitoring. With improvements in technology and appropriate quality assurance/quality control, field analysis has been shown to provide high quality data, useful for most environmental monitoring or characterization projects. An emphasis of the program is to seek out innovative solutions to existing problems and to provide the cost and performance data a user would require to make an informed decision regarding the adequacy of a technology to address a specific environmental problem. The objective of this program is to promote the acceptance and use of innovative field technologies by providing well-documented performance and cost data obtained from field demonstrations.

  2. Brain MRI analysis for Alzheimer's disease diagnosis using an ensemble system of deep convolutional neural networks.

    PubMed

    Islam, Jyoti; Zhang, Yanqing

    2018-05-31

    Alzheimer's disease is an incurable, progressive neurological brain disorder. Earlier detection of Alzheimer's disease can help with proper treatment and prevent brain tissue damage. Several statistical and machine learning models have been exploited by researchers for Alzheimer's disease diagnosis. Analyzing magnetic resonance imaging (MRI) is a common practice for Alzheimer's disease diagnosis in clinical research. Detection of Alzheimer's disease is exacting due to the similarity in Alzheimer's disease MRI data and standard healthy MRI data of older people. Recently, advanced deep learning techniques have successfully demonstrated human-level performance in numerous fields including medical image analysis. We propose a deep convolutional neural network for Alzheimer's disease diagnosis using brain MRI data analysis. While most of the existing approaches perform binary classification, our model can identify different stages of Alzheimer's disease and obtains superior performance for early-stage diagnosis. We conducted ample experiments to demonstrate that our proposed model outperformed comparative baselines on the Open Access Series of Imaging Studies dataset.

  3. Initial Multidisciplinary Design and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Ozoroski, L. P.; Geiselhart, K. A.; Padula, S. L.; Li, W.; Olson, E. D.; Campbell, R. L.; Shields, E. W.; Berton, J. J.; Gray, J. S.; Jones, S. M.; hide

    2010-01-01

    Within the Supersonics (SUP) Project of the Fundamental Aeronautics Program (FAP), an initial multidisciplinary design & analysis framework has been developed. A set of low- and intermediate-fidelity discipline design and analysis codes were integrated within a multidisciplinary design and analysis framework and demonstrated on two challenging test cases. The first test case demonstrates an initial capability to design for low boom and performance. The second test case demonstrates rapid assessment of a well-characterized design. The current system has been shown to greatly increase the design and analysis speed and capability, and many future areas for development were identified. This work has established a state-of-the-art capability for immediate use by supersonic concept designers and systems analysts at NASA, while also providing a strong base to build upon for future releases as more multifidelity capabilities are developed and integrated.

  4. Analysis of Skylab IV fluid mechanic science demonstration

    NASA Technical Reports Server (NTRS)

    Klett, M. G.; Bourgeois, S. V.

    1975-01-01

    Several science demonstrations performed on Skylab III and IV were concerned with the behavior of fluid drops free floating in microgravity. These demonstrations, with large liquid drops, included the oscillation, rotation, impact and coalescence, and air injection into the drops. Rayleigh's analysis of the oscillation of spherical drops of a liquid predicts accurately the effect of size and surface tension on the frequency of vibrated water globules in the Skylab demonstration. However, damping occurred much faster than predicted by Lamb's or Scriven's analyses of the damping time for spherical drops. The impact demonstrations indicated that a minimum velocity is necessary to overcome surface forces and effect a coalescence, but a precise criterion for the coalescence of liquids in low g could not be determined.

  5. Parametric and experimental analysis using a power flow approach

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1988-01-01

    Having defined and developed a structural power flow approach for the analysis of structure-borne transmission of structural vibrations, the technique is used to perform an analysis of the influence of structural parameters on the transmitted energy. As a base for comparison, the parametric analysis is first performed using a Statistical Energy Analysis approach and the results compared with those obtained using the power flow approach. The advantages of using structural power flow are thus demonstrated by comparing the type of results obtained by the two methods. Additionally, to demonstrate the advantages of using the power flow method and to show that the power flow results represent a direct physical parameter that can be measured on a typical structure, an experimental investigation of structural power flow is also presented. Results are presented for an L-shaped beam for which an analytical solution has already been obtained. Furthermore, the various methods available to measure vibrational power flow are compared to investigate the advantages and disadvantages of each method.

  6. Performance Analysis of HF Band FB-MC-SS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hussein Moradi; Stephen Andrew Laraway; Behrouz Farhang-Boroujeny

    Abstract—In a recent paper [1] the filter bank multicarrier spread spectrum (FB-MC-SS) waveform was proposed for wideband spread spectrum HF communications. A significant benefit of this waveform is robustness against narrow and partial band interference. Simulation results in [1] demonstrated good performance in a wideband HF channel over a wide range of conditions. In this paper we present a theoretical analysis of the bit error probably for this system. Our analysis tailors the results from [2] where BER performance was analyzed for maximum ration combining systems that accounted for correlation between subcarriers and channel estimation error. Equations are give formore » BER that closely match the simulated performance in most situations.« less

  7. Ca analysis: An Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis☆

    PubMed Central

    Greensmith, David J.

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. PMID:24125908

  8. Ca analysis: an Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis.

    PubMed

    Greensmith, David J

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. Copyright © 2013 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.

  9. ARAMIS project: a more explicit demonstration of risk control through the use of bow-tie diagrams and the evaluation of safety barrier performance.

    PubMed

    de Dianous, Valérie; Fiévez, Cécile

    2006-03-31

    Over the last two decades a growing interest for risk analysis has been noted in the industries. The ARAMIS project has defined a methodology for risk assessment. This methodology has been built to help the industrialist to demonstrate that they have a sufficient risk control on their site. Risk analysis consists first in the identification of all the major accidents, assuming that safety functions in place are inefficient. This step of identification of the major accidents uses bow-tie diagrams. Secondly, the safety barriers really implemented on the site are taken into account. The barriers are identified on the bow-ties. An evaluation of their performance (response time, efficiency, and level of confidence) is performed to validate that they are relevant for the expected safety function. At last, the evaluation of their probability of failure enables to assess the frequency of occurrence of the accident. The demonstration of the risk control based on a couple gravity/frequency of occurrence is also possible for all the accident scenarios. During the risk analysis, a practical tool called risk graph is used to assess if the number and the reliability of the safety functions for a given cause are sufficient to reach a good risk control.

  10. Kinetic performance comparison of fully and superficially porous particles with a particle size of 5 µm: intrinsic evaluation and application to the impurity analysis of griseofulvin.

    PubMed

    Kahsay, Getu; Broeckhoven, Ken; Adams, Erwin; Desmet, Gert; Cabooter, Deirdre

    2014-05-01

    After the great commercial success of sub-3 µm superficially porous particles, vendors are now also starting to commercialize 5 µm superficially porous particles, as an alternative to their fully porous counterparts which are routinely used in pharmaceutical analysis. In this study, the performance of 5 µm superficially porous particles was compared to that of fully porous 5 µm particles in terms of efficiency, separation performance and loadability on a conventional HPLC instrument. Van Deemter and kinetic plots were first used to evaluate the efficiency and performance of both particle types using alkylphenones as a test mixture. The van Deemter and kinetic plots showed that the superficially porous particles provide a superior kinetic performance compared to the fully porous particles over the entire relevant range of separation conditions, when both support types were evaluated at the same operating pressure. The same observations were made both for isocratic and gradient analysis. The superior performance was further demonstrated for the separation of a pharmaceutical compound (griseofulvin) and its impurities, where a gain in analysis time of around 2 could be obtained using the superficially porous particles. Finally, both particle types were evaluated in terms of loadability by plotting the resolution of the active pharmaceutical ingredient and its closest impurity as a function of the signal-to-noise ratio obtained for the smallest impurity. It was demonstrated that the superficially porous particles show better separation performance for griseofulvin and its impurities without significantly compromising sensitivity due to loadability issues in comparison with their fully porous counterparts. Moreover these columns can be used on conventional equipment without modifications to obtain a significant improvement in analysis time. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. European Multicenter Study on Analytical Performance of DxN Veris System HCV Assay.

    PubMed

    Braun, Patrick; Delgado, Rafael; Drago, Monica; Fanti, Diana; Fleury, Hervé; Gismondo, Maria Rita; Hofmann, Jörg; Izopet, Jacques; Kühn, Sebastian; Lombardi, Alessandra; Marcos, Maria Angeles; Sauné, Karine; O'Shea, Siobhan; Pérez-Rivilla, Alfredo; Ramble, John; Trimoulet, Pascale; Vila, Jordi; Whittaker, Duncan; Artus, Alain; Rhodes, Daniel W

    2017-04-01

    The analytical performance of the Veris HCV Assay for use on the new and fully automated Beckman Coulter DxN Veris Molecular Diagnostics System (DxN Veris System) was evaluated at 10 European virology laboratories. Precision, analytical sensitivity, specificity, and performance with negative samples, linearity, and performance with hepatitis C virus (HCV) genotypes were evaluated. Precision for all sites showed a standard deviation (SD) of 0.22 log 10 IU/ml or lower for each level tested. Analytical sensitivity determined by probit analysis was between 6.2 and 9.0 IU/ml. Specificity on 94 unique patient samples was 100%, and performance with 1,089 negative samples demonstrated 100% not-detected results. Linearity using patient samples was shown from 1.34 to 6.94 log 10 IU/ml. The assay demonstrated linearity upon dilution with all HCV genotypes. The Veris HCV Assay demonstrated an analytical performance comparable to that of currently marketed HCV assays when tested across multiple European sites. Copyright © 2017 American Society for Microbiology.

  12. 76 FR 24831 - Site-Specific Analyses for Demonstrating Compliance With Subpart C Performance Objectives

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-03

    ... available under ADAMS accession number ML111040419, and the ``Technical Analysis Supporting Definition of... NUCLEAR REGULATORY COMMISSION 10 CFR Part 61 RIN 3150-AI92 [NRC-2011-0012] Site-Specific Analyses...-level radioactive waste disposal facilities to conduct site-specific analyses to demonstrate compliance...

  13. PILOT-SCALE DEMONSTRATION OF A SLURRY-PHASE BIOLOGICAL REACTOR FOR CREOSOTE-CONTAMINATED SOIL - APPLICATION ANALYSIS REPORT

    EPA Science Inventory

    In support of the U.S. Environmental Protection Agency’s (EPA) Superfund Innovative Technology Evaluation (SITE) Program, a pilot-scale demonstration of a slurry-phase bioremediation process was performed May 1991 at the EPA’s Test & Evaluation Facility in Cincinnati, OH. In this...

  14. Uncertainty analysis for low-level radioactive waste disposal performance assessment at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, D.W.; Yambert, M.W.; Kocher, D.C.

    1994-12-31

    A performance assessment of the operating Solid Waste Storage Area 6 (SWSA 6) facility for the disposal of low-level radioactive waste at the Oak Ridge National Laboratory has been prepared to provide the technical basis for demonstrating compliance with the performance objectives of DOE Order 5820.2A, Chapter 111.2 An analysis of the uncertainty incorporated into the assessment was performed which addressed the quantitative uncertainty in the data used by the models, the subjective uncertainty associated with the models used for assessing performance of the disposal facility and site, and the uncertainty in the models used for estimating dose and humanmore » exposure. The results of the uncertainty analysis were used to interpret results and to formulate conclusions about the performance assessment. This paper discusses the approach taken in analyzing the uncertainty in the performance assessment and the role of uncertainty in performance assessment.« less

  15. ESTCP Munitions Response Live Site Demonstrations, Former Southwestern Proving Ground, Arkansas Demonstration Report

    DTIC Science & Technology

    2015-07-01

    electromagnetic induction (EMI) sensor. A total of 2,116 targets were selected from the dynamic data for cued investigation, and 1,398 targets were...geophysical mapping DSB Defense Science Board EE/CA Engineering Evaluation/Cost Analysis EMI electromagnetic induction ESTCP Environmental Security...performed a live site demonstration project using the Geometrics MetalMapper advanced electromagnetic induction (EMI) sensor at the former Southwestern

  16. Comparative Proteomic and Nutritional Composition Analysis of Independent Transgenic Pigeon Pea Seeds Harboring cry1AcF and cry2Aa Genes and Their Nontransgenic Counterparts.

    PubMed

    Mishra, Pragya; Singh, Shweta; Rathinam, Maniraj; Nandiganti, Muralimohan; Ram Kumar, Nikhil; Thangaraj, Arulprakash; Thimmegowda, Vinutha; Krishnan, Veda; Mishra, Vagish; Jain, Neha; Rai, Vandna; Pattanayak, Debasis; Sreevathsa, Rohini

    2017-02-22

    Safety assessment of genetically modified plants is an important aspect prior to deregulation. Demonstration of substantial equivalence of the transgenics compared to their nontransgenic counterparts can be performed using different techniques at various molecular levels. The present study is a first-ever comprehensive evaluation of pigeon pea transgenics harboring two independent cry genes, cry2Aa and cry1AcF. The absence of unintended effects in the transgenic seed components was demonstrated by proteome and nutritional composition profiling. Analysis revealed that no significant differences were found in the various nutritional compositional analyses performed. Additionally, 2-DGE-based proteome analysis of the transgenic and nontransgenic seed protein revealed that there were no major changes in the protein profile, although a minor fold change in the expression of a few proteins was observed. Furthermore, the study also demonstrated that neither the integration of T-DNA nor the expression of the cry genes resulted in the production of unintended effects in the form of new toxins or allergens.

  17. Hardware Demonstration: Frequency Spectra of Transients

    NASA Technical Reports Server (NTRS)

    McCloskey, John; Dimov, Jen

    2017-01-01

    Radiated emissions measurements as specified by MIL-STD-461 are performed in the frequency domain, which is best suited to continuous wave (CW) types of signals. However, many platforms implement signals that are single event pulses or transients. Such signals can potentially generate momentary radiated emissions that can cause interference in the system, but they may be missed with traditional measurement techniques. This demonstration provides measurement and analysis techniques that effectively evaluate the potential emissions from such signals in order to evaluate their potential impacts to system performance.

  18. Energy localization and frequency analysis in the locust ear.

    PubMed

    Malkin, Robert; McDonagh, Thomas R; Mhatre, Natasha; Scott, Thomas S; Robert, Daniel

    2014-01-06

    Animal ears are exquisitely adapted to capture sound energy and perform signal analysis. Studying the ear of the locust, we show how frequency signal analysis can be performed solely by using the structural features of the tympanum. Incident sound waves generate mechanical vibrational waves that travel across the tympanum. These waves shoal in a tsunami-like fashion, resulting in energy localization that focuses vibrations onto the mechanosensory neurons in a frequency-dependent manner. Using finite element analysis, we demonstrate that two mechanical properties of the locust tympanum, distributed thickness and tension, are necessary and sufficient to generate frequency-dependent energy localization.

  19. Evaluating Web-Based Nursing Education's Effects: A Systematic Review and Meta-Analysis.

    PubMed

    Kang, Jiwon; Seomun, GyeongAe

    2017-09-01

    This systematic review and meta-analysis investigated whether using web-based nursing educational programs increases a participant's knowledge and clinical performance. We performed a meta-analysis of studies published between January 2000 and July 2016 and identified through RISS, CINAHL, ProQuest Central, Embase, the Cochrane Library, and PubMed. Eleven studies were eligible for inclusion in this analysis. The results of the meta-analysis demonstrated significant differences not only for the overall effect but also specifically for blended programs and short (2 weeks or 4 weeks) intervention periods. To present more evidence supporting the effectiveness of web-based nursing educational programs, further research is warranted.

  20. Design and demonstrate the performance of cryogenic components representative of space vehicles: Start basket liquid acquisition device performance analysis

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The objective was to design, fabricate and test an integrated cryogenic test article incorporating both fluid and thermal propellant management subsystems. A 2.2 m (87 in) diameter aluminum test tank was outfitted with multilayer insulation, helium purge system, low-conductive tank supports, thermodynamic vent system, liquid acquisition device and immersed outflow pump. Tests and analysis performed on the start basket liquid acquisition device and studies of the liquid retention characteristics of fine mesh screens are discussed.

  1. Advanced techniques for determining long term compatibility of materials with propellants

    NASA Technical Reports Server (NTRS)

    Green, R. L.

    1972-01-01

    The search for advanced measurement techniques for determining long term compatibility of materials with propellants was conducted in several parts. A comprehensive survey of the existing measurement and testing technology for determining material-propellant interactions was performed. Selections were made from those existing techniques which were determined could meet or be made to meet the requirements. Areas of refinement or changes were recommended for improvement of others. Investigations were also performed to determine the feasibility and advantages of developing and using new techniques to achieve significant improvements over existing ones. The most interesting demonstration was that of the new technique, the volatile metal chelate analysis. Rivaling the neutron activation analysis in terms of sensitivity and specificity, the volatile metal chelate technique was fully demonstrated.

  2. Industry Application ECCS / LOCA Integrated Cladding/Emergency Core Cooling System Performance: Demonstration of LOTUS-Baseline Coupled Analysis of the South Texas Plant Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Hongbin; Szilard, Ronaldo; Epiney, Aaron

    Under the auspices of the DOE LWRS Program RISMC Industry Application ECCS/LOCA, INL has engaged staff from both South Texas Project (STP) and the Texas A&M University (TAMU) to produce a generic pressurized water reactor (PWR) model including reactor core, clad/fuel design and systems thermal hydraulics based on the South Texas Project (STP) nuclear power plant, a 4-Loop Westinghouse PWR. A RISMC toolkit, named LOCA Toolkit for the U.S. (LOTUS), has been developed for use in this generic PWR plant model to assess safety margins for the proposed NRC 10 CFR 50.46c rule, Emergency Core Cooling System (ECCS) performance duringmore » LOCA. This demonstration includes coupled analysis of core design, fuel design, thermalhydraulics and systems analysis, using advanced risk analysis tools and methods to investigate a wide range of results. Within this context, a multi-physics best estimate plus uncertainty (MPBEPU) methodology framework is proposed.« less

  3. Discriminative Nonlinear Analysis Operator Learning: When Cosparse Model Meets Image Classification.

    PubMed

    Wen, Zaidao; Hou, Biao; Jiao, Licheng

    2017-05-03

    Linear synthesis model based dictionary learning framework has achieved remarkable performances in image classification in the last decade. Behaved as a generative feature model, it however suffers from some intrinsic deficiencies. In this paper, we propose a novel parametric nonlinear analysis cosparse model (NACM) with which a unique feature vector will be much more efficiently extracted. Additionally, we derive a deep insight to demonstrate that NACM is capable of simultaneously learning the task adapted feature transformation and regularization to encode our preferences, domain prior knowledge and task oriented supervised information into the features. The proposed NACM is devoted to the classification task as a discriminative feature model and yield a novel discriminative nonlinear analysis operator learning framework (DNAOL). The theoretical analysis and experimental performances clearly demonstrate that DNAOL will not only achieve the better or at least competitive classification accuracies than the state-of-the-art algorithms but it can also dramatically reduce the time complexities in both training and testing phases.

  4. Hyper-X Hot Structures Comparison of Thermal Analysis and Flight Data

    NASA Technical Reports Server (NTRS)

    Amundsen, Ruth M.; Leonard, Charles P.; Bruce, Walter E., III

    2004-01-01

    The Hyper-X (X-43A) program is a flight experiment to demonstrate scramjet performance and operability under controlled powered free-flight conditions at Mach 7 and 10. The Mach 7 flight was successfully completed on March 27, 2004. Thermocouple instrumentation in the hot structures (nose, horizontal tail, and vertical tail) recorded the flight thermal response of these components. Preflight thermal analysis was performed for design and risk assessment purposes. This paper will present a comparison of the preflight thermal analysis and the recorded flight data.

  5. Optimizing Probability of Detection Point Estimate Demonstration

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  6. Case Study for the ARRA-funded GSHP Demonstration at University at Albany

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Xiaobing; Malhotra, Mini; Xiong, Zeyu

    High initial costs and lack of public awareness of ground-source heat pump (GSHP) technology are the two major barriers preventing rapid deployment of this energy-saving technology in the United States. Under the American Recovery and Reinvestment Act (ARRA), 26 GSHP projects have been competitively selected and carried out to demonstrate the benefits of GSHP systems and innovative technologies for cost reduction and/or performance improvement. This report highlights the findings of a case study of one of the ARRA-funded GSHP demonstration projects—a distributed GSHP system at a new 500-bed apartment-style student residence hall at the University at Albany. This case studymore » is based on the analysis of detailed design documents, measured performance data, published catalog data of heat pump equipment, and actual construction costs. Simulations with a calibrated computer model are performed for both the demonstrated GSHP system and a baseline heating, ventilation, and airconditioning (HVAC) system to determine the energy savings and other related benefits achieved by the GSHP system. The evaluated performance metrics include the energy efficiency of the heat pump equipment and the overall GSHP system, as well as the pumping performance, energy savings, carbon emission reductions, and cost-effectiveness of the demonstrated GSHP system compared with the baseline HVAC system. This case study also identifies opportunities for improving the operational efficiency of the demonstrated GSHP system.« less

  7. Propulsion Powertrain Real-Time Simulation Using Hardware-in-the-Loop (HIL) for Aircraft Electric Propulsion System

    NASA Technical Reports Server (NTRS)

    Choi, Benjamin B.; Brown, Gerald V.

    2017-01-01

    It is essential to design a propulsion powertrain real-time simulator using the hardware-in-the-loop (HIL) system that emulates an electrified aircraft propulsion (EAP) systems power grid. This simulator would enable us to facilitate in-depth understanding of the system principles, to validate system model analysis and performance prediction, and to demonstrate the proof-of-concept of the EAP electrical system. This paper describes how subscale electrical machines with their controllers can mimic the power components in an EAP powertrain. In particular, three powertrain emulations are presented to mimic 1) a gas turbo-=shaft engine driving a generator, consisting of two permanent magnet (PM) motors with brushless motor drives, coupled by a shaft, 2) a motor driving a propulsive fan, and 3) a turbo-shaft engine driven fan (turbofan engine) operation. As a first step towards the demonstration, experimental dynamic characterization of the two motor drive systems, coupled by a mechanical shaft, were performed. The previously developed analytical motor models1 were then replaced with the experimental motor models to perform the real-time demonstration in the predefined flight path profiles. This technique can convert the plain motor system into a unique EAP power grid emulator that enables rapid analysis and real-time simulation performance using hardware-in-the-loop (HIL).

  8. Object motion analysis study

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The use of optical data processing (ODP) techniques for motion analysis in two-dimensional imagery was studied. The basic feasibility of this approach was demonstrated, but inconsistent performance of the photoplastic used for recording spatial filters prevented totally automatic operation. Promising solutions to the problems encountered are discussed, and it is concluded that ODP techniques could be quite useful for motion analysis.

  9. Performance of Modified Test Statistics in Covariance and Correlation Structure Analysis under Conditions of Multivariate Nonnormality.

    ERIC Educational Resources Information Center

    Fouladi, Rachel T.

    2000-01-01

    Provides an overview of standard and modified normal theory and asymptotically distribution-free covariance and correlation structure analysis techniques and details Monte Carlo simulation results on Type I and Type II error control. Demonstrates through the simulation that robustness and nonrobustness of structure analysis techniques vary as a…

  10. The Categorical Perception Deficit in Dyslexia: A Meta-Analysis

    ERIC Educational Resources Information Center

    Noordenbos, Mark W.; Serniclaes, Willy

    2015-01-01

    Speech perception in dyslexia is characterized by a categorical perception (CP) deficit, demonstrated by weaker discrimination of acoustic differences between phonemic categories in conjunction with better discrimination of acoustic differences within phonemic categories. We performed a meta-analysis of studies that examined the reliability of the…

  11. Interteaching: The Effects of Discussion Group Size on Undergraduate Student Performance and Preference

    ERIC Educational Resources Information Center

    Gutierrez, Michael

    2017-01-01

    Interteaching is a college teaching method grounded in the principles of applied behavior analysis. Research on interteaching demonstrates that it improves academic performance, and students report greater satisfaction with interteaching as compared to traditional teaching styles. The current study investigates whether discussion group size, a…

  12. Posttest analysis of the FFTF inherent safety tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Padilla, A. Jr.; Claybrook, S.W.

    Inherent safety tests were performed during 1986 in the 400-MW (thermal) Fast Flux Test Facility (FFTF) reactor to demonstrate the effectiveness of an inherent shutdown device called the gas expansion module (GEM). The GEM device provided a strong negative reactivity feedback during loss-of-flow conditions by increasing the neutron leakage as a result of an expanding gas bubble. The best-estimate pretest calculations for these tests were performed using the IANUS plant analysis code (Westinghouse Electric Corporation proprietary code) and the MELT/SIEX3 core analysis code. These two codes were also used to perform the required operational safety analyses for the FFTF reactormore » and plant. Although it was intended to also use the SASSYS systems (core and plant) analysis code, the calibration of the SASSYS code for FFTF core and plant analysis was not completed in time to perform pretest analyses. The purpose of this paper is to present the results of the posttest analysis of the 1986 FFTF inherent safety tests using the SASSYS code.« less

  13. Free wake analysis of hover performance using a new influence coefficient method

    NASA Technical Reports Server (NTRS)

    Quackenbush, Todd R.; Bliss, Donald B.; Ong, Ching Cho; Ching, Cho Ong

    1990-01-01

    A new approach to the prediction of helicopter rotor performance using a free wake analysis was developed. This new method uses a relaxation process that does not suffer from the convergence problems associated with previous time marching simulations. This wake relaxation procedure was coupled to a vortex-lattice, lifting surface loads analysis to produce a novel, self contained performance prediction code: EHPIC (Evaluation of Helicopter Performance using Influence Coefficients). The major technical features of the EHPIC code are described and a substantial amount of background information on the capabilities and proper operation of the code is supplied. Sample problems were undertaken to demonstrate the robustness and flexibility of the basic approach. Also, a performance correlation study was carried out to establish the breadth of applicability of the code, with very favorable results.

  14. Automatically visualise and analyse data on pathways using PathVisioRPC from any programming environment.

    PubMed

    Bohler, Anwesha; Eijssen, Lars M T; van Iersel, Martijn P; Leemans, Christ; Willighagen, Egon L; Kutmon, Martina; Jaillard, Magali; Evelo, Chris T

    2015-08-23

    Biological pathways are descriptive diagrams of biological processes widely used for functional analysis of differentially expressed genes or proteins. Primary data analysis, such as quality control, normalisation, and statistical analysis, is often performed in scripting languages like R, Perl, and Python. Subsequent pathway analysis is usually performed using dedicated external applications. Workflows involving manual use of multiple environments are time consuming and error prone. Therefore, tools are needed that enable pathway analysis directly within the same scripting languages used for primary data analyses. Existing tools have limited capability in terms of available pathway content, pathway editing and visualisation options, and export file formats. Consequently, making the full-fledged pathway analysis tool PathVisio available from various scripting languages will benefit researchers. We developed PathVisioRPC, an XMLRPC interface for the pathway analysis software PathVisio. PathVisioRPC enables creating and editing biological pathways, visualising data on pathways, performing pathway statistics, and exporting results in several image formats in multiple programming environments. We demonstrate PathVisioRPC functionalities using examples in Python. Subsequently, we analyse a publicly available NCBI GEO gene expression dataset studying tumour bearing mice treated with cyclophosphamide in R. The R scripts demonstrate how calls to existing R packages for data processing and calls to PathVisioRPC can directly work together. To further support R users, we have created RPathVisio simplifying the use of PathVisioRPC in this environment. We have also created a pathway module for the microarray data analysis portal ArrayAnalysis.org that calls the PathVisioRPC interface to perform pathway analysis. This module allows users to use PathVisio functionality online without having to download and install the software and exemplifies how the PathVisioRPC interface can be used by data analysis pipelines for functional analysis of processed genomics data. PathVisioRPC enables data visualisation and pathway analysis directly from within various analytical environments used for preliminary analyses. It supports the use of existing pathways from WikiPathways or pathways created using the RPC itself. It also enables automation of tasks performed using PathVisio, making it useful to PathVisio users performing repeated visualisation and analysis tasks. PathVisioRPC is freely available for academic and commercial use at http://projects.bigcat.unimaas.nl/pathvisiorpc.

  15. Efficient genotype compression and analysis of large genetic variation datasets

    PubMed Central

    Layer, Ryan M.; Kindlon, Neil; Karczewski, Konrad J.; Quinlan, Aaron R.

    2015-01-01

    Genotype Query Tools (GQT) is a new indexing strategy that expedites analyses of genome variation datasets in VCF format based on sample genotypes, phenotypes and relationships. GQT’s compressed genotype index minimizes decompression for analysis, and performance relative to existing methods improves with cohort size. We show substantial (up to 443 fold) performance gains over existing methods and demonstrate GQT’s utility for exploring massive datasets involving thousands to millions of genomes. PMID:26550772

  16. Air Vehicle Integration and Technology Research (AVIATR). Task Order 0003: Condition-Based Maintenance Plus Structural Integrity (CBM+SI) Demonstration (April 2011 to August 2011)

    DTIC Science & Technology

    2011-08-01

    investigated. Implementation of this technology into the maintenance framework depends on several factors, including safety of the structural system, cost... Maintenance Parameters The F-15 Program has indicated that, in practice , maintenance actions are generally performed on flight hour multiples of 200...Risk Analysis or the Perform Cost Benefit Analysis sections of the flowchart. 4.6. Determine System Configurations The current maintenance practice

  17. Distributed Finite Element Analysis Using a Transputer Network

    NASA Technical Reports Server (NTRS)

    Watson, James; Favenesi, James; Danial, Albert; Tombrello, Joseph; Yang, Dabby; Reynolds, Brian; Turrentine, Ronald; Shephard, Mark; Baehmann, Peggy

    1989-01-01

    The principal objective of this research effort was to demonstrate the extraordinarily cost effective acceleration of finite element structural analysis problems using a transputer-based parallel processing network. This objective was accomplished in the form of a commercially viable parallel processing workstation. The workstation is a desktop size, low-maintenance computing unit capable of supercomputer performance yet costs two orders of magnitude less. To achieve the principal research objective, a transputer based structural analysis workstation termed XPFEM was implemented with linear static structural analysis capabilities resembling commercially available NASTRAN. Finite element model files, generated using the on-line preprocessing module or external preprocessing packages, are downloaded to a network of 32 transputers for accelerated solution. The system currently executes at about one third Cray X-MP24 speed but additional acceleration appears likely. For the NASA selected demonstration problem of a Space Shuttle main engine turbine blade model with about 1500 nodes and 4500 independent degrees of freedom, the Cray X-MP24 required 23.9 seconds to obtain a solution while the transputer network, operated from an IBM PC-AT compatible host computer, required 71.7 seconds. Consequently, the $80,000 transputer network demonstrated a cost-performance ratio about 60 times better than the $15,000,000 Cray X-MP24 system.

  18. Multi-rater feedback with gap analysis: an innovative means to assess communication skill and self-insight.

    PubMed

    Calhoun, Aaron W; Rider, Elizabeth A; Peterson, Eleanor; Meyer, Elaine C

    2010-09-01

    Multi-rater assessment with gap analysis is a powerful method for assessing communication skills and self-insight, and enhancing self-reflection. We demonstrate the use of this methodology. The Program for the Approach to Complex Encounters (PACE) is an interdisciplinary simulation-based communication skills program. Encounters are assessed using an expanded Kalamazoo Consensus Statement Essential Elements Checklist adapted for multi-rater feedback and gap analysis. Data from a representative conversation were analyzed. Likert and forced-choice data with gap analysis are used to assess performance. Participants were strong in Demonstrating Empathy and Providing Closure, and needed to improve Relationship Building, Gathering Information, and understanding the Patient's/Family's Perspective. Participants under-appraised their abilities in Relationship Building, Providing Closure, and Demonstrating Empathy, as well as their overall performance. The conversion of these results into verbal feedback is discussed. We describe an evaluation methodology using multi-rater assessment with gap analysis to assess communication skills and self-insight. This methodology enables faculty to identify undervalued skills and perceptual blind spots, provide comprehensive, data driven, feedback, and encourage reflection. Implementation of graphical feedback forms coupled with one-on-one discussion using the above methodology has the potential to enhance trainee self-awareness and reflection, improving the impact of educational programs. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  19. Demonstration of a modelling-based multi-criteria decision analysis procedure for prioritisation of occupational risks from manufactured nanomaterials.

    PubMed

    Hristozov, Danail; Zabeo, Alex; Alstrup Jensen, Keld; Gottardo, Stefania; Isigonis, Panagiotis; Maccalman, Laura; Critto, Andrea; Marcomini, Antonio

    2016-11-01

    Several tools to facilitate the risk assessment and management of manufactured nanomaterials (MN) have been developed. Most of them require input data on physicochemical properties, toxicity and scenario-specific exposure information. However, such data are yet not readily available, and tools that can handle data gaps in a structured way to ensure transparent risk analysis for industrial and regulatory decision making are needed. This paper proposes such a quantitative risk prioritisation tool, based on a multi-criteria decision analysis algorithm, which combines advanced exposure and dose-response modelling to calculate margins of exposure (MoE) for a number of MN in order to rank their occupational risks. We demonstrated the tool in a number of workplace exposure scenarios (ES) involving the production and handling of nanoscale titanium dioxide, zinc oxide (ZnO), silver and multi-walled carbon nanotubes. The results of this application demonstrated that bag/bin filling, manual un/loading and dumping of large amounts of dry powders led to high emissions, which resulted in high risk associated with these ES. The ZnO MN revealed considerable hazard potential in vivo, which significantly influenced the risk prioritisation results. In order to study how variations in the input data affect our results, we performed probabilistic Monte Carlo sensitivity/uncertainty analysis, which demonstrated that the performance of the proposed model is stable against changes in the exposure and hazard input variables.

  20. The Student-to-Student Chemistry Initiative: Training High School Students To Perform Chemistry Demonstration Programs for Elementary School Students

    NASA Astrophysics Data System (ADS)

    Voegel, Phillip D.; Quashnock, Kathryn A.; Heil, Katrina M.

    2004-05-01

    The Student-to-Student Chemistry Initiative is an outreach program started in the fall of 2001 at Midwestern State University (MSU). The oncampus program trains high school science students to perform a series of chemistry demonstrations and subsequently provides kits containing necessary supplies and reagents for the high school students to perform demonstration programs at elementary schools. The program focuses on improving student perception of science. The program's impact on high school student perception is evaluated through statistical analysis of paired preparticipation and postparticipation surveys. The surveys focus on four areas of student perception: general attitude toward science, interest in careers in science, science awareness, and interest in attending MSU for postsecondary education. Increased scores were observed in all evaluation areas including a statistically significant increase in science awareness following participation.

  1. Optical ensemble analysis of intraocular lens performance through a simulated clinical trial with ZEMAX.

    PubMed

    Zhao, Huawei

    2009-01-01

    A ZEMAX model was constructed to simulate a clinical trial of intraocular lenses (IOLs) based on a clinically oriented Monte Carlo ensemble analysis using postoperative ocular parameters. The purpose of this model is to test the feasibility of streamlining and optimizing both the design process and the clinical testing of IOLs. This optical ensemble analysis (OEA) is also validated. Simulated pseudophakic eyes were generated by using the tolerancing and programming features of ZEMAX optical design software. OEA methodology was verified by demonstrating that the results of clinical performance simulations were consistent with previously published clinical performance data using the same types of IOLs. From these results we conclude that the OEA method can objectively simulate the potential clinical trial performance of IOLs.

  2. Recovery of Rare Earths, Precious Metals and Other Critical Materials from Geothermal Waters with Advanced Sorbent Structures

    DOE Data Explorer

    Pamela M. Kinsey

    2015-09-30

    The work evaluates, develops and demonstrates flexible, scalable mineral extraction technology for geothermal brines based upon solid phase sorbent materials with a specific focus upon rare earth elements (REEs). The selected organic and inorganic sorbent materials demonstrated high performance for collection of trace REEs, precious and valuable metals. The nanostructured materials typically performed better than commercially available sorbents. Data contains organic and inorganic sorbent removal efficiency, Sharkey Hot Springs (Idaho) water chemsitry analysis, and rare earth removal efficiency from select sorbents.

  3. Methodology, Methods, and Metrics for Testing and Evaluating Augmented Cognition Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greitzer, Frank L.

    The augmented cognition research community seeks cognitive neuroscience-based solutions to improve warfighter performance by applying and managing mitigation strategies to reduce workload and improve the throughput and quality of decisions. The focus of augmented cognition mitigation research is to define, demonstrate, and exploit neuroscience and behavioral measures that support inferences about the warfighter’s cognitive state that prescribe the nature and timing of mitigation. A research challenge is to develop valid evaluation methodologies, metrics and measures to assess the impact of augmented cognition mitigations. Two considerations are external validity, which is the extent to which the results apply to operational contexts;more » and internal validity, which reflects the reliability of performance measures and the conclusions based on analysis of results. The scientific rigor of the research methodology employed in conducting empirical investigations largely affects the validity of the findings. External validity requirements also compel us to demonstrate operational significance of mitigations. Thus it is important to demonstrate effectiveness of mitigations under specific conditions. This chapter reviews some cognitive science and methodological considerations in designing augmented cognition research studies and associated human performance metrics and analysis methods to assess the impact of augmented cognition mitigations.« less

  4. 40 CFR 63.7505 - What are my general requirements for complying with this subpart?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... compliance with the applicable emission limit for hydrogen chloride or mercury using fuel analysis if the..., fuel analysis, or continuous monitoring systems (CMS), including a continuous emission monitoring..., you must demonstrate compliance for hydrogen chloride or mercury using performance testing, if subject...

  5. High Performance Liquid Chromatography of Some Analgesic Compounds: An Instrumental Analysis Experiment.

    ERIC Educational Resources Information Center

    Haddad, Paul; And Others

    1983-01-01

    Background information, procedures, and results are provided for an experiment demonstrating techniques of solvent selection, gradient elution, pH control, and ion-pairing in the analysis of an analgesic mixture using reversed-phase liquid chromatography on an octadecylsilane column. Although developed using sophisticated/expensive equipment, less…

  6. Solar energy system economic evaluation: Fern Tunkhannock, Tunkhannock, Pennsylvania

    NASA Astrophysics Data System (ADS)

    1980-09-01

    The economic performance of an Operational Test Site (OTS) is described. The long term economic performance of the system at its installation site and extrapolation to four additional selected locations to demonstrate the viability of the design over a broad range of environmental and economic conditions is reported. Topics discussed are: system description, study approach, economic analysis and system optimization, and technical and economical results of analysis. Data for the economic analysis are generated through evaluation of the OTS. The simulation is based on the technical results of the seasonal report simulation. In addition localized and standard economic parameters are used for economic analysis.

  7. Solar energy system economic evaluation: Fern Tunkhannock, Tunkhannock, Pennsylvania

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The economic performance of an Operational Test Site (OTS) is described. The long term economic performance of the system at its installation site and extrapolation to four additional selected locations to demonstrate the viability of the design over a broad range of environmental and economic conditions is reported. Topics discussed are: system description, study approach, economic analysis and system optimization, and technical and economical results of analysis. Data for the economic analysis are generated through evaluation of the OTS. The simulation is based on the technical results of the seasonal report simulation. In addition localized and standard economic parameters are used for economic analysis.

  8. Advanced building energy management system demonstration for Department of Defense buildings.

    PubMed

    O'Neill, Zheng; Bailey, Trevor; Dong, Bing; Shashanka, Madhusudana; Luo, Dong

    2013-08-01

    This paper presents an advanced building energy management system (aBEMS) that employs advanced methods of whole-building performance monitoring combined with statistical methods of learning and data analysis to enable identification of both gradual and discrete performance erosion and faults. This system assimilated data collected from multiple sources, including blueprints, reduced-order models (ROM) and measurements, and employed advanced statistical learning algorithms to identify patterns of anomalies. The results were presented graphically in a manner understandable to facilities managers. A demonstration of aBEMS was conducted in buildings at Naval Station Great Lakes. The facility building management systems were extended to incorporate the energy diagnostics and analysis algorithms, producing systematic identification of more efficient operation strategies. At Naval Station Great Lakes, greater than 20% savings were demonstrated for building energy consumption by improving facility manager decision support to diagnose energy faults and prioritize alternative, energy-efficient operation strategies. The paper concludes with recommendations for widespread aBEMS success. © 2013 New York Academy of Sciences.

  9. Multiplex network analysis of employee performance and employee social relationships

    NASA Astrophysics Data System (ADS)

    Cai, Meng; Wang, Wei; Cui, Ying; Stanley, H. Eugene

    2018-01-01

    In human resource management, employee performance is strongly affected by both formal and informal employee networks. Most previous research on employee performance has focused on monolayer networks that can represent only single categories of employee social relationships. We study employee performance by taking into account the entire multiplex structure of underlying employee social networks. We collect three datasets consisting of five different employee relationship categories in three firms, and predict employee performance using degree centrality and eigenvector centrality in a superimposed multiplex network (SMN) and an unfolded multiplex network (UMN). We use a quadratic assignment procedure (QAP) analysis and a regression analysis to demonstrate that the different categories of relationship are mutually embedded and that the strength of their impact on employee performance differs. We also use weighted/unweighted SMN/UMN to measure the predictive accuracy of this approach and find that employees with high centrality in a weighted UMN are more likely to perform well. Our results shed new light on how social structures affect employee performance.

  10. Identifying Novel Helix-Loop-Helix Genes in "Caenorhabditis elegans" through a Classroom Demonstration of Functional Genomics

    ERIC Educational Resources Information Center

    Griffin, Vernetta; McMiller, Tracee; Jones, Erika; Johnson, Casonya M.

    2003-01-01

    A 14-week, undergraduate-level Genetics and Population Biology course at Morgan State University was modified to include a demonstration of functional genomics in the research laboratory. Students performed a rudimentary sequence analysis of the "Caenorhabditis elegans" genome and further characterized three sequences that were predicted to encode…

  11. ATD-1 Avionics Phase 2: Post-Flight Data Analysis Report

    NASA Technical Reports Server (NTRS)

    Scharl, Julien

    2017-01-01

    This report aims to satisfy Air Traffic Management Technology Demonstration - 1 (ATD-1) Statement of Work (SOW) 3.6.19 and serves as the delivery mechanism for the analysis described in Annex C of the Flight Test Plan. The report describes the data collected and derived as well as the analysis methodology and associated results extracted from the data set collected during the ATD-1 Flight Test. All analyses described in the SOW were performed and are covered in this report except for the analysis of Final Approach Speed and its effect on performance. This analysis was de-prioritized and, at the time of this report, is not considered feasible in the schedule and costs remaining.

  12. Multiphysics Nuclear Thermal Rocket Thrust Chamber Analysis

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See

    2005-01-01

    The objective of this effort is t o develop an efficient and accurate thermo-fluid computational methodology to predict environments for hypothetical thrust chamber design and analysis. The current task scope is to perform multidimensional, multiphysics analysis of thrust performance and heat transfer analysis for a hypothetical solid-core, nuclear thermal engine including thrust chamber and nozzle. The multiphysics aspects of the model include: real fluid dynamics, chemical reactivity, turbulent flow, and conjugate heat transfer. The model will be designed to identify thermal, fluid, and hydrogen environments in all flow paths and materials. This model would then be used to perform non- nuclear reproduction of the flow element failures demonstrated in the Rover/NERVA testing, investigate performance of specific configurations and assess potential issues and enhancements. A two-pronged approach will be employed in this effort: a detailed analysis of a multi-channel, flow-element, and global modeling of the entire thrust chamber assembly with a porosity modeling technique. It is expected that the detailed analysis of a single flow element would provide detailed fluid, thermal, and hydrogen environments for stress analysis, while the global thrust chamber assembly analysis would promote understanding of the effects of hydrogen dissociation and heat transfer on thrust performance. These modeling activities will be validated as much as possible by testing performed by other related efforts.

  13. A practical measure of workplace resilience: developing the resilience at work scale.

    PubMed

    Winwood, Peter C; Colon, Rochelle; McEwen, Kath

    2013-10-01

    To develop an effective measure of resilience at work for use in individual work-related performance and emotional distress contexts. Two separate cross-sectional studies investigated: (1) exploratory factor analysis of 45 items putatively underpinning workplace resilience among 397 participants and (2) confirmatory factor analysis of resilience measure derived from Study 1 demonstrating a credible model of interaction, with performance outcome variables among 194 participants. A 20-item scale explaining 67% of variance, measuring seven aspects of workplace resilience, which are teachable and capable of conscious development, was achieved. A credible model of relationships with work engagement, sleep, stress recovery, and physical health was demonstrated in the expected directions. The new scale shows considerable promise as a reliable instrument for use in the area of employee support and development.

  14. Energy and exergy assessments for an enhanced use of energy in buildings

    NASA Astrophysics Data System (ADS)

    Goncalves, Pedro Manuel Ferreira

    Exergy analysis has been found to be a useful method for improving the conversion efficiency of energy resources, since it helps to identify locations, types and true magnitudes of wastes and losses. It has also been applied for other purposes, such as distinguishing high- from low-quality energy sources or defining the engineering technological limits in designing more energy-efficient systems. In this doctoral thesis, the exergy analysis is widely applied in order to highlight and demonstrate it as a significant method of performing energy assessments of buildings and related energy supply systems. It aims to make the concept more familiar and accessible for building professionals and to encourage its wider use in engineering practice. Case study I aims to show the importance of exergy analysis in the energy performance assessment of eight space heating building options evaluated under different outdoor environmental conditions. This study is concerned with the so-called "reference state", which in this study is calculated using the average outdoor temperature for a given period of analysis. Primary energy and related exergy ratios are assessed and compared. Higher primary exergy ratios are obtained for low outdoor temperatures, while the primary energy ratios are assumed as constant for the same scenarios. The outcomes of this study demonstrate the significance of exergy analysis in comparison with energy analysis when different reference states are compared. Case study II and Case study III present two energy and exergy assessment studies applied to a hotel and a student accommodation building, respectively. Case study II compares the energy and exergy performance of the main end uses of a hotel building located in Coimbra in central Portugal, using data derived from an energy audit. Case study III uses data collected from energy utilities bills to estimate the energy and exergy performance associated to each building end use. Additionally, a set of energy supply options are proposed and assessed as primary energy demand and exergy efficiency, showing it as a possible benchmarking method for future legislative frameworks regarding the energy performance assessment of buildings. Case study IV proposes a set of complementary indicators for comparing cogeneration and separate heat and electricity production systems. It aims to identify the advantages of exergy analysis relative to energy analysis, giving particular examples where these advantages are significant. The results demonstrate that exergy analysis can reveal meaningful information that might not be accessible using a conventional energy analysis approach, which is particularly evident when cogeneration and separated systems provide heat at very different temperatures. Case study V follows the exergy analysis method to evaluate the energy and exergy performance of a desiccant cooling system, aiming to assess and locate irreversibilities sources. The results reveal that natural gas boiler is the most inefficient component of the plant in question, followed by the chiller and heating coil. A set of alternative heating supply options for desiccant wheel regeneration is proposed, showing that, while some renewables may effectively reduce the primary energy demand of the plant, although this may not correspond to the optimum level of exergy efficiency. The thermal and chemical exergy components of moist air are also evaluated, as well as, the influence of outdoor environmental conditions on the energy/exergy performance of the plant. This research provides knowledge that is essential for the future development of complementary energy- and exergy-based indicators, helping to improve the current methodologies on performance assessments of buildings, cogeneration and desiccant cooling systems. The significance of exergy analysis is demonstrated for different types of buildings, which may be located in different climates (reference states) and be supplied by different types of energy sources. (Abstract shortened by ProQuest.).

  15. Evolutionary computing for the design search and optimization of space vehicle power subsystems

    NASA Technical Reports Server (NTRS)

    Kordon, Mark; Klimeck, Gerhard; Hanks, David; Hua, Hook

    2004-01-01

    Evolutionary computing has proven to be a straightforward and robust approach for optimizing a wide range of difficult analysis and design problems. This paper discusses the application of these techniques to an existing space vehicle power subsystem resource and performance analysis simulation in a parallel processing environment. Out preliminary results demonstrate that this approach has the potential to improve the space system trade study process by allowing engineers to statistically weight subsystem goals of mass, cost and performance then automatically size power elements based on anticipated performance of the subsystem rather than on worst-case estimates.

  16. Development and Performance Analysis of a Photonics-Assisted RF Converter for 5G Applications

    NASA Astrophysics Data System (ADS)

    Borges, Ramon Maia; Muniz, André Luiz Marques; Sodré Junior, Arismar Cerqueira

    2017-03-01

    This article presents a simple, ultra-wideband and tunable radiofrequency (RF) converter for 5G cellular networks. The proposed optoelectronic device performs broadband photonics-assisted upconversion and downconversion using a single optical modulator. Experimental results demonstrate RF conversion from DC to millimeter waves, including 28 and 38 GHz that are potential frequency bands for 5G applications. Narrow linewidth and low phase noise characteristics are observed in all generated RF carriers. An experimental digital performance analysis using different modulation schemes illustrates the applicability of the proposed photonics-based device in reconfigurable optical wireless communications.

  17. Structural Analysis in a Conceptual Design Framework

    NASA Technical Reports Server (NTRS)

    Padula, Sharon L.; Robinson, Jay H.; Eldred, Lloyd B.

    2012-01-01

    Supersonic aircraft designers must shape the outer mold line of the aircraft to improve multiple objectives, such as mission performance, cruise efficiency, and sonic-boom signatures. Conceptual designers have demonstrated an ability to assess these objectives for a large number of candidate designs. Other critical objectives and constraints, such as weight, fuel volume, aeroelastic effects, and structural soundness, are more difficult to address during the conceptual design process. The present research adds both static structural analysis and sizing to an existing conceptual design framework. The ultimate goal is to include structural analysis in the multidisciplinary optimization of a supersonic aircraft. Progress towards that goal is discussed and demonstrated.

  18. INNOVATIVE TECHNOLOGY VERIFICATION REPORT " ...

    EPA Pesticide Factsheets

    The EnSys Petro Test System developed by Strategic Diagnostics Inc. (SDI), was demonstrated under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation Program in June 2000 at the Navy Base Ventura County site in Port Hueneme, California. The purpose of the demonstration was to collect reliable performance and cost data for the EnSys Petro Test System and six other field measurement devices for total petroleum hydrocarbons (TPH) in soil. In addition to assessing ease of device operation, the key objectives of the demonstration included determining the (1) method detection limit, (2) accuracy and precision, (3) effects of interferents and soil moisture content on TPH measurement, (4) sample throughput, and (5) TPH measurement costs for each device. The demonstration involved analysis of both performance evaluation samples and environmental samples collected in four areas contaminated with gasoline, diesel, or other petroleum products. The performance and cost results for a given field measurement device were compared to those for an off-site laboratory reference method,

  19. INNOVATIVE TECHNOLOGY VERIFICATION REPORT " ...

    EPA Pesticide Factsheets

    The Synchronous Scanning Luminoscope (Luminoscope) developed by the Oak Ridge National Laboratory in collaboration with Environmental Systems Corporation (ESC) was demonstrated under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation Program in June 2000 at the Navy Base Ventura County site in Port Hueneme, California. The purpose of the demonstration was to collect reliable performance and cost data for the Luminoscope and six other field measurement devices for total petroleum hydrocarbons (TPH) in soil. In addition to assessing ease of device operation, the key objectives of the demonstration included determining the (1) method detection limit, (2) accuracy and precision, (3) effects of interferents and soil moisture content on TPH measurement, (4) sample throughput, and (5) TPH measurement costs for each device. The demonstration involved analysis of both performance evaluation samples and environmental samples collected in five areas contaminated with gasoline, diesel, lubricating oil, or other petroleum products. The performance and cost results for a given field measurement device were compared to those for an off-site laboratory reference method,

  20. SFDT-1 Camera Pointing and Sun-Exposure Analysis and Flight Performance

    NASA Technical Reports Server (NTRS)

    White, Joseph; Dutta, Soumyo; Striepe, Scott

    2015-01-01

    The Supersonic Flight Dynamics Test (SFDT) vehicle was developed to advance and test technologies of NASA's Low Density Supersonic Decelerator (LDSD) Technology Demonstration Mission. The first flight test (SFDT-1) occurred on June 28, 2014. In order to optimize the usefulness of the camera data, analysis was performed to optimize parachute visibility in the camera field of view during deployment and inflation and to determine the probability of sun-exposure issues with the cameras given the vehicle heading and launch time. This paper documents the analysis, results and comparison with flight video of SFDT-1.

  1. Philosophy of ATHEANA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bley, D.C.; Cooper, S.E.; Forester, J.A.

    ATHEANA, a second-generation Human Reliability Analysis (HRA) method integrates advances in psychology with engineering, human factors, and Probabilistic Risk Analysis (PRA) disciplines to provide an HRA quantification process and PRA modeling interface that can accommodate and represent human performance in real nuclear power plant events. The method uses the characteristics of serious accidents identified through retrospective analysis of serious operational events to set priorities in a search process for significant human failure events, unsafe acts, and error-forcing context (unfavorable plant conditions combined with negative performance-shaping factors). ATHEANA has been tested in a demonstration project at an operating pressurized water reactor.

  2. Performance analysis of distributed applications using automatic classification of communication inefficiencies

    DOEpatents

    Vetter, Jeffrey S.

    2005-02-01

    The method and system described herein presents a technique for performance analysis that helps users understand the communication behavior of their message passing applications. The method and system described herein may automatically classifies individual communication operations and reveal the cause of communication inefficiencies in the application. This classification allows the developer to quickly focus on the culprits of truly inefficient behavior, rather than manually foraging through massive amounts of performance data. Specifically, the method and system described herein trace the message operations of Message Passing Interface (MPI) applications and then classify each individual communication event using a supervised learning technique: decision tree classification. The decision tree may be trained using microbenchmarks that demonstrate both efficient and inefficient communication. Since the method and system described herein adapt to the target system's configuration through these microbenchmarks, they simultaneously automate the performance analysis process and improve classification accuracy. The method and system described herein may improve the accuracy of performance analysis and dramatically reduce the amount of data that users must encounter.

  3. Analysis of Skylab fluid mechanics science demonstrations

    NASA Technical Reports Server (NTRS)

    Tegart, J. R.; Butz, J. R.

    1975-01-01

    The results of the data reduction and analysis of the Skylab fluid mechanics demonstrations are presented. All the fluid mechanics data available from the Skylab missions were identified and surveyed. The significant fluid mechanics phenomena were identified and reduced to measurable quantities wherever possible. Data correlations were performed using existing theories. Among the phenomena analyzed were: static low-g interface shapes, oscillation frequency and damping of a liquid drop, coalescence, rotating drop, liquid films and low-g ice melting. A survey of the possible applications of the results was made and future experiments are recommended.

  4. Evaluation of Savings in Energy-Efficient Public Housing in the Pacific Northwest

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gordon, A.; Lubliner, M.; Howard, L.

    2013-10-01

    This report presents the results of an energy performance and cost-effectiveness analysis. The Salishan phase 7 and demonstration homes were compared to Salishan phase 6 homes built to 2006 Washington State Energy Code specifications 2. Predicted annual energy savings (over Salishan phase 6) was 19% for Salishan phase 7, and between 19-24% for the demonstration homes (depending on ventilation strategy). Approximately two-thirds of the savings are attributable to the DHP. Working with the electric utility provider, Tacoma Public Utilities, researchers conducted a billing analysis for Salishan phase 7.

  5. The Impact of Accountability on Teachers' Assessments of Student Performance: A Social Cognitive Analysis

    ERIC Educational Resources Information Center

    Krolak-Schwerdt, Sabine; Bohmer, Matthias; Grasel, Cornelia

    2013-01-01

    Research on teachers' judgments of student performance has demonstrated that educational assessments may be biased or may more correctly take the achievements of students into account depending on teachers' motivations while making the judgment. Building on research on social judgment formation the present investigation examined whether the…

  6. TCP Packet Trace Analysis. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Shepard, Timothy J.

    1991-01-01

    Examination of a trace of packets collected from the network is often the only method available for diagnosing protocol performance problems in computer networks. This thesis explores the use of packet traces to diagnose performance problems of the transport protocol TCP. Unfortunately, manual examination of these traces can be so tedious that effective analysis is not possible. The primary contribution of this thesis is a graphical method of displaying the packet trace which greatly reduce, the tediousness of examining a packet trace. The graphical method is demonstrated by the examination of some packet traces of typical TCP connections. The performance of two different implementations of TCP sending data across a particular network path is compared. Traces many thousands of packets long are used to demonstrate how effectively the graphical method simplifies examination of long complicated traces. In the comparison of the two TCP implementations, the burstiness of the TCP transmitter appeared to be related to the achieved throughput. A method of quantifying this burstiness is presented and its possible relevance to understanding the performance of TCP is discussed.

  7. A systematic review and meta-analysis of the circulatory, erythrocellular and CSF selenium levels in Alzheimer's disease: A metal meta-analysis (AMMA study-I).

    PubMed

    Reddy, Varikasuvu Seshadri; Bukke, Suman; Dutt, Naveen; Rana, Puneet; Pandey, Arun Kumar

    2017-07-01

    Available studies in the literature on the selenium levels in Alzheimer's disease (AD) are inconsistent with some studies reporting its decrease in the circulation, while others reported an increase or no change as compared to controls. The objective of this study was to perform a meta-analysis of circulatory (plasma/serum and blood), erythrocyte and cerebrospinal fluid (CSF) selenium levels in AD compared controls. We also performed a meta-analysis of the correlation coefficients (r) to demonstrate the associations between selenium and glutathione peroxidase (GPx) in AD patients. All major databases were searched for eligible studies. We included 12 case-control/observational studies reporting selenium concentrations in AD and controls. Pooled-overall effect size as standardized mean difference (SMD) and pooled r-values were generated using Review Manager 5.3 and MedCalc 15.8 software. Random-effects meta-analysis indicated a decrease in circulatory (SMD=-0.44), erythrocellular (SMD=-0.52) and CSF (SMD=-0.14) selenium levels in AD patients compared to controls. Stratified meta-analysis demonstrated that the selenium levels were decreased in both the subgroups with (SMD=-0.55) and without (SMD=-0.37) age matching between AD and controls. Our results also demonstrated a direct association between decreased selenium levels and GPx in AD. This meta-analysis suggests that circulatory selenium concentration is significantly lower in AD patients compared to controls and this decrease in selenium is directly correlated with an important antioxidant enzyme, the GPx, in AD. Copyright © 2017 Elsevier GmbH. All rights reserved.

  8. Applied Chaos Level Test for Validation of Signal Conditions Underlying Optimal Performance of Voice Classification Methods.

    PubMed

    Liu, Boquan; Polce, Evan; Sprott, Julien C; Jiang, Jack J

    2018-05-17

    The purpose of this study is to introduce a chaos level test to evaluate linear and nonlinear voice type classification method performances under varying signal chaos conditions without subjective impression. Voice signals were constructed with differing degrees of noise to model signal chaos. Within each noise power, 100 Monte Carlo experiments were applied to analyze the output of jitter, shimmer, correlation dimension, and spectrum convergence ratio. The computational output of the 4 classifiers was then plotted against signal chaos level to investigate the performance of these acoustic analysis methods under varying degrees of signal chaos. A diffusive behavior detection-based chaos level test was used to investigate the performances of different voice classification methods. Voice signals were constructed by varying the signal-to-noise ratio to establish differing signal chaos conditions. Chaos level increased sigmoidally with increasing noise power. Jitter and shimmer performed optimally when the chaos level was less than or equal to 0.01, whereas correlation dimension was capable of analyzing signals with chaos levels of less than or equal to 0.0179. Spectrum convergence ratio demonstrated proficiency in analyzing voice signals with all chaos levels investigated in this study. The results of this study corroborate the performance relationships observed in previous studies and, therefore, demonstrate the validity of the validation test method. The presented chaos level validation test could be broadly utilized to evaluate acoustic analysis methods and establish the most appropriate methodology for objective voice analysis in clinical practice.

  9. Analysis of high vacuum systems using SINDA'85

    NASA Technical Reports Server (NTRS)

    Spivey, R. A.; Clanton, S. E.; Moore, J. D.

    1993-01-01

    The theory, algorithms, and test data correlation analysis of a math model developed to predict performance of the Space Station Freedom Vacuum Exhaust System are presented. The theory used to predict the flow characteristics of viscous, transition, and molecular flow is presented in detail. Development of user subroutines which predict the flow characteristics in conjunction with the SINDA'85/FLUINT analysis software are discussed. The resistance-capacitance network approach with application to vacuum system analysis is demonstrated and results from the model are correlated with test data. The model was developed to predict the performance of the Space Station Freedom Vacuum Exhaust System. However, the unique use of the user subroutines developed in this model and written into the SINDA'85/FLUINT thermal analysis model provides a powerful tool that can be used to predict the transient performance of vacuum systems and gas flow in tubes of virtually any geometry. This can be accomplished using a resistance-capacitance (R-C) method very similar to the methods used to perform thermal analyses.

  10. QUAD+ BWR Fuel Assembly demonstration program at Browns Ferry plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doshi, P.K.; Mayhue, L.T.; Robert, J.T.

    1984-04-01

    The QUAD+ fuel assembly is an improved BWR fuel assembly designed and manufactured by Westinghouse Electric Corporation. The design features a water cross separating four fuel minibundles in an integral channel. A demonstration program for this fuel design is planned for late 1984 in cycle 6 of Browns Ferry 2, a TVA plant. Objectives for the design of the QUAD+ demonstration assemblies are compatibility in performance and transparency in safety analysis with the feed fuel. These objectives are met. Inspections of the QUAD+ demonstration assemblies are planned at each refueling outage.

  11. DoE Phase II SBIR: Spectrally-Assisted Vehicle Tracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Villeneuve, Pierre V.

    2013-02-28

    The goal of this Phase II SBIR is to develop a prototype software package to demonstrate spectrally-aided vehicle tracking performance. The primary application is to demonstrate improved target vehicle tracking performance in complex environments where traditional spatial tracker systems may show reduced performance. Example scenarios in Figure 1 include a) the target vehicle obscured by a large structure for an extended period of time, or b), the target engaging in extreme maneuvers amongst other civilian vehicles. The target information derived from spatial processing is unable to differentiate between the green versus the red vehicle. Spectral signature exploitation enables comparison ofmore » new candidate targets with existing track signatures. The ambiguity in this confusing scenario is resolved by folding spectral analysis results into each target nomination and association processes. Figure 3 shows a number of example spectral signatures from a variety of natural and man-made materials. The work performed over the two-year effort was divided into three general areas: algorithm refinement, software prototype development, and prototype performance demonstration. The tasks performed under this Phase II to accomplish the program goals were as follows: 1. Acquire relevant vehicle target datasets to support prototype. 2. Refine algorithms for target spectral feature exploitation. 3. Implement a prototype multi-hypothesis target tracking software package. 4. Demonstrate and quantify tracking performance using relevant data.« less

  12. Respiratory muscle endurance after training in athletes and non-athletes: A systematic review and meta-analysis.

    PubMed

    Sales, Ana Tereza do N; Fregonezi, Guilherme A de F; Ramsook, Andrew H; Guenette, Jordan A; Lima, Illia Nadinne D F; Reid, W Darlene

    2016-01-01

    The objectives of this systematic review was to evaluate the effects of respiratory muscle training (RMT) on respiratory muscle endurance (RME) and to determine the RME test that demonstrates the most consistent changes after RMT. Electronic searches were conducted in EMBASE, MEDLINE, COCHRANE CENTRAL, CINHAL and SPORTDiscus. The PEDro scale was used for quality assessment and meta-analysis were performed to compare effect sizes of different RME tests. Twenty studies met the inclusion criteria. Isocapnic hyperpnea training was performed in 40% of the studies. Meta-analysis showed that RMT improves RME in athletes (P = 0.0007) and non-athletes (P = 0.001). Subgroup analysis showed differences among tests; maximal sustainable ventilatory capacity (MSVC) and maximal sustainable threshold loading tests demonstrated significant improvement after RMT (P = 0.007; P = 0.003 respectively) compared to the maximal voluntary ventilation (MVV) (P = 0.11) in athletes whereas significant improvement after RMT was only shown by MSVC in non-athletes. The effect size of MSVC was greater compared to MVV in studies that performed both tests. The meta-analysis results provide evidence that RMT improves RME in athletes and non-athletes and MSVC test that examine endurance over several minutes are more sensitive to improvement after RMT. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. The challenge of measuring emergency preparedness: integrating component metrics to build system-level measures for strategic national stockpile operations.

    PubMed

    Jackson, Brian A; Faith, Kay Sullivan

    2013-02-01

    Although significant progress has been made in measuring public health emergency preparedness, system-level performance measures are lacking. This report examines a potential approach to such measures for Strategic National Stockpile (SNS) operations. We adapted an engineering analytic technique used to assess the reliability of technological systems-failure mode and effects analysis-to assess preparedness. That technique, which includes systematic mapping of the response system and identification of possible breakdowns that affect performance, provides a path to use data from existing SNS assessment tools to estimate likely future performance of the system overall. Systems models of SNS operations were constructed and failure mode analyses were performed for each component. Linking data from existing assessments, including the technical assistance review and functional drills, to reliability assessment was demonstrated using publicly available information. The use of failure mode and effects estimates to assess overall response system reliability was demonstrated with a simple simulation example. Reliability analysis appears an attractive way to integrate information from the substantial investment in detailed assessments for stockpile delivery and dispensing to provide a view of likely future response performance.

  14. Demonstration of Cost-Effective, High-Performance Computing at Performance and Reliability Levels Equivalent to a 1994 Vector Supercomputer

    NASA Technical Reports Server (NTRS)

    Babrauckas, Theresa

    2000-01-01

    The Affordable High Performance Computing (AHPC) project demonstrated that high-performance computing based on a distributed network of computer workstations is a cost-effective alternative to vector supercomputers for running CPU and memory intensive design and analysis tools. The AHPC project created an integrated system called a Network Supercomputer. By connecting computer work-stations through a network and utilizing the workstations when they are idle, the resulting distributed-workstation environment has the same performance and reliability levels as the Cray C90 vector Supercomputer at less than 25 percent of the C90 cost. In fact, the cost comparison between a Cray C90 Supercomputer and Sun workstations showed that the number of distributed networked workstations equivalent to a C90 costs approximately 8 percent of the C90.

  15. Aerostructural analysis and design optimization of composite aircraft

    NASA Astrophysics Data System (ADS)

    Kennedy, Graeme James

    High-performance composite materials exhibit both anisotropic strength and stiffness properties. These anisotropic properties can be used to produce highly-tailored aircraft structures that meet stringent performance requirements, but these properties also present unique challenges for analysis and design. New tools and techniques are developed to address some of these important challenges. A homogenization-based theory for beams is developed to accurately predict the through-thickness stress and strain distribution in thick composite beams. Numerical comparisons demonstrate that the proposed beam theory can be used to obtain highly accurate results in up to three orders of magnitude less computational time than three-dimensional calculations. Due to the large finite-element model requirements for thin composite structures used in aerospace applications, parallel solution methods are explored. A parallel direct Schur factorization method is developed. The parallel scalability of the direct Schur approach is demonstrated for a large finite-element problem with over 5 million unknowns. In order to address manufacturing design requirements, a novel laminate parametrization technique is presented that takes into account the discrete nature of the ply-angle variables, and ply-contiguity constraints. This parametrization technique is demonstrated on a series of structural optimization problems including compliance minimization of a plate, buckling design of a stiffened panel and layup design of a full aircraft wing. The design and analysis of composite structures for aircraft is not a stand-alone problem and cannot be performed without multidisciplinary considerations. A gradient-based aerostructural design optimization framework is presented that partitions the disciplines into distinct process groups. An approximate Newton-Krylov method is shown to be an efficient aerostructural solution algorithm and excellent parallel scalability of the algorithm is demonstrated. An induced drag optimization study is performed to compare the trade-off between wing weight and induced drag for wing tip extensions, raked wing tips and winglets. The results demonstrate that it is possible to achieve a 43% induced drag reduction with no weight penalty, a 28% induced drag reduction with a 10% wing weight reduction, or a 20% wing weight reduction with a 5% induced drag penalty from a baseline wing obtained from a structural mass-minimization problem with fixed aerodynamic loads.

  16. Testing and Performance Analysis of the Multichannel Error Correction Code Decoder

    NASA Technical Reports Server (NTRS)

    Soni, Nitin J.

    1996-01-01

    This report provides the test results and performance analysis of the multichannel error correction code decoder (MED) system for a regenerative satellite with asynchronous, frequency-division multiple access (FDMA) uplink channels. It discusses the system performance relative to various critical parameters: the coding length, data pattern, unique word value, unique word threshold, and adjacent-channel interference. Testing was performed under laboratory conditions and used a computer control interface with specifically developed control software to vary these parameters. Needed technologies - the high-speed Bose Chaudhuri-Hocquenghem (BCH) codec from Harris Corporation and the TRW multichannel demultiplexer/demodulator (MCDD) - were fully integrated into the mesh very small aperture terminal (VSAT) onboard processing architecture and were demonstrated.

  17. Heave-pitch-roll analysis and testing of air cushion landing systems

    NASA Technical Reports Server (NTRS)

    Boghani, A. B.; Captain, K. M.; Wormley, D. N.

    1978-01-01

    The analytical tools (analysis and computer simulation) needed to explain and predict the dynamic operation of air cushion landing systems (ACLS) is described. The following tasks were performed: the development of improved analytical models for the fan and the trunk; formulation of a heave pitch roll analysis for the complete ACLS; development of a general purpose computer simulation to evaluate landing and taxi performance of an ACLS equipped aircraft; and the verification and refinement of the analysis by comparison with test data obtained through lab testing of a prototype cushion. Demonstration of simulation capabilities through typical landing and taxi simulation of an ACLS aircraft are given. Initial results show that fan dynamics have a major effect on system performance. Comparison with lab test data (zero forward speed) indicates that the analysis can predict most of the key static and dynamic parameters (pressure, deflection, acceleration, etc.) within a margin of a 10 to 25 percent.

  18. Principal Component Analysis for pulse-shape discrimination of scintillation radiation detectors

    NASA Astrophysics Data System (ADS)

    Alharbi, T.

    2016-01-01

    In this paper, we report on the application of Principal Component analysis (PCA) for pulse-shape discrimination (PSD) of scintillation radiation detectors. The details of the method are described and the performance of the method is experimentally examined by discriminating between neutrons and gamma-rays with a liquid scintillation detector in a mixed radiation field. The performance of the method is also compared against that of the conventional charge-comparison method, demonstrating the superior performance of the method particularly at low light output range. PCA analysis has the important advantage of automatic extraction of the pulse-shape characteristics which makes the PSD method directly applicable to various scintillation detectors without the need for the adjustment of a PSD parameter.

  19. Pyogenic Granuloma of the Penis: An Uncommon Lesion with Unusual Presentation

    PubMed Central

    Katmeh, Rateb F.; Johnson, Luke; Kempley, Eilis; Kotecha, Shrinal; Hamarneh, Wael; Chitale, Sudhanshu

    2017-01-01

    We present the case of a 37-year-old man who presented with a penile lesion that engorged on erection. Ultrasound examination demonstrated vascularity of the lesion and the decision was made to perform a complete excision. Histological analysis confirmed the diagnosis of a pyogenic granuloma of the penis. Follow-up demonstrated no recurrence at 3 months. PMID:28413384

  20. An Analysis of Warfighter Sleep, Fatigue, and Performance on the USS Nimitz

    DTIC Science & Technology

    2014-09-01

    35 1. Chernobyl Reactor 4 .............................................................. 36 2...deprivation and fatigue can be disastrous, as demonstrated by the accidents at Chernobyl Reactor 4, Three Mile Island Unit 2, Bhopal Union Carbide, and the...deprivation and fatigue can be disastrous, as demonstrated by the accidents at Chernobyl Reactor 4, Three Mile Island Unit 2, Bhopal Union Carbide, and

  1. Preliminary Results, Analysis and Overview of Part -1 of the GOLD Experiment

    NASA Technical Reports Server (NTRS)

    Wilson, K. E.; Jeganathan, M.

    1996-01-01

    The Ground/Orbiter Lasercomm Demonstration (GOLD) is an optical communications demonstration between Japanese Engineering Test Satellite (ETS-V1) and an optical ground transmitting and receiving station at the Table Mountain Facility in Wrightwood California. Laser transmissions to the satellite were performed approximately four hours every third night when the satellite was at above Table Mountain.

  2. Massively Parallel, Molecular Analysis Platform Developed Using a CMOS Integrated Circuit With Biological Nanopores

    PubMed Central

    Roever, Stefan

    2012-01-01

    A massively parallel, low cost molecular analysis platform will dramatically change the nature of protein, molecular and genomics research, DNA sequencing, and ultimately, molecular diagnostics. An integrated circuit (IC) with 264 sensors was fabricated using standard CMOS semiconductor processing technology. Each of these sensors is individually controlled with precision analog circuitry and is capable of single molecule measurements. Under electronic and software control, the IC was used to demonstrate the feasibility of creating and detecting lipid bilayers and biological nanopores using wild type α-hemolysin. The ability to dynamically create bilayers over each of the sensors will greatly accelerate pore development and pore mutation analysis. In addition, the noise performance of the IC was measured to be 30fA(rms). With this noise performance, single base detection of DNA was demonstrated using α-hemolysin. The data shows that a single molecule, electrical detection platform using biological nanopores can be operationalized and can ultimately scale to millions of sensors. Such a massively parallel platform will revolutionize molecular analysis and will completely change the field of molecular diagnostics in the future.

  3. A comparison of visual search strategies of elite and non-elite tennis players through cluster analysis.

    PubMed

    Murray, Nicholas P; Hunfalvay, Melissa

    2017-02-01

    Considerable research has documented that successful performance in interceptive tasks (such as return of serve in tennis) is based on the performers' capability to capture appropriate anticipatory information prior to the flight path of the approaching object. Athletes of higher skill tend to fixate on different locations in the playing environment prior to initiation of a skill than their lesser skilled counterparts. The purpose of this study was to examine visual search behaviour strategies of elite (world ranked) tennis players and non-ranked competitive tennis players (n = 43) utilising cluster analysis. The results of hierarchical (Ward's method) and nonhierarchical (k means) cluster analyses revealed three different clusters. The clustering method distinguished visual behaviour of high, middle-and low-ranked players. Specifically, high-ranked players demonstrated longer mean fixation duration and lower variation of visual search than middle-and low-ranked players. In conclusion, the results demonstrated that cluster analysis is a useful tool for detecting and analysing the areas of interest for use in experimental analysis of expertise and to distinguish visual search variables among participants'.

  4. HuMOVE: a low-invasive wearable monitoring platform in sexual medicine.

    PubMed

    Ciuti, Gastone; Nardi, Matteo; Valdastri, Pietro; Menciassi, Arianna; Basile Fasolo, Ciro; Dario, Paolo

    2014-10-01

    To investigate an accelerometer-based wearable system, named Human Movement (HuMOVE) platform, designed to enable quantitative and continuous measurement of sexual performance with minimal invasiveness and inconvenience for users. Design, implementation, and development of HuMOVE, a wearable platform equipped with an accelerometer sensor for monitoring inertial parameters for sexual performance assessment and diagnosis, were performed. The system enables quantitative measurement of movement parameters during sexual intercourse, meeting the requirements of wearability, data storage, sampling rate, and interfacing methods, which are fundamental for human sexual intercourse performance analysis. HuMOVE was validated through characterization using a controlled experimental test bench and evaluated in a human model during simulated sexual intercourse conditions. HuMOVE demonstrated to be a robust and quantitative monitoring platform and a reliable candidate for sexual performance evaluation and diagnosis. Characterization analysis on the controlled experimental test bench demonstrated an accurate correlation between the HuMOVE system and data from a reference displacement sensor. Experimental tests in the human model during simulated intercourse conditions confirmed the accuracy of the sexual performance evaluation platform and the effectiveness of the selected and derived parameters. The obtained outcomes also established the project expectations in terms of usability and comfort, evidenced by the questionnaires that highlighted the low invasiveness and acceptance of the device. To the best of our knowledge, HuMOVE platform is the first device for human sexual performance analysis compatible with sexual intercourse; the system has the potential to be a helpful tool for physicians to accurately classify sexual disorders, such as premature or delayed ejaculation. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Milestones for Selection, Characterization, and Analysis of the Performance of a Repository for Spent Nuclear Fuel and High-Level Radioactive Waste at Yucca Mountain.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rechard, Robert P.

    This report presents a concise history in tabular form of events leading up to site identification in 1978, site selection in 1987, subsequent characterization, and ongoing analysis through 2008 of the performance of a repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain in southern Nevada. The tabulated events generally occurred in five periods: (1) commitment to mined geologic disposal and identification of sites; (2) site selection and analysis, based on regional geologic characterization through literature and analogous data; (3) feasibility analysis demonstrating calculation procedures and importance of system components, based on rough measures of performance usingmore » surface exploration, waste process knowledge, and general laboratory experiments; (4) suitability analysis demonstrating viability of disposal system, based on environment-specific laboratory experiments, in-situ experiments, and underground disposal system characterization; and (5) compliance analysis, based on completed site-specific characterization. Because the relationship is important to understanding the evolution of the Yucca Mountain Project, the tabulation also shows the interaction between four broad categories of political bodies and government agencies/institutions: (a) technical milestones of the implementing institutions, (b) development of the regulatory requirements and related federal policy in laws and court decisions, (c) Presidential and agency directives and decisions, and (d) critiques of the Yucca Mountain Project and pertinent national and world events related to nuclear energy and radioactive waste.« less

  6. High-Performance Flexible Perovskite Solar Cells on Ultrathin Glass: Implications of the TCO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dou, Benjia; Miller, Elisa M.; Christians, Jeffrey A.

    For halide perovskite solar cells (PSCs) to fulfill their vast potential for combining low-cost, high efficiency, and high throughput production they must be scaled using a truly transformative method, such as roll-to-roll processing. Bringing this reality closer to fruition, the present work demonstrates flexible perovskite solar cells with 18.1% power conversion efficiency on flexible Willow Glass substrates. Here, we highlight the importance of the transparent conductive oxide (TCO) layers on device performance by studying various TCOs. And while tin-doped indium oxide (ITO) and indium zinc oxide (IZO) based PSC devices demonstrate high photovoltaic performances, aluminum-doped zinc oxide (AZO) based devicesmore » underperformed in all device parameters. Analysis of X-ray photoemission spectroscopy data shows that the stoichiometry of the perovskite film surface changes dramatically when it is fabricated on AZO, demonstrating the importance of the substrate in perovskite film formation.« less

  7. High-Performance Flexible Perovskite Solar Cells on Ultrathin Glass: Implications of the TCO

    DOE PAGES

    Dou, Benjia; Miller, Elisa M.; Christians, Jeffrey A.; ...

    2017-09-27

    For halide perovskite solar cells (PSCs) to fulfill their vast potential for combining low-cost, high efficiency, and high throughput production they must be scaled using a truly transformative method, such as roll-to-roll processing. Bringing this reality closer to fruition, the present work demonstrates flexible perovskite solar cells with 18.1% power conversion efficiency on flexible Willow Glass substrates. Here, we highlight the importance of the transparent conductive oxide (TCO) layers on device performance by studying various TCOs. And while tin-doped indium oxide (ITO) and indium zinc oxide (IZO) based PSC devices demonstrate high photovoltaic performances, aluminum-doped zinc oxide (AZO) based devicesmore » underperformed in all device parameters. Analysis of X-ray photoemission spectroscopy data shows that the stoichiometry of the perovskite film surface changes dramatically when it is fabricated on AZO, demonstrating the importance of the substrate in perovskite film formation.« less

  8. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Rigaku ZSX Mini II (ZSX Mini II) XRF Services x-ray fluorescence (XRF) analyzer was demon-strated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ZSX Mini II analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ZSX Mini II analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element con

  9. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Rontec PicoTAX x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the PicoTAX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the PicoTAX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by c

  10. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Niton XLt 700 Series (XLt) XRF Services x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XLt analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XLt analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy

  11. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Oxford ED2000 x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ED2000 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ED2000 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by com

  12. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Innov-X XT400 Series (XT400) x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XT400 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XT400 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was as

  13. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Elvatech, Ltd. ElvaX (ElvaX) x-ray fluorescence (XRF) analyzer distributed in the United States by Xcalibur XRF Services (Xcalibur), was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ElvaX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ElvaX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as s

  14. Comparison of dissimilarity measures for cluster analysis of X-ray diffraction data from combinatorial libraries

    NASA Astrophysics Data System (ADS)

    Iwasaki, Yuma; Kusne, A. Gilad; Takeuchi, Ichiro

    2017-12-01

    Machine learning techniques have proven invaluable to manage the ever growing volume of materials research data produced as developments continue in high-throughput materials simulation, fabrication, and characterization. In particular, machine learning techniques have been demonstrated for their utility in rapidly and automatically identifying potential composition-phase maps from structural data characterization of composition spread libraries, enabling rapid materials fabrication-structure-property analysis and functional materials discovery. A key issue in development of an automated phase-diagram determination method is the choice of dissimilarity measure, or kernel function. The desired measure reduces the impact of confounding structural data issues on analysis performance. The issues include peak height changes and peak shifting due to lattice constant change as a function of composition. In this work, we investigate the choice of dissimilarity measure in X-ray diffraction-based structure analysis and the choice of measure's performance impact on automatic composition-phase map determination. Nine dissimilarity measures are investigated for their impact in analyzing X-ray diffraction patterns for a Fe-Co-Ni ternary alloy composition spread. The cosine, Pearson correlation coefficient, and Jensen-Shannon divergence measures are shown to provide the best performance in the presence of peak height change and peak shifting (due to lattice constant change) when the magnitude of peak shifting is unknown. With prior knowledge of the maximum peak shifting, dynamic time warping in a normalized constrained mode provides the best performance. This work also serves to demonstrate a strategy for rapid analysis of a large number of X-ray diffraction patterns in general beyond data from combinatorial libraries.

  15. TRISO Fuel Performance: Modeling, Integration into Mainstream Design Studies, and Application to a Thorium-fueled Fusion-Fission Hybrid Blanket

    NASA Astrophysics Data System (ADS)

    Powers, Jeffrey J.

    2011-12-01

    This study focused on creating a new tristructural isotropic (TRISO) coated particle fuel performance model and demonstrating the integration of this model into an existing system of neutronics and heat transfer codes, creating a user-friendly option for including fuel performance analysis within system design optimization and system-level trade-off studies. The end product enables both a deeper understanding and better overall system performance of nuclear energy systems limited or greatly impacted by TRISO fuel performance. A thorium-fueled hybrid fusion-fission Laser Inertial Fusion Energy (LIFE) blanket design was used for illustrating the application of this new capability and demonstrated both the importance of integrating fuel performance calculations into mainstream design studies and the impact that this new integrated analysis had on system-level design decisions. A new TRISO fuel performance model named TRIUNE was developed and verified and validated during this work with a novel methodology established for simulating the actual lifetime of a TRISO particle during repeated passes through a pebble bed. In addition, integrated self-consistent calculations were performed for neutronics depletion analysis, heat transfer calculations, and then fuel performance modeling for a full parametric study that encompassed over 80 different design options that went through all three phases of analysis. Lastly, side studies were performed that included a comparison of thorium and depleted uranium (DU) LIFE blankets as well as some uncertainty quantification work to help guide future experimental work by assessing what material properties in TRISO fuel performance modeling are most in need of improvement. A recommended thorium-fueled hybrid LIFE engine design was identified with an initial fuel load of 20MT of thorium, 15% TRISO packing within the graphite fuel pebbles, and a 20cm neutron multiplier layer with beryllium pebbles in flibe molten salt coolant. It operated at a system power level of 2000 MWth, took about 3.5 years to reach full plateau power, and was capable of an End of Plateau burnup of 38.7 %FIMA if considering just the neutronic constraints in the system design; however, fuel performance constraints led to a maximum credible burnup of 12.1 %FIMA due to a combination of internal gas pressure and irradiation effects on the TRISO materials (especially PyC) leading to SiC pressure vessel failures. The optimal neutron spectrum for the thorium-fueled blanket options evaluated seemed to favor a hard spectrum (low but non-zero neutron multiplier thicknesses and high TRISO packing fractions) in terms of neutronic performance but the fuel performance constraints demonstrated that a significantly softer spectrum would be needed to decrease the rate of accumulation of fast neutron fluence in order to improve the maximum credible burnup the system could achieve.

  16. 24 CFR 1000.134 - When may a recipient (or entity funded by a recipient) demolish or dispose of current assisted...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... or an entity funded by the recipient when: (1) A financial analysis demonstrates that it is more cost... considerations. (b) No action to demolish or dispose of the property other than performing the analysis cited in... written notification must set out the analysis used to arrive at the decision to demolish or dispose of...

  17. 24 CFR 1000.134 - When may a recipient (or entity funded by a recipient) demolish or dispose of current assisted...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... or an entity funded by the recipient when: (1) A financial analysis demonstrates that it is more cost... considerations. (b) No action to demolish or dispose of the property other than performing the analysis cited in... written notification must set out the analysis used to arrive at the decision to demolish or dispose of...

  18. The Impact of Measurement Noise in GPA Diagnostic Analysis of a Gas Turbine Engine

    NASA Astrophysics Data System (ADS)

    Ntantis, Efstratios L.; Li, Y. G.

    2013-12-01

    The performance diagnostic analysis of a gas turbine is accomplished by estimating a set of internal engine health parameters from available sensor measurements. No physical measuring instruments however can ever completely eliminate the presence of measurement uncertainties. Sensor measurements are often distorted by noise and bias leading to inaccurate estimation results. This paper explores the impact of measurement noise on Gas Turbine GPA analysis. The analysis is demonstrated with a test case where gas turbine performance simulation and diagnostics code TURBOMATCH is used to build a performance model of a model engine similar to Rolls-Royce Trent 500 turbofan engine, and carry out the diagnostic analysis with the presence of different levels of measurement noise. Conclusively, to improve the reliability of the diagnostic results, a statistical analysis of the data scattering caused by sensor uncertainties is made. The diagnostic tool used to deal with the statistical analysis of measurement noise impact is a model-based method utilizing a non-linear GPA.

  19. Automated, Ultra-Sterile Solid Sample Handling and Analysis on a Chip

    NASA Technical Reports Server (NTRS)

    Mora, Maria F.; Stockton, Amanda M.; Willis, Peter A.

    2013-01-01

    There are no existing ultra-sterile lab-on-a-chip systems that can accept solid samples and perform complete chemical analyses without human intervention. The proposed solution is to demonstrate completely automated lab-on-a-chip manipulation of powdered solid samples, followed by on-chip liquid extraction and chemical analysis. This technology utilizes a newly invented glass micro-device for solid manipulation, which mates with existing lab-on-a-chip instrumentation. Devices are fabricated in a Class 10 cleanroom at the JPL MicroDevices Lab, and are plasma-cleaned before and after assembly. Solid samples enter the device through a drilled hole in the top. Existing micro-pumping technology is used to transfer milligrams of powdered sample into an extraction chamber where it is mixed with liquids to extract organic material. Subsequent chemical analysis is performed using portable microchip capillary electrophoresis systems (CE). These instruments have been used for ultra-highly sensitive (parts-per-trillion, pptr) analysis of organic compounds including amines, amino acids, aldehydes, ketones, carboxylic acids, and thiols. Fully autonomous amino acid analyses in liquids were demonstrated; however, to date there have been no reports of completely automated analysis of solid samples on chip. This approach utilizes an existing portable instrument that houses optics, high-voltage power supplies, and solenoids for fully autonomous microfluidic sample processing and CE analysis with laser-induced fluorescence (LIF) detection. Furthermore, the entire system can be sterilized and placed in a cleanroom environment for analyzing samples returned from extraterrestrial targets, if desired. This is an entirely new capability never demonstrated before. The ability to manipulate solid samples, coupled with lab-on-a-chip analysis technology, will enable ultraclean and ultrasensitive end-to-end analysis of samples that is orders of magnitude more sensitive than the ppb goal given in the Science Instruments.

  20. Enhancing the ABAQUS thermomechanics code to simulate multipellet steady and transient LWR fuel rod behavior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. L. Williamson

    A powerful multidimensional fuels performance analysis capability, applicable to both steady and transient fuel behavior, is developed based on enhancements to the commercially available ABAQUS general-purpose thermomechanics code. Enhanced capabilities are described, including: UO2 temperature and burnup dependent thermal properties, solid and gaseous fission product swelling, fuel densification, fission gas release, cladding thermal and irradiation creep, cladding irradiation growth, gap heat transfer, and gap/plenum gas behavior during irradiation. This new capability is demonstrated using a 2D axisymmetric analysis of the upper section of a simplified multipellet fuel rod, during both steady and transient operation. Comparisons are made between discrete andmore » smeared-pellet simulations. Computational results demonstrate the importance of a multidimensional, multipellet, fully-coupled thermomechanical approach. Interestingly, many of the inherent deficiencies in existing fuel performance codes (e.g., 1D thermomechanics, loose thermomechanical coupling, separate steady and transient analysis, cumbersome pre- and post-processing) are, in fact, ABAQUS strengths.« less

  1. [Text mining, a method for computer-assisted analysis of scientific texts, demonstrated by an analysis of author networks].

    PubMed

    Hahn, P; Dullweber, F; Unglaub, F; Spies, C K

    2014-06-01

    Searching for relevant publications is becoming more difficult with the increasing number of scientific articles. Text mining as a specific form of computer-based data analysis may be helpful in this context. Highlighting relations between authors and finding relevant publications concerning a specific subject using text analysis programs are illustrated graphically by 2 performed examples. © Georg Thieme Verlag KG Stuttgart · New York.

  2. Impact of Vehicle Flexibility on IRVE-II Flight Dynamics

    NASA Technical Reports Server (NTRS)

    Bose, David M.; Toniolo, Matthew D.; Cheatwood, F. M.; Hughes, Stephen J.; Dillman, Robert A.

    2011-01-01

    The Inflatable Re-entry Vehicle Experiment II (IRVE-II) successfully launched from Wallops Flight Facility (WFF) on August 17, 2009. The primary objectives of this flight test were to demonstrate inflation and re-entry survivability, assess the thermal and drag performance of the reentry vehicle, and to collect flight data for refining pre-flight design and analysis tools. Post-flight analysis including trajectory reconstruction outlined in O Keefe3 demonstrated that the IRVE-II Research Vehicle (RV) met mission objectives but also identified a few anomalies of interest to flight dynamics engineers. Most notable of these anomalies was high normal acceleration during the re-entry pressure pulse. Deflection of the inflatable aeroshell during the pressure pulse was evident in flight video and identified as the likely cause of the anomaly. This paper provides a summary of further post-flight analysis with particular attention to the impact of aeroshell flexibility on flight dynamics and the reconciliation of flight performance with pre-flight models. Independent methods for estimating the magnitude of the deflection of the aeroshell experienced on IRVE-II are discussed. The use of the results to refine models for pre-flight prediction of vehicle performance is then described.

  3. Latent profile analysis of regression-based norms demonstrates relationship of compounding MS symptom burden and negative work events.

    PubMed

    Frndak, Seth E; Smerbeck, Audrey M; Irwin, Lauren N; Drake, Allison S; Kordovski, Victoria M; Kunker, Katrina A; Khan, Anjum L; Benedict, Ralph H B

    2016-10-01

    We endeavored to clarify how distinct co-occurring symptoms relate to the presence of negative work events in employed multiple sclerosis (MS) patients. Latent profile analysis (LPA) was utilized to elucidate common disability patterns by isolating patient subpopulations. Samples of 272 employed MS patients and 209 healthy controls (HC) were administered neuroperformance tests of ambulation, hand dexterity, processing speed, and memory. Regression-based norms were created from the HC sample. LPA identified latent profiles using the regression-based z-scores. Finally, multinomial logistic regression tested for negative work event differences among the latent profiles. Four profiles were identified via LPA: a common profile (55%) characterized by slightly below average performance in all domains, a broadly low-performing profile (18%), a poor motor abilities profile with average cognition (17%), and a generally high-functioning profile (9%). Multinomial regression analysis revealed that the uniformly low-performing profile demonstrated a higher likelihood of reported negative work events. Employed MS patients with co-occurring motor, memory and processing speed impairments were most likely to report a negative work event, classifying them as uniquely at risk for job loss.

  4. From recording discrete actions to studying continuous goal-directed behaviours in team sports.

    PubMed

    Correia, Vanda; Araújo, Duarte; Vilar, Luís; Davids, Keith

    2013-01-01

    This paper highlights the importance of examining interpersonal interactions in performance analysis of team sports, predicated on the relationship between perception and action, compared to the traditional cataloguing of actions by individual performers. We discuss how ecological dynamics may provide a potential unifying theoretical and empirical framework to achieve this re-emphasis in research. With reference to data from illustrative studies on performance analysis and sport expertise, we critically evaluate some of the main assumptions and methodological approaches with regard to understanding how information influences action and decision-making during team sports performance. Current data demonstrate how the understanding of performance behaviours in team sports by sport scientists and practitioners may be enhanced with a re-emphasis in research on the dynamics of emergent ongoing interactions. Ecological dynamics provides formal and theoretically grounded descriptions of player-environment interactions with respect to key performance goals and the unfolding information of competitive performance. Developing these formal descriptions and explanations of sport performance may provide a significant contribution to the field of performance analysis, supporting design and intervention in both research and practice.

  5. Color perception and ATC job performance.

    DOT National Transportation Integrated Search

    1983-07-01

    Current OMP policy and guidance requires demonstrated job-relatedness and reasonable accommodation in the application of physical qualifications. The OPM has accomplished an analysis of the Air Traffic Control Specialist (ATCS) series and recommended...

  6. Simulation analysis of temperature control on RCC arch dam of hydropower station

    NASA Astrophysics Data System (ADS)

    XIA, Shi-fa

    2017-12-01

    The temperature analysis of roller compacted concrete (RCC) dam plays an important role in their design and construction. Based on three-dimensional finite element method, in the computation of temperature field, many cases are included, such as air temperature, elevated temperature by cement hydration heat, concrete temperature during placing, the influence of water in the reservoir, and boundary temperature. According to the corresponding parameters of RCC arch dam, the analysis of temperature field and stress field during the period of construction and operation is performed. The study demonstrates that detailed thermal stress analysis should be performed for RCC dams to provide a basis to minimize and control the occurrence of thermal cracking.

  7. Structural Analysis and Testing of the Inflatable Re-entry Vehicle Experiment (IRVE)

    NASA Technical Reports Server (NTRS)

    Lindell, Michael C.; Hughes, Stephen J.; Dixon, Megan; Wiley, Cliff E.

    2006-01-01

    The Inflatable Re-entry Vehicle Experiment (IRVE) is a 3.0 meter, 60 degree half-angle sphere cone, inflatable aeroshell experiment designed to demonstrate various aspects of inflatable technology during Earth re-entry. IRVE will be launched on a Terrier-Improved Orion sounding rocket from NASA s Wallops Flight Facility in the fall of 2006 to an altitude of approximately 164 kilometers and re-enter the Earth s atmosphere. The experiment will demonstrate exo-atmospheric inflation, inflatable structure leak performance throughout the flight regime, structural integrity under aerodynamic pressure and associated deceleration loads, thermal protection system performance, and aerodynamic stability. Structural integrity and dynamic response of the inflatable will be monitored with photogrammetric measurements of the leeward side of the aeroshell during flight. Aerodynamic stability and drag performance will be verified with on-board inertial measurements and radar tracking from multiple ground radar stations. In addition to demonstrating inflatable technology, IRVE will help validate structural, aerothermal, and trajectory modeling and analysis techniques for the inflatable aeroshell system. This paper discusses the structural analysis and testing of the IRVE inflatable structure. Equations are presented for calculating fabric loads in sphere cone aeroshells, and finite element results are presented which validate the equations. Fabric material properties and testing are discussed along with aeroshell fabrication techniques. Stiffness and dynamics tests conducted on a small-scale development unit and a full-scale prototype unit are presented along with correlated finite element models to predict the in-flight fundamental mod

  8. How little data is enough? Phase-diagram analysis of sparsity-regularized X-ray computed tomography

    PubMed Central

    Jørgensen, J. S.; Sidky, E. Y.

    2015-01-01

    We introduce phase-diagram analysis, a standard tool in compressed sensing (CS), to the X-ray computed tomography (CT) community as a systematic method for determining how few projections suffice for accurate sparsity-regularized reconstruction. In CS, a phase diagram is a convenient way to study and express certain theoretical relations between sparsity and sufficient sampling. We adapt phase-diagram analysis for empirical use in X-ray CT for which the same theoretical results do not hold. We demonstrate in three case studies the potential of phase-diagram analysis for providing quantitative answers to questions of undersampling. First, we demonstrate that there are cases where X-ray CT empirically performs comparably with a near-optimal CS strategy, namely taking measurements with Gaussian sensing matrices. Second, we show that, in contrast to what might have been anticipated, taking randomized CT measurements does not lead to improved performance compared with standard structured sampling patterns. Finally, we show preliminary results of how well phase-diagram analysis can predict the sufficient number of projections for accurately reconstructing a large-scale image of a given sparsity by means of total-variation regularization. PMID:25939620

  9. How little data is enough? Phase-diagram analysis of sparsity-regularized X-ray computed tomography.

    PubMed

    Jørgensen, J S; Sidky, E Y

    2015-06-13

    We introduce phase-diagram analysis, a standard tool in compressed sensing (CS), to the X-ray computed tomography (CT) community as a systematic method for determining how few projections suffice for accurate sparsity-regularized reconstruction. In CS, a phase diagram is a convenient way to study and express certain theoretical relations between sparsity and sufficient sampling. We adapt phase-diagram analysis for empirical use in X-ray CT for which the same theoretical results do not hold. We demonstrate in three case studies the potential of phase-diagram analysis for providing quantitative answers to questions of undersampling. First, we demonstrate that there are cases where X-ray CT empirically performs comparably with a near-optimal CS strategy, namely taking measurements with Gaussian sensing matrices. Second, we show that, in contrast to what might have been anticipated, taking randomized CT measurements does not lead to improved performance compared with standard structured sampling patterns. Finally, we show preliminary results of how well phase-diagram analysis can predict the sufficient number of projections for accurately reconstructing a large-scale image of a given sparsity by means of total-variation regularization.

  10. 40 CFR 63.1571 - How and when do I conduct a performance test or other initial compliance demonstration?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... the option in paragraph (a)(1)(iii) in § 63.1564 (Ni lb/hr), and you use continuous parameter monitoring systems, you must establish an operating limit for the equilibrium catalyst Ni concentration based on the laboratory analysis of the equilibrium catalyst Ni concentration from the initial performance...

  11. Using goal setting and feedback to increase weekly running distance.

    PubMed

    Wack, Stephanie R; Crosland, Kimberly A; Miltenberger, Raymond G

    2014-01-01

    We evaluated goal setting with performance feedback to increase running distance among 5 healthy adults. Participants set a short-term goal each week and a long-term goal to achieve on completion of the study. Results demonstrated that goal setting and performance feedback increased running distance for all participants. © Society for the Experimental Analysis of Behavior.

  12. Meta-Analysis of the Impact of Positive Psychological Capital on Employee Attitudes, Behaviors, and Performance

    ERIC Educational Resources Information Center

    Avey, James B.; Reichard, Rebecca J.; Luthans, Fred; Mhatre, Ketan H.

    2011-01-01

    The positive core construct of psychological capital (or simply PsyCap), consisting of the psychological resources of hope, efficacy, resilience, and optimism, has recently been demonstrated to be open to human resource development (HRD) and performance management. The research stream on PsyCap has now grown to the point that a quantitative…

  13. Judgments of Self-Perceived Academic Competence and Their Differential Impact on Students' Achievement Motivation, Learning Approach, and Academic Performance

    ERIC Educational Resources Information Center

    Ferla, Johan; Valcke, Martin; Schuyten, Gilberte

    2010-01-01

    Using path analysis, the present study focuses on the development of a model describing the impact of four judgments of self-perceived academic competence on higher education students' achievement goals, learning approach, and academic performance. Results demonstrate that academic self-efficacy, self-efficacy for self-regulated learning, academic…

  14. Error analysis of filtering operations in pixel-duplicated images of diabetic retinopathy

    NASA Astrophysics Data System (ADS)

    Mehrubeoglu, Mehrube; McLauchlan, Lifford

    2010-08-01

    In this paper, diabetic retinopathy is chosen for a sample target image to demonstrate the effectiveness of image enlargement through pixel duplication in identifying regions of interest. Pixel duplication is presented as a simpler alternative to data interpolation techniques for detecting small structures in the images. A comparative analysis is performed on different image processing schemes applied to both original and pixel-duplicated images. Structures of interest are detected and and classification parameters optimized for minimum false positive detection in the original and enlarged retinal pictures. The error analysis demonstrates the advantages as well as shortcomings of pixel duplication in image enhancement when spatial averaging operations (smoothing filters) are also applied.

  15. Socioeconomic disparities in the utilization of mechanical thrombectomy for acute ischemic stroke in US hospitals.

    PubMed

    Brinjikji, W; Rabinstein, A A; McDonald, J S; Cloft, H J

    2014-03-01

    Previous studies have demonstrated that socioeconomic disparities in the treatment of cerebrovascular diseases exist. We studied a large administrative data base to study disparities in the utilization of mechanical thrombectomy for acute ischemic stroke. With the utilization of the Perspective data base, we studied disparities in mechanical thrombectomy utilization between patient race and insurance status in 1) all patients presenting with acute ischemic stroke and 2) patients presenting with acute ischemic stroke at centers that performed mechanical thrombectomy. We examined utilization rates of mechanical thrombectomy by race/ethnicity (white, black, and Hispanic) and insurance status (Medicare, Medicaid, self-pay, and private). Multivariate logistic regression analysis adjusting for potential confounding variables was performed to study the association between race/insurance status and mechanical thrombectomy utilization. The overall mechanical thrombectomy utilization rate was 0.15% (371/249,336); utilization rate at centers that performed mechanical thrombectomy was 1.0% (371/35,376). In the sample of all patients with acute ischemic stroke, multivariate logistic regression analysis demonstrated that uninsured patients had significantly lower odds of mechanical thrombectomy utilization compared with privately insured patients (OR = 0.52, 95% CI = 0.25-0.95, P = .03), as did Medicare patients (OR = 0.53, 95% CI = 0.41-0.70, P < .0001). Blacks had significantly lower odds of mechanical thrombectomy utilization compared with whites (OR = 0.35, 95% CI = 0.23-0.51, P < .0001). When considering only patients treated at centers performing mechanical thrombectomy, multivariate logistic regression analysis demonstrated that insurance was not associated with significant disparities in mechanical thrombectomy utilization; however, black patients had significantly lower odds of mechanical thrombectomy utilization compared with whites (OR = 0.41, 95% CI = 0.27-0.60, P < .0001). Significant socioeconomic disparities exist in the utilization of mechanical thrombectomy in the United States.

  16. Coordinated Basal–Bolus Infusion for Tighter Postprandial Glucose Control in Insulin Pump Therapy

    PubMed Central

    Bondia, Jorge; Dassau, Eyal; Zisser, Howard; Calm, Remei; Vehí, Josep; Jovanovič, Lois; Doyle, Francis J.

    2009-01-01

    Background Basal and bolus insulin determination in intensive insulin therapy for type 1 diabetes mellitus (T1DM) are currently considered independently of each other. A new strategy that coordinates basal and bolus insulin infusion to cope with postprandial glycemia in pump therapy is proposed. Superior performance of this new strategy is demonstrated through a formal analysis of attainable performances in an in silico study. Methods The set inversion via interval analysis algorithm has been applied to obtain the feasible set of basal and bolus doses that, for a given meal, mathematically guarantee a postprandial response fulfilling the International Diabetes Federation (IDF) guidelines (i.e., no hypoglycemia and 2 h postprandial glucose below 140 mg/dl). Hypoglycemia has been defined as a glucose value below 70 mg/dl. A 5 h time horizon has been considered for a 70 kg in silico T1DM subject consuming meals in the range of 30 to 80 g of carbohydrates. Results The computed feasible sets demonstrate that current separated basal/bolus strategy dramatically limits the attainable performance. For a nominal basal of 0.8 IU/h leading to a basal glucose of approximately 100 mg/dl, IDF guidelines cannot be fulfilled for meals greater than 50 g of carbohydrates, independent of the bolus insulin computed. However, coordinating the basal and bolus insulin delivery can achieve this. A decrement of basal insulin during the postprandial period is required together with an increase in bolus insulin, in appropriate percentages, which is meal dependent. After 3 h, basal insulin can be restored to its nominal value. Conclusions The new strategy meets IDF guidelines in a typical day, contrary to the standard basal/bolus strategy, yielding a mean 2 h postprandial glucose reduction of 36.4 mg/dl without late hypoglycemia. The application of interval analysis for the computation of feasible sets is demonstrated to be a powerful tool for the analysis of attainable performance in glucose control. PMID:20046653

  17. BeadArray Expression Analysis Using Bioconductor

    PubMed Central

    Ritchie, Matthew E.; Dunning, Mark J.; Smith, Mike L.; Shi, Wei; Lynch, Andy G.

    2011-01-01

    Illumina whole-genome expression BeadArrays are a popular choice in gene profiling studies. Aside from the vendor-provided software tools for analyzing BeadArray expression data (GenomeStudio/BeadStudio), there exists a comprehensive set of open-source analysis tools in the Bioconductor project, many of which have been tailored to exploit the unique properties of this platform. In this article, we explore a number of these software packages and demonstrate how to perform a complete analysis of BeadArray data in various formats. The key steps of importing data, performing quality assessments, preprocessing, and annotation in the common setting of assessing differential expression in designed experiments will be covered. PMID:22144879

  18. Structural reliability analysis under evidence theory using the active learning kriging model

    NASA Astrophysics Data System (ADS)

    Yang, Xufeng; Liu, Yongshou; Ma, Panke

    2017-11-01

    Structural reliability analysis under evidence theory is investigated. It is rigorously proved that a surrogate model providing only correct sign prediction of the performance function can meet the accuracy requirement of evidence-theory-based reliability analysis. Accordingly, a method based on the active learning kriging model which only correctly predicts the sign of the performance function is proposed. Interval Monte Carlo simulation and a modified optimization method based on Karush-Kuhn-Tucker conditions are introduced to make the method more efficient in estimating the bounds of failure probability based on the kriging model. Four examples are investigated to demonstrate the efficiency and accuracy of the proposed method.

  19. Overview of RICOR's reliability theoretical analysis, accelerated life demonstration test results and verification by field data

    NASA Astrophysics Data System (ADS)

    Vainshtein, Igor; Baruch, Shlomi; Regev, Itai; Segal, Victor; Filis, Avishai; Riabzev, Sergey

    2018-05-01

    The growing demand for EO applications that work around the clock 24hr/7days a week, such as in border surveillance systems, emphasizes the need for a highly reliable cryocooler having increased operational availability and optimized system's Integrated Logistic Support (ILS). In order to meet this need, RICOR developed linear and rotary cryocoolers which achieved successfully this goal. Cryocoolers MTTF was analyzed by theoretical reliability evaluation methods, demonstrated by normal and accelerated life tests at Cryocooler level and finally verified by field data analysis derived from Cryocoolers operating at system level. The following paper reviews theoretical reliability analysis methods together with analyzing reliability test results derived from standard and accelerated life demonstration tests performed at Ricor's advanced reliability laboratory. As a summary for the work process, reliability verification data will be presented as a feedback from fielded systems.

  20. Uncertainty in benefit cost analysis of smart grid demonstration-projects in the U.S., China, and Italy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karali, Nihan; Flego, Gianluca; Yu, Jiancheng

    Given the substantial investments required, there has been keen interest in conducting benefits analysis, i.e., quantifying, and often monetizing, the performance of smart grid technologies. In this study, we compare two different approaches; (1) Electric Power Research Institute (EPRI)’s benefits analysis method and its adaptation to the European contexts by the European Commission, Joint Research Centre (JRC), and (2) the Analytic Hierarchy Process (AHP) and fuzzy logic decision making method. These are applied to three case demonstration projects executed in three different countries; the U.S., China, and Italy, considering uncertainty in each case. This work is conducted under the U.S.more » (United States)-China Climate Change Working Group, smart grid, with an additional major contribution by the European Commission. The following is a brief description of the three demonstration projects.« less

  1. Multi-threaded Sparse Matrix Sparse Matrix Multiplication for Many-Core and GPU Architectures.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deveci, Mehmet; Trott, Christian Robert; Rajamanickam, Sivasankaran

    Sparse Matrix-Matrix multiplication is a key kernel that has applications in several domains such as scientific computing and graph analysis. Several algorithms have been studied in the past for this foundational kernel. In this paper, we develop parallel algorithms for sparse matrix- matrix multiplication with a focus on performance portability across different high performance computing architectures. The performance of these algorithms depend on the data structures used in them. We compare different types of accumulators in these algorithms and demonstrate the performance difference between these data structures. Furthermore, we develop a meta-algorithm, kkSpGEMM, to choose the right algorithm and datamore » structure based on the characteristics of the problem. We show performance comparisons on three architectures and demonstrate the need for the community to develop two phase sparse matrix-matrix multiplication implementations for efficient reuse of the data structures involved.« less

  2. Multi-threaded Sparse Matrix-Matrix Multiplication for Many-Core and GPU Architectures.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deveci, Mehmet; Rajamanickam, Sivasankaran; Trott, Christian Robert

    Sparse Matrix-Matrix multiplication is a key kernel that has applications in several domains such as scienti c computing and graph analysis. Several algorithms have been studied in the past for this foundational kernel. In this paper, we develop parallel algorithms for sparse matrix-matrix multiplication with a focus on performance portability across different high performance computing architectures. The performance of these algorithms depend on the data structures used in them. We compare different types of accumulators in these algorithms and demonstrate the performance difference between these data structures. Furthermore, we develop a meta-algorithm, kkSpGEMM, to choose the right algorithm and datamore » structure based on the characteristics of the problem. We show performance comparisons on three architectures and demonstrate the need for the community to develop two phase sparse matrix-matrix multiplication implementations for efficient reuse of the data structures involved.« less

  3. Parametric and experimental analysis using a power flow approach

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1990-01-01

    A structural power flow approach for the analysis of structure-borne transmission of vibrations is used to analyze the influence of structural parameters on transmitted power. The parametric analysis is also performed using the Statistical Energy Analysis approach and the results are compared with those obtained using the power flow approach. The advantages of structural power flow analysis are demonstrated by comparing the type of results that are obtained by the two analytical methods. Also, to demonstrate that the power flow results represent a direct physical parameter that can be measured on a typical structure, an experimental study of structural power flow is presented. This experimental study presents results for an L shaped beam for which an available solution was already obtained. Various methods to measure vibrational power flow are compared to study their advantages and disadvantages.

  4. Universal microfluidic automaton for autonomous sample processing: application to the Mars Organic Analyzer.

    PubMed

    Kim, Jungkyu; Jensen, Erik C; Stockton, Amanda M; Mathies, Richard A

    2013-08-20

    A fully integrated multilayer microfluidic chemical analyzer for automated sample processing and labeling, as well as analysis using capillary zone electrophoresis is developed and characterized. Using lifting gate microfluidic control valve technology, a microfluidic automaton consisting of a two-dimensional microvalve cellular array is fabricated with soft lithography in a format that enables facile integration with a microfluidic capillary electrophoresis device. The programmable sample processor performs precise mixing, metering, and routing operations that can be combined to achieve automation of complex and diverse assay protocols. Sample labeling protocols for amino acid, aldehyde/ketone and carboxylic acid analysis are performed automatically followed by automated transfer and analysis by the integrated microfluidic capillary electrophoresis chip. Equivalent performance to off-chip sample processing is demonstrated for each compound class; the automated analysis resulted in a limit of detection of ~16 nM for amino acids. Our microfluidic automaton provides a fully automated, portable microfluidic analysis system capable of autonomous analysis of diverse compound classes in challenging environments.

  5. Evaluating the Effect of Virtual Reality Temporal Bone Simulation on Mastoidectomy Performance: A Meta-analysis.

    PubMed

    Lui, Justin T; Hoy, Monica Y

    2017-06-01

    Background The increasing prevalence of virtual reality simulation in temporal bone surgery warrants an investigation to assess training effectiveness. Objectives To determine if temporal bone simulator use improves mastoidectomy performance. Data Sources Ovid Medline, Embase, and PubMed databases were systematically searched per the PRISMA guidelines. Review Methods Inclusion criteria were peer-reviewed publications that utilized quantitative data of mastoidectomy performance following the use of a temporal bone simulator. The search was restricted to human studies published in English. Studies were excluded if they were in non-peer-reviewed format, were descriptive in nature, or failed to provide surgical performance outcomes. Meta-analysis calculations were then performed. Results A meta-analysis based on the random-effects model revealed an improvement in overall mastoidectomy performance following training on the temporal bone simulator. A standardized mean difference of 0.87 (95% CI, 0.38-1.35) was generated in the setting of a heterogeneous study population ( I 2 = 64.3%, P < .006). Conclusion In the context of a diverse population of virtual reality simulation temporal bone surgery studies, meta-analysis calculations demonstrate an improvement in trainee mastoidectomy performance with virtual simulation training.

  6. Multiprocessor smalltalk: Implementation, performance, and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pallas, J.I.

    1990-01-01

    Multiprocessor Smalltalk demonstrates the value of object-oriented programming on a multiprocessor. Its implementation and analysis shed light on three areas: concurrent programming in an object oriented language without special extensions, implementation techniques for adapting to multiprocessors, and performance factors in the resulting system. Adding parallelism to Smalltalk code is easy, because programs already use control abstractions like iterators. Smalltalk's basic control and concurrency primitives (lambda expressions, processes and semaphores) can be used to build parallel control abstractions, including parallel iterators, parallel objects, atomic objects, and futures. Language extensions for concurrency are not required. This implementation demonstrates that it is possiblemore » to build an efficient parallel object-oriented programming system and illustrates techniques for doing so. Three modification tools-serialization, replication, and reorganization-adapted the Berkeley Smalltalk interpreter to the Firefly multiprocessor. Multiprocessor Smalltalk's performance shows that the combination of multiprocessing and object-oriented programming can be effective: speedups (relative to the original serial version) exceed 2.0 for five processors on all the benchmarks; the median efficiency is 48%. Analysis shows both where performance is lost and how to improve and generalize the experimental results. Changes in the interpreter to support concurrency add at most 12% overhead; better access to per-process variables could eliminate much of that. Changes in the user code to express concurrency add as much as 70% overhead; this overhead could be reduced to 54% if blocks (lambda expressions) were reentrant. Performance is also lost when the program cannot keep all five processors busy.« less

  7. ALS rocket engine combustion devices design and demonstration

    NASA Technical Reports Server (NTRS)

    Arreguin, Steve

    1989-01-01

    Work performed during Phase one is summarized and the significant technical and programmatic accomplishments occurring during this period are documented. Besides a summary of the results, methodologies, trade studies, design, fabrication, and hardware conditions; the following are included: the evolving Maintainability Plan, Reliability Program Plan, Failure Summary and Analysis Report, and the Failure Mode and Effect Analysis.

  8. Improving Measurement of the EFNEP Outcomes Using Factor Analysis of the Behavior Checklist

    ERIC Educational Resources Information Center

    Hoerr, Sharon L.; Abdulkadri, Abdullahi O.; Miller, Steven; Waltersdorf, Christine; LaShore, Margaret; Martin, Karen; Newkirk, Cathy

    2011-01-01

    This article advances the literature on assessment of EFNEP's effectiveness. Factor analysis of Behavior Checklist items were performed to arrive at a parsimonious set of constructs used to assess the effects of program attributes on participants' behavior change. Based on the data from Michigan EFNEP, the use of constructs demonstrated a robust…

  9. Beyond Traditional School Value-Added Models: A Multilevel Analysis of Complex School Effects in Chile

    ERIC Educational Resources Information Center

    Troncoso, Patricio; Pampaka, Maria; Olsen, Wendy

    2016-01-01

    School value-added studies have largely demonstrated the effects of socioeconomic and demographic characteristics of the schools and the pupils on performance in standardised tests. Traditionally, these studies have assessed the variation coming only from the schools and the pupils. However, recent studies have shown that the analysis of academic…

  10. An exploratory analysis of the relationship between ambient ozone and particulate matter concentrations during early pregnancy and selected birth defects in Texas

    EPA Science Inventory

    Background: Associations between ozone (O3) and fine particulate matter (PM2.5) concentrations and birth outcomes have been previously demonstrated. We perform an exploratory analysis of O3 and PM2.5 concentrations during early pregnancy and multiple types of birth defects. Met...

  11. Optimization of Soft Tissue Management, Spacer Design, and Grafting Strategies for Large Segmental Bone Defects using the Chronic Caprine Tibial Defect Model

    DTIC Science & Technology

    2014-10-01

    histology, and microCT analysis. In the current phase of work he will receive more specialized ` training and orientation to microCT analysis...fibrous connective tissue. • Performed histology on goat autogenous bone graft which demonstrated that the quantity and quality of cancellous bone graft

  12. Printed strain sensors for early damage detection in engineering structures

    NASA Astrophysics Data System (ADS)

    Zymelka, Daniel; Yamashita, Takahiro; Takamatsu, Seiichi; Itoh, Toshihiro; Kobayashi, Takeshi

    2018-05-01

    In this paper, we demonstrate the analysis of strain measurements recorded using a screen-printed sensors array bonded to a metal plate and subjected to high strains. The analysis was intended to evaluate the capabilities of the printed strain sensors to detect abnormal strain distribution before actual defects (cracks) in the analyzed structures appear. The results demonstrate that the developed device can accurately localize the enhanced strains at the very early stage of crack formation. The promising performance and low fabrication cost confirm the potential suitability of the printed strain sensors for applications within the framework of structural health monitoring (SHM).

  13. Effects of equivalent series resistance on the noise mitigation performance of piezoelectric shunt damping

    NASA Astrophysics Data System (ADS)

    Lai, Szu Cheng; Sharifzadeh Mirshekarloo, Meysam; Yao, Kui

    2017-05-01

    Piezoelectric shunt damping (PSD) utilizes an electrically-shunted piezoelectric damper attached on a panel structure to suppress the transmission of acoustic noise. The paper develops an understanding on the effects of equivalent series resistance (ESR) of the piezoelectric damper in a PSD system on noise mitigation performance, and demonstrates that an increased ESR leads to a significant rise in the noise transmissibility due to reduction in the system’s mechanical damping. It is further demonstrated with experimental results that ESR effects can be compensated in the shunt circuit to significantly improve the noise mitigation performance. A theoretical electrical equivalent model of the PSD incorporating the ESR is established for quantitative analysis of ESR effects on noise mitigation.

  14. Intelligent redundant actuation system requirements and preliminary system design

    NASA Technical Reports Server (NTRS)

    Defeo, P.; Geiger, L. J.; Harris, J.

    1985-01-01

    Several redundant actuation system configurations were designed and demonstrated to satisfy the stringent operational requirements of advanced flight control systems. However, this has been accomplished largely through brute force hardware redundancy, resulting in significantly increased computational requirements on the flight control computers which perform the failure analysis and reconfiguration management. Modern technology now provides powerful, low-cost microprocessors which are effective in performing failure isolation and configuration management at the local actuator level. One such concept, called an Intelligent Redundant Actuation System (IRAS), significantly reduces the flight control computer requirements and performs the local tasks more comprehensively than previously feasible. The requirements and preliminary design of an experimental laboratory system capable of demonstrating the concept and sufficiently flexible to explore a variety of configurations are discussed.

  15. Static analysis techniques for semiautomatic synthesis of message passing software skeletons

    DOE PAGES

    Sottile, Matthew; Dagit, Jason; Zhang, Deli; ...

    2015-06-29

    The design of high-performance computing architectures demands performance analysis of large-scale parallel applications to derive various parameters concerning hardware design and software development. The process of performance analysis and benchmarking an application can be done in several ways with varying degrees of fidelity. One of the most cost-effective ways is to do a coarse-grained study of large-scale parallel applications through the use of program skeletons. The concept of a “program skeleton” that we discuss in this article is an abstracted program that is derived from a larger program where source code that is determined to be irrelevant is removed formore » the purposes of the skeleton. In this work, we develop a semiautomatic approach for extracting program skeletons based on compiler program analysis. Finally, we demonstrate correctness of our skeleton extraction process by comparing details from communication traces, as well as show the performance speedup of using skeletons by running simulations in the SST/macro simulator.« less

  16. Performance Analysis, Design Considerations, and Applications of Extreme-Scale In Situ Infrastructures

    DOE PAGES

    Ayachit, Utkarsh; Bauer, Andrew; Duque, Earl P. N.; ...

    2016-11-01

    A key trend facing extreme-scale computational science is the widening gap between computational and I/O rates, and the challenge that follows is how to best gain insight from simulation data when it is increasingly impractical to save it to persistent storage for subsequent visual exploration and analysis. One approach to this challenge is centered around the idea of in situ processing, where visualization and analysis processing is performed while data is still resident in memory. Our paper examines several key design and performance issues related to the idea of in situ processing at extreme scale on modern platforms: Scalability, overhead,more » performance measurement and analysis, comparison and contrast with a traditional post hoc approach, and interfacing with simulation codes. We illustrate these principles in practice with studies, conducted on large-scale HPC platforms, that include a miniapplication and multiple science application codes, one of which demonstrates in situ methods in use at greater than 1M-way concurrency.« less

  17. Boom Minimization Framework for Supersonic Aircraft Using CFD Analysis

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian; Rallabhandi, Sriram K.

    2010-01-01

    A new framework is presented for shape optimization using analytical shape functions and high-fidelity computational fluid dynamics (CFD) via Cart3D. The focus of the paper is the system-level integration of several key enabling analysis tools and automation methods to perform shape optimization and reduce sonic boom footprint. A boom mitigation case study subject to performance, stability and geometrical requirements is presented to demonstrate a subset of the capabilities of the framework. Lastly, a design space exploration is carried out to assess the key parameters and constraints driving the design.

  18. The effects of videotape modeling on staff acquisition of functional analysis methodology.

    PubMed

    Moore, James W; Fisher, Wayne W

    2007-01-01

    Lectures and two types of video modeling were compared to determine their relative effectiveness in training 3 staff members to conduct functional analysis sessions. Video modeling that contained a larger number of therapist exemplars resulted in mastery-level performance eight of the nine times it was introduced, whereas neither lectures nor partial video modeling produced significant improvements in performance. Results demonstrated that video modeling provided an effective training strategy but only when a wide range of exemplars of potential therapist behaviors were depicted in the videotape.

  19. The Effects of Videotape Modeling on Staff Acquisition of Functional Analysis Methodology

    PubMed Central

    Moore, James W; Fisher, Wayne W

    2007-01-01

    Lectures and two types of video modeling were compared to determine their relative effectiveness in training 3 staff members to conduct functional analysis sessions. Video modeling that contained a larger number of therapist exemplars resulted in mastery-level performance eight of the nine times it was introduced, whereas neither lectures nor partial video modeling produced significant improvements in performance. Results demonstrated that video modeling provided an effective training strategy but only when a wide range of exemplars of potential therapist behaviors were depicted in the videotape. PMID:17471805

  20. Computational Fluid Dynamic Investigation of Loss Mechanisms in a Pulse-Tube Refrigerator

    NASA Astrophysics Data System (ADS)

    Martin, K.; Esguerra, J.; Dodson, C.; Razani, A.

    2015-12-01

    In predicting Pulse-Tube Cryocooler (PTC) performance, One-Dimensional (1-D) PTR design and analysis tools such as Gedeon Associates SAGE® typically include models for performance degradation due to thermodynamically irreversible processes. SAGE®, in particular, accounts for convective loss, turbulent conductive loss and numerical diffusion “loss” via correlation functions based on analysis and empirical testing. In this study, we compare CFD and SAGE® estimates of PTR refrigeration performance for four distinct pulse-tube lengths. Performance predictions from PTR CFD models are compared to SAGE® predictions for all four cases. Then, to further demonstrate the benefits of higher-fidelity and multidimensional CFD simulation, the PTR loss mechanisms are characterized in terms of their spatial and temporal locations.

  1. Evaluation of Counter-Based Dynamic Load Balancing Schemes for Massive Contingency Analysis on Over 10,000 Cores

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Huang, Zhenyu; Rice, Mark J.

    Contingency analysis studies are necessary to assess the impact of possible power system component failures. The results of the contingency analysis are used to ensure the grid reliability, and in power market operation for the feasibility test of market solutions. Currently, these studies are performed in real time based on the current operating conditions of the grid with a set of pre-selected contingency list, which might result in overlooking some critical contingencies caused by variable system status. To have a complete picture of a power grid, more contingencies need to be studied to improve grid reliability. High-performance computing techniques holdmore » the promise of being able to perform the analysis for more contingency cases within a much shorter time frame. This paper evaluates the performance of counter-based dynamic load balancing schemes for a massive contingency analysis program on 10,000+ cores. One million N-2 contingency analysis cases with a Western Electricity Coordinating Council power grid model have been used to demonstrate the performance. The speedup of 3964 with 4096 cores and 7877 with 10240 cores are obtained. This paper reports the performance of the load balancing scheme with a single counter and two counters, describes disk I/O issues, and discusses other potential techniques for further improving the performance.« less

  2. Constrained independent component analysis approach to nonobtrusive pulse rate measurements

    NASA Astrophysics Data System (ADS)

    Tsouri, Gill R.; Kyal, Survi; Dianat, Sohail; Mestha, Lalit K.

    2012-07-01

    Nonobtrusive pulse rate measurement using a webcam is considered. We demonstrate how state-of-the-art algorithms based on independent component analysis suffer from a sorting problem which hinders their performance, and propose a novel algorithm based on constrained independent component analysis to improve performance. We present how the proposed algorithm extracts a photoplethysmography signal and resolves the sorting problem. In addition, we perform a comparative study between the proposed algorithm and state-of-the-art algorithms over 45 video streams using a finger probe oxymeter for reference measurements. The proposed algorithm provides improved accuracy: the root mean square error is decreased from 20.6 and 9.5 beats per minute (bpm) for existing algorithms to 3.5 bpm for the proposed algorithm. An error of 3.5 bpm is within the inaccuracy expected from the reference measurements. This implies that the proposed algorithm provided performance of equal accuracy to the finger probe oximeter.

  3. Variability in memory performance in aged healthy individuals: an fMRI study.

    PubMed

    Grön, Georg; Bittner, Daniel; Schmitz, Bernd; Wunderlich, Arthur P; Tomczak, Reinhard; Riepe, Matthias W

    2003-01-01

    Episodic memory performance varies in older subjects but underlying biological correlates remain as yet ambiguous. We investigated episodic memory in healthy older individuals (n=24; mean age: 64.4+/-6.7 years) without subjective memory complaints or objective cognitive impairment. Episodic memory was assessed with repetitive learning and recall of abstract geometric patterns during fMRI. Group analysis of brain activity during initial learning and maximum recall revealed hippocampal activation. Correlation analysis of brain activation and task performance demonstrated significant hippocampal activity during initial learning and maximum recall in a success-dependent manner. Neither age nor gray matter densities correlated with hippocampal activation. Functional imaging of episodic memory thus permits to detect objectively variability in hippocampal recruitment in healthy aged individuals without subjective memory complaints. Correlation analysis of brain activation and performance during an episodic memory task may be used to determine and follow-up hippocampal malfunction in a very sensitive manner.

  4. Constrained independent component analysis approach to nonobtrusive pulse rate measurements.

    PubMed

    Tsouri, Gill R; Kyal, Survi; Dianat, Sohail; Mestha, Lalit K

    2012-07-01

    Nonobtrusive pulse rate measurement using a webcam is considered. We demonstrate how state-of-the-art algorithms based on independent component analysis suffer from a sorting problem which hinders their performance, and propose a novel algorithm based on constrained independent component analysis to improve performance. We present how the proposed algorithm extracts a photoplethysmography signal and resolves the sorting problem. In addition, we perform a comparative study between the proposed algorithm and state-of-the-art algorithms over 45 video streams using a finger probe oxymeter for reference measurements. The proposed algorithm provides improved accuracy: the root mean square error is decreased from 20.6 and 9.5 beats per minute (bpm) for existing algorithms to 3.5 bpm for the proposed algorithm. An error of 3.5 bpm is within the inaccuracy expected from the reference measurements. This implies that the proposed algorithm provided performance of equal accuracy to the finger probe oximeter.

  5. Guaifenesin- and ephedrine-induced stones.

    PubMed

    Assimos, D G; Langenstroer, P; Leinbach, R F; Mandel, N S; Stern, J M; Holmes, R P

    1999-11-01

    We report a new type of drug-induced stone that is caused by overconsumption of preparations containing guaifenesin and ephedrine. Clinical and stone analysis data from the Molecular Structure Laboratory at the Veterans Affairs Medical Center in Milwaukee, Wisconsin, were reviewed. Stone analysis was performed by Fourier transform infrared spectroscopy, high-resolution X-ray crystallographic powder diffraction, or both. The urine and stone material from one of the subjects were analyzed with high-performance liquid chromatography. Stone analysis from seven patients demonstrated metabolites of guaifenesin. High-performance liquid chromatography revealed that the stone and urine from one subject had a high content of guaifenesin metabolites and a small amount of ephedrine. Demographic data were available on five patients. Three had a history of alcohol or drug dependency. All were consuming over-the-counter preparations containing ephedrine and guaifenesin. Four admitted to taking excessive quantities of these agents, mainly as a stimulant. Hypocitraturia was identified in two individuals subjected to urinary metabolic testing. These stones are radiolucent on standard X-ray imaging but can be demonstrated on unenhanced CT. Shockwave lithotripsy was performed in two patients, and the calculi fragmented easily. Individuals consuming large quantities of preparations containing ephedrine and guaifenesin may be at risk to develop stones derived mainly from metabolites of guaifenesin and small quantities of ephedrine. These patients may be prone to drug or alcohol dependency.

  6. Acute toxicity of excess mercury on the photosynthetic performance of cyanobacterium, S. platensis--assessment by chlorophyll fluorescence analysis.

    PubMed

    Lu, C M; Chau, C W; Zhang, J H

    2000-07-01

    Measurement of chlorophyll fluorescence has been shown to be a rapid, non-invasive, and reliable method to assess photosynthetic performance in a changing environment. In this study, acute toxicity of excess Hg on the photosynthetic performance of the cyanobacterium S. platensis, was investigated by use of chlorophyll fluorescence analysis after cells were exposed to excess Hg (up to 20 microM) for 2 h. The results determined from the fast fluorescence kinetics showed that Hg induced a significant increase in the proportion of the Q(B)-non-reducing PSII reaction centers. The fluorescence parameters measured under the steady state of photosynthesis demonstrated that the increase of Hg concentration led to a decrease in the maximal efficiency of PSII photochemistry, the efficiency of excitation energy capture by the open PSII reaction centers, and the quantum yield of PSII electron transport. Mercury also resulted in a decrease in the coefficients of photochemical and non-photochemical quenching. Mercury may have an acute toxicity on cyanobacteria by inhibiting the quantum yield of photosynthesis sensitively and rapidly. Such changes occurred before any other visible damages that may be evaluated by other conventional measurements. Our results also demonstrated that chlorophyll fluorescence analysis can be used as a useful physiological tool to assess early stages of change in photosynthetic performance of algae in response to heavy metal pollution.

  7. Practical exergy analysis of centrifugal compressor performance using ASME-PTC-10 data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carranti, F.J.

    1997-07-01

    It has been shown that measures of performance currently in use for industrial and process compressors do not give a true measure of energy utilization, and that the required assumptions of isentropic or adiabatic behavior are now always valid. A better indication of machine or process performance can be achieved using exergetic (second law) efficiencies and by employing the second law of thermodynamics to indicate the nature of irreversibilities and entropy generation in the compression process. In this type of analysis, performance is related to an environmental equilibrium condition, or dead state. Often, the differences between avoidable and unavoidable irreversibilitiesmore » ca be interpreted from these results. A general overview of the techniques involved in exergy analysis as applied to compressors and blowers is presented. A practical method to allow the calculation of exergetic efficiencies by manufacturers and end users is demonstrated using data from ASME Power Test Code input. These data are often readily available from compressor manufacturers for both design and off-design conditions, or can sometimes be obtained from field measurements. The calculations involved are simple and straightforward, and can demonstrate the energy usage situation for a variety of conditions. Here off-design is taken to mean at different rates of flow, as well as at different environmental states. The techniques presented are also applicable to many other equipment and process types.« less

  8. An information-theoretic approach to the modeling and analysis of whole-genome bisulfite sequencing data.

    PubMed

    Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John

    2018-03-07

    DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of quantifying methylation stochasticity using concepts from information theory. By employing this methodology, substantial improvement of DNA methylation analysis can be achieved by effectively taking into account the massive amount of statistical information available in WGBS data, which is largely ignored by existing methods.

  9. Case Study for the ARRA-funded Ground Source Heat Pump (GSHP) Demonstration at Wilders Grove Solid Waste Service Center in Raleigh, NC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Xiaobing; Malhotra, Mini; Xiong, Zeyu

    High initial costs and lack of public awareness of ground-source heat pump (GSHP) technology are the two major barriers preventing rapid deployment of this energy-saving technology in the United States. Under the American Recovery and Reinvestment Act (ARRA), 26 GSHP projects have been competitively selected and carried out to demonstrate the benefits of GSHP systems and innovative technologies for cost reduction and/or performance improvement. This paper highlights the findings of a case study of one of the ARRA-funded GSHP demonstration projects, a distributed GSHP system for providing all the space conditioning, outdoor air ventilation, and 100% domestic hot water tomore » the Wilders Grove Solid Waste Service Center of City of Raleigh, North Carolina. This case study is based on the analysis of measured performance data, construction costs, and simulations of the energy consumption of conventional central heating, ventilation, and air-conditioning (HVAC) systems providing the same level of space conditioning and outdoor air ventilation as the demonstrated GSHP system. The evaluated performance metrics include the energy efficiency of the heat pump equipment and the overall GSHP system, pumping performance, energy savings, carbon emission reductions, and cost-effectiveness of the GSHP system compared with conventional HVAC systems. This case study also identified opportunities for reducing uncertainties in the performance evaluation and improving the operational efficiency of the demonstrated GSHP system.« less

  10. Zr-doped ceria additives for enhanced PEM fuel cell durability and radical scavenger stability

    DOE PAGES

    Baker, Andrew M.; Williams, Stefan Thurston DuBard; Mukundan, Rangachary; ...

    2017-06-06

    Doped ceria compounds demonstrate excellent radical scavenging abilities and are promising additives to improve the chemical durability of polymer electrolyte membrane (PEM) fuel cells. Here in this paper, Ce 0.85Zr 0.15O 2 (CZO) nanoparticles were incorporated into the cathode catalyst layers (CLs) of PEM fuel cells (based on Nafion XL membranes containing 6.0 μg cm -2 ion-exchanged Ce) at loadings of 10 and 55 μg cm -2. When compared to a CZO-free baseline, CZO-containing membrane electrode assemblies (MEAs) demonstrated extended lifetimes during PEM chemical stability accelerated stress tests (ASTs), exhibiting reduced electrochemical gas crossover, open circuit voltage decay, and fluoridemore » emission rates. The MEA with high CZO loading (55 μg cm -2) demonstrated performance losses, which are attributed to Ce poisoning of the PEM and CL ionomer regions, which is supported by X-ray fluorescence (XRF) analysis. In the MEA with the low CZO loading (10 μg cm -2), both the beginning of life (BOL) performance and the performance after 500 hours of ASTs were nearly identical to the BOL performance of the CZO-free baseline MEA. XRF analysis of the MEA with low CZO loading reveals that the BOL PEM Ce concentrations are preserved after 1408 hours of ASTs and that Ce contents in the cathode CL are not significant enough to reduce performance. Therefore, employing a highly effective radical scavenger such as CZO, at a loading of 10 μg cm -2 in the cathode CL, dramatically mitigates degradation effects, which improves MEA chemical durability and minimizes performance losses.« less

  11. Zr-doped ceria additives for enhanced PEM fuel cell durability and radical scavenger stability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Andrew M.; Williams, Stefan Thurston DuBard; Mukundan, Rangachary

    Doped ceria compounds demonstrate excellent radical scavenging abilities and are promising additives to improve the chemical durability of polymer electrolyte membrane (PEM) fuel cells. Here in this paper, Ce 0.85Zr 0.15O 2 (CZO) nanoparticles were incorporated into the cathode catalyst layers (CLs) of PEM fuel cells (based on Nafion XL membranes containing 6.0 μg cm -2 ion-exchanged Ce) at loadings of 10 and 55 μg cm -2. When compared to a CZO-free baseline, CZO-containing membrane electrode assemblies (MEAs) demonstrated extended lifetimes during PEM chemical stability accelerated stress tests (ASTs), exhibiting reduced electrochemical gas crossover, open circuit voltage decay, and fluoridemore » emission rates. The MEA with high CZO loading (55 μg cm -2) demonstrated performance losses, which are attributed to Ce poisoning of the PEM and CL ionomer regions, which is supported by X-ray fluorescence (XRF) analysis. In the MEA with the low CZO loading (10 μg cm -2), both the beginning of life (BOL) performance and the performance after 500 hours of ASTs were nearly identical to the BOL performance of the CZO-free baseline MEA. XRF analysis of the MEA with low CZO loading reveals that the BOL PEM Ce concentrations are preserved after 1408 hours of ASTs and that Ce contents in the cathode CL are not significant enough to reduce performance. Therefore, employing a highly effective radical scavenger such as CZO, at a loading of 10 μg cm -2 in the cathode CL, dramatically mitigates degradation effects, which improves MEA chemical durability and minimizes performance losses.« less

  12. Probabilistic performance-based design for high performance control systems

    NASA Astrophysics Data System (ADS)

    Micheli, Laura; Cao, Liang; Gong, Yongqiang; Cancelli, Alessandro; Laflamme, Simon; Alipour, Alice

    2017-04-01

    High performance control systems (HPCS) are advanced damping systems capable of high damping performance over a wide frequency bandwidth, ideal for mitigation of multi-hazards. They include active, semi-active, and hybrid damping systems. However, HPCS are more expensive than typical passive mitigation systems, rely on power and hardware (e.g., sensors, actuators) to operate, and require maintenance. In this paper, a life cycle cost analysis (LCA) approach is proposed to estimate the economic benefit these systems over the entire life of the structure. The novelty resides in the life cycle cost analysis in the performance based design (PBD) tailored to multi-level wind hazards. This yields a probabilistic performance-based design approach for HPCS. Numerical simulations are conducted on a building located in Boston, MA. LCA are conducted for passive control systems and HPCS, and the concept of controller robustness is demonstrated. Results highlight the promise of the proposed performance-based design procedure.

  13. Turbine blade forced response prediction using FREPS

    NASA Technical Reports Server (NTRS)

    Murthy, Durbha, V.; Morel, Michael R.

    1993-01-01

    This paper describes a software system called FREPS (Forced REsponse Prediction System) that integrates structural dynamic, steady and unsteady aerodynamic analyses to efficiently predict the forced response dynamic stresses in axial flow turbomachinery blades due to aerodynamic and mechanical excitations. A flutter analysis capability is also incorporated into the system. The FREPS system performs aeroelastic analysis by modeling the motion of the blade in terms of its normal modes. The structural dynamic analysis is performed by a finite element code such as MSC/NASTRAN. The steady aerodynamic analysis is based on nonlinear potential theory and the unsteady aerodynamic analyses is based on the linearization of the non-uniform potential flow mean. The program description and presentation of the capabilities are reported herein. The effectiveness of the FREPS package is demonstrated on the High Pressure Oxygen Turbopump turbine of the Space Shuttle Main Engine. Both flutter and forced response analyses are performed and typical results are illustrated.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlsbad Field Office

    The Performance Demonstration Program (PDP) for headspace gases distributes blind audit samples in a gas matrix for analysis of volatile organic compounds (VOCs). Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardousmore » Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility’s compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document. Participating measurement facilities must analyze PDP samples using the same procedures used for routine waste characterization analyses of WIPP samples.« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlsbad Field Office

    The Performance Demonstration Program (PDP) for headspace gases distributes sample gases of volatile organic compounds (VOCs) for analysis. Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issuedmore » by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility’s compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document. Participating measurement facilities must analyze PDP samples using the same procedures used for routine waste characterization analyses of WIPP samples.« less

  16. A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth

    2005-03-15

    The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scalemore » long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK projects have made use of this infrastructure to build performance measurement and analysis tools that scale to long-running programs on large parallel and distributed systems and that automate much of the search for performance bottlenecks.« less

  17. Energy savings and cost-benefit analysis of the new commercial building standard in China

    DOE PAGES

    Zhao, Shanguo; Feng, Wei; Zhang, Shicong; ...

    2015-10-07

    In this study, a comprehensive comparison of the commercial building energy efficiency standard between the previous 2005 version and the new proposed version is conducted, including the energy efficiency analysis and cost-benefit analysis. To better understand the tech-economic performance of the new Chinese standard, energy models were set up based on a typical commercial office building in Chinese climate zones. The building energy standard in 2005 is used as the baseline for this analysis. Key building technologies measures are analyzed individually, including roof, wall, window, lighting and chiller and so on and finally whole building cost-benefit analysis was conducted. Resultsmore » show that the new commercial building energy standard demonstrates good cost-effective performance, with whole building payback period around 4 years.« less

  18. Energy savings and cost-benefit analysis of the new commercial building standard in China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Shanguo; Feng, Wei; Zhang, Shicong

    In this study, a comprehensive comparison of the commercial building energy efficiency standard between the previous 2005 version and the new proposed version is conducted, including the energy efficiency analysis and cost-benefit analysis. To better understand the tech-economic performance of the new Chinese standard, energy models were set up based on a typical commercial office building in Chinese climate zones. The building energy standard in 2005 is used as the baseline for this analysis. Key building technologies measures are analyzed individually, including roof, wall, window, lighting and chiller and so on and finally whole building cost-benefit analysis was conducted. Resultsmore » show that the new commercial building energy standard demonstrates good cost-effective performance, with whole building payback period around 4 years.« less

  19. Understanding Emotions as Situated, Embodied, and Fissured: Thinking with Theory to Create an Analytical Tool

    ERIC Educational Resources Information Center

    Kuby, Candace R.

    2014-01-01

    An emerging theoretical perspective is that emotions are a verb or something we do in relation to others. Studies that demonstrate ways to analyze emotions from a performative stance are scarce. In this article, a new analytical tool is introduced; a critical performative analysis of emotion (CPAE) that draws upon three theoretical perspectives:…

  20. Economic Evaluation of Single-Family-Residence Solar-Energy Installation

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Long-term economic performance of a commercial solar-energy system was analyzed and used to predict economic performance at four additional sites. Analysis described in report was done to demonstrate viability of design over a broad range of environmental/economic conditions. Report contains graphs and tables that present evaluation procedure and results. Also contains appendixes that aid in understanding methods used.

  1. Deconstructing Hub Drag. Part 2. Computational Development and Anaysis

    DTIC Science & Technology

    2013-09-30

    leveraged a Vertical Lift Consortium ( VLC )-funded hub drag scaling research effort. To confirm this objective, correlations are performed with the...Technology™ Demonstrator aircraft using an unstructured computational solver. These simpler faired elliptical geome- tries can prove to be challenging ...possible. However, additional funding was obtained from the Vertical Lift Consortium ( VLC ) to perform this study. This analysis is documented in

  2. Religion, Public Education, and Gender: A Feminist Critical Analysis of Policies Implemented for Objections to Music in Mixed-Gender Elementary Education Courses

    ERIC Educational Resources Information Center

    Thiele, Margaret

    2012-01-01

    In 2008 the Michigan State Board of Education adopted new certification standards for teacher preparation institutions training elementary classroom teachers. Eight content areas were identified, one of which was visual and performing arts. Standard 1.5, Visual and Performing Arts states candidates are to demonstrate knowledge and understanding in…

  3. Relationship of aerobic and anaerobic parameters with 400 m front crawl swimming performance

    PubMed Central

    Kalva-Filho, CA; Campos, EZ; Andrade, VL; Silva, ASR; Zagatto, AM; Lima, MCS

    2015-01-01

    The aims of the present study were to investigate the relationship of aerobic and anaerobic parameters with 400 m performance, and establish which variable better explains long distance performance in swimming. Twenty-two swimmers (19.1±1.5 years, height 173.9±10.0 cm, body mass 71.2±10.2 kg; 76.6±5.3% of 400 m world record) underwent a lactate minimum test to determine lactate minimum speed (LMS) (i.e., aerobic capacity index). Moreover, the swimmers performed a 400 m maximal effort to determine mean speed (S400m), peak oxygen uptake (V.O2PEAK) and total anaerobic contribution (CANA). The CANA was assumed as the sum of alactic and lactic contributions. Physiological parameters of 400 m were determined using the backward extrapolation technique (V.O2PEAK and alactic contributions of CANA) and blood lactate concentration analysis (lactic anaerobic contributions of CANA). The Pearson correlation test and backward multiple regression analysis were used to verify the possible correlations between the physiological indices (predictor factors) and S400m (independent variable) (p < 0.05). Values are presented as mean ± standard deviation. Significant correlations were observed between S400m (1.4±0.1 m·s-1) and LMS (1.3±0.1 m·s-1; r = 0.80), V.O2PEAK (4.5±3.9 L·min-1; r = 0.72) and CANA (4.7±1.5 L·O2; r= 0.44). The best model constructed using multiple regression analysis demonstrated that LMS and V.O2PEAK explained 85% of the 400 m performance variance. When backward multiple regression analysis was performed, CANA lost significance. Thus, the results demonstrated that both aerobic parameters (capacity and power) can be used to predict 400 m swimming performance. PMID:28479663

  4. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  5. Low carbon technology performance vs infrastructure vulnerability: analysis through the local and global properties space.

    PubMed

    Dawson, David A; Purnell, Phil; Roelich, Katy; Busch, Jonathan; Steinberger, Julia K

    2014-11-04

    Renewable energy technologies, necessary for low-carbon infrastructure networks, are being adopted to help reduce fossil fuel dependence and meet carbon mitigation targets. The evolution of these technologies has progressed based on the enhancement of technology-specific performance criteria, without explicitly considering the wider system (global) impacts. This paper presents a methodology for simultaneously assessing local (technology) and global (infrastructure) performance, allowing key technological interventions to be evaluated with respect to their effect on the vulnerability of wider infrastructure systems. We use exposure of low carbon infrastructure to critical material supply disruption (criticality) to demonstrate the methodology. A series of local performance changes are analyzed; and by extension of this approach, a method for assessing the combined criticality of multiple materials for one specific technology is proposed. Via a case study of wind turbines at both the material (magnets) and technology (turbine generators) levels, we demonstrate that analysis of a given intervention at different levels can lead to differing conclusions regarding the effect on vulnerability. Infrastructure design decisions should take a systemic approach; without these multilevel considerations, strategic goals aimed to help meet low-carbon targets, that is, through long-term infrastructure transitions, could be significantly jeopardized.

  6. Seasat-A ASVT: Commercial demonstration experiments. Results analysis methodology for the Seasat-A case studies

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The SEASAT-A commercial demonstration program ASVT is described. The program consists of a set of experiments involving the evaluation of a real time data distributions system, the SEASAT-A user data distribution system, that provides the capability for near real time dissemination of ocean conditions and weather data products from the U.S. Navy Fleet Numerical Weather Central to a selected set of commercial and industrial users and case studies, performed by commercial and industrial users, using the data gathered by SEASAT-A during its operational life. The impact of the SEASAT-A data on business operations is evaluated by the commercial and industrial users. The approach followed in the performance of the case studies, and the methodology used in the analysis and integration of the case study results to estimate the actual and potential economic benefits of improved ocean condition and weather forecast data are described.

  7. Large Pilot-Scale Carbon Dioxide (CO2) Capture Project Using Aminosilicone Solvent.Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hancu, Dan

    GE Global Research has developed, over the last 8 years, a platform of cost effective CO2 capture technologies based on a non-aqueous aminosilicone solvent (GAP-1m). As demonstrated in previous funded DOE projects (DE-FE0007502 and DEFE0013755), the GAP-1m solvent has increased CO2 working capacity, lower volatility and corrosivity than the benchmark aqueous amine technology. Performance of the GAP-1m solvent was recently demonstrated in a 0.5 MWe pilot at National Carbon Capture Center, AL with real flue gas for over 500 hours of operation using a Steam Stripper Column (SSC). The pilot-scale PSTU engineering data were used to (i) update the techno-economicmore » analysis, and EH&S assessment, (ii) perform technology gap analysis, and (iii) conduct the solvent manufacturability and scale-up study.« less

  8. Enhancing the ABAQUS Thermomechanics Code to Simulate Steady and Transient Fuel Rod Behavior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. L. Williamson; D. A. Knoll

    2009-09-01

    A powerful multidimensional fuels performance capability, applicable to both steady and transient fuel behavior, is developed based on enhancements to the commercially available ABAQUS general-purpose thermomechanics code. Enhanced capabilities are described, including: UO2 temperature and burnup dependent thermal properties, solid and gaseous fission product swelling, fuel densification, fission gas release, cladding thermal and irradiation creep, cladding irradiation growth , gap heat transfer, and gap/plenum gas behavior during irradiation. The various modeling capabilities are demonstrated using a 2D axisymmetric analysis of the upper section of a simplified multi-pellet fuel rod, during both steady and transient operation. Computational results demonstrate the importancemore » of a multidimensional fully-coupled thermomechanics treatment. Interestingly, many of the inherent deficiencies in existing fuel performance codes (e.g., 1D thermomechanics, loose thermo-mechanical coupling, separate steady and transient analysis, cumbersome pre- and post-processing) are, in fact, ABAQUS strengths.« less

  9. Analysis of Volatile Fragrance and Flavor Compounds by Headspace Solid Phase Microextraction and GC-MS: An Undergraduate Instrumental Analysis Experiment

    NASA Astrophysics Data System (ADS)

    Galipo, Randolph C.; Canhoto, Alfredo J.; Walla, Michael D.; Morgan, Stephen L.

    1999-02-01

    A senior-level undergraduate laboratory experiment that demonstrates the use of solid-phase microextraction (SPME) and capillary gas chromatography-mass spectrometry (GC-MS) was developed for the identification of volatile compounds in consumer products. SPME minimizes sample preparation and concentrates volatile analytes in a solvent-free manner. Volatile flavor and fragrance compounds were extracted by SPME from the headspace of vials containing shampoos, chewing gums, and perfumes and analyzed by GC-MS. Headspace SPME was shown to be more sensitive than conventional headspace analysis of similar samples performed with an airtight syringe. Analysis times were less than 30 min, allowing multiple analyses to be performed in a typical laboratory class period.

  10. A feasibility study on age-related factors of wrist pulse using principal component analysis.

    PubMed

    Jang-Han Bae; Young Ju Jeon; Sanghun Lee; Jaeuk U Kim

    2016-08-01

    Various analysis methods for examining wrist pulse characteristics are needed for accurate pulse diagnosis. In this feasibility study, principal component analysis (PCA) was performed to observe age-related factors of wrist pulse from various analysis parameters. Forty subjects in the age group of 20s and 40s were participated, and their wrist pulse signal and respiration signal were acquired with the pulse tonometric device. After pre-processing of the signals, twenty analysis parameters which have been regarded as values reflecting pulse characteristics were calculated and PCA was performed. As a results, we could reduce complex parameters to lower dimension and age-related factors of wrist pulse were observed by combining-new analysis parameter derived from PCA. These results demonstrate that PCA can be useful tool for analyzing wrist pulse signal.

  11. Cooperative Collision Avoidance Technology Demonstration Data Analysis Report

    NASA Technical Reports Server (NTRS)

    2007-01-01

    This report details the National Aeronautics and Space Administration (NASA) Access 5 Project Office Cooperative Collision Avoidance (CCA) Technology Demonstration for unmanned aircraft systems (UAS) conducted from 21 to 28 September 2005. The test platform chosen for the demonstration was the Proteus Optionally Piloted Vehicle operated by Scaled Composites, LLC, flown out of the Mojave Airport, Mojave, CA. A single intruder aircraft, a NASA Gulf stream III, was used during the demonstration to execute a series of near-collision encounter scenarios. Both aircraft were equipped with Traffic Alert and Collision Avoidance System-II (TCAS-II) and Automatic Dependent Surveillance Broadcast (ADS-B) systems. The objective of this demonstration was to collect flight data to support validation efforts for the Access 5 CCA Work Package Performance Simulation and Systems Integration Laboratory (SIL). Correlation of the flight data with results obtained from the performance simulation serves as the basis for the simulation validation. A similar effort uses the flight data to validate the SIL architecture that contains the same sensor hardware that was used during the flight demonstration.

  12. Design and Analysis of Turbines for Space Applications

    NASA Technical Reports Server (NTRS)

    Griffin, Lisa W.; Dorney, Daniel J.; Huber, Frank W.

    2003-01-01

    In order to mitigate the risk of rocket propulsion development, efficient, accurate, detailed fluid dynamics analysis of the turbomachinery is necessary. This analysis is used for component development, design parametrics, performance prediction, and environment definition. To support this requirement, a task was developed at NASAh4arshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. The turbine chosen on which to demonstrate the procedure was a supersonic design suitable for a reusable launch vehicle (RLV). The hot gas path and blading were redesigned to obtain an increased efficiency. The redesign of the turbine was conducted with a consideration of system requirements, realizing that a highly efficient turbine that, for example, significantly increases engine weight, is of limited benefit. Both preliminary and detailed designs were considered. To generate an improved design, one-dimensional (1D) design and analysis tools, computational fluid dynamics (CFD), response surface methodology (RSM), and neural nets (NN) were used.

  13. Parental education predicts change in intelligence quotient after childhood epilepsy surgery.

    PubMed

    Meekes, Joost; van Schooneveld, Monique M J; Braams, Olga B; Jennekens-Schinkel, Aag; van Rijen, Peter C; Hendriks, Marc P H; Braun, Kees P J; van Nieuwenhuizen, Onno

    2015-04-01

    To know whether change in the intelligence quotient (IQ) of children who undergo epilepsy surgery is associated with the educational level of their parents. Retrospective analysis of data obtained from a cohort of children who underwent epilepsy surgery between January 1996 and September 2010. We performed simple and multiple regression analyses to identify predictors associated with IQ change after surgery. In addition to parental education, six variables previously demonstrated to be associated with IQ change after surgery were included as predictors: age at surgery, duration of epilepsy, etiology, presurgical IQ, reduction of antiepileptic drugs, and seizure freedom. We used delta IQ (IQ 2 years after surgery minus IQ shortly before surgery) as the primary outcome variable, but also performed analyses with pre- and postsurgical IQ as outcome variables to support our findings. To validate the results we performed simple regression analysis with parental education as the predictor in specific subgroups. The sample for regression analysis included 118 children (60 male; median age at surgery 9.73 years). Parental education was significantly associated with delta IQ in simple regression analysis (p = 0.004), and also contributed significantly to postsurgical IQ in multiple regression analysis (p = 0.008). Additional analyses demonstrated that parental education made a unique contribution to prediction of delta IQ, that is, it could not be replaced by the illness-related variables. Subgroup analyses confirmed the association of parental education with IQ change after surgery for most groups. Children whose parents had higher education demonstrate on average a greater increase in IQ after surgery and a higher postsurgical--but not presurgical--IQ than children whose parents completed at most lower secondary education. Parental education--and perhaps other environmental variables--should be considered in the prognosis of cognitive function after childhood epilepsy surgery. Wiley Periodicals, Inc. © 2015 International League Against Epilepsy.

  14. Design and Analysis of a Hyperspectral Microwave Receiver Subsystem

    NASA Technical Reports Server (NTRS)

    Blackwell, W.; Galbraith, C.; Hancock, T.; Leslie, R.; Osaretin, I.; Shields, M.; Racette, P.; Hillard, L.

    2012-01-01

    Hyperspectral microwave (HM) sounding has been proposed to achieve unprecedented performance. HM operation is achieved using multiple banks of RF spectrometers with large aggregate bandwidth. A principal challenge is Size/Weight/Power scaling. Objectives of this work: 1) Demonstrate ultra-compact (100 cm3) 52-channel IF processor (enabler); 2) Demonstrate a hyperspectral microwave receiver subsystem; and 3) Deliver a flight-ready system to validate HM sounding.

  15. Transactive System: Part II: Analysis of Two Pilot Transactive Systems using Foundational Theory and Metrics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lian, Jianming; Sun, Y.; Kalsi, Karanjit

    This document is the second of a two-part report. Part 1 reviewed several demonstrations of transactive control and compared them in terms of their payoff functions, control decisions, information privacy, and mathematical solution concepts. It was suggested in Part 1 that these four listed components should be adopted for meaningful comparison and design of future transactive systems. Part 2 proposes qualitative and quantitative metrics that will be needed to compare alternative transactive systems. It then uses the analysis and design principles from Part 1 while conducting more in-depth analysis of two transactive demonstrations: the American Electric Power (AEP) gridSMART Demonstration,more » which used a double –auction market mechanism, and a consensus method like that used in the Pacific Northwest Smart Grid Demonstration. Ultimately, metrics must be devised and used to meaningfully compare alternative transactive systems. One significant contribution of this report is an observation that the decision function used for thermostat control in the AEP gridSMART Demonstration has superior performance if its decision function is recast to more accurately reflect the power that will be used under for thermostatic control under alternative market outcomes.« less

  16. Recommendations for Review of TRADOC Pam 351-4(T).

    DTIC Science & Technology

    1981-07-08

    demonstrate proficiency in performing task at speed required on the job . Overtrain - Trainee must be trained to a high standard of retention ...20, It different from Report) OI. SUPPLEMENTARY NOTES 1. KEY WORDS (Continue on reveree aide It necessary and Identify by block number) Job Analysis...the analysis effort. The terminology in 351-4 is confusing. For example, no distinction is made between job and task analysis. There are no practical

  17. Development and Demonstration of a Networked Telepathology 3-D Imaging, Databasing, and Communication System

    DTIC Science & Technology

    1996-10-01

    aligned using an octree search algorithm combined with cross correlation analysis . Successive 4x downsampling with optional and specifiable neighborhood...desired and the search engine embedded in the OODBMS will find the requested imagery and que it to the user for further analysis . This application was...obtained during Hoftmann-LaRoche production pathology imaging performed at UMICH. Versant works well and is easy to use; 3) Pathology Image Analysis

  18. Modeling and Analysis of Actinide Diffusion Behavior in Irradiated Metal Fuel

    NASA Astrophysics Data System (ADS)

    Edelmann, Paul G.

    There have been numerous attempts to model fast reactor fuel behavior in the last 40 years. The US currently does not have a fully reliable tool to simulate the behavior of metal fuels in fast reactors. The experimental database necessary to validate the codes is also very limited. The DOE-sponsored Advanced Fuels Campaign (AFC) has performed various experiments that are ready for analysis. Current metal fuel performance codes are either not available to the AFC or have limitations and deficiencies in predicting AFC fuel performance. A modified version of a new fuel performance code, FEAST-Metal , was employed in this investigation with useful results. This work explores the modeling and analysis of AFC metallic fuels using FEAST-Metal, particularly in the area of constituent actinide diffusion behavior. The FEAST-Metal code calculations for this work were conducted at Los Alamos National Laboratory (LANL) in support of on-going activities related to sensitivity analysis of fuel performance codes. A sensitivity analysis of FEAST-Metal was completed to identify important macroscopic parameters of interest to modeling and simulation of metallic fuel performance. A modification was made to the FEAST-Metal constituent redistribution model to enable accommodation of newer AFC metal fuel compositions with verified results. Applicability of this modified model for sodium fast reactor metal fuel design is demonstrated.

  19. Experiments and analysis concerning the use of external burning to reduce aerospace vehicle transonic drag. Ph.D. Thesis - Maryland Univ., 1991

    NASA Technical Reports Server (NTRS)

    Trefny, Charles J.

    1992-01-01

    The external combustion of hydrogen to reduce transonic drag was investigated. A control volume analysis is developed and indicates that the specific impulse performance of external burning is competitive with other forms of airbreathing propulsion and depends on the fuel-air ratio, freestream Mach number, and the severity of the base drag. A method is presented for sizing fuel injectors for a desired fuel-air ratio in the unconfined stream. A two-dimensional Euler analysis is also presented which indicates that the total axial force generated by external burning depends on the total amount of energy input and is independent of the transverse and streamwise distribution of heat addition. Good agreement between the Euler and control volume analysis is demonstrated. Features of the inviscid external burning flowfield are discussed. Most notably, a strong compression forms at the sonic line within the burning stream which may induce separation of the plume and prevent realization of the full performance potential. An experimental program was conducted in a Mach 1.26 free-jet to demonstrate drag reduction on a simple expansion ramp geometry, and verify hydrogen-air stability limits at external burning conditions. Stable combustion appears feasible to Mach number of between 1.4 and 2 depending on the vehicle flight trajectory. Drag reduction is demonstrated on the expansion ramp at Mach 1.26; however, force levels showed little dependence on fuel pressure or altitude in contrast to control volume analysis predictions. Various facility interference mechanisms and scaling issues were studied and are discussed.

  20. Bioinformatic analysis of Msx1 and Msx2 involved in craniofacial development.

    PubMed

    Dai, Jiewen; Mou, Zhifang; Shen, Shunyao; Dong, Yuefu; Yang, Tong; Shen, Steve Guofang

    2014-01-01

    Msx1 and Msx2 were revealed to be candidate genes for some craniofacial deformities, such as cleft lip with/without cleft palate (CL/P) and craniosynostosis. Many other genes were demonstrated to have a cross-talk with MSX genes in causing these defects. However, there is no systematic evaluation for these MSX gene-related factors. In this study, we performed systematic bioinformatic analysis for MSX genes by combining using GeneDecks, DAVID, and STRING database, and the results showed that there were numerous genes related to MSX genes, such as Irf6, TP63, Dlx2, Dlx5, Pax3, Pax9, Bmp4, Tgf-beta2, and Tgf-beta3 that have been demonstrated to be involved in CL/P, and Fgfr2, Fgfr1, Fgfr3, and Twist1 that were involved in craniosynostosis. Many of these genes could be enriched into different gene groups involved in different signaling ways, different craniofacial deformities, and different biological process. These findings could make us analyze the function of MSX gens in a gene network. In addition, our findings showed that Sumo, a novel gene whose polymorphisms were demonstrated to be associated with nonsyndromic CL/P by genome-wide association study, has protein-protein interaction with MSX1, which may offer us an alternative method to perform bioinformatic analysis for genes found by genome-wide association study and can make us predict the disrupted protein function due to the mutation in a gene DNA sequence. These findings may guide us to perform further functional studies in the future.

  1. TRISO Fuel Performance: Modeling, Integration into Mainstream Design Studies, and Application to a Thorium-fueled Fusion-Fission Hybrid Blanket

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Powers, Jeffrey James

    2011-11-30

    This study focused on creating a new tristructural isotropic (TRISO) coated particle fuel performance model and demonstrating the integration of this model into an existing system of neutronics and heat transfer codes, creating a user-friendly option for including fuel performance analysis within system design optimization and system-level trade-off studies. The end product enables both a deeper understanding and better overall system performance of nuclear energy systems limited or greatly impacted by TRISO fuel performance. A thorium-fueled hybrid fusion-fission Laser Inertial Fusion Energy (LIFE) blanket design was used for illustrating the application of this new capability and demonstrated both the importancemore » of integrating fuel performance calculations into mainstream design studies and the impact that this new integrated analysis had on system-level design decisions. A new TRISO fuel performance model named TRIUNE was developed and verified and validated during this work with a novel methodology established for simulating the actual lifetime of a TRISO particle during repeated passes through a pebble bed. In addition, integrated self-consistent calculations were performed for neutronics depletion analysis, heat transfer calculations, and then fuel performance modeling for a full parametric study that encompassed over 80 different design options that went through all three phases of analysis. Lastly, side studies were performed that included a comparison of thorium and depleted uranium (DU) LIFE blankets as well as some uncertainty quantification work to help guide future experimental work by assessing what material properties in TRISO fuel performance modeling are most in need of improvement. A recommended thorium-fueled hybrid LIFE engine design was identified with an initial fuel load of 20MT of thorium, 15% TRISO packing within the graphite fuel pebbles, and a 20cm neutron multiplier layer with beryllium pebbles in flibe molten salt coolant. It operated at a system power level of 2000 MW th, took about 3.5 years to reach full plateau power, and was capable of an End of Plateau burnup of 38.7 %FIMA if considering just the neutronic constraints in the system design; however, fuel performance constraints led to a maximum credible burnup of 12.1 %FIMA due to a combination of internal gas pressure and irradiation effects on the TRISO materials (especially PyC) leading to SiC pressure vessel failures. The optimal neutron spectrum for the thorium-fueled blanket options evaluated seemed to favor a hard spectrum (low but non-zero neutron multiplier thicknesses and high TRISO packing fractions) in terms of neutronic performance but the fuel performance constraints demonstrated that a significantly softer spectrum would be needed to decrease the rate of accumulation of fast neutron fluence in order to improve the maximum credible burnup the system could achieve.« less

  2. A high-temperature superconducting Helmholtz probe for microscopy at 9.4 T.

    PubMed

    Hurlston, S E; Brey, W W; Suddarth, S A; Johnson, G A

    1999-05-01

    The design and operation of a high-temperature superconducting (HTS) probe for magnetic resonance microscopy (MRM) at 400 MHz are presented. The design of the probe includes a Helmholtz coil configuration and a stable open-cycle cooling mechanism. Characterization of coil operating parameters is presented to demonstrate the suitability of cryo-cooled coils for MRM. Specifically, the performance of the probe is evaluated by comparison of signal-to-noise (SNR) performance with that of a copper Helmholtz pair, analysis of B1 field homogeneity, and quantification of thermal stability. Images are presented to demonstrate the SNR advantage of the probe for typical MRM applications.

  3. Determination of Hypochlorite in Bleaching Products with Flower Extracts to Demonstrate the Principles of Flow Injection Analysis

    ERIC Educational Resources Information Center

    Ramos, Luiz Antonio; Prieto, Katia Roberta; Carvalheiro, Eder Tadeu Gomes; Carvalheiro, Carla Cristina Schmitt

    2005-01-01

    The use of crude flower extracts to the principle of analytical chemistry automation, with the flow injection analysis (FIA) procedure developed to determine hypochlorite in household bleaching products was performed. The FIA comprises a group of techniques based on injection of a liquid sample into a moving, nonsegmented carrier stream of a…

  4. SPAR thermal analysis processors reference manual, system level 16. Volume 1: Program executive. Volume 2: Theory. Volume 3: Demonstration problems. Volume 4: Experimental thermal element capability. Volume 5: Programmer reference

    NASA Technical Reports Server (NTRS)

    Marlowe, M. B.; Moore, R. A.; Whetstone, W. D.

    1979-01-01

    User instructions are given for performing linear and nonlinear steady state and transient thermal analyses with SPAR thermal analysis processors TGEO, SSTA, and TRTA. It is assumed that the user is familiar with basic SPAR operations and basic heat transfer theory.

  5. LC-MS and MS/MS in the analysis of recombinant proteins

    NASA Astrophysics Data System (ADS)

    Coulot, M.; Domon, B.; Grossenbacher, H.; Guenat, C.; Maerki, W.; Müller, D. R.; Richter, W. J.

    1993-03-01

    Applicability and performance of electrospray ionization mass spectrometry (ESIMS) is demonstrated for protein analysis. ESIMS is applied in conjunction with on-line HPLC (LC-ESlMS) and direct tandem mass spectrometry (positive and negative ion mode ESlMS/MS) to the structural characterization of a recombinant protein (r-hirudin variant 1) and a congener phosphorylated at threonine 45 (RP-1).

  6. Analysis of Lunar Surface Charging for a Candidate Spacecraft Using NASCAP-2K

    NASA Technical Reports Server (NTRS)

    Parker, Linda; Minow, Joseph; Blackwell, William, Jr.

    2007-01-01

    The characterization of the electromagnetic interaction for a spacecraft in the lunar environment, and identification of viable charging mitigation strategies, is a critical lunar mission design task, as spacecraft charging has important implications both for science applications and for astronaut safety. To that end, we have performed surface charging calculations of a candidate lunar spacecraft for lunar orbiting and lunar landing missions. We construct a model of the spacecraft with candidate materials having appropriate electrical properties using Object Toolkit and perform the spacecraft charging analysis using Nascap-2k, the NASA/AFRL sponsored spacecraft charging analysis tool. We use nominal and atypical lunar environments appropriate for lunar orbiting and lunar landing missions to establish current collection of lunar ions and electrons. In addition, we include a geostationary orbit case to demonstrate a bounding example of extreme (negative) charging of a lunar spacecraft in the geostationary orbit environment. Results from the charging analysis demonstrate that minimal differential potentials (and resulting threat of electrostatic discharge) occur when the spacecraft is constructed entirely of conducting materials, as expected. We compare charging results to data taken during previous lunar orbiting or lunar flyby spacecraft missions.

  7. Response of the Alliance 1 Proof-of-Concept Airplane Under Gust Loads

    NASA Technical Reports Server (NTRS)

    Naser, A. S.; Pototzky, A. S.; Spain, C. V.

    2001-01-01

    This report presents the work performed by Lockheed Martin's Langley Program Office in support of NASA's Environmental Research Aircraft and Sensor Technology (ERAST) program. The primary purpose of this work was to develop and demonstrate a gust analysis method which accounts for the span-wise variation of gust velocity. This is important because these unmanned aircraft having high aspect ratios and low wing loading are very flexible, and fly at low speeds. The main focus of the work was therefore to perform a two-dimensional Power Spectrum Density (PSD) analysis of the Alliance 1 Proof-of-Concept Unmanned Aircraft, As of this writing, none of the aircraft described in this report have been constructed. They are concepts represented by analytical models. The process first involved the development of suitable structural and aeroelastic Finite Element Models (FEM). This was followed by development of a one-dimensional PSD gust analysis, and then the two-dimensional (PSD) analysis of the Alliance 1. For further validation and comparison, two additional analyses were performed. A two-dimensional PSD gust analysis was performed on a simplet MSC/NASTRAN example problem. Finally a one-dimensional discrete gust analysis was performed on Alliance 1. This report describes this process, shows the relevant comparisons between analytical methods, and discusses the physical meanings of the results.

  8. An Expert Assistant for Computer Aided Parallelization

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Chun, Robert; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    The prototype implementation of an expert system was developed to assist the user in the computer aided parallelization process. The system interfaces to tools for automatic parallelization and performance analysis. By fusing static program structure information and dynamic performance analysis data the expert system can help the user to filter, correlate, and interpret the data gathered by the existing tools. Sections of the code that show poor performance and require further attention are rapidly identified and suggestions for improvements are presented to the user. In this paper we describe the components of the expert system and discuss its interface to the existing tools. We present a case study to demonstrate the successful use in full scale scientific applications.

  9. Probabilistic image modeling with an extended chain graph for human activity recognition and image segmentation.

    PubMed

    Zhang, Lei; Zeng, Zhi; Ji, Qiang

    2011-09-01

    Chain graph (CG) is a hybrid probabilistic graphical model (PGM) capable of modeling heterogeneous relationships among random variables. So far, however, its application in image and video analysis is very limited due to lack of principled learning and inference methods for a CG of general topology. To overcome this limitation, we introduce methods to extend the conventional chain-like CG model to CG model with more general topology and the associated methods for learning and inference in such a general CG model. Specifically, we propose techniques to systematically construct a generally structured CG, to parameterize this model, to derive its joint probability distribution, to perform joint parameter learning, and to perform probabilistic inference in this model. To demonstrate the utility of such an extended CG, we apply it to two challenging image and video analysis problems: human activity recognition and image segmentation. The experimental results show improved performance of the extended CG model over the conventional directed or undirected PGMs. This study demonstrates the promise of the extended CG for effective modeling and inference of complex real-world problems.

  10. Recovery Act-SmartGrid regional demonstration transmission and distribution (T&D) Infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hedges, Edward T.

    This document represents the Final Technical Report for the Kansas City Power & Light Company (KCP&L) Green Impact Zone SmartGrid Demonstration Project (SGDP). The KCP&L project is partially funded by Department of Energy (DOE) Regional Smart Grid Demonstration Project cooperative agreement DE-OE0000221 in the Transmission and Distribution Infrastructure application area. This Final Technical Report summarizes the KCP&L SGDP as of April 30, 2015 and includes summaries of the project design, implementation, operations, and analysis performed as of that date.

  11. High Sensitivity and High Detection Specificity of Gold-Nanoparticle-Grafted Nanostructured Silicon Mass Spectrometry for Glucose Analysis.

    PubMed

    Tsao, Chia-Wen; Yang, Zhi-Jie

    2015-10-14

    Desorption/ionization on silicon (DIOS) is a high-performance matrix-free mass spectrometry (MS) analysis method that involves using silicon nanostructures as a matrix for MS desorption/ionization. In this study, gold nanoparticles grafted onto a nanostructured silicon (AuNPs-nSi) surface were demonstrated as a DIOS-MS analysis approach with high sensitivity and high detection specificity for glucose detection. A glucose sample deposited on the AuNPs-nSi surface was directly catalyzed to negatively charged gluconic acid molecules on a single AuNPs-nSi chip for MS analysis. The AuNPs-nSi surface was fabricated using two electroless deposition steps and one electroless etching step. The effects of the electroless fabrication parameters on the glucose detection efficiency were evaluated. Practical application of AuNPs-nSi MS glucose analysis in urine samples was also demonstrated in this study.

  12. Determining cantilever stiffness from thermal noise.

    PubMed

    Lübbe, Jannis; Temmen, Matthias; Rahe, Philipp; Kühnle, Angelika; Reichling, Michael

    2013-01-01

    We critically discuss the extraction of intrinsic cantilever properties, namely eigenfrequency f n , quality factor Q n and specifically the stiffness k n of the nth cantilever oscillation mode from thermal noise by an analysis of the power spectral density of displacement fluctuations of the cantilever in contact with a thermal bath. The practical applicability of this approach is demonstrated for several cantilevers with eigenfrequencies ranging from 50 kHz to 2 MHz. As such an analysis requires a sophisticated spectral analysis, we introduce a new method to determine k n from a spectral analysis of the demodulated oscillation signal of the excited cantilever that can be performed in the frequency range of 10 Hz to 1 kHz regardless of the eigenfrequency of the cantilever. We demonstrate that the latter method is in particular useful for noncontact atomic force microscopy (NC-AFM) where the required simple instrumentation for spectral analysis is available in most experimental systems.

  13. Biodegradation of polycyclic hydrocarbons by Phanerochaete chrysosporium

    EPA Science Inventory

    The ability of the white rot fungus Phanerochaete chrysosporium to degrade polycyclic aromatic hydrocarbons (PAHs) that are present in anthracene oil (a distillation product obtained from coal tar) was demonstrated. Analysis by capillary gas chromatography and high-performance li...

  14. DEMONSTRATION OF AUTONOMOUS AIR MONITORING THROUGH ROBOTICS

    EPA Science Inventory

    Hazardous and/or tedious functions are often performed by on-site workers during investigation, mitigation and clean-up of hazardous substances. These functions include site surveys, sampling and analysis, excavation, and treatment and preparation of wastes for shipment to chemic...

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casbon, M. A.; Nichols, W. E.

    DOE O 435.1, Radioactive Waste Management, and DOE M 435.1-1, Radioactive Waste Management Manual, require that a determination of continued adequacy of the performance assessment (PA), composite analysis (CA), and disposal authorization statement (DAS) be made on an annual basis, and it must consider the results of data collection and analysis from research, field studies, and monitoring. Annual summaries of low-level waste (LLW) disposal operations must be prepared with respect to the conclusions and recommendations of the PA and CA, and a determination of the need to revise the PA or CA must be made. The annual summary requirement providesmore » a structured approach for demonstrating the continued adequacy of the PA and CA in demonstrating a reasonable expectation that the performance objectives will be met. This annual summary addresses only the status of the Environmental Restoration Disposal Facility (ERDF) PA (CP-60089, Performance Assessment for the Environmental Restoration Disposal Facility, Hanford Site, Washington, formerly WCH-520 Rev. 1)1. The CA for ERDF is supported by DOE/RL-2016-62, Annual Status Report (FY 2016): Composite Analysis of Low Level Waste Disposal in the Central Plateau at the Hanford Site. The ERDF PA portion of the CA document is found in Section 3.1.4, and the ERDF operations portion is found in Section 3.3.3.2 of that document.« less

  16. Orion Rendezvous, Proximity Operations, and Docking Design and Analysis

    NASA Technical Reports Server (NTRS)

    D'Souza, Christopher; Hanak, F. Chad; Spehar, Pete; Clark, Fred D.; Jackson, Mark

    2007-01-01

    The Orion vehicle will be required to perform rendezvous, proximity operations, and docking with the International Space Station (ISS) and the Earth Departure Stage (EDS)/Lunar Landing Vehicle (LLV) stack in Low Earth Orbit (LEO) as well as with the Lunar Landing Vehicle in Low Lunar Orbit (LLO). The RPOD system, which consists of sensors, actuators, and software is being designed to be flexible and robust enough to perform RPOD with different vehicles in different environments. This paper will describe the design and the analysis which has been performed to date to allow the vehicle to perform its mission. Since the RPOD design touches on many areas such as sensors selection and placement, trajectory design, navigation performance, and effector performance, it is inherently a systems design problem. This paper will address each of these issues in order to demonstrate how the Orion RPOD has been designed to accommodate and meet all the requirements levied on the system.

  17. Novel Image Encryption Scheme Based on Chebyshev Polynomial and Duffing Map

    PubMed Central

    2014-01-01

    We present a novel image encryption algorithm using Chebyshev polynomial based on permutation and substitution and Duffing map based on substitution. Comprehensive security analysis has been performed on the designed scheme using key space analysis, visual testing, histogram analysis, information entropy calculation, correlation coefficient analysis, differential analysis, key sensitivity test, and speed test. The study demonstrates that the proposed image encryption algorithm shows advantages of more than 10113 key space and desirable level of security based on the good statistical results and theoretical arguments. PMID:25143970

  18. Planar Inlet Design and Analysis Process (PINDAP)

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Gruber, Christopher R.

    2005-01-01

    The Planar Inlet Design and Analysis Process (PINDAP) is a collection of software tools that allow the efficient aerodynamic design and analysis of planar (two-dimensional and axisymmetric) inlets. The aerodynamic analysis is performed using the Wind-US computational fluid dynamics (CFD) program. A major element in PINDAP is a Fortran 90 code named PINDAP that can establish the parametric design of the inlet and efficiently model the geometry and generate the grid for CFD analysis with design changes to those parameters. The use of PINDAP is demonstrated for subsonic, supersonic, and hypersonic inlets.

  19. Accelerating Sequences in the Presence of Metal by Exploiting the Spatial Distribution of Off-Resonance

    PubMed Central

    Smith, Matthew R.; Artz, Nathan S.; Koch, Kevin M.; Samsonov, Alexey; Reeder, Scott B.

    2014-01-01

    Purpose To demonstrate feasibility of exploiting the spatial distribution of off-resonance surrounding metallic implants for accelerating multispectral imaging techniques. Theory Multispectral imaging (MSI) techniques perform time-consuming independent 3D acquisitions with varying RF frequency offsets to address the extreme off-resonance from metallic implants. Each off-resonance bin provides a unique spatial sensitivity that is analogous to the sensitivity of a receiver coil, and therefore provides a unique opportunity for acceleration. Methods Fully sampled MSI was performed to demonstrate retrospective acceleration. A uniform sampling pattern across off-resonance bins was compared to several adaptive sampling strategies using a total hip replacement phantom. Monte Carlo simulations were performed to compare noise propagation of two of these strategies. With a total knee replacement phantom, positive and negative off-resonance bins were strategically sampled with respect to the B0 field to minimize aliasing. Reconstructions were performed with a parallel imaging framework to demonstrate retrospective acceleration. Results An adaptive sampling scheme dramatically improved reconstruction quality, which was supported by the noise propagation analysis. Independent acceleration of negative and positive off-resonance bins demonstrated reduced overlapping of aliased signal to improve the reconstruction. Conclusion This work presents the feasibility of acceleration in the presence of metal by exploiting the spatial sensitivities of off-resonance bins. PMID:24431210

  20. Extended depth of focus contact lenses vs. two commercial multifocals: Part 1. Optical performance evaluation via computed through-focus retinal image quality metrics.

    PubMed

    Bakaraju, Ravi C; Ehrmann, Klaus; Ho, Arthur

    To compare the computed optical performance of prototype lenses designed using deliberate manipulation of higher-order spherical aberrations to extend depth-of-focus (EDOF) with two commercial multifocals. Emmetropic, presbyopic, schematic eyes were coupled with prototype EDOF and commercial multifocal lenses (Acuvue Oasys for presbyopia, AOP, Johnson & Johnson & Air Optix Aqua multifocal, AOMF, Alcon). For each test configuration, the through-focus retinal image quality (TFRIQ) values were computed over 21 vergences, ranging from -0.50 to 2.00D, in 0.125D steps. Analysis was performed considering eyes with three different inherent aberration profiles: five different pupils and five different lens decentration levels. Except the LOW design, the AOP lenses offered 'bifocal' like TFRIQ performance. Lens performance was relatively independent to pupil and aberrations but not centration. Contrastingly, AOMF demonstrated distance centric performance, most dominant in LOW followed by MED and HIGH designs. AOMF lenses were the most sensitive to pupil, aberrations and centration. The prototypes demonstrated a 'lift-off' in the TFRIQ performance, particularly at intermediate and near, without trading performance at distance. When compared with AOP and AOMF, EDOF lenses demonstrated reduced sensitivity to pupil, aberrations and centration. With the through focus retinal image quality as the gauge of optical performance, we demonstrated that the prototype EDOF designs were less susceptible to variations in pupil, inherent ocular aberrations and decentration, compared to the commercial designs. To ascertain whether these incremental improvements translate to a clinically palpable outcome requires investigation through human trials. Copyright © 2017 Spanish General Council of Optometry. Published by Elsevier España, S.L.U. All rights reserved.

  1. Analysis of Biomass Sugars Using a Novel HPLC Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agblevor, F. A.; Hames, B. R.; Schell, D.

    The precise quantitative analysis of biomass sugars is a very important step in the conversion of biomass feedstocks to fuels and chemicals. However, the most accurate method of biomass sugar analysis is based on the gas chromatography analysis of derivatized sugars either as alditol acetates or trimethylsilanes. The derivatization method is time consuming but the alternative high-performance liquid chromatography (HPLC) method cannot resolve most sugars found in biomass hydrolysates. We have demonstrated for the first time that by careful manipulation of the HPLC mobile phase, biomass monomeric sugars (arabinose, xylose, fructose, glucose, mannose, and galactose) can be analyzed quantitatively andmore » there is excellent baseline resolution of all the sugars. This method was demonstrated for standard sugars, pretreated corn stover liquid and solid fractions. Our method can also be used to analyze dimeric sugars (cellobiose and sucrose).« less

  2. COBRA ATD minefield detection results for the Joint Countermine ACTD Demonstrations

    NASA Astrophysics Data System (ADS)

    Stetson, Suzanne P.; Witherspoon, Ned H.; Holloway, John H., Jr.; Suiter, Harold R.; Crosby, Frank J.; Hilton, Russell J.; McCarley, Karen A.

    2000-08-01

    The Coastal Battlefield Reconnaissance and Analysis)COBRA) system described here was a Marine Corps Advanced Technology Demonstration (ATD) development consisting of an unmanned aerial vehicle (UAV) airborne multispectral video sensor system and ground station which processes the multispectral video data to automatically detect minefields along the flight path. After successful completion of the ATD, the residual COBRA ATD system participated in the Joint Countermine (JCM) Advanced Concept Technology Demonstration (ACTD) Demo I held at Camp Lejeune, North Carolina in conjunction with JTFX97 and Demo II held in Stephenville, Newfoundland in conjunction with MARCOT98. These exercises demonstrated the COBRA ATD system in an operational environment, detecting minefields that included several different mine types in widely varying backgrounds. The COBRA system performed superbly during these demonstrations, detecting mines under water, in the surf zone, on the beach, and inland, and has transitioned to an acquisition program. This paper describes the COBRA operation and performance results for these demonstrations, which represent the first demonstrated capability for remote tactical minefield detection from a UAV. The successful COBRA technologies and techniques demonstrated for tactical UAV minefield detection in the Joint Countermine Advanced Concept Technology Demonstrations have formed the technical foundation for future developments in Marine Corps, Navy, and Army tactical remote airborne mine detection systems.

  3. Integrated flight/propulsion control - Subsystem specifications for performance

    NASA Technical Reports Server (NTRS)

    Neighbors, W. K.; Rock, Stephen M.

    1993-01-01

    A procedure is presented for calculating multiple subsystem specifications given a number of performance requirements on the integrated system. This procedure applies to problems where the control design must be performed in a partitioned manner. It is based on a structured singular value analysis, and generates specifications as magnitude bounds on subsystem uncertainties. The performance requirements should be provided in the form of bounds on transfer functions of the integrated system. This form allows the expression of model following, command tracking, and disturbance rejection requirements. The procedure is demonstrated on a STOVL aircraft design.

  4. Validation of a Scalable Solar Sailcraft

    NASA Technical Reports Server (NTRS)

    Murphy, D. M.

    2006-01-01

    The NASA In-Space Propulsion (ISP) program sponsored intensive solar sail technology and systems design, development, and hardware demonstration activities over the past 3 years. Efforts to validate a scalable solar sail system by functional demonstration in relevant environments, together with test-analysis correlation activities on a scalable solar sail system have recently been successfully completed. A review of the program, with descriptions of the design, results of testing, and analytical model validations of component and assembly functional, strength, stiffness, shape, and dynamic behavior are discussed. The scaled performance of the validated system is projected to demonstrate the applicability to flight demonstration and important NASA road-map missions.

  5. X-38 Bolt Retractor Subsystem Separation Demonstration

    NASA Technical Reports Server (NTRS)

    Rugless, Fedoria (Editor); Johnston, A. S.; Ahmed, R.; Garrison, J. C.; Gaines, J. L.; Waggoner, J. D.

    2002-01-01

    The Flight Robotics Laboratory FRL successfully demonstrated the X-38 bolt retractor subsystem (BRS). The BRS design was proven safe by testing in the Pyrotechnic Shock Facility (PSI) before being demonstrated in the FRL. This Technical Memorandum describes the BRS, FRL, PSF, and interface hardware. Bolt retraction time, spacecraft simulator acceleration, and a force analysis are also presented. The purpose of the demonstration was to show the FRL capability for spacecraft separation testing using pyrotechnics. Although a formal test was not performed due to schedule and budget constraints, the data will show that the BRS is a successful design concept and the FRL is suitable for future separation tests.

  6. The demonstration of an advanced cyclone coal combustor, with internal sulfur, nitrogen, and ash control for the conversion of a 23 MMBTU/hour oil fired boiler to pulverized coal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zauderer, B.; Fleming, E.S.

    1991-08-30

    This work contains to the final report of the demonstration of an advanced cyclone coal combustor. Titles include: Chronological Description of the Clean Coal Project Tests,'' Statistical Analysis of Operating Data for the Coal Tech Combustor,'' Photographic History of the Project,'' Results of Slag Analysis by PA DER Module 1 Procedure,'' Properties of the Coals Limestone Used in the Test Effort,'' Results of the Solid Waste Sampling Performed on the Coal Tech Combustor by an Independent Contractor During the February 1990 Tests.'' (VC)

  7. In Vitro Assessment of Plants Growing in Cuba Belonging to Solanaceae Family Against Leishmania amazonensis.

    PubMed

    Monzote, Lianet; Jiménez, Jenny; Cuesta-Rubio, Osmany; Márquez, Ingrid; Gutiérrez, Yamile; da Rocha, Cláudia Quintino; Marchi, Mary; Setzer, William N; Vilegas, Wagner

    2016-11-01

    In this study, an in vitro antileishmanial assessment of plant extracts from 12 genera and 46 species growing in Cuba belonging to Solanaceae family was performed. A total of 226 extracts were screened against promastigotes of Leishmania amazonensis, and cytotoxicity of active extracts [median inhibitory concentration (IC 50 ) promastigotes <100 µg/mL] was determined on peritoneal macrophage from BALB/c mice. Extracts that showed selective index >5 were then assayed against intracellular amastigote. Metabolomics analysis of promissory extracts was performed using chemical profile obtained by ultra performance liquid chromatography. Only 11 extracts (4.9%) from nine plants were selected as potentially actives: Brunfelsia cestroides A. Rich, Capsicum annuum L., Capsicum chinense Jacq., Cestrum nocturnum L., Nicotiana plumbaginifolia Viv., Solanum havanense Jacq., Solanum myriacanthum Dunal, Solanum nudum Dunal and Solanum seaforthianum And., with IC 50  < 50 µg/mL and selectivity index >5. Metabolomics analysis demonstrated significant differences in the chemical profiles with an average of 42.8 (range 31-88) compounds from m/z 104 to 1477, which demonstrated the complex mixture of compounds. In addition, no common markers among active extracts were identified. The results demonstrate the importance of the Solanaceae family to search new antileishmanial agents, particularly in unexplored species of this family. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  8. Technology verification phase. Dynamic isotope power system. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halsey, D.G.

    1982-03-10

    The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight systemmore » design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance. (LCL)« less

  9. Hall Thruster Technology for NASA Science Missions

    NASA Technical Reports Server (NTRS)

    Manzella, David; Oh, David; Aadland, Randall

    2005-01-01

    The performance of a prototype Hall thruster designed for Discovery-class NASA science mission applications was evaluated at input powers ranging from 0.2 to 2.9 kilowatts. These data were used to construct a throttle profile for a projected Hall thruster system based on this prototype thruster. The suitability of such a Hall thruster system to perform robotic exploration missions was evaluated through the analysis of a near Earth asteroid sample return mission. This analysis demonstrated that a propulsion system based on the prototype Hall thruster offers mission benefits compared to a propulsion system based on an existing ion thruster.

  10. SUPERFUND TREATABILITY CLEARINGHOUSE: FINAL ...

    EPA Pesticide Factsheets

    During the period of July 8 - July 12, 1985, the Shirco Infrared Systems Portable Pilot Test Unit was in operation at the Times Beach Dioxin Research Facility to demonstrate the capability of Shirco's infrared technology to decontaminate silty soil laden with 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) at a concentration range of 156 to 306 ppb. Emissions sampling and final analysis was performed by Environmental Research & Technology, Inc. (ERT), while laboratory analysis of the emissions and soil samples was performed by Roy F. Weston Inc. Shirco Infrared Systems prepared the testing procedure protocol and operated the furnace system. publish information

  11. A human factors evaluation of the operational demonstration flight inspection aircraft.

    DOT National Transportation Integrated Search

    1995-05-01

    These reports describe the data collection and analysis efforts performed by the Civil Aerospace Medical Institute's Human Factors Research Laboratory to assist the Office of Aviation System Standards (AVN) in the human factors evaluation of the Oper...

  12. Spacelab cryogenic propellant management experiment

    NASA Technical Reports Server (NTRS)

    Cady, E. C.

    1976-01-01

    The conceptual design of a Spacelab cryogen management experiment was performed to demonstrate toe desirability and feasibility of subcritical cryogenic fluid orbital storage and supply. A description of the experimental apparatus, definition of supporting requirements, procedures, data analysis, and a cost estimate are included.

  13. Measuring the effects of aborted takeoffs and landings on traffic flow at JFK

    DOT National Transportation Integrated Search

    2012-10-14

    The FAA Office of Accident Investigation and Prevention (AVP) supports research, analysis and demonstration of quantitative air traffic analyses to estimate safety performance and benefits of the Next Generation Air Transportation System (NextGen). T...

  14. The High Stability Engine Control (HISTEC) Program: Flight Demonstration Phase

    NASA Technical Reports Server (NTRS)

    DeLaat, John C.; Southwick, Robert D.; Gallops, George W.; Orme, John S.

    1998-01-01

    Future aircraft turbine engines, both commercial and military, must be able to accommodate expected increased levels of steady-state and dynamic engine-face distortion. The current approach of incorporating sufficient design stall margin to tolerate these increased levels of distortion would significantly reduce performance. The objective of the High Stability Engine Control (HISTEC) program is to design, develop, and flight-demonstrate an advanced, integrated engine control system that uses measurement-based estimates of distortion to enhance engine stability. The resulting distortion tolerant control reduces the required design stall margin, with a corresponding increase in performance and decrease in fuel burn. The HISTEC concept has been developed and was successfully flight demonstrated on the F-15 ACTIVE aircraft during the summer of 1997. The flight demonstration was planned and carried out in two phases, the first to show distortion estimation, and the second to show distortion accommodation. Post-flight analysis shows that the HISTEC technologies are able to successfully estimate and accommodate distortion, transiently setting the stall margin requirement on-line and in real-time. This allows the design stall margin requirement to be reduced, which in turn can be traded for significantly increased performance and/or decreased weight. Flight demonstration of the HISTEC technologies has significantly reduced the risk of transitioning the technology to tactical and commercial engines.

  15. Magnetic Resonance Elastography Demonstrating Low Brain Stiffness in a Patient with Low-Pressure Hydrocephalus: Case Report.

    PubMed

    Olivero, William C; Wszalek, Tracey; Wang, Huan; Farahvar, Arash; Rieth, Sandra M; Johnson, Curtis L

    2016-01-01

    The authors describe the case of a 19-year-old female with shunted aqueductal stenosis who presented with low-pressure hydrocephalus that responded to negative pressure drainage. A magnetic resonance elastography scan performed 3 weeks later demonstrated very low brain tissue stiffness (high brain tissue compliance). An analysis of the importance of this finding in understanding this rare condition is discussed. © 2016 S. Karger AG, Basel.

  16. Butterfly wing color: A photonic crystal demonstration

    NASA Astrophysics Data System (ADS)

    Proietti Zaccaria, Remo

    2016-01-01

    We have theoretically modeled the optical behavior of a natural occurring photonic crystal, as defined by the geometrical characteristics of the Teinopalpus Imperialis butterfly. In particular, following a genetic algorithm approach, we demonstrate how its wings follow a triclinic crystal geometry with a tetrahedron unit base. By performing both photonic band analysis and transmission/reflection simulations, we are able to explain the characteristic colors emerging by the butterfly wings, thus confirming their crystal form.

  17. The power-proportion method for intracranial volume correction in volumetric imaging analysis.

    PubMed

    Liu, Dawei; Johnson, Hans J; Long, Jeffrey D; Magnotta, Vincent A; Paulsen, Jane S

    2014-01-01

    In volumetric brain imaging analysis, volumes of brain structures are typically assumed to be proportional or linearly related to intracranial volume (ICV). However, evidence abounds that many brain structures have power law relationships with ICV. To take this relationship into account in volumetric imaging analysis, we propose a power law based method-the power-proportion method-for ICV correction. The performance of the new method is demonstrated using data from the PREDICT-HD study.

  18. The evolutionary basis of human social learning

    PubMed Central

    Morgan, T. J. H.; Rendell, L. E.; Ehn, M.; Hoppitt, W.; Laland, K. N.

    2012-01-01

    Humans are characterized by an extreme dependence on culturally transmitted information. Such dependence requires the complex integration of social and asocial information to generate effective learning and decision making. Recent formal theory predicts that natural selection should favour adaptive learning strategies, but relevant empirical work is scarce and rarely examines multiple strategies or tasks. We tested nine hypotheses derived from theoretical models, running a series of experiments investigating factors affecting when and how humans use social information, and whether such behaviour is adaptive, across several computer-based tasks. The number of demonstrators, consensus among demonstrators, confidence of subjects, task difficulty, number of sessions, cost of asocial learning, subject performance and demonstrator performance all influenced subjects' use of social information, and did so adaptively. Our analysis provides strong support for the hypothesis that human social learning is regulated by adaptive learning rules. PMID:21795267

  19. The evolutionary basis of human social learning.

    PubMed

    Morgan, T J H; Rendell, L E; Ehn, M; Hoppitt, W; Laland, K N

    2012-02-22

    Humans are characterized by an extreme dependence on culturally transmitted information. Such dependence requires the complex integration of social and asocial information to generate effective learning and decision making. Recent formal theory predicts that natural selection should favour adaptive learning strategies, but relevant empirical work is scarce and rarely examines multiple strategies or tasks. We tested nine hypotheses derived from theoretical models, running a series of experiments investigating factors affecting when and how humans use social information, and whether such behaviour is adaptive, across several computer-based tasks. The number of demonstrators, consensus among demonstrators, confidence of subjects, task difficulty, number of sessions, cost of asocial learning, subject performance and demonstrator performance all influenced subjects' use of social information, and did so adaptively. Our analysis provides strong support for the hypothesis that human social learning is regulated by adaptive learning rules.

  20. Case Study for the ARRA-funded Ground Source Heat Pump Demonstration at Denver Museum of Nature & Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Im, Piljae; Liu, Xiaobing

    High initial costs and lack of public awareness of ground-source heat pump (GSHP) technology are the two major barriers preventing rapid deployment of this energy-saving technology in the United States. Under the American Recovery and Reinvestment Act (ARRA), 26 GSHP projects were competitively selected and carried out to demonstrate the benefits of GSHP systems and innovative technologies for cost reduction and/or performance improvement. This report highlights the findings of a case study of one such GSHP demonstration projects that uses a recycled water heat pump (RWHP) system installed at the Denver Museum of Nature & Science in Denver, Colorado. Themore » RWHP system uses recycled water from the city’s water system as the heat sink and source for a modular water-to-water heat pump (WWHP). This case study was conducted based on the available measured performance data from December 2014 through August 2015, utility bills of the building in 2014 and 2015, construction drawings, maintenance records, personal communications, and construction costs. The annual energy consumption of the RWHP system was calculated based on the available measured data and other related information. It was compared with the performance of a baseline scenario— a conventional VAV system using a water-cooled chiller and a natural gas fired boiler, both of which have the minimum energy efficiencies allowed by ASHRAE 90.1-2010. The comparison was made to determine energy savings, operating cost savings, and CO2 emission reductions achieved by the RWHP system. A cost analysis was performed to evaluate the simple payback of the RWHP system. Summarized below are the results of the performance analysis, the learned lessons, and recommended improvement in the operation of the RWHP system.« less

  1. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 2; Preliminary Results

    NASA Technical Reports Server (NTRS)

    Walsh, J. L.; Weston, R. P.; Samareh, J. A.; Mason, B. H.; Green, L. L.; Biedron, R. T.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity finite-element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a high-speed civil transport configuration. The paper describes both the preliminary results from implementing and validating the multidisciplinary analysis and the results from an aerodynamic optimization. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture compliant software product. A companion paper describes the formulation of the multidisciplinary analysis and optimization system.

  2. Thermal analysis of a conceptual design for a 250 We GPHS/FPSE space power system

    NASA Technical Reports Server (NTRS)

    Mccomas, Thomas J.; Dugan, Edward T.

    1991-01-01

    A thermal analysis has been performed for a 250-We space nuclear power system which combines the US Department of Energy's general purpose heat source (GPHS) modules with a state-of-the-art free-piston Stirling engine (FPSE). The focus of the analysis is on the temperature of the indium fuel clad within the GPHS modules. The thermal analysis results indicate fuel clad temperatures slightly higher than the design goal temperature of 1573 K. The results are considered favorable due to numerous conservative assumptions used. To demonstrate the effects of the conservatism, a brief sensitivity analysis is performed in which a few of the key system parameters are varied to determine their effect on the fuel clad temperatures. It is shown that thermal analysis of a more detailed thermal mode should yield fuel clad temperatures below 1573 K.

  3. System Risk Assessment and Allocation in Conceptual Design

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Smith, Natasha L.; Zang, Thomas A. (Technical Monitor)

    2003-01-01

    As aerospace systems continue to evolve in addressing newer challenges in air and space transportation, there exists a heightened priority for significant improvement in system performance, cost effectiveness, reliability, and safety. Tools, which synthesize multidisciplinary integration, probabilistic analysis, and optimization, are needed to facilitate design decisions allowing trade-offs between cost and reliability. This study investigates tools for probabilistic analysis and probabilistic optimization in the multidisciplinary design of aerospace systems. A probabilistic optimization methodology is demonstrated for the low-fidelity design of a reusable launch vehicle at two levels, a global geometry design and a local tank design. Probabilistic analysis is performed on a high fidelity analysis of a Navy missile system. Furthermore, decoupling strategies are introduced to reduce the computational effort required for multidisciplinary systems with feedback coupling.

  4. Simultaneous analysis and design

    NASA Technical Reports Server (NTRS)

    Haftka, R. T.

    1984-01-01

    Optimization techniques are increasingly being used for performing nonlinear structural analysis. The development of element by element (EBE) preconditioned conjugate gradient (CG) techniques is expected to extend this trend to linear analysis. Under these circumstances the structural design problem can be viewed as a nested optimization problem. There are computational benefits to treating this nested problem as a large single optimization problem. The response variables (such as displacements) and the structural parameters are all treated as design variables in a unified formulation which performs simultaneously the design and analysis. Two examples are used for demonstration. A seventy-two bar truss is optimized subject to linear stress constraints and a wing box structure is optimized subject to nonlinear collapse constraints. Both examples show substantial computational savings with the unified approach as compared to the traditional nested approach.

  5. Time-dependent inertia analysis of vehicle mechanisms

    NASA Astrophysics Data System (ADS)

    Salmon, James Lee

    Two methods for performing transient inertia analysis of vehicle hardware systems are developed in this dissertation. The analysis techniques can be used to predict the response of vehicle mechanism systems to the accelerations associated with vehicle impacts. General analytical methods for evaluating translational or rotational system dynamics are generated and evaluated for various system characteristics. The utility of the derived techniques are demonstrated by applying the generalized methods to two vehicle systems. Time dependent acceleration measured during a vehicle to vehicle impact are used as input to perform a dynamic analysis of an automobile liftgate latch and outside door handle. Generalized Lagrange equations for a non-conservative system are used to formulate a second order nonlinear differential equation defining the response of the components to the transient input. The differential equation is solved by employing the fourth order Runge-Kutta method. The events are then analyzed using commercially available two dimensional rigid body dynamic analysis software. The results of the two analytical techniques are compared to experimental data generated by high speed film analysis of tests of the two components performed on a high G acceleration sled at Ford Motor Company.

  6. Development and Validation of a Fast, Accurate and Cost-Effective Aeroservoelastic Method on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Goodwin, Sabine A.; Raj, P.

    1999-01-01

    Progress to date towards the development and validation of a fast, accurate and cost-effective aeroelastic method for advanced parallel computing platforms such as the IBM SP2 and the SGI Origin 2000 is presented in this paper. The ENSAERO code, developed at the NASA-Ames Research Center has been selected for this effort. The code allows for the computation of aeroelastic responses by simultaneously integrating the Euler or Navier-Stokes equations and the modal structural equations of motion. To assess the computational performance and accuracy of the ENSAERO code, this paper reports the results of the Navier-Stokes simulations of the transonic flow over a flexible aeroelastic wing body configuration. In addition, a forced harmonic oscillation analysis in the frequency domain and an analysis in the time domain are done on a wing undergoing a rigid pitch and plunge motion. Finally, to demonstrate the ENSAERO flutter-analysis capability, aeroelastic Euler and Navier-Stokes computations on an L-1011 wind tunnel model including pylon, nacelle and empennage are underway. All computational solutions are compared with experimental data to assess the level of accuracy of ENSAERO. As the computations described above are performed, a meticulous log of computational performance in terms of wall clock time, execution speed, memory and disk storage is kept. Code scalability is also demonstrated by studying the impact of varying the number of processors on computational performance on the IBM SP2 and the Origin 2000 systems.

  7. Validation of hierarchical cluster analysis for identification of bacterial species using 42 bacterial isolates

    NASA Astrophysics Data System (ADS)

    Ghebremedhin, Meron; Yesupriya, Shubha; Luka, Janos; Crane, Nicole J.

    2015-03-01

    Recent studies have demonstrated the potential advantages of the use of Raman spectroscopy in the biomedical field due to its rapidity and noninvasive nature. In this study, Raman spectroscopy is applied as a method for differentiating between bacteria isolates for Gram status and Genus species. We created models for identifying 28 bacterial isolates using spectra collected with a 785 nm laser excitation Raman spectroscopic system. In order to investigate the groupings of these samples, partial least squares discriminant analysis (PLSDA) and hierarchical cluster analysis (HCA) was implemented. In addition, cluster analyses of the isolates were performed using various data types consisting of, biochemical tests, gene sequence alignment, high resolution melt (HRM) analysis and antimicrobial susceptibility tests of minimum inhibitory concentration (MIC) and degree of antimicrobial resistance (SIR). In order to evaluate the ability of these models to correctly classify bacterial isolates using solely Raman spectroscopic data, a set of 14 validation samples were tested using the PLSDA models and consequently the HCA models. External cluster evaluation criteria of purity and Rand index were calculated at different taxonomic levels to compare the performance of clustering using Raman spectra as well as the other datasets. Results showed that Raman spectra performed comparably, and in some cases better than, the other data types with Rand index and purity values up to 0.933 and 0.947, respectively. This study clearly demonstrates that the discrimination of bacterial species using Raman spectroscopic data and hierarchical cluster analysis is possible and has the potential to be a powerful point-of-care tool in clinical settings.

  8. Microfluidic Injector Models Based on Artificial Neural Networks

    DTIC Science & Technology

    2005-06-15

    medicine, and chemistry [1], [2]. They generally perform chemical analysis involving sample preparation, mixing , reaction, injection, separation analysis...algorithms have been validated against many ex- periments found in the literature demonstrating microfluidic mixing , joule heating, injection, and...385 [7] K. Seiler, Z. H. Fan, K. Fluri, and D. J. Harrison, “ Electroosmotic pump- ing and valveless control of fluid flow within a manifold of

  9. Integrative molecular network analysis identifies emergent enzalutamide resistance mechanisms in prostate cancer

    PubMed Central

    King, Carly J.; Woodward, Josha; Schwartzman, Jacob; Coleman, Daniel J.; Lisac, Robert; Wang, Nicholas J.; Van Hook, Kathryn; Gao, Lina; Urrutia, Joshua; Dane, Mark A.; Heiser, Laura M.; Alumkal, Joshi J.

    2017-01-01

    Recent work demonstrates that castration-resistant prostate cancer (CRPC) tumors harbor countless genomic aberrations that control many hallmarks of cancer. While some specific mutations in CRPC may be actionable, many others are not. We hypothesized that genomic aberrations in cancer may operate in concert to promote drug resistance and tumor progression, and that organization of these genomic aberrations into therapeutically targetable pathways may improve our ability to treat CRPC. To identify the molecular underpinnings of enzalutamide-resistant CRPC, we performed transcriptional and copy number profiling studies using paired enzalutamide-sensitive and resistant LNCaP prostate cancer cell lines. Gene networks associated with enzalutamide resistance were revealed by performing an integrative genomic analysis with the PAthway Representation and Analysis by Direct Reference on Graphical Models (PARADIGM) tool. Amongst the pathways enriched in the enzalutamide-resistant cells were those associated with MEK, EGFR, RAS, and NFKB. Functional validation studies of 64 genes identified 10 candidate genes whose suppression led to greater effects on cell viability in enzalutamide-resistant cells as compared to sensitive parental cells. Examination of a patient cohort demonstrated that several of our functionally-validated gene hits are deregulated in metastatic CRPC tumor samples, suggesting that they may be clinically relevant therapeutic targets for patients with enzalutamide-resistant CRPC. Altogether, our approach demonstrates the potential of integrative genomic analyses to clarify determinants of drug resistance and rational co-targeting strategies to overcome resistance. PMID:29340039

  10. Gait performance of children and adolescents with sensorineural hearing loss.

    PubMed

    Melo, Renato de Souza

    2017-09-01

    Several studies have demonstrated that children with sensorineural hearing loss (SNHL) may exhibit balance disorders, which can compromise the gait performance of this population. Compare the gait performance of normal hearing (NH) children and those with SNHL, considering the sex and age range of the sample, and analyze gait performance according to degrees of hearing loss and etiological factors in the latter group. This is a cross-sectional study that assessed 96 students, 48 NH and 48 with SNHL, aged between 7 and 18 years. The Brazilian version of the Dynamic Gait Index (DGI) was used to analyze gait and the Mann-Whitney test for statistical analysis. The group with SNHL obtained lower average gait performance compared to NH subjects (p=0.000). This was also observed when the children were grouped by sex female and male (p=0.000). The same difference occurred when the children were stratified by age group: 7-18 years (p=0.000). The group with severe and profound hearing loss exhibited worse gait performance than those with mild and moderate loss (p=0.048) and children with prematurity as an etiological factor demonstrated the worst gait performance. The children with SNHL showed worse gait performance compared to NH of the same sex and age group. Those with severe and profound hearing loss and prematurity as an etiological factor demonstrated the worst gait performances. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Risk-Informed External Hazards Analysis for Seismic and Flooding Phenomena for a Generic PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parisi, Carlo; Prescott, Steve; Ma, Zhegang

    This report describes the activities performed during the FY2017 for the US-DOE Light Water Reactor Sustainability Risk-Informed Safety Margin Characterization (LWRS-RISMC), Industry Application #2. The scope of Industry Application #2 is to deliver a risk-informed external hazards safety analysis for a representative nuclear power plant. Following the advancements occurred during the previous FYs (toolkits identification, models development), FY2017 focused on: increasing the level of realism of the analysis; improving the tools and the coupling methodologies. In particular the following objectives were achieved: calculation of buildings pounding and their effects on components seismic fragility; development of a SAPHIRE code PRA modelsmore » for 3-loops Westinghouse PWR; set-up of a methodology for performing static-dynamic PRA coupling between SAPHIRE and EMRALD codes; coupling RELAP5-3D/RAVEN for performing Best-Estimate Plus Uncertainty analysis and automatic limit surface search; and execute sample calculations for demonstrating the capabilities of the toolkit in performing a risk-informed external hazards safety analyses.« less

  12. Application of the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) for Dynamic Systems Analysis

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey; Zinnecker, Alicia

    2014-01-01

    Systems analysis involves steady-state simulations of combined components to evaluate the steady-state performance, weight, and cost of a system; dynamic considerations are not included until later in the design process. The Dynamic Systems Analysis task, under NASAs Fixed Wing project, is developing the capability for assessing dynamic issues at earlier stages during systems analysis. To provide this capability the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) has been developed to design a single flight condition controller (defined as altitude and Mach number) and, ultimately, provide an estimate of the closed-loop performance of the engine model. This tool has been integrated with the Commercial Modular Aero-Propulsion System Simulation 40,000(CMAPSS40k) engine model to demonstrate the additional information TTECTrA makes available for dynamic systems analysis. This dynamic data can be used to evaluate the trade-off between performance and safety, which could not be done with steady-state systems analysis data. TTECTrA has been designed to integrate with any turbine engine model that is compatible with the MATLABSimulink (The MathWorks, Inc.) environment.

  13. Application of the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) for Dynamic Systems Analysis

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey Thomas; Zinnecker, Alicia Mae

    2014-01-01

    Systems analysis involves steady-state simulations of combined components to evaluate the steady-state performance, weight, and cost of a system; dynamic considerations are not included until later in the design process. The Dynamic Systems Analysis task, under NASAs Fixed Wing project, is developing the capability for assessing dynamic issues at earlier stages during systems analysis. To provide this capability the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) has been developed to design a single flight condition controller (defined as altitude and Mach number) and, ultimately, provide an estimate of the closed-loop performance of the engine model. This tool has been integrated with the Commercial Modular Aero-Propulsion System Simulation 40,000 (CMAPSS 40k) engine model to demonstrate the additional information TTECTrA makes available for dynamic systems analysis. This dynamic data can be used to evaluate the trade-off between performance and safety, which could not be done with steady-state systems analysis data. TTECTrA has been designed to integrate with any turbine engine model that is compatible with the MATLAB Simulink (The MathWorks, Inc.) environment.

  14. A Field-Portable Cell Analyzer without a Microscope and Reagents.

    PubMed

    Seo, Dongmin; Oh, Sangwoo; Lee, Moonjin; Hwang, Yongha; Seo, Sungkyu

    2017-12-29

    This paper demonstrates a commercial-level field-portable lens-free cell analyzer called the NaviCell (No-stain and Automated Versatile Innovative cell analyzer) capable of automatically analyzing cell count and viability without employing an optical microscope and reagents. Based on the lens-free shadow imaging technique, the NaviCell (162 × 135 × 138 mm³ and 1.02 kg) has the advantage of providing analysis results with improved standard deviation between measurement results, owing to its large field of view. Importantly, the cell counting and viability testing can be analyzed without the use of any reagent, thereby simplifying the measurement procedure and reducing potential errors during sample preparation. In this study, the performance of the NaviCell for cell counting and viability testing was demonstrated using 13 and six cell lines, respectively. Based on the results of the hemocytometer ( de facto standard), the error rate (ER) and coefficient of variation (CV) of the NaviCell are approximately 3.27 and 2.16 times better than the commercial cell counter, respectively. The cell viability testing of the NaviCell also showed an ER and CV performance improvement of 5.09 and 1.8 times, respectively, demonstrating sufficient potential in the field of cell analysis.

  15. Trial-and-error copying of demonstrated actions reveals how fledglings learn to ‘imitate’ their mothers

    PubMed Central

    Lotem, Arnon

    2017-01-01

    Understanding how humans and other animals learn to perform an act from seeing it done has been a major challenge in the study of social learning. To determine whether this ability is based on ‘true imitation’, many studies have applied the two-action experimental paradigm, examining whether subjects learn to perform the specific action demonstrated to them. Here, we show that the insights gained from animals' success in two-action experiments may be limited, and that a better understanding is achieved by monitoring subjects' entire behavioural repertoire. Hand-reared house sparrows that followed a model of a mother demonstrator were successful in learning to find seeds hidden under a leaf, using the action demonstrated by the mother (either pushing the leaf or pecking it). However, they also produced behaviours that had not been demonstrated but were nevertheless related to the demonstrated act. This finding suggests that while the learners were clearly influenced by the demonstrator, they did not accurately imitate her. Rather, they used their own behavioural repertoire, gradually fitting it to the demonstrated task solution through trial and error. This process is consistent with recent views on how animals learn to imitate, and may contribute to a unified process-level analysis of social learning mechanisms. PMID:28228516

  16. Using Runtime Analysis to Guide Model Checking of Java Programs

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Norvig, Peter (Technical Monitor)

    2001-01-01

    This paper describes how two runtime analysis algorithms, an existing data race detection algorithm and a new deadlock detection algorithm, have been implemented to analyze Java programs. Runtime analysis is based on the idea of executing the program once. and observing the generated run to extract various kinds of information. This information can then be used to predict whether other different runs may violate some properties of interest, in addition of course to demonstrate whether the generated run itself violates such properties. These runtime analyses can be performed stand-alone to generate a set of warnings. It is furthermore demonstrated how these warnings can be used to guide a model checker, thereby reducing the search space. The described techniques have been implemented in the b e grown Java model checker called PathFinder.

  17. Characterization of the Bm61 of the Bombyx mori nucleopolyhedrovirus.

    PubMed

    Shen, Hongxing; Chen, Keping; Yao, Qin; Zhou, Yang

    2009-07-01

    orf61 (bm61) of Bombyx mori Nucleopolyhedrovirus (BmNPV) is a highly conserved baculovirus gene, suggesting that it performs an important role in the virus life cycle whose function is unknown. In this study, we describe the characterization of bm61. Quantitative polymerase chain reaction (qPCR) and western blot analysis demonstrated that bm61 was expressed as a late gene. Immunofluorescence analysis by confocal microscopy showed that BM61 protein was localized on nuclear membrane and in intranuclear ring zone of infected cells. Structure localization of the BM61 in BV and ODV by western analysis demonstrated that BM61 was the protein of both BV and ODV. In addition, our data indicated that BM61 was a late structure protein localized in nucleus.

  18. The String Stability of a Trajectory-Based Interval Management Algorithm in the Midterm Airspace

    NASA Technical Reports Server (NTRS)

    Swieringa, Kurt A.

    2015-01-01

    NASA's first Air Traffic Management (ATM) Technology Demonstration (ATD-1) was created to facilitate the transition of mature ATM technologies from the laboratory to operational use. The technologies selected for demonstration are the Traffic Management Advisor with Terminal Metering (TMA-TM), which provides precise time-based scheduling in the terminal airspace; Controller Managed Spacing (CMS), which provides terminal controllers with decision support tools enabling precise schedule conformance; and Interval Management (IM), which consists of flight deck automation that enables aircraft to achieve or maintain a precise spacing interval behind a target aircraft. As the percentage of IM equipped aircraft increases, controllers may provide IM clearances to sequences, or strings, of IM-equipped aircraft. It is important for these strings to maintain stable performance. This paper describes an analytic analysis of the string stability of the latest version of NASA's IM algorithm and a fast-time simulation designed to characterize the string performance of the IM algorithm. The analytic analysis showed that the spacing algorithm has stable poles, indicating that a spacing error perturbation will be reduced as a function of string position. The fast-time simulation investigated IM operations at two airports using constraints associated with the midterm airspace, including limited information of the target aircraft's intended speed profile and limited information of the wind forecast on the target aircraft's route. The results of the fast-time simulation demonstrated that the performance of the spacing algorithm is acceptable for strings of moderate length; however, there is some degradation in IM performance as a function of string position.

  19. Practical Techniques for Modeling Gas Turbine Engine Performance

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Jonathan S.

    2016-01-01

    The cost and risk associated with the design and operation of gas turbine engine systems has led to an increasing dependence on mathematical models. In this paper, the fundamentals of engine simulation will be reviewed, an example performance analysis will be performed, and relationships useful for engine control system development will be highlighted. The focus will be on thermodynamic modeling utilizing techniques common in industry, such as: the Brayton cycle, component performance maps, map scaling, and design point criteria generation. In general, these topics will be viewed from the standpoint of an example turbojet engine model; however, demonstrated concepts may be adapted to other gas turbine systems, such as gas generators, marine engines, or high bypass aircraft engines. The purpose of this paper is to provide an example of gas turbine model generation and system performance analysis for educational uses, such as curriculum creation or student reference.

  20. Vitamin D and depression: a systematic review and meta-analysis comparing studies with and without biological flaws.

    PubMed

    Spedding, Simon

    2014-04-11

    Efficacy of Vitamin D supplements in depression is controversial, awaiting further literature analysis. Biological flaws in primary studies is a possible reason meta-analyses of Vitamin D have failed to demonstrate efficacy. This systematic review and meta-analysis of Vitamin D and depression compared studies with and without biological flaws. The systematic review followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. The literature search was undertaken through four databases for randomized controlled trials (RCTs). Studies were critically appraised for methodological quality and biological flaws, in relation to the hypothesis and study design. Meta-analyses were performed for studies according to the presence of biological flaws. The 15 RCTs identified provide a more comprehensive evidence-base than previous systematic reviews; methodological quality of studies was generally good and methodology was diverse. A meta-analysis of all studies without flaws demonstrated a statistically significant improvement in depression with Vitamin D supplements (+0.78 CI +0.24, +1.27). Studies with biological flaws were mainly inconclusive, with the meta-analysis demonstrating a statistically significant worsening in depression by taking Vitamin D supplements (-1.1 CI -0.7, -1.5). Vitamin D supplementation (≥800 I.U. daily) was somewhat favorable in the management of depression in studies that demonstrate a change in vitamin levels, and the effect size was comparable to that of anti-depressant medication.

  1. ExaSAT: An exascale co-design tool for performance modeling

    DOE PAGES

    Unat, Didem; Chan, Cy; Zhang, Weiqun; ...

    2015-02-09

    One of the emerging challenges to designing HPC systems is understanding and projecting the requirements of exascale applications. In order to determine the performance consequences of different hardware designs, analytic models are essential because they can provide fast feedback to the co-design centers and chip designers without costly simulations. However, current attempts to analytically model program performance typically rely on the user manually specifying a performance model. Here we introduce the ExaSAT framework that automates the extraction of parameterized performance models directly from source code using compiler analysis. The parameterized analytic model enables quantitative evaluation of a broad range ofmore » hardware design trade-offs and software optimizations on a variety of different performance metrics, with a primary focus on data movement as a metric. Finally, we demonstrate the ExaSAT framework’s ability to perform deep code analysis of a proxy application from the Department of Energy Combustion Co-design Center to illustrate its value to the exascale co-design process. ExaSAT analysis provides insights into the hardware and software trade-offs and lays the groundwork for exploring a more targeted set of design points using cycle-accurate architectural simulators.« less

  2. Prediction of SOFC Performance with or without Experiments: A Study on Minimum Requirements for Experimental Data

    DOE PAGES

    Yang, Tao; Sezer, Hayri; Celik, Ismail B.; ...

    2015-06-02

    In the present paper, a physics-based procedure combining experiments and multi-physics numerical simulations is developed for overall analysis of SOFCs operational diagnostics and performance predictions. In this procedure, essential information for the fuel cell is extracted first by utilizing empirical polarization analysis in conjunction with experiments and refined by multi-physics numerical simulations via simultaneous analysis and calibration of polarization curve and impedance behavior. The performance at different utilization cases and operating currents is also predicted to confirm the accuracy of the proposed model. It is demonstrated that, with the present electrochemical model, three air/fuel flow conditions are needed to producemore » a set of complete data for better understanding of the processes occurring within SOFCs. After calibration against button cell experiments, the methodology can be used to assess performance of planar cell without further calibration. The proposed methodology would accelerate the calibration process and improve the efficiency of design and diagnostics.« less

  3. NSEG: A segmented mission analysis program for low and high speed aircraft. Volume 3: Demonstration problems

    NASA Technical Reports Server (NTRS)

    Hague, D. S.; Rozendaal, H. L.

    1977-01-01

    Program NSEG is a rapid mission analysis code based on the use of approximate flight path equations of motion. Equation form varies with the segment type, for example, accelerations, climbs, cruises, descents, and decelerations. Realistic and detailed vehicle characteristics are specified in tabular form. In addition to its mission performance calculation capabilities, the code also contains extensive flight envelope performance mapping capabilities. For example, rate-of-climb, turn rates, and energy maneuverability parameter values may be mapped in the Mach-altitude plane. Approximate take off and landing analyses are also performed. At high speeds, centrifugal lift effects are accounted for. Extensive turbojet and ramjet engine scaling procedures are incorporated in the code.

  4. Using tipping points of emotional intelligence and cognitive competencies to predict financial performance of leaders.

    PubMed

    Boyatzis, Richard E

    2006-01-01

    Competencies have been shown to differentiate outstanding managers and leaders from their less effective counterparts. Some of the competencies related to effectiveness reflect cognitive intelligence, but many of them are behavioral manifestations of emotional intelligence. Meanwhile, the performance measures used have often been an approximation of effectiveness. A study of leaders in a multi-national, consulting company shows that the frequency with which they demonstrate a variety of competencies, as seen by others, predicts financial performance in the seven quarters following the competency assessment. This, like other studies only clarify which competencies are necessary for outstanding performance. Borrowing from complexity theory, a tipping point analysis allows examination of how much of the competency is sufficient for outstanding performance. Using the tipping point analysis shows an even greater impact of competencies on the financial performance measures of the leaders in the study. The emotional intelligence competencies constituted most (i.e., 13/14) of the validated competencies predicting financial performance.

  5. Redox flow cell development and demonstration project, calendar year 1976

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The major focus of the effort was the key technology issues that directly influence the fundamental feasibility of the overall redox concept. These issues were the development of a suitable semipermeable separator membrane for the system, the screening and study of candidate redox couples to achieve optimum cell performance, and the carrying out of systems analysis and modeling to develop system performance goals and cost estimates.

  6. Using spectral imaging for the analysis of abnormalities for colorectal cancer: When is it helpful?

    PubMed

    Awan, Ruqayya; Al-Maadeed, Somaya; Al-Saady, Rafif

    2018-01-01

    The spectral imaging technique has been shown to provide more discriminative information than the RGB images and has been proposed for a range of problems. There are many studies demonstrating its potential for the analysis of histopathology images for abnormality detection but there have been discrepancies among previous studies as well. Many multispectral based methods have been proposed for histopathology images but the significance of the use of whole multispectral cube versus a subset of bands or a single band is still arguable. We performed comprehensive analysis using individual bands and different subsets of bands to determine the effectiveness of spectral information for determining the anomaly in colorectal images. Our multispectral colorectal dataset consists of four classes, each represented by infra-red spectrum bands in addition to the visual spectrum bands. We performed our analysis of spectral imaging by stratifying the abnormalities using both spatial and spectral information. For our experiments, we used a combination of texture descriptors with an ensemble classification approach that performed best on our dataset. We applied our method to another dataset and got comparable results with those obtained using the state-of-the-art method and convolutional neural network based method. Moreover, we explored the relationship of the number of bands with the problem complexity and found that higher number of bands is required for a complex task to achieve improved performance. Our results demonstrate a synergy between infra-red and visual spectrum by improving the classification accuracy (by 6%) on incorporating the infra-red representation. We also highlight the importance of how the dataset should be divided into training and testing set for evaluating the histopathology image-based approaches, which has not been considered in previous studies on multispectral histopathology images.

  7. Using spectral imaging for the analysis of abnormalities for colorectal cancer: When is it helpful?

    PubMed Central

    Al-Maadeed, Somaya; Al-Saady, Rafif

    2018-01-01

    The spectral imaging technique has been shown to provide more discriminative information than the RGB images and has been proposed for a range of problems. There are many studies demonstrating its potential for the analysis of histopathology images for abnormality detection but there have been discrepancies among previous studies as well. Many multispectral based methods have been proposed for histopathology images but the significance of the use of whole multispectral cube versus a subset of bands or a single band is still arguable. We performed comprehensive analysis using individual bands and different subsets of bands to determine the effectiveness of spectral information for determining the anomaly in colorectal images. Our multispectral colorectal dataset consists of four classes, each represented by infra-red spectrum bands in addition to the visual spectrum bands. We performed our analysis of spectral imaging by stratifying the abnormalities using both spatial and spectral information. For our experiments, we used a combination of texture descriptors with an ensemble classification approach that performed best on our dataset. We applied our method to another dataset and got comparable results with those obtained using the state-of-the-art method and convolutional neural network based method. Moreover, we explored the relationship of the number of bands with the problem complexity and found that higher number of bands is required for a complex task to achieve improved performance. Our results demonstrate a synergy between infra-red and visual spectrum by improving the classification accuracy (by 6%) on incorporating the infra-red representation. We also highlight the importance of how the dataset should be divided into training and testing set for evaluating the histopathology image-based approaches, which has not been considered in previous studies on multispectral histopathology images. PMID:29874262

  8. Efficacy of antidepressive medication for depression in Parkinson disease: a network meta-analysis

    PubMed Central

    Zhuo, Chuanjun; Xue, Rong; Luo, Lanlan; Ji, Feng; Tian, Hongjun; Qu, Hongru; Lin, Xiaodong; Jiang, Ronghuan; Tao, Ran

    2017-01-01

    Abstract Background: Parkinson disease (PD) was considered as the 2nd most prevalent neurodegenerative disorder after Alzheimer disease, while depression is a prevailing nonmotor symptom of PD. Typically used antidepression medication includes tricyclic antidepressants (TCA), selective serotonin reuptake inhibitors (SSRI), serotonin and norepinephrine reuptake inhibitors (SNRI), monoamine-oxidase inhibitors (MAOI), and dopamine agonists (DA). Our study aimed at evaluating the efficacy of antidepressive medications for depression of PD. Methods: Web of Science, PubMed, Embase, and the Cochrane library were searched for related articles. Traditional meta-analysis and network meta-analysis (NMA) were performed with outcomes including depression score, UPDRS-II, UPDRS-III, and adverse effects. Surface under the cumulative ranking curve (SUCRA) was also performed to illustrate the rank probabilities of different medications on various outcomes. The consistency of direct and indirect evidence was also assessed by node-splitting method. Results: Results of traditional pairwise meta-analysis were performed. Concerning depression score, significant improvement was observed in AD, MAOI, SSRI, and SNRI compared with placebo. NMA was performed and more information could be obtained. DA was illustrated to be effective over placebo concerning UPDRS-III, MAOI, and SNRI. DA demonstrated a better prognosis in UPDRS-II scores compared with placebo and MAOI. However, DA and SSRI demonstrated a significant increase in adverse effects compared with placebo. The SUCRA value was calculated to evaluate the ranking probabilities of all medications on investigated outcomes, and the consistency between direct and indirect evidences was assessed by node-splitting method. Conclusion: SSRI had a satisfying efficacy for the depression of PD patients and could improve activities of daily living and motor function of patient but the adverse effects are unneglectable. SNRI are the safest medication with high efficacy for depression as well while other outcomes are relatively poor. PMID:28562526

  9. An exploration of function analysis and function allocation in the commercial flight domain

    NASA Technical Reports Server (NTRS)

    Mcguire, James C.; Zich, John A.; Goins, Richard T.; Erickson, Jeffery B.; Dwyer, John P.; Cody, William J.; Rouse, William B.

    1991-01-01

    The applicability is explored of functional analysis methods to support cockpit design. Specifically, alternative techniques are studied for ensuring an effective division of responsibility between the flight crew and automation. A functional decomposition is performed of the commercial flight domain to provide the information necessary to support allocation decisions and demonstrate methodology for allocating functions to flight crew or to automation. The function analysis employed 'bottom up' and 'top down' analyses and demonstrated the comparability of identified functions, using the 'lift off' segment of the 'take off' phase as a test case. The normal flight mission and selected contingencies were addressed. Two alternative methods for using the functional description in the allocation of functions between man and machine were investigated. The two methods were compared in order to ascertain their relative strengths and weaknesses. Finally, conclusions were drawn regarding the practical utility of function analysis methods.

  10. Airfoil Vibration Dampers program

    NASA Technical Reports Server (NTRS)

    Cook, Robert M.

    1991-01-01

    The Airfoil Vibration Damper program has consisted of an analysis phase and a testing phase. During the analysis phase, a state-of-the-art computer code was developed, which can be used to guide designers in the placement and sizing of friction dampers. The use of this computer code was demonstrated by performing representative analyses on turbine blades from the High Pressure Oxidizer Turbopump (HPOTP) and High Pressure Fuel Turbopump (HPFTP) of the Space Shuttle Main Engine (SSME). The testing phase of the program consisted of performing friction damping tests on two different cantilever beams. Data from these tests provided an empirical check on the accuracy of the computer code developed in the analysis phase. Results of the analysis and testing showed that the computer code can accurately predict the performance of friction dampers. In addition, a valuable set of friction damping data was generated, which can be used to aid in the design of friction dampers, as well as provide benchmark test cases for future code developers.

  11. Robotic solid phase extraction and high performance liquid chromatographic analysis of ranitidine in serum or plasma.

    PubMed

    Lloyd, T L; Perschy, T B; Gooding, A E; Tomlinson, J J

    1992-01-01

    A fully automated assay for the analysis of ranitidine in serum and plasma, with and without an internal standard, was validated. It utilizes robotic solid phase extraction with on-line high performance liquid chromatographic (HPLC) analysis. The ruggedness of the assay was demonstrated over a three-year period. A Zymark Py Technology II robotic system was used for serial processing from initial aspiration of samples from original collection containers, to final direct injection onto the on-line HPLC system. Automated serial processing with on-line analysis provided uniform sample history and increased productivity by freeing the chemist to analyse data and perform other tasks. The solid phase extraction efficiency was 94% throughout the assay range of 10-250 ng/mL. The coefficients of variation for within- and between-day quality control samples ranged from 1 to 6% and 1 to 5%, respectively. Mean accuracy for between-day standards and quality control results ranged from 97 to 102% of the respective theoretical concentrations.

  12. A Shot Number Based Approach to Performance Analysis in Table Tennis

    PubMed Central

    Yoshida, Kazuto; Yamada, Koshi

    2017-01-01

    Abstract The current study proposes a novel approach that improves the conventional performance analysis in table tennis by introducing the concept of frequency, or the number of shots, of each shot number. The improvements over the conventional method are as follows: better accuracy of the evaluation of skills and tactics of players, additional insights into scoring and returning skills and ease of understanding the results with a single criterion. The performance analysis of matches played at the 2012 Summer Olympics in London was conducted using the proposed method. The results showed some effects of the shot number and gender differences in table tennis. Furthermore, comparisons were made between Chinese players and players from other countries, what threw light on the skills and tactics of the Chinese players. The present findings demonstrate that the proposed method provides useful information and has some advantages over the conventional method. PMID:28210334

  13. Acrania/encephalocele sequence (exencephaly) associated with 92,XXXX karyotype: early prenatal diagnosis at 9(+5) weeks by 3D transvaginal ultrasound and coelocentesis.

    PubMed

    Tonni, Gabriele; Ventura, Alessandro; Bonasoni, Maria Paola

    2009-09-01

    A 27-year-old pregnant woman was diagnosed by 3D transvaginal ultrasound as carrying a fetus of 9(+5) weeks gestation affected by acrania/encephalocele (exencephaly) sequence. A 2D transvaginal ultrasound-guided aspiration of 5 mL of extra-coelomic fluid was performed under cervical block before uterine suction. Conventional cytogenetic analysis demonstrated a 92,XXXX karyotype. Transvaginal 2D ultrasound-guided coelocentesis for rapid karyotyping can be proposed to women who are near to miscarriage or in cases where a prenatal ultrasound diagnosis of congenital anomaly is performed at an early stage of development. Genetic analysis can be performed using traditional cytogenetic analysis or can be aided by fluorescence in situ hybridization (FISH). Coelocentesis may become an integral part of first trimester armamentarium and may be clinically useful in the understanding of the pathogenesis of early prenatally diagnosed congenital anomalies.

  14. Estimating Driving Performance Based on EEG Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Lin, Chin-Teng; Wu, Ruei-Cheng; Jung, Tzyy-Ping; Liang, Sheng-Fu; Huang, Teng-Yi

    2005-12-01

    The growing number of traffic accidents in recent years has become a serious concern to society. Accidents caused by driver's drowsiness behind the steering wheel have a high fatality rate because of the marked decline in the driver's abilities of perception, recognition, and vehicle control abilities while sleepy. Preventing such accidents caused by drowsiness is highly desirable but requires techniques for continuously detecting, estimating, and predicting the level of alertness of drivers and delivering effective feedbacks to maintain their maximum performance. This paper proposes an EEG-based drowsiness estimation system that combines electroencephalogram (EEG) log subband power spectrum, correlation analysis, principal component analysis, and linear regression models to indirectly estimate driver's drowsiness level in a virtual-reality-based driving simulator. Our results demonstrated that it is feasible to accurately estimate quantitatively driving performance, expressed as deviation between the center of the vehicle and the center of the cruising lane, in a realistic driving simulator.

  15. RESOLVE OVEN Field Demonstration Unit for Lunar Resource Extraction

    NASA Technical Reports Server (NTRS)

    Paz, Aaron; Oryshchyn, Lara; Jensen, Scott; Sanders, Gerald B.; Lee, Kris; Reddington, Mike

    2013-01-01

    The Oxygen and Volatile Extraction Node (OVEN) is a subsystem within the Regolith & Environment Science and Oxygen & Lunar Volatile Extraction (RESOLVE) project. The purpose of the OVEN subsystem is to release volatiles from lunar regolith and extract oxygen by means of a hydrogen reduction reaction. The complete process includes receiving, weighing, sealing, heating, and disposing of core sample segments while transferring all gaseous contents to the Lunar Advanced Volatile Analysis (LAVA) subsystem. This document will discuss the design and performance of the OVEN Field Demonstration Unit (FDU), which participated in the 2012 RESOLVE field demonstration.

  16. Vector magnetometer design study: Analysis of a triaxial fluxgate sensor design demonstrates that all MAGSAT Vector Magnetometer specifications can be met

    NASA Technical Reports Server (NTRS)

    Adams, D. F.; Hartmann, U. G.; Lazarow, L. L.; Maloy, J. O.; Mohler, G. W.

    1976-01-01

    The design of the vector magnetometer selected for analysis is capable of exceeding the required accuracy of 5 gamma per vector field component. The principal elements that assure this performance level are very low power dissipation triaxial feedback coils surrounding ring core flux-gates and temperature control of the critical components of two-loop feedback electronics. An analysis of the calibration problem points to the need for improved test facilities.

  17. Thermal Analysis of a Disposable, Instrument-Free DNA Amplification Lab-on-a-Chip Platform.

    PubMed

    Pardy, Tamás; Rang, Toomas; Tulp, Indrek

    2018-06-04

    Novel second-generation rapid diagnostics based on nucleic acid amplification tests (NAAT) offer performance metrics on par with clinical laboratories in detecting infectious diseases at the point of care. The diagnostic assay is typically performed within a Lab-on-a-Chip (LoC) component with integrated temperature regulation. However, constraints on device dimensions, cost and power supply inherent with the device format apply to temperature regulation as well. Thermal analysis on simplified thermal models for the device can help overcome these barriers by speeding up thermal optimization. In this work, we perform experimental thermal analysis on the simplified thermal model for our instrument-free, single-use LoC NAAT platform. The system is evaluated further by finite element modelling. Steady-state as well as transient thermal analysis are performed to evaluate the performance of a self-regulating polymer resin heating element in the proposed device geometry. Reaction volumes in the target temperature range of the amplification reaction are estimated in the simulated model to assess compliance with assay requirements. Using the proposed methodology, we demonstrated our NAAT device concept capable of performing loop-mediated isothermal amplification in the 20⁻25 °C ambient temperature range with 32 min total assay time.

  18. Integrated restructurable flight control system demonstration results

    NASA Technical Reports Server (NTRS)

    Weiss, Jerold L.; Hsu, John Y.

    1987-01-01

    The purpose of this study was to examine the complementary capabilities of several restructurable flight control system (RFCS) concepts through the integration of these technologies into a complete system. Performance issues were addressed through a re-examination of RFCS functional requirements, and through a qualitative analysis of the design issues that, if properly addressed during integration, will lead to the highest possible degree of fault-tolerant performance. Software developed under previous phases of this contract and under NAS1-18004 was modified and integrated into a complete RFCS subroutine for NASA's B-737 simulation. The integration of these modules involved the development of methods for dealing with the mismatch between the outputs of the failure detection module and the input requirements of the automatic control system redesign module. The performance of this demonstration system was examined through extensive simulation trials.

  19. Task analysis exemplified: the process of resolving unfinished business.

    PubMed

    Greenberg, L S; Foerster, F S

    1996-06-01

    The steps of a task-analytic research program designed to identify the in-session performances involved in resolving lingering bad feelings toward a significant other are described. A rational-empirical methodology of repeatedly cycling between rational conjecture and empirical observations is demonstrated as a method of developing an intervention manual and the components of client processes of resolution. A refined model of the change process developed by these procedures is validated by comparing 11 successful and 11 unsuccessful performances. Four performance components-intense expression of feeling, expression of need, shift in representation of other, and self-validation or understanding of the other-were found to discriminate between resolution and nonresolution performances. These components were measured on 4 process measures: the Structural Analysis of Social Behavior, the Experiencing Scale, the Client's Emotional Arousal Scale, and a need scale.

  20. RETRAN analysis of multiple steam generator blow down caused by an auxiliary feedwater steam-line break

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feltus, M.A.

    1987-01-01

    Analysis results for multiple steam generator blow down caused by an auxiliary feedwater steam-line break performed with the RETRAN-02 MOD 003 computer code are presented to demonstrate the capabilities of the RETRAN code to predict system transient response for verifying changes in operational procedures and supporting plant equipment modifications. A typical four-loop Westinghouse pressurized water reactor was modeled using best-estimate versus worst case licensing assumptions. This paper presents analyses performed to evaluate the necessity of implementing an auxiliary feedwater steam-line isolation modification. RETRAN transient analysis can be used to determine core cooling capability response, departure from nucleate boiling ratio (DNBR)more » status, and reactor trip signal actuation times.« less

  1. Geometrical-Based Navigation System Performance Assessment in the Space Service Volume Using a Multiglobal Navigation Satellite System Methodology

    NASA Technical Reports Server (NTRS)

    Welch, Bryan W.

    2016-01-01

    NASA is participating in the International Committee on Global Navigation Satellite Systems (GNSS) (ICG)'s efforts towards demonstrating the benefits to the space user in the Space Service Volume (SSV) when a multi-GNSS solution space approach is utilized. The ICG Working Group: Enhancement of GNSS Performance, New Services and Capabilities has started a three phase analysis initiative as an outcome of recommendations at the ICG-10 meeting, in preparation for the ICG-11 meeting. The first phase of that increasing complexity and fidelity analysis initiative is based on a pure geometrically-derived access technique. The first phase of analysis has been completed, and the results are documented in this paper.

  2. Buckling Testing and Analysis of Space Shuttle Solid Rocket Motor Cylinders

    NASA Technical Reports Server (NTRS)

    Weidner, Thomas J.; Larsen, David V.; McCool, Alex (Technical Monitor)

    2002-01-01

    A series of full-scale buckling tests were performed on the space shuttle Reusable Solid Rocket Motor (RSRM) cylinders. The tests were performed to determine the buckling capability of the cylinders and to provide data for analytical comparison. A nonlinear ANSYS Finite Element Analysis (FEA) model was used to represent and evaluate the testing. Analytical results demonstrated excellent correlation to test results, predicting the failure load within 5%. The analytical value was on the conservative side, predicting a lower failure load than was applied to the test. The resulting study and analysis indicated the important parameters for FEA to accurately predict buckling failure. The resulting method was subsequently used to establish the pre-launch buckling capability of the space shuttle system.

  3. Inactivation of Smad5 in Endothelial Cells and Smooth Muscle Cells Demonstrates that Smad5 Is Required for Cardiac Homeostasis

    PubMed Central

    Umans, Lieve; Cox, Luk; Tjwa, Marc; Bito, Virginie; Vermeire, Liesbeth; Laperre, Kjell; Sipido, Karin; Moons, Lieve; Huylebroeck, Danny; Zwijsen, An

    2007-01-01

    Smads are intracellular signaling proteins that transduce signals elicited by members of the transforming growth factor (TGF)-β superfamily. Smad5 and Smad1 are highly homologous, and they mediate primarily bone morphogenetic protein (Bmp) signals. We used the Cre-loxP system and Sm22-Cre and Tie-1-Cre mice to study the function of Smad5 in the developing blood vessel wall. Analysis of embryos demonstrated that deletion of Smad5 in endothelial or smooth muscle cells resulted in a normal organization of embryonic and extra-embryonic vasculature. Angiogenic assays performed in adult mice revealed that mutant mice display a comparable angiogenic and vascular remodeling response to control mice. In Sm22-Cre;Smad5fl/− mice, Smad5 is also deleted in cardiomyocytes. Echocardiographic analysis on those 9-month-old female mice demonstrated larger left ventricle internal diameters and decreased fractional shortening compared with control littermates without signs of cardiac hypertrophy. The decreased cardiac contractility was associated with a decreased performance in a treadmill experiment. In isolated cardiomyocytes, fractional shortening was significantly reduced compared with control cells. These data demonstrate that restricted deletion of Smad5 in the blood vessel wall results in viable mice. However, loss of Smad5 in cardiomyocytes leads to a mild heart defect. PMID:17456754

  4. Inactivation of Smad5 in endothelial cells and smooth muscle cells demonstrates that Smad5 is required for cardiac homeostasis.

    PubMed

    Umans, Lieve; Cox, Luk; Tjwa, Marc; Bito, Virginie; Vermeire, Liesbeth; Laperre, Kjell; Sipido, Karin; Moons, Lieve; Huylebroeck, Danny; Zwijsen, An

    2007-05-01

    Smads are intracellular signaling proteins that transduce signals elicited by members of the transforming growth factor (TGF)-beta superfamily. Smad5 and Smad1 are highly homologous, and they mediate primarily bone morphogenetic protein (Bmp) signals. We used the Cre-loxP system and Sm22-Cre and Tie-1-Cre mice to study the function of Smad5 in the developing blood vessel wall. Analysis of embryos demonstrated that deletion of Smad5 in endothelial or smooth muscle cells resulted in a normal organization of embryonic and extra-embryonic vasculature. Angiogenic assays performed in adult mice revealed that mutant mice display a comparable angiogenic and vascular remodeling response to control mice. In Sm22-Cre; Smad5(fl/-) mice, Smad5 is also deleted in cardiomyocytes. Echocardiographic analysis on those 9-month-old female mice demonstrated larger left ventricle internal diameters and decreased fractional shortening compared with control littermates without signs of cardiac hypertrophy. The decreased cardiac contractility was associated with a decreased performance in a treadmill experiment. In isolated cardiomyocytes, fractional shortening was significantly reduced compared with control cells. These data demonstrate that restricted deletion of Smad5 in the blood vessel wall results in viable mice. However, loss of Smad5 in cardiomyocytes leads to a mild heart defect.

  5. Comparing and contrasting poverty reduction performance of social welfare programs across jurisdictions in Canada using Data Envelopment Analysis (DEA): an exploratory study of the era of devolution.

    PubMed

    Habibov, Nazim N; Fan, Lida

    2010-11-01

    In the mid-1990s, the responsibilities to design, implement, and evaluate social welfare programs were transferred from federal to local jurisdictions in many countries of North America and Europe through devolution processes. Devolution has caused the need for a technique to measure and compare the performances of social welfare programs across multiple jurisdictions. This paper utilizes Data Envelopment Analysis (DEA) for a comparison of poverty reduction performances of jurisdictional social welfare programs across Canadian provinces. From the theoretical perspective, findings of this paper demonstrates that DEA is a promising method to evaluate, compare, and benchmark poverty reduction performance across multiple jurisdictions using multiple inputs and outputs. This paper demonstrates that DEA generates easy to comprehend composite rankings of provincial performances, identifies appropriate benchmarks for each inefficient province, and estimates sources and amounts of improvement needed to make the provinces efficient. From a practical perspective the empirical results presented in this paper indicate that Newfoundland, Prince Edwards Island, and Alberta achieve better efficiency in poverty reduction than other provinces. Policy makers and social administrators of the ineffective provinces across Canada may find benefit in selecting one of the effective provinces as a benchmark for improving their own performance based on similar size and structure of population, size of the budget for social programs, and traditions with administering particular types of social programs. Copyright (c) 2009 Elsevier Ltd. All rights reserved.

  6. Monitoring the metering performance of an electronic voltage transformer on-line based on cyber-physics correlation analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Zhu; Li, Hongbin; Tang, Dengping; Hu, Chen; Jiao, Yang

    2017-10-01

    Metering performance is the key parameter of an electronic voltage transformer (EVT), and it requires high accuracy. The conventional off-line calibration method using a standard voltage transformer is not suitable for the key equipment in a smart substation, which needs on-line monitoring. In this article, we propose a method for monitoring the metering performance of an EVT on-line based on cyber-physics correlation analysis. By the electrical and physical properties of a substation running in three-phase symmetry, the principal component analysis method is used to separate the metering deviation caused by the primary fluctuation and the EVT anomaly. The characteristic statistics of the measured data during operation are extracted, and the metering performance of the EVT is evaluated by analyzing the change in statistics. The experimental results show that the method successfully monitors the metering deviation of a Class 0.2 EVT accurately. The method demonstrates the accurate evaluation of on-line monitoring of the metering performance on an EVT without a standard voltage transformer.

  7. Analysis of PCR Thermocycling by Rayleigh-Bénard Convection

    NASA Astrophysics Data System (ADS)

    Sharma, Ruchi; Ugaz, Victor

    2004-03-01

    In previous studies, we demonstrated a novel device employing the circulatory flow field established by Rayleigh-Bénard convection to perform amplification of a 295 base target region from a human genomic DNA template inside a 35 uL cylindrical cavity using the polymerase chain reaction (PCR) [Krishnan, Ugaz & Burns, Science, Vol. 298, 2002, p. 793]. This design eliminates the need for dynamic external temperature control required in conventional thermocyclers that repeatedly heat and cool static sample volumes to denaturation, annealing, and extension temperatures. In this paper, we extend these studies by demonstrating the design and operation of a multiwell convective flow device capable of achieving amplification of a 191 base pair fragment associated with membrane channel proteins M1 and M2 of the influenza-A virus in as little as 15 minutes with performance comparable to a conventional thermocycler. We also study the effect of initial template concentration and observe no degradation in performance over four orders of magnitude of initial template loading dilution, consistent with conventional thermocycler results. These results illustrate the ability of convective flow PCR systems to achieve performance equal to or exceeding conventional thermocycling hardware, and demonstrate their suitability for use in rapid biodetection assays.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Im, Piljae; Liu, Xiaobing

    High initial costs and lack of public awareness of ground-source heat pump (GSHP) technology are the two major barriers preventing rapid deployment of this energy-saving technology in the United States. Under the American Recovery and Reinvestment Act (ARRA), 26 GSHP projects have been competitively selected and carried out to demonstrate the benefits of GSHP systems and innovative technologies for cost reduction and/or performance improvement. This paper highlights the findings of a case study of one of the ARRA-funded GSHP demonstration projects, a ground-source variable refrigerant flow (GS-VRF) system installed at the Human Health Building at Oakland University in Rochester, Michigan.more » This case study is based on the analysis of measured performance data, maintenance records, construction costs, and simulations of the energy consumption of conventional central heating, ventilation, and air-conditioning (HVAC) systems providing the same level of space conditioning as the demonstrated GS-VRF system. The evaluated performance metrics include the energy efficiency of the heat pump equipment and the overall GS-VRF system, pumping performance, energy savings, carbon emission reductions, and cost-effectiveness of the GS-VRF system compared with conventional HVAC systems. This case study also identified opportunities for reducing uncertainties in the performance evaluation, improving the operational efficiency, and reducing the installed cost of similar GSHP systems in the future.« less

  9. Business Modeling to Implement an eHealth Portal for Infection Control: A Reflection on Co-Creation With Stakeholders.

    PubMed

    van Limburg, Maarten; Wentzel, Jobke; Sanderman, Robbert; van Gemert-Pijnen, Lisette

    2015-08-13

    It is acknowledged that the success and uptake of eHealth improve with the involvement of users and stakeholders to make technology reflect their needs. Involving stakeholders in implementation research is thus a crucial element in developing eHealth technology. Business modeling is an approach to guide implementation research for eHealth. Stakeholders are involved in business modeling by identifying relevant stakeholders, conducting value co-creation dialogs, and co-creating a business model. Because implementation activities are often underestimated as a crucial step while developing eHealth, comprehensive and applicable approaches geared toward business modeling in eHealth are scarce. This paper demonstrates the potential of several stakeholder-oriented analysis methods and their practical application was demonstrated using Infectionmanager as an example case. In this paper, we aim to demonstrate how business modeling, with the focus on stakeholder involvement, is used to co-create an eHealth implementation. We divided business modeling in 4 main research steps. As part of stakeholder identification, we performed literature scans, expert recommendations, and snowball sampling (Step 1). For stakeholder analyzes, we performed "basic stakeholder analysis," stakeholder salience, and ranking/analytic hierarchy process (Step 2). For value co-creation dialogs, we performed a process analysis and stakeholder interviews based on the business model canvas (Step 3). Finally, for business model generation, we combined all findings into the business model canvas (Step 4). Based on the applied methods, we synthesized a step-by-step guide for business modeling with stakeholder-oriented analysis methods that we consider suitable for implementing eHealth. The step-by-step guide for business modeling with stakeholder involvement enables eHealth researchers to apply a systematic and multidisciplinary, co-creative approach for implementing eHealth. Business modeling becomes an active part in the entire development process of eHealth and starts an early focus on implementation, in which stakeholders help to co-create the basis necessary for a satisfying success and uptake of the eHealth technology.

  10. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 1; Formulation

    NASA Technical Reports Server (NTRS)

    Walsh, J. L.; Townsend, J. C.; Salas, A. O.; Samareh, J. A.; Mukhopadhyay, V.; Barthelemy, J.-F.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity, finite element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a highspeed civil transport configuration. The paper describes the engineering aspects of formulating the optimization by integrating these analysis codes and associated interface codes into the system. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture (CORBA) compliant software product. A companion paper presents currently available results.

  11. Design and Analysis Tool for External-Compression Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2012-01-01

    A computational tool named SUPIN has been developed to design and analyze external-compression supersonic inlets for aircraft at cruise speeds from Mach 1.6 to 2.0. The inlet types available include the axisymmetric outward-turning, two-dimensional single-duct, two-dimensional bifurcated-duct, and streamline-traced Busemann inlets. The aerodynamic performance is characterized by the flow rates, total pressure recovery, and drag. The inlet flowfield is divided into parts to provide a framework for the geometry and aerodynamic modeling and the parts are defined in terms of geometric factors. The low-fidelity aerodynamic analysis and design methods are based on analytic, empirical, and numerical methods which provide for quick analysis. SUPIN provides inlet geometry in the form of coordinates and surface grids useable by grid generation methods for higher-fidelity computational fluid dynamics (CFD) analysis. SUPIN is demonstrated through a series of design studies and CFD analyses were performed to verify some of the analysis results.

  12. Computer assessment of interview data using latent semantic analysis.

    PubMed

    Dam, Gregory; Kaufmann, Stefan

    2008-02-01

    Clinical interviews are a powerful method for assessing students' knowledge and conceptualdevelopment. However, the analysis of the resulting data is time-consuming and can create a "bottleneck" in large-scale studies. This article demonstrates the utility of computational methods in supporting such an analysis. Thirty-four 7th-grade student explanations of the causes of Earth's seasons were assessed using latent semantic analysis (LSA). Analyses were performed on transcriptions of student responses during interviews administered, prior to (n = 21) and after (n = 13) receiving earth science instruction. An instrument that uses LSA technology was developed to identify misconceptions and assess conceptual change in students' thinking. Its accuracy, as determined by comparing its classifications to the independent coding performed by four human raters, reached 90%. Techniques for adapting LSA technology to support the analysis of interview data, as well as some limitations, are discussed.

  13. Quality assessment of crude and processed Arecae semen based on colorimeter and HPLC combined with chemometrics methods.

    PubMed

    Sun, Meng; Yan, Donghui; Yang, Xiaolu; Xue, Xingyang; Zhou, Sujuan; Liang, Shengwang; Wang, Shumei; Meng, Jiang

    2017-05-01

    Raw Arecae Semen, the seed of Areca catechu L., as well as Arecae Semen Tostum and Arecae semen carbonisata are traditionally processed by stir-baking for subsequent use in a variety of clinical applications. These three Arecae semen types, important Chinese herbal drugs, have been used in China and other Asian countries for thousands of years. In this study, the sensory technologies of a colorimeter and sensitive validated high-performance liquid chromatography with diode array detection were employed to discriminate raw Arecae semen and its processed drugs. The color parameters of the samples were determined by a colorimeter instrument CR-410. Moreover, the fingerprints of the four alkaloids of arecaidine, guvacine, arecoline and guvacoline were surveyed by high-performance liquid chromatography. Subsequently, Student's t test, the analysis of variance, fingerprint similarity analysis, hierarchical cluster analysis, principal component analysis, factor analysis and Pearson's correlation test were performed for final data analysis. The results obtained demonstrated a significant color change characteristic for components in raw Arecae semen and its processed drugs. Crude and processed Arecae semen could be determined based on colorimetry and high-performance liquid chromatography with a diode array detector coupled with chemometrics methods for a comprehensive quality evaluation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Space Life-Support Engineering Program

    NASA Technical Reports Server (NTRS)

    Seagrave, Richard C. (Principal Investigator)

    1995-01-01

    This report covers the seventeen months of work performed under an extended one year NASA University Grant awarded to Iowa State University to perform research on topics relating to the development of closed-loop long-term life support systems with the initial principal focus on space water management. In the first phase of the program, investigators from chemistry and chemical engineering with demonstrated expertise in systems analysis, thermodynamics, analytical chemistry and instrumentation, performed research and development in two major related areas; the development of low-cost, accurate, and durable sensors for trace chemical and biological species, and the development of unsteady-state simulation packages for use in the development and optimization of control systems for life support systems. In the second year of the program, emphasis was redirected towards concentrating on the development of dynamic simulation techniques and software and on performing a thermodynamic systems analysis, centered on availability or energy analysis, in an effort to begin optimizing the systems needed for water purification. The third year of the program, the subject of this report, was devoted to the analysis of the water balance for the interaction between humans and the life support system during space flight and exercise, to analysis of the cardiopulmonary systems of humans during space flight, and to analysis of entropy production during operation of the air recovery system during space flight.

  15. An integrated approach to demonstrating the ANR pathway of proanthocyanidin biosynthesis in plants.

    PubMed

    Peng, Qing-Zhong; Zhu, Yue; Liu, Zhong; Du, Ci; Li, Ke-Gang; Xie, De-Yu

    2012-09-01

    Proanthocyanidins (PAs) are oligomers or polymers of plant flavan-3-ols and are important to plant adaptation in extreme environmental conditions. The characterization of anthocyanidin reductase (ANR) and leucoanthocyanidin reductase (LAR) has demonstrated the different biogenesis of four stereo-configurations of flavan-3-ols. It is important to understand whether ANR and the ANR pathway widely occur in the plant kingdom. Here, we report an integrated approach to demonstrate the ANR pathway in plants. This includes different methods to extract native ANR from different tissues of eight angiosperm plants (Lotus corniculatus, Desmodium uncinatum, Medicago sativa, Hordeum vulgare, Vitis vinifera, Vitis bellula, Parthenocissus heterophylla, and Cerasus serrulata) and one fern plant (Dryopteris pycnopteroides), a general enzymatic analysis approach to demonstrate the ANR activity, high-performance liquid chromatography-based fingerprinting to demonstrate (-)-epicatechin and other flavan-3-ol molecules, and phytochemical analysis of PAs. Results demonstrate that in addition to leaves of M. sativa, tissues of other eight plants contain an active ANR pathway. Particularly, the leaves, flowers and pods of D. uncinatum, which is a model plant to study LAR and the LAR pathways, are demonstrated to express an active ANR pathway. This finding suggests that the ANR pathway involves PA biosynthesis in D. uncinatum. In addition, a sequence BLAST analysis reveals that ANR homologs have been sequenced in plants from both gymnosperms and angiosperms. These data show that the ANR pathway to PA biosynthesis occurs in both seed and seedless vascular plants.

  16. Novel optical gyroscope: proof of principle demonstration and future scope

    PubMed Central

    Srivastava, Shailesh; Rao D. S., Shreesha; Nandakumar, Hari

    2016-01-01

    We report the first proof-of-principle demonstration of the resonant optical gyroscope with reflector that we have recently proposed. The device is very different from traditional optical gyroscopes since it uses the inherent coupling between the clockwise and counterclockwise propagating waves to sense the rotation. Our demonstration confirms our theoretical analysis and simulations. We also demonstrate a novel method of biasing the gyroscope using orthogonal polarization states. The simplicity of the structure and the readout method, the theoretically predicted high sensitivities (better than 0.001 deg/hr), and the possibility of further performance enhancement using a related laser based active device, all have immense potential for attracting fresh research and technological initiatives. PMID:27694987

  17. Forest cover from Landsat Thematic Mapper data for use in the Catahoula anger District geographic information system.

    Treesearch

    David L. Evans

    1994-01-01

    A forest cover classification of the Kisatchie National Forest, Catahoula Ranger district, was performed with Landsat Thematic Mapper data. Data base retrievals and map products from this analysis demonstrated use of Landsat for forest management decisions.

  18. Validating a Geographical Image Retrieval System.

    ERIC Educational Resources Information Center

    Zhu, Bin; Chen, Hsinchun

    2000-01-01

    Summarizes a prototype geographical image retrieval system that demonstrates how to integrate image processing and information analysis techniques to support large-scale content-based image retrieval. Describes an experiment to validate the performance of this image retrieval system against that of human subjects by examining similarity analysis…

  19. Further development of the sound intensity method of measuring tire noise performance of in-situ pavements.

    DOT National Transportation Integrated Search

    2006-01-01

    Through analysis of earlier research and some recent on-road testing it is demonstrated that, with : adequate precaution, accurate measurement of tire/pavement noise using on-board sound : intensity (SI) can be accomplished with two intensity probes ...

  20. 1995 Truck Size and Weight Performance-Based Workshop report : activity 5 : document North American and European experiences

    DOT National Transportation Integrated Search

    2012-08-01

    This report presents the test plan for conducting the Technical Capability Analysis for the United States Department of Transportation (U.S. DOT) evaluation of the San Diego Integrated Corridor Management (ICM) Initiative Demonstration. The ICM proje...

  1. DYNALIST II : A Computer Program for Stability and Dynamic Response Analysis of Rail Vehicle Systems : Volume 4. Revised User's Manual.

    DOT National Transportation Integrated Search

    1976-07-01

    The Federal Railroad Administration (FRA) is sponsoring research, development, and demonstration programs to provide improved safety, performance, speed, reliability, and maintainability of rail transportation systems at reduced life-cycle costs. A m...

  2. Fuzzy logic application for modeling man-in-the-loop space shuttle proximity operations. M.S. Thesis - MIT

    NASA Technical Reports Server (NTRS)

    Brown, Robert B.

    1994-01-01

    A software pilot model for Space Shuttle proximity operations is developed, utilizing fuzzy logic. The model is designed to emulate a human pilot during the terminal phase of a Space Shuttle approach to the Space Station. The model uses the same sensory information available to a human pilot and is based upon existing piloting rules and techniques determined from analysis of human pilot performance. Such a model is needed to generate numerous rendezvous simulations to various Space Station assembly stages for analysis of current NASA procedures and plume impingement loads on the Space Station. The advantages of a fuzzy logic pilot model are demonstrated by comparing its performance with NASA's man-in-the-loop simulations and with a similar model based upon traditional Boolean logic. The fuzzy model is shown to respond well from a number of initial conditions, with results typical of an average human. In addition, the ability to model different individual piloting techniques and new piloting rules is demonstrated.

  3. Precarious employment in Chile: psychometric properties of the Chilean version of Employment Precariousness Scale in private sector workers.

    PubMed

    Vives-Vergara, Alejandra; González-López, Francisca; Solar, Orielle; Bernales-Baksai, Pamela; González, María José; Benach, Joan

    2017-04-20

    The purpose of this study is to perform a psychometric analysis (acceptability, reliability and factor structure) of the Chilean version of the new Employment Precariousness Scale (EPRES). The data is drawn from a sample of 4,248 private salaried workers with a formal contract from the first Chilean Employment Conditions, Work, Health and Quality of Life (ENETS) survey, applied to a nationally representative sample of the Chilean workforce in 2010. Item and scale-level statistics were performed to assess scaling properties, acceptability and reliability. The six-dimensional factor structure was examined with confirmatory factor analysis. The scale exhibited high acceptability (roughly 80%) and reliability (Cronbach's alpha 0.83) and the factor structure was confirmed. One subscale (rights) demonstrated poorer metric properties without compromising the overall scale. The Chilean version of the Employment Precariousness Scale (EPRES-Ch) demonstrated good metric properties, pointing to its suitability for use in epidemiologic and public health research.

  4. Performance analysis of a multispectral system for mine detection in the littoral zone

    NASA Astrophysics Data System (ADS)

    Hargrove, John T.; Louchard, Eric

    2004-09-01

    Science & Technology International (STI) has developed, under contract with the Office of Naval Research, a system of multispectral airborne sensors and processing algorithms capable of detecting mine-like objects in the surf zone. STI has used this system to detect mine-like objects in a littoral environment as part of blind tests at Kaneohe Marine Corps Base Hawaii, and Panama City, Florida. The airborne and ground subsystems are described. The detection algorithm is graphically illustrated. We report on the performance of the system configured to operate without a human in the loop. A subsurface (underwater bottom proud mine in the surf zone and moored mine in shallow water) mine detection capability is demonstrated in the surf zone, and in shallow water with wave spillage and foam. Our analysis demonstrates that this STI-developed multispectral airborne mine detection system provides a technical foundation for a viable mine counter-measures system for use prior to an amphibious assault.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boggess, A.

    Existing models and simulants of tank disposition media at SRS have presumed the presence of high concentrations of inorganic mercury. However, recent quarterly tank analyses show that mercury is present as organomercurial species at concentrations that may present challenges to remediation and disposition and may exceed the Saltstone Waste Acceptance Criteria (WAC). To-date, methylmercury analysis for Savannah River Remediation (SRR) has been performed off-site by Eurofins Scientific (Lancaster, PA). A series of optimization and validation experiments has been performed at SRNL, which has resulted in the development of on-site organomercury speciation capabilities using purge and trap gas chromatography coupled withmore » thermal desorption cold vapor atomic fluorescence spectroscopy (P&T GC/CVAFS). Speciation has been achieved for methylmercury, with a method reporting limit (MRL) values of 1.42 pg for methylmercury. Results obtained by SRNL from the analysis of past quarterly samples from tanks 21, 40, and 50 have demonstrated statistically indistinguishable concentration values compared with the concentration data obtained from Eurofins, while the data from SRNL has demonstrated significantly improved precision and processing time.« less

  6. Evidence for Cognitive Remediation Therapy in Young People with Anorexia Nervosa: Systematic Review and Meta-analysis of the Literature.

    PubMed

    Tchanturia, Kate; Giombini, Lucia; Leppanen, Jenni; Kinnaird, Emma

    2017-07-01

    Cognitive remediation therapy (CRT) for eating disorders has demonstrated promising findings in adult age groups, with randomised treatment trials and systematic reviews demonstrating medium to large effect sizes in improved cognitive performance. In recent years, several case series have been conducted for young people with anorexia nervosa, but these findings have not been synthesised in the form of a systematic review. This systematic review aimed to evaluate the evidence for the efficacy of CRT in child and adolescent age groups. Nine studies were identified, with a subsequent meta-analysis suggesting improvements in cognitive performance with small effect sizes. Patient feedback was positive, with low dropout rates. These findings suggest that CRT has potential as a supplementary treatment for young people with anorexia nervosa, warranting further investigation using randomised treatment trials. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association.

  7. Active vibration damping of the Space Shuttle remote manipulator system

    NASA Technical Reports Server (NTRS)

    Scott, Michael A.; Gilbert, Michael G.; Demeo, Martha E.

    1991-01-01

    The feasibility of providing active damping augmentation of the Space Shuttle Remote Manipulator System (RMS) following normal payload handling operations is investigated. The approach used in the analysis is described, and the results for both linear and nonlinear performance analysis of candidate laws are presented, demonstrating that significant improvement in the RMS dynamic response can be achieved through active control using measured RMS tip acceleration data for feedback.

  8. Implementing a Quantitative Analysis Design Tool for Future Generation Interfaces

    DTIC Science & Technology

    2012-03-01

    with Remotely Piloted Aircraft (RPA) has resulted in the need of a platform to evaluate interface design. The Vigilant Spirit Control Station ( VSCS ...Spirit interface. A modified version of the HCI Index was successfully applied to perform a quantitative analysis of the baseline VSCS interface and...time of the original VSCS interface. These results revealed the effectiveness of the tool and demonstrated in the design of future generation

  9. Preparation and structural characterization of poly-mannose synthesized by phosphoric acid catalyzation under microwave irradiation.

    PubMed

    Wang, Haisong; Cheng, Xiangrong; Shi, Yonghui; Le, Guowei

    2015-05-05

    Poly-mannose with molecular weight of 2.457 kDa was synthesized using d-mannose as substrate and phosphoric acid as catalyst under the condition of microwave irradiation for the first time. The optimum reaction conditions were microwave output power of 900 W, temperature 115°C, proton concentration 2.5 mol/L, and microwave irradiation time 5 min. The actual maximum yield was 91.46%. After purified by Sepherdex G-25 column chromatography, the structural features of poly-mannose were investigated by high-performance anion-exchange chromatography (HPAEC), high-performance gel-permeation chromatography (HPGPC), infrared (IR) spectroscopy, methylation analysis and NMR spectroscopy analysis ((1)H, (13)C, COSY, TOCSY, HMQC, and HMBC). HPAEC analysis showed that the composition of synthetic polysaccharides was d-mannose, its purity was demonstrated by HPGPC as a single symmetrical sharp peak, and additionally IR spectra demonstrated the polymerization of d-mannose. Methylation analysis and NMR spectroscopy revealed that the backbone of poly-mannose consisting of (1→3)-linked β-d-Manp, (1→3)-linked α-d-Manp, and (1→6)-linked α-d-Manp residues, and the main chain were branched at the O-2, O-3, O-4, O-6 position. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Characterizing shallow secondary clarifier performance where conventional flux theory over-estimates allowable solids loading rate.

    PubMed

    Daigger, Glen T; Siczka, John S; Smith, Thomas F; Frank, David A; McCorquodale, J A

    The performance characteristics of relatively shallow (3.3 and 3.7 m sidewater depth in 30.5 m diameter) activated sludge secondary clarifiers were extensively evaluated during a 2-year testing program at the City of Akron Water Reclamation Facility (WRF), Ohio, USA. Testing included hydraulic and solids loading stress tests, and measurement of sludge characteristics (zone settling velocity (ZSV), dispersed and flocculated total suspended solids), and the results were used to calibrate computational fluid dynamic (CFD) models of the various clarifiers tested. The results demonstrated that good performance could be sustained at surface overflow rates in excess of 3 m/h, as long as the clarifier influent mixed liquor suspended solids (MLSS) concentration was controlled to below critical values. The limiting solids loading rate (SLR) was significantly lower than the value predicted by conventional solids flux analysis based on the measured ZSV/MLSS relationship. CFD analysis suggested that this resulted because mixed liquor entering the clarifier was being directed into the settled sludge blanket, diluting it and also creating a 'thin' concentration sludge blanket that overlays the thicker concentration sludge blanket typically expected. These results indicate the need to determine the allowable SLR for shallow clarifiers using approaches other than traditional solids flux analysis. A combination of actual testing and CFD analyses are demonstrated here to be effective in doing so.

  11. Statistical Analysis for Collision-free Boson Sampling.

    PubMed

    Huang, He-Liang; Zhong, Han-Sen; Li, Tan; Li, Feng-Guang; Fu, Xiang-Qun; Zhang, Shuo; Wang, Xiang; Bao, Wan-Su

    2017-11-10

    Boson sampling is strongly believed to be intractable for classical computers but solvable with photons in linear optics, which raises widespread concern as a rapid way to demonstrate the quantum supremacy. However, due to its solution is mathematically unverifiable, how to certify the experimental results becomes a major difficulty in the boson sampling experiment. Here, we develop a statistical analysis scheme to experimentally certify the collision-free boson sampling. Numerical simulations are performed to show the feasibility and practicability of our scheme, and the effects of realistic experimental conditions are also considered, demonstrating that our proposed scheme is experimentally friendly. Moreover, our broad approach is expected to be generally applied to investigate multi-particle coherent dynamics beyond the boson sampling.

  12. Parallel Algorithms for Monte Carlo Particle Transport Simulation on Exascale Computing Architectures

    NASA Astrophysics Data System (ADS)

    Romano, Paul Kollath

    Monte Carlo particle transport methods are being considered as a viable option for high-fidelity simulation of nuclear reactors. While Monte Carlo methods offer several potential advantages over deterministic methods, there are a number of algorithmic shortcomings that would prevent their immediate adoption for full-core analyses. In this thesis, algorithms are proposed both to ameliorate the degradation in parallel efficiency typically observed for large numbers of processors and to offer a means of decomposing large tally data that will be needed for reactor analysis. A nearest-neighbor fission bank algorithm was proposed and subsequently implemented in the OpenMC Monte Carlo code. A theoretical analysis of the communication pattern shows that the expected cost is O( N ) whereas traditional fission bank algorithms are O(N) at best. The algorithm was tested on two supercomputers, the Intrepid Blue Gene/P and the Titan Cray XK7, and demonstrated nearly linear parallel scaling up to 163,840 processor cores on a full-core benchmark problem. An algorithm for reducing network communication arising from tally reduction was analyzed and implemented in OpenMC. The proposed algorithm groups only particle histories on a single processor into batches for tally purposes---in doing so it prevents all network communication for tallies until the very end of the simulation. The algorithm was tested, again on a full-core benchmark, and shown to reduce network communication substantially. A model was developed to predict the impact of load imbalances on the performance of domain decomposed simulations. The analysis demonstrated that load imbalances in domain decomposed simulations arise from two distinct phenomena: non-uniform particle densities and non-uniform spatial leakage. The dominant performance penalty for domain decomposition was shown to come from these physical effects rather than insufficient network bandwidth or high latency. The model predictions were verified with measured data from simulations in OpenMC on a full-core benchmark problem. Finally, a novel algorithm for decomposing large tally data was proposed, analyzed, and implemented/tested in OpenMC. The algorithm relies on disjoint sets of compute processes and tally servers. The analysis showed that for a range of parameters relevant to LWR analysis, the tally server algorithm should perform with minimal overhead. Tests were performed on Intrepid and Titan and demonstrated that the algorithm did indeed perform well over a wide range of parameters. (Copies available exclusively from MIT Libraries, libraries.mit.edu/docs - docs mit.edu)

  13. Energy Efficient Engine: Combustor component performance program

    NASA Technical Reports Server (NTRS)

    Dubiel, D. J.

    1986-01-01

    The results of the Combustor Component Performance analysis as developed under the Energy Efficient Engine (EEE) program are presented. This study was conducted to demonstrate the aerothermal and environmental goals established for the EEE program and to identify areas where refinements might be made to meet future combustor requirements. In this study, a full annular combustor test rig was used to establish emission levels and combustor performance for comparison with those indicated by the supporting technology program. In addition, a combustor sector test rig was employed to examine differences in emissions and liner temperatures obtained during the full annular performance and supporting technology tests.

  14. Comparison of Appendectomy Outcomes Between Senior General Surgeons and General Surgery Residents.

    PubMed

    Siam, Baha; Al-Kurd, Abbas; Simanovsky, Natalia; Awesat, Haitham; Cohn, Yahav; Helou, Brigitte; Eid, Ahmed; Mazeh, Haggi

    2017-07-01

    In some centers, the presence of a senior general surgeon (SGS) is obligatory in every procedure, including appendectomy, while in others it is not. There is a relative paucity in the literature of reports comparing the outcomes of appendectomies performed by unsupervised general surgery residents (GSRs) with those performed in the presence of an SGS. To compare the outcomes of appendectomies performed by SGSs with those performed by GSRs. A retrospective analysis was performed of all patients 16 years or older operated on for assumed acute appendicitis between January 1, 2008, and December 31, 2015. The cohort study compared appendectomies performed by SGSs and GSRs in the general surgical department of a teaching hospital. The primary outcome measured was the postoperative early and late complication rates. Secondary outcomes included time from emergency department to operating room, length of surgery, surgical technique (open or laparoscopic), use of laparoscopic staplers, and overall duration of postoperative antibiotic treatment. Among 1649 appendectomy procedures (mean [SD] patient age, 33.7 [13.3] years; 612 female [37.1%]), 1101 were performed by SGSs and 548 by GSRs. Analysis demonstrated no significant difference between the SGS group and the GSR group in overall postoperative early and late complication rates, the use of imaging techniques, time from emergency department to operating room, percentage of complicated appendicitis, postoperative length of hospital stay, and overall duration of postoperative antibiotic treatment. However, length of surgery was significantly shorter in the SGS group than in the GSR group (mean [SD], 39.9 [20.9] vs 48.6 [20.2] minutes; P < .001). This study demonstrates that unsupervised surgical residents may safely perform appendectomies, with no difference in postoperative early and late complication rates compared with those performed in the presence of an SGS.

  15. A Pulsed Thermographic Imaging System for Detection and Identification of Cotton Foreign Matter

    PubMed Central

    Kuzy, Jesse; Li, Changying

    2017-01-01

    Detection of foreign matter in cleaned cotton is instrumental to accurately grading cotton quality, which in turn impacts the marketability of the cotton. Current grading systems return estimates of the amount of foreign matter present, but provide no information about the identity of the contaminants. This paper explores the use of pulsed thermographic analysis to detect and identify cotton foreign matter. The design and implementation of a pulsed thermographic analysis system is described. A sample set of 240 foreign matter and cotton lint samples were collected. Hand-crafted waveform features and frequency-domain features were extracted and analyzed for statistical significance. Classification was performed on these features using linear discriminant analysis and support vector machines. Using waveform features and support vector machine classifiers, detection of cotton foreign matter was performed with 99.17% accuracy. Using frequency-domain features and linear discriminant analysis, identification was performed with 90.00% accuracy. These results demonstrate that pulsed thermographic imaging analysis produces data which is of significant utility for the detection and identification of cotton foreign matter. PMID:28273848

  16. Guideline update for the performance of fusion procedures for degenerative disease of the lumbar spine. Part 3: assessment of economic outcome.

    PubMed

    Ghogawala, Zoher; Whitmore, Robert G; Watters, William C; Sharan, Alok; Mummaneni, Praveen V; Dailey, Andrew T; Choudhri, Tanvir F; Eck, Jason C; Groff, Michael W; Wang, Jeffrey C; Resnick, Daniel K; Dhall, Sanjay S; Kaiser, Michael G

    2014-07-01

    A comprehensive economic analysis generally involves the calculation of indirect and direct health costs from a societal perspective as opposed to simply reporting costs from a hospital or payer perspective. Hospital charges for a surgical procedure must be converted to cost data when performing a cost-effectiveness analysis. Once cost data has been calculated, quality-adjusted life year data from a surgical treatment are calculated by using a preference-based health-related quality-of-life instrument such as the EQ-5D. A recent cost-utility analysis from a single study has demonstrated the long-term (over an 8-year time period) benefits of circumferential fusions over stand-alone posterolateral fusions. In addition, economic analysis from a single study has found that lumbar fusion for selected patients with low-back pain can be recommended from an economic perspective. Recent economic analysis, from a single study, finds that femoral ring allograft might be more cost-effective compared with a specific titanium cage when performing an anterior lumbar interbody fusion plus posterolateral fusion.

  17. An overview of technical considerations when using quantitative real-time PCR analysis of gene expression in human exercise research

    PubMed Central

    Yan, Xu; Bishop, David J.

    2018-01-01

    Gene expression analysis by quantitative PCR in skeletal muscle is routine in exercise studies. The reproducibility and reliability of the data fundamentally depend on how the experiments are performed and interpreted. Despite the popularity of the assay, there is a considerable variation in experimental protocols and data analyses from different laboratories, and there is a lack of consistency of proper quality control steps throughout the assay. In this study, we present a number of experiments on various steps of quantitative PCR workflow, and demonstrate how to perform a quantitative PCR experiment with human skeletal muscle samples in an exercise study. We also tested some common mistakes in performing qPCR. Interestingly, we found that mishandling of muscle for a short time span (10 mins) before RNA extraction did not affect RNA quality, and isolated total RNA was preserved for up to one week at room temperature. Demonstrated by our data, use of unstable reference genes lead to substantial differences in the final results. Alternatively, cDNA content can be used for data normalisation; however, complete removal of RNA from cDNA samples is essential for obtaining accurate cDNA content. PMID:29746477

  18. Effects of Pilates on muscle strength, postural balance and quality of life of older adults: a randomized, controlled, clinical trial

    PubMed Central

    Campos de Oliveira, Laís; Gonçalves de Oliveira, Raphael; Pires-Oliveira, Deise Aparecida de Almeida

    2015-01-01

    [Purpose] The aim of the present study was to determine the effects of Pilates on lower leg strength, postural balance and the health-related quality of life (HRQoL) of older adults. [Subjects and Methods] Thirty-two older adults were randomly allocated either to the experimental group (EG, n = 16; mean age, 63.62 ± 1.02 years), which performed two sessions of Pilates per week for 12 weeks, or to the control group (CG, n = 16; mean age, 64.21 ± 0.80), which performed two sessions of static stretching per week for 12 weeks. The following evaluations were performed before and after the interventions: isokinetic torque of knee extensors and flexors at 300°/s, the Timed Up and Go (TUG) test, the Berg Balance Scale, and the Health Survey assessment (SF-36). [Results] In the intra-group analysis, the EG demonstrated significant improvement in all variables. In the inter-group analysis, the EG demonstrated significant improvement in most variables. [Conclusion] Pilates exercises led to significant improvement in isokinetic torque of the knee extensors and flexors, postural balance and aspects of the health-related quality of life of older adults. PMID:25931749

  19. Design and Test Research on Cutting Blade of Corn Harvester Based on Bionic Principle.

    PubMed

    Tian, Kunpeng; Li, Xianwang; Zhang, Bin; Chen, Qiaomin; Shen, Cheng; Huang, Jicheng

    2017-01-01

    Existing corn harvester cutting blades have problems associated with large cutting resistance, high energy consumption, and poor cut quality. Using bionics principles, a bionic blade was designed by extracting the cutting tooth profile curve of the B. horsfieldi palate. Using a double-blade cutting device testing system, a single stalk cutting performance contrast test for corn stalks obtained at harvest time was carried out. Results show that bionic blades have superior performance, demonstrated by strong cutting ability and good cut quality. Using statistical analysis of two groups of cutting test data, the average cutting force and cutting energy of bionic blades and ordinary blades were obtained as 480.24 N and 551.31 N and 3.91 J and 4.38 J, respectively. Average maximum cutting force and cutting energy consumption for the bionic blade were reduced by 12.89% and 10.73%, respectively. Variance analysis showed that both blade types had a significant effect on maximum cutting energy and cutting energy required to cut a corn stalk. This demonstrates that bionic blades have better cutting force and energy consumption reduction performance than ordinary blades.

  20. CFD Analysis and Design of Detailed Target Configurations for an Accelerator-Driven Subcritical System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kraus, Adam; Merzari, Elia; Sofu, Tanju

    2016-08-01

    High-fidelity analysis has been utilized in the design of beam target options for an accelerator driven subcritical system. Designs featuring stacks of plates with square cross section have been investigated for both tungsten and uranium target materials. The presented work includes the first thermal-hydraulic simulations of the full, detailed target geometry. The innovative target cooling manifold design features many regions with complex flow features, including 90 bends and merging jets, which necessitate three-dimensional fluid simulations. These were performed using the commercial computational fluid dynamics code STAR-CCM+. Conjugate heat transfer was modeled between the plates, cladding, manifold structure, and fluid. Steady-statemore » simulations were performed but lacked good residual convergence. Unsteady simulations were then performed, which converged well and demonstrated that flow instability existed in the lower portion of the manifold. It was established that the flow instability had little effect on the peak plate temperatures, which were well below the melting point. The estimated plate surface temperatures and target region pressure were shown to provide sufficient margin to subcooled boiling for standard operating conditions. This demonstrated the safety of both potential target configurations during normal operation.« less

  1. Effects of Elastic Resistance Exercise on Muscle Strength and Functional Performance in Healthy Adults: A Systematic Review and Meta-Analysis.

    PubMed

    de Oliveira, Poliana Alves; Blasczyk, Juscelino Castro; Souza Junior, Gerson; Lagoa, Karina Ferreira; Soares, Milene; de Oliveira, Ricardo Jacó; Filho, Paulo José Barbosa Gutierres; Carregaro, Rodrigo Luiz; Martins, Wagner Rodrigues

    2017-04-01

    Elastic Resistance Exercise (ERE) has already demonstrated its effectiveness in older adults and, when combined with the resistance generated by fixed loads, in adults. This review summarizes the effectiveness of ERE performed as isolated method on muscle strength and functional performance in healthy adults. A database search was performed (MEDLine, Cochrane Library, PEDro and Web of Knowledge) to identify controlled clinical trials in English language. The mean difference (MD) with 95% confidence intervals (CIs) and overall effect size were calculated for all comparisons. The PEDro scale was used assess the methodological quality. From the 93 articles identified by the search strategy, 5 met the inclusion criteria, in which 3 presented high quality (PEDro > 6). Meta-analyses demonstrated that the effects of ERE were superior when compared with passive control on functional performance and muscle strength. When compared with active controls, the effect of ERE was inferior on function performance and with similar effect on muscle strength. ERE are effective to improve functional performance and muscle strength when compared with no intervention, in healthy adults. ERE are not superior to other methods of resistance training to improve functional performance and muscle strength in health adults.

  2. A Whole-Tumor Histogram Analysis of Apparent Diffusion Coefficient Maps for Differentiating Thymic Carcinoma from Lymphoma.

    PubMed

    Zhang, Wei; Zhou, Yue; Xu, Xiao-Quan; Kong, Ling-Yan; Xu, Hai; Yu, Tong-Fu; Shi, Hai-Bin; Feng, Qing

    2018-01-01

    To assess the performance of a whole-tumor histogram analysis of apparent diffusion coefficient (ADC) maps in differentiating thymic carcinoma from lymphoma, and compare it with that of a commonly used hot-spot region-of-interest (ROI)-based ADC measurement. Diffusion weighted imaging data of 15 patients with thymic carcinoma and 13 patients with lymphoma were retrospectively collected and processed with a mono-exponential model. ADC measurements were performed by using a histogram-based and hot-spot-ROI-based approach. In the histogram-based approach, the following parameters were generated: mean ADC (ADC mean ), median ADC (ADC median ), 10th and 90th percentile of ADC (ADC 10 and ADC 90 ), kurtosis, and skewness. The difference in ADCs between thymic carcinoma and lymphoma was compared using a t test. Receiver operating characteristic analyses were conducted to determine and compare the differentiating performance of ADCs. Lymphoma demonstrated significantly lower ADC mean , ADC median , ADC 10 , ADC 90 , and hot-spot-ROI-based mean ADC than those found in thymic carcinoma (all p values < 0.05). There were no differences found in the kurtosis ( p = 0.412) and skewness ( p = 0.273). The ADC 10 demonstrated optimal differentiating performance (cut-off value, 0.403 × 10 -3 mm 2 /s; area under the receiver operating characteristic curve [AUC], 0.977; sensitivity, 92.3%; specificity, 93.3%), followed by the ADC mean , ADC median , ADC 90 , and hot-spot-ROI-based mean ADC. The AUC of ADC 10 was significantly higher than that of the hot spot ROI based ADC (0.977 vs. 0.797, p = 0.036). Compared with the commonly used hot spot ROI based ADC measurement, a histogram analysis of ADC maps can improve the differentiating performance between thymic carcinoma and lymphoma.

  3. The Impact of Nursing Home Pay-for-Performance on Quality and Medicare Spending: Results from the Nursing Home Value-Based Purchasing Demonstration.

    PubMed

    Grabowski, David C; Stevenson, David G; Caudry, Daryl J; O'Malley, A James; Green, Lisa H; Doherty, Julia A; Frank, Richard G

    2017-08-01

    To evaluate the impact of the Nursing Home Value-Based Purchasing demonstration on quality of care and Medicare spending. Administrative and qualitative data from Arizona, New York, and Wisconsin nursing homes over the base-year (2008-2009) and 3-year (2009-2012) demonstration period. Nursing homes were randomized to the intervention in New York, while the comparison facilities were constructed via propensity score matching in Arizona and Wisconsin. We used a difference-in-difference analysis to compare outcomes across the base-year relative to outcomes in each of the three demonstration years. To provide context and assist with interpretation of results, we also interviewed staff members at participating facilities. Medicare savings were observed in Arizona in the first year only and Wisconsin for the first 2 years; no savings were observed in New York. The demonstration did not systematically impact any of the quality measures. Discussions with nursing home administrators suggested that facilities made few, if any, changes in response to the demonstration, leading us to conclude that the observed savings likely reflected regression to the mean rather than true savings. The Federal nursing home pay-for-performance demonstration had little impact on quality or Medicare spending. © Health Research and Educational Trust.

  4. Comprehensive and quantitative profiling of lipid species in human milk, cow milk and a phospholipid-enriched milk formula by GC and MS/MSALL.

    PubMed

    Sokol, Elena; Ulven, Trond; Færgeman, Nils J; Ejsing, Christer S

    2015-06-01

    Here we present a workflow for in-depth analysis of milk lipids that combines gas chromatography (GC) for fatty acid (FA) profiling and a shotgun lipidomics routine termed MS/MS ALL for structural characterization of molecular lipid species. To evaluate the performance of the workflow we performed a comparative lipid analysis of human milk, cow milk, and Lacprodan® PL-20, a phospholipid-enriched milk protein concentrate for infant formula. The GC analysis showed that human milk and Lacprodan have a similar FA profile with higher levels of unsaturated FAs as compared to cow milk. In-depth lipidomic analysis by MS/MS ALL revealed that each type of milk sample comprised distinct composition of molecular lipid species. Lipid class composition showed that the human and cow milk contain a higher proportion of triacylglycerols (TAGs) as compared to Lacprodan. Notably, the MS/MS ALL analysis demonstrated that the similar FA profile of human milk and Lacprodan determined by GC analysis is attributed to the composition of individual TAG species in human milk and glycerophospholipid species in Lacprodan. Moreover, the analysis of TAG molecules in Lacprodan and cow milk showed a high proportion of short-chain FAs that could not be monitored by GC analysis. The results presented here show that complementary GC and MS/MS ALL analysis is a powerful approach for characterization of molecular lipid species in milk and milk products. : Milk lipid analysis is routinely performed using gas chromatography. This method reports the total fatty acid composition of all milk lipids, but provides no structural or quantitative information about individual lipid molecules in milk or milk products. Here we present a workflow that integrates gas chromatography for fatty acid profiling and a shotgun lipidomics routine termed MS/MS ALL for structural analysis and quantification of molecular lipid species. We demonstrate the efficacy of this complementary workflow by a comparative analysis of molecular lipid species in human milk, cow milk, and a milk-based supplement used for infant formula.

  5. Comprehensive and quantitative profiling of lipid species in human milk, cow milk and a phospholipid-enriched milk formula by GC and MS/MSALL

    PubMed Central

    Sokol, Elena; Ulven, Trond; Færgeman, Nils J; Ejsing, Christer S

    2015-01-01

    Here we present a workflow for in-depth analysis of milk lipids that combines gas chromatography (GC) for fatty acid (FA) profiling and a shotgun lipidomics routine termed MS/MSALL for structural characterization of molecular lipid species. To evaluate the performance of the workflow we performed a comparative lipid analysis of human milk, cow milk, and Lacprodan® PL-20, a phospholipid-enriched milk protein concentrate for infant formula. The GC analysis showed that human milk and Lacprodan have a similar FA profile with higher levels of unsaturated FAs as compared to cow milk. In-depth lipidomic analysis by MS/MSALL revealed that each type of milk sample comprised distinct composition of molecular lipid species. Lipid class composition showed that the human and cow milk contain a higher proportion of triacylglycerols (TAGs) as compared to Lacprodan. Notably, the MS/MSALL analysis demonstrated that the similar FA profile of human milk and Lacprodan determined by GC analysis is attributed to the composition of individual TAG species in human milk and glycerophospholipid species in Lacprodan. Moreover, the analysis of TAG molecules in Lacprodan and cow milk showed a high proportion of short-chain FAs that could not be monitored by GC analysis. The results presented here show that complementary GC and MS/MSALL analysis is a powerful approach for characterization of molecular lipid species in milk and milk products. Practical applications : Milk lipid analysis is routinely performed using gas chromatography. This method reports the total fatty acid composition of all milk lipids, but provides no structural or quantitative information about individual lipid molecules in milk or milk products. Here we present a workflow that integrates gas chromatography for fatty acid profiling and a shotgun lipidomics routine termed MS/MSALL for structural analysis and quantification of molecular lipid species. We demonstrate the efficacy of this complementary workflow by a comparative analysis of molecular lipid species in human milk, cow milk, and a milk-based supplement used for infant formula. PMID:26089741

  6. Analysis of Metallized Teflon(trademark) Film Materials Performance on Satellites

    NASA Technical Reports Server (NTRS)

    Pippin, H. Gary; Normand, Eugene; Wolf, Suzanne L. B.; Kamenetzky, Rachel; Kauffman, William J., Jr. (Technical Monitor)

    2002-01-01

    Laboratory and on-orbit performance data for two common thermal control materials, silver- and aluminum-backed (metallized) fluorinated ethyl-propylene (TER) was collected from a variety of sources and analyzed. This paper demonstrates that the change in solar absorptance, alpha, is a strong function of particulate radiation for these materials. Examination of additional data shows that the atomic oxygen recession rate is a strong function of solar exposure with an induction period of between 25 to 50 equivalent solar hours. The relationships determined in this analysis were incorporated into an electronic knowledge base, the 'Spacecraft Materials Selector,' under NASA contract NAS8-98213.

  7. Integrated approaches for reducing sample size for measurements of trace elemental impurities in plutonium by ICP-OES and ICP-MS

    DOE PAGES

    Xu, Ning; Chamberlin, Rebecca M.; Thompson, Pam; ...

    2017-10-07

    This study has demonstrated that bulk plutonium chemical analysis can be performed at small scales (\\50 mg material) through three case studies. Analytical methods were developed for ICP-OES and ICP-MS instruments to measure trace impurities and gallium content in plutonium metals with comparable or improved detection limits, measurement accuracy and precision. In two case studies, the sample size has been reduced by 109, and in the third case study, by as much as 50009, so that the plutonium chemical analysis can be performed in a facility rated for lower-hazard and lower-security operations.

  8. NASA Runway Incursion Prevention System (RIPS) Dallas-Fort Worth Demonstration Performance Analysis

    NASA Technical Reports Server (NTRS)

    Cassell, Rick; Evers, Carl; Esche, Jeff; Sleep, Benjamin; Jones, Denise R. (Technical Monitor)

    2002-01-01

    NASA's Aviation Safety Program Synthetic Vision System project conducted a Runway Incursion Prevention System (RIPS) flight test at the Dallas-Fort Worth International Airport in October 2000. The RIPS research system includes advanced displays, airport surveillance system, data links, positioning system, and alerting algorithms to provide pilots with enhanced situational awareness, supplemental guidance cues, a real-time display of traffic information, and warnings of runway incursions. This report describes the aircraft and ground based runway incursion alerting systems and traffic positioning systems (Automatic Dependent Surveillance - Broadcast (ADS-B) and Traffic Information Service - Broadcast (TIS-B)). A performance analysis of these systems is also presented.

  9. Integrated approaches for reducing sample size for measurements of trace elemental impurities in plutonium by ICP-OES and ICP-MS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Ning; Chamberlin, Rebecca M.; Thompson, Pam

    This study has demonstrated that bulk plutonium chemical analysis can be performed at small scales (\\50 mg material) through three case studies. Analytical methods were developed for ICP-OES and ICP-MS instruments to measure trace impurities and gallium content in plutonium metals with comparable or improved detection limits, measurement accuracy and precision. In two case studies, the sample size has been reduced by 109, and in the third case study, by as much as 50009, so that the plutonium chemical analysis can be performed in a facility rated for lower-hazard and lower-security operations.

  10. Assessing technical performance in differential gene expression experiments with external spike-in RNA control ratio mixtures.

    PubMed

    Munro, Sarah A; Lund, Steven P; Pine, P Scott; Binder, Hans; Clevert, Djork-Arné; Conesa, Ana; Dopazo, Joaquin; Fasold, Mario; Hochreiter, Sepp; Hong, Huixiao; Jafari, Nadereh; Kreil, David P; Łabaj, Paweł P; Li, Sheng; Liao, Yang; Lin, Simon M; Meehan, Joseph; Mason, Christopher E; Santoyo-Lopez, Javier; Setterquist, Robert A; Shi, Leming; Shi, Wei; Smyth, Gordon K; Stralis-Pavese, Nancy; Su, Zhenqiang; Tong, Weida; Wang, Charles; Wang, Jian; Xu, Joshua; Ye, Zhan; Yang, Yong; Yu, Ying; Salit, Marc

    2014-09-25

    There is a critical need for standard approaches to assess, report and compare the technical performance of genome-scale differential gene expression experiments. Here we assess technical performance with a proposed standard 'dashboard' of metrics derived from analysis of external spike-in RNA control ratio mixtures. These control ratio mixtures with defined abundance ratios enable assessment of diagnostic performance of differentially expressed transcript lists, limit of detection of ratio (LODR) estimates and expression ratio variability and measurement bias. The performance metrics suite is applicable to analysis of a typical experiment, and here we also apply these metrics to evaluate technical performance among laboratories. An interlaboratory study using identical samples shared among 12 laboratories with three different measurement processes demonstrates generally consistent diagnostic power across 11 laboratories. Ratio measurement variability and bias are also comparable among laboratories for the same measurement process. We observe different biases for measurement processes using different mRNA-enrichment protocols.

  11. Line Fluid Actuated Valve Development Program. [for application on the space shuttle

    NASA Technical Reports Server (NTRS)

    Lynch, R. A.

    1975-01-01

    The feasibility of a line-fluid actuated valve design for potential application as a propellant-control valve on the space shuttle was examined. Design and analysis studies of two prototype valve units were conducted and demonstrated performance is reported. It was shown that the line-fluid actuated valve concept offers distinct weight and electrical advantages over alternate valve concepts. Summaries of projected performance and design goals are also included.

  12. Uncertainty aggregation and reduction in structure-material performance prediction

    NASA Astrophysics Data System (ADS)

    Hu, Zhen; Mahadevan, Sankaran; Ao, Dan

    2018-02-01

    An uncertainty aggregation and reduction framework is presented for structure-material performance prediction. Different types of uncertainty sources, structural analysis model, and material performance prediction model are connected through a Bayesian network for systematic uncertainty aggregation analysis. To reduce the uncertainty in the computational structure-material performance prediction model, Bayesian updating using experimental observation data is investigated based on the Bayesian network. It is observed that the Bayesian updating results will have large error if the model cannot accurately represent the actual physics, and that this error will be propagated to the predicted performance distribution. To address this issue, this paper proposes a novel uncertainty reduction method by integrating Bayesian calibration with model validation adaptively. The observation domain of the quantity of interest is first discretized into multiple segments. An adaptive algorithm is then developed to perform model validation and Bayesian updating over these observation segments sequentially. Only information from observation segments where the model prediction is highly reliable is used for Bayesian updating; this is found to increase the effectiveness and efficiency of uncertainty reduction. A composite rotorcraft hub component fatigue life prediction model, which combines a finite element structural analysis model and a material damage model, is used to demonstrate the proposed method.

  13. Performance optimization for rotors in hover and axial flight

    NASA Technical Reports Server (NTRS)

    Quackenbush, T. R.; Wachspress, D. A.; Kaufman, A. E.; Bliss, D. B.

    1989-01-01

    Performance optimization for rotors in hover and axial flight is a topic of continuing importance to rotorcraft designers. The aim of this Phase 1 effort has been to demonstrate that a linear optimization algorithm could be coupled to an existing influence coefficient hover performance code. This code, dubbed EHPIC (Evaluation of Hover Performance using Influence Coefficients), uses a quasi-linear wake relaxation to solve for the rotor performance. The coupling was accomplished by expanding of the matrix of linearized influence coefficients in EHPIC to accommodate design variables and deriving new coefficients for linearized equations governing perturbations in power and thrust. These coefficients formed the input to a linear optimization analysis, which used the flow tangency conditions on the blade and in the wake to impose equality constraints on the expanded system of equations; user-specified inequality contraints were also employed to bound the changes in the design. It was found that this locally linearized analysis could be invoked to predict a design change that would produce a reduction in the power required by the rotor at constant thrust. Thus, an efficient search for improved versions of the baseline design can be carried out while retaining the accuracy inherent in a free wake/lifting surface performance analysis.

  14. Design framework for a spectral mask for a plenoptic camera

    NASA Astrophysics Data System (ADS)

    Berkner, Kathrin; Shroff, Sapna A.

    2012-01-01

    Plenoptic cameras are designed to capture different combinations of light rays from a scene, sampling its lightfield. Such camera designs capturing directional ray information enable applications such as digital refocusing, rotation, or depth estimation. Only few address capturing spectral information of the scene. It has been demonstrated that by modifying a plenoptic camera with a filter array containing different spectral filters inserted in the pupil plane of the main lens, sampling of the spectral dimension of the plenoptic function is performed. As a result, the plenoptic camera is turned into a single-snapshot multispectral imaging system that trades-off spatial with spectral information captured with a single sensor. Little work has been performed so far on analyzing diffraction effects and aberrations of the optical system on the performance of the spectral imager. In this paper we demonstrate simulation of a spectrally-coded plenoptic camera optical system via wave propagation analysis, evaluate quality of the spectral measurements captured at the detector plane, and demonstrate opportunities for optimization of the spectral mask for a few sample applications.

  15. Virtual lab demonstrations improve students' mastery of basic biology laboratory techniques.

    PubMed

    Maldarelli, Grace A; Hartmann, Erica M; Cummings, Patrick J; Horner, Robert D; Obom, Kristina M; Shingles, Richard; Pearlman, Rebecca S

    2009-01-01

    Biology laboratory classes are designed to teach concepts and techniques through experiential learning. Students who have never performed a technique must be guided through the process, which is often difficult to standardize across multiple lab sections. Visual demonstration of laboratory procedures is a key element in teaching pedagogy. The main goals of the study were to create videos explaining and demonstrating a variety of lab techniques that would serve as teaching tools for undergraduate and graduate lab courses and to assess the impact of these videos on student learning. Demonstrations of individual laboratory procedures were videotaped and then edited with iMovie. Narration for the videos was edited with Audacity. Undergraduate students were surveyed anonymously prior to and following screening to assess the impact of the videos on student lab performance by completion of two Participant Perception Indicator surveys. A total of 203 and 171 students completed the pre- and posttesting surveys, respectively. Statistical analyses were performed to compare student perceptions of knowledge of, confidence in, and experience with the lab techniques before and after viewing the videos. Eleven demonstrations were recorded. Chi-square analysis revealed a significant increase in the number of students reporting increased knowledge of, confidence in, and experience with the lab techniques after viewing the videos. Incorporation of instructional videos as prelaboratory exercises has the potential to standardize techniques and to promote successful experimental outcomes.

  16. Airplane numerical simulation for the rapid prototyping process

    NASA Astrophysics Data System (ADS)

    Roysdon, Paul F.

    Airplane Numerical Simulation for the Rapid Prototyping Process is a comprehensive research investigation into the most up-to-date methods for airplane development and design. Uses of modern engineering software tools, like MatLab and Excel, are presented with examples of batch and optimization algorithms which combine the computing power of MatLab with robust aerodynamic tools like XFOIL and AVL. The resulting data is demonstrated in the development and use of a full non-linear six-degrees-of-freedom simulator. The applications for this numerical tool-box vary from un-manned aerial vehicles to first-order analysis of manned aircraft. A Blended-Wing-Body airplane is used for the analysis to demonstrate the flexibility of the code from classic wing-and-tail configurations to less common configurations like the blended-wing-body. This configuration has been shown to have superior aerodynamic performance -- in contrast to their classic wing-and-tube fuselage counterparts -- and have reduced sensitivity to aerodynamic flutter as well as potential for increased engine noise abatement. Of course without a classic tail elevator to damp the nose up pitching moment, and the vertical tail rudder to damp the yaw and possible rolling aerodynamics, the challenges in lateral roll and yaw stability, as well as pitching moment are not insignificant. This thesis work applies the tools necessary to perform the airplane development and optimization on a rapid basis, demonstrating the strength of this tool through examples and comparison of the results to similar airplane performance characteristics published in literature.

  17. Functional integration of PCR amplification and capillary eletrophoresis in a microfabricated DNA analysis device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woolley, A.T.; deMello, A.J.; Mathies, R.A.

    Microfabricated silicon PCR reactors and glass capillary electrophoresis (CE) chips have been successfully coupled to form an integrated DNA analysis system. This construct combines the rapid thermal cycling capabilities of microfabricated PCR devices (10{degree}C/s heating, 2.5{degree}C/s cooling) with the high-speed (<120 s) DNA separations provided by microfabricated CE chips. The PCR chamber and the CE chip were directly linked through a photolithographically fabricated channel filled with hydroxyethylcellulose sieving matrix. Electrophoretic injection directly from the PCR chamber through the cross injection channel was used as an `electrophoretic valve` to couple the PCR and CE devices on-chip. To demonstrate the functionality ofmore » this system, a 15 min PCR amplification of a {Beta}-globin target cloned in m13 was immediately followed by high-speed CE chip separation in under 120 s, providing a rapid PCR-CE analysis in under 20 min. A rapid assay for genomic Salmonella DNA was performed in under 45 min, demonstrating that challenging amplifications of diagnostically interesting targets can also be performed. Real-time monitoring of PCR target amplification in these integrated PCR-CE devices is also feasible. 33 refs., 6 figs.« less

  18. Using the weighted area under the net benefit curve for decision curve analysis.

    PubMed

    Talluri, Rajesh; Shete, Sanjay

    2016-07-18

    Risk prediction models have been proposed for various diseases and are being improved as new predictors are identified. A major challenge is to determine whether the newly discovered predictors improve risk prediction. Decision curve analysis has been proposed as an alternative to the area under the curve and net reclassification index to evaluate the performance of prediction models in clinical scenarios. The decision curve computed using the net benefit can evaluate the predictive performance of risk models at a given or range of threshold probabilities. However, when the decision curves for 2 competing models cross in the range of interest, it is difficult to identify the best model as there is no readily available summary measure for evaluating the predictive performance. The key deterrent for using simple measures such as the area under the net benefit curve is the assumption that the threshold probabilities are uniformly distributed among patients. We propose a novel measure for performing decision curve analysis. The approach estimates the distribution of threshold probabilities without the need of additional data. Using the estimated distribution of threshold probabilities, the weighted area under the net benefit curve serves as the summary measure to compare risk prediction models in a range of interest. We compared 3 different approaches, the standard method, the area under the net benefit curve, and the weighted area under the net benefit curve. Type 1 error and power comparisons demonstrate that the weighted area under the net benefit curve has higher power compared to the other methods. Several simulation studies are presented to demonstrate the improvement in model comparison using the weighted area under the net benefit curve compared to the standard method. The proposed measure improves decision curve analysis by using the weighted area under the curve and thereby improves the power of the decision curve analysis to compare risk prediction models in a clinical scenario.

  19. Apparent Fibre Density: a novel measure for the analysis of diffusion-weighted magnetic resonance images.

    PubMed

    Raffelt, David; Tournier, J-Donald; Rose, Stephen; Ridgway, Gerard R; Henderson, Robert; Crozier, Stuart; Salvado, Olivier; Connelly, Alan

    2012-02-15

    This article proposes a new measure called Apparent Fibre Density (AFD) for the analysis of high angular resolution diffusion-weighted images using higher-order information provided by fibre orientation distributions (FODs) computed using spherical deconvolution. AFD has the potential to provide specific information regarding differences between populations by identifying not only the location, but also the orientations along which differences exist. In this work, analytical and numerical Monte-Carlo simulations are used to support the use of the FOD amplitude as a quantitative measure (i.e. AFD) for population and longitudinal analysis. To perform robust voxel-based analysis of AFD, we present and evaluate a novel method to modulate the FOD to account for changes in fibre bundle cross-sectional area that occur during spatial normalisation. We then describe a novel approach for statistical analysis of AFD that uses cluster-based inference of differences extended throughout space and orientation. Finally, we demonstrate the capability of the proposed method by performing voxel-based AFD comparisons between a group of Motor Neurone Disease patients and healthy control subjects. A significant decrease in AFD was detected along voxels and orientations corresponding to both the corticospinal tract and corpus callosal fibres that connect the primary motor cortices. In addition to corroborating previous findings in MND, this study demonstrates the clear advantage of using this type of analysis by identifying differences along single fibre bundles in regions containing multiple fibre populations. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. Optical communication for space missions

    NASA Technical Reports Server (NTRS)

    Firtmaurice, M.

    1991-01-01

    Activities performed at NASA/GSFC (Goddard Space Flight Center) related to direct detection optical communications for space applications are discussed. The following subject areas are covered: (1) requirements for optical communication systems (data rates and channel quality; spatial acquisition; fine tracking and pointing; and transmit point-ahead correction); (2) component testing and development (laser diodes performance characterization and life testing; and laser diode power combining); (3) system development and simulations (The GSFC pointing, acquisition and tracking system; hardware description; preliminary performance analysis; and high data rate transmitter/receiver systems); and (4) proposed flight demonstration of optical communications.

  1. Applications of inertial-sensor high-inheritance instruments to DSN precision antenna pointing

    NASA Technical Reports Server (NTRS)

    Goddard, R. E.

    1992-01-01

    Laboratory test results of the initialization and tracking performance of an existing inertial-sensor-based instrument are given. The instrument, although not primarily designed for precision antenna pointing applications, demonstrated an on-average 10-hour tracking error of several millidegrees. The system-level instrument performance is shown by analysis to be sensor limited. Simulated instrument improvements show a tracking error of less than 1 mdeg, which would provide acceptable performance, i.e., low pointing loss, for the DSN 70-m antenna sub network, operating at Ka-band (1-cm wavelength).

  2. Applications of inertial-sensor high-inheritance instruments to DSN precision antenna pointing

    NASA Technical Reports Server (NTRS)

    Goddard, R. E.

    1992-01-01

    Laboratory test results of the initialization and tracking performance of an existing inertial-sensor-based instrument are given. The instrument, although not primarily designed for precision antenna pointing applications, demonstrated an on-average 10-hour tracking error of several millidegrees. The system-level instrument performance is shown by analysis to be sensor limited. Simulated instrument improvements show a tracking error of less than 1 mdeg, which would provide acceptable performance, i.e., low pointing loss, for the Deep Space Network 70-m antenna subnetwork, operating at Ka-band (1-cm wavelength).

  3. Comparison between measured turbine stage performance and the predicted performance using quasi-3D flow and boundary layer analyses

    NASA Technical Reports Server (NTRS)

    Boyle, R. J.; Haas, J. E.; Katsanis, T.

    1984-01-01

    A method for calculating turbine stage performance is described. The usefulness of the method is demonstrated by comparing measured and predicted efficiencies for nine different stages. Comparisons are made over a range of turbine pressure ratios and rotor speeds. A quasi-3D flow analysis is used to account for complex passage geometries. Boundary layer analyses are done to account for losses due to friction. Empirical loss models are used to account for incidence, secondary flow, disc windage, and clearance losses.

  4. Adaptive Optics Communications Performance Analysis

    NASA Technical Reports Server (NTRS)

    Srinivasan, M.; Vilnrotter, V.; Troy, M.; Wilson, K.

    2004-01-01

    The performance improvement obtained through the use of adaptive optics for deep-space communications in the presence of atmospheric turbulence is analyzed. Using simulated focal-plane signal-intensity distributions, uncoded pulse-position modulation (PPM) bit-error probabilities are calculated assuming the use of an adaptive focal-plane detector array as well as an adaptively sized single detector. It is demonstrated that current practical adaptive optics systems can yield performance gains over an uncompensated system ranging from approximately 1 dB to 6 dB depending upon the PPM order and background radiation level.

  5. Spherical roller bearing analysis. SKF computer program SPHERBEAN. Volume 3: Program correlation with full scale hardware tests

    NASA Technical Reports Server (NTRS)

    Kleckner, R. J.; Rosenlieb, J. W.; Dyba, G.

    1980-01-01

    The results of a series of full scale hardware tests comparing predictions of the SPHERBEAN computer program with measured data are presented. The SPHERBEAN program predicts the thermomechanical performance characteristics of high speed lubricated double row spherical roller bearings. The degree of correlation between performance predicted by SPHERBEAN and measured data is demonstrated. Experimental and calculated performance data is compared over a range in speed up to 19,400 rpm (0.8 MDN) under pure radial, pure axial, and combined loads.

  6. Finite element analysis of structural engineering problems using a viscoplastic model incorporating two back stresses

    NASA Technical Reports Server (NTRS)

    Arya, Vinod K.; Halford, Gary R.

    1993-01-01

    The feasibility of a viscoplastic model incorporating two back stresses and a drag strength is investigated for performing nonlinear finite element analyses of structural engineering problems. To demonstrate suitability for nonlinear structural analyses, the model is implemented into a finite element program and analyses for several uniaxial and multiaxial problems are performed. Good agreement is shown between the results obtained using the finite element implementation and those obtained experimentally. The advantages of using advanced viscoplastic models for performing nonlinear finite element analyses of structural components are indicated.

  7. Friends as Coworkers: Research Review and Classroom Implications.

    ERIC Educational Resources Information Center

    Zajac, Robert J.; Hartup, Willard W.

    1997-01-01

    Provides evidence that benefits occur when friends, compared with nonfriends, are coworkers on cognitive tasks. Notes meta-analysis which found superior performance was demonstrated in the areas of seeking scarce resources, problem solving, creative activity, and reaching consensus. Argues teachers should consider giving students the opportunity…

  8. Technicians, Technical Education, and Global Economic Development: A Cross National Examination.

    ERIC Educational Resources Information Center

    Honig, Benson; Ramirez, Francisco

    Although the relationship among education, science, technology, and economic development is nearly universally accepted, the link among education, infrastructure, and economic growth has yet to be empirically demonstrated. A multivariate analysis of cross-national data regarding 48 countries was performed to document relationships between…

  9. APPLICATION ANALYSIS REPORT - DEMONSTRATION OF A TRIAL EXCAVATION AT THE MCCOLL SUPERFUND SITE

    EPA Science Inventory

    In June 1990, the U.S. Environmental Protection Agency’s Region IX Superfund Program, in cooperation with EPA’s Air and Energy Engineering Research Laboratory (AEERL), and EPA’s Superfund Innovative Technology Evaluation (SITE) Program performed a trial excavation of approximatel...

  10. Robust MOE Detector for DS-CDMA Systems with Signature Waveform Mismatch

    NASA Astrophysics Data System (ADS)

    Lin, Tsui-Tsai

    In this letter, a decision-directed MOE detector with excellent robustness against signature waveform mismatch is proposed for DS-CDMA systems. Both the theoretic analysis and computer simulation results demonstrate that the proposed detector can provide better SINR performance than that of conventional detectors.

  11. Automatic control of a liquid nitrogen cooled, closed-circuit, cryogenic pressure tunnel

    NASA Technical Reports Server (NTRS)

    Balakrishna, S.; Goglia, G. L.

    1980-01-01

    The control system design, performance analysis, microprocesser based controller software development, and specifications for the Transonic Cryogenic Tunnel (TCT) are discussed. The control laws for the single-input single-output controllers were tested on the TCT simulator, and successfully demonstrated on the TCT.

  12. 10 CFR 39.13 - Specific licenses for well logging.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... applicant will use to demonstrate the logging supervisor's knowledge and understanding of and ability to... and understanding of and ability to comply with the applicant's operating and emergency procedures. (c... performing the analysis; and (3) Pertinent experience of the person who will analyze the wipe samples. ...

  13. Mennonite Nursing Home passive solar demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    A long-term nursing care facility and retirement center was designed for passive solar heating. The system comprises thermal mass, thermal insulation, Trombe walls, and direct gain clerestories. Included here is a topical report, analysis of building performance, owner's perspective, designer's perspective, and summary of information dissemination activities. (MHR)

  14. Development of space-stable thermal control coatings for use on large space vehicles

    NASA Technical Reports Server (NTRS)

    Gilligan, J. E.; Harada, Y.

    1976-01-01

    The potential of zinc orthotitanate as a pigment for spacecraft thermal control was demonstrated. The properties and performance of pigments prepared by solid state, coprecipitation, and mixed oxalate methods were compared. Environmental tests and subsequent spectral analysis were given primary emphasis.

  15. X-Band CubeSat Communication System Demonstration

    NASA Technical Reports Server (NTRS)

    Altunc, Serhat; Kegege, Obadiah; Bundick, Steve; Shaw, Harry; Schaire, Scott; Bussey, George; Crum, Gary; Burke, Jacob C.; Palo, Scott; O'Conor, Darren

    2015-01-01

    Today's CubeSats mostly operate their communications at UHF- and S-band frequencies. UHF band is presently crowded, thus downlink communications are at lower data rates due to bandwidth limitations and are unreliable due to interference. This research presents an end-to-end robust, innovative, compact, efficient and low cost S-band uplink and X-band downlink CubeSat communication system demonstration between a balloon and a Near Earth Network (NEN) ground system. Since communication systems serve as umbilical cords for space missions, demonstration of this X-band communication system is critical for successfully supporting current and future CubeSat communication needs. This research has three main objectives. The first objective is to design, simulate, and test a CubeSat S- and X-band communication system. Satellite Tool Kit (STK) dynamic link budget calculations and HFSS Simulations and modeling results have been used to trade the merit of various designs for small satellite applications. S- and X-band antennas have been tested in the compact antenna test range at Goddard Space Flight Center (GSFC) to gather radiation pattern data. The second objective is simulate and test a CubeSat compatible X-band communication system at 12.5Mbps including S-band antennas, X-band antennas, Laboratory for Atmospheric and Space Physics (LASP) /GSFC transmitter and an S-band receiver from TRL-5 to TRL-8 by the end of this effort. Different X-band communication system components (antennas, diplexers, etc.) from GSFC, other NASA centers, universities, and private companies have been investigated and traded, and a complete component list for the communication system baseline has been developed by performing analytical and numerical analysis. This objective also includes running simulations and performing trades between different X-band antenna systems to optimize communication system performance. The final objective is to perform an end-to-end X-band CubeSat communication system demonstration between a balloon and/or a sounding rocket and a Near Earth Network (NEN) ground system. This paper presents CubeSat communication systems simulation results, analysis of X-band and S-band antennas and RF front-end components, transceiver design, analysis and optimization of space-to-ground communication performance, subsystem development, as well as the test results for an end-to-end X-band CubeSat communication system demonstration. The outcome of this work will be used to pave the way for next generation NEN-compatible X-band CubeSat communication systems to support higher data rates with more advanced modulation and forward error correction (FEC) coding schemes, and to support and attract new science missions at lower cost. It also includes an abbreviated concept of operations for CubeSat users to utilize the NEN, starting from first contact with NASA's communication network and continuing through on-orbit operations.

  16. Voxel-based statistical analysis of cerebral glucose metabolism in patients with permanent vegetative state after acquired brain injury.

    PubMed

    Kim, Yong Wook; Kim, Hyoung Seop; An, Young-Sil; Im, Sang Hee

    2010-10-01

    Permanent vegetative state is defined as the impaired level of consciousness longer than 12 months after traumatic causes and 3 months after non-traumatic causes of brain injury. Although many studies assessed the cerebral metabolism in patients with acute and persistent vegetative state after brain injury, few studies investigated the cerebral metabolism in patients with permanent vegetative state. In this study, we performed the voxel-based analysis of cerebral glucose metabolism and investigated the relationship between regional cerebral glucose metabolism and the severity of impaired consciousness in patients with permanent vegetative state after acquired brain injury. We compared the regional cerebral glucose metabolism as demonstrated by F-18 fluorodeoxyglucose positron emission tomography from 12 patients with permanent vegetative state after acquired brain injury with those from 12 control subjects. Additionally, covariance analysis was performed to identify regions where decreased changes in regional cerebral glucose metabolism significantly correlated with a decrease of level of consciousness measured by JFK-coma recovery scale. Statistical analysis was performed using statistical parametric mapping. Compared with controls, patients with permanent vegetative state demonstrated decreased cerebral glucose metabolism in the left precuneus, both posterior cingulate cortices, the left superior parietal lobule (P(corrected) < 0.001), and increased cerebral glucose metabolism in the both cerebellum and the right supramarginal cortices (P(corrected) < 0.001). In the covariance analysis, a decrease in the level of consciousness was significantly correlated with decreased cerebral glucose metabolism in the both posterior cingulate cortices (P(uncorrected) < 0.005). Our findings suggest that the posteromedial parietal cortex, which are part of neural network for consciousness, may be relevant structure for pathophysiological mechanism in patients with permanent vegetative state after acquired brain injury.

  17. TECA: A Parallel Toolkit for Extreme Climate Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prabhat, Mr; Ruebel, Oliver; Byna, Surendra

    2012-03-12

    We present TECA, a parallel toolkit for detecting extreme events in large climate datasets. Modern climate datasets expose parallelism across a number of dimensions: spatial locations, timesteps and ensemble members. We design TECA to exploit these modes of parallelism and demonstrate a prototype implementation for detecting and tracking three classes of extreme events: tropical cyclones, extra-tropical cyclones and atmospheric rivers. We process a modern TB-sized CAM5 simulation dataset with TECA, and demonstrate good runtime performance for the three case studies.

  18. The business value and cost-effectiveness of genomic medicine.

    PubMed

    Crawford, James M; Aspinall, Mara G

    2012-05-01

    Genomic medicine offers the promise of more effective diagnosis and treatment of human diseases. Genome sequencing early in the course of disease may enable more timely and informed intervention, with reduced healthcare costs and improved long-term outcomes. However, genomic medicine strains current models for demonstrating value, challenging efforts to achieve fair payment for services delivered, both for laboratory diagnostics and for use of molecular information in clinical management. Current models of healthcare reform stipulate that care must be delivered at equal or lower cost, with better patient and population outcomes. To achieve demonstrated value, genomic medicine must overcome many uncertainties: the clinical relevance of genomic variation; potential variation in technical performance and/or computational analysis; management of massive information sets; and must have available clinical interventions that can be informed by genomic analysis, so as to attain more favorable cost management of healthcare delivery and demonstrate improvements in cost-effectiveness.

  19. Germline MLH1 Mutations Are Frequently Identified in Lynch Syndrome Patients With Colorectal and Endometrial Carcinoma Demonstrating Isolated Loss of PMS2 Immunohistochemical Expression.

    PubMed

    Dudley, Beth; Brand, Randall E; Thull, Darcy; Bahary, Nathan; Nikiforova, Marina N; Pai, Reetesh K

    2015-08-01

    Current guidelines on germline mutation testing for patients suspected of having Lynch syndrome are not entirely clear in patients with tumors demonstrating isolated loss of PMS2 immunohistochemical expression. We analyzed the clinical and pathologic features of patients with tumors demonstrating isolated loss of PMS2 expression in an attempt to (1) determine the frequency of germline MLH1 and PMS2 mutations and (2) correlate mismatch-repair protein immunohistochemistry and tumor histology with germline mutation results. A total of 3213 consecutive colorectal carcinomas and 215 consecutive endometrial carcinomas were prospectively analyzed for DNA mismatch-repair protein expression by immunohistochemistry. In total, 32 tumors from 31 patients demonstrated isolated loss of PMS2 immunohistochemical expression, including 16 colorectal carcinomas and 16 endometrial carcinomas. Microsatellite instability (MSI) polymerase chain reaction was performed in 29 tumors from 28 patients with the following results: 28 tumors demonstrated high-level MSI, and 1 tumor demonstrated low-level MSI. Twenty of 31 (65%) patients in the study group had tumors demonstrating histopathology associated with high-level MSI. Seventeen patients underwent germline mutation analysis with the following results: 24% with MLH1 mutations, 35% with PMS2 mutations, 12% with PMS2 variants of undetermined significance, and 29% with no mutations in either MLH1 or PMS2. Three of the 4 patients with MLH1 germline mutations had a mutation that results in decreased stability and quantity of the MLH1 protein that compromises the MLH1-PMS2 protein complex, helping to explain the presence of immunogenic but functionally inactive MLH1 protein within the tumor. The high frequency of MLH1 germline mutations identified in our study has important implications for testing strategies in patients suspected of having Lynch syndrome and indicates that patients with tumors demonstrating isolated loss of PMS2 expression without a germline PMS2 mutation must have MLH1 mutation analysis performed.

  20. Performance Analysis of a Cost-Effective Electret Condenser Microphone Directional Array

    NASA Technical Reports Server (NTRS)

    Humphreys, William M., Jr.; Gerhold, Carl H.; Zuckerwar, Allan J.; Herring, Gregory C.; Bartram, Scott M.

    2003-01-01

    Microphone directional array technology continues to be a critical part of the overall instrumentation suite for experimental aeroacoustics. Unfortunately, high sensor cost remains one of the limiting factors in the construction of very high-density arrays (i.e., arrays containing several hundred channels or more) which could be used to implement advanced beamforming algorithms. In an effort to reduce the implementation cost of such arrays, the authors have undertaken a systematic performance analysis of a prototype 35-microphone array populated with commercial electret condenser microphones. An ensemble of microphones coupling commercially available electret cartridges with passive signal conditioning circuitry was fabricated for use with the Langley Large Aperture Directional Array (LADA). A performance analysis consisting of three phases was then performed: (1) characterize the acoustic response of the microphones via laboratory testing and calibration, (2) evaluate the beamforming capability of the electret-based LADA using a series of independently controlled point sources in an anechoic environment, and (3) demonstrate the utility of an electret-based directional array in a real-world application, in this case a cold flow jet operating at high subsonic velocities. The results of the investigation revealed a microphone frequency response suitable for directional array use over a range of 250 Hz - 40 kHz, a successful beamforming evaluation using the electret-populated LADA to measure simple point sources at frequencies up to 20 kHz, and a successful demonstration using the array to measure noise generated by the cold flow jet. This paper presents an overview of the tests conducted along with sample data obtained from those tests.

  1. Development of an automated analysis system for data from flow cytometric intracellular cytokine staining assays from clinical vaccine trials

    PubMed Central

    Shulman, Nick; Bellew, Matthew; Snelling, George; Carter, Donald; Huang, Yunda; Li, Hongli; Self, Steven G.; McElrath, M. Juliana; De Rosa, Stephen C.

    2008-01-01

    Background Intracellular cytokine staining (ICS) by multiparameter flow cytometry is one of the primary methods for determining T cell immunogenicity in HIV-1 clinical vaccine trials. Data analysis requires considerable expertise and time. The amount of data is quickly increasing as more and larger trials are performed, and thus there is a critical need for high throughput methods of data analysis. Methods A web based flow cytometric analysis system, LabKey Flow, was developed for analyses of data from standardized ICS assays. A gating template was created manually in commercially-available flow cytometric analysis software. Using this template, the system automatically compensated and analyzed all data sets. Quality control queries were designed to identify potentially incorrect sample collections. Results Comparison of the semi-automated analysis performed by LabKey Flow and the manual analysis performed using FlowJo software demonstrated excellent concordance (concordance correlation coefficient >0.990). Manual inspection of the analyses performed by LabKey Flow for 8-color ICS data files from several clinical vaccine trials indicates that template gates can appropriately be used for most data sets. Conclusions The semi-automated LabKey Flow analysis system can analyze accurately large ICS data files. Routine use of the system does not require specialized expertise. This high-throughput analysis will provide great utility for rapid evaluation of complex multiparameter flow cytometric measurements collected from large clinical trials. PMID:18615598

  2. Mixed Criticality Scheduling for Industrial Wireless Sensor Networks

    PubMed Central

    Jin, Xi; Xia, Changqing; Xu, Huiting; Wang, Jintao; Zeng, Peng

    2016-01-01

    Wireless sensor networks (WSNs) have been widely used in industrial systems. Their real-time performance and reliability are fundamental to industrial production. Many works have studied the two aspects, but only focus on single criticality WSNs. Mixed criticality requirements exist in many advanced applications in which different data flows have different levels of importance (or criticality). In this paper, first, we propose a scheduling algorithm, which guarantees the real-time performance and reliability requirements of data flows with different levels of criticality. The algorithm supports centralized optimization and adaptive adjustment. It is able to improve both the scheduling performance and flexibility. Then, we provide the schedulability test through rigorous theoretical analysis. We conduct extensive simulations, and the results demonstrate that the proposed scheduling algorithm and analysis significantly outperform existing ones. PMID:27589741

  3. Some Observations on the Current Status of Performing Finite Element Analyses

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Knight, Norman F., Jr; Shivakumar, Kunigal N.

    2015-01-01

    Aerospace structures are complex high-performance structures. Advances in reliable and efficient computing and modeling tools are enabling analysts to consider complex configurations, build complex finite element models, and perform analysis rapidly. Many of the early career engineers of today are very proficient in the usage of modern computers, computing engines, complex software systems, and visualization tools. These young engineers are becoming increasingly efficient in building complex 3D models of complicated aerospace components. However, the current trends demonstrate blind acceptance of the results of the finite element analysis results. This paper is aimed at raising an awareness of this situation. Examples of the common encounters are presented. To overcome the current trends, some guidelines and suggestions for analysts, senior engineers, and educators are offered.

  4. A performance analysis of DS-CDMA and SCPC VSAT networks

    NASA Technical Reports Server (NTRS)

    Hayes, David P.; Ha, Tri T.

    1990-01-01

    Spread-spectrum and single-channel-per-carrier (SCPC) transmission techniques work well in very small aperture terminal (VSAT) networks for multiple-access purposes while allowing the earth station antennas to remain small. Direct-sequence code-division multiple-access (DS-CDMA) is the simplest spread-spectrum technique to use in a VSAT network since a frequency synthesizer is not required for each terminal. An examination is made of the DS-CDMA and SCPC Ku-band VSAT satellite systems for low-density (64-kb/s or less) communications. A method for improving the standardf link analysis of DS-CDMA satellite-switched networks by including certain losses is developed. The performance of 50-channel full mesh and star network architectures is analyzed. The selection of operating conditions producing optimum performance is demonstrated.

  5. Numerical and experimental study of actuator performance on piezoelectric microelectromechanical inkjet print head.

    PubMed

    Van So, Pham; Jun, Hyun Woo; Lee, Jaichan

    2013-12-01

    We have investigated the actuator performance of a piezoelectrically actuated inkjet print head via the numerical and experimental analysis. The actuator consisting of multi-layer membranes, such as piezoelectric, elastic and other buffer layers, and ink chamber was fabricated by MEMS processing. The maximum displacement of the actuator membrane obtained in the experiment is explained by numerical analysis. A simulation of the actuator performance with fluidic damping shows that the resonant frequency of the membrane in liquid is reduced from its resonant frequency in air by a factor of three, which was also verified in the experiment. These simulation and experimental studies demonstrate how much "dynamic force," in terms of a membrane's maximum displacement, maximum force and driving frequency, can be produced by an actuator membrane interacting with fluid.

  6. Mars Microprobe Entry Analysis

    NASA Technical Reports Server (NTRS)

    Braun, Robert D.; Mitcheltree, Robert A.; Cheatwood, F. McNeil

    1998-01-01

    The Mars Microprobe mission will provide the first opportunity for subsurface measurements, including water detection, near the south pole of Mars. In this paper, performance of the Microprobe aeroshell design is evaluated through development of a six-degree-of-freedom (6-DOF) aerodynamic database and flight dynamics simulation. Numerous mission uncertainties are quantified and a Monte-Carlo analysis is performed to statistically assess mission performance. Results from this 6-DOF Monte-Carlo simulation demonstrate that, in a majority of the cases (approximately 2-sigma), the penetrator impact conditions are within current design tolerances. Several trajectories are identified in which the current set of impact requirements are not satisfied. From these cases, critical design parameters are highlighted and additional system requirements are suggested. In particular, a relatively large angle-of-attack range near peak heating is identified.

  7. Mental Toughness Moderates Social Loafing in Cycle Time-Trial Performance.

    PubMed

    Haugen, Tommy; Reinboth, Michael; Hetlelid, Ken J; Peters, Derek M; Høigaard, Rune

    2016-09-01

    The purpose of this study was to determine if mental toughness moderated the occurrence of social loafing in cycle time-trial performance. Twenty-seven men (Mage = 17.7 years, SD = 0.6) completed the Sport Mental Toughness Questionnaire prior to completing a 1-min cycling trial under 2 conditions: once with individual performance identified, and once in a group with individual performance not identified. Using a median split of the mental toughness index, participants were divided into high and low mental toughness groups. Cycling distance was compared using a 2 (trial) × 2 (high-low mental toughness) analysis of variance. We hypothesized that mentally tough participants would perform equally well under both conditions (i.e., no indication of social loafing) compared with low mentally tough participants, who would perform less well when their individual performance was not identifiable (i.e., demonstrating the anticipated social loafing effect). The high mental toughness group demonstrated consistent performance across both conditions, while the low mental toughness group reduced their effort in the non-individually identifiable team condition. The results confirm that (a) clearly identifying individual effort/performance is an important situational variable that may impact team performance and (b) higher perceived mental toughness has the ability to negate the tendency to loaf.

  8. The impact of lungs from diabetic donors on lung transplant recipients†.

    PubMed

    Ambur, Vishnu; Taghavi, Sharven; Jayarajan, Senthil; Kadakia, Sagar; Zhao, Huaqing; Gomez-Abraham, Jesus; Toyoda, Yoshiya

    2017-02-01

    We attempted to determine if transplants of lungs from diabetic donors (DDs) is associated with increased mortality of recipients in the modern era of the lung allocation score (LAS). The United Network for Organ Sharing (UNOS) database was queried for all adult lung transplant recipients from 2006 to 2014. Patients receiving a lung from a DD were compared to those receiving a transplant from a non-DD. Multivariate Cox regression analysis using variables associated with mortality was used to examine survival. A total of 13 159 adult lung transplants were performed between January 2006 and June 2014: 4278 (32.5%) were single-lung transplants (SLT) and 8881 (67.5%) were double-lung transplants (DLT). The log-rank test demonstrated a lower median survival in the DD group (5.6 vs 5.0 years, P = 0.003). We performed additional analysis by dividing this initial cohort into two cohorts by transplant type. On multivariate analysis, receiving an SLT from a DD was associated with increased mortality (HR 1.28, 95% CI 1.07–1.54, P = 0.011). Interestingly, multivariate analysis demonstrated no difference in mortality rates for patients receiving a DLT from a DD (HR 1.12, 95% CI 0.97–1.30, P = 0.14). DLT with DDs can be performed safely without increased mortality, but SLT using DDs results in worse survival and post-transplant outcomes. Preference should be given to DLT when using lungs from donors with diabetes. © The Author 2016. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  9. Computed Tomography-Derived Fractional Flow Reserve in the Detection of Lesion-Specific Ischemia: An Integrated Analysis of 3 Pivotal Trials.

    PubMed

    Xu, Rende; Li, Chenguang; Qian, Juying; Ge, Junbo

    2015-11-01

    Invasive fractional flow reserve (FFR) is the gold standard for the determination of physiologic stenosis severity and the need for revascularization. FFR computed from standard acquired coronary computed tomographic angiography datasets (FFRCT) is an emerging technology which allows calculation of FFR using resting image data from coronary computed tomographic angiography (CCTA). However, the diagnostic accuracy of FFRCT in the evaluation of lesion-specific myocardial ischemia remains to be confirmed, especially in patients with intermediate coronary stenosis. We performed an integrated analysis of data from 3 prospective, international, and multicenter trials, which assessed the diagnostic performance of FFRCT using invasive FFR as a reference standard. Three studies evaluating 609 patients and 1050 vessels were included. The total calculated sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of FFRCT were 82.8%, 77.7%, 60.8%, 91.6%, and 79.2%, respectively, for the per-vessel analysis, and 89.4%, 70.5%, 69.7%, 89.7%, and 78.7%, respectively, for the per-patient analysis. Compared with CCTA alone, FFRCT demonstrated significantly improved accuracy (P < 0.001) in detecting lesion-specific ischemia. In patients with intermediate coronary stenosis, FFRCT remained both highly sensitive and specific with respect to the diagnosis of ischemia. In conclusion, FFRCT appears to be a reliable noninvasive alternative to invasive FFR, as it demonstrates high accuracy in the determination of anatomy and lesion-specific ischemia, which justifies the performance of additional randomized controlled trials to evaluate both the clinical benefits and the cost-effectiveness of FFRCT-guided coronary revascularization.

  10. Develop Advanced Nonlinear Signal Analysis Topographical Mapping System

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1997-01-01

    During the development of the SSME, a hierarchy of advanced signal analysis techniques for mechanical signature analysis has been developed by NASA and AI Signal Research Inc. (ASRI) to improve the safety and reliability for Space Shuttle operations. These techniques can process and identify intelligent information hidden in a measured signal which is often unidentifiable using conventional signal analysis methods. Currently, due to the highly interactive processing requirements and the volume of dynamic data involved, detailed diagnostic analysis is being performed manually which requires immense man-hours with extensive human interface. To overcome this manual process, NASA implemented this program to develop an Advanced nonlinear signal Analysis Topographical Mapping System (ATMS) to provide automatic/unsupervised engine diagnostic capabilities. The ATMS will utilize a rule-based Clips expert system to supervise a hierarchy of diagnostic signature analysis techniques in the Advanced Signal Analysis Library (ASAL). ASAL will perform automatic signal processing, archiving, and anomaly detection/identification tasks in order to provide an intelligent and fully automated engine diagnostic capability. The ATMS has been successfully developed under this contract. In summary, the program objectives to design, develop, test and conduct performance evaluation for an automated engine diagnostic system have been successfully achieved. Software implementation of the entire ATMS system on MSFC's OISPS computer has been completed. The significance of the ATMS developed under this program is attributed to the fully automated coherence analysis capability for anomaly detection and identification which can greatly enhance the power and reliability of engine diagnostic evaluation. The results have demonstrated that ATMS can significantly save time and man-hours in performing engine test/flight data analysis and performance evaluation of large volumes of dynamic test data.

  11. Application of Micro-Electro-Mechanical Sensors Contactless NDT of Concrete Structures.

    PubMed

    Ham, Suyun; Popovics, John S

    2015-04-17

    The utility of micro-electro-mechanical sensors (MEMS) for application in air-coupled (contactless or noncontact) sensing to concrete nondestructive testing (NDT) is studied in this paper. The fundamental operation and characteristics of MEMS are first described. Then application of MEMS sensors toward established concrete test methods, including vibration resonance, impact-echo, ultrasonic surface wave, and multi-channel analysis of surface waves (MASW), is demonstrated. In each test application, the performance of MEMS is compared with conventional contactless and contact sensing technology. Favorable performance of the MEMS sensors demonstrates the potential of the technology for applied contactless NDT efforts. To illustrate the utility of air-coupled MEMS sensors for concrete NDT, as compared with conventional sensor technology.

  12. Rapid analysis of controlled substances using desorption electrospray ionization mass spectrometry.

    PubMed

    Rodriguez-Cruz, Sandra E

    2006-01-01

    The recently developed technique of desorption electrospray ionization (DESI) has been applied to the rapid analysis of controlled substances. Experiments have been performed using a commercial ThermoFinnigan LCQ Advantage MAX ion-trap mass spectrometer with limited modifications. Results from the ambient sampling of licit and illicit tablets demonstrate the ability of the DESI technique to detect the main active ingredient(s) or controlled substance(s), even in the presence of other higher-concentration components. Full-scan mass spectrometry data provide preliminary identification by molecular weight determination, while rapid analysis using the tandem mass spectrometry (MS/MS) mode provides fragmentation data which, when compared to the laboratory-generated ESI-MS/MS spectral library, provide structural information and final identification of the active ingredient(s). The consecutive analysis of tablets containing different active components indicates there is no cross-contamination or interference from tablet to tablet, demonstrating the reliability of the DESI technique for rapid sampling (one tablet/min or better). Active ingredients have been detected for tablets in which the active component represents less than 1% of the total tablet weight, demonstrating the sensitivity of the technique. The real-time sampling of cannabis plant material is also presented.

  13. Integrating software architectures for distributed simulations and simulation analysis communities.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael

    2005-10-01

    The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context ofmore » the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.« less

  14. Performance analysis and improvement of WPAN MAC for home networks.

    PubMed

    Mehta, Saurabh; Kwak, Kyung Sup

    2010-01-01

    The wireless personal area network (WPAN) is an emerging wireless technology for future short range indoor and outdoor communication applications. The IEEE 802.15.3 medium access control (MAC) is proposed to coordinate the access to the wireless medium among the competing devices, especially for short range and high data rate applications in home networks. In this paper we use analytical modeling to study the performance analysis of WPAN (IEEE 802.15.3) MAC in terms of throughput, efficient bandwidth utilization, and delay with various ACK policies under error channel condition. This allows us to introduce a K-Dly-ACK-AGG policy, payload size adjustment mechanism, and Improved Backoff algorithm to improve the performance of the WPAN MAC. Performance evaluation results demonstrate the impact of our improvements on network capacity. Moreover, these results can be very useful to WPAN application designers and protocol architects to easily and correctly implement WPAN for home networking.

  15. Performance Analysis and Improvement of WPAN MAC for Home Networks

    PubMed Central

    Mehta, Saurabh; Kwak, Kyung Sup

    2010-01-01

    The wireless personal area network (WPAN) is an emerging wireless technology for future short range indoor and outdoor communication applications. The IEEE 802.15.3 medium access control (MAC) is proposed to coordinate the access to the wireless medium among the competing devices, especially for short range and high data rate applications in home networks. In this paper we use analytical modeling to study the performance analysis of WPAN (IEEE 802.15.3) MAC in terms of throughput, efficient bandwidth utilization, and delay with various ACK policies under error channel condition. This allows us to introduce a K-Dly-ACK-AGG policy, payload size adjustment mechanism, and Improved Backoff algorithm to improve the performance of the WPAN MAC. Performance evaluation results demonstrate the impact of our improvements on network capacity. Moreover, these results can be very useful to WPAN application designers and protocol architects to easily and correctly implement WPAN for home networking. PMID:22319274

  16. Elucidating Performance Limitations in Alkaline-Exchange- Membrane Fuel Cells

    DOE PAGES

    Shiau, Huai-Suen; Zenyuk, Iryna V.; Weber, Adam Z.

    2017-07-15

    Water management is a serious concern for alkaline-exchange-membrane fuel cells (AEMFCs) because water is a reactant in the alkaline oxygen-reduction reaction and hydroxide conduction in alkaline-exchange membranes is highly hydration dependent. Here in this article, we develop and use a multiphysics, multiphase model to explore water management in AEMFCs. We demonstrate that the low performance is mostly caused by extremely non-uniform distribution of water in the ionomer phase. A sensitivity analysis of design parameters including humidification strategies, membrane properties, and water transport resistance was undertaken to explore possible optimization strategies. Furthermore, the strategy and issues of reducing bicarbonate/carbonate buildup inmore » the membrane-electrode assembly with CO 2 from air is demonstrated based on the model prediction. Overall, mathematical modeling is used to explore trends and strategies to overcome performance bottlenecks and help enable AEMFC commercialization.« less

  17. Progress of long pulse operation with high performance plasma in KSTAR

    NASA Astrophysics Data System (ADS)

    Bae, Young; Kstar Team

    2015-11-01

    Recent KSTAR experiments showed the sustained H-mode operation up to the pulse duration of 46 s at the plasma current of 600 kA. The long-pulse H-mode operation has been supported by long-pulse capable neutral beam injection (NBI) system with high NB current drive efficiency attributed by highly tangential injections of three beam sources. In next phase, aiming to demonstrate the long pulse stationary high performance plasma operation, we are attempting the long pulse inductive operation at the higher performance (MA plasma current, high normalized beta, and low q95) for the final goal of demonstration of ITER-like baseline scenario in KSTAR with progressive improvement of the plasma shape control and higher neutral beam injection power. This paper presents the progress of long pulse operation and the analysis of energy confinement time and non-inductive current drive in KSTAR.

  18. Optimum sensitivity derivatives of objective functions in nonlinear programming

    NASA Technical Reports Server (NTRS)

    Barthelemy, J.-F. M.; Sobieszczanski-Sobieski, J.

    1983-01-01

    The feasibility of eliminating second derivatives from the input of optimum sensitivity analyses of optimization problems is demonstrated. This elimination restricts the sensitivity analysis to the first-order sensitivity derivatives of the objective function. It is also shown that when a complete first-order sensitivity analysis is performed, second-order sensitivity derivatives of the objective function are available at little additional cost. An expression is derived whose application to linear programming is presented.

  19. Thermal analysis of combinatorial solid geometry models using SINDA

    NASA Technical Reports Server (NTRS)

    Gerencser, Diane; Radke, George; Introne, Rob; Klosterman, John; Miklosovic, Dave

    1993-01-01

    Algorithms have been developed using Monte Carlo techniques to determine the thermal network parameters necessary to perform a finite difference analysis on Combinatorial Solid Geometry (CSG) models. Orbital and laser fluxes as well as internal heat generation are modeled to facilitate satellite modeling. The results of the thermal calculations are used to model the infrared (IR) images of targets and assess target vulnerability. Sample analyses and validation are presented which demonstrate code products.

  20. Idaho National Laboratory’s Analysis of ARRA-Funded Plug-in Electric Vehicle and Charging Infrastructure Projects: Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Francfort, Jim; Bennett, Brion; Carlson, Richard

    2015-09-01

    Battelle Energy Alliance, LLC, managing and operating contractor for the U.S. Department of Energy’s (DOE) Idaho National Laboratory (INL), is the lead laboratory for U.S. Department of Energy’s Advanced Vehicle Testing Activity (AVTA). INL’s conduct of the AVTA resulted in a significant base of knowledge and experience in the area of testing light-duty vehicles that reduced transportation-related petroleum consumption. Due to this experience, INL was tasked by DOE to develop agreements with companies that were the recipients of The American Recovery and Reinvestment Act of 2009 (ARRA) grants, that would allow INL to collect raw data from light-duty vehicles andmore » charging infrastructure. INL developed non-disclosure agreements (NDAs) with several companies and their partners that resulted in INL being able to receive raw data via server-to-server connections from the partner companies. This raw data allowed INL to independently conduct data quality checks, perform analysis, and report publicly to DOE, partners, and stakeholders, how drivers used both new vehicle technologies and the deployed charging infrastructure. The ultimate goal was not the deployment of vehicles and charging infrastructure, cut rather to create real-world laboratories of vehicles, charging infrastructure and drivers that would aid in the design of future electric drive transportation systems. The five projects that INL collected data from and their partners are: • ChargePoint America - Plug-in Electric Vehicle Charging Infrastructure Demonstration • Chrysler Ram PHEV Pickup - Vehicle Demonstration • General Motors Chevrolet Volt - Vehicle Demonstration • The EV Project - Plug-in Electric Vehicle Charging Infrastructure Demonstration • EPRI / Via Motors PHEVs – Vehicle Demonstration The document serves to benchmark the performance science involved the execution, analysis and reporting for the five above projects that provided lessons learned based on driver’s use of the vehicles and recharging decisions made. Data is reported for the use of more than 25,000 vehicles and charging units.« less

  1. Farmers Extension Program Effects on Yield Gap in North China Plain

    NASA Astrophysics Data System (ADS)

    Sum, N.; Zhao, Y.

    2015-12-01

    Improving crop yield of the lowest yielding smallholder farmers in developing countries is essential to both food security of the country and the farmers' livelihood. Although wheat and maize production in most developed countries have reached 80% or greater of yield potential determined by simulated models, yield gap remains high in the developing world. One of these cases is the yield gap of maize in the North China Plain (NCP), where the average farmer's yield is 41% of his or her potential yield. This large yield gap indicates opportunity to raise yields substantially by improving agronomy, especially in nutrition management, irrigation facility, and mechanization issues such as technical services. Farmers' agronomic knowledge is essential to yield performance. In order to propagate such knowledge to farmers, agricultural extension programs, especially in-the-field guidance with training programs at targeted demonstration fields, have become prevalent in China. Although traditional analyses of the effects of the extension program are done through surveys, they are limited to only one to two years and to a small area. However, the spatial analysis tool Google Earth Engine (GEE) and its extensive satellite imagery data allow for unprecedented spatial temporal analysis of yield variation. We used GEE to analyze maize yield in Quzhou county in the North China Plain from 2007 to 2013. We based our analysis on the distance from a demonstration farm plot, the source of the farmers' agronomic knowledge. Our hypothesis was that the farther the farmers' fields were from the demonstration plot, the less access they would have to the knowledge, and the less increase in yield over time. Testing this hypothesis using GEE helps us determine the effectiveness of the demonstration plot in disseminating optimal agronomic practices in addition to evaluating yield performance of the demonstration field itself. Furthermore, we can easily extend this methodology to analyze the whole NCP and any other parts of the world for any type of crop.

  2. Confirmation of translatability and functionality certifies the dual endothelin1/VEGFsp receptor (DEspR) protein.

    PubMed

    Herrera, Victoria L M; Steffen, Martin; Moran, Ann Marie; Tan, Glaiza A; Pasion, Khristine A; Rivera, Keith; Pappin, Darryl J; Ruiz-Opazo, Nelson

    2016-06-14

    In contrast to rat and mouse databases, the NCBI gene database lists the human dual-endothelin1/VEGFsp receptor (DEspR, formerly Dear) as a unitary transcribed pseudogene due to a stop [TGA]-codon at codon#14 in automated DNA and RNA sequences. However, re-analysis is needed given prior single gene studies detected a tryptophan [TGG]-codon#14 by manual Sanger sequencing, demonstrated DEspR translatability and functionality, and since the demonstration of actual non-translatability through expression studies, the standard-of-excellence for pseudogene designation, has not been performed. Re-analysis must meet UNIPROT criteria for demonstration of a protein's existence at the highest (protein) level, which a priori, would override DNA- or RNA-based deductions. To dissect the nucleotide sequence discrepancy, we performed Maxam-Gilbert sequencing and reviewed 727 RNA-seq entries. To comply with the highest level multiple UNIPROT criteria for determining DEspR's existence, we performed various experiments using multiple anti-DEspR monoclonal antibodies (mAbs) targeting distinct DEspR epitopes with one spanning the contested tryptophan [TGG]-codon#14, assessing: (a) DEspR protein expression, (b) predicted full-length protein size, (c) sequence-predicted protein-specific properties beyond codon#14: receptor glycosylation and internalization, (d) protein-partner interactions, and (e) DEspR functionality via DEspR-inhibition effects. Maxam-Gilbert sequencing and some RNA-seq entries demonstrate two guanines, hence a tryptophan [TGG]-codon#14 within a compression site spanning an error-prone compression sequence motif. Western blot analysis using anti-DEspR mAbs targeting distinct DEspR epitopes detect the identical glycosylated 17.5 kDa pull-down protein. Decrease in DEspR-protein size after PNGase-F digest demonstrates post-translational glycosylation, concordant with the consensus-glycosylation site beyond codon#14. Like other small single-transmembrane proteins, mass spectrometry analysis of anti-DEspR mAb pull-down proteins do not detect DEspR, but detect DEspR-protein interactions with proteins implicated in intracellular trafficking and cancer. FACS analyses also detect DEspR-protein in different human cancer stem-like cells (CSCs). DEspR-inhibition studies identify DEspR-roles in CSC survival and growth. Live cell imaging detects fluorescently-labeled anti-DEspR mAb targeted-receptor internalization, concordant with the single internalization-recognition sequence also located beyond codon#14. Data confirm translatability of DEspR, the full-length DEspR protein beyond codon#14, and elucidate DEspR-specific functionality. Along with detection of the tryptophan [TGG]-codon#14 within an error-prone compression site, cumulative data demonstrating DEspR protein existence fulfill multiple UNIPROT criteria, thus refuting its pseudogene designation.

  3. Vitamin D and Depression: A Systematic Review and Meta-Analysis Comparing Studies with and without Biological Flaws

    PubMed Central

    Spedding, Simon

    2014-01-01

    Efficacy of Vitamin D supplements in depression is controversial, awaiting further literature analysis. Biological flaws in primary studies is a possible reason meta-analyses of Vitamin D have failed to demonstrate efficacy. This systematic review and meta-analysis of Vitamin D and depression compared studies with and without biological flaws. The systematic review followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. The literature search was undertaken through four databases for randomized controlled trials (RCTs). Studies were critically appraised for methodological quality and biological flaws, in relation to the hypothesis and study design. Meta-analyses were performed for studies according to the presence of biological flaws. The 15 RCTs identified provide a more comprehensive evidence-base than previous systematic reviews; methodological quality of studies was generally good and methodology was diverse. A meta-analysis of all studies without flaws demonstrated a statistically significant improvement in depression with Vitamin D supplements (+0.78 CI +0.24, +1.27). Studies with biological flaws were mainly inconclusive, with the meta-analysis demonstrating a statistically significant worsening in depression by taking Vitamin D supplements (−1.1 CI −0.7, −1.5). Vitamin D supplementation (≥800 I.U. daily) was somewhat favorable in the management of depression in studies that demonstrate a change in vitamin levels, and the effect size was comparable to that of anti-depressant medication. PMID:24732019

  4. Importance-performance analysis as a guide for hospitals in improving their provision of services.

    PubMed

    Whynes, D K; Reed, G

    1995-11-01

    As a result of the 1990 National Health Services Act, hospitals now compete with one another to win service contracts. A high level of service quality represents an important ingredient of a successful competitive strategy, yet, in general, hospitals have little external information on which to base quality decisions. Specifically, in their efforts to win contracts from fundholding general practitioners, hospitals require information on that which these purchasers deem important with respect to quality, and on how these purchasers assess the quality of their current service performance. The problem is complicated by the fact that hospital service quality, in itself, is multi-dimensional. In other areas of economic activity, the information problem has been resolved by importance-performance analysis and this paper reports the findings of such an analysis conducted for hosptials in the Trent region. The importance and performance service quality ratings of fundholders were obtained from a questionnaire survey and used in a particular variant of importance-performance analysis, which possesses certain advantages over more conventional approaches. In addition to providing empirical data on the determinants of service quality, as perceived by the purchasers of hospital services, this paper demonstrates how such information can be successfully employed in a quality enhancement strategy.

  5. Signal Strength-Based Global Navigation Satellite System Performance Assessment in the Space Service Volume

    NASA Technical Reports Server (NTRS)

    Welch, Bryan W.

    2016-01-01

    NASA is participating in the International Committee on Global Navigation Satellite Systems (GNSS) (ICG)'s efforts towards demonstrating the benefits to the space user in the Space Service Volume (SSV) when a multi-GNSS solution space approach is utilized. The ICG Working Group: Enhancement of GNSS Performance, New Services and Capabilities has started a three phase analysis initiative as an outcome of recommendations at the ICG-10 meeting, in preparation for the ICG-11 meeting. The second phase of that increasing complexity and fidelity analysis initiative is based on augmenting the Phase 1 pure geometrical approach with signal strength-based limitations to determine if access is valid. The second phase of analysis has been completed, and the results are documented in this paper.

  6. Rapid analysis of the main components of the total glycosides of Ranunculus japonicus by UPLC/Q-TOF-MS.

    PubMed

    Rui, Wen; Chen, Hongyuan; Tan, Yuzhi; Zhong, Yanmei; Feng, Yifan

    2010-05-01

    A rapid method for the analysis of the main components of the total glycosides of Ranunculus japonicus (TGOR) was developed using ultra-performance liquid chromatography with quadrupole-time-of-flight mass spectrometry (UPLC/Q-TOF-MS). The separation analysis was performed on a Waters Acquity UPLC system and the accurate mass of molecules and their fragment ions were determined by Q-TOF MS. Twenty compounds, including lactone glycosides, flavonoid glycosides and flavonoid aglycones, were identified and tentatively deduced on the basis of their elemental compositions, MS/MS data and relevant literature. The results demonstrated that lactone glycosides and flavonoids were the main constituents of TGOR. Furthermore, an effective and rapid pattern was established allowing for the comprehensive and systematic characterization of the complex samples.

  7. Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO

    NASA Technical Reports Server (NTRS)

    Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.

    2016-01-01

    A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.

  8. RLV Turbine Performance Optimization

    NASA Technical Reports Server (NTRS)

    Griffin, Lisa W.; Dorney, Daniel J.

    2001-01-01

    A task was developed at NASA/Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. Final results of the preliminary design and the results of the two-dimensional (2D) detailed design of the first-stage vane of a supersonic turbine suitable for a reusable launch vehicle (R-LV) are presented. Analytical techniques for obtaining the results are also discussed.

  9. Procedural learning in Parkinson's disease, specific language impairment, dyslexia, schizophrenia, developmental coordination disorder, and autism spectrum disorders: A second-order meta-analysis.

    PubMed

    Clark, Gillian M; Lum, Jarrad A G

    2017-10-01

    The serial reaction time task (SRTT) has been used to study procedural learning in clinical populations. In this report, second-order meta-analysis was used to investigate whether disorder type moderates performance on the SRTT. Using this approach to quantitatively summarise past research, it was tested whether autism spectrum disorder, developmental coordination disorder, dyslexia, Parkinson's disease, schizophrenia, and specific language impairment differentially affect procedural learning on the SRTT. The main analysis revealed disorder type moderated SRTT performance (p=0.010). This report demonstrates comparable levels of procedural learning impairment in developmental coordination disorder, dyslexia, Parkinson's disease, schizophrenia, and specific language impairment. However, in autism, procedural learning is spared. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Development of Droplet Microfluidics Enabling High-Throughput Single-Cell Analysis.

    PubMed

    Wen, Na; Zhao, Zhan; Fan, Beiyuan; Chen, Deyong; Men, Dong; Wang, Junbo; Chen, Jian

    2016-07-05

    This article reviews recent developments in droplet microfluidics enabling high-throughput single-cell analysis. Five key aspects in this field are included in this review: (1) prototype demonstration of single-cell encapsulation in microfluidic droplets; (2) technical improvements of single-cell encapsulation in microfluidic droplets; (3) microfluidic droplets enabling single-cell proteomic analysis; (4) microfluidic droplets enabling single-cell genomic analysis; and (5) integrated microfluidic droplet systems enabling single-cell screening. We examine the advantages and limitations of each technique and discuss future research opportunities by focusing on key performances of throughput, multifunctionality, and absolute quantification.

  11. A Passive System Reliability Analysis for a Station Blackout

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunett, Acacia; Bucknor, Matthew; Grabaskas, David

    2015-05-03

    The latest iterations of advanced reactor designs have included increased reliance on passive safety systems to maintain plant integrity during unplanned sequences. While these systems are advantageous in reducing the reliance on human intervention and availability of power, the phenomenological foundations on which these systems are built require a novel approach to a reliability assessment. Passive systems possess the unique ability to fail functionally without failing physically, a result of their explicit dependency on existing boundary conditions that drive their operating mode and capacity. Argonne National Laboratory is performing ongoing analyses that demonstrate various methodologies for the characterization of passivemore » system reliability within a probabilistic framework. Two reliability analysis techniques are utilized in this work. The first approach, the Reliability Method for Passive Systems, provides a mechanistic technique employing deterministic models and conventional static event trees. The second approach, a simulation-based technique, utilizes discrete dynamic event trees to treat time- dependent phenomena during scenario evolution. For this demonstration analysis, both reliability assessment techniques are used to analyze an extended station blackout in a pool-type sodium fast reactor (SFR) coupled with a reactor cavity cooling system (RCCS). This work demonstrates the entire process of a passive system reliability analysis, including identification of important parameters and failure metrics, treatment of uncertainties and analysis of results.« less

  12. Does anaesthesia with nitrous oxide affect mortality or cardiovascular morbidity? A systematic review with meta-analysis and trial sequential analysis.

    PubMed

    Imberger, G; Orr, A; Thorlund, K; Wetterslev, J; Myles, P; Møller, A M

    2014-03-01

    The role of nitrous oxide in modern anaesthetic practice is contentious. One concern is that exposure to nitrous oxide may increase the risk of cardiovascular complications. ENIGMA II is a large randomized clinical trial currently underway which is investigating nitrous oxide and cardiovascular complications. Before the completion of this trial, we performed a systematic review and meta-analysis, using Cochrane methodology, on the outcomes that make up the composite primary outcome. We used conventional meta-analysis and trial sequential analysis (TSA). We reviewed 8282 abstracts and selected 138 that fulfilled our criteria for study type, population, and intervention. We attempted to contact the authors of all the selected publications to check for unpublished outcome data. Thirteen trials had outcome data eligible for our outcomes. We assessed three of these trials as having a low risk of bias. Using conventional meta-analysis, the relative risk of short-term mortality in the nitrous oxide group was 1.38 [95% confidence interval (CI) 0.22-8.71] and the relative risk of long-term mortality in the nitrous oxide group was 0.94 (95% CI 0.80-1.10). In both cases, TSA demonstrated that the data were far too sparse to make any conclusions. There were insufficient data to perform meta-analysis for stroke, myocardial infarct, pulmonary embolus, or cardiac arrest. This systematic review demonstrated that we currently do not have robust evidence for how nitrous oxide used as part of general anaesthesia affects mortality and cardiovascular complications.

  13. The Cluster Sensitivity Index: A Basic Measure of Classification Robustness

    ERIC Educational Resources Information Center

    Hom, Willard C.

    2010-01-01

    Analysts of institutional performance have occasionally used a peer grouping approach in which they compared institutions only to other institutions with similar characteristics. Because analysts historically have used cluster analysis to define peer groups (i.e., the group of comparable institutions), the author proposes and demonstrates with…

  14. LOW TEMPERATURE THERMAL TREATMENT (LT3®) TECHNOLOGY - ROY F. WESTON, INC. - APPLICATIONS ANALYSIS REPORT

    EPA Science Inventory

    This report evaluates the Low Temperature Thermal Treatment (LT3®) system's ability to remove VOC and SVOC compounds from solid wastes. This evaluation is based on treatment performance and cost data from the Superfund Innovative Technology (SITE) demonstration and fi...

  15. Evaluating Inquiry-Based Learning as a Means to Advance Individual Student Achievement

    ERIC Educational Resources Information Center

    Ziemer, Cherilyn G.

    2013-01-01

    Although inquiry-based learning has been debated throughout the greater educational community and demonstrated with some effect in modern classrooms, little quantitative analysis has been performed to empirically validate sustained benefits. This quantitative study focused on whether inquiry-based pedagogy actually brought about sustained and…

  16. 10 CFR 39.13 - Specific licenses for well logging.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... applicant will use to demonstrate the logging supervisor's knowledge and understanding of and ability to... and understanding of and ability to comply with the applicant's operating and emergency procedures. (c... performing the analysis; and (3) Pertinent experience of the person who will analyze the wipe samples. [52 FR...

  17. 10 CFR 39.13 - Specific licenses for well logging.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... applicant will use to demonstrate the logging supervisor's knowledge and understanding of and ability to... and understanding of and ability to comply with the applicant's operating and emergency procedures. (c... performing the analysis; and (3) Pertinent experience of the person who will analyze the wipe samples. [52 FR...

  18. 10 CFR 39.13 - Specific licenses for well logging.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... applicant will use to demonstrate the logging supervisor's knowledge and understanding of and ability to... and understanding of and ability to comply with the applicant's operating and emergency procedures. (c... performing the analysis; and (3) Pertinent experience of the person who will analyze the wipe samples. [52 FR...

  19. Narrative Performance in Verbally Gifted Children.

    ERIC Educational Resources Information Center

    Porath, Marion

    1996-01-01

    Comparison of 14 verbally gifted 6-year olds with equal numbers of chronological-age and mental-age controls using a structural-developmental analysis found that the gifted children organized story plots in ways typical of children 2 years older, elaborated on basic plot structures more than control groups, and demonstrated advanced language…

  20. Some Ethnic Cognitive Patterns.

    ERIC Educational Resources Information Center

    Curtis, Patricia Gelber

    It was hypothesized that there are significant differences in intellectual patterns between black and white populations which can be demonstrated on the Wechsler Adult Intelligence Scale (WAIS). A one-way analysis of variance was performed on the subjects' scores on the WAIS subtests and the Verbal, Peformance and Full Scale IQ using the ethnic…

  1. Application of Transformations in Parametric Inference

    ERIC Educational Resources Information Center

    Brownstein, Naomi; Pensky, Marianna

    2008-01-01

    The objective of the present paper is to provide a simple approach to statistical inference using the method of transformations of variables. We demonstrate performance of this powerful tool on examples of constructions of various estimation procedures, hypothesis testing, Bayes analysis and statistical inference for the stress-strength systems.…

  2. Technology Demonstration Summary: International Waste Technologies In Situ Stabilization/Solidification, Hialeah, Florida

    EPA Science Inventory

    An evaluation was performed of the International Waste Technologies (IWT) HWT-20 additive and the Geo-Con, Inc. deep-soil-mixing equipment for an in situ stabilization/solidification process and its applicability as an on-site treatment method for waste site cleanup. The analysis...

  3. Determination of dasatinib in the tablet dosage form by ultra high performance liquid chromatography, capillary zone electrophoresis, and sequential injection analysis.

    PubMed

    Gonzalez, Aroa Garcia; Taraba, Lukáš; Hraníček, Jakub; Kozlík, Petr; Coufal, Pavel

    2017-01-01

    Dasatinib is a novel oral prescription drug proposed for treating adult patients with chronic myeloid leukemia. Three analytical methods, namely ultra high performance liquid chromatography, capillary zone electrophoresis, and sequential injection analysis, were developed, validated, and compared for determination of the drug in the tablet dosage form. The total analysis time of optimized ultra high performance liquid chromatography and capillary zone electrophoresis methods was 2.0 and 2.2 min, respectively. Direct ultraviolet detection with detection wavelength of 322 nm was employed in both cases. The optimized sequential injection analysis method was based on spectrophotometric detection of dasatinib after a simple colorimetric reaction with folin ciocalteau reagent forming a blue-colored complex with an absorbance maximum at 745 nm. The total analysis time was 2.5 min. The ultra high performance liquid chromatography method provided the lowest detection and quantitation limits and the most precise and accurate results. All three newly developed methods were demonstrated to be specific, linear, sensitive, precise, and accurate, providing results satisfactorily meeting the requirements of the pharmaceutical industry, and can be employed for the routine determination of the active pharmaceutical ingredient in the tablet dosage form. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Development of an Enhanced Payback Function for the Superior Energy Performance Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Therkelsen, Peter; Rao, Prakash; McKane, Aimee

    2015-08-03

    The U.S. DOE Superior Energy Performance (SEP) program provides recognition to industrial and commercial facilities that achieve certification to the ISO 50001 energy management system standard and third party verification of energy performance improvements. Over 50 industrial facilities are participating and 28 facilities have been certified in the SEP program. These facilities find value in the robust, data driven energy performance improvement result that the SEP program delivers. Previous analysis of SEP certified facility data demonstrated the cost effectiveness of SEP and identified internal staff time to be the largest cost component related to SEP implementation and certification. This papermore » analyzes previously reported and newly collected data of costs and benefits associated with the implementation of an ISO 50001 and SEP certification. By disaggregating “sunk energy management system (EnMS) labor costs”, this analysis results in a more accurate and detailed understanding of the costs and benefits of SEP participation. SEP is shown to significantly improve and sustain energy performance and energy cost savings, resulting in a highly attractive return on investment. To illustrate these results, a payback function has been developed and is presented. On average facilities with annual energy spend greater than $2M can expect to implement SEP with a payback of less than 1.5 years. Finally, this paper also observes and details decreasing facility costs associated with implementing ISO 50001 and certifying to the SEP program, as the program has improved from pilot, to demonstration, to full launch.« less

  5. Boeing's STAR-FODB test results

    NASA Astrophysics Data System (ADS)

    Fritz, Martin E.; de la Chapelle, Michael; Van Ausdal, Arthur W.

    1995-05-01

    Boeing has successfully concluded a 2 1/2 year, two phase developmental contract for the STAR-Fiber Optic Data Bus (FODB) that is intended for future space-based applications. The first phase included system analysis, trade studies, behavior modeling, and architecture and protocal selection. During this phase we selected AS4074 Linear Token Passing Bus (LTPB) protocol operating at 200 Mbps, along with the passive, star-coupled fiber media. The second phase involved design, build, integration, and performance and environmental test of brassboard hardware. The resulting brassboard hardware successfully passed performance testing, providing 200 Mbps operation with a 32 X 32 star-coupled medium. This hardware is suitable for a spaceflight experiment to validate ground testing and analysis and to demonstrate performace in the intended environment. The fiber bus interface unit (FBIU) is a multichip module containing transceiver, protocol, and data formatting chips, buffer memory, and a station management controller. The FBIU has been designed for low power, high reliability, and radiation tolerance. Nine FBIUs were built and integrated with the fiber optic physical layer consisting of the fiber cable plant (FCP) and star coupler assembly (SCA). Performance and environmental testing, including radiation exposure, was performed on selected FBIUs and the physical layer. The integrated system was demonstrated with a full motion color video image transfer across the bus while simultaneously performing utility functions with a fiber bus control module (FBCM) over a telemetry and control (T&C) bus, in this case AS1773.

  6. Validating the performance of one-time decomposition for fMRI analysis using ICA with automatic target generation process.

    PubMed

    Yao, Shengnan; Zeng, Weiming; Wang, Nizhuan; Chen, Lei

    2013-07-01

    Independent component analysis (ICA) has been proven to be effective for functional magnetic resonance imaging (fMRI) data analysis. However, ICA decomposition requires to optimize the unmixing matrix iteratively whose initial values are generated randomly. Thus the randomness of the initialization leads to different ICA decomposition results. Therefore, just one-time decomposition for fMRI data analysis is not usually reliable. Under this circumstance, several methods about repeated decompositions with ICA (RDICA) were proposed to reveal the stability of ICA decomposition. Although utilizing RDICA has achieved satisfying results in validating the performance of ICA decomposition, RDICA cost much computing time. To mitigate the problem, in this paper, we propose a method, named ATGP-ICA, to do the fMRI data analysis. This method generates fixed initial values with automatic target generation process (ATGP) instead of being produced randomly. We performed experimental tests on both hybrid data and fMRI data to indicate the effectiveness of the new method and made a performance comparison of the traditional one-time decomposition with ICA (ODICA), RDICA and ATGP-ICA. The proposed method demonstrated that it not only could eliminate the randomness of ICA decomposition, but also could save much computing time compared to RDICA. Furthermore, the ROC (Receiver Operating Characteristic) power analysis also denoted the better signal reconstruction performance of ATGP-ICA than that of RDICA. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. A Field-Portable Cell Analyzer without a Microscope and Reagents

    PubMed Central

    Oh, Sangwoo; Lee, Moonjin; Hwang, Yongha

    2017-01-01

    This paper demonstrates a commercial-level field-portable lens-free cell analyzer called the NaviCell (No-stain and Automated Versatile Innovative cell analyzer) capable of automatically analyzing cell count and viability without employing an optical microscope and reagents. Based on the lens-free shadow imaging technique, the NaviCell (162 × 135 × 138 mm3 and 1.02 kg) has the advantage of providing analysis results with improved standard deviation between measurement results, owing to its large field of view. Importantly, the cell counting and viability testing can be analyzed without the use of any reagent, thereby simplifying the measurement procedure and reducing potential errors during sample preparation. In this study, the performance of the NaviCell for cell counting and viability testing was demonstrated using 13 and six cell lines, respectively. Based on the results of the hemocytometer (de facto standard), the error rate (ER) and coefficient of variation (CV) of the NaviCell are approximately 3.27 and 2.16 times better than the commercial cell counter, respectively. The cell viability testing of the NaviCell also showed an ER and CV performance improvement of 5.09 and 1.8 times, respectively, demonstrating sufficient potential in the field of cell analysis. PMID:29286336

  8. Systems Engineering Provides Successful High Temperature Steam Electrolysis Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charles V. Park; Emmanuel Ohene Opare, Jr.

    2011-06-01

    This paper describes two Systems Engineering Studies completed at the Idaho National Laboratory (INL) to support development of the High Temperature Stream Electrolysis (HTSE) process. HTSE produces hydrogen from water using nuclear power and was selected by the Department of Energy (DOE) for integration with the Next Generation Nuclear Plant (NGNP). The first study was a reliability, availability and maintainability (RAM) analysis to identify critical areas for technology development based on available information regarding expected component performance. An HTSE process baseline flowsheet at commercial scale was used as a basis. The NGNP project also established a process and capability tomore » perform future RAM analyses. The analysis identified which components had the greatest impact on HTSE process availability and indicated that the HTSE process could achieve over 90% availability. The second study developed a series of life-cycle cost estimates for the various scale-ups required to demonstrate the HTSE process. Both studies were useful in identifying near- and long-term efforts necessary for successful HTSE process deployment. The size of demonstrations to support scale-up was refined, which is essential to estimate near- and long-term cost and schedule. The life-cycle funding profile, with high-level allocations, was identified as the program transitions from experiment scale R&D to engineering scale demonstration.« less

  9. Pilot physiology, cognition and flight performance during flight simulation exposed to a 3810-m hypoxic condition.

    PubMed

    Peacock, Corey A; Weber, Raymond; Sanders, Gabriel J; Seo, Yongsuk; Kean, David; Pollock, Brandon S; Burns, Keith J; Cain, Mark; LaScola, Phillip; Glickman, Ellen L

    2017-03-01

    Hypoxia is a physiological state defined as a reduction in the distribution of oxygen to the tissues of the body. It has been considered a major factor in aviation safety worldwide because of its potential for pilot disorientation. Pilots are able to operate aircrafts up to 3810 m without the use of supplemental oxygen and may exhibit symptoms associated with hypoxia. To determine the effects of 3810 m on physiology, cognition and performance in pilots during a flight simulation. Ten healthy male pilots engaged in a counterbalanced experimental protocol comparing a 0-m normoxic condition (NORM) with a 3810-m hypoxic condition (HYP) on pilot physiology, cognition and flight performance. Repeated-measures analysis of variance demonstrated a significant (p ≤ 0.05) time by condition interaction for physiological and cognitive alterations during HYP. A paired-samples t test demonstrated no differences in pilot performance (p ≥ 0.05) between conditions. Pilots exhibited physiological and cognitive impairments; however, pilot performance was not affected by HYP.

  10. Shape design sensitivity analysis and optimal design of structural systems

    NASA Technical Reports Server (NTRS)

    Choi, Kyung K.

    1987-01-01

    The material derivative concept of continuum mechanics and an adjoint variable method of design sensitivity analysis are used to relate variations in structural shape to measures of structural performance. A domain method of shape design sensitivity analysis is used to best utilize the basic character of the finite element method that gives accurate information not on the boundary but in the domain. Implementation of shape design sensitivty analysis using finite element computer codes is discussed. Recent numerical results are used to demonstrate the accuracy obtainable using the method. Result of design sensitivity analysis is used to carry out design optimization of a built-up structure.

  11. The combined use of order tracking techniques for enhanced Fourier analysis of order components

    NASA Astrophysics Data System (ADS)

    Wang, K. S.; Heyns, P. S.

    2011-04-01

    Order tracking is one of the most important vibration analysis techniques for diagnosing faults in rotating machinery. It can be performed in many different ways, each of these with distinct advantages and disadvantages. However, in the end the analyst will often use Fourier analysis to transform the data from a time series to frequency or order spectra. It is therefore surprising that the study of the Fourier analysis of order-tracked systems seems to have been largely ignored in the literature. This paper considers the frequently used Vold-Kalman filter-based order tracking and computed order tracking techniques. The main pros and cons of each technique for Fourier analysis are discussed and the sequential use of Vold-Kalman filtering and computed order tracking is proposed as a novel idea to enhance the results of Fourier analysis for determining the order components. The advantages of the combined use of these order tracking techniques are demonstrated numerically on an SDOF rotor simulation model. Finally, the approach is also demonstrated on experimental data from a real rotating machine.

  12. AEROELASTIC SIMULATION TOOL FOR INFLATABLE BALLUTE AEROCAPTURE

    NASA Technical Reports Server (NTRS)

    Liever, P. A.; Sheta, E. F.; Habchi, S. D.

    2006-01-01

    A multidisciplinary analysis tool is under development for predicting the impact of aeroelastic effects on the functionality of inflatable ballute aeroassist vehicles in both the continuum and rarefied flow regimes. High-fidelity modules for continuum and rarefied aerodynamics, structural dynamics, heat transfer, and computational grid deformation are coupled in an integrated multi-physics, multi-disciplinary computing environment. This flexible and extensible approach allows the integration of state-of-the-art, stand-alone NASA and industry leading continuum and rarefied flow solvers and structural analysis codes into a computing environment in which the modules can run concurrently with synchronized data transfer. Coupled fluid-structure continuum flow demonstrations were conducted on a clamped ballute configuration. The feasibility of implementing a DSMC flow solver in the simulation framework was demonstrated, and loosely coupled rarefied flow aeroelastic demonstrations were performed. A NASA and industry technology survey identified CFD, DSMC and structural analysis codes capable of modeling non-linear shape and material response of thin-film inflated aeroshells. The simulation technology will find direct and immediate applications with NASA and industry in ongoing aerocapture technology development programs.

  13. Guaifenesin and increased sperm motility: a preliminary case report.

    PubMed

    Means, Gary; Berry-Cabán, Cristóbal S; Hammermeuller, Kurt

    2010-12-20

    A review of the literature and an extensive Medline search revealed that this is the first case report of the use of guaifenesin to increase sperm motility. A 32-year-old male presented for an infertility evaluation. He reported an inability to conceive with his wife after 18 months of unprotected intercourse. A semen analysis was performed that included spermatozoa count, liquefaction, morphology, motility, viscosity and volume. Initial results of the semen analysis demonstrated low sperm count and motility. The provider offered treatment with guaifenesin 600 mg extended release tablets twice daily. Two months after guaifenesin therapy the semen analysis was repeated that demonstrated marked improvement in both total sperm count and motility. Evidence for the effectiveness of guaifenesin is almost entirely anecdotal. Given the mechanism of action of guaifenesin, it is not clear from this case why the patient demonstrated such a large improvement in both sperm count and motility. Additional studies of the effects of guaifenesin on male fertility could yield information of the medication's effect on men with normal or decreased total sperm counts.

  14. Guaifenesin and increased sperm motility: a preliminary case report

    PubMed Central

    Means, Gary; Berry-Cabán, Cristóbal S; Hammermeuller, Kurt

    2011-01-01

    Background A review of the literature and an extensive Medline search revealed that this is the first case report of the use of guaifenesin to increase sperm motility. Case A 32-year-old male presented for an infertility evaluation. He reported an inability to conceive with his wife after 18 months of unprotected intercourse. A semen analysis was performed that included spermatozoa count, liquefaction, morphology, motility, viscosity and volume. Initial results of the semen analysis demonstrated low sperm count and motility. The provider offered treatment with guaifenesin 600 mg extended release tablets twice daily. Two months after guaifenesin therapy the semen analysis was repeated that demonstrated marked improvement in both total sperm count and motility. Conclusion Evidence for the effectiveness of guaifenesin is almost entirely anecdotal. Given the mechanism of action of guaifenesin, it is not clear from this case why the patient demonstrated such a large improvement in both sperm count and motility. Additional studies of the effects of guaifenesin on male fertility could yield information of the medication’s effect on men with normal or decreased total sperm counts. PMID:21403786

  15. A wavelet transform algorithm for peak detection and application to powder x-ray diffraction data.

    PubMed

    Gregoire, John M; Dale, Darren; van Dover, R Bruce

    2011-01-01

    Peak detection is ubiquitous in the analysis of spectral data. While many noise-filtering algorithms and peak identification algorithms have been developed, recent work [P. Du, W. Kibbe, and S. Lin, Bioinformatics 22, 2059 (2006); A. Wee, D. Grayden, Y. Zhu, K. Petkovic-Duran, and D. Smith, Electrophoresis 29, 4215 (2008)] has demonstrated that both of these tasks are efficiently performed through analysis of the wavelet transform of the data. In this paper, we present a wavelet-based peak detection algorithm with user-defined parameters that can be readily applied to the application of any spectral data. Particular attention is given to the algorithm's resolution of overlapping peaks. The algorithm is implemented for the analysis of powder diffraction data, and successful detection of Bragg peaks is demonstrated for both low signal-to-noise data from theta-theta diffraction of nanoparticles and combinatorial x-ray diffraction data from a composition spread thin film. These datasets have different types of background signals which are effectively removed in the wavelet-based method, and the results demonstrate that the algorithm provides a robust method for automated peak detection.

  16. Levofloxacin Penetration into Epithelial Lining Fluid as Determined by Population Pharmacokinetic Modeling and Monte Carlo Simulation

    PubMed Central

    Drusano, G. L.; Preston, S. L.; Gotfried, M. H.; Danziger, L. H.; Rodvold, K. A.

    2002-01-01

    Levofloxacin was administered orally to steady state to volunteers randomly in doses of 500 and 750 mg. Plasma and epithelial lining fluid (ELF) samples were obtained at 4, 12, and 24 h after the final dose. All data were comodeled in a population pharmacokinetic analysis employing BigNPEM. Penetration was evaluated from the population mean parameter vector values and from the results of a 1,000-subject Monte Carlo simulation. Evaluation from the population mean values demonstrated a penetration ratio (ELF/plasma) of 1.16. The Monte Carlo simulation provided a measure of dispersion, demonstrating a mean ratio of 3.18, with a median of 1.43 and a 95% confidence interval of 0.14 to 19.1. Population analysis with Monte Carlo simulation provides the best and least-biased estimate of penetration. It also demonstrates clearly that we can expect differences in penetration between patients. This analysis did not deal with inflammation, as it was performed in volunteers. The influence of lung pathology on penetration needs to be examined. PMID:11796385

  17. A Policy Analysis of Using Unit Costs as a Means of Performance Measurement in the Air Force Science and Technology Program

    DTIC Science & Technology

    1991-09-01

    demonstrate, is that there is more than one way to account for S &T costs and evaluate its program performance. The first option evaluated considers...Personal Communication. Wright Laboratory, Wright-Patterson AFB OH, 14 May 1991. Horngren , Charles T. and George Foster. Cost Accounting A Managerial...78 Programmatic Feedback ........ 79 S &T as Overhead............79 Scoring the Options............80 Unit Costs ................8 Programmatic

  18. Generative models for discovering sparse distributed representations.

    PubMed Central

    Hinton, G E; Ghahramani, Z

    1997-01-01

    We describe a hierarchical, generative model that can be viewed as a nonlinear generalization of factor analysis and can be implemented in a neural network. The model uses bottom-up, top-down and lateral connections to perform Bayesian perceptual inference correctly. Once perceptual inference has been performed the connection strengths can be updated using a very simple learning rule that only requires locally available information. We demonstrate that the network learns to extract sparse, distributed, hierarchical representations. PMID:9304685

  19. Neurological Limitations of Aircraft Operations: Human Performance Implications (les Limitations neurologiques des operations aeriennes: les Consequences pour les performances des equipages).

    DTIC Science & Technology

    1996-04-01

    Amniotic fluid Debris?* Young women Long bone fracture Fat * Any age Chronic intravenous drug users Talc* Any age Disseminated intravascular coagulapathy...maximal stress at which bone fracture occurs. This study demonstrated the usefulness of finite Results from centrifuge experiments element analysis for...Vine Street M/S 455 Philadelphia, PA 19102-1192, USA SUMMARY Exposure to Impact Acceleration (15). In these reports, fracture of the bones, dislocation

  20. Using coal inside California for electric power

    NASA Technical Reports Server (NTRS)

    Moore, J. B.

    1978-01-01

    In a detailed analysis performed at Southern California Edison on a wide variety of technologies, the direct combustion of coal and medium BTU gas from coal were ranked just below nuclear power for future nonpetroleum based electric power generation. As a result, engineering studies were performed for demonstration projects for the direct combustion of coal and medium BTU gas from coal. Graphs are presented for power demand, and power cost. Direct coal combustion and coal gasification processes are presented.

  1. Transient analysis using conical shell elements

    NASA Technical Reports Server (NTRS)

    Yang, J. C. S.; Goeller, J. E.; Messick, W. T.

    1973-01-01

    The use of the NASTRAN conical shell element in static, eigenvalue, and direct transient analyses is demonstrated. The results of a NASTRAN static solution of an externally pressurized ring-stiffened cylinder agree well with a theoretical discontinuity analysis. Good agreement is also obtained between the NASTRAN direct transient response of a uniform cylinder to a dynamic end load and one-dimensional solutions obtained using a method of characteristics stress wave code and a standing wave solution. Finally, a NASTRAN eigenvalue analysis is performed on a hydroballistic model idealized with conical shell elements.

  2. Analysis of Discrete-Source Damage Progression in a Tensile Stiffened Composite Panel

    NASA Technical Reports Server (NTRS)

    Wang, John T.; Lotts, Christine G.; Sleight, David W.

    1999-01-01

    This paper demonstrates the progressive failure analysis capability in NASA Langley s COMET-AR finite element analysis code on a large-scale built-up composite structure. A large-scale five stringer composite panel with a 7-in. long discrete source damage was analyzed from initial loading to final failure including the geometric and material nonlinearities. Predictions using different mesh sizes, different saw cut modeling approaches, and different failure criteria were performed and assessed. All failure predictions have a reasonably good correlation with the test result.

  3. Automated system for the on-line monitoring of powder blending processes using near-infrared spectroscopy. Part I. System development and control.

    PubMed

    Hailey, P A; Doherty, P; Tapsell, P; Oliver, T; Aldridge, P K

    1996-03-01

    An automated system for the on-line monitoring of powder blending processes is described. The system employs near-infrared (NIR) spectroscopy using fibre-optics and a graphical user interface (GUI) developed in the LabVIEW environment. The complete supervisory control and data analysis (SCADA) software controls blender and spectrophotometer operation and performs statistical spectral data analysis in real time. A data analysis routine using standard deviation is described to demonstrate an approach to the real-time determination of blend homogeneity.

  4. SSME 3-D Turnaround Duct flow analysis - CFD predictions

    NASA Technical Reports Server (NTRS)

    Brankovic, Andreja; Stowers, Steven T.; Mcconnaughey, Paul

    1988-01-01

    CFD analysis is presently employed to obtain an improved flowfield for an individual flowpath in the case of the Space Shuttle Main Engine's High Pressure Fuel Turbopump Turn-Around Duct (TAD), which conducts the flow exiting from the gas turbines into the fuel bowl. It is demonstrated that the application of CFD to TAD flow analysis, giving attention to the duct's configuration and to the number, shape, and alignment of the diffuser struts, can enhance understanding of flow physics and result in improved duct design and performance.

  5. Preliminary analyses of WL experiment No. 701, space environment effects on operating fiber optic systems

    NASA Technical Reports Server (NTRS)

    Taylor, E. W.; Berry, J. N.; Sanchez, A. D.; Padden, R. J.; Chapman, S. P.

    1992-01-01

    A brief overview of the analyses performed to date on WL Experiment-701 is presented. Four active digital fiber optic links were directly exposed to the space environment for a period of 2114 days. The links were situated aboard the Long Duration Exposure Facility (LDEF) with the cabled, single fiber windings atop an experimental tray containing instrumentation for exercising the experiment in orbit. Despite the unplanned and prolonged exposure to trapped and galactic radiation, wide temperature extremes, atomic oxygen interactions, and micro-meteorite and debris impacts, in most instances the optical data links performed well within the experimental limits. Analysis of the recorded orbital data clearly indicates that fiber optic applications in space will meet with success. Ongoing tests and analysis of the experiment at the Phillips Laboratory's Optoelectronics Laboratory will expand this premise, and establish the first known and extensive database of active fiber optic link performance during prolonged space exposure. WL Exp-701 was designed as a feasibility demonstration for fiber optic technology in space applications, and to study the performance of operating fiber systems exposed to space environmental factors such as galactic radiation, and wide temperature cycling. WL Exp-701 is widely acknowledged as a benchmark accomplishment that clearly demonstrates, for the first time, that fiber optic technology can be successfully used in a variety of space applications.

  6. High-performance equation solvers and their impact on finite element analysis

    NASA Technical Reports Server (NTRS)

    Poole, Eugene L.; Knight, Norman F., Jr.; Davis, D. Dale, Jr.

    1990-01-01

    The role of equation solvers in modern structural analysis software is described. Direct and iterative equation solvers which exploit vectorization on modern high-performance computer systems are described and compared. The direct solvers are two Cholesky factorization methods. The first method utilizes a novel variable-band data storage format to achieve very high computation rates and the second method uses a sparse data storage format designed to reduce the number of operations. The iterative solvers are preconditioned conjugate gradient methods. Two different preconditioners are included; the first uses a diagonal matrix storage scheme to achieve high computation rates and the second requires a sparse data storage scheme and converges to the solution in fewer iterations that the first. The impact of using all of the equation solvers in a common structural analysis software system is demonstrated by solving several representative structural analysis problems.

  7. High-performance equation solvers and their impact on finite element analysis

    NASA Technical Reports Server (NTRS)

    Poole, Eugene L.; Knight, Norman F., Jr.; Davis, D. D., Jr.

    1992-01-01

    The role of equation solvers in modern structural analysis software is described. Direct and iterative equation solvers which exploit vectorization on modern high-performance computer systems are described and compared. The direct solvers are two Cholesky factorization methods. The first method utilizes a novel variable-band data storage format to achieve very high computation rates and the second method uses a sparse data storage format designed to reduce the number od operations. The iterative solvers are preconditioned conjugate gradient methods. Two different preconditioners are included; the first uses a diagonal matrix storage scheme to achieve high computation rates and the second requires a sparse data storage scheme and converges to the solution in fewer iterations that the first. The impact of using all of the equation solvers in a common structural analysis software system is demonstrated by solving several representative structural analysis problems.

  8. Coupled Solid Rocket Motor Ballistics and Trajectory Modeling for Higher Fidelity Launch Vehicle Design

    NASA Technical Reports Server (NTRS)

    Ables, Brett

    2014-01-01

    Multi-stage launch vehicles with solid rocket motors (SRMs) face design optimization challenges, especially when the mission scope changes frequently. Significant performance benefits can be realized if the solid rocket motors are optimized to the changing requirements. While SRMs represent a fixed performance at launch, rapid design iterations enable flexibility at design time, yielding significant performance gains. The streamlining and integration of SRM design and analysis can be achieved with improved analysis tools. While powerful and versatile, the Solid Performance Program (SPP) is not conducive to rapid design iteration. Performing a design iteration with SPP and a trajectory solver is a labor intensive process. To enable a better workflow, SPP, the Program to Optimize Simulated Trajectories (POST), and the interfaces between them have been improved and automated, and a graphical user interface (GUI) has been developed. The GUI enables real-time visual feedback of grain and nozzle design inputs, enforces parameter dependencies, removes redundancies, and simplifies manipulation of SPP and POST's numerous options. Automating the analysis also simplifies batch analyses and trade studies. Finally, the GUI provides post-processing, visualization, and comparison of results. Wrapping legacy high-fidelity analysis codes with modern software provides the improved interface necessary to enable rapid coupled SRM ballistics and vehicle trajectory analysis. Low cost trade studies demonstrate the sensitivities of flight performance metrics to propulsion characteristics. Incorporating high fidelity analysis from SPP into vehicle design reduces performance margins and improves reliability. By flying an SRM designed with the same assumptions as the rest of the vehicle, accurate comparisons can be made between competing architectures. In summary, this flexible workflow is a critical component to designing a versatile launch vehicle model that can accommodate a volatile mission scope.

  9. Determination of residual carbamate, organophosphate, and phenyl urea pesticides in drinking and surface water by high-performance liquid chromatography/tandem mass spectrometry.

    PubMed

    Hao, Chunyan; Nguyen, Bick; Zhiao, Xiaoming; Chen, Ernie; Yang, Paul

    2010-01-01

    Methods using SPE followed by HPLC/MS/MS analysis were developed and validated for the determination of 39 pesticides in different aquatic environmental matrixes. The target pesticides included 12 carbamates, 15 organophosphates, and 12 phenyl ureas, out of which 16 are regulated in North America. Method detection limits were in the low ng/L range using the U.S. Environmental Protection Agency's protocol and multiple reaction monitoring (MRM) data acquisition, meeting the regulatory needs in the United States, Canada, and European Union. Isotope-labeled compounds were used as injection internal standards, as well as method surrogates to improve the data quality. QC/QA data (e.g., method recovery and within-run and between-run method precision) derived from multiyear monitoring activities were used to demonstrate method ruggedness. The same QC/QA data also showed that the method exerted no obvious matrix effect on the target analytes. Parameters that affect method performance, such as preservatives, pH values, sample storage time, and sample extract storage time, were also studied in detail. Accredited by the Canadian Association for Laboratory Accreditation and licensed by the Ontario government for drinking water analysis, these methods have been applied to the analysis of drinking water, ground water, and surface water samples collected in the province of Ontario, Canada, to ensure the pristine nature of Ontario's aquatic environment. Using the scheduled MRM (sMRM) data acquisition algorithm, it was demonstrated that sMRM improved the S/N of extracted ion chromatograms by at least two- to six-fold and, therefore, enhanced the short- and long-term instrument precision, demonstrated the ability to offer high throughput multiresidue analysis, and allowed the use of two MRM transitions for each compound to achieve higher confidence for compound identification.

  10. An optimization based sampling approach for multiple metrics uncertainty analysis using generalized likelihood uncertainty estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng

    2016-09-01

    This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.

  11. Sandia PUF Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-06-11

    This program is a graphical user interface for measuring and performing inter-active analysis of physical unclonable functions (PUFs). It is intended for demonstration and education purposes. See license.txt for license details. The program features a PUF visualization that demonstrates how signatures differ between PUFs and how they exhibit noise over repeated measurements. A similarity scoreboard shows the user how close the current measurement is to the closest chip signatures in the database. Other metrics such as average noise and inter-chip Hamming distances are presented to the user. Randomness tests published in NIST SP 800-22 can be computed and displayed. Noisemore » and inter-chip histograms for the sample of PUFs and repeated PUF measurements can be drawn.« less

  12. Demonstration of a directional sonic prism in two dimensions using an air-acoustic leaky wave antenna

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naify, Christina J., E-mail: christina.naify@nrl.navy.mil; Rohde, Charles A.; Calvo, David C.

    Analysis and experimental demonstration of a two-dimensional acoustic leaky wave antenna is presented for use in air. The antenna is comprised of a two-dimensional waveguide patterned with radiating acoustic shunts. When excited using a single acoustic source within the waveguide, the antenna acts as a sonic prism that exhibits frequency steering. This design allows for control of acoustic steering angle using only a single source transducer and a patterned aperture. Aperture design was determined using transmission line analysis and finite element methods. The designed antenna was fabricated and the steering angle measured. The performance of the measured aperture was withinmore » 9% of predicted angle magnitudes over all examined frequencies.« less

  13. High Stability Engine Control (HISTEC): Flight Demonstration Results

    NASA Technical Reports Server (NTRS)

    Delaat, John C.; Southwick, Robert D.; Gallops, George W.; Orme, John S.

    1998-01-01

    Future aircraft turbine engines, both commercial and military, must be able to accommodate expected increased levels of steady-state and dynamic engine-face distortion. The current approach of incorporating sufficient design stall margin to tolerate these increased levels of distortion would significantly reduce performance. The High Stability Engine Control (HISTEC) program has developed technologies for an advanced, integrated engine control system that uses measurement- based estimates of distortion to enhance engine stability. The resulting distortion tolerant control reduces the required design stall margin, with a corresponding increase in performance and/or decrease in fuel burn. The HISTEC concept was successfully flight demonstrated on the F-15 ACTIVE aircraft during the summer of 1997. The flight demonstration was planned and carried out in two parts, the first to show distortion estimation, and the second to show distortion accommodation. Post-flight analysis shows that the HISTEC technologies are able to successfully estimate and accommodate distortion, transiently setting the stall margin requirement on-line and in real-time. Flight demonstration of the HISTEC technologies has significantly reduced the risk of transitioning the technology to tactical and commercial engines.

  14. Physician performance assessment using a composite quality index.

    PubMed

    Liu, Kaibo; Jain, Shabnam; Shi, Jianjun

    2013-07-10

    Assessing physician performance is important for the purposes of measuring and improving quality of service and reducing healthcare delivery costs. In recent years, physician performance scorecards have been used to provide feedback on individual measures; however, one key challenge is how to develop a composite quality index that combines multiple measures for overall physician performance evaluation. A controversy arises over establishing appropriate weights to combine indicators in multiple dimensions, and cannot be easily resolved. In this study, we proposed a generic unsupervised learning approach to develop a single composite index for physician performance assessment by using non-negative principal component analysis. We developed a new algorithm named iterative quadratic programming to solve the numerical issue in the non-negative principal component analysis approach. We conducted real case studies to demonstrate the performance of the proposed method. We provided interpretations from both statistical and clinical perspectives to evaluate the developed composite ranking score in practice. In addition, we implemented the root cause assessment techniques to explain physician performance for improvement purposes. Copyright © 2012 John Wiley & Sons, Ltd.

  15. Understanding product cost vs. performance through an in-depth system Monte Carlo analysis

    NASA Astrophysics Data System (ADS)

    Sanson, Mark C.

    2017-08-01

    The manner in which an optical system is toleranced and compensated greatly affects the cost to build it. By having a detailed understanding of different tolerance and compensation methods, the end user can decide on the balance of cost and performance. A detailed phased approach Monte Carlo analysis can be used to demonstrate the tradeoffs between cost and performance. In complex high performance optical systems, performance is fine-tuned by making adjustments to the optical systems after they are initially built. This process enables the overall best system performance, without the need for fabricating components to stringent tolerance levels that often can be outside of a fabricator's manufacturing capabilities. A good performance simulation of as built performance can interrogate different steps of the fabrication and build process. Such a simulation may aid the evaluation of whether the measured parameters are within the acceptable range of system performance at that stage of the build process. Finding errors before an optical system progresses further into the build process saves both time and money. Having the appropriate tolerances and compensation strategy tied to a specific performance level will optimize the overall product cost.

  16. Analysis of architect’s performance indicators in project delivery process

    NASA Astrophysics Data System (ADS)

    Marisa, A.

    2018-03-01

    Architect as a professional in the construction industry should possess a good performance in project delivery process. As a design professional, architect has an important role to ensure that the process is well-conducted by delivering a high-quality product for the clients. Thus, analyzing architect’s performance indicators is crucial in the project delivery process. This study aims to analyze the relative importance of architect performance indicators in project delivery process among registered architects in North Sumatera, Indonesia. A total of five indicators that measure architect performance in project delivery process were identified and 110 completed questionnaires were obtained and used for data analysis. A relative importance index is used to rank the relative importance of architect performance indicators. Results indicate that focus on the clients is the most important indicator of architect performance in project delivery process. This study demonstrates project communication as one of crucial indicators perceived by the architects for measuring their performance, and fills a knowledge gap on the importance of identifying the most important indicator for measuring architect performance from their own perspectives which previous studies have overlooked to improve performance assessment in project delivery process.

  17. Seal Analysis for the Ares-I Upper Stage Fuel Tank Manhole Cover

    NASA Technical Reports Server (NTRS)

    Phillips, Dawn R.; Wingate, Robert J.

    2010-01-01

    Techniques for studying the performance of Naflex pressure-assisted seals in the Ares-I Upper Stage liquid hydrogen tank manhole cover seal joint are explored. To assess the feasibility of using the identical seal design for the Upper Stage as was used for the Space Shuttle External Tank manhole covers, a preliminary seal deflection analysis using the ABAQUS commercial finite element software is employed. The ABAQUS analyses are performed using three-dimensional symmetric wedge finite element models. This analysis technique is validated by first modeling a heritage External Tank liquid hydrogen tank manhole cover joint and correlating the results to heritage test data. Once the technique is validated, the Upper Stage configuration is modeled. The Upper Stage analyses are performed at 1.4 times the expected pressure to comply with the Constellation Program factor of safety requirement on joint separation. Results from the analyses performed with the External Tank and Upper Stage models demonstrate the effects of several modeling assumptions on the seal deflection. The analyses for Upper Stage show that the integrity of the seal is successfully maintained.

  18. The effects of temperature on service employees' customer orientation: an experimental approach.

    PubMed

    Kolb, Peter; Gockel, Christine; Werth, Lioba

    2012-01-01

    Numerous studies have demonstrated how temperature can affect perceptual, cognitive and psychomotor performance (e.g. Hancock, P.A., Ross, J., and Szalma, J., 2007. A meta-analysis of performance response under thermal stressors. Human Factors: The Journal of the Human Factors and Ergonomics Society, 49 (5), 851-877). We extend this research to interpersonal aspects of performance, namely service employees' and salespeople's customer orientation. We combine ergonomics with recent research on social cognition linking physical with interpersonal warmth/coldness. In Experiment 1, a scenario study in the lab, we demonstrate that student participants in rooms with a low temperature showed more customer-oriented behaviour and gave higher customer discounts than participants in rooms with a high temperature - even in zones of thermal comfort. In Experiment 2, we show the existence of alternative possibilities to evoke positive temperature effects on customer orientation in a sample of 126 service and sales employees using a semantic priming procedure. Overall, our results confirm the existence of temperature effects on customer orientation. Furthermore, important implications for services, retail and other settings of interpersonal interactions are discussed. Practitioner Summary: Temperature effects on performance have emerged as a vital research topic. Owing to services' increasing economic importance, we transferred this research to the construct of customer orientation, focusing on performance in service and retail settings. The demonstrated temperature effects are transferable to services, retail and other settings of interpersonal interactions.

  19. Using virtual reality to analyze sports performance.

    PubMed

    Bideau, Benoit; Kulpa, Richard; Vignais, Nicolas; Brault, Sébastien; Multon, Franck; Craig, Cathy

    2010-01-01

    Improving performance in sports can be difficult because many biomechanical, physiological, and psychological factors come into play during competition. A better understanding of the perception-action loop employed by athletes is necessary. This requires isolating contributing factors to determine their role in player performance. Because of its inherent limitations, video playback doesn't permit such in-depth analysis. Interactive, immersive virtual reality (VR) can overcome these limitations and foster a better understanding of sports performance from a behavioral-neuroscience perspective. Two case studies using VR technology and a sophisticated animation engine demonstrate how to use information from visual displays to inform a player's future course of action.

  20. Analysis and Design of Cryogenic Pressure Vessels for Automotive Hydrogen Storage

    NASA Astrophysics Data System (ADS)

    Espinosa-Loza, Francisco Javier

    Cryogenic pressure vessels maximize hydrogen storage density by combining the high pressure (350-700 bar) typical of today's composite pressure vessels with the cryogenic temperature (as low as 25 K) typical of low pressure liquid hydrogen vessels. Cryogenic pressure vessels comprise a high-pressure inner vessel made of carbon fiber-coated metal (similar to those used for storage of compressed gas), a vacuum space filled with numerous sheets of highly reflective metalized plastic (for high performance thermal insulation), and a metallic outer jacket. High density of hydrogen storage is key to practical hydrogen-fueled transportation by enabling (1) long-range (500+ km) transportation with high capacity vessels that fit within available spaces in the vehicle, and (2) reduced cost per kilogram of hydrogen stored through reduced need for expensive structural material (carbon fiber composite) necessary to make the vessel. Low temperature of storage also leads to reduced expansion energy (by an order of magnitude or more vs. ambient temperature compressed gas storage), potentially providing important safety advantages. All this is accomplished while simultaneously avoiding fuel venting typical of cryogenic vessels for all practical use scenarios. This dissertation describes the work necessary for developing and demonstrating successive generations of cryogenic pressure vessels demonstrated at Lawrence Livermore National Laboratory. The work included (1) conceptual design, (2) detailed system design (3) structural analysis of cryogenic pressure vessels, (4) thermal analysis of heat transfer through cryogenic supports and vacuum multilayer insulation, and (5) experimental demonstration. Aside from succeeding in demonstrating a hydrogen storage approach that has established all the world records for hydrogen storage on vehicles (longest driving range, maximum hydrogen storage density, and maximum containment of cryogenic hydrogen without venting), the work also demonstrated a methodology for computationally efficient detailed modeling of cryogenic pressure vessels. The work continues with support of the US Department of Energy to demonstrate a new generation of cryogenic vessels anticipated to improve on the hydrogen storage performance figures previously imposed in this project. The author looks forward to further contributing to a future of long-range, inexpensive, and safe zero emissions transportation.

  1. Well-Being and the Social Environment of Work: A Systematic Review of Intervention Studies.

    PubMed

    Daniels, Kevin; Watson, David; Gedikli, Cigdem

    2017-08-16

    There is consistent evidence that a good social environment in the workplace is associated with employee well-being. However, there has been no specific review of interventions to improve well-being through improving social environments at work. We conducted a systematic review of such interventions, and also considered performance as an outcome. We found eight studies of interventions. Six studies were of interventions that were based on introducing shared social activities into workgroups. Six out of the six studies demonstrated improvements in well-being across the sample (five studies), or for an identifiable sub-group (one study). Four out of the five studies demonstrated improvements in social environments, and four out of the five studies demonstrated improvements in indicators of performance. Analysis of implementation factors indicated that the interventions based on shared activities require some external facilitation, favorable worker attitudes prior to the intervention, and several different components. We found two studies that focused on improving fairness perceptions in the workplace. There were no consistent effects of these interventions on well-being or performance. We conclude that there is some evidence that interventions that increase the frequency of shared activities between workers can improve worker well-being and performance. We offer suggestions for improving the evidence base.

  2. Well-Being and the Social Environment of Work: A Systematic Review of Intervention Studies

    PubMed Central

    Gedikli, Cigdem

    2017-01-01

    There is consistent evidence that a good social environment in the workplace is associated with employee well-being. However, there has been no specific review of interventions to improve well-being through improving social environments at work. We conducted a systematic review of such interventions, and also considered performance as an outcome. We found eight studies of interventions. Six studies were of interventions that were based on introducing shared social activities into workgroups. Six out of the six studies demonstrated improvements in well-being across the sample (five studies), or for an identifiable sub-group (one study). Four out of the five studies demonstrated improvements in social environments, and four out of the five studies demonstrated improvements in indicators of performance. Analysis of implementation factors indicated that the interventions based on shared activities require some external facilitation, favorable worker attitudes prior to the intervention, and several different components. We found two studies that focused on improving fairness perceptions in the workplace. There were no consistent effects of these interventions on well-being or performance. We conclude that there is some evidence that interventions that increase the frequency of shared activities between workers can improve worker well-being and performance. We offer suggestions for improving the evidence base. PMID:28813009

  3. Spectral multi-energy CT texture analysis with machine learning for tissue classification: an investigation using classification of benign parotid tumours as a testing paradigm.

    PubMed

    Al Ajmi, Eiman; Forghani, Behzad; Reinhold, Caroline; Bayat, Maryam; Forghani, Reza

    2018-06-01

    There is a rich amount of quantitative information in spectral datasets generated from dual-energy CT (DECT). In this study, we compare the performance of texture analysis performed on multi-energy datasets to that of virtual monochromatic images (VMIs) at 65 keV only, using classification of the two most common benign parotid neoplasms as a testing paradigm. Forty-two patients with pathologically proven Warthin tumour (n = 25) or pleomorphic adenoma (n = 17) were evaluated. Texture analysis was performed on VMIs ranging from 40 to 140 keV in 5-keV increments (multi-energy analysis) or 65-keV VMIs only, which is typically considered equivalent to single-energy CT. Random forest (RF) models were constructed for outcome prediction using separate randomly selected training and testing sets or the entire patient set. Using multi-energy texture analysis, tumour classification in the independent testing set had accuracy, sensitivity, specificity, positive predictive value, and negative predictive value of 92%, 86%, 100%, 100%, and 83%, compared to 75%, 57%, 100%, 100%, and 63%, respectively, for single-energy analysis. Multi-energy texture analysis demonstrates superior performance compared to single-energy texture analysis of VMIs at 65 keV for classification of benign parotid tumours. • We present and validate a paradigm for texture analysis of DECT scans. • Multi-energy dataset texture analysis is superior to single-energy dataset texture analysis. • DECT texture analysis has high accura\\cy for diagnosis of benign parotid tumours. • DECT texture analysis with machine learning can enhance non-invasive diagnostic tumour evaluation.

  4. Sensitivity of negative subsequent memory and task-negative effects to age and associative memory performance.

    PubMed

    de Chastelaine, Marianne; Mattson, Julia T; Wang, Tracy H; Donley, Brian E; Rugg, Michael D

    2015-07-01

    The present fMRI experiment employed associative recognition to investigate the relationships between age and encoding-related negative subsequent memory effects and task-negative effects. Young, middle-aged and older adults (total n=136) were scanned while they made relational judgments on visually presented word pairs. In a later memory test, the participants made associative recognition judgments on studied, rearranged (items studied on different trials) and new pairs. Several regions, mostly localized to the default mode network, demonstrated negative subsequent memory effects in an across age-group analysis. All but one of these regions also demonstrated task-negative effects, although there was no correlation between the size of the respective effects. Whereas negative subsequent memory effects demonstrated a graded attenuation with age, task-negative effects declined markedly between the young and the middle-aged group, but showed no further reduction in the older group. Negative subsequent memory effects did not correlate with memory performance within any age group. By contrast, in the older group only, task-negative effects predicted later memory performance. The findings demonstrate that negative subsequent memory and task-negative effects depend on dissociable neural mechanisms and likely reflect distinct cognitive processes. The relationship between task-negative effects and memory performance in the older group might reflect the sensitivity of these effects to variations in amount of age-related neuropathology. This article is part of a Special Issue entitled SI: Memory. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Multi-GPU parallel algorithm design and analysis for improved inversion of probability tomography with gravity gradiometry data

    NASA Astrophysics Data System (ADS)

    Hou, Zhenlong; Huang, Danian

    2017-09-01

    In this paper, we make a study on the inversion of probability tomography (IPT) with gravity gradiometry data at first. The space resolution of the results is improved by multi-tensor joint inversion, depth weighting matrix and the other methods. Aiming at solving the problems brought by the big data in the exploration, we present the parallel algorithm and the performance analysis combining Compute Unified Device Architecture (CUDA) with Open Multi-Processing (OpenMP) based on Graphics Processing Unit (GPU) accelerating. In the test of the synthetic model and real data from Vinton Dome, we get the improved results. It is also proved that the improved inversion algorithm is effective and feasible. The performance of parallel algorithm we designed is better than the other ones with CUDA. The maximum speedup could be more than 200. In the performance analysis, multi-GPU speedup and multi-GPU efficiency are applied to analyze the scalability of the multi-GPU programs. The designed parallel algorithm is demonstrated to be able to process larger scale of data and the new analysis method is practical.

  6. Stiffness modeling of compliant parallel mechanisms and applications in the performance analysis of a decoupled parallel compliant stage

    NASA Astrophysics Data System (ADS)

    Jiang, Yao; Li, Tie-Min; Wang, Li-Ping

    2015-09-01

    This paper investigates the stiffness modeling of compliant parallel mechanism (CPM) based on the matrix method. First, the general compliance matrix of a serial flexure chain is derived. The stiffness modeling of CPMs is next discussed in detail, considering the relative positions of the applied load and the selected displacement output point. The derived stiffness models have simple and explicit forms, and the input, output, and coupling stiffness matrices of the CPM can easily be obtained. The proposed analytical model is applied to the stiffness modeling and performance analysis of an XY parallel compliant stage with input and output decoupling characteristics. Then, the key geometrical parameters of the stage are optimized to obtain the minimum input decoupling degree. Finally, a prototype of the compliant stage is developed and its input axial stiffness, coupling characteristics, positioning resolution, and circular contouring performance are tested. The results demonstrate the excellent performance of the compliant stage and verify the effectiveness of the proposed theoretical model. The general stiffness models provided in this paper will be helpful for performance analysis, especially in determining coupling characteristics, and the structure optimization of the CPM.

  7. Probabilistic finite elements for fracture and fatigue analysis

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Lawrence, M.; Besterfield, G. H.

    1989-01-01

    The fusion of the probabilistic finite element method (PFEM) and reliability analysis for probabilistic fracture mechanics (PFM) is presented. A comprehensive method for determining the probability of fatigue failure for curved crack growth was developed. The criterion for failure or performance function is stated as: the fatigue life of a component must exceed the service life of the component; otherwise failure will occur. An enriched element that has the near-crack-tip singular strain field embedded in the element is used to formulate the equilibrium equation and solve for the stress intensity factors at the crack-tip. Performance and accuracy of the method is demonstrated on a classical mode 1 fatigue problem.

  8. Energy Efficient Engine Low Pressure Subsystem Aerodynamic Analysis

    NASA Technical Reports Server (NTRS)

    Hall, Edward J.; Delaney, Robert A.; Lynn, Sean R.; Veres, Joseph P.

    1998-01-01

    The objective of this study was to demonstrate the capability to analyze the aerodynamic performance of the complete low pressure subsystem (LPS) of the Energy Efficient Engine (EEE). Detailed analyses were performed using three- dimensional Navier-Stokes numerical models employing advanced clustered processor computing platforms. The analysis evaluates the impact of steady aerodynamic interaction effects between the components of the LPS at design and off- design operating conditions. Mechanical coupling is provided by adjusting the rotational speed of common shaft-mounted components until a power balance is achieved. The Navier-Stokes modeling of the complete low pressure subsystem provides critical knowledge of component acro/mechanical interactions that previously were unknown to the designer until after hardware testing.

  9. Magnetite-doped polydimethylsiloxane (PDMS) for phosphopeptide enrichment.

    PubMed

    Sandison, Mairi E; Jensen, K Tveen; Gesellchen, F; Cooper, J M; Pitt, A R

    2014-10-07

    Reversible phosphorylation plays a key role in numerous biological processes. Mass spectrometry-based approaches are commonly used to analyze protein phosphorylation, but such analysis is challenging, largely due to the low phosphorylation stoichiometry. Hence, a number of phosphopeptide enrichment strategies have been developed, including metal oxide affinity chromatography (MOAC). Here, we describe a new material for performing MOAC that employs a magnetite-doped polydimethylsiloxane (PDMS), that is suitable for the creation of microwell array and microfluidic systems to enable low volume, high throughput analysis. Incubation time and sample loading were explored and optimized and demonstrate that the embedded magnetite is able to enrich phosphopeptides. This substrate-based approach is rapid, straightforward and suitable for simultaneously performing multiple, low volume enrichments.

  10. Examination of pain experiences of cancer patients in western Turkey: a phenomenological study.

    PubMed

    Akin Korhan, Esra; Yildirim, Yasemin; Uyar, Meltem; Eyigör, Can; Uslu, Ruçhan

    2013-01-01

    This study aims to explore the individual experience of living with cancer pain. This qualitative study was performed by using a phenomenological research design. In-depth and open interviews with participants were conducted to collect the data and a qualitative Colaizzi method of analysis was performed. Following the analysis of the data, the expressions made by the cancer patients during the interviews were grouped under 5 themes. Consistent with the questionnaire format, 5 themes and 19 subthemes of responses were determined describing the pain of the cancer patients. The results of our study have demonstrated that cancer patients go through negative physical, psychological, and social experiences due to the pain they suffered.

  11. Global/local stress analysis of composite panels

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Knight, Norman F., Jr.

    1989-01-01

    A method for performing a global/local stress analysis is described, and its capabilities are demonstrated. The method employs spline interpolation functions which satisfy the linear plate bending equation to determine displacements and rotations from a global model which are used as boundary conditions for the local model. Then, the local model is analyzed independent of the global model of the structure. This approach can be used to determine local, detailed stress states for specific structural regions using independent, refined local models which exploit information from less-refined global models. The method presented is not restricted to having a priori knowledge of the location of the regions requiring local detailed stress analysis. This approach also reduces the computational effort necessary to obtain the detailed stress state. Criteria for applying the method are developed. The effectiveness of the method is demonstrated using a classical stress concentration problem and a graphite-epoxy blade-stiffened panel with a discontinuous stiffener.

  12. Structural analyses of the JPL Mars Pathfinder impact

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gwinn, K.W.

    1994-12-31

    The purpose of this paper is to demonstrate that finite element analysis can be used in the design process for high performance fabric structures. These structures exhibit extreme geometric nonlinearity; specifically, the contact and interaction of fabric surfaces with the large deformation which necessarily results from membrane structures introduces great complexity to analyses of this type. All of these features are demonstrated here in the analysis of the Jet Propulsion Laboratory (JPL) Mars Pathfinder impact onto Mars. This lander system uses airbags to envelope the lander experiment package, protecting it with large deformation upon contact. Results from the analysis showmore » the stress in the fabric airbags, forces in the internal tendon support system, forces in the latches and hinges which allow the lander to deploy after impact, and deceleration of the lander components. All of these results provide the JPL engineers with design guidance for the success of this novel lander system.« less

  13. Structural analyses of the JPL Mars Pathfinder impact

    NASA Astrophysics Data System (ADS)

    Gwinn, Kenneth W.

    The purpose of this paper is to demonstrate that finite element analysis can be used in the design process for high performance fabric structures. These structures exhibit extreme geometric nonlinearity; specifically, the contact and interaction of fabric surfaces with the large deformation which necessarily results from membrane structures introduces great complexity to analyses of this type. All of these features are demonstrated here in the analysis of the Jet Propulsion Laboratory (JPL) Mars Pathfinder impact onto Mars. This lander system uses airbags to envelope the lander experiment package, protecting it with large deformation upon contact. Results from the analysis show the stress in the fabric airbags, forces in the internal tendon support system, forces in the latches and hinges which allow the lander to deploy after impact, and deceleration of the lander components. All of these results provide the JPL engineers with design guidance for the success of this novel lander system.

  14. Global/local stress analysis of composite structures. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.

    1989-01-01

    A method for performing a global/local stress analysis is described and its capabilities are demonstrated. The method employs spline interpolation functions which satisfy the linear plate bending equation to determine displacements and rotations from a global model which are used as boundary conditions for the local model. Then, the local model is analyzed independent of the global model of the structure. This approach can be used to determine local, detailed stress states for specific structural regions using independent, refined local models which exploit information from less-refined global models. The method presented is not restricted to having a priori knowledge of the location of the regions requiring local detailed stress analysis. This approach also reduces the computational effort necessary to obtain the detailed stress state. Criteria for applying the method are developed. The effectiveness of the method is demonstrated using a classical stress concentration problem and a graphite-epoxy blade-stiffened panel with a discontinuous stiffener.

  15. Blurring the Inputs: A Natural Language Approach to Sensitivity Analysis

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Thompson, Richard A.; Johnston, Christopher O.

    2007-01-01

    To document model parameter uncertainties and to automate sensitivity analyses for numerical simulation codes, a natural-language-based method to specify tolerances has been developed. With this new method, uncertainties are expressed in a natural manner, i.e., as one would on an engineering drawing, namely, 5.25 +/- 0.01. This approach is robust and readily adapted to various application domains because it does not rely on parsing the particular structure of input file formats. Instead, tolerances of a standard format are added to existing fields within an input file. As a demonstration of the power of this simple, natural language approach, a Monte Carlo sensitivity analysis is performed for three disparate simulation codes: fluid dynamics (LAURA), radiation (HARA), and ablation (FIAT). Effort required to harness each code for sensitivity analysis was recorded to demonstrate the generality and flexibility of this new approach.

  16. Low-Dimensional Statistics of Anatomical Variability via Compact Representation of Image Deformations.

    PubMed

    Zhang, Miaomiao; Wells, William M; Golland, Polina

    2016-10-01

    Using image-based descriptors to investigate clinical hypotheses and therapeutic implications is challenging due to the notorious "curse of dimensionality" coupled with a small sample size. In this paper, we present a low-dimensional analysis of anatomical shape variability in the space of diffeomorphisms and demonstrate its benefits for clinical studies. To combat the high dimensionality of the deformation descriptors, we develop a probabilistic model of principal geodesic analysis in a bandlimited low-dimensional space that still captures the underlying variability of image data. We demonstrate the performance of our model on a set of 3D brain MRI scans from the Alzheimer's Disease Neuroimaging Initiative (ADNI) database. Our model yields a more compact representation of group variation at substantially lower computational cost than models based on the high-dimensional state-of-the-art approaches such as tangent space PCA (TPCA) and probabilistic principal geodesic analysis (PPGA).

  17. Design and analysis of compact MMIC switches utilising GaAs pHEMTs in 3D multilayer technology

    NASA Astrophysics Data System (ADS)

    Haris, Norshakila; Kyabaggu, Peter B. K.; Alim, Mohammad A.; Rezazadeh, Ali A.

    2017-05-01

    In this paper, we demonstrate for the first time the implementation of three-dimensional multilayer technology on GaAs-based pseudomorphic high electron mobility transistor (pHEMT) switches. Two types of pHEMT switches are considered, namely single-pole single-throw (SPST) and single-pole double-throw (SPDT). The design and analysis of the devices are demonstrated first through a simulation of the industry-recognised standard model, TriQuint’s Own Model—Level 3, developed by TriQuint Semiconductor, Inc. From the simulation analysis, three optimised SPST and SPDT pHEMT switches which can address applications ranging from L to X bands, are fabricated and tested. The performance of the pHEMT switches using multilayer technology are comparable to those of the current state-of-the-art pHEMT switches, while simultaneously offering compact circuits with the advantages of integration with other MMIC components.

  18. Quantitative performance targets by using balanced scorecard system: application to waste management and public administration.

    PubMed

    Mendes, Paula; Nunes, Luis Miguel; Teixeira, Margarida Ribau

    2014-09-01

    This article demonstrates how decision-makers can be guided in the process of defining performance target values in the balanced scorecard system. We apply a method based on sensitivity analysis with Monte Carlo simulation to the municipal solid waste management system in Loulé Municipality (Portugal). The method includes two steps: sensitivity analysis of performance indicators to identify those performance indicators with the highest impact on the balanced scorecard model outcomes; and sensitivity analysis of the target values for the previously identified performance indicators. Sensitivity analysis shows that four strategic objectives (IPP1: Comply with the national waste strategy; IPP4: Reduce nonrenewable resources and greenhouse gases; IPP5: Optimize the life-cycle of waste; and FP1: Meet and optimize the budget) alone contribute 99.7% of the variability in overall balanced scorecard value. Thus, these strategic objectives had a much stronger impact on the estimated balanced scorecard outcome than did others, with the IPP1 and the IPP4 accounting for over 55% and 22% of the variance in overall balanced scorecard value, respectively. The remaining performance indicators contribute only marginally. In addition, a change in the value of a single indicator's target value made the overall balanced scorecard value change by as much as 18%. This may lead to involuntarily biased decisions by organizations regarding performance target-setting, if not prevented with the help of methods such as that proposed and applied in this study. © The Author(s) 2014.

  19. Signal processing for the detection of explosive residues on varying substrates using laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Morton, Kenneth D., Jr.; Torrione, Peter A.; Collins, Leslie

    2011-05-01

    Laser induced breakdown spectroscopy (LIBS) can provide rapid, minimally destructive, chemical analysis of substances with the benefit of little to no sample preparation. Therefore, LIBS is a viable technology for the detection of substances of interest in near real-time fielded remote sensing scenarios. Of particular interest to military and security operations is the detection of explosive residues on various surfaces. It has been demonstrated that LIBS is capable of detecting such residues, however, the surface or substrate on which the residue is present can alter the observed spectra. Standard chemometric techniques such as principal components analysis and partial least squares discriminant analysis have previously been applied to explosive residue detection, however, the classification techniques developed on such data perform best against residue/substrate pairs that were included in model training but do not perform well when the residue/substrate pairs are not in the training set. Specifically residues in the training set may not be correctly detected if they are presented on a previously unseen substrate. In this work, we explicitly model LIBS spectra resulting from the residue and substrate to attempt to separate the response from each of the two components. This separation process is performed jointly with classifier design to ensure that the classifier that is developed is able to detect residues of interest without being confused by variations in the substrates. We demonstrate that the proposed classification algorithm provides improved robustness to variations in substrate compared to standard chemometric techniques for residue detection.

  20. Development of portable defocusing micro-scale spatially offset Raman spectroscopy.

    PubMed

    Realini, Marco; Botteon, Alessandra; Conti, Claudia; Colombo, Chiara; Matousek, Pavel

    2016-05-10

    We present, for the first time, portable defocusing micro-Spatially Offset Raman Spectroscopy (micro-SORS). Micro-SORS is a concept permitting the analysis of thin, highly turbid stratified layers beyond the reach of conventional Raman microscopy. The technique is applicable to the analysis of painted layers in cultural heritage (panels, canvases and mural paintings, painted statues and decorated objects in general) as well as in many other areas including polymer, biological and biomedical applications, catalytic and forensics sciences where highly turbid stratified layers are present and where invasive analysis is undesirable or impossible. So far the technique has been demonstrated only on benchtop Raman microscopes precluding the non-invasive analysis of larger samples and samples in situ. The new set-up is characterised conceptually on a range of artificially assembled two-layer systems demonstrating its benefits and performance across several application areas. These included stratified polymer sample, pharmaceutical tablet and layered paint samples. The same samples were also analysed by a high performance (non-portable) benchtop Raman microscope to provide benchmarking against our earlier research. The realisation of the vision of delivering portability to micro-SORS has a transformative potential spanning across multiple disciplines as it fully unlocks, for the first time, the non-invasive and non-destructive aspects of micro-SORS enabling it to be applied also to large and non-portable samples in situ without recourse to removing samples, or their fragments, for laboratory analysis on benchtop Raman microscopes.

Top